The Miburi is a wearable musical instrument which was released commercially by the
Yamaha Corporation’s Tokyo-based experimental division in 1995.[1]
Categorisation and functions of the Miburi
The Miburi can be characterized as an “inside-in” system according to Axel Mulder’s three categories of motion sensing systems:
inside-in - sensor(s) and source(s) that are both on the body;
inside-out - on-body sensors that sense artificial external sources;
outside-in - external sensors that sense artificial sources on the body [2]
It conforms to what Todd Winkler refers to as the ‘body sensor’ group of controllers (the other are spatial sensors, acoustic models and ‘new instruments’).[3]
The Miburi system consists of a vest with embedded
capacitive displacement sensors, two hand-grips, and shoe inserts with
pressure sensors, and a belt-worn signal distribution unit joined by a cable to a small
synthesizer/
MIDI converter. A wireless version, conforming to Japanese
wireless frequency regulations was available within Japan only.
The Miburi's belt unit, “MBU-20”, processes data from the sensors into MIDI pitch and velocity information. The unit can be programmed to interpret the data using three ‘trigger’ modes: ‘Cross-point’ mode; ‘Stop’ mode and ‘All’ a combination of both modes. ‘Cross-point’ mode measures the speed of the transducer’s flexion as it traverses its zero point (when the flex sensor is straight). The six ‘flex’ sensors send 12 notes – this is because they measure inward and outward movement of each joint as separate notes. ‘Stop’ mode sends note and maximum velocity values at the conclusion of a gesture. ‘All’ interprets sensor data in both modes simultaneously.[4]
The mapping of each sensor is highly programmable. Each sensor can be mapped on the synthesizer unit, “MSU-20”, to any MIDI note, interpreted in any of the three modes outlined above according to 48 different response modes. The response modes (preset by Yamaha) define the manner in which the sensor’s output is graphed to velocity. All the above definitions are components of a single Map ‘Preset’, there are 32 programmable preset positions available.
Evaluation of the Miburi
These features make the Miburi extremely effective as a
computing input device. However the Miburi’s synthesizer unit is limited in its possibilities as a sound source and more importantly is only able to process gestures in a direct one-to-one relationship to the sounds they produce.
The need to ‘tether’ the Miburi to its synthesizer unit is also clearly a drawback for movement detection and a restriction for the dancer. However, the Miburi has the robust design, and very predictable sensor output that might be expected from one of the principal
electronic musical instrument manufacturers.
The Miburi may be combined with more sophisticated sound sources and software-based interactive mapping such as
MAX/msp. Extensions of its basic functions include control of video,[5] lighting,[6] utilization as a component of a “multimedia orchestra” [7] and “to help children engage their whole bodies while interacting with computers”.[8]
Composers of music for the Miburi
Saburo Hirano[9] "Ping Bang" (1995) for solo MIBURI - Uses the Miburi as a Multimedia controller
Lindsay Vickery "your sky is filled with billboards of the sky" (2002) for solo MIBURI - Uses the Miburi as a Multimedia controller [11]
References
^Marrin, Teresa and Paradiso, Joseph “The Digital Baton: a Versatile Performance Instrument, International Computer Music Conference, Thessaloniki, Greece, pages 313-316, 1997”
^Axel Mulder, Human movement tracking technology, Hand Centered Studies of Human Movement Project, Technical Report 94-1, 1994. “
[1]”
^Todd Winkler, “Composing Interactive Music: techniques and ideas using Max. Cambridge Massachusetts, MIT Press p. 315-8”, 1998
^Yamaha Corporation, “Miburi R3 Manual. Tokyo, Japan: Yamaha Corporation”, 1996.
^Vickery, Lindsay, “The Yamaha Miburi MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT Melbourne”
^Nishimoto, Kazushi, Mase, Kenji, Fels, Sidney, “Towards Multimedia Orchestra: A Proposal for an Interactive Multimedia Art Creation System. ICMCS, Vol. 1 1999: pp. 900-904”, 1999
^Zigelbaum, Jamie, Millner, Amon, Desai, Bella, Ishii, Hiroshi, “BodyBeats: whole-body, musical interfaces for children. CHI Extended Abstracts 2006: pp. 1595-1600”, 2006
^Vickery, Lindsay, “The Yamaha Miburi MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT Melbourne”
Nagashima, Yoichi, "Real-Time Interactive Performance with Computer Graphics and Computer Music, in Proceedings of the 7th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Man-Machina Systems, IFAC” 1998.
http://nagasm.suac.net/ASL/paper/ifac98.pdf
Nishimoto, Kazushi, Mase, Kenji, Fels, Sidney, “Towards Multimedia Orchestra: A Proposal for an Interactive Multimedia Art Creation System. ICMCS, Vol. 1 1999: pp. 900-904”, 1999,
https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=779322
Vickery, Lindsay, “The Yamaha MIBURI MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT, Melbourne.