Halfway into a recent performance at New York City's Bohemian National Hall violinist Mari Kimura had already performed "Preludio" from Bach's Partita No. 3 in E Major followed by several pieces in which she deftly demonstrated her innovative "subharmonics" techniques for extending the octave range of her instrument. Then things got really interesting.

Kimura donned a white fingerless glove laden with wireless sensors, plugged her "augmented" violin into her laptop onstage, and proceeded to demonstrate how she is redefining the relationship musicians have with both their instruments and their music. After a few moments setting up her interactive bowing technology Kimura launched into her composition Voyage Apollonian, during which her bow strokes controlled an animation sequence created by New York University computer science professor Ken Perlin. As she played, the glove's sensors sent data to software running on her laptop, prompting a black-and-white butterfly on the large screen behind her to morph into various shapes and patterns before returning to its original winged configuration.

The sensors are part of module that includes 3-D accelerometers and three axis gyroscopes as well as a wireless transmitter that sends data about Kimura's bowing to her computer as she plays. The module functions as an electronic controller for real-time digital sound processes, such as sound transformation and sound synthesis, says Frédéric Bevilacqua, who leads the Paris Institut de Recherche et Coordination Acoustique/Musique's (IRCAM) Real-Time Musical Interactions Team, which developed the module.

Augmented violin
IRCAM's augmented violin technology—its first prototype was built in 2004—tracks all bow attacks and bow angles, Bevilacqua says, adding, "We specifically worked on a gesture recognition and synchronization system that is able to distinguish standard bowing styles, such as détaché or martelé, or to recognize a bow pattern specifically chosen by the musician." Others who have used IRCAM's technology include composer Florence Baschet and dancer/choreographer Richard Siegal.

The augmented violin project appeals to Kimura's experimental side, but she is hesitant to draw specific conclusions based on the data produced by the sensors. "When I first learned about the technology I was really excited because bowing is what makes the violin really live," says Kimura, who teaches a graduate class in computer music interactive performance at The Juilliard School in New York City. "I wanted to study: Where does expression lie in? How does bowing motion contribute to the expression? For engineers it would be so easy if the bowing motion itself equaled expression, but it doesn't."

IRCAM's original idea was to attach sensors to the violinist's bow and a battery for the sensors to Kimura's wrist, but she found this cumbersome and moved the sensors and battery to a makeshift glove worn on her right hand. Her first performance using the glove came in 2007. Two years later she was approached by designer Mark Salinas, who revamped the look and designed the version of the glove that Kimura wore during the second half of her recent New York performance.

Kimura's work has taken IRCAM's technology in new directions. Instead of using the system of violin, sensors and software to analyze and reproduce certain sounds, Kimura is more interested in studying her relationship with her instrument. "How you get to the string and how you end your stroke—that's where the character of the bowing is," she adds.

Duet X2
Pushing this technology further during her second performance with the sensor glove at the Bohemian National Hall show on May 20, Kimura had cellist Dave Eggar (wearing a sensor glove of his own) join her on stage to debut a duet she had written to be played interactively with her computer. Throughout the piece, software running on her laptop read data produced by the sensor gloves, extracting musical expression from Kimura's and Eggar's bows as they played. The computer used that information to modify the sounds made by the two instruments throughout the piece.

The musicians were using an IRCAM-written artificial intelligence software program called OMax designed to learn in real-time the typical features of a musician's style and then use those lessons to alter the musical output, giving the flavor of a machine collaborating with a musician to yield a novel improvisation. OMax, produced by IRCAM's Musical Representations Team, achieves this in part by creating a digital "clone" of each musician that interacts with the musician's output as well as other musicians' clones during a performance.

"The bows were actually controlling not only our own sound processing but the processing of the other person," Kimura says. "I could change Dave's processing with the bow, and he changed my processing with his."

The computer acts almost as a composer, but it's not exactly a third instrument. The output originates only from the duo. As a result of the computer's influence, however, "you're not really in control of the music," Eggar says. "Jazz improvisation is actually more predictable. When I do something, I can pretty much tell what someone else will do. Tonight's performance was very different."

Computers and music have become intertwined over the past several decades, says Eggar, himself a Grammy-nominated experimentalist who has performed with musicians running the gamut from Hannah Montana to the Philippines's indigenous Talaandig tribe. "Mari's really owning that and defining what it means for a musician to interface with a computer," he adds.

Technology aside, a striking component of Kimura's performance is the integration of her subharmonic bowing techniques that she has been developing for nearly two decades. The objective behind subharmonics is to extend the violin's range by a full octave. For example, she can hit notes below the  not fingered "open" G string—normally the lowest note in standard tuning—into the cello range without altering the instrument's tuning. Throughout her performance Kimura's bow flew across the strings as she delved into the subharmonic range and back, often during a single bow stroke.

A likely next step for Kimura is to use the technology behind the augmented violin to better understand subharmonics. The late Yale University physicist William Bennett—co-inventor of the first gas laser—dedicated a considerable amount of time prior to his 2008 death studying some of Kimura's techniques, writing in his 2006 book The Science of Musical Sound about what he termed "subtones" produced by her violin. Still, he was not able to pin down precisely what mechanism she used to produce, as he put it, "octave subtones."

Kimura estimates that within a year or two she would like to work with IRCAM to better understand how subharmonics are produced. "Right now, all I have is my kinetic memory and a voodoo feeling that's, like, I can do it, I can do it," she says. "It would be great if there were some sort of data that says this is how much pressure and speed, a quantifier of some sort."

Kimura is leery of her reputation for experimentation superceding her skill as a violinist. "I hope you will see an everyday violinist, except that a computer participates in my performance," she says. Somehow, it is unlikely that her audiences will ever see her performances as commonplace.