John Granzow

Associate Professor
School of Music Theatre and Dance
Faculy Director, ArtsEngine
University of Michigan
jgranzow [at] umich [dot] edu

Ph.D. Dissertation


My doctoral dissertation Digital Fabrication for Musical Applications is available through the Stanford Library at this permanent URL.

Stanford Library : dissertation (.pdf)

Prototyping NIMES in VR

We conducted a full day workshop at the New Interfaces for Musical Expression Conference. Participants designed interactive possibilities with virtual instruments prior to fabricating their physical couterparts. We used both virtual and physical sensors. I co-instructed this workshop with Anıl Çamcı at the NIME conference in Mexico City, June 1-4 2023. Workshop Proposal

Visualizing Telematic Music Performance

create kinetic, mechatronic displays of human movement in telematic chamber music performances. Through a series of design experiments and public workshop performances, we are attempting to efficiently sense, encode, transmit, and display relevant movement features using 3-dimensional mechatronic displays to support musicians performing in disparate geographical locations using high-quality audio streamed over the Internet. Michael Gurevich. Project Page

Room Modal Response using Auralization

Alvin Lucier's famous piece, I Am Sitting in A Room, inspired a method to explore room modes in virtual acoustics. Alaa Algargoosh conducted this research published in Applied Acoustics. Paper

Capturing Kinetic Wave Demonstrations for Sound Control

I am working with Matias Vilaplana and Anıl Çamcı to capture kinetic wave demonstrations scultpures invented for acoustic visualization, and deploy their motion for sound control. We are using a 16 camera Qualyses Motion Capture system to capture the dynamics of these physical wave demonstrations. We have so far built a shive machine, captured its wave patterns and sonified the wave form using MaxMSP. In this research educational acoustic demonstrations are recast as musical controllers through motion capture.project page

Hyperreal Instruments

I continue to work on Hyperreal Instruments with Anıl Çamcı: In this article, we look at how technology has facilitated the materialization of impossible instruments from the twentieth century on. We then discuss the bridging of VR and fabrication as a new frontier in instrument design, where synthetic sounds can be used to condition an equally synthetic sensory scaffolding upon which the time-varying spectra can be interactively anchored: The result is new instruments that can defy our sense of audiovisual reality while satisfying our proprioceptive and haptic expectations. paper

Utopia Swim Club

We started the Utopia Swim Club an artist collective with Christian Sandvig, Sophia Breuckner, Catie Newell, William Calvo-Quiros. We come from schools across the University, including the School of Information, Stamps School of Art and Design, Architecture, American Studies and the School of Music Theatre and Dance. The collective was established through the gererous support of the Humanities collaboratory. Our practice is to place AI into unlikey scenarios to foreground the shifting attributions we make to agents that deliver our data back to us,laden with motives to have the wrold unfold in a certain way. project page

Digital Fabrication for Acoustics

I presented an update to my Digital Fabrication for Acoustics curriculum at the Acoustical Society for America Conference in San Diego (Dec,2019) The increasing presence of maker spaces in academic settings provides opportunities to study acoustics through digital fabrication. Theory is coupled to physical play in a well established tradition of making in acoustic education. This research explores how we can make sounding objects and acoustic demonstrations, testing them against our numeric predictions, and applying this experimentation to creative endeavors in digital lutherie and sound art. Slides

Embedded DSP workshop, Stanford

In this annual workshop held at Stanford, participants learn how to program microcontrollers with the Faust programming language for low latency real-time audio Digital Signal Processing (DSP). Final projects consisted of hardware for musical applications such as digital guitar pedal effects and synthesizer modules. The Teensy 3.6 board was used as the main development platform. Its ARM Cortex-M4 microcontroller provides plenty of processing power to implement advanced DSP algorithms (e.g., feedback delay networks, physical models, band-limited oscillators, filter banks, etc.). Also, its various analog and digital inputs can be used for sensors acquisition. The lack of Operating System allows for the use of very low block sizes (i.e., 8 samples) offering extremely low audio latency. project page

Michigan Music Conference

With Joo Won Park: I presented at the Michigan Music Conference. In our talk, Majoring in Music Technology: An Introduction to Undergraduate Music Technology Programs in Michigan, we presented similarities and differences between various music undergraduate technology majors offered by the 4-year universities and community colleges in Michigan. The session also provides tips on the application procedure and portfolio preparation. slides

Axes, Society for Music Computing, Sala Unicaja de Conciertos MarĂ­a Cristina, Malaga

Luthiers use computer-controlled mills for the subtractive manufacture of guitar components. These machines have multiple motors stepping at variable rates to propel cutting tools in three dimensional paths with corresponding pitch contours. Axes is a work that brings these live robotic sounds of modern guitarmaking to the concert space. For this piece, stepper motors are fixed to the neck, body and soundboard of an unassembled guitar. The motors are driven in concert as the x, y and z axes of a toolpath derived from a digital model of the instrument. The pitched and noisy motors are filtered acoustically through the guitars components and are captured via transducers to become the source for subtractive synthesis. The actuation and vibration also make the guitar components mildly kinetic. MaxMSP is used for the additional processing and to generate nebulous quotations from the emerging guitars future/past repertoire, producing a collage of fine motor skills, both machine and human. Axes is a multichannel work that can be adapted to the channelcount in the space. project page

Unrecordables

On November 30th, 2018, artists and researchers from the University of Michigan, Michigan State University and Wayne State University will held a concert and a symposium on live electronic music. The event features various technologies developed for performances and installations rather than those designed to improve or facilitate recording and mixing. During the symposium, the presenters will discuss how the recent development and research in performance technology are changing the way we present, experience and think about music. We also shared our artistic ideas through the technology in the evening concert. video

String Section

String Section is a collaborative sound installation and performance system created with Catie Newell (Taubmann School of Architecture) and Kim Harty (head of glass at the College of Creative Studies in Detroit) project page

A Carillon Lab for The 21st Century

Throughout the Bicentennial year, the campus heard and shared in our efforts to forge new paths in carillon performance, bell studies, and public musical engagement. Traditionally, carillons have been relatively isolated and elitist institutional symbols, associated with centuries past. Our projects involved the public, students, and scholars in reimagining how the sounds of our campus bells connect the present-day soundscape not just to two centuries of c ampus history, but also to inclusive, inventive, interdisciplinary new directions. full report

Vox Voxel, National Theatre, Taipei

Vox Voxel created in colaboration with Fernando-Lopez Lezcano was included in the concert of machines (5 concerts) at the National Theatre in Taipei, Taiwan in September, 2018. VoxVoxel is "composed" by designing a suitably useless 3D shape and capturing the sound of the working 3D printer using sensors. Those sounds are amplified, modified and multiplied through live processing in a computer using ardour and LV2/LADSPA plugins, and output in full matching 3D sound. 3D pixels in space. project page