Always work in progress, excuse the duct-tape
iOS metronome app Tempobot is live on the App Store.
Read about Rhythm in the Brain, an ArtPrize research-art installation I helped develop.
To examine visual feedback in multi-touch interaction design, I built a visual programming environment for multi-touch tablet for assembling interfaces. This serves as a platform to explore visual feedback in multi-touch interactions. Multiple visual feedback paradigms are implemented on top of a common core visual vocabulary, consisted of visual entities such as regions and links and containers.
This environment can be used to rapidly construct musical instruments, sample tasks were designed and used in human-subject usability study. In these example, a musical keyboard and multi-touch mixer instrument is built.
Video: Task 1,
Video: Task 2
Tools: urMus, Lua, iOS
Representation-Plurality in Multi-Touch Mobile Visual Programming for Music, Qi Yang, Georg Essl NIME 2015
Using Kinect Depth sensor to augment traditional keyboard instrument with a 3D gesture space, and top-down projection is used for visual feedback at the site of the gesture interaction.
This novel interaction model enables us to explore different visualizations:
Tools: Kinect, Processing, OpenFrameworks, OpenCV
Evaluating Gesture-Augmented Piano Performance, Qi Yang, Georg Essl CMJ 2014 PDF
Visual Associations in Augmented Keyboard Performance, Qi Yang, Georg Essl NIME 2013
Augmented Piano Performance using a Depth Camera, Qi Yang, Georg Essl NIME 2012
Native mobile interface design and implementation in Swift on iOS. Metro is a reinterpretation of a familiar physical artifact (a metronome), for a mobile touch and gesture-driven interface. The visual look-and-feel and interaction experience are designed to be native to the modern touch UI vocabulary, yet still retains its own character. At the same time, the selected use of skeumorph reminds users of the tactile experience of using a mechanical metronome without gratuitous use of textures.
Interface Demo:
Now Available on the App Store
Tools: iOS, Swift, Sketch
In collaboration with Sang Won Lee, I designed user interface and product concepts for a web-based writing application and corresponding mobile app which supports timed playback of the writing process, as well as enabling rich text-based expressions. The companion mobile app can capture nuanced typing gestures to enrich texting-like communications.
Visualizing contact network between 6 dormitories, by hour of day
As part of the ExFlu study by University of Michigan School of Public Health, I cleaned and analyzed multi-sensory data collected from 100 phones over 3 month. These include bluetooth and wifi contacts, accelerometer, and battery. Between-phone Bluetooth contact data are used to visualize social contact between study participants.
I also coordinated the collection of GPS position of local wifi access points. These data enabled me to localize and visualize activities on-campus and heat spots.
Tools: Python, MSSQL, kml, matplotlib
I developed as part of the web development team of Harvest Mission Community Church. I also lead the upcoming redesign of the web presence of the nonprofit organization.
An example work-in-progress: we designed a card interface for events calendar, where each card represents one event, including actions which can be performed on each event (e.g. sign up links, sharing). The cards can be stacked in a list for an agenda view, or reused as an in-page popup. (In collaboration with Bo Zhu, Valerie Maldonado and SJ Ahn)
See prototype in HTML
Tools: PHP, html+CSS, Sketch
We created a mobile phishing attack and proof-of-concept defense against the attack, for computer security graduate course project. The demonstration attack targets the iOS mobile browser by mimicking the native application user interface to pass casual observation. The defense uses a keyboard input monitoring to intercept possibly sensitive information.