Overview

Working with a community of teens involved in an after-school hip-hop education program, we explored how to increase their music's visibility and reach within their communities. A parallel goal was to bring awareness to the importance of music education initiatives, especially within underserved communities, and encourage new students to join such programs. 

Solution

We created a system called 412 Beats, which allows musicians to link audio content to images that can then be placed in urban spaces. The system leverages computer vision and AR to more naturally augment tagged images allowing any user with a mobile camera and an internet connection to retrieve the original audio. With this project we ultimately aim to showcase how AR, mobile phones, and new interaction techniques can be used to empower communities to place self-generated digital content into physical space.

Awards

Core77 Badge.png
adobe2.png

 

 

 

Awards: Semifinalist, Adobe Design Achievement Award. Student Notable Interaction Award, Core 77.

Tools: Interviews, Pen & Paper, Sketch, InVision, HTML/CSS, Javascript, Adobe Creative Suite, DSLR, mobile AR

Year: Ongoing

Team: Manjari Sahu, Agnes Yeong, Vikas Yadav (CMU Design)

Role: AR prototyping, video production, design ideation, mobile UX

Link: Research blog, Github code

 
 

User Research

How might we increase interest and awareness in music education programs?

Early in our project we decided to work with a community of Pittsburgh youth involved in Arts Greenhouse, a hip-hop music education program that serves Pittsburgh teens through music technology classes, music recording projects, hip-hop performances, and workshops on special topics relating to hip-hop. We spent time in recording sessions with many of the students, having formal and informal conversations with them, and we interviewed several program directors. 

 
 

The research uncovered a few important findings:

  • Music makes a huge impact on these teens' lives. Teens exposed to music programs have higher retention rates at school and tend to perform better. Students expressed the confidence and personal growth they experienced through learning how to make their own music. 
  • Pittsburgh has few physical spaces where community members can go hear live music. Young musicians struggle to get their music out in an already crowded and competitive musical market.

Pairing these two insights together, the driving force for our research became: could we increase interest and awareness in music education programs by helping these musicians get noticed and discovered throughout the city of Pittsburgh?

Initial Ideation

 
 

Iteration & Prototyping

 

 

What is the relationship between physical space and the resulting user interaction with the system?

 

What should the look and feel of the system be like? What physical artifacts would provide the right affordances?

 

 

Is the scanning interaction intuitive to users? Is the browsing experience enjoyable?

 
 
 

Final Concept

Interactive Music Discovery using Mobile Augmented Reality

 
 
 
 

Although QR codes have already made users familiar with the act of scanning images to retrieve digital content, they usually function as one-way conversion tools, with a QR code acting as a URL shortcut. In the 412 Beats system anyone with a mobile camera can scan an image to retrieve a song by a local musician. Simply moving the phone camera over another image produces another song that can be saved for later. The interaction is simple, by using augmented reality users can explore songs by simply moving their phones around in space. The system consists of 3 components: a website where musicians can upload their songs and link them to specific images, a mobile application for scanning images, and a physical installation that contains the images.

1. Web system for musicians to upload songs

The first point of contact in the system is a website where musicians from Arts Greenhouse can upload soundtracks and link each one to a particular image or visual marker of their own choosing. After inputting basic metadata for each song, musician users can also design how these images will be combined into a physical installation. 

 
Musicians can upload individual songs, add metadata, and link each song to a visual marker.

Musicians can upload individual songs, add metadata, and link each song to a visual marker.

They can then design physical are installations out of sets of visual markers (images).

They can then design physical are installations out of sets of visual markers (images).

 


2. Physical installation for scanning music

We combine sets of images (each one linked to a unique soundtrack) together to form a physical installation passersby might encounter in public spaces.  We purposefully kept the basic components of the installation  simple: rhomboid-shaped images acting as visual markers - the idea being that ultimately musician users themselves might create personalized designs on the web system and physically setup the installation themselves.

 
Example installation in a public setting. Each image is linked to a unique audio track.

Example installation in a public setting. Each image is linked to a unique audio track.

The installation can accommodate more than one user at once.

The installation can accommodate more than one user at once.

Example installation in a public setting. Each image is linked to a unique audio track.

Example installation in a public setting. Each image is linked to a unique audio track.

Image scanning can occur independent of the rotation of the phone.

Image scanning can occur independent of the rotation of the phone.

 


3. Mobile augmented reality app for scanning music

A mobile AR application run from a web browser allows passerbys to scan images within the physical installation to retrieve music. After accessing the app, a camera-based interface guides users to position their phones on top of a visual marker. Once a marker is correctly recognized, the soundtrack associated to that particular marker begins playing on the user’s phone. Whereas a QR code would have redirected the user to a standard web page, by using AR we can provide a more continuous experience. The soundtrack will play as long as the user’s camera can recognize its corresponding marker correctly; simply moving the phone’s camera to another marker will automatically play a different song. The app also contains information about the artist and track, in case the users seeks to learn more information about the music being played. 

 
Normal
0




false
false
false

EN-US
JA
X-NONE

 
 
 
 
 
 
 
 
 


 
 
 
 
 
 
 
 
 
 
 


 <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false"
DefSemiHidden="false" DefQFormat="false" DefPriority="99"
LatentStyleCount="380">


<w:…

A frame gently guides the user to align phone with an image marker. 

Normal
0




false
false
false

EN-US
JA
X-NONE

 
 
 
 
 
 
 
 
 


 
 
 
 
 
 
 
 
 
 
 


 <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false"
DefSemiHidden="false" DefQFormat="false" DefPriority="99"
LatentStyleCount="380">


<w:…

User aligns phone with an image successfully.

An animation indicates the image has been correctly identified. The image's linked soundtrack begins playing automatically.&nbsp;

An animation indicates the image has been correctly identified. The image's linked soundtrack begins playing automatically. 

Normal
0




false
false
false

EN-US
JA
X-NONE

 
 
 
 
 
 
 
 
 


 
 
 
 
 
 
 
 
 
 
 


 <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false"
DefSemiHidden="false" DefQFormat="false" DefPriority="99"
LatentStyleCount="380">


<w:…

User can choose to save or export soundtrack, or find more information about the musician behind the music.

 
 
 

Evaluating the Design

To test the design and receive feedback from our users, we implemented a technical prototype of our design. Importantly, we relied on Argon.js, a Javascript framework for adding augmented reality content to web applications being developed at Georgia Tech. Although today one has to download a native Argon application that functions as a browser, the expectation is that in the near future AR will become a feature of standard mobile browsers (Chrome, Safari, IE). We purposefully used HTML, CSS, and Javascript - the basic building blocks of the web - so audience users would not have to download an application before experiencing the installation, but simply navigate to a website on their mobile browser. The actual image recognition is implemented by using Vuforia, a computer vision SDK for target recognition.

While all of those we spoke with understood the functionality of the system by simply interacting with the prototype, some users perceived the rhomboid frame to be more natural than others. For one group of users, the frame suggested that the phone could be moved and oriented freely using only one hand – an important consideration for mobile usage. For another group, the frame became confusing because they expected a rectangular or square shape, having become familiar with the experience of scanning a QR code. Thus, a goal for our next design iteration is to rethink and more rigorously test the form of the frame and corresponding visual markers.