SXSW: Cognitive Brews

My team was presented with the unique opportunity to build an exhibit at SXSW in March of 2016. We partnered with the IBM Design Studio to present an experience at their Design Hive Party. 🍻

2016 » Role: Lead UX Researcher

TL;DR

We created this awesome cognitive beer tasting experience for South by Southwest (SXSW) from a failed project that has now traveled the world. We did research along the way, which is why it’s been so successful.

Challenge

We needed an experience that drew in millennials but also showcased some of our innovative technology. So what better topic to use than craft beer?

We created a craft brew tasting experience using IBM’s Watson machine learning. The basic experience began with a user being asked a series of colloquial questions about food preferences (What’s your favorite berry?, etc.). With this information, Watson made a recommendation of the three local craft brews we think the taster will like best. As the beers were poured and tasted one by one, we used object recognition to trigger the display of blind taste information on a large screen behind the bartender. The taster then arranged the beers on the bar top according to preference. During the final reveal, the name and details of the top beer were unveiled to the taster along with similar local craft brews they may like. The taster’s rankings were are automatically captured by the same object recognition technology, providing feedback to Watson to improve our recommendations over time.

 

IMG_9126.JPG

Let's get into the nitty gritty. 

How did we actually get to this point? Through lots and lots of usability testing.

The goal of this project was to create an innovative and interactive retail experience. We were given two constraints: the experience needed to incorporate RFID technology and it needed to be in the retail space. After doing some market analysis, our Offering Manager saw an opportunity for this type of project to exist in gourmet food and wine shops.

Knowing the technology constraints, we needed to make sure that the experience paid close attention to human interactions. We were trying to learn how people interacted with objects at a store, but also we needed to find out how they could interact with the Apple TV 2 at the same time. 

We wanted to see if we could incorporate RFID technology into our solution. So our team built a retail experience prototype where the screen changed based on which item you were holding. It included item recommendations which helped retailers cross-sell and upsell products.

"At this point I wouldn't want to use this anymore."

"At this point I wouldn't want to use this anymore."

Pivot!

As we started testing the prototype, we quickly realized the Apple TV was not the right solution. Users could not figure out how to use the Apple TV remote due to how it was designed. For example, they had a really hard time understanding how to go back to the previous screen. It was a hardware issue, not an issue with our design. 

So we decided to scrap it and go back to the drawing board. How could we have the users interact with the TV screen without touching it? We explored other options including the Microsoft Kinect and Leap Motion. The Kinect didn’t pick up detailed hand gestures (and you would need a hand free from the item) and users felt self-conscious doing large movements in a public space.  Even though the Leap Motion sensor would’ve picked up the small hand gestures properly, users would have a hard time finding area where they were supposed to interact with it since it’s too small.

Queue QR technology! Along with the developers, we were researching QR codes as a technology option and thought about attaching it to the cups somehow. We chose to put QR code stickers to the bottom disposable Solo cups (for sanitary issues) and to keep the cost down.  Through a clear plexiglass bar top, the developers hid a camera that would read the QR codes and know exactly what position they were in. The designers designed multiple versions of bar tops where you ranked each beer you tasted. We made paper prototypes of each design version and tested it so the experience would be as intuitive as possible.

We had to test all the user interaction points of our experience. This included: interaction with picking up the cups, ranking system, if they understood the information on the screen about the beer, appropriate timing, sound effects, lighting, etc. We wanted to ensure that we could create the best “physical meets digital” experience.

We created flyers in the elevator lobbies of our building to get users for testing. Since the flyers said "beer" on them, there was no issue in finding users. We had several testing sessions over a couple of weeks where users would go through the tasting experience, give us feedback on the surface design, or stand in line so we could test timing.
Overall, we did card sorting, paper prototypes, surveys, literature review, and moderated experience testing.

The final tv screen design shown to the user

The final iPad app design for the bartender

 

Phase II

The experience did so well at SXSW, we got a lot of requests for it to be sent to other events. Setting up and breaking down the beer tasting stands was a logistical nightmare so we set out to create a portable version. The challenge was to keep the immersive experience intact while making the hardware and setup smaller. After doing a lot of technical research with RFIDs and QR codes, cup holders, and weight sensors, we ended up with a Infrared-sensitive beer flight. We conducted multiple rounds of user testing to ensure the experience was intuitive and translated well from the beer stands, from cardboard prototypes to the actual wired 3D-printed flight. We wrote and tested an instructional PDF and packed it all up in a pelican case to be shipped to clients.

It has now traveled around the world, including Amsterdam, NYC, Vegas, France, and more. We even got attention from beer companies and were asked to send it to other IBM events.