top of page

Harman Autonomous Vehicle Control

Designing a collaborative control for human and artificial intelligence

Prompt​

Harman International, a Samsung company, engaged the Carnegie Mellon Human-Computer Interaction Institute to improve the user experience of autonomous vehicles.

My Role

· Lead Interaction Designer

· Hand-sketched interaction design concepts, proposed to team.

· Created high fidelity interactions in Framer.

· Designed Wizard of Oz driving experiment.

*A greater focus on my individual contributions to this project can be found in the next case study.

Methods

Wizard of Oz Experiment that simulated the experience of being in an autonomous vehicle by obscuring the driver. 

User Interviews  with Tesla Owners and car enthusiasts.

Affinity Diagramming  revealed patterns in our data and actionable insights

Creative Matrix  comparing Harman's technical capabilities and our areas of focus found in the affinity.

BodyStorming  led to the testing of ergonomic and interactions for the in-vehicle context.

Driving Simulator + Usability Testing  were used together to test new interaction concepts.

The Solution

· A UX concept in  which the vehicle cockpit is a bridge for collaboration between the the vehicle's AI and the the control of the human driver.

· A curved screen interface that embodies the dichotomy of joint decision making with the vehicle.

· Interactions designed for a curved screen that empower the user with agency at key moments throughout the drive.

The Prototype

The team created a curved surface interface, employing rear projection and hand tracking, which manipulated a driving simulator in real time.

The prototype was used to validate our interaction design concepts in the highest fidelity possible. A video of the prototype is below.

Harman CMU x264
Play Video

Design for Semi-Autonomy

The Folk Model of Self-Driving Vehicles

Through research we discovered a commonly oversimplified view of autonomy, also called a folk model. The misconception is that autonomy is binary.

In other words, there is only manual driving and full autonomy - and nothing in between.

A common misconception is that driving is either fully automated or fully manual.

Self-Driving Today Requires Sustained Vigilance

in reality, autonomous driving is somewhere in the middle: Some of the mental work is taken off the hands of the human driver, but even so, they have to supervise the vehicle and be prepared to take back control.

Humans  perform poorly in tasks of sustained vigilance.

Gradually Achieving Full Autonomy

Technology is gradually shifting towards vehicles that  can handle every possible situation without the need for human supervision or intervention.

 

That shift is occurring over time, as AI masters more of the multitude of tasks involved in driving.
 

Artificial intelligence is still progressing towards enabling full autonomy

Human vs AI responsibility Over Time

Where We Are Today: Tesla AutoPilot

The Next 5-10 Years - Our Target

Our Problem Space: Levels 2-4

We concluded through secondary research, and discussions with experts that there was an opportunity to design for semi-autonomous vehicles over the next 5-10 years.

OEM's investments in teleoperations speak to to difficulty of achieving full autonomy.

Collecting Data

Dealing With an Open-Ended Problem

Our prompt put very few constraints on the scope of our efforts. Grouping the questions we had about designing for AV's made a seemingly infinite domain much more approachable.

Listing, then grouping our questions about AV's yiedled areas for further research.

User Interviews: Tesla Drivers and Car Enthusiasts

The closest thing available to a user of a semi-autonomous vehicle was Tesla Owner who engages AutoPilot.​

At the same time, car enthusiasts helped us understand the aspects the driving experience with which autonomy could actually interfere, and those which it could enhance.

Our question categories informed the focus of our interviews with Tesla drivers.

Wizard of Oz Study

Our Wizard of Oz study sought to probe attitudes about self-driving while identifying specific use cases for which to design.

The study followed a strict protocol from start to finish while recording both the road and the participant.

Most participants believed they were in a self-driving vehicle.

Experiment Setup

Experiment Protocol: Capturing Driving Use Cases

Data into Insights

Affinity Diagramming to Find Patterns

With the luxury of ​a lengthy discovery period, the team was able to embark on creating an affinity diagram together for all of our data.

The process of sifting through the data as a team was as valuable as the groupings that we identified in the data.

We coded interview and Wizard of Oz responses into individual data points.

Areas of Exploration

By jointly analyzing the data bottom-up, the team naturally came to 6 areas we believed could meaningfully impact the User Experience of Autonomous Vehicles.

The most salient patterns in our data became "how might we" statements that merited further exploration.

Brainstorming and Validating

Generating Concepts

We next created a matrix in which Harman's technology intersected with our area of exploration. Those ideas were further explored validates through rough prototypes tested on users.

Creative matrix, visioning, and bodystorming exercises generated new concepts.

Narrowing Through User Testing

We employed rough paper prototypes and laptops playing videos of the road in order to get rapid feedback on our many prototype ideas, so we could narrow to fewer choices.

We tested prototypes with users to address our areas of exploration.

Screenshot 2018-09-13 23.15.11.png

Final Direction

A New Interface for Humans and AI

The solution we narrowed towards is an overarching concept of collaborative AI/human control that will become increasingly important as autonomy improves across industries.

The interface we designed following our exploratory research is a control that would live within the larger collaborative environment of a semi-autonomous vehicles.

Focusing in on this concept was the start of an iterative prototyping and user-validation process.

Our curved interface is one incarnation in a larger system of Human-AI Interaction.

communication.png
clean screen.png
2 spaces.png

Final Prototype and Interactions

The prototype allowed us to test our system of interactions and refine them over several iterations.

Fidelity was gradually increased from paper to an iPad, and finally the live interactions on a curved surface that manipulated a driving simulator.

A case study detailing the interaction patterns can be found here.

We designed a system of interactions for driving a self-driving car and for understanding the black box of the vehicle's AI. 

in situ.png
gma.JPG
proto curve.gif

Accelerate and Decelerate 

*I was responsible for the interaction design and coding of the prototypes below, but did not create the visual designs.

Human Input
continuous.gif
Vehicle/AI Feedback
accel.gif

In Lane Adjustment

Human Input
continuous.gif
Vehicle/AI Feedback
left.gif

Binary "Go/Don't Go" Decisions

Human Input
binary_ui.gif
Vehicle/AI Feedback
binary.gif

Results

We created a concept and prototype specifically the for Harman Future Experiences group, and while our users were the drivers in our tests, our customer was the larger Harman International corporation.

Evangelized a New Mental Model

The concept of co-driving, based on the data-based rationale provided was extremely well received by Harman. At their Michigan HQ, they expressed that no automotive companies were yet thinking about autonomy in these terms.

Our curved interface is one incarnation in a larger system of Human-AI Interaction.

Market Readiness

The autonomy level we chose allowed Harman to immediately began conversations with automotive OEM’s regarding their autonomy roadmaps.

Supporting Rationale

We empowered Harman to leverage our work for future projects by providing the data behind all of our decisions; future designs could take a new form for Harman customers while still being derived from our insights.

Scalability through a Design System

Our curved interface is one incarnation in a larger system of Human-AI interaction.

By documenting a system of generalizable interaction patterns, we empowered Harman to tailor our concept to the likings of any automotive manufacturer.

Time will tell the exact way in which this product influences the digital cockpits our future. 
 

bottom of page