Role: UX Designer
Client: BMW
Timeline: 5 months
Platform: mobile application
The rise of autonomous driving raises the question of how users will interact with their future cars.
How might interfaces adapt to translate between an autonomous machine and a human?
BMW Vision combines functions such as automatic parking, charging, and an outlook on how a futuristic UI combined with autonomous functions could look like.
We wanted to understand what challenges BMW users are currently facing and what potential autonomous functions have. Therefore, we broke the question down into concrete sub-problems, which we addressed by setting research goals.
As our project was severely limited by COVID, the various user needs were identified in the research phase based on the pain points of the existing BMW app. Future desired functions were found out based on studies on autonomous driving.
The app concept is based on a long page from which the user can access all functionalities to not miss any function.
The entry point features your current car model as well as possible push notifications in case the car needs an action from the user. The driving feature, which allows the user to get an overview of the cars' current location, offers various shortcuts (e.g., parking, charging) as well as the option to drive to a specific location and also to cancel ongoing journeys.
Functions offer the user concrete choices of settings related to locking and opening doors and windows, temperature control, remote functions such as headlight flasher and horn, ambient light setting and music.
During this project, it became clear that although fully autonomous driving is still in the future, we need to start thinking about making processes and functions easy for the user to understand. While we were working, more and more questions arose: "to what extent do we still want to control the temperature in an autonomous car?" or "does an autonomous car allow us to turn on the heating even though the windows are open?”. These questions must be answered in the next steps in research and testing with users.