AR

Project info

Timeline

Role

Tools used

Internship project

Client: Hotel Adlon

1 UX/UI designer

2 project manager

5 unity developers

UX/UI design

Design system

Prototyping

Service design

Visual design

Mar-Jun 2022

Figma

Unity

Blender

Aftereffects

Adobe Suite


Adlon AR allows users to explore the rich history of Adlon through immersive AR content paired with an audio guide. Users can enter interactive 3D scenery to meet historical figures, interact with virtual objects, and capture photos and videos as souvenirs. Two modes are provided: location-based ‘On site mode’, which allows users to experience content by directly reaching the station, and ‘Home mode’, which allows users to access content regardless of physical distance constraints.

Immersive location-based AR tour

Relevant link

Featured in

The iconic Hotel Adlon Kempinski Berlin is located in the very heart of Berlin, having a great history which began in 1907. Our client wanted to convey their hotel history vividly to the people. Therefore, to commemorate the 25th anniversary of its reopening, historic figures, artifacts, and the Presidential Suite have been preserved through augmented reality, offering an immersive experience to visitors.

About Hotel Adlon, world-famous hotel at the Brandenburg Gate

Why Augmented Reality?

AR technology can bring back historical memories, figures and vibe into present.

Historical remains don’t exist in the present, but it can with AR technology.

Bell boy wearing Adlon uniform with 25th anniversary cake

The water fountain of Hotel Adlon

The goal

Deliver the story of Hotel Adlon with mobile-based AR regarding user’s physical space and safety.

Features

Easily explore the Suite room by placing objects in Home mode

Experience historic site with

On site mode

Commemorate with selfies

Home mode is activated when the user's location is more than 1km away from the station. By scanning a flat space to place objects and 360º photo bubbles, users can enjoy interactive contents in their home.

If the user's location is within 1km of the station, location-based mode is activated, and clicking on the station navigates to the route. The AR path and Earth Cloud Anchor navigates users to the destination. When they reach the content location, they can play the location-based content placed in AR and an audio guide that explains it.

In both modes, users can take commemorative photos with historical figures and objects in a virtual or real space and share them on social media.

Location based AR

A vivid tour starts when users reach the

location

Audio guide

Hear hotel history and scenes with audio guide

Placement UX

Place AR objects in real space and zoom, rotate, expand

Virtual object

Interact with historically significant objects

Take photo & Share

Create memories to remember the tour and share on SNS

360° scene

Immersive feeling as if the users were actually inside the hotel

Features

Physically reach the station to experience content

Enjoy content regardless of location or distance from the station

On site mode

Home mode

Mode types

Flow chart

Since the AR experiences are presented in two modes, establishing flow chart to set conditions to activate each modes were crucial. If the user is within 1km of the station location, On site mode experiences are displayed top. If far than 1km, experiences with Home mode is displayed top. This flowchart was finished up in collaboration with engineers in the team.

User journey

Open app

Scan environment

Experience AR objects

Take a photo & share

Service goal

Deliver the story of Hotel Adlon with mobile-based AR regarding user’s physical space and safety.

On site mode

On-site mode is a playable experience near Praiser Platz, Berlin, where the actual historic events occurred. In front of the Hotel Adlon, users can walk to the station by the guidance of an AR path to open the contents.

When user is within 1km(0.6 miles) to the station,

Follow AR path navigation

There are multiple contents in one station. A UI appears to move the user to the relevant location to experience each content.

01 Choose station from the map

02 Scan environment

03 Follow AR path navigation

04 Start virtual tour with audio guide

If the user's location is within 1km of the station, location-based mode is activated, and clicking on the station navigates to the route. The list is displayed starting from the nearest station.

When users reach the selected station, the surrounding environment is scanned and AR content is activated.

User flow

When reaching the content location, users can play the content placed in AR and an audio guide that explains it. To experience the next content, press the ‘>’ button and follow the route leading to the next location

The process

Initial version of AR route & Earth Cloud Anchor interface

We added distances indicators to mark the remaining distance to the station.

Added carpet texture to the AR path.

Designed map theme to fit to the branding.

Created 4 versions to test and iterate.

Added destination label to make it easily spottable.

In the process of enhancing UX of AR path scenario with the developers...

Home mode

This mode was the newly added feature during my project period. Since the experience starts by placing virtual objects, I mainly focused on placement UX, as well as adding UI to ensure user safety such as notification and indicators.

When user is not within 1km(mile) to the station,

Once placement is complete, the panel comes up and the audio guide switches to a playable state.

Home mode is activated when the user's location is more than 1km away from the station. Scan a flat space to place objects.

Key objects related to the audio guide narration can be observed closely while placed in the real environment.

Displays the number of content contained

Go on to next content

Users can take commemorative photos with historical figures and objects in a virtual space and share them on social media.

Once an object is automatically created on the scanned surface, users can adjust its size, position, and angle. Once placement is complete, the object is attached to the surface and a shadow is created to add realism.


Final placement flow

01 Scan surface

05 Interact with virtual object in real world

06 Take a photo from the virtual site and share

02 Rotate, scale and move the object

04 Explore the scene and interact with objects

03 Place 360° interactive scene on the ground

Users can tour the hotel by turning the camera and interact with virtual objects by clicking on them.

Off-screen content location indicator

An object that can be clicked to turn

the pages and read


Researching existing app

IKEA place

choose object

camera turns on, detects flat surface with a ring

when tapping check button, it gets placed

there is no guidline or message intructing the user. But the object can be rotated by two fingers control. Adjusting scale is not possible.

When check button is tapped, the object gets placed with a little bouncy interaction. Also the circle guideline gets disappeared.

To move or rotate an object, user can just tap on the object. Then guide circle are appeared again and the object is in the air which is a status of being edited.

To exit, just tap x on the right top.

Choose home mode.

Instructions are presented. - what home mode is, what you can do

Induces to scan floor first

popup message are shown and instructs about rotating, scaling and taking a mixie.

The object is generated in its real size but user can scale it with a slider(%). Also the location of an object is is guided by an arrow next to the slider.

But these scaling and rotating can be done without a slider with two fingers.

Placable objects are presented on the bottom, slidable horizentally. Capture button exists for taking a mixie and sharing it.

TAGGESSPIEGEL 89/19

Prior to designing AR object placement user flow, I researched the using process of existing apps. I recognized that placement mechanism differs according to the goal of the app. To design the UX for a Home Mode that allows users to enjoy AR content without physical constraints related to their distance from a station, I first downloaded and explored existing apps like IKEA PLACE and ADOBE AERO, which allow users to place 3D AR content in real-world spaces. Through hands-on experience, I investigated their processes, strengths, and weaknesses.

Researching existing app

IKEA place

  1. choose object

  2. camera turns on, detects flat surface with a ring

  3. when tapping check button, it gets placed

  4. there is no guidline or message intructing the user. But the object can be rotated by two fingers control. Adjusting scale is not possible.

  5. When check button is tapped, the object gets placed with a little bouncy interaction. Also the circle guideline gets disappeared.

  6. To move or rotate an object, user can just tap on the object. Then guide circle are appeared again and the object is in the air which is a status of being edited.

  7. To exit, just tap x on the right top.

  1. Choose home mode.

  2. Instructions are presented. - what home mode is, what you can do

  3. Induces to scan floor first

  4. popup message are shown and instructs about rotating, scaling and taking a mixie.

  1. The object is generated in its real size but user can scale it with a slider(%). Also the location of an object is is guided by an arrow next to the slider.

  2. But these scaling and rotating can be done without a slider with two fingers.

  3. Placable objects are presented on the bottom, slidable horizontally. Capture button exists for taking a mixie and sharing it.

TAGGESSPIEGEL 89/19

Prior to designing AR object placement user flow, I researched the using process of existing apps. I recognized that placement mechanism differs according to the goal of the app. To design the UX for a Home Mode that allows users to enjoy AR content without physical constraints related to their distance from a station, I first downloaded and explored existing apps like IKEA PLACE and ADOBE AERO, which allow users to place 3D AR content in real-world spaces. Through hands-on experience, I investigated their processes, strengths, and weaknesses.

User flow

Unlike other apps, this service has an emphasis on playing the audio guide. Therefore, it was crucial to induce users to look to the AR object and tap play button to play the description about it. Considering the user scenario, I designed streamlined user flow and interactions from scanning surface to listening to the audio guide. 

Object placing user flow

Use two fingers

Don’t let user to play audio guide while placing

Is ‘Real life scale’ needed?

Audio guide is deactivated

Guide circle

An oject sticks to the scanned surface(X floating)

Scanning UI : how should it look?

How can the audio guide look when it is activated?

What if users want to change the location of the object?
- they can tap on the placed object to edit again

Wireframe

After specifying user flow, I made wireframes of the screens with consideration of appropriate UI component. The panel was selected as a control area, since it is foldable so that users can take a detailed view of AR objects. Also, tutorials were displayed as toasts to assist users to follow the process.

Unlike 2D apps, AR tours require 3D setting. Users might trip, fall while they immerse into the experience. Also, there will be a lot of people who is trying AR for the first time. Therefore, proper instructions are presented with notification and animations to help them follow easily. To aid user’s safety, I added some interfaces such as indicators to guide the direction to the content.

Tutorial animation

Notification toast

Direction indicator

Interfaces& Instructions regarding scenarios beyond the screen

UX points

1

2

3

The UI panels and labels have to be easily viewable in any environment. In order to test many versions, I used Adobe Aero to test out what I had made in Figma. Many variations with dark and light, background-blurred versions were tested out in a real-world setting. Eventually, I found out that dark solid panels have better readability than the white version.

Finding the appropriate panel design for readability

Design system

Typeface

Typography

Moodboard

Brand colors

Adlon gold

#Luxurious #Classic #Shiny

Sub color

2

1

3

4

FFFFFF

DEDBD5

C9BEB4

B5A191

Main color

1

2

0D1F44

031029

Title 1

Title 2

Headline

Body

Detail

Bold

Bold

Regular

Regular

Regular

24pt

20pt

15pt

13pt

11pt

Style

Weight

Size

Aa

Red Hat Display

Aa

Times New Roman

2

2

배치 중인 상태임을 암시하는 spot ray

다음 콘텐츠로 넘어가기

Ratio to actual scale

Tutorial toast

Spot ray

Confirm placement

Number of contents

Carrying out a project as a main designer in an international startup was sincerely a precious opportunity! If I had more time to finish up the project, I would..

• Do more iterations of the created MVP and update UI.

• Talk to engineers and revise UI for front-end to make MVP to stick to the design I have made.

Learnings

Next steps

• Having meetings, my communication skills and delivery were enhanced. I collaborated with multicultural teams from Sweden, Taiwan, Germany, and many more.

• Also, I had a chance to experience holistic product development process from participating on managing client requirements to making user-friendly AR tour experience. 

• By designing screens based on the actual user process (success and fail states of UI, asking for permissions such as location, camera, photo), I was able to gain a deeper understanding of the product's operation while considering its business and security aspects.

• I was able to absorb the fundamentals of AR UX including the engineering mechanism side.

@2024 Jiwon Park