SlideShare a Scribd company logo
1 of 156
Download to read offline
DESIGNING AR SYSTEMS
COMP 4010 Lecture Seven
Mark Billinghurst
September 7th 2021
mark.billinghurst@unisa.edu.au
REVIEW
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
XR Prototyping Techniques
Lo-
Fi
Hi-
Fi
Easy
Hard
Digital
Authoring
Immersive
Authoring
Web-Based
Development*
Cross-Platform
Development*
Native
Development*
* requires scripting and 3D programming skills
Sketching
Paper Prototyping
Video Prototyping
Wireframing
Bodystorming
Wizard of Oz
Interactive Sketching
•Pop App
â—Ź Pop - https://marvelapp.com/pop
â—Ź Combining sketching and interactivity on mobiles
â—Ź Take pictures of sketches, link pictures together
Proto.io
• Web based prototyping tool
• Visual drag and drop interface
• Rich transitions
• Scroll, swipe, buttons, etc
• Deploy on device
• mobile, PC, browser
• Ideal for mobile interfaces
• iOS, Android template
• For low and high fidelity prototypes
Digital Authoring Tools for AR
Vuforia Studio
Lens Studio
• Support visual authoring of marker-
based and/or marker-less AR apps
• Provide default markers and support
for custom markers
• Typically enable AR previews
through emulator but need to deploy
to AR device for testing
Zappar
• Zapworks Studio
• Code-free interactivity
• Desktop authoring for mobile AR
• Integrated computer vision (ARkit, ARCore)
• Scripting, visual programming
• Multiple publishing options
• Zappar App, WebAR, App enabled
• Zapbox
• Inexpensive mobile AR HMD solution
• Two handed input
ZapBox
Snap LensStudio - https://lensstudio.snapchat.com/
Author and preview AR prototypes
â—Ź Tool behind Snapchat Lenses, but also a powerful AR prototyping tool
â—Ź Can do face (using front camera) and world lenses (rear camera)
â—Ź Simulated previews using webcam
Deploy and use advanced AR features
â—Ź Can deploy to phone running Snapchat app via Snapcode
â—Ź Has advanced AR tracking and segmentation capabilities
Immersive Authoring Tools for AR
• Enable visual authoring of 3D
content in AR
• Make it possible to edit while
previewing AR experience in the
environment
• Provide basic support for interactive
behaviors
• Sometimes support export to
WebXR
Apple Reality Composer
Adobe Aero
Creating On Device
•Adobe Aero
•Create AR on mobile devices
•Touch based interaction and authoring
•Only iOS support for now
•https://www.adobe.com/nz/products/aero.html
Apple Reality Composer
• Rapidly create 3D scenes and AR experiences
• Creation on device (iPhone, iPad)
• Drag and drop interface
• Loading 2D/3D content
• Simple interactivity – trigger/action
• Anchor content in real world (AR view)
• Planes (vertical, horizontal), faces, images
Digital Prototyping
Lo-
Fi
Hi-
Fi
Easy
Hard
Digital
Authoring
Immersive
Authoring
Web-Based
Development*
Cross-Platform
Development*
Native
Development*
* requires scripting and 3D programming skills
XR Tools Landscape
Digital & Immersive Authoring
Proto.io, Tour Creator, ...
Tilt Brush, Blocks, Quill, …
Web-Based Development
THREE.js, Babylon.js, …
A-Frame, AR.js, …
Cross-Platform Development
Unity + SDKs
Unreal + SDKs
Native Development
Cardboard/Oculus/Vive/... SDK
Vuforia/ARCore/ARKit/… SDK
XR Tools Landscape
Digital & Immersive Authoring
Good for storyboarding but limited
support for interactions
Web-Based Development
Good for basic XR apps but often
interactions feel unfinished
Native Development
Good for full-fledged XR apps but
limited to a particular platform
Cross-Platform Development
Good for full-fledged XR apps but
usually high learning curve
XR Toolkits
Card-
board
AR
Kit
AR
Core
Oculus VIVE
Holo
Lens
WMR
Web
Cam
A-Frame
AR.js
SteamVR
MRTK
Vuforia
AR Foundation
XR Interaction
WebXR
WebXR: A-Frame
• Based on Three.js and WebGL
• New HTML tags for 3D scenes
• A-Frame Inspector (not editor)
• Asset management (img, video,
audio, & 3D models)
• ECS architecture with many open-
source components
• Cross-platform XR
AR.js – WebXR Tracking
• Web based AR tracking library
• Marker tracking: ARToolkit markers
• Image tracking: Natural feature tracking
• Location tracking: GPS and compass
• Key Features
• Very Fast : It runs efficiently even on phones
• Web-based : It is a pure web solution, so no installation required.
• Full javascript based on three.js + A-Frame + jsartoolkit5
• Open Source : It is completely open source and free of charge!
• Standards : It works on any phone with webgl and webrtc
• See https://ar-js-org.github.io/AR.js-Docs/
Unity – unity.com
• Started out as game engine
• Has integrated support for many
types of XR apps
• Powerful scene editor
• Asset management & store
• Basically all XR device vendors
provide Unity SDKs
Vuforia
• Highly optimized computer vision tracking
• Multiple types of tracking
• Image tracking, object tracking, model tracking, area tracking, etc.
• Interaction features
• Virtual buttons, occlusion, visual effects,
• Multi-platform
• Mobile AR, AR headsets See https://www.vuforia.com/
AR Foundation
• A unified Framework for AR
• Multi-platform API
• Includes core features from ARKit, ARCore, Magic Leap, and HoloLens
• Set of behaviours and API with following features
• Tracking, light estimation, occlusion, meshing , video pass-through, etc.
• Integrates with Unity MARS
• See https://unity.com/unity/features/arfoundation
Unity XR Interation ToolKit (preview package)
• Easy way to add interactivity to AR/VR experience
• Object interactions
• UI interactions
• Locomotion
• Enabling common interactions without writing code
• AR gesture, object placement, annotations
• https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@1.0/
Unity MARS
• Features
• Visually author AR apps (WYSIWYG)
• Test apps in Unity editor
• Develop apps that can interact with real world
• Intelligent real-world recognition
• Multi-platform development
• Based on ARFoundation
• ARKit, ARCore, Magic Leap and Hololens
• See unity.com/mars
Mixed Reality ToolKit (MRTK)
• Open-Source Mixed Reality ToolKit
• Set of Unity modules/Unreal plugin
• Interaction Models
• Controllers, gesture, gaze, voice, etc.
• UX elements
• Foundational elements
• Material, text, light, etc.
• Controls and behaviours
• button, menu, slider, etc.
• Tutorials, documentation, guidelines
• See https://github.com/microsoft/MixedRealityToolkit-Unity
DESIGNING AR SYSTEMS
Design in Interaction Design
Key Prototyping
Steps
Good vs. Bad AR Design
https://www.youtube.com/watch?v=YJg02ivYzSs
AR. Design Considerations
• 1. Design for Humans
• Use Human Information Processing model
• 2. Design for Different User Groups
• Different users may have unique needs
• 3. Design for the Whole User
• Social, cultural, emotional, physical cognitive
• 4. Use UI Best Practices
• Adapt known UI guidelines to AR/VR
• 5. Use of Interface Metaphors/Affordances
• Decide best metaphor for AR/VR application
1. Design for Human Information Processing
• High level staged model from Wickens and Carswell (1997)
• Relates perception, cognition, and physical ergonomics
Perception Cognition Ergonomics
Design for Perception
• Need to understand perception to design AR
• Visual perception
• Many types of visual cues (stereo, oculomotor, etc.)
• Auditory system
• Binaural cues, vestibular cues
• Somatosensory
• Haptic, tactile, kinesthetic, proprioceptive cues
• Chemical Sensing System
• Taste and smell
Depth Perception Problems
• Without proper depth cues AR interfaces look unreal
Which of these POI are near or far?
Types of Depth Cues
Improving Depth Perception
Cutaways
Occlusion
Shadows
Cutaway Example
• Providing depth perception cues for AR
https://www.youtube.com/watch?v=2mXRO48w_E4
Design for Cognition
• Design for Working and Long-term memory
• Working memory
• Short term storage, Limited storage (~5-9 items)
• Long term memory
• Memory recall trigger by associative cues
• Situational Awareness
• Model of current state of user’s environment
• Used for wayfinding, object interaction, spatial awareness, etc..
• Provide cognitive cues to help with situational awareness
• Landmarks, procedural cues, map knowledge
• Support both ego-centric and exo-centric views
Micro-Interactions
â–Ş Using mobile phones people split their attention
between the display and the real world
Time Looking at Screen
Oulasvirta, A. (2005). The fragmentation of attention in mobile
interaction, and what to do with it. interactions, 12(6), 16-18.
Dividing Attention to World
• Number of times looking away from mobile screen
Design for Micro Interactions
â–Ş Design interaction for less than a few seconds
• Tiny bursts of interaction
• One task per interaction
• One input per interaction
â–Ş Benefits
• Use limited input
• Minimize interruptions
• Reduce attention fragmentation
NHTSA Guidelines - www.nhtsa.gov
For technology in cars:
• Any task by a driver should be interruptible at any time.
• The driver should control the pace of task interactions.
• Tasks should be completed with glances away from the
roadway of 2 seconds or less
• Cumulative time glancing away from the road <=12 secs.
Make it Glanceable
• Seek to rigorously reduce information density. Successful designs afford for
recognition, not reading.
Bad Good
Reduce Information Chunks
You are designing for recognition, not reading. Reducing the total # of information
chunks will greatly increase the glanceability of your design.
1
2
3
1
2
3
4
5 (6)
Eye movements
For 1: 1-2 460ms
For 2: 1 230ms
For 3: 1 230ms
~920ms
Eye movements
For 1: 1 230ms
For 2: 1 230ms
For 3: 1 230ms
For 4: 3 690ms
For 5: 2 460ms
~1,840ms
Ego-centric and Exo-centric views
• Combining ego-centric and exo-centric cue for better situational awareness
Cognitive Issues in Mobile AR
• Information Presentation
• Amount, Representation, Placement, View combination
• Physical Interaction
• Navigation, Direct manipulation, Content creation
• Shared Experience
• Social context, Bodily Configuration, Artifact manipulation, Display space
Li, N., & Duh, H. B. L. (2013). Cognitive issues in mobile augmented reality: an embodied perspective.
In Human factors in augmented reality environments (pp. 109-135). Springer, New York, NY.
Information Presentation
• Consider
• The amount of information
• Clutter, complexity
• The representation of information
• Navigation cues, POI representation
• The placement of information
• Head, body, world stabilized
• Using view combinations
• Multiple views
Example: Twitter 360
• www.twitter-360.com
• iPhone application
• See geo-located tweets in real world
• Twitter.com supports geo tagging
But: Information Clutter from Many Tweets
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Solution: Information Filtering
Information Filtering
Before After
OutdoorAR:Limited FOV
• Show POI outside FOV
• Zooms between map and panorama views
Zooming Views
• https://www.youtube.com/watch?v=JLxLH9Cya20
Design for Physical Ergonomics
• Design for the human motion range
• Consider human comfort and natural posture
• Design for hand input
• Coarse and fine scale motions, gripping and grasping
• Avoid “Gorilla arm syndrome” from holding arm pose
Gorilla Arm in AR
• Design interface to reduce mid-air gestures
XRgonomics
• Uses physiological model to calculate ergonomic interaction cost
• Difficulty of reaching points around the user
• Customizable for different users
• Programmable API, Hololens demonstrator
• GitHub Repository
• https://github.com/joaobelo92/xrgonomics
Evangelista Belo, J. M., Feit, A. M., Feuchtner, T., & Grønbæk, K. (2021, May). XRgonomics: Facilitating the Creation of
Ergonomic 3D Interfaces. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-11).
XRgonomics
https://www.youtube.com/watch?v=cQW9jfVXf4g
2. Designing for Different User Groups
• Design for Difference Ages
• Children require different interface design than adults
• Older uses have different needs than younger
• Prior Experience with AR systems
• Familiar with HMDs, AR input devices
• People with Different Physical Characteristics
• Height and arm reach, handedness
• Perceptual, Cognitive and Motor Abilities
• Colour perception varies between people
• Spatial ability, cognitive or motor disabilities
Designing for Children
• HMDS
• inter pupillary distance, head fit, size and weight
• Tablets
• Poor dexterity, need to hold large tablet
• Content
• Reading ability, spatial perception
3. Design for the Whole User
Consider Your User
• Consider context of user
• Physical, social, emotional, cognitive, etc.
• Mobile Phone AR User
• Probably Mobile
• One hand interaction
• Short application use
• Need to be able to multitask
• Use in outdoor or indoor environment
• Want to enhance interaction with real world
Would you wear this HMD?
Whole User Needs
• Social
• Don’t make your user look stupid
• Cultural
• Follow local cultural norms
• Physical
• Can the user physically use the interface?
• Cognitive
• Can the user understand how the interface works?
• Emotional
• Make the user feel good and in control
Example: Social Acceptance
• People don’t want to look silly
• Only 12% of 4,600 adults would be willing to wear AR glasses
• 20% of mobile AR browser users experience social issues
• Acceptance more due to Social than Technical issues
• Needs further study (ethnographic, field tests, longitudinal)
TAT AugmentedID
4. Use UI Best Practices
• General UI design principles can be applied to AR
• E.g. Shneiderman’s UI guidelines from 1998
• Providing interface feedback
• Mixture of reactive, instrumental and operational feedback
• Maintain spatial and temporal correspondence
• Use constraints
• Specify relations between variables that must be satisfied
• E.g. physical constraints reduce freedom of movement
• Support Two-Handed control
• Use Guiard’s framework of bimanual manipulation
• Dominant vs. non-dominant hands
Follow Good HCI Principles
• Provide good conceptual model/Metaphor
• customers want to understand how UI works
• Make things visible
• if object has function, interface should show it
• Map interface controls to customer s model
• infix -vs- postfix calculator -- whose model?
• Provide feedback
• what you see is what you get!
Example: Guiard’s model of bimanual manipulation
Guiard, Y. (1987). Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Journal of Motor Behavior, 19, 486-517.
Dominant
hand
Non-dominant
hand
Dominant
hand
Non-dominant
hand
Non-Dominant: Leads, set spatial reference frame, performs coarse motions
Dominant: Follows, works in reference frame, performs fine motions
Adapting Existing Guidelines
• Mobile Phone AR
• Phone HCI Guidelines
• Mobile HCI Guidelines
• HMD Based AR
• 3D User Interface Guidelines
• VR Interface Guidelines
• Desktop AR
• Desktop UI Guidelines
Example: Apple iOS Interface Guidelines
• Make it obvious how to use your content.
• Avoid clutter, unused blank space, and busy backgrounds.
• Minimize required user input.
• Express essential information succinctly.
• Provide a fingertip-sized target for all controls.
• Avoid unnecessary interactivity.
• Provide feedback when necessary
From: https://developer.apple.com/ios/human-interface-guidelines/
Applying Principles to Mobile AR
• Clean
• Large Video View
• Large Icons
• Text Overlay
• Feedback
•Interface Components
• Physical components
• Display elements
• Visual/audio
• Interaction metaphors
Physical
Elements
Display
Elements
Interaction
Metaphor
Input Output
5. Use Interface Metaphors
AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Expressiveness, Intuitiveness
AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Design for Layers
Information Layers
• Head-stabilized
• Heads-up display
• Body-stabilized
• E.g., virtual tool-belt
• World-stabilized
• E.g., billboard or signpost
Head stabilized
• Information attached to view – always visible
Body Stabilized
• Information moves with person
Body Stabilized Interface
• Elements you want always available, but not always visible
World Stabilized
• Information fixed in world
• Elements you want fixed relative to real world objects
“Diegetic UI”
• Integrated with world
Example: Fragments
• UI Elements embedded
in real world
• Real world occlusion
Fragments Demo
• https://www.youtube.com/watch?v=kBGWZztPZ4A
Design to Device Constraints
• Understand the platform and design for limitations
• Hardware, software platforms
• E.g. Handheld AR game with visual tracking
• Use large screen icons
• Consider screen reflectivity
• Support one-hand interaction
• Consider the natural viewing angle
• Do not tire users out physically
• Do not encourage fast actions
• Keep at least one tracking surface in view
Art of Defense Game
Handheld AR Constraints/Affordances
• Camera and screen are linked
• Fast motions a problem when looking at screen
• Intuitive “navigation”
• Phone in hand
• Two handed activities: awkward or intuitive
• Extended periods of holding phone tiring
• Awareness of surrounding environment
• Small screen
• Extended periods of looking at screen tiring
• In general, small awkward platform
• Vibration, sound
• Can provide feedback when looking elsewhere
Common Mobile AR Metaphors
• Tangible AR Lens Viewing
• Look through screen into AR scene
• Interact with screen to interact with AR content
• Touch screen input
• E.g. Invisible Train
• Metaphor – holding a window into the AR world
The Invisible Train
https://www.youtube.com/watch?v=6LE98k0YMLM
Common Mobile AR Metaphors
• Tangible AR Lens Manipulation
• Select AR object and attach to device
• Physically move phone to move AR object
• Use motion of device as input
• E.g. AR Lego
• Metaphor – phone as physical handle for device
• https://www.youtube.com/watch?v=icmqv32HEPU
AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Design for Affordances
Tangible AR Metaphor
• AR overcomes limitation of TUIs
• enhance display possibilities
• merge task/display space
• provide public and private views
• TUI + AR = Tangible AR
• Apply TUI methods to AR interface design
Tangible AR Design Principles
• Tangible AR Interfaces use TUI principles
• Physical controllers for moving virtual content
• Support for spatial 3D interaction techniques
• Time and space multiplexed interaction
• Support for multi-handed interaction
• Match object affordances to task requirements
• Support parallel activity with multiple objects
• Allow collaboration between multiple users
AR Design Space
Reality Virtual Reality
Augmented Reality
Physical Design Virtual Design
Affordances
”… the perceived and actual properties of the thing, primarily
those fundamental properties that determine just how the
thing could possibly be used. [...]
Affordances provide strong clues to the operations of things.”
(Norman, The Psychology of Everyday Things 1988, p.9)
Affordances
Affordance Matrix
Affordance Matrix
Fake door
Hidden door
Real door
No door
Physical vs. Virtual Affordances
• Physical Affordance
• Look and feel of real objects
• Shape, texture, colour, weight, etc.
• Industrial Design
• Virtual Affordance
• Look of virtual objects
• Copy real objects
• Interface Design
•AR design is mixture of physical
affordance and virtual affordance
•Physical
•Tangible controllers and objects
•Virtual
•Virtual graphics and audio
Affordances in AR
• Design AR interface objects to show how they are used
• Use visual and physical cues to show possible affordances
• Perceived affordances should match actual affordances
• Physical and virtual affordances should match
Merge Cube Tangible Molecules
Case Study 1: 3D AR Lens
Goal: Develop a lens based AR interface
• MagicLenses
• Developed at Xerox PARC in 1993
• View a region of the workspace differently to the rest
• Overlap MagicLenses to create composite effects
3D MagicLenses
MagicLenses extended to 3D (Veiga et. al. 96)
§ Volumetric and flat lenses
AR Lens Design Principles
• Physical Components
• Lens handle
• Virtual lens attached to real object
• Display Elements
• Lens view
• Reveal layers in dataset
• Interaction Metaphor
• Physically holding lens
3D AR Lenses: Model Viewer
§ Displays models made up of multiple parts
§ Each part can be shown or hidden through the lens
§ Allows the user to peer inside the model
§ Maintains focus + context
AR Lens Demo
AR Lens Implementation
Stencil Buffer Outside Lens
Inside Lens Virtual Magnifying Glass
Case Study 2 : LevelHead
• Block based game
Case Study 2: LevelHead
• Physical Components
• Real blocks
• Display Elements
• Virtual person and rooms
• Interaction Metaphor
• Blocks are rooms
Level Head Demo
Case Study 3: AR Chemistry (Fjeld 2002)
• Tangible AR chemistry education
Goal: An AR application to teach
molecular structure in chemistry
•Physical Components
• Real book, rotation cube, scoop, tracking
markers
•Display Elements
• AR atoms and molecules
• Interaction Metaphor
• Build your own molecule
AR Chemistry Input Devices
AR Chemistry Demo
Case Study 4: Transitional Interfaces
Goal: An AR interface supporting
transitions from reality to virtual reality
•Physical Components
• Real book
•Display Elements
• AR and VR content
• Interaction Metaphor
• Book pages hold virtual scenes
Milgram’s Continuum (1994)
Reality
(Tangible
Interfaces)
Virtualit
y
(Virtual
Reality)
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Mixed Reality (MR)
Central Hypothesis
• The next generation of interfaces will support
transitions along the Reality-Virtuality continuum
Transitions
• Interfaces of the future will need to support
transitions along the RV continuum
• Augmented Reality is preferred for:
• co-located collaboration
• Immersive Virtual Reality is preferred for:
• experiencing world immersively (egocentric)
• sharing views
• remote collaboration
The MagicBook
•Design Goals:
•Allows user to move smoothly
between reality and virtual reality
•Support collaboration
MagicBook Metaphor
MagicBook Demo
Features
• Seamless transition from Reality to Virtuality
• Reliance on real decreases as virtual increases
• Supports egocentric and exocentric views
• User can pick appropriate view
• Computer becomes invisible
• Consistent interface metaphors
• Virtual content seems real
• Supports collaboration
Collaboration in MagicBook
• Collaboration on multiple levels:
• Physical Object
• AR Object
• Immersive Virtual Space
• Egocentric + exocentric collaboration
• multiple multi-scale users
• Independent Views
• Privacy, role division, scalability
Technology
• Reality
• No technology
• Augmented Reality
• Camera – tracking
• Switch – fly in
• Virtual Reality
• Compass – tracking
• Pressure pad – move
• Switch – fly out
Summary
•When designing AR interfaces, think of:
• Physical Components
• Physical affordances
• Virtual Components
• Virtual affordances
• Interface Metaphors
• Tangible AR or similar
AR INTERFACE DESIGN
GUIDELINES
Design Guidelines
By Vendors
Platform driven
By Designers
User oriented
By Practitioners
Experience based
By Researchers
Empirically derived
Design Patterns
“Each pattern describes a problem which occurs
over and over again in our environment, and then
describes the core of the solution to that problem in
such a way that you can use this solution a million
times over, without ever doing it the same way twice.”
– Christopher Alexander et al.
Use Design Patterns to Address Reoccurring Problems
C.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
Example UI Design Patterns
• http://ui-patterns.com/patterns
Design Patterns for Handheld AR
• Set of design patterns for Handheld AR
• Title: a short phase that is memorable.
• Definition: what experiences the prepattern supports
• Description: how and why the prepattern works,
what aspects of game design it is based on.
• Examples: Illustrate the meaning of the pre-pattern.
• Using the pre-patterns: reveal the challenges and
context of applying the pre-patterns.
Xu, Y., Barba, E., Radu, I., Gandy, M., Shemaka, R., Schrank, B., ... & Tseng, T.
(2011, October). Pre-patterns for designing embodied interactions in handheld
augmented reality games. In 2011 IEEE International Symposium on Mixed and
Augmented Reality-Arts, Media, and Humanities (pp. 19-28). IEEE.
Handheld AR Design Patterns
Title Meaning Embodied Skills
Device Metaphors Using metaphor to suggest available player
actions
Body A&S NaĂŻve physics
Control Mapping Intuitive mapping between physical and digital
objects
Body A&S NaĂŻve physics
Seamful Design Making sense of and integrating the
technological seams through game design
Body A&S
World Consistency Whether the laws and rules in
physical world hold in digital world
NaĂŻve physics
Environmental A&S
Landmarks Reinforcing the connection between digital-
physical space through landmarks
Environmental A&S
Personal Presence The way that a player is represented in the
game decides how much they feel like living in
the digital game world
Environmental A&S
NaĂŻve physics
Living Creatures Game characters that are responsive to
physical, social events that mimic behaviours
of living beings
Social A&S Body A&S
Body constraints Movement of one’s body position
constrains another player’s action
Body A&S Social A&S
Hidden information The information that can be hidden and
revealed can foster emergent social play
Social A&S Body A&S
*A&S = awareness and skills
Example: Seamless Design
• Design to reduce seams in the user experience
• Eg: AR tracking failure, change in interaction mode
• Paparazzi Game
• Change between AR tracking to accelerometer input
Yan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented reality games,
Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media, and
Humanities, p.19-28, October 26-29, 2011
Demo: Paparazzi Game
• https://www.youtube.com/watch?v=MIGH5WGMnbs
Example: Living Creatures
• Virtual creatures should respond to real world events
• eg. Player motion, wind, light, etc
• Creates illusion creatures are alive in the real world
• Sony EyePet
• Responds to player blowing on creature
Google ARCore Interface Guidelines
https://developers.google.com/ar/design
ARCore Elements App
• Mobile AR app demonstrating
interface guidelines
• Multiple Interface Guidelines
• User interface
• User environment
• Object manipulation
• Off-screen markers
• Etc..
• Test on Device
• https://play.google.com/store/apps/details?id=com.google.ar.unity.ddelements
ARCore Elements
• https://www.youtube.com/watch?v=pRHmLuXIs0s
ARKit Interface Guidelines
• developer.apple.com/design/human-interface-guidelines/ios/system-capabilities/augmented-reality/
Microsoft Mixed Reality Design Guidelines
• https://docs.microsoft.com/en-us/windows/mixed-reality/design/design
MRTK Interface Examples
• Examples of UX Building Blocks
• http://aka.ms/MRTK
The Trouble with AR Design Guidelines
1) Rapidly evolving best practices
Still a moving target, lots to learn about AR design
Slowly emerging design patterns, but often change with OS updates
Already major differences between device platforms
2) Challenges with scoping guidelines
Often too high level, like “keep the user safe and comfortable”
Or, too application/device/vendor-specific
3) Best guidelines come from learning by doing
Test your designs early and often, learn from your own “mistakes”
Mind differences between VR and AR, but less so between devices
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

What's hot

Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
Lecture 6 Interaction Design for VR
Lecture 6 Interaction Design for VRLecture 6 Interaction Design for VR
Lecture 6 Interaction Design for VRMark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsMark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2Mark Billinghurst
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 

What's hot (20)

Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Lecture 6 Interaction Design for VR
Lecture 6 Interaction Design for VRLecture 6 Interaction Design for VR
Lecture 6 Interaction Design for VR
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR Systems
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 

Similar to Comp4010 Lecture7 Designing AR Systems

Mobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - PrototypingMobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - PrototypingMark Billinghurst
 
Why Do Mobile Projects Fail?
Why Do Mobile Projects Fail?Why Do Mobile Projects Fail?
Why Do Mobile Projects Fail?Indiginox
 
Adapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile WorkflowAdapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile WorkflowJoseph Labrecque
 
Workshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their PeculiaritiesWorkshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their PeculiaritiesMartin Lechner
 
Mobile applications development
Mobile applications developmentMobile applications development
Mobile applications developmentVictor Matyushevskyy
 
Cross Platform Mobile Development
Cross Platform Mobile DevelopmentCross Platform Mobile Development
Cross Platform Mobile DevelopmentManesh Lad
 
Simple mobile Websites
Simple mobile WebsitesSimple mobile Websites
Simple mobile Websitescityofroundrock
 
Beginners guide to creating mobile apps
Beginners guide to creating mobile appsBeginners guide to creating mobile apps
Beginners guide to creating mobile appsJames Quick
 
Designing for mobile user experience
Designing for mobile user experienceDesigning for mobile user experience
Designing for mobile user experienceSameer Chavan
 
Is it possible to write cross-native apps in 2020 ?
Is it possible to write cross-native apps in 2020 ?Is it possible to write cross-native apps in 2020 ?
Is it possible to write cross-native apps in 2020 ?Chris Saez
 
Impact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher EducationImpact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher EducationJoseph Labrecque
 
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSebastien Kuntz
 
COMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile ARCOMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile ARMark Billinghurst
 
Future of user interface design
Future of user interface designFuture of user interface design
Future of user interface designRanjeet Tayi
 
Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...
Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...
Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...Bala Subra
 
The Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude StudioThe Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude StudioMartin Lechner
 
Mobile web development
Mobile web development Mobile web development
Mobile web development Moumie Soulemane
 
Storytelling using Immersive Technologies
Storytelling using Immersive TechnologiesStorytelling using Immersive Technologies
Storytelling using Immersive TechnologiesKumar Ahir
 

Similar to Comp4010 Lecture7 Designing AR Systems (20)

Mobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - PrototypingMobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - Prototyping
 
Why Do Mobile Projects Fail?
Why Do Mobile Projects Fail?Why Do Mobile Projects Fail?
Why Do Mobile Projects Fail?
 
Adapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile WorkflowAdapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile Workflow
 
Workshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their PeculiaritiesWorkshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their Peculiarities
 
Mobile applications development
Mobile applications developmentMobile applications development
Mobile applications development
 
Xamarin tools
Xamarin toolsXamarin tools
Xamarin tools
 
Cross Platform Mobile Development
Cross Platform Mobile DevelopmentCross Platform Mobile Development
Cross Platform Mobile Development
 
Simple mobile Websites
Simple mobile WebsitesSimple mobile Websites
Simple mobile Websites
 
Beginners guide to creating mobile apps
Beginners guide to creating mobile appsBeginners guide to creating mobile apps
Beginners guide to creating mobile apps
 
Designing for mobile user experience
Designing for mobile user experienceDesigning for mobile user experience
Designing for mobile user experience
 
Is it possible to write cross-native apps in 2020 ?
Is it possible to write cross-native apps in 2020 ?Is it possible to write cross-native apps in 2020 ?
Is it possible to write cross-native apps in 2020 ?
 
Impact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher EducationImpact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher Education
 
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
 
COMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile ARCOMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile AR
 
Future of user interface design
Future of user interface designFuture of user interface design
Future of user interface design
 
Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...
Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...
Mobile Development Architecture Ppt with Slides, Book Notes on using Web Silv...
 
Android development first steps
Android development   first stepsAndroid development   first steps
Android development first steps
 
The Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude StudioThe Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude Studio
 
Mobile web development
Mobile web development Mobile web development
Mobile web development
 
Storytelling using Immersive Technologies
Storytelling using Immersive TechnologiesStorytelling using Immersive Technologies
Storytelling using Immersive Technologies
 

More from Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

More from Mark Billinghurst (13)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Recently uploaded

GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsHyundai Motor Group
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 

Recently uploaded (20)

GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 

Comp4010 Lecture7 Designing AR Systems

  • 1. DESIGNING AR SYSTEMS COMP 4010 Lecture Seven Mark Billinghurst September 7th 2021 mark.billinghurst@unisa.edu.au
  • 3. XR Prototyping Tools Low Fidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 4. XR Prototyping Techniques Lo- Fi Hi- Fi Easy Hard Digital Authoring Immersive Authoring Web-Based Development* Cross-Platform Development* Native Development* * requires scripting and 3D programming skills Sketching Paper Prototyping Video Prototyping Wireframing Bodystorming Wizard of Oz
  • 5. Interactive Sketching •Pop App â—Ź Pop - https://marvelapp.com/pop â—Ź Combining sketching and interactivity on mobiles â—Ź Take pictures of sketches, link pictures together
  • 6. Proto.io • Web based prototyping tool • Visual drag and drop interface • Rich transitions • Scroll, swipe, buttons, etc • Deploy on device • mobile, PC, browser • Ideal for mobile interfaces • iOS, Android template • For low and high fidelity prototypes
  • 7. Digital Authoring Tools for AR Vuforia Studio Lens Studio • Support visual authoring of marker- based and/or marker-less AR apps • Provide default markers and support for custom markers • Typically enable AR previews through emulator but need to deploy to AR device for testing
  • 8. Zappar • Zapworks Studio • Code-free interactivity • Desktop authoring for mobile AR • Integrated computer vision (ARkit, ARCore) • Scripting, visual programming • Multiple publishing options • Zappar App, WebAR, App enabled • Zapbox • Inexpensive mobile AR HMD solution • Two handed input ZapBox
  • 9. Snap LensStudio - https://lensstudio.snapchat.com/ Author and preview AR prototypes â—Ź Tool behind Snapchat Lenses, but also a powerful AR prototyping tool â—Ź Can do face (using front camera) and world lenses (rear camera) â—Ź Simulated previews using webcam Deploy and use advanced AR features â—Ź Can deploy to phone running Snapchat app via Snapcode â—Ź Has advanced AR tracking and segmentation capabilities
  • 10. Immersive Authoring Tools for AR • Enable visual authoring of 3D content in AR • Make it possible to edit while previewing AR experience in the environment • Provide basic support for interactive behaviors • Sometimes support export to WebXR Apple Reality Composer Adobe Aero
  • 11. Creating On Device •Adobe Aero •Create AR on mobile devices •Touch based interaction and authoring •Only iOS support for now •https://www.adobe.com/nz/products/aero.html
  • 12. Apple Reality Composer • Rapidly create 3D scenes and AR experiences • Creation on device (iPhone, iPad) • Drag and drop interface • Loading 2D/3D content • Simple interactivity – trigger/action • Anchor content in real world (AR view) • Planes (vertical, horizontal), faces, images
  • 14. XR Tools Landscape Digital & Immersive Authoring Proto.io, Tour Creator, ... Tilt Brush, Blocks, Quill, … Web-Based Development THREE.js, Babylon.js, … A-Frame, AR.js, … Cross-Platform Development Unity + SDKs Unreal + SDKs Native Development Cardboard/Oculus/Vive/... SDK Vuforia/ARCore/ARKit/… SDK
  • 15. XR Tools Landscape Digital & Immersive Authoring Good for storyboarding but limited support for interactions Web-Based Development Good for basic XR apps but often interactions feel unfinished Native Development Good for full-fledged XR apps but limited to a particular platform Cross-Platform Development Good for full-fledged XR apps but usually high learning curve
  • 17. WebXR: A-Frame • Based on Three.js and WebGL • New HTML tags for 3D scenes • A-Frame Inspector (not editor) • Asset management (img, video, audio, & 3D models) • ECS architecture with many open- source components • Cross-platform XR
  • 18. AR.js – WebXR Tracking • Web based AR tracking library • Marker tracking: ARToolkit markers • Image tracking: Natural feature tracking • Location tracking: GPS and compass • Key Features • Very Fast : It runs efficiently even on phones • Web-based : It is a pure web solution, so no installation required. • Full javascript based on three.js + A-Frame + jsartoolkit5 • Open Source : It is completely open source and free of charge! • Standards : It works on any phone with webgl and webrtc • See https://ar-js-org.github.io/AR.js-Docs/
  • 19. Unity – unity.com • Started out as game engine • Has integrated support for many types of XR apps • Powerful scene editor • Asset management & store • Basically all XR device vendors provide Unity SDKs
  • 20. Vuforia • Highly optimized computer vision tracking • Multiple types of tracking • Image tracking, object tracking, model tracking, area tracking, etc. • Interaction features • Virtual buttons, occlusion, visual effects, • Multi-platform • Mobile AR, AR headsets See https://www.vuforia.com/
  • 21. AR Foundation • A unified Framework for AR • Multi-platform API • Includes core features from ARKit, ARCore, Magic Leap, and HoloLens • Set of behaviours and API with following features • Tracking, light estimation, occlusion, meshing , video pass-through, etc. • Integrates with Unity MARS • See https://unity.com/unity/features/arfoundation
  • 22. Unity XR Interation ToolKit (preview package) • Easy way to add interactivity to AR/VR experience • Object interactions • UI interactions • Locomotion • Enabling common interactions without writing code • AR gesture, object placement, annotations • https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@1.0/
  • 23. Unity MARS • Features • Visually author AR apps (WYSIWYG) • Test apps in Unity editor • Develop apps that can interact with real world • Intelligent real-world recognition • Multi-platform development • Based on ARFoundation • ARKit, ARCore, Magic Leap and Hololens • See unity.com/mars
  • 24. Mixed Reality ToolKit (MRTK) • Open-Source Mixed Reality ToolKit • Set of Unity modules/Unreal plugin • Interaction Models • Controllers, gesture, gaze, voice, etc. • UX elements • Foundational elements • Material, text, light, etc. • Controls and behaviours • button, menu, slider, etc. • Tutorials, documentation, guidelines • See https://github.com/microsoft/MixedRealityToolkit-Unity
  • 26. Design in Interaction Design Key Prototyping Steps
  • 27. Good vs. Bad AR Design
  • 29. AR. Design Considerations • 1. Design for Humans • Use Human Information Processing model • 2. Design for Different User Groups • Different users may have unique needs • 3. Design for the Whole User • Social, cultural, emotional, physical cognitive • 4. Use UI Best Practices • Adapt known UI guidelines to AR/VR • 5. Use of Interface Metaphors/Affordances • Decide best metaphor for AR/VR application
  • 30. 1. Design for Human Information Processing • High level staged model from Wickens and Carswell (1997) • Relates perception, cognition, and physical ergonomics Perception Cognition Ergonomics
  • 31. Design for Perception • Need to understand perception to design AR • Visual perception • Many types of visual cues (stereo, oculomotor, etc.) • Auditory system • Binaural cues, vestibular cues • Somatosensory • Haptic, tactile, kinesthetic, proprioceptive cues • Chemical Sensing System • Taste and smell
  • 32. Depth Perception Problems • Without proper depth cues AR interfaces look unreal
  • 33. Which of these POI are near or far?
  • 36. Cutaway Example • Providing depth perception cues for AR https://www.youtube.com/watch?v=2mXRO48w_E4
  • 37. Design for Cognition • Design for Working and Long-term memory • Working memory • Short term storage, Limited storage (~5-9 items) • Long term memory • Memory recall trigger by associative cues • Situational Awareness • Model of current state of user’s environment • Used for wayfinding, object interaction, spatial awareness, etc.. • Provide cognitive cues to help with situational awareness • Landmarks, procedural cues, map knowledge • Support both ego-centric and exo-centric views
  • 38. Micro-Interactions â–Ş Using mobile phones people split their attention between the display and the real world
  • 39. Time Looking at Screen Oulasvirta, A. (2005). The fragmentation of attention in mobile interaction, and what to do with it. interactions, 12(6), 16-18.
  • 40. Dividing Attention to World • Number of times looking away from mobile screen
  • 41. Design for Micro Interactions â–Ş Design interaction for less than a few seconds • Tiny bursts of interaction • One task per interaction • One input per interaction â–Ş Benefits • Use limited input • Minimize interruptions • Reduce attention fragmentation
  • 42. NHTSA Guidelines - www.nhtsa.gov For technology in cars: • Any task by a driver should be interruptible at any time. • The driver should control the pace of task interactions. • Tasks should be completed with glances away from the roadway of 2 seconds or less • Cumulative time glancing away from the road <=12 secs.
  • 43. Make it Glanceable • Seek to rigorously reduce information density. Successful designs afford for recognition, not reading. Bad Good
  • 44. Reduce Information Chunks You are designing for recognition, not reading. Reducing the total # of information chunks will greatly increase the glanceability of your design. 1 2 3 1 2 3 4 5 (6) Eye movements For 1: 1-2 460ms For 2: 1 230ms For 3: 1 230ms ~920ms Eye movements For 1: 1 230ms For 2: 1 230ms For 3: 1 230ms For 4: 3 690ms For 5: 2 460ms ~1,840ms
  • 45. Ego-centric and Exo-centric views • Combining ego-centric and exo-centric cue for better situational awareness
  • 46. Cognitive Issues in Mobile AR • Information Presentation • Amount, Representation, Placement, View combination • Physical Interaction • Navigation, Direct manipulation, Content creation • Shared Experience • Social context, Bodily Configuration, Artifact manipulation, Display space Li, N., & Duh, H. B. L. (2013). Cognitive issues in mobile augmented reality: an embodied perspective. In Human factors in augmented reality environments (pp. 109-135). Springer, New York, NY.
  • 47. Information Presentation • Consider • The amount of information • Clutter, complexity • The representation of information • Navigation cues, POI representation • The placement of information • Head, body, world stabilized • Using view combinations • Multiple views
  • 48. Example: Twitter 360 • www.twitter-360.com • iPhone application • See geo-located tweets in real world • Twitter.com supports geo tagging
  • 49. But: Information Clutter from Many Tweets Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah
  • 53. • Show POI outside FOV • Zooms between map and panorama views Zooming Views
  • 55. Design for Physical Ergonomics • Design for the human motion range • Consider human comfort and natural posture • Design for hand input • Coarse and fine scale motions, gripping and grasping • Avoid “Gorilla arm syndrome” from holding arm pose
  • 56. Gorilla Arm in AR • Design interface to reduce mid-air gestures
  • 57. XRgonomics • Uses physiological model to calculate ergonomic interaction cost • Difficulty of reaching points around the user • Customizable for different users • Programmable API, Hololens demonstrator • GitHub Repository • https://github.com/joaobelo92/xrgonomics Evangelista Belo, J. M., Feit, A. M., Feuchtner, T., & Grønbæk, K. (2021, May). XRgonomics: Facilitating the Creation of Ergonomic 3D Interfaces. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-11).
  • 59. 2. Designing for Different User Groups • Design for Difference Ages • Children require different interface design than adults • Older uses have different needs than younger • Prior Experience with AR systems • Familiar with HMDs, AR input devices • People with Different Physical Characteristics • Height and arm reach, handedness • Perceptual, Cognitive and Motor Abilities • Colour perception varies between people • Spatial ability, cognitive or motor disabilities
  • 60. Designing for Children • HMDS • inter pupillary distance, head fit, size and weight • Tablets • Poor dexterity, need to hold large tablet • Content • Reading ability, spatial perception
  • 61. 3. Design for the Whole User
  • 62. Consider Your User • Consider context of user • Physical, social, emotional, cognitive, etc. • Mobile Phone AR User • Probably Mobile • One hand interaction • Short application use • Need to be able to multitask • Use in outdoor or indoor environment • Want to enhance interaction with real world
  • 63. Would you wear this HMD?
  • 64. Whole User Needs • Social • Don’t make your user look stupid • Cultural • Follow local cultural norms • Physical • Can the user physically use the interface? • Cognitive • Can the user understand how the interface works? • Emotional • Make the user feel good and in control
  • 65. Example: Social Acceptance • People don’t want to look silly • Only 12% of 4,600 adults would be willing to wear AR glasses • 20% of mobile AR browser users experience social issues • Acceptance more due to Social than Technical issues • Needs further study (ethnographic, field tests, longitudinal)
  • 67.
  • 68.
  • 69. 4. Use UI Best Practices • General UI design principles can be applied to AR • E.g. Shneiderman’s UI guidelines from 1998 • Providing interface feedback • Mixture of reactive, instrumental and operational feedback • Maintain spatial and temporal correspondence • Use constraints • Specify relations between variables that must be satisfied • E.g. physical constraints reduce freedom of movement • Support Two-Handed control • Use Guiard’s framework of bimanual manipulation • Dominant vs. non-dominant hands
  • 70. Follow Good HCI Principles • Provide good conceptual model/Metaphor • customers want to understand how UI works • Make things visible • if object has function, interface should show it • Map interface controls to customer s model • infix -vs- postfix calculator -- whose model? • Provide feedback • what you see is what you get!
  • 71. Example: Guiard’s model of bimanual manipulation Guiard, Y. (1987). Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Journal of Motor Behavior, 19, 486-517. Dominant hand Non-dominant hand Dominant hand Non-dominant hand Non-Dominant: Leads, set spatial reference frame, performs coarse motions Dominant: Follows, works in reference frame, performs fine motions
  • 72. Adapting Existing Guidelines • Mobile Phone AR • Phone HCI Guidelines • Mobile HCI Guidelines • HMD Based AR • 3D User Interface Guidelines • VR Interface Guidelines • Desktop AR • Desktop UI Guidelines
  • 73. Example: Apple iOS Interface Guidelines • Make it obvious how to use your content. • Avoid clutter, unused blank space, and busy backgrounds. • Minimize required user input. • Express essential information succinctly. • Provide a fingertip-sized target for all controls. • Avoid unnecessary interactivity. • Provide feedback when necessary From: https://developer.apple.com/ios/human-interface-guidelines/
  • 74. Applying Principles to Mobile AR • Clean • Large Video View • Large Icons • Text Overlay • Feedback
  • 75. •Interface Components • Physical components • Display elements • Visual/audio • Interaction metaphors Physical Elements Display Elements Interaction Metaphor Input Output 5. Use Interface Metaphors
  • 76. AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Expressiveness, Intuitiveness
  • 77. AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Design for Layers
  • 78. Information Layers • Head-stabilized • Heads-up display • Body-stabilized • E.g., virtual tool-belt • World-stabilized • E.g., billboard or signpost
  • 79. Head stabilized • Information attached to view – always visible
  • 81. Body Stabilized Interface • Elements you want always available, but not always visible
  • 83. • Elements you want fixed relative to real world objects
  • 85. Example: Fragments • UI Elements embedded in real world • Real world occlusion
  • 87. Design to Device Constraints • Understand the platform and design for limitations • Hardware, software platforms • E.g. Handheld AR game with visual tracking • Use large screen icons • Consider screen reflectivity • Support one-hand interaction • Consider the natural viewing angle • Do not tire users out physically • Do not encourage fast actions • Keep at least one tracking surface in view Art of Defense Game
  • 88. Handheld AR Constraints/Affordances • Camera and screen are linked • Fast motions a problem when looking at screen • Intuitive “navigation” • Phone in hand • Two handed activities: awkward or intuitive • Extended periods of holding phone tiring • Awareness of surrounding environment • Small screen • Extended periods of looking at screen tiring • In general, small awkward platform • Vibration, sound • Can provide feedback when looking elsewhere
  • 89. Common Mobile AR Metaphors • Tangible AR Lens Viewing • Look through screen into AR scene • Interact with screen to interact with AR content • Touch screen input • E.g. Invisible Train • Metaphor – holding a window into the AR world
  • 91. Common Mobile AR Metaphors • Tangible AR Lens Manipulation • Select AR object and attach to device • Physically move phone to move AR object • Use motion of device as input • E.g. AR Lego • Metaphor – phone as physical handle for device
  • 93. AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Design for Affordances
  • 94. Tangible AR Metaphor • AR overcomes limitation of TUIs • enhance display possibilities • merge task/display space • provide public and private views • TUI + AR = Tangible AR • Apply TUI methods to AR interface design
  • 95. Tangible AR Design Principles • Tangible AR Interfaces use TUI principles • Physical controllers for moving virtual content • Support for spatial 3D interaction techniques • Time and space multiplexed interaction • Support for multi-handed interaction • Match object affordances to task requirements • Support parallel activity with multiple objects • Allow collaboration between multiple users
  • 96. AR Design Space Reality Virtual Reality Augmented Reality Physical Design Virtual Design
  • 97. Affordances ”… the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used. [...] Affordances provide strong clues to the operations of things.” (Norman, The Psychology of Everyday Things 1988, p.9)
  • 100. Affordance Matrix Fake door Hidden door Real door No door
  • 101. Physical vs. Virtual Affordances • Physical Affordance • Look and feel of real objects • Shape, texture, colour, weight, etc. • Industrial Design • Virtual Affordance • Look of virtual objects • Copy real objects • Interface Design
  • 102. •AR design is mixture of physical affordance and virtual affordance •Physical •Tangible controllers and objects •Virtual •Virtual graphics and audio
  • 103. Affordances in AR • Design AR interface objects to show how they are used • Use visual and physical cues to show possible affordances • Perceived affordances should match actual affordances • Physical and virtual affordances should match Merge Cube Tangible Molecules
  • 104. Case Study 1: 3D AR Lens Goal: Develop a lens based AR interface • MagicLenses • Developed at Xerox PARC in 1993 • View a region of the workspace differently to the rest • Overlap MagicLenses to create composite effects
  • 105. 3D MagicLenses MagicLenses extended to 3D (Veiga et. al. 96) § Volumetric and flat lenses
  • 106. AR Lens Design Principles • Physical Components • Lens handle • Virtual lens attached to real object • Display Elements • Lens view • Reveal layers in dataset • Interaction Metaphor • Physically holding lens
  • 107. 3D AR Lenses: Model Viewer § Displays models made up of multiple parts § Each part can be shown or hidden through the lens § Allows the user to peer inside the model § Maintains focus + context
  • 109. AR Lens Implementation Stencil Buffer Outside Lens Inside Lens Virtual Magnifying Glass
  • 110. Case Study 2 : LevelHead • Block based game
  • 111. Case Study 2: LevelHead • Physical Components • Real blocks • Display Elements • Virtual person and rooms • Interaction Metaphor • Blocks are rooms
  • 112.
  • 114. Case Study 3: AR Chemistry (Fjeld 2002) • Tangible AR chemistry education
  • 115. Goal: An AR application to teach molecular structure in chemistry •Physical Components • Real book, rotation cube, scoop, tracking markers •Display Elements • AR atoms and molecules • Interaction Metaphor • Build your own molecule
  • 116. AR Chemistry Input Devices
  • 117.
  • 119. Case Study 4: Transitional Interfaces Goal: An AR interface supporting transitions from reality to virtual reality •Physical Components • Real book •Display Elements • AR and VR content • Interaction Metaphor • Book pages hold virtual scenes
  • 120. Milgram’s Continuum (1994) Reality (Tangible Interfaces) Virtualit y (Virtual Reality) Augmented Reality (AR) Augmented Virtuality (AV) Mixed Reality (MR) Central Hypothesis • The next generation of interfaces will support transitions along the Reality-Virtuality continuum
  • 121. Transitions • Interfaces of the future will need to support transitions along the RV continuum • Augmented Reality is preferred for: • co-located collaboration • Immersive Virtual Reality is preferred for: • experiencing world immersively (egocentric) • sharing views • remote collaboration
  • 122. The MagicBook •Design Goals: •Allows user to move smoothly between reality and virtual reality •Support collaboration
  • 125. Features • Seamless transition from Reality to Virtuality • Reliance on real decreases as virtual increases • Supports egocentric and exocentric views • User can pick appropriate view • Computer becomes invisible • Consistent interface metaphors • Virtual content seems real • Supports collaboration
  • 126. Collaboration in MagicBook • Collaboration on multiple levels: • Physical Object • AR Object • Immersive Virtual Space • Egocentric + exocentric collaboration • multiple multi-scale users • Independent Views • Privacy, role division, scalability
  • 127. Technology • Reality • No technology • Augmented Reality • Camera – tracking • Switch – fly in • Virtual Reality • Compass – tracking • Pressure pad – move • Switch – fly out
  • 128. Summary •When designing AR interfaces, think of: • Physical Components • Physical affordances • Virtual Components • Virtual affordances • Interface Metaphors • Tangible AR or similar
  • 130. Design Guidelines By Vendors Platform driven By Designers User oriented By Practitioners Experience based By Researchers Empirically derived
  • 131. Design Patterns “Each pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem in such a way that you can use this solution a million times over, without ever doing it the same way twice.” – Christopher Alexander et al. Use Design Patterns to Address Reoccurring Problems C.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
  • 132. Example UI Design Patterns • http://ui-patterns.com/patterns
  • 133.
  • 134. Design Patterns for Handheld AR • Set of design patterns for Handheld AR • Title: a short phase that is memorable. • Definition: what experiences the prepattern supports • Description: how and why the prepattern works, what aspects of game design it is based on. • Examples: Illustrate the meaning of the pre-pattern. • Using the pre-patterns: reveal the challenges and context of applying the pre-patterns. Xu, Y., Barba, E., Radu, I., Gandy, M., Shemaka, R., Schrank, B., ... & Tseng, T. (2011, October). Pre-patterns for designing embodied interactions in handheld augmented reality games. In 2011 IEEE International Symposium on Mixed and Augmented Reality-Arts, Media, and Humanities (pp. 19-28). IEEE.
  • 135. Handheld AR Design Patterns Title Meaning Embodied Skills Device Metaphors Using metaphor to suggest available player actions Body A&S NaĂŻve physics Control Mapping Intuitive mapping between physical and digital objects Body A&S NaĂŻve physics Seamful Design Making sense of and integrating the technological seams through game design Body A&S World Consistency Whether the laws and rules in physical world hold in digital world NaĂŻve physics Environmental A&S Landmarks Reinforcing the connection between digital- physical space through landmarks Environmental A&S Personal Presence The way that a player is represented in the game decides how much they feel like living in the digital game world Environmental A&S NaĂŻve physics Living Creatures Game characters that are responsive to physical, social events that mimic behaviours of living beings Social A&S Body A&S Body constraints Movement of one’s body position constrains another player’s action Body A&S Social A&S Hidden information The information that can be hidden and revealed can foster emergent social play Social A&S Body A&S *A&S = awareness and skills
  • 136.
  • 137. Example: Seamless Design • Design to reduce seams in the user experience • Eg: AR tracking failure, change in interaction mode • Paparazzi Game • Change between AR tracking to accelerometer input Yan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented reality games, Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media, and Humanities, p.19-28, October 26-29, 2011
  • 138. Demo: Paparazzi Game • https://www.youtube.com/watch?v=MIGH5WGMnbs
  • 139. Example: Living Creatures • Virtual creatures should respond to real world events • eg. Player motion, wind, light, etc • Creates illusion creatures are alive in the real world • Sony EyePet • Responds to player blowing on creature
  • 140.
  • 141.
  • 142.
  • 143.
  • 144.
  • 145.
  • 146.
  • 147.
  • 148.
  • 149. Google ARCore Interface Guidelines https://developers.google.com/ar/design
  • 150. ARCore Elements App • Mobile AR app demonstrating interface guidelines • Multiple Interface Guidelines • User interface • User environment • Object manipulation • Off-screen markers • Etc.. • Test on Device • https://play.google.com/store/apps/details?id=com.google.ar.unity.ddelements
  • 152. ARKit Interface Guidelines • developer.apple.com/design/human-interface-guidelines/ios/system-capabilities/augmented-reality/
  • 153. Microsoft Mixed Reality Design Guidelines • https://docs.microsoft.com/en-us/windows/mixed-reality/design/design
  • 154. MRTK Interface Examples • Examples of UX Building Blocks • http://aka.ms/MRTK
  • 155. The Trouble with AR Design Guidelines 1) Rapidly evolving best practices Still a moving target, lots to learn about AR design Slowly emerging design patterns, but often change with OS updates Already major differences between device platforms 2) Challenges with scoping guidelines Often too high level, like “keep the user safe and comfortable” Or, too application/device/vendor-specific 3) Best guidelines come from learning by doing Test your designs early and often, learn from your own “mistakes” Mind differences between VR and AR, but less so between devices