Lecture 9 from a course on Mobile Based Augmented Reality Development taught by Mark Billinghurst and Zi Siang See on November 29th and 30th 2015 at Johor Bahru in Malaysia. This lecture describes principles for effective Interface Design for Mobile AR applications. Look for the other 9 lectures in the course.
Lecture 5 in the COMP 4010 class on Augmented and Virtual Reality. This lecture was about AR Interaction and Prototyping methods. Taught by Mark Billinghurst on August 24th 2021 at the University of South Australia.
Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are "augmented" by computer-generated or extracted real-world sensory input such as sound, video, graphics, haptics or GPS data.[1] It is related to a more general concept called computer-mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. Augmented reality enhances one’s current perception of reality, whereas in contrast, virtual reality replaces the real world with a simulated one.
This document discusses augmented reality (AR), which combines real and virtual scenes viewed through a device like glasses. AR enhances the real world with computer-generated input, unlike virtual reality which immerses the user in a simulated world. The document outlines how AR works using tracking, computing, and display components. It explores applications of AR in medical, entertainment, military, and engineering fields and limitations like technological and social acceptance challenges.
AUGMENTED REALITY :-By superimposing virtual imagery, sound and theoretically even other sensorial enrichment over real-world environments in real-time, augmented reality serves as a tool to ever more enhance a human being’s awareness and performance.
Originally developed for military applications, the technology has since been transferred to civilian domains such as the medical field, the automobile or leisure industries and many more. While already in place and constantly receiving improvement, augmented reality still faces many shortcomings that limit its effectiveness and portability.
Lecture 9 of the COMP 4010 course on AR/VR. This lecture is about AR Interaction methods. Taught on October 2nd 2018 by Mark Billinghurst at the University of South Australia
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
Augmented Reality; mostly confused with virtual reality is a completely different concept and is extensively implemented in various leading companies' R&D departments to experiment with design and performance characteristics.
Augmented reality (AR) enhances the real world by adding virtual objects. It combines real and virtual aspects in real-time and is interactive in 3D. Early development began in the 1960s but the term "augmented reality" was coined in the 1990s. AR systems add virtual audio, objects, and other enhancements to the real world. Potential applications include medical, entertainment, education, and military uses. Continued research is needed to address performance, interaction, and alignment issues and to develop applications that provide instant information to users.
COMP4010 Lecture 4 - VR Technology - Visual and Haptic Displays. Lecture about VR visual and haptic display technology. Taught on August 16th 2016 by Mark Billinghurst from the University of South Australia
Augmented Reality connects the online and offline worlds. Let us have a look at what it is, why it is so popular and what are the businesses to which it can contribute.
AUGMENTED REALITY CONNECTS THE ONLINE AND OFFLINE WORLDS.
Augmented Reality - the next big thing in mobileHari Gottipati
The document discusses the potential of augmented reality (AR) as the next big thing in mobile technology. It provides an overview of AR, including what AR is, different types of AR, examples of current AR uses cases, and some of the major AR development toolkits. The document also examines some of the limitations of current AR technology and points developers should consider when building AR applications, like simplicity, engagement, competition, and ensuring longevity.
Yogesh Baisla's seminar presentation provided an overview of augmented reality (AR). AR superimposes digitally rendered images onto the real world using markers recognized by mobile apps. The seminar discussed the history of AR from the 1960s, how it works technically, main applications like medical, manufacturing, and entertainment. It also compared AR to virtual reality, described implementation frameworks using off-the-shelf hardware and software, reviewed advantages like increased knowledge but also disadvantages like privacy issues. The seminar concluded AR has potential to enhance our lives but also faces challenges like technological limitations and social acceptance.
This document provides an overview of augmented reality (AR), including its definition, evolution, components, implementation methods, applications, and future possibilities. AR enhances the real-world environment by overlaying digital content and information. The key components of an AR system are displays, tracking systems, and mobile computing power. Implementation can be done via markers, markerless recognition, or location-based methods. Applications include medical, education, military, tourism and more. The future of AR may include replacing cell phones and expanding computer screens into the real world.
Virtual reality is a computer-generated simulation of an environment that users can interact with. It tracks users in real-time to give the impression of being in the simulated world. VR has been used since the 1950s in flight simulators and has since expanded to entertainment, design, education, and more. There are several types of VR including immersive, augmented, projected, and desktop. Key VR technologies include head-mounted displays, haptic interfaces, CAVE systems, and motion tracking. VR has many applications such as rehabilitation, training, education, design, and more. Major VR software includes VRML for creating virtual worlds on the web.
Augmented reality and virtual reality (1)annuyadav30
The document discusses augmented reality (AR) and virtual reality (VR). AR overlays computer information onto the real world in real-time and combines real and virtual realities, while VR immerses the viewer in computer-generated environments using headsets. Both technologies are popular in gaming. AR is used in entertainment, gaming, medicine, and military training, while VR is used for gaming, entertainment, and medical education.
This document discusses augmented reality (AR) and its applications in manufacturing. It begins with an introduction to AR, describing it as a method of altering the real world by adding digital elements. It then covers the main types of AR: marker-based, markerless, projection-based, and superimposition-based. Applications of AR in manufacturing include using it to aid assembly, maintenance, training, quality assurance, and design. Boeing, Mitsubishi Electric, Lockheed Martin, Porsche, and Ford are highlighted as companies employing AR in their manufacturing processes. The document concludes that AR can help make manufacturing more efficient and reduce costs.
Lecture 8 of the COMP 4010 course taught at the University of South Australia. This lecture provides and introduction to VR technology. Taught by Mark Billinghurst on September 14th 2021 at the University of South Australia.
This document summarizes a lecture on interaction design for augmented reality. It discusses several types of AR interfaces including: (1) AR information browsers that allow viewing and manipulating virtual content registered in the real world, (2) 3D AR interfaces that allow interacting with and manipulating 3D virtual objects, and (3) tangible interfaces that use physical objects to interact with and control virtual objects. It also presents case studies of specific AR applications and discusses design principles for AR interaction including using physical affordances, feedback, and natural mappings.
2013 Lecture 6: AR User Interface Design GuidelinesMark Billinghurst
COSC 426 Lecture 6: on AR User Interface Design Guidelines. Lecture taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury on August 16th 2013
COMP 4010 lecture on AR Interaction Design. Lecture given by Gun Lee at the University of South Australia on October 12th 2017, from slides prepared by Mark Billinghurst
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
Lecture 1 of the VR/AR class taught by Mark Billinghurst and Bruce Thomas at the University of South Australia. This lecture provides an introduction to VR and was taught on July 26th 2016.
This document discusses interaction design principles and processes for designing virtual reality interfaces. It begins by defining interaction design and discussing needs analysis methods like learning from users, analogous settings, and experts. Ideation techniques like brainstorming and sketching VR interfaces are presented. Design considerations like affordances, metaphors, and physical ergonomics are covered. Prototyping tools like Sketchbox, A-Frame and Unity EditorVR are introduced. The document concludes by discussing evaluation methods like usability testing and field studies.
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesMark Billinghurst
Keynote talk given by Mark Billinghurst at the VSMM 2016 conference on October 19th 2016.This talk was about how AR and VR can be used to create Empathic Computing experiences.
This document discusses the design of augmented reality interfaces. It begins by describing different types of AR interfaces such as browsing interfaces, 3D interfaces, tangible interfaces, and tangible AR interfaces. It then discusses specific interface design considerations for AR like using physical objects as controls for virtual objects. The document provides examples of space-multiplexed and time-multiplexed tangible AR interfaces. It emphasizes designing AR interfaces using principles from tangible user interfaces. Overall, the document provides guidance on conceptualizing and building effective AR experiences through consideration of physical components, display elements, and interaction metaphors.
Lecture 11 from the 2017 COMP 4010 course on AR and VR at the University of South Australia. This lecture was on AR applications and was taught by Mark Billinghurst on October 26th 2017.
The fifth lecture from the Augmented Reality Summer School taught by Mark Billinghurst at the University of South Australia, February 15th - 19th, 2016. This provides an overview of AR research directions.
Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...Shalin Hai-Jew
Augmented reality (AR)—the use of digital overlays over physical space—manifests in a wide range of spaces (indoor, outdoor; virtual) and ways (in real space (with unaided human vision); in head gear; in smart glasses; on mobile devices, and others). There are various authoring technologies that enable the making of AR experiences for various users. This work uses a particular tool (Adobe Aero®) to explore ways to build AR for multiple dimensions, including the fourth dimension (motion, changes over time).
Based on the respective purposes of the AR experience, some basic heuristics are captured for
space design (1),
motion design (2),
multiple perception design (sight, smell, taste, sound, touch) (3),
and virtual- and tangible- interactivity (4).
This document discusses building usable augmented reality interfaces. It emphasizes understanding user needs through methods like focus groups and observations. Both virtual and physical interface elements must be considered, along with the interaction metaphor. Rapid prototyping and user testing are important to develop compelling AR experiences. Evaluation of AR applications is also discussed. The goal is to design AR that effectively merges virtual and real information while addressing usability issues.
Lecture on AR Interaction Techniques given by Mark Billinghurst on November 1st 2016 at the University of South Australia as part of the COMP 4010 course on VR.
COMP4010 Lecture 5 taught by Bruce Thomas at University of South Australia on August 24th 2017. This class was about using Interaction Design techniques for developing effective VR interfaces. Slides by Mark Billinghurst.
Final lecture from the COMP 4010 course on Virtual and Augmented Reality. This lecture was about Research Directions in Augmented Reality. Taught by Mark Billinghurst on November 1st 2016 at the University of South Australia
Lecture 10 in the COMP 4010 Lectures on AR/VR from the Univeristy of South Australia. This lecture is about VR Interface Design and Evaluating VR interfaces. Taught by Mark Billinghurst on October 12, 2021.
Mobile user experience conference 2009 - The rise of the mobile contextFlorent Stroppa
The document discusses how mobile devices can leverage context awareness and sensors to improve the user experience. It describes how sensors like accelerometers, gyroscopes, microphones, and location sensors can provide information about the user's situation, environment and activity. With this context, devices can make smarter inferences and behave differently based on factors like location, time of day, activity, and the user's schedule and relationships. This will lead to devices that are less disruptive and more helpful. It also discusses challenges for user experience teams in designing for this new paradigm where inputs are no longer just from the user but also the environment and context.
Similar to Mobile AR lecture 9 - Mobile AR Interface Design (20)
An invited talk given by Mark Billinghurst on Research Directions for Cross Reality Interfaces. This was given on July 2nd 2024 as part of the 2024 Summer School on Cross Reality in Hagenberg, Austria (July 1st - 7th)
Keynote talk by Mark Billinghurst at the 9th XR-Metaverse conference in Busan, South Korea. The talk was given on May 20th, 2024. It talks about progress on achieving the Metaverse vision laid out in Neil Stephenson's book, Snowcrash.
These are slides from the Defence Industry event orgranized by the Australian Research Centre for Interactive and Virtual Environments (IVE). This was held on April 18th 2024, and showcased IVE research capabilities to the South Australian Defence industry.
This is a guest lecture given by Mark Billinghurst at the University of Sydney on March 27th 2024. It discusses some future research directions for Augmented Reality.
Presentation given by Mark Billinghurst at the 2024 XR Spring Summer School on March 7 2024. This lecture talks about different evaluation methods that can be used for Social XR/AR/VR experiences.
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
Invited guest lecture by Mark Billingurust given at the MIT Media Laboratory on November 21st 2023. This was given as part of Professor Hiroshi Ishii's class on Tangible Media
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
This document discusses empathic computing and its relationship to the metaverse. It defines key elements of the metaverse like virtual worlds, augmented reality, mirror worlds, and lifelogging. Research on the metaverse is still fragmented across these areas. The document outlines a vision for empathic computing systems that allow sharing experiences, emotions, and environments through technologies like virtual reality, augmented reality, and sensor data. Examples are given of research projects exploring collaborative VR experiences and AR/VR systems for remote collaboration and communication. The goal is for technology to support more natural and implicit understanding between people.
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
The document discusses using virtual avatars to improve remote collaboration. It provides background on communication cues used in face-to-face interactions versus remote communication. It then discusses early experiments using augmented reality for remote conferencing dating back to the 1990s. The document outlines key questions around designing effective virtual bodies for collaboration and discusses various technologies that have been developed for remote collaboration using augmented reality, virtual reality, and mixed reality. It summarizes several studies that have evaluated factors like avatar representation, sharing of different communication cues, and effects of spatial audio and visual cues on collaboration tasks.
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
1) The document discusses the concept of empathic computing and its application to designing for the broader metaverse.
2) Empathic computing aims to develop systems that allow people to share what they are seeing, hearing, and feeling with others through technologies like augmented reality, virtual reality, and physiological sensors.
3) Potential research directions are explored, like using lifelogging data in VR, bringing elements of the real world into VR, and developing systems like "Mini-Me" avatars that can convey non-verbal communication cues to facilitate remote collaboration.
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.
This document discusses augmented reality technology and visual tracking methods. It covers how humans perceive reality through their senses like sight, hearing, touch, etc. and how virtual reality systems use input and output devices. There are different types of visual tracking including marker-based tracking using artificial markers, markerless tracking using natural features, and simultaneous localization and mapping which builds a model of the environment while tracking. Common tracking technologies involve optical, magnetic, ultrasonic, and inertial sensors. Optical tracking in augmented reality uses computer vision techniques like feature detection and matching.
Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
This document provides an introduction to extended reality technologies from Mark Billinghurst, the director of the Empathic Computing Lab at the University of South Australia. It outlines Billinghurst's background and research interests. It then provides an overview of the class, including assignments, equipment available, and the lecture schedule. The lecture schedule covers topics such as augmented reality, virtual reality, the metaverse, and the history of AR/VR.
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
This document discusses empathic computing and collaborative immersive analytics. It notes that while fields like scientific and information visualization are well established, little research has looked at collaborative visualization specifically. Collaborative immersive analytics combines mixed reality, visual analytics and computer-supported cooperative work. Empathic computing aims to develop systems that allow sharing experiences, emotions and perspectives using technologies like virtual and augmented reality with physiological sensors. Applying these concepts could enhance communication and understanding for collaborative immersive analytics tasks.
This document discusses how metaverse concepts can be applied to corporate learning and leadership development. It defines the metaverse and outlines its key components: virtual worlds, augmented reality, mirror worlds, and lifelogging. Traditional corporate learning is described as instructor-led, group-based, and discrete. The document proposes applying metaverse concepts like learning in the flow of work, just-in-time learning, and adaptive personalized learning. Specific applications explored are virtual reality for skills and soft skills training, augmented reality for hands-on training, lifelogging for adaptive training, and mirror worlds for capturing real-world tasks.
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
keynote speech by Mark Billinghurst at the Workshop on Transitional Interfaces in Mixed and Cross-Reality, at the ACM ISS 2021 Conference. Given on November 14th 2021
The Rise of Supernetwork Data Intensive ComputingLarry Smarr
Invited Remote Lecture to SC21
The International Conference for High Performance Computing, Networking, Storage, and Analysis
St. Louis, Missouri
November 18, 2021
Data Protection in a Connected World: Sovereignty and Cyber Securityanupriti
Delve into the critical intersection of data sovereignty and cyber security in this presentation. Explore unconventional cyber threat vectors and strategies to safeguard data integrity and sovereignty in an increasingly interconnected world. Gain insights into emerging threats and proactive defense measures essential for modern digital ecosystems.
What's Next Web Development Trends to Watch.pdfSeasiaInfotech2
Explore the latest advancements and upcoming innovations in web development with our guide to the trends shaping the future of digital experiences. Read our article today for more information.
The DealBook is our annual overview of the Ukrainian tech investment industry. This edition comprehensively covers the full year 2023 and the first deals of 2024.
Are you interested in learning about creating an attractive website? Here it is! Take part in the challenge that will broaden your knowledge about creating cool websites! Don't miss this opportunity, only in "Redesign Challenge"!
Video traffic on the Internet is constantly growing; networked multimedia applications consume a predominant share of the available Internet bandwidth. A major technical breakthrough and enabler in multimedia systems research and of industrial networked multimedia services certainly was the HTTP Adaptive Streaming (HAS) technique. This resulted in the standardization of MPEG Dynamic Adaptive Streaming over HTTP (MPEG-DASH) which, together with HTTP Live Streaming (HLS), is widely used for multimedia delivery in today’s networks. Existing challenges in multimedia systems research deal with the trade-off between (i) the ever-increasing content complexity, (ii) various requirements with respect to time (most importantly, latency), and (iii) quality of experience (QoE). Optimizing towards one aspect usually negatively impacts at least one of the other two aspects if not both. This situation sets the stage for our research work in the ATHENA Christian Doppler (CD) Laboratory (Adaptive Streaming over HTTP and Emerging Networked Multimedia Services; https://athena.itec.aau.at/), jointly funded by public sources and industry. In this talk, we will present selected novel approaches and research results of the first year of the ATHENA CD Lab’s operation. We will highlight HAS-related research on (i) multimedia content provisioning (machine learning for video encoding); (ii) multimedia content delivery (support of edge processing and virtualized network functions for video networking); (iii) multimedia content consumption and end-to-end aspects (player-triggered segment retransmissions to improve video playout quality); and (iv) novel QoE investigations (adaptive point cloud streaming). We will also put the work into the context of international multimedia systems research.
UiPath Community Day Kraków: Devs4Devs ConferenceUiPathCommunity
We are honored to launch and host this event for our UiPath Polish Community, with the help of our partners - Proservartner!
We certainly hope we have managed to spike your interest in the subjects to be presented and the incredible networking opportunities at hand, too!
Check out our proposed agenda below 👇👇
08:30 ☕ Welcome coffee (30')
09:00 Opening note/ Intro to UiPath Community (10')
Cristina Vidu, Global Manager, Marketing Community @UiPath
Dawid Kot, Digital Transformation Lead @Proservartner
09:10 Cloud migration - Proservartner & DOVISTA case study (30')
Marcin Drozdowski, Automation CoE Manager @DOVISTA
Pawel Kamiński, RPA developer @DOVISTA
Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner
09:40 From bottlenecks to breakthroughs: Citizen Development in action (25')
Pawel Poplawski, Director, Improvement and Automation @McCormick & Company
Michał Cieślak, Senior Manager, Automation Programs @McCormick & Company
10:05 Next-level bots: API integration in UiPath Studio (30')
Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner
10:35 ☕ Coffee Break (15')
10:50 Document Understanding with my RPA Companion (45')
Ewa Gruszka, Enterprise Sales Specialist, AI & ML @UiPath
11:35 Power up your Robots: GenAI and GPT in REFramework (45')
Krzysztof Karaszewski, Global RPA Product Manager
12:20 🍕 Lunch Break (1hr)
13:20 From Concept to Quality: UiPath Test Suite for AI-powered Knowledge Bots (30')
Kamil Miśko, UiPath MVP, Senior RPA Developer @Zurich Insurance
13:50 Communications Mining - focus on AI capabilities (30')
Thomasz Wierzbicki, Business Analyst @Office Samurai
14:20 Polish MVP panel: Insights on MVP award achievements and career profiling
How RPA Help in the Transportation and Logistics Industry.pptxSynapseIndia
Revolutionize your transportation processes with our cutting-edge RPA software. Automate repetitive tasks, reduce costs, and enhance efficiency in the logistics sector with our advanced solutions.
Blockchain and Cyber Defense Strategies in new genre timesanupriti
Explore robust defense strategies at the intersection of blockchain technology and cybersecurity. This presentation delves into proactive measures and innovative approaches to safeguarding blockchain networks against evolving cyber threats. Discover how secure blockchain implementations can enhance resilience, protect data integrity, and ensure trust in digital transactions. Gain insights into cutting-edge security protocols and best practices essential for mitigating risks in the blockchain ecosystem.
Paradigm Shifts in User Modeling: A Journey from Historical Foundations to Em...Erasmo Purificato
Slide of the tutorial entitled "Paradigm Shifts in User Modeling: A Journey from Historical Foundations to Emerging Trends" held at UMAP'24: 32nd ACM Conference on User Modeling, Adaptation and Personalization (July 1, 2024 | Cagliari, Italy)
Interaction Latency: Square's User-Centric Mobile Performance MetricScyllaDB
Mobile performance metrics often take inspiration from the backend world and measure resource usage (CPU usage, memory usage, etc) and workload durations (how long a piece of code takes to run).
However, mobile apps are used by humans and the app performance directly impacts their experience, so we should primarily track user-centric mobile performance metrics. Following the lead of tech giants, the mobile industry at large is now adopting the tracking of app launch time and smoothness (jank during motion).
At Square, our customers spend most of their time in the app long after it's launched, and they don't scroll much, so app launch time and smoothness aren't critical metrics. What should we track instead?
This talk will introduce you to Interaction Latency, a user-centric mobile performance metric inspired from the Web Vital metric Interaction to Next Paint"" (web.dev/inp). We'll go over why apps need to track this, how to properly implement its tracking (it's tricky!), how to aggregate this metric and what thresholds you should target.
How Netflix Builds High Performance Applications at Global ScaleScyllaDB
We all want to build applications that are blazingly fast. We also want to scale them to users all over the world. Can the two happen together? Can users in the slowest of environments also get a fast experience? Learn how we do this at Netflix: how we understand every user's needs and preferences and build high performance applications that work for every user, every time.
How Netflix Builds High Performance Applications at Global Scale
Mobile AR lecture 9 - Mobile AR Interface Design
1. LECTURE 9:
DESIGNING MOBILE AR
INTERFACES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
Zi Siang See
zisiang@reina.com.my
November 29th-30th 2015
Mobile-Based Augmented Reality Development
3. Handheld HCI
• Consider your user
• Follow good HCI principles
• Adapt HCI guidelines for handhelds
• Design to device constraints
• Micro-Interactions
• Design Patterns
4. ConsiderYour User
• Consider context of user
• Physical, social, emotional, cognitive, etc
• Mobile Phone AR User
• Probably Mobile
• One hand interaction
• Short application use
• Need to be able to multitask
• Use in outdoor or indoor environment
• Want to enhance interaction with real world
5. Follow Good HCI Principles
• Provide good conceptual model/Metaphor
• customers want to understand how UI works
• Make things visible
• if object has function, interface should show it
• Map interface controls to customer s model
• infix -vs- postfix calculator -- whose model?
• Provide feedback
• what you see is what you get!
6. Adapting Existing Guidelines
• Mobile Phone AR
• Phone HCI Guidelines
• Mobile HCI Guidelines
• HMD Based AR
• 3D User Interface Guidelines
• VR Interface Guidelines
• Desktop AR
• Desktop UI Guidelines
7. iPhone Guidelines
• Make it obvious how to use your content.
• Avoid clutter, unused blank space, and busy
backgrounds.
• Minimize required user input.
• Express essential information succinctly.
• Provide a fingertip-sized target area for all links and
controls.
• Avoid unnecessary interactivity.
• Provide feedback when necessary
9. Applying Principles to MobileAR
• Clean
• LargeVideoView
• Large Icons
• Text Overlay
• Feedback
10. AR vs.NonAR Design
• Design Guidelines
• Design for 3D graphics + Interaction
• Consider elements of physical world
• Support implicit interaction
Characteristics Non-AR Interfaces AR Interfaces
Object Graphics Mainly 2D Mainly 3D
Object Types Mainly virtual objects Both virtual and physical objects
Object behaviors Mainly passive objects Both passive and active objects
Communication Mainly simple Mainly complex
HCI methods Mainly explicit Both explicit and implicit
12. Design to Device Constraints
• Understand the platforms used and design for limitations
• Hardware, software platforms
• Eg Handheld AR game with visual tracking
• Use large screen icons
• Consider screen reflectivity
• Support one-hand interaction
• Consider the natural viewing angle
• Do not tire users out physically
• Do not encourage fast actions
• Keep at least one tracking surface in view
Art of Defense Game
13. HandheldAR Constraints/Affordances
• Camera and screen are linked
• Fast motions a problem when looking at screen
• Intuitive “navigation”
• Phone in hand
• Two handed activities: awkward or intuitive
• Extended periods of holding phone tiring
• Awareness of surrounding environment
• Small screen
• Extended periods of looking at screen tiring
• In general, small awkward platform
• Vibration, sound
• Can provide feedback when looking elsewhere
• Networking - Bluetooth, 802.11
• Collaboration possible
• Guaranteed minimum collection of buttons
• Sensors often available
• GPS, camera, accelerometer, compass, etc
15. Time Looking at Screen
Oulasvirta, A. (2005). The fragmentation of attention in mobile
interaction, and what to do with it. interactions, 12(6), 16-18.
17. Design for MicroInteractions
▪ Design interaction less than a few seconds
• Tiny bursts of interaction
• One task per interaction
• One input per interaction
▪ Benefits
• Use limited input
• Minimize interruptions
• Reduce attention fragmentation
18. Design Patterns
“Each pattern describes a problem which occurs
over and over again in our environment, and then
describes the core of the solution to that problem in
such a way that you can use this solution a million
times over, without ever doing it the same way twice.”
– Christopher Alexander et al.
C.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
19. Handheld AR Design Patterns
Title Meaning Embodied Skills
Device Metaphors Using metaphor to suggest available player
actions
Body A&S Naïve physics
Control Mapping Intuitive mapping between physical and
digital objects
Body A&S Naïve physics
Seamful Design Making sense of and integrating the
technological seams through game design
Body A&S
World Consistency Whether the laws and rules in
physical world hold in digital world
Naïve physics
Environmental A&S
Landmarks Reinforcing the connection between digital-
physical space through landmarks
Environmental A&S
Personal Presence The way that a player is represented in the
game decides how much they feel like living
in the digital game world
Environmental A&S
Naïve physics
Living Creatures Game characters that are responsive to
physical, social events that mimic behaviours
of living beings
Social A&S Body A&S
Body constraints Movement of one’s body position
constrains another player’s action
Body A&S Social A&S
Hidden information The information that can be hidden and
revealed can foster emergent social play
Social A&S Body A&S
20. Example:Seamless Design
• Design to reduce seams in the user experience
• Eg:AR tracking failure, change in interaction mode
• Paparazzi Game
• Change between AR tracking to accelerometer input
Yan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented reality games,
Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media,
and Humanities, p.19-28, October 26-29, 2011
21. Example:Living Creatures
• Virtual creatures should respond to real world events
• eg. Player motion, wind, light, etc
• Creates illusion creatures are alive in the real world
• Sony EyePet
• Responds to player blowing on creature
23. AR Interaction
• Designing AR System = Interface Design
• Using different input and output technologies
• Objective is a high quality of user experience
• Ease of use and learning
• Performance and satisfaction
25. Design of Objects
• Objects
• Purposely built – affordances
• “Found” – repurposed
• Existing – already at use in marketplace
• Make affordances obvious (Norman)
• Object affordances visible
• Give feedback
• Provide constraints
• Use natural mapping
• Use good cognitive model
26. Affordances:to give a clue
• Refers to an attribute of an object that allows people to
know how to use it
• e.g. a button invites pushing, a door handle affords pulling
• Norman (1988) used the term to discuss the design of
everyday objects
• Since has been much popularised in interaction design to
discuss how to design interface objects
• e.g. scrollbars afford moving up and down, icons afford clicking
28. ‘Affordance’ and Interface Design?
• Interfaces are virtual and do not have affordances
like physical objects
• Norman argues it does not make sense to talk
about interfaces in terms of ‘real’ affordances
• Instead interfaces are better conceptualized as
‘perceived’ affordances
• Learned conventions of arbitrary mappings between action
and effect at the interface
• Some mappings are better than others
30. • AR is mixture of physical and virtual affordance
• Physical
• Tangible controllers and objects
• Virtual
• Virtual graphics and audio
31. Affordance Led Design
• Make affordances perceivable
• Provide visual, haptic, tactile, auditory cues
• Affordance Led Usability
• Give feedback
• Provide constraints
• Use natural mapping
• Use good cognitive model
32. Example: AR Chemistry
• Tangible AR chemistry education (Fjeld)
Fjeld, M., Juchli, P., and Voegtli, B. M. 2003. Chemistry education: A tangible
interaction approach. Proceedings of INTERACT 2003, September 1st -5th
2003, Zurich, Switzerland.
35. Case Study 1:AR Lens
• Physical Components
• Lens handle
• Virtual lens attached to real object
• Display Elements
• Lens view
• Reveal layers in dataset
• Interaction Metaphor
• Physically holding lens
36. 3DAR Lenses:ModelViewer
! Displays models made up of multiple parts
! Each part can be shown or hidden through the lens
! Allows the user to peer inside the model
! Maintains focus + context
38. Case Study 2:LevelHead
• Physical Components
• Real blocks
• Display Elements
• Virtual person and rooms
• Interaction Metaphor
• Blocks are rooms
40. Handheld Interface Metaphors
• Tangible AR LensViewing
• Look through screen into AR scene
• Interact with screen to interact with
AR content
• Eg Invisible Train
• Tangible AR Lens Manipulation
• Select AR object and attach to device
• Use the motion of the device as input
• Eg AR Lego
42. Space vs. Time - Multiplexed
• Space-multiplexed
• Many devices each with one function
• Quicker to use, more intuitive, clutter
• Real Toolbox
• Time-multiplexed
• One device with many functions
• Space efficient
• mouse
47. TangibleAR Design Principles
• Tangible AR Interfaces use TUI principles
• Physical controllers for moving virtual content
• Support for spatial 3D interaction techniques
• Time and space multiplexed interaction
• Support for multi-handed interaction
• Match object affordances to task requirements
• Support parallel activity with multiple objects
• Allow collaboration between multiple users
49. AR and Perception
" Creating the illusion that virtual images are
seamlessly part of the real world
• Must match real and virtual cues
• Depth, occlusion, lighting, shadows..
50. AR as Perception Problem
• Goal of AR to fool human senses – create
illusion that real and virtual are merged
• Depth
• Size
• Occlusion
• Shadows
• Relative motion
• Etc..
53. Use the Following Depth Cues
• Movement parallax.
• Icon/Object size (for close objects)
• Linear perspective
• To add side perspective bar.
• Overlapping
• Works if the objects are big enough
• Shades and shadows.
• Depends on the available computation
57. Information Presentation
" Information Presentation
• Amount of information
• Clutter, complexity
• Representation of information
• Navigation cues, POI representation
• Placement of information
• Head, body, world stabilized
• View combination
• Multiple views