Explore tens of thousands of sets crafted by our community.
Tracking Systems in AR
20
Flashcards
0/20
GPS
Uses satellite-based navigation to provide location and time information. In AR, it enables geospatial experiences tying virtual content to real-world locations.
IMU (Inertial Measurement Unit)
Combines accelerometers and gyroscopes to detect motion and orientation. In AR, it tracks device movement to align virtual content with the user’s perspective.
Visual Odometry
Estimates a device's position relative to its surroundings using camera images. In AR, it helps in maintaining accurate positioning of virtual objects as the user moves.
SLAM (Simultaneous Localization and Mapping)
Creates a map of the environment while tracking the users' location within it. In AR, it enables the placement and tracking of virtual objects in complex scenes.
Marker-based Tracking
Recognizes specific visual markers to determine the position and orientation of a device. In AR, it triggers the display of virtual content when markers are in view.
Markerless Tracking
Uses features of the environment, rather than predefined markers, to track position. In AR, it allows for more dynamic and flexible placement of AR content.
RFID (Radio-Frequency Identification)
Uses electromagnetic fields to automatically identify and track tags attached to objects. In AR, RFID can link real-world objects with virtual data overlays.
Beacon Tracking
Utilizes small Bluetooth radio transmitters for location-based interactions. In AR, beacons can trigger specific virtual content when the user is nearby.
Time of Flight (ToF) Sensors
Measures distance by calculating the time it takes light to travel from the sensor to an object and back. In AR, ToF sensors assist in creating depth maps for better spatial understanding.
Structured-light 3D Scanners
Projects a pattern of light onto a scene and measures distortions to create a 3D model. In AR, it enhances the precision of virtual object placement in the real world.
LIDAR (Light Detection and Ranging)
Uses laser pulses to measure distances to a target. In AR, LIDAR contributes to creating detailed and accurate environmental meshes for real-world interactions.
6DoF (Six Degrees of Freedom)
Tracks both translational and rotational motions in 3D space. In AR, 6DoF enables the user to move around and view virtual content from different angles, mimicking real-world interactions.
Electro-Optical Tracking
Combines electrical and optical systems to track the position of objects. In AR, it is used especially in large spaces like warehouses to track both user and objects’ positions for an enhanced AR experience.
Optical Flow Sensors
Measures the motion of objects across the camera’s field of view. In AR, it contributes to understanding the movement of AR devices relative to the real world for smoother experiences.
Infrared Sensors
Detects infrared light to capture depth information and track object movement. In AR, infrared sensors are crucial for depth sensing and real-time motion tracking.
Machine Vision
Employs computer algorithms to interpret image data. In AR, machine vision is used to recognize and track objects and scenes for interaction.
UWB (Ultra-Wideband)
Utilizes radio waves to determine precise device location with centimeter accuracy. In AR, UWB is used for precise indoor navigation and spatial interaction.
Computer Vision-based Tracking
Applies algorithms to analyze video and images to recognize forms, gestures, and scenery. In AR, it is essential for creating responsive and immersive AR experiences.
Haptic Feedback
Uses touch feedback to simulate the physical interaction with virtual objects. In AR, it provides users with a sensation of touching virtual objects, enhancing immersion.
Fiducial Markers
Specialized symbols used to ensure camera perspective accuracy in tracking systems. In AR, they assist in the stable and accurate placement of virtual content.
© Hypatia.Tech. 2024 All rights reserved.