Virtual Reality does a lot of great things with our moves. We can embody game characters, immerse in movies, and even interact with other people. Of course, in order for this to work, VR software should be able to recognize motion in space and act accordingly. This is achieved with motion tracking technology, a key component of any VR project. Let’s take a look at how it works, examine common methods and analyze existing cases.
What is motion tracking?
Motion tracking is the process of detecting, recording and digitizing human movements in order to use them in software and hardware. Motion tracking enables users to interact with the environment, participate in games and movies, and change the settings. In a nutshell, there’s no VR without motion tracking.
Development of the six degrees of freedom
To understand how VR is implemented, let’s examine the way people move in real life. At any given moment, you can shift forwards-backward, up-and-down- and left-to-right – these three blocks of directions are referred to as 3-DoF or three axes of freedom. As long as the user’s movements are detected and enabled in these directions, the movement will be fully unrestricted. To achieve full mobility, you also need to be able to rotate around each of these axes – you can bend, lean, and change the angle between you and the surface. Now, with these six angles of freedom (6-DoF), you will be provided with a fully immersive mobile experience.
Reasons to build a virtual reality motion tracking system
VR can’t exist without motion tracking. It’s crucial to providing a fully immersive experience: users should be able to interact with reality and modify it. However, using motion tracking also comes with additional benefits.
- Depth perception: when all the six angles of freedom are enabled, the device can perceive the depth of movements, which is crucial for driving emulators, surgical learning platforms, and many other applications;
- Light reflection: motion tracker allows adapting the environment to the game, making a room lighter, darker, or adding new shades altogether;
- Understanding a user: motion tracking allows getting real-time feedback from the user, understand reactors to virtual stimuli, and improve the product;
- Practical impact: the ability to change the settings and participate in the process is exactly what makes VR different from typical 2D and 3D content.
Motion tracking connects VR content to the user, directly leading to the formation of new skills, habits, and reactions. Not only can users be just idle content spectators, but also they are active participants of the process.
Methods of virtual reality motion tracking
Motion tracking is executed with dozens of methods. Luckily, this variety of ways can be broken into two main categories: optical and non-optical tracking.
- Optical tracking uses an imaging device to detect and track bodily movements (smartphone camera, digital camera, etc);
- Non-optical tracking uses sensors, sound-wave or magnetic-field detection devices to get data on users’ movements.
Let’s take a closer look at these methods of optical tracking, examine their specifics, advantages, and problematic areas.
Optical motion tracking systems
The first step of optical tracking is to find optical markers that a camera and software will use to recognize a person’s movements. The identification starts from a known point: if a user is wearing a headset, it’s easy for the system to identify the eyes first. If the game requires a handheld controller, the software can pinpoint the location of palms and fingers.
Once the main markers are established, the system analyzes the image and based on average measurements of the human body, decided which marker to put next. Some also use AI and ML for image recognition: the algorithm analyzes the graphic input from the camera and pinpoints several dozens of additional points.
This type of markers should reflect light from LED devices. The computer receives access to wearable LED devices and analyzes their color scheme and the length of a wave. By analyzing the image from the camera, the system can tell whether it’s dark or not, change what colors users are seeing according to their actual surroundings, and so on. This is especially important for AR when the app needs to detect users’ facial features, apply filters, effects, fake makeup, and so on.
Issues with optical motion tracking
- Users need to be equipped with high-quality cameras to create a high-resolution input.
- Light-sensitive input requires users to be connected to a power supply and wear LED lights.
- Marker recognition can be imprecise, in which case user experience will be different from what was originally intended.
In Microsoft Kinect, the Xbox360 sensor package, the camera doesn’t use markers to enable motion detection. As long as it sees the subject, the console will be able to trace the movements from the location of a handheld console.
Non-optical move tracking systems
Some VR devices require users to wear specialized equipment that contradicts electromechanical sensors. Magnetometers, accelerometers, gyroscopes are used to detect motion by Oculus Touch, Playstation Move, Stream VR, and many other popular solutions.
Types of sensors used in non-optical tracking
- Magnetometers are used to determine users’ orientation around magnetic fields, detecting North, South and define coordinates.
- Gyroscopes detect a person’s rotation within 360 degrees.
- Accelerometers measure the speed of a person’s movement on one of three axes of freedom and detect changes in the intensity.
Non-optical methods can use all three of these sensors and be combined with optical methods.
Use Cases of motion tracking systems in virtual reality
All VR-powered devices and software use motion tracking to enable smooth interactions with the environment. However, there are startups and companies that focus on motion tracking, because it’s crucial for their main product and target audience. Let’s take a look at such examples to see how motion tracking, even on its own, can create a basis for a successful startup.
iKinema and Apple
iKinema is a motion-tracking company that specializes in providing software for motion detection, processing, and digitalization. The company focuses on implementing seamless motion tracking into videos and games. On their Youtube channel, the company posts videos of game characters, managed by user real-time, allowing them to move in sync in real-time.
The startup wasn’t popular or widely commercially successful until it wasn’t revealed that the company is purchased by Apple. Apple was among prominent investors into Virtual Reality and motion tracking for a while, so it’s likely that the corporation will use iKinema resources for its in-house projects.
The takeaway here is that major companies take a huge interest in promising motion tracking startup. You don’t have to create a video game or 360-degree film to get commercial success with this innovation. Technology alone can be enough to drive a lot of profit to a business.
Obviously, Virtual Reality and motion tracking can be combined with other emerged technologies – in particular, Artificial Intelligence. Wrnch is a Canada-based startup that offers companies a technology of motion recognition, equipped by AI. Similarly to iKinema, the company doesn’t offer a ready VR product, rather, an instrument that can be used by content makers who want to increase the users’ immersion into their work.
The startup uses optical motion-tracking technology – the system is connected to the camera, and as soon as there is input (photo or video, or real-time streaming), the software maps facial and bodily features with visual markers – AI-powered facial recognition increases detecting precision.
If your company explores other innovative technologies, you can use this expertise to improve the efficiency of your motion tracking algorithms.
Users can connect a VR headset to their PS4 to get a VR experience from their favorite console. The panoramic view provides full visibility of the gameplay – just like in real life, users can rotate their heads on different angles and the system adjusts the visual output. With OLED-display and 120 frames per second quality, users are fully immersed in the content.
PlayStation didn’t limit the experience to motion tracking and enhanced visual content. They also embedded the microphone into the headset, allowing users to communicate with players – as if they are talking to fellow players in real life.
It’s obvious that motion tracking technologies are one of the most promising directions for investment and development. During this decade, we will likely see a major transition from the traditional ways to interact with content (passive consumption) to new methods, based on active participation. Brands, technology companies, content creators, educators, and non-profit organizations can benefit from implementing motion tracking into their content. You can create realistic games, videos, movies, learning programs, and advertising to communicate your ideas in a powerful way. Even companies that don’t specialize in the content making can develop a motion-tracking technology as a standalone product and offer these innovations to entertainment and educational businesses. The future of motion tracking will obviously change, but one thing is already clear – it’s a highly promising technology, which is why its opportunities are limitless.