Intel RealSense - yesterday, today... tomorrow?
At the end of August, Intel CEO Pat Gelsinger said that as part of the company’s restructuring process, it would abandon the development of non-core technologies, including RealSense. This gave rise to numerous media outlets to “bury” the technology, and even we were among them, but we quickly came to our senses. Having dug deeper and questioned Intel experts, we are ready to offer you a different interpretation of what is happening.
This post is a story about yesterday and today’s RealSense, as well as about its plans for survival in an era of change.
To show the context of what is happening, we will quote Gelsinger’s words in their entirety.
I want to invest in those areas that support the company’s core businesses: data centers, user computers, graphics, chip manufacturing and unmanned control. If the technology is suitable for any of these six areas, I will invest in it. If not, I won’t. As for RealSense, a number of good results have been obtained here that we could use, but they are not suitable for the main directions.
On the one hand, it is said very clearly. On the other hand, there is a discrepancy: the fact is that “autonomous driving” in the company means not only autonomous cars (which are engaged in an Intel Company – Mobileye), but also robotics, in which RealSense cameras have been successfully used for a long time. It turns out that technology still has a place in the new value system of the company?
However, let’s not get ahead of ourselves. Let’s start from the beginning.
Intel RealSense – yesterday
The first mention of the technology dates back to October 2012 — that’s when the first beta version of the Intel Perceptual Computing SDK appeared on the Intel website. And already in January 2013, literally together with the second beta, we have a blog
article about her
. We (the blog) were lucky that the technology was developed, including by the Moscow Intel team, and the sources of knowledge were very close.
Description of Intel Perceptual Computing SDK
What is Perceptual Computing? This is a new technology that will allow the user to interact with his mobile device through voice, hands, face, i.e. more natural ways for a person to interact.
The first version of the PerC SDK supported the following features:
- Gesture recognition, hand and finger positions, palm openness level
- Face position tracking, identification of control points (lips, nose, eyes), identification (“recognition”) of the face
- Tracking the position of 2- and 3-dimensional objects
Initially, Intel used third-party equipment, such as Creative, later switched to its own.
Almost immediately, the company included PerC /RealSense among its priorities and already in 2013 held a competition among developers with an impressive prize fund of one million dollars.
The principle of operation of RealSense cameras
The principle of operation of 3D cameras is to form a depth map obtained using 3 main technologies – Time of Flight, Coded light and stereo vision. The RGB image enriched with depth data can be used for manipulating and analyzing the depicted objects (for example, segmentation of the background or transfer of the focus point, volume measurement, drawing 3D models, etc.),
RGB photo and depth map for it
2014 was marked by
on modern RealSense,
another competition for developers
and the start of the series
hackathons, meetups and other events
, which will be conducted visibly and invisibly in the coming years. A lot of effort and money has really been invested in the promotion of technology.
In 2015, two RealSense cameras appeared, which can now be called “classic”: the rear R200 and the front F200. It was, perhaps, the golden age of RealSense: the SDK was vigorously updated, new drivers were released, prototypes of hardware were demonstrated, RealSense was planned to be implemented in smartphones and laptops. We produced tutorials for developers on an industrial scale, and one of the pioneers told us well about the nuances of programming for RealSense at that time.
Intel RealSense – Today
If we consider “conditional today” a period of time deep into a year or two, then this period of time also cannot be considered stagnant or crisis for RealSense — as they say, nothing foreshadowed trouble. Yes, there are fewer events – this is typical now not only for RealSense. But new cameras continued to come out regularly.
Intel RealSense 435i with integrated inertial sensor IMU (Inertial Measurement Unit) for use in moving devices.
Intel RealSense Tracking Camera T265 is a fundamentally new device without an IR system, equipped with two ultra-wide-angle fisheye lenses, a 6-axis inertial sensor IMU and a specialized VPU (Visual Processing Unit) Intel Movidius Myriad 2 as a central computer.
Intel RealSense LiDAR L515 is the first LIDAR in the RealSense line using the MEMS mirror scanning technology developed by Intel.
Intel RealSense Depth Camera D455 is a classic depth camera, improved and corrected, released just six months ago.
As you can see, the current Intel RealSense portfolio has a sufficient number of cameras for all occasions, different in principle of operation.
Moreover, a couple of months ago, in August 2021, Xiaomi released CyberDog, a four-legged open source platform and Intel RealSense D450 as visual organs.
Intel RealSense – tomorrow?
What has changed at the junction between today and tomorrow? Let’s express our personal opinion.
The main purpose of RealSense was to add a third dimension to the daily practice of user interaction with a computer, as well as participation in the creation and consumption of 3D content. In general, attempts to introduce digital 3D experience into the lives of the masses have been made by various companies in various fields, for example, VR helmets, since the last century. But all of them, alas, are not super-successful. That is, having taken off brightly, after a couple of years, at best, they occupy their narrow niche, and at worst, they disappear altogether (in fact, they hibernate in order to be reborn later on a new round of technology development). Here are some examples: Do you often use the Windows Aero Glass 3D interface? Have you watched a 3D movie on TV or in a movie for a long time? Have you heard that Microsoft recently decided to remove the 3D Objects folder display from Windows 10 Explorer? Did you have anything in this folder itself?
3D model vikky13, created in 2015 with the help of an Intel RealSense camera by Itseez3D.
Why is this happening? The point here is both in human nature and in the level of technology development. Both the imperfection of the created/recognized 3D plays a role, which leads to physical and psychological discomfort of users, and the absence of a real mass necessity justifying the purchase and use of appropriate equipment. That is, we can say that the third dimension in the computer life of users is, alas, being cut off by Occam’s razor.
And on the other hand, even in cases where depth data could add real value to the widespread user experience, technologies like RealSense are being replaced by… software solutions. Including those created by Intel. Recent achievements in deep machine learning and computer vision make it possible to effectively “create depth” from images from conventional two-dimensional cameras. That is, neural networks, rather than 3D cameras, are now widely used to replace the background during video calls or to provide additional reliability for face recognition.
But not in all scenarios neural networks are able to fully replace 3D cameras: drones, robots, biometrics work much more accurately and faster on deep cameras. As our CEO noticed, too many good developments were made within the RealSense project to just give up on it. RealSense will evolve, changing to meet the tasks set for the company. This process will be gradual in order to pass painlessly for numerous customers using the technology and buying RealSense cameras.
In fact, Intel is not closing RealSense, its developments will continue to be used in the company’s focus area – robotics and drones. Stereo cameras, as the most successful and in-demand product, will continue to exist and develop (new models are expected to appear), but further novelties should be expected in the field of robotics. At the same time, there are no restrictions on the use of cameras in other scenarios using their own or partner developments.
This means that the RealSense topic has not yet been exhausted in the Intel blog.