Making the Invisible Visible
The IBM Research group's hyperimaging device will allow us to see what was previously not visible to the human eye, even through fog, snow and rain.
Image by Alberto Valdes-Garcia, IBM Research scientist and manager of the RF Circuits and Systems Group, is photographed with the millimeter wave imaging experimental set-up. Photo courtesy of IBM Research
By Jim Utsler11/13/2017
What we see with our eyes represents only a tiny fraction of the information around us. If we had evolved to capture X-rays, we could use electromagnetic wavelengths to see the world. The same holds for ultraviolet or infrared waves. We’ve evolved to visually understand and react to our surroundings based on “visible light.”
But we may not be so limited in the not-so-distant future. Alberto Valdes-Garcia, IBM Research staff member and principal investigator, and his colleagues are working on a technology that will allow us to “see” beyond the visible light domain. By combining sensors that work across different portions of the electromagnetic (EM) spectrum, such as millimeter waves, with image processing software and eventually artificial intelligence (AI), the group’s hyperimaging device will allow us to see what was previously not visible to the human eye, even through fog, snow and rain.
And this is only the beginning of how this technology—in conjunction with the IBM Research Frontiers Institute—could be applied across different disciplines and applications.
IBM Systems Magazine (ISM): From my understanding, we can’t see 99.9 percent of the electromagnetic environment in which we live with the naked eye. Why is that?
Valdes-Garcia (AVG): There are two reasons. First, there has to be a device that can capture the energy of a particular portion of the electromagnetic spectrum. For instance, let’s talk about light. A mirror doesn’t capture any energy. It bounces it back. But your eyes act like a lens and focus—and capture—the energy that’s coming into your field of view. Another example would be the antenna on your cellphone, which captures a very different part of the electromagnetic spectrum known as microwaves.
Second, you need a sensor that translates that energy into a signal that then can be further processed and interpreted. In the cellphone example, you have to have a receiver that can translate the electrical energy captured by the antenna into some other form of electrical information that can then be turned into voice and data.
ISM: What role do infrared (IR) domain sensors play?
AVG: My daughter was watching a program recently that I found fascinating. Imagine you’re a hunter’s prey. One of the easiest ways to protect yourself is by using camouflage, so you make your skin look like the surrounding bushes and hide inside them. Now, what my daughter and I learned is that some snakes have infrared domain sensors and animals in general emit energy in this domain. People cannot see at those IR wavelengths, but in general, a warm object emits more IR energy than a cold object.
A small mouse would emit more energy than its surroundings. Their eyes will tell snakes one thing—there’s no mouse—but their IR sensors would tell them something else. They’d see something that looks warm and is likely to be food and then they’d attack it. Some snakes can sense in two domains of the electromagnetic spectrum.
Which leads us to properties, an important topic. Conceptually, we’ve labeled portions of the electromagnetic spectrum using different names, like the visible domain. Below the visible domain, there’s the IR domain, and below them there are the THz, millimeter-wave and microwave domains. Each has different wavelengths with different reflection and transparency properties. Take paper, which is opaque, as an example. Although we can’t see through paper, that doesn’t mean there’s no method by which we can’t see through it. If we use other wavelengths, we actually can.
ISM: So is your goal is to create a multispectrum device that can capture visible light and other wavelengths?
AVG: Correct. One of the key portions of the spectrum we’re addressing with our hyperimaging platform is the properties of millimeter waves, which give us the ability to measure distance to a given object to interpret dimension. This is something that can be done at almost any wavelength, but at the level of millimeter waves, it’s easier to accurately measure distance to dimensions in the range of centimeters for the purpose of autonomous navigation and autonomous driving, for example. If you want to have a sense of the distance of objects with centimeter resolution and also the speed of objects, it happens to be easier if you launch a wave in the millimeter wave domain and then measure the reflection of an object by counting the delay of wavelengths.
ISM: So millimeter waves are key.
AVG: Yes, millimeter waves can “see” through materials that are visually opaque, like everyday materials and environmental conditions that we care about in some applications—clothing, paper, packing materials, fog, rain and snow. Because of its shorter wavelength, light can create sharper images, but millimeter waves can—using the appropriate technology and processing—be made accurate enough to be useful, for example, to detect a car in front of you in the fog.
But millimeter waves bring unique challenges. If we used the type of sensors that we have for cameras, they wouldn’t work. They’re not sensitive at all to millimeter waves. So new technology has to be developed to address these separate spots in the middle of the spectrum that, up to maybe 15 years ago, was nearly exclusive to the aerospace and defense domains. So our group actually pioneered the integration of that technology into silicon chips for small devices.
This is now becoming revolutionary in the tech world. Automotive radars operate at millimeter waves. Although they really don’t provide images, they can tell you whether something is in front of you. In the hyperimager, we have a unique device called the “phase array” that essentially allows you to electronically scan a beam across a scene. We don’t see the beam but it contains this millimeter-wave energy.
Imagine you have a flashlight at night. As you’re scanning the scene, you have a sense of what’s going on in the dark. In a similar way, we’re scanning a scene with a sensor array, with our beam. The beam can go through fog, snow. You’ll get some reflections and an image, and even though it may not be nearly as sharp as in the visual domain, it will give us a first sense of what’s going on in the scene that can’t be perceived with the eye or a camera. In our current prototype, we are combing a millimeter-wave sensor, a commercial infrared sensor and a visible-domain camera.
ISM: What are the results when you combine those three?
AVG: That is an intriguing question, isn’t it? Images that combine all that information have not been created so far for every day scenes, and we believe they will reveal unique insights. By developing a portable multispectral imaging platform, we want to enable the exploration of multiple use cases including, to name just a couple, the detection of nonvisible defects in manufacturing and the identification of obstacles under conditions of limited visibility.
Under conditions that a camera really cannot distinguish very well what’s going on, we’ll be able to combine what the IR sensor and millimeter wave sensor can see. We think that this combination could give autonomous driving systems a higher sense of confidence of what’s ahead of them.
Jim Utsler, IBM Systems magazine senior writer, has been writing for IBM since the mid-1990s.More →