Combining light and sound to see underwater


Engineers at Stanford University have developed an aerial method of imaging underwater objects by combining light and sound to break the seemingly insurmountable barrier at the interface between air and water.

The researchers envision that their hybrid optical-acoustic system will one day be used to conduct marine biological studies with airborne drones, conduct large-scale aerial searches of sunken ships and planes, and map the deep ocean at speed and a level of detail similar to that of landscapes. Their system is detailed in a recent study published in the IEEE Access journal.

"Airborne and space-borne laser and radar systems, or LIDARs, have been able to map Earth's landscapes for decades. Radar signals are even capable of penetrating cloud cover and tree canopy. However, seawater is too absorbent to allow imaging through, "said study leader Amin Arbabian, an associate professor of electrical engineering at the Stanford School of Engineering. "Our goal is to develop a more robust system that can image even through murky water."

The oceans cover about 70 percent of the Earth's surface, but only a small fraction of its depths have been subjected to high-resolution imaging and mapping.

The main barrier has to do with physics: sound waves, for example, cannot pass from air to water or vice versa without losing most - more than 99.9 percent - of their energy by reflection against the other medium. A system that attempts to see underwater using sound waves that travel from air to water and vice versa is subject to this energy loss twice, resulting in a 99.9999 percent energy reduction.

An artistic interpretation of the airborne photoacoustic sonar system that operates from a drone to detect and photograph underwater objects. (Photo: Kindea Labs)

Similarly, electromagnetic radiation - a generic term that includes light, microwave, and radar signals - also loses energy when passing from one physical medium to another, although the mechanism is different from that of sound. "Light also loses some energy through reflection, but most of the energy loss is due to absorption by water," explained the study's first author, Aidan Fitzpatrick, a Stanford electrical engineering graduate student. By the way, this absorption is also the reason why sunlight cannot penetrate the depths of the ocean and why your smartphone, which relies on cellular signals, a form of electromagnetic radiation, cannot receive calls under the Water.

The upshot of all this is that the oceans cannot be mapped from the air and from space in the same way as the land. To date, most of the underwater mapping has been done by installing sonar systems on ships that scan a particular region of interest. But this technique is slow and expensive, and inefficient to cover large areas.

To solve this comes the Aerial Photoacoustic Sonar System (PASS), which combines light and sound to cross the air-water interface. The idea for this system came from another project that used microwaves for "non-contact" imaging and characterization of underground plant roots. Some of the PASS instruments were initially designed for that purpose in collaboration with the laboratory of Stanford electrical engineering professor Butrus Khuri-Yakub.

At its core, PASS plays with the individual forces of light and sound. "If we can use light in air, where light travels well, and sound in water, where sound travels well, we can get the best of both worlds," Fitzpatrick said.

To do this, the system first fires a laser from the air that is absorbed by the surface of the water. When the laser is absorbed, it generates ultrasound waves that propagate through the water column and are reflected off underwater objects before returning to the surface.

The returning sound waves are still sapped of most of their energy when they pass through the water's surface, but by generating the sound waves underwater with lasers, the researchers can prevent the energy loss from occurring twice.

"We have developed a system that is sensitive enough to compensate for a loss of this magnitude and still allow for signal detection and imaging," Arbabian said.

The reflected ultrasound waves are recorded by instruments called transducers. Software is used to put the acoustic signals back together like an invisible puzzle and reconstruct a three-dimensional image of the submerged feature or object.

"Similar to how light refracts or 'bends' when it passes through water or any medium denser than air, ultrasound also refracts," Arbabian explained. "Our image reconstruction algorithms correct for this curvature that occurs when ultrasound waves pass from water to air."

Post a Comment

0 Comments