New worlds: Technion engineers develop virtual periscope that doesn’t have to get wet

Prof. Yoav Schechner, of the Technion’s electrical engineering department, and colleagues developed the virtual periscope.

Tekumah, a Dolphin-class submarine 370 (photo credit: REUTERS)
Tekumah, a Dolphin-class submarine 370
(photo credit: REUTERS)
‘Up periscope!” may become an outdated order thanks to a team of Technion-Israel Institute of Technology researchers who have developed a new technology for viewing objects above the water’s surface without the need for a periscope poking its head above the waves. The researchers modeled their virtual periscope on technology used by astronomers to counter blurring and distortion caused by layers of atmosphere when viewing stars.
The technology behind a submerged “virtual periscope” was introduced in a presentation at the IEEE International Conference on Computational Photography, held in California earlier this month. Prof. Yoav Schechner, of the Technion’s electrical engineering department, and colleagues developed the virtual periscope, which is called “Stella Maris” (Stellar Marine Refractive Imaging Sensor).
The heart of the underwater imaging system is a camera – a pinhole array to admit light (a thin metal sheet with precise, laser-cut holes), a glass diffuser and mirrors.
The rays of the sun are projected through the pinholes to the diffuser, which is imaged by the camera, beside the distorted object of interest. The image is then corrected for distortion.
“Raw images taken by a submerged camera are degraded by water-surface waves similarly to degradation of astronomical images by our atmosphere. We borrowed the concept from astronomers who use the Shack-Hartmann astronomical sensor on telescopes to counter blurring and distortion caused by layers of atmosphere,” explained Schechner. “Stella Maris is a novel approach to a virtual periscope as it passively measures water and waves by imaging the refracted sun.”
The unique technology gets around the inevitable distortion caused by the water-surface waves when using a submerged camera.
According to the Technion engineer, because of the sharp refractive differences between water and air, random waves at the interface present distortions that are worse than the distortion atmospheric turbulence creates for astronomers peering into space.
“When the water surface is wavy, the sun’s rays refract according to the waves and project onto the solar image plane,” said Schechner. “With the pinhole array, we obtain an array of tiny solar images on the diffuser.”
When all of the components work together, the Stella Maris system acts as both a wave sensor to estimate the water surface, and a viewing system to see the above-surface image of interest through a computerized, “reconstructed” surface.
The Stella Maris virtual periscope is just the latest technology developed by the researchers, who have also found ways to exploit “underwater flicker” – random change of underwater lighting caused by the water surface wave motion. The team turned the tables on underwater flicker and used the natural rapid and random motion of the light beams to obtain three-dimensional mapping of the sea floor.
The virtual periscope may have other potential uses in which they could reduce the reliance on traditional periscopes, that have been in use for more than a century.
Submerged on the sea floor, Stella Maris could be useful for marine biology research when viewing and imaging both beneath and above the waves simultaneously is important. It could, for example, monitor the habits of seabirds as they fly, then plunge into water and capture prey.
“There are many ways to advance the virtual periscope,” says Schechner, who adds that while the system requires sunlight, they are currently working on a way to gather enough light from moonlight or starlight to be able to use the system at night.
University of Liverpool researchers have found that brief musical training can increase the blood flow in the left hemisphere of our brain. This suggests that the areas responsible for music and language share common brain pathways. The team carried out two separate studies that looked at brain activity patterns in musicians and non-musicians.
The first study looking for patterns of brain activity of 14 musicians and nine others while they participated in music and word generation tasks. The results showed that patterns in the musician’s brains were similar in both tasks – but this was not the case for the non-musicians.
In the second study, brain activity patterns were measured in a different group of non-musical participants who participated in a word-generation task and a music-perception task. The measurements were also taken again following half an hour’s musical training.
Measurements of brain activity taken before the musical training showed no significant pattern of correlation, but following the training, significant similarities were found.
Researcher Amy Spray said: “The areas of our brain that process music and language are thought to be shared and previous research has suggested that musical training can lead to the increased use of the left hemisphere of the brain. This study looked into the modulatory effects that musical training could have on the use of the different sides of the brain when performing music and language tasks. It was fascinating to see that the similarities in blood flow signatures could be brought about after just half an hour of simple musical training.”