Combining sound and light to identify deep swarms of animals

It is often said that seeing is believing. The deep scattering layer found in the ocean provides a great example of the truth in that statement. That feature was accidently discovered in 1942 when signals from navy sonars reflected off it and gave a false register of the seafloor. At first it was thought to be geological because of its intensity and width, but later was confirmed to be biological via in situ visual observations. The enigmatic “feature” was in fact a collection of zooplankton and micronekton (small actively swimming crustaceans, squids, and fishes between two and ten centimeters in size).

This first example of combining acoustic sensing (injecting sound into the water and looking at its reflections) with optical imaging highlighted the complementary strengths of each of these tools and their enhanced power to observe biological phenomena in the ocean that are not apparent using either technique alone. In this case, the large sampling volume and rapid coverage of an area with acoustic surveys were instrumental in locating the extensive layers. Later, acoustics helped target visual observations by observers in human-occupied submersibles, who immediately determined that the source of the acoustic scattering was a collection of various small mobile animals.

Since then, the fields of acoustics and in situ visual observation (now, largely using imaging systems on remotely operated or autonomous vehicles) have made big advances in both technology and application. Yet, despite many attempts, they have been used together infrequently. In most cases, the two tools have been used inequitably, with one approach doing a service for the other (e.g., using imaging to ground truth acoustics) rather than being used synergistically to address a grand question. This lack of integration is not surprising in that the level of technical expertise required for application and development of each approach is quite high. Few individual researchers or even research institutions have had the ability to make the coordinated investments required to fully exploit the strengths of an integrated approach. MBARI has been able to step into a leadership role by combining these two technologies.

Acoustics vs. imaging

The deep scattering layer was found using sonar in the 1940s. Visual observations from human-occupied submersibles later found the layer to be made up of a large number of small animals. The technologies for both acoustics and in situ visualization have progressed dramatically in the decades since; however, the techniques have only rarely been used together, largely due to the technical challenges of each approach. MBARI is poised to exploit the strengths of a truly integrated acoustic-imaging approach in investigations of animal life in the water column.

In 2017, the Acoustical Ocean Ecology Group with support from the Midwater Ecology Group, led by Scientists Kelly Benoit-Bird and Bruce Robison, began the process of developing integrated acoustical-optical methods using the ROV Ventana, leveraging MBARI’s longstanding investments and expertise in imaging technologies. The measures of animal communities that can be obtained from acoustics and imaging, used concurrently from a single platform, promise to tell much about life in the ocean and improve the ability to sample it. For example, scientists are beginning to describe the behavioral responses of animals to vehicles and their lights, and to measure the species-specific “acoustic signatures” of pelagic animals needed to interpret remotely collected data. These methods also provide a new view of how animals are distributed in the midwater and how their patterns of organization are affected by the environment.

The challenges of studying the immense, dark, and traditionally inaccessible midwater habitat mean that many relevant aspects of its biology remain largely unexplored. There is a mistaken, yet long-held, view that the huge volume beneath the ocean’s photic zone is largely homogeneous, both physically and biologically, in the horizontal plane. Combining in situ acoustic measures with concomitant imaging is providing an unprecedented view of individuals, the groups they are part of, and their environmental context, thus affording the opportunity to quantify structure of both biomass and taxonomic composition that has been previously unattainable.

Acoustics of krill patch

A patch of krill was measured acoustically at long range (shown here at eight meters depth but first observed at more than 50 meters depth) and then observed from within the swarm, providing both acoustic and visual measures of the density of the patch, acoustic measurements of the patch’s extent, shape, and internal organization, and visual observations of the species forming the group. Using the acoustics and imaging together, the team measured the distribution of individual krill length. Neither the acoustics nor imaging could have given a complete picture of the swarm independently. For swarms of this physical scale (less than 10 meters across), even coordinated platforms with independent sensors would have been challenged to obtain matching data sets.

In 2018, the two research teams, with support from a number of MBARI engineers, coordinated by Eric Martin and the AUV operations team led by Hans Thomas, will begin extending this new perspective to an autonomous platform, the i2MAP vehicle, which already carries a state-of-the-art imaging system in its nose cone. While acoustics and imaging can be used in a coordinated fashion from independent platforms, this kind of true integration will provide details at depths not possible with surface vessels. The integration will allow for the observation of animal behavior in response to the platform, identification of targets, and measurement of their sizes, with independent, simultaneous assessments of animal densities at sub-patch scales. This approach parallels efforts begun with the ROV Ventana. Combining the strengths of these two platform types will allow us to quickly increase in the sample size of visual-acoustic data. It will also provide the opportunity to quantitatively compare directly observed responses of animals to these two sampling platforms to facilitate appropriate integration of the data from ROV and AUV surveys.

This new toolset will provide critical support for the MBARI goal of taking the laboratory to the ocean to “see” midwater communities in new ways and understand how the presence of the “laboratory” affects the subjects of study. The mesopelagic is an understudied region that is critically important for the success of a wide range of species including many important food fish and, through decreasing oxygen and shoaling of the oxygen minimum layer, is being dramatically impacted by climate change. To provide information for the sound management of biological resources in the ocean, the animals in this critical yet understudied region must be brought to light—in this case, using sound and light.

Concept of i2Map imaging vehicle

Drawing of the acoustic system that will be integrated into the i2MAP imaging vehicle in 2018.The positioning of the transducers is optimized for a combination of vehicle stability, avoidance of interference between instruments, and overlap between imaging and forward-looking acoustic systems.

Collaborations bring distant study sites into reach

In 2018, three programs will be able to conduct operations not generally within reach of MBARI vessels by working with the Schmidt Ocean Institute.