14
IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009 225 Generalized Framework for Real Aperture, Synthetic Aperture, and Tomographic Sonar Imaging Brian G. Ferguson, Senior Member, IEEE, and Ron J. Wyber, Member, IEEE Abstract—When an underwater object is insonified over a range of aspect angles, the signals that are backscattered from the object can be sampled in space and time by an array of sensors, which can form either a real or a synthetic aperture. The spatial infor- mation contained in the backscattered signals can be processed using inverse transform methods to form an image of the 2-D spa- tial distribution of the object’s acoustic reflectivity function. The signal measurement space (or data space) defines a sector of the polar wave number spectrum that quantifies the variation with as- pect angle of the spatial frequency components that contribute to the backscattered signal. The temporal bandwidth of the incident sonar pulse determines the radial extent of the sector, while the range of aspect angles over which the object is insonified defines the angular extent of the sector. Equivalently, the extent of the aper- ture that is formed by the sensor positions relative to the object de- termines the angular extent of the sector. This concept provides a generalized framework that unifies sonar imaging techniques such as reconstructive tomography (image reconstruction from projec- tions), synthetic aperture sonar, and real aperture sidescan sonar. The difference in these techniques is simply due to the azimuthal angle subtended by the aperture of the sensing array, which is typ- ically a fraction of a degree for a real aperture sonar, several de- grees for a strip map synthetic aperture sonar, tens of degrees for a spotlight synthetic aperture sonar, and 360 for a tomographic imaging sonar. Experimental results are presented for a real aper- ture sidescan sonar, a synthetic aperture sonar, and a tomographic sonar that demonstrate the development of an acoustic image from a single blurred point to a clearly identifiable object as the az- imuthal extent of the angle subtended by the sonar aperture is in- creased. Index Terms—Beamforming, reconstructive tomography, sonar imaging, synthetic aperture. I. INTRODUCTION H ISTORICALLY, the sonar community has been promi- nent in the development of advanced beamforming algorithms for real arrays, including adaptive beamforming for passive sonar arrays and for active sonar receiver arrays [1]–[3]. Algorithms for linear and circular synthetic apertures have been developed in other fields including medicine [4], radar [5]–[9], and geophysics [10], as well as sonar [11]–[14]. As a consequence, the diversity in the subject disciplines where Manuscript received March 16, 2006; revised August 27, 2007, December 17, 2007, and October 21, 2008; accepted March 06, 2009. Current version pub- lished August 05, 2009. This work was supported by Capability Technology Demonstrator Project SEA 1436 for an Advanced Minehunting Sonar. Guest Editor: P. E. Hagen. B. G. Ferguson is with the Maritime Operations Division, Defence Science and Technology Organisation, Pyrmont, N.S.W. 2009, Australia (e-mail: Brian. [email protected]). R. J. Wyber is with Midspar Systems Pty, Ltd., Oyster Bay, N.S.W. 2225, Australia (e-mail: [email protected]). Digital Object Identifier 10.1109/JOE.2009.2017801 active imaging methods have been applied is matched by the diversity in the mathematical formulation of these methods, so the relationship between the various algorithms can be consid- ered obscure. Rather than treating the beamforming operation for each type of sonar aperture on a case-by-case basis, this paper develops a generalized approach to beamforming in the wave number domain. This approach enables the theoretical framework for tomographic imaging to be readily extended to encompass real aperture beamforming and both linear and circular synthetic aperture imaging. In the case of linear syn- thetic aperture imaging, both the strip map and spotlight sonar imaging methods are included. Importantly, the benefits of synthetic aperture imaging in enhancing the quality of sonar imagery are demonstrated using real data. Fig. 1 depicts the concept that underpins each of the sonar imaging methods: (a) broadside real aperture (sidescan), (b) strip map synthetic aperture, (c) spotlight synthetic aperture, and (d) reconstructive tomography. Fig. 1(a) represents a real aperture sidescan (side-looking) sonar that has a narrow beamwidth in the along-track (or azimuthal) direction. A real aperture sidescan sonar achieves a high azimuthal resolution by forming a very narrow broadside beam, necessitating the use of a large physical aperture and a high operating frequency, so the acoustical wavelength is short when compared with the length of the real aperture. Since the absorption loss of a sonar signal increases with operating frequency, any high-resolution imaging sonar that uses a real aperture is restricted to a short operating range. Fig. 1(b) represents a strip map synthetic aper- ture sonar [11]–[14] where a sequence of echoes is recorded as the sonar platform travels in a nominally straight line past the area of interest (or object space that defines the location of the scatterers). The echoes are coherently combined as though they had been received by a single, long array (the synthetic aperture); the sonar beam is focused for each range of interest. A synthetic aperture sonar uses a small real aperture that has a wide beamwidth but achieves high resolution in the along-track direction by virtue of the motion of the sonar platform and the synthetic aperture signal processing algorithm that coherently integrates the sequence of echoes received at each position along the sonar’s trajectory. This greatly enhanced resolu- tion in the along-track direction is independent of range and wavelength—a result that is contrary to real aperture imaging. Fig. 1(c) represents the spotlight mode of a linear synthetic aperture sonar, where the real aperture is steered continuously so that it points to the same area of interest. The continuous insonification of the selected area (the same scene) allows the formation of an image with an along-track resolution that exceeds that achievable with a strip map synthetic aperture sonar system. Fig. 1(d) represents a tomographic imaging sonar 0364-9059/$26.00 © 2009 IEEE

IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009 225

Generalized Framework for Real Aperture, SyntheticAperture, and Tomographic Sonar Imaging

Brian G. Ferguson, Senior Member, IEEE, and Ron J. Wyber, Member, IEEE

Abstract—When an underwater object is insonified over a rangeof aspect angles, the signals that are backscattered from the objectcan be sampled in space and time by an array of sensors, whichcan form either a real or a synthetic aperture. The spatial infor-mation contained in the backscattered signals can be processedusing inverse transform methods to form an image of the 2-D spa-tial distribution of the object’s acoustic reflectivity function. Thesignal measurement space (or data space) defines a sector of thepolar wave number spectrum that quantifies the variation with as-pect angle of the spatial frequency components that contribute tothe backscattered signal. The temporal bandwidth of the incidentsonar pulse determines the radial extent of the sector, while therange of aspect angles over which the object is insonified definesthe angular extent of the sector. Equivalently, the extent of the aper-ture that is formed by the sensor positions relative to the object de-termines the angular extent of the sector. This concept provides ageneralized framework that unifies sonar imaging techniques suchas reconstructive tomography (image reconstruction from projec-tions), synthetic aperture sonar, and real aperture sidescan sonar.The difference in these techniques is simply due to the azimuthalangle subtended by the aperture of the sensing array, which is typ-ically a fraction of a degree for a real aperture sonar, several de-grees for a strip map synthetic aperture sonar, tens of degrees fora spotlight synthetic aperture sonar, and 360 for a tomographicimaging sonar. Experimental results are presented for a real aper-ture sidescan sonar, a synthetic aperture sonar, and a tomographicsonar that demonstrate the development of an acoustic image froma single blurred point to a clearly identifiable object as the az-imuthal extent of the angle subtended by the sonar aperture is in-creased.

Index Terms—Beamforming, reconstructive tomography, sonarimaging, synthetic aperture.

I. INTRODUCTION

H ISTORICALLY, the sonar community has been promi-nent in the development of advanced beamforming

algorithms for real arrays, including adaptive beamformingfor passive sonar arrays and for active sonar receiver arrays[1]–[3]. Algorithms for linear and circular synthetic apertureshave been developed in other fields including medicine [4],radar [5]–[9], and geophysics [10], as well as sonar [11]–[14].As a consequence, the diversity in the subject disciplines where

Manuscript received March 16, 2006; revised August 27, 2007, December 17,2007, and October 21, 2008; accepted March 06, 2009. Current version pub-lished August 05, 2009. This work was supported by Capability TechnologyDemonstrator Project SEA 1436 for an Advanced Minehunting Sonar.

Guest Editor: P. E. Hagen.B. G. Ferguson is with the Maritime Operations Division, Defence Science

and Technology Organisation, Pyrmont, N.S.W. 2009, Australia (e-mail: [email protected]).

R. J. Wyber is with Midspar Systems Pty, Ltd., Oyster Bay, N.S.W. 2225,Australia (e-mail: [email protected]).

Digital Object Identifier 10.1109/JOE.2009.2017801

active imaging methods have been applied is matched by thediversity in the mathematical formulation of these methods, sothe relationship between the various algorithms can be consid-ered obscure. Rather than treating the beamforming operationfor each type of sonar aperture on a case-by-case basis, thispaper develops a generalized approach to beamforming in thewave number domain. This approach enables the theoreticalframework for tomographic imaging to be readily extendedto encompass real aperture beamforming and both linear andcircular synthetic aperture imaging. In the case of linear syn-thetic aperture imaging, both the strip map and spotlight sonarimaging methods are included. Importantly, the benefits ofsynthetic aperture imaging in enhancing the quality of sonarimagery are demonstrated using real data.

Fig. 1 depicts the concept that underpins each of the sonarimaging methods: (a) broadside real aperture (sidescan), (b)strip map synthetic aperture, (c) spotlight synthetic aperture,and (d) reconstructive tomography. Fig. 1(a) represents areal aperture sidescan (side-looking) sonar that has a narrowbeamwidth in the along-track (or azimuthal) direction. A realaperture sidescan sonar achieves a high azimuthal resolutionby forming a very narrow broadside beam, necessitating theuse of a large physical aperture and a high operating frequency,so the acoustical wavelength is short when compared with thelength of the real aperture. Since the absorption loss of a sonarsignal increases with operating frequency, any high-resolutionimaging sonar that uses a real aperture is restricted to a shortoperating range. Fig. 1(b) represents a strip map synthetic aper-ture sonar [11]–[14] where a sequence of echoes is recordedas the sonar platform travels in a nominally straight line pastthe area of interest (or object space that defines the location ofthe scatterers). The echoes are coherently combined as thoughthey had been received by a single, long array (the syntheticaperture); the sonar beam is focused for each range of interest.A synthetic aperture sonar uses a small real aperture that has awide beamwidth but achieves high resolution in the along-trackdirection by virtue of the motion of the sonar platform and thesynthetic aperture signal processing algorithm that coherentlyintegrates the sequence of echoes received at each positionalong the sonar’s trajectory. This greatly enhanced resolu-tion in the along-track direction is independent of range andwavelength—a result that is contrary to real aperture imaging.Fig. 1(c) represents the spotlight mode of a linear syntheticaperture sonar, where the real aperture is steered continuouslyso that it points to the same area of interest. The continuousinsonification of the selected area (the same scene) allowsthe formation of an image with an along-track resolution thatexceeds that achievable with a strip map synthetic aperturesonar system. Fig. 1(d) represents a tomographic imaging sonar

0364-9059/$26.00 © 2009 IEEE

Page 2: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

226 IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009

Fig. 1. Types of imaging sonars: (a) real aperture sidescan, (b) strip map syn-thetic aperture, (c) spotlight synthetic aperture, and (d) reconstructive tomog-raphy.

where a 2-D cross-sectional image (plan view) of a 3-D object isreconstructed by the digital processing of many 1-D views [15].The tomographic reconstruction of backscattered sonar signalsrequires a set of projections for which the viewing (or aspect)angles span a full 360 . The output waveform of the sonarreceiver for a given angle of insonification represents a singleprojectional view of the object. The sonar transducer is movedfrom point to point around the entire circumference of a circleso as to view the object from all possible aspect angles, thuscompiling a complete set of projections. Alternatively, the sonaris fixed and the object is rotated about its axis through 360 .A high-resolution image of the object can be reconstructedfrom this set of projections. The image represents a 2-D map

Fig. 2. (a) The fan of beams used to define the azimuthal resolution of a realaperture sonar in the classical model. (b) The concept used for synthetic aperturesonar where the transducer insonifies the object at a particular aspect angle,then senses the backscattered signal, before moving to the next position (aspectangle). The process is repeated over a range of aspect angles.

of the 3-D acoustic reflectivity function when projected on theimaging plane.

This paper is structured so that Section II provides back-ground information on real and synthetic aperture sonars(including range migration), as well as tomographic sonars(including backprojection). Section III develops a generalizedframework for sonar imaging from which a mathematicalrepresentation for each of the imaging methods can be de-rived. Section IV depicts the extent of the polar wave numberspectrum covered by each of the sonar imaging methods andprovides formulas for the point spread functions and spatial res-olutions of these methods. Section V shows results of applyingthe methods to real sonar data enabling the comparison of sonarimages formed by the various imaging methods. Section VIgives the conclusions.

II. BACKGROUND

A. Real and Synthetic Aperture Sonars

The quality of an underwater acoustic image is determined bythe resolution of the imaging sonar. The classical approach usedto calculate the resolution of a sonar is to consider a fan of beamsthat covers the object. The cross-range resolution is then the az-imuthal resolution of a sonar beam multiplied by the range tothe object. This model, which is illustrated in Fig. 2(a), is appro-priate for sonars in which the length of the array is small whencompared to the range to the object, which is generally truefor practical sonars using real apertures. An alternative model,

Page 3: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

FERGUSON AND WYBER: GENERALIZED FRAMEWORK FOR REAL APERTURE, SYNTHETIC APERTURE 227

Fig. 3. Plan view of the transducer-object geometry for a tomographic imagingsonar and the projection for an insonification angle �. As the object is in the farfield, the wave fronts are planar.

which is represented diagrammatically in Fig. 2(b), better de-scribes the synthesis of a virtual aperture. In this model, it is as-sumed that the signal received at a given sensor position resultsfrom insonifying the (entire) object at a particular aspect angle,which changes as the position of the sensor changes, that is, themeasurements are made using the same sensor but in differentpositions and at different times. For this case, the synthesis ofa large linear aperture is achieved by coherently processing thephase history information of the backscattered signal.

B. Tomographic Sonar

Historically, tomography was first developed for X-rayabsorption imaging in the medical field. For reflection tomog-raphy, the transducer-object geometry is depicted in Fig. 3where and represent a Cartesian coordinate system with anorigin located at the center of the scene to be imaged by thesonar. The scene includes the object of interest. The directionof propagation of the sonar transmission is at an angle to the

-axis and denotes the 2-D distribution of the amplitudeof the signal backscattered from the object. Reconstructive to-mography (also known as computed tomography) reconstructsan image of the object from projections recorded at all aspectangles. The object function is reconstructed from theinverse of the integral equation given by

(1)

where the coordinate pair represents the rotation of thecoordinate pair through an angle , that is

(2)

(3)

Reconstructive tomography requires a complete set of projec-tions where both and are variable. The 2-D function

is the Radon transform of . As an inverse exists forthe Radon transform, it is possible to form an image of ,which is given by

(4)

where denotes the inverse Radon transform operator.The operations required to implement the inverse Radon trans-form are defined in [16] to be

(5)

where is the forward spatial Fourier transform ,is the modulus of the wave number (or spatial frequency) ,

is the inverse spatial Fourier transform , anddenotes the backprojection operator—see (6) and (7). The op-

eration in (5) represents the application ofa wave number filter with a response to each of the projec-tions measured at different angles.

For a tomographic imaging sonar [15], the resolution is highin the range direction, but low in the azimuthal (or cross-range)direction; the converse is true in X-ray absorption tomography.In both cases, the poor resolution is overcome by viewing theobject at many aspect angles, enabling the reconstruction of ahigh-resolution image of the object.

C. Backprojection

Associated with the Radon transform is the backprojectionoperator which is defined as

(6)

where, using (2) and (3), (see Fig. 3).The quantity is referred to as the backprojection of

. In polar coordinates, it can be written as

(7)

Backprojection represents the accumulation of the wave frontintegrals of all the wave fronts that pass through the pointor . In other words, for a fixed point or , thebackprojection is evaluated by integratingover for all wave fronts that pass through that point—see(6). However, in view of (7), the backprojection at isalso the integration of along the sinusoidin the plane. For example, Fig. 4(a) shows the variationwith and of the set of projections that were observedfor an underwater object (a large bomb). Rather than the sonarcircumnavigating the stationary object, the sonar was fixedand the object (which was suspended in a dam by wire ropesattached to a revolving horizontal spar) was rotated about itsvertical axis through one revolution, that is, 0 360 . Aftertransforming the variable , where ,the output waveform in the time domain becomes theprojection in the spatial domain. The received echosignal data (bandpass-limited impulse response measurements)were recorded at angular increments of 0.35 and the distancebetween the sonar and the object was 60 m [15].

Page 4: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

228 IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009

For any given aspect angle, the projection data are dominatedby a sequence of impulses related to acoustic highlights (promi-nent points of reflectivity) on the object. As the object rotates,the distance from the sonar to each of the highlights traces out asine wave. The amplitude of the sine wave is determined by thedistance of the highlight from the axis of rotation, which (in thiscase) is perpendicular to the longitudinal axis of the object. Thephase of the sine wave is determined by the angular position ofthe highlight on the object. Thus, the backprojection operatorestimates the reflectivity of the scatterer at a point as theintegral of the received echo signal along the path that is tracedout by the rotation of the object.

D. Range Migration

In the case of a linear synthetic aperture, the locus of the dis-tance of an acoustic highlight on the object to the sonar scribesa hyperbola, which in practice approximates a parabola [11]. Asequence of received echo signals in the range domain is plottedin Fig. 4(b) as the sonar moves in a straight line along the -axis(which is referred to as the along-track, azimuth, or cross-rangedirection). The data space is characterized by a collection of hy-perbolas, each hyperbola corresponding to one reflecting pointin the object space; the hyperbolas become more extended asthe range (or time delay) increases. The extent to which a singlepoint in the object space maps to a hyperbolic locus that coversseveral range cells is referred to as range curvature or range mi-gration. The amount of curvature depends on the range, whichcomplicates the synthetic aperture reconstruction problem.

Like acoustic reflection tomography [15], synthetic aperturesonar image formation is a two-step process. The data acqui-sition process maps a single point scatter in object space to a2-D point spread function (or point scatterer response function)in measurement (or data) space. Synthetic aperture signal pro-cessing endeavors to focus the point scatterer response back to asingle point in image space. As a result, the measurement spaceconsisting of received echo data is mapped to the image spaceconsisting of synthetic aperture beamformed data that are fo-cused in range. Each point in the image space represents an es-timate of the magnitude of the complex reflectivity function; thespatial variation of these estimates forms a 2-D acoustic reflec-tivity map of the object or the area of interest.

III. A GENERALIZED APPROACH TO SONAR IMAGING

IN THE FREQUENCY DOMAIN

If it is assumed that the object is composed of an ensemble ofomnidirectional point scatterers, then, after insonification by anideal impulse, the variation with time of the signal backscat-tered from the object is given by

(8)

where is the Dirac delta function, is theamplitude of the signal backscattered from the object at the point

, and is the two-way propagation delay to the scat-terer at the position .

From Fig. 3, it can be shown that

(9)

where is the distance from the transducer to the originwhen the transducer is at an angle . After transforming the vari-able , where , the output waveform

in the time domain becomes the projection in thespatial domain. The projection is represented diagrammaticallyat the top of Fig. 3.

The temporal Fourier transform of (8) is given by

(10)

The substitution of the wave number into (10) gives

(11)

which can be written as

(12)

where and .Now (12) can be written as

(13)

where

and the polar transformation denoted by the operator mapsthe wave number data from the Cartesian domain to thepolar domain via

(14)

(15)

This polar transformation is induced by the measurementsystem.

Since represents the 2-D wavenumber transform of the backscattering amplitude function (oracoustic reflectivity function) , then an image of the ob-ject can be recovered as

(16)

where denotes the 2-D inverse wave number transform anddenotes the inverse polar mapping operator, for which,

and . In practice, thespatial frequency data are windowed and padded with zeroes

Page 5: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

FERGUSON AND WYBER: GENERALIZED FRAMEWORK FOR REAL APERTURE, SYNTHETIC APERTURE 229

Fig. 4. (a) Projection data for one complete rotation of an underwater object. Each acoustic highlight (prominent point of reflectivity) on the object character-istically traces out a sinusoid. In absorption (X-ray) tomography, this type of plot is referred to as a sinogram. (b) Responses of prominent point scatters in theobject field plotted in the range-along-track (azimuth) domain for a linear synthetic aperture sonar. Each acoustic highlight traces a hyperbola with a curvature thatdepends on the range.

prior to calculating the inverse Fourier transformation that re-sults in the image estimate .

A. Circular Synthetic Aperture Sonar

Since the trajectory of a circular synthetic aperture imagingsonar circumscribes a circle of radius [see Fig. 5(a)], then

is a constant (which can be ignored). So for a circularsynthetic aperture imaging sonar, (16) becomes

(17)

The phase factor does not appear in (17) becausethe time origin is shifted from the sonar position to the scenecenter . This is equivalent to multiplyingby the spatial phase factor in the wave numberdomain.

B. Linear Synthetic Aperture Sonar

For a synthetic aperture sonar with a trajectory that coin-cides with the -axis [see Fig. 5(b)], and

Page 6: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

230 IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009

Fig. 5. Geometry of (a) circular synthetic aperture sonar, (b) linear syntheticaperture sonar, and (c) real aperture sidescan sonar.

. The signal, whose measurement space cor-responds to the domain, is given by

(18)

where the operator maps the signal data from the Cartesiandomain to the domain via

(19)

(20)

C. Strip Map Synthetic Aperture Sonar

Various inversion methods have been used to form strip mapsynthetic aperture sonar images including the exact method, the

range-Doppler algorithm, and the range migration algorithm[12]. The latter algorithm, which is also known as the wavenumber algorithm, can be derived by applying a 1-D Fouriertransform to (18), that is

(21)

(22)

This integral can be evaluated using the principle of stationaryphase [12], so the transformed signal measurement space be-comes

(23)

where the approximation is valid for wideband signals and theforward Stolt mapping operator [10], denoted as , maps

to via

(24)

(25)

From (23), it can be shown that the image estimate using therange migration algorithm for a strip map synthetic aperturesonar is given by [12]

(26)

where the inverse Stolt operator maps the measured do-main data onto the rectilinear domain via

(27)

(28)

A diagrammatic representation of the inverse Stolt mappingoperator is shown in Fig. 6, where the small black-filled circlesindicate the locations of the data samples (observations), whichlie along radii of at ordinate . The inverse Stolt mappingoperator transforms these radial data samples to Cartesian coor-dinates , which are then interpolated to form the under-lying rectangular grid of data points, before 2-D Fourier inver-sion.

Note that efficient digital implementation of the range mi-gration algorithm using the fast Fourier transform requires thephase functions in the spatial frequency domain to be referencedto the scene center at the origin , instead ofthe center of the synthetic aperture array at

Page 7: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

FERGUSON AND WYBER: GENERALIZED FRAMEWORK FOR REAL APERTURE, SYNTHETIC APERTURE 231

. The phase oscillations then have a much lower frequency be-cause of the shift property, that is

(29)

where and are the wave number spectraof the object, when the phase functions are referenced to

and , respectively. Thislowering of the rate of the phase oscillations is extremely im-portant for accurately interpolating the wave number data whenimplementing the mapping operations [12], since it reduces thelikelihood of uncertainties in the interpolated data that woulddegrade the image. The factor in (26) ef-fectively removes the highly oscillatory phase functions insidethe inverse Stolt mapping before the data are interpo-lated onto the domain grid.

If required, the temporal origin at the center of the syntheticaperture at can be shifted to the scenecenter at [see Fig. 5(b)]. This temporalshift can be represented by a transformation of the time variable

. This is equivalent to multiplication by the spatialphase factor in the wave number domain, and so(26) becomes

(30)

D. Spotlight Synthetic Aperture Sonar

Using (16) with a transformation of the variable , or(18), it can be shown that the image function for a spotlightsynthetic aperture sonar is given by

(31)

where the inverse operator maps the signal data fromspace to a rectangular grid defined by

(32)

(33)

In spotlight synthetic aperture, the temporal origin is shiftedfrom the sonar position at to the scenecenter at [see Fig. 5(b)]. This temporalshift can be represented by a transformation of the time vari-able , which is equivalent to multiplica-tion by the spatial phase factor in the wavenumber domain. Hence, the phase factor in (16),or in (18), does not appear in the spotlightmode formulation (31) due to the definition of the time origin. Inpractice, the spotlight system data are gated so that only echoes

Fig. 6. The inverse Stolt operation takes the observed data (black-filled cir-cles) and transforms them to Cartesian coordinates. The underlying Cartesiangrid of regularly spaced data points is then populated by interpolation of thetransformed Cartesian data. The spatial bandwidths � and � outline therectangular section of the 2-D wave number space that is extracted, windowed,and inverse-Fourier-transformed to produce the image estimate (after [12]).

from around the scene center are stored. In the tomographic for-mulation of spotlight synthetic aperture, the measurement spaceconsists of polar samples of the 2-D Fourier transform of the im-aged scene, so

(34)

E. Real Aperture Sidescan Sonar

For a real aperture sidescan sonar, a narrow sonar beam ofwidth is formed in the broadside direction [seeFigs. 1(a) and 5(c)]. Using (8) and (9) with and

[see Fig. 5(c)] gives

(35)

Also, using (12) with and gives

(36)

where is the projection ofon the -axis for a planar wave front at a rangefrom . In other words, is the integral of overthe lateral extent of the beam at range .Equation (36) can also be written as

(37)

where is a 1-D Fourier transform .In the limit , the sonar beamwidth has negligible (lat-

eral) extent in the -direction, so . Thus, the2-D image function formed by a real aperture sidescansonar is composed of a series of 1-D raster scans that pro-vides the image in the -direction; the motion of the sonar in thealong-track direction provides the image information in the

-direction.

Page 8: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

232 IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009

Fig. 7. Schematic depicting as shaded areas the extent of the polar wave nmber spectrum covered by various sonars: (a) tomographic aperture, (b) spotlightsynthetic aperture, (c) strip-map synthetic aperture, (d) wideband real aperture, and (e) narrowband real aperture.

IV. POLAR WAVE NUMBER SPECTRUM AND

POINT SPREAD FUNCTIONS

A. Polar Wave Number Spectral Coverageof Each Sonar Imaging Method

Fig. 7 depicts the sector of the polar wave number spectrumcovered by each of the sonar imaging methods. The fre-

quency bandwidth of the sonar transmissions determines the ra-dial extent of the sector and the range of aspect angles definesthe angular extent. The zero component of the wave numberspectrum lies in the center of the circles. Fig. 7(a) shows the an-nulus spanned by a tomographic imaging sonar, which requiresa wideband sonar transducer and full (360 ) aspect angle cov-erage. In practice, high-frequency sonars transmit bandpass sig-nals, so the region of the polar wave number spectrum spannedby the measurements is constrained to an annulus with an innerradius of and an outer radius of .

A synthetic aperture sonar is generally assumed to travelalong a straight line, which means that the measurements areonly obtained over a limited range of angles. The spotlightsynthetic aperture algorithm, which has its origin in radar appli-cations, has a 2-D wave number spectrum that, although havinga larger angular extent than a strip map synthetic aperture, isstill only defined over a limited angular region [see Fig. 7(b)].The diagrams for Fig. 7(b) (spotlight synthetic aperture),Fig. 7(c) (strip map synthetic aperture sonar), and Fig. 7(d)(wideband real aperture sonar) depict a progressive reductionin aspect coverage. Fig. 7(e) (narrowband real aperture) depictsa significant reduction in the information available for high-res-olution sonar imaging due to the narrow extent of the aspectcoverage and the sonar transmissions being narrowband.

B. Point Spread Functions of Imaging Sonars

The image of a point source in the object space is blurred bythe imaging system; image degradation can also arise due to thepresence of noise. If the imaging system is linear and spatially(or shift) invariant, the image of an object can be expressed as[16]

(38)where is the image estimate of the object function

, is the point spread function, denotes the 2-Dspatial convolution operation in Cartesian coordinates, and

is the additive noise function (which is subsequentlyignored). The point spread function is equivalent to the impulseresponse of the imaging system, that is, it is the output atthe location in the image space due to an ideal pointsource at location in the object space. This is expressedmathematically as

(39)and, for spatially invariant systems (that is, the shape of the im-pulse response does not change as the impulse moves aroundthe plane)

(40)

The ideal point spread function is the 2-D Dirac delta function, since

(41)

Page 9: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

FERGUSON AND WYBER: GENERALIZED FRAMEWORK FOR REAL APERTURE, SYNTHETIC APERTURE 233

In this case, the image estimate is an exact match of the objectfunction.

The image estimate using a backprojected Radon transformimplementation of a tomographic imaging sonar is given by[16]

(42)

so the image estimate of is blurred by the point spreadfunction . The object function can be restoredfrom the estimate in (42) by a 2-D (inverse) filter whosespatial frequency response is , that is

(43)

This is equivalent to implementing an inverse Radon transform.For a practical sonar, the sonar transmissions are bandpass fil-

tered . When the measurements are confinedto an annulus of the polar wave number space [as illustrated inFig. 7(a)], the reconstructed tomographic image is given by (seeAppendix)

(44)

where andare the minimum and maximum wave numbers, respec-tively, associated with the bandpass-filtered sonar signal,

, and is the first-order Bessel functionof the first kind. The point spread function, which is radi-ally symmetric, is the difference of two Airy patterns. Theradial resolution for a bandpass-filtered tomographic sonar is

.The image estimate using a strip map implementation of a

linear synthetic aperture (26) is given by [12]

(45)

where the wave number bandwidths and of the ex-tracted data (see Fig. 6) are given by [12]

(46)

(47)

The resolution of the image estimate for a strip map syntheticaperture sonar is given by [12]

(48)

(49)

For an image estimate using a spotlight mode implementationof a linear synthetic aperture (31), the wave number bandwidths

and of the extracted data are given by [12]

(50)

(51)

where is the range of aspect angles over whichinsonification occurs, that is, the angular diversity of the spot-light aperture or the difference in the final squint angle and initialsquint angle, and is the length of the spotlight syntheticaperture. The wave number bandwidth in the along-trackdirection is larger for spotlight mode (than for strip map mode)because the spotlight aperture is longer. The resolution of theimage estimate for a spotlight synthetic aperture sonar is givenby

(52)

(53)

Unlike strip map mode (49), the resolution in the along-trackdirection is range dependent for spotlight mode (53).

V. RESULTS OF APPLYING THE SONAR IMAGING

METHODS TO REAL DATA

A. Tomographic Sonar Images

In the implementation of a tomographic sonar, the data pointsare measured at discrete angles corresponding to the mea-

surement positions of the sonar [see Fig. 4(a)]. The transformeddata points , where and is theforward spatial Fourier transform , are uniformly dis-tributed along the radii of a circle in a polar plot. It is necessaryto map these points via the operation to a Carte-sian grid with the data points being regularly spacedat intervals of and in the respective and direc-tions of the rectangular grid [15]. This mapping is implementedby applying a 2-D (bilinear) interpolation routine to the mea-sured data points. The 2-D inverse wave number transform isthen applied to the (regularly spaced) interpolated data points toprovide an estimate of .

The projection data (received echo signal as a function of as-pect angle for one complete revolution of the object) shown inFig. 4(a) are for the bomb photographed in Fig. 8(a). A tomo-graphic image of this bomb is shown in Fig. 8(b). This tomo-gram was formed by applying the Fourier reconstruction methodto the wideband sonar projection data shown in Fig. 4(a) (for de-tails, see [15]). The separation distance between the sonar andthe object was 60 m. The center frequency of the sonar was 150kHz and the bandwidth of the sonar transmissions was 100 kHz.Features evident in the image are the outline of the bomb caseand the two lugs to which wire lifting ropes were attached.

The experiment was repeated using another object, the Stone-fish exercise mine, photographed in Fig. 8(c). Features evidentin the tomographic sonar image, which is shown in Fig. 8(d), arethe outline of the mine case and the nose dome that is intended

Page 10: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

234 IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009

Fig. 8. (a) Photograph and (b) tomographic image of the 900-kg bomb.(c) Photograph and (d) tomographic image of the Stonefish exercise mine.

to be an acoustically transparent window. Also prominent is therecessed section near the center of the mine where the diameterof the cylindrical shape is reduced. The two small white circular

features in the image are due to the lifting lugs on the minecasing and the attached support cables.

B. Synthetic Aperture Sonar Images

The U.K. Defence Evaluation and Research Agency deployedits mobile synthetic aperture sonar rail facility in Sydney Har-bour. The center frequency of the sonar was 150 kHz and thebandwidth was 64 kHz. An example of a linear synthetic aper-ture sonar image formed using data from the sonar-on-a-railsystem is shown in Fig. 9(a); the length of the aperture syn-thesized was 8.53 m. This sonar image indicates the presenceof two objects: the bomb and the Stonefish exercise mine pho-tographed in Fig. 8(a) and (c), respectively. The geometricalshape and orientation of each object is shown in Fig. 9(b). Inthe synthetic aperture sonar image, each object is representedby a spatial distribution of acoustic highlights. Unlike the to-mographic images shown in Fig. 8(b) and (d), the shapes of theobjects are not discernible in the synthetic aperture sonar image.This is a consequence of the limited range of aspect angles fromwhich the objects were insonified. In Fig. 4(a), the target high-lights corresponding to point reflectors on the bomb are visiblefrom nearly all aspect angles; these highlights can be seen inthe synthetic aperture sonar image where the bomb is 45 frombroadside. To observe the shape of the bomb would require re-flections from the sides of the case. Due to the specular nature ofthese reflections, they can only be observed close to broadside.Due to the limited range of look angles available for the syn-thetic aperture sonar experiment, no measurements were madefor these aspects and so the shape of the mine case is not de-fined.

Due to the strong highlights present and the high resolutionof the sonar, the signal to reverberation level was in the order of50 dB. To prevent sidelobe leakage from the highlights into theremainder of the image it was necessary to suppress the side-lobes by an amount greater than this. While this can be readilyachieved with an appropriate choice of aperture shading overthe length of the array, sidelobe suppression of this magnituderequires extremely accurate phase matching over the sensors inthe array. For a synthetic aperture array where measurementsare made using the same sensors in different positions and atdifferent times, the phase matching requires compensation forerrors in the sensor position and changes in the path lengths dueto fluctuations in the speed of sound propagation in water. Tomeasure the fluctuations caused by errors in the sensor positionsfrom a nominal straight line and changes in the sound speed, thepropagation time for the signal reflected from a target highlightwas accurately measured using the zero crossing of the phaseof the analytic signal. As the sensor position changed along therail, the round-trip time delay (for sound wave propagation fromthe sonar to the target highlight and back to the sonar) varied dueto changes in the range to the target highlight. Over the limitedextent of look angles, the variation in this range could be ap-proximated by a parabola. So, by fitting a parabola to the mea-surements of the round-trip time delay, the position of the targethighlight was estimated. The fluctuations in the time

Page 11: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

FERGUSON AND WYBER: GENERALIZED FRAMEWORK FOR REAL APERTURE, SYNTHETIC APERTURE 235

Fig. 9. (a) Synthetic aperture sonar image and (b) line drawings of the 900-kg bomb (upper object) and the Stonefish exercise mine (lower object).

round-trip time delays due to sensor position errors and changesin the propagation speed were then estimated as the variationin the time delays about this parabola. Compensating for thisfluctuation allowed the signal returns from the highlight to beaccurately aligned. To quantify the accuracy of this alignmentover the field of view, the compensation determined from theinitial highlight was used to align the signal reflected from aseparate target highlight. Any change in the speed of sound inthe paths to each of the target highlights will result in errorsin the alignment about the parabola associated with the secondhighlight. For target highlights separated by less than 2 m, it wasfound that the errors were less than , which satisfies thecriterion to achieve the required sidelobe suppression. Thus, fo-cusing on a single target highlight is sufficient to focus the signalover the extent of a mine. For target highlights separated by 5 m,the error increased to . Thus, to achieve the required side-lobe suppression for the second object, it would be necessaryto focus on a separate highlight from that particular object. It

was found that over the region shown in Fig. 9 fluctuations overthe period of the measurements resulted in changes in the effec-tive path length in the order of about 1 mm, which is .However, over a distance of a couple of meters they are muchsmaller and in the order of 100 m, which is . Thiswas the phase accuracy required to achieve the desired side-lobe suppression and thus by focusing on the highlight from thelower left corner of the lower object the image was obtainedfree of sidelobe leakage. However, the suppression of the side-lobe leakage to the same level for the synthetic aperture sonarimage of the other (upper) object would require refocusing thearray on a prominent point of reflection (highlight) on the upperobject, before applying the shading coefficients.

C. Real Aperture Sonar Images

The image estimates formed using a real aperture sidescansonar are displayed in Fig. 10. The images correspond to thelower object shown in Fig. 9 (the Stonefish exercise mine). The

Page 12: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

236 IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009

Fig. 10. Real aperture sonar image of the Stonefish exercise mine using (a)wideband signals and (b) narrowband signals.

real aperture was composed of 32 hydrophone elements and thetotal length of the real aperture was 0.258 m. The center fre-quency of the sonar was 150 kHz and the bandwidth of thesonar transmissions was large (64 kHz) in Fig. 10(a) and small(8 kHz) in Fig. 10(b). Fig. 10(a) shows that the wideband (64kHz) sonar allows features (or acoustic highlights) of the objectto be resolved in range. However, the details of these featuresbecome blurred in the range direction when the sonar transmis-sions are narrowband (8 kHz). Traditionally, in-service minehunting sonars use narrowband sonar transmissions. Also, inFig. 10, the azimuthal resolution is low due to the physical ex-tent of the real aperture being small. This leads to smearing ofthe object image in the cross-range direction.

VI. CONCLUSION

A generalized approach to sonar image formation enablesa number of commonly used active sonar imaging algorithmsto be derived by the progressive application of constraints tothe equation for tomographic imaging. Real aperture sidescansonar imagery reveals the presence of mine-like objects but itscoarse cross-range resolution prevents imaging of any of the de-tail in the azimuthal direction. Linear synthetic aperture sonarprovides high-resolution maps of the seafloor’s reflectivity withthe enhanced cross-range resolution enabling the spatial distri-bution of prominent reflection points on mine-like objects to be

resolved in the azimuthal direction. Tomographic sonar providessuperior imagery, where the cross-range resolution of the tomo-graphic image that is reconstructed from projections is matchedto the range resolution of ultrawideband sonar transmissions.

APPENDIX

EFFECT OF THE BANDWIDTH OF THE MEASUREMENT SIGNAL

ON THE IMAGE FORMED BY A TOMOGRAPHIC SONAR

Ignoring the effects of noise, the tomographic image estimateis given by

(A1)

where is the point spread functionof the imaging system, is the object function, and de-notes the 2-D spatial convolution operation in Cartesian coordi-nates.

In practice, sonar signals are bandpass filtered, so the measurements of the polar wave number spectrum

are limited to an annulus defined by

for (A2)

for

(A3)

for (A4)

These constraints on the wave number space available forimage reconstruction lead to smoothing and loss of resolutionin the image. The effect of the smoothing can be quantified byevaluating the point spread function , which is given by

(A5)

where

Since and , then

Page 13: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

FERGUSON AND WYBER: GENERALIZED FRAMEWORK FOR REAL APERTURE, SYNTHETIC APERTURE 237

where . A polar representation of the point spread func-tion is given by

(A6)

where , , , ,, and .

Since and the inte-gration with respect to is independent of , then

(A7)

Expanding the exponential in (A7) into its real and imaginaryparts gives the radially symmetric function

(A8)Since is an even function and is anodd function (so that the imaginary part of the integral is zero),then (A8) becomes

(A9)

where is the zero-orderBessel function of the first kind with argument .

Given , where is the first-orderBessel function of the first kind with argument , then(A9) becomes

(A10)

Finally, as shown in (44), (A10) can be expressed as

(A11)

ACKNOWLEDGMENT

The authors would like to thank the Maritime Operations Di-vision’s High Frequency Sonar Trials Team: J. Shaw, C. Hall-iday, N. Tavener, L. Criswick, J. Cleary, R. Susic, A. Head, andG. Speechley, and M. Savage from the Woronora Test Facility,for the contributions to this paper. They would also like to thankDr. K. Lo, Maritime Operations Division (DSTO Sydney), forthe technical discussions on high-frequency sonar signal pro-cessing and Prof. P. Gough, Department of Electrical and Elec-tronic Engineering, University of Canterbury, New Zealand, forthe technical discussions on synthetic aperture sonar. The mea-surements of the object field were made using an underwaterrail and sonar provided by DERA (U.K.) for a joint experimentconducted under the auspices of TTCP MAR TP-13 in SydneyHarbour.

REFERENCES

[1] R. O. Nielsen, Sonar Signal Processing. Boston, MA: Artech House,1991.

[2] B. G. Ferguson, “Minimum variance distortionless response beam-forming of acoustic array data,” J. Acoust. Soc. Amer., vol. 104, pp.947–954, 1998.

[3] K. W. Lo, “Adaptive array processing for wide-band active sonars,”IEEE J. Ocean. Eng., vol. 29, no. 3, pp. 837–846, Jul. 2004.

[4] C. Kak and M. Slaney, Principles of Computerized TomographicImaging. New York: IEEE Press, 1988.

[5] M. Soumekh, Fourier Array Imaging. Englewood Cliffs, NJ: Pren-tice-Hall, 1994.

[6] M. Soumekh, Synthetic Aperture Radar Signal Processing. NewYork: Wiley, 1999.

[7] D. C. Munson, J. D. O’Brien, and W. K. Jenkins, “A tomographic for-mulation of spotlight-mode synthetic aperture radar,” Proc. IEEE, vol.71, no. 8, pp. 917–925, Aug. 1983.

[8] C. V. Jackowatz, D. E. Wahl, P. H. Eichel, D. C. Ghiglia, and P. A.Thompson, Spotlight-Mode Synthetic Aperture Radar: A Signal Pro-cessing Approach. New York: Springer-Verlag, 1996.

[9] D. L. Mensa, High Resolution Radar Cross-Section Imaging. Boston,MA: Artech House, 1991, ch. 6.

[10] R. H. Stolt, “Migration by Fourier transform,” Geophysics, vol. 43, pp.23–48, 1978.

[11] M. P. Hayes, “A CTFM synthetic aperture sonar,” Ph.D. dissertation,Dept. Electr. Electron. Eng., Univ. Canterbury, Christchurch, NewZealand, 1989.

[12] D. W. Hawkins, “Synthetic aperture imaging algorithms,” Ph.D. dis-sertation, Dept. Electr. Electron. Eng., Univ. Canterbury, Christchurch,New Zealand, 1996.

[13] P. T. Gough and D. W. Hawkins, “Imaging algorithms for a strip mapsynthetic aperture sonar: Minimizing the effects of aperture errorsand aperture undersampling,” IEEE J. Ocean. Eng., vol. 22, no. 1, pp.27–39, Jan. 1997.

[14] A. Bellettini and M. A. Pinto, “Theoretical accuracy of synthetic aper-ture sonar micronavigation using a displaced phase-center antenna,”IEEE J. Ocean. Eng., vol. 27, no. 4, pp. 780–789, Oct. 2002.

[15] B. G. Ferguson and R. J. Wyber, “Application of acoustic reflectiontomography to sonar imaging,” J. Acoust. Soc. Amer., vol. 117, pp.2915–2928, 2005.

[16] A. K. Jain, Fundamentals of Digital Image Processing. EnglewoodCliffs, NJ: Prentice-Hall, 1989, ch. 2 and 10.

Brian G. Ferguson (M’93–SM’07) received theB.Sc. (with honors) and Dip.Ed. degrees from theUniversity of Sydney, Sydney, N.S.W., Australia,in 1972 and 1973, respectively, and the M.Sc.(research) and Ph.D. degrees from the Universityof New South Wales, Sydney, N.S.W., Australia, in1976 and 1982, respectively.

From 1974 to 1984, he was a Physicist with theAustralian Department of Science and Technology,where his research activities were in the fields ofsolar radio astronomy and ionospheric physics. In

Page 14: IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY ...homepages.inf.ed.ac.uk/ksubr/Files/Papers/SONARTypes.pdf · inverse of the integral equation given by (1) where the coordinate

238 IEEE JOURNAL OF OCEANIC ENGINEERING, VOL. 34, NO. 3, JULY 2009

1984, he joined the Submarine Sonar Group at the Royal Australian NavyResearch Laboratory, which was incorporated into Australia’s Defence Scienceand Technology Organisation (DSTO), Pyrmont, N.S.W., Australia, in 1988.He is currently a Principal Research Scientist, where he leads a group involvedin the research and development of littoral environment sensing systems andconcepts. He is Australia’s national leader in Battlespace Acoustics underthe Technical Cooperation Program between the Governments of Australia,Canada, New Zealand, the United Kingdom, and the United States. He is alsoan invited member of NATO Sensors and Electronics Technology Panel TG-53.His work has been published widely in international scientific and engineeringjournals. In September 2006, the IEEE Oceanic Engineering Society appointedhim as General Chair of the prestigious international OCEANS conference andexhibition to be held at the Sydney Convention and Exhibition Centre, DarlingHarbour, May 24–28, 2010. In June 2009, he was appointed Principal Scientistand Engineer responsible for acoustic systems science in the DSTO.

Dr. Ferguson is a Fellow of the Acoustical Society of America, a SeniorMember of the Australian Institution for Radio and Electronics Engineers, a

Chartered Professional Engineer, and a Member of the Australian Institution ofEngineers. As an invited member of SET-093, he received the NATO Researchand Technology Organisation 2009 Scientific Achievement Award.

Ron J. Wyber (M’85) received the B.Sc, B.E. (honors 1, electrical, and thePh.D. degrees in science and electrical engineering from Sydney University,Sydney, N.S.W., Australia, in 1968, 1970, and 1974, respectively.

From 1974 to 1987, he worked for the Australian Defence Department and theRoyal Australian Navy Research Laboratory. From 1983 to 1985, he was a vis-iting Scientist at Applied Research Laboratory, University of Texas (ARL-UT).In 1987, he was a founding director of the company Sonartech and in 1989 thecompany Midspar Systems Pty, Ltd., Oyster Bay, N.S.W., Australia, for whichhe currently works.

Dr. Wyber is a member of the Acoustical Society of America.