8
Cent. Eur. J. Phys. DOI: 10.2478/s11534-009-0132-7 Central European Journal of Physics Optics-less smart sensors and a possible mechanism of cutaneous vision in nature Research Article Leonid Yaroslavsky 1* , Chad Goerzen 1 , Stanislav Umansky 1 , H. John Caulfield 21 Department Physical Electronics, Faculty of Engineering, Tel Aviv University, Tel Aviv, Ramat Aviv 69978, Israel 2 Physics Department, Fisk University, 1000 17th St., Nashville, TN 37298, USA Received 29 September 2008; accepted 1 August 2009 Abstract: Although optics-less cutaneous (“skin”) vision is not uncommon among living organisms, its mechanisms and capabilities have not been thoroughly investigated. This paper demonstrates, using methods from statistical parameter estimation theory and numerical simulations, that arrays of bare radiation detectors arranged on a planar or curved surface have the ability to perform imaging tasks without any optics at all. The working principle of this type of optics-less sensors and the model developed here for determining their performance may be used to shed light on possible mechanisms, capabilities and evolution of cutaneous vision in nature. PACS (2008): 42.30.Va, 42.79.Pw, 42.66.-p, 87.19.lt Keywords: radiation sensors • estimation theory • cutaneous vision • imaging systems © Versita Warsaw and Springer-Verlag Berlin Heidelberg. 1. Introduction Organisms in nature use a wide variety of visual sys- tems [1, 2]. Most of them use optics to form images, but optics-less cutaneous vision (skin vision) is also found among many types of living organisms. One example may be heliotropism, the ability of some plants to orient their leaves or flowers towards the Sun. It seems obvious that this ability requires some sort of “skin” vision to deter- mine direction to the Sun. There are also many examples of animals with organs that may be classified as optics- less imaging sensors. These include: * E-mail: [email protected] (Corresponding author) E-mail: hjc@fisk.edu • “Primitive eyes” of many invertebrates, including eye spots (patches of photosensitive cells located on the skin), cup eyes, and pit eyes [1, 2]. • Cutaneous photo-reception in reptiles [3, 4]. • “Pit organs” of vipers located near their normal eyes – these image forming organs are sensitive to infra-red radiation and do not contain any op- tics [5, 6]. • The “lateral line system”, an organ characteristic of some fishes, which is sensitive to pressure waves in water and which they use to localize sources of vibration located within approximately one body length [7]. • Electro-receptors, found in some types of fishes and sharks, as well as the platypus, are sensitive

Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Cent. Eur. J. Phys.DOI: 10.2478/s11534-009-0132-7

Central European Journal of Physics

Optics-less smart sensors and a possible mechanismof cutaneous vision in nature

Research Article

Leonid Yaroslavsky1∗, Chad Goerzen1, Stanislav Umansky1, H. John Caulfield2†

1 Department Physical Electronics, Faculty of Engineering, Tel Aviv University, Tel Aviv, Ramat Aviv 69978, Israel

2 Physics Department, Fisk University, 1000 17th St., Nashville, TN 37298, USA

Received 29 September 2008; accepted 1 August 2009

Abstract: Although optics-less cutaneous (“skin”) vision is not uncommon among living organisms, its mechanismsand capabilities have not been thoroughly investigated. This paper demonstrates, using methods fromstatistical parameter estimation theory and numerical simulations, that arrays of bare radiation detectorsarranged on a planar or curved surface have the ability to perform imaging tasks without any optics at all.The working principle of this type of optics-less sensors and the model developed here for determining theirperformance may be used to shed light on possible mechanisms, capabilities and evolution of cutaneousvision in nature.

PACS (2008): 42.30.Va, 42.79.Pw, 42.66.-p, 87.19.lt

Keywords: radiation sensors • estimation theory • cutaneous vision • imaging systems© Versita Warsaw and Springer-Verlag Berlin Heidelberg.

1. Introduction

Organisms in nature use a wide variety of visual sys-tems [1, 2]. Most of them use optics to form images,but optics-less cutaneous vision (skin vision) is also foundamong many types of living organisms. One example maybe heliotropism, the ability of some plants to orient theirleaves or flowers towards the Sun. It seems obvious thatthis ability requires some sort of “skin” vision to deter-mine direction to the Sun. There are also many examplesof animals with organs that may be classified as optics-less imaging sensors. These include:∗E-mail: [email protected] (Corresponding author)†E-mail: [email protected]

• “Primitive eyes” of many invertebrates, includingeye spots (patches of photosensitive cells locatedon the skin), cup eyes, and pit eyes [1, 2].• Cutaneous photo-reception in reptiles [3, 4].• “Pit organs” of vipers located near their normaleyes – these image forming organs are sensitiveto infra-red radiation and do not contain any op-tics [5, 6].• The “lateral line system”, an organ characteristic ofsome fishes, which is sensitive to pressure wavesin water and which they use to localize sourcesof vibration located within approximately one bodylength [7].• Electro-receptors, found in some types of fishesand sharks, as well as the platypus, are sensitive

Page 2: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Optics-less smart sensors and a possible mechanism of cutaneous vision in nature

to electric fields. These organs allow animals tosense electrical-field variations in their surround-ings within approximately one body length. Datapresented in [8] suggest that the platypus as awhole has approximately 100 times the sensitivityof a single electro-receptor to an electrical source,which may imply that both multiple-receptor datafusion and higher-level processing is being carriedout by the platypus.There are also a numerous reports on the phenomenon ofcutaneous vision in humans [9–11]. In particular, Ref. [10]provides some quantitative data on the ability of a certainyoung woman to “see” images using only the fingers ofher right hand. It also reports that in a series of care-fully conducted tests, this subject demonstrated the abil-ity to detect colours, to resolve patterns in near-contactwith her fingers with a resolution of about 0.6 mm andthe ability to determine simple patterns within a maximaldistance of 1-2 cm from the fingers. Refs. [12, 13] report ex-periments, which suggest that photosensitive cells on theskin (abdomen, back, foot, and tongue have all been used)may be used to allow blind people to “see” to a limitedextent in a process that is termed “sensory substitution”.However, most available publications provide phenomeno-logical findings rather than discussion of the underlyingmechanisms and capabilities of cutaneous vision.The properties of cutaneous vision reported in the litera-ture correlate very well with the properties of the “Optics-less Smart” (OLS) sensors proposed recently in [14–16].This correlation enables us to consider operational princi-ples of OLS sensors as a possible mechanism of cutaneousvision, as well as of other types of primitive vision.2. OLS sensors and their operationprincipleOLS sensor consists of arrays of radiation detectors (re-ceptors) arranged on planar or curved surfaces and sup-plemented with a signal processing unit. This unit collectsthe outputs of the detectors and uses them to compute sta-tistically optimal estimates of radiation source intensitiesand coordinates. The detectors are not supplemented withoptics or any other means with which to shape their angu-lar sensitivity beyond the inherently defined by physicalproperties of their surface.Depending on a priori knowledge about radiation sourcesthese planar OLS sensors can be used in the followingbasic operation modes:

• “General localization” mode: estimation of inten-sities and locations of a known number of point ra-

a)

b)

Figure 1. Planar OLS sensor (a) and a schematic diagram of its 1Dmodel (b).

diation sources• “Imaging” mode: estimation of intensities of a givennumber of point radiation sources with known spa-tial locations

For the purpose of this paper, we will concentrate on pla-nar OLS-sensors shown in Fig. 1a. We will explain theiroperational principle using a 2-D model shown in Fig. 1b.As shown in Fig. 1b, let {I, x0, y0} represent the intensityand coordinates of a radiation source in the sensor’s coor-dinate system and let the sensor consists of N radiationdetectors, whose outputs {sk} , k = 1, ..., N are used inthe signal processing unit to determine the source inten-sity and coordinates {I, x0, y0}. Assume that k-th detec-tor has an angular sensitivity function AngSen (θk ), whereθk ∈ [−π/2, π/2,] is the angle between the surface normaland the direction to the source from the detector’s position(xk , 0). Assume also that the detector’s response to the ra-diation is corrupted by a random interference {nk} thatcan be modelled as additive signal independent Gaussiannoise. Then, taking into account that radiation energy isinversely proportional to the distance from the radiation

Page 3: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Leonid Yaroslavsky, Chad Goerzen, Stanislav Umansky, H. John Caulfield

source to the detector, the output of the k-th detector canbe written assk = IAngSen (θk (x0, y0))√(x0 − xk )2 − y2 + nk . (1)

In view of the statistical nature of sensor noise, theBayesian approach to optimal parameter estimation canbe applied [17]. Without limitation of generality, we willassume here that the signal-processing unit of the sen-sor is programmed to compute the Maximum Likelihood(ML) estimates {I, x0, y0} of parameters {I, x0, y0}. Notethat computing other statistical estimates, such as for in-stance, Maximum A Posteriori Probability estimates, willonly require a corresponding re-programming of the signalprocessor unit.For the selected Gaussian additive noise model, MaximumLikelyhood estimates {I, x0, y0} minimize mean-squaredifferences between detectors’ outputs {sk} and their hy-pothetical noiseless values:{I, x0, y0} =arg min{I,x0,y0}

N∑k=1sk − I AngSen (θk (x0, y0))√(x0 − xk )2 + y20

2 . (2)In what follows we will use the simplest model for the an-gular sensitivity function of detectors. Specifically, we willassume that detector’s output to radiation is proportionalto the radiation energy per unit surface area of the de-tector. This assumption implies the so-called Lambertianmodel, according to which detector’s output to radiation isproportional to the cosine of the angle between the surfacenormal and the direction to the radiation source. In thiscase the angular sensitivity function AngSen (θk (x0, y0))of k-th detector in Eq. (2) is defined by the equation

AngSen (θk (x0, y0)) = cosθk (x0, y0) . (3)3. Sensors’ performance: theoreti-cal lower boundsConventional optical imaging devices satisfy the superpo-sition principle and are modelled as linear systems. Con-sequently, their performance is commonly characterized bytheir point-spread functions. OLS sensors are decision-making devices. Because of this they are essentially non-linear and their imaging capability can’t be treated in

terms of point-spread functions. We will evaluate the per-formance of the OLS sensors in terms of variances of thesource parameter estimation errors. Statistical theory ofparameter estimation shows that, for parameter estima-tion from data corrupted by sufficiently small independentGaussian additive noise, estimation errors have a normaldistribution with a mean of zero and a variance given bythe Cramér-Rao Lower Bounds (CRLB, [17]). Derivationof CRLB requires quite bulky computations. For the sakeof brevity, we provide here results obtained for a simplespecial case of three detectors and one source on sensor’s“optical axis” in front of it at coordinates (x0 = 0, y0 = 1)in units of inter-detector distance. In this case CRLB errorvariances Var {Ierr}, Var {xerr}, and Var {yerr} in esti-mating of source intensity and X-Y coordinates are foundto be1:

Var {Ierr} = Var{∣∣∣I − I∣∣∣} = 2σ 2, (4)

Var {xerr} = Var {|x0 − x0|} = 2σ 2I2 , (5)

Var {yerr} = Var {|y0 − y0|} = 3σ 2I2 , (6)

where σ 2 is the detector noise variance.This result implies that planar OLS sensors are poten-tially capable of estimating the intensities and coordi-nates of radiation sources in their vicinity with estimationerrors of the order of magnitude of the relative noise levelat sensors’ detectors. Eqs. (5) and (6) also suggest thatthe sensor is more accurate in estimating X-coordinatethan Y-coordinate.Cases involving larger number of detectors and largernumber of sources are much more involved. From gen-eral principles of mathematical statistics, one can expectthat estimation accuracy will increase with an increase ofthe number of detectors. The increase will, however, soonslow down, because individual detectors do not contributeuniformly to the estimation. Detectors which are clos-est to the source provide the main contributions. On theother side, one can also expect that, given the number ofdetectors, the source parameter estimation accuracy willdecrease with an increased number of sources.1 S. Umansky, Imaging without optics: localization ofradiation sources in close proximity using flat lens-lesssensors,www.eng.tau.ac.il/%7Eyaro/StasUmansky/ThesisWebSummary.pdf

Page 4: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Optics-less smart sensors and a possible mechanism of cutaneous vision in nature

4. Numerical simulation of OLS sen-sors and simulation resultsThe performance and imaging capability of planar OLSsensors were thoroughly studied by means of statisticalsimulation. To solve the optimization problem of Eqs. (2)the multi-start algorithm based on a Quasi-Newton opti-mization procedure [18] was used. For each vector of thesource parameters, the model was run 20-50 times in orderto accumulate statistical data to evaluate the sensor per-formance in terms of the standard deviation of the param-eter estimation error over the runs. It should be noted thatthe stochastic algorithm used by the global-optimizationmethod is not guaranteed to always converge to the globalminimum, although in most cases convergence does occur.In order to eliminate the results of non-converged runs,robust estimates were first found for the mean µ and stan-dard deviation σ

µ = Q2; σ = Q3−Q1, (7)

where Q1, Q2 and Q3 are first, second and third quar-tiles of the data distribution. Data outside the range:[µ−4σ , µ+4σ ] were removed, and the mean and standarddeviation were re-calculated from the remaining data. Ad-ditionally, in order to accelerate the search and improvethe reliability of global-minimum location, input data weresubjected to de-correlation pre-processing by means of the“whitening” algorithm [19] that proved to be the optimalpre-processing algorithm for target location in the clut-ter. It is interesting to note a similar de-correlation datapre-processing is known in vision science as the “lateralinhibition” [20].

The computer model was successfully tested for fitness tothe above theoretical lower bounds of errors of estimatingthe intensity and co-ordinates of a single radiation sourceby a three-detector sensor. The model was then run toevaluate the sensor field of view.

a) b)

c) d)

Figure 2. Maps of standard deviations of estimation errors of X-Y coordinates (a, b) and of intensity (c) of a radiation source as a function of thesource position with respect to the planar line array of 25 detectors. Darker areas correspond to larger errors. Plot d) shows standarddeviations of X-Y and intensity estimation errors as function of distance from the sensor along its “optical axis” (central cross-sectionsof Figs. a)-c)).

Page 5: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Leonid Yaroslavsky, Chad Goerzen, Stanislav Umansky, H. John Caulfield

Figs. 2a-d enable quantitative evaluation of the size andshape of the sensor’s field of view in terms of the standarddeviation of estimates of X-Y coordinates and intensity of asingle source placed at different positions in front of a linearray of 25 detectors. In particular, plots in Fig. 2d confirmthe above theoretical conclusion that errors in estimatingY-coordinates of the source exceed those of X-coordinates.From these data one can see that accurate estimation ofsource intensity and coordinates is possible within smalldistances from the sensor, but at distances greater then

about half the sensor length, the estimation accuracydrops by an order of magnitude.Further experiments also revealed that planar sensors arecapable of resolving two sources if the distance betweenthem exceeds the value of the order of inter-detector dis-tance, which is intuitively well understood. Experimentswith different numbers of detectors gave similar resultsand proved the above suggestion that increasing the num-ber of detectors increases source parameter estimation ac-curacy, though to a limited degree.

Distance Detector readings Estimated source intensities

Y=1 Error standard deviation 5.6850e-04

Y=2 Error standard deviation 0.0105

Y=4 Error standard deviation 0.0590

Y=8 Error standard deviation 0.1197Figure 3. An illustration of sensing of 8×16 radiation sources arranged on a plane in the form of characters “SV” by a 3-D model of a planar

OLS sensor (made up of 8×16 detectors in “imaging” mode for source distances from the sensor z =1 to 8 (in units of inter-detectordistance). SNR was kept constant at 100 by making the source amplitude proportional to the distance between the source plane andsensor plane. Detector noise was 0.01.

Page 6: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Optics-less smart sensors and a possible mechanism of cutaneous vision in nature

Fig. 3 illustrates the work of a planar sensor of 8×16detectors in “imaging” mode. It shows the results of re-construction of the intensities of 8×16 sources arrangedin a planar array by a planar sensor also consisting of8×16 detectors.The test was run several times, each time with the sensorplane placed at different distances from the source plane.The source intensities form an image containing the char-acters “SV.” The noise standard deviation at the detectorswas set to 0.01 in units of source intensities, which wereset to zero (dark) and one (bright). As one can see fromthe figure, the planar OLS-sensor with a realistic noiselevel is capable of reasonably good imaging of radiationsources located in its close proximity, though the qual-ity of the estimates rapidly decays with the distance tothe sources. This phenomenon matches observations pub-lished in the above-mentioned studies of cutaneous vision.A natural option in the design of OLS sensors is to place

the detectors on curved surfaces. One can expect thatbending the surface of a planar sensor into a curved convexsurface will result in widening the sensor’s field of view,while bending the sensor to form a concave surface willnarrow its field of view. Simulation experiments confirmthis expectation. This is illustrated in Fig. 4 by maps of thestandard deviations of intensity estimates for a radiationsource as a function of source position with respect to acurved line array of eleven detectors. In [16], we showthat an array of detectors placed on the outer surface ofa sphere and supplemented with a corresponding signalprocessor has 4π steradian field of view.Yet another fact revealed by the experiments is thatsharpening the detector angular sensitivity functionAngSen (θk ) with respect to the Lambertian cosine law,such as making it proportional to (cosθk )P , with P > 1,improves the sensor’s accuracy in estimating the sourceparameters.

a) b)

c) d)

Figure 4. Maps of error-estimate standard deviations for the intensity of a radiation source as a function of the source position with respect to thesurface of the curved line array of 11 detectors (dark boxes): a) convex bending, radius 100 of inter-detector distances (idd), b) convexbending, radius 10 idd; c) – concave bending, radius 10 idd; d) –m concave bending, radius 5 idd.

Page 7: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Leonid Yaroslavsky, Chad Goerzen, Stanislav Umansky, H. John Caulfield

5. Discussion and conclusionsSimulation results for OLS sensors presented here showthat reasonably good directional vision without optics ispossible even when using the simplest possible detectors(whose angular sensitivity is defined only by the surfaceabsorptivity, such as the natural Lambertian one). Theseresults are also in a good correlation with published ex-perimental observations in studies of cutaneous vision.Main advantages of the OLS-sensors are:

• No optics are needed, making this type of sensorapplicable to virtually any type of radiation and toany wavelength.• The design is flexible: the shape of the sensor’ssurface can be easily changed to fit the requiredsensor’s field of view without any changes in thesignal processing software except updating its inputnumerical parameters such as angles of detectors’normals in the sensor coordinate system.• The sensor’s resolving power is determined ulti-mately by the number of detectors, their geometryand signal-to-noise ratio; diffraction-related limitsare not relevant to OLS sensors.

The cost for these advantages is the high computationalcomplexity, especially when good imaging properties formultiple sources are required: the computational complex-ity of OLS sensors grows exponentially with the number ofsources in the naïve solution, although there are ways toreduce it to polynomial. This might be a reason why nat-ural selection resulted in lens-based camera vision ratherthan “skin” vision as a main vision instrument in verte-brates. What a lens does in parallel and at the speedof light, optics-less vision must replace by computationsin neural machinery, which is much slower and requiresmany biological resources.This motivates advancing the hypothesis that optics-lesscutaneous vision may have appeared at very early stagesof evolution and that evolution of vision started from for-mation, around primordial light sensitive cells, of neuralcircuitry for implementing imaging algorithms similar tothose in our model of the planar OLS sensor, including,at one of the first steps, the lateral inhibition. Then, asour planar OLS sensor model naturally suggests, flat pri-mordial eyespots may have evolved in two independentways:(i) bending of the primordial almost-flat sensor sur-face to convex spherical form coupled with sharp-ening the angular sensitivity of detectors by meansof lenslets and waveguides, might have allowed theeyespot to evolve into a compound facet eye;

(ii) bending of the primordial sensor’s surface to a con-cave spherical form covered by a protective trans-parent medium, might have allowed the eyespot toevolve into camera-like vision.Possible mechanisms of such evolution of eye optics arediscussed elsewhere [21–23]. In addition, our model ofOLS sensors suggests that, in both cases, the evolutionof eye optics had to be paralleled by the evolution of eyeneural circuitries as parts of animal brains. To under-stand this process, it is important to note that, as it fol-lows from the theory of optimal target location in images,detection and localization of targets does not necessar-ily require formation of sharp images and can be carriedout directly on images, which are not sharply focused [19].Image sharpness affects the reliability of detection andbecomes important only for low signal-to-noise ratios atdetectors. Therefore, gradual improvements of eye opticstranslated into improved target detection reliability andallowed transferrance of increasingly higher fractions ofeye neural circuitry and brain resources from image for-mation to image understanding.To conclude, we believe that the operational principle andproperties of OLS sensors may cast light on mechanisms,capabilities and evolution of extra-ocular vision in natureand may suggest new ways for planning further experi-ments with animals and humans. Specifically, we believethat experiments with blind people would be useful to de-termine- the sensitivity of peoples’ skin to light of differentwavelengths from the visible range to infrared;- the spatial density of light sensitive skin cells, ifthere are any, at the surface of different parts ofthe body;- the range of light intensity that can detect by thesecells in humans.If the skin’s sensitivity to light is confirmed and quantified,it could, perhaps, be possible to train blind people to de-tect and recognize simple patterns of light sources placedat appropriately close proximity to light sensitive partsof the body, bearing in mind that the pattern complexitymust be commensurable with the density of light-sensitiveskin cells.

References

[1] R. Parker, In the Blink of an Eye (Free Press, London;Perseus, Cambridge, MA, 2003)[2] M. F. Land, D. E. Nilsson, Animal Eyes (Oxford, Ox-ford, 2001)

Page 8: Optics-less smart sensors and a possible mechanism of ...yaro/RecentPublications/ps&pdf...Optics-less smart sensors and a possible mechanism of cutaneous vision in nature to electric

Optics-less smart sensors and a possible mechanism of cutaneous vision in nature

[3] I. R. Schwab, Brit. J. Ophtalmol. 89, 1078 (2005)[4] K. Zimmerman, H. Heatwole, Copeia, 860 (1990)[5] G. J. Molenaar, In: C. Gans, P. S. Ulinski (Eds.), Bi-ology of the Reptilia, Vol. 17 (University of Chicago,Chicago,1992) 367[6] A. B. Sichert, P. Friedel, J. L. van Hemmen, Phys. Rev.Lett. 97, 068105 (2006)[7] N. A. M. Schellart, R. J. Wubbels, The Physiology ofFishes, 2nd ed. (CRC Press, 1998)[8] J. Pettigrew, J. Exp. Biol. 202, 1447 (1999)[9] M. Gardner, Science 711, 654 (1966)[10] M. M. Bongard, M. S. Smirnov, Federation Proceed-ings, Translation Supplement 24, 1015 (1965)[11] H. P. Zavala, D. B. Van Cott, D. B. Orr, V. H. Small,Percept. Motor Skill. 25, 525 (1967)[12] P. Bach-y-Rita, M. E.Tyler, K. A. Kaczmarek, Int. J.Hum.-Comput. Int. 15, 285 (2003)[13] E. Sampaio, S. Maris, P. Bach-y-Rita, Brain Res. 908,204 (2001)[14] H. J. Caulfield, L. P. Yaroslavsky, J. Ludman,arXiv:physics/0703099v1[15] H.J. Caulfield, L. P. Yaroslavsky, Ch. Goerzen, S.Umansky, arXiv:0808.1259v1[16] L. Yaroslavsky, In: A. T. Friberg, R. Daendliker (Eds.),Advances in Information Optics and Photonics (SPIEPress, Bellingham, Washington, USA, 2008) 209[17] H. L. Van-Trees, Detection, Estimation and Modula-tion Theory, Part I (Wiley, New York, 1968)[18] A. H. G. Rinnooy Kan, G. T. Timmer, Math. Program.39, 27 (1987)[19] L. Yaroslavsky, Digital Holography and Digital ImageProcesssing (Kluwer Academic Publishers, Boston,2004) Chap. 10, 11[20] G. Von Békésy, Sensory Inhibition (Princeton Univer-sity Press, Princeton, 1967)[21] D.-E. Nilsson, D. Arendt, Curr. Biol. 18, R1096 (2008)[22] D.-E. Nilsson, A. Kelber, Arthropod Struct. Dev. 36,373 (2007)[23] W. J. Gehring, J. Hered. 96, 171 (2005)