10

Click here to load reader

A Novel Emotion Elicitation Index Using Frontal

Embed Size (px)

DESCRIPTION

Emotion Elicitation

Citation preview

Page 1: A Novel Emotion Elicitation Index Using Frontal

IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011 737

A Novel Emotion Elicitation Index Using FrontalBrain Asymmetry for Enhanced EEG-Based

Emotion RecognitionPanagiotis C. Petrantonakis, Student Member, IEEE, and Leontios J. Hadjileontiadis, Senior Member, IEEE

Abstract—This paper aims at providing a novel method for eval-uating the emotion elicitation procedures in an electroencephalo-gram (EEG)-based emotion recognition setup. By employing thefrontal brain asymmetry theory, an index, namely asymmetryIndex (AsI), is introduced, in order to evaluate this asymmetry.This is accomplished by a multidimensional directed informationanalysis between different EEG sites from the two opposite brainhemispheres. The proposed approach was applied to three-channel(Fp1, Fp2, and F3/F4 10/20 sites) EEG recordings drawn from 16healthy right-handed subjects. For the evaluation of the efficiencyof the AsI, an extensive classification process was conducted usingtwo feature-vector extraction techniques and a SVM classifier forsix different classification scenarios in the valence/arousal space.This resulted in classification results up to 62.58% for the user in-dependent case and 94.40% for the user-dependent one, confirmingthe efficacy of AsI as an index for the emotion elicitation evaluation.

Index Terms—Emotion elicitation, emotion recognition,electroencephalogram, frontal brain asymmetry, multidimensionaldirected information.

I. INTRODUCTION

HUMAN machine interaction (HMI) has gained intenseattention in the last decade as machines have dramatically

influenced our lives in many aspects, such as, communication,profession, and entertainment. It has been lately argued [1] that ifmachines could understand a person’s affective state, HMI maybecome more intuitive, smoother, and more efficient defininga new approach in the HMI area known as ffective computing(AC). AC is the research field that deals with the design ofsystems and devices that can recognize, interpret, and processhuman emotions and would serve as the means of imbuingmachines with the ability of acting emotionally.

Emotion recognition is the first step toward the abovemen-tioned ultimate endeavor of AC. In order to recognize emo-tions many approaches have been proposed basically usingfacial expressions [2], [3], speech [4], [5], or signals fromthe autonomous nervous system (ANS) (e.g., heart rate, gal-vanic skin response) [6], [7] or even combination of them

Manuscript received November 6, 2010; revised April 13, 2011; accepted May20, 2011. Date of publication May 27, 2011; date of current version September2, 2011.

The authors are with the Department of Electrical and Computer Engineering,Aristotle University of Thessaloniki, Thessalonika, GR-54124, Greece (e-mail:[email protected]; [email protected]).

Color versions of one or more of the figures in this paper are available onlineat http://ieeexplore.ieee.org.

Digital Object Identifier 10.1109/TITB.2011.2157933

[8]. Lately, electroencephalogram-based emotion recognition(EEG-ER) [9]–[11] has been proposed, which offers great ben-efits with regard to its implementation, such as efficient time res-olution, less intrusiveness, and signals captured from the originof the emotion genesis, i.e., the central nervous system (CNS).

During an EEG-ER realization three major steps have to beimplemented. First of all, the emotion elicitation step deals withthe problem of efficiently evoked emotions to subjects who par-ticipate in the corresponding experiment. Second and third stepshave to do with captured data preprocessing and classification,respectively. Their effectiveness, however, is highly dependenton the emotion elicitation. The latter is of major importance inan EEG-ER system. If the subjects have not effectively becomeemotionally aroused during the emotion elicitation steps, the re-spective signals would not “carry” the corresponding emotionalinformation, resulting in an incompetent emotion classificationprocess.

Emotion elicitation procedures are mostly based on projec-tions of videos or pictures that are assumed to evoke certainemotions. The majority of the studies that dealt with the emotionrecognition problem (e.g., [11]) conducted emotion elicitationprocesses that consisted of such projections of multiple trials foreach emotional state and used all of the signals corresponding toeach one of the trials for the classification step. Considering thatthe situations depicted in the aforementioned videos/picturesare emotionally interpreted from the individuals in a way that ishighly influenced by factors, such as personality and personalexperiences, it is significantly questioned if all of the trials havethe same emotional impact to all of the subjects. Different sub-jects may be emotionally affected by different videos/pictures.

In this work, a novel measure to evaluate the emotional im-pact of each emotion elicitation trial via the EEG signals isintroduced. Consequently, by picking out the emotion elicita-tion trials that correspond to significant emotional responses,according to the introduced evaluation measure, the succeed-ing classification would have considerable rate improvement.In order to define an emotion elicitation evaluation measure,the frontal brain asymmetry concept [12] was used, exploitingthe neuronal bases of the emotion expression in the brain. Toextract this evaluation measure in the form of an index, namelythe asymmetry index (AsI), a multidimensional directed infor-mation (MDI) analysis [13] was adopted, resulting in a robustmathematical representation of the asymmetry concept, in amanner that efficiently handles the information flow (in bits)among multiple brain sites. In order to proceed with the assess-ment of the effectiveness of the AsI, an extended classification

1089-7771/$26.00 © 2011 IEEE

Page 2: A Novel Emotion Elicitation Index Using Frontal

738 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011

process took place. A support vector machine (SVM) classifierand two feature-vector extraction techniques were employed forthe classification process. EEG signals were acquired from 16healthy subjects using three EEG channels, which are sufficientboth for the representation of emotion expression in brain (seeSection II) and for the implementation of the MDI analysis [13].Experimental results demonstrated the potential of AsI as a ro-bust evaluation criterion of the emotion elicitation step of anEEG-ER system.

The rest of the paper is structured as follows. Section II givessome background material with regard to the emotion elicitationprocess and frontal brain asymmetry concept, thoroughly de-scribes the MDI method and introduces the proposed approach.Section III discusses some implementation issues, such as theconstruction of the dataset and the classification set up, whereasSection IV presents the results. Section V provides some dis-cussion on the overall evaluation of the proposed methodology.Finally, Section VI concludes the paper.

II. METHOD

A. Background

1) Valence/Arousal-Based Emotion Elicitation: Psycholo-gists do not present emotions as discrete states, but ratheras continuous ones and therefore demonstrate them in ann-dimensional (n-D) space; usually the 2-D valence/arousalspace (VAS) is adopted. Valence stands for one’s judgment abouta situation as positive or negative and arousal spans from calm-ness to excitement, expressing the degree of one’s excitation.The most frequently used technique to evoke emotion during anEEG-ER procedure is to use pictures depicting situations thatare supposed to elicit affective states relying in the VAS. TheInternational Affective Picture System (IAPS) [14] is a widelyused dataset, containing such pictures, which come with theirindividual values of valence and arousal. Consequently, the pro-jection of the IAPS pictures to a subject with the simultaneousrecording of EEG signals formulates an emotion elicitation pro-cess originating from the VAS theory.

2) Frontal Brain Asymmetry: The asymmetry between theleft and right brain hemispheres forms the most prominent ex-pression of emotion in brain signals. Davidson et al. [12] de-veloped a model that related this asymmetric behavior withemotions, with the latter analyzed in the VAS. According to thatmodel, emotions are: 1) organized around approach-withdrawaltendencies and 2) differentially lateralized in the frontal regionof the brain. The left frontal area is involved in the experi-ence of positive emotions (high values of valence), such as joyor happiness (the experience of positive affect facilitates andmaintains approach behaviors), whereas the right frontal regionis involved in the experience of negative emotions (lower va-lence values), such as fear or disgust (the experience of negativeaffect facilitates and maintains withdrawal behaviors). In orderto quantify the abovementioned asymmetry, Davidson used theasymmetry index DI = (L − R)/(L + R), where L and R arethe power of specific bands of the left and right hemispheres,respectively. Later, the asymmetry concept was also confirmedby other studies. For example, Davidson et al. [15], tried to

differentiate the emotions of happiness and disgust with EEGsignals captured from the left and right frontal, central, anteriortemporal, and parietal regions (F3, F4, C3, C4, T3, T4, P3, P4positions according to the 10–20 system [16]). The results re-vealed a more right-sided activation, as far as the power of alpha(8–12 Hz) band of the EEG signal is concerned, for the disgustcondition for both the frontal and anterior temporal regions.Thus, the results enhanced the applicability of the aforemen-tioned model (this model is used by the method described insucceeding sections to define the AsI as a metric of an emo-tion arousal (see II.C)) and confirmed the evidenced extensiveanatomical reciprocity of both regions with limbic circuits thathave been directly implicated in the control of emotion [17].Furthermore, Davidson [18] cites several studies that have ex-amined frequency bands other than alpha, including theta, beta,and gamma. In these studies, alpha power was also examinedand in a number of cases, asymmetrical effects were found inbands other than alpha, while effects in the alpha band wereabsent. Moreover, Bos [11] examined the efficacy of alpha, betaand the combination of them to discriminate emotions withinthe VAS and concluded that both bands include important infor-mation for the aforementioned discrimination. This perspectiveis adopted in this study (see Section III.A).

B. Multidimensional Directed Information

Correlations among multiple time series are in general ex-pected when they are simultaneously observed from an ob-ject. If a relation of temporal ordering is noted as the corre-lation relation among these time series, some are interpreted ascauses and others as results, suggesting a cause–effect relationamong the time series (causality analysis). When causality insuch a sense is noted in multiple time series, the relation isdefined as directed information [19]. There are methods suitedto causality analysis, such as directed-coherence analysis [20],directed-information analysis [19], multidimensional directedinformation (MDI) analysis [13], Kaminski’s method (DTF)[21], partial directed coherence [22], and Granger causality [23].In this work, the MDI analysis was employed as a means toidentify the causality between any two series considering allacquired series. One of the main advantages of MDI is that theamount of information propagation is presented as an absolutevalue in bits and not as a correlation, which is a relative value(e.g., in directed-coherence analysis [20]); a description of theMDI follows.

Consider the simple case of two stationary time series X andY of length N divided into n epochs of length L = N/n; eachepoch of length L = P + 1 + M is written as a sequence oftwo sections of length P and M before and after the xk and yk

sampled value of time series X and Y at time k, respectively, i.e.,

X = xk−P . . . xk−1xkxk+1 . . . xk+M = XP xkXM (1)

Y = yk−P . . . yk−1ykyk+1 . . . yk+M = Y P ykY M (2)

where XP = xk−P . . . xk−1 ; XM = xk+1 . . . xk+M ;Y P =yk−P . . . yk−1 ; Y M = yk+1 . . . yk+M .

Page 3: A Novel Emotion Elicitation Index Using Frontal

PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 739

The mutual information between the time series X and Y iswritten as

I(X;Y ) =∑

k

Ik (X;Y ) (3)

where,

Ik (X;Y ) = I(xk ;Y M |XP Y P yk )

+ I(yk ;XM |XP Y P xk ) + I(xk ; yk |XP Y P ).

(4)

The three terms of the right-hand side of (4) correspond to theinformation shared by the signal xk of X at time k and the futurepart YM of Y after time k (first part); the information shared bythe signal yk of Y at time k and the future part XM of X aftertime k (second part); and the information that is not contained inthe past parts XP and YP of X and Y but is shared by xk and yk

(third part). Since Ik (X; Y) represents mutual information whichhas symmetry, we have Ik (X; Y) = Ik (Y; X), meaning that itcontains no directivity, while the three terms in the right part of(4) contain a temporal relation which produces directivity. Thisdirectivity is defined [19] as directed information and depictedby using the arrow for clarification. For example, the first termof the right part of (4) can be written as

I(xk ;Y M |XP Y P yk ) = I(xk → Y M |XP Y P yk ) (5)

and analyzed as

I(xk → Y M |XP Y P yk ) =M∑

m=1

I(xk → yk+m |XP Y P yk )

(6)where each term on the right-hand side of (6) can be interpretedas information that is first generated in X at time k and propagatedwith a time delay of m to Y, and can be calculated throughthe conditional mutual information as a sum of joint entropyfunctions:

I(xk → yk+m |XP Y P yk ) = H(XP Y P xkyk )

+ H(XP Y P ykyk+m ) − H(XP Y P yk )

− H(XP Y P xkykyk+m ). (7)

According to [13], the joint entropy H(z1 . . . zn ) of n Gaus-sian stochastic variables z1 , . . . , zn can be calculated using thecovariance matrix R(z1 . . . zn ) as

H(z1 . . . zn ) =12

log[(2πe)n |R(z1 . . . zn )|] (8)

where | · | denotes the determinant; by using (8), (7) can bewritten as

I(xk → yk+m |XP Y P yk )

=12

log|R(XP Y P xkyk )| · |R(XP Y P ykyk+m )||R(XP Y P yk )| · |R(XP Y P xkykyk+m )| . (9)

So far, the calculation of the information flow between two serieswas presented. When the relations among three or more seriesare to be examined, however, the correct result of analysis isnot generally obtained if the aforementioned method is applied

to each pair of series [13]. To better comprehend this, considerthat there exist three series X, Y, and Z (instead of the just twoX and Y) with information flow from Z to both X and Y but notbetween X and Y. When conventional directed information anal-ysis, based only on the relation between X and Y, is applied, aninformation flow would wrongly be identified, as if there existsa flow between X and Y, since they contain the common compo-nent from Z. To circumvent this ambiguity, the aforementionedmethod has to be expanded accordingly, in order to consider allmeasured time series, and the MDI, which represents the flowbetween two arbitrary series, must be defined. Consequently,the following expression for the MDI is obtained for the simplecase of three interacting signals X, Y, Z:

I(xk → yk+m |XP Y P ZP ykzk )

=12

log|R(XP Y P ZP xkykzk )|·|R(XP Y P ZP ykzkyk+m )||R(XP Y P ZP ykzk )|·|R(XP Y P ZP xkykzkyk+m )| .

(10)

Using (10) and (6), the total amount of information, namelyS, that is first generated in X and propagated to Y taking intoaccount the existence of Z, across the time delay range is

SX Y : I(xk → Y M |XP Y P ZP ykzk ) =M∑

m=1

12

log

× |R(XP Y P ZP xkykzk )| · |R(XP Y P ZP ykzkyk+m )||R(XP Y P ZP ykzk )| · |R(XP Y P ZP xkykzkyk+m )| .

(11)

In the subsequent paragraph, (11) will be used to consolidatethe AsI measure by estimating the mutual information sharedbetween the left and right brain hemisphere, exploiting that waythe frontal brain asymmetry concept.

C. The Proposed Approach

According to the frontal brain asymmetry concept, the expe-rience of negative emotions is related with an increased rightfrontal and prefrontal hemisphere activity, whereas positiveemotions evoke an enhanced left-hemisphere activity. Let usassume that an EEG channel, i.e., Channel 1 recorded from theleft hemisphere, another EEG channel, Channel 2 from the otherhemisphere, and Channel 3 recorded from both hemispheres asa dipole channel represent the signals X, Y, and Z, previously in-troduced by the MDI analysis, respectively. If we now considerthe asymmetry concept, a measure to evaluate this asymmetryinformation in signals X and Y taking into account the infor-mation propagated by signal Z to both of them (and as a resultisolating the information shared between the X, Y pair moreeffectively), would introduce an index of how effectively anemotion has been elicited. Toward this, it is assumed that thetotal amount of information S [see (11)], hidden in the EEG sig-nals and shared between the left and right hemisphere (signalsX and Y, respectively) would become maximum when the sub-ject is calm (information symmetry), whereas S would becomeminimum when the subjects are emotionally aroused (infor-mation asymmetry). In order to investigate the aforementioned

Page 4: A Novel Emotion Elicitation Index Using Frontal

740 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011

assumption, two values were calculated according to the MDIanalysis, i.e., the Sr and Sp values. Sr refers to bidirectional in-formation sharing between X and Y, taking into account Z, whenthe subject does not feel any emotion; hence, she/he is relaxed,i.e.,

Sr = SX Yr + SY X

r (12)

where Sp is the same sharing information during the periodwhere she/he is supposed to feel an emotion, i.e.,

Sp = SX Yp + SY X

p . (13)

According to what has already been discussed, Sp will presum-ably be smaller than Sr if the asymmetry concept holds. Finally,in order to directly define a measure for emotion experience,the asymmetry index (AsI) is introduced which is defined asthe distance of the (Sp, Sr ) point, corresponding to a specificpicture, from the line Sr = Sp , i.e.

AsI = (Sr − Sp) ×√

22

. (14)

Later in this paper, AsI will serve as an index for the efficiencyof the emotion elicitation of each trial during the correspondingexperiment (see subsequent section for dataset construction) andwill be further evaluated in regard of its foundation as a metricfor emotion arousal through an extensive classification setup.

III. IMPLEMENTATION ISSUES

A. Dataset Construction

For the construction of the EEG data, a specifically designedexperiment was conducted through an emotion elicitation pro-cess. In this experiment, 16 (nine males and seven females)healthy right-handed volunteers in the age group of 19–32 yearsparticipated. The whole experiment was designed to induct emo-tion within the VAS and, specifically, for four affective states,i.e., LALV (low arousal-low valence), LAHV (low arousal-highvalence), HAHV (high arousal-high valence), and HALV (higharousal-low valence).1 Ten pictures per affective state were se-lected from the IAPS database according to their arousal andvalence values provided by the norms of the IAPS database.Particularly, the low (L) grade previously mentioned was con-sidered lower than 4 (<4) in a scale of 1–9 and the high (H)grade greater than 6 (6<) in the same scale. Moreover, the stan-dard deviation of these values considered to be lower than 2.2.According to these criteria, 11, 47, 61, and 17 pictures for LALV,LAHV, HAHV, and HALV affective states were extracted, re-spectively. From these pictures, ten pictures per affective statewere randomly selected and constituted the 40-picture databaseused in the experiment. Fig. 1(a) shows the location of the pic-tures finally selected at the VAS.

The experimental protocol included the series of stepsschematically depicted in Fig. 1(b). In particular, before theprojection of each picture, an 11-s period took place consistingsequentially of: 1) a 5-s black screen period, 2) a 5-s period

1Valence is commonly associated with positive/negative instead of high/lowbut the second one is used here for the sake of simplicity.

Fig. 1. (a) Location of the selected pictures (big black dots) for the experi-ment’s conduction along with the rest pictures of the IAPS database (small blackdots) at the arousal-valence space. (b) Schematic representation of the experi-mental protocol followed. (c) The Fp1, Fp2, F3, and F4 electrode positions inthe 10–20 system (marked with black), used for the EEG acquisition.

in which countdown frames (5 → 1) were demonstrated, and3) an 1-s projection of a cross shape in the middle of the screen toattract the sight of the subject. The 5-s countdown phase was em-ployed to accomplish a relaxation phase and emotion-reset [11],due to its naught emotional content before the projection of thenew picture. During the experiment, the selected pictures wereprojected (in sequence: 10 for LALV, 10 for LAHV, 10 forHAHV, and 10 for HALV) for 5 s. After the picture projection, acomputerized 20-s self-assessment mannequin (SAM) [24] pro-cedure took place. The same 36-second procedure was repeatedfor every one of the 40 pictures. The sequence of the projectionof the affective states was chosen in order to show emotionally

Page 5: A Novel Emotion Elicitation Index Using Frontal

PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 741

intensive pictures (i.e., pictures with high arousal score) at theend of the experiment as a picture that is supposed to evokean intense emotion would emotionally “cover up” a subsequentone which is characterized by a lower arousal value. In otherwords, an intensive emotion would dominate upon a milder one.Thus the HAHV and HALV picture sets were shown after theLALV and LAHV ones.

The EEG signals from each subject were recorded duringthe whole projection phase. The EEG signals were acquiredfrom Fp1, Fp2, F3, and F4 positions, according to the 10–20system [see Fig. 1(c)], related to the emotion expression inthe brain, based on the asymmetry concept. The Fp1 and Fp2positions were recorded as monopole channels (Channels 1 and2, respectively, according to the description took place in theaforementioned section), whereas the F3 and F4 positions asa dipole (Channel 3), resulting in a three-channel EEG set.Thus, the shared information between Fp1 and Fp2 channels(see Section II.B) is measured, as effectively as possible, bytaking into account the respective information propagated bythe dipole F3/F4 to both of them. The ground and referenceelectrodes were placed at the right (A2) and left (A1) earlobes,respectively [Fig. 1(c)].

After the acquisition part, the EEG signals were subjected toa band-pass Butterworth filtering, to retain only the frequencieswithin the alpha (8–12 Hz) and beta (13–30 Hz) bands. Thus,both the mutual relation regarding the prefrontal cortical acti-vation or inactivation (see Section II.A) is exploited, and theelimination of superimposed artifacts from various sources isaccomplished. As it has been reported [25]–[27] they are effec-tively eliminated by appropriate bandpass filtering. Specifically,the influence of eye movement/blinking is most dominant below4 Hz, heart-functioning causes artifacts around 1.2 Hz, whereasmuscle artifacts affect the EEG spectrum above 30 Hz. Non-physiological artifacts caused by power lines are clearly above30 Hz (50–60 Hz). Consequently, by extracting the alpha andbeta frequency bands only from the acquired EEG recordings,the most part of the noise influence is circumvented. Afterwards,the EEG signals were segmented into 5 s segments correspond-ing to the duration of each picture projection. Finally, the EEGsignals referring to the countdown phase were also cut and fil-tered, eliminating any undesired artifact, as they were intendedto be used as the ground truth for the evaluation of the emotionelicitation. It must be stressed out that Sp and Sr values werecalculated for the signals that correspond to the projection ofthe IAPS picture and the countdown phase that took place im-mediately before the projection of the abovementioned picture,respectively.

EEG recordings were conducted using the g.MOBIlab(g.tec medical and electrical engineering, Guger Technologies,www.gtec.at) portable biosignal acquisition system [four EEGbipolar channels, passive electrodes (i.e., with no inbuilt cir-cuitry), filters: 0.5–30 Hz, voltage change sensitivity: 100 μV,data acquisition: A/D converter with 16-bit resolution and sam-pling frequency of 256 Hz; data transfer: wireless, Bluetooth“Class I” technology].

B. Classification Setup

For better evaluating the efficiency of the AsI to serve asmetric of the reliability of an emotion elicitation procedure,a thorough classification process was designed. In particular,two feature-vector extraction techniques one classifier and sixdifferent classification scenarios were employed.

The feature vector set was constructed based on two methods,i.e., higher order crossings (HOC) [28] and cross-correlation(CC) [29]. HOC as a technique for feature vector extraction wasproposed by the authors [28], resulting in good classificationresults. The CC method was chosen here for further evaluationof both the HOC scheme and AsI index, as mental and emo-tional brain activities strongly depend on synaptic connections;hence, the state of mind is partly manifested in the spatial cor-relation of EEGs. To this end, cross-correlation coefficients ofEEGs associated with emotional states are used to form thefeature vector of the CC approach [29]. A brief description ofthe implementation issues of the two feature-vector extractionmethods is presented in the Appendix.

As there is no single best classification algorithm, that is one-size-fits-all, the choice of the most efficient classifier is stronglydependent on the examined problem and the relevant datasetto be classified [30]. After having tested some classifiers suchas quadratic discriminant analysis (QDA) [31], Mahalanobisdistance (MD) [32], k-nearest neighbor (k-NN) [33], and supportvector machines (SVM) [34] we chose the SVM, in particularwith a five-order polynomial kernel, which outperformed withhigher recognition accuracy in our case.

According to the AsI concept, emotion elicitation trials, i.e.,corresponding (Sp, Sr ) pairs, with big AsI values are supposedto elicit more effectively the respective emotional state. On theother hand, trials with lower AsI values should exhibit the oppo-site result. Toward the investigation of this assumption, the AsIvalues were normalized in the range [0, 1] and the emotion elic-itation trials that correspond to AsI larger than 0.5 (0.5 < AsI)gathered the big AsI group. The other with AsI lower than 0.1(AsI < 0.1) gathered the corresponding small AsI group. Thethreshold for the construction of the small AsI group was cho-sen to conclude in a group with comparative number of signalsper affective states with the respective group of the big AsI.Particularly, the big AsI group was consisted of 31, 29, 21, and15 signals (gathered from approximately all subjects) for theLALV, LAHV, HAHV, and HALV affective states, respectively.On the other hand, the small AsI group incorporated 34, 35,35, and 37 signals for the same affective states as before. Af-terwards, a classification procedure was implemented for bothbig and small AsI groups under six three-classes classificationscenarios, i.e., 1) S1: class 1: LA, class 2: HA and class 3: therespective relax signals, 2) S2: class1: LV, class 2: HV and class3: the respective relax signals, 3) S3: class 1: LALV, class 2:HALV and class 3: the respective relax signals, 4) S4: class 1:LAHV, class 2: HAHV and class 3: the respective relax signals,5) S5: class1: LALV, class 2: LAHV and class 3: the respectiverelax signals, 6) S6: class 1: HALV, class 2: HAHV and class 3:the respective relax signals.

Page 6: A Novel Emotion Elicitation Index Using Frontal

742 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011

Fig. 2. Overall system diagram presenting the collection and processing of the EEG data.

These classification scenarios were chosen in order to:1) emphasize on the discrimination of the adverse expressionsof the valence and arousal coordinates in the VAS and the relaxstate and 2) to create different classification setups for testingthe consistency of the AsI to perform efficiently in every one ofthem. For each one of the classification scenarios the 70% of thesignals were used as a training set whereas the remaining 30%as a testing set. The cross validation technique was adopted for abetter evaluation of the classification performance resulting in a20-iteration classification process. The mean classification rateC̄ from the 20 iterations was finally extracted. The same clas-sification procedure was also conducted for the whole dataset,without the grouping according to the AsI value. Furthermore,until now, the categorization of a signal to an affective state,corresponding to a specific picture, was done according to thenorms of the IAPS database. A final classification setup wasalso conducted for the whole dataset, but with the categoriza-tion following the self-assessment of each picture as describedin Section III.A.

All above processes were conducted for both the HOC andCC feature extraction methods (see the Appendix). A system di-agram defining each one of the implementation steps is depictedin Fig. 2.

IV. RESULTS

The efficiency of AsI as a metric for the reliability of theemotion elicitation procedures is evaluated through an extensiveclassification performance. Before that, in order to investigateif the assumption made in Section II.C, i.e., Sp should generallybe smaller than Sr , actually holds, the Sr values were plottedversus the Sp ones. Fig. 3 shows the aforementioned plot foreach one of the emotion elicitation trials for all subjects and allaffective states. Indeed, the assumption seems to hold for a greatvolume of the (Sp , Sr ) pairs.

Fig. 4 shows the results of the classification procedure. Blacksolid and black dashed lines represent the corresponding C̄ val-ues of big and small AsI groups, respectively. Whilst, the blackdotted and black dashed–dotted lines correspond to the wholedataset categorized by norms and subjective criteria, respec-tively. The circle (o) and triangle (Δ) marks correspond to HOCand CC methods, respectively. It is obvious that signals within

Fig. 3. Sr versus Sp values for the four affective states of IAPS-based emotionelicitation.

the big AsI group exhibit profoundly higher C̄ values in contrastwith the C̄ of small AsI group and the whole dataset. In partic-ular, C̄ values for the big AsI group are {61.19, 55.24, 58.16,54.69, 62.58, 58.55}% for the HOC feature extraction methodand {43.71, 48.66, 49.95, 47.72, 50.91, 51.12}% for the CCmethod corresponding to the S1, S2, S3, S4, S5, and S6 classifi-cation scenarios, respectively, confirming also the superiority ofthe HOC method against the CC one. It should be noted that wehave also unified the frequency bands in (23) (see the Appendix),assuming nonorthogonality in the frequency components, andcalculated the classification performance of the CC approach.This, however, resulted in reduced mean classification accuracy[−(6.46% ± 5.7%)], when compared with the one derived fromthe adoption of separate frequency bands.

Moreover, in order to compare the classification results ofthe big AsI group with other groups constituted of the samenumber of signals per affective state, 50 groups were randomlyextracted and were subjected to the abovementioned classifica-tion process. Fig. 5 shows the results of this procedure. Withblack dotted line is depicted the median value of the C̄ values

Page 7: A Novel Emotion Elicitation Index Using Frontal

PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 743

Fig. 4. Classification rates, C̄ , for HOC and CC methods for big and smallAsI groups, norm and subjective-based categorization of EEG signals.

Fig. 5. Classification rates, C̄ , for HOC and CC methods for big and smallAsI groups, and median C̄ derived from all 50 randomly created groups withequal signal number with Big AsI groups.

extracted for the 50 randomly constructed groups for each classi-fication scenario. Once again the superiority of the classificationperformance in the big AsI group is evident, leading to the con-clusion that the sets of signals that constitute the big AsI grouprepresent the most effective emotion elicitation trials.

From all aforementioned analyses, it is clear that the resultsdiscussed so far refer to the user-independent case. In an at-tempt to evaluate the efficiency of the AsI in a user-dependentconcept, the classification setup previously discussed was alsoimplemented for each one of the 16 subjects. Fig. 6 depicts thecorresponding results. The black dotted line represents the meannormalized AsI of each one of the subjects extracted from allaffective states plotted in a descending order. The black solidand black dashed lines present the mean classification rate, ¯̄C,

Fig. 6. Mean classification rates, C̄ , for HOC and CC methods, all subjects.In black dotted line the mean AsI is also depicted in descending order.

TABLE IMEAN CLASSIFICATION RATES (%) FOR SUBJECTS 4 AND 13 FOR ALL

CLASSIFICATION SCENARIOS

(normalized in the [0, 1] range), i.e., ¯̄C is the mean of the C̄across the six classification scenarios S1. . .S6, for the HOC andCC methods, respectively. The important conclusion from thisfigure is that the HOC and CC lines incline with the same way asthe AsI line. This shows that a subject with small mean AsI tendsto exhibit poorer classification performance in contrast with an-other with higher mean AsI values. It must be noted that HOC,CC, and mean AsI lines correspond to different values, i.e., ¯̄Cand AsI, respectively. Furthermore, from Fig. 6, it is obviousthat Subjects 4 and 13 exhibit the best and the worst classi-fication performance as far as the HOC method is concerned,respectively. The C̄ values for each one of the six classifica-tion scenarios, for the aforementioned subjects, are tabulated inTable I. That way, not only the single trials can be evaluated fortheir ability to evoke emotions, but also individual subjects canbe characterized according to their tendency of how easily canbe emotionally aroused emotion elicitation procedures.

It should be stressed out that HOC method overcame the CCone in the classification performance in all classification setupsresulting in a 62.58% for the classification scenario S5 (seeFig. 4) in a user-independent setup while in a user-dependentconcept it also exhibited the maximum mean classification rateacross all classification scenarios, 86.92% (Subject 4). Particu-larly, in the S3 classification scenario (three classes) the HOCmethod resulted in 94.4%. On the other hand, the CC method re-sulted in 51.12% (S6), 51.62% (Subject 1), and 59.6% (Subject1, S3) in the respective classification cases. All given previously

Page 8: A Novel Emotion Elicitation Index Using Frontal

744 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011

Fig. 7. Classification rates, C̄ , for the HOC method for Big groups of AsI andDI indexes.

confirm the superiority of the HOC approach in contrast to theCC one.

Finally, in order to compare the efficiency of AsI with theDI one, the corresponding big group according to DI was con-structed, taking into consideration the absolute values of it. Thethreshold for the categorization to the big group was DI ≥ 0.5,in accordance to the definition of the big AsI-based group (seeSection III.B). Subsequently, the selected trials were used forclassification with the more efficient HOC-based feature vectorextraction method. The corresponding results are depicted inFig. 7. From the latter, it is obvious that AsI-based big groupprovides with higher C̄ rates for all classification scenarios, i.e.,(S1-S6): {61.19, 55.24, 58.16, 54.69, 62.58, 58.55}%, as com-pared with the corresponding rates from the DI-based one, i.e.,(S1-S6): {49.90, 54.89, 55.27, 40.40, 53.20, 43.41}%.

V. DISCUSSION

The presented results have shown that AsI can serve as anindication of how well an emotion has been evoked during anemotion elicitation process. Firstly, in Fig. 3 the assumptionthat Sr is larger than Sp holds in a great volume of the emotionelicitation trials. The exclusion of the remaining trials with theopposite attitude (i.e., low AsI values) resulted in a better clas-sification rates in contrast with the whole dataset (see Fig. 4) orany random group of equal number of signals with the Big AsIgroup (see Fig. 5). It is noteworthy, regarding the constructionof the Big AsI group, that the (Sp, Sr ) pairs, corresponding tothe HALV state, exhibited, in average, lower AsI values whichalso resulted in the selection of only 15 signals from this affec-tive state (see the previous section). This might be due to theintensiveness of the emotion that the pictures in this categoryelicit. For the projection of pictures in this category many sub-jects reported that some of the pictures produced so negativelyemotional charges that they were thinking about them whilethe procedure was proceeding. As a result, the experience ofnegative emotions occurred during the relaxation (countdown)phase, would reduce the symmetry during its realization; hence,

Sr values would tend to be almost equal with Sp ones. It mustalso be stressed out that all Sr and Sp are normalized in the range[0, 1] by finding the maximum value between both value sets.The maximum value that was found was a Sr value from theLALV set. This fact confirms the previous remark, as it impliesthat during the relaxation (countdown) phases of the projec-tion period of the LALV pictures the maximum symmetry wasaccomplished. Furthermore, if we obtain the maximum Sp val-ues for the affective states after normalization (see Fig. 3) wehave {0.124, 0.106, 0.93, 0.87} for LALV, LAHV, HAHV, andHALV, respectively. This fact is also a clue that during the pro-jection of pictures with high arousal scores, lower Sp values areobtained, implying lower amount of shared information betweenthe left and right hemispheres, i.e., more intense asymmetry.

To authors’ knowledge up to now, no other attempt to eval-uate the emotion elicitation reliability has been reported in theliterature. In [9], the final selection of the emotional stimuluswas based on the assessment of a bunch of emotional filmsby subjects that did not participate in the corresponding fi-nal data-collection experiment. This approach raises variousissues, as different emotional stimulus are apprehended differ-ently from each individual. Furthermore, Fig. 4 shows that eventhe categorization of the signals according to the emotional self-assessment of the subjects participated in the experiments leadto low classification performance, maybe due to subconsciousevaluation of the emotional stimulus that is not later projectedin the self-assessment procedure. The later, may be a corollaryof the inability of the self-assessment methods (in this workthe SAM method) to effectively convey the means of such anassessment. Thus, a more objective approach, as the one intro-duced in this paper, may overcome these issues by evaluatingthe emotional impact on subjects using only the informationtransferred from the origin of the emotion genesis. Moreover,the comparison of the AsI with the DI revealed its ability tobetter evaluate the trials of the emotion elicitation procedurethat probably convey emotion information in contrast with thelatter one. A possible reason of this fact is that DI does nottake into consideration a ground truth in regard with how theabsence of asymmetry is reflected in the EEG signals of eachsubject. Due to the definition of the AsI (see Section II.C), thefinal measure of the efficiency of each emotion elicitation trialis based on the nonasymmetry detected immediately before theaforementioned trial. This fact implies a more robust evaluationof how effectively an emotion is elicited, taking into account thepreceding emotional state of the subject.

In this work, the MDI method was employed in order toestimate the amount of information flow between the two hemi-spheres of the brain. The total amount of information measured[see (11)] was estimated for the whole 5-s signal correspondingto the projection of the IAPS picture. It would be interestingto investigate whether the brain asymmetry that was describedin Section II.A is exhibited during the whole 5-s period or intime period inside the signal. This investigation would lead ina further “purge” of the dataset not only in a subject level (seeFig. 6) or trial level, but also in a time segment level duringeach emotion elicitation trial. This should result in more effi-cient isolation of the emotional information in EEG signals and,

Page 9: A Novel Emotion Elicitation Index Using Frontal

PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 745

thus, in better classification performance during an EEG-ERtask.

VI. CONCLUSION

A novel approach to evaluate the effectiveness of an emotionelicitation trial was presented in this paper. The frontal EEGasymmetry along with the MDI approach led to the definitionof the AsI, which was used to discard the signals with less“emotional” information and perform more reliable EEG-ERprocesses. The classification-based evaluation of the efficacy ofthe AsI resulted in promising results, which pave the way formore robust emotion elicitation processes. In particular, signalswith higher AsI values seem to demonstrate better classificationattitudes, leading to the conclusion that they are imbued withgreater volume of emotional information, in contrast to otherwith lower AsI values. Moreover, except for selecting the mosteffective EEG signals, AsI provides the potential of also discard-ing individual subjects that seem not to emotionally react in therespective stimulus. In addition, AsI seems to be quite efficientevaluation criterion for an emotion elicitation process, comparedto the known DI measure. However, despite the potential of thepresented results, further justification of the reliability of theAsI must be provided by implementing the proposed approachto a larger scale experiments.

APPENDIX

Consider, a finite zero-mean series {Zt}, t = 1, . . . , N os-cillating about level zero. Let ∇ be the backward differenceoperator defined by

∇Zt ≡ Zt − Zt−1 . (15)

If we define the following sequence:

�k ≡ ∇k−1 , k = 1, 2, 3, . . . , (16)

with �1 ≡ ∇0 being an identity filter, we can estimate the cor-responding HOC, namely simple HOC [35], by

Dk = NZC{�k (Zt)}, k = 1, 2, 3, . . . ; t = 1, . . . , N (17)

where NZC{.} denotes the estimation of the number of zero-crossings and

∇k−1Zt =k∑

j=1

(k − 1j − 1

)(−1)j−1Zt−j+1 . (18)

In practice, we only have finite time series and lose an observa-tion with each difference. Hence, to avoid this effect we mustindex the data by moving to the right, i.e., for the evaluation of ksimple HOC, the index t = 1 should be given to the k-th or a laterobservation. For the estimation of the number of zero-crossingsin (17), a binary time series Xt(k) is initially constructed givenby

Xt(k) ={

1 if �k (Zt) ≥ 00 if �k (Zt) < 0 ,

k = 1, 2, 3, . . . ; t = 1, . . . , N (19)

and the desired simple HOC are then estimated by countingsymbol changes in X1(k), . . . , XN (k), i.e.,

Dk =N∑

t=2

[Xt(k) − Xt−1(k)]2 . (20)

In finite data records it holds Dk+1 ≥ Dk − 1 [35].The HOC used to construct the feature vector (FVHOC ),

formed as follows:

FVHOC = [D1 ,D2 , . . . , DL ], 1 < L ≤ J (21)

where J denotes the maximum order of the estimated HOC and Lthe HOC order up to they were used to form the FVHOC . Due tothe three-channel EEG recording setup the final FV c

HOC , wherec stands for “combined,” was structured as

FV cHOC

= [Di1 ,D

i2 , . . . , D

iL ,Dj

1 ,Dj2 , . . . , D

jL ,Ds

1 ,Ds2 , . . . , D

sL ],

i = 1; j = 2; s = 3 (22)

where i, j, s denote the channel number that participates to thecombination.

The CC method introduced in [29] estimates the cross-correlation coefficient c(ω; ij) between potentials of the EEGelectrodes i and j for the frequency band ω, i.e.:

c(ω; ij) =

∑ω Xi(ωn )X∗

j (ωn )√∑

ω |Xi(ωn )|2√∑

ω |Xj (ωn )|2(23)

where Xi(ωn ) is the Fourier transform of EEG at the ith elec-trode site and nth frequency bin, and the summation is overthe frequency bins in the ω frequency band. In this paper, thecross-correlation coefficient was estimated for both alpha andbeta bands resulting in a 6-feature feature vector, three featuresfor each band as we have a three-channel EEG recording set.The final feature vector was named FV c

CC .

ACKNOWLEDGMENT

The authors would like to thank all the 16 subjects participatedin the experiment for their patience during the tedious EEGrecording phase.

REFERENCES

[1] P. Rani, N. Sarkar, C. A. Smith, and J. A. Adams, “Affective communica-tion for implicit human-machine interaction,” IEEE Int. Conf. Syst., Man,Cybern., pp. 4896–4903, 2003.

[2] I. Cohen, A. Garg, and T. S. Huang, “Emotion recognition from facialexpressions using multilevel HMM,” presented at the Neural Inf. Pro-cess. Syst. Workshop Affective Computing, Colorado, 2000, available at:[Online] http://www.ifp.uiuc.edu/∼ashutosh/ papers/NIPS_emotion.pdf

[3] F. Bourel, C. C. Chibelushi, and A. A. Low, “Robust facial expressionrecognition using a state-based model of spatially-localized facial dynam-ics,” in Proc. Fifth IEEE Int. Conf. Automatic Face Gesture Recog., 2002,pp. 106–111.

[4] B. Schuller, S. Reiter, R. Mueller, M. Al-Hames, and G. Rigoll, “Speakerindependent speech emotion recognition by ensemble classification,” inProc. 6th Int. Conf. Multimedia Expo., 2005, pp. 864–867.

[5] F. Yu, E. Chang, Y. Q. Xu, and H. Y. Shum, “Emotion detection fromspeech to enrich multimedia content,” in Proc. IEEE Pacific Rim Conf.Multimedia, 2001, pp. 550–557.

Page 10: A Novel Emotion Elicitation Index Using Frontal

746 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011

[6] R. W. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intel-ligence: Analysis of affective physiological state,” IEEE Trans. PatternAnal. Mach. Intell., vol. 23, no. 10, pp. 1175–1191, Oct. 2001.

[7] F. Nasoz, C. L. Lisetti, K. Alvarez, and N. Finkelstein, “Emotion recogni-tion from physiological signals for user modeling of affect,” in Proc. 9thInt. Conf. User Model, June 22–26, 2003, Pittsburg, USA, [Online]. avail-able at: http://www.eurecom.fr/util/ publidownload.en.htm?id = 1806.

[8] C. Busso, Z. Deng, S. Yildirim, M. Buut, C. M. Lee, A. Kazemzadeh,S. Lee, U. Neumann, and S. Narayanan, “Analysis of emotion recognitionusing facial expressions, speech and multimodal information,” in Proc.6th Int. Conf. Multimodal Interfaces, 2004, pp. 205–211.

[9] K. Takahashi, “Remarks on emotion recognition from bio-potential sig-nals,” in Proc. of 2nd Int. Conf. Autonomous Robots and Agents, 2004,pp. 186–191.

[10] A. Heraz and C. Frasson, “Predicting the three major dimensions of thelearner’s emotions from brainwaves,” Int. J. Comput. Sci., vol. 2, no. 3,pp. 187–193, 2008.

[11] D. O. Bos, EEG-based emotion recognition: The influence of vi-sual and auditory stimuli, 2006 [Online]. Available: http://hmi.ewi.utwente.nl/verslagen/capita-selecta/CS-Oude_Bos-Danny.pdf

[12] R. J. Davidson, G. E. Schwartz, C. Saron, J. Bennett, and D. J. Goleman,“Frontal versus parietal EEG asymmetry during positive and negativeaffect,” Psychophysiology, vol. 16, pp. 202–203, 1979.

[13] O. Sakata, T. Shiina, and Y. Saito, “Multidimensional Directed Informationand Its Application,” Elec. Com. Jpn., vol. 4, pp. 3–85, 2002.

[14] P. J. Lang, M. M. Bradley, and B. N. Cuthbert, “International affective pic-ture system (IAPS): Affective ratings of pictures and instruction manual”,Univ. Florida, Gainesville, FL, Tech. Rep. A-8, 2008.

[15] R. J. Davidson, P. Ekman, C. D. Saron, J. A. Senulis, and W. V. Friesen,“Approach-withdrawal and cerebral asymmetry: Emotional expressionand brain physiology,” J. Personality Soc. Psychol., vol. 58, pp. 330–341,1990.

[16] H. Jasper, “The ten-twenty electrode system of the international feder-ation,” Electroencephalogr. Clin. Neurophysiol., vol. 39, pp. 371–375,1958.

[17] W. J. H. Nauta, “The problem of the frontal lobe: A reinterpretation,” J.Psychiatric Res., vol. 8, pp. 167–187, 1971.

[18] R. J. Davidson, “What does the prefrontal cortex ‘do’ in affect: Per-spectives on frontal EEG asymmetry research,” Biol. Psychol., vol. 67,pp. 219–233, 2004.

[19] T. Kamitake, H. Harashima, and H. Miyakawa, “Time series analysis basedon directed information,” Trans Inst. Elect. Inf. Commun. Engineers,vol. 67-A, pp. 103–110, 1984.

[20] G. Wang and M. Takigawa, “Directed coherence as a measure of interhemi-spheric correlation of EEG,” Int. J. Psychophysiol., vol. 13, pp. 119–128,1992.

[21] M. Kaminski and K. J. Blinowska, “A new method of the descriptionof the information flow in the brain structures,” Biol. Cybern., vol. 65,pp. 203–210, 1991.

[22] L. A. Baccala and K. Sameshima, “Partial directed coherence: a newconcept in neural structure determination,” Biol. Cybern., vol. 84, pp. 463–474, 2001.

[23] W. Hesse, E. Moller, M. Arnord, and B. Schack, “The use of time-variantEEG Granger causality for inspecting directed interdependencies of neuralassemblies,” J. Neurosci. Methods, vol. 124, pp. 27–44, 2003.

[24] J. D. Morris, “Observations: SAM the self assessment mannequin—An ef-ficient cross-cultural measurement of emotional response,” J. AdvertisingRes., vol. 35, pp. 63–68, 1995.

[25] K. Coburn and M. Moreno, “Facts and artifacts in brain electrical activitymapping,” Brain Topography, vol. 1, pp. 37–45, 1988.

[26] D. O. Olguin, “Adaptive digital filtering algorithms for the elimination ofpower line interference in electroencephalographic signals”, Master thesis,Instituto Tecnologico y de Estudios Superiores de Monterrey, Monterrey,Mexico, 2005.

[27] M. Fatourechi, A. Bashashati, R. K. Ward, and G. E. Birch, “EMG andEOG artifacts in brain computer interface systems: A survey,” Clin.Neurophysiol., vol. 118, pp. 480–494, 2007.

[28] P. C. Petrantonakis and L. J. Hadjileontiadis, “Emotion recognition fromEEG using higher order crossings,” IEEE Trans. Inf. Technol. Biomed.,vol. 14, no. 2, pp. 186–197, 2010.

[29] T. Musha, Y. Terasaki, H. A. Haque, and G. A. Ivamitsky, “Feature ex-traction from EEGs associated with emotions,” Artif. Life Robot., vol. 1,no. 1, pp. 15–19, 1997.

[30] R. D. King, C. Feng, and A. Shutherland, “StatLog: Comparison of clas-sification algorithms on large real-world problems,” Appl. Artif. Intell.,vol. 9, no. 3, pp. 259–287, 1995.

[31] W. J. Krzanowski, Principles of Multivariate Analysis. Oxford, UK:Oxford University Press, 1988.

[32] P. C. Mahalanobis, “On the generalized distance in statistics,” Proc. Natl.Inst. Sci. India, vol. 2, pp. 49–55, 1936.

[33] T. Mitchell, Machine Learning.. New York, NY: McGraw-Hill, 1997.[34] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector

Machines and Other Kernel-based Learning Methods.. Cambridge, UK:Cambridge University Press, 2000.

[35] B. Kedem, Time Series Analysis by Higher Order Crossings. Piscat-away, NJ: IEEE Press, 1994.

Panagiotis C. Petrantonakis (S’08) was born inIerapetra, Crete, Greece, in 1984. He received theDiploma degree in electrical and eomputer engineer-ing from the Aristotle University of Thessaloniki(AUTH), Thessaloniki, Greece, in 2007, where heis currently working toward the Ph.D. degree at theSignal Processing and Biomedical Technology Unitof the Telecommunications Laboratory.

His current research interests include advancedsignal processing techniques, nonlinear transforms,and affective computing.

Mr. Petrantonakis He is a member of the Technical Chamber of Greece.

Leontios J. Hadjileontiadis (S’87–M’98–SM’11)was born in Kastoria, Greece, in 1966. He receivedthe Diploma degree in electrical engineering and thePh.D. degree in electrical and computer engineeringboth from the Aristotle University of Thessaloniki,Thessaloniki, Greece, in 1989 and 1997, respectively.He also received the Ph.D. degree in music composi-tion, University of York, UK, in 2004.

Since December 1999, he is with the Depart-ment of Electrical and Computer Engineering, Aris-totle University of Thessaloniki, as a faculty mem-

ber, where he is currently an Associate Professor, working on lung sounds,heart sounds, bowel sounds, ECG data compression, seismic data analysisand crack detection in the Signal Processing and Biomedical Technology Unitof the Telecommunications Laboratory. Also, he is currently a Professor incomposition at the State Conservatory of Thessaloniki, Thessaloniki , Greece.His research interests include higher-order statistics, alpha-stable distributions,higher-order zero crossings, wavelets, polyspectra, fractals, neuro-fuzzy mod-eling for medical, mobile and digital signal processing applications.

Dr. Hadjileontiadis is a member of the Technical Chamber of Greece, of theHigher-Order Statistics Society, of the International Lung Sounds Association,and of the American College of Chest Physicians. He was the recipient of thesecond award at the Best Paper Competition of the 9th Panhellenic MedicalConference on Thorax Diseases, in 1997, Thessaloniki. He was also an openfinalist at the Student paper Competition (Whitaker Foundation) of the IEEEEMBS, in 1997, Chicago, IL, a finalist at the Student Paper Competition (inmemory of Dick Poortvliet) of the MEDICON’98, Lemesos, Cyprus, and therecipient of the Young Scientist Award of the twenty-fourth International LungSounds Conference’99, Marburg, Germany. In 2004, 2005, and 2007 he orga-nized and served as a mentor to three five-student teams that have ranked asthird, second, and seventh worldwide, respectively, at the Imagine Cup Com-petition (Microsoft), Sao Paulo, Brazil (2004)/Yokohama, Japan (2005)/Seoul,Korea (2007), with projects involving technology-based solutions for peoplewith disabilities.