12
Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification of Mag- netic Resonance Images using 2D and 3D texture measures. Joint statis- tics such as co-occurrence matrices are common for analysing texture in 2D since they are simple and effective to implement. However, the computational complexity can be prohibitive especially in 3D. In this work, we develop a texture classification strategy by a sub-band filtering technique based on the Wilson and Spann [17] Finite Prolate Spheroidal Sequences that can be extended to 3D. We further propose a feature selection technique based on the Bhattacharyya distance measure that reduces the number of features required for the classification by select- ing a set of discriminant features conditioned on a set training texture samples. We describe and illustrate the methodology by quantitatively analysing a series of images: 2D synthetic phantom, 2D natural textures, and MRI of human knees. Keywords: Image Segmentation, Texture classification, Sub-band fil- tering, Feature selection, Co-occurrence. 1 Introduction There has been extensive research in texture analysis in 2D and even if the concept of texture is intuitively obvious it can been difficult to give a satisfac- tory definition. Haralick [5] is a basic reference for statistical and structural ap- proaches for texture description, contextual methods like Markov Random Fields are used by Cross and Jain [3], and fractal geometry methods by Keller [8]. The dependence of texture on resolution or scale has been recognised and exploited by workers in the past decade. Texture description and analysis using a frequency approach is not as com- mon as the spatial-domain method of co-occurrence [6] but there has been re- newed interest in the use of filtering methods akin to Gabor decomposition [10] and joint spatial/spatial-frequency representations like Wavelet transforms [16]. Although easy to implement, co-occurrence measures are outperformed by such filtering techniques (see [13]) and have prohibitive costs when extended to 3D. The textures encountered in magnetic resonance imaging (MRI) differ greatly from synthetic textures, which tend to be structural and can be often be de- scribed with textural elements that repeat in certain patterns. MR texture ex- hibits a degree of granularity, randomness and, where the imaged tissue is fi- brous like muscle, directionality. The importance of Texture in MRI has been the focus of some researchers, notably Lerksi [9] and Schad [15], and a COST European group has been established for this purpose [2]. Texture analysis has been used with mixed success in MRI, such as for detection of microcalcification

Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

  • Upload
    haduong

  • View
    240

  • Download
    2

Embed Size (px)

Citation preview

Page 1: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

Volumetric Texture Description andDiscriminant Feature Selection for MRI

Abstract. This paper considers the problem of classification of Mag-netic Resonance Images using 2D and 3D texture measures. Joint statis-tics such as co-occurrence matrices are common for analysing texturein 2D since they are simple and effective to implement. However, thecomputational complexity can be prohibitive especially in 3D. In thiswork, we develop a texture classification strategy by a sub-band filteringtechnique based on the Wilson and Spann [17] Finite Prolate SpheroidalSequences that can be extended to 3D. We further propose a featureselection technique based on the Bhattacharyya distance measure thatreduces the number of features required for the classification by select-ing a set of discriminant features conditioned on a set training texturesamples. We describe and illustrate the methodology by quantitativelyanalysing a series of images: 2D synthetic phantom, 2D natural textures,and MRI of human knees.Keywords: Image Segmentation, Texture classification, Sub-band fil-tering, Feature selection, Co-occurrence.

1 Introduction

There has been extensive research in texture analysis in 2D and even if theconcept of texture is intuitively obvious it can been difficult to give a satisfac-tory definition. Haralick [5] is a basic reference for statistical and structural ap-proaches for texture description, contextual methods like Markov Random Fieldsare used by Cross and Jain [3], and fractal geometry methods by Keller [8]. Thedependence of texture on resolution or scale has been recognised and exploitedby workers in the past decade.

Texture description and analysis using a frequency approach is not as com-mon as the spatial-domain method of co-occurrence [6] but there has been re-newed interest in the use of filtering methods akin to Gabor decomposition [10]and joint spatial/spatial-frequency representations like Wavelet transforms [16].Although easy to implement, co-occurrence measures are outperformed by suchfiltering techniques (see [13]) and have prohibitive costs when extended to 3D.

The textures encountered in magnetic resonance imaging (MRI) differ greatlyfrom synthetic textures, which tend to be structural and can be often be de-scribed with textural elements that repeat in certain patterns. MR texture ex-hibits a degree of granularity, randomness and, where the imaged tissue is fi-brous like muscle, directionality. The importance of Texture in MRI has beenthe focus of some researchers, notably Lerksi [9] and Schad [15], and a COSTEuropean group has been established for this purpose [2]. Texture analysis hasbeen used with mixed success in MRI, such as for detection of microcalcification

Page 2: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

in breast imaging [6] and for knee segmentation [7], and in Central Nervous Sys-tem (CNS) imaging to detect macroscopic lesions and microscopic abnormalitiessuch as for quantifying contralateral differences in epilepsy subjects [14], to aidthe automatic delineation of cerebellar volumes [12] and to characterise spinalcord pathology in Multiple Sclerosis [11]. Most of this reported work, however,has employed solely 2D measures, usually co-occurrence matrices that are lim-ited by computational cost. Furthermore, feature selection is often performed inan empirical way with little regard to training data which are usually available.

Our contribution in this work is to implement a fully 3D texture descriptionscheme using a multiresolution sub-band filtering and to develop a strategy forselecting the most discriminant texture features conditioned on a set of trainingimages containing examples of the tissue types of interest. The ultimate goal isto select a compact and appropriate set of features thus reducing the compu-tationally burden in both feature extraction and subsequent classification. Wedescribe the 2D and 3D frequency domain texture feature representation and thefeature selection method, by illustrating and quantitatively comparing results on2D images and 3D MRI.

2 Materials and Methods

For this work three textured data sets were used:

1. 2D Synthetic phantom of artificial textures; random noise and orientedpatterns with different frequencies and orientation,

2. 2D 16 natural textures from the Brodatz album arranged by Randen andHusøy [13]. This image is quite difficult to segment, the individual imageshave been histogram equalised, even to the human eye, some boundaries arenot evident,

3. 3D MRI of a human knee. The set is a sagittal T1 weighted with dimensions512× 512× 87, each pixel is 0.25 mm and the slice separation is 1.4 mm.

Figure 1 presents the data sets, in the case of the MRI only one slice (54) isshown. Throughout this work we will consider that an image, I, has dimensionsfor rows and columns Nr × Nc and is quantised to Ng grey levels. Let Lc ={1, 2, . . . , Nc} and Lr = {1, 2, . . . , Nr} be the horizontal and vertical spatialdomains of the image, and G = {1, 2, . . . , Ng} the set of grey tones. The imageI can be represented then as a function that assigns a grey tone to each pair ofcoordinates:

Lr × Lc; I : Lr × Lc → G (1)

2.1 Multiresolution Sub-band Filtering: The Second OrientationPyramid (SOP)

Textures can vary in their spectral distribution in the frequency domain, andtherefore a set of sub-band filters can help in their discrimination: if the image

Page 3: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

(a) (b) (c)

Fig. 1. Materials used for this work: four sets of images (a) Synthetic Phantom (2D),(b) Natural Textures (2D), (c) MRI of human knee (3D).

contains textures that vary in orientation and frequency, then certain filter sub-bands will be more energetic than others, and ‘roughness’ will be characterised bymore or less energy in broadly circular band-pass regions. Wilson and Spann [17]proposed a set of operations that subdivide the frequency domain of an imageinto smaller regions by the use of compact and optimal (in spatial versus spatial-frequency energy) filter functions based on finite prolate spheroidal sequences(FPSS). The FPSS are real, band-limited functions which cover the Fourierhalf-plane. In our case we have approximated these functions with truncatedGaussians for an ease of implementation with satisfactory results (figure 3).These filter functions can be regarded as a band-limited Gabor basis whichprovides for frequency localisation.

Any given image I whose centred Fourier transform is Iω = F{I} can besubdivided into a set of regions Li

r × Lic: Li

r = {r, r + 1, . . . , r + N ir}, 1 ≤ r ≤

Nr−N ir, Li

c = {c, c+1, . . . , c+N ic}, 1 ≤ c ≤ Nc−N i

c , that follow the conditions:Li

r ⊂ Lr, Lic ⊂ Lc,

∑i N i

r = Nr,∑

i N ic = Nc, (Li

r×Lic)∩(Lj

r×Ljc) = {φ}, i 6= j.

1

2

3 4

7

65 5 6

79

10 11 12 13

14

43

2

8

79

10 11 12 13

14

43

2

8

1716 21

2018 19

15

65

�����

�����

�����

�����

�����

�����

�����

�����

�����

�����

�����

�����

�����

�����

�����

�����

����������

����������

����������

����������

����������

���������

���������

���������

���������

���������

�������������

�������������

�������������

�������������

�������������

�������������

�������������

�������������

������������

������������

���������

���������

�������

�������

� � � �

������������

������

������

������

�����

�����

�����

�������

�������

�������

�������

�������

�������

������������

������������

������

������

������

�����

�����

�����

�������

�������

�������

�������

�������

�������

������������

������������

������

������

������

�����

�����

�����

�������

�������

�������

�������

�������

�������

������������

� � � �

!!!!!

!!!!!

!!!!!

!!!!!

"""""

"""""

"""""

"""""

#�#�#�#�#

#�#�#�#�#

$�$�$�$

$�$�$�$

%�%%�%%�%%�%

&�&&�&&�&&�&

''''''

''''''

''''''

(((((

(((((

(((((

)�)�)�)

)�)�)�)

)�)�)�)

*�*�*�*

*�*�*�*

*�*�*�*

+�++�++�++�+

,�,,�,,�,,�,

------

------

------

.....

.....

.....

/�/�/�/

/�/�/�/

/�/�/�/

0�0�0�0

0�0�0�0

0�0�0�0

1�11�11�11�1

2�22�22�22�2

333333

333333

333333

44444

44444

44444

5�5�5�5�5

5�5�5�5�5

5�5�5�5�5

6�6�6�6�6

6�6�6�6�6

6�6�6�6�6

7�77�77�77�77�7

8�88�88�88�88�8

99999

99999

99999

99999

:::::

:::::

:::::

:::::

;�;�;�;�;

;�;�;�;�;

<�<�<�<�<

<�<�<�<�<

=�==�==�==�=

>�>>�>>�>>�>

??????

??????

??????

??????

@@@@@@

@@@@@@

@@@@@@

@@@@@@

A�A�A�A

A�A�A�A

A�A�A�A

B�B�B�B

B�B�B�B

B�B�B�B

C�CC�CC�CC�C

D�DD�DD�DD�D

EEEEEE

EEEEEE

EEEEEE

EEEEEE

FFFFFF

FFFFFF

FFFFFF

FFFFFF

G�G�G�G

G�G�G�G

G�G�G�G

H�H�H�H

H�H�H�H

H�H�H�H

I�II�II�II�I

J�JJ�JJ�JJ�J

KKKKKK

KKKKKK

KKKKKK

KKKKKK

LLLLLL

LLLLLL

LLLLLL

LLLLLL

M�M�M�M

M�M�M�M

M�M�M�M

N�N�N�N

N�N�N�N

N�N�N�N

O�OO�OO�OO�O

P�PP�PP�PP�P

QQQQQ

QQQQQ

QQQQQ

RRRRR

RRRRR

RRRRR

S�S�S�S�S

S�S�S�S�S

T�T�T�T

T�T�T�T

U�UU�UU�UU�U

V�VV�VV�VV�V

WWWWWW

WWWWWW

WWWWWW

XXXXX

XXXXX

XXXXX

Y�Y�Y�Y

Y�Y�Y�Y

Z�Z�Z�Z

Z�Z�Z�Z

[�[[�[[�[[�[

\�\\�\\�\\�\

]]]]]]

]]]]]]

]]]]]]

^^^^^

^^^^^

^^^^^

_�_�_�_

_�_�_�_

`�`�`�`

`�`�`�`

a�aa�aa�aa�a

b�bb�bb�bb�b

cccccc

cccccc

cccccc

ddddd

ddddd

ddddd

e�e�e�e

e�e�e�e

f�f�f�f

f�f�f�f

g�gg�gg�gg�gg�g

h�hh�hh�hh�hh�h

iiiii

iiiii

iiiii

jjjjj

jjjjj

jjjjj

k�k�k�k�k

k�k�k�k�k

l�l�l�l

l�l�l�l

m�mm�mm�mm�m

n�nn�nn�nn�n

oooooo

oooooo

oooooo

oooooo

ppppp

ppppp

ppppp

ppppp

q�q�q�q

q�q�q�q

r�r�r�r

r�r�r�r

s�ss�ss�ss�s

t�tt�tt�tt�t

uuuuuu

uuuuuu

uuuuuu

uuuuuu

vvvvv

vvvvv

vvvvv

vvvvv

w�w�w�w

w�w�w�w

x�x�x�x

x�x�x�x

y�yy�yy�yy�y

z�zz�zz�zz�z

{{{{{{

{{{{{{

{{{{{{

{{{{{{

|||||

|||||

|||||

|||||

}�}�}�}

}�}�}�}

~�~�~�~

~�~�~�~

������������

������������

�����

�����

�����

�����

�����

�����

�����

�����

���������

���������

�������

�������

������������

������������

������

������

������

�����

�����

�����

���������

���������

���������

���������

���������

���������

������������

������������

�����

�����

�����

�����

�����

�����

�����

�����

���������

���������

���������

���������

������������

������������

������

������

������

������

������

������

�������

�������

�������

�������

�������

�������

������������

������������

�����

�����

�����

�����

�����

�����

�����

�����

���������

���������

�������

�������

������������

������������

������

������

������

     

     

     

¡�¡�¡�¡

¡�¡�¡�¡

¡�¡�¡�¡

¢�¢�¢�¢

¢�¢�¢�¢

¢�¢�¢�¢

£�££�££�££�£

¤�¤¤�¤¤�¤¤�¤

¥¥¥¥¥¥

¥¥¥¥¥¥

¥¥¥¥¥¥

¦¦¦¦¦

¦¦¦¦¦

¦¦¦¦¦

§�§�§�§

§�§�§�§

§�§�§�§

¨�¨�¨�¨

¨�¨�¨�¨

¨�¨�¨�¨

©�©©�©©�©©�©

ª�ªª�ªª�ªª�ª

««««««

««««««

««««««

¬¬¬¬¬

¬¬¬¬¬

¬¬¬¬¬

­�­�­�­

­�­�­�­

­�­�­�­

®�®�®�®

®�®�®�®

®�®�®�®

¯�¯¯�¯¯�¯¯�¯

°�°°�°°�°°�°

±±±±±

±±±±±

±±±±±

²²²²²

²²²²²

²²²²²

³�³�³�³�³

³�³�³�³�³

´�´�´�´

´�´�´�´

µ�µµ�µµ�µµ�µ

¶�¶¶�¶¶�¶¶�¶

······

······

······

¸¸¸¸¸

¸¸¸¸¸

¸¸¸¸¸

¹�¹�¹�¹

¹�¹�¹�¹

º�º�º�º

º�º�º�º

»�»»�»»�»»�»

¼�¼¼�¼¼�¼¼�¼

½½½½½½

½½½½½½

½½½½½½

¾¾¾¾¾

¾¾¾¾¾

¾¾¾¾¾

¿�¿�¿�¿

¿�¿�¿�¿

À�À�À�À

À�À�À�À

Á�ÁÁ�ÁÁ�ÁÁ�Á

Â�ÂÂ�ÂÂ�ÂÂ�Â

ÃÃÃÃÃÃ

ÃÃÃÃÃÃ

ÃÃÃÃÃÃ

ÄÄÄÄÄ

ÄÄÄÄÄ

ÄÄÄÄÄ

Å�Å�Å�Å

Å�Å�Å�Å

Æ�Æ�Æ�Æ

Æ�Æ�Æ�Æ

Ç�ÇÇ�ÇÇ�ÇÇ�Ç

È�ÈÈ�ÈÈ�ÈÈ�È

ÉÉÉÉÉ

ÉÉÉÉÉ

ÉÉÉÉÉ

ÉÉÉÉÉ

ÊÊÊÊÊ

ÊÊÊÊÊ

ÊÊÊÊÊ

ÊÊÊÊÊ

Ë�Ë�Ë�Ë�Ë

Ë�Ë�Ë�Ë�Ë

Ì�Ì�Ì�Ì

Ì�Ì�Ì�Ì

Í�ÍÍ�ÍÍ�ÍÍ�Í

Î�ÎÎ�ÎÎ�ÎÎ�Î

ÏÏÏÏÏÏ

ÏÏÏÏÏÏ

ÏÏÏÏÏÏ

ÐÐÐÐÐ

ÐÐÐÐÐ

ÐÐÐÐÐ

Ñ�Ñ�Ñ�Ñ�Ñ

Ñ�Ñ�Ñ�Ñ�Ñ

Ñ�Ñ�Ñ�Ñ�Ñ

Ò�Ò�Ò�Ò�Ò

Ò�Ò�Ò�Ò�Ò

Ò�Ò�Ò�Ò�Ò

Ó�ÓÓ�ÓÓ�ÓÓ�ÓÓ�Ó

Ô�ÔÔ�ÔÔ�ÔÔ�ÔÔ�Ô

ÕÕÕÕÕ

ÕÕÕÕÕ

ÕÕÕÕÕ

ÕÕÕÕÕ

ÖÖÖÖÖ

ÖÖÖÖÖ

ÖÖÖÖÖ

ÖÖÖÖÖ

×�×�×�×�×

×�×�×�×�×

Ø�Ø�Ø�Ø�Ø

Ø�Ø�Ø�Ø�Ø

Ù�ÙÙ�ÙÙ�ÙÙ�Ù

Ú�ÚÚ�ÚÚ�ÚÚ�Ú

ÛÛÛÛÛÛ

ÛÛÛÛÛÛ

ÛÛÛÛÛÛ

ÛÛÛÛÛÛ

ÜÜÜÜÜÜ

ÜÜÜÜÜÜ

ÜÜÜÜÜÜ

ÜÜÜÜÜÜ

Ý�Ý�Ý�Ý

Ý�Ý�Ý�Ý

Ý�Ý�Ý�Ý

Þ�Þ�Þ�Þ

Þ�Þ�Þ�Þ

Þ�Þ�Þ�Þ

ß�ßß�ßß�ßß�ßß�ß

à�àà�àà�àà�àà�à

ááááá

ááááá

ááááá

âââââ

âââââ

âââââ

ã�ã�ã�ã�ã

ã�ã�ã�ã�ã

ä�ä�ä�ä

ä�ä�ä�ä

å�åå�åå�åå�å

æ�ææ�ææ�ææ�æ

çççççç

çççççç

çççççç

çççççç

èèèèè

èèèèè

èèèèè

èèèèè

é�é�é�é

é�é�é�é

ê�ê�ê�ê

ê�ê�ê�ê

ë�ëë�ëë�ëë�ë

ì�ìì�ìì�ìì�ì

íííííí

íííííí

íííííí

íííííí

îîîîî

îîîîî

îîîîî

îîîîî

ï�ï�ï�ï

ï�ï�ï�ï

ð�ð�ð�ð

ð�ð�ð�ð

ñ�ññ�ññ�ññ�ñ

ò�òò�òò�òò�ò

óóóóóó

óóóóóó

óóóóóó

óóóóóó

ôôôôô

ôôôôô

ôôôôô

ôôôôô

õ�õ�õ�õ

õ�õ�õ�õ

ö�ö�ö�ö

ö�ö�ö�ö

÷�÷÷�÷÷�÷÷�÷

ø�øø�øø�øø�ø

ùùùùù

ùùùùù

ùùùùù

úúúúú

úúúúú

úúúúú

û�û�û�û�û

û�û�û�û�û

ü�ü�ü�ü

ü�ü�ü�ü

ý�ýý�ýý�ýý�ý

þ�þþ�þþ�þþ�þ

ÿÿÿÿÿÿ

ÿÿÿÿÿÿ

ÿÿÿÿÿÿ

�����

�����

�����

�������

�������

�������

�������

������������

������������

������

������

������

�����

�����

�����

�������

�������

���

���

����

������������

������

������

������

�������

�������

�������

�������

������������

������������

�����

�����

�����

�����

�����

�����

�����

�����

���������

���������

���������

���������

������������

������������

������

������

������

������

������

������

�������

�������

�������

�������

�������

�������

������������

������������

������

������

������

������

������

������

� � � �

� � � �

� � � �

!�!�!�!�!

!�!�!�!�!

!�!�!�!�!

"�""�""�""�"

#�##�##�##�#

$$$$$$

$$$$$$

$$$$$$

%%%%%%

%%%%%%

%%%%%%

&�&�&�&

&�&�&�&

&�&�&�&

'�'�'�'

'�'�'�'

'�'�'�'

(�((�((�((�((�(

)�))�))�))�))�)

*****

*****

*****

*****

+++++

+++++

+++++

+++++

,�,�,�,�,

,�,�,�,�,

-�-�-�-

-�-�-�-

.�..�..�..�.

/�//�//�//�/

000000

000000

000000

000000

11111

11111

11111

11111

2�2�2�2

2�2�2�2

2�2�2�2

3�3�3�3

3�3�3�3

3�3�3�3

4�44�44�44�4

5�55�55�55�5

666666

666666

666666

666666

77777

77777

77777

77777

8�8�8�8

8�8�8�8

8�8�8�8

9�9�9�9

9�9�9�9

9�9�9�9

:�::�::�::�:

;�;;�;;�;;�;

<<<<<<

<<<<<<

<<<<<<

<<<<<<

=====

=====

=====

=====

>�>�>�>

>�>�>�>

>�>�>�>

?�?�?�?

?�?�?�?

?�?�?�?

@�@@�@@�@@�@@�@

A�AA�AA�AA�AA�A

BBBBB

BBBBB

BBBBB

CCCCC

CCCCC

CCCCC

D�D�D�D�D

D�D�D�D�D

E�E�E�E

E�E�E�E

F�FF�FF�FF�F

G�GG�GG�GG�G

HHHHHH

HHHHHH

HHHHHH

HHHHHH

IIIII

IIIII

IIIII

IIIII

J�J�J�J

J�J�J�J

K�K�K�K

K�K�K�K

L�LL�LL�LL�L

M�MM�MM�MM�M

NNNNNN

NNNNNN

NNNNNN

NNNNNN

OOOOO

OOOOO

OOOOO

OOOOO

P�P�P�P

P�P�P�P

Q�Q�Q�Q

Q�Q�Q�Q

R�RR�RR�RR�R

S�SS�SS�SS�S

TTTTTT

TTTTTT

TTTTTT

TTTTTT

UUUUU

UUUUU

UUUUU

UUUUU

V�V�V�V

V�V�V�V

W�W�W�W

W�W�W�W

(a) (b) (c) (d)

Fig. 2. 2D and 3D Second Orientation Pyramid (SOP) tessellation. Solid lines indicatethe filters added at the present order while dotted lines indicate filters added in lowerorders. (a) 2D order 1, (b) 2D order 2, (c) 2D order 3, and (d) 3D order 1.

Page 4: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

For this work, the Second Orientation Pyramid (SOP) tessellation presentedin figure 2 (a, b) was selected for the tessellation of the frequency domain. TheSOP tessellation involves a set of 7 filters, one for the low-pass region and six forthe high-pass, and they are related to the i subdivisions of the frequency domainas:

Lr × Lc; Fi :

{Li

r × Lic → N(µi, Σi)

(Lir × Li

c)c → 0

∀i ∈ SOP (2)

where µi is the centre of the region i and Σi is the variance of the Gaussianthat will provide a cut-off of 0.5 at the limit of the band (figure 3).

(a) (b)

Fig. 3. Band-limited Gaussian Filter F i (a) Frequency domain, (b) Spatial Domain.

The Feature Space Siω in its frequency and spatial domains will be defined as:

Siw(k, l) = F i(k, l)Iω(k, l) ∀(k, l) ∈ (Lr × Lc), Si = F−1{Si

ω} (3)

Every order of the SOP Pyramid will consist of 7 filters. The same method-ology for the first order can be extended to the next orders. At ever step, one ofthe filters will contain the low-pass (i.e. the centre) of the region analysed, Iω

for the first order, and the six remaining will subdivide the high-pass bands orthe surround of the region. This is detailed in the following co-ordinate systems:Centre : F 1 : L1

r = {Nr

4 + 1 . . . 3Nr

4 }, L1c = {Nc

4 + 1 . . . 3Nc

4 }, Surround :F 2−7 : L3,4,5,6

r = {1 . . . Nr

4 }, L2,7r = {Nr

4 + 1 . . . Nr

2 }, L2,3c = {1 . . . Nc

4 }, L4c =

{Nc

4 + 1 . . . Nc

2 }, L5c = {Nc

2 + 1 . . . 3Nc

4 }, L6,7c = {2Nc

4 + 1 . . . Nc}.For a pyramid of order 2, the region to be subdivided will be the first central

region described by (L1r(1) × L1

c(1)) which will become (Lr(2) × Lc(2)) withdimensions Nr(2) = Nr(1)

2 , Nc(2) = Nc(1)2 , (or in general Nr,c(o + 1) = Nr,c(o)

2 ,for any order o). It is assumed that Nr(1) = 2a, Nc(1) = 2b so that the resultsof the divisions are always integer values. The horizontal and vertical frequencydomains are expressed by: Lr(2) = {Nr(1)

4 + 1 . . . 3Nr(1)4 }, Lc(2) = {Nc(1)

4 +1 . . . 3Nc(1)

4 } and the next filters can be calculated recursively: L8r(1) = L1

r(2),L8

c(1) = L1c(2), L9

r(1) = L2r(2), etc.

Figure 4 shows the feature space Si of the 2D synthetic phantom shown infigure 1(a). Figure 4(a) contains the features of orders 1 and 2, and figure 4(b)

Page 5: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

shows the features of orders 2 and 3. Note how in S2−7, the features that arefrom high pass bands, only the central region, which is composed of noise, ispresent. The oriented patterns have been filtered out. S10 and S20 show theactivation due to the oriented patterns. S8 is a low pass filter and still keeps atrace of one of the oriented patterns.

810 11 12 13

149

6543

2 7

19

131211

9

10

21

20

16

17 18

15

14

(a) (b)

Fig. 4. Two sets of features Si from the phantom image (a) Features 2 to 14 (Note S10

which describes one oriented pattern) (b) Features 9 to 21 (Note S20 which describesone oriented pattern). In each set, the feature Si is placed in the position correspondingto the filter F i in the frequency domain.

2.2 3D Multiresolution Sub-band Filtering

In order to filter a three dimensional set, a 3D tessellation (figure 2(d)) isrequired. The filters will again be formed by truncated 3D Gaussians in a octave-wise tessellation that resemble a regular oct-tree configuration. In the case of MRdata, these filters can be directly applied to the K-space. As in the 2D case, thelow pass region will be covered by one filter, but the surround or high pass regionis more complicated. While there are 6 high pass filters in a 2D tessellation, inthree dimensions there are 28 filters. This tessellation yields 29 features perorder. As in the two dimensional case, half of the space is not used because ofthe symmetric properties of the Fourier transform. The definitions of the filtersfollows the extension of the space of rows and columns to Lr × Lc × Ll withthe new dimension l - levels. Now the need for feature selection becomes clear,since it cannot expected that all the sub-bands will carry useful information fortexture classification.

2.3 Discriminant Feature Selection: Bhattacharyya Space andOrder statistics

Feature selection is a critical step in classification since not all features derivedfrom sub-band filtering, co-occurrence matrix, wavelets, wavelet packet or any

Page 6: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

other methodology have the same discrimination power. In many cases, a largenumber of features are included into classifiers or reduced by PCA or othermethods without considering that some of those features will not help to im-prove classification but will consume computational efforts. As well as makingeach feature linearly independent, PCA allows the ranking of features accordingto the size of the global covariance in each principal axis from which a ‘sub-space’ of features can be presented to a classifier. Fisher linear discriminantanalysis (LDA) diagonalises the features space constrained by maximising theratio between-class to within-class variance and can be used together with PCAto rank the features by their ‘spread’ and select a discriminant subspace [4].However, while these eigenspace methods are optimal and effective, they stillrequire the computation of all the features for given data.

We propose a feature selection methodology based on the discriminationpower of the individual features taken independently, the ultimate goal is selecta reduced number m of features or bands (in the 2D case m ≤ 7o, and in 3Dm ≤ 29o, where o is the order of the SOP tessellation). It is sub-optimal in thesense that there is no guarantee that the selected feature sub-space is the best,but our method does not exclude the use of PCA or LDA to diagonalise theresult to aid the classification.

A set of training classes are required, which make this a supervised method.Four training classes of the human knee MRI have been manually segmentedand each class has been sub-band filtered in 3D. Figure 5 shows the scatter plotof three bad and three good features arbitrarily chosen.

(a) (b)

Fig. 5. Scatter plots of three features Si from the 3D second order SOP tessellation ofthe human knee MRI, (a) bad discriminating features S2,24,47 (b) good discriminatingfeatures S5,39,54.

In order to obtain a quantitative measure of how separable two classes are, adistance or cost measure is required. In [1], the Bhattacharyya distance measureis presented as method to provide insight of the behaviour of two distributions.This distance outperforms other measures ( [citation deleted]). The variance andmean of each class are computed to calculate a distance in the following way:

B(a, b) =1

4ln

{1

4(σ2

a

σ2b

+σ2

b

σ2a

+ 2)

}+

1

4

{(µa − µb)

2

σ2a + σ2

b

}(4)

Page 7: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

where: B(a, b) is the Bhattacharyya distance between a− th and b− th classes,σa is the variance of the a− th class, µa is the mean of the a− th class, and a, bare two different training classes.

The Mahalanobis distance used in Fisher LDA is a particular case of theBhattacharyya, when the variances of the two classes are equal, this would elim-inate the first term of the distance. The second term, on the other hand willbe zero if the means are equal and is inversely proportional to the variances.B(a, b) was calculated for the four training classes (background, muscle, boneand tissue) of the human knee MRI (figure 1(c)) with the following results:

Background Muscle Bone Tissue

Tissue 11.70 3.26 0.0064 0

It should be noted the small Bhattacharyya distance between the tissue andthe bone classes. These two classes share the same range of gray levels and there-fore have lower discrimination power. For n classes with Si features, each classpairs (p) at feature i will have a Bhattacharyya distance Bi(a, b), and that willproduce a Bhattacharyya Space of dimensions Np = (n

2 ) and Ni = 7o: Np ×Ni.The domains are Li = {1, 2, . . . 7o} and Lp = {(1, 2), (1, 3), . . . (a, b), . . . (n −1, n)} where o is the order of the pyramid. The Bhattacharyya Space is definedthen as:

Lp × Li; BS : Lp × Li → Bi(Sia, Si

b) (5)

whose marginal BSi =∑Np

p=1 Bi(Sia, Si

b) is of particular interest since it sums theBhattacharyya distance of every pair of a certain feature and thus will indicatehow discriminant a certain filter is over the whole combination of class pairs.Figure 6(a) Shows the Bhattacharyya Space for the 2D image of Natural Texturesshown in figure 1(b), and figure 6(b) shows the marginal BSi.

The selection process of the most discriminant features that we propose usesthe marginal of the Bhattacharyya space BSi that indicates which filtered fea-ture is the most discriminant. The marginal is a set

BSi = {BS1, BS2, . . . BS7o}, (6)

which can be sorted in a decreasing order, its order statistic will be:

BS(i) = {BS(1), BS(2), . . . BS(7o)}, BS(1) ≥ BS(2) ≥ . . . ≥ BS(7o). (7)

This new set can be used in two different ways, first, it provides a particularorder in which the feature space can be fed into a classifier, and with a maskprovided, the error rate can be measured to see the contribution of each featureinto the final classification of the data. Second, it can provide a reduced set orsub-space; a group of training classes of reduced dimensions can show whichfilters are adequate for discrimination and thus only those filters and not thecomplete SOP tessellation need be calculated.

Page 8: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

5 10 15 20 25 30 35

20

40

60

80

100

120

50

100

150

200

250

Feature Space

Bhatt

acha

ryya D

istan

ceTr

aining

Clas

s Pair

s

0 5 10 15 20 25 30 350

200

400

600

800

1000

1200

1400

1600

1800

2000

Bhatt

acha

ryya D

istan

ce

Feature Space

(a) (b)

Fig. 6. Natural textures (a) Bhattacharyya Space BS (2D, order 5 = 35 features),((162 ) = 120 pairs), (b) Corresponding marginal of the Bhattacharyya Space BSi.

Back−Musc

Back−Bone

Back−Tiss

Musc−Bone

Musc−Tiss

Bone−Tiss10 20 30 40 50 60

0

10

20

30

40

50

60

Feature Space

Bhatt

acha

ryya D

istan

ce

10 20 30 40 50 600

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Feature Space

Bhatt

acha

ryya D

istan

ce

(a) (b)

Fig. 7. Human knee MRI (a) Bhattacharyya Space BS (3D, order 2) (34) = 6 pairs (b)Bhattacharyya Space(BSi(bone, tissue)).

(a) (b)

Fig. 8. Two sets of features Si from different images: (a) Features 2 to 14 of the NaturalTextures image (b) Features 2 to 14 from one slice of the human knee MRI.

Page 9: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

3 Classification of the feature space

For every data set of this work, the feature space was classified with a K-meanssegmentation algorithm, which was selected for simplicity and speed. The featurespace was introduced to the classifier by the order statistic BS(i) and for eachadditional feature included, the misclassification error was calculated.

The figures 4 and 8 show some of the features of the sub-band filtering pro-cess, for the MRI set only one slice, number 54 was used. In figure 4(b) twofeatures highlight the oriented patterns of the synthetic phantom; S10,20 fea-tures 10 and 20. This is a simple image compared with the natural textureswhose feature space is shown in figure 8 (a). Still some of the textures can bespotted in certain features. For instance, S2,7 highlight one of the upper centraltextures that is of high frequency. Note also that the S3−6, the upper row havea low value for the circular regions, i.e. they have been filtered since their natureis of lower frequencies.

For the human knee in figure 8(b) the first immediate observation is thehigh pass nature of the background of the image in regions number 2, 3, 6 and7, which is expected since it is mainly noise. Also in the high frequency band,features 4 and 5, upper central, do not describe the background but rather morethe bone of the knee. S8 is a low pass filtered version of the original slice.

The Bhattacharyya Spaces in figures 6 and 7 present very interesting infor-mation towards the selection of the features for classification. In the naturaltextures case, a certain periodicity can be found; the S1,7,14,21,28 have the lowestvalues. This implies that the low-pass features provide no discrimination at all,this could be supposed since the individual images have been histogram equalisedbefore generating the composite 16 class image.

The human knee MRI Bhattacharyya Space (Figure 7)(a) was formed withfour 32× 32× 32 training regions of background, muscle, bone. It can be imme-diately noticed that two bands S22,54 (low-pass) dominate the discrimination.Notice that the distance of the pair bone-tissue is practically zero comparedwith the rest of the space. This matches with the previous calculations. If themarginal were calculated like in the previous cases, the low-pass would domi-nate and the discrimination of the bone and tissue classes, which are difficult tosegment would be lost. Figure 7 (b) zooms into the Bhattacharyya space of thispair. Here we can see that some features, 12, 5, 8, 38, could provide discrimina-tion between bone and tissue, and the low pass bands could help discriminatethe rest of the classes.

4 Discussion

Figure 9 (a) shows the classification of the 2D synthetic phantom at 4.3% mis-classification with 7 features (out of 35). Of particular importance were features10 and 20 which can be seen in the marginal of the Bhattacharyya space infigure 9 (b). The low-pass features 1 and 8 also have high values but should notbe included in this case since they contain the frequency energy that will bedisclosed in features 10 and 20 giving more discrimination power.

Page 10: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

0 5 10 15 20 25 30 3510

1

102

103

104

105

106

107

Feature Space

Bhat

tach

aryy

a D

ista

nce

0 2 4 6 8 10 12 14 16 18 200

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Number of features classified

Mis

clas

sific

atio

n

(a) (b) (c)

Fig. 9. Classification of the figure 1(a), (a) Classified 2D Phantom at misclassification4.13% (b) Marginal Distribution of the Bhattacharyya Space BSi. (Note the highvalues for features 10 and 20) (c) Misclassification per features included.

0 5 10 15 20 25 30 350.35

0.4

0.45

0.5

0.55

0.6

0.65

0.7

0.75

0.8

0.85

Number of Features included for kmeans

Misc

lassif

icatio

n

(a) (b) (c)

Fig. 10. Classification of the natural textures image (figure 1(b)) with 16 differenttextures (a) Classification results at 37.2% misclassification (b) Pixels correctly classi-fied,(c) misclassification error for the sequential inclusion of features to the classifier.

(a) (b) (c) (d)

Fig. 11. Human knee MRI slices and their corresponding classification (misclassifica-tion 8.1%) (a) Sagittal view, slice 31 (b) Sagittal slice 45 (c) Axial slice 200 (d) Coronalslice 250.

Page 11: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

The misclassification plot in figure 9 (c) shows how the first two features in theclassifier manage to classify correctly more than 90% of the pixels and then thenext 5, which describe the central circular region, decrease the misclassification.If more features are added, the classification would not improve.

The natural textures image present a more difficult challenge. Randen andHusøy [13] used 9 techniques to classify this image, interestingly, they did notused FPSS filtering. Some of their misclassification results were Dyadic Gaborfilter banks (60.1%), Gabor filters (54.8%), co-occurrence (49.6%), Laws filters(48.3%), Wavelets (38.2%), quadrature mirror filters (36.4%). The Misclassifica-tion of SOP filtering is 37.2%, placing this in second place. Figure 10(a) showsthe final classification and figure 10(b) show the pixels that were correctly clas-sified. The misclassification decreases while adding and requires almost all of thefeature space in contrast with the synthetic phantom previously described.

The most important figure of the materials is the Human knee MRI. Theoriginal data set consisted of 87 slices of 512×512 pixels each. The classificationwas performed with the low-pass feature, 54, and the ordered statistics of thebone-tissue feature space: 12, 5, 8, 39, 9, 51, 42, 62. This reduced significantlythe computational burden since only these features were filtered. The misclas-sification obtained was 8.1%. Several slices in axial, coronal and sagittal planeswith their respective classifications are presented in figure 11.

To compare the discrimination power of the sub-band filtering technique withthe co-occurrence matrix, one slice of the human knee MRI set was selected andclassified with both methods. The major disadvantage of the co-occurrence ma-trix is that its dimensions will depend on the number of grey levels. In manycases, the grey levels are quantised to reduce the computational cost and infor-mation is lost inevitably. Otherwise, the computational burden just to obtainthe original matrix is huge.

The Bhattacharyya Space was calculated with the same methodology andthe 10 most discriminant features were Contrast: f2(θ = 0, π

2 , 3π4 ), Inverse differ-

ence moment: f5(θ = 3π4 ), Variance f10(θ = 0, π

2 , 3π4 ), Entropy f11(θ = 0, π

2 , 3π4 ).

The misclassification obtained with these 10 features was 40.5%. To improve theclassification, the gray-level original data was included as another feature and inthis case, with the first 6 features the misclassification reached 17.0%. With thesub-band filters, this slice had a misclassification of 7%.

5 Conclusions

The Second Orientation Pyramid Sub-Band Filtering is a powerful and simpletechnique to discriminate both natural and synthetic textures and extends wellto 3D. The number of features can be drastically reduced by feature selectionthrough the Bhattacharyya Space to a most discriminant subset, either from themarginal or an individual class pair distances. Our results compared with theco-occurrence matrix and show the misclassification for the sub-band filtering isalmost half of the joint statistics method, even with a simple classifier, the results

Page 12: Volumetric Texture Description and Discriminant … · Volumetric Texture Description and Discriminant Feature Selection for MRI Abstract. This paper considers the problem of classification

are comparable with Randen’s [13]. While co-occurrence is not easily extendedto three dimensions, we can employ our feature selection method for effectivelyselecting a compact set of discriminant features for this scheme.

References

1. G. B Coleman and H. C Andrews. Image Segmentation by Clustering. Proceedingsof the IEEE, 67(5):773–785, 1979.

2. COST European Cooperation in the field of Scientific and Technical Research.COST B11 Quantitation of Magnetic Resonance Image Texture. World Wide Web,http://www.uib.no/costb11/, 2002.

3. G. R. Cross and A. K. Jain. Markov Random Field Texture Models. IEEE Trans-actions on Pattern Analysis and Machine Intelligence, PAMI-5(1):25–39, 1983.

4. K. Fukanaga. Introduction to Statistical Pattern Recognition. Academic Press,1972.

5. R. M. Haralick. Statistical and Structural Approaches to Texture. Proceedings ofthe IEEE, 67(5):786–804, 1979.

6. D. James, B. D. Clymer, and P. Schmalbrock. Texture Detection of SimulatedMicrocalcification Sucecptibility Effects in Magnetic Resonance Imaging of theBreasts. Journal of Magnetic Resonance Imaging, 13:876–881, 2002.

7. Tina Kapur. Model based three dimensional Medical Image Segmentation. PhDthesis, AI Lab, Massachusetts Institute of Technology, May 1999.

8. J. M. Keller and S. Chen. Texture Description and Segmentation through FractalGeometry. Computer Vision, Graphics and Image Processing, 45:150–166, 1989.

9. R.A. Lerski, K Straughan, L.R. Shad, S. Bluml D. Boyce, and I. Zuna. MR ImageTexture Analysis - An Approach to tissue Characterization. Magnetic ResonanceImaging, 11(6):873–887, 1993.

10. M. Eden M. Unser. Multiresolution Feature Extraction and Selection for TextureSegmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence,11(7):717–728, 1989.

11. J. M. Mathias, P. S. Tofts, and N. A. Losseff. Texture Analysis of Spinal CordPathology in Multiple Sclerosis. Magnetic Resonance in Medicine, 42:929–935,1999.

12. I. J. Namer O. Yu, Y. Mauss and J. Chambron. Existence of contralateral abnor-malities revealed by texture analysis in unilateral intractable hippocampal epilepsy.Magnetic Resonance Imaging, 19:1305–1310, 2001.

13. T. Randen and J. H̊akon Husøy. Filtering for Texture Classification: A Compar-ative Study. IEEE Transactions on Pattern Analysis and Machine Intelligence,21(4):291–310, 1999.

14. N. Saeed and B. K. Piri. Cerebellum Segmentation Employing Texture Propertiesand Knowledge based Image Processing : Applied to Normal Adult Controls andPatients. Magnetic Resonance Imaging, 20:425–429, 2002.

15. L R Schad, S Bluml, and I Zuna. MR Tissue Characterization of IntracrianalTumors by means of Texture Analysis. Magnetic Resonance Imaging, 11:889–896,1993.

16. M. Unser. Texture Classification and Segmentation Using Wavelet Frames. IEEETransactions on Image Processing, 4(11):1549–1560, 1995.

17. R. Wilson and M. Spann. Finite Prolate Spheroidal Sequences and Their Ap-plications: Image Feature Description and Segmentation. IEEE Transactions onPattern Analysis and Machine Intelligence, 10(2):193–203, 1988.