70
GENERATION OF CRYPTOGRAPHIC KEYS FROM FACIAL FEATURES TABLE OF CONTENTS CHAPTER NO. TITLE PAGE NO. ABSTRACT iv LIST OF FIGURES v LIST OF TABLES v LIST OF ACRONYMS vi 1 INTRODUCTION 1 1.1 OBJECTIVE 2 1.2 BACKGROUND 2 1.3 LITERATURE SURVEY 4 1.4 PROBLEM DEFINITION 13 1.5 VASTNESS OF THE PROJECT 14 1.6 STATEMENT OF SCOPE 14 1.7 SOFTWARE CONTEXT 15 1.8 MAJOR CONSTRAINTS 15 2 BACKGROUND THEORY 16 2.1 BIOMETRICS 17 2.2. FINGERPRINT BIOMETRICS 22 2.2.1 Principles of fingerprint biometrics 22 i

GENERATION OF CRYPTOGRAPHIC KEYS FROM FACIAL FEATURES

Embed Size (px)

Citation preview

GENERATION OF CRYPTOGRAPHIC KEYS FROM FACIAL FEATURES

TABLE OF CONTENTS

CHAPTER NO. TITLE PAGE NO.

ABSTRACT iv

LIST OF FIGURES v

LIST OF TABLES v

LIST OF ACRONYMS vi

1 INTRODUCTION 1

1.1 OBJECTIVE 2

1.2 BACKGROUND 2

1.3 LITERATURE SURVEY 4

1.4 PROBLEM DEFINITION 13

1.5 VASTNESS OF THE PROJECT 14

1.6 STATEMENT OF SCOPE 14

1.7 SOFTWARE CONTEXT 15

1.8 MAJOR CONSTRAINTS 15

2 BACKGROUND THEORY 16

2.1 BIOMETRICS 17

2.2. FINGERPRINT BIOMETRICS 22

2.2.1 Principles of fingerprint biometrics 22

i

2.2.2 Issues with fingerprint systems 23

2.2.3 Benefits of fingerprint biometric systems 24

2.2.4 Applications of fingerprint biometrics 24

2.3 HAND BIOMETRICS 25

2.3.1 Principles of hand biometrics 25

2.3.2 How does hand biometrics work 25

2.3.3 Benefits of hand biometric systems 26

2.3.4 Weaknesses of hand biometric systems 26

2.3.5 Applications of hand biometrics 27

2.4 IRIS BIOMETRICS 27

2.4.1 Principles of iris biometrics 27

2.4.2 How does iris biometrics work 27

2.4.3 Benefits of iris biometric systems 28

2.4.4 Weaknesses of iris biometric systems 28

2.4.5 Applications of iris biometrics 28

2.5 FACE BIOMETRICS 28

2.5.1 Principles of face biometrics 28

2.5.2 How does face biometrics work 28

2.5.3 Benefits of face biometric systems 29

2.5.4 Weaknesses of face biometric systems 29

2.6 CRYPTOGRAPHIC KEY 30

2.6.1 Key Management 31

2.7 ADVANCED ENCRYPTION STANDARD 33

ii

2.8 FEASIBILITY STUDY 34

2.9 SOFTWARE/HARDWARE REQUIREMENTS 36

3 SYSTEM ARCHITECTURE 37

3.1 BASIC ALGORITHM 38

3.2 RADON TRANSFORMS 39

3.3 NORMALISATION 39

3.4 FOURIER TRANSFORM 40

3.5 BINARISATION 41

3.6 BINARY FEATURE EXTRACTION 41

4 IMPLEMENTATION 42

4.1 DETAILED PROCEDURE 43

4.1.1 Input image registration 43

4.1.2 Radon transforms 44

4.1.3 Normalization 45

4.1.4 Fast Fourier transforms 45

4.1.5 Binary feature extraction 50

5 CONCLUSION 61

6 REFERENCES 63

iii

ABSTRACT

A cryptographic key generation using face biometric was implemented and

verified. A method which uses an entropy based feature extraction process coupled with

Reed-Solomon error correcting codes that can generate deterministic bit-sequences was

implemented. Combining biometrics with cryptography is a possible solution but any

biometric cryptosystem must be able to overcome small variations present between

different acquisitions of the same biometric in order to produce consistent keys. Hence

the biometric key obtained from the face as been combined with the cryptographic key

and the look-up table was generated in order to obtain the cryptographic key while

decoding. The method is also flexible, the bio-keys used to protect the cryptographic key

can be changed and revoked and is a significant feature not possessed by other methods.

The method can be modified to protect keys of increasing length by either increasing the

size of bio keys through performing more rotations/iterations of the bi spectral transform

or by changing the RS encoding scheme used. A standard 128bit AES algorithm is used

here and the obtained cryptographic key is verified using that 128bit AES algorithm

iv

LIST OF FIGURES

FIGURE NO. FIGURE NAME PAGE NO.

3.1 Basic Algorithm 38

4.1.1 Input Image 43

4.1.4.1 Fast Fourier Transform 47

4.1.4.2 Removing first half of frequency spectrum 47

4.1.4.3 Shifting second half of frequency spectrum to first half 48

4.1.4.4 Normalization 48

4.1.4.5 Bi spectrum 49

LIST OF TABLES

TABLE NO. TABLE NAME PAGE NO.

2.9.1 Software & Hardware requirements 36

4.1.5 Entropy Calculation 52

v

LIST OF ACROYNMS

MATLAB Matrix Laboratory

AES Advanced Encryption System

PIN Personal Identification Number

FAR False Accept Rate

FFR False Reject Rate

FTE Failure To Enroll

vi

CHAPTER 1

INTRODUCTION

1

1. INTRODUCTION

1.1 OBJECTIVES

The main Objective of the project is to generate a biometric based cryptographic

key from facial features. Existing asymmetric encryption algorithms require the storage

of the secret private key. Stored keys are often protected by poorly selected user

passwords that can either be guessed or obtained through brute force attacks. Combining

biometrics with cryptography is seen as a possible solution but any biometric

cryptosystem must be able to overcome small variations present between different

acquisitions of the same biometric in order to produce consistent keys. A new method

which uses entropy based feature extraction process coupled with Reed-Solomon error

correcting codes that can generate deterministic bit-sequences from the output of an

iterative one-way transform. The technique is evaluated using face data and is shown to

reliably produce keys of suitable length for 128-bit Advanced Encryption Standard

(AES).

1.2 INTRODUCTION

Communications advancements in recent decades have led to an increased volume

of digital data traveling through publicly shared media. This has led to the rapid

development of cryptographic techniques such as AES and public some key architecture.

Although keys of sufficient length are strong against both brute force and factorization

attacks they still suffer from weaknesses due to insecure key protection by user selected

passwords. The limitations of passwords are well documented they are simple and can be

easily guessed or obtained using social engineering techniques. They are often written

2

down and stored in an insecure location, can be shared between users, and cannot provide

a guarantee of non-repudiation. Furthermore, most people tend to use the same password

for a wide range of applications and as a result the compromise of one system leads to the

compromise of many others. In recent years researchers have turned towards merging

biometrics with cryptography as a means to improve overall security by eliminating the

need for key storage using passwords. During the last decade biometrics has become

commonly used for identifying individuals. The success of its application in user

authentication has indicated that many advantages could be gained by incorporating

biometrics with cryptography. A biometric is an inherent physical or behavioural

characteristic of an individual such as their voice, face, and fingerprint or keystroke

dynamics.

Biometrics, in contrast to passwords, cannot be forgotten, are difficult to copy or

forge, impossible to share and offer more security then a common eight character

password. The principal drawback of a biometric is that it is not precise like a password

or cryptographic key. Limitations of acquisition technology and the inherent changes in

the biometric (such as pose and expression for faces) and environmental conditions (such

as lighting) lead to variations in each sample of the same biometric. For example,

although an iris is considered to be the most accurate of biometrics, there can be up to

30% variation between two different images of the same iris. It is the primary challenges

of all biometric cryptosystems to overcome this variation whilst harnessing the

advantages of biometrics in order to improve the security of encryption keys. Another

challenge stems from the permanence of a biometric. Apart from physical damage,

fingerprints or iris remain largely unchanged throughout a person’s life. This is a desired

3

property in most applications of biometrics but in cryptography this is a weakness.

Cryptographic keys need to be (and they often are) revoked or changed both proactively

as a measure to increase security and reactively as response to key compromise. Most

proposed schemes ultimately come down to the protection of an existing cryptographic

key with biometric information. While the existing key can be changed the biometric

used to secure it cannot and this shortcoming is often neglected.

1.3 LITERATURE SURVEY

Cryptography is the study of mathematical techniques related to aspects of

information security such as confidentiality, data integrity, entity authentication, and data

origin authentication. Cryptography is not the only means of providing information

security, but rather one set of techniques [1].

Confidentiality is a service used to keep the content of information from all but

those authorized to have it. Secrecy is a term synonymous with confidentiality and

privacy. There are numerous approaches to providing confidentiality, ranging from

physical protection to mathematical algorithms which render data unintelligible.

Data integrity is a service which addresses the unauthorized alteration of data. To

assure data integrity, one must have the ability to detect data manipulation by

unauthorized parties. Data manipulation includes such things as insertion, deletion, and

substitution.

Authentication is a service related to identification. This function applies to both

entities and information itself. Two parties entering into a communication should identify

each other. Information delivered over a channel should be authenticated as to origin,

date of origin, data content, time sent, etc. For these reasons this aspect of cryptography

4

is usually subdivided into two major classes: entity authentication and data origin

authentication. Data origin authentication implicitly provides data integrity. Non-

repudiation is a service which prevents an entity from denying previous commitments or

actions. When disputes arise due to an entity denying that certain actions were taken, a

means to resolve the situation is necessary. For example, one entity may authorize the

purchase of property by another entity and later deny such authorization was granted. A

procedure involving a trusted third party is needed to resolve the dispute. A fundamental

goal of cryptography is to adequately address these four areas in both theory and practice.

Cryptography is about the prevention and detection of cheating and other malicious

activities.

The Internet and World Wide Web have become important parts of most people’s

lives. Users regularly conduct various transactions over the Web that involves personal

information. These transactions include, among others, on-line banking, use of E-health

services, and engaging in E-commerce [2]. The organizations with which these

transactions occur maintain personal information about users on their computers, as well

as a variety of other types of sensitive information crucial to the organizations’ success,

requiring that this information be secured and access restricted to authorized individuals.

Moreover, the organizations should have policies in place to ensure that the users’

privacy will be protected so that their personal information does not fall into the hands of

people for whom it is not intended. Many methods can be used to restrict access of

information to authorized users and personnel through a process of authentication and

authorization. The most widely used authentication method is the username–password

combination. For the username–password method to be effective, it is essential that users

5

generate and use strong passwords that are resistant to guessing and cracking. However,

there typically is a trade off between password memorability and security. Passwords that

are easy to remember tend to be biographical information simple words that can be

guessed or cracked by other individuals or computer programs. Many studies have shown

that users in fact tend to create passwords of these types that are easy to remember. For

example, Leyden (2003) reported that 12%of users used ‘‘password’’ as their password,

and the three most common types of passwords included a user’s own name, favourite

football team, or date of birth. Another problem with passwords is that people tend to use

the same password for multiple accounts. If the password for one account is exposed,

then the security of all the other accounts is jeopardized. The username–password method

provides less security than other authentication methods such as biometric devices, smart

cards, and token devices. However, for many Web sites that maintain personal

information, the username– password combination is, and will continue to be, the

primary method of identifying and authenticating users. The popularity of the method lies

in its being accepted widely by users and being easy to implement. Thus, use of

username– password combinations is not likely to be replaced any time soon because of

convenience and practicality considerations. Consequently, it is important to determine

methods for password generation that will yield passwords that provide adequate security

but are also memorable.

In traditional cryptosystems, user authentication is based on possession of secret

keys, which falls apart if the keys are not kept secret. Further, keys can be forgotten, lost,

or stolen and, thus, cannot provide non repudiation. Current authentication systems based

on physiological and behavioural characteristics of persons, such as fingerprints,

6

inherently provide solutions to many of these problems and may replace the

authentication component of the traditional cryptosystems [3]. In this paper, we present

various methods that monolithically bind a cryptographic key with the biometric template

of a user stored in the database in such a way that the key cannot be revealed without a

successful biometric authentication. We assess the performance of one of these biometric

key binding/generation algorithms using the fingerprint biometric. We illustrate the

challenges involved in biometric key generation primarily due to drastic acquisition

variations in the representation of a biometric identifier and the imperfect nature of

biometric feature extraction and matching algorithms. We elaborate on the suitability of

these algorithms for the digital rights management systems.

The user authentication, which is an essential part of a DRM system, determines

whether a user is authorized to access the content. In a generic cryptographic system the

user authentication is possession based. That is, possession of the decrypting key is a

sufficient evidence to establish user authenticity. Because cryptographic keys are long

and random, (e.g., 128 bits for the advanced encryption standard (AES), they are difficult

to memorize. As a result, the cryptographic keys are stored somewhere and released

based on some alternative authentication mechanism, that is, upon assuring that they are

being released to the authorized users only. Most passwords are so simple that they can

be easily guessed or broken by simple dictionary attacks. It is not surprising that the most

commonly used password is the word “password”! Thus, the multimedia protected by the

cryptographic algorithm is only as secure as the passwords (weakest link) used for user

authentication that release the correct decrypting key(s). Simple passwords are easy to

crack and, thus, compromise security; complex passwords are difficult to remember and,

7

thus, are expensive to maintain. Users also have the tendency to write down complex

passwords in easily accessible locations. Further, most people use the same password

across different applications and, thus, if a single password is compromised, it may open

many doors. Finally, passwords are unable to provide non repudiation; that is, when a

password is shared with a friend, there is no way to know who the actual user is. This

may eliminate the feasibility of countermeasures such as holding conniving legitimate

users accountable in a court of law. Many of these limitations of the traditional passwords

can be ameliorated by incorporation of better methods of user authentication. Biometric

authentication refers to verifying individuals based on their physiological and behavioural

characteristics such as face, fingerprint, hand geometry, iris, keystroke, signature, voice,

etc.

It is inherently more reliable than password-based authentication, as biometric

characteristics cannot be lost or forgotten; they are extremely difficult to copy, share, and

distribute and require the person being authenticated to be present at the time and point of

authentication. It is difficult to forge biometric and it is unlikely for a user to repudiate

having accessed the digital content using biometrics. Finally, one user’s biometrics is no

easier to break than another’s; that is, all users have a relatively equal security level,

hence, there are not many users who have “easy to guess” biometrics, that can be used to

mount an attack against them. Thus, biometrics-based authentication is a potential

candidate to replace password-based authentication, either by providing the complete

authentication mechanism or by securing the traditional cryptographic keys that contain

the multimedia file in a DRM system. In this paper, we attempt to present an analysis of

implications of the existing biometric technologies to the containment process .We

8

present a brief summary of biometric technology and dwell on the challenges involved in

incorporating the biometric technologies to the cryptographic systems . We review the

existing approaches for overcoming the challenges involved in designing biometrics-

based cryptographic systems along with their strengths and limitations. Using fingerprint

data, we present the limitations of the present approach to designing biometric

cryptosystems.

Finally, we summarize the advantages of biometric cryptosystems, challenges of

designing such systems and stipulate on some of the promising directions for further

research for a successful marriage of the biometric and cryptographic techniques. A

number of biometric characteristics have been in use in various applications. Each

biometric has its strengths and weaknesses, and the choice depends on the application.

No single biometric is expected to effectively meet all the requirements of all the

applications. In other words, no biometric is “optimal.” The match between a specific

biometric and an application is determined depending upon the requirements of the

application and the properties of the biometric characteristic. A brief comparison of some

of the biometric identifiers based on seven factors is: - Universality, distinctiveness,

permanence, and collectability are properties of biometric identifiers. Performance,

acceptability, and circumvention are attributes of biometric systems. Use of many other

biometric characteristics such as retina, infrared images of face and body parts, gait,

odour, ear, and DNA in commercial authentication systems is also being investigated.

The following example illustrates how different biometric identifiers may be

appropriate in different scenarios. If one would like to provide “just-intime” secure

access to the documents for “write/modify” operations to authorized users, e.g., brokers

9

bidding on commodity items using a keyboard both for reputability as well as security the

most natural biometric for authenticating the bid document would be either keystroke

dynamics or having fingerprint sensors on each key of the keyboard. If the brokers were

bidding vocally, the bid voice segments could be authenticated using voice recognition. If

the application is intended for providing read-only access to a top secret “for your eyes

only” document, ideal authentication would be iris or retina recognition of the authorized

reader as she reads the document. Thus, depending upon the operational situation,

different biometric characteristics are suitable for different DRM applications.

A number of researchers have studied the interaction between biometrics and

cryptography, two potentially complementary security technologies. Biometrics is about

measuring unique personal features, such as a subject’s voice, fingerprint, or iris. It has

the potential to identify individuals with a high degree of assurance, thus providing a

foundation for trust. Cryptography, on the other hand, concerns itself with the projection

of trust: with taking trust from where it exists to where it is needed. A strong combination

of biometrics and cryptography might, for example, have the potential to link a user with

a digital signature she created with a high level of assurance [4]. For example, it will

become harder to use a stolen token to generate a signature, or for a user to falsely

repudiate a signature by claiming that the token was stolen when it was not. Previous

attempts in this direction include a signature-verification pen and associated signal

processor made available with the IBM Transaction Security System in 1989. One

problem with this approach is its complete reliance on hardware tamper-resistance: if the

token is broken, both the template and the key are lost. In many cases, attackers have

been able to break tokens, whether by hardware attacks exploiting chip-testing

10

technology, or (as with the IBM design) by API attacks on the token’s software. We

therefore set out to find a better way of combining biometrics, cryptography and tamper-

resistance. The main obstacle to algorithmic combination is that biometric data are noisy;

only an approximate match can be expected to a stored template. Cryptography, on the

other hand, requires that keys be exactly right, or protocols will fail. For that reason,

previous product offerings have been based on specific hardware devices. It would be

better to have a more general, protocol-level approach, combining cryptography and

biometrics. Yet another consideration is privacy. Many users may be reluctant to have

biometric data stored on central databases; and there may be less resistance to biometric

technology if users can be credibly assured that their templates are not stored centrally.

Other researchers have tried to map biometric data into a unique and repeatable binary

string. Subsequently, the binary string would be mapped to an encryption key by

referring to a look-up table, or direct hashing the potential of this approach is that storage

of a biometric template would not be needed. So far, however, these attempts have

suffered from several drawbacks, which we will now explain.

In the paper, we will use the term biometric key, to refer to the repeatable string

derived from a user biometric. The hardest problem with biometrics is the unreliability of

individual bits in the template. Biometric measurements, being made of attributes of the

human body, are noisy by nature, while cryptography demands correctness in keys. There

have been a number of attempts to bridge the gap between the fuzziness of biometrics

and the exactitude of cryptography, by deriving biometric keys from key stroke patterns,

the human voice, handwritten signatures, fingerprints, and facial characteristics.

However, so far, these attempts have suffered from an excessive False Rejection Rate

11

(FRR) – usually over 20%, which is unacceptable for practical applications. Second,

many proposals have failed to consider security engineering aspects, of which the most

severe are the irrevocability of biometrics and their low level of secrecy. Biometric

features are inherent in individuals, so they cannot be changed easily. A related problem

is key diversity: an individual may wish separate keys for their bank account and for

access to their workplace computer, so that they can revoke one without affecting the

other. Third, biometric data are not very secret. People leave (poor-quality) fingerprints

everywhere, and iris images may be captured by a hidden camera. Generally speaking,

the more a biometric is used, the less secret it will be. It would be imprudent to rely on a

biometric alone, especially if that biometric became used on a global scale. One might

expect Mafia- owned businesses to collect biometric data in large quantities if there was

any potential exploit path. Fourth, social acceptance is crucially important to the success

of biometric technology. The fear of potential misuse of biometric data may make the

public reluctant to use systems that depend on it, and this could be especially the case if

there is a large 4 central database of biometric data which focuses privacy worries and

acts as a target of privacy activists. There may be a fear that personal health information

will leak out via biometric data.

Biometric authentication is typically performed by a sophisticated software

application, which manages the user interface and database, and interacts with a vendor

specific, proprietary biometric algorithm. Algorithms undertake the following processing

steps: 1) acquisition of a biometric sample image, 2) conversion of the sample image to a

biometric template, 3) comparison of the new (or "live") template to previously stored

templates, to calculate a match (or similarity) score. High match scores indicate a

12

likelihood that the corresponding images are from the same individual. The template is a

(typically vendor specific) compact digital representation of the essential features of the

sample image. Biometric algorithm vendors have uniformly claimed that it is impossible

or infeasible to recreate the image from the templates. These claims are supported by: 1)

the template records features (such as fingerprint minutiae) and not image primitives, 2)

templates are typically calculated using only a small portion of the image, 3) templates

are small − a few hundred bytes − much smaller than the sample image, and 4) the

proprietary nature of the storage format makes templates infeasible to "hack". For these

reasons, biometric templates are considered to be effectively non-identifiable data, much

like a password hash. Because biometric data is considered to be non-identifiable, it can

be managed in ways that the source images cannot. For example, templates stored on

government identification documents are exchanged between governments. This type of

exchange typically requires approval by a national privacy commissioner, and the

primary justification is typically the assumed non identifiable nature of biometric

templates. This assumed non identifiability is also used to allay concerns expressed by

citizens and employees that their fingerprint, face, and iris images may accessed from

their storage on identification cards.

1.4 PROBLEM DEFINITION

Cryptographic key generation is the main basic scenario in our algorithm. Radon

transform was used to convert the 2-D image into one dimensional projection. The

projections thus obtained are normalized and Discrete Fourier transformed twice in order

to obtain bi spectrum. For the values after the first Fourier transform the first half of the

frequency spectrum was to be removed and the second half was to be shifted to the first

13

half and the second half should be zero padded. The obtained output was integrated.

Iterative transform is done in order to increase the bit length code. Magnitude and angle

matrices are binarised and feature extraction is done from these binarised values using

entropy. The key thus generated is the bio key. A random key is generated and encoded

with reed Solomon encoding process called as cryptographic key and is combined with

the bio key to form the look-up table. The cryptographic key generated should be

encoded and decoded by standard encryption algorithm, so the key generated here is

verified with standard AES algorithm.

1.5 VASTNESS OF THE PROBLEM

Although cryptographic key generation using biometric appears to be a simple

task, it is a non-trivial task for computers. Cryptographic key generation using biometrics

would be a trivial problem if the biometric of the human was enriched with additional

information about distance of eye, nose mouth, etc... Possibly no algorithm for

cryptographic key generation using biometrics will ever be able to detect all the

variations in biometric of human, unless it is provided with powerful artificial

intelligence.

1.6 STATEMENT OF SCOPE

The success of a cryptographic key generation using biometric eventually depends

on how much of the obtained precision that is useful for discrimination. The scope of this

project is very high due to lack of insecure passwords biometric plays a key role in

securing the password because the biometric cannot be copied by other users nor cannot

be shared by others. It has a wide range of applications especially used for login

14

authentication where the authorized person should be strictly available at the time of

login process.

1.7 SOFTWARE CONTEXT

The design and implementation of cryptographic key generation using biometric

provides support for the next generation of information performing. The cryptographic

key generation using biometric is performed using MATLAB. The image inputs were

taken by cameras.

1.8 MAJOR CONSTRAINTS

This is a very challenging task especially in adverse circumstances cryptographic

key generation using biometric algorithms are in general very sensitive to lighting

conditions (especially illumination) and image quality; in such cases, cryptographic key

generation using biometric is not relatively an easier task to accomplish. The object

should be obtained without any sort of shadows related to the object or any other noises

15

CHAPTER 2

BACKGROUND THEORY

16

2. BACKGROUND THEORY

2.1 BIOMETRICS

“Biometric" come from the Greek words "bio" (life) and "metric" (to measure).

Biometrics is technologies used for measuring and analyzing a person's unique

characteristics. There are two types of biometrics: behavioural and physical. Behavioural

biometrics is generally used for verification while physical biometrics can be used for

either identification or verification. Biometrics is used for identification and verification:

Identification is determining who a person is. It involves trying to find a match for

a person's biometric data in a database containing records of people and that

characteristic. This method requires time and a large amount of processing power,

especially if the database is very large.

Verifications determining if a person is who they say they are. It involves

comparing a user's biometric data to the previously recorded data for that person to

ensure that this is the same person. This method requires less processing power and time,

and is used for access control (to buildings or data). The main physical biometric

technologies include:

fingerprint

iris

retina

hand

palm vein

face

17

There are also a number of behavioural biometric technologies such as voice

recognition (analyzing a speaker's vocal behaviour), keystroke (measuring the time

spacing of typed words), gait recognition (manner of walking), or signature.

Biometric devices normally consist of 3 elements:

A scanner / reader that captures the user's biometrics characteristics.

A piece of software that converts this data into digital form and compares it with

data previously recorded.

A database, which stores the biometric data.

The process comprises 4 main steps: sample capture, feature extraction, template

comparison, and matching. At enrolment, a person's biometrics is captured by the

scanner. The software converts the biometric input into a template and identifies specific

points of data as "match points". The match points are processed using an algorithm into

a value that can be compared with biometric data in the database.

Unique: The various biometrics systems have been developed around unique

characteristics of individuals. The probability of 2 people sharing the same

biometric data is virtually nil.

Cannot be shared: Because a biometric property is an intrinsic property of an

individual, it is extremely difficult to duplicate or share.

Cannot be copied: Biometric characteristics are nearly impossible to forge or

spoof, especially with new technologies ensuring that the biometric being

identified is from a live person.

18

Cannot be lost: A biometric property of an individual can be lost only in case of

serious accident.

Reliable user authentication is essential. The consequences of insecure

authentication in a banking or corporate environment can be catastrophic, with loss of

confidential information, money, and compromised data integrity. Many applications in

everyday life also require user authentication, including physical access control to offices

or buildings, e-commerce, healthcare, immigration and border control, etc.

Currently, the prevailing techniques of user authentication are linked to

passwords, user IDs, identification cards and PINs (personal identification numbers).

These techniques suffer from several limitations: Passwords and PINs can be guessed,

stolen or illicitly acquired by covert observation.

In addition, there is no way to positively link the usage of the system or service to

the actual user. A password can be shared, and there is no way for the system to know

who the actual user is. A credit card transaction can only validate the credit card number

and the PIN, not if the transaction is conducted by the rightful owner of the credit card.

This is where biometrics systems provide a more accurate and reliable user authentication

method, as can be summarized in the table underneath:

Existing user authentication techniques include:

Something you know, e.g. password or PIN. The issue is that many password are

easy to guess, and can also be easily forgotten.

Something you have, e.g. key or car. They can be lost, stolen or duplicated.

19

Something you know and have, e.g. card + PIN.

Something you are, e.g. fingerprint, hand, iris, retina, voice. You cannot lose

them, are unique for each individual and are difficult to forge.

There are two basic types of recognition errors: the false accept rate (FAR) and

the false reject rate (FRR). A False Accept is when a non matching pair of biometric data

is wrongly accepted as a match by the system. A False Reject is when a matching pair of

biometric data is wrongly rejected by the system. The two errors are complementary:

When you try to lower one of the errors by varying the threshold, the other error rate

automatically increases. There is therefore a balance to be found, with a decision

threshold that can be specified to either reduce the risk of FAR, or to reduce the risk of

FRR.

In a biometric authentication system, the relative false accept and false reject rates

can be set by choosing a particular operating point (i.e., a detection threshold). Very low

(close to zero) error rates for both errors (FAR and FRR) at the same time are not

possible. By setting a high threshold, the FAR error can be close to zero, and similarly by

setting a significantly low threshold, the FRR rate can be close to zero. A meaningful

operating point for the threshold is decided based on the application requirements, and

the FAR versus FRR error rates at that operating point may be quite different. To provide

high security, biometric systems operate at a low FAR instead of the commonly

recommended equal error rate (EER) operating point where FAR = FRR.

Paradoxically, the greatest strength of biometrics is at the same time its greatest

liability. It is the fact that an individual's biometric data does not change over time: the

20

pattern in your iris retina or palm vein remain the same throughout your life.

Unfortunately, this means that should a set of biometric data be compromised, it is

compromised forever. The user only has a limited number of biometric features (one

face, two hands, ten fingers, two eyes). For authentication systems based on physical

tokens such as keys and badges, a compromised token can be easily cancelled and the

user can be assigned a new token. Similarly, user IDs and passwords can be changed as

often as required. But if the biometric data are compromised, the user may quickly run

out of biometric features to be used for authentication.

There are seven main areas where attacks may occur in a biometric system:

Presenting fake biometrics or a copy at the sensor, for instance a fake finger or a

face mask. It is also possible to try and resubmitting previously stored digitized

biometrics signals such as a copy of a fingerprint image or a voice recording.

Producing feature sets preselected by the intruder by overriding the feature

extraction process.

Tampering with the biometric feature representation: The features extracted

from the input signal are replaced with a fraudulent feature set.

Attacking the channel between the stored templates and the matcher: The stored

templates are sent to the matcher through a communication channel. The data

traveling through this channel could be intercepted and modified - There is a real

danger if the biometric feature set is transmitted over the Internet.

Corrupting the matcher: The matcher is attacked and corrupted so that it

produces pre-selected match scores.

21

Tampering with stored templates, either locally or remotely.

Overriding the match result.

2.2 FINGERPRINT BIOMETRICS

2.2.1 Principles of fingerprint biometrics

A fingerprint is made of a number of ridges and valleys on the surface of the

finger. Ridges are the upper skin layer segments of the finger and valleys are

the lower segments. The ridges form so-called minutia points: ridge endings

(where a ridge end) and ridge bifurcations (where a ridge splits in two). Many

types of minutiae exist, including dots (very small ridges), islands (ridges

slightly longer than dots, occupying a middle space between two temporarily

divergent ridges), ponds or lakes (empty spaces between two temporarily

divergent ridges), spurs (a notch protruding from a ridge), bridges (small

ridges joining two longer adjacent ridges), and crossovers (two ridges which

cross each other).

The uniqueness of a fingerprint can be determined by the pattern of ridges and

furrows as well as the minutiae points. There are five basic fingerprint

patterns: arch, tented arch, left loop, right loop and whorl. Loops make up

60% of all fingerprints, whorls account for 30%, and arches for 10%.

Fingerprints are usually considered to be unique, with no two fingers having

the exact same dermal ridge characteristics.

The main technologies used to capture the fingerprint image with sufficient detail

are optical, silicon, and ultrasound.

22

There are two main algorithm families to recognize fingerprints:

Minutia matching compares specific details within the fingerprint ridges. At

registration, the minutia points are located, together with their relative positions

to each other and their directions. At the matching stage, the fingerprint image is

processed to extract its minutia points, which are then compared with the

registered template.

Pattern matching compares the overall characteristics of the fingerprints, not

only individual points. Fingerprint characteristics can include sub-areas of

certain interest including ridge thickness, curvature, or density. During

enrolment, small sections of the fingerprint and their relative distances are

extracted from the fingerprint. Areas of interest are the area around a minutia

point, areas with low curvature radius, and areas with unusual combinations of

ridges.

2.2.2 Issues with fingerprint systems

The tip of the finger is a small area from which to take measurements, and ridge

patterns can be affected by cuts, dirt, or even wear and tear. Acquiring high-quality

images of distinctive fingerprint ridges and minutiae is complicated task. People with no

or few minutia points (surgeons as they often wash their hands with strong detergents,

builders, and people with special skin conditions) cannot enroll or use the system. The

number of minutia points can be a limiting factor for security of the algorithm. Results

can also be confused by false minutia points (areas of obfuscation that appear due to low-

quality enrolment, imaging, or fingerprint ridge detail).

23

Note: There is some controversy over the uniqueness of fingerprints. The quality of

partial prints is however the limiting factor. As the number of defining points of the

fingerprint becomes smaller, the degree of certainty of identity declines. There have been

a few well-documented cases of people being wrongly accused on the basis of partial

fingerprints.

2.2.3 Benefits of fingerprint biometric systems

Easy to use

Cheap

Small size

Low power

Non-intrusive

Large database already available

2.2.4 Applications of fingerprint biometrics

Fingerprint sensors are best for devices such as cell phones, USB flash drives,

notebook computers and other applications where price, size, cost and low power are key

requirements. Fingerprint biometric systems are also used for law enforcement,

background searches to screen job applicants, healthcare and welfare.

2.3 HAND BIOMETRICS

2.3.1 Principles of hand biometrics

24

An individual's hand does not significantly change after a certain age. Unlike

fingerprints, the human hand isn't unique. Individual hand features are not descriptive

enough for identification. However, hand biometric recognition systems are accurate for

verification purposes when combining various individual features and measurements of

fingers and hands.

2.3.2 How does hand biometrics work

Biometric hand recognition systems measure and analyze the overall structure,

shape and proportions of the hand, e.g. length, width and thickness of hand, fingers and

joints; characteristics of the skin surface such as creases and ridges. Some hand geometry

biometrics systems measure up to 90 parameters.

As hand biometrics rely on hand and finger geometry, the system will also work

with dirty hands. The only limitation is for people with severe arthrist are who cannot

spread their hands on the reader.

The user places the palm of his or her hand on the reader's surface and aligns his

or her hand with the guidance pegs which indicate the proper location of the fingers. The

device checks its database for verification of the user. The process normally takes only a

few seconds.

To enroll, the users place his or her hand palm down on the reader's surface. To

prevent a mold or a cast of the hand from being used, some hand biometric systems will

require the user to move their fingers. Also, hand thermography can be used to record the

heat of the hand, or skin conductivity can be measured.

25

2.3.3 Benefits of hand biometric systems

Easy to use

Non intrusive

Small amount of data required to uniquely identify a user, so a large number of

templates can be easily stored in a standalone device: Hand biometric systems

will generally only require a template size of 10 bytes, which is much smaller

than most other biometric technologies (fingerprint systems require 250 to 1,000

bytes and voice biometric systems require 1,500 to 3,000 bytes)

Low FTE (failure to enroll) rates

2.3.4 Weaknesses of hand biometric systems

Lack of accuracy, so it can only be used for verification

Size of the scanner

Fairly expensive, compared with fingerprint systems

Injuries to hands are fairly common and would prevent the hand biometric

system from working properly

2.3.5 Applications of hand biometrics

Hand biometric systems are currently among the most widely used biometric

technologies.

26

Time and attendance

Access to restricted areas and buildings: Hand biometric systems are currently

used in apartment buildings, offices, airports, day care centers, welfare agencies,

hospitals, and immigration facilities.

2.4 IRIS BIOMETRICS

2.4.1 Principles of iris biometrics

The iris is the elastic, pigmented, connective tissue that controls the pupil. The iris

is formed in early life in a process called morphogenesis. Once fully formed, the texture

is stable throughout life. It is the only internal human organ visible from the outside and

is protected by the cornea. The iris of the eye has a unique pattern, from eye to eye and

person to person.

2.4.2 How does iris biometrics work?

An iris scan will analyze over 200 points of the iris, such as rings, furrows,

freckles, and the corona and will compare it a previously recorded template. Glasses,

contact lenses, and even eye surgery does not change the characteristics of the iris.

To prevent an image / photo of the iris from being used instead of a real "live"

eye, iris scanning systems will vary the light and check that the pupil dilates or contracts.

2.4.3 Benefits of iris biometric systems

Highly accurate: There is no known case of a false acceptance for iris

recognition

27

Not intrusive and hygienic - no physical contact required

2.4.4 Weaknesses of iris biometric systems

The user must hold still while the scan is taking place

2.4.5 Applications of iris biometrics

Applications include: Identity cards and passports, border control and other

Government programs, prison security, database access and computer login, hospital

security, schools, aviation security, controlling access to restricted areas, buildings and

homes.

2.5 FACE BIOMETRICS

2.5.1 Principles of face biometrics

The dimensions, proportions and physical attributes of a person's face are unique.

2.5.2 How does face biometrics work

Biometric facial recognition systems will measure and analyze the overall

structure, shape and proportions of the face: Distance between the eyes, nose, mouth, and

jaw edges; upper outlines of the eye sockets, the sides of the mouth, the location of the

nose and eyes, the area surrounding the cheekbones.

At enrolment, several pictures are taken of the user's face, with slightly different

angles and facial expressions, to allow for more accurate matching. For verification and

28

identification, the user stands in front of the camera for a few seconds, and the scan is

compared with the template previously recorded.

To prevent an image / photo of the face or a mask from being used, face biometric

systems will require the user to smile, blink, or nod their head. Also, facial thermography

can be used to record the heat of the face (which won't be affected by a mask).

2.5.3 Benefits of face biometric systems

Not intrusive, can be done from a distance, even without the user being aware of

it (for instance when scanning the entrance to a bank or a high security area).

2.5.4 Weaknesses of face biometric systems

Face biometric systems are more suited for authentication than for identification

purposes, as it is easy to change the proportion of one's face by wearing a mask,

a nose extension, etc.

User perceptions / civil liberty: Most people are in comfortable with having their

picture taken.

The main facial recognition methods are: feature analysis, neural network, eigen

faces, and automatic face processing.

2.6 CRYPTOGRAPHIC KEY

29

Cryptography is often seen as a 'black art': something others understand but you

need. These of course need not the case at all. Yes, there are some complex concepts to

embrace, but basic understanding need not be a trial.

If the confidentiality or accuracy of your information is of any value at all, it

should be protected to an appropriate level.

If the unauthorized disclosure or alteration of the information could result in any

negative impact, it should be secured.

These are simple and widely accepted facts. However, the means to achieve the

requisite protection are usually far from obvious.

A number of mechanisms are commonly employed:

Controlling access to the computer system or media. For instance, through 'logon'

authentication (eg: via passwords).

Employing an access control mechanism (such as profiling)

Restricting physical access (eg: keeping media locked away or preventing access

to the computer itself).

All these approaches can be valuable and effective, but equally all can have

serious shortcomings. A more fundamental approach to data security is

cryptography.

Conventional access control mechanisms can often be bypassed (for instance via

hacking). In addition, what if data has to be transmitted, or if the data media (eg:

floppy disk) has to be moved outside the secure environment? What if a number

of people are sharing the computer environment? Cryptography (encryption and

30

decryption) is a technique designed to protect your information in ALL such

situations.

Encryption is the science of changing data so that it is unrecognizable and useless

to an unauthorized person. Decryption is changing it back to its original form.

The most secure techniques use a mathematical algorithm and a variable value

known as a 'key'.

The selected key (often any random character string) is input on encryption and is

integral to the changing of the data. The EXACT same key MUST be input to enable

decryption of the data.

This is the basis of the protection.... if the key (sometimes called a password) is

only known by authorized individual(s); the data cannot be exposed to other parties. Only

those who know the key can decrypt it. This is known as 'private key' cryptography,

which is the most well known form.

2.6.1 Key Management

As the entire operation is dependent upon the security of the keys, it is sometimes

appropriate to devise a fairly complex mechanism to manage them.

Where a single individual is involved, often direct input of a value or string will

suffice. The 'memorized' value will then be re-input to retrieve the data, similar to

password usage.

Sometimes, many individuals are involved, with a requirement for unique keys to

be sent to each for retrieval/decryption of transmitted data. In this case, the keys

31

themselves may be encrypted. A number of comprehensive and proven key management

systems are available for these situations.

The two components required to encrypt data are an algorithm and a key. The

algorithm generally known and the key are kept secret.

The key is a very large number that should be impossible to guess, and of a size

that makes exhaustive search impractical.

In a symmetric cryptosystem, the same key is used for encryption and decryption.

In an asymmetric cryptosystem, the key used for decryption is different from the key

used for encryption.

In an asymmetric system the encryption and decryption keys are different but

related. The encryption key is known as the public key and the decryption key is known

as the private key. The public and private keys are known as a key pair.

Where a certification authority is used, remember that it is the public key that is

certified and not the private key. This may seem obvious, but it is not unknown for a user

to insist on having his private key certified!

Keys should whenever possible be distributed by electronic means, enciphered

under previously established higher-level keys. There comes a point, of course when no

higher-level key exists and it is necessary to establish the key manually.

A common way of doing this is to split the key into several parts (components)

and entrust the parts to a number of key management personnel. The idea is that none of

the key parts should contain enough information to reveal anything about the key itself.

32

Usually, the key is combined by means of the exclusive-OR operation within a

secure environment. In the case of DES keys, there should be an odd number of

components, each component having odd parity. Odd parity is preserved when all the

components are combined. Further, each component should be accompanied by a key

check value to guard against keying errors when the component is entered into the

system.

A key check value for the combined components should also be available as a

final check when the last component is entered.

A problem that occurs with depressing regularity in the real world is when it is

necessary to re-enter a key from its components. This is always an emergency situation,

and it is usually found that one or more of the key component holders cannot be found.

For this reason it is prudent to arrange matters so that the components are distributed

among the key holders in such a way that not all of them need to be present.

2.7 AES

The Advanced Encryption Standard (AES) specifies a FIPS-approved

Cryptographic algorithm that can be used to protect electronic data. The AES algorithm is

a Symmetric block cipher that can encrypt (encipher) and decrypt (decipher) information.

Encryption converts data to an unintelligible form called cipher text; decrypting the

cipher text converts the data back into its original form, called plaintext. The AES

algorithm is capable of using cryptographic keys of 128, 192, and 256 bits to encrypt and

decrypt data in blocks of 128 bits.

2.8 FEASIBILITY STUDY

33

The analyst does study to evaluate the likelihood of the usability of the system to

the organization. The feasibility team ought to carry initial architecture and design of

the high-risk requirements to the point at which we can answer the question like, if the

requirements pose risk that would likely make the project infeasible. They have to check

if the defects were reduced to a level matching the application needs. The analyst has to

be sure that the organization has the resources needed in the requirements may be

negotiated. The following feasibility technique has been in this project.

Economic Feasibility

Technical Feasibility

Economic Feasibility:

A system that can be developed technically and that will be used if installed

must still be a good investment for the organization. Financial benefits must exceed or

equal cost. The analyst estimates the cost of the building the software, the cost of the

hardware and software, the benefits in the form of reduced costs or fewer costly errors.

This project is designed in such a way that the financial benefits exceeds or

equals costs. It has been decided to develop the system at reasonable costs. The system

is sure to be good investment for the organization, since it requires minimal cost for

implementing.

Technical Feasibility:

34

The technical issues raised during the feasibility stage include whether the

necessary technology suggested for the project exist. The proposed equipment should

have the technical capacity to hold the data required to use the new system.

The technical guarantees of reliability, accuracy, case of access and data security

should be provided by the system. The system must have the capability of being

expanded in future.

The proposed system is designed to operate under minimum technical

requirements. There is a wide range of future enhancement that can be implemented in

the system. The tools used are highly reliable and guarantee ease of access, data security

and accuracy. The proposal is technically feasible.

Justification:

The proposed system has been developed with an eye to future developments. It

can be easily updated with new modules when required. The project has the minimal

software and hardware requirements making it technically feasible. It can also be

implemented in the organization with the guarantee of satisfying the user needs. The

project is highly operable. The above details specify that the project is feasible top all

situations.

2.9 SOFTWARE/HARDWARE REQUIREMENTS

35

Table 2.9.1 software & hardware requirements

1 MATLAB 7.8.0

2 512 MB RAM, 1.7 GHz Processor

3 10 GB HDD

4 WINDOWS XP & ABOVE

5 INTEL PENTIUM P-IV Processor

36

CHAPTER 3

SYSTEM ARCHITECTURE

3. SYSTEM ARCHITECTURE

3.1 BASIC ALGORITHM

37Radon transform (convert 2-D image to 1-D

projections)r

Fig. 3.1 Basic Algorithm

3.2 RADON TRANSFORMS

Radon transform in two dimensions is the integral transform consisting of the

integral of a function over straight lines The Radon transform is closely related to the and

38

Reed-solomon (RS)code

Binary feature extraction

Apply DFT

Iterative transform

Binarisation of angle and magnitude

Normalize the 1-D projections

Fourier transform is applied to get bispectrum

Cryptographic key generation

Look up table creation

for a function of a 2-vector, The transform employed by the system is an iterative,

chaotic, bi spectrum one-way transform that accepts a one-dimensional vector input and

is used to produce a magnitude and angle pair per iteration. The transform incorporates

similarity transformation invariance and shape sensitivity by design. This output can be

converted to binary to form a very large bit matrix. The transform requires a 1D input

vectors and the Radon transform is used to convert a 2D image into a set of 1D

projection. The resulting 1D vector produced at each rotation angle is fed into the bi

spectrum transform.

3.3 NORMALIZATION

Normalization is any process that makes something more normal, which typically

means conforming to some regularity or rule, or returning from some state of

abnormality. Vector is first normalized by the magnitude of the largest vector element;

the mean is also removed. In normalization is a process that changes the range of

intensity values. Applications include photographs with poor due to glare, for example.

Normalization is sometimes called contrast stretching. In more general fields of data

processing, such as such as digital signal processing, it is referred to as dynamic range

expansion.

The purpose of dynamic range expansion in the various applications is usually to

bring the image, or other type of signal, into a range that is more familiar or normal to the

senses, hence the term normalization. Often, the motivation is to achieve consistency in

dynamic range for a set of data, signals, or images to avoid mental distraction or fatigue.

For example, a newspaper will strive to make all of the images in an issue share a similar

range of gray scale. Normalization is a linear process. If the intensity range of the image

39

is 50 to 180 and the desired range is 0 to 255 the process entails subtracting 50 from each

of pixel intensity, making the range 0 to 130. Then each pixel intensity is multiplied by

255/130, making the range 0 to 255. Auto-normalization in image processing software

typically normalizes to the full dynamic range of the number system specified in the

image file format. The normalization process will produce iris regions, which have the

same constant dimensions, so that two photographs of the same iris under different

conditions will have characteristic features at the same spatial location.

3.4 FOURIER TRANSFORM

Fourier transform is an operation that transforms complex valued function of a

real variable into another. It describes which frequencies are present in the original

function. The Fourier Transform is an important image processing tool which is used to

decompose an image into its sine and cosine components. The output of the

transformation represents the image in the Fourier or, while the input image is the

equivalent. In the Fourier domain image, each point represents a particular frequency

contained in the spatial domain image. The DFT is the sampled Fourier Transform and

therefore does not contain all frequencies forming an image, but only a set of samples

which is large enough to fully describe the spatial domain image. The number of

frequencies corresponds to the number of pixels in the spatial domain image, i.e. the

image in the spatial and Fourier domain is of the same size.

3.5 BINARISATION

40

A binarization method of binarizing an image by extracting lightness as a feature

amount from the image. When a pixel is selected in an image, sensitivity is added to

and/or subtracted from the value concerning the Y value of the selected pixel to set a

threshold value range. Next, when another pixel is selected, the sensitivity is added to or

subtracted from the value concerning the Y value of the selected pixel and a new

threshold value range is set containing the calculation result and the already setup

threshold value range. The pixel with the value concerning the Y value of any pixel in the

image within the threshold value range is extracted as the same brightness as the selected

pixel and the extraction result is displayed.

3.6 BINARY FEATURE EXTRACTION

When the input data to an algorithm is too large to be processed and it is

suspected to be notoriously redundant then the input data will be transformed into a

reduced representation set of features. Transforming the input data into the set of features

is called feature extraction. If the features extracted are carefully chosen it is expected

that the features set will extract the relevant information from the input data in order to

perform the desired task using this reduced representation instead of the full size input. It

can be used in the area of it which involves using to detect and isolate various desired

portions or shapes of a digitized image

41

CHAPTER 4

IMPLEMENTATION

4. IMPLEMENTATION

42

4.1 DETAILED PROCEDURE

The biometric has been extracted from various types of inputs from around five

persons and the output was verified. The step-by-step procedure of the algorithm will be

explained in detail from the following steps.

4.1.1 Input image registration

The still image of a person’s face for which the biometric key should be generated

was taken as the input image. This input image will be used as inputs for radon transform,

bi spectrum, entropy calculation, generation of the cryptographic key, read Solomon

encoding and for AES algorithm. Some f the inputs which we tried were as follows

Fig.4.1.1 Input Image

4.1.2 Radon transform

43

Radon transform in two dimensions is the integral transform consisting of the

integral of a function over straight lines Fourier transform. The Radon transform is

closely related to the Fourier transform. In our algorithm, radon transform was used to

convert 2-D image into one dimensional projection. The procedure for radon transform is

as follows

1. The input image which was registered is a 2-D image the projections are basically

the total sum of the pixel values of the image at an angle from zero degrees to

ninety degrees. The projections obtained when image 1 was taken as an input are

as follows

3032974 3033017 3033005 3033006 3032982 3033000 3033017 3032956 3032930

3032964 3033009 3032991 3033075 3033062 3032968 3033002 3032992 3032992

3032989 3032977 3033166 3033030 3033027 3032974 3032920 3033224 3032895

3032991 3032868 3033086 3032945 3032977 3033063 3033028 3033066 3032981

3032931 3033064 3033004 3033054 3032917 3033024 3032685 3033315 3035292

3032895 3033004 3032956 3033052 3032916 3033049 3032952 3032875 3033034

3032988 3033053 3033030 3033058 3033041 3032996 3033073 3032971 3032897

3033095 3033009 3032960 3033010 3032967 3032974 3032998 3033082 3033091

3033027 3032871 3033078 3033169 3032973 3032947 3033064 3033005 3033022

3032958 3033008 3032984 3032962 3033003 3032980 3033001 3032986 3032959

These are the ninety projections for the input image 1 when given to the radon

transform. These outputs are normalized and then the fast fourier transform of the values

will be found out.

44

4.1.3 Normalisation

The normalisation of the above obtained values will be found by using the

following equation

k=(x-min (min(x)))/ (max (max(x))-min (min(x)))

Where, k is the output of the normalisation and

x is the output obtained from radon transform

The values obtained after the normalization are

0.110 0.1273 0.1227 0.1231 0.1139 0.1208 0.1273 0.1040 0.0940 0.1070 0.1243

0.1174 0.1496 0.1446 0.1086 0.1216 0.1178 0.1178 0.1166 0.1120 0.1845 0.1323

0.1312 0.1109 0.0901 0.2068 0.0806 0.1174 0.0702 0.1538 0.0997 0.1120 0.1450

0.1316 0.1461 0.1135 0.0944 0.1454 0.1224 0.1415 0.0890 0.1300 0.2417 1.0000

0.0806 0.1224 0.1040 0.1408 0.0886 0.1396 0.1024 0.0729 0.1339 0.1162 0.1412

0.1323 0.1431 0.1366 0.1193 0.1488 0.1097 0.0813 0.1573 0.1243 0.1055 0.1247

0.1082 0.1109 0.1201 0.1523 0.1557 0.1312 0.0713 0.1507 0.1857 0.1105 0.1005

0.1454 0.1227 0.1293 0.1047 0.1239 0.1147 0.1063 0.1220 0.1132 0.1212 0.1155

0.1051

4.1.4 Fast Fourier transform

In order to find the bi spectrum, the obtained values from normalization are n

dimensional fast Fourier transformed twice. The magnitude of the obtained values is then

found out and the negative half of the frequency spectrum is then discarded to make the

transform one way irreversible process the fourier phase information is discarded the

sequence is then zero padded to length N to produce a real valued sequence. The

imaginary part is then set to zero. Now the Fourier transform is applied again to the

45

sequence to produce the bi spectrum. The phase information is not discarded from the bi

spectrum. Therefore the bi spectrum is complex valued with non zero imaginary

components and is sensitive to asymmetry. The bi spectrum is then integrated along

radial slices in the bi frequency plane. The obtained values after this process can be

shown as below

1.0e+011 *

1.596 -0.0351 - 1.1180i -0.1992 + 0.0475i 0.0284 - 0.2705i -0.0998 + 0.0058i

0.0242 - 0.1569i -0.0767 - 0.0135i 0.0062 - 0.1472i -0.1633 - 0.0081i 0.0067 +

0.0070i -0.0564 - 0.0150i -0.0096 - 0.0129i -0.0342 + 0.0073i 0.0159 - 0.0364i -

0.0487 - 0.0179i -0.0026 - 0.0418i -0.0773 - 0.0057i 0.0231 - 0.0268i -0.1203 -

0.0756i -0.1012 + 0.0550i -0.0747 + 0.0736i 0.0228 + 0.0818i -0.0260 + 0.0046i -

0.0016 + 0.0801i 0.0215 + 0.0162i 0.0198 + 0.0352i 0.0102 - 0.0177i -0.0221 +

0.0281i 0.0059 + 0.0365i 0.0578 + 0.0299i 0.0087 - 0.0797i -0.0877 + 0.0358i

0.0281 + 0.0679i 0.0275 + 0.0198i 0.0361 + 0.0037i 0.0042 - 0.0110i -0.0001 +

0.0145i 0.0243 + 0.0199i 0.0275 - 0.0146i 0.0014 - 0.0067i 0.0043 - 0.0098i -

0.0102 + 0.0122i 0.0203 + 0.0062i -0.0034 + 0.0005i 0.0158 + 0.0163i 0.0227

0.0158 - 0.0163i -0.0034 - 0.0005i 0.0203 - 0.0062i -0.0102 - 0.0122i 0.0043 +

0.0098i 0.0014 + 0.0067i 0.0275 + 0.0146i 0.0243 - 0.0199i -0.0001 - 0.0145i

0.0042 + 0.0110i 0.0361 - 0.0037i 0.0275 - 0.0198i 0.0281 - 0.0679i -0.0877 -

0.0358i 0.0087 + 0.0797i 0.0578 - 0.0299i 0.0059 - 0.0365i -0.0221 - 0.0281i

0.0102 + 0.0177i 0.0198 - 0.0352i 0.0215 - 0.0162i -0.0016 - 0.0801i -0.0260 -

0.0046i 0.0228 - 0.0818i -0.0747 - 0.0736i -0.1012 - 0.0550i -0.1203 + 0.0756i

0.0231 + 0.0268i -0.0773 + 0.0057i -0.0026 + 0.0418i -0.0487 + 0.0179i 0.0159 +

46

0.0364i -0.0342 - 0.0073i -0.0096 + 0.0129i -0.0564 + 0.0150i 0.0067 - 0.0070i -

0.1633 + 0.0081i 0.0062 + 0.1472i -0.0767 + 0.0135i 0.0242 + 0.1569i -0.0998 -

0.0058i 0.0284 + 0.2705i

After applying FFT:

Fig.4.1.4.1 Fast Fourier Transform

Discarding first half of the frequency spectrum:

Fig.4.1.4.2. Removing first half of frequency spectrum

Shifting the second half of the frequency spectrum to first half and zero-padding

47

Fig.4.1.4.3. Shifting second half of frequency spectrum to first half

Normalization:

Fig.4.1.4.4 Normalization

Bispectrum:

48

Fig.4.1.4.5 Bispectrum

Iterating the procedure is necessary to produce bit sequences since more iterations

increases the size of the output and the therefore enlarges the potential pool of bits. The

transform can be easily modified to become iterative by feeding back the integrated

bispectrum as a complex valued input vector of length N for the next iteration. The

normalization step applied guarantees the system will be bounded-input bounded output

(BIBO) stable regardless of the number of iterations taken.

After each iteration of the procedure, a measure of change is extracted by

computing the complex valued inner product of the difference between the previous and

present outputs with the previous output to obtain.

1.0e+004 *

0.0006 -0.0000 + 0.0000i -0.0000 - 0.0000i 0.0000 - 0.0000i -0.0032 -3.2768 - 0.0064i

The magnitude and the phase matrices obtained after this process can be shown as below

Magnitude matrix = 6 0 0 0 32 255

Angle matrix = 0 3.1396 3.1420 6.2793 3.1416 3.1435

49

4.1.5 Binary feature extraction

These magnitude and angle matrices must then be binarised to allow for entropy

calculation in order to determine desirability of the bits. These binary sequences are then

stored in two matrices, one for the angle and the other for magnitude. The binarised

values of the magnitude and angle matrices can be given by Magnitude 00000110

00000000 00000000 00000000 00100000 11111111

Angle

00000000

00000011

00000011

00000110

00000011

00000011

Bit probability seems like a natural statistical property that can be used to describe

a bit’s desirability. However bit probability is dual valued in this context; a bit

probability of 0 or 1 represents the same level of constancy, likewise bit

probabilities of 0.49 or 0.51 represent the same level of randomness. Hence

binary entropy is preferred. Entropy measures the amount of information

contained in a bit on a scale from zero to one. The two extreme values of 0 and 1

correspond to a constant bit (no information) and a completely random bit (one bit

information), respectively. The desirability of a bit as a feature and as part of the

bio-key can be represented using its intra and extra class entropies. Rarely will a

bit fit perfectly into any of the above categories. Instead each bit can then be

50

given a weight value between 0 and 1 depending on its usefulness allowing for the

quantitative ranking of bits in order of their desirability as a feature bit. The

weight (w) is

Calculated using a function based on both the intra and extra class entropies

(ηintra and ηextra) and can be broken up into two parts:

1. Intra-class weight: from Table 1 it can be seen that ideal intra class entropy

should be low and these bits should have a high weighting.

w1 = 1 - ηintra

2. Extra-class weight: should be high when extra class entropy is high.

w2 = ηextra

The overall weight of a bit is simply the product of the intra-class weight and the

extra-class weight. The N highest weighted bits can then be used to form an N-bit

bio-key, the locations of the N highest weighted bits are stored and used as a mask

that can be applied to the bit matrix derived from future presentations of the

biometric in order to extract the same N-bit bio-key.

Bit properties and the corresponding value for key generation

Table 4.1.5 Entropy Calculation

51

Low Intra Class Entropy High Intra Class Entropy

High Extra Class Entropy Desirable Undesirable

Low Extra Class Entropy Undesirable Undesirable

The intra entropy for the magnitude matrix can be given as follows

Intra entropy of magnitude matrix =

0.3750 0.6250 0.6250 0.6250 0.5000 0.1250 0.1250 0.2500

0.6250 1.0000 1.0000 1.0000 0.8750 0.7500 0.7500 0.5000

0.6250 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 0.6250

0.6250 0.8750 0.8750 0.8750 1.0000 1.0000 1.0000 0.6250

0.3750 0.5000 0.3750 0.5000 0.6250 0.6250 0.6250 0.3750

0.1250 0.3750 0.3750 0.3750 0.2500 0.2500 0.2500 0.1250

The intra entropy for the angle matrix can be given as follows

Intra entropy of angle matrix =

0.3750 0.6250 0.6250 0.6250 0.6250 0.5000 0.3750 0.1250

0.6250 1.0000 1.0000 1.0000 1.0000 0.7500 0.3750 0.3750

0.6250 1.0000 1.0000 1.0000 0.8750 0.5000 0.6250 0.5000

0.6250 1.0000 1.0000 1.0000 0.8750 0.3750 0.6250 0

0.6250 1.0000 1.0000 1.0000 0.8750 0.5000 0.6250 0.5000

0.3750 0.6250 0.6250 0.6250 0.6250 0.3750 0.3750 0.3750

52

The extra entropy for the magnitude matrix can be given as follows

Extra entropy of magnitude matrix =

0 0 0 0 0.1250 0.5000 0.5000 0.1250

0 0 0 0 0.1250 0.2500 0.2500 0.1250

0 0 0 0 0 0 0 0

0 0.1250 0.1250 0.1250 0 0 0 0

0.2500 0.5000 0.6250 0.5000 0.3750 0.3750 0.3750 0.2500

0.2500 0.2500 0.2500 0.2500 0.3750 0.3750 0.3750 0.2500

The extra entropy for the angle matrix can be given as follows

Intra entropy of angle matrix =

0 0 0 0 0 0.1250 0.2500 0.2500

0 0 0 0 0 0.2500 0.6250 0.2500

0 0 0 0 0.1250 0.5000 0.3750 0.1250

0 0 0 0 0.1250 0.6250 0.3750 0.6250

0 0 0 0 0.1250 0.5000 0.3750 0.1250

0 0 0 0 0 0.2500 0.2500 0

The overall weight for the magnitude and angle matrices for them image 1 can be shown

as follows

Weight of magnitude matrix =

0 0 0 0 0.0625 0.4375 0.4375 0.0938

0 0 0 0 0.0156 0.0625 0.0625 0.0625

0 0 0 0 0 0 0 0

0 0.0156 0.0156 0.0156 0 0 0 0

53

0.1563 0.2500 0.3906 0.2500 0.1406 0.1406 0.1406 0.1563

0.2188 0.1563 0.1563 0.1563 0.2813 0.2813 0.2813 0.2188

Weight of angle matrix:

0 0 0 0 0 0.0625 0.1563 0.2188

0 0 0 0 0 0.0625 0.3906 0.1563

0 0 0 0 0.0156 0.2500 0.1406 0.0625

0 0 0 0 0.0156 0.3906 0.1406 0.6250

0 0 0 0 0.0156 0.2500 0.1406 0.0625

0 0 0 0 0 0.1563 0.1563 0

Finding the top 7 bits of magnitude matrix:

48 47 46 45 23 2 1 19

44 43 42 41 27 22 21 20

40 39 38 37 36 35 34 33

32 26 25 24 31 30 29 28

15 8 3 7 18 17 16 14

10 13 12 11 6 5 4 9

The top 3 bits are represented by values 1, 2, 3...etc…

Finding the top 7 bits of angle matrix:

48 47 46 45 44 17 10 6

43 42 41 40 39 16 3 9

38 37 36 35 20 5 13 15

34 33 32 31 19 2 12 1

30 29 28 27 18 4 11 14

54

26 25 24 23 22 8 7 21

The top 3 bits are represented by values 1, 2, 3...etc..

The magnitude mask obtained after selecting the top seven bits is given by:

0 0 0 0 0 1 1 0

0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0

0 0 1 1 0 0 0 0

0 0 0 0 1 1 1 0

In this obtained magnitude mask the selected bits are represented by 1 and the remaining

bits with 0.

The angle mask obtained after selecting the top seven bits is given by :

0 0 0 0 0 0 0 1

0 0 0 0 0 0 1 0

0 0 0 0 0 1 0 0

0 0 0 0 0 1 0 1

0 0 0 0 0 1 0 0

0 0 0 0 0 0 1 0

In this obtained angle mask the selected bits are represented by 1 and the remaining bits

with 0.

So the seven bit code obtained for the magnitude matrix is given by

1 1 1 0 1 1 1

So the seven bit code obtained for the angle matrix is given by

55

0 1 0 1 0 0 1

So the overall 14 bit code obtained from the image is given by

1 1 1 0 1 1 1 0 1 0 1 0 0 1

The cryptographic code which is to be encoded will be generated randomly in the code. It

is given by

2 3 1 2 2 0

This code which we obtained is the korig is the biometric obtained from the human face

which we have taken as the input image. This output will be given as an input to the reed

Solomon encoding process.

The obtained n bit code is then given as an input for reed Solomon encoding to encode

the obtained values. The output obtained after encoding the 6 bit code with a word length

of 7 and 3 bits per symbol is given by

Korig =

2 3 1 3 0 1 2 2 2 0 4 0 6 4

The obtained n bit code is then given as an input for reed Solomon decoding to decode

the obtained values. The output obtained after decoding with a word length of 7 and 3

bits per symbol is given by

korig =

2 3 1 2 2 0

With this Borig and Korig i.e the biometric key and output of RS coding, a look-up table

has to be generated in order to encrypt the output of reed Solomon encoding with this

56

random code generated. The output of look-up table can be shown as follows look up

table

if bn=0 bn=1 n=1 2 1 n=2 2 1 n=3 2 1 n=4 1 2 n=5 2 1 n=6 2 1 n=7 2 1

n=8 2 0

n=9 2 0

n=10 1 2

n=11 9 4

n=12 3 2

n=13 5 9

n=14 9 6

The obtained key of the biometric has to be checked with the standard AES algorithm in

order to verify the obtained key was encodable and decodable. The korig obtained was

given by

korig =

2 3 1 2 2 0

The standard AES algorithm takes a binary sequence and a character string as the input

and generates a coded sequence. The character string is given as

‘my name is satish, i am a good boy.’

So the encoded sequence in the AES is given by

76 62 216 212 74 136 230 78 178 212 5 209 203 57 254 122 230 9 125 247 4 237 60 230

69 46 113 114 108 248 78 242 113 13 12 46 57 227 138 133 143 93 126 193 20 48 147

21

57

The obtained encrypted values are then decoded with AES decrypt algorithm and

the obtained output was

‘my name is satish, i am a good boy.’ The same algorithm was checked with a 62 bit

sequence and the outputs are as follows

Finding the top 31 bits of magnitude matrix:

9 9 9 9 0 1 1 0

9 9 9 9 0 0 0 0

9 9 9 9 9 9 9 9

9 0 0 0 0 0 0 0

0 0 1 0 0 0 0 0

1 1 1 1 1 1 1 1

Finding the top 31 bits of angle matrix:

9 9 9 9 9 0 0 0

9 9 9 9 9 0 1 1

9 9 9 9 0 0 1 1

9 9 9 0 0 1 1 0

0 0 0 0 0 0 1 1

0 0 0 0 0 0 1 1

The angle mask obtained after selecting the top thirty one bits is given by

0 0 0 0 0 1 1 1

0 0 0 0 0 1 1 1

0 0 0 0 1 1 1 1

0 0 0 1 1 1 1 1

58

1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1

The magnitude mask obtained after selecting the top thirty one bits is given by

0 0 0 0 1 1 1 1

0 0 0 0 1 1 1 1

0 0 0 0 0 0 0 0

0 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1

So the thirty one bit code obtained for the magnitude matrix is given by

0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 1 1 1 1 1

So the thirty one code obtained for the angle matrix is given by

0 0 0 0 1 1 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 1

korig =

4 10 30 22 15 7 14 31 24 18

Korig =

4 10 30 22 15 1 13 11 30 21 30 27 31 24 24 24 14 23 1 21 6 0 24 3 17 1 22 1 3 28 4 7 14

31 24 18 31 19 3 27 28 14 8 15 21 7 4 14 18 1 16 14 15 7 29 5 23 23 24 4 8 26

Borig =

0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 1 1 1 1 1 0 0 0 0 1 1 0 0 1 1 0 0 1 1 0 0 0

0 0 0 0 1 1 0 0 0 0 0 0 1 1

59

The obtained key of the biometric has to be checked with the standard AES algorithm in

order to verify the obtained key was encodable and decodable. The text encrypted can be

given by ‘my name is satish, i am good boy.’

The standard AES algorithm takes a binary sequence and a character string as the input

and generates a coded sequence. So the encoded sequence in the AES is given by

181 118 119 29 195 118 46 204 77 55 231 76 249 204 185 120 76 127 113 194 184 97

132 155 234 164 174 77 89 97 10 150 94 189 219 7 199 97 186 163 115 143 84 106 179

154 29 1

The obtained encrypted values are then decoded with AES decrypt algorithm and the

obtained output was

‘my name is satish, i am good boy.’

60

CHAPTER 5

CONCLUSION

61

5. CONCLUSION

Hence a method of securing a cryptographic key of arbitrary length using a given

biometric. Although other biometric based methods have been proposed that have

superior FRR/FAR Performance few can produce keys of this length. The method is also

flexible, the bio-keys used to protect the cryptographic key can be changed and revoked

and is a significant feature not possessed by other methods. The method can be modified

to protect keys of increasing length by either increasing the size of bio keys through

performing more rotations/iterations of the bispectrum transform or by changing the RS

encoding scheme used. Lastly, the method can theoretically be applied to any biometric

as multidimensional biometrics can be reduced to one dimensional projections or feature

vectors.

62

CHAPTER 6

REFERENCE

63

6. REFERENCES

[1] A. Menezes, P. van Oorschot and S. Vanstone, “Handbook of Applied

Cryptography, USA”, CRC Press, pp 180, 1997.

[2] K.-P. L. Vu, R. W. Proctor, A. Bhargav-Spantzel, B.-L. Tai, J. Cook, and E.

Eugene Schultz,” Improving password security and memorability to protect

personal and organizational information”, International Journal of Human-

Computer Studies, vol. 65, pp. 744-757, 2007.

[3] U. Uludag, S. Pankanti, S. Prabhakar, and A. K. Jain, “Biometric cryptosystems:

issues and challenges”, in Proceedings of the IEEE, vol. 92, pp. 948-960, 2004.

[4] F. Hao, R. Anderson, and J. Daugman, "Combining Crypto with Biometrics

Effectively," IEEE Transactions on Computers, vol. 55, pp. 1081-1088, 2006.

64