Upload
pratikamlani
View
121
Download
9
Embed Size (px)
Citation preview
Project Report
on
Pattern Recognition Using Neural
network with
Matlabs
(Character Recognition)
Lovely Faculty of Technology & Sciences
Department of ECE/EEE/D6 Domain
Amlani Pratik Ketanbhai Lovely professional University
Reg. No. 10903436 Jalandhar- Delhi Highway, GT Road SEC: E38E1 NH-1, Phagwara -144402 Roll No. RE38E1A06
CERTIFICATE
This is to certify that AMLANI PRATIK KETANBHAI bearing Registration no. 10903436 has completed dissertation/capstone project
titled, “PATTERN RECOGNITION USING NEURAL NETWORK
WITH MATLABs” under my guidance and supervision. To the best of my knowledge, the present work is the result of her original
investigation and study. No part of the dissertation has ever been
submitted for any other degree at any University. The dissertation is fit for submission and the partial fulfillment of the conditions for the award
of Capstone project.
Mr. Srinivas Perala
UID 14910
Assistant professor of ECE/EEE
Lovely Professional University
Phagwara, Punjab.
Date :
DECLARATION
I AMLANI PRATIK KETANBHAI student of Lovely faculty of
Technology & Sciences , B.Tech ECE LEET RE38E1 under
Department of Electronics & Communications Engineering of Lovely
Professional University, Punjab, hereby declare that all the information
furnished in this dissertation / capstone project report is based on my
own intensive research and is genuine. This dissertation / report does
not, to the best of my knowledge, contain part of my work which has
been submitted for the award of my degree either of this university or
any other university without proper citation.
Date : 26/11/2011 AMLANI PRATIK KETANBHAI
Reg. No. 10903436 SEC: E38E1
Roll No. RE38E1A06
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 4
INDEX
Topic No Topic Name Page No.
1.0 ABSTRACT 5
1.1 INTRODUCTION 5
1.2 KEYWORDS 6
2.0 WHAT IS NEURAL NETWORK? 6
2.1 APPLICATIONS OF NEURAL NETWORKS
7
2.2 BIOLOGICAL INSPIRATION 7
2.3 GATHERING DATA FOR NEURAL NETWORK
8
3.0 BASIC ARTIFICIAL MODEL 8
3.1 PATTER RECOGNITION 11
3.2 TRAINING MULTILAYER PERCEPTRON
13
3.3 BACKPROPAGATION ALGORITHM 14
4.0 HANDWRITTEN ENGLISH ALPHABET RECOGNITION SYSTEM
15
4.1 HANDWRITTEN CHARACTER RECOGNITION SYSTEM
18
4.2 TRAINING METHOD WITH REJECT OUTPUT
21
5.0 CHARACTER RECOGNITION BY LEARNING RULE
23
5.1 ARTIFICIAL NEURAL NETWORK TRAINING
26
5.2 MATLAB IMPLEMENTATION 27
5.3 6.0 6.1
TRANSLATING THE INPUT INTO MATLAB CODE CHARACTER RECOGNITION MATLABs SOURCE CODE REFRENCE S
27 31 49
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 5
1.0 Abstract
Abstract—In this paper, I have tried develop a neural network that can be used for handwritten
character recognition. Supervised learning is used in order to provide training to the neural
network. The neural network is designed here with a reject output so that we can easily classify
all the training patterns exactly. Competitive learning algorithm is used in order to get the best
result. The fine-classifiers are realized using multilayered neural networks, each of which solves
the two-category classification problem. We also use the rough-classifier for the selection the
training samples in the learning process of multilayered neural networks in order to reduce the
learning time.
1.1 Introduction
One of the most important effects the field of Cognitive Science can have on the field of
Computer Science is the development of technologies that make our tools more human. A very
relevant present-day field of natural interface research is hand writing recognition technology.
Evidenced by the fact that we are not currently all using Tablet computers, accurate hand writing
recognition is clearly a difficult problem to solve. Before pattern recognition, air photo
interpretation and photo reading from aerial reconnaissance were used to define objects and its
significance. In this process, the human interpreter must have vast experience and imagination to
perform the task efficiently and the interpreter‟s judgment is considered subjective. Just recently,
applications utilizing aerial photographs have multiplied and significant technological
advancements have emerged from plain aerial photographs into high-resolution digital images.
Pattern recognition is now widely used in large scale application such as monitoring the changes
in water levels of lakes and reservoirs, assessing crop diseases, assessing land-use and mapping
archaeological sites Each character is written with a single stroke. This solves the character level
segmentation problem that previously plagued handwriting recognition. The curve drawn
between pen down and pen up events can be recognized in isolation. Uni-stroke recognition
algorithms can be relatively simple because there is no need to decide which parts of the curve
belong to which character or to wait for more strokes that belong to the same character as is often
the case when we try to recognize conventional handwriting. To develop a handwriting
recognition system that is both as reliable as Uni-stroke, and natural enough to be comfortable,
the system must be highly adaptable.
1.2Key words
Pattern recognition, Hidden Markov Model, Learning rules in Neural N/W, Matlab
Toolbox.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 6
2.0 What is Neural Network?
Neural Networks are a different paradigm for computing:
Von Neumann machines are based on the processing/memory abstraction of human
information processing.
Neural networks are based on the parallel architecture of animal brains.
Neural networks are a form of multiprocessor computer system, with
simple processing elements
a high degree of interconnection
simple scalar messages
adaptive interaction between elements
A biological neuron may have as many as 10,000 different inputs, and may send its output (the
nth presence or absence of a short-duration spike) to many other neurons. Neurons are wired up
in a 3-dimensional pattern.
Real brains, however, are orders of magnitude more complex than any artificial neural
network so far considered.
2.1 Application of Neural Network
Neural networks are applicable in virtually every situation in which a relationship between the
predictor variables (independents inputs) and predicted variables (dependents, outputs) exists,
even when that relationship is very complex and not easy to articulate in the usual terms of
"correlations" or "differences between groups." A few representative examples of problems to
which neural network analysis has been applied successfully are:
Detection of medical phenomena
Stock market prediction
Credit assignment
Monitoring the condition of machinery
Engine management
2.2 Biological Inspiration
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 7
Neural networks grew out of research in Artificial Intelligence; specifically, attempts to mimic
the fault-tolerance and capacity to learn of biological neural systems by modeling the low-level
structure of the brain (see Patterson, 1996). The main branch of Artificial Intelligence research in
the 1960s -1980s produced Expert Systems. These are based upon a high-level model of
reasoning processes (specifically, the concept that our reasoning processes are built upon
manipulation of symbols). It became rapidly apparent that these systems, although very useful in
some domains, failed to capture certain key aspects of human intelligence. According to one line
of speculation, this was due to their failure to mimic the underlying structure of the brain. In
order to reproduce intelligence, it would be necessary to build systems with a similar
architecture.
Basic artificial Neuron model developed by various transfer function.
The brain is principally composed of a very large number (circa 10,000,000,000) of neurons,
massively interconnected (with an average of several thousand interconnects per neuron,
although this varies enormously). Each neuron is a specialized cell which can propagate an
electrochemical signal.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 8
Human Neuron Model consisting with millions of element to form a brain and transfers the
signal.
In general, if you use a neural network, you won't know the exact nature of the relationship
between inputs and outputs - if you knew the relationship, you would model it directly. The other
key feature of neural networks is that they learn the input/output relationship through training.
There are two types of training used in neural networks, with different types of networks using
different types of training. These are supervised and unsupervised training, of which supervised
is the most common and will be discussed in this section.
2.3 Gathering Data for Neural Networks
Once you have decided on a problem to solve using neural networks, you will need to gather data
for training purposes. The training data set includes a number of cases, each containing values
for a range of input and output variables. The first decisions you will need to make are: which
variables to use, and how many (and which) cases to gather.The choice of variables (at least
initially) is guided by intuition. Your own expertise in the problem domain will give you some
idea of which input variables are likely to be influential. As a first pass, you should include any
variables that you think could have an influence - part of the design process will be to whittle this
set down.
For example, consider a neural network being trained to estimate the value of houses. The
price of houses depends critically on the area of a city in which they are located. A particular city
might be subdivided into dozens of named locations, and so it might seem natural to use a
nominal-valued variable representing these locations. Unfortunately, it would be very difficult to
train a neural network under these circumstances, and a more credible approach would be to
assign ratings (based on expert knowledge) to each area; for example, you might assign ratings
for the quality of local schools, convenient access to leisure facilities etc.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 9
3.0 The Basic Artificial Model
To capture the essence of biological neural systems, an artificial neuron is defined as follows:
It receives a number of inputs (either from original data, or from the output of other neurons in
the neural network). Each input comes via a connection that has a strength or weight these
weights correspond to synaptic efficacy in a biological neuron. Each neuron also has a single
threshold value. The weighted sum of the inputs is formed, and the threshold subtracted, to
compose the activation of the neuron also known as the post-synaptic potential, or PSP, of the
neuron. The activation signal is passed through an activation function also known as a transfer
function to produce the output of the neuron.
This describes an individual neuron. The next question is: how should neurons be connected
together? If a network is to be of any use, there must be inputs (which carry the values of
variables of interest in the outside world) and outputs (which form predictions, or control
signals). Inputs and outputs correspond to sensory and motor nerves such as those coming from
the eyes and leading to the hands. However, there also can be hidden neurons that play an
internal role in the network. The input, hidden and output neurons need to be connected together.
The key issue here is feedback (Haykin, 1994). A simple network has a feed forward structure:
signals flow from inputs, forwards through any hidden units, eventually reaching the output
units.
Such a structure has stable behavior. However, if the network is recurrent (contains connections
back from later to earlier neurons) it can be unstable, and has very complex dynamics. Recurrent
networks are very interesting to researchers in neural networks, but so far it is the feed forward
structures that have proved most useful in solving real problems.
A typical feed forward network has neurons arranged in a distinct layered topology. The input
layer is not really neural at all: these units simply serve to introduce the values of the input
variables.
The hidden and output layer neurons are each connected to all of the units in the preceding layer.
Again, it is possible to define networks that are partially-connected to only some units in the
preceding layer; however, for most applications fully-connected networks are better.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 10
Example: A simple single unit adaptive network:
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 11
The network has 2 inputs, and one output. All are binary. The output is
1 if W0 *I0 + W1 * I1 + Wb> 0
0 if W0 *I0 + W1 * I1 + Wb<= 0
We want it to learn simple OR: output a 1 if either I0 or I1 is 1.
3.1 Pattern recognition
A Pattern recognition is such that process of identifying a stimulus. Recognizing a
correspondence between a stimulus and information in permanent memory. Pattern
recognition is generally categorized according to the type of learning procedure used to generate
the output value. Supervised learning assumes that a set of training data (the training set) has
been provided, consisting of a set of instances that have been properly labeled by hand with the
correct output.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 12
I/O O/P
Automatic machine recognition, description, classification, and grouping of patterns are
important problems in a variety of engineering and scientific disciplines such as biology,
psychology, medicine, marketing, computer vision, artificial intelligence, and remote sensing. A
pattern could be a fingerprint image, a handwrittencursive word, a human face, or a speech
signal. Given a pattern, its recognition/classification may consist of one of the following two
tasks
1) Supervised classification (e.g. discriminated analysis) in which the input pattern is
identified as a member of a predefined class.
2) Unsupervised classification (e.g. clustering) in which the pattern is assigned to a hitherto
unknown class. The recognition problem here is being posed as a classification or
categorization task, where the classes are either defined by the system designer in
supervised classification or are learned based on the similarity of patterns in unsupervised
classification. These applications include data mining identifying a “pattern” e.g.
correlation, or an outlier in millions of multidimensional patterns document classification
efficiently searching text documents financial forecasting, organization and retrieval of
multimedia databases, and biometrics. The rapidly growing and available computing power,
while enabling faster processing of huge data sets, has also facilitated the use of elaborate
and diverse methods for data analysis and classification. At the same time, demands on
automatic pattern recognition systems are rising enormously due to the availability of large
databases and stringent performance requirements (speed, accuracy, and cost).
The design of a pattern recognition system essentially involves the following three aspects:
1) Data acquisition and preprocessing.
2) Data representation, and
3) Decision making.
SHORT TERM STORE
RETRIVAL
STRATEGEIC
LONG TERM
STORE
Sensory
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 13
Networks (NN) have been widely used to solve complex problems of pattern recognition and
they can recognize patterns very fast after they have already been trained. There are two types of
training which are used in neural networks. These are supervised and unsupervised training of
which supervised is the most common. In supervised training, the neural network assembles a
training data set. This data set contains examples of input patterns together with the
corresponding output results, and the network learns to infer the relationship between input
patterns and output results through training.
The training process requires a training algorithm, which uses the training data set to adjust the
network's weights and bias so as to minimize an error function, such as the mean squared error
function (MSE) , and try to classify all patterns in the training data set. After training, the NN
will have a set of weights and biases that will be used to recognize the new patterns. In general,
training NN with a larger training data set can reduce the recognizing error rate, but it increases
the complexity. Handwritten character recognition is a typical application of neural networks.
A. Problem with large training data set
If we are training a neural network with a large training data set, it requires more time to
minimize the error function. These patterns decreases the speed of reduction of error function but
the training data set should not be too small that the network cannot identify the pattern
properly.
If the neural network is not provided with a proper set of training data, it cannot recognize
difficult patterns (fig 1). If the NN have to recognize a pattern that approximates in shape to one
of the provided patterns, the recognition result will be wrong.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 14
Fig 1
Fig. 1. The difficult recognizing patterns are usually clustered in the area that is near the
boundary line in the pattern space.
3.2Training Multilayer Perceptrons
Once the number of layers, and number of units in each layer, has been selected, the network's
weights and thresholds must be set so as to minimize the prediction error made by the network.
This is the role of the training algorithms. The historical cases that you have gathered are used to
automatically adjust the weights and thresholds in order to minimize this error. This process is
equivalent to fitting the model represented by the network to the training data available. The
error of a particular configuration of the network can be determined by running all the training
cases through the network, comparing the actual output generated with the desired or target
outputs. The differences are combined together by an error function to give the network error.
The most common error functions are the sum squared error (used for regression problems),
where the individual errors of output units on each case are squared and summed together, and
the cross entropy functions (used for maximum likelihood classification).
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 15
Neural network error surfaces are much more complex, and are characterized by a number of
unhelpful features, such as local minima (which are lower than the surrounding terrain, but above
the global minimum), flat-spots and plateaus, saddle-points, and long narrow ravines.
It is not possible to analytically determine where the global minimum of the error surface is, and
so neural network training is essentially an exploration of the error surface. From an initially
random configuration of weights and thresholds (i.e., a random point on the error surface), the
training algorithms incrementally seek for the global minimum. Typically, the gradient (slope) of
the error surface is calculated at the current point, and used to make a downhill move.
Eventually, the algorithm stops in a low point, which may be a local minimum (but hopefully is
the global minimum).
3.3 Back PropagationAlgorithm
The best-known example of a neural network training algorithm is back propagation. In back
propagation, the gradient vector of the error surface is calculated. This vector points along the
line of steepest descent from the current point, so we know that if we move along it a "short"
distance, we will decrease the error. A sequence of such moves (slowing as we near the bottom)
will eventually find a minimum of some sort. The difficult part is to decide how large the steps
should be.
Large steps may converge more quickly, but may also overstep the solution or (if the error
surface is very eccentric) go off in the wrong direction. A classic example of this in neural
network training is where the algorithm progresses very slowly along a steep, narrow, valley,
bouncing from one side across to the other. In contrast, very small steps may go in the correct
direction, but they also require a large number of iterations. In practice, the step size is
proportional to the slope (so that the algorithms settles down in a minimum) and to a special
constant: the learning rate. The correct setting for the learning rate is application-dependent, and
is typically chosen by experiment; it may also be time-varying, getting smaller as the algorithm
progresses.
The algorithm is also usually modified by inclusion of a momentum term: this encourages
movement in a fixed direction, so that if several steps are taken in the same direction, the
algorithm "picks up speed", which gives it the ability to (sometimes) escape local minimum, and
also to move rapidly over flat spots and plateaus.
The algorithm therefore progresses iteratively, through a number of epochs. On each epoch, the
training cases are each submitted in turn to the network, and target and actual outputs compared
and the error calculated. This error, together with the error surface gradient, is used to adjust the
weights, and then the process repeats. The initial network configuration is random, and training
stops. when a given number of epochs elapses or when the error stops improving (you can select
which of these stopping conditions to use).
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 16
4.0 HAND WRITTEN ENGLISH ALPHABET RECOGNITION SYSTEM
In some hand-writing, the characters are indistinguishable even to the human eye, and that they
can only be distinguished by context. In order to distinguish between such similar characters, the
tiny differences that they have must be identified. One of the major problems of doing this for
hand written characters is that they do not appear at the same relative location of the letter due to
the different proportions in which characters are written by different writers of the language.
Even the same person may not always write the same letter with the same proportions. Here, the
goal of a character recognition system is to transform a hand written text document on paper into
a digital format that can be manipulated by word processor software. The system is required to
identify a given input character form by mapping it to a single character in a given character set.
Each hand written character is split into a number of segments (depending on the complexity of
the alphabet involved) and each segment is handled by a set of purpose built neural network. The
final output is unified via a lookup table. Neural network architecture is designed for different
values of the network parameters like the number of layers, number of neurons in each layer, the
initial values of weights, the training coefficient and the tolerance of the correctness.
The optimal selection of these network parameters certainly depends on the complexity of the
alphabet. Two phase processes are involved in the overall processing of our proposed scheme the
Pre-processing and Neural network based Recognizing tasks. The pre-processing steps handle
the manipulations necessary for the preparation of the characters for feeding as input to the
neural network system. First, the required character or part of characters needs to be extracted
from the pictorial representation.
The splitting of alphabets into 25 segment grids, scaling the segments so split to a standard size
and thinning the resultant character segments to obtain skeletal patterns. The following pre-
processing steps may also be required to furnish the recognition process…..I. The alphabets can
be thinned and their skeletons obtained using well-known image processing techniques before
extracting their binary forms. II. The scanned documents can be “cleaned” and “smoothed” with
the help of image processing techniques for better performance.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 17
INPUT ALPHABET B
25 SEGMENT GRID
DIGITIZATION
0 1 0 0 0
0 1 0 0 0
0 1 1 1 0
0 1 0 1 0
0 1 1 1 0
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 18
B. Approach to problem solving
One of the ideas that can be implemented to solve the above problems is “the training data set
should be separated into some parts and the neural network will use many sets of weights and
biases to classify all patterns in each part and between parts”.
A new structure of pattern recognition NN with an especial output that is called “Reject output”
is introduced that can build a training method corresponding to this structure. The name “Reject
output” that means it is used to separate all difficult recognizing patterns from the training data
set. Hence, these patterns are called “Rejected patterns”. Our training method uses the reject
output to separate the training data set into some parts, and with a smaller number of patterns in
each part, they can be classified by the NN using a distinct set of weights and biases.
Using “Reject output” to separate the training data set into two parts for classifying.
Fig illustrates our idea in principle. A large training data set has been separated into two parts for
classifying; thus, a not big NN can use two sets of weights and biases (SWBs) for classifying the
large training data set. Of course, the NN will use the two SWBs for recognizing new patterns.
The SWB1 is set for recognizing first.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 19
If the reject output gives a result “Reject”, the SWB1 will be replaced by SWB2 for recognizing
in the second time. As a result, all the sets of weights and biases have to be kept in the order that
we receive them from the training process.
4.1 HANDWRITTEN CHARACTER RECOGNITION
The handwritten character recognition system includes the following steps.
1) Scanning of image,
2) Pre processing,
3) Classification of pattern and
4) Output.
A 300 dpi scanner can be used for scanning of page that consist the handwritten characters. Here,
we are considering only black and white image of the characters and hence, we will avoid color
scanning.
Preprocessing
At first, a character image is normalized and smoothed in the preprocessing stage. In
normalization, a linear normalization method is employed and an input image is adjusted to 128
* 128 dots. As a result of smoothing, bumps and holes of strokes are patched up by using 3 * 3
masks.
Then, contour extraction is done. If a white pixel adjoins a black pixel to the upward, downward,
left, or right direction, the black pixel is regarded as on contour. The feature vector is extracted
from the pixels of contour.
Classification of Pattern
Different patterns in the form of line elements and directional elements have been developed by
scanning a pixel and eight adjacent pixels of that pixel which will help in determining the actual
shape of the input pattern.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 20
Line elements in the directional element features. This process is called dot orientation and it will
help in weight vector constriction and updating process at the time of learning of the neural
network.
MODULAR NEURAL NETWORKS USING SOM IN ROUGH
CLASSIFIER
The basic structure of our historical character recognition system is shown in Fig. This consists
of two kinds of classifiers: a rough-classifier to find the several candidates of categories for the
input pattern, and a set of fine-classifiers that determine the category of the input pattern by
multi- layered perceptrons (MLP).
The fine-classifiers are realized using a set of MLPs, each of which solves the two-category
classification problem. In other words, each MLP is learned to output the value „1‟ only when
the patterns belonging to the lass for which the MLP is responsible, and otherwise to output he
value „0‟.
The rough-classifier is constructed using self-organizing maps (SOM), which can derive the
multi-templates for each category from input data. Basic structure of SOM is shown in Fig. 5. In
the learning algorithm of SOM, the code-vector w for the winner cell which is nearest to the
input vector x and its neighborhood cells are updated by the following equation.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 21
Modular neural networks using SOM in rough classifier.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 22
where is learning coefficient after t learning steps. The coefficient starts from its initial value
and then decreases monotonically as t increases, thus reaching its minimum value at the pre-
set maximum number of learning steps Tmax. is aneighborhood function with the center at
winner cell 1508 c and pi is the distance from cell i to the winner cell c.
In the equation (2), is a time-varying parameter that defines the neighborhood size in the
competitive layer. As well as parameter in equation (1), this parameter decreases monotonically
from as t increases.
In general, SOM has significant characteristics that the distribution of code- vectors after the
learning of SOM reflects the distribution of the input data. That is, code-vectors tend to converge
on the area where the density of the input data is high. In this research, based on these
characteristics of SOM, we construct the rough- classifier of our modular neural networks by
assigning each SOM to each class category. Here, we use one-dimensional SOM for each SOM
in the rough-classifier.
In the rough-classification, we select k templates which are nearest to the test sample among the
all templates for all class categories. The class categories of the selected k templates result in the
candidate classes in the fine-classifiers.
4.2 THE TRAINING METHOD WITH REJECT OUTPUT
We realize that the number of the rejected patterns is always smaller than the number of the un-
rejected pattern, and if the NN was trained with all rejected and un-rejected patterns in the
original order of them in the training data set the training process would take a long time to
converge. Thus the rejected patterns and the un-rejected patterns should be placed one after the
other into the input layer of the NN for training. The training software can do it very easy. As a
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 23
result, the rejected patterns are used in rotation in each training epoch; thus, they usually are
clustered by the reject output faster than the un-rejected patterns The reject output value is
always in range from -1 to +1. The NN uses the reject output to cluster for the rejected patterns
with value 1 and for the un-rejected patterns with value -1. Therefore, a threshold value R has
been determined for the reject output to separate all the rejected patterns from the un-rejected
patterns .
That means the minimum value α of all the value of the reject output corresponding to all the
rejected patterns must be higher than R, and the maximum value (β) of all the value of the reject
output corresponding to all the un-rejected patterns must be lower than or equal R. The training
software can track the and β in this phase, and the NN should be trained until α>β. At that time,
the reject output can separate all the rejected patterns from the un-rejected patterns by the
threshold R=α. However, we do not need to wait until α>β, because if α<β and R=α, all the
rejected patterns are still clustered by the reject output, although some un-rejected patterns are
classified as rejected patterns .
It is no problem, because these patterns will be classified again in the next phase. Thus, if α is
still smaller than β after hundreds of epochs, we should check all the training patterns and mark
the rejected patterns again. If a pattern has already clustered by the rejected output, it is still
marked as a rejected pattern. If a pattern is not classified correctly by the normal outputs, it
should be marked as a rejected pattern. As the result, the number of the patterns that were
marked as the rejected patterns may be changed in this phase.
Determine the threshold for the reject output
After this phase, we have the first set of weights and biases (SWB1) and the threshold value R
for the reject output separated the training data set into two parts.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 24
A large training data set can be separated into more two parts to classify by using many sets of
weights and biases (SWB). The update process can be started not from the first phase.
In case of large training data, there are different phases of learning which are derived from main
phase 1 and 2. Phase 1 and 2 decides the rejected and not rejected patterns and if the patterns are
not rejected, the NN classifies it further for precise outputs. From Fig 6, we can see that at the
end of each phase, there is a set of updated new patterns which helps that layer of neurons to
solve the problem very easily and fast.
5.0 Character recognition by Learning Rule
OVERVIEW OF THE PROPOSED CHARACTER RECOGNITION
SIMULATOR PROGRAM AND METHODOLOGY
The characters were prepared on a piece of blank paper. Black Artline70 marker pen was used to
write down all of the characters. Samples of 10 for each character (each having different style)
were used, hence a total of 360 characters were used for recognition (26 upper case letters + 10
numerical digits). These characters were captured using a digital camera, Canon IXUS 40.
The captured images were then used to generate input vectors for the back propagation neural
network for training. In this case, multi scale technique with selective threshold is proposed and
the results using this method will be compared with other methods. For simulation, separate
character sets were also prepared using the same method as mentioned previously. The
characters were captured automatically using the GUI created and the captured images were
stored into a temporary folder for quick reference. Input vectors are generated and fed into the
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 25
trained neural network for simulation. This simulator program is designed in such a way that the
characters can be written on any spot on the blank paper. The program has the ability to capture
the characters through careful filtering. 8-connectivity is used to allow connected pixels in any
direction to be taken as the same object.
EXEMPLARS PREPARATION
The characters prepared as explained in Section 2, are scanned using a scanner or captured using
a digital camera and these characters will be segregated according to their own character group.
One example is shown below in Fig. 1.
Fig. 1 Sample of character A
Note that the scanned or captured images are in RGB scale. These images have to be converted
into grayscale format before further processing can be done. Using appropriate
grayscalethresholding, binary images are to be created.
Fig. 2 Binary image of sample character A
Fig. 2 shows the generated binary image using image processing tool box in MATLAB and 8-
connectivity analysis. The next step is to obtain the bounding box of the characters (Fig. 3).
Bounding box is referring to the minimum rectangular box that is able to encapsulate the whole
character. The size of this bounding box is important as only a certain width-to-height (WH)
ratio of the bounding box will be considered in capturing. Those objects of which bounding
boxes are not within a specific range will not be captured, hence will be treated as unwanted
objects. The next criteria to be used for selection of objects are relative-height (RH) ratio and
also relative-width (RW) ratio. RH ratio is defined as the ratio of the bounding box height of one
object to the maximum bounding box height among all the objects of that image. Likewise, RW
refers to the ratio between the widths of one bounding box to the maximum width among all
bounding boxes widths in that image.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 26
Fig.3
Fig. 3 The outer rectangle that encapsulates the letter A is the bounding box.
The captured objects should now all consist of valid characters (and not unwanted objects) to be
used for neural network training. Each of the captured character images have different
dimensions measured in pixels because each of them have different bounding box sizes. Hence,
each of these images needs to be resized to form standard image dimensions. However, in order
to perform multi scale training technique different image resolutions are required. For this
purpose, the images are resized into dimensions of 20 by 28 pixels, 10 by 14 pixels, and 5 by 7
pixels. Note that these objects are resized by using the averaging procedure. 4 pixels will be
“averaged” and mapped into one pixel (Fig. 4).
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 27
Fig. 4 The pixels shaded in gray on the left are “averaged” (computing mean of 101, 128, 88, and
50) and the pixel shaded in gray on the right is the “averaged” pixel intensity value. This
example shows averaging procedure from 20 by 28 pixel image into 10 by 14 pixel image.
5.1 ARTIFICIAL NEURAL NETWORK TRAINING
Back propagation neural network is used to perform the character recognition. The network used
consists of 3 layers and they include input layer, hidden layer, and output layer. The number of
input neurons depends on the image resolution. For example, if the images that are used for
training have a resolution of 5 by 7 pixels, then there should be 35 input neurons and so on. On
the other hand, the number of output neurons is fixed to 36 (26 upper case letters + 10 numerical
digits). The first output neuron corresponds to letter A, second corresponds to letter B, and so on.
The sequence is A, B, C… X, Y, Z, 0, 1, 2 … 7, 8, 9. The number of neurons.
MULTISCALE TRAINING TECHNIQUE
Fig. 5 Conceptual diagram of multi scale training technique.
The training begins with 5 by 7 pixels exemplars (stage 1). These input vectors are fed into the
neural network for training. After being trained for a few epochs, the neural network is “boosted”
by manipulating the weights between the first and the second layer. This resultant neural network
is trained for another few epochs by feeding in 10 by 14 pixels exemplars (stage 2). Again, the
trained neural network is boosted for the next training session. In a similar fashion, the boosted
neural network is fed in with 20 by 28 pixel exemplars for another few epochs until satisfactory
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 28
convergence is achieved (stage 3). The conceptual diagram of multi scale neural network is
shown in Fig. 5.
Fig. 6 Original neural network (top) and the “boosted” neural network (bottom).
Referring to Fig. 6, P1, P2, P3, and P4 are pixel intensity values and Pave is the averaged pixel
intensity value of these pixels. After the boosting process, the original weight value W, is split
into 4, each connected to one pixel location.
5.2 MatlabImplimentation
Handwriting samples were obtained from 4 subjects, two male and two female. All the subjects
were given a piece of paper with boxes for each letter of the alphabet, and a large box to write
the sentence , The quick brown fox jumped over the lazy dogs All the subjects used identical felt
tip pens. In addition to the handwriting samples, one printed sample was created using Microsoft
word with the font Arial 12pt, in all capital letters.
5.3 Translating the Input into Matlab Code
Converting the scanned characters to code readable by Matlab was achieved with a Java
application. The application allowed the user to load in an image, and then convert that image
into Matlab code by painting over it with the mouse. The applicatio was designed to take mouse
input as opposed to directly scanning the image so that it could also capture temporal
information about the user‟s strokes. However, there wasunfortunately not enough time to
implement this feature. While the source images were at a high resolution, they were imported
into the Matlab code at a resolution of 10 by 10 (represented as a 100 unit vector) to reduce the
processing time of the neural network. The black grid in the interface visually depicts this lower
resolution.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 29
The splitting of alphabets into 25 segment grids, scaling the segments so split to a standard size
and thinning the resultant character segments to obtain skeletal patterns. The following pre-
processing steps may also be required to furnish the recognition process…..
Above, the user has loaded a file, and has painted over it to register the active pixels. Pressing
the save button will add this image‟s data to the growing amount of Matlab code in the bottom
text box. Each input must also be associated with a letter of the alphabet; this is specified in the
interface using the small text box under the add Scan button to specify the appropriate letter.
This target letter appears in the Matlab code as a 26 unit vector containing one 1 and twenty-five
0s.
Errors Identifications
hora had 12 errors
krav had 3 errors
blaz had 7 errors
chme had 15 errors
bart had 13 errors
hyne had 3 errors
barta had 10 errors
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 30
fris had 8 errors
cerm had 4 errors
fise had 19 errors
kriz had 9 errors
cimp had 9 errors
holi had 3 errors
jako had 9 errors
krat had 9 errors
habe had 4 errors
bern had 4 errors
chlu had 6 errors
kost had 1 error
---[ Unsuccessful recognitions stats:]
0 not recognised 18 times
1 not recognised 4 times
2 not recognised 11 times
3 not recognised 8 times
4 not recognised 13 times
5 not recognised 5 times
6 not recognised 33 times
7 not recognised 19 times
8 not recognised 20 times
9 not recognised 18 times
We had 851 successful recognitions on 1000 patterns 85.100000
We had 68 uncertain right and 56 unclassified
The first part explains how many patterns where misclassified. The first part explains how many
patterns where misclassified for every person while the second one tells us the same information
but about the number to be classified. In the end we show the rate of success fully classified
patterns. To decide if a pattern was uncertain right (the highest output in the right position) or
unclassified (the highest value in the wrong position) we checked if the output was lower than
0.5.
We have also tried to reduce the input space dimensionality by reducing the number of segments
from seven to three. This reduction allowed us to test the network with an input layer of 72 nodes
but this network was not able to successfully recognize more than 74% of the validation set. This
stopped us from continuing any further in this direction.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 31
What can be seen from this image, which compares on the left our best recognized volunteer
“Kost” with the worst “Fise” on the right, is that the characters do not differ from each other too
much, but the results show that discriminates exist. We might find one in the Z axis that
represents the pressure. This also hints us where to find other improvements in the network and
the feature extraction. Another hint to this problem is the number of unclassified patterns. In fact,
if we could classify even just 80% of these patterns, we could get close to 93% of successful
recognitions achieved by MarekMusil with an ad-hoc K-Means based solution. Considering the
number of weights, increasing the number of patterns in the training set might help as well (Data
we did not have access to).
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 32
Considering that both RBF and K-Means use codebook vectors, we would have expected the
RBF approach to yield good results, but we think this inability is caused by the high
dimensionality of the input space.
WEIGHTS FOR TRAINING
The figures given below give us the information how we can initialize weights for
training the neural network. The initialization of weights is a very important task during training
because the training of neural network consists only of reduction in error between obtained
output and desired output.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 33
6.0 CHARACTER RECOGNITION MATLABs SOURCE CODE
functionvarargout = gskmadechar(varargin)
% GSKMADECHAR M-file for gskmadechar.fig
% GSKMADECHAR, by itself, creates a new GSKMADECHAR or
raises the existing
% singleton*.
%
% H = GSKMADECHAR returns the handle to a new GSKMADECHAR
or the handle to
% the existing singleton*.
%
% GSKMADECHAR('CALLBACK',hObject,eventData,handles,...)
calls the local
% function named CALLBACK in GSKMADECHAR.M with the given
input arguments.
%
% GSKMADECHAR('Property','Value',...) creates a new
GSKMADECHAR or raises the
% existing singleton*. Starting from the left, property
value pairs are
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 34
% applied to the GUI before gskmadechar_OpeningFunction
gets called. An
% unrecognized property name or invalid value makes
property application
% stop. All inputs are passed to gskmadechar_OpeningFcn
via varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI
allows only one
% instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES
% Edit the above text to modify the response to help gskmadechar
% Last Modified by GUIDE v2.5 23-Mar-2008 23:58:49
% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @gskmadechar_OpeningFcn, ...
'gui_OutputFcn', @gskmadechar_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
ifnargin&&ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end
ifnargout
[varargout{1:nargout}] = gui_mainfcn(gui_State,
varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
% --- Executes just before gskmadechar is made visible.
functiongskmadechar_OpeningFcn(hObject, eventdata, handles,
varargin)
% This function has no output args, see OutputFcn.
% hObject handle to figure
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 35
% varargin command line arguments to gskmadechar (see
VARARGIN)
% Choose default command line output for gskmadechar
handles.output = hObject;
% Update handles structure
guidata(hObject, handles);
% UIWAIT makes gskmadechar wait for user response (see UIRESUME)
% uiwait(handles.figure1);
clc
clearglobal
global inmv000 wtmv000 incmv000 iolmv000
inmv000 = zeros(100,1);
inmv000 = (-1) + inmv000;
incmv000 = zeros(1,100);
incmv000 = (-1) + incmv000;
iolmv000 = 45;
wtmv000 = zeros(100,100);
wtmv000 = wtmv000 + (inmv000 * inmv000');
%for intv011 = 1:100
% wtmv000(intv011,intv011) = 0;
%end
% --- Outputs from this function are returned to the command
line.
functionvarargout = gskmadechar_OutputFcn(hObject, eventdata,
handles)
% varargout cell array for returning output args (see
VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Get default command line output from handles structure
varargout{1} = handles.output;
function edit1_Callback(hObject, eventdata, handles)
% hObject handle to edit1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 36
% Hints: get(hObject,'String') returns contents of edit1 as text
% str2double(get(hObject,'String')) returns contents of
edit1 as a double
% --- Executes during object creation, after setting all
properties.
function edit1_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: edit controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on button press in pushbutton1.
function pushbutton1_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
global wtmv000 inmv000 incmv000 iolmv000
inmv001 = zeros(100,1);
inmv001 = (-1) + inmv001;
chrv000 = cd;
chrv001 = get(handles.edit1,'string');
chrv002 = get(handles.edit3,'string');
chrvv002 = str2double(chrv002);
dos('%SystemRoot%\system32\mspaint.exe gskchar.bmp');
un8v000 = imread(fullfile(chrv000,'gskchar.bmp'));
%calculation of x1, y1, x2, y2
[intv000,intv001,intv002] = size(un8v000);
intv005 = 0;
intv006 = 0;
intv007 = 0;
intv008 = 0;
intv009 = intv000;
intv010 = intv001;
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 37
for intv003 = 1:intv000
intv007 = 0;
for intv004 = 1:intv001
if (un8v000(intv003,intv004,1) ~= 255) ||
(un8v000(intv003,intv004,2) ~= 255) ||
(un8v000(intv003,intv004,3) ~= 255)
intv007 = intv007 + 1;
if intv007 == 1
intv009 = min(intv003,intv009);
intv010 = min(intv004,intv010);
end
intv005 = max(intv005,intv003);
intv006 = max(intv006,intv004);
end
end
end
if ((intv005 == intv009) || (intv006 == intv010)) || (intv005 ==
0) || (intv006 == 0) || (intv009 == 0) || (intv010 == 0)
msgbox([{'Image cannot have less than 2 lines (line must be
longer than 2 pixels)',char(13),'Retrieval is not
sucessful'}],'ChrRecg - GSK');
else
if (intv005 - intv009) > (intv006 - intv010)
intv003 = 9 / (intv005 - intv009);
else
intv003 = 9 / (intv006 - intv010);
end
axes(handles.axes1);
% figure
cla
holdon
for intv011 = 1 : intv000
for intv012 = 1 : intv001
if (un8v000(intv011,intv012,1) ~= 255) ||
(un8v000(intv011,intv012,2) ~= 255) ||
(un8v000(intv011,intv012,3) ~= 255)
intv013 = round(intv003 * (intv011 -intv009)) +
1;
intv014 = round(intv003 * (intv012 -intv010)) +
1;
inmv001((intv013-1) * 10 + intv014,1) = 1;
plot(intv014,-intv013);
end
end
end
xlim([0 11]);
ylim([-11 0]);
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 38
holdoff
logv000 = 0;
intv004 = 0;
intv005 = 0;
[intv000,intv001] = size(incmv000);
opmv000 = inmv001;
while logv000 == 0
intv004 = intv004 + 1;
intv016(1:intv000,1) = zeros(intv000,1);
for intv002 = 1 : intv000
intv015(1:100,1) = incmv000(intv002,1:100) - opmv000(1:100,1)';
intv016(intv002,1) = norm(intv015(1:100,1));
end
intv017 = min(intv016);
intv018 = 0;
for intv002 = 1 : intv000
if intv016(intv002,1) == intv017;
intv018 = intv018 + 1;
intv003 = intv002;
end
end
if (intv018 == 1)
logv000 = 1;
end
inmmv001 = opmv000;
if logv000 ~= 1
if intv004 > 100
intv004 = intv004 - 100;
end
intv005 = intv005 + 1;
if get(handles.checkbox2,'value') == 1
inmv001(intv004,1) = opmv000(intv004,1);
else
inmv001 = opmv000;
end
end
if intv005 == chrvv002 * 10000
msgbox([{'Processing taking too long',char(13),'Retrieval is not
sucessful'}],'ChrRecg - GSK');
logv000 = 1;
intv003 = 1;
end
opmv000 = wtmv000' * inmv001;
for intv002 = 1 : 100
if opmv000(intv002) < 0
opmv000(intv002) = -1;
else
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 39
opmv000(intv002) = 1;
end
end
end
set(handles.edit4,'string',num2str(intv005));
chrv001 = strcat(chrv001,char(iolmv000(intv003)));
set(handles.edit1,'string',chrv001);
end
intv000 = get(handles.checkbox1,'value');
intvs000 = num2str(intv000);
if intvs000 == '1'
gskmadechar('pushbutton1_Callback',hObject, eventdata, handles)
end
% --- Executes on button press in pushbutton2.
function pushbutton2_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
function edit2_Callback(hObject, eventdata, handles)
% hObject handle to edit2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hints: get(hObject,'String') returns contents of edit2 as text
% str2double(get(hObject,'String')) returns contents of
edit2 as a double
% --- Executes during object creation, after setting all
properties.
function edit2_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: edit controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 40
ifispc&&isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on button press in pushbutton3.
function pushbutton3_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton3 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
set(handles.edit1,'string','');
% --- Executes on button press in pushbutton4.
function pushbutton4_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton4 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
global inmv000 wtmv000 incmv000 iolmv000
[chrv002,chrv003]=uiputfile('*.mat','Select a File to store
Data');
if ~(num2str(chrv002) == '0');
save(fullfile(chrv003,chrv002));
msgbox('Saving is sucessful','ChrRecg - GSK');
else
msgbox([{'You Have Specify File To Save Data',char(13),'Saving
is not sucessful'}],'ChrRecg - GSK');
end
% --- Executes on button press in pushbutton5.
function pushbutton5_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton5 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
[chrv005,chrv006]=uigetfile('*.mat','Select a File to Load
Data');
if (num2str(chrv005) == '0');
msgbox([{'You Have Specify File To Save Data',char(13),'Loading
is not sucessful'}],'ChrRecg - GSK');
else
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 41
chrv007 = questdlg([{'This will delete present learned
data!!!'};{'Are You Sure ???'}],'Question
?','Yes','No.','No.','Yes');
if chrv007 == 'Yes'
chrv008 = handles;
load(fullfile(chrv006,chrv005));
else
msgbox([{'Cancelled by user',char(13),'Loading is not
sucessful'}],'ChrRecg - GSK');
end
end
% --- Executes on button press in pushbutton6.
function pushbutton6_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton6 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
global wtmv000 inmv000 incmv000 iolmv000
inmv000 = zeros(100,1);
inmv000 = (-1) + inmv000;
chrv000 = cd;
chrv001 = get(handles.edit2,'string');
dos('%SystemRoot%\system32\mspaint.exe gskchar.bmp');
un8v000 = imread(fullfile(chrv000,'gskchar.bmp'));
%calculation of x1, y1, x2, y2
[intv000,intv001,intv002] = size(un8v000);
intv005 = 0;
intv006 = 0;
intv007 = 0;
intv008 = 0;
intv009 = intv000;
intv010 = intv001;
for intv003 = 1:intv000
intv007 = 0;
for intv004 = 1:intv001
if (un8v000(intv003,intv004,1) ~= 255) ||
(un8v000(intv003,intv004,2) ~= 255) ||
(un8v000(intv003,intv004,3) ~= 255)
intv007 = intv007 + 1;
if intv007 == 1
intv009 = min(intv003,intv009);
intv010 = min(intv004,intv010);
end
intv005 = max(intv005,intv003);
intv006 = max(intv006,intv004);
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 42
end
end
end
if ((intv005 == intv009) || (intv006 == intv010)) || (intv005 ==
0) || (intv006 == 0) || (intv009 == 0) || (intv010 == 0)
msgbox([{'Image cannot have less than 2 lines (line must be
longer than 2 pixels)',char(13),'Teaching is not
sucessful'}],'ChrRecg - GSK');
else
if (intv005 - intv009) > (intv006 - intv010)
intv003 = 9 / (intv005 - intv009);
else
intv003 = 9 / (intv006 - intv010);
end
axes(handles.axes1);
% figure
cla
holdon
for intv011 = 1 : intv000
for intv012 = 1 : intv001
if (un8v000(intv011,intv012,1) ~= 255) ||
(un8v000(intv011,intv012,2) ~= 255) ||
(un8v000(intv011,intv012,3) ~= 255)
intv013 = round(intv003 * (intv011 -intv009)) +
1;
intv014 = round(intv003 * (intv012 -intv010)) +
1;
inmv000((intv013-1) * 10 + intv014,1) = 1;
plot(intv014,-intv013);
end
end
end
xlim([0 11]);
ylim([-11 0]);
holdoff
[intv000,intv001] = size(incmv000);
if intv000>0
logv000 = 0;
for intv002 = 1 : intv000
if inmv000(1:100,1) == incmv000(intv002,1:100)'
logv000 = 1;
if ~isempty(chrv001)
intv004 = unicode2native(chrv001);
iolmv000(intv002,1)= intv004(1,1);
logv001 = 1;
end
end
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 43
end
end
if (intv000 == 0) | (logv000 == 0)
incmv000(intv000+1,1:100) = inmv000;
intv004 = unicode2native(chrv001);
iolmv000(intv000+1,1) = intv004(1,1);
wtmv000 = wtmv000 + (inmv000 * inmv000');
% for intv011 = 1:100
% wtmv000(intv011,intv011) = 0;
% end
end
end
% --- Executes on button press in pushbutton8.
function pushbutton8_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton8 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
chrv000 = [{'CHARACTER RECOGNITION WITH NEURAL ANALYSIS'}...
{''}...
{}...
{'This programme used simple algorithm of auto assosiative
principle of artificial neural networks analysis with
asynchronous updation. Algorithm used here is very simple and
robust one.'}...
{''}...
{''}...
{'Number of iterations required depend on the mach between
input given to the available teaching data. Number of iteration
may exceen with improper teaching. We allow you to specify time
limit of each processing so that incase of long iteration the
processing will stop to reduce the problem of hanging.'}...
{''}...
{''}...
{'First we need to train the system by typing alphabet or
numaric in textbox provided.'}...
{''}...
{'Paintbrush will open where we can draw input image, use
"Air Brush" tool in paintbrush for good results'}...
{''}...
{'Same alphabet can be taught more than once (It is
recomended to teach each alphabet more than 10 times'}...
{''}...
{'You can save the systems learned data for future use or
for future updation. It is similar to stoping work now and
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 44
continuing afterwards. Load / Save buttons should be used for
this purpose'}...
{''}...
{'By pressing the button start, you are allowed to enter the
image which will approximated to nearest alphabet of
numaric'}...
{''}...
{'Please send your suggetions to
"[email protected]"'}...
{''}...
{''}...
{'Prepared by Suresh Kumar Gadi'}];
msgbox(chrv000,'ChrRecg - GSK');
% --- Executes on button press in pushbutton9.
function pushbutton9_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton9 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% --- Executes on button press in pushbutton10.
function pushbutton10_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton10 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
chrv000 = questdlg([{'This will delete present learned
data!!!'};{'Are You Sure ???'}],'Question
?','Yes','No.','No.','Yes');
if chrv000 == 'Yes'
clearglobal
global inmv000 wtmv000 incmv000 iolmv000
inmv000 = zeros(100,1);
inmv000 = (-1) + inmv000;
incmv000 = zeros(1,100);
incmv000 = (-1) + incmv000;
iolmv000 = 45;
wtmv000 = zeros(100,100);
wtmv000 = wtmv000 + (inmv000 * inmv000');
% for intv011 = 1:100
% wtmv000(intv011,intv011) = 0;
% end
else
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 45
msgbox([{'Cancelled by user',char(13),'Learned data clearing is
not sucessful'}],'ChrRecg - GSK');
end
function edit3_Callback(hObject, eventdata, handles)
% hObject handle to edit3 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hints: get(hObject,'String') returns contents of edit3 as text
% str2double(get(hObject,'String')) returns contents of
edit3 as a double
% --- Executes during object creation, after setting all
properties.
function edit3_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit3 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: edit controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on button press in pushbutton12.
function pushbutton12_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton12 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
global inmv000 wtmv000 incmv000 iolmv000
intv000 = wtmv000 / max(abs(max(abs(wtmv000))));
figure
holdon
gridoff
for intv001 = 1:100
for intv002 = 1:100
if intv000(intv001,intv002) < 0
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 46
plot(intv001,100-intv002,'--
rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[-
1*intv000(intv001,intv002) 0 0],'MarkerSize',10);
else
if intv000(intv001,intv002) > 0
plot(intv001,100-intv002,'--
rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[0
intv000(intv001,intv002) 0],'MarkerSize',10);
else
plot(intv001,100-intv002,'--
rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[1 1
0],'MarkerSize',10);
end
end
end
end
holdoff
% --- Executes on button press in pushbutton15.
function pushbutton15_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton15 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
global inmv000 wtmv000 incmv000 iolmv000
[intv000, intv001] = size(incmv000);
figure
holdon
for intv002 = 1:intv000
for intv003 = 1:100
if incmv000(intv002,intv003) > 0
plot(intv003,intv002,'--
rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[0 1
0],'MarkerSize',10);
else
plot(intv003,intv002,'--
rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[1 0
0],'MarkerSize',10);
end
end
end
holdoff
%msgbox([{'Recorded input patterens are displyed on mainwindow
of MATLAB'}],'ChrRecg - GSK');
% --- Executes on button press in pushbutton16.
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 47
function pushbutton16_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton16 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
global inmv000 wtmv000 incmv000 iolmv000
char(iolmv000)
msgbox([{'Recorded Ouputs are displyed on mainwindow of
MATLAB'}],'ChrRecg - GSK');
function edit4_Callback(hObject, eventdata, handles)
% hObject handle to edit4 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hints: get(hObject,'String') returns contents of edit4 as text
% str2double(get(hObject,'String')) returns contents of
edit4 as a double
% --- Executes during object creation, after setting all
properties.
function edit4_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit4 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: edit controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on button press in checkbox1.
function checkbox1_Callback(hObject, eventdata, handles)
% hObject handle to checkbox1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 48
% Hint: get(hObject,'Value') returns toggle state of checkbox1
% --- Executes on button press in pushbutton17.
function pushbutton17_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton17 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
global inmv000 wtmv000 incmv000 iolmv000
figure
holdon
for intv000 = 1:100
if inmv000(intv000,1) > 0
plot(intv000,1,'--
rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[0 1
0],'MarkerSize',10);
else
plot(intv000,1,'--
rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[1 0
0],'MarkerSize',10);
end
end
holdoff
%msgbox([{'Present input matrix is displyed on mainwindow of
MATLAB'}],'ChrRecg - GSK');
% --- Executes on selection change in listbox1.
function listbox1_Callback(hObject, eventdata, handles)
% hObject handle to listbox1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns listbox1
contents as cell array
% contents{get(hObject,'Value')} returns selected item
from listbox1
% --- Executes during object creation, after setting all
properties.
function listbox1_CreateFcn(hObject, eventdata, handles)
% hObject handle to listbox1 (see GCBO)
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 49
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: listbox controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on button press in radiobutton1.
function radiobutton1_Callback(hObject, eventdata, handles)
% hObject handle to radiobutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hint: get(hObject,'Value') returns toggle state of
radiobutton1
% --- Executes on button press in radiobutton2.
function radiobutton2_Callback(hObject, eventdata, handles)
% hObject handle to radiobutton2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hint: get(hObject,'Value') returns toggle state of
radiobutton2
% --- Executes on button press in checkbox2.
function checkbox2_Callback(hObject, eventdata, handles)
% hObject handle to checkbox2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hint: get(hObject,'Value') returns toggle state of checkbox2
PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs 2011
L O V E L Y P R O F E S S I O N A L U N I V E R S I T Y
Page 50
6.0REFERENCES
[1] IEEE research paper on “A Study on Japanese Historical Character Recognition that uses
Modular Neural Networks” by Tadashi Horiuchi and Satoru Kato vide reference no. 978-
0-7695-3873-0/09.
[2] IEEE research paper on “A Pattern Recognition Neural Network Using Many Sets of
Weights
[3] and Biases” by Le Dung
and Makoto Mizukawa vide reference no. 1-4244-0790-7/07.
[4] IEEE research paper on “Automatic Handwritten Postcode Reading System using Image
Processing and Neural Networks” by Le Dung, Makoto Mizukawa, 930-931 SI2006
[5] http://www.codeproject.com/KB/recipes/HopfieldNeuralNetwork.aspx
[6] http://home.eunet.no/~khunn/papers/2039.html
[7] http://www.codeproject.com/KB/dotnet/simple_ocr.aspx