54
NEURAL NETWORKS Gagan Deep Rozy Computech Services 3 rd Gate, Kurukshetra-136119 [email protected] , 9416011599 1 ANN by Gagan Deep, [email protected]

Fundamentals of Neural Networks

Embed Size (px)

Citation preview

Page 1: Fundamentals of Neural Networks

NEURAL NETWORKS

Gagan DeepRozy Computech Services3rd Gate, [email protected], 9416011599

1ANN by Gagan Deep, [email protected]

Page 2: Fundamentals of Neural Networks

Artificial Neural Network(ANN)

Artificial + Neural + Network

2ANN by Gagan Deep, [email protected]

Page 3: Fundamentals of Neural Networks

Artificial

Made or produced by human beings rather thanoccurring naturally, especially as a copy ofsomething natural.

However, artificiality does not necessarily have anegative connotation, as it may also reflect theability of humans to replicate forms or functionsarising in nature, as with an artificial heart orartificial intelligence.

Intelligence expert Herbert A. Simon observesthat "some artificial things are imitations ofthings in nature, and the imitation may useeither the same basic materials as those in thenatural object or quite different materials.

3ANN by Gagan Deep, [email protected]

Page 4: Fundamentals of Neural Networks

Artificial Intelligence

Artificial intelligence (AI) is the intelligenceexhibited by machines or software. It is anacademic field of study which studies thegoal of creating intelligence.

The central problems (or goals) of AI researchinclude reasoning, knowledge, planning,learning, natural language processing(communication), perception and the abilityto move and manipulate objects.

4ANN by Gagan Deep, [email protected]

Page 5: Fundamentals of Neural Networks

Knowledge Based System

Knowledge-based system is a program thatacquires, represents and uses knowledge fora specific purpose.

Consists of a knowledge-base and aninference engine.

Knowledge is stored in the knowledge-basewhile control strategies reside in the separateinference engine.

5ANN by Gagan Deep, [email protected]

Page 6: Fundamentals of Neural Networks

Knowledge-Base

Inference Engine

6ANN by Gagan Deep, [email protected]

Page 7: Fundamentals of Neural Networks

Stages of Biological Neural System

The neural system of the human body consists of threestages: receptors, a neural network, and effectors. Thereceptors receive the stimuli either internally or from theexternal world, then pass the information into the neurons ina form of electrical impulses. The neural network thenprocesses the inputs then makes proper decision of outputs.Finally, the effectors translate electrical impulses from theneural network into responses to the outside environment.Figure shows the bidirectional communication betweenstages for feedback

7ANN by Gagan Deep, [email protected]

Page 8: Fundamentals of Neural Networks

Neural

Neural relating to a nerve or the nervoussystem.

Situated in the region of or on the same sideof the body as the brain and spinal cord.

It came from the Greek word Neuron.

8ANN by Gagan Deep, [email protected]

Page 9: Fundamentals of Neural Networks

Neuron

A neuron also known as a neurone or nervecell) is an electrically excitable cell thatprocesses and transmits information throughelectrical and chemical signals.

These signals between neurons occur viasynapses, specialized connections with othercells.

Synapses - a junction between two nervecells, consisting of a minute gap across whichimpulses pass by diffusion of aneurotransmitter.

9ANN by Gagan Deep, [email protected]

Page 10: Fundamentals of Neural Networks

The human body is made up of trillions of cells.

Neurons, are specialized to carry "messages"through an electrochemical process.

The human brain has approximately 100 billionneurons.

Neurons come in many different shapes andsizes.

Some of the smallest neurons have cell bodiesthat are only 4 microns wide.

Some of the biggest neurons have cell bodiesthat are 100 microns wide. (Remember that 1micron is equal to one thousandth of amillimeter!).

10ANN by Gagan Deep, [email protected]

Page 11: Fundamentals of Neural Networks

Neurons vs. Other Cells

Similarities with other cells:

Neurons are surrounded by a cell membranethat protects the cell.

Neurons and other body cells both contain a nucleus that holds genetic information.

Neurons carry out basic cellular processessuch as protein synthesis and energyproduction.

11ANN by Gagan Deep, [email protected]

Page 12: Fundamentals of Neural Networks

However, neurons differ from other cells in thebody because:

Neurons have specialize cell parts calleddendrites and axons. Dendrites bringelectrical signals to the cell body and axonstake information away from the cell body.

Neurons communicate with each otherthrough an electrochemical process.

Neurons contain some specialized structures(for example, synapses) and chemicals (forexample, neurotransmitters).

12ANN by Gagan Deep, [email protected]

Page 13: Fundamentals of Neural Networks

The Structure of a Neuron

There are three basic parts of a neuron: thedendrites, the cell body and the axon.

However, all neurons vary somewhat in size,shape, and characteristics depending on thefunction and role of the neuron.

Some neurons have few dendritic branches,while others are highly branched in order toreceive a great deal of information.

Some neurons have short axons, while otherscan be quite long. The longest axon in the humanbody extends from the bottom of the spine tothe big toe and averages a length ofapproximately three feet!

13ANN by Gagan Deep, [email protected]

Page 14: Fundamentals of Neural Networks

Neuron

One way to classify neurons is by the number of extensions that extend from the neuron's cell body

(soma).14ANN by Gagan Deep, [email protected]

Page 15: Fundamentals of Neural Networks

15ANN by Gagan Deep, [email protected]

Page 16: Fundamentals of Neural Networks

Bipolar neurons have two processes extending from the cell body (examples: retinal cells, olfactory epithelium cells).

Pseudounipolar cells (example: dorsal root ganglion cells). Actually, these cells have 2 axons rather than an axon and dendrite. One axon extends centrally toward the spinal cord, the other axon extends toward the skin or muscle.

Multipolar neurons have many processes that extend from the cell body. However, each neuron has only one axon (examples: spinal motor neurons,

pyramidal neurons, Purkinje cells). 16ANN by Gagan Deep, [email protected]

Page 17: Fundamentals of Neural Networks

SYNAPSE

17ANN by Gagan Deep, [email protected]

Page 18: Fundamentals of Neural Networks

Brain Interconnections

18ANN by Gagan Deep, [email protected]

Page 19: Fundamentals of Neural Networks

BIOLOGICAL (MOTOR) NEURON

19ANN by Gagan Deep, [email protected]

Page 20: Fundamentals of Neural Networks

Neurons can also be classified by the directionthat they send information.

Sensory (or afferent) neurons: sendinformation from sensory receptors (e.g., inskin, eyes, nose, tongue, ears) TOWARD thecentral nervous system.

Motor (or efferent) neurons: sendinformation AWAY from the central nervoussystem to muscles or glands.

Interneuron: send information betweensensory neurons and motor neurons. Mostinterneuron's are located in the centralnervous system.

20ANN by Gagan Deep, [email protected]

Page 21: Fundamentals of Neural Networks

Action Potentials How do neurons transmit and receive

information? In order for neurons tocommunicate, they need to transmit informationboth within the neuron and from one neuron tothe next. This process utilizes both electricalsignals as well as chemical messengers.

The dendrites of neurons receive informationfrom sensory receptors or other neurons. Thisinformation is then passed down to the cell bodyand on to the axon. Once the information asarrived at the axon, it travels down the length ofthe axon in the form of an electrical signal knownas an action potential.

21ANN by Gagan Deep, [email protected]

Page 22: Fundamentals of Neural Networks

Communication Between Synapses Once an electrical impulse has reached the end of an

axon, the information must be transmitted acrossthe synaptic gap to the dendrites of the adjoiningneuron. In some cases, the electrical signal canalmost instantaneously bridge the gap between theneurons and continue along its path.

In other cases, neurotransmitters are needed to sendthe information from one neuron to the next.Neurotransmitters are chemical messengers that arereleased from the axon terminals to cross thesynaptic gap and reach the receptor sites of otherneurons. In a process known as reuptake, theseneurotransmitters attach to the receptor site andare reabsorbed by the neuron to be reused.

22ANN by Gagan Deep, [email protected]

Page 23: Fundamentals of Neural Networks

Neurotransmitters

Neurotransmitters are an essential part of oureveryday functioning. While it is not knownexactly how many neurotransmitters exist,scientists have identified more than 100 of thesechemical messengers.

The spikes travelling along the axon of the pre-synaptic neuron trigger the release ofneurotransmitter substances at the synapse.

The neurotransmitters cause excitation orinhibition in the dendrite of the post-synapticneuron.

23ANN by Gagan Deep, [email protected]

Page 24: Fundamentals of Neural Networks

The integration of the excitatory andinhibitory signals may produce spikes in thepost-synaptic neuron.

The contribution of the signals depends onthe strength of the synaptic connection.

What effects do each of theseneurotransmitters have on the body?

What happens when disease or drugsinterfere with these chemical messengers?

The following are just a few of the majorneurotransmitters, their known effects, anddisorders they are associated with.

24ANN by Gagan Deep, [email protected]

Page 25: Fundamentals of Neural Networks

Acetylcholine: Associated with memory, musclecontractions, and learning. A lack ofacetylcholine in the brain is associated withAlzheimer’s disease.

Endorphins: Associated with emotions and painperception. The body releases endorphins inresponse to fear or trauma. These chemicalmessengers are similar to opiate drugs such asmorphine, but are significantly stronger.

Dopamine: Associated with thought andpleasurable feelings. Parkinson’s disease is oneillness associated with deficits in dopamine,while schizophrenia is strongly linked toexcessive amounts of this chemical messenger.

25ANN by Gagan Deep, [email protected]

Page 26: Fundamentals of Neural Networks

Biological Prototype

● Neuron

- Information gathering (D)

- Information processing (C)

- Information propagation (A / S)

human being: 1012 neurons

electricity in mV range

speed: 120 m / s

cell body (C)

dendrite (D)nucleus

axon (A)

synapse (S)

26ANN by Gagan Deep, [email protected]

Page 27: Fundamentals of Neural Networks

Artificial Neural Network

An Artificial Neural Network (ANN) is aninformation processing paradigm that is inspired bythe way biological nervous systems, such as thebrain, process information. T

The key element of this paradigm is the novelstructure of the information processing system.

It is composed of a large number of highlyinterconnected processing elements (neurones)working in unison to solve specific problems.

ANNs, like people, learn by example.

An ANN is configured for a specific application, suchas pattern recognition or data classification,through a learning process.

27ANN by Gagan Deep, [email protected]

Page 28: Fundamentals of Neural Networks

Learning in biological systems involves adjustmentsto the synaptic connections that exist between theneurones. This is true of ANNs as well.

28ANN by Gagan Deep, [email protected]

Page 29: Fundamentals of Neural Networks

BRAIN COMPUTATION

The human brain contains about 10billion nerve cells, or neurons. Onaverage, each neuron is connected toother neurons through approximately10,000 synapses.

29ANN by Gagan Deep, [email protected]

Page 30: Fundamentals of Neural Networks

DEFINITION OF NEURAL NETWORKS

According to the DARPA Neural Network Study • ... a neural network is a system composed of many

simple processing elements operating in parallel whosefunction is determined by network structure, connectionstrengths, and the processing performed at computingelements or nodes.

According to Haykin

A neural network is a massively parallel distributed

processor that has a natural propensity for storing

experiential knowledge and making it available for use. It

resembles the brain in two respects:• Knowledge is acquired by the network through a learning process.

• Interneuron connection strengths known as synaptic weights are

used to store the knowledge.

30ANN by Gagan Deep, [email protected]

Page 31: Fundamentals of Neural Networks

NEURAL NETWORKS v/s CONVENTIONAL COMPUTERS

COMPUTERS

Algorithmic approach

They are necessarily programmed

Work on predefined set of instructions

Operations are predictable

ANN

Learning approach

Not programmed for specific tasks

Used in decision making

Operation is unpredictable

31ANN by Gagan Deep, [email protected]

Page 32: Fundamentals of Neural Networks

ARTIFICIAL NEURAL NETWORKS

Information-processing system.

Neurons process the information.

The signals are transmitted by means of

connection links.

The links possess an associated weight.

The output signal is obtained by applying

activations to the net input.

32ANN by Gagan Deep, [email protected]

Page 33: Fundamentals of Neural Networks

ARTIFICIAL NEURAL NETWORKS

The figure shows a simple artificial neural netwith two input neurons (X1, X2) and oneoutput neuron (Y). The inter connectedweights are given by W1 and W2.

X2

X1

W2

W1

Y

33ANN by Gagan Deep, [email protected]

Page 34: Fundamentals of Neural Networks

ASSOCIATION OF BIOLOGICAL NET WITH ARTIFICIAL NET

34ANN by Gagan Deep, [email protected]

Page 35: Fundamentals of Neural Networks

PROCESSING OF AN ARTIFICIAL NETWORKS

The neuron is the basic information processing unit of a NN. It

consists of:

1. A set of links, describing the neuron inputs, with weights

W1, W2, …, Wm.

2. An adder function (linear combiner) for computing the

weighted sum of the inputs (real numbers):

3. Activation function for limiting the amplitude of the neuron

output.

jj

jXWu

m

1

) (u y b

35ANN by Gagan Deep, [email protected]

Page 36: Fundamentals of Neural Networks

MOTIVATION FOR NEURAL NET

Scientists are challenged to use machines more

effectively for tasks currently solved by humans.

Symbolic rules don't reflect processes actually used

by humans.

Traditional computing excels in many areas, but not

in others.

36ANN by Gagan Deep, [email protected]

Page 37: Fundamentals of Neural Networks

The major areas being:

Massive parallelism

Distributed representation and computation

Learning ability

Generalization ability

Adaptivity

Inherent contextual information processing

Fault tolerance

Low energy consumption

37ANN by Gagan Deep, [email protected]

Page 38: Fundamentals of Neural Networks

Characteristics of Artificial Neural Networks

A large number of very simple processing neuron-like processing elements

A large number of weighted connections between the elements

Distributed representation of knowledge over the connections

Knowledge is acquired by network through a learning process

38ANN by Gagan Deep, [email protected]

Page 39: Fundamentals of Neural Networks

The good news: They exhibit some brain-likebehaviors that are difficult to programdirectly like: learning

association

categorization

generalization

feature extraction

optimization

noise immunity

The bad news: neural nets are black boxes

difficult to train in some cases

39ANN by Gagan Deep, [email protected]

Page 40: Fundamentals of Neural Networks

The NN exhibit mapping capabilities that is they caninput patterns to their associated output patterns.

The NN learn by example. Thus, NN architecture canbe trained with known examples of a problem beforethey are tested for their ‘inference’ capability onunknown instances of the problem. They can,therefore identify new objects previously untrained.

The NN possess the capability to generalize. Thus,they can predict new outcomes from past trends.

The NNs are robust systems and are fault tolerant.They can, therefore, recall full patterns fromincomplete, partial or noisy patterns.

The NNs can process information in parallel, at highspeed, and in a distributed manner.

40ANN by Gagan Deep, [email protected]

Page 41: Fundamentals of Neural Networks

Features of Biological Neural Networks Some attractive features of the biological NN that

make it superior to even the most sophisticated AIcomputer system pattern recognition tasks are thefollowing

Robustness and fault tolerance : The decay of nervecells does not seem to affect the performancesignificantly.

Flexibility : The network automatically adjust to anew environment without using any programmedinstruction.

Ability to deal with a variety of data situations : Thenetwork can deal with information that is fuzzy,probabilistics, noisy and inconsistent.

Collective computation : The network performsroutinely many operations in parrallel and also giventask in a distributed manner.

41ANN by Gagan Deep, [email protected]

Page 42: Fundamentals of Neural Networks

Performance Comparison of Computer and BiologicalNeural Networks Speed: Brain (Slow in processing information) +

Computer (Fast)= ANN (Fast) Processing :Sequential (Programs) & Parallel (Brain)=

Parallel Processing Size & Complexity : Billion & trillions of neurons and

their interconnections- so they give size andcomplexity.

Storage : Brain (Adaptable) /Computer (StrictlyReplaceable)- In computers overwriting takes place butin brain according to interconnection strengths theyadd.

Fault Tolerance : Due to Distributed Networksinformation can be retrieved after anycrash/destruction

Control Mechanism : Central nervous system (Brain)/Computer (Control Unit)

42ANN by Gagan Deep, [email protected]

Page 43: Fundamentals of Neural Networks

HISTORICAL BACKGROUND

The history of neural networks that was describedabove can be divided into several periods:

First Attempts: There were some initial simulationsusing formal logic. McCulloch and Pitts (1943)developed models of neural networks based on theirunderstanding of neurology. These models madeseveral assumptions about how neurons worked. Theirnetworks were based on simple neurons which wereconsidered to be binary devices with fixed thresholds.The results of their model were simple logic functionssuch as "a or b" and "a and b".

43ANN by Gagan Deep, [email protected]

Page 44: Fundamentals of Neural Networks

Another attempt was by using computersimulations. Two groups (Farley and Clark, 1954;Rochester, Holland, Haibit and Duda, 1956). Thefirst group (IBM researchers) maintained closedcontact with neuroscientists at McGill University.So whenever their models did not work, theyconsulted the neuroscientists. This interactionestablished a multidisciplinary trend whichcontinues to the present day.

44ANN by Gagan Deep, [email protected]

Page 45: Fundamentals of Neural Networks

Promising & Emerging Technology: Notonly was neuroscience influential in thedevelopment of neural networks, butpsychologists and engineers also contributedto the progress of neural networksimulations.

Rosenblatt (1958) stirred considerableinterest and activity in the field when hedesigned and developed the Perceptron. ThePerceptron had three layers with the middlelayer known as the association layer. Thissystem could learn to connect or associate agiven input to a random output unit.

45ANN by Gagan Deep, [email protected]

Page 46: Fundamentals of Neural Networks

Another system was the ADALINE (ADAptiveLInear Element) which was developed in 1960by Widrow and Hoff (of Stanford University).The ADALINE was an analogue electronicdevice made from simple components. Themethod used for learning was different tothat of the Perceptron, it employed theLeast-Mean-Squares (LMS) learning rule.

46ANN by Gagan Deep, [email protected]

Page 47: Fundamentals of Neural Networks

Period of Frustration & Disrepute: In 1969Minsky and Papert wrote a book in whichthey generalized the limitations of singlelayer Perceptrons to multilayered systems. Inthe book they said:

"...our intuitive judgment that the extension(to multilayer systems) is sterile".

The significant result of their book was toeliminate funding for research with neuralnetwork simulations. The conclusionssupported the disenchantment of researchersin the field. As a result, considerable prejudiceagainst this field was activated.

47ANN by Gagan Deep, [email protected]

Page 48: Fundamentals of Neural Networks

Innovation: Although public interest and availablefunding were minimal, several researcherscontinued working to develop neuromorphicallybased computational methods for problems such aspattern recognition.During this period several paradigms weregenerated which modern work continues toenhance. Grossberg's (Steve Grossberg and GailCarpenter in 1988) influence founded a school ofthought which explores resonating algorithms. Theydeveloped the ART (Adaptive Resonance Theory)networks based on biologically plausible models.Anderson and Kohonen developed associativetechniques independent of each other. Klopf (A.Henry Klopf) in 1972, developed a basis for learningin artificial neurons based on a biological principlefor neuronal learning called heterostasis.

48ANN by Gagan Deep, [email protected]

Page 49: Fundamentals of Neural Networks

Werbos (Paul Werbos 1974) developed and used theback-propagation learning method, however severalyears passed before this approach was popularized.Back-propagation nets are probably the most wellknown and widely applied of the neural networkstoday. In essence, the back-propagation net. is aPerceptron with multiple layers, a different thresholdfunction in the artificial neuron, and a more robustand capable learning rule.Amari (A. Shun-Ichi 1967) was involved withtheoretical developments: he published a paperwhich established a mathematical theory for alearning basis (error-correction method) dealing withadaptive pattern classification. While Fukushima (F.Kunihiko) developed a step wise trained multilayeredneural network for interpretation of handwrittencharacters. The original network was published in1975 and was called the Cognitron.

49ANN by Gagan Deep, [email protected]

Page 50: Fundamentals of Neural Networks

Re-Emergence: Progress during the late 1970s andearly 1980s was important to the re-emergence oninterest in the neural network field. Several factorsinfluenced this movement.

For example, comprehensive books and conferencesprovided a forum for people in diverse fields withspecialized technical languages, and the response toconferences and publications was quite positive. Thenews media picked up on the increased activity andtutorials helped disseminate the technology.Academic programs appeared and courses wereintroduced at most major Universities (in US andEurope). Attention is now focused on funding levelsthroughout Europe, Japan and the US and as thisfunding becomes available, several new commercialwith applications in industry and financialinstitutions are emerging.

.50ANN by Gagan Deep, [email protected]

Page 51: Fundamentals of Neural Networks

Today: Significant progress has been made inthe fild of neural networks-enough to attracta great deal of attention and fund furtherresearch.

Advancement beyond current commercialapplications appears to be possible, andresearch is advancing the field on manyfronts.

Neurally based chips are emerging andapplications to complex problemsdeveloping.

Clearly, today is a period of transition forneural network technology

51ANN by Gagan Deep, [email protected]

Page 52: Fundamentals of Neural Networks

FEW APPLICATIONS OF NEURAL NETWORKS

52ANN by Gagan Deep, [email protected]

Page 53: Fundamentals of Neural Networks

We Discussed(marked) Unit I Introduction: Concepts of neural networks,

Characteristics of Neural Networks, HistoricalPerspective, and Applications of NeuralNetworks.

Fundamentals of Neural Networks: Thebiological prototype, Neuron concept, Singlelayer Neural Networks, Multi-Layer NeuralNetworks, terminology, Notation andrepresentation of Neural Networks, Training ofArtificial Neural Networks.

Representation of perceptron and issues,perceptron learning and training, Classification,linear Separability

53ANN by Gagan Deep, [email protected]

Page 54: Fundamentals of Neural Networks

Thanks!

Gagan [email protected]

9416011599

54ANN by Gagan Deep, [email protected]