NN Lecture 1

Embed Size (px)

Citation preview

  • 8/2/2019 NN Lecture 1

    1/20

    Neural NetworksNeural Networks

    AndAnd

    Its ApplicationsIts Applications

    By

    Dr. Surya Chitra

  • 8/2/2019 NN Lecture 1

    2/20

    OUTLINEOUTLINE

    Introduction & Software

    Basic Neural Network & Processing Software Exercise Problem/Project

    Complementary Technologies Genetic Algorithms

    Fuzzy Logic

    Examples of Applications

    Manufacturing R&D

    Sales & Marketing

    Financial

  • 8/2/2019 NN Lecture 1

    3/20

    IntroductionIntroduction

    A computing system made up of a number of

    highly interconnected processing elements,

    which processes information by its dynamicstate response to external inputs

    Dr. Robert Hecht-Nielsen

    What is a Neural Network?

    A parallel information processing systembased on the human nervous system

    consisting of large number of neurons,

    which operate in parallel.

  • 8/2/2019 NN Lecture 1

    4/20

    Biological Neuron & Its FunctionB

    iological Neuron & Its Function

    Information Processed in Neuron Cell Body and

    Transferred to Next Neuron via Synaptic Terminal

  • 8/2/2019 NN Lecture 1

    5/20

    Processing inB

    iological NeuronProcessing inB

    iological Neuron

    Neurotransmitters Carry information to Next Neuron and

    It is Further Processed in Next Neuron Cell Body

  • 8/2/2019 NN Lecture 1

    6/20

    Artificial Neuron & Its FunctionArtificial Neuron & Its Function

    Neuron

    Processing ElementInputs

    Outputs

    Dendrites

    Axon

  • 8/2/2019 NN Lecture 1

    7/20

    Processing Steps Inside a NeuronProcessing Steps Inside a Neuron

    Electronic ImplementationElectronic Implementation

    Processing Element

    Inputs Outputs

    Summed

    Inputs Sum

    Min

    Max

    Mean

    OR/AND

    Add

    Bias

    Weight

    Transform

    Sigmoid Hyperbola

    Sine

    Linear

  • 8/2/2019 NN Lecture 1

    8/20

    Sigmoid Transfer FunctionSigmoid Transfer Function

    -10 -5 0 5 100

    0.2

    0.4

    0.6

    0.8

    1

    f(X)

    f'(X)

    Transfer 1

    Function =

    ( 1 + e (- sum) )

  • 8/2/2019 NN Lecture 1

    9/20

    Basic Neural Network & Its Elements

    Input

    Neurons

    Hidden

    Neurons

    Output

    Neurons

    Bias Neurons Clustering ofNeurons

  • 8/2/2019 NN Lecture 1

    10/20

    BackBack--Propagation NetworkPropagation NetworkForward Output FlowForward Output Flow

    Random Set of Weights Generated

    Send Inputs to Neurons

    Each Neuron Computes Its Output Calculate Weighted Sum

    Ij = 7 i Wi, j-1 * Xi, j-1 + B j

    Transform the Weighted SumXj= f (I j) = 1/ (1 + e

    (Ij + T))

    Repeat for all the Neurons

  • 8/2/2019 NN Lecture 1

    11/20

    BackBack--Propagation NetworkPropagation NetworkBackward Error PropagationBackward Error Propagation

    Errors are Propagated Backwards

    Update the Network Weights

    Gradient Descent Algorithm(Wji(n) = F * Hj* XiWji(n+1) = W ji(n) + (Wji(n)

    Add Momentum for Convergence

    (Wji (n) = F * Hj * Xi + E * (Wji (n-1)

    Where n = Iteration Number; F = Learning Rate

    E = Rate of Momentum (0 to 1)

  • 8/2/2019 NN Lecture 1

    12/20

    BackBack--Propagation NetworkPropagation NetworkBackward Error PropagationBackward Error Propagation

    Gradient Descent Algorithm

    Minimization ofMean Squared Errors

    Shape of Error Complex

    Multidimensional

    Bowl-Shaped

    Hills and Valleys

    Training by Iterations

    Global Minimum is Challenging

  • 8/2/2019 NN Lecture 1

    13/20

    Simple Transfer FunctionsSimple Transfer Functions

  • 8/2/2019 NN Lecture 1

    14/20

    Input Unit

    Bias Unit Computation Node

    Context Unit

    Recurrent Neural Network

  • 8/2/2019 NN Lecture 1

    15/20

    Input Unit

    Bias Unit Computation Node

    Higher Order Unit

    Time Delay Neural Network

  • 8/2/2019 NN Lecture 1

    16/20

    TrainingTraining -- SupervisedSupervised

    Both Inputs & Outputs are Provided

    Designer Can Manipulate

    Number of Layers

    Neurons per Layer Connection Between Layers

    The Summation & Transform Function

    Initial Weights

    Rules ofTraining Back Propagation

    Adaptive Feedback Algorithm

  • 8/2/2019 NN Lecture 1

    17/20

    TrainingTraining -- UnsupervisedUnsupervised

    Only Inputs are Provided

    System has to Figure Out

    Self Organization

    Adaptation to Input Changes/Patterns

    Grouping of Neurons to Fields

    Topological Order

    Based on Mammalian Brain

    Rules ofTraining Adaptive Feedback Algorithm (Kohonen)

    Topology: Map one space to another without

    changing geometric Configuration

  • 8/2/2019 NN Lecture 1

    18/20

    Traditional Computing Vs. NN TechnologyTraditional Computing Vs. NN Technology

    CHARACTERISTICS

    TRADITIONAL

    COMPUTING

    ARTIFICIAL

    NEURAL

    NETWORKS

    PROCESSING STYLE Sequential Parallel

    FUNCTIONS

    Logically

    Via Rules, Concepts

    Calculations

    Mapping

    Via Images, Pictures

    And Controls

    LEARNING METHOD By Rules By Example

    APPLICATIONSAccountingWord Processing

    Communications

    Computing

    Sensor ProcessingSpeech Recognition

    Pattern Recognition

    Text Recognition

  • 8/2/2019 NN Lecture 1

    19/20

    Traditional Computing Vs. NN TechnologyTraditional Computing Vs. NN Technology

    CHARACTERISTICS

    TRADITIONAL

    COMPUTING

    ARTIFICIAL

    NEURAL

    NETWORKS

    PROCESSORS VLSI - Traditional ANN

    OtherTechnologies

    APPRAOCH One Rule at a time

    Sequential

    Multiple Processing

    Simultaneous

    CONNECTIONS Externally

    Programmable

    Dynamically Self

    Programmable

    LEARNING Algorithmic Adaptable Continuously

    FAULTTOLERANCE None Significant via Neurons

    PROGRAMMING RuleBased Self-learning

    ABILITYTOTEST Need Big

    Processors

    Require Multiple

    Custom-built Chips

  • 8/2/2019 NN Lecture 1

    20/20

    HISTORYOF NEURAL NETWORKSHISTORYOF NEURAL NETWORKS

    TIME PERIOD Neural Network ActivityEarly 1950s IBM Simulate Human Thought Process Failed

    Traditional Computing Progresses Rapidly

    1956 Dartmouth Research Project on AI

    1959 Stanford Bernard Widrows ADALINE/MADALINE

    First NN Applied to Real World Problem

    1960s PERCEPTRON Cornell Neuro-biologist(RosenBlatt)

    1982 Hopfiled CalTech, Modeled Brain for Devices

    Japanese 5th Generation Computing

    1985 NN Conference by IEEE Japanese Threat

    1989 US Defense Sponsored Several Projects

    Today Several Commercial Applications

    Still Processing Limitations

    Chips ( digital,analog, & Optical)