Line Follower using Neural Nets : Part 2 (Designing the Neural Net )

This post is the second in the series of developing a neural net line follower. For the post covering dataset generation, go here

Once data is handy, the next objective in a Neural Net is to plan out its structure.
Neural nets are mostly used for uncertain and non-linear operations as is our task of pattern matching a.k.a Classification.

A neural net has mainly the following layers which map the input to the output
  • Input Layer
  • Hidden Layer(s)
  • Output Layer
The Number of Hidden Layer is a deterministic on the complexity of the task being performed.For example, A deep learning net might contain more hidden layers for a deeper defragmentation of data.However there is a trade off between Depth and Speed of execution, therefore for our case, a single hidden layer would be enough.

The Designed Network looks as depicted here:

8 : 1 mapping through a hidden layer of two nodes (and one bias unit)

Intuition says this will be sufficient for our task as a partition in half can easily tell which side of our sensor strip sees the line.

For details of what a neural network is, I would recommend going through the following site: Neural Networks and Deep Learning

For activation function in our network, I have used tangent sigmoid function:
Tangent Sigmoid Function
For faster development time, the neural net designer tool in MATLAB is used.For the sake of understanding, an algorithmic view is as shown:

x = 1 X 8;                                             {Input from 8 nodes}
W12   =   2 X 8;                                     {Weight Matrix from Input layer to Hidden Layer}
Z1= x*(W12)= (1X8 * 8X2)= 1X2;   {Mapping from Input layer to Hidden layer}
A1= TanSig(Z1)= 1X2;                         {Activation of Hidden Layer}
W23 = 1 X 2;                                        {Weight Matrix from Hidden layer to Output Layer}
Z2 =A1*(W23)= (1X2 * 2X1)=1X1; {Mapping from Hidden layer to Output Layer}
A2=TanSig(Z2);                                    {Activation of Output Node}
Y=A2;                                                                    {Output}

The specifications are set up in the NNFit tool in MATLAB and data obtained from the previous post is used to train the network using Backpropagation Algorithm. After the training completes, the entire process is stored as a script using the prompt window.

By default the script generated contains a lot of redundant information which can be optimised on examination.
The entire script is reduced to the following:

 function [y1] = NNLF(x1)   
 %NNLF neural network simulation function.  
 %  
 % Generated by Neural Network Toolbox function genFunction, 13-Aug-2016 14:40:30.  
 %  
 % [y1] = NNLF(x1) takes these arguments:  
 %  x = Qx8 matrix, input #1  
 % and returns:  
 %  y = Qx1 matrix, output #1  
 % where Q is the number of samples.  
 %#ok<*RPMT0>  
 %(c)Sanjeev Tripathi ( AlphaDataOne.blogspot.in )   
 % ===== NEURAL NETWORK CONSTANTS =====  
 % Layer 1  
 b1 = [-8.8516132798193108e-10;2.1615423176361062];  
 IW1_1 = [-30.312171052276302 -15.230142543653209 -7.5989117276441904 -3.8426480381828529 3.8426480396510354 7.5989117277872076 15.230142543113857 30.312171051352024;1.1787493338995503 1.1684723902794487 -0.30187584946551604 -1.2505266965306716 -0.85655951742083458 -0.61361689937359887 -0.51938433720151178 0.43601182390986715];  
 % Layer 2  
 b2 = -3.5519126834821509e-10;  
 LW2_1 = [1.0000000099939996 2.6043564908190074e-09];  
 % ===== SIMULATION ========  
 Q = size(x1,1); % samples  
 x1 = x1';  
 xp1=2*x1 -1;  
 xp1=cast(xp1,'double');  
 a1 = tansig_apply(b1 + IW1_1*xp1);  
 a2 = repmat(b2,1,Q) + LW2_1*a1;  
 y1=a2;      
 end  
 % Sigmoid Symmetric Transfer Function  
 function a = tansig_apply(n,~)  
 a = 2 ./ (1 + exp(-2*n)) - 1;  
 end  

The Neural Net being run uses the pre optimized weights to map the input to the output with an accuracy of 100% (High Variance) as the dataset contained all the possibilities. A demo of our net for different inputs entered manually is as shown:


As stated earlier,
        -1 informs to move left 
        +1 informs to move right 
        ~0 informs to keep moving forward

The network of nodes can compute with accuracy any width,orientation or order of line with absolute accuracy. It can also detect multiple lines simultaneously as the network has been trained for all the possible inputs that it can encounter.

An Implementation on AVR Atmega32 microcontroller to be covered soon.
Stay Tuned for more.

Peace Out

Used:
Matlab 2016a