Advertisement

What exactly is "Perception" with ANN's?

Started by February 19, 2005 02:39 AM
3 comments, last by Timkin 20 years ago
Hi, in a book i'm reading the author gives an example of perception(below), but I still don't understand what exactly he means by "perception", and what it is to do with ANN's.
Quote:
Cash Register Game A contestant in The Price is Right is sometimes asked to play the Cash Register Game. A few products are described, their prices are unknown to the contestant, and the contestant has to declare how many units of each item he or she would like to (pretend to) buy. If the total purchase does not exceed the amount specified, the contestant wins a special prize. After the contestant announces how many items of a particular product he or she wants, the price of that product is revealed, and it is rung up on the cash register. The contestant must be careful, in this case, that the total does not exceed some nominal value, to earn the associated prize. We can now cast the whole operation of this game, in terms of a neural network, called a Perceptron, as follows. Consider each product on the shelf to be a neuron in the input layer, with its input being the unit price of that product. The cash register is the single neuron in the output layer. The only connections in the network are between each of the neurons (products displayed on the shelf) in the input layer and the output neuron (the cash register). This arrangement is usually referred to as a neuron, the cash register in this case, being an instar in neural network terminology. The contestant actually determines these connections, because when the contestant says he or she wants, say five, of a specific product, the contestant is thereby assigning a weight of 5 to the connection between that product and the cash register. The total bill for the purchases by the contestant is nothing but the weighted sum of the unit prices of the different products offered. For those items the contestant does not choose to purchase, the implicit weight assigned is 0. The application of the dollar limit to the bill is just the application of a threshold, except that the threshold value should not be exceeded for the outcome from this network to favor the contestant, winning him or her a good prize. In a Perceptron, the way the threshold works is that an output neuron is supposed to fire if its activation value exceeds the threshold value.
He says there is 4 inputs neurons, and one output neuron. He gives this in an example of c++ program which is so badly written I can't work out what this has to do with whatever perception is.
Quote:
C++ Implementation of Perceptron Network In our C++ implementation of this network, we have the following classes: we have separate classes for input neurons and output neurons. The ineuron class is for the input neurons. This class has weight and activation as data members. The oneuron class is similar and is for the output neuron. It is declared as a friend class in the ineuron class. The output neuron class has also a data member called output. There is a network class, which is a friend class in the oneuron class. An instance of the network class is created with four input neurons. These four neurons are all connected with one output neuron. The member functions of the ineuron class are: (1) a default constructor, (2) a second constructor that takes a real number as an argument, and (3) a function that calculates the output of the input neuron. The constructor taking one argument uses that argument to set the value of the weight on the connection between the input neuron and the output neuron. The functions that determine the neuron activations and the network output are declared public. The activations of the neurons are calculated with functions defined in the neuron classes. A threshold value is used by a member function of the output neuron to determine if the neuron’s activation is large enough for it to fire, giving an output of 1.
heres the source if your interested:

ineuron::ineuron(float j)
{
weight= j;
}

float ineuron::act(float x)
{
float a;

a = x*weight;

return a;
}

void oneuron::actvtion(float *inputv, ineuron *nrn)
{
int i;
activation = 0;

for(i=0;i<4;i++)
     {
     cout<<"\nweight for neuron "<<i+1<<" is       "<<nrn.weight;
     nrn.activation = nrn.act(inputv);
     cout<<"           activation is      "<<nrn.activation;
     activation += nrn.activation;
     }
cout<<"\n\nactivation is  "<<activation<<"\n";
}

int oneuron::outvalue(float j)
{
if(activation>=j)
     {
     cout<<"\nthe output neuron activation exceeds the threshold value of "<<j<<"\n";
     output = 1;
     }
else
     {
     cout<<"\nthe output neuron activation is smaller than the threshold value of "<<j<<"\n";
     output = 0;
     }

cout<<" output value is "<< output;
return (output);
}

network::network(float a,float b,float c,float d)
{
nrn[0] = ineuron(a) ;
nrn[1] = ineuron(b) ;
nrn[2] = ineuron(c) ;
nrn[3] = ineuron(d) ;
onrn = oneuron();
onrn.activation = 0;
onrn.output = 0;
}
void main (int argc, char * argv[])
{

float inputv1[]= {1.95,0.27,0.69,1.25};
float wtv1[]= {2,3,3,2}, wtv2[]= {3,0,6,2};
FILE * wfile, * infile;
int num=0, vecnum=0, i;
float threshold = 7.0;

if (argc < 2)
     {
     cerr << "Usage: percept Weightfile Inputfile";
     exit(1);
     }
// open  files

wfile= fopen(argv[1], "r");
infile= fopen(argv[2], "r");

if ((wfile == NULL) || (infile == NULL))
     {
     cout << " Can't open a file\n";
     exit(1);
     }

cout<<"\nTHIS PROGRAM IS FOR A PERCEPTRON NETWORK WITH AN INPUT LAYER OF";
cout<<"\n4 NEURONS, EACH CONNECTED TO THE OUTPUT NEURON.\n";
cout<<"\nTHIS EXAMPLE TAKES REAL NUMBERS AS INPUT SIGNALS\n";

//create the network by calling its constructor.
//the constructor calls neuron constructor as many times as the number of
//neurons in input layer of the network.

cout<<"please enter the number of weights/vectors \n";
cin >> vecnum;

for (i=1;i<=vecnum;i++)
     {
     fscanf(wfile,"%f %f %f %f\n", &wtv1[0],&wtv1[1],&wtv1[2],&wtv1[3]);
     network h1(wtv1[0],wtv1[1],wtv1[2],wtv1[3]);
     fscanf(infile,"%f %f %f %f \n",
     &inputv1[0],&inputv1[1],&inputv1[2],&inputv1[3]);
     cout<<"this is vector # " << i << "\n";
     cout << "please enter a threshold value, eg 7.0\n";
     cin >> threshold;
     h1.onrn.actvtion(inputv1, h1.nrn);
     h1.onrn.outvalue(threshold);
     cout<<"\n\n";
     }

fclose(wfile);
fclose(infile);
}


heres the inputs and weights

inputs
  2.0 3.0 3.0 2.0
  3.0 0.0 6.0 2.0

weights 
  1.95 0.27 0.69 1.25
  0.30 1.05 0.75 0.19

This is the programs outputs
Quote:
THIS PROGRAM IS FOR A PERCEPTRON NETWORK WITH AN INPUT LAYER OF 4 NEURONS, EACH CONNECTED TO THE OUTPUT NEURON. THIS EXAMPLE TAKES REAL NUMBERS AS INPUT SIGNALS please enter the number of weights/vectors 2 this is vector # 1 please enter a threshold value, eg 7.0 7.0 weight for neuron 1 is 2 activation is 3.9 weight for neuron 2 is 3 activation is 0.81 weight for neuron 3 is 3 activation is 2.07 weight for neuron 4 is 2 activation is 2.5 activation is 9.28 the output neuron activation exceeds the threshold value of 7 output value is 1 this is vector # 2 please enter a threshold value, eg 7.0 7.0 weight for neuron 1 is 3 activation is 0.9 weight for neuron 2 is 0 activation is 0 weight for neuron 3 is 6 activation is 4.5 weight for neuron 4 is 2 activation is 0.38 activation is 5.78 the output neuron activation is smaller than the threshold value of 7 output value is 0 Finally, try adding a data vector of (1.4, 0.6, 0.35, 0.99) to the data file. Add a weight vector of ( 2, 6, 8, 3) to the weight file and use a threshold value of 8.25 to see the result. You can use other values to experiment also.
If anyone can help me make sense of what exactly the author is doing, id be grateful. Thanks
The AP is a little off in his/her reply, because perceptrons can be used in multiple layers and for non-linear results. A very good article about them can be found at http://generation5.org/content/1999/perceptron.asp
(http://www.ironfroggy.com/)(http://www.ironfroggy.com/pinch)
Advertisement
There is a vast difference between the meaning of perception and the term perceptron. Don't get them confused, just because they look similar.

This topic is closed to new replies.

Advertisement