AI robot project

From BenningtonWiki
Jump to: navigation, search

For my (Max Alexander's) senior project this year, I am creating an autonomous navigating robot controlled by a neural-network based artificially intelligent control program. The robot itself will be very simple, consisting of little more than a set of motors and cameras, and the sensory input it receives (primarily camera images) will be complex and ambiguous; however, based on the emerging discipline of "Embodied AI," I believe that it is feasible (for a undergraduate student) to design a control system such that basic intelligent behaviors can be produced even in these conditions.

In the end, the robot will hopefully be capable of such behaviors as navigating a complex environment without collisions, seeking out and approaching certain target objects, and even collecting and placing target objects from and to varied locations; furthermore, these behaviors will not be directly programmed (which would be a tedious and complicated exercise) but instead emerge, after a period of learning, from the initially unstructured behavior of the robot itself (which is a method less complicated to produce, and potentially applicable to practical problems in robotics.)

In order to learn from its experience in the world, the robot will be controlled by adaptive neural network programs; software which mimics (to some degree) the functioning of the neural networks in animal brains, producing systems which can self-organize to correlate a certain set of input/output responses which are rewarded by their environment- in other words, which "learn" adaptive behavior. The capacity of a neural network to learn a complex task is limited- otherwise AI in general would have been solved long ago- but I believe that it is possible to create networks of neural networks, each tackling a certain part of the overall problem of a complex behavior.



As of 6/6/9, the neural network software has been essentially completed, and amounts to about a thousand lines of purely original code, which is available to you below. Further modifications will be made to the program to increase the efficiency of neural network adaptivity, but this largely amounts to tweaking the constants of the program (or devising functions to modify such constants dynamically.) Also note that, in the end, I wound up dividing my efforts slightly between two specific algorithms for synaptic adaptivity; one more complicated model which was my initial goal (referred to in comments as the "Izhi" model, after Eugene Izhikevitch, the scientist who described it), and one simpler model which may be closer to what I actually want from the system (referred to in comments as the "Grand" model, after Steve Grand, the game designer who described it.) It can be shown (though the output of the program is currently pretty inscrutable to anyone but me) that in the proper conditions, the networks implemented here of both types can currently "learn" simple input/output/reward correlations.

<source lang='c'> // SPIKE TIMING DEPENDENT PLASTICITY NEURAL NETWORK BASED BEHAVIOR SHAPING ARCHITECTURE PLATFORM // (STDPNNBBSAP)

  1. include <stdio.h>
  2. include <stdlib.h>
  3. include <assert.h>
  1. define FUBAR 1 == 0
  1. define INITSTR 300 //The starting strength of all (?) synapses.
  2. define THRESHOLD 1000 //The threshold voltage of all (?) neurons.
  3. define CONNECT 5 //The connectivity of all (?) neurons.
  4. define LTPUP 75
  5. define LTPEXP .75
  6. define RELAXRATE .9
  7. define LTPRELAX 100
  8. define PRE 1
  9. define POST 1
  10. define VOLTDESCEND .25
  11. define SYNCLAMP 800 //This is arbitrary in the Grand model. 800 works.
  12. define LOWSTR 200
  13. define STRDECAY 1.01
  14. define STRCAP 10000

void testModularity(); void confuzzleMod(); void searchForId(); void testPiping(); void learningNet(); void comboTest();

long idcount;

int main() {


printf("\n");

//srand(32984111); srand(123456);

// testNeurons(); // saveTest();

// testModularity(); // testPiping();

learningNet();

// comboTest(); // dualNeuronTest(); }

struct synapse { struct neuron *input; //Points to "owner" neuron, for purposes of pre-post calc. struct neuron *output; //Points to target neuron. long outid; float str; float ltp; //Str += (ltp * reward) is the timestep relation... though this overgeneralizes }; //the mechanism; negative ltp and punishment don't ordinarily increase str.

struct neuron { struct synapse *axon; //This should be a pointer to a malloc'd array of length synpop. int volt; //Excitation, but sounds less like an innuendo. int input; int synpop; unsigned int firedtimer; //Cycles since last firing event. long id; };

struct layer { //I'm pretty sure this is actually the efficient way to manage layer structure. struct neuron *neurons; int pop; int type; //This points to the target layer; should be sufficient for now. float goodsyns; float decayrate; int synpop; };

struct netMod { struct layer *brains; int pop; struct neuron *inputs; struct neuron *outputs; int dopamine; int inpop; int outpop; int inid; int outid; int synpop; };

struct pipe { struct neuron *input; struct neuron *outputs; int outnum; char *types; int id; };

struct synapse synapize(struct neuron *in, struct neuron *out, float strength) { struct synapse syn; syn.input = in; syn.output = out; syn.str = strength; syn.ltp = 0;

return syn; //This feels so LISPy. }

struct neuron neuralize(int synp) { struct neuron neu; neu.volt = 5; //Check, just for testing. neu.input = 0; neu.synpop = synp; neu.firedtimer = 5; //This will cause problems, check this. Initialize randomly? neu.id = idcount++;


struct synapse *targets = (malloc (sizeof(struct synapse) * neu.synpop)); //Yay malloc!

int ncount;

for (ncount = 0; ncount < neu.synpop; ncount++) { targets[ncount] = synapize(&neu, NULL, (INITSTR + ((rand() % 10) - 5))); //Does not set output pointer! Check it! }

neu.axon = targets; //This should be a pointer to an array, or somecrap

return neu; }


struct layer layerize(int capacity, int specs) { struct layer laer; laer.pop = capacity; laer.type = specs; laer.goodsyns = capacity * CONNECT; laer.decayrate = 1.0;

struct neuron *neuria = (malloc (sizeof(struct neuron) * capacity));

int lcount; for (lcount = 0; lcount < capacity; lcount++) { neuria[lcount] = neuralize(CONNECT); //This can be replaced by a specs-based scheme. }

laer.neurons = neuria;

//int i = 0; //for (i = 0; i < capacity; i++) {


return laer; }

struct netMod modulate(int numoflayers, int *layerlengths, int *layertypes, int inlayer, int outlayer) { struct netMod module; module.pop = numoflayers;

struct layer *laiers = (malloc (sizeof(struct layer) * numoflayers));

int mcount;

for (mcount = 0; mcount < numoflayers; mcount++) { laiers[mcount] = layerize(layerlengths[mcount], layertypes[mcount]); }

module.inputs = laiers[inlayer].neurons; module.inpop = layerlengths[inlayer]; module.outputs = laiers[outlayer].neurons; module.outpop = layerlengths[outlayer];

module.inid = inlayer; module.outid = outlayer;

module.brains = laiers;

module.dopamine = 0;

module.synpop = 0;

int i = 0; int n = 0; for (i = 0; i < module.pop; i++) { for (n = 0; n < module.brains[i].pop; n++) { module.synpop += CONNECT; } }

return module; }

struct pipe pipify(struct neuron *inpt, struct neuron *outpts, int outputnum, char *typearray) {

struct pipe tube;

tube.input = inpt; tube.outputs = outpts; tube.outnum = outputnum; tube.types = typearray;

return tube; }

void cleanmod(struct netMod module) { int i = 0; int n = 0;

for(i = 0; i < module.pop; i++) { for(n = 0; n < module.brains[i].pop; n++) { free(module.brains[i].neurons[n].axon); }

free(module.brains[i].neurons); }

free(module.brains); }

int iterateSynapses(struct neuron* neu, int reward, float strdecay);

void iterateModule(struct netMod* net, int reward) { int i; int n; int firecount = 0;

for(i = 0; i < net->pop; i++) { net->brains[i].goodsyns = 0; for (n = 0; n < net->brains[i].pop; n++) { firecount += iterateNeuron(&(net->brains[i].neurons[n]));

net->brains[i].goodsyns += iterateSynapses(&(net->brains[i].neurons[n]), reward, net->brains[i].decayrate); } //printf("L%d F%d G%f S%f|", i, firecount, net->brains[i].goodsyns, net->brains[i].neurons[1].axon[1].str); printf("L%d F%d|", i, firecount); firecount = 0;

net->brains[i].decayrate = .95 + ((net->brains[i].goodsyns/(net->brains[i].pop * CONNECT))/50);

printf("%f|", net->brains[i].decayrate); }

for (i = 0; i < net->pop; i++) { for (n = 0; n < net->brains[i].pop; n++) { if (net->brains[i].neurons[n].input <= SYNCLAMP) { net->brains[i].neurons[n].volt += net->brains[i].neurons[n].input; } else { net->brains[i].neurons[n].volt += SYNCLAMP; } net->brains[i].neurons[n].input = 0; } }

}

int iterateNeuron(struct neuron* neu) { int i = 0; int fired = 0;

if (neu->volt > THRESHOLD) { if (neu->synpop != 0) { for(i = 0; i < neu->synpop; i++) { if (neu->axon[i].output != NULL) { neu->axon[i].output->input += neu->axon[i].str; //Applies str to the input variable of the target neuron. } else { //(Check validity?) assert(FUBAR); } } }

neu->firedtimer = 0; neu->volt *= VOLTDESCEND; fired = 1;

(neu->firedtimer)++; } else { neu->firedtimer++; }

neu->volt *= RELAXRATE; if (neu->volt < 0) { neu->volt = 0; }

return fired; //For testing purposes. } /* int iterateSynapses(struct neuron* neu, int reward) { //Izhi model. int i = 0; int t = 0; int n = 0;

if (neu->synpop != 0) { for(i = 0; i < neu->synpop; i++) { if (neu->axon[i].output != NULL) { neu->axon[i].str += neu->axon[i].ltp * reward; //Updates synapse str, just like that.

if (neu->axon[i].str < 0) { neu->axon[i].str = 0; }

neu->axon[i].ltp /= LTPRELAX;

t = (neu->firedtimer - neu->axon[i].output->firedtimer); //printf("T: %d\n", t);

if ((neu->firedtimer == PRE) || (neu->axon[i].output->firedtimer == POST)) { if (t <= 50 && t >= -50) { if (t == 0) { //Um, I'm not sure this should do anything. } else if (t > 0) { neu->axon[i].ltp += LTPUP / t; } else if (t < 0) { neu->axon[i].ltp += (LTPUP / t) * 1.5; } } } } } } }

  • /

int iterateSynapses(struct neuron* neu, int reward, float strdecay) { //Grand model. int i = 0; int t = 0; int n = 0; int upsyns = 0;

if (neu->synpop != 0) { for(i = 0; i < neu->synpop; i++) { if (neu->axon[i].output != NULL) { if (neu->axon[i].str < (neu->axon[i].str *= strdecay)) assert(FUBAR); neu->axon[i].str += neu->axon[i].ltp * reward; //Updates synapse str, just like that.

if (neu->axon[i].str < 0) { neu->axon[i].str = 0; } else if (neu->axon[i].str > STRCAP) { neu->axon[i].str = STRCAP; }

if (neu->axon[i].str > LOWSTR) { upsyns++; // printf("Rr"); } else { // printf("-%f %f-", neu->axon[i].str, LOWSTR); }

neu->axon[i].ltp /= LTPRELAX;

if (neu->firedtimer == 1) neu->axon[i].ltp += LTPUP; } } } //printf("%d", upsyns); return upsyns; }

void initLayer(struct netMod* module, int layer) { int i; int n; int z;

if (module->brains[layer].type == -1) { for(n = 0; n < module->brains[layer].pop; n++) { module->brains[layer].neurons[n].synpop = 0; } } else { for(n = 0; n < module->brains[layer].pop; n++) { for(z = 0; z < module->brains[layer].neurons[n].synpop; z++) { module->brains[layer].neurons[n].axon[z].output = &(module->brains[(module->brains[layer].type)].neurons[(rand() % (module->brains[(module->brains[layer].type)].pop))]); //HAHAHAHAHAHA! } } } }

int remakeAxons(struct netMod* module, int layer) { //Remove some random synapses with very low strength and replace them with int n; //new random synapses which could fare better. int z;

if (module->brains[layer].type != -1) { for(n = 0; n < module->brains[layer].pop; n++) { int replacenum = 0; char bustedsynapses[module->brains[layer].neurons[n].synpop];

for(z = 0; z < module->brains[layer].neurons[n].synpop; z++) { if (module->brains[layer].neurons[n].axon[z].str < LOWSTR) { bustedsynapses[z] = 1; replacenum++; } else { bustedsynapses[z] = 0; } }

if (replacenum > (module->brains[layer].neurons[n].synpop / 2)) { //Kinda arbitrarily.

struct synapse *targets = (malloc (sizeof(struct synapse) * module->brains[layer].neurons[n].synpop));

int i; for(i = 0; i < module->brains[layer].neurons[n].synpop; i++) { //Synpop stays the same for now. assert(bustedsynapses[i] == 1 || bustedsynapses[i] == 0); if (bustedsynapses[i]) { targets[i] = synapize(&(module->brains[layer].neurons[n]), &(module->brains[(module->brains[layer].type)].neurons[(rand() % (module->brains[(module->brains[layer].type)].pop))]), /*And again! AAH-HAHAHAHAHAHA!*/ (INITSTR + ((rand() % 10) - 5)));

} else { targets[i] = module->brains[layer].neurons[n].axon[i]; //Keeping strong synapses. } } free(module->brains[layer].neurons[n].axon); module->brains[layer].neurons[n].axon = targets; } } } }

void confuzzleMod(struct netMod *module) { int i; int n; int z; long int totvolt; int avgvolt;

//for(i = 0; i < module->brains[0].pop; i++) { // module->brains[0].neurons[i].volt += (rand() % 1000); //}

//for(i = 0; i < module->inpop; i++) { // module->inputs[i].volt += (rand() % 1000); //} for(i = 0; i < 100; i++) { int g = (rand() % (module->brains[0].pop)); //printf("%d ", module->brains[0].neurons[g].volt); module->brains[0].neurons[g].volt += 1100; //printf("%d, ", module->brains[0].neurons[g].volt); }

for(i = 0; i < module->outpop; i++) { totvolt += module->outputs[i].volt; //printf("%d ", module->outputs[i].volt); } for(i = 0; i < 5; i++) { // printf("%d ", module->outputs[(rand() % (module->outpop))].volt); } avgvolt = totvolt/(module->outpop); for(i = 0; i < module->inpop; i++) { // module->inputs[i].volt -= avgvolt * 100; } //printf("%ld %d\n", totvolt, avgvolt); //printf("\n");

/* for(i = 0; i < module->pop; i++) { for(n = 0; n < module->brains[i].pop; n++) { module->brains[i].neurons[n].volt += (rand() % 100); } } */ }

void saveMod(struct netMod* data, char* filename) { int i = 0; int n = 0; int z = 0;

FILE *save;

save = fopen(filename, "w");

fprintf(save, "SHAPER NEURAL NETWORK SAVE FILE.\nPOP%dIN%dOUT%d\n", data->pop, data->inid, data->outid); for(i = 0; i < data->pop; i++) { fprintf(save, "L%dP%dT%d ", i, data->brains[i].pop, data->brains[i].type); //printf("L%dP%dT%d ", i, data->brains[i].pop, data->brains[i].type);

for(n = 0; n < data->brains[i].pop; n++) { fprintf(save, "N%dY%dV%dF%dI%ld ", n, data->brains[i].neurons[n].synpop, data->brains[i].neurons[n].volt, data->brains[i].neurons[n].firedtimer, data->brains[i].neurons[n].id); // printf("N%dY%dV%dF%dI%ld ", n, data->brains[i].neurons[n].synpop, data->brains[i].neurons[n].volt, // data->brains[i].neurons[n].firedtimer, data->brains[i].neurons[n].id); for(z = 0; z < data->brains[i].neurons[n].synpop; z++) { fprintf(save, "S%fL%fO%ld ", data->brains[i].neurons[n].axon[z].str, data->brains[i].neurons[n].axon[z].ltp, data->brains[i].neurons[n].axon[z].output->id); // printf("S%dL%dO%ld ", data->brains[i].neurons[n].axon[z].str, data->brains[i].neurons[n].axon[z].ltp, // data->brains[i].neurons[n].axon[z].output->id); } fprintf(save, "DN"); } fprintf(save, "DL"); } fprintf(save, ".");

fclose(save); }

struct netMod loadMod(char* filename) { int i = 0; int n = 0; int z = 0; int c = 0; int x = 0; int failsafe = 0; int synfailcounter = 0; int synapsecount = 0; long int idnum; int idlist[3000]; //Temporary value. int counter = 0;

FILE *load;

struct netMod mod;

load = fopen(filename, "r");

fscanf(load, "SHAPER NEURAL NETWORK SAVE FILE.\nPOP%dIN%dOUT%d\n", &mod.pop, &mod.inid, &mod.outid); mod.brains = (malloc (sizeof(struct layer) * mod.pop));

for(i = 0; i < mod.pop; i++) { fscanf(load, "L%dP%dT%d ", &c, &mod.brains[i].pop, &mod.brains[i].type);

mod.brains[i].neurons = (malloc ((sizeof(struct neuron)) * mod.brains[i].pop));

for(n = 0; n < mod.brains[i].pop; n++) { fscanf(load, "N%dY%dV%dF%dI%ld ", &c, &mod.brains[i].neurons[n].synpop, &mod.brains[i].neurons[n].volt, &mod.brains[i].neurons[n].firedtimer, &mod.brains[i].neurons[n].id); mod.brains[i].neurons[n].axon = (malloc ((sizeof(struct synapse)) * mod.brains[i].neurons[n].synpop));

for(z = 0; z < mod.brains[i].neurons[n].synpop; z++) { fscanf(load, "S%fL%fO%ld ", &mod.brains[i].neurons[n].axon[z].str, &mod.brains[i].neurons[n].axon[z].ltp, &mod.brains[i].neurons[n].axon[z].outid);

mod.brains[i].neurons[n].axon[z].input = &(mod.brains[i].neurons[n]); } assert(fgetc(load) == *"D" && fgetc(load) == *"N"); } assert(fgetc(load) == *"D" && fgetc(load) == *"L"); } assert(fgetc(load) == *".");

fclose(load);

for(i = 0; i < mod.pop; i++) { for(n = 0; n < mod.brains[i].pop; n++) { for(z = 0; z < mod.brains[i].neurons[n].synpop; z++) { for(x = 0; x < mod.brains[(mod.brains[i].type)].pop; x++) { if (mod.brains[(mod.brains[i].type)].neurons[x].id == mod.brains[i].neurons[n].axon[z].outid) { mod.brains[i].neurons[n].axon[z].output = &(mod.brains[(mod.brains[i].type)].neurons[x]); break; } else if (x == mod.brains[(mod.brains[i].type)].pop - 1) { assert(FUBAR); } } } } }

return mod; }


void testModularity() { int i = 0; int n = 0; int z = 0; int dop = 1; int count = 0; int popcount = 0;

for (n = 0; n < 1; n++) {


//int layerpop = (rand() % 30) + 5; int layerpop = 10;

int neupop[layerpop]; int type[layerpop];


for(i = 0; i < layerpop; i++) { neupop[i] = (rand() % 1000) + 1; if (i < layerpop - 1) type[i] = i + 1; else type[i] = 0; printf("%d %d ", neupop[i], type[i]); }

//type[29] = 28; //type[28] = 29;

printf("\nLayerpop: %d\n", layerpop);

struct netMod moddle = modulate(layerpop, neupop, type, 0, 2);

for(z = 0; z < layerpop; z++) { initLayer(&moddle, z); }


for(z = 0; z < 10; z++) { confuzzleMod(&moddle); saveMod(&moddle, "savetest.txt"); struct netMod duplicate = loadMod("savetest.txt");

int v = 0; for(v = 0; v < 20; v++) { int fa = (rand() % moddle.pop); int so = (rand() % moddle.brains[fa].pop); int la = (rand() % moddle.brains[fa].neurons[so].synpop); //assert(moddle.brains[fa].pop == duplicate.brains[fa].pop); //assert(moddle.brains[fa].neurons[so].volt == duplicate.brains[fa].neurons[so].volt); //assert(moddle.brains[fa].neurons[so].axon[la].str == duplicate.brains[fa].neurons[so].axon[la].str); assert(moddle.brains[fa].neurons[so].axon[la].output->volt == duplicate.brains[fa].neurons[so].axon[la].output->volt);

if (!(moddle.brains[fa].neurons[so].axon[la].ltp == duplicate.brains[fa].neurons[so].axon[la].ltp)) { printf("Fa%d So%d La%d \n", fa, so, la); printf("%f %f\n", moddle.brains[fa].neurons[so].axon[la].ltp, duplicate.brains[fa].neurons[so].axon[la].ltp); } } printf("\n"); iterateModule(&moddle, dop);

}

cleanmod(moddle);

printf("\n"); } }

void iteratePipe(struct pipe* tubule) { int i = 0;

for(i = 0; i < tubule->outnum; i++) { switch (tubule->types[i]) { case 0: tubule->outputs[i].volt = tubule->input->volt; //Simple copying; suitable for NN-to-trainer mods. // printf("Case 0."); break; case 1: tubule->outputs[i].volt += (tubule->input->volt - THRESHOLD) ? (tubule->input->volt - THRESHOLD) : 0; // printf("Case 1."); break; case 2: tubule->outputs[i].volt = (tubule->input->volt) ? 1 : 0; break; default: printf("Invalid pipe."); assert(FUBAR); } } }

void testPiping() { int i = 0; int n = 0;

int sourcelayers = 2; int soulayerlengths[sourcelayers]; int soulayertypes[sourcelayers];

int destlayers = 2; int destlayerlengths[destlayers]; int destlayertypes[destlayers];

for(i = 0; i < destlayers; i++) { destlayerlengths[i] = soulayerlengths[i] = 2; soulayertypes[i] = destlayertypes[i] = ((i + 1) % destlayers); }

struct netMod source = modulate(sourcelayers, soulayerlengths, soulayertypes, 0, 1); struct netMod destination = modulate(destlayers, destlayerlengths, destlayertypes, 0, 1);

for(i = 0; i < source.pop; i++) { initLayer(&source, i); } for(i = 0; i < destination.pop; i++) { initLayer(&destination, i); }

struct pipe piping[source.outpop];

char typearray[1] = {0};

for(i = 0; i < source.outpop; i++) { if (i < destination.outpop) { piping[i] = pipify(&(source.outputs[i]), &(destination.inputs[i]), 1, typearray); } }

for(i = 0; i < 10; i++) { printf("\nS: "); confuzzleMod(&source); iterateModule(&source, 0); printf("\nD: "); iterateModule(&destination, 0); printf("\n");

for(n = 0; n < source.outpop; n++) { iteratePipe(&(piping[n])); } }

cleanmod(source); cleanmod(destination); }

void comboTest() { int i = 0; int n = 0; int z = 0;

int strtotal = 0; int lowstrtotal = 0;

int layers[2] = {1,1}; int types[2] = {1, -1}; struct netMod petri = modulate(2, layers, types, 1, 2); //petri.brains[0].neurons[0].axon[0].output = &(petri.brains[1].neurons[0]); //petri.brains[1].neurons[0].synpop = 0;

petri.brains[0].neurons[0].synpop = 1;

initLayer(&petri, 0); initLayer(&petri, 1);

int dopamine = 0; int dop = 0; char inpt = 0;

for (i = 0; i < 20000; i++) { n++;

//(petri.brains[0].neurons[0].volt) += (!(rand() % 4)) ? (rand() % 700) : 0; petri.brains[0].neurons[0].volt += (!(n % 10)) ? 2000 : 0; (petri.brains[1].neurons[0].volt) += (!(rand() % 4)) ? (rand() % 700) : 0;

//dop = petri.brains[1].neurons[0].volt - THRESHOLD; //dopamine += (dop > 0) ? 5 : 0; //dopamine += (!(rand() % 5)) ? (rand() % 10) : 0; dop = 0; if ((petri.brains[1].neurons[0].volt > THRESHOLD) && (petri.brains[0].neurons[0].firedtimer < 10)) dop += 20; dopamine += dop; iterateModule(&petri, dopamine);

strtotal += petri.brains[0].neurons[0].axon[0].str; if (petri.brains[0].neurons[0].axon[0].str < LOWSTR) lowstrtotal++; dopamine /= 2; }

printf("\n"); printf("%d\n", n); printf("First: Volt: %d Firedtimer: %d \n", petri.brains[0].neurons[0].volt, petri.brains[0].neurons[0].firedtimer); printf("Synapse: Strength: %f LTP: %f \n", petri.brains[0].neurons[0].axon[0].str, petri.brains[0].neurons[0].axon[0].ltp); printf("Second: Volt: %d Firedtimer: %d \n", petri.brains[1].neurons[0].volt, petri.brains[1].neurons[0].firedtimer); printf("Dopamaminilline: %d Avg strength: %d Percent under lowstr %d\n", dopamine, (strtotal / n), (lowstrtotal*100/n));

for(n = 0; n < 10; n++) { int netpop = 3; int layersizes[3] = {20, 50, 20}; int layertypes[3] = {1, 2, -1};

struct netMod net = modulate(netpop, layersizes, layertypes, 0, 2);

for (i = 0; i < net.pop; i++) { initLayer(&net, i); }

printf("\n");

dop = 0; int right = 0; int wrong = 0; int rightcount = 0; int wrongcount = 0; int cyclecount = 0; float righthist = 0; float wronghist = 0; float proportion = 0; int rightvolts = 0; int wrongvolts = 0; // while (getchar() != *"d") { for(z = 0; z < 30000; z++) { iterateModule(&net, dop); dop /= 5;

for(i = 0; i < net.inpop; i++) { net.inputs[i].input += (rand() % 450); }

// if (!(z % 3)) { // for(i = 0; i < 10; i++) { // net.inputs[(rand() % net.inpop)].input += 300; // } // }

/*if (!(z % 5)) { for(i = 0; i < 20; i++) { net.inputs[i].input += 2000; } } else { for (i = 0; i < 10; i++) { net.inputs[(rand() % net.inpop)].input = 1000; } }*/

//net.inputs[(z % net.inpop)].input += 2000;

for(i = 0; i < net.outpop; i++) { if (i < 6 && net.outputs[i].volt > THRESHOLD) { right++; // printf("%dR", i); rightvolts += net.outputs[i].volt - THRESHOLD; dop += 20;

} else if (net.outputs[i].volt > THRESHOLD) { wrong++; // printf("%dW", i); wrongvolts += net.outputs[i].volt - THRESHOLD; } }/* if (!(z % net.outpop)) { for (i = 0; i < net.outpop; i++) { if (net.outputs[i].volt > THRESHOLD) { rightcount++; dop += 10; } } } else { for (i = 0; i < net.outpop; i++) { if (net.outputs[i].volt > THRESHOLD) { wrongcount++; } } }*/ //printf("\n"); /*if (right > wrong) { dop += 20; rightcount++; } else if (wrong > right) { wrongcount++; dop -= 2; }*/ if ((rightvolts > wrongvolts) && (!((z + 3) % 5))) { // dop += 500; rightcount++; } else if (wrongvolts > rightvolts) { wrongcount++; } right = wrong = 0; //Gee, how nihilistic. rightvolts = wrongvolts = 0; //Electro-nihilism. cyclecount++; remakeAxons(&net, 1);

//dop += ((rand() % 4) - 2); }

righthist += rightcount; wronghist += wrongcount; proportion = (righthist && wronghist) ? (righthist / wronghist) : righthist; //printf("%d %d %d %f %d\n", cyclecount, rightcount, wrongcount, proportion, dop); printf("Cycles: %d Proportion: %f.\n", cyclecount, proportion); rightcount = wrongcount = 0; // } } printf("\n"); }

dualNeuronTest() { int i = 0; int n = 0; int strtotal = 0; int lowstrtotal = 0; int strtotaltwo = 0; int lowstrtotaltwo = 0;

int layers[2] = {1,1}; int types[2] = {1, -1}; struct netMod petri = modulate(2, layers, types, 1, 2); struct netMod petritwo = modulate(2, layers, types, 1, 2);

petri.brains[0].neurons[0].synpop = 1; petritwo.brains[0].neurons[0].synpop = 1;


initLayer(&petri, 0); initLayer(&petri, 1); initLayer(&petritwo, 0); initLayer(&petritwo, 1);

int dopamine = 0; int dop = 0;

int randone; int randtwo;

do { for (i = 0; i < 10; i++) { n++;

randone = (!(rand() % 6)) ? (rand() % 1000) : 0; randtwo = (!(rand() % 6)) ? (rand() % 1000) : 0; (petri.brains[0].neurons[0].volt) += randone; (petritwo.brains[0].neurons[0].volt) += randone; //petri.brains[0].neurons[0].volt += (!(n % 10)) ? 2000 : 0; (petri.brains[1].neurons[0].volt) += randtwo; (petritwo.brains[1].neurons[0].volt) += randtwo;

//dop = petri.brains[1].neurons[0].volt - THRESHOLD; //dopamine += (dop > 0) ? 5 : 0; dopamine += (!(rand() % 5)) ? (rand() % 10) : 0;

//if ((petri.brains[1].neurons[0].volt > THRESHOLD) && (petri.brains[0].neurons[0].firedtimer < 10)) dop += 20; if (petri.brains[1].neurons[0].volt > THRESHOLD) dop += 20; //if (petritwo.brains[1].neurons[0].volt > THRESHOLD) dopamine += 20;

iterateModule(&petri, dopamine); iterateModule(&petritwo, dop);

strtotal += petri.brains[0].neurons[0].axon[0].str; strtotaltwo += petritwo.brains[0].neurons[0].axon[0].str; if (petri.brains[0].neurons[0].axon[0].str < LOWSTR) lowstrtotal++; if (petritwo.brains[0].neurons[0].axon[0].str < LOWSTR) lowstrtotaltwo++;

dopamine /= 2; dop /= 2; }

printf("\n"); printf("%d\n", n); printf("First: Volt: %d Firedtimer: %d \n", petri.brains[0].neurons[0].volt, petri.brains[0].neurons[0].firedtimer); printf("Synapse: Strength: %f LTP: %f \n", petri.brains[0].neurons[0].axon[0].str, petri.brains[0].neurons[0].axon[0].ltp); printf("Second: Volt: %d Firedtimer: %d \n", petri.brains[1].neurons[0].volt, petri.brains[1].neurons[0].firedtimer);

printf("Dopamaminilline: %d Avg strength: %d Percent under lowstr %d\n", dopamine, (strtotal / n), (lowstrtotal*100/n));

printf("First: Volt: %d Firedtimer: %d \n", petritwo.brains[0].neurons[0].volt, petritwo.brains[0].neurons[0].firedtimer); printf("Synapse: Strength: %f LTP: %f \n", petritwo.brains[0].neurons[0].axon[0].str, petritwo.brains[0].neurons[0].axon[0].ltp); printf("Second: Volt: %d Firedtimer: %d \n", petritwo.brains[1].neurons[0].volt, petritwo.brains[1].neurons[0].firedtimer);

printf("Dopamaminilline: %d Avg strength: %d Percent under lowstr %d\n", dop, (strtotaltwo / n), (lowstrtotaltwo*100/n));

printf("\nDivergences: Dopamine: %d Avg Strength: %d Percent Under Lowstr: %d\n", dopamine - dop, (strtotaltwo/n) - (strtotal/n), (lowstrtotaltwo*100/n) - (lowstrtotal*100/n));

} while (getchar() != *"d");

cleanmod(petri); cleanmod(petritwo); }

void learningNet() { int i = 0; int z = 0;

int netpop = 3; int layersizes[3] = {10, 50, 20}; int layertypes[3] = {1, 2, -1};

struct netMod net = modulate(netpop, layersizes, layertypes, 0, 2);

for (i = 0; i < net.pop; i++) { initLayer(&net, i); }

int dop = 0; int right = 0; int wrong = 0; int rightcount = 0; int wrongcount = 0; int cyclecount = 0; float righthist = 0; float wronghist = 0; float proportion = 0; int rightvolts = 0; int wrongvolts = 0; while (getchar() != *"d") { for(z = 0; z < 100; z++) { iterateModule(&net, dop); dop /= 5;

//for(i = 0; i < net.inpop; i++) { // net.inputs[i].input += (rand() % 450); //}

for(i = 0; i < 3; i++) { net.inputs[(rand() % net.inpop)].input += 1000; }

for(i = 0; i < net.outpop; i++) { if (i < 10 && net.outputs[i].volt > THRESHOLD) { right++; printf("%dR", i); rightvolts += net.outputs[i].volt - THRESHOLD;

} else if (net.outputs[i].volt > THRESHOLD) { wrong++; printf("%dW", i); wrongvolts += net.outputs[i].volt - THRESHOLD; } } printf("\n"); if (right > wrong) { dop += 20; rightcount++; } else if (wrong > right) { wrongcount++; dop -= 2; } /*if (rightvolts > wrongvolts) { dop += 75; rightcount++; } else if (wrongvolts > rightvolts) { wrongcount++; }*/ right = wrong = 0; //Gee, how nihilistic. rightvolts = wrongvolts = 0; //Electro-nihilism. cyclecount++; remakeAxons(&net, 0); remakeAxons(&net, 1);

//dop += ((rand() % 4) - 2); }

righthist += rightcount; wronghist += wrongcount; proportion = (righthist && wronghist) ? (righthist / wronghist) : righthist; printf("%d %d %d %f %d\n", cyclecount, rightcount, wrongcount, proportion, dop); rightcount = wrongcount = 0; } } </source>


Relevant Works

I'm hoping no one scrolls down this far to see that I'm not citing things properly yet, but at least I'm citing things which need to be cited.

Grand, Steve, _Creation_ : A popular and foundational book on artificial life, this describes the basic principles of dynamic neural networks and artificial life in general (in the context of my favorite computer game as a kid, Creatures.)

Izhikevich, E, "Solving the Distal Reward Problem Through Linkage of STDP and Dopamine Signalling", in Cerebral Cortex journal. : Describes a slightly more complicated model of dynamic neural network function, in the context of how it may happen in actual brains. I still feel sorry for inflicting this on my Sci/Math Seminar class.

Dorigo, Colombetti, _Behavior Shaping_ : A book describing robotic control using "behavior shaping" networks similar to the neural network-networks planned in my project, except using LCSes as the adaptive component rather than neural networks.