Neural Nets Vs Genetic Algorithms

Status
Not open for further replies.

        

Reign Mack
Registered Senior Member
Does anyone know of any advantages that genetic algorithms have over neural networks in terms of artificial intelligence?
 
Yes, Absolute chaos. Where a programmatic system needs a programmatic means to create a random event, natural systems win hands down.
 
Neural networks are based on experience probabilities.
if an experience's result is true,or a successful attempt was made,the pattern(called a Weight as i explained earlier in programming world)strengthens,or you can say the assigned value of weight increases.

But as far as i know the genetic algorithms are based on pre-concieved notion of an event.a system which has a genetically programmed base,or contains genetic algorithm is fed with various algorithms.these algorithms are fed uniquely.the result of an algorithm applied to a particular event is logged.this logged on event is matched with various other kinds of gentic algorithm applied.whichever has the highest priority assigned to it,on the basis of the results of application,is used in future.
thus you might say a kind of a competitive enviorment is set up,between the algorithms and only the fittest survive.

you can get more information on this subject on one of my threads.

thanks for your time.

bye!
 
I'm not sure I understand what Zion said about genetic algorithms but here's what I know about them:

Genetic algorithms or GAs are loosely based on Darwin's theory of evolution or Survival of the Fittest. You start with:
  1. A problem. Common example is the merchant's traveling problem (or whatever it's called): a merchant has to visit an n number of cities all interconnected via different, known length roads. (Usually this n is very big.) He has to find the shortes route past all cities without visiting any one twice.
  2. A set of solutions (called the Gene Pool). These solutions can be coded in a DNA-like or string-like form. For the marchant's example this could be a list of cities to visit: 1,5,3,7,4,2,6,8,... You generally start off with a whole bunch of randomly generated possible solutions, but some people prefer to start with just one or two.
    [/list=1]

    Each solution gets a value, in the case of the example this would be the total length traveled.

    MARK: Now you pick the best say 10 solutions or from the best 100 pick 10 randomly or something like that. (As long as it includes "the best ...". That's the Survival of the Fittest part.) You take these 10 and pair them off to make offspring. For the example you could take parts of the traveled path from one parent and the other part of the path from the second parent. This creates a new batch of solutions wich (hopefully) are better than the first batch.

    Generally you add the new solutions to the old batch and kill off the worst solutions to get back to the starting number.

    Now you GOTO MARK (I know GOTO, bleg! Use a while if you want ;) ) and repeat until bored. Then you take the best solution from the final batch and use this as your final outcome. In our example this would be the path to travel.

    For most problems brute force or some kind of greedy algorithm is sufficient. But for some problems that's not possible or feasable. In that case a GA is usually better than plain random search.

    So, the question of using a Neural Net or a GA is more one of, is this problem solvable using a NN or would it be more easily implemented with a GA?

    I hope this helps.
 
Yes, that's another option. Instead of taking two parents you take one subject and mutate it. (For example by changing the order of two of the cities.) Or you take two parents, create some offspring and then flip a coin to see if you want to mutate the offspring. The possibilities are endless.
 
Nother difference though possibly minor.

Um tough to know how to say this. Consider a simple problem: hill climbing. You task is to find the highest point within an area covered by fog. If you decide to always head up hill... Then you may find the top of a foothill and miss the mountain.

This problem is called Hilll Climbing and the false place is called a local maximum. (Professors have no imagination! ?). Ok too simple, but warp your mind a little and the idea can be extended to other situations.

NN's can be prone to finding local maximums. GA have a slightly better chance to find the real maximum, if you are lucky... Sometimes it works to add ocasional catastrophies to kill the dinosaurs and let the apes evolve <grin>
 
I think I get what your saying that kind of like random chance or luck (when survivle of the fittest does not really come into it) like when you get a few people who are immune to that killer strain of the FLU.
 
GAs by themselves can not climb all the hills in real world. So do NNs.

GAs are constrained by their initial conditions and number of subgroups with their specific differentials.

NNs learn without a full deck. In real life, the backpropagation factors are ~100% accurate after the fact. Otherwise everyone would be using it in stock market and making money all the time. Why the whole market lost trillions of dollars?
 
In fact,The above sentence sums it all,
GAs are constrained by Initial conditions...



bye!
 
In fact,The above sentence sums it all,
GAs are constrained by Initial conditions...
Not if you use random mutations.

In fact, my professor was working on a theory that the best GA is one with a pool size of 1 and the only operator mutation.
 
I am new to this and would like to know what everybody codes in for genetic algorithims and would also like to know if any body knows anything about distributive procces and evolutionary algorithims.

I code in QBasic and was wondering if qb is powerfull enough to code these AI.
 
Last edited:
What application are you using these for? AI/Alife/COgsci??

evolutionary algorithms seem fun.
if your looking for NNs code in matlab
from what i know about GAs any Basic will do
but i'd prefer C/C++

-----: you can join NNs and GAs together....simply the letter code of the GAs are NNs them selves.

depending on how you code NNs they can be chaotic too but more controllable the

But if your doing anything with multiple solutions use GAs unless your willing to create a highly complex modular NN.

Now lets say you have a vision system. And you have modular parts like detecting lines and then whole shapes, and colours etc. If you have multiple algorithmic solutions for each that are NNs liek 2 solutions that can do line detecting I suggest using GAs on these NN solutions.
 
Hi I am the member who started this thread, Have not been able to log in seems my old user name was banned for some unknown reason? Thanks for *bumping* this thread back up.

Im going to be using Java. My GAs will be used to train Binary Decision Trees.

c1earwater I endorse what you say about Random mutation allowing the GA to go beyond its initial condition but the second part about a pool of 1 sounds odd, and there really is no "best" GA it all depends on what type of problem your solving.

I population of 1 and only mutuation as the operator sounds like chaos, which may take a very long time to converge!
 
Im coding AI and i need someone to point me in the right direction for evolutionary algorithms.

Also, does anybody know anything about low-level intelegence(ie. Birds flocking as a result of "stay close to the bird next to you but dont bump into it).

These realy interest me and i was wondering if they could be used to let a program branch out and have complety random outcomes.

PS. is QB good for evolutionary algorithms.
PSS. is PROLOG a good compiler for AI
PSSS. i think i am actualy coding Alife but i am not sure.
 
Last edited:
Status
Not open for further replies.
Back
Top