Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Genetic Programming in Clojure (sulami.github.io)
83 points by sulami on Nov 18, 2018 | hide | past | favorite | 23 comments


This isn't genetic programming at all.

What's being described here is a kind of sample-based stochastic optimization which, if I were pressed, is most similar to an algorithm known as a "(mu, lambda) evolution strategy". This is in the general subarea of evolutionary computation. Another algorithm in this area, for example, is the Genetic Algorithm.

Genetic Programming (GP) is the application of stochastic optimization techniques to a specific problem: the discovery and optimization of small computer programs or procedures. Because GP is famously associated with candidate solutions in the form of tree structures, it's also applied to non-program optimization as long as the problem involves trees. That's the rough boundary for the community.

The most famous example of a genetic programming system in Clojure is Lee Spector's Clojush (https://github.com/lspector/Clojush).


AKA. the usual confusion between "genetic programming" and "genetic algorithms".


And ironically, Push/Clojush don't actually use trees! I quite like Push, and also Linear GP, the DAG-based system used by Schmidt and Lipson, and grammatical GP, depending on the application.


Sure, there's no requirement to use trees: but because trees are so common, non-programming tree-representation stuff is also typically included in the GP umbrella.


ESs don't consider uniform sampling and there is no step-size adaptation involved in his algorithm so it is just a (mu,lambda)-EA.


I don't think either of these is a requirement for an ES. Just truncation selection.


It is, otherwise it is random search with selection. Truncation is just (mu, lambda).


Terminology quibble: My understanding is that "genetic programming" refers not to all evolutionary algorithms, but specifically to those where the code itself (rather than "a few key parameters") is evolved. This doesn't seem to use or implement 'eval' anywhere which I think immediately disqualifies it from meeting that definition.


Perhaps they meant "genetic algorithm"? Quite the confusing set of concepts to be so closely named.

https://en.wikipedia.org/wiki/Genetic_algorithm


Simply put, genetic algorithms evolve solutions. Genetic programming is genetic algorithms where the solutions are themselves programs.

I’m not sure what terminology change could make things more clear.


This brings back memories... One of the greatest heads-down hacking I did was back in the day, running some variant of Common Lisp, working myself through Koza's _Genetic Programming_ book. The designs arrived at were literally non-human. The story of the FPGA tone recognition circuit (spoiler: lot-specific design) was delightful.


My favorite was using GP to evolve a soccer team that outcompeted the handcrafted competion and won the Robo Cup championship in 1997.[1]

I still wonder why GP never got to be as popular as NN's (aka the currently fashionable "deep learning").

Was it just that GP's didn't perform as well? I find that somewhat hard to believe, as Koza's books are chock full of impressive results, and there are hundreds more papers on them.

GP's also have the virtue of ultimately being analyzable and understandable, at least in some cases (I'm not confident enough to say in all cases). That is a feature that NN's seem to lack, and it's becoming a big problem for some critical systems where life, important decisions, and/or ethics are involved.

[1] - http://www.genetic-programming.com/hc/lukesoccer.html


GP/GAs were more popular than NNs before NNs started to use gpu a few years ago and had already some real world applications like antenna design or circuit solvers. That said GP (or GA) didn't gathered the same fame NNs have now since using gpu with NNs let NNs provide results far better than GA/GP using only cpu. If someome will have a smart idea on how to get the advantages of using the gpu with GA/GP then GA/GP will become seriously popular.


I don't think GP's not being able to use GPU's is the issue, because GP's have run on GPU's for a long time:

http://gpgpgpu.com/


Uber and OpenAI have written on the subject of evolutionary algorithms in a contemporary context:

https://eng.uber.com/deep-neuroevolution/

https://blog.openai.com/evolution-strategies/


The trouble with genetic algorithms is that they are hardly worth it. It's just one of metaheuristics [1], and despite being "biology-inspired" and good-sounding, it does perform worse than simpler alternatives like tabu search [2] or, even better, LAHC [3]. One of the reasons for that is that for local search the speed of neighborhood exploration is paramount, so theoretical advantages of GA get swamped by slower neighborhood iteration. In addition, LAHC or global annealing are way simpler to implement, they are literally just a few lines of code.

[1]: https://en.wikipedia.org/wiki/Metaheuristic

[2]: https://en.wikipedia.org/wiki/Tabu_search

[3]: https://link.springer.com/chapter/10.1007/978-3-642-41019-2_...


I've a fair bit of experience with both GAs and LAHC and in my opinion the results are mixed. LAHC wins for fewer hyperparameters (1 versus a minimum of, say, 3), but in terms of implementation simplicity there isn't that big a difference -- say, 6 lines of Python against 12. People might get a false impression from big EA implementations like ECJ being compared against bare-bones LAHC implementations.


It's been a while since I needed one, but do you know a decent Tabu search package for R or Python (or C)?


To be fair, the author uses a very simple program: a number that evaluates to itself to show the general approach for a genetic algorithm.

It shows quite nicely the advantages of dynamic typing and abstractions together with very readable data manipulation code. A great showcase for Clojure's capabilities.

Looking forward to the next part where the author wants to mutate s-expressions.

(I'm aware that all of this has been done before sometime in the 90s with Prolog and CommonLisp mostly)


I don't see how this shows in any way the advantages of dynamic typing. The same code in a statically typed languages would hardly be more verbose. But it would be much easier to maintain, evolve, refactor, and understand.


I'd also recommend taking a look at the following libraries

https://github.com/lspector/Clojush

https://github.com/emilyagras/clojenetics


I've always wanted to try genetic algorithms in practice, but whenever I came across a problem that could be solved this way, I was stuck on point 1.

How to parametrize my problem? Be it mathematical operations, source code or gates, how to separate it in pieces that can be mutated efficiently?


And that's half the fun of machine learning ;)

Yes, you will have to fight the GA/GPs and modify your problem definition and utility functions until you get the results that you're looking for.

Sometimes the results are very surprising!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: