Analogy-Making as Perception is based on the premise that analogy-making is fundamentally a high-level perceptual process in which the interaction of perception and concepts gives rise to “conceptual slippages” that allow analogies to be made. It describes Copycat—a computer model of analogy-making, developed by the author with Douglas Hofstadter, that models the complex, subconscious interaction between perception and concepts that underlies the creation of analogies.
In Copycat, both concepts and high-level perception are emergent phenomena, arising from large numbers of low-level, parallel, non-deterministic activities. In the spectrum of cognitive modeling approaches, Copycat occupies a unique intermediate position between symbolic systems and connectionist systems—a position that is at present the most useful one for understanding the fluidity of concepts and high-level perception.
On one level the book is about analogy-making, but on another level it is about cognition in general. It explores such issues as the nature of concepts and perception and the emergence of highly flexible concepts from a lower-level “subcoghitive” substrate.
The descriptions of applications and modeling projects stretch beyond the strict boundaries of computer science to include dynamical systems theory, game theory, molecular biology, ecology, evolutionary biology, and population genetics, underscoring the exciting “general purpose” nature of genetic algorithms as search methods that can be employed across disciplines.
An Introduction to Genetic Algorithms is accessible to students and researchers in any scientific discipline. It includes many thought and computer exercises that build on and reinforce the reader’s understanding of the text.
The first chapter introduces genetic algorithms and their terminology and describes two provocative applications in detail. The second and third chapters look at the use of genetic algorithms in machine learning (computer programs, data analysis and prediction, neural networks) and in scientific models (interactions among learning, evolution, and culture; sexual selection, ecosystems; and evolutionary activity). Several approaches to the theory of genetic algorithms are discussed in depth in the fourth Chapter. The Fifth chapter takes up implementation, and the last chapter poses some currently unanswered questions and surveys prospects for the future of evolutionary computation.
This promised to be a very interesting book, but it was let down for me by being too low level -- too much about the scientific and technological bases, and not enough about any new computational paradigms. (The very poor level of proof reading, with some chapters thick with spelling mistakes, also detracts.)
I was hoping for an overview of what new tools are being added to our computational capability, with maybe a review of the current state of the art, but what I got was a bunch of essays that have an idiosyncratic viewpoint, with all the details in the wrong places (for me, at least).
For example, the chapter on Genetic Algorithms devotes hardly any space to the schemata model (beyond saying it is intuitive) but instead develops a "statistical mechanics" model, without then providing the intuition of how this model helps us to cast or solve new computational problems. It also seems to imply that mutation is the key concept, with cross-over just an interesting second order add-on (whereas the study of genetic algorithms has shown is that cross-over is key, with mutation playing a surprisingly small role).
The two chapters on quantum computing range over the theoretical QM underpinnings, and the current technology, but again provide no intuition of how these devices work as computers. (And the second of these chapters has an almost useless bibliography, since it omits the papers' titles.)
So I was left disappointed.
Can we entrust them with decisions that affect our lives?
How soon do we need to worry about them surpassing us?