Books

Short works

Books : reviews

Melanie Mitchell.
Analogy-Making as Perception: a computer model.
MIT Press. 1993

rating : 3.5 : worth reading

The psychologist William James observed that “a native talent for perceiving analogies is … the leading fact in genius of every order.” The centrality and the ubiquity of analogy in creative thought have been noted again and again by scientists, artists, and writers, and understanding and modeling analogical thought have emerged as two of the most important challenges for cognitive science.

Analogy-Making as Perception is based on the premise that analogy-making is fundamentally a high-level perceptual process in which the interaction of perception and concepts gives rise to “conceptual slippages” that allow analogies to be made. It describes Copycat—a computer model of analogy-making, developed by the author with Douglas Hofstadter, that models the complex, subconscious interaction between perception and concepts that underlies the creation of analogies.

In Copycat, both concepts and high-level perception are emergent phenomena, arising from large numbers of low-level, parallel, non-deterministic activities. In the spectrum of cognitive modeling approaches, Copycat occupies a unique intermediate position between symbolic systems and connectionist systems—a position that is at present the most useful one for understanding the fluidity of concepts and high-level perception.

On one level the book is about analogy-making, but on another level it is about cognition in general. It explores such issues as the nature of concepts and perception and the emergence of highly flexible concepts from a lower-level “subcoghitive” substrate.

An in-depth description of Copycat, one of the projects at Douglas Hofstadter’s Fluid Analogies Research Group

Melanie Mitchell.
An Introduction to Genetic Algorithms.
MIT Press. 1996

(read but not reviewed)

Genetic algorithms have been used in science and engineering as adaptive algorithms for solving practical problems and as computational models of natural evolutionary systems. This brief, accessible introduction describes some of the most interesting research in the field and also enables readers to implement and experiment with genetic algorithms on their own. It focuses in depth on a small set of important and interesting topics—particularly in machine learning, scientific modeling, and artificial life—and reviews a broad span of research, including the work of Mitchell and her colleagues.

The descriptions of applications and modeling projects stretch beyond the strict boundaries of computer science to include dynamical systems theory, game theory, molecular biology, ecology, evolutionary biology, and population genetics, underscoring the exciting “general purpose” nature of genetic algorithms as search methods that can be employed across disciplines.

An Introduction to Genetic Algorithms is accessible to students and researchers in any scientific discipline. It includes many thought and computer exercises that build on and reinforce the reader’s understanding of the text.

The first chapter introduces genetic algorithms and their terminology and describes two provocative applications in detail. The second and third chapters look at the use of genetic algorithms in machine learning (computer programs, data analysis and prediction, neural networks) and in scientific models (interactions among learning, evolution, and culture; sexual selection, ecosystems; and evolutionary activity). Several approaches to the theory of genetic algorithms are discussed in depth in the fourth Chapter. The Fifth chapter takes up implementation, and the last chapter poses some currently unanswered questions and surveys prospects for the future of evolutionary computation.

Tino Gramss, Stefan Bornholdt, Michael Gross, Melanie Mitchell, Thomas Pellizzari, eds.
Non-Standard Computation: molecular computation, cellular automata, evolutionary algorithms, quantum computers.
Wiley-VCH. 1998

rating : 4.5 : passes the time
review : 2 June 1999

There’s never enough computer power for challenging questions. Problems such as the design of turbines consisting of more than 100 parts or the simulation of systems of some 50 interacting particles are far beyond today’s computer capacities. Or, how to find the shortest phone line connecting 100 given cities?

The most promising answers to such questions come from unconventional technologies. The massive parallelism of molecular computers or the ingenious use of quantum systems by universal quantum computers provide solutions to the dilemma. Designed to solve specific problems, they show unprecedented performance.

And as for the phone line problem – genetic algorithms mimick the way nature found its way from the first cells to today’s creatures. While relying on conventional computer hardware, they introduce an element of chance on the software level, thus circumventing the disadvantages of traditional deterministic algorithms.

A textbook for those shaping the future of computing, this volume is also pure fun. Learn about the physical principles of tomorrow’s technology, and enjoy a marvellous tour through their potential applications!

This promised to be a very interesting book, but it was let down for me by being too low level – too much about the scientific and technological bases, and not enough about any new computational paradigms. (The very poor level of proof reading, with some chapters thick with spelling mistakes, also detracts.)

I was hoping for an overview of what new tools are being added to our computational capability, with maybe a review of the current state of the art, but what I got was a bunch of essays that have an idiosyncratic viewpoint, with all the details in the wrong places (for me, at least).

For example, the chapter on Genetic Algorithms devotes hardly any space to the schemata model (beyond saying it is intuitive) but instead develops a “statistical mechanics” model, without then providing the intuition of how this model helps us to cast or solve new computational problems. It also seems to imply that mutation is the key concept, with cross-over just an interesting second order add-on (whereas studies of some genetic algorithms show is that cross-over is key, with mutation playing a surprisingly small role).

The two chapters on quantum computing range over the theoretical QM underpinnings, and the current technology, but again provide no intuition of how these devices work as computers. (And the second of these chapters has an almost useless bibliography, since it omits the papers’ titles.)

So I was left disappointed.

Contents

Heinz Georg Schuster. Introduction to Non-standard Computation. 1998
Michael Gross. Molecular Computation. 1998
Using DNA to solve computational problems can allow massive parallelism, by exploring 1019 cases in parallel. This doesn't solve the trouble with exponentially difficult problems -- but it does delay it by several orders of magnitude! Currently, to program such a device, one needs to be a good bench chemist -- by DNA-sequencing technologies might make automatic compilation easier.
    Supramolecules are large molecules that are (partly) built from non-covalent bonds. Natural supramolecules include DNA; artificial ones also look promising for computational applications.
Stefan Bornholdt. Genetic Algorithms. 1998
Solving optimisation problems with artificial evolution
Melanie Mitchell. Computation in Cellular Automata: A Selected Review. 1998
A good overview of what cellular automata are, and a clear description of how higher-level 'particles' can be used to design cellular automata algorithms. [By far the best chapter. I find cellular automata intrinsically interesting, but this chapter still left me wondering what advantages they have as computers.]
Tino Gramss. The Theory of Quantum Computation: An Introduction. 1998
Theoretical quantum mechanic underpinnings of quantum computers -- all Hamiltonians and reversibility
Thomas Pellizzari. Quantum Computers: First Steps Towards a Realization. 1998
Current technology for building (very small!) quantum computers, including error correction

Lashon B. Booker, Stephanie Forrest, Melanie Mitchell, Rick L. Riolo, eds.
Perspectives on Adaptation in Natural and Artificial Systems.
OUP. 2005

Contents

Kenneth De Jong. Genetic Algorithms: A 30 Year Perspective. 2005
John R. Koza. Human-Competitive Machine Intelligence by Means of Genetic Algorithms. 2005
David E. Goldberg. John Holland, Facetwise models, and Economy of Thought. 2005
Arthur W. Burks. An Early Graduate Program in Computers and Communications. 2005
Oliver G. Selfridge. Had We But World Enough and Time. 2005
Bernard P. Zeigler. Discrete Event Abstraction: An Emerging Paradigm for Modeling Complex Adaptive Systems. 2005
Herbert A. Simon. Good Old-Fashioned AI and Genetic Algorithms: An Exercise in Translation Scholarship. 2005
Douglas R. Hofstadter. Moore's Law, Artificial Evolution, and the Fate of Humanity. 2005
Julian Adams. Evolution of Complexity in Microbial Populations. 2005
Bobbi S. Low, Doug Finkbeiner, Carl Simon. Favored Places in the Selfish Herd: Trading Off Food and Security . 2005
Rick L. Riolo, Robert Axelrod, Michael D. Cohen. Tags, Interaction Patterns and the Evolution of Cooperation. 2005
Robert G. Reynolds, Salah Saleem. The Impact of Environmental Dynamics on Cultural Emergence. 2005
Kenneth J. Arrow. John Holland and the Evolution of Economics. 2005
W. Brian Arthur. Cognition: The Black Box of Economics. 2005

Melanie Mitchell.
Complexity: a guided tour.
OUP. 2009

(read but not reviewed)

Melanie Mitchell.
Artificial Intelligence: a guide for thinking humans.
Pelican. 2019

How intelligent are the best Al programs?

Can we entrust them with decisions that affect our lives?

How soon do we need to worry about them surpassing us?