Perceptrons: An Introduction to Computational Geometry
By Marvin Minsky and Seymour A. Papert
I’ve read and loved Papert’s book Mindstorms, and I’ve owned and ogled Minsky’s Society of Mind for about a year without getting around to reading it. When I saw that these two academic giants had written a book together, I couldn’t help but take a closer look, even if this particular subject matter looks way over my head. (Also, I love the cover! Sometimes that really helps solidify a book’s appeal, you know?)
This goes into such fascinating topics as neural networks, computer learning, logic, topology, and pattern recognition. The index is filled with names I do not recognize and mathematical terminology the language of which I do not speak. The actual preview is limited to a few pages of the prologue; I haven’t even looked at the table of contents to start to better appreciate the overall structure and thrust of the book. Such shortcomings, such random and fickle tides of algorithmic preview make it difficult to complete my task!
Anyway: this is “the first systematic study of parallelism in computation” and is apparently a “classical work on threshold automata networks”. I’ll take it on faith that that’s a legitimate genre. Suffice it to say it’s an important book in AI research — from what I gather, it played a large role in articulating some of the potential and some of the limitations of the kind of computational intelligence potentially arising from neural networks.
I don’t understand what exactly computational geometry is, but hey, I read things like “the history of the perceptron demonstrates the complexity of analyzing neural networks” and I find that intriguing, so I put the book on my list. I’m not the first one this book has confused, though. Apparently it’s been the source of some confusion and debate, some popular misconceptions and strange vectors in driving AI research. Anyway, there are a lot of things I can imagine reading about in a book called “Perceptrons”; with some books, probably including this one, I think it’s more fun to leave them imagined.