Turing's Cathedral: The Origins of the Digital Universe und über 1,5 Millionen weitere Bücher verfügbar für Amazon Kindle. Erfahren Sie mehr
EUR 36,18
  • Alle Preisangaben inkl. MwSt.
Nur noch 1 auf Lager
Verkauf und Versand durch Amazon.
Geschenkverpackung verfügbar.
Turing's Cathedral: The O... ist in Ihrem Einkaufwagen hinzugefügt worden
Ihren Artikel jetzt
eintauschen und
EUR 0,10 Gutschein erhalten.
Möchten Sie verkaufen?
Zur Rückseite klappen Zur Vorderseite klappen
Anhören Wird wiedergegeben... Angehalten   Sie hören eine Probe der Audible-Audioausgabe.
Weitere Informationen
Dieses Bild anzeigen

Turing's Cathedral: The Origins of the Digital Universe (Englisch) Audio-CD – Audiobook, Ungekürzte Ausgabe


Alle 11 Formate und Ausgaben anzeigen Andere Formate und Ausgaben ausblenden
Amazon-Preis Neu ab Gebraucht ab
Kindle Edition
"Bitte wiederholen"
Audio-CD, Audiobook, Ungekürzte Ausgabe
"Bitte wiederholen"
EUR 36,18
EUR 36,18 EUR 33,05
Unbekannter Einband
"Bitte wiederholen"
3 neu ab EUR 36,18 4 gebraucht ab EUR 33,05

Kunden, die diesen Artikel angesehen haben, haben auch angesehen

Jeder kann Kindle Bücher lesen — selbst ohne ein Kindle-Gerät — mit der KOSTENFREIEN Kindle App für Smartphones, Tablets und Computer.


Produktinformation


Mehr über den Autor

Entdecken Sie Bücher, lesen Sie über Autoren und mehr

Produktbeschreibungen

Pressestimmen

“An expansive narrative . . . The book brims with unexpected detail. Maybe the bomb (or the specter of the machines) affected everyone. Gödel believed his food was poisoned and starved himself to death. Turing, persecuted for his homosexuality, actually did die of poisoning, perhaps by biting a cyanide-laced apple. Less well known is the tragic end of Klári von Neumann, a depressive Jewish socialite who became one of the world’s first machine-language programmers and enacted the grandest suicide of the lot, downing cocktails before walking into the Pacific surf in a black dress with fur cuffs. Dyson’s well made sentences are worthy of these operatic contradictions . . . A groundbreaking history of the Princeton computer.”
—William Poundstone, The New York Times Book Review

“Dyson combines his prodigious skills as a historian and writer with his privileged position within the [Institute for Advanced Study’s] history to present a vivid account of the digital computer project . . .  A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
 
“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ’50s . . . It demonstrates that the power of human thought often precedes determination and creativity in the birth of world-changing technology . . . An important work.”
—Richard DiDio, Philadelphia Inquirer
 
“Dyson’s book is not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
—Josh Rothman, Braniac blog, Boston Globe
 
“Beyond the importance of this book as a contribution to the history of science, as a generalist I was struck by Dyson’s eye and ear for the delightfully entertaining detail . . . Turing’s Cathedral is suffused . . . with moments of insight, quirk and hilarity rendering it more than just a great book about science. It’s a great book, period.”
—Douglas Bell, The Globe and Mail
 
“The greatest strength of Turing’s Cathedral lies in its luscious wealth of anecdotal details about von Neumann and his band of scientific geniuses at IAS.  Dyson himself is the son of Freeman Dyson, one of America’s greatest twentieth-century physicists and an IAS member from 1948 onward, and so Turing’s Cathedral is, in part, Dyson’s attempt to make both moral and intellectual sense of his father’s glittering and yet severely compromised scientific generation.”
—Andrew Keen, B&N Review

“A mesmerizing tale brilliantly told . . . . The use of wonderful quotes and pithy sketches of the brilliant cast of characters further enriches the text . . . . Meticulously researched and packed with not just technological details, but sociopolitical and cultural details as well—the definitive history of the computer.”
Kirkus (starred review)
 
“The most powerful technology of the last century was not the atomic bomb, but software—and both were invented by the same folks. Even as they were inventing it, the original geniuses imagined almost everything software has become since. At long last, George Dyson delivers the untold story of software’s creation. It is an amazing tale brilliantly deciphered.”
—Kevin Kelly, cofounder of WIRED magazine, author of What Technology Wants
 
“It is a joy to read George Dyson’s revelation of the very human story of the invention of the electronic computer, which he tells with wit, authority, and insight. Read Turing’s Cathedral as both the origin story of our digital universe and as a perceptive glimpse into its future.”
—W. Daniel Hillis, inventor of The Connection Machine, author of The Pattern on the Stone

Über den Autor und weitere Mitwirkende

George Dyson is a historian of technology whose interests include the development (and redevelopment) of the Aleut kayak (Baidarka), the evolution of digital computing and telecommunications (Darwin Among the Machines), and the exploration of space (Project Orion).


Welche anderen Artikel kaufen Kunden, nachdem sie diesen Artikel angesehen haben?


In diesem Buch (Mehr dazu)
Nach einer anderen Ausgabe dieses Buches suchen.
Ausgewählte Seiten ansehen
Buchdeckel | Copyright | Inhaltsverzeichnis | Auszug | Stichwortverzeichnis
Hier reinlesen und suchen:

Kundenrezensionen

Es gibt noch keine Kundenrezensionen auf Amazon.de
5 Sterne
4 Sterne
3 Sterne
2 Sterne
1 Sterne

Die hilfreichsten Kundenrezensionen auf Amazon.com (beta)

Amazon.com: 109 Rezensionen
125 von 137 Kunden fanden die folgende Rezension hilfreich
How it came from bit 7. März 2012
Von A. Jogalekar - Veröffentlicht auf Amazon.com
Format: Gebundene Ausgabe Verifizierter Kauf
The physicist John Wheeler who was famous for his neologisms once remarked that the essence of the universe could be boiled down to the phrase "it from bit", signifying the creation of matter from information. This description encompasses the digital universe which now so completely pervades our existence. Many moments in history could lay claim as the creators of this universe, but as George Dyson marvelously documents in "Turing's Cathedral", the period between 1945 and 1957 at the Institute for Advanced Study (IAS) in Princeton is as good a candidate as any.

Dyson's book focuses on the pioneering development of computing during the decade after World War II and essentially centers on one man- John von Neumann. Von Neumann is one of the very few people in history to whom the label "genius" can authentically be applied. The sheer diversity of fields to which he made important contributions beggars belief- Wikipedia lists at least twenty ranging from quantum mechanics to game theory to biology. Von Neumann's mind ranged across a staggeringly wide expanse of thought, from the purest of mathematics to the most applied nuclear weapons physics. The book recounts the path breaking efforts of him and his team to build a novel computer at the IAS in the late 1940s. Today when we are immersed in a sea of computer-generated information it is easy to take the essential idea of a computer for granted. That idea was not the transistor or the integrated circuit or even the programming language but the groundbreaking notion that you could have a machine where both data AND the instructions for manipulating that data could be stored in the same place by being encoded in a common binary language. That was von Neumann's great insight which built upon the idea of Alan Turing's basic abstract idea of a computing machine. The resulting concept of a stored program is at the foundation of every single computer in the world. The IAS computer practically validated this concept and breathed life into our modern digital universe. By present standards its computing power was vanishingly small, but the technological future it unleashed has been limitless.

Dyson's book excels mainly in three ways. Firstly, it presents a lively history of the IAS, the brilliant minds who worked there and the culture of pure thought that often looked down on von Neumann's practical computational tinkering. Secondly, it discusses the provenance of von Neumann's ideas which partly arose from his need to perform complex calculations of the events occurring in a thermonuclear explosion. These top-secret calculations were quietly run at night on the IAS computer and in turn were used to tweak the computer's workings; as Dyson pithily puts it, "computers built bombs, and bombs built computers". Von Neumann also significantly contributed to the ENIAC computer project at the University of Pennsylvania. Thirdly, Dyson brings us evocative profiles of a variety of colorful and brilliant characters clustered around von Neumann who contributed to the intersection of computing with a constellation of key scientific fields that are now at the cutting edge.

There was the fascinating Stan Ulam who came up with a novel method for calculating complex processes - the Monte Carlo technique - that is used in everything from economic analysis to biology. Ulam who was one of the inventors of thermonuclear weapons originally used the technique to calculate the multiplication of neutrons in a hydrogen bomb. Then there was Jule Charney who set up some of the first weather pattern calculations, early forerunners of modern climate models. Charney was trying to implement von Neumann's grand dream of controlling the weather, but neither he nor von Neumann could anticipate chaos and the fundamental sensitivity of weather to tiny fluctuations. Dyson's book also pays due homage to an under-appreciated character, Nils Barricelli, who used the IAS computer to embark on a remarkable set of early experiments that sought to duplicate evolution and artificial life. In the process Barricelli discovered fascinating properties of code, including replication and parasitism that mirrored some of the great discoveries taking place in molecular biology at the time. As Dyson tells us, there were clear parallels between biology and computing; both depended on sequences of code, although biology thrived on error-prone duplication (leading to variation) while computing actively sought to avoid it. Working on computing and thinking about biology, von Neumann anticipated the genesis of self-reproducing machines which have fueled the imagination of both science fiction fans and leading researchers in nanotechnology.

Finally, Dyson introduces us to the remarkable engineers who were at the heart of the computing projects. Foremost among them was Julian Bigelow, a versatile man who could both understand code and fix a car. Bigelow's indispensable role in building the IAS computer brings up an important point; while von Neumann may have represented the very pinnacle of abstract thought, his computer wouldn't have gotten off the ground had Bigelow and his group of bright engineers not gotten their hands dirty. Great credit also goes to the two lead engineers on the ENIAC project, J. Presper Eckert and John Mauchly, who were rather unfairly relegated to the shadows and sidetracked by history. Dyson rightly places as much emphasis on discussing the nitty-gritty of the engineering hurdles behind the IAS computer as he does on its lofty mathematical underpinnings. He makes it clear that the ascendancy of a revolutionary technology requires both novel theoretical ideas as well as fine craftsmanship. Unfortunately in this case, the craftsmanship was ultimately trampled by the institute's mathematicians and humanists, which only added to its reputation as a refuge for ivory tower intellectuals who considered themselves above pedestrian concerns like engineering. At the end of the computing project the institute passed a resolution which forbade any kind of experimentation from ever taking place; perhaps keeping in line with his son's future interest in the topic, Freeman Dyson (who once worked on a nuclear spaceship and genuinely appreciates engineering details) was one of the few dissenting voices. But this was not before the IAS project spawned a variety of similar machines which partly underlie today's computing technology.

All these accounts are supplemented with gripping stories about weather prediction, the US thermonuclear program, evolutionary biology, and the emigration of European intellectuals like Kurt Godel and von Neumann to the United States. The book does have its flaws though. For one thing it focuses too heavily on von Neumann and the IAS. Dyson says relatively very little about Turing himself, about pioneering computing efforts at Manchester and Cambridge (the first stored-program computer in fact was the Manchester "Baby" machine) and about the equally seminal development of information theory by Claude Shannon. James Gleick's "The Information" and Andrew Hodges's "Alan Turing: The Enigma" might be useful complements to Dyson's volume. In addition, Dyson often meanders into one too many digressions that break the flow of the narrative; for instance, do we really need to know so much about Kurt Godel's difficulties in obtaining a visa? And do we need to get bogged down in minutiae such as the starting dates and salaries for every member of the project and the list of items on the cafeteria menu? Details like these might put casual readers off.

Notwithstanding these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters. It's certainly the most detailed account of the IAS computer project that I have seen. If you want to know about the basic underpinnings of our digital universe, this is a great place to start even with its omissions. All the implications, pitfalls and possibilities of multiple scientific revolutions can be connected in one way or another to that little machine running quietly in a basement in Princeton.
341 von 386 Kunden fanden die folgende Rezension hilfreich
Misleading 7. März 2012
Von Jeremy E. May - Veröffentlicht auf Amazon.com
Format: Gebundene Ausgabe
The focus of George Dyson's well-written, fascinating but essentially misleading book,'Turing's Cathedral', is curiously not on celebrated mathematician, code-breaker and computer theorist Alan Turing but on his equally gifted and innovative contemporary John von Neumann. Von Neumann, whose extraordinarily varied scientific activities included inter alia significant contributions to game theory, thermodynamics and nuclear physics, is especially associated with the early development of the electronic digital computer (i.e. the 'EDC'), an interest apparently sparked by reading Turing's seminal 1936 paper 'On Computational Numbers' which attempted to systematize and express in mathematical terminology the principles underlying a purely mechanical process of computation. Implicit in this article, but at a very theoretical level, was a recognition of the relevance of stored program processing (whereby a machine's instructions and data reside in the same memory), a concept emanating from the work of mid-Victorian computer pioneer Charles Babbage but which demanded a much later electronic environment for effective realization.

What Mr Dyson insufficiently emphasizes is that, despite a widespread and ever-growing influence on the mathematical community, Turing's paper was largely ignored by contemporary electronic engineers and had negligible overall impact on the early development of the EDC. Additionally, he omits to adequately point out that von Neumann's foray into the new science of electronic computers involved a virtual total dependence on the prior work, input and ongoing support of his engineering colleagues. Invited in August 1944 to join the Moore School, University of Pennsylvania, team responsible for ENIAC, the world's first general purpose computer being built for the US Army, von Neumann was quickly brought up to speed courtesy of the machine's lead engineers, J. Presper Eckert and John Mauchly. As early as the fall of 1943, Eckert and Mauchly had become seriously frustrated by the severe processing limitations imposed by ENIAC's design and were giving serious consideration to implementing major modifications, in particular the adoption of Eckert's own mercury delay line technology to boost the machine's miniscule memory capacity and enable a primitive stored-program capability. These proposals were subsequently vetoed by the School's authorities on the quite understandable grounds that they would seriously delay ENIAC's delivery date; instead it was decided to simultaneously begin research on a more advanced machine (i.e. EDVAC) to incorporate the latest developments. As a new member of the group, von Neumann speedily grasped the essentials of the new science and contributed valuable theoretical feedback, but an almost total lack of hands-on electronic expertise on his part prevented any serious contribution to the nuts and bolts of the project. Relations with Eckert and Mauchly rapidly deteriorated when an elegantly written, but very high-level, document of his entitled 'First Draft of a Report on the EDVAC' was circulated among the scientific community. Not only had this document not been previewed, let alone pre-approved, by Eckert and Mauchly, but it bore no acknowledgment whatsoever of their overwhelming responsibility for much of the content. By default, and in view too of his already very considerable international reputation, the content was therefore attributed exclusively to von Neumann, an impression he made no attempt thereafter to correct, the term 'Von Neumann Architecture' being subsequently bestowed on the stored program setup described in the document.

The public distribution of von Neumann's 'Draft' denied Eckert and Mauchly the opportunity to patent their technology. Worse still, despite academic precedents to the contrary, they were refused permission by the Moore School to proceed with EDVAC's development on a commercial basis. In spite of his own links to big business (he represented IBM as a consultant), von Neumann likewise opposed their efforts to do so. All this resulted in a major rift, von Neumann thereafter being shunned by Eckert and Mauchly and forced to rely on lesser mortals to help implement various stored-program projects, notably the IAS computer at Princeton. The following year (1946) Eckert and Mauchly left the School to focus on developing machines for the business market. Before doing so, they jointly delivered a series of state of the art lectures on ENIAC and EDVAC to an invited audience at the School. Among the attendees was British electronics engineer Maurice Wilkes, a fellow academic of Turing's from Cambridge University, but with relatively little interest in the latter's ongoing activity (by this time Turing, a great visionary, had also turned his attention to designing stored-program computers). Blown away by Eckert and Mauchly's presentation, Wilkes returned to England to forge ahead with a new machine called EDSAC, which was completed in May 1949 and represented the first truly viable example of a stored program computer (an experimental prototype christened 'Baby' had already been developed at Manchester University the year before). Back in the US, Eckert and Mauchly continued their efforts, but persistent problems with funding and also Eckert's own staunch refusal to compromise on quality delayed progress, their partnership finally culminating in the development of the UNIVAC 1, the world's first overtly business-oriented computer, delivered initially to the Census Bureau in March 1951.

Mr Dyson is quite right of course (and he does this well) to trace the beginnings of the modern computer to the stored program concept, but his obsessive focus on von Neumann's role obscures the impact of Eckert and Mauchly's vastly more significant contribution to its development. The triumph of the EDC depended almost wholly on the efforts and expertise of utterly dedicated and outstanding electronics specialists like them, not on mathematicians, logicians and generalists like von Neumann or even Turing. Never one to deny credit where it was due, Wilkes (who later spearheaded advances in software, became the doyen of Britain's electronic community and ended his long and distinguished career as professor emeritus of computer science at Cambridge) unceasingly acknowledged his major debt to Eckert and Mauchly. Hopefully, Mr Dyson, a writer of considerable talent, might one day decide to tell in full their story and set the record straight.
119 von 139 Kunden fanden die folgende Rezension hilfreich
Digital History that Reads Like Code 9. März 2012
Von Book Shark - Veröffentlicht auf Amazon.com
Format: Gebundene Ausgabe Verifizierter Kauf
Turing's Cathedral: The Origins of the Digital Universe by George Dyson

"Turing's Cathedral" is the uninspiring and rather dry book about the origins of the digital universe. With a title like, "Turing's Cathedral" I was expecting a riveting account about the heroic acts of Alan Turing the father of modern computer science and whose work was instrumental in breaking the wartime Enigma codes. Instead, I get a solid albeit "research-feeling" book about John von Neumann's project to construct Turing's vision of a Universal Machine. The book covers the "explosion" of the digital universe and those applications that propelled them in the aftermath of World War II. Historian of technology, George Dyson does a commendable job of research and provide some interesting stories involving the birth and development of the digital age and the great minds behind it. This 432-page book is composed of the following eighteen chapters: 1.1953, 2. Olden Farm, 3. Veblen's Circle, 4. Neumann Janos, 5. MANIAC, 6. Fuld 219, 7. 6J6, 8. V-40, 9. Cyclogenesis, 10. Monte Carlo, 11. Ulam's Demons, 12. Barricelli's Universe, 13. Turing's Cathedral, 14. Engineer's Dreams, 15. Theory of Self-Reproducing Automota, 16. Mach 9, 17. The Tale of the Big Computer, and 18. The Thirty-ninth Step.

Positives:
1. A well researched book. The author faces a daunting task of research but pulls it together.
2. The fascinating topic of the birth of the digital universe.
3. A who's who of science and engineering icons of what will eventually become computer science. A list of principal characters was very welcomed.
4. For those computer lovers who want to learn the history behind the pioneers behind digital computing this book is for you.
5. Some facts will "blow" you away, "In March 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth".
6. Some goals are counterintuitive. "The new computer was assigned two problems: how to destroy life as we know it, and how to create life of unknown forms".
7. There are some interesting philosophical considerations.
8. As an engineer, I enjoy the engineering challenges involved with some of their projects.
9. Amazing how the Nazi threat gave America access to some of the greatest minds. The author does a good job of describing these stories.
10. The fascinating life of the main character of this book, John von Neumann.
11. So much history interspersed throughout this book.
12. The ENIAC..." a very personal computer". A large portion of this book is dedicated to the original computer concepts, challenges, parts, testing, etc...
13. The fundamental importance of Turing's paper of 1936. It's the inspiration behind the history of the digital universe.
14. Some amusing tidbits here and there, including Einstein's diet.
15. The influence of Godel. How he set the stage for the digital revolution.
16. Blown away with Leibniz. In 1679, yes that is correct 1679 he already imagined a digital computer with binary numbers...
17. So many great stories of how these great minds attacked engineering challenges. Computer scientists will get plenty of chuckles with some of these stories involving the types of parts used in the genesis of computing. Vacuum tubes as an example.
18. There are many engineering principles devised early on that remain intact today. Many examples, Bigelow provides plenty of axioms.
19. I enjoyed the stories involving how computers improved the art of forecasting the weather.
20. "Filter out the noise". A recurring theme and engineering practice that makes its presence felt in this book.
21. Computers and nuclear weapons.
22. The Monte Carlo method a new, key domain in mathematical physics and its invaluable contribution to the digital age.
23. The fascinating story of the summer of 1943 at Los Alamos.
24. The Teller-Ulam invention.
25. How the digital universe and the hydrogen bomb were brought into existence simultaneously.
26. Barricelli and an interesting perspective on biological evolution.
27. The amazing life of Alan Mathison Turing and his heroic contributions.
28. A fascinating look at the philosophy of artificial intelligence and its future.
29. The collision between digital universe and two existing stores of information: genetic codes and information stored in brains.
30. The basis for the power of computers.
31. The five distinct sets of problems running on the MANIAC by mid-1953. All in JUST 5 kilobytes.
32. A look at global digital expansion and where we are today.
33. The unique perspective of Hannes Alfven. Cosmology.
34. The future of computer science.
35. Great quotes, "What if the price of machines that think is people who don't?"
36. The author does a great job of providing a "where are they now" narration of all the main characters of the book.
37. Links worked great.
38. Some great illustrations in the appendix of the book. It's always great to put a face on people involved in this story.

Negatives:
1. It wasn't an enjoyable read. Plain and simple this book was tedious to read. The author lacked panache.
2. The title is misleading. This title is a metaphor regarding Google's headquarters in California. The author who was given a glimpse inside the aforementioned organization sensed Turing's vision of a gathering of all available answers and possible equations mapped out in this awe-inspiring facility. My disappointment is that this book despite being inspired by Alan Turing's vision, in fact, has only one chapter dedicated to him. The main driver behind this book was really, John von Neumann.
3. A timeline chart would have added value. With so many stories going back and forth it would help the reader ground their focus within the context of the time that it occurred.
4. Some of the stories really took the scenic route to get to the point.
5. The photos should have been included within the context of the book instead of a separate section of its own.
6. The book was probably a hundred pages too long.

In summary, I didn't enjoy reading this book. The topic was of interest to me but between the misleading title and the very dry prose, the book became tedious and not intellectually satisfying. The book felt more like a research paper than a book intended for the general audience. For the record, I am engineer and a lot of the topics covered in this book are near and dear my heart but the author was never able to connect with me. This book is well researched and includes some fascinating stories about some of the icons of science and the engineering involved with the digital origins but I felt like I was reading code instead of a story. This book will have a limited audience; if you are an engineer, scientist or in the computer field this book may be of interest but be forewarned it is a monotonous and an uninspiring read.

Recommendations: "Steve Jobs" by Walter Isaacson, "The Quantum Universe: (And Why Anything That Can Happen, Does)" by Brian Cox, "Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100" Michio Kaku, "Warnings: The True Story of How Science Tamed the Weather" by Mike Smith, "Spycraft: The Secret History of the CIA's Spytechs, from Communism to Al-Qaeda" by Robert Wallace and H. Keith Melton.
27 von 31 Kunden fanden die folgende Rezension hilfreich
A big disappointment 21. April 2012
Von Narzul Patrick - Veröffentlicht auf Amazon.com
Format: Gebundene Ausgabe Verifizierter Kauf
I bought this book as soon as it came out, after reading a good review in the "Guardian". The subject matter is certainly fascinating, but its treatment by the author is, in my opinion, awful. I was expecting a simple technical description of the basic concepts of early computing (like the famous "40-bit line of code" which he mentions without ever describing it). The only interesting technical discussion concerns the challenges faced by the engineers who built the machine in Princeton. Otherwise, we are treated to the biographies of all the main characters in the story (sometimes going back to their grand-parents...) and we learn more about partying in Budapest in the 20's than about Turing's ideas.
Of course, the description of how military applications triggered the development of the first computers is very interesting, but it could have been much more streamlined.
Finally, I found the discussion of computer intelligence very wishy-washy, not based on clear facts and arguments, and, once more, disappointing.
12 von 12 Kunden fanden die folgende Rezension hilfreich
Turing's Apple Orchard 23. Mai 2012
Von B. Abramson - Veröffentlicht auf Amazon.com
Format: Gebundene Ausgabe
This is a fascinating, occasionally irritating, extraordinarily well-researched examination of the emergence of digital computers and the part played by the Institute for Advanced Study in Princeton, NJ. It is highly recommended for anyone who wants to learn more about early computing machines and the people who made them possible.

In many ways this is as much a biography of John von Neumann as it is a history of the origins of modern computing and the information universe. Von Neumann was astonishingly talented and was centrally involved with many of the intellectual and technological revolutions of the 20th century. The early story of computing cannot be told without concentrating on him. Alan Turing was also remarkably talented and pivotal in the origins of computing. In this, Turing's centenary year, one has to look to Andrew Hodges biography for a definitive description of his role in these matters.

Others have pointed to problems with the book. I feel these are minor:

- the title is somewhat misleading: the book is not focused on Alan Turing and it only tangentially concerns itself with the cathedral of information that IT has made available to us (see below),

- the digressions whether they be into the disposition of George Washington's forces before the battle of Princeton, the design of the hydrogen bomb, or the parallels between the evolution of genes and the evolution of code (programs) can be either fascinating or tedious depending on your interests. I enjoyed them,

- it is assumed that the reader knows what registers, accumulators, and the other standard components of a central processor are. It is also assumed that the reader is familiar with relatively obscure computing techniques such as content-addressable memory,

- Dyson strains to the breaking point to find neat analogies to describe the emergence of "the digital universe." For example, "Turing's model was one-dimensional...von Neumann's implementation was two dimensional. The landscape is now three-dimensional". In fact, Turing's model was deliberately designed to be as simple as it possibly could be, MANIAC used a three dimensional memory, and the digital universe is at least 4 dimensional.

However, not all the criticisms of the book are valid. To pick just one example:

- Dyson is concerned about the role that the IAS played in the emergence of our modern information universe. He acknowledges that Eckert and Mauchly played a pivotal role but he isn't trying to disentangle who should have how much credit for which aspect of creating the modern stored program electronic computer. This impossibility of doing this is declared in the Preface.

I find it interesting that both von Neumann and Turing were absorbed with the idea that future stages of computer development would be towards artificial intelligence, self-reproducing machines, and machine consciousness. They saw, and Turing for one feared, computers as potentially replacing humans as the next evolutionary step. Neither of them, and as far as one can tell no other pioneer except Thomas J. Watson, saw that the computing ability of machines is of lesser importance. Far more important, in fact crucial, has been the integration of computers, communications, and, above all, information. This is what constitutes the digital universe.

Dyson describes a visit to Google's HQ saying it felt like "entering a fourteenth-century cathedral". Is it the building or the work that goes on there that reminds him of a huge edifice dedicated to the soul? I suspect he deliberately allows this to remain ambiguous.

Nevertheless, there is some incongruity between this metaphor, which is clearly significant as it provides the book's title, and the very strong emphasis in the later chapters on the biological character of computing. Cathedrals are everything that biology is not: rigid and frigid. Indeed the similarity between biological life and computing is a recurring theme. It may be helpful to use biological analogies to give insight into the future of computing but Dyson appears to believe that these are actual descriptions not analogies. For example, "Google's one million servers constitute a collective, metazoan organism." The server network may resemble an organism but it is not one. Further, he devotes a chapter to Barricelli's early and relatively unsuccessful efforts to simulate evolution using MANIAC. This chapter ends with some elegant but absurd statements such as "We have already outsourced much of our cultural heritage to the Internet, and are outsourcing our genetic inheritance as well."

Dyson is to be commended for producing a book of remarkable scope and depth.
Waren diese Rezensionen hilfreich? Wir wollen von Ihnen hören.