Monday, February 13, 2006

Wolfram's Talk at PARC (Feb 2003)

A New Kind of Science by Stephen Wolfram (2002), ISBN 1-57955-008-8.

Wolfram’s talk sold out the PARC auditorium, and 95% stayed an extra half hour till 9:30. I found it enjoyable, informative, entertaining, and took a bunch of notes. Lots of nice graphics.

Nothing new, but a more concise (than the book :) explanation of his view that “the universe could be a computer,” like the edge of a seashell, running some very simple program. The best question (also the first) was a guy who asked “if the future is deterministic, then what is my question going to be?”

His most often repeated point was his principle of “computational equivalence,” which says that any complex equation is pretty much like any other, there’s an upper bound, lots of things reach it, and they look a lot alike. But I don’t recall any quantitative evidence or reasoning to back this up. He said read the last part of Chapter 7.

The take home point, if there was one, was that if you were trying to program a molecule, you might try starting with Rule 110, rather than trying to emulate a Von Neumann architecture. He will soon put a dictionary of computational primitives on his website, like an organic molecule database, for people to use as raw material.

If some complex behavior is hard enough to predict, and we can only discover it by running the program, then “free will” can emerge that appears “free” of its underlying laws. Turing machines that “win” are usually the most complex, resembling engineered programs.

Randomness can arise from 3 sources, 1) external variations, like a boat bobbing on unseen waves, 2) initial conditions, driving chaotic systems that highly depend on them, and 3) implicitly within CA (cellular automata) rules, a source of the truest randomness, and yet it arises from no external actions or inputs. Mathematica has used Rule 30 as a random number generator for 15 years.

He laconically summarized his business success: “I didn’t hire a management team. I managed it myself.”

He felt that quantum computing may be less than it’s cracked up to be. There is way too much idealization of so-called quantum events. The measuring device, which must amplify some tiny signal to huge proportions, needs a lot of time to recover its equilibrium between hits, and to get more efficient it would need infinite time and infinite energy.

As to the criticism and controversy that have surrounded him, he admitted he hadn’t read many of the comments. They are thinking of trying to respond to some of them, but can’t yet draw a line between comments worthy of an answer, and those that are not.

Established peer reviewed science can’t handle big, non-incremental shifts. All new paradigms are attended by controversy, and the degree of anger correlates to their enduring value. He’s published the Mathematica Journal for 10 years, and peer review is not what it appears. It’s not easy to get meaningful comments, especially on big picture material that would need 1000 papers to describe in incremental terms. Most of the comments he got during his career were on his most technical papers, not his general ones.

If his detractors think his science is garbage, “how much longer can they keep saying it?” Sooner or later it will no longer be news. He’s received 15,000 e-mails from people who want to know more about how to apply his discoveries, and who would he rather spend time with, them or detractors from the science establishment?

Originally posted 2-14-2003.

For a more detailed book review, see Cosma Shalizi, "A Rare Blend of Monster Raving Egomania and Utter Batshit Insanity" (10-21-2005).


At 10/14/2009 10:20 PM, Blogger Frank Sudia said...

Followup: "Complex Adaptive Systems," Miller & Page (Princeton 2007) treats Wolfram like a pillar of the field of complex systems research, devoting most of Chapter 8 to extolling his ideas. Maybe he was right that the negative reaction (in 2003) was in proportion to the advance.


Post a Comment

Links to this post:

Create a Link

<< Home