Jump to content
Objectivism Online Forum
Sign in to follow this  
Axiom

A New Kind of Science

Rate this topic

Recommended Posts

Has anyone read the fairly recent book by Stephen Wolfram, A New Kind of Science?

What are your general thoughts regarding his approach?

Stephen Wolfram is the famous boy genius who founded Mathematica (a mathematical software tool - a whole new language really). He had his Ph.D at 20 and had written numerous papers on particle physics and cellular automata by that time. He is, I believe, the youngest winner of the MacArthur Genius Award (he was 22). The guy is smart, to say the least.

He is 55 years old now, and had worked in seclusion on A New Kind of Science for over a decade. Here are a few quotes from the introduction of the book:

"Three centuries ago science was transformed by the dramatic new idea that rules based on mathematical equations could be used to describe the natural world. My purpose in this book is to initiate another such transformation, and to introduce a new kind of science that is based on much more general types of rules that can be embodied in simple computer programs."- page 1

"...perhaps immediately most dramatic is that it [the new method] yields a resolution to what has long been considered the single greatest mystery of the natural world: what secret it is that allows nature seemingly so effortlessly to produce so much that appears to us so complex" - page 2

The seminal insight of the book:

"One might have thought - as at first I certainly did - that if the rules for a program were simple then this would mean that its behavior must also be correspondingly simple. For our everyday experience in building things tends to give us the intuition that creating complexity is somehow difficult, and requires rules or plans that are themselves complex. But the pivotal discovery that I made some eighteen years ago is that in the world of programs such intuition is not even close to correct." - page 2

"...the Principle of Computational Equivalence now implies that this [traditional mathematical modeling] will normally be possible for rather special systems with simple behavior. For other systems will tend to perform computations that are just as sophisticated as those we can do, even with all our mathematics and computers. And this means that such systems are computationally irreducible - so that in effect the only way to find their behavior is to trace each of their steps, spending about as much computational effort as the systems themselves." - page 6

"When I first started at the beginning of the 1980's, my goal was mostly just to understand the phenomenon of complexity. But by the mid 1990's I had built up a whole intellectual structure that was capable of much more, and that in fact provided the foundation for what could only be considered a fundamentally new kind of science." - page 21

My first impression of the book was that it was neither new, nor science. It seemed as though the man sat in his room for a decade and ran funny scripts on his computer, to emerge with a new framework for conducting science in the real world. It just seemed like pure rationalism.

I must admit that after I took a closer look at his work (i.e, after I actually started reading the book), I've been very impressed. If nothing else, it bares closer study.

The book is available online for free at http://www.wolframscience.com/nksonline/

Share this post


Link to post
Share on other sites

I saw a documentary on this guy. The conclusions he makes seemed obvious to me, and I don’t think the basic theory behind them is that revolutionary, though the math and evidence behind it may well be.

Share this post


Link to post
Share on other sites

That's not his contention. There is actually very little "new" math involved (from what I've seen). According to Wolfram the revolution is primarily conceptual.

What's been bugging me is how this new approach is any different from complexity or even chaos theory - both very new disciplines themselves (complexity especially - it's only been around for not much longer than two decades.) He gives an outline of what is missing in each approach in the first chapter of his book, but from what I recall it didn't seem to hit home with me.

Regarding the idea that this stuff is obvious, I'd say that it more likely has more to do with lack of experience on your part than with Wolfram's theory. The stuff he presents - especially the idea of computational equivalence is pretty radical. In fact, I find the idea very hard to digest. But, I haven't read through his book yet, so I'll hold my final verdict until I get at least that far.

Vecheslav Silagadze

Share this post


Link to post
Share on other sites

Wow, I had no idea there was this much hostility his Wolfram from the CS field.

On second thought, I think Wolfram does deserve some credit. While the ideas in his book are not new, they are collected and presented for the first time in a coherent thesis accessible to non-programmers. Just as OPAR was a major philosophical achievement despite not presenting a (brand) new philosophy, so does NKS provide wide exposure to previously obscure ideas.

As far as the actual ideas in the book go, in my (very limited) grasp of them, I think they are entirely correct. The idea that everything in the universe consists of simple algorithms equivalent to a natural Turing machine is a brilliant discovery.

(Disclaimer: I have no scientific background, and my primary exposure to the book is a PBS documentary, so I could very well be wrong.)

Share this post


Link to post
Share on other sites
Tom    0

It's rationalism with a software analogy instead of a mathematical analogy.

What's compelling about rationalism is that it models the complex as an easy-to-understand system of interacting elements and dependent events.

But no rationalist model should ever be taken as the final word because there will always be some observation that blows the model out of the water. Knowledge is contextual and the best we can do is form and rethink concepts, propositions, and models out of the continuous stream of incoming evidence.

If you want to know more about pitfalls of mathematical rationalism, read up on Euclid vs non-Euclidean geometry, Newton's Principia vs the orbit of Mercury, and Russel-Whitehead's Mathematica vs Godel.

Share this post


Link to post
Share on other sites
It's rationalism with a software analogy instead of a mathematical analogy. 

What's compelling about rationalism is that it models the complex as an easy-to-understand system of interacting elements and dependent events.

But no rationalist model should ever be taken as the final word because there will always be some observation that blows the model out of the water.  Knowledge is contextual and the best we can do is form and rethink concepts, propositions, and models out of the continuous stream of incoming evidence.

Well this kind of applies to all scientific model; it seems a bit unfair to single out Wolfram and Rationalism. Humans arent omniscient, so its pretty irrational to demand theories that require it. The criteria for evaluation should be "Is this the best model/theory we have, based on all the available evidence we currently have access to?", not "Is this model likely to be proven wrong some time in the future, by some new observations?". The latter option seems to lead almost immediately to true scepticism.

Share this post


Link to post
Share on other sites
Tom    0

The hallmark of rationalism lies in treating fundamental concepts and fundamental propositions as contextless absolutes or irreducible metaphysical primaries.

There is nothing wrong with a deductive system as such provided that:

1. it doesn't contradict any observation

2. it recognizes the contextual nature of knowledge

3. its fundamental concepts and propositions are traced back to their origins in observation and experiment

Newton's Principles of Natural Philosophy is rationalist in the sense of violating condition three. Newton gives no mention of where he got his Laws of Motion from.

Well this {being rationalist} kind of applies to all scientific models
No modern scientific model is rationalist. Every generalization is treated as a hypothesis. Every equation is treated as merely "good enough" until more precision is required. No modern scientist worth his salt regards any theory as anything more than merely adequate for what he currently knows.

it seems a bit unfair to single out Wolfram and Rationalism.

I'm not singling out Wolfram. I'm just pointing out the limitations of rationalism.

Share this post


Link to post
Share on other sites

Oh I get you, your point is that rationalism claims its models are universally valid, whereas the modern scientific takes the contextuality of knowledge into account when discussing claims to 'truth'. I thought your problem with rationalism was the idea of 'making models' in itself, which is why I said that this approach is fairly uniform throughout the sciences.

I agree with you then. All meaningful scientific discourse regarding modelling should be carried out in e-prime anyway.

Share this post


Link to post
Share on other sites

Tom,

I think that you're selling Newton short. He got his laws of motion from Galileo's experiments. He simply applied his new mathematical techniques (calculus) to the same assumptions that Galileo had made, and got the same results, plus some other insights. As he wrote in his letter to Hooke, he saw further by "standing on the shoulders of giants."

He didn't spell out the proofs of these assumptions in the Principia, but why should he have? They were common knowledge among his audience, and thus, outside the scope of that work.

Isaac

Share this post


Link to post
Share on other sites
Guest Math Bot   
Guest Math Bot

As far as the actual ideas in the book go, in my (very limited) grasp of

them, I think they are entirely correct. The idea that everything in

the universe consists of simple algorithms equivalent to a natural

Turing machine is a brilliant discovery.

Excuse me? What does this even mean?  The universe does not contain algorithms in any sense, it contains entities.  Algorithms are what humans invented as a term to describe operations performed according to a given pattern.   What exists are entities which act, algorithms have no existence apart from relationships which relate  the actions of entities in a certain way. 

Edited by Math Bot

Share this post


Link to post
Share on other sites

Newton's Principles of Natural Philosophy is rationalist in the sense of violating condition three. Newton gives no mention of where he got his Laws of Motion from.

 

Incorrect.  Read Newton's rules of experiments which is his method of deriving hypotheses from experiment.  You might call it induction on steroids.

 

Please see:  http://www.fordham.edu/halsall/mod/newton-princ.asp

 

He states these rules in Principia Mathematica

 

ruveyn1

Share this post


Link to post
Share on other sites

Good shot there Math Bot, on the general topic I can also recomend  John Rogers Searle's "Machines Like Us" interview and his work "Is the Brain a Digital Computer?" .

I haven't read all of his stuff but from what I did I think he's got a great and mostly compatible theory of mind.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

×