How computational complexity will revolutionize philosophy

August 10, 2011 | Source: The physics ArXiv blog

(Credit: stock image)

MIT computer scientist Scott Aaronson has put forward a persuasive argument that computational complexity theory will transform philosophical thinking about a range of topics such as the nature of mathematical knowledge, the foundations of quantum mechanics, and the problem of artificial intelligence.

Computational complexity theory is concerned with the question of how the resources needed to solve a problem scale with some measure of the problem size, call it n. There are essentially two answers. Either the problem scales reasonably slowly, like n, n^2 or some other polynomial function of n. Or it scales unreasonably quickly, like 2^n, 10000^n or some other exponential function of n.

So while the theory of computing can tell us whether something is computable or not, computational complexity theory tells us whether it can be achieved in a few seconds or whether it’ll take longer than the lifetime of the Universe.

Aaronson points out that this leads to a powerful new way to think about the problem of AI. He says that instead of “whatever a computer can do using fixed formal rules, it will never be able to ‘see’ the consistency of its own rules,” Penrose could say that even though the look up table approach is possible in principle, it is effectively impractical because of the huge computational resources it requires.

By this argument, the difference between humans and machines is essentially one of computational complexity.

Ref: arxiv.org/abs/1108.1791: Why Philosophers Should Care About Computational Complexity

Also see:

A Computational Foundation for the Study of Cognition
ARE WE SPIRITUAL MACHINES? | The Beginning of a Debate