I believe minds are strange loops: level-crossing feedback loops. My understanding is that they are systems that edit how they edit themselves. This post is about self-improving artificial intelligences, which I believe are strange loops. The question is, could there be a class of strange loops that tend to increase in complexity? What environment would … [Read more…]
Let’s talk about evil artificial intelligence. From Terminator to The Matrix, humans have feared computers taking over the world. I think there are great reasons why that’s unlikely: 1. Intelligence is hard to model. 2. Self-improving intelligence may be even harder. 3. Intelligence requires cooperation. 4. Intelligence grows. It is not a switch that is … [Read more…]
How many generations would it take for an isolated group of ignorant humans, unable to speak or read, to bootstrap themselves and develop these technologies again?