August 5, 2010 by Thomas McCabe
Eliezer Yudkowsky is a Research Fellow at the Singularity Institute for Artificial Intelligence and founder of the community blog Less Wrong. We discussed his coming talk at the Singularity Summit on August 15, his forthcoming book on human rationality, his theory of “friendly AI,” and the likelihood of the Singularity and how to achieve it.
What are you working on currently?
I’m working on… read more