Quantcast
≡ Menu

Michael Shermer on Singularity 1 on 1: Be Skeptical! (Even of Skeptics)

I couple of days ago I interviewed Michael Shermer for Singularity 1 on 1.

I met Dr. Shermer at the recent Singularity Summit in New York where he was one of the most entertaining, engaging and optimistic speakers. Since he calls himself a skeptic and not a singularitarian, I thought he would bring not only balance to my singularity podcast but also a healthy doze of skepticism, and I was not disappointed.

During our conversation we discuss a variety of topics such as: his education at a Christian college and original interest in religion and theology; his eventual transition to atheism, skepticism, science and the scientific method; SETI, the singularity and religion; scientific progress and the dots on the curve as precursors of big breakthroughs; life-extension, cloning and mind uploading; being a skeptic and an optimist at the same time; the “social singularity”; global warming; the tricky balance between being a skeptic while still being able to learn and make progress.

(As always you can listen to or download the audio file above or scroll down and watch the video interview in full.)

 

Michael Shermer’s Singularity Summit presentation: “Social Singularity: Transitioning from Civilization 1.0 to 2.0″

 

Who is Michael Shermer?

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine (www.skeptic.com), the Executive Director of the Skeptics Society, a monthly columnist for Scientific American, the host of the Skeptics Distinguished Science Lecture Series at Caltech, and Adjunct Professor at Claremont Graduate University and Chapman University.

Dr. Shermer’s latest book is The Mind of the Market, on evolutionary economics. His last book was Why Darwin Matters: The Case Against Intelligent Design, and he is the author of Science Friction: Where the Known Meets the Unknown, about how the mind works and how thinking goes wrong. His book The Science of Good and Evil: Why People Cheat, Gossip, Care, Share, and Follow the Golden Rule, is on the evolutionary origins of morality and how to be good without God. He wrote a biography, In Darwin’s Shadow, about the life and science of the co-discoverer of natural selection, Alfred Russel Wallace. He also wrote The Borderlands of Science, about the fuzzy land between science and pseudoscience, and Denying History, on Holocaust denial and other forms of pseudohistory. His book How We Believe, presents his theory on the origins of religion and why people believe in God. He is also the author of Why People Believe Weird Things on pseudoscience, superstitions, and other confusions of our time.

According to the late Stephen Jay Gould (from his Foreword to Why People Believe Weird Things): “Michael Shermer, as head of one of America’s leading skeptic organizations, and as a powerful activist and essayist in the service of this operational form of reason, is an important figure in American public life.”

Dr. Shermer received his B.A. in psychology from Pepperdine University, M.A. in experimental psychology from California State University, Fullerton, and his Ph.D. in the history of science from Claremont Graduate University (1991). He was a college professor for 20 years (1979-1998), teaching psychology, evolution, and the history of science at Occidental College (1989-1998), California State University Los Angeles, and Glendale College. Since his creation of the Skeptics Society, Skeptic magazine, and the Skeptics Distinguished Science Lecture Series at Caltech, he has appeared on such shows as The Colbert Report, 20/20, Dateline, Charlie Rose, Larry King Live, Tom Snyder, Donahue, Oprah, Lezza, Unsolved Mysteries (but, proudly, never Jerry Springer!), and other shows as a skeptic of weird and extraordinary claims, as well as interviews in countless documentaries aired on PBS, A&E, Discovery, The History Channel, The Science Channel, and The Learning Channel. Shermer was the co-host and co-producer of the 13-hour Family Channel television series, Exploring the Unknown.

Like this article?

Please help me produce more content:

Donate!

OR

Please subscribe for free weekly updates:

  • http://twitter.com/Nikki_OlsonTSIN Nikki Olson

    Enjoyed this interview a great deal!

    He says in the second video, min 1:30-2:00 “I think that because in history of humanity everyone who’s ever made such a claim has been wrong (great grand breakthroughs, extension of life, etc.), skepticism is the appropriate position to start off with, and see how it goes”

    -My response is in accordance with yours, Nikola!

    -I agree that one should be skeptical, but not for this reason. The failure of past predictions gives no scientific information about the predictions themselves. Skepticism ought to be derived by actually looking at the science, not by looking at the success of past predictions on the matter. (Nor by how ‘far away’ the realization of the claim seems (5:15-5:25))

    -this is a common error that people make when they think about the feasibility of AI. There were a bunch of over-optimistic predictions regarding AI in the 60s, 80s, etc (in between each AI winter) that failed to come true, causing people to in effect doubt the feasibility of AI. However, the failure of those predictions to come true gives no information about the actual feasibility of AI itself. One needs to look at neuroscience, etc. to answer these questions, not at efficacy of past predictions.

    Nor does it really give a lot of information regarding the feasibility in terms of time. As Yudkowsky argues, to realize that predictions of ‘very near AI’ have failed, and then to adopt the position that AI must be ‘very far away’ is to make the same error in the opposite direction. “It’s the opposite side of the same coin”. Again, one needs to look at the actual data to get an idea of feasibility.

  • Sunrider

    Hmm … I agree. Listening to this, I’m wondering whether the primary mistake is one that Muehlhauser alluded to as well – i.e. whether the singularity is the extrapolation of exponential growth curves when, in fact, we’ve only observed too short a part of the curve. I.e. we would then have the good’ol logarithmic flattening out again. In this world what singularitans would want to watch out for then are the breakthrough events (like the Wright brothers) and a succession of these would ultimately give rise to a technological or biological singularity. Kind of like a jump diffusion model of different strands running in parallel that ends up being self-directing and self-reinforcing, so that the final result, at convergence, is not a single new strand but, in fact, the singularity.

  • http://cmstewartwrite.wordpress.com/ CMStewart

    Thank you for this “skeptics” podcast, Nikola! You and Shermer
    have an engaging give and take, and the interview was illuminating and
    balanced.

    I’m somewhat of a skeptic, but I echo Nikki’s and Sunrider’s comments. Perhaps Copernicus was wrong, and we ARE special. ;)

     

    The crashed plane / not crashed plane scenario made me laugh
    out loud. :)
    But it does raise some very important, very serious questions. The technology
    to replicate people could develop to the point of viability.

     

    Two identical (for the sake of argument) “yous” are running
    around, each 100% internalized as “you,” feeling as convinced of their “you-ness”
    as you do right now reading this. Imagine yourself, Nikola, right now, being
    told you are a copy, and being presented with evidence proving you are, indeed,
    a copy. Would you willingly walk away from your (as you understood it) life, so
    that the “original you” (who just walked in the room to kick you out of the
    house) could resume his life with what you thought was your wife, your career, and
    your family and friends? Where would you go? What would you do? Imagine your
    wife’s reaction upon finding out that the plane that carried the “original you”
    did NOT crash, and the “original you” has just come home. Assuming she would
    prefer the “original you” to the “copied you,” would you- as a copy- be able to
    move on with a new life without everything you thought was yours? Remember, you
    (as a copy) would feel and think and remember exactly as you do right now
    reading this.

     

    Now imagine that instead of you on the crashed plane / not
    crashed plane, it was your wife. After the crashed plane news, you made an
    exact copy of your wife. Then surprise, your “original” wife comes home and
    kicks the “copy” out of the house. Knowing the “copy” would be feeling just as
    lost, lonely, and miserable as you would as a kicked-out copy, could you bear
    to let your original wife kick her copy out? Would you remain friends with your
    wife’s copy? Would you still love her and feel sorry for her? How about a wife VS
    wife confrontation, and one killed the other- would the original wife get off
    easier than the copy?

     

    If I was in the crashed plane / not crashed plane scenario, I
    know I couldn’t bear to see the look on my “copied” husband’s face as my “original”
    husband told him to get out. My only solution would be to make a copy of myself
    so the two copies could be together and try to rebuild their lives. At least
    they would have each other as they tried to re-establish themselves in the
    world. Which brings me to my last (for now) set of questions. In a situation
    where a married couple finds themselves each with a copy, (as in the situation described
    above with 4 people) would it initially matter whether an original was initially
    matched with the other original? If you were an original, would it matter that
    you were matched with your wife’s original? If you were a copy, would you still
    prefer the “original wife”? What does that say about how you would see yourself
    as a copy?

Over 2,000 super smart people have subscribed to my newsletter: