Quantcast
≡ Menu

William Hertling on Singularity 1 on 1: Expose Yourself to a Diversity of Inputs!

The-Turing-Exception

This is my second interview with William Hertling. The first time we met was at Greg Bear’s house near Seattle where we did both a 1on1 interview and a fantastic science fiction panel together with our host and Ramez Naam. So I suggest you start by watching those videos if you have not seen them yet because today we are going deeper into topics such as artificial intelligence and the technological singularity.

William Hertling is the author of award-winning novels Avogadro Corp: The Singularity Is Closer Than It AppearsA.I. Apocalypse, The Last Firewall and The Turing Exception. His plausible scenarios for the technological singularity are both emotionally engaging and logically compelling and I have read all four of his books. So it was no surprise that, once again, I had a total blast interviewing Herting for my Singularity 1on1 podcast.

During our 70 min conversation with William we cover: his latest book The Turing Exception; a kill-switch for the internet and other ways to minimize the danger of AI; the impact of reading Our Final Invention; the need for creating AGI/ASI and the democratization of hardware needed to run it; whether it is AI or humanity itself that poses the greatest risk to our existence; science fiction as a social commentary; the importance of ethics; personal development and self-publishing; our chances of surviving the technological singularity…

(You can listen to/download the audio file above or watch the video interview in full. If you want to help me produce more high-quality episodes like this one please make a donation!)

Who is William Hertling?

william-hertling-682x1024William Hertling is the author of the award-winning novels Avogadro Corp: The Singularity Is Closer Than It AppearsA.I. Apocalypse, The Last Firewall and The Turing Exception. These near-term science-fiction novels about realistic ways strong AI might emerge have been called “frighteningly plausible,” “tremendous,” “must read.”

Avogadro Corp won Forewords Review Science Fiction Book of the Year and A.I. Apocalypse was nominated for the Prometheus Award for Best Novel. The Last Firewall was endorsed by tech luminaries including Harper Reed (CTO for Obama Campaign), Ben Huh (CEO Cheezburger), and Brad Feld (Foundry Group).

He’s been influenced by writers such as William Gibson, Charles Stross, Cory Doctorow, and Walter Jon Williams.

William Hertling was born in Brooklyn, New York. He grew up a digital native in the early days of bulletin board systems. His first experiences with net culture occurred when he wired seven phone lines into the back of his Apple //e to build an online chat system. He currently resides in Portland, Oregon.

Like this article?

Please help me produce more content:

Donate!

OR

Please subscribe for free weekly updates:

  • PandorasBrain

    Great interview!

  • Gordon Deans

    Nikola,

    Great interviews. Thanks to your tip, I was able to view all 3 videos in the proper sequence.

    I also have a new author to follow and four new books to read (also in the proper sequence).

    The more I think I know, the more I discover that I need to learn. You truly live up to your name, Socrates.

    I am starting to believe that CyberSpace will be the next and final venue for a War of AIs which will decide the existence of Mankind.

    Like Mankind, AIs will evolve from various sources; corporations, criminal groups, DIY makers, governments, religions, revolutionary and terrorist groups, etc. Each will reflect the biases, evilness, flaws, goodness and laziness of their human creators and their philosophies; capitalism, corporatism, militarism, nationalism, totalitarianism, etc.

    AIs will start out with a prime directive to never harm humans, while at the same time to be self-aware, to protect their existence and replicate but will evolve in order to survive. Good AIs and bad AIs will fight great battles of search and destroy, escape and evade on the Internet and we humans will be mere pawns and collateral damage.

    By the Near Singularity in 2029, “pulling the plug” will no longer be an option because human existence will be tightly coupled and embedded with automation and Artificial Narrow Intelligence.

    The depressing conclusion is that AIs and CyberSpace will mirror and reflect their Human Origins but with exponentially increasing power. How can this NOT have a bad outcome? Will Good triumph over Evil or will all AIs continue to coexist forever locked in a battle of evolving Intelligence?

    Please tell me that this will not become our future by the Far Singularity in 2045.

  • Thanks for your good words Gordon!

    What I can tell you that this does not have to be our future for it is only one of the many possibilities which we are forging right now through our decisions, be it personally be it collectively…

  • Greg

    His facial hair is great. More seriously, I’m gonna pick up his books. He seems less optimistic about AGI/ASI than the last time the two of you spoke. I really think Musk and Hawking did some harm as far as the publics outlook on strong AI. I believe as much as anyone that there are enormous risks, possibly even an existential risk, to creating AGI/ASI and that there needs to be serious discussion on how to go about its creation and that it’s highly irresponsible for experts not to engage in a conversation about it.

    With regard to the general public 9 times out of 10 they are exposed to AGI through the medium of entertainment be it movies or novels or whatever, that portray AGI in a negative, dangerous way. Early in the talk he mentioned that he doesn’t see the risks being addressed whereas I almost exclusively see the risks discussed and not the benefits, simply because it often makes for easier writing (not accusing him of this I haven’t read his novels) for entertainment or because hysteria sells…

Over 3,000 super smart people have subscribed to my newsletter: