2 Comments

A thought-provoking piece, Vasant.

I've never really bought into the doomsday AI scenario. It makes for great science fiction but the reality is and will be something radically different. A couple of things to consider:

1) We presume that despite any safeguards, the end result will always lead to the eradication of humanity

2) If an AI were capable of "taking over everything", wouldn't there be grander missions for a super intelligence than to simply wipe out an entire species?

If AI is to become some super mind capable of thinking in terms of 1000 times an Einstein and beyond, I just find it ridiculous to imagine that revenge or domination would be the prime directive of such a sentient being.

Another assumption is that a run-away AI is 100% going to happen.

Interesting times ahead!

Expand full comment

Thank you Craig. And thanks always for reading

Expand full comment