The Hard-Luck Case For AGI And AI Superintelligence As An Extinction-Level Event
The post The Hard-Luck Case For AGI And AI Superintelligence As An Extinction-Level Event appeared on BitcoinEthereumNews.com.
Realizing that the advent of AGI and ASI could trigger a dire AI-driven extinction-level event that wipes us all out. getty In today’s column, I examine the widely debated and quite distressing contention that once we attain artificial general intelligence (AGI) and artificial superintelligence (ASI), doing so will be an extinction-level event (ELE). It’s a real hard-luck case. On the one hand, we ought to be elated that we have managed to devise a machine that is on par with human intellect and potentially possesses superhuman smarts. Still, at the same time, the bad news is that we are utterly decimated accordingly. Wiped out forever. It’s a rather dismal prize. Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here). Heading Toward AGI And ASI First, some fundamentals are required to set the stage for this weighty discussion. There is a great deal of research going on to further advance AI. The general goal is to either reach artificial general intelligence (AGI) or maybe even the outstretched possibility of achieving artificial superintelligence (ASI). AGI is AI that is considered on par with human intellect and can seemingly match our intelligence. ASI is AI that has gone beyond human intellect and would be superior in many, if not all, feasible ways. The idea is that ASI would be able to run circles around humans by outthinking us at every turn. For more details on the nature of conventional AI versus AGI and ASI, see my analysis at the link here. We have not yet attained AGI. In fact, it is unknown whether we will reach AGI, or that maybe AGI will be achievable in decades…
Filed under: News - @ October 26, 2025 7:19 am