03/19/2024 / By Ava Grace
Mathematician and futurist Ben Goertzel has warned that artificial intelligence could surpass human intelligence by 2027 – decades earlier than previously predicted.
Goetzel is known for popularizing the term “artificial general intelligence,” which is when AI are able to imitate natural human cognition. He warned that the rapid rate of advance in AI technology means AI is verging on an exponential “intelligence explosion.”
“It seems quite plausible we could get to human-level AGI [artificial general intelligence] within, let’s say, the next three to eight years,” Goertzel predicted while closing out a summit on AGI – the Beneficial AGI Summit in Panama. “Once you get to human-level AGI, within a few years you could get a radically superhuman AGI.”
While the futurist admitted that he “could be wrong,” he went on to predict that the only impediment to a runaway, ultra-advanced AI – far more advanced than its human makers – would be if the bot’s “own conservatism” advised caution. (Related: In two years, China plans to unleash mass-produced humanoid robots to replace human workers.)
“There are ‘known unknowns’ and probably ‘unknown unknowns,'” Goertzel acknowledged. “No one has created human-level artificial general intelligence yet; nobody has a solid knowledge of when we’re going to get there.”
But, unless the processing power, in Goertzel’s words, required a “quantum computer with a million qubits or something,” an exponential escalation of AI struck him as inevitable.
In recent years, Goertzel has been investigating a concept he calls “artificial super intelligence” – which he defines as an AI that’s so advanced that it matches all of the brain power and computing power of human civilization.
Goertzel listed “three lines of converging evidence” that, he said, support his thesis that AGI is achievable within the next few years.
First, he cited the updated work of Google’s long-time resident futurist and computer scientist Ray Kurzweil, who has developed a predictive model suggesting AGI will be achievable by 2029. Kurzweil’s idea drew on data documenting the exponential nature of technological growth within other tech sectors to help inform his analysis.
Next, Goertzel cited all the well-known recent improvements made to large language model (LLM) systems within the past few years, which he pointed out have “woken up so much of the world to the potential of AI.”
Lastly, the computer scientist turned to his infrastructure research designed to combine various types of AI infrastructure, which he calls “OpenCog Hyperon.” This new infrastructure would combine LLMs, more natural AI and other new forms of AI to help tech figure out how to process other areas of cognitive reasoning such as math, physics or philosophy.
“In the next decade or two [it] seems likely an individual computer will have roughly the computing power of a human brain by 2029, 2030. Then you add another 10-15 years on that, an individual computer would have roughly the computing power of all of human society,” Goertzel said. “Once we have a system that can design and write code well enough to improve upon itself and write subsequent versions, we enter a realm that could lead to a full-on intelligence explosion and Technological Singularity.”
Watch this video warning about how AI may be able to predict the future.
This video is from the vjtv channel on Brighteon.com.
Next phase of modern warfare: The development of neuroweapons that can control the human brain.
NYPD has been stalking civilians with tracking robots, drones and GPS locators.
Workers in warehouses could soon lose their jobs to HUMANOID ROBOTS.
Sources include:
Tagged Under:
AGI, AI, artificial general intelligence, artificial intelligence, Ben Goertzel, computing, cyber war, cyborg, dangerous, future science, future tech, glitch, information technology, insanity, inventions, machine learning, progress, prophecy, robotics, robots
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2019 Dangerous.News
All content posted on this site is protected under Free Speech. Dangerous.News is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Dangerous.News assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.