There is no way our current world is even close to Singularity.

Six years? Absolutely not. Even if the entirety of the world’s powers were spent on science instead of military, if every technological provider on the planet were working together on development, if every person who could get a quality education were getting one across the entirety of the planet Earth, maybe, there would be about a 5% chance, that all of that concentrated, coordinated effort would offer us that capacity.

Our current environment isn’t even close to affording us the concentrated technological development necessary to pretend that our current crop of generative artificial intelligence tools is even close to providing the bedrock for the most transformative scientific discovery since FIRE and NUCLEAR POWER.

What is the Singularity anyway?

The Singularity is a theoretical point in the future when technological growth becomes uncontrollable and irreversible, resulting in profound changes to human civilization. Often associated with advancements in artificial intelligence (AI), the Singularity is marked by the emergence of super-intelligent systems capable of surpassing human intellectual capacity in virtually every domain.

What would change if the singularity happened?

  • Exponential Growth of Technology: Rapid technological advancements, particularly in AI, biotechnology, and nanotechnology, lead to innovations that accelerate progress in unprecedented ways.
  • Artificial General Intelligence (AGI): The development of AI systems with cognitive abilities on par with or exceeding human intelligence, enabling them to perform any intellectual task a human can do—and potentially far better.
  • Automation and Transformation: Profound shifts in economies, work, and daily life as automation and AI take over increasingly complex tasks, rendering many traditional human roles obsolete.
  • Integration with Technology: Potential merging of human biology with technology (e.g., brain-computer interfaces) to enhance physical and cognitive abilities, challenging the definition of what it means to be human.

Now you know the outcome. Do we look close to you?

We are a civilization at war with itself, battling against runaway corporate capitalism, failed democracies, strong-man authoritarian governments springing up all over the planet, three to nine military conflicts of varying degrees taking place all over the planet, runaway climate change effects, microplastics in every ocean and now in the wind raining down on populated areas, diseases that were almost annihilated are running rampant again, deforestation, over-fishing, and climate migration taking place on every continent.

Does this look like a world able to usher in the development of a world-transforming technological breakthrough just as liable to destroy humanity as it is to save it?

AGI isn’t here yet. Not even close.

Artificial General Intelligence (AGI) Isn’t Here Yet — Despite advancements, current AI models are far from achieving the cognitive flexibility and reasoning abilities of AGI. Narrow AI excels at specific tasks but lacks the generalized understanding needed to operate autonomously across domains.

  • Hardware Constraints — The computational power required for AGI—and the subsequent superintelligence needed for a true Singularity — would be astronomical. Quantum computing, while promising, is still in its infancy and far from solving these challenges.
  • Data Bottlenecks — Training advanced AI requires enormous, clean, and diverse datasets. Ethical, logistical, and technological issues often limit the quality and scope of these datasets.
  • Energy Demands — The energy consumption of training state-of-the-art AI systems is unsustainable at scale. Developing systems capable of hosting superintelligence without catastrophic environmental impact will take longer than six years.

Six years isn’t even on the menu. Call me in twelve.

Yes, there are constantly new developments taking place which may offer new paths to AGI but our current artificial intelligence development appears to be slowing down, not speeding up, despite the hundreds of billions being spent annually on the subject by numerous companies.

The dream of the Singularity captivates many, but history shows that transformative breakthroughs, like the New Deal, the Space Race, or the Human Genome Project, required decades of focused effort, global collaboration, and the resolution of immense technical and societal challenges.

These successes demonstrate what humanity can achieve when unified by purpose and shared vision. However, our current fragmented world—riven by geopolitical conflicts, environmental degradation, and economic inequity—makes the idea of achieving AGI within six years not just implausible, but a distraction from real, achievable goals.

True progress lies not in fantastical timelines, but in addressing systemic challenges and ensuring that emerging technologies serve the collective good, as our greatest achievements have done in the past. Only through such focus and unity can we hope to harness AI’s transformative potential responsibly and equitably.

Reference

Orf, Darren. 2024. “Humanity May Reach Singularity within Just 6 Years, Trend Shows.” Popular Mechanics. November 30, 2024.

#science #technology #singularity #artificialintelligence #AGI

SCIFI Radio Staff

SCIFI.radio is listener supported sci-fi geek culture radio, and operates almost exclusively via the generous contributions of our fans via our Patreon campaign. If you like, you can also use our tip jar and send us a little something to help support the many fine creatives that make this station possible.