I mean, what you're describing is technological advancement. It's great! I'm fully in favor of it, and I fully believe in it.
It's not the singularity.
The singularity is a specific belief that we will achieve AGI, and the AGI will then self-improve at an exponential rate allowing it to become infinitely more advanced and powerful (much moreso than we could ever have made it), and it will then also invent loads of new technologies and usher in a golden age. (Either for itself or us. That part's a bit under contention, from my understanding.)
> "The singularity is a specific belief that we will achieve AGI
That is one version of it, but not the only one. "John von Neumann is the first person known to have discussed a "singularity" in technological progress.[14][15] Stanislaw Ulam reported in 1958 that an earlier discussion with von Neumann "centered on the accelerating progress of technology and changes in human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue""[1]. A time when people before it, would be unable to predict what came after it because it was so different. (And which I argue in another comment[2] is not a specific cutoff time, but a trend over history of the future being increasingly hard to predict over shorter and shorter timeframes).
Apart from AGI, or Von Neuman accelerationism, I also understand it as augmenting human intelligence: "once we become cyborgs and enhance our abilities, nobody can predict what comes next"; or artificial 'life' - "if we make self-replicating nano-machines (that can have Darwinian natural selection?), all bets about the future are off"; or "once we can simulate human brains in a machine, even if we can't understand how they work, we can run tons of them at high speeds".
> and usher in a golden age. (Either for itself or us. That part's a bit under contention, from my understanding.)
Arguably, we have built weakly-superhuman entities, in the form of companies. Collectively they can solve problems that individual humans can't, live longer than humans, deploy and exploit more resources over larger areas and longer timelines than humans, and have shown a tendency to burn through workers and ruin the environment that keeps us alive even while supposedly guided by human intelligence. I don't have very much hope that a non-human AGI would be more aligned with our interests than companies made up of us are.
The version I described is the only one I've ever heard anyone else referring to.
Just because you can find someone referring to something else as a "technological singularity" doesn't mean it's reasonable to say that must, or even could, have been the definition the person I was replying to was using.
...And I think you'd be hard-pressed to find anyone to agree with you that "we founded companies" somehow satisfies the conditions of the Singularity, unless they're deeply invested in the idea that The Singularity Is Happening Now, and have entirely forgotten just what that's actually supposed to mean, or why they wanted it to be happening.
I wouldn't say that founding companies satisfies the conditions of the Singularity, the relevance there is that companies are superhuman and sociopathic, therefore the dream of superintelligences to be empathic, cooperative, and aligned with human interests doesn't seem likely.
> The version I described is the only one I've ever heard anyone else referring to.
In 2008 the IEEE magazine Spectrum did a series on the Singularity and one article [0] says "Like paradise, technological singularity comes in many versions, but most involve bionic brain boosting." Another article had this "Who's Who"[1] cheat-sheet of famous Singularity discussers for and against; one of the columns is "Kind of Singularity", and there's more than one kind. Kevin Kelly has "singularities are pervasive changes in the state of the world that are often recognizable only in retrospect. As a result, the singularity is always near". Bill Joy was a more general "computer science, biotech and nanotechnology event horizon". Marvin Minsky had mind uploading as well as machine intelligence.
Vernor Vinge named 'The Singularity' in 1983 and popularised the ida, and here are talk slides from him in 2005 after he had twenty years of thinking about it[2]:
Why call this transition the "Technological Singularity"?
By analogy with the use of "singularity" in Math
A place where some regularity property is lost
Not necessarily a place where anything becomes infinite
By analogy with the use of "singularity" in Physics
A place where the rules profoundly change
What comes beyond is intrinsically less knowable/predictable than before
The apocalyptic endpoint of radical optimism :-)
and
Singularity futures
Possible paths to the Singularity
What if: AI (Artificial intelligence) research succeeds?
I.J. Good, "Speculations Concerning the First Ultraintelligent Machine"
Hannes Alfv�n, The End of Man?
What if: The internet itself attains unquestioned life and intelligence?
Gregory Stock, Metaman
Bruce Sterling, "Maneki Neko"
What if: Fine-grained distributed systems are aggressively successful?
Karl Schroeder, Ventus
Vernor Vinge, "Fast Times at Fairmont High"
What if: IA (Intelligence Amplification) occurs
As the radical endpoint of Human/computer interface research?
Poul Anderson, "Kings Who Die"
Vernor Vinge, "Bookworm, Run!"
As the outcome of bioscience research?
Vernor Vinge, "Fast Times at Fairmont High"
Vernor Vinge, "Win a Nobel Prize!"
A place where the rules profoundly change because of bioscience research, human/computer cyborging, or distributed systems emergent complexity, is compatible with Vingean Singularity ideas, or at least used to be before the current LLM/AGI hype cycle.
The 'attractor at the end of history' is from Terence McKenna: "[he] saw the universe, in relation to novelty theory, as having a teleological attractor at the end of time, which increases interconnectedness and would eventually reach a singularity of infinite complexity. ... The universe is not being pushed from behind. The universe is being pulled from the future toward a goal ... our ever-accelerating speed through the phenomenal world of connectivity and novelty is based on the fact that we are now very, very close to the attractor."[3]
> "doesn't mean it's reasonable to say that must, or even could, have been the definition the person I was replying to was using."
Okay, but your "LLMs are not a path to AGI and tech bros are dumb" doesn't lead to anything interesting, it's just a mic-drop end of discussion.
"The singularity is happening" is one of those things.
"LLMs are AGI" is yet another.
They've already been discussed to death. The only reason to discuss them now is if you are one of the people who is absolutely rejecting reality because you are bound and determined to believe something that is not true.
It's not the singularity.
The singularity is a specific belief that we will achieve AGI, and the AGI will then self-improve at an exponential rate allowing it to become infinitely more advanced and powerful (much moreso than we could ever have made it), and it will then also invent loads of new technologies and usher in a golden age. (Either for itself or us. That part's a bit under contention, from my understanding.)