Majority of us are meme-copying automatons who are easily pwned by LLMs. Few of us have learned to exercise critical thinking and understanding from the first assumptions - the kind of thing we are expected to be learn in schools - also the kind of thing that still separates us from machines. A charitable view is that there is a spectrum in there. Now, with AI and social media, there will be an acceleration of this movement to the stupid end of the spectrum.
> Only the marketing and distribution matter in a world where it's very easy for others to clone something and sell it at a lower price
Great point. AI remixes and rips-off existing code-bases in a manner that is impossible to attribute copyright violation making it legal. ie, Perfect cloning. In a world where cloning is legal, the engineering cost of product drops to zero. That is where software production could be headed. What remains is marketing/distribution/sales.
There will remain niches solving "hard problems" which cant be cloned, but those will be rare. Hard problems are where a lot of engineering complexity resides, involving interacting components for which there are no examples in training datasets to copy from. For example, a complex distributed system or hardware with multiple nuanced tradeoffs.
This is what everyone who uses llms regularly expected. Good results require a human in the loop and the internet is so big that just about everything has been done there by someone. Most often you.
Take a look at the history of the power loom which automated weaving in the 19th century. The number of handloom weavers dropped two orders of magnitude after the power loom.
> This is the same dynamic that kept IBM dominant for decades
IBM still sells mainframes but is no longer a growth darling.
> Markets are right to reassess multiples. But reassessing multiples is very different from pricing in extinction
What you are missing is that the SaaS companies were extremely overpriced. For instance, crm after all the carnage is still priced at 25 times earnings which is historically high for anything that is not a growth company. The perception was that these companies would print money year after year selling software trinkets on their platforms and as such were placed in the growth category. Now, it is plainly obvious that these software trinkets can be produced easily by anyone using AI. Their pricing-power has dramatically declined. Hence the re-rating. None of this contradicts the thesis in your ai-assisted article that these businesses have moats just like IBM and its mainframes. These businesses are now in a vicious reflexive narrative loop where the narrative will impact the real-world which will further fuel the narrative.
Automation of easy templated tasks will cause a huge disruption. Production of software used to be a skilled job, but now is automated to a large degree. This has huge impacts to the profession as a whole. Already, enrollment to the UC CS program is declining.
> culture is intertwined with the hard problem of consciousness
Majority of people are sleep-walking as machines driven by imitation, habit and external forces. We live in a dreamlike, mechanical state lacking the awareness of this itself. apropos: Gurdjieff
Very uncharitable and questionable on a few levels. Every human exists in context of society, no human exists standalone—the very definition of self, as in self-awareness, has the existence of other as a prerequisite. People you see are perfectly aware of themselves; it’s just that awareness of yourself does not mean you have to violate societal norms and show how individual you are all the time—at best, it requires a more acute awareness of norms (you have to know what to violate first, cf. all the various counter-cultures), making one more socially integrated and in some ways paradoxicay less individual; at worst (if you are properly disconnected) it makes one less of a human, not more.
> People you see are perfectly aware of themselves
Are they rote-students imitating or copying memes and as such are driven by inadequate-ideas or are they students who understand the subject from its first assumptions and as such are driven by adequate-ideas. In the quote above, the suggestion is that majority are rote-students.
There’s a lot of ground between “imitate” and “understand the subject from its first assumptions”. Arguably, the former is how all learning happens at first. We imitate to get a taste for it and start enjoying it (humans are mirrors), then we can dig deeper if we become sufficiently interested. You can hardly become truly interested in music if you are presented with all the music theory up front and don’t get to have fun playing the instrument; same with math.
Even if someone never becomes sufficiently interested to dig deeper into some academic subject and sticks to imitating, I wouldn’t say they are somehow worse and have no awareness. They may have other interests and joys in life, there’s many fulfilling things outside academia. Why would you expect everybody to be like you?
> The basic force behind all culture formation is imitation
We are also limited by the linguistic structures we inhabit. And many languages have multiple variants. There is the respectful, obedient "formal" variant used at the workplace and the informal "colloquial" used in other places.
The "strong" sapir-whorf hypothesis, that cognitive and behavioral categories are limited by linguistic ones, is thoroughly discredited. At most they may influence our perceptions, but they do not constrain them.
Linguistics is one of the fields where HN consensus goes directly against the scholarly mainstream of the discipline for what I mostly find to be ideological reasons. So hopefully this isn't that and you're just a bit out of date. But there's been a big reevaluation of this in the last twenty years and virtually no contemporary working linguists represent the strong relative view anymore. It simply did not consistently produce useful results and has been abandoned.
Incredible, how an entire religion has sprung up around AGI.
reply