It used to be that you had to have a strong understanding of the underlying machine in order to create software that actually worked.
Things like cycle times of instructions, pipeline behavior, registers and so on. You had to, because compilers weren‘t good enough. Then they caught up.
You used to manage every byte of memory, utilized every piece of underlying machinery like the different chips, DMA transfers and so on, because that‘s what you had to do. Now it‘s all abstracted away.
These fundamentals are still there, but 99,9% of developers neither care nor bother with them. They don’t have to, unless they are writing a compiler or kernel, or just because it‘s fun.
I think what you‘re describing is also going to go away in the future. Still there, but most developers are going to move up one level of abstraction.
Having worked in a very large company for the past two decades now, one of the best career advices I ever got is about how you measure if you are a „good employee“.
It is very simple: you are a good employee if your boss(es) think you are.
That’s it. Nothing else matters in terms of career advancement or retainment.
No, you misunderstood. It is not about their output, it almost never is.
Most of the times, the business decision has already been made long before McK is hired. It’s all about legitimizing that decision and making it happen.
You can also wield them as a weapon against internal competitors or opponents. Look up how they were used to kill off Cariad for example.
Why is it bizarre? It is inevitable. After all, AI has not ruined creative professions, it merely disrupted and transformed them. And yes, I fully understand my whole comment here being snarky, but please bear with me.
> Actually all progress will definitely will have a huge impact on a lot of lives—otherwise it is not progress. By definition it will impact many, by displacing those who were doing it the old way by doing it better and faster. The trouble is when people hold back progress just to prevent the impact. No one should be disagreeing that the impact shouldn't be prevented, but it should not be at the cost of progress.
Now it's the software engineers turn to not hold back progress.
> [...] At the same time, a part of me feels art has no place being motivated by money anyway. Perhaps this change will restore the balance. Artists will need to get real jobs again like the rest of us and fund their art as a side project.
Replace "Artists" with "Coders" and imagine a plumber writing that comment.
> [...] Artists will still exist, but most likely as hybrid 3d-modellers, AI modelers (Not full programmers, but able to fine-tune models with online guides and setups, can read basic python), and storytellers (like manga artists). It'll be a higher-pay, higher-prestige, higher-skill-requirement job than before. And all those artists who devoted their lives to draw better, find this to be an incredibly brutal adjustment.
Again, replace "Artists" with coders and fill in the replacement.
So, please get in line and adapt. And stop clinging to your "great intellectually challenging job" because you are holding back progress. It can't be that challenging if it can be handled by a machine anyway.
The premise of those comments, just like the premise in this thread, is ridiculous and fantastical.
The only way generative AI has changed the creative arts is that it's made it easier to produce low quality slop.
I would not call that a true transformation. I'd call that saving costs at the expense of quality.
The same is true of software. The difference is, unlike art, quality in software has very clear safety and security implications.
This gen AI hype is just the crypto hype all over again but with a sci-fi twist in the narrative. It's a worse form of work just like crypto was a worse form of money.
I do not disagree, in fact I'm feeling more and more Butlerian with every passing day. However, it is undeniable that a transformation is taking place -- just not necessarily to the better.
Gen AI is the opposite of crypto. The use is immediate, obvious and needs no explanation or philosophizing.
You are basically showing your hand that you have zero intellectual curiosity or you are delusional in your own ability if you have never learned anything from gen AI.
I play with generative AI quite often. Mostly for shits and giggles. It's fun to try to make it hallucinate in the dumbest way possible. Or to make up context.
E.g. try to make any image generating model take an existing photo of a humanoid and change it so the character does a backflip.
It's also interesting to generate images in a long loop, because it usually reveals interesting patterns in the training data.
Outside these distractions I've never had generative AI be useful. And I'm currently working in AI research.
Is it though? I agree the technology evolving is inevitable, but, the race/rush to throw as much money at scaling and marketing as possible before these things are profitable and before society is ready is not inevitable at all. It feels extremely forced. And the way it's being shoved into every product to juice usage numbers seems to agree with me that it's all premature and rushed and most people don't really want it. The bubble is essentially from investing way more money in datacenters and GPU's than they can even possibly pay for or build, and there's no evidence there's even a market for using that capacity!
It's funny you bring up artists, because I used to work in game development and I've worked with a lot of artists, and they almost universally HATE this stuff. They're not like "oh thank you Mr. Altman", they're more like "if we catch you using AI we'll shun you." And it's not just producers, a lot of gamers are calling out games that are made using AI, so the customers are mad too.
You keep talking about "progress", but "progress" towards what exactly? So far these things aren't making anything new or advancing civilization, they're remixing stuff we already did well before, but sloppily. I'm not saying they don't have a place -- they definitely do, they can be useful. My argument is against the bizarre hype machine and what sometimes seems like sock puppets on social media. If the marketting was just "hey, we have this neat AI, come use it" I think there'd be a lot less backlash then people saying "Get in line and adapt"
> And stop clinging to your "great intellectually challenging job" because you are holding back progress.
Man, I really wish I had the power you think I have. Also, I use these tools daily, I'm deeply familiar with them, I'm not holding back anyone's progress, not even my own. That doesn't mean I think they're beyond criticism or that the companies behind them are acting responsibly, or that every product is great. I plan to be part of the future, but I'm not just going to pretend like I think every part of it is brilliant.
> It can't be that challenging if it can be handled by a machine anyway.
This will be really funny when it comes for your job.
I believe you misunderstood the point of my comment, or rather I didn't make it clear enough. The quotes I quickly picked out feel like they represent a majority opinion on HN, namely that this is progress and disruption. I don't share that opinion.
My own gut feeling is that this sentiment comes out of a position of superiority and a definite lack of empathy. It is software engineers building the technology that is leading to job loss, sloppification of everything as well as second order effects like storage and RAM prices soaring because of the hype.
As such, I find it ironic to complain about being replaced. After all, your profession is the one responsible for all of this, so now please take a look in the mirror and take responsibility for the actions of the industry you choose to work in.
Personally, I think the current trajectory of AI is an overall net negative to society. I sincerely hope it all comes crashing down in another AI winter, but we'll see.
I feel different: the last line is very important in this context, since it communicates the underlying thoughts and values of the poster.
Asking for "amazing" open source projects in this case is not asking out of genuine curiosity or want for debate, it is a rhetorical question asked out of frustration at the general trajectory of AI and who profits off of it -- namely the boot-wearers.
No fucking shit, I paraphrased Anthropic's comments as
> do better than we have publicly admitted most of humanity can do, and we may deign to interview you
If you think telling someone that after passing a test that 99.999% of humanity cannot pass, that they _may_ get an interview, you are being snarky/condescending.
That's not how paraphrasing works. They probably intentionally held back from guaranteeing an interview, for various reasons. One that seems obvious to me is that with the bar set at "Claude Opus 4.5's best performance at launch", it's plausible that someone could meet it by feeding the problem into an LLM. If a bunch of people do that, they won't want to waste time interviewing them all.
You may want to consider the distribution and quantity of replies before stating that you WILL do something that might just waste more people’s time or not be practical.
The classy thing to do would be responding to every qualifying submission, even if it’s just to thank everyone and let some people know the field was very competitive if an interview won’t be happening.
So I like these public challenges, but as someone who set some public questions, ask any company who ran any public contest for their opinion. The pool is filled with scammers who either bought the solutions through sites like Chegg or sometimes even just stackoverflow.
i think by your logic, they only thing that they do that is condescending is to say that an interview is not guaranteed.
people are mentioning that they do this for a reason, which explains away that behavior, so yeah, it kinda does change the fact of whether they are being condescending.
> They've even got their own slogan: "you're probably just not prompting it properly"
That's the same energy as telling other professions to "just learn to code, bro" once they are displaced by AI.
But I guess it doesn't feel nice once the shoe is on the other foot, though. If nobody values the quality of human art, why should anybody value the quality of human code?
>That's the same energy as telling other professions to "just learn to code, bro" once they are displaced by AI. But I guess it doesn't feel nice once the shoe is on the other foot, though.
It's the exact same neoliberal elites who told everyone to code one year and told them they'd all be automated of a job the next year.
I dunno who exactly you think you're being condescending towards.
reply