> But an LLM can't be held accountable.. neither can most employees
Yes and no.
Yes, this is really problem, because at current level of technologies, some thing are inexpensive only if done in large numbers (factor of scale), so for example, just could not exist one person who could be accountable for machine like Boeing-747 (~500 human-years of work per plane).
Unfortunately, modern automobile is considered large system, made from thousands parts, so again, not exist one person to know everything.
And no, Germans said "Ordnung muss sein", which in modern management mean, constant clear organization of the game of the whole team is more important than the success of individual players.
Or, in simple words, right organization, controlled by rules is considered enough reliable to be accountable.
And for example in automobile industry, now normal to consider accountable whole organization.
And for example, Daimler officials few years ago said, Daimler safety systems will use Daimler view on robotic laws - priority will be safety of people inside vehicle. You may know, traditionally used Lem robotic laws, which have totally different view, separated from inside vs outside approach. In civil aviation using approach, to just use simple designs or design with evidence of reliability.
Sure, government regulators could decide something even more original, will see.
Any way, as technology emerge, accountability of machines will be sure subject of many discussions.
But an LLM doesn't understand "never used again" as a consequence and the threat of it is useless as a motivation to improve (also because LLMs have no concept of "motivation" or "threats" or anything else).
You're talking about LLMs as if they are some kind of singular entity, but LLMs as used for coding only exist as a product of a company that employs humans. If nobody uses the LLM because it sucks, those people will be out of a job.