Hacker Newsnew | past | comments | ask | show | jobs | submit | ilc's commentslogin

When you realize that being a great "programmer", isn't about writing the most code, but getting the job done...

AI will click as another tool in the toolbox.


In much the same way that buying produce makes you a great farmer.

Farming is a funny example to use, given that it's one of the best examples of an industry that's continually revolutionized by evolving technology. Farming today is about owning the best tractor.

It's the difference between lovingly crafting heirloom tomatoes in small batches vs producing a consistent multi-ton quantity of tomatoes at an industrial scale.

There are uses for both, but job/compensation wise the heirloom grower isn't in the majority.


no, it makes you a great chef

I believe you reinforced the point.

I think you missed the point.

produce : chef :: code : programmer

chefs use produce to create dishes of food; chefs do not generally grow their own food. the point they were making is that the code is actually the means to the end, not the end in itself. to wit: i do not write assembly.


This may be more correct than you know. Chefs actually don't cook food customers eat. They plan the menu and manage the operations. The cooks cook.

They know how to cook and can if needed, but usually don't bother as they have 15 other restaurants to manage.

No, I understand the point, it means that people drawing the analogies don't understand the original intent was production, not consumption.

I think being a great "software engineer" at a company is about getting the job done. A great "programmer" is about designing and writing great code.

Its hard for me to disagree more, being a great programmer is completely orthogonal to how fast you "get the job done."

As a Principal SWE, who has done his fair share of big stuff.

I'm excited to work with AI. Why? Because it magnifies the thing I do well: Make technical decisions. Coding is ONE place I do that, but architecture, debugging etc. All use that same skill. Making good technical decisions.

And if you can make good choices, AI is a MEGA force multiplier. You just have to be willing to let go of the reins a hair.


As a self teaching beginner* this is where I find AI a bit limiting. When I ask ChatGPT questions about code it is always about to offer up a solution, but it often provides inappropriate responses that don't take into account the full context of a project/task. While it understands what good structure and architecture are, it's missing the awareness of good design and architecture and applying to the questions I have, and I don't have have the experience or skill set to ask those questions. It often suggests solutions (I tend to ask it for suggestions rather than full code, so I can work it out myself) that may have drawbacks that I only discover down the line.

Any suggestions to overcome this deficit in design experience? My best guess is to read some texts on code design or alternatively get a job at a place to learn design in practice. Mainly learning javascript and web app development at the moment.

*Who has had a career in a previous field, and doesn't necessarily think that learning programming with lead to another career (and is okay with that).


I can't summarize 40~ YOE of programming easily. (30+ professional)

I can tell you: Your problems are a layer higher than you think.

Coding, Architecture, etc. Those get the face time. Process, and Discipline, and where the money is made and lost in AI.

To give a minor example: My first attempt at a major project with AI failed HORRIBLY. But I stepped back and figured out why. What short-comings did my approach have, what short-comings did the AI have. Root Cause Analysis.

Next day I sat down with the AI and developed a PLAN of what to do. Yes, a day spent on a plan.

Then we executed the plan. (or it did and I kept it on track, and fixed problems in the plan as things happened.) On the third day I'd completed a VERY complex task. I mean STUPIDLY complex, something I knew WHAT I wanted to do, and roughly how, but not the exact details, and not at the level to implement it. I'm sure 1-2 weeks of research could have taught me. Or I could let the AI do it.

... And that formed my style of working with AI.

If you need a mentor pop in the Svalboard discord, and join #sval-dev. You should be able to figure out who I am.


Thanks for reply. I have notifications off so didn't realise you had.

I guess it sounds like a bit of a trap in some ways. Those without experience (noobs like me) are gifted a tool that can suggest to them solutions or strategies that you'd typically come across only after many years working/learning peeling back the onion. You don't really have the taste or canniness to know how best to use those tools without the experience, which you typically gain by working and gaining the experience!

I cycle regularly and it actually reminds me a little of the many people who've taken to ebikes recently. I'm generally in favour of ebikes for adding another mode of non-car transport out there but I definitely see a lot of newish ebikers attempting manouveres that only experienced cyclists would do, because you wouldn't do it if you didn't have the strength to get up to speed and have the handling. But ebikers are able to bypass the strength and skills training and get themselves in somewhat dangerous situations that you'd only find yourself in if you were a pretty strong cyclist and had the bike handling skills to match your speed.

A bit of a tangent there!

I do like spending time planning. Problem is as a relative beginner there's only so much you can add to your plan when you're still learning how to do so much. I do use AI to help advise on my strategies, toolkit approaches rather than do the code though. It does send me really inappropriate solutions though quite regularly that a human would (hopefully) avoid because the human understands the wider context or at least will ask questions.


I think you over estimate how valuable really good Principal level talent is when you have AIs that can take over for entire teams.

As an older and higher up engineer, I worry more for the youngsters than myself. I'll find a spot. I'm using AI, I'm doing things at rates that are pretty crazy.

That's all powered by decades of good decision making practice. Youngsters don't have that. They don't have the painful lessons hard earned.


> I think you over estimate how valuable really good Principal level talent is when you have AIs that can take over for entire teams.

I think you mean underestimate.

A good principal engineer (and they almost always are good) is someone who can understand both business requirements as well as the architectural foundations that underlay the product itself.

Principal (and Principal track Staff, Senior, and early-career) SWEs aren't going anywhere. In fact in an LLM-driven world, their domain experience is much more critical.


Minor note to anyone from taalas:

The background on your site genuinely made me wonder what was wrong with my monitor.


LLMs are where you need the most tests.

You want them writing tests especially in critical sections, I'll push to 100% coverage. (Not all code goes there, but thing that MUST work or everything crumbles. Yeah I do it.)

There was one time I was doing the classic: Pull a bug find 2 more thing. And I just told the LLM. "100% test coverage on the thing giving me problems." it found 4 bugs, fixed them, and that functionality has been rock solid since.

100% coverage is not a normal tool. But when you need it. Man does it help.


> You want them writing tests especially in critical sections, I'll push to 100% coverage.

But how do you know if you got it?

I've seen no LLM that can even verify execution pathway coverage.


I disagree.

Freedom of Speech guarantees the right to speak. Not the right to have no repercussions.

Elon has GREAT interest in Freedom of Speech, it enables him to have far more power than regulating the type of "speech" he showed in cancelling that customer's order.


Elon has interest in monetary gain and stirring conflicts around the world. It is sad that individuals like you are drunk on his coolaid.


> Freedom of Speech guarantees the right to speak. Not the right to have no repercussions.

How is that different from, say, Freedom of Theft guaranteeing the right to steal, but not the right to have no repercussions?

By these definitions, everyone has these “rights”?


I think the OP meant 'social reprecussions'.

But really it is a citizen, Elon Musk, exercising their property rights. The fact that he claims that such restrictions should not be applied to speech he likes is the problem.


he's controlling the speech, not freeing it


try posting "cisgender" on xhitter


No, it means you have something unique to say.

The bar is there, but it is lower than you expect. If you have a truly unique point of view to express, that brings some value to the table, slots will open up.

And I've spoken at plenty of conferences. :) Not always in the glamour rooms/slots. But... I did have one talk fill a room out the door. That was a talk on a difficult/controversial topic, and by then... I was probably about as expert as they came on the issue.

I didn't start with that though. I just started with a simple point of view talk. And I'd argue the second version of that talk is still one of the best I've given in my life.


That doesn't mean every talk has to be unique and special. An "introduction to XYZ" talk may have a bunch of equally valid speakers, which all naturally provide a slightly different angle and there is a bunch of factors going in the decision about who gets the slot.

Some talks are plain craftswork, not unique experiences and still very worthwhile.


It can. But I don't want to compete for my slot with others who can give the same talk, or a talk that is similar.

I want to make the conference committee choose between "Do we want ilc's talk on X." or "Do we want foo's talk on Y." If we are both discussing the same thing, if I'm unknown, I will lose. OTOH, if I have something interesting to talk about... I have 2 routes to "victory". "ilc gives great talks, he gets good grades and is working on his skills." and "Man that's a damn cool topic. We want that at our conference, even if ilc isn't the BEST speaker, the combo is better."

I didn't start out as the best presenter. I learned. But I always knew I had to have an interesting topic, something that made it worth them giving me a slot.


No, I put them with lmgtfy. You are being told that your question is easy to research and you didn't do the work, most of the time.

Also heaven forbid, AI can be right. I realize this is a shocker to many here. But AI has use, especially in easy cases.


"I asked AI and it said" is far worse than lmgtfy (which is already rude) because it has zero value as evidence. AI can be right, but it's wrong often enough that you can't actually use it to determine the truth of something.


How is lmgtfy rude?


Do you respond to every comment with a question in it? No? Then why would you respond to a question in a comment with a useless reply?


You didn't answer the question, it's your reply that is useless.


1.) They are not replies to people asking questions.

2.) Posting AI response has as much value as posting random reddit comment.

3.) AI has value where you are able to factually verify it. If someone asks a question, they do not know the answer and are unable to validate ai.


I don't think LLM responses mean a question is easy to research - they will always give an answer.


I think AI clouds the real issues around Junior hiring. Defective companies.

Let's say you hire your great new engineer. Ok, great! Now their value is going to escalate RAPIDLY over the next 2-3 years. And by rapidly, it could be 50-100%. Because someone else will pay that to NOT train a person fresh out of college!

What company hands out raises aggressively enough to stay ahead of that truth? None of them, maybe a MANGA or some other thing. But most don't.

So, managers figure out fresh out of college == training employees for other people, so why bother? The company may not even break even!

That is the REAL catch 22. Not AI. It is how the value of people changes early in their career.


I think this is the crux of it. When i got my first job I probably made half the salary of the senior engineer in our division. I am 100% sure I was not half as productive. Juniors take a lot of training and time and aren't very productive, but their salaries are actually not reflective of that. The first few months at your first job you're probably a net loss in productivity.

If salaries reflected productivity, you'd probably start out at near minimum wage and rapidly get raises of 100% every half year.

On top of that, if the junior is successful he'll probably leave soon after he's up-and-running b/c the culture encourages changing jobs every 1-2 years. So then you need to lock people down with vesting stock or something..

It seems not easy at all. Even if you give aggressive raises, at the next interview they can fake/inflate their experience and jump in to a higher salary bracket

Hiring and training junior developers seems incredibly difficult and like a total waste of energy. The only time I've seen it work is when you get a timid autistic-savant-type who is too intimidated with interviewing and changing jobs. These people end up pumping out tons of code for small salaries and stay of for years and years. This is hitting the jackpot for a company


>Even if you give aggressive raises, at the next interview they can fake/inflate their experience and jump in to a higher salary bracket

I don't think the kinds of people who see a 50% raise and complain that it's not 100% are the kinds of candidates you want to hire anyway. I'd like to see more of that before deciding we tried nothing and ran out of ideas.

I didn't leave my first job because I was non-autistic. I left because I was paid 50k and the next job literally tripled my total comp. Oh, and because I was laid off. but tbf I was already out the door mentally around that time after 2 years of nothing but chastising and looking at the next opportunity.

I would have (outside of said chastising) gladly stayed if I got boosted to 75k. I was still living within my means on 50k.

>Hiring and training junior developers seems incredibly difficult and like a total waste of energy

If that's the attitude at large, we're all falling into a tragedy of the commons.


> Juniors take a lot of training and time and aren't very productive, but their salaries are actually not reflective of that

In the current economic situation you can offer a junior 2x may be even 3x less and still get candidates to choose from.

Also there juniors who are ready to compensate for lack of experience by working longer hours (though that's not something you would learn during hiring).

> The first few months at your first job you're probably a net loss in productivity.

It's true for a senior too, each company is different and it takes time to learn company's specific stuff.


I actually got a major raise after 6m, and then another major raise 1y into my career, because my boss recognized my value.

Sadly this is not as common as it should be - but I've also mentored folks at FAANGs who got promoted after 1y at the new-hire level because they were so clearly excelling. The first promotion is usually not very hard to attain if you're in the top quartile.


>not very hard to attain if you're in the top quartile.

No biggie, just be the best in the interview stage and continue to be the best for years after that. It's that simple.


I used to sit on a Google / Alphabet promo committee (more than half a decade ago, so things have changed a bit). The bar for L3 (newgrad) to L4 promotions is not high. It's certainly not a rubber stamp, but we were looking for:

  - Gets things done fast enough with high quality 
  - Works mostly independently 
  - Has recommendations from more senior peers who can speak to what they've done
  - Is starting to show several of the expected marks of a Senior (L5)
I had at least one, maybe two new grads who I mentored get promoted to L4 within a year of starting full time; they had both done internships at Google so they didn't have as much ramp up time.


"Winner take all" environment


What you are saying is not a hiring problem, but an education one.

If colleges stayed up to date, and teach valuable skills, the jump wouldn't be so steep!


Dumping our apprenticeship programs onto academia is exacly how we got into this mess to begin with. It has historically not been the job of a college to produce junior talent. They teach a best for T shaped individual and setup for more of their pipeline in research should students want to delve deeper

If industry doesn't want to pay for training, they better pay bootcamps to overhaul themselves and teach what they actually need. I don't think universities will bend much more since they have their own bubble on their hands.


So you can get rooted by the security issues disclosed.

Isn't it a wonderful catch 22?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: