Training employees use to be common in the US. Companies trained their workforce and marketed the quality of their training programs to both customers and potential employees.
US employers didn't do this out of the goodness of their shareholder's hearts or (as in Germany) any regulatory obligation. They did it for competitive advantage; skilled labor was scarce and worth investment.
Those days are long past. Contemporary workers are oblivious to this history as it has been more than two generations since this situation prevailed in the US. The reasons for this change can no longer be discussed in a candid manner without triggering people so I won't attempt it.
It might trigger me but I’m still interested in hearing your perspective on why this went away. My simple take is that the economic pressure to train workforces went away as the supply of labor changed, and there wasn’t a competitive advantage to spending on training. Is it something more than that?
In my view, there's been a strong push towards eliminating skilled labor in a lot of settings such as factories. We'd rather get rid of both apprentice and the master.
At the factory attached to my place of employment, I'm not sure what jobs are actually left that are related to any of the skilled trades. Most of our workers are assemblers. We do hire people with 2 year trade school degrees, or equivalent military service training. When our welder retires, we will probably outsource the welding.
Now we get pieces from shops that have skilled trades such as machinists, but I don't think those shops require a 4 year degree. I don't think carpenters, roofers, or plumbers have bachelors degrees. Many of the businesses that hire those people are family owned, and hard to break into if you're not connected.
As for health care trades, well, employers might be willing to bear the cost of the training, if there was only one employer: The government.
There are new skilled trades coming up now. My friend is a robotics welder. Unlike a traditional welder, he does most of his work from a computer terminal. He programs the robot to weld a specific joint and then the robot repeats that weld for every part that comes through his station on the line.
The old guard of skilled welders who did the welding by hand are being replaced (and not happy about it) by young guys like him. My friend learned his job through a 3-year program at the local community college. His job involves a lot less of the tacit knowledge a skilled welder would have (judging the quality of a joint by feel) and a lot more theoretical knowledge of metallurgy. Instead of intuitive judgement, he relies on the methods of nondestructive (x-ray scanning) and destructive (cutting a test weld in half and inspecting it under an electron microscope) testing.
There are economic reasons behind it. My company used to invest a lot of training in the employees because most employees were working for the same company a very long time, more than half till retirement. I got a lot of training ~ 20 years ago as a new hire, while the new hires today get almost nothing, not even the bare minimum to do their job. Why? Well, a few weeks ago I had a meeting with a team from a very well known IT company for a project and they all expressed their amazement when they heard I am working for such a long time in one place; people now stay 3-5 years in a company and move on. Why train them, to make them better for the next job?
Companies that invested a lot in their employees paid less than companies that had no such expenses, so it became common to have a new hire, spend 3 years on trainings and leave for a better salary. This competition pressure made us adjust and move the training budget into salaries.
Tech employers usually offer reimbursement for online training, conferences, books, etc. That seems commonplace and enables business models of PluralSight and alike.
Some also bring outside consultants and training firms for in-house seminars, some pay for conference expenses, and some even reimburse tuition for a professional development course.
I too wish gp would elaborate. I don’t know what caused the change but it’s clear to me that companies wishing to choose hires now with no qualifications other than teachability would have a very difficult time dealing with diversity issues.
Demographics. Compulsory public education became a thing in the US around the same time child labor laws were becoming wide spread. This was coincidentally around the time men were coming back from the world wars and needing to take back the jobs their children/wives had been filling. Well, all those boomers are still alive, and we needed an excuse to keep their kids/grandkids out of the work force a little longer. Hence over-saturation of college degree requirements. That's changing as the boomers die out, so we will see "college degrees" become less of a requirement. The only thing that could keep it going is competition from immigrants, but I doubt that will be the case.
Notice how the high school graduation rate among 25-29 year-olds rose between 1940 and 1950. Those are people who graduated high school between 1929-1934 and 1939-1943. Before WWII.
This pre-WWII trend is consistent with the post-WWII trend, up until ~1978 when effectively "everyone" graduated high school. In other words, this has been happening since before WWII, and WWI had nowhere near the civilian mobilization as WWII, so the demobilized wives/children argument doesn't explain the inter-war increase.
As for college:
Notice how the number of 25-29 year-olds with college degrees rises dramatically between 1948 and 1951 and then slowly through about 1960. That's WWII vets using the GI bill to go to college (18-21 year-olds starting 1945 to 1948). Then the increase tapers off.
There's another significant rise between 1965 and 1978. That's boomers avoiding the Vietnam draft with college deferments. Then it levels off (actually, decreases) for 20 years until the late 90's.
The late 90's rise corresponds to the late-80's / early-90's push to throw student loans at anyone with a pulse.
Starting in 2008, the rate starts rising again: Young adults of all ages staying in or going back to finish college because the Great Recession killed a bunch of jobs. Plus, probably, some amount of people who started college around 2002 to avoid being drafted for Iraq (which didn't end up happening anyway).
These things are not mutually exclusive. For example, during the Great Recession, why were young people particularly unemployed, sending them to college? Because their elders held onto those jobs, squeezing them out of the work force. It became a meme that you needed 5 years of experience for an "entry level" position.
Because the apprentice watching you hammer in a nail is not producing $25/hr (or what ever the min wage happens to be). Add on all the ways that employees can screw your business over without the ability to get rid of them and you get the piecemeal training systems we have today.
> Add on all the ways that employees can screw your business over without the ability to get rid of them and you get the piecemeal training systems we have today.
This is an interesting angle. Do you think it is a mutually escalating war of trying to screw each other or maximize one's own gains that has resulted in a hostile relationship between employees and employers? What would mutual de-escalation look like?
US employers didn't do this out of the goodness of their shareholder's hearts or (as in Germany) any regulatory obligation. They did it for competitive advantage; skilled labor was scarce and worth investment.
Those days are long past. Contemporary workers are oblivious to this history as it has been more than two generations since this situation prevailed in the US. The reasons for this change can no longer be discussed in a candid manner without triggering people so I won't attempt it.