I would argue exactly otherwise... with the advent of the web platform and frameworks like .net, the vast majority (and I mean like 95%) of developers will never touch anything ML related in their careers. I mean, I get that this is MIT and many of their students will end up working with ML, but applying that globally to CS is nonsense. Back when I was studying CS (cze) more than a decade ago, we had to pass linear algebra, graph theory and calculus, but honestly, that was like in the first year and a half and then it completely tapered off (later years were all about projects, algorithmization, i.e. "doing the work" and very little about hardcore theory) and guess what, I never needed it again. A bit of statistics and some graph theory here and there, but that's about it.
Contrary to popular belief, there are NOT that many ML jobs out there and the ones that are there are more about data science and messing with model zoo type of shit than actually coding useful programs. Most programmers will be lucky if they get to integrate inference of a prepared model into the apps they work on.
This is exactly why so many people (myself included) advocate for a pure "software engineering" degree at more universities. Let people who are interested study graph theory, combinatorics, linear algebra, advanced probability and statistics and whatever else. For the rest, provide a path to be ready for an industry job building websites and applications, which is what 90% of graduates will end up doing.
Every other discipline out there has a clear separation of pure from applied science. Why can't we do the same for software? What we end up with is borderline fraudulent coding bootcamps to fill in the gap.
While I spend 99% of my time doing pure "software engineering", I'm pretty grateful to have the advanced probability / graph theory / combinatorics etc. background because it helps me envision possibilities I wouldn't otherwise be able to.
That being said, there are probably lighter ways of teaching that instinct than full-depth classes. I try to listen to podcasts these days as a way of expanding my horizons.
I have been saying for years that we need to treat software developers like jedi when it comes to training.
Practical, industry expert-led coursework has been by far the most outstanding education I have ever received. My DSP professor was (is still) an adjunct to the university I attended and works a normal job 9-5 during the day at some engineering firm. He was easily the best educator I have ever experienced because he brought reality into the classroom every day. I still vividly recall the 20–30-minute lecture/rant about making power point presentations that don't suck.
It's all the little things for me... The nuanced details like "why are you holding it that way?" are impossible to discover until you have a customer complaining at you for a while or have someone who experienced it themselves giving you a heads-up.
For me, the future of practical software engineering education looks a lot more like a machine shop than it does a university campus.
I think this line of thinking fails to recognize what a general math background does for your critical thinking skills.
I'm sure you and I both took plenty of math classes, and therefore we won't ever really know what our computer science skills would be like without a rigorous math background. Even if I never touch anything more complex than algebra II again, taking ~30 credits of applied math allows me to think in a way that I wouldn't otherwise without that background.
Software engineering is based on applied mathematics too. You'll need at least some basic calculus to make sense of O(n) analysis, and Calc II as a prereq for probability. Then add plenty of logic, discrete mathematics (needed for algorithms and data structures), models of computation and concurrency, category theory (which is becoming a shared language of everything "compositional"), topology etc. etc.
If you really want a "math free" intro to tech, look into Business Information Systems. That tends to be more ad hoc, at least for now. At some point, people will start to care about software assurance even in that context, and the standards will rise accordingly.
You don't really need to know calculus (derivatives and such) but it's true that big O notation requires some sort of "asymptotic thinking" which is probably only explicitly taught in a calculus course.
The thing is people with a maths heavy background tend to think you need a much deeper understanding of math for this than you actually do.
You need very little beyond high school level math for most CS. Some areas, sure.
I've done things in my career that touches on a lot of different areas of math. But the number of times I've regretted not having taken more math have been pretty much non-existent. I wish I remembered a bit more of my trig, mostly.
Most software engineers come into contact with far less CS subjects where math matters than I do.
I don't have an issue with a place like MIT insisting on lots of math, but this notion that you need to understand so much math for software engineering is deeply flawed - you don't need much even for a lot of theoretical computer science.
The point is that even a "shallow" understanding of math is much deeper already than many, perhaps most realize. Many high-schools don't seriously try to teach math at all - there's no such thing as "high school math" in this day and age. You need college to even have a chance of being exposed to it properly.
(Then there's the whole "learning to code" part, of course. This is actually where middle and high school math provides useful application domains for learning to code, and people have tried to teach coding in schools since the 1980s.)
I don't really think you need much math to learn to code at all. Again, some forms of code is helped by math, but I've also seen beginners struggle to reconcile differences between whatever programming language they were introduced to an mathematical notations. There's no doubt there are close relationships between math and CS, but you can a lot of CS just fine without ever being aware of those relationships.
I opted out of pretty much all the math I could at university, and at mine you could opt out of almost all of it (I had to take one introductory course which mostly served to bring those who hadn't taken much in high school up to scratch, and one introductory stats course).
Many of my other courses touches on subjects where a mathematician probably would say "but that's math". E.g. my compiler courses of course touched on a lot on parsers and grammars that are effectively just math restated. But those restatements matter. Maybe if more math was taught in ways that downplayed the dense notations more people would actually stick with it.
Code is pretty much defined by having "dense notation". Learning to code involves plenty of familiarity with formal, logical reasoning; simple math helps provide a convenient domain for applying logical reasoning to meaningful problems.
Code is nowhere near as dense as it can be in most languages - we make considerations for human readers by making languages verbose on purpose to a much greater extent than most math (of course there are exceptions like e.g. J and K and similar languages). Most math violates the intent of pretty much every coding standard there is in focusing overly on density over readability to someone not intimately familiar. On top of that, math very often omit a lot of detail in a way code can't, that requires a lot more effort from the reader. I get that it's intentional - the focus is on expressing what is different in terms of a shared context people familiar with the math already has, while in code only a few fringe languages tends to take this approach over favouring readability.
And yes, we need familiarity with formal, logical reasoning, but the primitives you need to be able to understand coding are really basic, and often easiest introduced by showing people code rather than giving it the mathematical treatment.
It's not necessarily math itself that is the issue, but mathematical notation and the way we teach it - there's a very stark divide, I've observed, between those who prefer those really terse notations that you must take time to decipher, and those who want notations that can be read like prose. For my part I'm firmly in the latter camp.
> but the primitives you need to be able to understand coding are really basic, and often easiest introduced by showing people code rather than giving it the mathematical treatment.
The primitives are hopefully simple, but the logical implications are not. That's why it makes sense to have both.
For a subset of CS that most developers will never need, sure. Nobody is arguing math is never needed for CS, but most developers, and indeed a whole lot of CS researchers will never need much of it.
Discrete probability is probably adequate for most software engineering. Almost everything we encounter in our jobs is discrete. One thing I do think we need more of is linear algebra.
If all you're interested in is getting a good job, you don't need a degree at all. The information is available for free in a variety of presentations and formats. The source code to just about all the software you'll use is available for free as are all the tools. You don't even need a bootcamp, just time and energy.
You may not need a degree to learn the material, but as someone new to the field, there are plenty of jobs that list a 2 year or 4 year degree as a requirement. Having that degree will open more doors than just learning on your own simply because that’s what they’re looking for.
Sure, but in the same vein everything you will ever learn at MIT can be found for free online as well. Ultimately a 4-year degree does have value, whether just for the brand, or as a forcing function to learn, or the constant help from teachers and peers or whatever else.
I don't think a "pure" software engineering degree really needs to be four years.
What you're talking about sounds an awful lot like the program I went into initially at a community college. They taught you some coding in a few popular languages, some database concepts and sent you on your way. I dropped out after a year and found a job.
I ended up going to a four year program after a while. Turns out, a lot of the good jobs in software engineering require understanding those peaky abstract fundamentals.
If you just want to be a great web developer, MIT may not be the best place for you.
MIT best prepares people for those less well defined roles, such as designing the next era of web browsers. For that, you can never know exactly which skills will be needed, so it's probably best to have as many neighbouring skills as possible so you don't hit problems you can't solve merely because the knowledge required to see the best solution was in that topic your course didn't cover.
Who knows, maybe the next era of web browsers will browse the web for you, and then condense everything they learned from thousands of resources into a single paragraph for the user to see. And for that, they might need ML.
Of course if I browse linkedin MIT EECS grads, most are probably just doing bug fixing at FAANG or the latest unicorn and some small fraction are doing anything revolutionary. It's also likely that they would have done so without an MIT education. See e.g the Collison brothers.
Learning about those things aren't necessary for those jobs but they prove that you're capable of learning something, and as such is a part of the FAANG acceptance process.
Hmm I’m not sure I really agree with this. Does MIT (or any university) teach the creativity needed to envision the kind of thing you’re talking about? Or like most universities, is it just teaching some foundational skills coupled with whatever has condensed into “required reading” from industry over the last couple decades? Just with a higher pedigree and ostensibly better prepared student body.
Alright. You seem to feel pretty confident about this.
Having worked with quite a few MIT grads over the years, at least in my anecdotal experience, they were smart people who were no more or less likely than any of the other smart people working around them to stumble upon the next evolution of the web browser.
I remember finding myself in 3rd year Calculus in a Computer Science degree, and realizing: I don't have to be here! (only two years were mandatory)
I've always enjoyed math and kept enrolling into it out of habit, until it became so esoteric, and my actual interests more solid and practical.
I find a lot of my university career was fascinating and... useless. Not just from "I will never use this directly perspective", but also largely from "this will give me broader understanding and framework and enable me to learn faster" perspective. We can have wonderful philosophical discussion on what University should be for - job prep or educational enhancement for the sake of it - but truth of the matter was that I envied those in Engineering fields who had fun AND learned AND were doing practical things AND were going to apply some of it. Whereas my 3rd and 4th year maths were just maths for the sake of maths.
I may be hanging out with uninteresting crowds, but same experience is broadly true for my friends and co-workers - Java developer, VMWare architect, Database Administrator, ERP developer, etc. We all value education and love learning and will go on our vacation with couple of technical books - but university Computer Science degree seems very mistailored, or at least, sold wrong.
> I find a lot of my university career was fascinating and... useless.
I was in college long ago and for my CS undergrad and masters took the usual CS and math courses. When I needed electives though I took courses like economics, finance and accounting. Many years later, those electives ended up being the most useful.
The CS and math courses I wouldn't consider useless though. I'm sure I lean on theory I learned without realizing. But, at the time I couldn't have predicted working in small companies or startups and how important basic finance and accounting would end up.
interesting, I think almost the complete opposite. I am happy where I'm at, but I most certainly would have preferred doing a math/cs double major rather than all the BS busy work of an engineering degree I went through. I wouldn't call 60 hours a week of symbol manipulation practical...
If your goal is to be a code monkey writing database-backed web applications, MIT is probably a very expensive way to get there.
The goal of the degree is to prepare people for data analysis, machine vision, 3d graphics, ML, signal processing, and similar. If you're not into that, going to MIT is wasteful for everyone involved.
That's not elitism talking; that's just the nature of MIT. Other schools aren't like that. For example, if you want to do a startup around a database-backed web application, Stanford is a fine choice. I'm not arguing Stanford is either better or worse; it's just a little bit less academic and little bit more entrepreneurial. There are other schools which emphasize other things. Harvard or Yale will move you more into the class of powerful people. Etc.
Calculus was the first time in mathematics education where I actually had to understand systems and how to derive results from first principles. Prior to that everything was just memorizing: "this is what logarithm is", "socatoah", multiplication tables, etc etc. I straight up hated math until calculus (now I have a math PhD).
I'm sure the same could be accomplished with other fields of math but I don't feel it's necessary to switch. Would be extremely hard to find good teachers and course materials for combinatorics or graph theory to.
Contrary to popular belief, there are NOT that many ML jobs out there and the ones that are there are more about data science and messing with model zoo type of shit than actually coding useful programs. Most programmers will be lucky if they get to integrate inference of a prepared model into the apps they work on.