CS degrees aren’t meant to teach any particular skill. They should be teaching general knowledge that helps you adapt to whatever the FOTM is in industry. Realistically you can learn to use git in less than one day. It just shouldn’t be a significant barrier to hiring someone.
None of the specific languages or software I learned during my degree do I even use now in my current role.
Git is not just a fad; it's a literal underpinning of most serious software development these days, especially if you need to work with others.
That a CS grad doesn't even know basic Git is telling not of their degree, but of the individual themselves - that they have no desire to learn beyond what's taught, and couldn't even be bothered to look into the requirements of the industry that they want to work in.
I argue that it should be a barrier to hiring, on that ground.
As someone who has used GIT a fair amount over the last 15 years, it has literally one of the worst developer experiences, right up there with C, python’s package management, cuda kernel development or shader debugging, ffmpeg, Scala, MongoDB, and Rust. All of those tools are very useful in todays world but they’re not necessary to do the job,and unless I’m specifically hiring for that skillset I wouldn’t ding anyone who didn’t know them.
Sure, but among those things you listed, source control systems are fundamental to working on projects with other people. If a candidate hasn't encountered any source control system (not just git), you should seriously consider what it will take to train this person on not just the tooling, but the concepts as well.
First, git doesn't especially solve problems that CS majors have until they are near the end of their courses. The need to manage collaboration, organization, and change in a code base is trivial until pretty late in a CS program, and typically never gets past basic.
Second, the context here is the git command-line interface. People can and do use git extensively without it.
Third, git's not exactly perfect. Personally, I'll be a little sad if git remains the de facto standard source control in the long term... it's not bad, but we can do better.
I think lack of knowledge of git commands in a CS grad would be essentially uncorrelated with future professional success.
Git would have been helpful to me in my very first intro to programming course. It fixes those situations of: "arg, I changed one thing - it no longer works, and I'm not even sure anymore what that one thing was!"
Second, a successful CS grad has hopefully written at least something that is at least 2000 lines long. Doing that without VCS of any kind is not impressive, it is obtuse. That is a strong no hire signal imo.
I had a different CS course than you, because git (as a remote storage and version control tool, using branches and submodules came later) and bash were the two first thing I had to learn on the first day (how to use diff and patch was on the second day if I remember right).
> especially when there is money to be made by doing things better.
Is there though? Why would I pay for a VCS when there already is Git, Mercurial and other free and open source solutions?
Git may not be the most intuitive but it's a solid product and most people work well with it, even those who every now and then delete and re-clone a repo because they don't know another way to get out of a pickle.
> yeah, but all of the VCSs before git could've claimed the same thing. if that's not a fad, what is?
Git being the de facto standard in the space is coming up on twenty years. That's about as far from a "fad" as you can get.
> i'm glad git is the soup du jourre today
The problem isn't so much with the "soup du jour", but that the whole concept of "soup" seems foreign to the people the tweet is talking about.
(But yes, given the twenty years mentioned above, "the concept of soup" in practice is pretty much equal to "the soup of the last few decades". Claiming to be a software developer nowadays while being unfamiliar with git is like claiming you were "computer literate" in the nineties but not being able to use Windows.)
Accept my upvote. The silent cowards who are downvoting you hate that what you're saying is true - or at least are tacitly admitting that they can't actually form a counter-argument.
Git deserves credit for popularizing distributed VCS - which is HUGE - but it was not the first or the best DVCS. Its CLI is full of footguns and it's too eager to irreparably erase history.
Even if they're focused on theoretical CS they should still be using git to upload their tex files somewhere in 2024. And from an undergrad degree I'd expect both an OS class and an algorithm or datastructurs course that would use some kind of VCS.
Or maybe all that doesn't exist when you attend Canadian Chromebook degree mills?
Data structures is where I learned how to write test code. It was the best way to actually get the algorithms correct.
If we had git back then.. I remember one bug that took 6 hours to find, finally found it at 8am before the 10am class. Was typing as fast as I could for the next two hours... If there any kind of basic VCS back then that would have been as easy to use - it would have been many fewer all nighters.
Which is to say, a data structures class should have students coding. There are a number of skills you want when coding, basic and common skills that just make it a ton easier. Namely, confidence and iteration speed go up a ton when you can rollback. A modern IDE can do a lot of that too..
I guess I'm becoming an old fart, I've no idea why basic and helpful tools would not be emphasized early. Debugging with those tools is one of those things you want to learn. It helps build a data/fact based approach to coding.
Yes, they divided us into teams of 4-5 and told us about valgrind which turned out to invaluable. So if that is done in some black box web interface anti cheat vendor provided CS course module today that abstracts the actual useful irl experience from the student that would be pretty sad.
Because part of the homework could be implementing algorithms and data structures. Even if that's done in pseudocode you'd probably want to expose your students to some form of VCS at that point. At my old faculty for example you worked through 1/3 of CLRS as part of the lecture and then implemented a few algorithms and data structures in both Java and C++ under the guidance of TAs.
So even students who failed the programming part and passed by acing the exam and the theoretic part knew what a shell was, what a compiler and linker did, how to find documentation and how to actually do all that on either their laptop or a provided thin client.
And most classes were set up like this. As a first semster student you'd be submitting your MIPS assembler homework using a VCS etc.
That's the heart of the matter: Schools are teaching algorithms and data structures to students who don't yet know how to code.
It's wankery, no matter what the student's future path is intended to be. You can't work in CS research either if you don't know how to use a computer.
To expand on that, and maybe my view is a bit too traditional, but my understanding is that universities ought to prepare you to become a scientist. At least that's how it was historically. They are not meant to be "vocational training plus" or to subsidize the industry by training the work force for them. (It might be different for colleges, I'm mainly speaking from a European perspective.)
I know times have changed, but I don't think git should be taught in a university-level computer science course. You teach graph theory and then many operations in git should become clear after reading the documentation. I don't think any implementation-specific skills should be taught, unless you use it as an example, like Java or C++ to teach OOP. Instead, you teach broad concepts that underpin all of modern computing.
Isn't there a saying along the lines of "computer science is as much about computers as astronomy is about telescopes"? When I was at university twenty years ago people were still claiming you could easily pass a computer science program without ever touching a computer.
I'm a researcher in one of the more "applied" subfields of CS (security and privacy), and we absolutely use standard software engineering tools day-to-day. You're not going to be a productive member of a research group if you don't know how to navigate the tools of the trade. Research isn't about using Git, Bash, or Docker, but you're dead weight if you can't wield those.
Here's a serious research venue asking for technical artifacts for reproducability: shell scripts, VMs, Docker containers, configs, READMEs, etc. https://petsymposium.org/artifacts.php
It is a fairly new thing in the security/privacy academic research community (probably just in the last 5 years or so, and still spreading), but a very welcome one. Not only does it encourage reproducability, but it also helps authors have a broader impact if their work is in evaluating ideas and offering practical proofs-of-concept that other people can easily build upon. It's all optional for the venues I deal with, yet lots of people do turn in artifacts for their accepted papers. For all we know, the next Let's Encrypt (itself partially a University of Michigan project) could start as a submitted artifact.
> You teach graph theory and then many operations in git should become clear after reading the documentation
Never have I needed to understand the underlying Merkle tree implementation of git to use it. Instead, I need to know when to branch, how to reset, and what makes up a good commit working set and message.
All of this would be the same if I was sullying myself by learning vocational skills or just being able to effectively and efficiently work by myself on a research project.
Indeed, I've always been against filtering people on programming language. However, I wouldn't hire a developer that wasn't good at the language they had been using or learning for the last years. In the same way, I wouldn't stop from hiring someone without git experience. But I would definitely not hire someone who, in 2024, hadn't been using some CVS tool. Plus, I would be curious to know how the applicant had been sharing code with colleagues in group projects and why they had gone for solutions other than git. For someone more experienced I would be wondering how they managed to be insulated from the most widely used code repositories like GitHub, GitLab, BitBucket, and so on, which overwhelmingly are git focused (BitBucket even started out as a Mercurial solution and then slowly shifted to Git). This could actually be an interesting window into the applicant's decision making process. Even though I would be open minded, I think that the odds would tend more for me to find out reasons not to hire that applicant.
This is theory. In practice, you should have done some work during your course and it's reasonable to assume that using version control (which you can learn in a day) is a skill you'd pick up at that time. If not, what exactly were you doing for all those years is a question.
This has proven to be an extremely controversial topic, but, in my opinion, it's perfectly okay to use git in college or university, and we should encourage, not discourage people from using technologies like distributed version control software.
You could substitute git with WhatsApp, Google Drive, or e-mail for small projects, and get by just fine, but, why not spend 15 minutes learning the basics of git?
As far as I know, the use of git and other distributed version control software is very popular, and we don't see the same hesitation when adopting technologies like Google Docs and its collaborative editing features in college or university.
Is distributed version control software truly a controversial technology?
If distributed version control software is not suitable for use in college and university, what would be a more appropriate technology?
The university could even setup their own private self-hosted Gitlab and use it as part of assignment submission.
10 years ago, my no-name college had a CS degree that required us all to take a "Software Engineering" course that covered the fundamentals needed once you graduated, including Git. We did group-style large coding projects where teams had to submit their GitHub repo at the end.
The prof was able to review who committed what and then hammered us on good commit messages, clean coding style, testing, etc.. I feel that a large part of my career success was due to the early start I had from that course.
I think that this would be a great idea, and could also help combat students not doing anything in group projects.
I lone wolfed most of my group projects in college, and don't have any regrets, but, of the projects that I didn't loan wolf, most people didn't write a single line of code, or only contributed in relatively inconsequential ways.
I think that adopting distributed version control systems in higher education would be mostly good.
This is one of those things that MIT’s missing semester course aims to help with (https://missing.csail.mit.edu/), and although computer science is different from software engineering, the reality is that most CS grads go into software engineering, and thus should try and learn these essential skills.
Computer science in the graphics space requires a ton of gnarly coding. If we consider post-doc computer science in particular, the idea you dont then need to be at least a semi-competent programmer is.. a surprising idea to have.
Which is to say, the idea CS do not need to ever code is a stretch. How does a person work on cutting edge graphics algorithms without building something with it? Without coding in a pretty serious way?
One example, at the end of my intro to programming, the final assignment was to build something, I wrote a chess program (with a GUI). While some classes are quite theoretical, I strongly disagree that a person can get through all of a CS program without their coding skills being challenged.
git is a tool, not a discipline. It's made to facilitate work, not to create a whole knowledge domain unto itself.
I wouldn't fault a seasoned and well trained mechanic for not knowing the specifics of a specialty tool necessitating the need to find the manual; similarly although git is the current VCS-ish thing of choice right now, it doesn't represent all of computer science.
VCS is something that is interacted with multiple times a day in any codebase. It is a basic tool, not too far apart from being able to traverse a file system. The next step after viewing files, is being able to commit to and view the history of those files.
I dont know if that answers "who cares", but a mechanic that does not know how to do something they need to do multiple times every day - there is a skill gap. While CS is about theory, I would expect a CS grad to have used all of the basic programming tools and more.
Been programming since 98, professionally since 08.
If a codebase has no VCS.. that seems almost ideological. I've done a git init on a linux file system to understand patch changes.
With that aside, what does disaster recovery look like without VCS? Remote backups ate an obtuse form of VCS.
If your laptop explodes, are you losing months of work? I mentor junior devs to push daily, if their laptop blows up I want them set up again in 30 minutes with at most one days worth of work lost (because they push their branches remotely, if only for the redundancy. Doing so daily makes it easy to recreate what was lost, compared to a week or two of effort).
Without VCS, integration is s larger challenge. It can be done, but continuously and also efficiently?
Seemingly we are talking about non collaborative code bases, a vanishingly small subset of projects. Most software is developed by teams.
Last, it is so damn easy to run 'git init'. My last job - the team used git locally and then did the stuff in TFS when they finally had to. The point is how easy and useful it was. The other guy that did his own thing, did not use VCS, his stuff was scary. His code was disaster if it stopped building, of if we lost his laptop.
Which is to say, VCS is an underpinning for CI & CD, DR, and is stupid easy to get the init setup done. Just about any codebase - yeah. Even if that means doing git locally and then throwing it away.
It is different to choose to not use all the tools compared to not being able to use them, and yet another to be unwilling to learn them either due to prejudice.
Second response after having thought about your question a bit longer, re: "any code base"
First, VCS is a generic term. Saving files to a backup is a VCS, zipping and sending files via email is a VCS.
What kinds of system use VCS as their underpinnings? Build systems, disaster recovery systems, deployment systems, more... Without VCS, you can't have those systems, and which code bases require none of those systems?
Perhaps we are talking mostly demo projects, small ad-hoc scripts. Would you posit other examples perhaps? My perspective is that it is so trivially damn easy to get a VCS set up locally, even a distributed one, that while - yes - you can live without it - in what cases should you really be living without it? Particularly in light of all the downstream dependencies of a VCS.
Maybe we disagree on the ROI of VCS? I can understand someone not building a CI/CD pipeline because it is kinda costly and maybe won't be used enough to justify it. With Git (VCS), there was a time period where I was new to it and was corrupting files left and right. Getting past that, the cost to set up & use is absurdly tiny, my daily workflow has had no issues for years now. Perhaps there is a difference in perspective there? That you view the ongoing cost to be really large? So much so, that a cat walking across your keyboard is so rare and the daily pain so large, that you have a different perspective on the ROI?
It’s a fair point. At the same time, I think it’s pretty important to cover VCS at some point in college. My CS degree included a couple classes with projects where we did project management, code review, VCS, working with stakeholders, design, etc. —- most of that is just part of life in programming. It’s good to at least touch on it in college.
What about containers. These are now pretty ubiquitous too. Should they also be taught containers? Package management? Document management systems?
I feel like there's a line where we expect some on the job ramp up. Version control was not ubiquitous 20 years ago even if it technically existed in some early form. We honestly weren't really using SVN/CVS much back then and git wasn't invented 20 years ago. Containerization is another example where it was not ubiquitous even 10 years ago.
What if some new technology becomes fairly ubiquitous across the industry? There's some line where you need to accept people won't be taught this in a CS degree. There's a reasonable assumption that you'll learn on the job and also continuously throughout your career.
Dating myself here, my SUN Microsystems sponsored university did have all kinds of VCS. Not just the FOTM stuff but also weird academic toys.
>Should they also be taught containers?
Yes, because that's how things are done today while 20 years ago they might have all ssh'd into the same server running the development environment and 30 years ago they would have all sat in the same computer lab. And 'taught' in a sense that lists them a few good documentation resources to get them up to speed while using a provided image.
And if they end up with an actual degree they should have covered both hypervisors and containers in an operating systems course.
VCS is an applied project management and communication system in the context of software engineering. Both project management and communication are disciplines.
CS grads don't need to know VCS, but software engineers do.
I would say that you can put physical computers in that same category. It is unnecessary for a good computer scientist to even touch a computer to acquire the scientific skills required to call yourself a computer scientist.
In my uni, "Software Engineering" was the one class of the CS program that had nothing to do with programming. It was about UML, SOLID, DRY, KISS, agile, scrum and more. Useless class that was mostly a waste of time.
On the other hand, we had classes like Operating Systems, Computer Networks and more where we had to write things in C and assembly.
It teaches you how to build practically functional and maintainable software instead of the prototypes that are enough to pass other classes. It's one step beyond just programming.
There is some truth in that but mostly I would disagree. A lot of what I mentioned are ideas/concepts that usually came from one or more "guru-types". They may have been programmers, managers, or other.
They're not necessarily wrong but the issue I have is that, in science what should be taught is to question everything and see if it's actually correct and verifiably so. With a lot of these concepts this is seemingly never done. It also comes from a specific period where everything happens to be quite tightly bound to Java (UML, Clean Code etc. especially). That doesn't mean that I can't learn something from it. But it is almost a little hypocritical when you are taught the importance of abstracting and generalizing but then the tools given themselves are not even general enough to apply to anything besides enterprise Java.
CS is an underpinning to software engineering. Knowing how code is translated to assembly to then machine language is helpful to engineering. It is helpful to know how to analyze algorithms, to have studied and implemented algorithms.
Though.. in software engineering, being able to write well is perhaps the single most important skill. Communication, documentation, very important.
Are we going to then say English majors are the best programmers? If they later become dev managers or director - then perhaps. Does a person learn English really well by also learning Latin and the origins of english and it's great works? I believe yes, and that is why I think knowing the theory and underpinnings of CS are very useful foundational knowledge. Required? No, but nor is touch typing or anything else that makes a person extremely competent.
So where should people who want to learn the discipline of software engineering go? It’s definitely what most CS majors are actually there for, and there’s no separate software engineering major at (almost) any liberal arts colleges. I get that saying “that’s why it’s CS not SE” is the hip thing to say when this comes up, but it doesn’t answer the underlying issue, which is that people want a mix of theory and job training, and right now they’re getting almost entirely theory.
> Computer Engineering or any of the other legitimately useful CS degree paths instead of CS itself.
> At least at the school I went to, CS was kind of a joke for undergrad.
> The students that cared about CPU design, low level software (firmware, drivers, kernel, embedded), or robotics (including computer vision, etc) all went computer engineering.
> The students that cared about cryptography or the formal maths side of computing all went to the mathematics dept in their applied discrete maths or applied computational mathematics degree paths.
> The students that cared primarily about high performance computing or applied computing in general (but didn't go one of the aforementioned routes) went through the computational modelling and data analytics program.
> And the students that wanted to learn CS for the purpose of game design or creative arts had their own program within the school of arts (can't remember the name).
> So out of the students who were interested in computing that went to my uni for undergrad, the ones that were left in the CS department were those who were told "get a CS degree for lots of money", those that didn't bother researching any other programs, and those who wanted to be web devs or enterprise java/c# devs.
But if they aren’t teaching Git, are they at least teaching about source control in general? What it does, how it works, how it should and shouldn’t be used. That would be pretty useful.
How can you graduate from a CS program without having had to learn git somewhere along the way? Do you never collaborate on any project? Do you never check out a repo or make changes to it?
Here's a thought. Imagine there's another new way of doing things that quickly becomes pretty much ubiquitous in programming (containers may be a good example?). For those who have already graduated long ago and thus didn't learn that new thing in university are you seriously concerned about the people graduating not being rote taught this particular technology?
Git isn't even 20 years old. I graduated before that. I learnt the basics of version control in a day and searched to unblock myself when needed without fuss. It's not a particularly big deal.
When i read things like this I'm extremely concerned and embarrassed for the people demanding new graduates be rote taught specific tech stacks. That's the real concern here. The projection from having so little faith in their own ability to learn they can't even see that something like git is going to be one of the many things needing to be learnt on a new job.
>When i read things like this I'm extremely concerned and embarrassed for the people demanding new graduates be rote taught specific tech stacks.
I think that's too narrow of a reading. There are some foundational practical tools and concepts that are broadly applicable to all computing tasks: version control, command line file-and-folder navigation, quick-and-dirty sripting. These don't (and shouldn't) need to be "rote-taught." But they can absolutely be integrated into existing coursework so students learn how to use them in context. In a data structures class? Run the lab work in Linux and C++ with basic makefiles. Have a compilers course? Offer the skeleton code for the parser as a git repository and submit via merge request.
> the people demanding new graduates be rote taught specific tech stacks
If people were graduating from college with a CS degree, and didn't know how to use a keyboard and mouse, then yes Id advocate for them to be "rote taught" basic keyboard and mouse skills that they should have learned in elementary school.
It would show that something is extremely wrong going on if the most basic of skill set isn't know by people who's entire field of study involves computers, and didn't know how basic interactions with a computer work.
I had a strikingly similar experience at my own university, and took things into my own hands somewhat by teaching a free, basic Git course each semester.
Yep. When I studied CS in early ‘0X the lecture material was only about programming. Everything else you were on your own.
The labs all had Solaris machines. Most students had never seen Unix or a terminal before in their lives. The instruction on how to use it was extremely minimal.
There was a very wide gap between the students who were huge nerds who already knew, or learned in their own, and those who didn’t. There wasn’t even YouTube, or Stack Overflow, or anything much to help in those days.
The biggest difference was between the students who had to physically go to the lab to do their projects, and the others like me who knew SSH existed, and how to use it, doing our projects from the comfort of the dorms.
I was a TA during that time. A part of the students that came in weren't familiar with concepts such as files and directories or desktop UI metaphors because they were used to mobile OS or in rare cases didn't have any actual exposure to computers. But because the last case used to be the rule and traditionally students mostly used the provided thin client labs and Solaris servers the basic intro to actually using a keyboard and the provided hardware to get things done didn't change much, even when they switched from Solaris to Linux on x86.
The iPhone came out in 2007, the ugly brown android in 2008 or 2009, what mobile OS were students using during the time period mentioned in OP’s comment?
Yeah, you're right. I'm off by a few years. Early 2000s was when these Sharp personal assistants came out. And some students would have been exposed to Windows 98 or XP and sometimes IRC or ICQ but on average they didn't have any actual computing experience apart from maybe playing games or chatting.
That was a bit before my time so they would have had to cope with actual UNIX-like CDE on Solaris until they figured out how to access the nicer Linux servers that had KDE.
But in these days thin clients with 24" screens were actually pretty nice compared to what students actually owned. Even on Solaris with Mozilla and CDE.
The actual teaching material didn't change that much though because the university essentially taught the same sh course with here's latex, here's man, here's how you install whatever you need to ~/bin since the late 1980ies. It didn't matter if a professer wanted to use C++ or Modula2 or Pascal or Java because every student got that crash course in how to actually use the faculty provided computung pools.
First semester homework and midterm tests ensured that they were able to start their IDE or interpreter and knew how to compile their homework on the right architecture, how to search for documentation and how to actually use a keyboard to submit their math homework. The beauty of that approach was that it allowed the 10% or so who were really interested in learning more access to university resources while also ensuring that no math-cs double major graduates without at least a modest grasp of tools that were state of the art in Knuth's days.
I did not learn any git from my CS degree, so I'm not surprised at all. Hell, the source control used for the first part of my degree wasn't git, and I didn't graduate that long ago. I don't even really remember what prompted me to teach myself git, but everything I learned was from blog posts and stack overflow.
CS degrees does not teach software engineering. They teach computer science. But also, do I think they should probably teach source control basics if the degree involves a significant amount of programming? Yes.
I think every CS undergrad (or self-taught programmer) should run through the Harvard "CS50x" course, and supplement it with the MIT "Missing Semester" course. This way you're guaranteed to have a bit of C, clang, make, Python, git, UNIX shell command, and GitHub understanding by the time you enter the industry as a programmer. I wrote a little guide to how to do this here:
The guide stemmed out of the fact that I saw questions related to all this from beginner programmers, even those with CS degrees. I think one of the issues is that CS departments (including the one I went through at NYU years ago) think some of these things are sort of like "implementation details" of computing/programming, so no one course ever focuses on the topics in a cohesive way. They just sort of expect you to pick it up by osmosis. When I was an undergrad I supplemented by working on GNOME/GTK open source projects, which gave me nice exposure to UNIX tooling, version control, issue trackers, as well as compilers, C, and Python.
(Funny enough, we did have a course on shell programming at NYU way back when, but it was only because the author of ksh, David Korn, was a professor!)
Source control should be taught as a component of any "software engineering" course. Any class that teaches project management, testing, and delivery, and does NOT teach source control, is missing something of vital importance.
most mathematicians and theoretical physicists don’t know what the business end of a spanner is, either.
mechanical engineers have a fighting chance at knowing, and tradie apprentices assuredly do.
This is less surprising simply because those disciplines and their relative abstraction hierarchies are older, more well understood, and more clearly delineated. CS is less than a century old.
I will rather work with somebody using Source Tree and clears his code before committing from unnecessary, irrelevant or outright wrong changes, than with somebody who knows 3 commands (git add ., git commit -m "some changes", git push) and repeats them without thinking the moment code on his side is "finished", thus committing also changes which are completely irrelevant to what such person was working on.
yes. so much this. many times unnecessary changes are getting pushed because people blindly run these commands. i always use SourceTree to check my changes before pushing.
"its so easy to use!" tho, right? Why should one need formal education to pick it up?
Git is like Autoconf for a lot of problems: utterly unnecessary but often included by default because people have been told "best practice" and never thought to ask themselves if that was true in their case.
At first a person follows a best practice because they dont know anything else.
Next, a person learns why something is actually a best practice. Which, eventually is a way to also learn whdn something is no longer the best practice. They learn when to break the best practice, specifically which situations it is not best.
Not being able to rollback or view file history are major handicaps. It is like not being able to dribble a basketball with your left hand. It is a best practice to be an ambidextrous dribbler. CS is young, some best practices are opinion, others are tools that you should know about in order to achieve basic competency.
On the other side of the spectrum, the standard entry point to a CS degree here in Finland is taking https://programming-23.mooc.fi/ , which IIRC is entirely based around automated tests which run on your Git repo. A commit is an assignment turn-in, and you get a generous number of tries each time to get it right. But you're not getting around at least learning add, commit, and push. And yes, you do get course credit for this, although it is not 100% remote (you need to take your exams on-campus, though it can actually be any university here).
I've worked with hundreds of CS grads, the worst programmer I ever came across (as far as readibility of his code) had a master's degree in CS from stanford.
Similar experience here, too. Virtually every time I have worked with, trained, or reported to someone with a CS degree from Stanford or Harvard, they have approached their work with a narrow mind and little empathy for others. This has been true in the military, public sector, and the private sector. I don’t know what it is about these places.
I read something a while back that did color my understanding of folks somewhat, and that was that there are a lot of people who come into the technology field for the money, not necessarily because of their lifelong interest in tinkering with computers. That to me is nuts, but I guess it’s to be expected. Maybe the sample size of my experience with these folks is overrepresented with people who are in it for the money and not curiosity and creativity. I don’t know.
I would look more at their undergrad, a masters doesn’t really mean much. It’s a short program and mostly focused on more advanced topics, not standard coding skills.
It could be they are working at a level higher than you understand.
Not being snarky, but the elite schools are significantly advanced.
Imagine learning algorithms from Knuth (yes, I know he hasn't taught undergraduates in 50 years). Looking at the results after being taught by Dr. Smith at Generic State University, you might be unable to recognize the genius.
In a professional setting, your primary concern is generally to produce work that's understandable to your coworkers, whatever that means. Just getting the computer to do what you want is generally not that hard. In that regard, "genius" in programming is more like Feynman turning complicated calculations into simple pictures, and less like von Neumann instantly calculating the infinite series in the fly and train puzzle.
Are you trying to defend a software engineer who is bad at writing readable code by... bringing Knuth to the table?
The same Knuth who published "Literate Programming"? "Instead of imagining that our main task is to instruct a computer what to do, let us concentrate rather on explaining to human beings what we want a computer to do"?
I have worked with people that have a command of a given programming language, and algorithms such that it appears to be magic to me. (C++ is common for this problem).
So we share personal experiences now; okay, let me share mine.
Once, one of my teammates committed an extremely complex piece of code that I wasn't able to understand. As I expected, we soon started to start bug reports. Finally, the original author couldn't fix it properly, so I threw away the code and rewrote it from scratch. I made it simple, readable, and, what's more important - debuggable. And it was correct.
So what? Your personal experience says that you are not the smartest person in the world, and my personal experience says that magic is magic — it doesn't work properly in the real world.
It's easy to build "magic" code that's hard to understand. You can just run any readable piece of code through an obfuscator. Still, no one believes that an obfuscator is genius. If you see "magic" code that, for some reason, truly works, it means that the author did a ton of research and forgot to put that into comments — i.e., obfuscated a code. If, even with comments, it still looks like magic, then the author avoided doing a self-review & simplification of the code. And well, sometimes, even after that, code can still be magic... if we have chosen the wrong tool for doing the job (does it sound like a depiction of genius?).
Every college student’s “free” time is different. Whether sports, work, clubs or drinking, people fill there time with what the have to do or interests them.
I was lucky my work was mostly free time to further play and learn computers, OSs and software. I learned networking with PERL, Java in my free time, lots of Solaris and AIX.
I didn’t spend any time on VCS, but this was back when RCS and SCCS were dominant.
Way back in my matriculation, I saw the same behavior but written in terms of CVS, even. As some other posters have written, CS, as it is purely taught, doesn't often touch on the realities of software engineering and delivery.
IMO, it should. An education without any relation to practical skills just makes it that much harder to leverage what you learn.
"And the number of CS grads who don't even know basic MSDOS commands like ASSIGN is equally astounding."
Such hubris in that original statement! As if Git is the be-all and end-all of all computer programs. Yes, Git is important. No, it's not the only version-control program in the Universe.
It is more than just that. I used to recommend a lot the MIT's Missing Semester of your CS Education https://missing.csail.mit.edu/ to people that is not familiar with some topics at work.
I had teachers that wanted code assignments on cd rom. No joke, I graduated undergrad in 2014. I had 1 professor who used fossil - which was on outlier (I actually kinda like fossil still).
i feel like this is being blown way out of proportion. learning git is never the bottleneck to becoming a software engineer, and it doesn't take longer than an hour to grasp the basics.
CS majors should learn it just by messing around with projects, but I don't see why an otherwise great candidate couldn't learn it very quickly
This is the nuance that people aren't able to understand anymore - something happened when the Internet came out, and we've never been the same.
The OP was just surprised that people don't know git, and indicated that he wouldn't hire a junior engineer who didn't know git, but, there's very likely nuance to this, and I don't think that one person's personal preference necessarily needs to be discussed and debated extensively on Twitter, HackerNews, etc.
In my opinion, git is a very popular tool, and lots, and lots of people use it - and it only takes 15-30 minutes to learn the basics - for this reason, I think that it is fair to be surprised that someone doesn't know it.
It's worth noting that the person who said that they'd only hire junior developers who know git isn't the President of the United States or anything, and can absolutely make their own hiring decisions.
It's perfectly reasonable to make your own hiring decisions, IMO, and asking people to know git, or the fundamentals of version control seems totally fair, IMO.
If people are willing to spend a few weeks solving leetcode problems, or answering mock interview questions, I feel like they could absolutely spend 15-30 minutes learning how to use git.
Whether CS grads know this or that tech is irrelevant.
What matters is are they curious and how quickly they can self teach this or that tech.
And besides, learning the basics of git should be an afterthought for someone being hired for a programming job!! Come on.*
* Not gatekeeping here, I'm just trying to say that in the spectrum of tasks an entry-level programmer will need to get good at, git is a minor footnote. You can generally learn 99% of what you'll need to do in a short period of time, unless your team is doing exotic stuff, in which case they should stop doing that.
None of the specific languages or software I learned during my degree do I even use now in my current role.