{{announcement.body}}
{{announcement.title}}

To C, or Not To C: How Can We Best Teach the Coders of the Future?

DZone 's Guide to

To C, or Not To C: How Can We Best Teach the Coders of the Future?

A new HackerRank report reveals a striking discrepancy between the skills new college graduates possess and those most desired on the job market.

· Web Dev Zone ·
Free Resource

College Grads

Photo credit by Unsplash/Good Free Photos

HackerRank is known for its invaluable research into the lives of those who make the tech world go round, and this year’s Women in Tech Report is no exception. Filled with numerous important insights into the skills and motivations of the youngest female developers entering the workforce, it is an absolute must-read for any hiring manager looking to bring on the best and brightest of this uniquely savvy generation.

But there is one piece of information that I find particularly provocative, and it impacts more than just the women in tech: There is a remarkable discrepancy between the skills desired by a majority of our nation’s employers and those actually possessed by new college graduates. In fact, nearly 60 percent of employers are actively seeking developers with JavaScript experience, making it by far the most sought-after programming language today. Only half of these incoming devs, though, can meet this demand.

What are they actually bringing to the table? Nearly 80 percent claim mastery of C, while 70 percent are fluent in C++, both of which are only desired by about 20 percent of employers. These are certainly not the only languages Gen Z developers have experience with; 69 percent do indeed know Java while 61 percent can work with Python, but JS is desired by almost 20 percent more employers than either of these languages.

So, how did this happen? Why are we not better equipping our new developers with the skills they’ll actually be using on the job?

It turns out, as I’m sure many of you reading this are aware, there are (at least) two distinct camps when it comes to the debate over the ‘proper’ way to train new developers: To begin with C, or not to begin with C.

Why Is C the Basis of So Many CS Degree Programs

Many refer to C as “the mother of all programming languages,” and there is indeed good reason for that. As this Programming Hub blog post explains, “almost all popular cross-platform programming languages and scripting languages, such as C++, Java, Python, Objective-C, Perl, Ruby, PHP, Lua, and Bash, are implemented in C and borrow syntaxes and functions heavily from C.”

Because these languages share “similar operators, expressions, repetition statements, control structures, arrays, input and output, and functions” with C, the common belief is that subsequently learning them is a relatively easy endeavor, much in the same way that learning to speak additional Romance languages, such as French or Italian, is thought to be dramatically less difficult once fluency in one has been achieved.  

But this Romance language analogy also brings up another point: While learning one can indeed make it easier to then learn others, this certainly does not extend beyond the lexicon. The fact that I know French of course has no real bearing on my ability to learn a language like German or Arabic or Japanese. The rules simply do not apply across the board, which many newly minted devs realize in dismay when they attempt to make the transition to JavaScript on their own.

One Redditor put it this way: “JavaScript is multi-paradigm, which means OOP is there but only optionally so. The consequence is that when a university educated Java developer is tasked to modify or extend some JavaScript code they are painfully and hopelessly lost. First of all the OOP in JavaScript follows a different model, prototypes, opposed to the more common C++ class-based model.”

There is also no empirical evidence suggesting one must intimately understand the origins of any language to be able to competently use its most recent form. I of course don’t need to speak Old English to be able to write the essay you’re (hopefully) enjoying right now. And devs don't have to learn C to be able to use Java, Python, etc., although many certainly do pick up a lot of its basic tenets through exploration of these newer languages.

And as the plethora of coding boot camps, not to mention on-the-job training programs, make quite clear, plenty of highly skilled developers exist these days who never learned C (or Java for that matter).

There are also plenty of developers who openly question the value of their four-year computer science degree. As Michael Menard shared with DZone via LinkedIn:

“If you are going the four-year route as a means to learn coding, you are going to be disappointed, at least I was. For me, I was less interested in theory and more interested in the practical applications of how to solve problems. So I spent a lot of time on Udemy and the like searching for specific technologies I was interested in,” he said. “In the end, I think it’s a mixture of traditional classes and personal projects via non-traditional means of learning that will make a good coder. If I wasn't tinkering on my own, the four-year degree would not have been enough to really understand how to solve real problems in code, but I CAN wax poetic for hours on measuring time/space complexities or the applications of a Turing machine.”

This distinction between programming theory and practical application is repeated often by developers, including in this post on Reddit. A dev who goes by the handle Hexigonz questions why he was never taught JavaScript in college when this is the skill he uses most on the job. Another Redditor’s assertion that the purpose of college is to “teach you the principles of programming, OOP, and basics of algorithms,” rather than “all possible programming languages,” does little to appease Hexigonz’s disappointment with the overall substance of his education:

“Overarching principles are very important to learn before diving deep into any language,” he acknowledges. “I just found that my program went particularly deep into Java. A strong focus on back end and server side programming will teach a lot of the principles necessary to pick up any language, but in this case I use JavaScript as a representative of the client side of the stack in general. I'd love to see more client side being taught in curriculums. Even our web development elective still teaches C# and web forms, which is viable, but definitely outdated.”

Hexigonz also inadvertently finds himself in the middle of a rather hot button issue. His own credibility as a developer is actually called into question because he even suggested that JavaScript become a teaching language. One Redditor practically goes for the jugular, calling out his newbie status in the field. “Going on a hunch,” he says, “this is the problem.”

But after countenancing a fairly lengthy diatribe against the entire concept of JS, with its “weird odds and ends, questionable design decisions, and outright errors in the language design,” Hexigonz sticks to his guns, explaining, “The issue is [not] whether the language is ugly or not; 97 percent of the web uses it now. It’s rough to learn, and probably even worse to teach, but every developer will touch it in his or her lifetime.”

And as the HackerRank report asserts a year later, Hexigonz is absolutely right about this prevalence.

But why does this animosity exist toward any language in the first place? The answer is unfortunately anything but simple: As with most things in life, the status quo likes to assert itself in times of challenge, and with the dizzying array of alternative educational routes for new developers these days, it’s no wonder so many academics are righteously pissed off.

Do they have a point? Maybe. I absolutely see the value in teaching the basics of engineering, but to argue that to learn any language other than C first is unwise, as this developer did on Quora, seems rather tone deaf in the world we live in today.

Higher ed is astronomically expensive and out of reach for a huge proportion of the population. Just look at what happened when the government tried to help more people gain access by making student loans more widely available. People are now strapped with so much debt that many will never be free of it in their lifetimes.

But there is another important truth here: People learn in lots of ways, and there is absolutely no proof saying that one way is superior to any other. There are just the biases we all (often unknowingly) carry around, which are informed by nothing more than our own experiences. Learning to code, though, is not a one-size-fits-all endeavor.

And it’s a good thing because we’re running out of developer talent. But thanks to programs like Code the Dream, a non-profit providing free intensive training to those from underprivileged backgrounds, and companies like Techtonic, whose apprenticeship program is not only training many of its own future employees but actually paying them at the same time, we’re beginning to welcome coders from all walks of life. And these new developers will never have to wonder if their four-year investment was really worth it.

Topics:
c++ ,c and c++ courses ,javascript ,college degrees ,computer science ,women in tech ,young developer ,coding courses

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}