The increasing ease of accessibility of programming languages and development-related technologies has increased the talent pool available to companies by seemingly orders of magnitude in the past ten to fifteen years. I don't think anyone is going to argue that point, aside from perhaps the actual order of magnitude.
The Internet makes looking up and learning almost any language a snap.
(In fairness, the Internet has been perhaps the biggest catalyst for making all knowledge more accessible.)
Open source has made it possible to collaborate and develop at a faster rate.
The list goes on.
And now companies both big and small have access to not just those with Computer Science degrees but those with natural, demonstrable talent that might not have otherwise been considered historically.
The Double-Edged Sword
But there is a hidden danger in this. Sure, perhaps it isn't the end of the world, but it certainly underscores one of the side effects of making knowledge "too" accessible. Perhaps you yourself have had a similar exchange as you interviewed a prospective developer:
You: "So you have listed here under your 'skills' that you know <insert language/tech here> which is vital to this job. Tell me about your experience with it and some projects you've tackled with it."
Prospective Hire: "Well, I'm just learning it. I started learning a month ago."
You: "Oh, I see. Well, how about <insert other language/tech here>?"
P.H.: "Yeah, just learning that one as well."
Is It Such a Big Deal?
In an attempt to catch the eye of those screening resumes for jobs, more and more people are putting down languages/technologies with which those people have minimal experience, relying on the fact that they can "just Google it" and still perform at their jobs.
Now, this is less of an issue if what is being learned is considered a "nice to have". But with alarming frequency, this situation is happening with technologies that are central to posted jobs.
This situation used to be the domain primarily of developers fresh out of school ("hey, how else are we supposed to break into the industry?"); however, with the frequent advent of new and increasingly-useful tecnologies, even seasoned developers are resorting to this tactic.
Of course, these situations, while annoying, are easily caught and dealt with by asking a couple simple questions (for those candidates who aren't quite so forthcoming, see this article about some strategies you can use). So what's the big deal?
Maybe it really isn't a big deal at all.
"State of the Art"
While it is frustrating for companies to have to wade through such interviews, the situation itself serves as an indicator that technology, as a whole, has permeated the public at large to such a point that anyone who has an idea can, with a little gumption, hard work, and an internet connection, see that idea brought to life without the need for post-graduate degrees and 30 years of programming experience. Hobbyist programming is indeed the state of the art in several respects.
Think back to when compilers were prohibitively expensive. Think back to when learning how to program required several books that were long, dry, and not cheap. I can remember that very situation in the 1980s, which improved moderately by the 1990s, which gave way to the deluge of knowledge and collaboration that exists today.
Perhaps a question or two should be posed: What is the next step for programming languages and technology in general? What is the next recognizable hallmark for the advancement of software development?
What do you think? About the "next big step"? About the flood of people who want to be professionals now but are "still learning"? Leave a comment below and let us know!