This post is in reply to Rob Ashton's post, go ahead and read it, I’ll wait. My answer to Rob's post can be summarized in a single word: BULLCRAP!
In particular, this statement:
It is impossible to be a good .NET developer. To work in a development shop with a team is to continually cater for the lowest common denominator of that team and the vast majority of software shops using .NET have a whole lot of lowest common denominator to choose their bad development decisions for.
Um, nope. That only applies to places that are going for the lowest common denominator. To go from that to "all .NET shops" is quite misleading. I'd give our example of building a high-performance database in managed code, which has very little lowest common denominator anything anywhere, but that would actually be too easy.
Looking at the landscape, I can see quite a lot of people doing quite a lot of interesting things at the bleeding edge. Now it may be that this blog is a self-selecting crowd, but when you issue statements such as "you can't be a good .NET developer," that is a pretty big statement to stand behind.
Personally, I think that I'm a pretty good developer, and while I dislike the term "XYZ developer," I do 99% of my coding in C#.
Now, some shops have different metrics. They care about the predictability of results, so they will be extremely conservative in their tooling and language usage. The old "they can’t handle that, so we can't use it" approach. This has nothing to do with the platform you are using, and everything to do with the type of culture of the company where you work.
I can certainly find good reasons for that behavior, by the way. When your typical product lifespan is measured in 5–10 years, you have a very different mindset than if you aim for a year or two. Making decisions on brand new stuff is dangerous. We lost a lot when we decided to use Silverlight, for example. And the decision to go with CoreCLR for RavenDB was made with an explicit exit strategy in case that sunk, too.
Looking at the directions that people leave .NET for, it traditionally has been to the green, green hills of Rails, then it was Node.JS, although I'm not really paying attention. That means that in the time a .NET developer (assuming that they investing in themselves and continuously learning) invested in their platform and learned how to make it work properly, the person who left for greener pastures has had to learn multiple new frameworks and platforms. If you think that this doesn't have an impact on productivity, you are kidding yourself.
The reason you see backlash against certain changes (project.json coming, going and then doing disappearing acts worthy of Houdini) is that there is value in all of that experience.
Sure, sometimes change is worth it, but it needs to be measured against its costs.