DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
The Latest "Software Integration: The Intersection of APIs, Microservices, and Cloud-Based Systems" Trend Report
Get the report
  1. DZone
  2. Data Engineering
  3. Big Data
  4. The Rise of Developeronomics

The Rise of Developeronomics

A. Jesse Jiryu Davis user avatar by
A. Jesse Jiryu Davis
·
Dec. 18, 11 · Interview
Like (0)
Save
Tweet
Share
5.19K Views

Join the DZone community and get the full member experience.

Join For Free

I have some thoughts about Venkatesh Rao’s Forbes article, “The Rise of Developeronomics”. The article, in brief, argues that “software is now the core function of every company, no matter what it makes,” and that, as “software eats the world,” maintaining relationships with excellent software developers is a prerequisite for survival for all firms.

One of the article’s insights is, “while other industries have come up with systems to (say) systematically use mediocre chemists or accountants in highly leveraged ways, the software industry hasn’t.” This is certainly true, and the most successful firms realize it. Again and again, I’ve worked for companies that try to save money, or accelerate development, by adding teams of mediocre (typically offshore) developers to a staff of great hackers. It almost never works, either because very few managers know how to use mediocre developers efficiently, or because it’s impossible.

But I’m not sure that’s always going to be so. I kept thinking as I read this article, “each year we write software that prevents us from having to write more software.” WordPress means we don’t have to make CMSes any more. Hadoop means we don’t need to spend months writing ETLs like we did a few years ago. MongoDB makes it much easier to create and deploy a scalable data store. The list goes on—won’t there come an inflection point when we’ve made so much software that the need for new code levels off?

And yet each time we discover a new thing software can do (mobile apps, social networks, big data, …) it accelerates the growth of demand for software. I think this article might be roughly right about the trends for the foreseeable future. Carlo Cabanilla pointed out to me on Facebook that “as more and more software exists to solve common problems, Ops will become more and more valuable because you’ll always need a scalable, cost efficient way to manage these things. You can have the best app in the world, but if it’s always going down, it’s like it doesn’t exist.” He should know, since he works at DataDog, which is trying to solve this problem.

Ken Young, who’s solving big-data problems over at Mortar Data, thinks that “until the world has been faithfully modeled in software to the last degree there will be new need to predict and manipulate the real world in all its complexity. And since we are no closer to understanding the world than we were in Newton’s time (or so it seems)….”

Right, and even if we did model the whole world, we’d need another system to model all the software we’ve written so we know whether it’s running correctly, and so we can keep it running correctly. And as Turing proved, we can only get asymptotically close to that goal.

 

From http://emptysquare.net/blog/the-rise-of-developeronomics/

Software Big data

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Documentation 101: How to Properly Document Your Cloud Infrastructure Project
  • A First Look at Neon
  • How Chat GPT-3 Changed the Life of Young DevOps Engineers
  • Container Security: Don't Let Your Guard Down

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: