Why Fast Data Is Hard: Top 9 Challenges Ranked by 2400 Developers

DZone 's Guide to

Why Fast Data Is Hard: Top 9 Challenges Ranked by 2400 Developers

As big data ecosystems grow larger and more complex, let's take a look at what devs thought were some of the toughest challenges to deal with.

· Big Data Zone ·
Free Resource

In The Need For Speed survey, we asked over 2400 developers what’s hard about Fast Data through the lens of the complete project lifecycle: Design, Build, and Run.

The responses were roughly split between the build and run phases of a project. The design phase, however, appears most problematic with choosing the right tools ranking as the top challenge.

Across the entire project lifecycle, here are the top 9 challenges as ranked by over 2400 developers:

  1. 16% - Choosing the right tools and techniques
  2. 13% - Knowing how to write robust and performant applications
  3. 12% - Integrating and managing the tools and techniques chosen
  4. 12% - Scale / operational complexity for these new applications and systems
  5. 12% - Debugging Fast Data systems
  6. 10% - Monitoring our environment
  7. 10% - Security and compliance
  8. 8% - Integrating data from disparate sources
  9. 7% - Dealing with continuous streams of input data

The biggest hurdle surfaces early in the software development lifecycle: choosing the right tools and techniques for getting continuously streaming applications done right. Yet in adopting Fast Data, respondents appear to be more confident working with disparate data sources and continuous streams of input than operationalizing their systems in production. Integrating, scaling, debugging, and monitoring are posing challenges for developers. 

big data, data tools, fast data, performance

Published at DZone with permission of Oliver White , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}