Emerging and Disruptive Technologies of 21st Century - What Are We Aiming For?
Emerging and Disruptive Technologies of 21st Century - What Are We Aiming For?
Join the DZone community and get the full member experience.Join For Free
There has been a huge change in the software/IT industry in last few years and the industry has evolved in a way that forces everyone to be agile and adaptive in technological innovation and evolution.
The plethora of innovative, emerging, and disruptive technologies have not only changed the dynamics of the market but also proved again the fruitfulness of good and healthy research across the industry.
Is IoT (Internet of Things) the future? Are we all working for that only?
Please share your views and experience on the latest revolution and what do you think will make a difference in overall IT landscape.
The evolution of NoSQL is attributed to a necessity than innovation. The world of Social Networking and eCommerce had posed a big challenge to the traditional RDBMS systems from concurrency, consistency, partitioning, performance, and availability point of views. The existence of NoSQL is an outcome of these challenges.
The rise of NoSQL may not be as strong as RDBMS because the concept of "Polygot Persistency" (keeping multiple kinds of databases in a system depending on the particular need) will prevail in the enterprises. Moreover, lot of traditional RDBMS vendors (Open source and Commercial) are branding themselves as a follower of NoSQL as well.
DevOps (Continuous Integration, Continuous Delivery)
Developers can't work alone; operations need to be part of their job as well. The development process should be integrated with the build and deployment as an integral part, delivery should take care development and testing automatically. Both Continuous Integration and Delivery are essential.
The evolution of Chef, Puppet, Vagrant, and some other tools have solved many challenges in this domain.
The evolution of "multiple parallel processing" engines are needed to solve issues where traditional data ware house systems could not stand.
Vertica is one of the major players in this category.
Storm – Real time complex event processing
How to do real time complex event processing? MapReduce is more like a batch/offline processor of big data, so what's the way around?
Apache storm (a project evolved at Twitter) is an answer to solve real-time data processing. A good combination which many enterprises/product uses is "Kafka with Storm".
OSGI and Apache Felix
The evolution of Cloud has triggered a lot of potential in the field of modular application development and deployment through standard specifications.
OSGI is the specification to solve this problem and Apache Felix is a core reference implementation of this. Most of the major application server container support OSGI implementation. Spring also supports it through “Dynamic Module” project.
Apache Flume – Log management
Due to evolution of big data and analytics play an important part in the industry now. The need for processing multi-source log data across domains has triggered the existence of Apache Flume project.
Next generation Build & Release - Gradle
Ant was a still a prevailing tool for Build and release, but maven has taken the space. Both have good characteristics/features to adopt. The answer is Gradle (a best of both from Ant and Maven).
Front-end MVC and SPA - Angular and backbone
Parallel and concurrent programming model - Erlang, Go, and Java 8
The rise of "Social Networking" has triggered the need for massively concurrent support. There are few languages evolved to solve this issue mainly. Ericsson's Erlang and Google's Go are 2 special languages focusing on these concerns. The products like RabbitMQ, Riak are built on Erlang. Interestingly,
WhatsApp is built primarily on Erlang stack only.
In order to be on the game, Java has also introduced new APIs to handle concurrency as well as enhanced the existing collection packages in version 8.
Functional programming and Lambda expression - Prolog
The functional programming has got a new focus and introduction of Lambda expression in the languages like C, C++, and Java has put it on a strong place now.
High throughput messaging systems - Kafka, RabbitMQ, ZeroMQ
Messaging is not for legacy applications in the enterprise, but it has to support modern day huge volume transactions (in millions/billions) without impacting latency but with high throughput. Systems built already were good till some point, but there was a need for new breed of messaging technologies and product. The evolution of AMQP specification has triggered an array of new kind of messaging products like RabbitMQ.
Kafka and ZeroMQ are also on the rise.
CSS3 and HTML5 have helped to enhance the customer experience. The need for multi-channel support had triggered the web pages to become responsive.
SSD over HDD
The evolution of Big data/user, concurrent/parallel processing etc. have triggered not only new type of software but also forced to change the hardware world. The popularity of solid-state drive over hard-drive has been on the rise.
Cloud computing has been the major focus for last few years and will surely dominate the IT world for next few years. The popularity of Amazon AWS or Microsoft Azure has provided a strong reason of why enterprises should go on the cloud.
SMAC (Social, Mobility, Analytics, and Cloud) is the future.
Hot deployment - JRebel, Play framework
Fast coding and auto deployment of the code in the container without server restart have become imminent in the new programming world. Several web frameworks like "Play" have been supporting the feature inherently in its architecture without any additional support.
JRebel helps Java programs to work towards that but Spring also provides way to do hot-loading of the beans in the container.
Hadoop and family of technologies
Hadoop and its family of products (Hive, Pig, MapReduce, HBase, ZooKeeper etc.) has taken care of a whole new set of concerns around processing and analysis of Big data.
The caching was there before as well. Memcache was a popular and performing cache for long time. But the evolution of big data and user needed a more robust, scalable, distributed, and reliable solution in order to support data stores spread across geographies. Therefore, the need of "In-Memory-Data-Grid" (grid is a distributed topology supporting nodes communicating across data centers in multiple geographies).
The need for sharing information through Social channels (Facebook, LinkedIn, Google, Twitter etc.) has thrown a new set of challenges in the authorization world across systems. "OAuth" has been working as the major enabler on these integration.
The so-called "App world" popularized by Social sites, content distribution network like Netflix has triggered a new concept of exchanging/sharing data between the consumer and producer outside the boundary of the enterprise. SOA got a new challenge and "API" is on the rise.
The evolution of JSON, RAML, and Swagger has helped to build a standard specification around how the APIs should be communicated across participating vendors.
The companies like Apigee and Layer7 are pioneer in this area.
SOA has a new look through “microservices” instead of monolithic integration approaches. The term is still new and lot of vendors is building the systems/products around microservices.
Being Reactive is a new line of advantage. A new manifesto is on the horizon. Let’s be “Responsive”, “Resilient”, “Elastic”, and “Message-Driven”.
Many web frameworks like “Play” are conforming to this standard.
Configuration Management – GIT
The evolution of new level of configuration management tools like GIT over conventional players like SVN, ClearCase has changed the way how teams collaborate to develop the product.
Mobile development - Objective C
The evolution of mobile development/OS has given a huge drive in the programming world. Objective C has become one of the top languages being used for this purpose.
Alternate Protocol and Serialization
Google's Protocol Buffer is an example of how the object serialization/demoralization can be enhanced to take advantage over passing data on the network.
Revive the concept of IDL - Apache Thrift
Though CORBA is almost dormant now, its concept of "IDL" (Interface Definition Language) was evolutionary in the communication through cross-language paradigm. Apache Thrift is a similar implementation in that line. Many NoSQL vendors support Thrift client to access the data on the server through multiple languages (Java, Python, Ruby, C++ etc.).
Opinions expressed by DZone contributors are their own.