Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

The Journey Towards Autonomic Computing Starts With the Mainframe

DZone's Guide to

The Journey Towards Autonomic Computing Starts With the Mainframe

As we take our first steps with this series, which considers the path we're taking toward self-managing computing, we look back to the origins of the cloud — mainframes.

· Cloud Zone
Free Resource

MongoDB Atlas is a database as a service that makes it easy to deploy, manage, and scale MongoDB. So you can focus on innovation, not operations. Brought to you in partnership with MongoDB.

We start our tale in the middle of the journey. Today, we have seen the evolution of computing from its inception at the earliest versions, up to a point where we have IBM Watson, Tesla Autopilot, and other self-learning, self-driven, self-organizing computing platforms. They feed on real-time information. They feed back decisions. They feed back a vision of elevating the way that we consume IT.

Rewards aren’t without risk. Sadly, we have also seen the first case where a fear of AI may have come true in the tragic loss of life involving Tesla Autopilot. This should be viewed as a difficult lesson, but not as the end of the platform because of the terrible result in this case. If anything, this marks the start of the important journey towards the next steps in our autonomic computing evolution.

In the Beginning…

Big Data may seem like a trend today as we launch into a generation where we’ve gathered trillions of data points, or made the choice to retain an unstructured data pool that can be mined for valuable information in a host of different ways. The truth of Big Data is that it doesn’t get much bigger than the early mainframe environments.

We could dedicate an entire series to the original data designs that were prevalent in the mainframe environment. Data aside, let’s just look at the mainframe conceptual design.

conceptual mainframe


The centralized computing model was born. This delivered on the idea of having a central system which provided compute, memory, and storage. Network access to the compute environment was provided through external controllers, which translated to what are often called “green screen” terminals, so named due to the monochrome text-only display.

green screen computer terminals

Data access was provided to the environment in real-time via the terminals and terminal emulators. As data was entered and uploaded through direct access, and through FTP (File Transfer Protocol) and other data interfaces, programs would be run in order to process that data.

Processing was done in two ways. These were known as batch and online. Online was an immediate run of the application processing, whereas batch meant that it was triggered in a bundle (thus the name batch) which happened on a time interval. Batch processing usually ran overnight.

Processing the data incurred usage of the CPU defined in MIPS (Million Instructions Per Second). A chargeback model was often employed to make sure that the cost of maintaining the large-scale centralized environment would be paid for in an even distribution, based on usage by each program. Those programs were assigned to budget codes and to departments.

The OG Virtualization and Cloud?

Could this have been the dawn of the concept of cloud computing? In effect, it was. A centralized compute model. You are charged for data storage, CPU usage, and processing time. It does contain a lot of what would one day become part of the definition of a cloud computing tenets.

LPAR (Logical PARtition) separation within the mainframe environment allowed us to divide the CPU, memory, and storage into subsets of the physical environment. LPAR isolation provided the ability to independently operate without impacting the other LPARs within the environment.

LPAR


We had both virtualization and multi-tenant environments in a large centralized model. It sure sounds familiar in that sense, doesn’t it?

Everything Old is New Again

You will see a common theme through the series as we discuss the evolution of computing models and platforms. It boils down to this simple statement I like to use:

History doesn’t repeat itself, it iterates on itself.

There was much more that the mainframe platform provided, and still does. It’s an important start in the journey towards what we call autonomic computing.

Join us for our next post which will talk about the rise of the distributed computing environment alongside the centralized, mainframe-centric model.

MongoDB Atlas is the best way to run MongoDB on AWS — highly secure by default, highly available, and fully elastic. Get started free. Brought to you in partnership with MongoDB.

Topics:
platforms ,mainframe ,cloud computing

Published at DZone with permission of Eric Wright, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

SEE AN EXAMPLE
Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
Subscribe

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}