Serverless Architectures: The Evolution of Cloud Computing
Serverless Architectures: The Evolution of Cloud Computing
As serverless architecture explodes over the cloudscape, here's MongoDB's take on going serverless and how they've adapted to the changing times.
Join the DZone community and get the full member experience.Join For Free
Since the advent of the computer, building software has been a complicated process. Over the past decade, new infrastructure approaches (IaaS and PaaS), software architectures (SOA and Microservices), and methodologies (Agile, Continuous Delivery and DevOps) have emerged to mitigate the complexity of application development. While microservices have been the hot trend over the past couple of years, serverless architectures have been gaining momentum by providing a new way to build scalable and cost effective applications. Serverless computing frees developers from the traditional cost of building applications by automatically provisioning servers and storage, maintaining infrastructure, upgrading software, and only charging for consumed resources.
This blog discusses what serverless computing is and key considerations when evaluating a database with your serverless environment.
What Is Serverless Computing?
Serverless computing is the next layer of abstraction in cloud computing. It does not mean that there are no servers, but rather the underlying infrastructure (physical and virtual hosts, virtual machines, containers), as well as the operating system, are abstracted away from the developer. Applications are run in stateless compute containers that are event triggered (e.g. a user uploading a photo which triggers notifications to his/her followers). Developers create functions and depend on the infrastructure to allocate the proper resources to execute the function. If the load on the function grows, the infrastructure will create copies of the function and scale to meet demand.
Serverless computing supports multiple languages, so developers can choose the tools they are most comfortable with. Users are only charged for runtime and resources (e.g. RAM) that the function consumes; thus there is no longer any concept of under or over-provisioning. For example, if a function runs for 500ms and consumes 15 MB of RAM, the user will only be charged for 500 ms of runtime, and the cost to use 15 MB of RAM.
Serverless architectures are a natural extension of microservices. Similar to microservices, serverless architecture applications are broken down into specific core components. While microservices may group similar functionality into one service, serverless applications delineate functionality into finer grained components. Custom code is developed and executed as isolated, autonomous, granular functions that run in a stateless compute service.
To illustrate this point, let’s look at a simple example of how a microservice and serverless architecture differ.
In Figure 1, a client interacts with a “User” microservice. A container is pre-provisioned with all of the functionality of the “User” service residing in the container. The service consists of different functions (update_user, get_user, create_user, delete_user) and is scaled based on the overall load across the service. The service will consume hardware resources even while idle, and the user will still be charged for the underutilized resources.
Figure 1: Microservices Architecture
For a serverless architecture, the “User” service would be separated into more granular functions. In Figure 2, each API endpoint corresponds to a specific function and file. When a “create user” request is initiated by the client, the entire codebase of the “User” service does not have to run; instead only create_user.js will execute. There is no need to pre-provision containers, as standalone functions only consume resources when needed, and users are only charged on the actual runtime of their functions. This granularity also facilitates parallel development work, as functions can be tested and deployed independently.
Figure 2:Serverless Architecture
Benefits of Serverless Computing
Costs scale with usage: One of the biggest benefits of serverless computing is that you only pay for the runtime of your function. There is no concept of “idle” resources as you are not charged if the function is not executed. This is especially helpful for applications that are only used a few times an hour, which means any dedicated hardware, VMs, or containers would be sitting idle for the majority of the time, and a user would be charged for underutilized resources. With serverless computing, enterprises could build out an entire infrastructure and not pay for any compute resources until customers start using the application.
Elastic scalability: Elastic scalability is also simple with a serverless architecture. If a function needs to scale, the infrastructure will make copies of the function to handle the load. An example of this could be a chatbot that responds to weather requests. In a serverless architecture, a chatbot function would handle the response by retrieving the user’s location and responding back with the temperature. For a few requests, this is not a problem, but what happens if the chatbot service is flooded with thousands of request a second. For this scenario, the chatbot function would automatically scale by instantiating thousands of copies of the function. Once the requests have subsided, the environment would terminate the idle instances and scale down, allowing costs to scale proportionally with user demand.
Rapid development and iteration: Serverless computing is ideal for companies that need to quickly develop, prototype, and iterate. Development is quicker since there aren’t any dependencies on IT Ops. Functions are single threaded, which makes debugging and deploying functions simpler. The build process is also broken down into smaller and more manageable chunks. This increases the number of changes that can be pushed through the Continuous Delivery pipeline, resulting in rapid deployment and more iterative feedback. and Iterations are fast as the architecture is conducive to making large code changes quickly, resulting in more customer feedback and better product market fit.
Less system administration: Serverless doesn’t mean that you completely obviate the operational element of your infrastructure, but it does mean that there is less system administration. There are no servers to manage, provision, and scale, as well as no patching and upgrading. Servers are automatically deployed in multiple availability zones to provide high availability. Support is also streamlined; if there is an issue in the middle of the night it is the responsibility of the cloud provider to resolve the problem.
Developer productivity: By using a serverless architecture, developers can focus more on writing code than having to worry about managing the operational tasks of the application. This allows them to develop innovative features and focus on the core business logic that matters most to the business.
MongoDB Atlas and Serverless Computing
With MongoDB Atlas, users can leverage the rich functionality of MongoDB — expressive query language, flexible schema, always-on availability, distributed scale-out — from a serverless environment. MongoDB Atlas is a database as a service and provides all the features of the database without the heavy lifting of setting up operational tasks. Developers no longer need to worry about provisioning, configuration, patching, upgrades, backups, and failure recovery. Atlas offers elastic scalability, either by scaling up on a range of instance sizes or scaling out with automatic sharding, all with no application downtime.
Setting up Atlas is simple.
Figure 3: Provisioning MongoDB Atlas cluster
Select the instance size that fits your application needs and click “CONFIRM & DEPLOY.” Depending on the instance size, a MongoDB cluster can be provisioned in seconds.
Figure 4: MongoDB Atlas Cluster Monitoring
MongoDB Atlas provides many benefits for those interested in building a serverless architecture:
Vendor independence: Cloud providers typically only offer databases specific to that provider, which may not fit with what a developer needs. MongoDB Atlas provides independence from the underlying cloud provider and empowers developers to choose the appropriate tools for their needs. Developers can leverage the rich functionality of MongoDB’s query language and flexible data model, without worrying about the operational tasks of managing the database. If you decide to shift to another cloud provider, you won’t have to repopulate your data with a different database technology. MongoDB Atlas is currently available only on AWS, with support for Microsoft Azure and Google Cloud Platform (GCP) coming soon.
Rapid deployment: With MongoDB Atlas, a MongoDB cluster can be provisioned and deployed in minutes and even seconds. Developers no longer need to worry about configuring or managing servers. Integrating MongoDB Atlas into your serverless platform requires you to pass the connection string into your serverless application.
Figure 5: MongoDB Atlas Connection
MongoDB Atlas features extensive capabilities to defend, detect, and control access to MongoDB, offering among the most complete security controls of any modern database:
- User rights management: Control access to sensitive data using industry standard mechanisms for authentication and authorization at the database level
- Encryption: Protect data in motion over the network and at rest in persistent storage
To ensure a secure system right out of the box, authentication and IP Address whitelisting are automatically enabled.
IP address whitelisting is a key MongoDB Atlas security feature, adding an extra layer to prevent 3rd parties from accessing your data. Clients are prevented from accessing the database unless their IP address has been added to the IP whitelist for your MongoDB Atlas group.
For AWS, VPC Peering for MongoDB Atlas is under development and will be available soon, offering a simple, robust solution. It will allow the whitelisting of an entire AWS Security Group within the VPC containing your application servers.
Scalability: You should expect your serverless functions to scale out, thus downstream setups need to be architected to keep up and scale out with your functions. Relational databases tend to break down with this model. MongoDB Atlas is designed with scalability as a core principle. When your cluster hits a certain threshold, MongoDB Atlas will alert you and with one click you can provision new servers.
Flexible Schema: Because serverless architectures are event driven, many use cases revolve around around the Internet of Things (IoT) and mobile. MongoDB is ideal for these use cases and more as its flexible document model enables you to store and process data of any type: events, geospatial, time series, text, binary, and anything else. Adding new fields to the document structure is simple, making it easy to handle changing data generated by event driven applications. Developers spend less time modifying schemas and more time innovating.
For more information on Serverless architecture best practices and benefits download MongoDB's Serverless Architectures: Evolution of Cloud Computing whitepaper.
Published at DZone with permission of Jason Ma, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.