Development Best Practices for Hybrid Cloud: Part 1
Development Best Practices for Hybrid Cloud: Part 1
For anyone seeking to move into the hybrid cloud space, check out these best practices and what you should keep in mind.
Join the DZone community and get the full member experience.Join For Free
Insight into the right steps to take for migrating workloads to public cloud and successfully reducing cost as a result. Read the Guide.
Development is both engineering and an art which is perfected as experience is gained. With the introduction of hybrid cloud, the way we look at how applications have changed, we see our customers’ expectations have changed and more importantly, our business revenue streams have changed. To accommodate these changes by incorporating evolving new technologies, best practices for development needs to morph to suit the emerging needs.
There are numerous best practices which can be derived based on multiple factors, but there are a few which form the core, especially for the hybrid cloud environment. This article is the first of two-part series, which tries to shed some light on these best practices, which will greatly help the development teams for their day-to-day development activities.
Starting from Notepad, there are a variety of IDEs available in the market. Some of the key features which should be considered for hybrid cloud application include:
- Multi-lingual: An IDE should be able to handle multiple programming languages so as to support most of the enterprise development needs.
- Open Source: As the IDEs are used by the entire development community, it is recommended to get an open source IDE with an extensive development community support. It's also recommended to have a limited number of IDEs if not one, so that upgrades, patches, and maintenance become easy.
- Cloud Integration: Especially with hybrid cloud, IDEs should be able to communicate with various clouds like AWS, Azure, Mule Cloudhub, Oracle, and Salesforce.
- SCM Integration: It should have seamless integration with most commonly used source code repositories such as Bitbucket, Gitlab, Github, Subversion, etc.
- Code/Content Assist: The IDE should be able to actively assist developers to generate code automatically, import the most commonly used libraries, provide shortcuts keys to generate most commonly used code constructs, generate template projects, assist with compilation errors, profiling code, autocomplete, and code recommendations.
- Third Party Constructs: The IDE should be capable to accommodate integration with third-party tools like Mule Studio, BPM Modeler, Swagger, RAML, and Databases.
- Publish Content: Developers can save a lot of time if IDEs can publish and pull content directly from API management tools like Apigee, Mule, Akana.
These are some of the common functionalities an IDE should provide to be used at an enterprise scale. Eclipse seems to be the winner in this category with many plugins to do all the above-mentioned aspects of IDE.
Agile, Lean & XP
Following Agile and Lean principles enables the team to systematically deliver value to the stakeholders. Using tools like Jira and VersionOne can not only help to track, assess, and prioritize work, but also provide transparency to stakeholders. Using XP can greatly increase the quality of the code developed. Pair programming and pair review can give the assurance that the code is effectively coded and validated to the best of our knowledge.
Cloud-agnostic tools, a well-defined sequence of steps, integrated security/governance, and thought-out configuration management are key for identifying robust toolchains. Care should be taken that there are limited effective steps in the chain, which not only facilitate faster builds but also provide enough quality assurance of the code built. More details on DevOps and automation can be found here.
Incorporating design patterns in the code confirms adherence to the industry standards. But care should be taken that the appropriate design pattern is used for a specific use case. Mismatch in the design patterns not only causes confusion but degrades the application effectiveness. With the hybrid cloud environments, where multiple replicas of the application are spun up and tore down, detail analysis needs to be conducted on how the objects are created and destroyed. There might be use cases which might demand specific handling of the created objects as a result of rapid scaling. These design patterns can also greatly aid in the acceleration of the development process. More details on the accelerators for development can be found here.
Whether you are on-premises or cloud or both, as the number of services grows, it becomes more and more critical to have a robust discovery process. Based on the use case and tools in use, it is recommended to aim for self-registration, server-side discovery, and Gateway moderation. There are many tools available, few of them include Eureka, Mule Service Registry, and Male Gateway, Apigee, API Connect, Istio, Akana etc. Regardless of the tool and technology, these tools have a performance implication which should be evaluated before making the decision.
There are many server types available to developers, and most of the IDEs come packaged with some default servers. It is recommended to work with the closest server type and the version which the code will be eventually deployed on, either on-premise or cloud. Also, pick a lightweight and cloud-agnostic server which can do the job. Lightweight servers greatly decrease the time taken to start and deploy code, and many servers also have the ability to hot deploy the code, saving a significant abmount of time.
Enhancement and Refactoring
With the introduction of cloud, it became effortless to rebuild and teardown instances frequently. This facilitates frequent enhancements and patching of the existing code base, third-party libraries, OS and Middleware. It is recommended to constantly upgrade and patch at every possible occasion. This will help in reducing vulnerabilities, defects and greatly reduces the upgrade cycle of an application. If possible, have a regular upgrade cycle for the provisioned servers and make it as an IT portfolio activity.
JSON and XML are the most widely-used message types to send information across to different systems. Having a standard translation/conversion of these messages into objects is critical for further processing. Using programs like Java API for JSON, Jackson, JAXB, JSON to C#, or XML to C# can greatly help developers to create objects of incoming messages. To extend it further, with API First development, the use of Swagger and RAML is gaining popularity. Having RAML to Java, Swagger to Java, RAML to .Net, Swagger to .Net etc. can greatly help developers.
As the complexity of the applications increase, using third-party libraries can greatly decrease the development time. But using third-party libraries also results in dependencies, security vulnerability and most importantly vendor lock-in. It is recommended to use third-party libraries sparingly and if possible, use open source with huge community backing which have a proven adoption.
This is one of the most critical aspects of any IT artifact. Semantic versioning is the most widely-used strategy but there are many others which can be used based on specific use case. Tools like Bitbucket or Git can not only version your artifacts, but also give enhanced functionality to revert, merge and branch versioned artifacts as needed. Rule of thumb is to version every artifact.
This is the first part of the two series on Development Best Practices for Hybrid Cloud. Second part can be found here.
Opinions expressed by DZone contributors are their own.