[This article was originally written by Rob Fox.]
By unleashing your digital assets, developers flock to consume the digital mana, creating new value for consumers in the API age. The proliferation of both open and closed APIs has created a rich diverse ecosystem which is driving the Internet of Things (IoT) from an idea to a reality. But how do you get the most out of your existing APIs, and/or create new ones nimbly? How do application developers benefit by leveraging them? Can you predict it?
As an example, a good poker player uses the behavior and actions of the other players combined with his own hand and stack of chips to his/her advantage. This combination of statistical and behavioral “analytics” boost the value of every chip in his/her possession. Without using his/her environment, the intrinsic “value” of his current bank is significantly reduced.
In order to understand how this same philosophy applies to API design, let’s take a step back and look at the lifecycle of an API.
API Development Lifecycle
To be successful, an API ecosystem needs to be treated similar to a (loosely connected) software system. The Software Development Lifecycle (SDLC) is aimed at providing quality software with a feedback loop to enable iteration on the system to continually refine and improve the user experience.
APIs should be treated in a similarly by way of the API Development Lifecycle. In a time when the availability of APIs has exploded exponentially, companies that do not embrace this lifecycle put themselves in a vulnerable position. Recognizing the loosely coupled ecosystem for which APIs are typically integrated, it is critical to understand how an API fits into an ecosystem. What happens if you change or enhance an API? How are they being consumed today? Do you really have a good handle on the needs of your customers who may be building applications that consume your APIs? How about their users?
A recommended top-down approach to API design and development is illustrated below:
By following a design first approach, and ensuring that an API can be smoothly adopted, the success of an API increases dramatically. But how do we create a truly full lifecycle that allows for successful APIs to be improved and continually get more lift? For poor APIs to either be improved or deprecated? Most importantly, how to identify the needs of your customers (and theirs) predictively?
By collecting usage data and applying predictive modeling results, the feedback loop can be closed and the API lifecycle enriched, which can ultimately increase API consumption by adding value, removing inefficiencies, and most importantly – predict what should come next.
Predictive modeling can provide true understanding of what assets are being consumed, by whom , and what changes will provide a better experience for the developer community building onto your APIs, and their user community.
A modified API Development Lifecycle that incorporates this insight is illustrated below:
So what are the kinds of insight analytics can bring to an organization developing APIs?
Descriptive analytics describes insight as to what has happened. This can be manual or automated and is typically described as data mining. Most organizations that implement an analytics solution against their usage data implement at this first-level stage.
Use cases may include:
- What APIs are called the most often?
- Which ones have the highest latency?
- What applications are the heaviest consumers of an API?
- Where are APIs being called from?
- What APIs (and therefore assets) are underutilized?
- What disparate assets are being used together in an unforeseen way?
The list is endless.
Building on top of what has happened, predictive modeling allows an organization to predict what may happen next (and when). This is a powerful, but harder to achieve second-stage of analytics for API usage.
Use cases may include:
- What customers are going to exceed their SLA
- Where is traffic likely to increase or decrease, when, and by how much?
- Predict customer loyalty, retention, and attrition
- Predict resource failures
- Impact analysis – what impact will a change to the API ecosystem have prior to deployment?
Prescriptive is the third and most mature stage of an analytics solution is the ability to not only predict what will happen, but why it will happen giving insight into what an organization should do to capitalize on the data.
Use cases may include:
- Auto-scaling resources ahead of demand (up or down) when and where they may be needed
- Auto-engage customers/consumers for SLA upgrades
- Auto-engage customers that are likely to drop off
- Notify a customer about opportunities
- Provide feedback and alerting internally on API design issues with recommended adjustments
Keep ‘em coming back for more
It’s a game of consumption. Keeping APIs relevant, performing well, and at the same time, preventing negative impact of change allows for true transcendent implementation. Services come and go all of the time. Technology is constantly evolving and changing, and this creates a strain on the API economy for those who are not agile. There are endless revenue generating opportunities by publicizing (and in some cases monetizing) digital assets.
However, the new service economy built on loosely coupled (REST-based) APIs also has the effect that one organization no longer controls the end-to-end experience. It is this very fact that makes combining analytics with API consumption and absolute must for an organization to thrive in the API economy. It isn’t enough to provide a quality service. The services and assets they represent must constantly be weighed against the ecosystem they are apart of (which is constantly changing), and the consumers in that ecosystem.
A good poker player never plays blind, and always uses every aspect of the environment to their advantage. They recognize that this is critical for every game or tournament as different players constantly come and go.
An API developer should do the same thing by using analytics as the new currency to drive the API lifecycle.