TIBCO: AI, Analytics, and Vendor Neutrality
Cathy Traugot writes about the TIBCO Analytics conference in Houston and shares thoughts on AI, analytics, and vendor neutrality.
Join the DZone community and get the full member experience.Join For Free
Journeys were a theme at the recent TIBCO Analytics conference in Houston. From TIBCO COO’s Matt Quinn’s discussion of the company’s own journey to rearchitecting its products to be cloud-native to Mercedes AMG Petronas Motorsport Formula One race victories fueled by analytics (analytics + tires, more on that later).
You might also like: Prometheus Monitoring for Microservices Using TIBCO
Nielsen’s Senior VP, Global Portfolio Leader Sharon Skurnac talked about her company’s digital transformation journey with a detailed breakout of the tech stack used to fuel the consumer market information giant’s operations data collection framework. The culture change aspect of the story was engaging on its own (dealing with cultural differences and entrenched behaviors in 100 countries).
“Everyone had their own way of doing things,’’ Skurnac said. And that was hard with “40,000 boots on the ground.’’
But Skurnac’s tech stack is also a who’s who of the latest open-source and flexible tools including PostgresSQL and MongoDB for their persistence layer, Python and Spring Boot for their services layer, Xamarin, Sencha Ext JS and Tibco Spotfire for the presentation layer – and that was just for transforming the back office into an integrated entity.
Skurnac’s talk detailed the use of Unity to provide visuals for gamification to help staff tasked with more routine jobs stay engaged in their job. And what came first in putting this extensive modernization project together? It was the process, with tools coming in second and layering in people in the final phase.
A company with a heavy lift to digital transformation might be forgiven for forgoing a data stack of best-in-class and just signing up for an out-of-box solution, but then Skurnac likely wouldn’t have been speaking at a TIBCO event.
Keeping It Vendor Neutral
TIBCO has been committed to first R and now Python, and while not a fully open-source product, it’s touted its integrations and vendor neutrality. So much of the conference chatter was around moving from established (and sometimes proprietary) languages to Python or using technologies that don’t require a programming background. It was part of the presentation of Ravi Hubbly, President of Explore Digits gave on analytics for large healthcare enterprises.
Oil, Gas, and Data
TIBCO has a lot of customers in the oil and gas industry — this particular forum used to be branded specifically around that industry before expanding into other areas. So, its announcement of a commitment to the Open Group Open Subsurface Data Universe (OSDU) wasn’t a mic drop moment. What is more interesting is how the industry (or at least those talking at this conference) are embracing analytics beyond the basic control systems. It’s needed.
OAG Analytics presented on using AI to improve well spacing, Conoco Phillips detailed supplier spend analysis, and IHS Markit discussed how they’re using analytics to explore future productivity potential for shale. Using analytics to make pipelines safe? Analyze Permian Basin well performance? Got it.
Turning Computing Rules on Their Heads
Company thought leaders often try out the ideas they are working on at conferences. TIBCO’s Quinn first discussed AI as the foundation of your platform instead of a technology sprinkled on top at TIBCO’s London event earlier this fall. He expounded on it in an interview with us after his keynote.
Quinn started at TIBCO as a developer and as he surveys the landscape of change (including the continued misuse of terms around AI and Machine Learning) he says it’s time for companies to stop thinking about AI as something that is a nice-to-have but the fundamental principle driving what they’re doing.
If you look back to the origins of the computer, the mechanic components were the big deal — the machine itself, not the code that drove it, Quinn says. As the capabilities of AI-first emerged in the ‘90s with voice recognition and similar advancements, computing power hadn’t caught up. “It cost too much and there were limited practical uses,’’ Quinn says.
But that has changed rapidly. Quinn proposes building AI into the foundation so it can find and fix problems, or at least learn from problems humans fix and suggest those fixes going forward with less human intervention. He gives, as an example, web services and sites that crash for knowable reasons. “Flip the model and change the perspective. You can build systems that are smarter at their base level so that you add more and more functionality, it’s smarter and more adaptable to learning.’’
Data Analytics and the Racing Cars
No conference (even one attended by IT, developers, and statisticians) is complete without a cool story. Matt Harris, head of IT for Mercedes-AMG Petronas Motorsport, keynoted on how analytics has helped keep Mercedes drivers on the top of the leader boards in Formula One. The story is not new. You can read about it here.
But using data to figure out which tire to put on a car to increase the speed of a car? It’s working. Mercedes Petronas has been a dominant force since 2014. Someone needs to pour some champagne over the IT person who spends three days setting up the computing equipment before every race and those data analysts back at the home office crunching all that sensor data to make the right decisions about the tires (and other aspects of the vehicles)
The Mercedes story is also a great example of how even products thought of as mechanically driven (race cars) are increasingly software-driven.
Opinions expressed by DZone contributors are their own.