Tibco Delivers a High-Performance ‘Data and Analytics Pipeline’ To Convert Raw Data into Actionable Intelligence

Tibco is delivering a new ‘data and analytics pipeline’ to let companies quickly and continuously convert raw data into actionable intelligence.  IDN explores Tibco’s Connected Intelligence Cloud Platform’s with senior execs Shawn Rodgers and Robert Eve.

Tags: analytics, cloud, integration, intelligence, MDM, pipeline, Tibco,

Shawn Rogers, Tibco
Shawn Rogers
senior director for
analytic strategy
Tibco


"Companies need to have a solid framework around their pipeline We've gone from companies having a handful of algorithms to hundreds or a thousand algorithms."



Bob Eve, Tibco
Robert Eve
senior director
Tibco


"What started as an AI/ML analytics model development challenge for data scientists is now an application integration challenge for application engineers."

CLOUD
Architecture Summit
Enterprise-Grade Integration Across Cloud and On-Premise
Online Conference

Tibco is bringing together integration, analytics and global data management to help companies create a high-performance pipeline to convert raw data into actionable intelligence.  

 

Tibco’s Connected Intelligence Cloud Platform aims to deliver a converged  ‘data and analytics pipeline’ by combining crucial technologies to define a new reference architecture. Rather than a simple one-way data pipeline, Tibco’s approach offers a ‘closed loop’ model, which provides an iterative way to circle back on itself – and provide ongoing improvements on how to deliver data and analytics.   

 

Tibco’s senior director for analytic strategy Shawn Rogers described it this way:

 

“Our view of a closed loop system has a start and a finish, but then can loop around to begin again. Today, the pipelines that I see that work the best are somewhat circular because at the very end of a pipeline there's this idea you can measure for success, optimize the process and start over again.”

 

This pivot to closed loop thinking is also becoming important – not simply because of the volume of data, Rogers added. It’s the explosion in the volume and complexity of algorithms.

 

“Companies need to have a solid framework around their pipeline  It’s becoming very important because we've gone from companies having a handful of algorithms – maybe five or ten in their whole environment – to deploying, managing and operationalizing hundreds or a thousand algorithms.” Rogers shared that more than one customer has asked Tibco how to best help them support 10,000 algorithms.  

 

Rogers, a former analyst before joining Tibco, explained how Tibco is focused on delivering a multi-purpose pipeline that can (1) support multiple data types from anywhere, (2) rapid, even real-time, analytics operations and (3) include all the management services that requires.  

 

Tibco’s senior director Robert Eve told IDN that by adding a ‘closed loop’ dimension to a traditional pipeline, customers are shifting their entire thinking about how to extract value from a pipeline. 

 

“Successful organizations are reengineering their ‘analytics pipelines’ into ‘data and analytics pipelines’ to discover hidden trends and predict future business outcomes,” Eve told IDN. 

 

Pipelines That  ‘Leverage’ and ‘Operationalizing’ All Data and Analytics     

Beyond the technologies involved, there is also a new way of looking at analytics teamwork, Eve added. “Further, they are applying an ‘all hands-on deck’ approach to analytics that empowers citizen data engineers and citizen data scientists to focus IT's efforts on the agile delivery of secure, consistent, and reusable analytic datasets,” he told IDN.

 

To deliver this new ‘data and analytics pipeline’ that works for technicians, data engineers and laymen business users,  Tibco engineers have been exploring many dimensions of how raw data can best and most quickly become useful analytics -  or intelligently respond to queries.

In short, the transformation in pipelines are about this, Rogers added: 

 

“It’s no longer just about pipes for a data pump. It's no longer just about how much data you can push through it,” Rogers said. “Rather, it’s a recognition that companies need to be able to better leverage all their data – no matter what type it is or where it is. In short, it’s making sure you have the ability to operationalize the environment that you have around analytics.”

 

Companies looking to better ‘operationalizing’ their environments for analytics means they need to take a step back – and ask some broader questions about their current pipeline approach—and more specifically, how data becomes intelligence. 

 

So, how to start thinking about – and implementing – a more ‘operationalized’ environment for analytics ROI. 

 

One place to start, Rogers suggested is to ask some of the same questions asked (and still being asked) by engineers working on Tibco’s Connected Intelligence Cloud platform:

Do you have enough governance?

 

Can you deliver high-performance and high scale?

 

Do you have enough access to all data types?

 

Can those data sets work with one another?

 

Can you support a wide number of varied personas – both technical and non-technical workers – with tools they can use to meet their specific goals?

If that sounds like a lot of extra functionality, you’d be correct.

 

And Rogers pointed out: “The need for all that [extra] functionality can put lots of stress and push on the backside of the architecture. Tibco is focusing very much on everything that has to do with the enablement side of an innovative analytics program.”  

 

Tibco’s Recipe for Innovative ‘Data and Analytics Pipelines’

Enter the Tibco Connected Intelligence Cloud platform, where Tibco brings together crucial capabilities -- integration, in-memory, MDM, data virtualization, and more – creating the ‘data and analytics pipeline.’

 

With it, businesses can connect disparate data, govern their information, and augment the resulting intelligence for high-impact and high-performance, Rogers said.  

 

Rogers and Eve walked IDN through some of the more important technology components of the solution:

 

TIBCO ComputeDB Fabric. Earlier this year, Tibco acquired ComputeDB technologies (as part of its acquisition of SnappyData) for its unified analytics data fabric. ComputeDB presents a unified analytics data fabric, offering simplified, agile analytics on data in motion and at rest.

 

This technology, which provides an in-memory optimized analytics database based on Apache Spark, also enhances the scale and speed of analytics operations -- and supports streaming, transactional and interactive analytics, Rogers said.

 

Integrating ComputeDB technologies with Tibco’s Connected Intelligence Cloud Platform brings several benefits, Rogers noted. Among them:

  • Faster data refresh intervals and quicker query times
  • High-speed and high-scale in-memory data store to expand access to immense data sets
  • High-performance for streaming analytics
  • Rich DataOps supports, with the added ability to orchestrate and streamline management of analytics data pipelines.

 

Tibco MDM. Tibco also acquired Orchestra Network, and its data asset management and MDM (master data management) EBX technology. Now named Tibco EBX, the technology allows the Tibco Connected Intelligence Cloud Platform to natively provide trust across many critical data assets, including master data, metadata, reference data, hierarchies, and business glossaries. 

 

The goal of Tibco EBX, in short, is to provide a single solution to help users govern, manage and consume all shared data assets. These capabilities are critical for operational and analytics processes that drive innovation and transformation, Rogers said.

 

Data Virtualization. Tibco Data Virtualization delivers more agile analytics datasets from any source. Without it, enterprises aren’t able to access a 360° view of business operations, minimize costs, or reduce operational complexities.

 

For all these technology enhancements, there is also a compounding effect when using them altogether, Rogers added.

 

“There's sort of this ‘one plus one equals three’ idea of when you bring more than one component of our offerings together. You are going to get a multiplier on your value because when we can highly integrate them all into a unified pipeline, there are cool things that happen” in regard to performance, deep analytics, access to large data volumes and other considerations, he explained.

 

As an example, Tibco’s Connected Intelligence Cloud Platform enables IT and DataOps teams to innovate by governing their data and information, exposing it to other systems through a unified data fabric, and extracting the best insights. 

Further, once in place, the Tibco Connected Intelligence Cloud Platform offers a well-governed, business-friendly data foundation ready to work with Tibco Spotfire and Tibco Data Science. This combination enables business users to perform last-mile analytics activities such as visualization, AI/ML modeling, and deployment in a collaborative, self-service manner. 

Notably, just recently Tibco Spotfire added some new features that play into the strengths of Tibco Connected Intelligence Platform, including:

 

(a) native Snowflake Data Warehouse support, which, once connected, can combine or push live queries into Snowflake, with on-demand retrieval of rows for in-memory analytics.


(b) a new Data Streams Connectivity Wizard to boost visibility for users on all real-time streaming data, and


(c) more high-performance in-cluster analytics and new native BigQuery support.

Analytics Pipelines Plus Integration Also Can Transform Customer Engagement

By enabling all these technologies to work better together,  Tibco execs say the  Connected Intelligence Cloud Platform can also improve how companies engage with customers.

 

“The convergence of a ‘closed loop analytics pipeline’ and ‘enterprise-class integration’ has the power to completely transform customer engagement in today’s multichannel world,” Eve said. 

 

He shared some real-world examples of how Tibco sets the stage for data engineers, developers and business users to work together to unlock new customer-centric innovations:

Let’s say a business wants to create a new AI-based propensity-to-buy model that generates next-best-offer recommendations for customers. This is what consumers see online under the title “What other items do customers buy after viewing this item?” or hear from a customer service representative along the lines for “Would you also like to purchase the carrying case or the extended warranty.”

In the first step of the process, data engineers need to gather all the relevant buyer history data, market trends, and customer behaviors. Next, the data scientists go to work developing the AI/ML models, iteratively backtesting, refining and retesting the model’s ability to predict future outcomes. Once the data scientists are satisfied that the model can truly recommend the next best offer, it’s time to deploy it on the business’ e-commerce site, at their call centers, and every other one of its customer engagement channels. In other words, what started as an AI/ML analytics model development challenge for data scientists, is now an application integration challenge for application engineers.   

Every step of this process requires leveraging best-in-class technology, agile synchronized processes, domain experts of all stripes working collaboratively, and more from initial concept to operationalization. Not only does the business need to make this work in the first place, but it also needs to do all this faster and better than the smart people with access to the same technology over at their competitor. 

This is where Tibco’s converged approach stands apart.

Tibco Data Science integrates the model development flow across ideation, data preparation, model development, testing, refinement and training, and deployment. Data engineers and data scientists collaborate more efficiently, and time-to-solution is faster. Most importantly, the model’s recommendations drive more revenue, much to the chagrin of the competition.

At model deployment, Tibco’s integration products take over. If the deployment is on the e-commerce site, Tibco’s Connected Intelligence Cloud solution allows businesses to seamlessly integrate the next best offer recommendation model into the e-commerce user interface. Demonstrating the flexibility of this Tibco solution, it can also be used to simultaneously deploy the same model into the scripts that pop-up on customer service reps’ screens at call centers across the globe. When the data scientists improve the model and want to operationalize the new version across all the channels, it’s easy to do so with Tibco.

In Rogers’ view, Tibco focus on scale – in all its dimensions -- is also worth noting. 

 

It’s not simply having scale to meet the needs of data volumes and processing power, Rogers told IDN. . It’s also the scale of how many knowledges workers need access to the pipeline – and how many different jobs they want to do.  

 

“At Tibco, we’re focusing on this new equation, where adding together so many main elements work together to deliver an analytical environment,” Rogers said.  “Making data and analytics available to so many more personas is where the opportunity lies.”  

 

“The tipping point really for me came from how large the user community has become,” he added.  “In the early 90s, BI platforms were serving, just tens of users within a large organization. Now we have thousands of users across an organization and all of them have many different and special needs.”




back