facebooklinkedinrsstwitterBlogAsset 1PRDatasheetDatasheetAsset 1DownloadForumGuideLinkWebinarPRPresentationRoad MapVideofacebooklinkedinrsstwitterBlogAsset 1PRDatasheetDatasheetAsset 1DownloadForumGuideLinkWebinarPRPresentationRoad MapVideo
Actian Blog / Do you Need a Data Fabric?

Do you Need a Data Fabric?

Big Data Abstract Visualization: Blur Waves On Surface Of The Sea Of Information. Digital Data Or Cyberspace Concept: Virtual Landscape. 3D Sound Waves, Futuristic Background. EPS 10, Vector Illustration.

Are you challenged to provide faster access to integrated data across a diverse and distributed IT landscape?

Over the past few years, IT organizations are increasingly being asked to automate the systems and processes involved in integrating and preparing their data for reporting, leveraging things like active metadata, artificial intelligence (AI) / machine learning (ML) algorithms, and knowledge graphs.

Bringing traditional data sources together and augmenting them using these modern capabilities requires a different design approach – that’s where a data fabric comes in.

What is a Data Fabric?

A data fabric is a design concept (an architecture) that provides a consistent set of data services and capabilities across on-premise and cloud environments.  It enables you to abstract your data from systems that are physically and logically different into a common set of data objects so you can treat them as a unified enterprise data set.

This is particularly important in supporting digital transformation initiatives where business processes leverage different systems that are on-premise, spread across multiple clouds, and even deployed remotely (things like IoT and Mobile apps). By utilizing a data fabric, companies can achieve faster and more efficient data sharing across systems, which leads to more integrated business insights and increased business agility.

A data fabric is comprised of a set of capability layers that transform and abstract data from different sources on their way to the data consumer.  Think of this as a value chain where raw materials, through a series of value-add steps, are transformed into consumable finished goods.  A typical data fabric is comprised of 5 steps of refinement that take place in three layers of capabilities.

Raw data is first organized in a data catalog/metadata layer.  Each data source includes contextual information called “metadata” about what the data is, when it was collected, what format it was in, etc.  The cataloging layer performs a “rough sort” of the raw data.

This catalog then informs a knowledge-graph layer where analytics are applied to activate meta-data and infer connections and relationships that may exist.  AI/ML algorithms are used for enrichment to of the active metadata to help interpret the data, put it into context, and make it simpler so automation rules can be defined for data integration.

The data from the knowledge graph then moves into an integration layer where data from different sources is brought together and reconciled into a common, integrated data set.  This data set is then used to drive data orchestration and automation that pushes relevant data to the individuals and systems that need to consume it.

What Problems Does a Data Fabric Address?

The data fabric design concept is intended to solve the age-old data problem – “how can I make things that are fundamentally different look and act similar enough to treat them as if they were the same?”  As IT environments grow and evolve, the challenge gets more significant, and the urgency of providing a solution becomes more apparent.

  • Data siloes across business functions
  • Diversity of data sources and types
  • IT systems spread across physical operating environments (multi-cloud, on-premise, mobile, etc.)
  • Demand for real-time and event-driven data for decision making
  • Growth in operational analytics and business-led data modeling activities

The IT systems are getting more complex while the business is demanding simpler, faster data for decision making. Data fabric provides the capability to address both.

How Integration Platform as a Service can Help Support Your Data Fabric

If you want to implement a data fabric, before you can start cataloging and refining data through the layers of the fabric, you first need to collect it from all of its various sources and get it in one place.  It also enables you to manage and orchestrate the connections from your data fabric to all the target consuming systems.

An IPaaS solution like Actian DataConnect provides this connectivity needed to make a data fabric design successful.   The data fabric provides a platform that enables data transformation; the IPaaS solution manages connectivity, security, authorized access, and orchestration of the data flow across your organization.

Do you Need a Data Fabric?

If you have a complicated IT environment, an ever-evolving business environment, and decision-makers that demand real-time data, then you need to be looking at a data fabric.  You also need to be looking at an Integration Platform like Actian DataConnect to manage the flow of data across your organization in a consistent and controlled way.

To learn more, visit www.actian.com/dataconnect.

About Sampa Choudhuri

Sampa is the Director of Product Marketing for Actian focusing on messaging, sales enablement and go-to market activities for Data Integration product line. She comes from an enterprise technology background, having worked in software, hardware, networking and security for companies like Cisco, Sun Microsystems and Symantec. Her experience ranges from product and partner marketing to market analysis and sales enablement. She has a keen interest in new technologies in the space. With an MBA from Santa Clara University, she brings a balanced acumen of technical and marketing skills to understand and position solutions to better help customers in today’s ever-changing complex IT landscape. When she is not working, she enjoys traveling, hiking, photography and spending time with her family.