Data Fabric

What is Data Fabric?

Data fabric is a design concept that serves as an integrated layer (fabric) of data and connecting processes. It accelerates insights by automating ingestion, curation, discovery, preparation, and integration across data silos.

How does data fabric work?

Data fabric creates a semantic layer that accelerates the delivery of data and insights by automating key processes, increasing agility while engaging business users and analysts in the data preparation process. 

Related HPE Solutions, Products, or Services

How is data fabric used?

Data fabric is not specific to data process or use, nor geographic locations or platforms. Its very design ensures all types of data can be easily accessed and governed. It adds a layer of supporting information to data lake environments, reducing the need for point-to-point integration for certain users and processes.

At its most useful, data fabric facilitates faster, more robust, and more reliable results when leveraging AI capabilities across large amounts of data, regardless of how many types or locations.

What can data fabric do?

Data fabric abstracts infrastructure, breaking the direct tie between data and specific infrastructure and so breaking down silos. It focuses on automating the process of data ingestion, data curation, and the integration of diverse data sources, simplifying data analytics and insights for business success. It minimizes complexity by automating processes, workflows, and pipelines streamlining data to simplify deployment. 

Data fabric use cases

  • 360-degree view of the customer:  Data fabric helps companies identify customers’ likes, dislikes, circle of friends, buying patterns, and past orders. It helps companies identify customer satisfaction, predict churn, and personalize experiences critical for business success. 
  • Internet of Things (IoT) analytics: Data fabric offers the ability to efficiently store, process, and access large volumes of IoT data from sensors, devices, and switches through automation and machine learning technologies. It enables analytics by steaming data from other data platforms and integrates with data lakes to deliver operational insights.
  • Real-time and advanced analytics: Data fabric supports pervasive analytics through automation, curation, and intelligence that is used for fraud detection, risk management, and applications. These benefit from the use of additional data signals to identify patterns in near real time. 

HPE and data fabric

High-performance hybrid analytics is not about migrating or moving data or being forced to work with a limited data set. HPE Ezmeral Data Fabric unifies data across data lakes, files, objects, streams, and databases into one data infrastructure and filesystem. It integrates existing files, objects, streams, and databases into a single technology base, security, and management system, reducing data silos.

Edge-to-cloud topologies are created and accessed through a single global namespace for simplified data access from any application or interface, regardless of where the data resides. A persistent data store simplifies both coding and data analytic models by providing cross-protocol data access via native S3, NFS, POSIX, REST, HDFS, and container storage interface (CSI) APIs.