41
Building Advanced Analytical Architectures for Big Data Using Logical Data Warehouse and Data Lakes Ravi Shankar, CMO October 2016

Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Building Advanced Analytical Architectures for Big Data Using Logical Data Warehouse and Data Lakes

Ravi Shankar, CMO

October 2016

Page 2: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Speakers

Ravi Shankar

Chief Marketing Officer

Page 3: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Agenda1.The Logical Data Warehouse

2.Common Patterns for Logical Data Warehouse/

Data Lakes

3.Performance, Security, and Governance

Considerations

4.Q&A

Page 4: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

What is a Logical Data Warehouse?

A logical data warehouse is a data system that follows

the ideas of traditional EDW (star or snowflake schemas)

and includes, in addition to one (or more) core DWs,

data from external sources.

The main motivations are improved decision making

and/or cost reduction

Page 5: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Logical Data Warehouse

Description:

“The Logical Data Warehouse (LDW) is a new data management architecture for analytics combining the strengths of traditional repository warehouses with alternative data management and access strategy. The LDW will form a new best practice by the end of 2015.”

“The LDW is an evolution and augmentation of DW practices, not a replacement”

“A repository-only style DW contains a single ontology/taxonomy, whereas in the LDW a semantic layer can contain many combination of use cases, many business definitions of the same information”

“The LDW permits an IT organization to make a large number of datasets available for analysis via query tools and applications.”

5

Gartner Definition

Gartner Hype Cycle for Enterprise Information Management, 2012

Page 6: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Logical Data Warehouse

Description:

A semantic layer on top of the data warehouse that keeps the business data definition.

Allows the integration of multiple data sources including enterprise systems, the data warehouse, additional processing nodes (analytical appliances, Big Data, …), Web, Cloud and unstructured data.

Publishes data to multiple applications and reporting tools.

6

Page 7: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

7

Three Integration/Semantic Layer AlternativesGartner’s View of Data Integration

Application/BI Tool as Data Integration/Semantic Layer

EDW as Data Integration/Semantic Layer

Data Virtualization as Data Integration/Semantic Layer

Application/BI Tool Data Virtualization

EDW

EDW

ODS ODS EDW ODS

Page 8: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

8

Data Virtualization as the Data Integration Layer

Data Virtualization as Data Integration/Semantic Layer

Data Virtualization

EDW ODS

• Move data integration and semantic layer to

independent Data Virtualization platform

• Purpose built for supporting data access

across multiple heterogeneous data sources

• Separate layer provides semantic models for

underlying data

• Physical to logical mapping

• Enforces common and consistent security

and governance policies

• Gartner’s recommended approach

Page 9: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Logical Data Warehouse

9

EDW Hadoop Cluster

SalesHDFSFiles

Document Collections

NoSQLDatabase

ERP

Database Excel

Page 10: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

10

The State and Future of Data Integration. Gartner, 25 may 2016

Physical data movement architectures that aren’t designed to

support the dynamic nature of business change, volatile

requirements and massive data volume are increasingly being

replaced by data virtualization.

Evolving approaches (such as the use of LDW architectures) include

implementations beyond repository-centric techniques

Page 11: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

What about the Logical Data Lake?

A Data Lake will not have a star or snowflake schema, but rather a more

heterogeneous collection of views with raw data from heterogeneous

sources

The virtual layer will act as a common umbrella under which these

different sources are presented to the end user as a single system

However, from the virtualization perspective, a Virtual Data Lake shares

many technical aspects with a LDW and most of these contents also

apply to a Logical Data Lake

Page 12: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Common Patterns for a Logical Data Warehouse

Page 13: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

13

Common Patterns for a Logical Data Warehouse

1. The Virtual Data Mart

2. DW + MDM

Data Warehouse extended with master data

3. DW + Cloud

Data Warehouse extended with cloud data

4. DW + DW

Integration of multiple Data Warehouse

5. DW historical offloading

DW horizontal partitioning with historical data in cheaper storage

6. Slim DW extension

DW vertical partitioning with rarely used data in cheaper storage

Page 14: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

14

Virtual Data Marts

Business friendly models defined on top of one or multiple systems,

often “flavored” for a particular division

Motivation

Hide complexity of star schemas for business users

Simplify model for a particular vertical

Reuse semantic models and security across multiple reporting engines

Typical queries

Simple projections, filters and aggregations on top of curated “fat tables”

that merge data from facts and many dimensions

Simplified semantic models for business users

Page 15: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

15

Virtual Data Marts

Time Dimension Fact table(sales)

Product

Retailer Dimension

Sales

EDW Others

Product

Prod. Details

Page 16: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

16

DW + MDM

Slim dimensions with extended information maintained in an external

MDM system

Motivation

Keep a single copy of golden records in the MDM that can be reused across

systems and managed in a single place

Typical queries

Join a large fact table (DW) with several MDM dimensions, aggregations on

top

Example

Revenue by customer, projecting the address from the MDM

Page 17: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

17

DW + MDM dimensions

Time Dimension Fact table(sales) Product Dimension

Retailer Dimension

EDW MDM

Page 18: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

18

DW + Cloud dimensional data

Fresh data from cloud systems (e.g. SFDC) is mixed with the EDW, usually

on the dimensions. DW is sometimes also in the cloud.

Motivation

Take advantage of “fresh” data coming straight from SaaS systems

Avoid local replication of cloud systems

Typical queries

Dimensions are joined with cloud data to filter based on some external attribute

not available (or not current) in the EDW

Example

Report on current revenue on accounts where the potential for an expansion is

higher than 80%

Page 19: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

19

DW + Cloud dimensional data

Time Dimension Fact table(sales) Product Dimension

Customer Dimension

CRM

SFDC Customer

EDW

Page 20: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

20

Multiple DW integration

Motivation

Merges and acquisitions

Different DWs by department

Transition to new EDW Deployments (migration to Spark, Redshift, etc.)

Typical queries

Joins across fact tables in different DW with aggregations before or after the JOIN

Example

Get customers with a purchases higher than 100 USD that do not have a fidelity

card (purchases and fidelity card data in different DW)

Use of multiple DW as if it was only one

Page 21: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

21

Multiple DW integration

Time Dimensi

on

Sales fact

Product Dimension

Region

Finance EDW

City

Marketing EDW

Customer Fidelity factsProduct Dimension

*Real Examples: Nationwide POC, IBM tests

Store

Page 22: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

22

DW Historical Partitioning

Only the most current data (e.g. last year) is in the EDW. Historical data is offloaded to a Hadoop cluster

Motivations

Reduce storage cost

Transparently use the two datasets as if they were all together

Typical queries

Facts are defined as a partitioned UNION based on date

Queries join the “virtual fact” with dimensions and aggregate on top

Example

Queries on current date only need to go to the DW, but longer timespans need to merge with Hadoop

Horizontal partitioning

Page 23: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

23

DW Historical offloading Horizontal partitioning

Time Dimension Fact table(sales) Product Dimension

Retailer Dimension

Current Sales Historical Sales

EDW

Page 24: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

24

Slim DW extension

Minimal DW, with more complete raw data in a Hadoop cluster

Motivation

Reduce cost

Transparently use the two datasets as if they were all together

Typical queries

Tables are defined virtually as 1-to-1 joins between the two systems

Queries join the facts with dimensions and aggregate on top

Example

Common queries only need to go to the DW, but some queries need attributes or measures from Hadoop

Vertical partitioning

Page 25: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

25

Slim DW extensionVertical partitioning

Time Dimension Fact table(sales) Product Dimension

Retailer Dimension

Slim Sales Extended Sales

EDW

Page 26: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Performance in a LDW

Page 27: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

27

It is a common assumption that a virtualized solution will

be much slower than a persisted approach via ETL:

1. There is a large amount of data moved through the

network for each query

2. Network transfer is slow

But is this really true?

Page 28: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

28

Debunking the myths of virtual performance

1. Complex queries can be solved transferring moderate data volumes when

the right techniques are applied

Operational queries

Predicate delegation produces small result sets

Logical Data Warehouse and Big Data

Denodo uses characteristics of underlying star schemas to apply

query rewriting rules that maximize delegation to specialized sources

(especially heavy GROUP BY) and minimize data movement

2. Current networks are almost as fast as reading from disk

10GB and 100GB Ethernet are a commodity

Page 29: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

29

Denodo has done extensive testing using queries from the standard benchmarking test

TPC-DS* and the following scenario

Compares the performance of a federated approach in Denodo with an MPP system where

all the data has been replicated via ETL

Customer Dim.2 M rows

Sales Facts290 M rows

Items Dim.400 K rows

* TPC-DS is the de-facto industry standard benchmark for measuring the performance of decision support solutions including, but not limited to, Big Data systems.

vs.Sales Facts290 M rows

Items Dim.400 K rows

Customer Dim.2 M rows

Performance ComparisonLogical Data Warehouse vs. Physical Data Warehouse

Page 30: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

30

Performance Comparison

Query DescriptionReturned

RowsTime Netezza

Time Denodo (Federated Oracle,

Netezza & SQL Server)

Optimization Technique (automatically selected)

Total sales by customer 1,99 M 20.9 sec. 21.4 sec. Full aggregation push-down

Total sales by customer and year between 2000 and 2004

5,51 M 52.3 sec. 59.0 sec Full aggregation push-down

Total sales by item brand 31,35 K 4.7 sec. 5.0 sec. Partial aggregation push-down

Total sales by item where sale price less than current

list price17,05 K 3.5 sec. 5.2 sec On the fly data movement

Logical Data Warehouse vs. Physical Data Warehouse

Page 31: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

31

Performance and optimizations in DenodoFocused on 3 core concepts

Dynamic Multi-Source Query Execution Plans

Leverages processing power & architecture of data sources

Dynamic to support ad hoc queries

Uses statistics for cost-based query plans

Selective Materialization

Intelligent Caching of only the most relevant and often used information

Optimized Resource Management

Smart allocation of resources to handle high concurrency

Throttling to control and mitigate source impact

Resource plans based on rules

Page 32: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

32

Performance and optimizations in DenodoComparing optimizations in DV vs ETL

Although Data Virtualization is a data integration platform, architecturally speaking it is more similar to a RDBMs

Uses relational logicMetadata is equivalent to that of a databaseEnables ad hoc querying

Key difference between ETL engines and DV:ETL engines are optimized for static bulk movements

Fixed data flowsData virtualization is optimized for queries

Dynamic execution plan per query

Therefore, the performance architecture presented here resembles that of a RDBMS

Page 33: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Security & PrivacyChallenges of Data Services

33

Page 34: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Role-Based Granular Privileges

34

Security In Denodo

Page 35: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Advanced Selective Data Masking

35

Security In Denodo

Page 36: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Advanced Selective Data Masking

36

Security In Denodo

Page 37: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Data Governance & VeracityChallenges of Data Services

37

Page 38: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

38

Enterprise Data Governance

Understand the “source of truth” and transformations of every piece of data in the

model

Data lineage

Page 39: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

O C T O B E R 1 8 , 2 0 1 6 S A N F R A N C I S C O B A Y A R E A , C A

#DenodoDataFest

RAPID, AGILE DATA STRATEGIESFor Accelerating Analytics, Cloud, and Big Data Initiatives.

Attend in Person or Watch it online

18 October 2016

8:30am – 6:00pm PT

Register: DenodoDataFest.com

Page 40: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Q&A

Page 41: Building Advanced Analytical Architectures for Big Data ... · A semantic layer on top of the data warehouse that keeps the business data definition. Allows the integration of multiple

Thanks!

www.denodo.com [email protected]

© Copyright Denodo Technologies. All rights reservedUnless otherwise specified, no part of this PDF file may be reproduced or utilized in any for or by any means, electronic or mechanical, including photocopying and microfilm, without prior the written authorization from Denodo Technologies.