The Logical Datawarehouse - Architecture, Design and Technology

The Logical Datawarehouse - Architecture, Design and Technology


Two-day practical seminar full of insights and advice about this modern and flexible architecture

24-25 October 2018 (14-21h)
Location: Parker Hotel (Diegem)
Presented in English by Rick van der Lans
Price: 1350 EUR (excl. 21% VAT)

This event is history, please check out the List of Upcoming Seminars, or send us an email

Check out these related open workshops:

Check out our related in-house workshops:

 Learning Objectives

Why this seminar ?

Business intelligence has changed dramatically the last years. The time-to-market for new reports and analysis has to be shortened, new data sources have to be made available to business users more quickly, self-service BI and data science must be supported, more and more users want to work with zero-latency data, adoption of new technologies, such as Hadoop, Spark, and NoSQL, must be easy, and analysis of streaming data and big data is required.

The classic data warehouse architecture has served many organizations well. But it is not the right architecture for this new world of BI. It is time for organizations to migrate gradually to a more flexible architecture: the logical data warehouse architecture. This architecture, introduced by Gartner, is based on a decoupling of reporting and analyses on the one hand, and data sources on the other hand.

Classic data warehouse architectures are made up of a chain of databases. This chain consists of numerous databases, such as the staging area, the central data warehouse and several data marts, and countless ETL programs needed to pump data through the chain. Integrating self-service BI products with this architecture is not easy and certainly not if users want to access the source systems. Delivering 100% up-to-date data to support operational BI is difficult to implement. And how do we embed new storage technologies into this architecture ?

With the logical data warehouse architecture new data sources can hooked up to the data warehouse more quickly, self-service BI can be supported correctly, operational BI is easy to implement, the adoption of new technology is much easier, and in which the processing of big data is not a technological revolution, but an evolution.

The technology to create a logical data warehouse is available, and many organizations have already completed the migration successfully; a migration that is based on a step-by-step process and not on full rip-and-replace approach.

In this practical seminar, the architecture is explained and products will be discussed. It discusses how organizations can migrate their existing architecture to this new one. Tips and design guidelines are given to help make this migration as efficient as possible.

What do you learn at this seminar ?

  • What are the practical advantages of the logical data warehousing architectuur ?
  • How can your organisation migrate stepwise and successfully ?
  • What are the possibilities and limitations of tthe various tools ?
  • How do the data virtualisation products work ?
  • How can big data be added in a transparant way to your your existing BI-omgeving?
  • How can self-service BI be integrated with the classic forms of ?
  • How can users access 100% up-to-date data without disturbing the operational systems ?
  • What are the real-life experiences of organisations with logisch data warehousing ?

Who should attend this seminar ?

This two-day seminar is aimed at everyone who needs to stay informed about the latest developments in business intelligence and datawarehousing, such as Business intelligence specialists; data analysts; data warehouse designers; business analysts; data scientists; technology planners; technical architects; enterprise architects; IT consultants; IT strategists; systems analysts; database developers; database administrators; solutions architects; data architects; IT managers.

Some knowledge of the classic data warehouse architecture is a plus.

You get a copy of the most recent edition of the book "Data Virtualization for Business Intelligence Systems", written by Rick van der Lans.

 Full Programme

Presented by Rick van der Lans

13.30h - 14.00h
Registration on the first day (24 October). We Welcome participants with coffee/tea and croissants
14.00h

1. Challenges of the classic datawarehouse

  • Integrating big data with existing data and making it available for reporting and analytics
  • Supporting self-service BI and self-service data preparation
  • Faster time-to-market for reports
  • Polyglot persistency – processing data stored in classic SQL, Hadoop, and NoSQL systems
  • Operational Business Intelligence, or analyzing zero-latency data

2. The Logical Data Warehouse Architecture (LDWA)

  • The essence: decoupling of reporting and data sources
  • From batch-integration to on-demand integration of data
  • The impact on flexibility and productivity – an improved time-to-market for reports
  • Examples of organizations operating a logical data warehouse
  • Can a logical data warehouse really work without a physical data warehouse?

3. Implementing a Logical Data Warehouse with Data Virtualization Servers

  • Why data virtualization?
  • Market overview: AtScale, Cirro Data Hub, Data Virtuality, Denodo Platform, FraXses, IBM Data Virtualization Manager for z/OS, RedHat JBoss Data Virtualization, Stone Bond Enterprise Enabler, and Tibco Data Virtualization
  • Importing non-relational data, such as XML and JSON documents, web services, NoSQL, and Hadoop data
  • The importance of an integrated business glossary and centralization of metadata specifications

4. Improving the Query Performance of Data Virtualization Servers

  • How does caching really work?
  • Using caching to minimize interference on transactional systems
  • Speeding up queries by caching data in analytical SQL database servers
  • Which virtual tables should be cached?
  • Query optimization techniques and the explain feature
  • Smart drivers/connectors can help improve query performance
  • How can SQL-on-Hadoop engines speed up query performance?
  • Working with multiple data virtualization servers in a distributed environment to minimize network traffic
17.45h - 18.45h
Dinner Buffet

5. Migrating to a Logical Data Warehouse

  • An A to Z roadmap
  • Guidelines for the development of a logical data warehouse
  • Three different methods for modeling: outside-in, inside-out, and middle-out
  • The value of a canonical data model
  • Considerations for security aspects
  • Step by step dismantling of the existing architecture
  • The focus on sharing of metadata specifications for integration, transformation, and cleansing

6. Self-Service BI and the Logical Data Warehouse

  • Why self-service BI can lead to "report chaos"
  • Centralizing and reusing metadata specifications with a logical data warehouse
  • Upgrading self-service BI into managed self-service BI
  • Implementing Gartner’s BI-modal environment
21.00h
End of Day 1
25 October 2018, 13.30h - 14.00h
Welcoming the participants with coffee/tea and croissants

7. Big Data and the Logical Data Warehouse

  • New data storage technologies for big data, including Hadoop, MongoDB, Cassandra
  • The appearance of the polyglot persistent environment; or each application its own optimal database technology
  • Design rules to integrate big data and the data warehouse seamlessly
  • Big data is too "big" to copy
  • Offloading cold data with a logical data warehouse

8. Physical Data Lakes or Virtual Data Lakes?

  • What is a Data Lake?
  • Is developing a physical Data Lake realistic when working with Big Data?
  • Developing a virtual Data Lake with data virtualization servers
  • Can the logical Data Warehouse and the virtual Data Lake be combined?

9. Implementing Operational BI with a Logical Data Warehouse

  • Examples of operational reporting and operational analytics
  • Extending a logical data warehouse with operational data for real-time analytics
  • "Streaming" data in a logical data warehouse
  • The coupling of data replication and data virtualization
17.45h - 18.45h
Dinner Buffet

10. Making Data Vault more Flexible with a Logical Data Warehouse

  • What exactly is Data Vault?
  • Using a Logical Data Warehouse to make data in a Data Vault available for reporting and analytics
  • The structured SuperNova design technique to develop virtual data marts
  • SuperNova turns a Data Vault in a flexible database

11. The Logical Data Warehouse and the Environment

  • Design principles to define data quality rules in a logical data warehouse
  • How data preparation can be integrated with a logical data warehouse
  • Shifting of tasks in the BICC
  • Which new development and design skills are important?
  • The impact on the entire design and development process

12. Closing Remarks

21.00h
End of this seminar

You get a copy of the most recent edition of the book "Data Virtualization for Business Intelligence Systems", written by Rick van der Lans.

 Speakers


Rick van der Lans (R20/Consultancy BV)
R20/Consultancy BV

Rick van der Lans is a highly-respected independent analyst, consultant, author, and internationally acclaimed lecturer specialising in data warehousing, business intelligence, big data, and database technology.

He has presented countless seminars, webinars, and keynotes at industry-leading conferences. For many years, he has served as the chairman of the annual European Enterprise Data and Business Intelligence Conference in London and the annual Data Warehousing and Business Intelligence Summit in The Netherlands.

Rick helps clients worldwide to design their data warehouse, big data, and business intelligence architectures and solutions and assists them with selecting the right products. He has been influential in introducing the new logical data warehouse architecture worldwide which helps organisations to develop more agile business intelligence systems.

Over the years, Rick has written hundreds of articles and blogs for newspapers and websites and has authored many educational and popular white papers for a long list of vendors. He was the author of the first available book on SQL, entitled including Introduction to SQL, which has been translated into several languages with more than 100,000 copies sold. More recently, he published his book Data Virtualization for Business Intelligence Systems.

Questions about this ? Interested but you can't attend ? Send us an email !

-->