In today’s digital economy, the customer is all powerful. They can switch loyalty in a single click while on the move from a mobile device. The internet has made loyalty cheap and many CEOs want new data to enrich what they already know about customers in order to keep them loyal and offer them a more personalised service. In addition, companies are capturing new data using sensors in to gain sight of what’s happening and to optimise business operations. This new data is causing many companies with traditional data warehouses and data marts to realise that this is not enough for analytics. Other systems are needed and with the pace of change quickening, lower latency data and machine learning is in demand everywhere. All of it is needed to remain competitive.
So how then do you modernise your analytical setup, to improve governance and agility, bring in new data, re-use data assets, modernise your data warehouse to easily accommodate change, lower data latency and integrate with other analytical workloads to provide a new modern data warehouse for the digital enterprise?
This new 2-day seminar looks at why you need to do this. It discusses the tools and techniques needed to capture new data types, establish new data pipelines across cloud and on-premises system and how to produce re-usable data assets, modernise your data warehouse and bring together the data and analytics needed to accelerate time to value.
CDOs, CIO’s, IT Managers, CTOs, Business Analysts, data scientists, BI Managers, data warehousing professionals, enterprise architects, data architects
After two days attendees will:
THE TRADITIONAL DATA WAREHOUSE AND WHY IT NEEDS MODERNISED
For most organisations today, their data warehouse is based on a waterfall style architecture with data flowing from source systems into operational data stores, staging areas, then on to data warehouses under the management of batch ETL jobs. However, analytical landscape has changed. New data sources continue to grow with data now being collected in edge devices, cloud storage, cloud or on-premises NoSQL data stores, Hadoop systems as well as data warehouse staging. Hadoop, Spark, streaming data platforms and Graph databases are also now used in data analysis. Also, many business units are using the cloud to quickly exploit these new analytical technologies at lower cost.
This opening session looks at these new activities and explains why data warehouses have to change not only to speed up development, improve agility, reduce costs but also to exploit new data, enable self- service data preparation, utilise advanced analytics and integrate with these other analytical platforms
MODERN DATA WAREHOUSE REQUIREMENTS
This session looks at the key building blocks of modern data warehouse that need to be in place for flexibility and agility
MODERN DATA MODELLING TECHNIQUES FOR AGILE DATA WAREHOUSING
In order to improve agility, change friendly data modelling techniques have emerged and are becoming increasingly popular in designing modern data warehouses. This session looks at data modelling and asks Is Star Schema dead? Which data warehouse modelling technique is best suited to handling change? Should you use Data Vault? Does Data Warehouse design need to change? Does data mart design need to change? t also looks at the disadvantages of such techniques and how you can overcome these.
MODERNISING YOUR ETL PROCESSING
This session looks at the challenges posed by new data on ETL processing. Also, what options are available to modernise ETL processing, where should it run and what are the pros and cons of each option? How does this impact on your data architecture?
ACCELERATING ETL PROCESSING USING A MULTI-PURPOSE DATA LAKE & DATA CATALOG
This session looks at how you can use a multi-purpose data lake to accelerate ETL processing and integration of data for your data warehouse
o Turning OLTP change data capture into Kafka data streams
o Linking Kafka and ETL tools to process data in real-time
o Running ETL processing at the edge Vs on the cloud or the data centre
o Future proofing streaming ETL processing using Apache Beam
o Ingesting streaming data into your data lake
• Real-time data warehouse – Integrating your data warehouse with streaming data – external tables, data virtualisation and data lake
• Using ETL data pipelines to produce re- usable data assets for use in your data warehouse and other analytical data stores
• Publishing reusable data in a catalog ready for consumption
• Using Data Science to develop new analytical models to run in your data warehouse
RAPID DATA WAREHOUSE DEVELOPMENT USING DATA WAREHOUSE AUTOMATION
In addition to a data lake, this session looks at how you can use metadata driven data warehouse automation tools to rapidly build, change and extend modern cloud and on premises Data Warehouses and data marts. It looks at how these tools help you adopt new modern data modelling techniques quickly, how they generate schemas and data integration jobs and how they can help you migrate to your new data warehouse systems on the cloud.
BUILDING A MODERN DATA WAREHOUSE IN A CLOUD COMPUTING ENVIRONMENT
A key question for many organisations is what do you do with your existing data warehouse? Should you try to change the existing set-up to make it more modern or re-develop it in the cloud? This session looks at the advantages of building modern data warehouses in a cloud computing environment using a cloud based analytical Relational DBMS
SIMPLIFYING DATA ACCESS – CREATING VIRTUAL DATA MARTS AND A LOGICAL DATA WAREHOUSE ARCHITECTURE TO INTEGRATE BIG DATA WITH YOUR DATA WAREHOUSE
This section looks at how you can make use of data virtualisation software to modernise your data warehouse architecture, and simplify access to and integrate data in your data warehouse and big data underlying data stores and improve agility
• What is data virtualisation?
• How does data virtualisation work?
• How can data virtualisation reduce cost of ownership, improve agility and modernise your data warehouse architecture?
• Simplifying your architecture by using data virtualisation to create Virtual Data Marts
• Migrating your physical data marts to virtual data marts to reduce cost of ownership
• Layering virtual tables on top of virtual marts to simplify business user access
• Publishing virtual views and queries as services in a catalog for consumption
• Integrating your data warehouse with your data lake and low latency data using external tables and data virtualisation
• Enabling rapid change management using data virtualisation
• Creating a logical data warehouse architecture that Integrates data from big data platforms, graph databases, streaming data platforms and your data warehouse into a common access layer for easy access by BI Tools and applications
• Using a business glossary and data virtualisation to create a common semantic layer with consistent common understanding across all BI tools
GETTING STARTED WITH DATA WAREHOUSE MODERNISATION
This final session looks at what you have to do to get started with a data warehouse modernisation initiative. In particular, it looks at:
Mike Ferguson is Managing Director of Intelligent Business Strategies Limited. As an independent analyst and consultant he specialises in data management and analytics. With over 38 years of IT experience, Mike has consulted for dozens of companies. He has spoken at events all over the world and written numerous articles.
Mike is Chairman of Big Data LDN – the fastest growing Big Data conference in Europe, and chairman of the CDO Exchange. Formerly he was a principal and co-founder of Codd and Date Europe Limited – the inventors of the Relational Model, a Chief Architect at Teradata on the Teradata DBMS and European Managing Director of Database Associates. He teaches popular master classes in Analytics, Big Data, Data Governance & MDM, Data Warehouse Modernisation and Data Lake operations.
If you can not participate this course, you can send someone else instead of you. If cancellation is done less than 14 days before the course start, we will charge 50% of the price. In case of no show without any cancellation, we will charge the whole price. Cancellation fee will also be charged in case of illness.