Data warehouses have a long history in decision support and business intelligence applications. While warehouses were great for structured data, a lot of modern enterprises have to deal with unstructured data, semi-structured data, and data with high variety, velocity, and volume. Traditional Data warehouses are not suited for many of these use cases, and they are certainly not the most cost efficient.
As technology evolved, companies began building data lakes. While suitable for storing data, data lakes lack some critical features. Many of the promises of the data lakes have not materialised, and in many cases leading to a loss of many of the benefits of data warehouses.
To address these issues, Databricks introduced the Lakehouse concept - a platform designed to take the best of both the Data warehouse and Data Lake and combine it into a single platform.
In this session, we will focus on how customers can leverage the platform to build a “modern data warehouse” leveraging the capabilities of Delta and Delta Engine to deliver the performance and reliability of the Data Warehouse, but with the scalability and elasticity of the Data Lake.
We will also show how customers can leverage existing modelling techniques, such as star schemas in SQL, to help accelerate the migration of existing workloads into Delta Lake.
- How to bring even better performance to your Data Lake with Delta Lake with Delta Engine, a high-performance query execution for Delta lake.
- Discuss architecture best practices across AWS and Azure with Databricks Senior Solution Architects during our live Q&A.
- Discover insights to answer questions around your data: What can it do for my industry? How can I explain its value to my colleagues and company leadership?