Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extension to read and ingest Delta Lake tables #15755

Merged
merged 72 commits into from
Jan 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
72 commits
Select commit Hold shift + click to select a range
85795fe
something
abhishekagarwal87 Dec 19, 2023
a3baae9
test commit
abhishekrb19 Dec 19, 2023
23d2616
compilation fix
abhishekrb19 Dec 19, 2023
672894a
more compilation fixes (fixme placeholders)
abhishekrb19 Dec 19, 2023
7b67bd2
Comment out druid-kereberos build since it conflicts with newly added…
abhishekrb19 Dec 19, 2023
e67f5cf
checkpoint
abhishekrb19 Dec 20, 2023
da24f62
remove snapshot schema since we can get schema from the row
abhishekrb19 Dec 20, 2023
5bc39e1
iterator bug fix
abhishekrb19 Dec 20, 2023
27ba7b3
json json json
abhishekrb19 Dec 20, 2023
1bf4f7e
sampler flow
abhishekagarwal87 Dec 20, 2023
934d92d
empty impls for read(InputStats) and sample()
abhishekrb19 Dec 20, 2023
235e03a
conversion?
LakshSingla Dec 20, 2023
defbe6d
Merge branch 'delta_lake_what_not' of https://github.com/abhishekagar…
LakshSingla Dec 20, 2023
33e6fe6
Merge branch 'delta_lake_what_not' of github.com:abhishekagarwal87/dr…
abhishekrb19 Dec 20, 2023
a8600b7
Merge branch 'delta_lake_what_not' of github.com:abhishekagarwal87/dr…
abhishekrb19 Dec 20, 2023
43a4868
conversion, without timestamp
LakshSingla Dec 20, 2023
b78a08f
Merge branch 'delta_lake_what_not' of github.com:abhishekagarwal87/dr…
abhishekrb19 Dec 20, 2023
935582c
Web console changes to show Delta Lake
abhishekrb19 Dec 20, 2023
97c3d0d
Asset bug fix and tile load
abhishekrb19 Dec 20, 2023
bcffb79
Add missing pieces to input source info, etc.
abhishekrb19 Dec 20, 2023
a2502d2
fix stuff
abhishekagarwal87 Dec 20, 2023
73649b2
Use a different delta lake asset
abhishekrb19 Dec 20, 2023
73b0152
Delta lake extension dependencies
abhishekrb19 Jan 3, 2024
9d254ae
Merge branch 'delta_lake_what_not' of github.com:abhishekagarwal87/dr…
abhishekrb19 Jan 3, 2024
b6fa446
Cleanup
abhishekrb19 Jan 3, 2024
f280480
Add InputSource, module init and helper code to process delta files.
abhishekrb19 Jan 3, 2024
68e8dcb
Test init
abhishekrb19 Jan 3, 2024
0bc238d
Checkpoint changes
abhishekrb19 Jan 7, 2024
ffd8bb1
Test resources and updates
abhishekrb19 Jan 10, 2024
c4ec7f9
some fixes
abhishekrb19 Jan 10, 2024
3b0b11f
move to the correct package
abhishekrb19 Jan 10, 2024
dab76b1
More tests
abhishekrb19 Jan 10, 2024
d81efe7
Test cleanup
abhishekrb19 Jan 10, 2024
ff9e0ba
TODOs
abhishekrb19 Jan 10, 2024
633d32d
Test updates
abhishekrb19 Jan 10, 2024
8dd4ccd
requirements and javadocs
abhishekrb19 Jan 11, 2024
8993ed3
Adjust dependencies
abhishekrb19 Jan 11, 2024
001eeb7
Update readme
abhishekrb19 Jan 11, 2024
6cf568c
Merge branch 'master' into delta_lake_connector_ext
abhishekrb19 Jan 24, 2024
eb6f12a
Bump up version
abhishekrb19 Jan 24, 2024
2eb2af3
fixup typo in deps
abhishekrb19 Jan 25, 2024
2b6ba3e
forbidden api and checkstyle checks
abhishekrb19 Jan 25, 2024
8891fbb
Trim down dependencies
abhishekrb19 Jan 25, 2024
597e664
new lines
abhishekrb19 Jan 25, 2024
ce6f56c
Fixup Intellij inspections.
abhishekrb19 Jan 25, 2024
9a3723a
Add equals() and hashCode()
abhishekrb19 Jan 25, 2024
ca566fe
chain splits, intellij inspections
abhishekrb19 Jan 25, 2024
027791d
review comments and todo placeholder
abhishekrb19 Jan 25, 2024
66d69b4
fix up some docs
abhishekrb19 Jan 25, 2024
024ed4c
null table path and test dependencies. Fixup broken link.
abhishekrb19 Jan 25, 2024
3874dd4
run prettify
abhishekrb19 Jan 25, 2024
5b4c2af
Merge branch 'master' into delta_lake_connector_ext
abhishekrb19 Jan 29, 2024
4826731
Different test; fixes
abhishekrb19 Jan 29, 2024
69a654f
Upgrade pyspark and delta-spark to latest (3.5.0 and 3.0.0) and regen…
abhishekrb19 Jan 29, 2024
f6a7658
yank the old test resource.
abhishekrb19 Jan 29, 2024
d7590b6
add a couple of sad path tests
abhishekrb19 Jan 29, 2024
752fa2b
Updates to readme based on latest.
abhishekrb19 Jan 29, 2024
2da7b1a
Version support
abhishekrb19 Jan 29, 2024
cb10475
Extract Delta DateTime converstions to DeltaTimeUtils class and add test
abhishekrb19 Jan 29, 2024
ec7add5
More comprehensive split tests.
abhishekrb19 Jan 29, 2024
9eb5a73
Some test renames.
abhishekrb19 Jan 29, 2024
0dccf14
Cleanup and update instructions.
abhishekrb19 Jan 29, 2024
e8f229b
add pruneSchema() optimization for table scans.
abhishekrb19 Jan 29, 2024
803f73a
Oops, missed the parquet files.
abhishekrb19 Jan 29, 2024
da88270
Update default table and rename schema constants.
abhishekrb19 Jan 29, 2024
883a75c
Test setup and misc changes.
abhishekrb19 Jan 30, 2024
f4c3a99
Add class loader logic as the context class loader is unaware about e…
abhishekrb19 Jan 30, 2024
72cb1f4
change some table client creation logic.
abhishekrb19 Jan 30, 2024
e2159e5
Add hadoop-aws, hadoop-common and related exclusions.
abhishekrb19 Jan 30, 2024
c2b820a
Remove org.apache.hadoop:hadoop-common
abhishekrb19 Jan 30, 2024
1a52b56
Apply suggestions from code review
abhishekrb19 Jan 31, 2024
8b9ecb9
Add entry to .spelling to fix docs static check
abhishekrb19 Jan 31, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions distribution/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -257,6 +257,8 @@
<argument>org.apache.druid.extensions:druid-kubernetes-extensions</argument>
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-catalog</argument>
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-deltalake-extensions</argument>
<argument>${druid.distribution.pulldeps.opts}</argument>
</arguments>
</configuration>
Expand Down Expand Up @@ -453,6 +455,8 @@
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-iceberg-extensions</argument>
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-deltalake-extensions</argument>
<argument>-c</argument>
<argument>org.apache.druid.extensions.contrib:druid-spectator-histogram</argument>
</arguments>
</configuration>
Expand Down
44 changes: 44 additions & 0 deletions docs/development/extensions-contrib/delta-lake.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
---
id: delta-lake
title: "Delta Lake extension"
---

<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing,
~ software distributed under the License is distributed on an
~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
~ KIND, either express or implied. See the License for the
~ specific language governing permissions and limitations
~ under the License.
-->

## Delta Lake extension


Delta Lake is an open source storage framework that enables building a
Lakehouse architecture with various compute engines. [DeltaLakeInputSource](../../ingestion/input-sources.md#delta-lake-input-source) lets
you ingest data stored in a Delta Lake table into Apache Druid. To use the Delta Lake extension, add the `druid-deltalake-extensions` to the list of loaded extensions.
See [Loading extensions](../../configuration/extensions.md#loading-extensions) for more information.

The Delta input source reads the configured Delta Lake table and extracts all the underlying delta files in the table's latest snapshot.
These Delta Lake files are versioned Parquet files.

## Version support

The Delta Lake extension uses the Delta Kernel introduced in Delta Lake 3.0.0, which is compatible with Apache Spark 3.5.x.
Older versions are unsupported, so consider upgrading to Delta Lake 3.0.x or higher to use this extension.

## Known limitations

- This extension relies on the Delta Kernel API and can only read from the latest Delta table snapshot.
- Column filtering isn't supported. The extension reads all columns in the configured table.
39 changes: 33 additions & 6 deletions docs/ingestion/input-sources.md
Original file line number Diff line number Diff line change
Expand Up @@ -823,6 +823,13 @@ rolled-up datasource `wikipedia_rollup` by grouping on hour, "countryName", and
to `true` to enable a compatibility mode where the timestampSpec is ignored.
:::

The [secondary partitioning method](native-batch.md#partitionsspec) determines the requisite number of concurrent worker tasks that run in parallel to complete ingestion with the Combining input source.
Set this value in `maxNumConcurrentSubTasks` in `tuningConfig` based on the secondary partitioning method:
- `range` or `single_dim` partitioning: greater than or equal to 1
- `hashed` or `dynamic` partitioning: greater than or equal to 2

For more information on the `maxNumConcurrentSubTasks` field, see [Implementation considerations](native-batch.md#implementation-considerations).
abhishekrb19 marked this conversation as resolved.
Show resolved Hide resolved

## SQL input source

The SQL input source is used to read data directly from RDBMS.
Expand Down Expand Up @@ -925,7 +932,7 @@ The following is an example of a Combining input source spec:
## Iceberg input source

:::info
To use the Iceberg input source, add the `druid-iceberg-extensions` extension.
To use the Iceberg input source, load the extension [`druid-iceberg-extensions`](../development/extensions-contrib/iceberg.md).
:::

You use the Iceberg input source to read data stored in the Iceberg table format. For a given table, the input source scans up to the latest Iceberg snapshot from the configured Hive catalog. Druid ingests the underlying live data files using the existing input source formats.
Expand Down Expand Up @@ -1114,11 +1121,31 @@ This input source provides the following filters: `and`, `equals`, `interval`, a
|type|Set this value to `not`.|yes|
|filter|The iceberg filter on which logical NOT is applied|yes|

## Delta Lake input source

:::info
To use the Delta Lake input source, load the extension [`druid-deltalake-extensions`](../development/extensions-contrib/delta-lake.md).
:::

The [secondary partitioning method](native-batch.md#partitionsspec) determines the requisite number of concurrent worker tasks that run in parallel to complete ingestion with the Combining input source.
Set this value in `maxNumConcurrentSubTasks` in `tuningConfig` based on the secondary partitioning method:
- `range` or `single_dim` partitioning: greater than or equal to 1
- `hashed` or `dynamic` partitioning: greater than or equal to 2
You can use the Delta input source to read data stored in a Delta Lake table. For a given table, the input source scans
the latest snapshot from the configured table. Druid ingests the underlying delta files from the table.

The following is a sample spec:

```json
...
"ioConfig": {
"type": "index_parallel",
"inputSource": {
"type": "delta",
"tablePath": "/delta-table/directory"
},
}
}
```

| Property|Description|Required|
|---------|-----------|--------|
| type|Set this value to `delta`.|yes|
| tablePath|The location of the Delta table.|yes|

For more information on the `maxNumConcurrentSubTasks` field, see [Implementation considerations](native-batch.md#implementation-considerations).
156 changes: 156 additions & 0 deletions extensions-contrib/druid-deltalake-extensions/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,156 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing,
~ software distributed under the License is distributed on an
~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
~ KIND, either express or implied. See the License for the
~ specific language governing permissions and limitations
~ under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

<groupId>org.apache.druid.extensions</groupId>
<artifactId>druid-deltalake-extensions</artifactId>
<name>druid-deltalake-extensions</name>
<description>Delta Lake connector for Druid</description>

<parent>
<artifactId>druid</artifactId>
<groupId>org.apache.druid</groupId>
<version>30.0.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>
<modelVersion>4.0.0</modelVersion>

<properties>
<delta-kernel.version>3.0.0</delta-kernel.version>
</properties>

<dependencies>
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-kernel-api</artifactId>
<version>${delta-kernel.version}</version>
</dependency>

<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-kernel-defaults</artifactId>
<version>${delta-kernel.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client-api</artifactId>
<version>${hadoop.compile.version}</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>org.apache.druid</groupId>
<artifactId>druid-processing</artifactId>
<version>${project.parent.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.12.7.1</version>
</dependency>
<dependency>
<groupId>it.unimi.dsi</groupId>
<artifactId>fastutil-core</artifactId>
<version>8.5.4</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>${hadoop.compile.version}</version>
<scope>runtime</scope>
<exclusions>
<exclusion>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bundle</artifactId>
</exclusion>
</exclusions>
</dependency>

<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.druid</groupId>
<artifactId>druid-processing</artifactId>
<version>${project.parent.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-core</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.owasp</groupId>
<artifactId>dependency-check-maven</artifactId>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>
</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

package org.apache.druid.delta.common;

import com.fasterxml.jackson.databind.Module;
import com.fasterxml.jackson.databind.jsontype.NamedType;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.google.inject.Binder;
import org.apache.druid.delta.input.DeltaInputSource;
import org.apache.druid.error.DruidException;
import org.apache.druid.initialization.DruidModule;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;

import java.util.Collections;
import java.util.List;

public class DeltaLakeDruidModule implements DruidModule
{
@Override
public List<? extends Module> getJacksonModules()
{
return Collections.singletonList(
new SimpleModule("DeltaLakeDruidModule")
.registerSubtypes(
new NamedType(DeltaInputSource.class, DeltaInputSource.TYPE_KEY)
)
);
}

@Override
public void configure(Binder binder)
{
final Configuration conf = new Configuration();
conf.setClassLoader(getClass().getClassLoader());

ClassLoader currCtxCl = Thread.currentThread().getContextClassLoader();
try {
Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
FileSystem.get(conf);
}
catch (Exception ex) {
throw DruidException.forPersona(DruidException.Persona.DEVELOPER)
.ofCategory(DruidException.Category.UNCATEGORIZED)
.build(ex, "Problem during fileSystem class level initialization");
}
finally {
Thread.currentThread().setContextClassLoader(currCtxCl);
}
}

}

Loading
Loading