Connecting to Delta Lake
It is also possible to query existing data on Delta Lake directly without loading it to PuppyGraph. Here is an demo of the feature.
Last updated
It is also possible to query existing data on Delta Lake directly without loading it to PuppyGraph. Here is an demo of the feature.
Last updated
In the demo, the delta lake data source stores people and referral information. To query the data as a graph, we model people as vertices and the referral relationship between people as edges.
The demo assumes that PuppyGraph has been deployed at localhost
according to the instruction in Launching PuppyGraph from AWS Marketplace or Launching PuppyGraph in Docker.
In this demo, we use the username puppygraph
and password puppygraph123
.
ID | Age | Name |
---|---|---|
The demo uses people and referral information as shown above.
Here is the shell command to start a SparkSQL instance for data preparation assuming that the delta lake data are stored on HDFS at 172.31.19.123:9000
and the Hive metastore is at 172.31.31.125:9083
.
Now we can use the following SparkSQL query to create data in the database onhdfs
. The catalog name is puppy_delta
as specified in the command above.
Now the data are ready in Delta Lake. We need a PuppyGraph schema before querying it. Let's create a schema file deltalake.json
:
Here are some notes on this schema:
A catalog catalog_test
is added to specify the remote data source in Delta Lake. Note the hiveMetastoreUrl
field has the same value as the one we used to create data.
The label of the vertices and edges do not have to be the same as the names of corresponding tables in Delta Lake. There is a mappedTableSource
field in each of the vertex and edge types specifying the actual schema (onhdfs
) and table (referral
).
Additionally, the mappedTableSource
marks meta columns in the tables. For example, the fieldsfrom
and to
describe which columns in the table form the endpoints of edges.
PuppyGraph supports query Iceberg / Hudi / Delta Lake with metastore: Hive metastore/ AWS Glue and with storage: HDFS/ AWS S3/ MinIO.
For more catalog parameters details, please refer to Data Lake Catalog.
Now we can upload the schema file deltalake.json
to PuppyGraph with the following shell command, assuming that the PuppyGraph is running on localhost
:
Connecting to PuppyGraph at http://localhost:8081 and start gremlin console from the "Query" section:
Now we have connected to the Gremlin Console. We can query the graph:
\
You can refer to catalog configuration examples we provide: .
v1
29
marko
v2
27
vadas
e1
v1
v2
0.5