Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testing.
-
Updated
Sep 2, 2023 - Dockerfile
Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testing.
Apache Hive Metastore in Standalone Mode With Docker
Deploying an analytics platform on kubernetes cluster
BigData Pipeline is a local testing environment for experimenting with various storage solutions (RDB, HDFS), query engines (Trino), schedulers (Airflow), and ETL/ELT tools (DBT). It supports MySQL, Hadoop, Hive, Kudu, and more.
MinIO trino + hive + minio with postgres in docker compose
Distributed SQL analytics with Trino, Hive Metastore and MinIO storage
dbt data transformations for lakehouse using trino
Simplified Trino deployment and management using Ansible for non-K8s environments
On-premise data lake architecture with Trino, Delta Tables and Hive Metastore
Add a description, image, and links to the trino topic page so that developers can more easily learn about it.
To associate your repository with the trino topic, visit your repo's landing page and select "manage topics."