Utils for streaming large files (S3, HDFS, gzip, bz2...)
-
Updated
Jul 10, 2025 - Python
Utils for streaming large files (S3, HDFS, gzip, bz2...)
80+ DevOps & Data CLI Tools - AWS, GCP, GCF Python Cloud Functions, Log Anonymizer, Spark, Hadoop, HBase, Hive, Impala, Linux, Docker, Spark Data Converters & Validators (Avro/Parquet/JSON/CSV/INI/XML/YAML), Travis CI, AWS CloudFormation, Elasticsearch, Solr etc.
ElasticCTR,即飞桨弹性计算推荐系统,是基于Kubernetes的企业级推荐系统开源解决方案。该方案融合了百度业务场景下持续打磨的高精度CTR模型、飞桨开源框架的大规模分布式训练能力、工业级稀疏参数弹性调度服务,帮助用户在Kubernetes环境中一键完成推荐系统部署,具备高性能、工业级部署、端到端体验的特点,并且作为开源套件,满足二次深度开发的需求。
A tool and library for easily deploying applications on Apache YARN
Data Engineering Project with Hadoop HDFS and Kafka
DBM,数据库管理,集成了MySQL、Redis、ES、Kafka、HDFS、InfluxDB、Pulsar等多种数据库组件的全生命周期管理,提供了海量集群的批量管理能力,以及相应DB组件的集群管理工具箱,并配套DB个性化配置、高可用切换、域名管理等DB个性化服务,同时全方位的监控告警可观测能力,让数据库管理员、运维、开发等用户可以轻松完成数据库管理工作,更高效、更安全、更全面的管理数据库。 The database management platform integrates a variety of database components such as MySQL...
Analysis scripts for log data sets used in anomaly detection.
Insight Data Engineering project: A platform built in HDFS, Spark and Airflow to help you to find social influencers from GitHub Network.
This project aims to predict smartphone prices using a combination of batch and stream processing techniques in a Big Data environment. The architecture follows the Lambda Architecture pattern, providing both real-time and batch processing capabilities to users.
Add a description, image, and links to the hdfs topic page so that developers can more easily learn about it.
To associate your repository with the hdfs topic, visit your repo's landing page and select "manage topics."