ekfrazo

Data Engineer

In this role, you will have the opportunity to:
Build Modern data architecture from the ground up with streaming and batch processing capabilities using technologies such as Apache Airflow, Apache Flink/Spark, Apache kafka, and Apache Pinot.
Build a configurable data platform enabling ELT/ETL and Reverse ETL Connectors & Transformations.
Experience with either Databricks Lakehouse / Snowflake leveraging
standards such as Apache Iceberg/Hudi, Apache Trino, or similar technologies would be beneficial.


To be successful in this role, you will need:
5+ years of software development experience with at least 3+ years in building data products/ data platforms/ Cloud data warehouses.
Proficiency with Python, SQL & Spark with 3+ years of experience building Data warehouse models.
Fluency with storing, transforming, and managing relational, non-relational, and streaming data structures on Cloud storage.
Experience with Dimensional Modeling (Star Schema) & OLTP
Experience working on any AWS, Azure, or Google Cloud infrastructure
DevOps: Docker, Kubernetes, CI/CD, Terraform
Excellent communication skills, both written and verbal

Job Category: Software Development
Job Type: Full Time
Job Location: Remote
Sorry! This job has expired.