Position Details
About this role
This role involves designing, building, and optimizing scalable data lakehouse solutions to enable data-driven decision-making across the organization.
Key Responsibilities
- Design and implement lakehouse architectures
- Build and optimize data pipelines
- Ensure data governance and security
- Monitor and troubleshoot workflows
- Collaborate with cross-functional teams
Technical Overview
The technical environment includes AWS cloud services, Snowflake data platform, Kafka streaming, Apache Iceberg, and data pipeline tools like StreamSets and DBT.
Ideal Candidate
The ideal candidate is a mid-level data engineer with 5+ years of experience in designing and maintaining scalable data pipelines and lakehouse architectures. They possess strong expertise in AWS, Snowflake, Kafka, and data modeling, with a focus on data governance and performance optimization.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
Lack of experience with AWS services, No experience with Snowflake or Kafka, Missing data pipeline or data governance skills, No relevant technical degree
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile