Position Details
About this role
This role involves designing and implementing scalable data pipelines and data models in a cloud environment, primarily using Snowflake, DBT, and Airflow. The engineer will collaborate with stakeholders to optimize data workflows and ensure data quality.
Key Responsibilities
- Design data pipelines and workflows
- Implement data transformations
- Optimize data performance
- Troubleshoot pipeline issues
- Collaborate with stakeholders on data requirements
Technical Overview
The technical environment includes Snowflake data warehouse, ETL/ELT processes, Python scripting, Airflow orchestration, and data modeling best practices. The focus is on building reliable, high-performance data pipelines for analytics.
Ideal Candidate
The ideal candidate is a mid-level data engineer with 5+ years of experience in data warehousing, ETL/ELT processes, and cloud-based data pipelines. They are proficient in Snowflake, DBT, and Airflow, with strong problem-solving and collaboration skills.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Certifications
Preferred
Industry & Role
Keywords for Your Resume
Deal Breakers
Lack of experience with Snowflake or ETL/ELT, No proficiency in Python or data pipeline tools, Insufficient experience in data modeling, No experience with cloud data platforms
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile