Position Details
About this role
This role involves building and maintaining scalable data pipelines, migrating legacy systems to modern lakehouse patterns, and ensuring data quality and lineage for security data at Amazon.
Key Responsibilities
- Design ETL pipelines
- Migrate legacy pipelines
- Ensure data quality
- Support data standards
- Collaborate on data infrastructure
Technical Overview
The technical environment includes AWS Redshift, Glue, S3, Lambda, Step Functions, Apache Iceberg, Spectrum, Lake Formation, with a focus on data pipeline development, data quality, and modernization.
Ideal Candidate
The ideal candidate is a mid-level data engineer experienced in building and maintaining data pipelines, data modeling, and ensuring data quality using AWS services and modern lakehouse architectures.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
Lack of experience with AWS or data pipelines, No data modeling or data quality experience, No familiarity with lakehouse architectures, Must be authorized to work in the US
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile