Position Details
About this role
This role focuses on developing and testing data pipelines, automating data workflows, and supporting large-scale data projects within a leading insurance company's analytics team.
Key Responsibilities
- Build and test data pipelines
- Automate data workflows
- Perform data transformation and movement
- Support AI and ML data needs
- Collaborate with cross-functional teams
Technical Overview
The technical environment includes Python, SQL, AWS cloud platform, Databricks, Snowflake, ETL tools, and Terraform, emphasizing data transformation, automation, and large dataset handling.
Ideal Candidate
The ideal candidate is a mid-level data engineer with at least 8 years of experience in building robust data pipelines using Python, SQL, and cloud platforms like AWS and Databricks. They are skilled in automation, data transformation, and working with large-scale datasets.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
Lack of experience with Python, SQL, or AWS, Less than 8 years of related experience, No experience with data pipelines or ETL tools
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile