Position Details
About this role
This role involves designing, building, and maintaining complex data pipelines and solutions to support enterprise analytics and AI initiatives within the insurance industry.
Key Responsibilities
- Build and operationalize data solutions
- Design complex data architectures
- Perform data analysis and validation
- Implement data governance and security
- Collaborate across teams
Technical Overview
The technical environment includes ETL tools like Ab Initio, Databricks, Snowflake, Teradata, cloud platforms such as AWS, and programming in Python and SQL, emphasizing data governance and security.
Ideal Candidate
The ideal candidate is a mid-level data engineer with at least 8 years of experience in data pipeline development, proficient in ETL tools like Ab Initio, Databricks, Snowflake, and Teradata, with strong AWS and programming skills in Python and SQL. They should have a solid understanding of data governance, security, and quality practices.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
Lack of experience with ETL tools and cloud platforms, Less than 8 years of relevant experience, No proficiency in Python or SQL, No experience with data governance or security
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile