Position Details
About this role
This role involves developing and maintaining large-scale data infrastructure, building data pipelines, and supporting analytics and AI initiatives within Intercom.
Key Responsibilities
- Design and build data pipelines
- Collaborate with data teams
- Develop automation tools
- Monitor data infrastructure
- Support data-driven projects
Technical Overview
Focuses on data pipeline development, cloud data platforms, and data quality monitoring using Python, SQL, Airflow, Snowflake, and AWS.
Ideal Candidate
The ideal candidate is a mid-level data engineer with at least 3 years of experience designing and maintaining large-scale data pipelines. They are proficient in Python, SQL, and cloud data tools, and are passionate about building reliable data infrastructure.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
Less than 3 years of experience in data engineering, Lack of experience with Python or SQL, No experience with cloud data platforms, Inability to work with large datasets
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile