Position Details
About this role
Design and build large-scale distributed data systems powering analytics, AI/ML, and business intelligence at Figma. Develop batch and streaming data pipelines, manage core platforms like Snowflake and ML Datalake, and drive data reliability and cost optimization.
Key Responsibilities
- Design and build large-scale distributed data systems for analytics and AI/ML
- Develop batch and streaming data pipelines
- Manage Snowflake, ML Datalake, orchestration infra
- Improve data reliability, quality, and compliance
- Mentor engineers and foster technical excellence
Technical Overview
Focus on distributed data infrastructure, batch and streaming processing, and data ingestion/orchestration. Tech stack includes Spark, Flink, Kafka, Airflow, Dagster, dbt, Snowflake, Python, and Go; emphasize data quality, reliability, and governance.
Ideal Candidate
The ideal candidate is a senior backend or data infrastructure engineer with 5+ years of experience designing and operating distributed data systems at scale, proficient in batch and streaming processing, and comfortable mentoring engineers in a fast-paced, data-driven environment.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
5+ years backend or infrastructure engineering experience, experience designing distributed data infrastructure at scale, proficiency with Spark, Flink, Kafka, or Airflow/Dagster
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile