Position Details
About this role
This role involves designing and managing enterprise-scale data lakehouse architectures on cloud platforms, supporting large-scale data ingestion, processing, and cost optimization efforts.
Key Responsibilities
- Manage enterprise lakehouse architecture
- Support large-scale data ingestion
- Implement data lifecycle and security standards
- Optimize data storage and compute costs
- Lead best practices for data development
Technical Overview
The technical environment includes Databricks Lakehouse, Delta Lake, PySpark, SQL, and cloud platforms like Azure and GCP, focusing on data engineering, lifecycle management, and security.
Ideal Candidate
The ideal candidate is a senior data platform architect with over 10 years of experience managing large-scale cloud data platforms, especially Databricks Lakehouse architectures. They possess strong expertise in data engineering, cloud environments, and cost management, with excellent communication and leadership skills.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
Less than 10 years of relevant experience, Lack of experience with Databricks or cloud data platforms, No experience with Delta Lake or Data Lakehouse architecture
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile