Position Details
About this role
This role involves designing and implementing scalable data lake solutions, developing data pipelines, and managing data security and governance in cloud environments, primarily using Snowflake, Databricks, and Azure.
Key Responsibilities
- Design scalable data lake solutions
- Develop and optimize data pipelines
- Manage data governance and security
- Troubleshoot cluster and server issues
- Migrate ETL jobs to cloud platforms
Technical Overview
The technical environment includes cloud data platforms like Snowflake and Azure Synapse, big data tools such as Hadoop, Hive, Spark, and security tools like HashiCorp Vault and CyberArk. Scripting in Python and shell is required for automation.
Ideal Candidate
The ideal candidate is a mid-level data engineer with 3+ years of experience in data lake solutions, data pipeline development, and cloud migration, particularly with Azure and Snowflake. Strong scripting and security management skills are essential.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Industry & Role
Keywords for Your Resume
Deal Breakers
Lack of experience with Snowflake or Databricks, No cloud migration experience, No scripting skills in Python or shell, No experience with data security tools, Bachelor's degree not in relevant field
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile