About this role
Booz Allen Hamilton is looking for a Lead Data Engineer to design and build scalable data platforms and pipelines. The role covers end-to-end work from assessment and design through development and sustainment, including migration from on-prem to cloud, and mentoring teammates in data engineering best practices.
Key Responsibilities
- Design, build, and maintain scalable data pipelines and platforms
- Develop secure Python for data processing, automation, and warehousing
- Implement CI/CD pipelines for automated build, test, and deployment
- Migrate and integrate data from on-premises systems to cloud environments
- Support data governance, metadata tagging, and access controls
Technical Overview
You will develop secure, efficient Python for data processing and automation, build SQL scripts and stored procedures, and implement data governance, metadata tagging, and access controls. The stack emphasizes AWS services (S3, IAM, EventBridge, Step Functions, Lambda) and automation via CI/CD pipelines for build, test, and deployment workflows.
Ideal Candidate
The ideal candidate is a lead-level Data Engineer with 2+ years supporting large-scale enterprise data engineering. They are strong in Python and SQL (including stored procedures and data modeling), have implemented CI/CD for data platforms, understand data governance and data tagging, and have hands-on knowledge of AWS services like S3, IAM, EventBridge, Step Functions, and Lambda, with TS/SCI clearance.
Must-Have Skills
2+ years of experience as a data engineer supporting large-scale enterprise systemsExperience writing cleansecureand efficient Python for data engineering use casesExperience moving data from on-prem environments to the cloud and automating development workflowsExperience with CI / CD practicesExperience with SQLincluding stored procedures and data modelingExperience with data governance and data taggingKnowledge of AWS services such as S3IAMEventBridgeStep Functionsor LambdaTS/SCI clearanceBachelor's degree in Computer ScienceData Scienceor Mathematics
Nice-to-Have Skills
Experience with Pipeline BuilderAIPand Foundry's application development ecosystemExperience with ETL tools such as dbt and AirflowExperience with NoSQL and graph databasesExperience designing and building data warehousesExperience with DevOps tools and automation practicesExperience working in multiple SDLC modelsincluding AgileWaterfallIterativeor SpiralExperience with open table formats such as Apache IcebergExperience with infrastructure-as-codeincluding Terraform or CloudFormationKnowledge of data lakes and lakehouse architecturesDatabase performance concepts such as partitioning
Tools & Platforms
PythonSQLAmazon S3Amazon Identity and Access Management (IAM)AWS EventBridgeAWS Step FunctionsAWS LambdadbtAirflowPipeline BuilderAIPFoundryApache IcebergTerraformCloudFormationCI/CDAgile
Required Skills
Scalable data pipelines and platformsPythonSQLstored proceduresdata modelingCI/CDdata governancedata taggingaccess controlsmigrating on-prem data to cloudAmazon S3IAMEventBridgeStep FunctionsLambdaAgileETLdbtAirflowTerraformCloudFormationApache Iceberg
Hard Skills
Data pipelinesData platformsScalable data pipelines and platformsPythonSecure PythonSQLSQL scriptsStored proceduresData modelingData governanceData taggingAccess controlsCI/CD pipelinesBuildtestand deployment workflowsMigrate and integrate dataOn-premises systemsCloud environmentsAWS services such as S3Amazon S3IAMAmazon Identity and Access Management (IAM)EventBridgeAWS EventBridgeStep FunctionsAWS Step FunctionsLambdaAWS LambdaAgileEnterprise systems data engineeringMentoring teammatesBest practicesLead technical effortsAutomation development workflows
Soft Skills
Technical leadershipMentoringGuidance in a complex enterprise environmentCross-functional collaborationCollaboration with analystsAI engineersdevelopersand stakeholdersAbility to work independently and manage tasks with minimal supervision
Keywords for Your Resume
Data EngineerLeaddata pipelinesdata platformsscalable data pipelinesPythonSQLstored proceduresdata modelingCI / CDCI/CD pipelinesbuildtestand deployment workflowsdata governancedata taggingaccess controlsAWS servicesAmazon S3IAMAmazon Identity and Access Management (IAM)EventBridgeAWS EventBridgeStep FunctionsAWS Step FunctionsLambdaAWS Lambdaon-premises systemscloud environmentsTS/SCI clearanceComputer ScienceData ScienceMathematicsAgiledata warehousesETLdbtAirflowTerraformCloudFormationApache IcebergCI/CD
Deal Breakers
Must have TS/SCI clearance, Must have 2+ years of experience as a data engineer supporting large-scale enterprise systems, Must have experience with Python for data engineering and SQL including stored procedures and data modeling, Must have knowledge of AWS services such as S3, IAM, EventBridge, Step Functions, or Lambda, Must have a Bachelor's degree in Computer Science, Data Science, or Mathematics
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile