Position Details
About this role
This role involves developing data pipelines and supporting analytic tools for federal agencies, ensuring efficient data processing and security.
Key Responsibilities
- Develop ETL workflows
- Support federal analytic projects
- Manage structured and unstructured data
- Implement real-time data processing
- Collaborate with cross-functional teams
Technical Overview
The technical scope includes programming in Python, R, Java; big data frameworks like Spark and Kafka; cloud platforms AWS and Azure; data warehousing solutions; and analytic tools such as Advana and Maven Smart Systems.
Ideal Candidate
The ideal candidate is an entry-level data engineer with at least 4 years of experience in Python, R, Java, and ETL workflows, with familiarity in big data tools like Spark and Kafka, and experience supporting federal agencies with a Secret clearance.
Must-Have Skills
Nice-to-Have Skills
Tools & Platforms
Required Skills
Hard Skills
Soft Skills
Certifications
Required
Preferred
Industry & Role
Clearance & Visa
Keywords for Your Resume
Deal Breakers
Lack of Secret clearance, Less than 4 years of experience, No experience with cloud platforms, No relevant degree
Get matched to jobs like this
Luna finds roles that fit your skills and career goals — no endless scrolling required.
Create a Free Profile