Data Integration Engineer
FORT MEADE MD | Government - Civil Service | Posted: 1 day ago |
Job Description:
One of our top Government Integrator clients is hiring key resources for their existing program supporting the Defense Information Systems Agency (DISA) on various efforts around their core data and analytics platform. You will be joining the Global Automations & Intelligent Network Solutions team that securely connects more than 3 million end users at over 3,000 DoD and Federal sites around the world. In this position, you will help lead data integration development – focusing on expanding the foundational Integrated Data Architecture platform and delivering modern analytics capabilities in support of mission-critical DoD network operations.
Requirements:
• 5+ years of hands-on data integration experience – developing & implementing Kafka integrations between ELK/Elastic Stack (Elasticsearch, Logstash, Kibana)
• Including working with RESTful APIs, connectors, and event streaming pipelines
• 3+ years of Databricks experience building pipelines with Delta tables, data cleansing, and implementing Bronze/Silver/Gold (medallion) architecture
• Extensive Python and/or Java programming experience
• Experience working in Agile development environment (SDLC) supporting sprint cycles, testing, and deployment activities
• Active SECRET Clearance or higher REQUIRED
• CompTIA Security+ certification required within the first 14 days of employment
Nice to have:
• Certified Confluent Developer and/or Certified Elastic Engineer Certifications
• Kubernetes containerization experience for cloud deployments and/or AWS Gov Cloud environment experience
• Experience developing and deploying software in a DoD environment (DISA experience is a plus)
• Experience with Atlassian tools including JIRA and Confluence
Responsibilities:
• Develop and implement integration solutions for the project using Kafka and Elastic as the primary data architecture platforms, with expanded integration to other technologies, including but not limited to Databricks.
• Integrate data sources into Databricks, Confluent (Kafka), and Elastic platforms in support of the GMS core data and analytics environment
• Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution
• Design and implement solutions within Databricks, including Delta Lake tables and medallion architecture layers, to support analytics and data transformation initiatives
• Develop Kafka-based integrations into Databricks and Elastic, including custom connectors, APIs, and event streaming pipelines
• Support the integration of new data sources, including DODNet, into the existing platform environment
• Support and maintain Elasticsearch and Logstash integrations to ensure continuity and stability of the existing ELK environment
• Automate the full software lifecycle from design and development through testing and deployment, including production environments
• Work with other members of the data integration team to propose solutions based on mission needs and platform strategies
• Develop DoD requirements, traceability, and detailed plans and schedules, including software systems engineering and interface documents (IDDs/ICDs)
• Interact with the customer to address data engineering technical considerations and associated problems, issues, or conflicts
Share This Job
Quick Apply
Stay Up To Date
Sign up for job alerts for
weekly job updates
APPLY NOW
Stay Up To Date
Sign up for job alerts for
weekly job updates
FORT WASHINGTON PA
PORTLAND OR
BEACHWOOD OH