Back to Search Results
Get alerts for jobs like this Get jobs like this tweeted to you
Company: SAIC
Location: Reston, VA
Career Level: Mid-Senior Level
Industries: Technology, Software, IT, Electronics

Description

Description

SAIC is seeking a Data Engineer to support the Department of the Air Force Integrated Fires Command and Control (DIFC2) program of record.  As the Data Engineer, you will design, develop, and implement data pipelines and analytics for applications.  This position will collaborate with cross-functional teams to understand and address operational challenges using data pipelines and analytics. Position will be 100% onsite in the Reston, VA area.  

JOB DUTIES:  
•    Collaborate with cross-functional teams to understand and address operational challenges using data pipelines and analytics.
•    Design, develop, and implement data pipelines and analytics for applications.
•    Perform exploratory data analysis, algorithm development, and testing.
•    Perform Data Ingest, Normalization, Sanitization, Extraction, Transformation, Loading, (ETL) process of structured and unstructure data to common standards for interoperability.
•    Work with multiple data formats, including UCI 2.0+, CSV, JSON, XML, Parquet, and ORC.
•    Develop and deploy data pipelines and analytics in real-world operational environments.
•    Deploy, monitor, and optimize data pipelines to ensure high performance and reliability.
•    Implement event streaming pipelines using Apache Nifi Workflows, Apache Kafka, AWS Kinesis, RabbitMQ, or ZeroMQ.
•    Utilize distributed computing platforms such as AWS Lambda, Dask, or Spark.
•    Leverage cloud-native tools including AWS S3, RDS, EFS, SNS, and SQS.
•    Utilize data pipeline frameworks such as AirByte, Apache Airflow, dbt, Apache Iceberg, and Snowflake.
•    Work with GIS data using ArcGIS, PostGIS, and related tooling.
•    Implement containerized environments using Docker or Kubernetes.
•    Apply cybersecurity principles in the context of secure DoD data applications.
•    Communicate findings and engineering solutions effectively with technical and mission stakeholders.
 

Qualifications

Requirements: 
•    U.S. Citizenship required 
•    Active TS/SCI security clearance required to start.
•    Bachelor's degree in Computer Science, Data Science, Geography, Math, Machine Learning, or Statistics and nine (9) years or Masters and seven (7) years or more of experience. Equivalent years of relevant experience in lieu of degree will be taken into consideration.
•    Strong programming skills in Python, G, Rust, Pandas, R, SQL, or related languages.
•    10 years of experience as a business analyst, data analyst, data scientist, data engineer, database administrator, geospatial analyst/engineer, machine learning engineer, or software engineer, or related field.
•    Ability to safely carry tools, equipment, and materials aboard ship, including ascending and descending shipboard ladders(stairwells) and navigating confined spaces while maintaining required points of contact. Tools and equipment will weigh no more than 50 lbs.
•    Ability to perform required work aboard Navy vessels and in shipboard environments, including navigating narrow passageways, ascending, and descending ladders (stairwells), working on elevated platforms, and operating in variable sea conditions.
•    Ability to perform activities on a reoccurring basis during shipboard operations or testing evolutions.
•    Ability to comply with Navy safety requirements and wear required personal protective equipment (PPE).
•    Ability to operate in a DDIL office environment.
•    Reasonable accommodations may be provided to enable qualified individuals to meet these requirements and perform the essential functions of the position.

Preferred Skills and Experience:
•    Experience with large-scale data architecture across secure DoD or government environments.
•    Experience working with NAVWAR, NIWC Pacific, or naval C2/ISR programs.
•    Experience architecting data solutions across multi-domain or cross-domain systems.
•    Familiarity with MLOps practices or deploying analytics/ML-enabled pipelines in classified, cross-domain, or constrained environments.
•    Experience with cloud-native data architecture and API design.
•    Programming experience in Go or Rust.
•    Proven experience designing, developing, and deploying complex data pipelines.
•    Experience working with multiple data formats (e.g., CSV, JSON, XML, Parquet, ORC).
•    Familiarity with event streaming technologies: (e.g. Kafka, AWS Kinesis, RabbitMQ, ZeroMQ).
•    Experience deploying, monitoring, and optimizing operational data pipelines.
•    Expertise in Elasticsearch, Redis, S3, PostgreSQL, or related datastores.
•    Experience with AWS data services (EFS, RDS, S3, SNS, SQS).
•    Experience with distributed computing: AWS Lambda, DASK, Spark.
•    Familiarity with AirByte, Airflow, dbt, Iceberg, Snowflake.
•    Experience managing, integrating, and retrieving GIS data (ArcGIS, PostGIS).
•    Understanding of cybersecurity principles as applied to data applications and operational environments (including DDIL constraints).
•    Strong analytical and problem-solving skills.
•    Excellent communication skills in a collaborative team environment.
•    Previous experience supporting government agencies or military organizations.
 


 Apply on company website