Big Data Engineer

Key Role:

Perform data engineering activities leveraging cutting-edge capabilities,a cloud environment, and Big Data tools. Perform activities, including architecting data systems, standing up data platforms, building out ETL pipelines, writing custom code, interfacing with data stores, performing data ingestion, and building data models. Work as part of a client-facing, internal consulting team that addresses data challenges, including discussing, designing, developing, and maintaining scalable data platforms that use the latest and best in Big Data tools. Support the practices of continuous integration and continuous delivery and deployment through the automation of application builds and compute environments. Deliver and configure deployment environments tailored to various roles, including development, testing, and operations on actual servers and in virtualized environments. Work with development teams to ensure their applications can be deployed and maintained at scale using automation.

Basic Qualifications:

-5+ years of experience in developing and deploying data ingestion, processing, and distribution systems on and with AWS technologies

-3+ years of experience with custom or structured ETL design, implementation, and maintenance

-3+ years of experience with ETL tools, including Pentaho Data Integration

-2+ years of experience with AWS datastores, including RDS, S3, Redshift, or DynamoDB

-2+ years of experience with Big Data systems, including Hadoop, HDFS, Map Reduce, and Hive

-Ability to obtain a security clearance

-BA or BS degree

Additional Qualifications:

-1+ years of experience with Big Data ETL tools, including StreamSets and NiFi

-1+ years of experience with data streaming, including Kafka or Kinesis

-Experience with supporting data governance and data quality initiatives, including developing and implementing data standards, policies, practices, and procedures that improved data integrity

-Experience with developing a data strategy and synthesizing complex data management needs into simple and measurable goals and objectives

-Experience with data deployments and operations

-Experience with Cloud infrastructure and automating the deployment of infrastructure components using one of the following tools: Ansible, Chef, or Puppet

-Experience with Linux, RHEL 7, or Windows

-Knowledge of Jira, Git, or Jenkins

Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information.

We’re an EOE that empowers our people—no matter their race, color, religion, sex, gender identity, sexual orientation, national origin, disability, veteran status, or other protected characteristic—to fearlessly drive change.

Not ready to apply? Join our talent community and sign up for job alerts.