- Must have an active Secret Security Clearance and be able to obtain a Top Secret Security Clearance with SCI access. At least 2 years of experience in Big Data to include Hadooop, Elastic, Kafka, and Spark.
- Proficiency in system administration of message pipelines.
- At least 2 years of experience as a Linux system administrator.
- Expert knowledge of Hadoop and Elastic architecture.
- Experience installing, upgrading, and maintaining large Hadoop clusters.
- Experience using Ambari to manage Hadoop Cluster.
- Knowledge of Continuous Integration and Continuous Development (CI/CD) frameworks.
- Knowledge and understanding of operating systems, networks and services.
- Manage changes to system and assesses the security impact of those changes.
- Strong research, analytical, and problem solving skills.
- Good communication skills including preparing and presenting results, findings and alternatives and influencing management decision making based on the best available data
Desired knowledge and skills:
- Horton Works
- RHEL 7
Training/certifications in any of the following strongly desired:
- Elastic Certifications
- Hadoop Certifications
Required Education (including Major):
- Bachelor's degree in Science, Technology, Engineering or Mathematics and a minimum of 6 years of prior relevant experience.
- Master’s degree in a related discipline may be substituted for two (2) years of experience
- Professional experience may be substituted for a degree