Hadoop Admin

Companie: Crossfire Consulting
Tipul locului de muncă: Full-time

Reference # : 21-00837 Title : Hadoop Admin Location : Temple Terrace, FL Position Type : Contract Experience Level : Start Date / End Date : 05/02/2021 / 05/01/2022 Description As a Part of Telecommunication Platform Engineering Team, the candidate (Hadoop Admin) will be responsible for implementation and ongoing Administration of Hadoop Bigdata infrastructure. The Hadoop Admin will support, implement and maintain bigdata infrastructure in Telecommunication and will be responsible for end to end Hadoop cluster administration. JOB DUTIES: ? Responsible for implementation and ongoing administration of Hadoop infrastructure initiatives. ? Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. ? Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig, Spark and MapReduce access for the new users. ? Cluster maintenance as well as creation and removal of nodes using Hadoop Management Admin tools like Ambari, Cloudera Manger etc. ? Sound knowledge in Ranger, Nifi, Kafka, Atlas, Hive, Storm, pig, spark, Elastic Search, Splunk, Solr, Kyvos, Hbase etc and other bigdata tools. ? Performance tuning of Hadoop clusters and Hadoop MapReduce routines. ? Screen Hadoop cluster job performances and capacity planning ? Monitor Hadoop cluster connectivity and security ? Manage and review Hadoop log files, File system management and monitoring. ? HDFS support and maintenance. ? Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability. ? Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required. ? Implement automation tools and frameworks (CI/CD pipelines). Knowledge on Ansible, Jenkins, Jira, Artifactory, Git etc. ? Design, develop, and implement software integrations based on user feedback. ? Troubleshoot production issues and coordinate with the development team to streamline code deployment. ? Analyze code and communicate detailed reviews to development teams to ensure a marked improvement in applications and the timely completion of projects. ? Collaborate with team members to improve the company?s engineering tools, systems and procedures, and data security. MUST HAVE SKILLS: ? Must know Hadoop and bigdata infrastructure ? Expert in Hadoop administration with knowledge of Hortonworks/Cloudera or Mapr Bigdata management tools ? Expert in developing/managing Java and Web applications. ? Expert in implementing and trouble shooting hive, spark, pig, storm, Kafka, Nifi, Atlas, Kyvos, Elastic Search, Solr, Splunk, HBase applications. ? Possess a strong command of software-automation production systems (Jenkins and Selenium) and code deployment tools (Puppet, Ansible, and Chef). ? Working knowledge of Ruby or Python and known DevOps tools like Git and GitHub. ? Working knowledge of database (Oracle/Teradata) and SQL (Structured Query Language). ? General operational expertise such as good troubleshooting skills, understanding of system?s capacity, bottlenecks, basics of memory, CPU, OS, storage, and network DESIRED SKILLS: ? The most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups. ? Good knowledge of Linux as Hadoop runs on Linux. ? Familiarity with open source configuration management and deployment tools such as Puppet or Chef and Linux scripting. ? Knowledge of Troubleshooting Core Java Applications is a plus. ? Problem-solving skills. ? A methodical and logical approach. ? The ability to plan work and meet deadlines ? Accuracy and attention to detail EDUCATION/CERTIFICATIONS: B.S. or equivalent Engineering degree with at least 5 Years for Big data work experience


Aplică pentru acest job