Skip to content

CareerBoard

Contact us at 1-877-619-5627
Advertise your job!
 

Job Application

 
 
 

Please answer the following questions in order to process your application.

 
 
Email Address *
 
I certify that I am a U.S. citizen, permanent resident, or a foreign national with authorization to work in the United States. *
 
 
 
File Attachments:
(2MB file maximum. doc, docx, pdf, rtf or txt files only)
 
Attach a Resume * 
 
Optional covering letter 
OR
Clear covering letter
 
 
 * denotes required field
 
 
 
Additional Information:
 
First Name
 
Last Name
 
Address
 
Country
 
Home Telephone
 
Mobile/Cell
 
Availability/Notice
 
Salary Expectation USD
 
Approximately how far are you willing to travel to work (in miles) ?
 
 
 

Key Privacy Information

When you apply for a job, CareerBoard will collect the information you provide in the application and disclose it to the advertiser of the job.

If the advertiser wishes to contact you they have agreed to use your information following data protection law.

CareerBoard will keep a copy of the application for 90 days.

More information about our Privacy Policy.

 
 

Job Details

 

Hadoop Administrator (Full Time)

Location: Tempe Arizona Country: United States of America Rate: 100-120K
 

Prestigious Fortune 500 Company is currently seeking a Big Data Engineer with strong Hadoop administration experience. Candidate will work in an agile environment interacting with multiple technology and business areas designing and developing next generation analytics platforms and applications. Candidate will be responsible for the strategy and design of complex projects as well as coding, and also supports project planning and mentoring.

Responsibilities:

Working closely with the various teams - data science, database, network, BI and application teams to make sure that all the big data applications are highly available and performing as expected

Administration experience on Hadoop, HDFS, YARN, Spark, Sentry/Ranger, HBase and Zookeeper

Design, install, and maintain big data analytics platforms (on-prem/cloud) including design, security, capacity planning, cluster setup and performance tuning.

Manage public and private cloud infrastructure.

Qualifications:

Deep understanding of distributed Hadoop ecosystem, networking connectivity and IO throughput along with other factors that affect distributed system performance

Expert in configuring & troubleshooting of all the components in the Hadoop ecosystem like MapReduce, YARN, Pig, Hive, HBase, Sqoop, Flume, Zookeeper, Oozie (understanding of all these)

Experience in installing/configuring cluster monitoring tools like Cloudera Manager/Ambari, Ganglia, or Nagios. (one of these)

Hands-on experience with Scripting with bash, Perl, ruby, or python (one of these)

Working knowledge of hardening Hadoop with Kerberos, TLS,SSL and HDFS encryption.

Working knowledge on Jenkins, git, AWS.

Good understanding on automation tools (eg, Puppet, Ansible)

Expert in configuring & troubleshooting of all the components in the Hadoop ecosystem Spark, Solr, Scala, Kafka etc.

Working knowledge on Jenkins, git, AWS.


Posted Date: 06 Dec 2018 Reference: JSCJ-HADAZ Employment Agency: Request Technology - Craig Johnson Contact: Craig Johnson