This is Aravind, - Recruitment Team from MSR Cosmos
We have an urgent requirement as follows:
Please respond with resumes in MS-Word Format with the following details to email@example.com
Full Name :
Contact Number :
Skype Id :
Last 4 digit SSNO :
Availability for project :
Availability for Interviews :
Visa Status and Validity :
D O B :
Years of Exp :
Sr Hadoop Admin ( Cloudera )
Location: Burlington, NC
USC/GC/GC EAD/L2 EAD/H4 EAD/E3/TN
Consultant is responsible for implementation and ongoing administration of Hadoop infrastructure on BDA. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users
* Providing expertise in provisioning physical systems for use in Hadoop
* Installing and Configuring Systems for use with Cloudera distribution of Hadoop (consideration given to other variants of Hadoop such as Apache, MapR, Spark, Hive, Impala, Kafka, Flume, etc.)
* Administering and Maintaining Cloudera Hadoop Clusters
* Provision physical Linux systems, patch, and maintain them.
* Performance tuning of Hadoop clusters and Hadoop Map Reduce/Spark routines.
* Management and support of Hadoop Services including HDFS, Hive, Impala, and SPARK. Primarily using Cloudera Manager but some command-line.
* Red Hat Enterprise Linux Operating System support including administration and provisioning of Oracle BDA.
* Answering trouble tickets around the Hadoop ecosystem
* Integration support of tools that need to connect to the OFR Hadoop ecosystem. This may include tools like Bedrock, Tableau, Talend, generic ODBC/JDBC, etc.
* Provisioning of new Hive/Impala databases.
* Provisioning of new folders within HDFS
* Setting up and validating Disaster Recovery replication of data from Production cluster
* Bachelor's degree in computer science
* Demonstrated knowledge/experience in all of the areas of responsibility provided above.
* General operational knowledge such as good troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage and networks.
* Must have knowledge of Red Hat Enterprise Linux Systems Administration
* Must have experience with Secure Hadoop - sometimes called Kerberized Hadoop - using Kerberos.
* Knowledge in configuration management and deployment tools such as Puppet or Chef and Linux scripting.
* Must have fundamentals of central, automated configuration management (sometimes called "DevOps.")
Thanks & Regards,
5250 Claremont Ave, Ste 249 | Stockton, CA – 95207
Desk : 925-399-7145
Fax : 925-219-0934
E-VERIFIED | WBE CERTIFIED | SAP Service Partner | Microsoft Silver Partner | Hortonworks & Cloudera Silver Partner |ORACLE GOLD PARTNER
Note: This email is not intended to be a solicitation. Please accept our apologies and reply in the subject heading with REMOVE to be removed from our Mailing list.
You received this message because you are subscribed to the Google Groups " c2c jobs usa" group.
To unsubscribe from this group and stop receiving emails from it, send an email to firstname.lastname@example.org.
To post to this group, send email to email@example.com.
Visit this group at https://groups.google.com/group/c2cjobsusa.
For more options, visit https://groups.google.com/d/optout.