Hadoop Administrator/ Hadoop Infrastructure Admin,San Jose, CA Job requirement

·Design and develop different architectural models for our scalable data processing as well as scalable data storage

·Build data pipelines and ETL using heterogeneous sources

·Build data ingestion from various source systems to Hadoop using Kafka, Flume, Sqoop, Spark Streaming etc.

·Transform data using data mapping and data processing capabilities like MapReduce, Spark SQL

·Supports Big Data and batch/real time analytical solutions

·Own the platform architecture and drive it to the next level of effectiveness to support current and future requirements

·Design, implement, deploy and maintain security, data capacity, node forecasting and planning.

·Provide hardware architectural guidance, plan and estimate cluster capacity, and create roadmaps for the Hadoop cluster deployment.

·Closely work with the Hadoop development, infrastructure, network, database, and business intelligence teams.

·Participate in a 12x7 rotation for production issue escalations.

·Communicate effectively with people at all levels of the organization.

Requirements:

·B.S. degree in Computer Science, Electrical and Computer Engineering, or equivalent technical fields.

·Experience in tools integration, automation, configuration management systems.

·Experience with networking, systems administration skills.

·3+ years of hands on working experience on Hadoop infrastructure stack (Ex: HDFS, MapReduce, HBase, Flume, Spark, Pig, Hive, Oozie, YARN, Zookeeper, Presto, etc). 2+ years of experience in Python or Perl.

·MapReduce, Hbase, Hive, Impala, Spark, Kafka, Kudu, Solr)

·Strong development/automation skills. Must be very comfortable with reading and writing Scala, Python or Java code.

·Proven Ability to build Curated Models for BI and Analytics

·Strong SQL Skills

·Experience : 10 + years

See full