Bigdata/Hadoop Developer at Multiple Locations Job requirement

Qualifications:

  • Minimum 4+ years of experience in Hadoop/Big Data
  • Big Data - Spark, Scala, Hadoop, MapReduce and HDFS, Cassandra / HBase
  • Understands the information management life cycle necessary to acquire, process, and analyze large sets of structured and unstructured data.
  • Experience with high-speed messaging frameworks and streaming (Kafka, Storm)
  • Working with large volume of data and in high performance compute environment.
  • Familiarity with Hadoop and MapReduce programming.
  • Create performance metrics to optimize, monitor and track effectiveness of models and tactics.
  • Optimizing data transformations, model development, and model validation.
  • Design and develop data mining techniques to gather, process, and analyze complex data sets including structured and unstructured data such as web logs, device logs, text log data, and unstructured social media content.
  • Support the development and use of Hive - SQL and NOSQL, SQL-H queries to access structured and unstructured data sources

See full