Big Data Technologies Job requirement

Detailed Job Description:

  • BS degree in computer science, computer engineering or equivalent
  • 5 6 years of experience delivering enterprise software solutions
  • Proficient in Spark, Scala, Python, AWS Cloud technologies
  • 3+ years of experience across multiple Hadoop / Spark technologies such as Hadoop, MapReduce, HDFS, HBase, Hive, Flume, Sqoop, Kafka, Scala
  • Flair for data, schema, data model, how to bring efficiency in big data related life cycle
  • Must be able to quickly understand technical and business requirements and can translate them into technical ...

See full