* Experience in developing Hadoop solutions and strong experience with Spark & Scala.
* Significant experience in overall Hadoop ecosystem(HDFS, MAPREDUCE,HIVE, PIG, SQOOP, FLUME,OOZIE)
* Extensive knowledge about Hadoop Architecture and HDFS, and other file formats (min 3+ years exp).
* Hands on experience writing MapReduce jobs (min 3+ years exp)
* Hands-on experience with Storm, HBase, Hive, and Pig Scripting.
* Familiarity with data loading tools like Flume, Sqoop.
* Knowledge of workflow/schedulers like Oozie.
* Experience working in agile environment.
Excellent Communication skills is mandatory.