Spark Developer - Hadoop

Soft tech Career Info system Pvt. Ltd
  • Bangalore
  • 9-13 lakh
  • 5-8 years
  • Views
  • 18 Dec 2017

  • IT/ Information Technology

  • IT/ Technology - Software/ Services
Job Description

Principal Accountabilities :

- Design and implement applications to support distributed processing using Hadoop Ecosystem

- Build libraries, user defined functions, jobs and frameworks around Hadoop

- Research, evaluate and utilize new technologies/tools/frameworks around Hadoop ecosystem such as Apache Spark, HDFS, Hive, HBase, etc.

- Develop user defined functions to provide custom Hbase/Hive capabilities

- Develop ETL Jobs using SQOOP, OOZIE, Impala, SPARK etc...

- Participating in the installation, configuration and administration of a single-node and multi-node cluster with technologies like HDFS, Apache Spark etc.

Role Requirements :

- 5+ years of experience building and managing complex products/solutions

- 2+ years of proficient level programming in Scala/Java.

- 2+ years of hands on development experience on Spark

- 2+ years of experience in building web services using Java/Python/Scala stack.

- Should be able to understand and write complex SQL queries

- Should have hands on development experience using Big Data eco system components like: Hive, Impala, HBase, Sqoop, Oozie etc

- Good to have experience in DW/ELT/ETL technologies

- Good to have experience developing Restful web services

- Knowledge of web technologies and protocols (NoSQL/JSON/REST/JMS)


Competencies/Skill sets for this job

Big Data Jms Hands On Java. Json

Job Posted By

O.P. Chawla
Director