Spark Developer - Hadoop/hbase/etl

Soft tech Career Info system Pvt. Ltd
  • Bangalore
  • 10-15 lakh
  • 5-8 years
  • Views
  • 18 Oct 2016

  • Software Design & Development

  • IT/ Technology - Software/ Services
Job Description

Position Purpose :

As a Spark Developer, you will be part of a nextGen components build team that will design and implement applications to support distributed processing using Hadoop Ecosystem.

Principal Accountabilities :

- Design and implement applications to support distributed processing using Hadoop Ecosystem

- Build libraries, user defined functions, jobs and frameworks around Hadoop

- Research, evaluate and utilize new technologies/tools/frameworks around Hadoop ecosystem such as Apache Spark, HDFS, Hive, HBase, etc.

- Develop user defined functions to provide custom Hbase/Hive capabilities

- Develop ETL Jobs using SQOOP, OOZIE, Impala, SPARK etc...

- Participating in the installation, configuration and administration of a single-node and multi-node cluster with technologies like HDFS, Apache Spark etc.

Role Requirements :

- 5+ years of experience building and managing complex products/solutions

- 2+ years of proficient level programming in Scala/Java.

- 2+ years of hands on development experience on Spark

- 2+ years of experience in building web services using Java/Python/Scala stack.

- Should be able to understand and write complex SQL queries

- Should have hands on development experience using Big Data eco system components like: Hive, Impala, HBase, Sqoop, Oozie etc

- Good to have experience in DW/ELT/ETL technologies

- Good to have experience developing Restful web services

- Knowledge of web technologies and protocols (NoSQL/JSON/REST/JMS)

Competencies/Skill sets for this job

Hands On Web Technologies Protocols Python Json Apache

Job Posted By

O.P. Chawla