On an embark for a good team player, willingness to learn and implement new Big Data technologies
· Design and implement applications to support distributed processing using Hadoop Eco system.
· Build libraries, user defined functions, and frameworks around Hadoop
· Research, evaluate and utilize new technologies/tools/frameworks around hadoop eco system such as Apache Kafka, Apache Spark, HDFS, Hive, HBase, etc.
· Develop user defined functions to provide custom hbase/hive capabilities
· Participating in the installation, configuration and administration of a single-node and multi-node cluster with technologies like HDFS, Apache Kafka, Apache Spark etc
· 8-12 years of experience building and managing complex products/solutions.
· Total 5+ experience in Java/J2EE technology
· 2+ years of hands on in development experience using Big Data Technology like: Hadoop (Apache Hadoop, MapR Hadoop, HBase)
· Proficient in using Streaming based analytics using Apache Storm & Apache Spark
· Expert level programming in Java.
· Hands on experience building web services in Java/PHP/Python stack.
· Experience developing Restful web services in Spring framework
· Knowledge of web technologies and protocols (NoSQL/JSON/REST/JMS)