IT development experience in Data, Database, JAVA or any programming languages.
Minimum 5+ years of working experience in the Apache Hadoop framework, HDFS, Map Reduce, Hive, Flume, Sqoop, Oozie, Spark, Spark Streaming and Scala.
Minimum of 5+ years of experience in Big Data and batch/real-time ingestion and analytical solutions leveraging transformational technologies.
Minimum of 5+ years of experience in JAVA, spring, Elasticsearch and Cassandra.
Good understanding about Lambda architecture
Architects Big Data analytics framework.
Translates complex functional and technical requirements into detailed architecture, design, and high performing software
Codes, tests, and documents new or modified data systems to create robust and scalable applications for data analytics.
Implements security and recovery tools and techniques as required.
Ensures all automated processes preserve data by managing the alignment of data availability and integration processes.
Having exposure on the Cloudera Manager, Cloudera Navigator, Hue, Shell Scripting, Apache Pig and Python
Good understanding in XSD, XML and JSON schemas
Exposure on Graph Databases - Neo4j
Technical Leading Skill and exposure as Enterprise Architect will be value added
Worked on IT security and standardization for Hadoop related technologies
Integrating Hadoop with Oracle and NoSQL databases
Exposure to MSTR, BI.BO, Tableau and other data visualizing tools Competencies / Agile Skills
Technology Doers / Full stack developers
Drive Efficiency and leaders in cutting edge technologies
Technical Capability development
Transparency and Collaboration
Self-driven and Self-learning
Problem solving and decision making
Appetite to develop in multiple platforms