- Minimum 5 years of experience with 3+ years of development experience in Big Data technologies like Hadoop and Scala/Python.
- Ability to own and establish physical Architecture for Big Data platform.
- Ability to design and support development of a data platform for data processing (data ingestion and data transformation) and data repository using Big Data Technologies like Hadoop stack including HDFS cluster, MapReduce, Spark, Scala, Hive and Impala.
- Past experience to build proof of concepts using Big Data technologies to test various use cases.
- Ability to support logical data model design and convert it into physical data model.
- Ability to design and support development of a data mart using Oracle and Ab Initio ETL platform.
- Ability to design and support RESTful API based web services for data distribution to downstream applications.
- Past experience working with best practices/standards for Big data platform and web services.
- Past experience in translating functional and technical requirements into detail design.