Responsible for driving projects end to end and will act more of individual contributor
work on building data marts, data migration, automating scripts on Hadoop ecosystem
* Understanding of google cloud platform is preferred
* The candidate should be able to plan and execute the projects and be able to guide the junior folks in the team
* To be able to drive the communication with internal and external stakeholders.
Desired Candidate Profile:
Qualification & Experience:
* Bachelor's or Master's degree in a technology related field (e.g. Engineering, Computer Science, etc.) required
* 5+ years of Big Data Processing technologies such as Spark, Hadoop etc.
* 4+ years extensive experience in Object Oriented Programming (Java, Scala, Python)
* 4+ years of hands-on experience in implementing data Integration frameworks to ingest terabytes of data in batch and real-time to an analytical environment
* 4+ years of experience in developing Big data applications in Cloud, preferably AWS, is highly desirable
* Deep knowledge of Database technologies such as Relational and NoSQL.