The responsibilities of the role include:
Understand clients requirements and its business viabilities.
Develop solutions in detail as per the specification.
Build data lake from multiple data sources with ingestion, data governance and auditing capabilities
Should have a mind-set to challenge status-quo with out-of-box solutions.
Able to translate functional and technical requirements into detail design.
Able to implement and follow industry best practices in relevant technologies.
Good troubleshooting and application performance tuning experience.
Prepare and produce releases of software components
Work with stakeholders for regular updates, requirement understanding and design discussions
Excellent team player
Able to learn new technologies andcross-skill as per project need.
Able to work in agile methodology
Able to work with remote teams across time zones
Demonstrated work experience in data analytics, preferably financial services / surveillance analytics projects.
Strong understanding of cloud computing fundamentals (example PaaS vs IaaS vs SaaS, server less computing etc.) in Azure/AWS/GCP.Azure is highly preferred.
Should have hands on experience in developing data pipelines by using different Azure PaaS offerings like Data Lake Store, Event Hub, Azure Functions or similar offerings from AWS/GCP
Strong object oriented programming / functional programming skills in C# / Python / Java / Scala. C# is highly preferred.
The candidate should have strong data engineering skills(like ETL, data profiling, quality etc.) to build data pipelines
Database: Good knowledge of any SQL,NoSQL database (specific experience in Cosmos DB is a bonus)
Hadoop developers on Azure (using HDInsight) using Python / Scala / Java can also be looked at
Experience & Background
Bring 3-5years of software engineering and solution development experience
BS/MS degree in Computer Science, Engineering or a related subject.
Excellent communication skills.