Analyze business/ functional requirements, review the solution architecture for ETL projects and develop the detailed design.
Develop and unit test the integration solution individually or in collaboration with onsite/ offshore development resources from external delivery partners.
Perform peer code reviews and ensure that the solution is aligned to pre-defined architectural standards, guidelines, best practices, and meet quality standards
Deploy the solution on a high-availability shared service production environment
Understand and comply with the established software development life cycle methodology
Proactively identify opportunities for process improvements and implements the same.
Capable of doing feasibility study analyze data issues, identify and articulate the business impact of data problems
Establish and enhance technical guidelines and best practices for the integration development team
Utilize subject matter expertise in area of enterprise applications, solutions to evaluate complex, sensitive business problems and architect technical solutions.
Keep current with industry technologies and tools in the integration domain.
Key Technical Requirements:
Good understanding of data integration methodologies with strong experience on Pentaho Data Integration (PDI) Kettle Version 7 including configuration
Experience of working on cloud based data integration projects on AWS or Azure platforms
Expertise on working with Redshift/Postgres as a target
Working knowledge of Git
Good understanding of Data Management/Data Quality processes
Well versed with data modeling techniques
Ability to work on project and deliver independently for any critical consulting role
Certification relevant technologies will be a plus.