Individual Contributor responsible for development of big data, BI and Analytic platform technology capabilities as competitive differentiators. Minimum 12 years of applied technical experience in BI, analytics, visualization. At least three years in big data (Hadoop), search and various data integration tools.
Essential Duties & Responsibilities :
Successfully partners with key business GIO stakeholders to assess needs, maintains the big data, BI and Analytic demand pipeline associated with business value.
Drives big data, BI & analytic innovation, develops/maintains technology platform strategies & roadmaps
Responsible for end to end creation of big data, NoSQL, Cloud platform service offerings
Understands business needs, emerging technologies, ensures solution roadmaps are aligned
Maintains in-depth technical knowledge of data, BI & Analytics trends, practical solution frameworks
Performs proof of concepts and pilots for big data tools,, manages relationships with technology partners
Assesses data, BI/Analytic solution landscape, designs data domain services for global analytics.
Delivers agile consultative engagement services across business domains, serves as the internal expert on big data and analytic platforms and solution designs.
Works well in a matrix organization, partners across business GIO and IT organizations to understand business, technology and process requirements to successfully deliver robust, scalable, cost effective and reliable big data, BI & Analytic platforms and solutions.
Proficient experience, designing, delivering innovative self-service BI analytics (data acquisition, integration, visualization) to effectively surface information that provides actionable insights.
Total 12 to 16 years of experience in BI, analytics, visualization, data integration, designing and developing data warehouses (ETL, databases, front-end tools).
9+ years of experience in designing and developing Business Intelligence and Data warehouse systems ( ETL, Data Modelling, Visualization)
3+ years in any of the Big Data or Data Analytics technologies such as Hadoop, Azure Data Lake, Machine Learning, Azure Data Factory, PIG, Hive etc.
6+ years of experience in organizing, planning, executing small architectural projects.
Experience documenting complex processes and presenting them in a clear and understandable format is required.
Experience working in a regulated environment is preferred.
Competencies required :
Demonstrated ability to plan and execute tasks like patching, applying hot fixes, and integration with other systems
Demonstrated strong analytical skills
Ability to proactively identify edge cases
Excellent interpersonal and communication skills (written and verbal), with ability to present ideas effectively
Demonstrated troubleshooting and analytical skills required to resolve complex issues