Lead/ Architect - Big Data Developer - Hadoop/ Java

Soft tech Career Info system Pvt. Ltd
  • Delhi, Noida
  • 10-20 lakh
  • 7-10 years
  • Views
  • 29 Jun 2017

  • IT/ Information Technology

  • IT/ Technology - Software/ Services
Job Description

1. Experience in Java

2. Knowledge in Hadoop (HDFS and MapReduce) concepts

3. Good knowledge of database structures, theories, principles, and practices

4. Ability to write MapReduce jobs

5. Proven understanding with Hadoop, HBase, Hive, Pig, and HBase

6. Ability to write Pig Latin scripts

7. Hands on experience in HiveQL

8. Familiarity with data loading tools like Flume, Sqoop

9. Knowledge of workflow/schedulers like Oozie

10. Good aptitude in multi-threading and concurrency concept

11. Loading data from disparate data source sets

12. Certifications like Cloudera Developer/Administrator Certifications added advantage

13. Hands-od experience with at least two NO SQL databases

14. Ability to analyze, identify issues with existing cluster, suggest architectural design changes

15. Ability/Knowledge to implement Data Governance in Hadoop clusters

16. Experience with Hortonworks Data Platform, having Java background

17. Strong understanding of underlying Hadoop concepts and distributed computing

18. Strong skills writing Map Reduce

19. Expertise with Hive

20. Experience working with Big Data on Amazon Web Services

21. Experience with Redshift, Elastic Map Reduce, S3 on AWS

22. Customer facing skills, responsible for deliverables, schedule and effort management

23. Ability to Lead/Manage a team

24. Creating Requirement Analysis and choosing the platform

25. Designing he technical architecture and application design

26. Deploying the proposed Hadoop solution

27. Played pivotal roles as an engineer and architect across domains

28. Experience with machine learning algorithms and data mining techniques.

29. Analytical and problem solving skills, applied to Big Data domain

30. Be very comfortable with Agile methodologies in order to be able to arrive at difficult engineering decisions quickly

31. To be able to clearly articulate pros and cons of various technologies and platforms

32. To be able to document use cases, solutions and recommendations
Position. 2

Bigdata Lead

Skills Required:

- Work with the team in providing hardware architectural guidance, planning and estimating cluster capacity, and creating roadmaps for Hadoop cluster deployment

- Evaluation of Hadoop infrastructure requirements and design/deploy solutions (high availability, big data clusters, etc)

- Troubleshoot and debug Hadoop eco system runtime issues.

- 4 Years- experience with Hadoop

- Strong experience in tuning long running quires in Hadoop environments

- Hands on experience with the Hadoop stack such as Sqoop, HDFS, MapReduce, Hbase, Pig, Hive, Oozie.

- Ability to multi-task and prioritize duties with respect to business needs

- Hands on expertise in systems administration, Linux tools, configuration management on large scale environment.

- Expertise with data integration and scheduling tools preferably Cloudera Manager, Nagois, Gangila,

- Exposure to configuration tools like Chef/Puppet

- Working knowledge of Unix operating system, command set and basic shell scripting.

- Systems analysis skills to support/maintain operational systems

- Strong communications skills verbal and written

- Very good knowledge of scheduling and job prioritization for a high demand 24/7 Business Intelligence organization.


Competencies/Skill sets for this job

Hadoop Java Hive Pig Big Data

Job Posted By

O.P. Chawla
Director