Cassandra Administrator

Soft tech Career Info system Pvt. Ltd
  • Hyderabad
  • Confidential
  • 4-8 years
  • Views
  • 02 Apr 2018

  • IT/ Information Technology

  • Consumer Durables/ Semi Durables
Job Description

Required Skiils: Cassandra DBA Hadoop Cloudera experience is a plus Job Description: Cassandra Admin with Hadoop/Cloudera experience: - We have a requirement for Cassandra Admin position(possible C2H) who also should have Cloudera Hadoop Admin experience including a sysadmin(unix/linux) experience. This position should be able to define standards (HA/DR/replication etc) on Cassandra DB for Client. - Deep understanding of internals of NoSQL approach. - Experience installing, configuring, upgrading, managing, and administering a Cassandra database - Responsible for database deployments, and monitor for capacity, performance, and/or troubleshooting issues - Expert experience with Cassandra and other noSQL databases - Expert knowledge of Cassandra architecture. - Advanced knowledge of various troubleshooting tasks (ex. Latency Analysis, Thread state Analysis, etc.) - Knowledge of installation and configuration procedures. - Knowledge of add/bootstrap nodes to clusters. - Knowledge of remove/replace nodes in clusters. - Knowledge of replication between data centers. - End-to-end performance tuning of Cassandra clusters against very large data sets - Monitor Cassandra clusters, performance and capacity planning - Cassandra cluster connectivity and security - Conduct Cassandra training - Software installation and configuration - Software patches and upgrades - Using monitoring and management tools (e.g. DataStax OpsCenter) - Troubleshoot Cassandra issues with other dba- s/developers - Point of Contact for Vendor escalation - Database backup and recovery - Database connectivity and security - Database creation and role assignment - Table indexing and partitioning Daily operations include, but not limited to, the installation, configuration, and support for Hadoop ecosystem on RedHat operating systems. Provide support and leadership for new project initiatives and support existing environments. Also after hours support as part of an on-call pager rotation. MAJOR DUTIES AND RESPONSIBILITIES : - Actively and consistently supports all efforts to simplify and enhance the customer experience - Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations - Working knowledge of whole Hadoop ecosystem like HDFS, Hive, Yarn, Flume, Oozie, Flume, Kafka, Storm, Spark and Spark Streaming including Nosql database knowledge - Recommends changes to improve systems and network configurations, and determines hardware or software requirements related to such changes - Designs and maintains access and security administration for Hadoop ecosystem - Responsible for troubleshooting Hadoop Infrastructure problems. - Creates/maintains backup and recovery strategies - Designs, implements, maintains Disaster Recovery (DR) methodologies and creates documentation - Uses established change management processes, requiring operational procedures be performed with minimal customer impact. - Researches, evaluates and recommends software and hardware products - Provides new hardware specifications to users based on application and business needs and anticipated growth; installs new servers and maintains the server infrastructure - Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues - Performs troubleshooting for complex hardware, Hadoop Cluster and network problems and provide plans to remediate complex business problems. - Works on multiple projects as a project team member, occasionally as a project leader - Deploys all Hadoop components including Operating Systems for project and operational support - Knowledge of Private and Public cloud computing and virtualization platform - Participates in on-call rotation Related Work Experience : - 5 to 8 years of related IT work experience - Extensive experience with RedHat 5.x and 6.x and Cloudera 5.x is mandatory

Job Posted By

O.P. Chawla