Specialist – Big Data Operations Support

Experience 3 - 5 yrs
Job Location Gurugram
CTC Best in Industry
Skills incident management, problem management, jira, escalation matrix, operations support activities, oncall support, big data, hadoop, yarn, hdfs, hive, spark, ambari, hue, unix shell scripting, python, AWS cloud, cloudwatch, cloudtrail, s3, ec2, emr, rds, rdbms, postgresql, sql server, oracle, db2, data warehousing, verbal communication skills, written communication skills, scheduling tools View More

Job Description

Role Summary:

The role is for Big Data Operations Specialist with experience of supporting operational activities in Big Data Ecosystem such as supporting environment, users, live applications and workflows.
The incumbent will be an adaptable individual, with good collaboration skills and able to flex his work timings to manage stakeholder expectation effectively.
Strategically, this role will add values by implementing best practices to Big Data Operational activities.

Core Responsibility:

- Provide on-call support for different applications and workflows on rotational basis.
- Suggest fixes to complex issues by doing a thorough analysis of root cause and impact of the defect.
- Should be able to prioritize workload, providing timely and accurate resolutions.
- Perform production support activities, which involve assignment of issues and issue analysis and resolution within the specified SLAs.
- Coordinate with Teams for successfully deploy in both User Acceptance Testing and Production environments.
- Support during project Implementation and after transitions.
- Ability to multitask on different work items, flexibility to shift timings including Canada day time shift.

Qualifications/ Industry Experience:

- BE, B. Tech. from reputed institute.
- 3 to 4.5 years of progressive experience in design & development of data projects or support

Skills:

- Experience in Incident and Problem management, ticketing tools such as Jira, Escalation matrix and related operations support activities.
- Experience of on-call support on applications and workflows in Big Data eco-system
- Experience of Hadoop ecosystem and related components such as Yarn, HDFS, Hive, Spark, Ambari, Hue etc.
- Experience in Unix shell and python scripting
- Experience of building various metrics to manage resources and demand in Hadoop ecosystem
- Knowledge and understanding of AWS cloud ecosystem and components/services like Cloudwatch, Cloudtrail,S3, EC2, EMR, RDS etc.
- Knowledge of RDBMS (PostgreSQL, SQL Server, Oracle, DB2)
- Knowledge of Data Warehousing concepts
- Strong verbal and written communication skills
- Knowledge of scheduling tools.