Engineering Manager- Data Devops
Job Description
Position: Principal Engineer–Data DevOps
Job Location: Noida
Job Overview
We are seeking an experienced Principal Engineer (Data DevOps) to lead our Data DevOps team in building, managing, and optimizing high-scale, secure, and reliable big data platforms. The ideal candidate will combine strong technical expertise with proven leadership skills, driving best practices in cloud infrastructure, automation, CI/CD, and big data technologies. This role involves managing cross-functional priorities, mentoring engineers, and ensuring delivery of high-quality, scalable data solutions.
Key Responsibilities
● Lead, mentor, and grow a high-performing Data DevOps team, fostering technical excellence and ownership.
● Drive architecture, design, and implementation of large-scale cloud and data infrastructure, ensuring scalability, performance, and security.
● Collaborate closely with Data Engineering, Data Science, Analytics, and Product teams to deliver efficient and reliable data platforms.
● Oversee operations and optimization of AWS-based infrastructure, including VPC, EC2, S3, EMR, EKS, SageMaker, Lambda, CloudFront, CloudWatch, and IAM.
● Manage and scale big data platforms leveraging Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Looker, and Jupyter Notebooks.
● Implement and maintain CI/CD pipelines and infrastructure automation using Terraform, Ansible, and CloudFormation.
● Ensure system observability, proactive incident management, and SLA adherence.
● Champion cloud security best practices, including API security, TLS/HTTPS, and access control policies.
● Partner with stakeholders to prioritize initiatives, manage budgets, and optimize cloud and operational costs.
Required Qualifications
● Experience: 8+ years in DevOps/Data DevOps or related fields, including at least 4+ years in a leadership role.
● Proven track record in managing high-scale big data infrastructure and leading engineering teams.
● Strong hands-on experience with AWS services and infrastructure automation tools (Terraform, Ansible, CloudFormation).
● Deep knowledge and hands-on experience with Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Jupyter Notebooks, and Looker.
● Proficiency in Kubernetes/EKS, Docker, ECS, and CI/CD tools.
● Strong understanding of networking, cloud security, and compliance requirements.
● Excellent communication, stakeholder management, and decision-making skills.
● Exposure to SQL and data query optimization is an advantage.
Job Information
Get Jobs Like This
New Paytm jobs and similar roles, straight to your inbox.
Weekly digest. Unsubscribe anytime.
Considering Relocating for This Job?
Before you apply, see how far your salary will go in Noida, Uttar Pradesh. Compare take-home pay, rent, food & transport costs vs other tech cities.