PBT Group Careers

Be part of our team of Data Specialists and embark on a career of the future!

Job Title
Senior DataOps Engineer
Employment Type
Full Time
Experience
5 to 25 years
Salary
Negotiable
Job Published
22 October 2024
Job Reference No.
1056112203

Job Description

We are looking for a Senior DataOps Engineer to join our team and lead the development of DataOps practices within our organisation. As this is a relatively new area for us, we need someone with a strong DevOps background combined with software and data experience. The ideal candidate will be responsible for setting the direction for DataOps within the team, building scalable data pipelines, and ensuring efficient data operations in a fully cloud-based environment (AWS). This role is ideal for a senior software professional with a deep understanding of DevOps practices and a solid foundation in data.

 

Key Responsibilities

  • Lead DataOps Strategy:
    • Define and implement DataOps practices, guiding the team towards a more automated and efficient data pipeline management system.
    • Collaborate with data engineers, data scientists, and other stakeholders to establish best practices for data workflows and infrastructure.
  • Build & Maintain Data Pipelines:
    • Design, deploy, and manage scalable and reliable data pipelines to enable efficient data flow and processing across cloud environments.
    • Automate and optimise ETL/ELT processes, ensuring seamless data integration from various sources.
  • DevOps for Data:
    • Apply DevOps principles to data workflows, focusing on continuous integration, continuous delivery (CI/CD), and infrastructure-as-code (IaC) for data operations.
    • Implement version control and automation for data-related processes, reducing errors and improving data quality.
  • Cloud Infrastructure Management (AWS):
    • Manage and optimise AWS cloud infrastructure to support data workflows, including S3, Redshift, Lambda, and RDS.
    • Monitor, maintain, and scale AWS resources for efficient data storage, processing, and analysis.
  • Data Security & Compliance:
    • Ensure data security and compliance with relevant industry standards and regulations, implementing appropriate security protocols and monitoring.
    • Collaborate with security teams to ensure secure handling of sensitive data and manage access controls effectively.
  • Monitoring & Optimisation:
    • Set up monitoring, logging, and alerting mechanisms for data pipelines to ensure high availability and performance.
    • Identify bottlenecks and inefficiencies in data processes, proposing and implementing optimisations.
  • Collaboration & Mentorship:
    • Provide technical leadership and mentorship to junior team members, helping them to grow in their roles and expand their knowledge of DataOps practices.
    • Work closely with data engineering, software development, and IT teams to drive cross-functional initiatives.

 

Qualifications & Experience

  • Proven experience as a DevOps Engineer, with significant exposure to data-focused environments.
  • Strong understanding of data concepts, including data pipelines, ETL/ELT processes, and data storage solutions.
  • Expertise in building, deploying, and managing data pipelines in cloud environments, especially AWS.
  • Familiarity with AWS services such as S3, Redshift, Lambda, RDS, and other relevant cloud-based data tools.
  • Experience with CI/CD pipelines, infrastructure-as-code, and automation tools (e.g., Jenkins, Terraform, Ansible).
  • Strong knowledge of version control systems (e.g., Git) and containerisation technologies (e.g., Docker, Kubernetes).

 

Preferred Skills

  • Experience with DataOps principles and practices, including data versioning, pipeline automation, and data observability.
  • Strong programming skills in languages such as Python, Java, or Scala, with experience in building data processing solutions.
  • Familiarity with data processing frameworks like Apache Spark, Kafka, or similar.
  • Exposure to security best practices for data storage and processing in cloud environments.

 

Technical Skills

  • DevOps Tools: Strong understanding of CI/CD tools, infrastructure automation, and version control.
  • Cloud Expertise: Deep familiarity with AWS cloud infrastructure, including automation and orchestration of services.
  • Data Management: Knowledge of modern data architectures, pipelines, ETL/ELT processes, and relevant technologies.
  • Scripting & Automation: Proficiency in automating tasks using scripting languages (Python, Bash) and DevOps tools.

Skills

Industries