Apply for DevOps Data Engineer

  • LOCATION:
  • CLOSING DATE:

Job Summary:

We are looking for a dynamic DevOps Engineer to join our Teleport Operations team. The ideal candidate will be responsible for designing and maintaining CI/CD pipelines, supporting scalable data engineering workflows, and collaborating across teams to ensure operational excellence. Experience with Skyline DataMiner, ServiceNow, KESTRA orchestrator, and cloud platforms is a strong advantage.


Key Responsibilities:

Core DevOps & Data Engineering Duties

  • Design, implement, and manage CI/CD pipelines using tools such as Gitea.
  • Automate deployment, testing, and monitoring processes across development and production environments.
  • Integrate SonarQube for continuous code quality and security analysis.
  • Build and maintain data pipelines for ingesting, transforming, and storing operational telemetry.
  • Collaborate with analytics and engineering teams to support real-time data processing and reporting.
  • Ensure secure, compliant, and efficient delivery of software and data workflows.

Collaboration & Support

  • Work closely with network engineers, developers, and system integrators to align DevOps practices with operational goals.
  • Provide technical documentation and support for internal stakeholders.
  • Participate in agile ceremonies and contribute to sprint planning and retrospectives.

Mandatory Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • Minimum 5 years of experience in DevOps or software engineering roles.
  • Strong proficiency in CI/CD pipeline development and automation.
  • Solid understanding of data engineering principles and tools .
  • Familiarity with containerization technologies (Docker, Kubernetes).

Additional Skills That Bring a Plus:

Skyline DataMiner Expertise

  • Experience developing DataMiner Integration Studio (DIS) packages.
  • Ability to build Services and SLA definitions aligned with operational requirements.
  • Skilled in creating modern, interactive dashboards for monitoring and analytics.
  • Familiarity with deploying and configuring DataMiner Agents (DMA) in distributed environments.

ServiceNow Integration

  • Experience integrating DevOps workflows with ServiceNow for incident, change, and configuration management.
  • Familiarity with ServiceNow APIs and scripting for automation and ticketing.
  • Ability to design and implement ServiceNow dashboards and reporting for operational visibility.
  • Understanding of CMDB and its role in infrastructure and service mapping.

KESTRA Orchestrator

  • Experience designing and managing workflows using KESTRA.
  • Ability to orchestrate complex data pipelines and operational tasks across distributed systems.
  • Familiarity with KESTRA’s plugin ecosystem and integration with cloud-native services.
  • Understanding of YAML-based workflow definitions and event-driven architecture.

Cloud Technologies

  • Exposure to cloud platforms such as Azure, AWS, or GCP.
  • Understanding of cloud-native DevOps practices and infrastructure-as-code.

Send us your details

FILL THE FORM BELOW.

EDUCATION

Program University Name Year of Passing

EXPERIENCE DETAILS

Organization worked with Designation Duration Reasons for leaving