DataOps Engineer

Job Locations Canada | United States
ID
2025-10002
Type
FullTime
Category
Information Technology

Company Overview

Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services, while elevating brands through digital experience, creative content, and customer data analytics services.

 

Don't just work, thrive. At Bridgenext, you have an opportunity to make a real difference - driving tangible business value for clients, while simultaneously propelling your own career growth. Our flexible and inclusive work culture provides you with the autonomy, resources, and opportunities to succeed. 

Position Description

Bridgenext is seeking a highly-skilled and motivated DataOps Engineer to join our data infrastructure team. This role is ideal for someone passionate about building scalable, automated, and observable data systems. You’ll work on mission-critical data pipelines, infrastructure-as-code, and monitoring systems that support our growing portfolio of innovative projects.

 

You will be part of a collaborative team with on-call responsibilities, ensuring the reliability and performance of our data platforms.

 

Responsibilities include but are not limited to:

  • Design, deploy, and manage distributed applications in Kubernetes-based production environments.
  • Build and maintain change data capture (CDC) pipelines using Kafka and Kafka Connect.
  • Develop and optimize data pipelines and orchestration workflows using Airflow.
  • Manage and tune Azure SQL and NoSQL databases, including performance analysis and query optimization.
  • Implement and maintain infrastructure using Terraform or similar IaC tools.
  • Work with Snowflake and other columnar data stores to support data warehousing and analytics.
  • Set up and manage observability tools, especially New Relic, for monitoring and alerting.
  • Automate operational tasks using scripting languages such as Python and Bash.
  • Collaborate with development teams to build robust, scalable, and secure data systems.
  • Participate in on-call rotations and respond to high-priority incident alerts.

 

Workplace: working remotely from home in Canada or the US, supporting Central timezone (CST) office hours

 

 

Must Have Skills:

  • 5+ years of experience in Data Engineering, DevOps, or a similar infrastructure/platform-focused role.
  • 2+ years of hands-on experience with Terraform or other infrastructure-as-code tools.
  • Strong Kubernetes experience in production environments.
  • Proficiency with Kafka, Kafka Connect, and CDC pipelines.
  • Experience with Snowflake and data warehousing best practices.
  • Solid understanding of orchestration tools like Airflow.
  • Familiarity with GitHub Actions, version control, and CI/CD pipelines.
  • Strong scripting skills (Python, Bash) and a strong bias toward automation.
  • Experience with observability tools like New Relic.

 

Preferred Skills:

  • Exposure to the logistics or supply chain industry.
  • Experience with Azure cloud services.

 

Professional Skills:

  • Solid written, verbal, and presentation communication skills
  • Strong team and individual player
  • Maintains composure during all types of situations and is collaborative by nature
  • High standards of professionalism, consistently producing high quality results
  • Self-sufficient, independent requiring very little supervision or intervention
  • Demonstrate flexibility and openness to bring creative solutions to address issues
     

Bridgenext is an Equal Opportunity Employer

 

Canadian / US citizens and those authorized to work in Canada / the US are encouraged to apply

 

Options

<p style="margin: 0px;"><span style="font-size: 12pt;">Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.</span></p>
Share on your newsfeed