Data Engineer III

Location: Remote/Hybrid
Experience: 4-7 Years
Employment Type: Full-Time

To be deployed at Canadian well known company

Role Overview:

As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development “scrums” and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design.

Key Responsibilities:

  • Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.
  • Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development “scrums” and solution reviews.
  • Mentor Junior Data Engineers
  • Lead the resolution of critical operations issues, including  post-implementation reviews, Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and built data extraction, loading, and transformation pipelines using Python and other GCP Data technologies

Skills & Qualifications:

  • Demonstrate SQL and database proficiency in various data engineering tasks.
  • Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations.
  • Model data to support business intelligence and analytics initiatives.
  • Utilize infrastructure-as-code tools such as Terrafrom, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).
  • Bachelor’s degree in Software Engineering, Computer Science, Business, Mathematics or related field.
  • 4+ years of data engineering experience.
  • 2 years of data solution architecture and design experience.
  • GCP Certified Data Engineer (preferred).

What You Will Get?

  • Competitive compensation and benefits
  • Opportunity to work on enterprise-grade projects with leading clients
  • A collaborative, learning-focused culture
  • Flexible work environment and growth opportunities

How to Apply:
Interested candidates are invited to submit their resume, portfolio, and a cover letter explaining their suitability for the role to hr@servoedge.com with the subject line: Application for Data Engineer III – [Your Name] or apply here.

Apply for this position

Drop files here or click to uploadMaximum allowed file size is 2 GB.
Allowed Type(s): .pdf, .doc, .docx