Titre du poste ou emplacement
RECHERCHES RÉCENTES

Senior Data Engineer - MongoDB

Staffinity Inc - 2 emplois
Halifax, NS
Posté aujourd'hui
Détails de l'emploi :
Temps plein
Exécutif

Staffinity is currently seeking a Senior Data Engineer/MongoDB for a client located in Halifax, NS. This is a full-time, permanent position with salary, benefits, RRSP matching and vacation. The base salary range is 110-120k. This is a day shift position, Monday to Friday and will work on a hybrid model of 2 days per week in the office and 3 from home.
Responsibilities:
Work with business stakeholders and cross-functional teams to understand data requirements and deliver scalable data solutions.
Design, develop, and maintain robust ETL processes to extract, transform, and load data from various sources into our data platform.
Build large-scale batch and event-driven data pipelines using cloud and on-premises hybrid data platform topology.
Strong experience with Mongo DB.
Designing, modeling, mapping, and architecting MongoDB/NoSQL and SQL databases and Redis caches
Data sync, aggregations, and migrations
Node JS for loading, migrations, scripting and query builder APIs
Work closely with data architects to review solutions and data models and ensure adherence to data platform architecture guidelines and engineering best practices.
Take ownership of end-to-end deliverables and ensures high quality software development.
Implement and enforce data quality standards and best practices while collaborating with data governance teams to ensure compliance with data policies and regulations.
Optimize data integration workflows for performance and reliability.
Troubleshoot and resolve data integration and data processing issues.
Leverage best practices in continuous integration and delivery using DataOps pipelines.
Stay informed about emerging technologies and trends in the data engineering domain.
Lead, mentor, and inspire a team of data engineers to achieve high performance levels.
Qualifications:
5+ years of experience building batch and real time data pipelines leveraging big data technologies and distributed data processing using Spark, Hadoop, Airflow, NiFi, and Kafka.
Proficiency in writing and optimizing SQL queries and at least one programming languages like Python and/or Scala.
Experience with cloud-based data platforms (Snowflakes, Databricks, AWS, Azure, GCP)
Expertise using CI/CD tools and working with docker and Kubernetes platforms.
Experience with following DevOps and agile best practices.
Experience with Mongo DB

Partager un emploi :