About the role:
A leading global software development company is looking for an experienced Data Platform Developer to join the growing team in Oakville. This is a fast-growing team and we're looking for people who follow their passion, think differently, and want to make an impact.
Core Responsibilities:
Develop and maintain new data infrastructure platforms managing the data ingestion, digestion, orchestration, and applications.
Develop and optimize distributed systems for real-time and batch telematics data processing.
Develop processes to enrich big data with telematics data at scale.
Develop processes and implement logging, monitoring, and alerting services to ensure the health of the big data infrastructure.
Work with data scientists to understand data processing needs and develop infrastructure solutions to support these initiatives.
Create and maintain documentation for architecture, requirements, and process flows.
Support internal teams to assist with data integration with newly developed big data platforms.
Minimum Requirements:
Post-secondary Degree specialization in Computer Science, Software or Computer Engineering or a related field.
5 years of experience in Software Engineering, Data Engineering or a similar role.
5 years of experience in developing production-level systems using Python.
3 years experience with API design and implementation.
3 years of experience in designing, building and maintaining production-level application containerization, such as Docker, Kubernetes or OpenShift.
Knowledge of data management fundamentals and data storage principles.
Knowledge of gRPC, Protobuf, Apache Avro, Apache Beam is a plus.
Knowledge of Apache Kafka, Apache Flink, Apache Ignite, Apache Airflow, Apache SuperSet, Apache Olingo, and DataHub is a big plus.
Knowledge of batch and streaming data architectures.
Familiar with Big Data environments (e.g. Google BigQuery).