With a career at The Home Depot, you can be yourself and also be part of something bigger.
The Home Depot Canada is a unique opportunity for you to become part of a transformative team as we continue our mission to provide the best interconnected shopping experience to our customers. As part of the Analytics team, you will be tasked with absorbing billions of rows of data from dozens of sources, organizing them, visualizing them, and analyzing them to help inform both short- and long-term decision-making, tackle problems and derive insights. The team isn't just making monthly reports or rerunning existing statistical models, it's doing creative work in an empowering environment with each decision backed by data to support different areas of the customer journey. We need more dreamers, innovators, and big thinkers passionate about data and re-imagining the future of retail. Interested in making history with us? If so, apply today to experience what it's like to be a part of Data team at The Home Depot Canada.
Position Overview:
As a Data Engineer, you will be working to improve the primary dataset and processing architecture, partnering with every other team such as– Contact Centre or Marketing – to help achieve their goals. You will develop efficient data capture and transformation processes and complex data models that form the core data foundation to enable reporting, data stitching and for other use cases
As an individual contributor, the role will work side-by-side with internal partners from across the organization to develop creative solutions for the highest priority business needs. We're looking for self-starters with a strong sense of urgency who thrive when operating in a fast-paced environment, enjoy the challenge of highly complex technical contexts working with hundreds of terabytes of data, and, above all else, are passionate about data and analytics.
Position Responsibilities:
Design, develop, and maintain scalable data pipelines using GCP services such as Dataflow, Apache Airflow, Dataproc, and Cloud Functions.
Extract, transform, and load (ETL) data from diverse sources (e.g., databases, APIs, cloud storage) into GCP-based data warehouses (BigQuery) and data lakes (Cloud Storage).
Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions leveraging GCP services.
Implement data quality checks and monitoring mechanisms using GCP tools like Data Fusion and Cloud Monitoring.
Analyze large datasets in BigQuery to identify trends, patterns, and insights.
Develop data models and visualizations using Looker or other tools to communicate findings effectively.
Collaborate with business analysts to understand their reporting requirements and provide data-driven solutions.
Required Experience/Skills:
Bachelor's degree or higher in Computer Science or related discipline
3+ years of experience with SQL database queries and programming
2+ years of experience programming in Python
2+ years of experience working with Google Cloud Platform and Google Composer
Experience handling unstructured data and building data pipelines, specifically from digital marketing or web clickstream platforms
Familiarity with data quality, cleansing and masking techniques
Strong analytical and problem-solving skills
Excellent communication and collaboration skills
A passion for data and a drive to learn new GCP technologies
Deep understanding of algorithms and performance optimization
Strong interpersonal and communication skills and a demonstrated ability to work and collaborate in a team environment