Designing and developing data pipelines using Cloud Technologies based on business requirements and technical design.
Collaborating with Stakeholders, Business Analysts and Data Architects to assist and translate business requirements to technical solutions
Develop big data and analytic solutions leveraging new or existing technology to advance clients all lines of business
Exploratory data analysis by querying and processing data stores and databases and generate reports. Design, upgrade and implement new data workflows.
Write and maintain technical documentation
Executes updates, patches, and other activities required to maintain and enhance the operations of on-premise or cloud-based environments
Essential Skills:
Experience working as Data Engineer with focus on big data processing and/or relational databases with good understanding of SQL
Experienced with working on Structured and Semi-Structured datasets.
Experience working with Microsoft Azure Data Platform, specifically Azure SQL Database, Azure Data Factory, ADLS storage account, Azure DataBricks and DevOps
Experienced in creating data pipelines & developing complex and optimized queries
Experienced with Workflow Management Tools: Airflow, Crontab, CA Workload Automation
Experience with implementing CI / CD pipeline on GitHub, Jenkins / JGP.
Experience writing complex SQL and NoSQL jobs to analyze data in traditional DBMS (MS-SQL, Oracle) environments
Experience with integrating to back-end/legacy environments
Experience integrating business and technology teamsExperienced in any of the following programming/scripting languages (SQL, Python, Shell, Scala)
Excellent organizational and time management skills, strong business presence with ability to multi-task and effectively deal with competing priorities.
Pyspark, MongoDB experience with solid experience in Ingestion & Curation.
Should have experience with Jenkins, GitHub and Azure Cloud as well
Desirable Skills:
Proficiency in writing software in one or more languages - such as Java, Node, Python, .NET.
Exposure to microservices.
Experience in scripting language such as groovy, Python, shell.
Grounding in DevOps principles, test-driven development, continuous integration, and continuous delivery.
Experience in Microsoft Azure and AKS or other similar cloud technologies.
Experience automating infrastructure provisioning with tools such as Terraform, Chef, and HELM charts.
Experience with Jenkin, Artifactory, SonarCube.
Familiarity with package management tools such as npm, pip.
Familiarity with build processes and build tools such as maven, gradle.
Familiarity with standard IT security practices.
Familiarity with database technologies in cloud such as Azure SQL, Cosmos/Mongo.