Description
NOTE
Assignment Type: This position is currently listed as "Hybrid" and consultants will be required to work onsite at the work location 3 days a week and 2 days from home. The details of this arrangement will be at the Hiring Manager's discretion.
Extension/Amendment Attestation: The Statement(s) of Work (SOW) shall expire on March 31, 2026. HSC may exercise its option(s) to extend the SOW beyond March 31, 2026 using unused days/funds left on contract. Such extension(s) will be allowable only if the Master Service Agreement is extended beyond April 5, 2026 and be upon the same terms, conditions and covenants contained in the SOW.
====================================================================
Responsibilities:
· Design, develop, test and implement:
o data pipelines using Python and AWS services (Glue, StepFunctions, Lambda)
o complex data transformation procedures
o data models for the efficient storing of data in relational databases as well as in unstructured data repositories
o re-useable classes and modules as to enhance the maintainability and ability to deliver solutions quickly.
· Review existing code base and assist other more junior team members ensuring a consistent level of quality.
· Assess new business requirements and propose adequate technical solutions (using on-prem and on-cloud resources)
· Monitor automatic execution of the various data loads and proactively address issues such as data processing errors or performance degradations.
SkillsExperience and Skill Set Requirements
General Development Experience (Python)
· Server-side scripting
- Working with XLSX, CSV, JSON files, relational databases, structured and unstructured data,
- Data engineering and processes automation,
· Linux experience
35 points
AWS Cloud Experience
· Experience using AWS Services such as Glue, StepFunctions, Lambda, S3 or equivalent
· Experience with cloud data warehousing and analytics (AWS Redshift or equivalent)
35 points
Data Warehouse
· Data modeling (relational & dimensional), advanced SQL
- Extract/Transform/Load data
· Data reporting/visualization
20 points
Software Development Life Cycle
· Full SDLC from requirements gathering, design, implementation, testing to deployment and production support
- Familiar with project management (agile/scrum and waterfall)
· Change and Incident management
5 points
General skills
· Communication, presentation and negotiation skills
- Consulting, problem-solving and decision-making skills
· Public sector experience
5 points
Hybrid - Candidate must work 3 days onsite and 2 days remote
Maximum Number of Submissions - two (2)
MUST HAVES:
General Development Experience (Python) -
Server-side scripting
· Working with XLSX, CSV, JSON files, relational databases, structured and unstructured data,
· Data engineering and processes automation,
AWS Cloud Experience -
Experience with cloud data warehousing and analytics (AWS Redshift or equivalent)
Experience using AWS Services such as Glue, StepFunctions, Lambda, S3 or equivalent
Data Warehouse
· Extract/Transform/Load data (ideally Informatica experience on Cloud)