Description
A Note on Assignment Type:
This position is currently listed as "Onsite"; however, the Assignment under this request will provisionally be "Hybrid", working 7.25 hours per calendar day, sometime between 8:00 AM and 5:00 PM (excluding breaks) Monday to Friday inclusive, unless otherwise identified. These conditions are subject to change as the OPS reflects on its current situation. During the duration of the assignment, you will be under the discretion of the Hiring Manager's requirements for the Project you are assigned to.
Scope
- The Justice Technology Services - Digital Design Branch requires a technical specialist with analysis / design / development / quality assurance and production operation experience of high sensitive, high performance, large scale enterprise database (e.g. SQL Server), database performance modelling, on-going administration, monitoring and tuning, database transformation (ETL) / cleansing / migration (e.g. Oracle to SQL server), as part of building, enhancing and sustaining digital products for Criminal Justice Digital Design (CJDD) modernization program. Successful candidate will work on various applications and integration endpoints including Criminal eIntake, Digital Information Repository (DIR) and ICON Integration workstreams.
Assignment Deliverables
- As a member of the development team, you will be responsible to migrate high-sensitive and large volume data from the existing Oracle database to the new Azure SQL Server, DataOps / Pipeline and on-going Database Administration.
- A high-level list of deliverables follows:
- Data Analysis: analyze the existing data in the applications, understand its structure, quality, and relationships and help in designing an appropriate migration strategy;
- Data Mapping and Transformation: map the data elements from the application to the corresponding entities and fields in Azure SQL Database. Handle necessary data transformations, ensuring compatibility and consistency between the legacy data and the target system;
- Data Extraction: help extract the required data from the application, develop and implement extraction processes to retrieve data from various sources, such as databases, files, APIs, or other relevant systems;
- Data Cleansing and Validation: cleanse and validate the extracted data to ensure its accuracy, completeness, and consistency. Help with identifying and resolving data quality issues, performing deduplication, and applying business rules to ensure the integrity of the migrated data;
- Data Migration Strategy and Execution: review the present migration strategy that outlines the overall approach, sequence, and timeline for migrating the data from Oracle database to Azure SQL Database using a delta-load approach; execute the migration plan efficiently, managing data transfers and ensuring minimal disruption to ongoing operations;
- Data Testing and Quality Assurance: conduct thorough testing to verify the accuracy and integrity of the migrated data; define test cases, perform data reconciliation, and address any issues or discrepancies that arise during the testing phase; develop KPIs to report on the progress, completeness and quality of the data migration effort;
- Documentation: document the entire data migration process, including data mapping rules, transformation logic, migration scripts, and any specific configurations;
- Ongoing Support: provide post-migration support, analyze and address data-related issues or questions; help optimize data management processes in the new environment, on-going Database Administration
- Implement on-going Database Administration procedures, process and guidebook
- Implement relevant DataOps / pipeline
- Other duties as assigned;
SkillsExperience and Skill Set Requirements
A Note on the VOR Master Service Agreement:
The VOR Master Service Agreement which expires on April 5, 2026, leaves some Contracts with funding unassigned for fiscal 2026-27. If the current statement of work expires on March 31, 2026, the remaining funds can be used to exercise an option to extend the SOW beyond March 31, 2026, based on business case approvals. Such extensions will be allowable, only if the Master Service Agreement is extended beyond April 5, 2026, and be upon the same terms, conditions, and covenants contained in the SOW.
The start date is subject to change based on security clearances and contract signing timelines.
Experience and Skillset Requirements
Mandatory Requirements
- 5+ year experience working with RDBMS, Azure SQL Server, Database Administration, security management for high-sensitive database, database performance management, query performance analysis & tuning, database quality practice, good understanding of Azure storage concepts and technologies.
- 5+ years of working experience in an ETL role; strong understanding of ETL principles, including data extraction, transformation, and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures, Azure workflow orchestration tools, and concepts related to data ingestion, transformation, and movement.
- Proficiency in Azure Data Factory, SSMA, Database Migration Services, including knowledge of pipeline creation, data flows, integration runtimes, triggers, and monitoring.
- Proficiency in scripting languages, relational data models, data manipulation languages (T-SQL, PL/SQL), data definition languages, physical database design, and experience with Azure-specific scripting using PowerShell or Azure CLI.
- Experience with continuous integration/continuous deployment (CI/CD) processes around DevOps, data workflows, Synapse workspaces.
Nice to Have Requirements
- Azure cloud certifications (e.g. Azure fundamentals, Azure Data Engineer associate, Azure Database Administrator associate)
- Proficiency in Oracle database administration
Desired Skills and Experience
- Experience in integrating various data sources and systems, both on-premises and in the cloud, using Azure ETL services or other ETL tools (e.g. Azure ADF, SSMA).
- Expertise in data transformation techniques, such as data cleansing, aggregation, enrichment, and normalization using Azure cloud technologies.
- Understanding of data quality management practices, including data profiling, data validation, and error handling within ETL processes.
- Understanding of data governance principles, data privacy regulations, and experience working with high-sensitivity data, and knowledge of best practices for data security and compliance in Azure.
- Ability to monitor and troubleshoot ETL processes, optimize query performance, and implement efficient data processing techniques in Azure.
- Familiarity with version control systems (e.g., Azure Repos) and collaboration tools (e.g., Azure DevOps) for managing code, tracking changes, and collaborating with team members.
- Experience with SQL Server Management Studio, Azure data management tools, XRM toolbox, data modeling tools (e.g. PowerDesigner, ERWIN).
Resumes Evaluation/Criteria:
Criteria 1: Data Migration, ETL - 40 Points
- Demonstrated experience with ETL development, data pipelines, workflow orchestration and data ingestion, transformation, and movement
- Demonstrated experience in integrating various data sources and systems, both on-premises and in the cloud, using Azure ETL services or other ETL tools (e.g. ADF, SSMA, etc)
- Demonstrated experience working with Azure Data Factory, including knowledge of pipeline creation, data flows, integration runtimes, triggers, and monitoring.
- Demonstrated experience with data manipulation languages (T-SQL, PL/SQL), data definition languages, query performance analysis & tuning
- Demonstrated experience with SQL Server, Oracle, Azure SQL Databases
- Demonstrated experience with data modeling tools (e.g. PowerDesigner, ERWIN)
- Demonstrated experience in scripting languages like Python and with Azure-specific scripting using PowerShell or Azure CLI.
- Experience with software development lifecycle
- Experience with data modeling, physical database design, data flow diagrams
Criteria 2: Database Tuning and Administration - 30 Points
- Demonstrated experience finetuning Azure SQL database for security, cost, performance, availability and reliability.
- Demonstrated experience setting up monitors and alerts on critical database metrics to ensure high availability.
- Demonstrated experience to automate operations.
- Experience with supporting a large database in a production environment.
Criteria 3: Azure Platform and Security - 20 Points
- Experience with Azure Data Factory (ADF) and Database Migration services and tools.
- Demonstrated experience Azure data management tools, DevOps
- Experience in Azure resource configuration and administration such as Azure SQL Database, Blob Storage, Key Vault, Application Insight resources, resource groups and subscriptions.
- Familiar with Azure cloud platform
- Familiar with database security concepts and practices
- (Nice to have) Azure cloud certifications
Criteria 4: DevOps and CI/CD - 10 Points
- Demonstrated experience with continuous integration/continuous deployment (CI/CD) tools and processes around DevOps, data workflows, Synapse workspaces.
Knowledge Transfer
What needs to be KT
- All technical artifacts related to the assignment
- Project specific presentations, reports, status decks
To whom
- Project Manager/Team Members
When
- 1:1 meetings / team meetings / documentation on SharePoint site, throughout the duration of the project life cycle etc
This position is currently listed as "Onsite"; however, the Assignment under this request will provisionally be "Hybrid", working 7.25 hours per calendar day, sometime between 8:00 AM and 5:00 PM (excluding breaks) Monday to Friday inclusive, unless otherwise identified. These conditions are subject to change as the OPS reflects on its current situation. During the duration of the assignment, you will be under the discretion of the Hiring Manager's requirements for the Project you are assigned to.
Must haves:
- 5+ year experience working with RDBMS, Azure SQL Server, Database Administration, security management for high-sensitive database, database performance management, query performance analysis & tuning, database quality practice, good understanding of Azure storage concepts and technologies.
- 5+ years of working experience in an ETL role; strong understanding of ETL principles, including data extraction, transformation, and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures, Azure workflow orchestration tools, and concepts related to data ingestion, transformation, and movement.
- Proficiency in Azure Data Factory, SSMA, Database Migration Services, including knowledge of pipeline creation, data flows, integration runtimes, triggers, and monitoring.
- Proficiency in scripting languages, relational data models, data manipulation languages (T-SQL, PL/SQL), data definition languages, physical database design, and experience with Azure-specific scripting using PowerShell or Azure CLI.
- Experience with continuous integration/continuous deployment (CI/CD) processes around DevOps, data workflows, Synapse workspaces.
Nice to have:
- Azure cloud certifications (e.g. Azure fundamentals, Azure Data Engineer associate, Azure Database Administrator associate)
- Proficiency in Oracle database administration