Use left and right arrow keys to navigate
Provided by the employer
Verified Pay check_circle $70 per hour
Hours Full-time
Location Washington, DC
Washington, District of Columbia open_in_new

About this job

Short Description: 
DHCF is looking for a contractor to assist in the data modernization efforts associated at DHCF.   
 
Complete Description: 
 
1. Position Purpose 
The Medicaid Data Systems Analyst is the functional anchor for the agency’s data modernization. This role ensures the technical infrastructure remains trustworthy by leading the implementation of an Enterprise Data Catalog and managing the transition of business logic from TFS/SSIS to the Azure/Databricks environment. They act as the "Metadata Architect," ensuring that all raw MMIS schemas are translated into governed, auditable, and BI-ready assets. 
 
2. Expanded Key Responsibilities 
 
A. Data Catalog Implementation & Metadata Governance 
  • Business Glossary Ownership: Lead the implementation of the Enterprise Data Catalog (e.g., Microsoft Purview or Unity Catalog)from a functional perspective. Define and maintain the business glossary for all Medicaid data. 
  • Data Lineage Mapping: Document and verify end-to-end data lineage—from the raw MMIS source table through the transformation layer in Databricks to the final PowerBI report. 
  • Discovery & Search: Organize the catalog so that policy analysts and actuaries can easily discover Medicaid datasets, understand their sensitivity (PHI/PII), and view their update frequency. 
B. Infrastructure & FinOps Support 
  • Data Reliability Engineering: Support the infrastructure’s health by monitoring Data Quality (DQ) dashboards. Alert the Senior Data Engineer if upstream MMIS schema changes cause data drift or pipeline failures. 
  • FinOps Collaboration: Partner with the Senior Data Engineer to analyze cloud consumption. Identify unused or redundant datasets that can be purged or archived to optimize Azure storage and compute costs. 
  • Infrastructure Validation: Verify that the "Infrastructure as Code" deployments (Managed by the Engineer) correctly apply the Row-Level Security (RLS) and masking rules defined by the Lead Architect. 
C. Schema Transformation & Migration Support 
  • Legacy-to-Cloud Logic Audit: Translate legacy SSIS logic stored in TFS into clear business requirements for the Engineer to rebuild in Azure Data Factory and Databricks. 
  • Target Mapping: Take raw table schemas from the current MMIS and define the "Target" format for the Data Warehouse, ensuring it meets the specific needs of modern BI tools. 
     
Responsibilities: 
1. Leads the adoption or implementation of an advanced technology or platform. 
2. Expert on the functionality or usage of a particular system, platform, or technology product. 
3. Serves as a consultant to clients, guiding the efficient use or adoption of a particular IT product or platform. 
4. Creates implementation, testing, and/or integration plans. 
5. Demonstrates expertise in a particular IT platform or service, allowing for maximum IT investment. 
 
Minimum Education/ Certification Requirements: 
Bachelor’s degree in Information Technology or related field or equivalent experience 
Training or certification in a particular product or IT platform/service, as required 
 

 
Candidate Skills Matrix: 
 
Following sections are to be filled by the candidate: 
 
Skills  | Required/Desired  | No. of Years  | How many years of experience candidate has?
Expert T-SQL skills for deep-dive data auditing and parity testing between legacy SQL Server and Azure Synapse.   | Required  | 5  | 
Experience with MMIS data, specifically claims (837), eligibility (270/271), and T-MSIS federal reporting.   | Required  | 5  | 
Familiarity with TFS (for reviewing legacy logic) and Azure DevOps (for modern project tracking).   | Required  | 5  | 
Industry certification/training in specific IT product/platform or service   | Required  | 5  | 
5 yrs. leading advanced technology projects or service projects   |   | 5  | 
Hands-on experience in specific product or IT platform   | Required   | 5  | 
5 yrs full system engineering lifecycle  | Required   | 5  | 
Bachelor’s degree  |  Required   |   | 
Certification in Data Governance (e.g., CDMP) or Cloud Fundamentals (e.g., Azure AZ-900).   |  Highly desired  |   | 
Experience with Power BI or Tableau to validate that transformed data meets visualization requirements.   |  Highly desired  |   | 
Hands-on experience with Microsoft Purview, Collibra, Informatica and Alation  |  Highly desired  |   |  
 
 
  
Background check:   
 
  • Extensive criminal history background check will be required. We cannot submit candidates with recent histories (go back seven years) of extensive driving, drug, robbery or any other illegal activity. Any criminal activity on the background check will eliminate the candidate from consideration.  If selected, please make certain that you inform all candidates that they will have to complete this criminal background check prior to starting. NATIONAL background checks are required; Federal background checks are NOT compliant under this contract. A national background check is a national criminal background check that pulls criminal records from State and County Courts in almost every US State.   

Nearby locations

Posting ID: 1226528350 Posted: 2026-02-04 Job Title: Data Analyst