Leeds
Greetings from Mastek! We’re seeking an experienced Data Architect for one of the Healthcare domain. About Mastek Mastek is a global digital engineering and cloud transformation specialist, trusted by public sector organisations to deliver large‑scale, mission‑critical digital services. With a strong and established presence across the UK public sector, including healthcare, Mastek combines strong delivery capability with collaborative, multidisciplinary ways of working to deliver outcomes‑focused programmes that improve population health and operational performance. This role supports Healthcare data programmes delivering enterprise data platforms and analytics capabilities focused on clinical, operational, and population‑level insight. Delivery is undertaken through agile teams operating within a highly regulated Healthcare environment and aligned to Healthcare data governance, security, and assurance standards. Engagement Overview This contract Data Architect role is focused on stabilising, documenting, and enabling BAU operation of critical data engineering pipelines within a complex Healthcare data estate. The contractor will operate in a delivery-led, time‑bound engagement, supporting Phase 1 and Phase 2 pipeline operations by producing clear architectural artefacts, data flow analysis, and operationally usable documentation that reduce risk, enable service continuity, and support handover into managed BAU support. The role is hands-on and outcomes-driven, requiring the contractor to be productive from day one, work with partially documented systems, and extract critical knowledge from data engineering SMEs. Objectives for the Contract The Data Architect contractor will be accountable for: • Rapidly understanding how critical data pipelines work today, • Making implicit and tribal knowledge explicit, • Producing architecture and data flow artefacts that are:, • Operable, • Supportable, • Transferable into BAU Key Deliverables & Responsibilities 1. Critical Pipeline Architecture Support • Analyse and document existing critical data engineering pipelines., • Contribute directly to:, • Critical Pipeline Inventory, • Service criticality and operational risk assessments, • Identify:, • Architectural dependencies, • Single points of failure, • Knowledge concentration risks, • Ensure architectural documentation aligns with operational continuity needs, not just design intent. 2. Data Flow & Source System Analysis • Produce end‑to‑end data flow mappings for critical pipelines, covering:, • Source systems and data collection mechanisms, • Processing stages and centralised transformations, • Intermediate storage and handoffs, • Downstream consumers and dependencies, • Document:, • Submission triggers (scheduled, event-driven, manual), • Data volumes, cadence, and SLA sensitivities, • Source system availability and dependency risks, • Explain and contextualise the current “heavy central processing” model, identifying:, • Statutory or regulatory drivers, • Legacy or convenience-driven complexity 3. Business Logic & Architectural Knowledge Capture • Capture and structure business rules, transformations, validations, and aggregations per critical pipeline., • Clearly distinguish:, • Regulatory / statutory logic, • Operationally required logic, • Legacy technical debt, • Populate and maintain a Business Logic Repository suitable for:, • Incident resolution, • Knowledge transfer, • Ongoing BAU support 4. BAU Service Model Enablement • Ensure architectural outputs support the defined BAU service model, including:, • Operational playbooks, • Incident response procedures, • Escalation and dependency clarity, • Provide architectural input into:, • SLA / OLA alignment, • Change management and release controls, • Disaster recovery (DR) robustness assessments, • Validate architectural readiness for handover into managed operations. 5. Knowledge Transfer & Handover Support • Actively support knowledge transfer activities with:, • Healthcare data pipeline SMEs, • Operational and support teams, • Ensure all architectural artefacts are:, • Clear, • Complete, • Validated with SMEs, • Support shadow and supervised operation periods as required., • Enable formal handover and sign‑off for BAU readiness. 6. Pipeline Improvement Identification (Architectural Input) • Identify pipeline-level improvement opportunities with an operations-first mindset., • Assess recommendations against:, • Immediate operational benefit, • Risk reduction, • Alignment with platform modernisation boundaries, • Support classification of improvements as:, • Tactical (“Do Now”), • Strategic (“Defer to Platform Modernisation”), • Ensure no recommendations create rework or conflict with future platform changes. Required Contracting Profile Essential Experience • Proven experience as a Data Architect on complex, legacy data estates., • Strong background working with:, • Data engineering pipelines, • Batch / scheduled processing, • Centralised data processing models, • Demonstrated ability to:, • Rapidly understand undocumented or poorly documented systems, • Produce operationally usable architecture and data flow artefacts, • Experience supporting BAU operations, service transition, or operational handover., • Comfortable working under time‑boxed delivery constraints with minimal onboarding. Desirable Experience • Experience within Healthcare, public sector, or other highly regulated data environments., • Familiarity with:, • SLA/OLA-driven service models, • Incident, problem, and change management processes, • Previous involvement in knowledge transfer or service transition engagements., • Ability to work effectively alongside service management and operational teams. Working Approach (Contractor Expectations) • Delivery-focused and self-directed, • Low dependency on management direction, • Pragmatic rather than theoretical, • Comfortable engaging with technical SMEs and senior stakeholders, • Produces documentation that is usable in live operations, not just design reviews Summary This contract role is suited to a hands-on Data Architect who can rapidly embed into a complex environment and deliver: Clear, operationally focused data architecture that stabilises critical pipelines, enables BAU support, and reduces delivery and operational risk—without overreaching into platform modernisation. Why Join Us? • We’re a purpose‑driven organisation committed to creating positive impact and championing inclusion., • You’ll join a culture that encourages innovation, collaboration, and continuous growth., • We celebrate difference and believe diverse perspectives lead to stronger solutions., • Even if you don’t meet every requirement, we encourage you to apply — great talent comes from a variety of backgrounds., • Your lived experience, unique strengths, and personal story matter — they help us design better outcomes for the communities we serve., • We celebrate diversity of experience, knowledge, backgrounds, and perspectives, and believe these differences enable us to create meaningful impact. We are proud to be an equal opportunity employer and are committed to fairness and inclusion for all, regardless of sex, race, religion or belief, ethnic or national origin, disability, age, citizenship, marital or partnership status, sexual orientation, gender identity, pregnancy or related conditions, or any other protected characteristic. If you require any reasonable adjustments or additional support during the recruitment process, please let us know. Ready to Make an Impact? 📩 Apply now and help us build a more inclusive, digitally empowered future