Data Architect (Newark)
11 hours ago
Newark
Role Purpose Define enterprise data architecture standards, create data exchange and ingestion frameworks, establish data quality and governance patterns, and develop the data domain models and templates that ensure consistent, high-quality data across the organization. This role creates enterprise data patterns and frameworks, not operational data solutions. What Makes This Role Unique • Data framework architect: Design the data ingestion framework with schema validation, quality checks, and exception handling used enterprise-wide, • Data quality champion: Embed quality into frameworks at ingestion, not as afterthought, • Exchange pattern owner: Create the patterns for internal and third-party data exchange Enterprise Data Standards & Patterns (40%) • Help define enterprise data modeling standards (conceptual, logical, physical, dimensional models), • Create data domain model standards with quality framework integration, • Define data architecture patterns for common scenarios (OLTP, OLAP, streaming, hybrid), • Document data persistence patterns (SQL vs NoSQL selection, data lake vs warehouse, caching), • Create master data management patterns and reference data standards Design enterprise data ingestion framework: • Central pipeline architecture for all data ingestion (batch, streaming, real-time), • Schema validation framework (enforce schemas at ingestion, schema registry), • Data quality validation framework integrated into pipeline (completeness, accuracy, consistency, timeliness), • Exception handling framework (quarantine, alerts, remediation workflows), • Internal system-to-system data sharing, • Third-party data exchange (inbound/outbound): SFTP/API patterns, data format standards, partner onboarding templates, data contract templates, • Define data integration patterns (ETL, ELT, streaming, CDC, data virtualization), • Create enterprise data quality framework (quality dimensions, metrics, rules by domain, monitoring, remediation), • Define data governance patterns (stewardship model, cataloging standards, metadata management, lineage and impact analysis), • Establish data privacy and protection patterns (PII handling, encryption, masking, tokenization), • Develop enterprise data architecture modernization roadmap, • Train solution architects on data patterns and frameworks, • Review solution architectures for data pattern compliance, • Coordinate with Data Platform Enablement team on platform capabilities vs patterns Education: • 12+ years in data architecture, data engineering, or enterprise architecture, • 5+ years creating enterprise data standards, frameworks, and patterns, • Proven experience designing data ingestion frameworks with quality and governance, • Experience with data exchange patterns for internal and external parties, • TOGAF certification, • Cloud data platform certification (AWS Data Analytics, Azure Data Engineer, Google Cloud Data Engineer), • DAMA CDMP (Certified Data Management Professional), • Data governance certification Required Technical Skills Skill Category Required Skills Proficiency Level Data Standards Data modeling standards (conceptual, logical, physical) Expert Data Standards Canonical data model design Expert Data Standards Data domain modeling Expert Data Ingestion Framework Central pipeline architecture Expert Data Ingestion Framework Schema validation framework Expert Data Ingestion Framework Data quality framework integration Expert Data Ingestion Framework Exception handling framework Advanced Data Exchange Patterns Internal data exchange patterns Expert Data Exchange Patterns Third-party data exchange patterns Expert Data Exchange Patterns Data contract templates Advanced Data Quality Framework Quality dimensions and metrics Expert Data Quality Framework Quality validation rules Advanced Data Governance Data governance frameworks Advanced Data Governance Data cataloging standards Advanced Data Governance Data lineage patterns Advanced Platform Knowledge Cloud data platforms (AWS, Azure, GCP) Advanced Platform Knowledge Data lake and warehouse patterns Expert Platform Knowledge Data pipeline patterns Advanced Preferred Qualifications • Experience in healthcare or financial services with complex data exchange requirements, • Track record implementing data quality frameworks at enterprise scale, • Experience with third-party data exchange and partner onboarding, • Published thought leadership on data architecture or data quality, • Data quality validation pass rate >95% through ingestion framework, • Onboard 10+ data sources to central ingestion framework, • Establish data exchange patterns adopted by 80%+ of new integrations, • Canonical model coverage for 15+ core enterprise entities, • Data ingestion framework with schema validation, quality checks, exception handling, • Data exchange patterns for internal and third-party data, • Data quality framework integrated with domain models, • Data governance patterns and templates, • Data integration pattern library, • Data architecture reference architectures Key Partnerships: • Data Platform Enablement team (they operate platform, you create patterns), • Solution architects (apply data patterns), • Data stewards and data governance council, • Integration Architect (data exchange patterns), • Architecture Review Board (bi-weekly), • Data governance forums Our Enterprise Architecture team operates on principles of collaboration, excellence, and innovation: • Pattern-first mindset: We create reusable blueprints that enable consistency and quality, • Partnership model: We work alongside operational teams (App Dev, Data Platform, Infrastructure, InfoSec) as strategic partners, • Continuous improvement: Patterns evolve based on feedback from implementation