Data Architect
hace 2 días
Charlotte
Job Description COMPANY OVERVIEW Riverstone Logistics (RLX) is committed to being an honest, reliable, dependable freight forwarding partner. Every day we are focused on fulfilling our Purpose of using our God-given talents and opportunities to love our neighbors, serve our communities, and improve our industry. RLX provides final mile deliveries through dedicated and network models for various clients across the United States. We love working with new clients that are looking to enhance their customer experience through final mile deliveries. Headquartered in Charlotte, North Carolina, RLX currently has over 700 employees and is always looking for new employees that demonstrate leadership capabilities and exceptional communication skills to grow with us at our home office as well as client sites across the United States. Position Summary The Data Architect leads the strategy, design, and implementation of RLX’s enterprise data architecture with a focus on a practical, modern Canonical Data Model (CDM). This role defines the data standards, shared business vocabulary, and data contracts that enable consistent, governed, and interoperable data across our platforms and lines of business (e.g., TMS, OMS, WMS, ERP, EDI). The architect partners closely with Data Engineering, Cybersecurity, Application teams, and business stakeholders to deliver a secure, high-quality, and cost-aware data foundation that supports analytics, operations, and AI/ML use cases. Essential Duties and Responsibilities Key responsibilities include, but are not limited to: • Define and maintain the enterprise Canonical Data Model strategy and roadmap: design a federated, domain-aligned CDM with a thin, shared backbone (core IDs, parties, products, locations, time) and data product–level contracts for interoperability., • Model data for both batch and streaming (event-first) paradigms; Establish RLX’s data platform methodology and define patterns (Dimensional/Kimball, Data Vault) as appropriate., • Establish data contracts, as required, and mentor developers to align APIs (OpenAPI) and events (AsyncAPI/Avro/Protobuf) with the CDM; implement versioning, schema registries, and contract-testing in CI/CD., • Lead architecture for the cloud data lake/lakehouse (e.g., Snowflake or Databricks) including medallion layering, ingestion (batch/CDC/streaming), transformation (dbt), and serving (BI/semantic layer)., • Define and operationalize data quality SLAs/SLOs (freshness, completeness, uniqueness, validity, timeliness); deploy monitoring and alerting with automated data tests in pipelines., • Implement end-to-end lineage and metadata management; standardize technical, business, and operational metadata capture and stewardship processes., • Partner with Cybersecurity to design privacy-by-default data controls: classification, masking, tokenization, row/column-level security, key management, and retention policies; ensure compliance with contractual and regulatory obligations., • Drive master and reference data strategies (party, location, product, asset/vehicle) and reconciliation back to ERP/source-of-truth systems., • Provide strategic thought and architectural guidance for new data initiatives; review designs for scalability, reliability, observability, and cost efficiency (FinOps) and ensure alignment to standards., • Mentor data engineers and analytics developers on modeling conventions, naming standards, and architectural patterns; promote code review, testing, and documentation practices., • Enable AI/ML readiness: articulate feature/label data needs, data curation and lineage requirements, model observability inputs, and secure access patterns for ML and LLM/RAG scenarios., • Champion platform reliability: backup/restore, disaster recovery, access management (RBAC/ABAC), secrets management, and environment promotion standards. Minimum Qualifications (Knowledge, Skills, and Abilities) Education • Bachelor’s degree in Computer Science, Data/Information Systems, or related field required, • Master’s degree preferred or equivalent experience. Experience & Skills • 8+ years in data architecture and/or data engineering with demonstrated ownership of enterprise data models and platform patterns., • Deep experience with lakehouse or cloud data platforms (Snowflake and/or Databricks) and one or more of: Delta Lake, Spark, Snowpark Apache Iceberg, Apache., • Hands-on expertise with ELT/ETL and orchestration (e.g., dbt, Airflow/Dagster/Prefect) and CDC from operational systems (e.g., Fivetran, Debezium, HVR, native connectors)., • Event-driven architecture experience: Kafka/PubSub, schema registry, event modeling, event versioning, and AsyncAPI/OpenAPI alignment to the CDM., • Data governance tooling and practices: lineage (e.g., OpenLineage), catalog (e.g., Collibra/Atlan/Alation), DQ frameworks (e.g., Great Expectations, Soda, Monte Carlo) and data stewardship workflows., • Strong SQL and performance tuning; practical experience with SCD patterns (Types 1/2) and CDC strategies in analytics and operational data stores., • Security-by-design with column/row-level security, data masking, tokenization, retention and disposition, and key/secrets management in cloud environments., • Fluent with CI/CD, Git, testing, and infrastructure-as-code (e.g., Terraform); promotes contract tests and automated validations in data pipelines., • Excellent communication and facilitation skills with the ability to translate between business processes and data/technical designs; proven mentorship of engineers and analysts. Reporting Structure This position reports directly to the Chief Information Officer (CIO). Note This job description in no way states or implies that these are the only duties to be performed by the employee(s) incumbent in this position. Employees will be required to follow any other job-related instructions and to perform any other job-related duties requested by any person authorized to give instructions or assignments. To perform this job successfully, the incumbents will possess the skills, aptitudes, and abilities to perform each duty proficiently. Some requirements may exclude individuals who pose a direct threat or significant risk to the health or safety of themselves or others. The requirements listed in this document are the minimum levels of knowledge, skills, or abilities. This document does not create an employment contract, implied or otherwise, other than an “at will” relationship. Riverstone Logistics is proud to be an Equal Opportunity Employer and Drug Free workplace. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, sexual orientation, gender identity, age, status as a protected veteran, among other things, or status as a qualified individual with disability. Riverstone Logistics also complies with the Immigration Reform & Control Act and E-Verify, so we ask that you bring the appropriate documents to confirm your authorization to work in the United States with you upon request.