Sr. Data Engineer - MDM
hace 3 días
Iselin
Job Description No C2C only on W2... Need local profiles only In-Person interview will be required. Recent 3-5 years of banking domain experience required. Job Description: Job Summary: • We are seeking a highly skilled and experienced Senior Data Engineer specializing in Master Data Management (MDM) to join our data team., • The ideal candidate will have a strong background in designing, implementing, and managing end-to-end MDM solutions, preferably within the financial sector., • Lead the design, development, and deployment of comprehensive MDM solutions across the organization, with an emphasis on financial data domains., • Demonstrate extensive experience with multiple MDM implementations, including platform selection, comparison, and optimization., • Architect and present end-to-end MDM architectures, ensuring scalability, data quality, and governance standards are met., • Evaluate various MDM platforms (e.g., Informatica, Reltio, Talend, IBM MDM, etc.) and provide objective recommendations aligned with business requirements., • Collaborate with business stakeholders to understand reference data sources and develop strategies for managing reference and master data effectively., • Implement data integration pipelines leveraging modern data engineering tools and practices., • Develop, automate, and maintain data workflows using Python, Airflow, or Astronomer., • Build and optimize data processing solutions using Kafka, Databricks, Snowflake, Azure Data Factory (ADF), and related technologies., • Design microservices, especially utilizing GraphQL, to enable flexible and scalable data services., • Ensure compliance with data governance, data privacy, and security standards., • 12+ years of experience in data engineering, with a proven track record of MDM implementations, preferably in the financial services industry., • Extensive hands-on experience designing and deploying MDM solutions and comparing MDM platform options., • Strong functional knowledge of reference data sources and domain-specific data standards., • Expertise in Python, Pyspark, Kafka, microservices architecture (particularly GraphQL), Databricks, Snowflake, Azure Data Factory, SQL, and orchestration tools such as Airflow or Astronomer., • Familiarity with CI/CD practices, tools, and automation pipelines., • Ability to work collaboratively across teams to deliver complex data solutions., • Familiarity with financial data models and regulatory requirements., • Experience with Azure cloud platforms, • Knowledge of data governance, data quality frameworks, and metadata management.