Data Integration Developer
3 days ago
Edison
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, Fairway Market®, and Gourmet Garage® banners. Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices. The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility. Essential Functions • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems., • Implement and enforce data quality and governance standards to ensure the accuracy and consistency., • Provide input for project plans and timelines to align with business objectives., • Monitor project progress, identify risks, and implement mitigation strategies., • Work with cross-functional teams and ensure effective communication and collaboration., • Provide regular updates to the management team., • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure., • Communicates and promotes the code of ethics and business conduct., • Ensures completion of required company compliance training programs., • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies., • Stays current through personal development and professional and industry organizations. Responsibilities • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations., • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases., • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability., • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms., • Ensure data solutions and data sources meet quality, security, and compliance standards., • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime., • Provide technical training, documentation, and ongoing support to end users of data automation systems., • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures. Qualifications • A bachelor's degree or higher in computer science, information systems, or a related field., • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.), • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc., • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage., • Experience with workflow orchestration tools such as Cloud Composer or Airflow, • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc., • Hands-on experience with IBM DataStage and Alteryx is a plus., • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization., • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,, • Familiarity with BI visualization tools such as Microstrategy, Power BI, Looker, or similar., • Familiarity with data modeling tools., • Proficiency in project management software (e.g., JIRA, Clarizen, etc.), • Familiarity with DevOps practices for data (CI/CD pipelines), • Strong knowledge and skills in data management, data quality, and data governance., • Strong communication, collaboration, and problem-solving skills., • Ability to work on multiple projects and prioritize tasks effectively., • Ability to work independently and in a team environment., • Ability to learn new technologies and tools quickly., • The ability to handle stressful situations., • Highly developed business acuity and acumen., • Strong critical thinking and decision-making skills. Working Conditions & Physical Demands • This position requires in-person office presence at least 4x a week.