Senior Data Engineer
17 hours ago
Glasgow
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s ___, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to ___, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at ___and on ___, ___, ___, and ___. Role: Principal Consultant-I Tech Stack: ETL- Python +DataBricks, Snowflake, Big Data Location: Glasgow RTO: 3 days (hybrid) You will be responsible for: • Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks, • Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs., • Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing, transformation, and loading, ensuring accuracy and consistency., • Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations, • Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives., • Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality., • Developing and maintain tooling and automation scripts to streamline repetitive tasks., • Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes, • Utilizing REST APIs and other integration techniques to connect various data sources, • Proficiency in Python programming, including experience in writing efficient and maintainable code., • Hands-on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines, • Proficiency in working with Snowflake or similar cloud-based data warehousing solutions, • Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices, • Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment., • Experience with code versioning tools (e.g., Git), • Meticulous attention to detail and a passion for problem solving, • Knowledge of Linux operating systems, • Familiarity with data visualization tools and libraries (e.g., Power BI), • Background in database administration or performance tuning, • Familiarity with data orchestration tools, such as Apache Airflow, • Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing, • Experience with ServiceNow integration