Data Modeller - hybrid - (*Y)
hace 2 días
Granada
At T-Systems, you will find groundbreaking projects that contribute to social and ecological well-being. It doesn't matter when or where you work. It's about doing work that matters to move society forward. For this reason, we will do everything possible to ensure that you have every opportunity to develop by offering you a support network, excellent technology, a new work environment and the freedom to work independently. We develop hybrid cloud and artificial intelligence solutions and drive the digital transformation of companies, industry, the public sector and, ultimately, society as a whole. Project Description: We are looking for Data Engineer to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on One Data Entry (ODE), development of the relevant data products on ODE, Operations of the data products on ODE Infrastructure Deployment & Management: Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing. # Data Processing & Transformation: Utilize Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow. # Core GCP Services Management: Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform. # Application Implementation: Develop and implement Python applications for various GCP services. # Integrate and manage GitLab Magenta CI/CD pipelines for automating cloud deployment, testing, and configuration of diverse data pipelines. # Implement comprehensive security measures, manage IAM policies, secrets using Secret Manager, and enforce identity-aware policies. # Data Integration: Handle integration of data sources from CDI, Datendrehscheibe (FTP servers), TARDIS API´s and Google Cloud Storage (GCS). # Multi-environment Deployment: Create and deploy workloads across Development (DEV), Testing (TEST), and Production (PROD) environments. # Implement AI solutions using Google’s Vertex AI for building and deploying machine learning models. # Must be a certified GCP Cloud Architect or Data Engineer. Google Cloud Platform (GCP) knowledge Expertise in Terraform for infrastructure management Python for application implementation Experience with GitLab CI/CD for automation Knowledge of network architectures, security implementations, and management of core GCP services Proficiency in employing data processing tools like Hive, PySpark , and data orchestration tools like Airflow Familiarity with managing and integrating diverse data sources Hybrid work model (teleworking/on-site). ~ Continuous training: Preparation for certifications, access to Coursera, weekly English and German classes... ~ Flexible compensation plan: health insurance, meal vouchers, childcare, transport assistance... ~ Social fund. ~