Data Scientist Deutsche Telekom
hace 20 horas
Granada
At T-Systems, you will find groundbreaking projects that contribute to social and ecological well-being. We want to welcome new talents like you, who bring fresh ideas, different points of view, who accept challenges and continuous learning, to grow and impact society… All this, in a fun way! It doesn't matter when or where you work. It's about doing work that matters to move society forward. For this reason, we will do everything possible to ensure that you have every opportunity to develop by offering you a support network, excellent technology, a new work environment and the freedom to work independently. We support you to constantly grow both personally and professionally, so that you can leave a notable mark on society. T-Systems is a team of around 28,000 people employed around the world, making us one of the world's leading providers of integrated end-to-end solutions. We develop hybrid cloud and artificial intelligence solutions and drive the digital transformation of companies, industry, the public sector and, ultimately, society as a whole. Job Description We calculate with our models the fibre rollout strategic plan for Telekom in whole Germany. With our models and data analytics we provide transparency and strategic insight. For a correct assesment of market opportunities it is necessary to get a a complete up-to-date picture of competitor activities. Therefore we need your help to build up a crawling pipeline, which will continously update available data about competitor activities from public sources Activity description and concrete tasks: The tasks comprise all activities to support the current crawling process and to improve this process continously, to get the latest data of competitor activities from public web sources in the most efficient way. Specifically there are three main activities. First for each web source, which hast to be analysed, a standard crawler script has to be adapted to the individual page layout of the web source. As a second task the crawling results should be collected and transfered to our database and a report system should be established to visualize changes due to the crawling reults in the database. The third task will take care of the automation of the crawling process. A git pipeline should be established, which makes an update of the adapted crawling scripts in an AWS-cloud environment. Apart of this the job includes new approaches to screen intelligently the German competitor market, understanding market strategies and provide transparencies to stakeholders Qualifications • Proficiency in Python programming, • Hands-on experience with Selenium or similar web scraping tools, • Strong SQL and database management skills, • Experience with Git and version control systems, • Demonstrated experience with AWS cloud services, • Experience with CI/CD pipelines and automation tools (e.g., GitLab CI) Additional Information What do we offer you? • International, positive, dynamic and motivated work environment., • Hybrid work model (teleworking/on-site)., • Flexible schedule., • Continuous training: Preparation for certifications, access to Coursera, weekly English and German classes..., • Flexible compensation plan: health insurance, meal vouchers, childcare, transport assistance..., • Life and accident insurance., • More than 26 working days of vacation per year., • Social fund., • Free service for specialists (doctors, physiotherapists, nutritionists, psychologists, lawyers...) If you are looking for a new challenge, do not hesitate to send us your CV. Join our team! T-Systems Iberia will only process the CVs of candidates who meet the requirements specified for each offer.