
As a Security Operations Analyst. You’ll monitor, triage, and respond to threats across our global estate, using modern SIEM/EDR and automation to keep [Company/clients] safe. Hybrid working, strong learning culture, and clear progression. The role • You’ll be part of our Security Operations Centre, detecting and responding to cyber threats, improving our controls, and guiding the business through security incidents. This role suits someone hands-on with SOC tooling, calm under pressure, and eager to automate the boring stuff. What you’ll do • Monitor and triage security alerts across SIEM (e.g., Microsoft Sentinel/Splunk), EDR (e.g., Defender for Endpoint/CrowdStrike), email security, and cloud platforms., • Investigate incidents end-to-end: scoping, containment, eradication, and recovery; maintain clear incident records and timelines., • Execute and improve playbooks/runbooks; contribute to SOAR automation for repetitive tasks., • Perform threat hunting using hypotheses mapped to MITRE ATT&CK; enrich findings with threat intelligence (internal and external)., • Lead/assist on phishing investigations, malware analysis at triage level, and suspicious user activity reviews., • Collaborate with IT/Cloud/Network teams on log onboarding, tuning, and control gaps; reduce false positives., • Track and meet SLAs/KPIs (MTTD/MTTR); deliver concise, executive-ready post-incident reports and lessons learned., • Support vulnerability management by contextualising exposures and recommending remediation priorities., • Participate in shift handovers and, if applicable, an out-of-hours/on-call rota., • Contribute to security awareness and purple-team exercises/attack simulations. What you’ll bring • Experience in a SOC/IR role (typically 2–5 years for this level) with demonstrable incident handling., • Working knowledge of: SIEM, EDR, email security, network security (IDS/IPS, firewalls), and cloud security (Azure/AWS)., • Ability to query and analyse data (KQL/Splunk SPL/SQL); basic scripting (PowerShell or Python) for enrichment and automation., • Familiarity with frameworks and standards: MITRE ATT&CK, NIST CSF, ISO/IEC 27001, and Cyber Essentials/Plus., • Strong written and verbal communication; comfortable translating technical risk for non-technical audiences., • A proactive mindset: curiosity, ownership, and continuous improvement., • Nice to have (advantageous, not essential), • Certifications such as Security+, CySA+, SC-200, AZ-500, GCIH/GCIA/GCTI, SSCP, GCED, or equivalent., • Experience with SOAR tooling, sandboxing, DFIR basics, or purple-team methodology., • Exposure to identity security (Entra ID, Okta), SaaS security, or container/Kubernetes security., • Experience in regulated environments (financial services, public sector) and/or UK SC/BPSS clearance eligibility. What we offer • Leave: 20 days’ annual leave, • Pension & protection: Employer pension contribution, life assurance, and income protection., • Learning & growth: Budget for certifications, paid exam days, access to labs and training platforms; clear progression to Senior Analyst/Incident Responder/Threat Hunter.

Job Overview We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will be responsible for designing, constructing, and maintaining scalable data pipelines and architectures. You will work closely with data scientists and analysts to ensure the efficient flow of data across various platforms and systems, enabling insightful analysis and decision-making. We are looking for a Data Engineer to join our growing Data and Analytics team. This is ideal for someone with a solid foundation in data engineering who wants to develop deeper skills in Azure Databricks and Microsoft Fabric. You will play a key role in developing and maintaining modern data pipelines, shaping the meta data driver architecture, and building high-quality data models that power reporting and advanced analytics across the business. Duties • Develop and maintain robust data pipelines using technologies such as AWS, Hadoop, and Spark., • Design and implement database solutions for both structured and unstructured data using Oracle and Microsoft SQL Server., • Collaborate with cross-functional teams to understand data requirements and translate them into technical specifications., • Perform data modelling and database design to optimise performance and scalability., • Conduct data analysis to identify trends, patterns, and anomalies in large datasets., • Utilise programming languages such as Python and Java for data manipulation and transformation tasks., • Implement ETL processes using tools like Informatica to ensure seamless data integration., • Write efficient SQL queries for data retrieval, reporting, and analysis., • Create documentation for data processes, workflows, and system architecture., • Employ shell scripting (Bash) for automation of routine tasks., • Build and maintain scalable data pipelines in Azure Databricks and Microsoft Fabric using PySpark and Python, • Support the meta driven architecture (raw, enriched, curated layers) to ensure a clean separation of raw, refined, and curated data, • Design and implement dimensional models such as star schemas and slowly changing dimensions, • Work closely with analysts, governance, and engineering teams to translate business requirements into data solutions, • Apply data governance and lineage principles to ensure documentation, traceability, and quality, • Proven experience in a Data Engineering role or similar position., • Strong knowledge of big data technologies including Hadoop, Apache Hive, and Spark., • Proficiency in programming languages such as Python, Java, VBA, and shell scripting (Bash)., • Experience with database design principles and management of relational databases (Oracle, Microsoft SQL Server)., • Familiarity with data warehousing concepts and best practices., • Excellent analytical skills with the ability to interpret complex datasets effectively., • Strong problem-solving abilities coupled with attention to detail., • Ability to work collaboratively in a team environment while also being self-motivated. If you are passionate about working with data and have the skills required to thrive in this role, we encourage you to apply. Join us in driving our data initiatives forward!, • Familiarity with Agile delivery principles, • Interest in gaining the Microsoft Fabric Data Engineer certification (supported by the business), • Strong SQL and Python skills with hands-on experience in PySpark, • Exposure to Azure Databricks, Microsoft Fabric, or similar cloud data platforms, • Understanding of Delta Lake, Git, and CI/CD workflows, • Experience with relational data modelling and dimensional modelling, • Awareness of data governance tools such as Purview or Unity Catalog, • Excellent analytical and problem-solving ability with strong attention to detail

• Collect, clean, and validate large datasets from multiple structured and unstructured sources., • Develop and maintain data pipelines using SQL, Python (Pandas, NumPy), and data integration tools., • Perform statistical analysis and exploratory data analysis (EDA) to identify trends, anomalies, and insights., • Support the design and implementation of predictive models and machine learning algorithms using Python (scikit-learn, TensorFlow, PyTorch) or R., • Create and maintain interactive dashboards and visual reports using Power BI, Tableau, or Qlik., • Collaborate with senior analysts to translate data insights into business recommendations., • Apply data governance and information security best practices throughout all analysis., • Use version control tools (Git/GitHub) and follow agile working methodologies where appropriate., • Contribute to documentation, data cataloguing, and reproducibility of analytical work., • Present findings clearly to non-technical stakeholders through visual storytelling and reports.