Are you a business? Hire hadoop candidates in Hounslow
Hadoop: 5+ years Kafka: 3+ years Spark: 4+ years PySpark: 3+ years
Must have experience with big data tools such as Hadoop, Cloudera or Elasticsearch * Demonstratable experience in a professional Java environment * Experience of DevSecOps automated deployment tools ...
Must have experience with big data tools such as Hadoop, Cloudera or Elasticsearch * Experience working in an Agile Scrum environment * Experience in design, development, test and integration of ...
Azure and or AWS -Excellent knowledge of Hadoop and tools such as Hbase / Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very strong knowledge ...
The following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellence experience across Azure -Excellent knowledge of Hadoop and tools such ...
Understanding of Big Data Technologies (Hadoop, Spark etc) * Experience with Cloud platforms (AWS, GCP or Azure) This is a fully remote role, but may require very occasional travel (once a month or ...
The following skills/experience is essential: -Proven experience as a Lead Big Data Engineer with excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase / Hive and ...
The following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase / Hive and Spark etc ...
Employers want to know
Do you have work experience?
Artificial Intelligence and data science approaches (Python, R, Matlab, Spark etc). * database technologies such as Hadoop. * tools that expand the companies tool kit, advancing their ability to ...
... Hadoop, Spark) is a plus Data Engineer - London - £85,000
Experience with big-data technologies Spark/Databricks and Hadoop/ADLS is a must * Experience in any one of the cloud platform Azure (Preferred), AWS or Google * Experience building data lakes and ...
Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. * Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and ...
Involvement with diverse Big Data tools such as Spark/Kafka/NiFi/HDFS/Hadoop/Docker/ELK/AWS * Great work environment * Assignments to enhance your classroom learning * Personalized coaching from our ...
Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key ...
Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau)
Strong Snowflake Development experience Hands on experience on Data Modelling/Data Migration and related tools Sound Knowledge on Hadoop Architecture Strong knowledge of SQL Best pointers for working ...
Build predictive models using machine-learning techniques that generate data-driven insights on modern data platforms (Spark, Hadoop and other map-reduce tools); * Develop and productionalize ...
Experience with big data technologies such as Hadoop, Spark, and distributed computing frameworks. What is in it for you? * Significant Bonus Package. * Flexible working for work/life balance
Familiarity with big data technologies such as Hadoop, Spark, or Kafka . * Experience with CI/CD * Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems