Global Enterprise Partners is currently looking for a Big Data Developer for a client in Utrecht, the Netherlands.
For this particular opportunity it is important that the candidate already resides in the Netherlands for the last 5 years.
HadoopPlatform: Kafka / HDFS / Spark / Hbase / Zeppelin / Flume / Hive / Elastic
Tooling: Jira / Confluence / Jenkins / Kylo / Unix Scripting / Eclipse / Intellij / GIT
Agile Framework: Scrum / Kanban
Keep the cluster and services operational and watch over the health of the Hadoop cluster
Realisation and implementation of analyzes and models on Big Data
Realise and implement production-worthy applications on the Hadoop platform
Realise data streams and data transformations on the platform
Maintain contacts, collaborate with the other Big Data teams.
Stimulate DevOps culture within the organisation
Minimum 2 years experience in Java OR
Minimum 2 years experience in Unix
Minimal Training enjoyed in Kafka / HDFS / Spark / Hbase / Zeppelin / Flume / Hive / Elastic
WSBD1: 2 years experience in HDFS / Spark / Flume
WSBD2: 1 year experience in Kafka / Zeppelin / Hive / Elastic / Bash scripting
WSBD3: Experience with HTM5 and AngularJS (2)
WSBD4: Experience with information analysis
Does this meet your profile? If yes, please respond directly with an updated CV in word format to a.doyle(a)globalenterprisepartners.com including your all inclusive hourly rate and availability.
Please feel free to forward to your network