Global Enterprise Partners is looking for a Big Data Engineer that will work on the collecting, storing, processing, provisioning, and analysing of massive sets of data.
- You will assemble large, complex data sets that meet functional and or non-functional business requirements.
- You will work together with the Data Science team for the preparation of structured and unstructured data. This will be used for predictive and prescriptive modelling
- You will works with DWH architecture team to integrate the clients Data warehouse. This is SQL Server with big data technologies linked to it. Hadoop is one of them.
- You will identify, design, & implement internal process improvements
- You have got a minimum of 2 years' of experience with Distributed Data Storage technologies including the Hadoop Ecosystem
- Strong experience with the integration of data from multiple data sources
- You have got experience with big data platforms such as HDFS, Hive, LLAP/Impala, and NoSQL technologies (Elastic Search, MongoDB, HBase, Redis etc
- You have got experience building stream-processing systems, using technologies such as Kafka, Storm and Spark-Streaming
- Analytic programming languages - Python, R, Java
- Knowledge or understanding of Lambda Architecture
- T-SQL a plus experience would be a plus
- Linux BASH scripting experience
Start date: June 2018 Duration : 6 -12 months contract Rate: Please let us know what you want Are you interested and available?
Please share your CV in Word format