+27 11 463 3633 info@e-merge.co.za
Craig Nel
    Published
    December 14, 2017
    Location
    Sandton, South Africa
    Job Type

    Description

    Financial services industry giant and employer of choice based in Sandton is looking for a Big Data Specialist with solid skills and experience in technologies such as, Java, PIG, Hive, Sqoop, Spark and Hadoop.

    Key Purpose

    • Construction of complex Business intelligence assets based on specifications produced by BI Architects and System Analysts, including mentoring and supporting junior developers.

    Key Outputs

    • Creation of digital assets on Big data platforms.
    • Create, maintain and support data processing functions to import
    • Unstructured data sources from various technology stacks into open source platforms.
    • Translates and not limited to, the CRS and PRS into executable code without errors.
    • Finds effective software solutions to technical issues.
    • Ensure that the application performs the functions as required by the business.
    • Actively coach other developers during the code review process to understand and apply standards whilst ensures that business objectives are reflected in the technical processes in which they manage.
    • Contribute to Business Intelligence strategy, technical direction, ETL Architecture on big data.

    Requirements:

    Qualifications & Experience    

    • Essential: Tertiary degree (B. Engineer, B.Com, BSC) / or similar
    • 5+ years’ experience in software development on Hadoop for large commercial entities.

    Technical Skills:

    • Processes: SDLC, ITIL (Incident, Change, Release, Problem Management)
    • Technologies: SQL, Oracle, PL/SQL
    • Big data, Hadoop ecosystem related technologies
    • Excellent understanding of Java 2 platform enterprise edition
    • Apache Pig
    • Apache hive
    • Apache Sqoop
    • Spark
    • Writing Pig UDFs and Hive UDFs and UDAFs in the analysis of data.
    • importing and exporting data from relational database into HDFS using Sqoop.
    • Hadoop architecture and its components like HDFS, MapReduce, Job Tracker, Task Tracker, Name Node and Data Node.
    • using all complex data types in Pig and mapreduce for handling the data and formatting it as required.
    • Working experience in Agile and waterfall models.
    • Expertise in writing ETL Jobs for analyzing data using Pig.
    • Expert in NoSQL Column-Oriented Databases like HBase and its Integration with Hadoop cluster
    • Excellent working knowledge in Java and SQL in application development and deployment.
    • Good knowledge on relational databases like DB2, Oracle and NoSQL databases like Hbase, mongo, etc.
    • Ability to contribute to all stages of SDLC on Client Requirement Analysis, Prototyping, Coding, Testing and Documentation.
    • Tech-functional responsibilities include interfacing with users, identifying functional and technical gaps, estimates, designing custom solutions, development, leading developers, producing documentation and production support.
    • Basic knowledge in Apache spark for fast large-scale in-memory MapReduce.
    • Diverse experience in utilizing Java tools in business, web and client

    Behavioural Skills:

    • Result Orientation
    • Change Readiness
    • Time Management
    • Communication (written, verbal and listening)
    • Creativeness
    • Innovation

    Reference Number for this position is CN39804 which is a permanent position based in Sandton offering a salary of R840k – R960k per annum CTC negotiable.

    The time for change is NOW! e-Merge IT recruitment are specialist niche recruiters with a wide range of positions available. We offer researched positions with top companies to strong technical candidates. For more info on this and other opportunities contact Craig on (011) 463 3633 / craig@e-merge.co.za

    Apply
    Drop files here browse files ...

    Related Jobs

    Are you sure you want to delete this file?
    /