Big Data Engineer
Are you an experienced Big Data Engineer interested in working within finance?
Do you have experience with Scala and Apache Spark?
If the answer is yes to the above I have a fantastic opportunity for you!
My client has an opportunity in the Data Systems & Change team for an individual with strong Scala and Spark development skills. You will design and develop data solutions, to be deployed to their internal Hadoop cluster, and be familiar with, and committed to, the principles of agile project delivery.
This is a great hands on role, in which you will deliver code to a high standard, whilst mentoring less experienced developers within a feature team and reviewing their work.
You will contribute to the overall technical direction of the team in combination with your effective leadership, communication and stakeholder management skills.
You will build and manage effective stakeholder relationships and have the ability to find creative solutions to unfamiliar problems. You will create new solutions, configure existing systems and provide end user support. Innovation is at the heart of our team, you'll have a passion for exploring new ways of working and adopting new technologies. In this role, you should be able to write functional code with a sharp eye for detail and spotting defects.
* Design and implement data processing applications for batch and real time delivery of information to end users
* Interpret and analyse business use-cases and feature requests and translate into test cases and technical designs
* Take ownership of development tasks, participate in scrum events and user story refinement
* Collaborate with internal teams to produce high level system designs and influence decisions on architecture
* Develop, test and implement code to departmental standards and scrum development principles
* Stakeholder influence at all levels, work closely with business colleagues to understand and shape requirements and solutions
* Proactively develop and maintain customer relationships, in order to meet current and future business needs
* Ensure compliance with all relevant Group and DS&C procedures, guidelines and reporting requirements as well as any relevant regulatory and statutory requirements
* Mentor and coach other team members in a range of skills and behaviours, to include both technical and development lifecycle best practice
* Computer Science or Software Engineering degree or equivalent experience
* Experience working with the Hadoop framework, and an understanding of the common modules and their inner workings (e.g. HDFS, Yarn, Hive, HBase and Kafka)
* Demonstrable experience on big data or advanced analytics projects
* Strong SQL, ideally with both relational and dimensional data modelling experience.
Search is an equal opportunities recruiter and we welcome applications from all suitably skilled or qualified applicants, regardless of their race, sex, disability, religion/beliefs, sexual orientation or age.