Summary of the work
Specialist role – Developer
Contribute to the development of the Borders AFTC service working in an Agile delivery environment. The role is for a Data Engineer to work on the continuing design, build, commission and test a big data platform for a Home Office department.
Latest start date Sunday 30 July 2017
Expected contract length Up to 24 months or completion of the services.
Organisation the work is for Home Office Borders Technology Portfolio
About the work
Who the specialist will work with:
Key stakeholders within Border Force, Home Office Digital Data and Technology, Home Office and Security Services. The programmes are resourced by a range of specialist resources including Home Office Civil Servants and other government Civil Servants.
What the specialist will work on:
The developer will need to be an SME in: Hadoop / HDFS, HBase, Kafka, Kerberos, Drools, Graph DBs, Big data architecture, Real time data processing, Spark, Storm, Solr, Java 8, Algorithms, Data Science, Systems Engineering, ZooKeeper, Spring / Spring Boot and spring data rest, REST, TDD , BDD (specifically Cucumber), Microservices architecture, GitHub / GitLab, use of branching and merging strategies, Clean code
Skills and experience
Essential skills and experience:
Hadoop / HDFS
HBase, Kafka, Kerberos, Drools
Big data architecture, Real time data processing, Spark, Storm
Java 8, Algorithms, Data Science, Systems Engineering
Spring / Spring Boot and spring data rest
BDD (specifically Cucumber)
Fluent in English and good communication skills
Experience working within a Scrum team and familiarity with CI development techniques and environments
Nice-to-have skills and experience:
Apache Camel, Knox, Ranger
Enterprise Integration Patterns