Details:
- Salary: £600 - 650 - Day
- Job Type: Contract
- Job Status: Full-Time
- Salary Per: Day
- Location: Leeds West Yorkshire
- Date: 1 week ago
Description:
Spark/PySpark Architect - 12 months+ -£Inside IR35- Hybrid working of 3 days on site in Leeds
My client are a Global Consultancy who are looking for a number of Spark/PySpark Architects to join them on a Long term programme. As the Spark architect, you will have the opportunity to work with one of the biggest IT landscapes in the world. You can also look forward to opportunity to collaborate with senior leadership stakeholders, guiding technical team and drive overall program objectives.
Responsibilities:
Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP).
Drive Data Integration upgrade to PySpark
Collaboration with multiple customer stakeholders
Knowledge of working with Cloud Databases
Excellent communication and solution presentation skills.
Able to analyse Spark code failures through Spark Plans and make correcting recommendations
Able to review PySpark and Spark SQL jobs and make performance improvement recommendations
Able to understand Data Frames / Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations
Able to monitor Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures. As a Spark architect, who can demonstrate deep knowledge of how Cloudera Spark is set up and how the run time libraries are used by PySpark code.Mandatory Skills
At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans.
Spark SME - Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.
To be able to traverse and explain the architecture you have been a part of and why any particular tool/technology was used.
Spark SME - Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.
Spark - SME Be able to understand Data Frames / Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations.
Monitoring -Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures.
Cloudera (CDP) Spark and how the run time libraries are used by PySpark code.
Prophecy - High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. Spark/PySpark Architect - 12 months+ -£Inside IR35- Hybrid working of 3 days on site in Leeds
Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website.
Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job.
Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003