Details:
- Salary: £85,000 - 95,000 - Annum
- Job Type: Permanent
- Job Status: Full-Time
- Salary Per: Annum
- Location: Nationwide
- Date: 1 week ago
Description:
GCP Data Engineer - Insight & Data Services - Permanent
Salary guideline: £85,000 - £95,000 pa (D.O.E) + 5-10% Bonus, Pension up to 6% contributory, Health Insurance, Life Assurance etc.
Base Location: Closest office to your home location / Hybrid working / Part Remote / UK wide
The Client:
Our client is a global leader in Systems Integration and IT Consultancy. They have built out a super advanced and respected industry-wide Insights & Data Practice. The Data Engineering, Architecture and Platform practice is part of global Insights & Data group.
The Focus of the Role:
Build and deliver GCP data engineering solutions as part of a larger project:
Use Google Data Products tools (e.g. BigQuery, Dataflow, DataProc, AI Building Blocks, Looker, Cloud Data Fusion, Data prep, etc.) to build solutions for our customers
Experience in Spark (Scala/Python/Java) and Kafka.
Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.
E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management.
E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy.
Experience with SQL and NoSQL modern data stores.
Build relationships with client stakeholders to establish a high-level of rapport and confidence
Work with clients, local teams and offshore resources to deliver modern data products
Work effectively on client sites, Capgemini offices and from home
Use GCP Data focused Reference Architecture
Design and build data service APIs
Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services
Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment Implement effective metrics and monitoring processes
Essential Skills and Experience Needed
Minimum 3-4 years of experience with Google Data Products tools (e.g., BigQuery, Dataflow, DataProc, AI Building Blocks, Looker, Cloud Data Fusion, Data prep, etc.) - Construction Industry Sector Preferred
Google Cloud Platform
Time-Series Data
Building Control and Monitoring Systems
Java, Scala, Python, Spark, SQL
Experience of developing enterprise grade ETL/ELT data pipelines.
Deep understanding of data manipulation/wrangling techniques
Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review).
Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion.
NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore.
Snowflake Data Warehouse/Platform
Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming.
Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc
Experience and knowledge of application Containerisation, Docker, Kubernetes etc
Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools
Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language.
Ability to translate business requirements into plausible technical solutions for articulation to other development staff.
Good understanding of Lambda architecture patterns
Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes
Influencing and supporting project delivery through involvement in project/sprint planning and QA
Experience with Agile methodology
Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc83DATA is a boutique consultancy specialising in Data Engineering and Architecture | Data Science (ML, AI, DL) | Data Visualisation | RPA within the UK. We provide high-quality interim and permanent senior IT professionals