Cloud Platform Engineer
Location: Paddington (Full-time remote working during COVID-19)
Our purpose at Vodafone is to connect for a better future. At Vodafone Group, we partner with our local markets to drive consistency and earn customer loyalty across 20 countries for over 600 million customers. We are restless and passionate about creating a better future for all. We are always open to new things, and curious to create solutions that our customers will love. We are constantly learning to innovate the way we operate, set our global standards and to deliver on our strategy. Our ambition is to improve 1 billion lives and half our environmental impact by 2025. It’s up to each of us to make this happen. Ready?
Today, we connect one in 16 people worldwide. The technology function is fundamental in building a Digital Society. Our Gigabit networks, the Internet of Things and mobile financial services enable incredible innovations that will make our lives easier, healthier, smarter and more fulfilling. We want to make sure that our technology creates a better future for people, communities, businesses and society. We get it done, together. We think big, take risks, keep the best and learn from the rest. Do you?
The Analytics Delivery team, within the Group Technology business function, are onboarding Local Markets on to a bespoke Google Cloud Platform (GCP) and expanding its usage to encompass other Vodafone business functions and projects.
The project delivers Next Generation Data Products on Google Cloud, covering the entire Telecoms business. Project leverages existing data repositories and rapid, agile delivery to provision Visualisations and Machine Learning, within rich digital apps. The project delivers early, tangible value and helps Local Vodafone Markets to be first adopters of Next Generation solutions and technology.
Project’s key strengths are its fast delivery, world-class cloud platforms and user-friendly designs, enabling access to massive datasets in secure and fast apps
The Platform Engineer delivers cloud infrastructure and software deployment automation on a variety of projects. This will be at industrial scale delivering 20+ countries local data to a central platform hosted in the cloud. Our goal is to create a self-healing solution that reduced operational support by over 70%.
Expert experience of Google Cloud Platform using its services such as Data Fusion, Data Flow, BigQuery etc.
Good exposure of Docker/Kubernetes
Expert level experience with Hadoop ecosystem (Spark, Hive/Impala, HBase, Yarn);
Strong software development experience in Java, Scala and Python programing languages; other functional languages desirable;
Experience with Unix-based systems, including bash programming
Experience with other distributed technologies
Experience of SQL preferably knowledge of Big Query
Worked in Agile environment
Expertise in public cloud PaaS, CaaS and IaaS tools and environments, particularly with Google Cloud Platform (GCP).
Demonstrable knowledge and expertise in Jenkins, Gitlab/Github, Nexus or equivalent CI/CD tools
Experience working with Terraform, Ansible and a common scripting language (Bash, Python, Ruby etc)
Experience building systems to perform real-time data processing using Kafka, or similar technologies like Pub/Sub.
Experience with common SDLC, including SCM, build tools, unit testing, TDD/BDD, continuous delivery and agile practises
Experience working on EDW or Data Warehouse solutions