DysrupITDysrupIT

GCP Data Engineer

Added 5 days ago

JOB SUMMARY

We are seeking an experienced Senior Data Engineer with strong Google Cloud Platform (GCP) expertise to design, build, and optimize scalable data pipelines and data platforms. The ideal candidate will play a key role in developing data solutions that support analytics, reporting, and business intelligence initiatives while ensuring reliability, performance, and data quality.

This role will collaborate closely with data analysts, data scientists, and engineering teams to deliver high-quality data solutions within a modern cloud environment. The successful candidate will bring not only technical depth, but also the independence, curiosity, and enterprise resilience required to thrive in complex, large-scale delivery environments.

JOB RESPONSIBILITIES

  • Design, develop, and maintain scalable data pipelines and ETL/ELT workflows on GCP.
  • Build and optimize data architectures to support large-scale data processing and analytics.
  • Implement and manage data solutions using Google Cloud data services.
  • Ensure data quality, integrity, and reliability across data pipelines and storage systems.
  • Work closely with cross-functional teams to understand data requirements and translate them into technical solutions — proactively challenging vague or ambiguous requirements rather than simply executing instructions.
  • Design robust database schemas with future consumers in mind, recognizing that data decisions have long-lasting consequences.
  • Optimize performance of data workflows and improve data processing efficiency.
  • Implement best practices for data governance, security, and monitoring.
  • Navigate enterprise processes such as rigorous peer reviews and architecture approvals with professionalism and efficiency.
  • Independently manage and absorb typical business delays and organisational complexity without requiring close supervision.
  • Support troubleshooting and resolution of production data issues.
  • Mentor junior data engineers and contribute to technical best practices.

WHAT WE’RE LOOKING FOR

Beyond core technical skills, we are looking for engineers who bring the following qualities:

GCP End-to-End Production Experience

We seek senior engineers who have genuinely been through the full lifecycle of GCP delivery — not just development, but production. This means familiarity with rigorous peer review processes, enterprise architecture approval workflows, and the real-world challenges of operating at scale. Candidates should hold one or more Google Professional Certifications (e.g. Professional Data Engineer, Professional Cloud Architect).

Product Mindset

This role operates in an environment where requirements may initially be unclear or evolving. We need independent, curious thinkers who take ownership of understanding the full end-to-end process. The ideal candidate will proactively question assumptions, surface gaps, and challenge vague specifications rather than waiting to be directed.

Database Architecture Excellence

Strong database design skills are absolutely critical to this role. Because data is long-lived and its future consumers are often unknown at design time, candidates must be able to design schemas and data models that are robust, extensible, and thoughtfully structured for the long term. Experience with both OLTP and OLAP design patterns is highly valued.

Enterprise Experience & Resilience

Candidates should have experience working within large, complex organisations where processes can be slow and layered. We value individuals who understand how to work effectively within these environments — managing delays pragmatically, navigating stakeholder complexity, and maintaining momentum without becoming frustrated by organisational friction.

JOB QUALIFICATIONS

  • 5+ years of experience in Data Engineering or similar roles, with demonstrable senior-level delivery.
  • Strong hands-on, end-to-end production experience with Google Cloud Platform (GCP).
  • Experience building data pipelines using tools such as Dataflow, BigQuery, Pub/Sub, Cloud Composer, or Dataproc.
  • Strong SQL skills and proven experience working with large-scale datasets.
  • Experience with Python, Java, or Scala for data processing.
  • Experience with ETL/ELT frameworks and modern data architectures.
  • Deep knowledge of data modelling and data warehousing concepts, including schema design for unknown future consumers.
  • Google Professional Certification(s) required (e.g. Professional Data Engineer, Professional Cloud Architect).
  • Demonstrated ability to navigate enterprise processes including architecture approvals and peer review cycles.
  • Experience working in Agile development environments.
  • Strong communication and stakeholder management skills, with the ability to challenge requirements constructively.