Data Engineer
Added 2 hours agoData Engineer
📍 Location: Remote-first, hybrid (Head Office: Cape Town, South Africa; with offices in Johannesburg, Mauritius and Nairobi, Kenya)
💼 Department: Data and Architecture
🏢 Company: Peach Payments
🌍 About Peach Payments Peach Payments is a fintech company shaping the future of digital payments across Africa. We empower businesses of all sizes — from startups to enterprise merchants — with secure, scalable, and seamless payments infrastructure.
🎯 Team Mission Our mission is to turn data into decisions. We provide the insights, tooling, and analytical rigour that empower teams across Peach Payments to understand our merchants, optimise our products, and drive commercial outcomes with confidence.
🕵🏿 Role Overview Build the pipes that power decisions. As a Data Engineer, you'll design, build, and maintain the pipelines that move payment event data from operational systems into our analytics platform — making it possible for analysts, product teams, and leadership to trust and act on the data they see. You're the person who makes sure the right data is in the right place, in the right shape, at the right time.
1. Pipeline Design & Development (The Plumbing)
- Build & Maintain Pipelines: Design and implement pipelines that ingest payment events into our analytics infrastructure, owning their reliability and performance.
- Data Modelling: Build well-structured dbt models that are transformed, tested, and documented for downstream use.
- Event Processing: Work with streaming data via Confluent and managed ingestion through Hevo into Snowflake and other downstream systems.
- Pipeline Observability: Instrument pipelines with monitoring and alerting so issues are caught early — not discovered by an analyst three days later.
2. Infrastructure & Platform (The Foundation)
- Analytics Infrastructure: Operate and improve our core stack — Snowflake, Confluent, Hevo, dbt, and Lightdash — keeping it stable, performant, and cost-effective.
- Schema Management: Manage schema evolution gracefully as upstream systems change.
- Performance Optimisation: Identify and resolve bottlenecks across queries, ingestion, and transformation; tune Snowflake warehousing and compute accordingly.
- Cost Awareness: Operate with an eye on spend, balancing performance, warehouse sizing, and cost.
3. Data Quality & Governance
- Testing & Validation: Build data quality checks as a first-class concern, ensuring analytics-layer data reconciles with source systems.
- Documentation: Maintain clear documentation of models, architecture, and source mappings so the team doesn't need to reverse-engineer your work.
- Source System Understanding: Develop deep knowledge of Peach's operational databases and event streams — where the data comes from, what it means, and where the edge cases live.
4. Collaboration & Delivery
- Analyst Partnership: Work closely with analysts to understand their data needs and ensure your models support their work.
- Cross-Team Coordination: Engage proactively with engineering pods when source systems change or new event types emerge.
- Iterative Delivery: Ship incrementally — build, monitor, and improve rather than designing in isolation for months.
Competencies: What You Bring
- Strong Data Engineering Fundamentals: You've built reliable pipelines end to end — ingestion, transformation, modelling, serving — in production.
- SQL Proficiency: You write it well, optimise it, and can debug complex queries.
- Pipeline Thinking: You reason about contracts, schemas, failure modes, and observability — building for reliability, not just the happy path.
- Curiosity About the Domain: You want to understand what the data means. Payment data has nuance — authorisations, settlements, refunds, chargebacks — and you want to learn it.
- Pragmatism: You ship working solutions and improve them iteratively, balancing rigour with the reality of a growing company.
🔧 Our Data Stack
- Languages: SQL (strong proficiency required), Python (beneficial)
- Databases: PostgreSQL, Snowflake
- Streaming & Ingestion: Confluent (Kafka), Hevo
- Transformation: dbt
- BI & Visualisation: Lightdash
- Cloud: AWS
- Tools: Git, Jira, Confluence, Cursor, Claude
💡 Required Qualifications & Skills
- 2–4 years in data engineering or a related role
- Strong SQL and experience with data transformation tooling
- Experience building and maintaining production data pipelines
- Familiarity with cloud infrastructure (AWS preferred)
💫 What Will Make You Succeed
- Strong communication and a collaborative working style
- Experience with Snowflake or other columnar/OLAP databases
- Familiarity with payment or fintech data pipelines
- Experience with Confluent (Kafka) and stream processing patterns
- Experience with Hevo or similar managed ingestion platforms
- Exposure to data quality frameworks (dbt tests, Great Expectations, etc.)
- Understanding of PCI DSS and handling sensitive payment data
✔️ Why This Role?
- Ownership: You own the infrastructure powering analytics and decisions across Peach.
- Impact: Every dashboard and insight flows through what you build.
- Growth: Work across the full stack — from Kafka topics to BI layers — and develop deep payments expertise.
🌟 Why Join Peach Payments?
- Impact: Work on mission-critical payments infrastructure processing millions of transactions.
- Growth: Join a fast-growing company expanding across Africa.
- Culture: A high-performance, diverse, and empathetic team focused on respect for people and merchant success.
- Flexibility: Remote-first hybrid — work from anywhere while staying connected to a world-class team.
- Benefits: Generous annual and life leave, market-related salaries, VSOP equity, learning budget, life insurance, and more.
🚀 Be part of our journey to redefine digital payments in Africa!
At Peach Payments, we value diversity and are committed to inclusion across race, gender, age, religion, identity, and experiences.