Senior Data Engineer
AGENTICAGENTIC

Senior Data Engineer

We are looking for highly experienced Senior Data Engineers to help deliver a robust and scalable data architecture across global ERP systems.

This role is ideal for someone passionate about building end-to-end pipelines, enabling AI/BI solutions, and thriving in fast-paced, high-stakes environments.

You will contribute to the acceleration of global data transformation initiatives, ensuring modern, enterprise-ready data ecosystems.

Key Responsibilities

Data Integration & Pipeline Development

  • Design, build, and maintain scalable ETL/ELT pipelines from diverse ERP sources into centralized Data Lakes and Warehouses.
  • Develop connectors for structured and semi-structured data using Python, SQL, APIs, or middleware solutions.

Data Lake & Warehouse Engineering

  • Implement bronze, silver, and gold layers for ingestion, cleaning, and curated datasets.
  • Organize data structures for optimized use in BI and AI systems.

Data Standardization & Cleansing

  • Align global units of measure (lbs, kg, packaging, linear feet) across products and regions.
  • Execute data deduplication, enrichment, and harmonization from disparate systems.

Architectural Collaboration

  • Collaborate closely with Data Architecture leadership on schema definitions, partitioning strategies, and infrastructure design.
  • Set up and maintain sandbox/staging environments for safe testing.

Power BI & AI Enablement

  • Provide ready-to-use, clean datasets to support BI dashboards and AI/ML use cases.

Documentation & Governance

  • Document pipeline architectures, data transformation logic, and integration points.
  • Ensure adherence to data governance policies and support metadata management.

Requirements

Technical Requirements

  • 7+ years of experience in enterprise-scale data engineering.
  • Strong proficiency in:
    • SQL (Advanced) and Python for data processing
    • Spark or Databricks for distributed data workflows
    • Cloud platforms such as Azure Data Lake/Blob, Synapse, or equivalents
    • ETL orchestration tools like Azure Data Factory (ADF), Airflow, or dbt
    • API integrations and data ingestion from ERP systems (NetSuite, QuickBooks, Salesforce, RF Smart, etc.)
  • Demonstrated experience with:
    • Master data frameworks, unit conversion, and ERP-to-warehouse mapping
    • Handling structured and unstructured data
    • Data modeling best practices (star schema, snowflake schema)

Soft Skills & Work Commitment

  • Fluent English (C1 level) – required for daily client calls and clear technical documentation.
  • Strong interpersonal and collaboration skills to work with cross-functional teams (BI, QA, DevOps, Business Analysts).

Nice to Have

  • Experience standardizing data across global manufacturing or supply chain environments.
  • Familiarity with Power BI datasets, alert triggers, and integration with messaging/email tools.
  • Exposure to AI/ML pipelines, including data preparation for machine learning or anomaly detection.