We, and third parties, use cookies on our website. We use cookies to ensure that our website functions properly, to store your preferences, to gain insight into visitor behavior, but also for marketing and social media purposes (showing personalized advertisements). By clicking 'Accept', you agree to the use of all cookies. In our Cookie Statement. you can read more about the cookies we use and save or change your preferences. By clicking 'Refuse' you only agree to the use of functional cookies.


We believe in an "AI-first" development approach. We are looking for an agile learner who leverages modern AI tools (like Gemini, Claude, and Copilot) alongside strong Python and GCP engineering skills to accelerate development and creatively solve complex integration challenges.
The Role
This position is tailored for a Data Engineer who loves the "plumbing" of data—connecting APIs, orchestrating workflows, and moving data seamlessly between systems. You will own our upstream ingestion framework, manage our Google Cloud infrastructure (Cloud Composer, Dataflow, Cloud Run), and power our Reverse ETL processes to ensure our operational systems have the data they need.
Responsibilities:
Build Ingestion Pipelines: Design, develop, and deploy robust data ingestion pipelines from various third-party APIs, webhooks, and source systems into Google Cloud.
AI-Augmented Engineering: Actively leverage advanced AI coding assistants to accelerate pipeline development, generate boilerplate API connection code, debug complex scripts, and automate repetitive tasks.
GCP Infrastructure & Orchestration: Build and manage data workflows using Cloud Composer (Airflow), and leverage Cloud Run and Dataflow for scalable, containerized data processing.
Drive Reverse ETL: Architect and maintain the data pipelines that push refined data from BigQuery back into our operational platforms (marketing tools, CRM, etc.) to drive business action.
Manage Operational Databases: Utilize Firestore and other NoSQL/relational databases to support operational data needs and microservices.
Collaborate and Learn: Partner with our data modeling specialists to ensure smooth handoffs between ingestion and transformation. Participate in code reviews and continuously share new engineering best practices.
Requirements:
Core Data Engineering Experience: 3 to 5 years of hands-on experience in data engineering, with a strong focus on data integration, APIs, and pipeline architecture.
AI Adaptability & Continuous Learning: A strong desire to learn quickly and adapt to new technologies. You embrace modern development practices and are highly comfortable using AI tools as a force multiplier in your daily work.
API Integration & Python: Strong Python programming skills with a proven track record of building custom API extractors, handling pagination, rate limiting, and working with REST/GraphQL endpoints.
GCP Service Expertise: Hands-on experience with Google Cloud Platform's ecosystem, specifically Cloud Composer, Dataflow, Cloud Run, and Firestore.
Code Quality & CI/CD: Proficient in writing clean, well-documented, and tested code (e.g., pytest), with strong experience using Git, Docker, and CI/CD pipelines.
Bonus Skills (Nice to Have): Experience managing Infrastructure as Code (specifically Terraform) or working with downstream data transformation tools (dbt).
Living in the Netherlands; Please note that due to the urgency of the role, we are only considering candidates who are already living in the Netherlands.
English Proficiency: Excellent written and verbal English communication skills.
About Us:
Hunkemöller strives to be a much loved, social & inclusive brand. A place where people love to work, are proud of the brand, and where we create true brand ambassadors. Working in a passionate, energetic, design-led and performance-driven environment where our key customer persona 'Shero' sits at the heart of everything we do. Hunkemöller is certified TOP EMPLOYER of the Netherlands 2024, which underlines our people initiatives and achievements.
Together Tomorrow – Join a Retailer that's on the move to be better for our planet, better for people, better together! From diversity & inclusion, reducing waste, to product care and how we work with our suppliers, our Together Tomorrow initiative reflects what we do and helps drive change across our business. Ready to help us achieving our ambitious goals? Where ever you'll start working with us, if in Stores or our HQs, you can contribute!
We believe in an "AI-first" development approach. We are looking for an agile learner who leverages modern AI tools (like Gemini, Claude, and Copilot) alongside strong Python and GCP engineering skills to accelerate development and creatively solve complex integration challenges.
The Role
This position is tailored for a Data Engineer who loves the "plumbing" of data—connecting APIs, orchestrating workflows, and moving data seamlessly between systems. You will own our upstream ingestion framework, manage our Google Cloud infrastructure (Cloud Composer, Dataflow, Cloud Run), and power our Reverse ETL processes to ensure our operational systems have the data they need.
Responsibilities:
Build Ingestion Pipelines: Design, develop, and deploy robust data ingestion pipelines from various third-party APIs, webhooks, and source systems into Google Cloud.
AI-Augmented Engineering: Actively leverage advanced AI coding assistants to accelerate pipeline development, generate boilerplate API connection code, debug complex scripts, and automate repetitive tasks.
GCP Infrastructure & Orchestration: Build and manage data workflows using Cloud Composer (Airflow), and leverage Cloud Run and Dataflow for scalable, containerized data processing.
Drive Reverse ETL: Architect and maintain the data pipelines that push refined data from BigQuery back into our operational platforms (marketing tools, CRM, etc.) to drive business action.
Manage Operational Databases: Utilize Firestore and other NoSQL/relational databases to support operational data needs and microservices.
Collaborate and Learn: Partner with our data modeling specialists to ensure smooth handoffs between ingestion and transformation. Participate in code reviews and continuously share new engineering best practices.
Requirements:
Core Data Engineering Experience: 3 to 5 years of hands-on experience in data engineering, with a strong focus on data integration, APIs, and pipeline architecture.
AI Adaptability & Continuous Learning: A strong desire to learn quickly and adapt to new technologies. You embrace modern development practices and are highly comfortable using AI tools as a force multiplier in your daily work.
API Integration & Python: Strong Python programming skills with a proven track record of building custom API extractors, handling pagination, rate limiting, and working with REST/GraphQL endpoints.
GCP Service Expertise: Hands-on experience with Google Cloud Platform's ecosystem, specifically Cloud Composer, Dataflow, Cloud Run, and Firestore.
Code Quality & CI/CD: Proficient in writing clean, well-documented, and tested code (e.g., pytest), with strong experience using Git, Docker, and CI/CD pipelines.
Bonus Skills (Nice to Have): Experience managing Infrastructure as Code (specifically Terraform) or working with downstream data transformation tools (dbt).
Living in the Netherlands; Please note that due to the urgency of the role, we are only considering candidates who are already living in the Netherlands.
English Proficiency: Excellent written and verbal English communication skills.
About Us:
Hunkemöller strives to be a much loved, social & inclusive brand. A place where people love to work, are proud of the brand, and where we create true brand ambassadors. Working in a passionate, energetic, design-led and performance-driven environment where our key customer persona 'Shero' sits at the heart of everything we do. Hunkemöller is certified TOP EMPLOYER of the Netherlands 2024, which underlines our people initiatives and achievements.
Together Tomorrow – Join a Retailer that's on the move to be better for our planet, better for people, better together! From diversity & inclusion, reducing waste, to product care and how we work with our suppliers, our Together Tomorrow initiative reflects what we do and helps drive change across our business. Ready to help us achieving our ambitious goals? Where ever you'll start working with us, if in Stores or our HQs, you can contribute!




