Apply now »

Data Engineer

Data Engineer

Location: Warsaw (hybrid or fully remotely)  

 

As a key step in scaling up our Supply Chain Analytics team, we are looking for a Data Engineer to drive and support data ingestion, staging, processing, and serving using MS Azure services.  

 

 
Your role:   

•  Develop, optimize & maintain ELT/ETL pipelines for purposes of business intelligence and statistical modelling,  
•  Ensure & monitor data quality and integrity,  
•  Manage and support connections between key elements of data infrastructure,   

•  Write and maintain secure, robust, scalable, and efficient code that turns business concepts into tangible solutions,  
•  Drive engineering best practices like automation, CI/CD, and maintainability,  
•  Support standardization and automation by following best practices / common development standards,  
•  Collaborate with BI analysts, data scientists, ML engineers and core IT in cross-functional teams delivering value through data solutions.  
 

  
We need you to bring to our table:  

•  Broad skills in SQL & MDX (Python, R & bash nice to have),  
•  Experience within Azure data components (Data Factory, Synapse, Data Lake, Blob Storage, Databricks),  
•  Experience with Snowflake and its use with Azure cloud,  

•  Experience with data workflow orchestration (ADF, Databricks Jobs, SSIS),  
•  Good command of version control and CI/CD pipelines using Azure DevOps or alike,  
•  Familiarity with data governance concepts (e.g., catalogue & meta data, data lineage, master data),  
•  Business-ready English (B2-C1),  
•  Results-focus and business goals understanding,  
•  Readiness to share knowledge with others and good communication skills.   

 

 
  On top, as nice-to-haves: 

•  Familiarity with SAP Business Warehouse as a data source, 

•  Familiarity with Project Management concepts, 

•  Azure Data Engineer / Data Analytics certificates, 

•  Understanding of BI tools, e.g. Power BI, 

• Experience with PySpark / Scala.  

  

 

What we offer:  

• Permanent employment agreement. 

•  Interesting and challenging work in an international company – a branch of worldwide and well recognized FMCG concern, 

•  Competitive benefits package: private medical care, Multisport card, Pension Fund, 50% discount for lunch in a company canteen, coffee benefit, 

•  Possibility to work in a dynamic team of professionals and leaders, 

•  Possibility to work with challenging projects and responsible tasks, 

•  Atmosphere full of respect, professionalism, 

•  Possibility of development & career advancement, 

• Opportunity to participate in Sustainability projects, and join a community focused on creating sustainable solutions. 

• Flexible working hours with the possibility to work from home (fully remote work possible).  

 

Location: Warsaw (hybrid or fully remotely)  

 

As a key step in scaling up our Supply Chain Analytics team, we are looking for a Data Engineer to drive and support data ingestion, staging, processing, and serving using MS Azure services.  

 

 
Your role:   

•  Develop, optimize & maintain ELT/ETL pipelines for purposes of business intelligence and statistical modelling,  
•  Ensure & monitor data quality and integrity,  
•  Manage and support connections between key elements of data infrastructure,   

•  Write and maintain secure, robust, scalable, and efficient code that turns business concepts into tangible solutions,  
•  Drive engineering best practices like automation, CI/CD, and maintainability,  
•  Support standardization and automation by following best practices / common development standards,  
•  Collaborate with BI analysts, data scientists, ML engineers and core IT in cross-functional teams delivering value through data solutions.  
 

  
We need you to bring to our table:  

•  Broad skills in SQL & MDX (Python, R & bash nice to have),  
•  Experience within Azure data components (Data Factory, Synapse, Data Lake, Blob Storage, Databricks),  
•  Experience with Snowflake and its use with Azure cloud,  

•  Experience with data workflow orchestration (ADF, Databricks Jobs, SSIS),  
•  Good command of version control and CI/CD pipelines using Azure DevOps or alike,  
•  Familiarity with data governance concepts (e.g., catalogue & meta data, data lineage, master data),  
•  Business-ready English (B2-C1),  
•  Results-focus and business goals understanding,  
•  Readiness to share knowledge with others and good communication skills.   

 

 
  On top, as nice-to-haves: 

•  Familiarity with SAP Business Warehouse as a data source, 

•  Familiarity with Project Management concepts, 

•  Azure Data Engineer / Data Analytics certificates, 

•  Understanding of BI tools, e.g. Power BI, 

• Experience with PySpark / Scala.  

  

 

What we offer:  

• Permanent employment agreement. 

•  Interesting and challenging work in an international company – a branch of worldwide and well recognized FMCG concern, 

•  Competitive benefits package: private medical care, Multisport card, Pension Fund, 50% discount for lunch in a company canteen, coffee benefit, 

•  Possibility to work in a dynamic team of professionals and leaders, 

•  Possibility to work with challenging projects and responsible tasks, 

•  Atmosphere full of respect, professionalism, 

•  Possibility of development & career advancement, 

• Opportunity to participate in Sustainability projects, and join a community focused on creating sustainable solutions. 

• Flexible working hours with the possibility to work from home (fully remote work possible).  

 

Warszawa, PL

Warszawa, PL

Apply now »