A regional body coordinating digital transformation projects for the water sector in Wallonia is building a common data and integration platform to support operational services, asset management and advanced analytics. This role exists to design the Microsoft-based data architecture and to make the platform AI-ready, combining data governance, data lake migration and ETL/integration work across several utilities and public entities.
The mission
The work sits in a small Strategy and Architecture office that defines sector-wide data standards for multiple public water actors. You will shape the target data platform architecture based on Microsoft technologies, define architecture and integration standards, and prepare a phased roadmap for deployment, migration and decommissioning of an existing datalake. The platform will serve a portfolio of use-cases that represent about 40% of upcoming projects, from consumption forecasting to anomaly detection and operational optimisation.
Day to day you will analyse current datalake functionality and data flows, translate those into the new Microsoft platform, and supervise progressive migrations of pipelines and datasets. You will define and operationalise data governance (quality, access, traceability), formalise roles (data owners, stewards) and work with technical teams and stakeholders to keep delivery aligned with the roadmap and compliance requirements. Expect regular coordination with SAP teams, open-source pipeline owners and operational contacts in utilities.
Your responsibilities
- Define and deliver the target Microsoft data platform architecture and an actionable deployment roadmap that balances security, regulatory constraints and business priorities
- Lead the migration strategy from the existing datalake, ensuring continuity of critical pipelines and phasing the decommissioning work
- Establish and enforce data governance practices, including metadata management, data quality rules, access controls and auditability
- Coordinate and mentor cross-functional teams implementing ETL/ELT pipelines, API integrations and streaming or batch data flows
- Design the platform to be AI-ready, identifying integration points for ML models, data sets for training and requirements for reproducibility and traceability
- Produce architecture documentation and run knowledge transfer sessions with internal teams and beneficiary organisations
Your profile
Essential skills
- 8+ years experience in data architecture and integration, including hands-on work on datalake or enterprise data platform projects
- Strong experience with data governance, metadata management and GDPR-compliant data practices
- Proven delivery of ETL/ELT and integration solutions, working with both SAP environments and open-source tooling
- Solid modelling skills across relational, dimensional and business-oriented data models
- Ability to translate architecture into roadmaps, standards and clear documentation; effective communicator with technical and non-technical stakeholders
- Understanding of AI/ML lifecycle needs and how to prepare datasets and platforms for reproducible model development
Preferred skills
- Experience in public utilities, water sector systems, GIS or SCADA integrations is an asset
- Familiarity with enterprise architecture modelling tools such as ARCHI
Languages
- French, C1
- English, B2
Education
- Degree in computer science, engineering or equivalent professional experience