Our Customer is active as a provider of digital services and communication solutions on the Belgian and international markets.
They offer their customers a world of digital opportunities so that they can live better and work smarter. We do this by offering products and services tailored to the needs of every customer and by being a partner of citizens, companies, and Belgian society in their digital evolution.
It is their ambition to become a truly customer-focused digital company. The customer is central to everything they do, and the customer journey should be as simple as possible with accessible and user-friendly solutions.
The concept of Business Activity Monitoring (BAM) is the acquisition, correlation, structuring, analysis, and presentation in real time of business data in the context of business process modelling.
The strategic focus of the team is building the real time visualization of the customer journey to allow instant and proactive reactions by the different divisions within the Customer to enrich the first time right and the customer centric concepts. Aligned and focused on achieving the Customers' vision and ambition.
In general, the core architecture of our BAM module is built on top of WebMethods EAI, Springboot Microservices, GraphQL and an Elasticsearch based search engine.
Our customers works in an Agile environment using a customized scrum framework. And currently, they are undergoing a transformation, shifting to their target architecture while building a robust SDLC and STLC for their scrum teams to follow. During which they are reviewing their processes and engines to improve their data correlation and therefore the quality of the data. The intention is to move from the passive role of translating business requirements into data structure to a more active role in which they suggest business projects/views based on the current or possible structure of their data.
Currently no Data Scientist exists in the team, therefore, your role will be to identify and form this entity following the Data Frameworks of the Customer.
• Data Cleansing using tools and programming languages
• Data mining using APIs or building ETL pipelines
• Establishing, maintaining, and continuously improving Data Quality Frameworks
• Accountable & responsible for the data quality by defining relevant data quality metrics, requirements, ranges, parameters and action plan, relevant for each data elements. Report regularly on data quality and trigger the actions for detection and correction of data quality issues. Drive the root causes analysis and the implementation of coercive measures. Proposes additional controls to improve the fitness of data in relevant systems
• Develops the data plan to enable business plans, owns the backlog and prioritizes their implementation according to available budgets and to maximize the return for the business
• Implementing and developing a DataOps methodology to compile with our DevOps methodology
• Creating programming and automation techniques, such as libraries, that simplify day-to-day processes using tools to develop and implement/train machine learning models
• Act as the "go to person" for everyone who works with BAM data. She/he has an end-to-end view on BAM data: she/he knows how this data is Created, Read, Updated and Deleted along its lifecycle and which business processes and systems are involved in these transformations
• Translate the business objectives in data needs and requirements
• Federate the business stakeholders to understand the data needs for the different business use cases and connects the dots between the different silos. Act as the local "Data Steward" for BAM
• Accountable of defining and documenting data and terminology in the relevant glossaries and ensure these are mapped towards the physical data assets for correct usage and processing in the different use cases
• Act as the link between the Data Governance & Quality Team and BAM
• Thinking outside the box
• Experienced in working within a Big Data architecture 3+ years
• Experience in creating services, data mapping, Java services, XML Schemas, Canonical documents, Broker Documents, flat files DTD's, dictionary changes, Parsing JSON Strings, web Services, REST & SOAP services, publish and Subscribe documents, Triggers, Notifications, Java API, OData Services, GraphQL.
• Experience in implementing Canonical Data models and Pub-Sub model.
• knowledge in following technologies, languages and tools: SQL, Word, Excel, PowerPoint, Teradata, SQL Server, .net, C#, HDFS, Hive, Python, Git, Linux, MSY, Power BI
• Good Experience in Elasticsearch is required 3+ years
• Experience with software engineering and/or real-time streaming is a plus
• Experience in developing with WebMethods product suite is a plus
• Experience in developing Microservices using Springboot framework is a plus
• Experience in Telco is a plus
• Experience using CI/CD tools and is a plus
• Experience in Couchbase and/or Oracle dbs is a plus
• PhD or master’s degree in a quantitative field (Artificial Intelligence, Computer Science, Engineering, Statistics, Mathematics, etc.)
Fluent in English
French and/or Dutch are a plus