top of page
Information Technology

Optimise Services With
Machine Learning

We develop Machine learning (ML) solutions allowing you to

uncover valuable insights that drive smarter decisions.


Predict trends, optimise processes, personalize experiences, and extract knowledge from your data. 

We have been building bespoke ML systems since 2015, typically with Azure ML. We also customise standard models from the likes of Azure AI Studio and Google Vertex AI. 

Data to Insights

  • Classification, Forecasting

  • Anomaly detection, image processing

Intellectual Property

  • Use ML to develop unique solutions

  • Prevent tools being copied by competitors

Secure Data Pipelines

  • Automate and harden data pipelines

  • Discover new data sources

The Process

Our approach to data science is based on the Microsoft Team Data Science Process.

You can review the Microsoft TDSP in more detail here.


Business Understanding

  • Identify and articulate the business problem

  • Set clear, measurable objectives for what the project aims to achieve.

Warehouse Shelves from Above

Machine Learning
Case Study


Agentico was asked to optimise parts inventory for a machinery business with 25,000 lines in stock over 4 sites.

Stock demand is seasonal and uneven across sites. Demand for many components is not market driven but set by warranty alerts. However, a third of supply must be ordered months in advance in order to achieve discounts.


We mapped business processes and interviewed stakeholders, inside and outside the business; warehouse managers, suppliers and finance providers. Data was sourced and algorithms tuned to the historical demand.

Approach Cont...

A prototype was trialled with users. The final model was then deployed with user training for each site and maintained on request.


An ensemble model was developed, comprising a 1D convolutional neural network for time series forecasting and a traditional ARIMA model.

Logic was then constructed to accommodate warranty rules and distribution of stock across sites, given costs of picking and transit.

Data pipelines were built as simply as possible to enable the inevitable maintenance and change of incentives from supplies.


The model saved 10% in required stocking levels and reduced parts obsolescence, paying for itself in the first year of usage.

bottom of page