Lukas Bromig, UniteLabs, München

Integrating laboratory hardware and software remains a persistent challenge in lab automation. Today’s solutions rely on proprietary, point-to-point integrations—connecting automation equipment to schedulers or file systems to LIMS—resulting in a fragmented, inflexible, and costly landscape.

A modern laboratory requires a robust infrastructure layer that standardizes connectivity across vendors, from the edge to the cloud. By adopting an API-first approach and leveraging principles from Industry 4.0, labs can achieve seamless automation that is more scalable, cost-effective, and accessible.

Burkhard Schäfer, Splashlake, Griesheim 

This presentation introduces the notion of digital sustainability in scientific research & development and manufacturing. It focuses on maximizing outcomes from available resources through a holistic and scalable approach to data management and integration. 

By implementing this approach, researchers can enhance the efficiency and effectiveness of their experiments, data handling, and analysis processes. The talk outlines a smart lab data blueprint, from experiment planning to analytics, showing how standardization enables more productive use of resources. This framework not only optimizes current research capabilities but also sets the stage for advanced AI and machine learning applications, including closed-loop experimentation. 

Alan Parkin, Titian Software, London, UK

Acoustic liquid handling delivers significant benefits, including the ability to execute complex sample layouts including pools and combinations. Accordingly, the machines have the potential to deliver a very rich set of data. This introduces data management challenges and complexity that is no longer easily managed with files, making it hard to accurately track sample information and realise the wider potential of the data.

In this session, Titian Software explores the integration with acoustic liquid handlers, providing examples of how to capture and manage that operational data to save time and effort and maximise the efficiency advances. Data can be leveraged in the cloud and, as well as providing full sample history, can be delivered as a foundation for analytical tools. We look ahead to how AI and ML can be leveraged to provide predictive data to for operational insights such as prediction of run time and proactive monitoring of machine performance. 

David Garcia Lopez, EU-Openscreen, Berlin

As laboratories evolve, the need for digitalization and automation becomes imperative. This presentation focuses on the transition from manual workflows managed in KNIME to MOSAIC, a powerful platform that centralizes data and automates laboratory processes. The discussion will cover the challenges and solutions involved in this migration, demonstrating how MOSAIC simplifies data handling and optimizes processes.

Janina Bolling, Spectaris, Berlin
Matthias Schuh, Essentim, München

Today's laboratory infrastructure is still extremely heterogeneous:

  • we have many highly specialized devices from many different manufacturers
  • different interfaces and data formats that make it difficult to network these devices with each other and integrate them into existing IT infrastructures
  • and a wealth of digital tools for laboratories

The digital laboratory is the basis for additional possibilities such as automation, robotics and AI. For this, the smart laboratory needs a communication standard!

In this presentation, the added value of a manufacturer-independent, industry-compatible and future-proof standard for the smart laboratory will be presented. In addition, it will show how LADS OPC UA meets the challenges of the laboratory of the future in a technical point of view (e.g. automation, robotics, AI) while also considering Cybersecurity and Regulatory Compliance in the Laboratory and addressing Challenges with the Cyber Resilience Act.