Controlling and scripting laboratory hardware with open-source, intuitive interfaces: OpenFlexure Voice Control and OpenFlexure Blockly

Samuel Mcdermott, Richard Bowman, Kerrianne Harrington, William Wadsworth, Pietro Cicuta

Research output: Contribution to journalArticlepeer-review


Making user interaction with laboratory equipment more convenient and intuitive should promote experimental work and help researchers to complete their tasks efficiently. The most common form of interaction in current instrumentation is either direct tactile, with buttons and knobs, or interfaced through a computer, using a mouse and keyboard. Scripting is another function typical of smart and automated laboratory equipment, yet users are currently required to learn bespoke programming languages and libraries for individual pieces of equipment. In this paper, we present two open-source, novel and intuitive ways of interacting with and scripting laboratory equipment. We choose the OpenFlexure family of microscopes as our exemplar, due to their open-source nature and smart control system. Firstly, we demonstrate 'OpenFlexure Voice Control' to enable users to control the microscope hands-free. Secondly, we present 'OpenFlexure Blockly' which uses the Blockly Visual Programming Language to enable users to easily create scripts for the microscope, using a drag and drop Web interface. We explain the design choices when developing these tools, and discuss more typical use cases and more general applications.

Original languageEnglish
Article number221236
JournalRoyal Society Open Science
Issue number2
Early online date1 Feb 2023
Publication statusPublished - 28 Feb 2023

Bibliographical note

Funding Information:
OFMs are used around the world. As such, for many users, English may not be their preferred language. It is easy to add internationalization to OpenFlexure Voice Control, and is something we will curate over time with the support of users. To do this, users would need to download the relevant language profile from voice2json. voice2json currently supports the following 18 languages and locales: Catalan, Czech, Dutch, English, French, German, Greek, Hindi, Italian, Kazakh, Korean, Mandarin, Polish, Portuguese, Russian, Spanish, Swedish and Vietnamese. Developers would only need to translate the ‘sentences.ini’ for each language, but keep the intent headings the same. The different voice2json language profiles will still manage the transcription and intent recognition.


  • hardware/software interfaces
  • laboratory equipment
  • visual programming
  • voice control

ASJC Scopus subject areas

  • General


Dive into the research topics of 'Controlling and scripting laboratory hardware with open-source, intuitive interfaces: OpenFlexure Voice Control and OpenFlexure Blockly'. Together they form a unique fingerprint.

Cite this