Sep 2023 – Mind-reading tech in the workplace – a good thing, or the end of the world as we know it?

The ICO released its first report on neurotechnology on 8 June: a fast-emerging tech that records and processes data directly from an individual’s brain and nervous system (“neurodata”).

In its report, the Commissioner discusses the neurotechnology currently available in the market (including invasive, such as deep brain stimulation devices, and non-invasive examples including headbands that read and interpret signals given off by the brain), the regulatory issues, and examples of sectors in which neurotech is currently, and expected to be, used.

While the medical sector is currently the major focus for this technology, with treatments of nervous system disorders being the most obvious use, the ICO also considered other areas in its report where it envisaged roll out of this futuristic processing. One example that is foreseen is the roll out of neurotech in a workplace context – from using it for recruitment purposes, to safety and productivity monitoring. Unsurprisingly, the ICO does not see this as being risk-free, and there are several issues that organisations will need to keep in mind when considering, developing and implementing the use of neurotech in these situations.


Similar to AI, there is a potential for discrimination where the models the technology is based on themselves contain bias, which can throw out inaccurate data or assumptions about people. A particular issue arises around neurodivergent people, whose brain and nervous systems may exhibit patterns that are different to those recorded from neurotypical individuals. The result of a neurotech system showing a person’s neurodata as undesirable or negative could therefore result in a missed job opportunity or being passed over for promotion, solely due to its ingrained bias – which is clearly an area of concern. It is therefore important that neurotech systems are “trained” using data from as wide a range of patterns as possible, in order to mitigate this risk of ingrained bias – something that organisations looking to gain access to these systems should keep in mind.


Another question for those processing neurodata is how to identify what the appropriate legal basis for processing would be. There are calls for explicit consent to be the sole legal basis available, but the innate nature of neurodata makes this problematic: individuals will have limited or no control over what data they are consenting to the processing of, as their neurodata output will be subconsciously generated. This makes consent as a basis difficult, and potentially impossible, to justify. In a workplace context there is also the usual issue of a power imbalance between employers and potential or current employees, which generally invalidates consent as a basis.

Organisations considering putting neurotech in place will have to think carefully about how they comply with the principle of data minimisation, i.e. processing only that data which is directly relevant and necessary to accomplish a specified purpose. Neurodata is generated subconsciously, and therefore the data that is collected may not be entirely relevant to the purpose it is being used for. How will organisations recognise what is and isn’t relevant without processing and analysing the data that is collected? This may be more of a question for the individuals developing this technology, however business would be well-advised to ensure parameters can be put in place where only data that is required is collected and processed.


Addressing data subject rights is also a question yet to be answered. The complexity of neurodata and questions about how it is/will be presented will certainly pose issues when it comes to subject rights – e.g. how will neurodata be dealt with following a subject access request (an area of ongoing focus by the ICO, see our latest article on employer SARs here)? The hope is that the ICO will address these questions in the future, with the Commissioner planning to release formal guidance on neurotechnology in the future. However, this is part of a very long list of things for the Commissioner to do, with international transfers taking centre stage recently (see our article on the recent passing of the Data Privacy Framework here) and AI also contributing to the Commissioner’s workload (see here for our interactive map of AI regulation around the world). The recent emergence, rapid development, and complex regulatory questions surrounding neurotech means that specific guidance may not be forthcoming particularly soon!

Click here for the ICO’s report  


If you have any specific questions you would like advice on, then please contact: Abi.Frederick@lewissilkin.com (Lewis Silkin LLP) or koichiro_nakada@btinternet.com.