‘Major risk’ of neurotech being biased, warns UK data regulator

UK – Emerging neurotechnologies risk discriminating against people, particularly neurodivergent people, if not developed properly, the Information Commissioner’s Office (ICO) has warned.

coloured chalk drawing of a brain on pavement

Companies are increasingly using neurotechnology in the personal wellbeing, sports and marketing sectors, as well as to monitor people in the workplace.

The ICO has warned that there is a ‘major risk’ that inherent bias and inaccurate data will be embedded in such technology, if it is not developed and tested on a wide enough range of people.

The regulator is set to formulate guidance for neurotech developers, to be released by 2025.

Neurotechnology is already used in healthcare under strict regulations, to diagnose and treat illnesses including Parkinson’s disease.

Electroencephalography and other methods of measuring physical and emotional response are well established in market research.

Technologies to gather brain data are also used in various lifestyle sectors, while in recent years, some businesses have added neural interfaces into gadgets such as watches and headphones for use in the workplace – for example, to monitor fatigue. 

However, the ICO warned that neurotechnology could lead to unfair treatment of employees, such as organisations overlooking individuals for promotion if certain neuropatterns are viewed unfavourably due to ingrained bias.  

More generally, the tech could discriminate as a result of devices not being trialled and assessed on a wide variety of people to ensure that data collection remains accurate and reliable.

Neurodivergent people may be particularly at risk, the regulator added, with inaccurate systems and databases trained on neuro-normative patterns.

Stephen Almond, executive director of regulatory risk at the Information Commissioner’s Office, said: “To many, the idea of neurotechnology conjures up images of science fiction films, but this technology is real and it is developing rapidly.

Neurotechnology collects intimate personal information that people are often not aware of, including emotions and complex behaviour. The consequences could be dire if these technologies are developed or deployed inappropriately.

“We want to see everyone in society benefit from this technology. It’s important for organisations to act now to avoid the real danger of discrimination.”

We hope you enjoyed this article.
Research Live is published by MRS.

The Market Research Society (MRS) exists to promote and protect the research sector, showcasing how research delivers impact for businesses and government.

Members of MRS enjoy many benefits including tailoured policy guidance, discounts on training and conferences, and access to member-only content.

For example, there's an archive of winning case studies from over a decade of MRS Awards.

Find out more about the benefits of joining MRS here.

0 Comments


Display name

Email

Join the discussion

Newsletter
Stay connected with the latest insights and trends...
Sign Up
Latest From MRS

Our latest training courses

Our new 2025 training programme is now launched as part of the development offered within the MRS Global Insight Academy

See all training

Specialist conferences

Our one-day conferences cover topics including CX and UX, Semiotics, B2B, Finance, AI and Leaders' Forums.

See all conferences

MRS reports on AI

MRS has published a three-part series on how generative AI is impacting the research sector, including synthetic respondents and challenges to adoption.

See the reports

Progress faster...
with MRS 
membership

Mentoring

CPD/recognition

Webinars

Codeline

Discounts