Are your thoughts being mined for data?

by Black Hat Middle East and Africa
on
Are your thoughts being mined for data?

Your thoughts belong to you. Or at least you think they do. 

The world inside your head is yours and yours alone; you can have private thoughts even in places where you can’t have private conversations, and your inner life is the only thing that’s not being tapped by tech companies and AI algorithms that are hungry for data. 

Except, that’s not exactly true: because digital profiling using expansive datasets is becoming pervasive across every area of our lives, and algorithms are monitoring pretty much everything – including the behaviour and activities that infer thought processes and emotions. 

It’s called ‘affective computing’. Essentially, it’s the study and development of tech that can ‘recognise, interpret, process, and simulate human emotions’. 

And when machines can simulate our emotions they can also respond to those emotions in ways that influence our choices and behaviour. 

Algorithm Affect Management (AAM) technologies at work 

The Institute for the Future of Work (IFOW) recently reported on the introduction of AAM technologies which are designed to take inferences from affective computing and connect them to algorithmic management systems, opening up a ‘new frontier in surveillance and privacy concerns.’ 

These technologies are being used in work environments to drive productivity and occupational health and safety benefits; but they also pose a risk to employee wellbeing and privacy. Because they gather information about people’s emotions at work, AAM technologies can very reasonably be seen as invasive – so we absolutely need careful, controlled implementation with a focus on ethics. 

What did the IFOW discover about AAM tech? 

The IFOW report pulled up a number of key findings, including: 

  • A recent shift in the way management teams make decisions, in which automated inferences and technological measurements (relating to employees’ identities and their potential behaviours) carry increasing weight. In other words, senior management teams are using AAM insights to make decisions about employees.
  • AAM has the potential to cause exploitative practices and ‘technostress’ – creating new forms of harm to the rights of employees at work.
  • Existing protections are not adequate to safeguard employee privacy, or their physiological or mental integrity, in the face of AAM tech.
  • There’s an urgent need for regulations that cover the various challenges that arise when AAM tech is adopted, including regulations that protect against the risks linked to ‘neurosurveillance’. 

The ethics of emotion data 

The report is a clear call for action against the risks of discrimination, bias, privacy infringements, and psychological harm as a result of AAM tech. 

And the underlying ethical conversation extends far beyond any workplace. Collectively, how do we feel about our emotions being mined within datasets? What should that data be used (and not used) for? And how much control should individuals retain over their personal behavioural data, when it infers emotional states to the organisation collecting that data? 

We’re entering a new era of ethics when it comes to data privacy. Now more than ever, cybersecurity practitioners have to confront the reality that part of the job in the future will be securing the contents of individual inner lives: thoughts, emotions, and the freedom to exist without constant technological analysis. 

Join us at Black Hat MEA 2025 to explore the latest in affective computing and data ethics. Because we all need to be part of this conversation. 

Share on

Join newsletter

Join the newsletter to receive the latest updates in your inbox.


Follow us


Topics

Sign up for more like this.

Join the newsletter to receive the latest updates in your inbox.

Related articles