
3 Reasons to worry about bossware
Bossware tech might be bad for security. Find out why bossware creates a security risk and damages trust.
Read MoreYour thoughts belong to you. Or at least you think they do.
The world inside your head is yours and yours alone; you can have private thoughts even in places where you can’t have private conversations, and your inner life is the only thing that’s not being tapped by tech companies and AI algorithms that are hungry for data.
Except, that’s not exactly true: because digital profiling using expansive datasets is becoming pervasive across every area of our lives, and algorithms are monitoring pretty much everything – including the behaviour and activities that infer thought processes and emotions.
It’s called ‘affective computing’. Essentially, it’s the study and development of tech that can ‘recognise, interpret, process, and simulate human emotions’.
And when machines can simulate our emotions they can also respond to those emotions in ways that influence our choices and behaviour.
The Institute for the Future of Work (IFOW) recently reported on the introduction of AAM technologies which are designed to take inferences from affective computing and connect them to algorithmic management systems, opening up a ‘new frontier in surveillance and privacy concerns.’
These technologies are being used in work environments to drive productivity and occupational health and safety benefits; but they also pose a risk to employee wellbeing and privacy. Because they gather information about people’s emotions at work, AAM technologies can very reasonably be seen as invasive – so we absolutely need careful, controlled implementation with a focus on ethics.
The IFOW report pulled up a number of key findings, including:
The report is a clear call for action against the risks of discrimination, bias, privacy infringements, and psychological harm as a result of AAM tech.
And the underlying ethical conversation extends far beyond any workplace. Collectively, how do we feel about our emotions being mined within datasets? What should that data be used (and not used) for? And how much control should individuals retain over their personal behavioural data, when it infers emotional states to the organisation collecting that data?
We’re entering a new era of ethics when it comes to data privacy. Now more than ever, cybersecurity practitioners have to confront the reality that part of the job in the future will be securing the contents of individual inner lives: thoughts, emotions, and the freedom to exist without constant technological analysis.
Join us at Black Hat MEA 2025 to explore the latest in affective computing and data ethics. Because we all need to be part of this conversation.
Join the newsletter to receive the latest updates in your inbox.
Bossware tech might be bad for security. Find out why bossware creates a security risk and damages trust.
Read MoreOur pick of five cybersecurity merger and acquisition deals in December 2024 that highlight key trends for cybersecurity in 2025.
Read MoreOrganisations face an unprecedented risk level as cyber criminals deploy multi-vector attacks, exploiting multiple vulnerabilities with simultaneous execution.
Read More