• Menu

Tech Briefing: Affective computing

Dec 6 2021

Giorgio Tarraf

“Computers are magnificent tools for the realisation of our dreams, but no machine can replace the human spark of spirit, compassion, love, and understanding.”—Louis V. Gerstner, Jr.

Affective computing enables machines to recognise, express and influence emotion. While the technology's most obvious implementation would be a humanoid robot, it is more likely to permeate into portable and wearable devices first.

Emotions have long been a taboo in science. They appear to be the antithesis of the scientific method, and of the objectives of computer science. Computers were long designed to have flawless, rational, and predictable outputs.

But as our relationships to our silicon companions grow increasingly intimate, the mismatch between the variability of human existence, and the steadiness of a microprocessor, becomes evident.

The applications of affective computing are near infinite. A machine could provide comfort during a panic attack, or calm you during a burst of anger. It could also tell marketers that you're currently highly emotional and vulnerable to suggestions. This capacity to access and influence emotions makes affective computing useful to people with autism, and educators, as well as mental health professionals. Affective gaming, whereby games adapt to a player’s emotional states, is also an active research area.

The complexity of integrating emotional intelligence into machines required a series of breakthroughs over the last two decades. Many consider a 1995 paper by Rosalind Picard of the MIT Media Lab to have launched affective computing as a field of research. A system of hardware, statistical models, and software capacity had to develop before Picard’s vision could come true. Indeed, for a computer to recognise emotion, it must be equipped with the sensors to understand users.

Sensor technologies are highly active areas of research. For example, the facial expression of emotions has been studied in depth, and computer models describing facial mannerisms were developed to detect a wide range of emotions. Basic versions of the technology can be implemented with as few as five lines of machine learning code in the programming language Python:


Many mainstream implementations are fuelled by the DeepFace Library, published by the Facebook research team. Body recognition is slower to develop, as it requires contextual analysis. Most such implementations currently rely on hand and arm gesture analysis.

Speech analysis appears to be the most accurate method for detecting emotions. Pattern recognition algorithms have gotten incredibly good at recognising anger and approval.

For the longest time, researchers believed machines were just as good as humans at recognising emotions. Recent research, using deep learning and artificial neural networks, tells a more complex story. The outputs are in fact highly data-dependent, and the method used could significantly alter a machine’s interpretation, especially when considering transitional states between emotions.

One must wonder what happens when the machines misinterpret emotions and reshape a virtual environment accordingly.  

Other sensor methods under exploration include health data monitoring, such as facial colour, blood pressure, or measuring the level of sweat on skin. I discuss these Internet of Life technologies in a previous article.

Overall trends

Affective computing is graduating from theoretical research, and large grants stimulate the generation of patents and market-ready applications. The technology's applicability across a broad spectrum of industries has attracted significant investment in academia, incentivising researchers to produce outputs in a range of research areas.

Research into emotion-sensing algorithms, and sensor hardware development, are reaching a tipping point. This convergence will create a significant market opportunity for entrepreneurs and investors seeking to capitalise on technology that could become foundational to how we interact with computers. 

Some market forecasting companies foresee significant growth for affective computing over the next seven years. Our research confirms that stratospheric growth is highly likely. We also note a one-year lag between the growth of the number of patents and the overall investment amount.

Currently, startups operating in the technology’s ecosystems are mainly nourished by incubators and VCs. L'Atelier's technology intelligence engine suggests a spike in returns on investment in the next three to five years as emerging startups are acquired by major players, likely in the automotive and gaming industries. 

Top research institutions

Much research generated around affective computing comes from the US, UK, China, and Germany. Harvard, Yale, and Stanford expectedly make the top 3 universities, joined by University College London and the University of Toronto. Each university published over a thousand papers on affective computing over the past five years. Overall, the number of publications is increasing steadily, and is likely to double to 5.000 papers in three years.

Top research funders

Research is primarily funded by US government agencies and the European Commission. Interestingly, fields of research converge here, with computer science-oriented institutions joining forces with mental health institutes, along with education-focused funders and the US Army. The number of grants has been stable, with nearly 100 allocated each year, at an average amount of 500.000€.

Top patent holders

The number of patents has grown steadily since 2015, more than tripling by 2020, to nearly 700 patents granted per year. US databases show the most relevant patents, with over 4.500 registered in the past five years. China comes second with nearly 1.500, but its number of patents granted per year nearly doubled between 2016 and 2019. IBM holds the most patents, closely trailed by Takeda, a Japanese pharmaceutical company that also operates in the US. Chinese patents appear mainly held by universities, rather than the private sector (though the country appears to have great ambitions for affective computing).

Tech Briefings is a regular L’Atelier Insights feature that breaks down the promise (and/or reality) of a given technology or group of technologies. This includes funding sources, current research, and current and future implementations, sourced from our internal intelligence tools.


Illustration by Debarpan Das.

Giorgio Tarraf

Technology Intelligence Director

Giorgio is the Technology Intelligence Director at L'Atelier BNP Paribas. His background in biology, international affairs, social science, and business allows him to explore a broad spectrum of signals to generate actionable foresight. He is most curious about how emerging technology will generate opportunities and solve global challenges—and what that will mean for our futures.

About L'Atelier

We are an interdisciplinary team of economists, journalists, scientists and researchers interested in emerging technology related cultural phenomena and change.

Learn more

Sign up for the latest news and insights

Your e-mail address is used to send you newsletters and marketing information on L’Atelier. You can unsubscribe at any time by using the unsubscribe link in our emails. More information about the management of your personal data and your rights can be found in our Data Protection Notice.