Artificial intelligence (AI) is expected to have a high impact on society, culture and economy, and its application in Cities and urban environments is widely debated. One of the latest hot topics is emotional AI, the emerging technology that allows machines to sense, learn and interact with people’s motions, moods and intentions, leveraging data such as body temperature and movements, voices and facial expressions.
Emotional AI might be used for a number of goals, including crime prevention and public security improvement. The idea behind it is to predict individual behaviour and intention by detecting how a person moves or talks, and correlating data as its heart rate and temperature. If we are able to understand when a driver is feeling tired, we could for instance have an in-car system to call out for a break, and prevent a crash.
Emotional AI has of course strong ethical implications due to the collection and processing of personal, sensitive data. The intersection of emotional AI’s opportunities and risks is being investigated by some researchers from Northumbria University in Newcastle, who have just announced a three-year project entitled Emotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life.
They will compare experiences and valuable know-how from UK and Japan, exploring both the benefits and potential issues of the use of emotional AI in Smart Cities, with specific focus on policing and security. The team will also examine existing governance for the collection and use of intimate data in relation to people’s emotions, especially in public spaces.
The project will stimulate the development of a think tank to provide impartial advice on the use of emotional AI to governments around the world, as well as industry, educators and other stakeholders.