Investor Relations case study

One of the first testbeds for deploying sentiment analysis was to analyse earnings calls of publicly listed companies. 

The promise was impressive, but the deliverable that we were allowed to show was modest - however the sector did not give up on innovative tools to analyse these valuable and data rich calls.

Using multimodal affective computing to analyse investor relations calls 


Due to NDAs we are only able to show a sneak peak of the technology - although only a snippet, it does offer an entirely new perspective on how to get and analyse alternative data in the performance evaluation of publicly listed companies.

For this purpose we showed an analysis of 2 primary videos, one being an internally produced pre-IPO interview and the other one, being a CNBC interview.

The first interview was filled with a negative sentiment, despite the CEO reading a carefully prepared positive script. The negative sentiment was detected only using multimodal affective computing data, previously adjusted for Steward’s baseline using publicly available data sets. 

In the CNBC interview, the founder disclosed very positive numbers, but again - implicitly very negative sentiment. 


Conclusion 

In conclusion, affective computing software/emotional AI is already being used in the financial and capital markets industry.

Despite having carefully scripted calls, the non-verbal cues and especially data rich video materials of investor calls or media interviews are perfect for analysing subconscious attitudes and behaviour patterns of executives such as lack of congruence and unusual verbal and upper body movement patterns. 

Baselining, or adjusting all multimodal affective computing algorithms to a personal “normal” or baseline state, is a key step in making this process an accurate and effective one.

The richness of publicly available data on major company executives makes this process more easy for collecting and training the model



Please find the detailed report here.

andrea Sagud