Emotions are integral to the social media user experience; we express our feelings, react to posted content and communicate with emoji. This may lead to emotional contagion and undesirable behaviors such as cyberbullying and flaming. Nearly real-time negative emotion detection during the use of social media could mitigate these behaviors, but existing techniques rely on corpora of aggregated user-generated data - posted comments or social graph structure. This paper explores how live data extracted from smartphone sensors can predict binary affect, valence and arousal during the typical social media tasks of browsing content and chatting. Results show that momentary emotion can be predicted, using features from screen touches and device motions, with peak F1-scores of 0.86, 0.86, 0.88 for affect, valence and arousal.
Want to know more about our research? Send us an email and we will get back to you as soon as possible!