Marketers always look for innovative ways to improve their communication strategy on their social media platforms, but they still seem to be somewhat afraid of machine learning and AI. The thing is, ML and AI models can help you make data-driven decisions about how your brand is representing itself to your fanbase, so it’s worth learning how to implement those models.
In this post, we’ll show you what emotion recognition is, and how it can be used to extract emotions from social media visual posts.
Definition of emotion recognition
There are many AI-powered solutions and programs that can be used to detect and recognize emotions. Here’s how you can utilize emotion recognition for your communication strategy.
The program or model will try to assign an emotional expression to a face. Usually, it will use hints like mouth and eyebrow position, but as AI gets more advanced, there will be more cues to pick up on. Currently, emotion recognition models can recognize a person’s facial expression as:
- Fear (scared)
To help you understand how emotion extraction and recognition work, we’ll show you some examples of the two machine learning models in action.
How do we detect faces and emotions using Sotrender’s machine learning models?
Sotrender uses data-driven machine learning approaches, specifically, MTCNN architecture for face detection and ResNet50 architecture to recognize emotions. MTCNN combines three neural networks in order to predict face and landmark locations, which represent key points in the face such as the eyes. Eventually, you can draw a box around the detected face. ResNet50, on the other hand, is a neural network with multiple layers that recognizes emotions. Using an iterative process, it detects edges in the picture and then uses them to form representations of bigger features like a smile or eyes. Ultimately, they are used to predict an associated emotion. To cut a long story short, we combine both of these architectures into a single model that will extract faces from the image and then predict emotions in the face.
As of right now, our model has 96% of the accuracy as the state-of-the-art models. Sotrender’s models are able to analyze videos as well, which makes it possible for brands and organizations to learn more about how often their ads and video content portray certain emotions. Since video content gets more engagement and requires more effort to create than static image posts, it’s worth it for brands and organizations to effectively plan and execute videos for their audiences.
Step-by-step guide for face and emotion recognition
Since you know a bit more about the methods we use, we’ll show you how all of this comes together for a single image.
Step 1. We find ourselves an input image, and feed it into the face detector model (using MTCNN).
Step 2. The faces get cropped out and become separate images from the original we used in the beginning.
Step 3. We can classify the emotions at this point, and use ResNet50. The images will no longer be cropped, but will instead be joined with the original input image. Here’s what that looks like with visuals.
What Facebook and Instagram posts can we analyze with these models?
Facebook explained how users can download/analyze images, videos, Stories, or albums from their platform. You can acquire the direct link to these forms of media through endpoints in Instagram’s Graph API. Basically, users can get media URLs to get access to the images that you want to analyze from a given profile. Since Instagram is owned by Facebook, there would be similar endpoints for Facebook’s API.
You should know that there are some limitations. For one, the API can only give you access to Business accounts. Second, there are also limitations when it comes to getting some media forms such as Stories. In the case of Instagram Stories, we cannot download Instagram Live Stories, and since Stories are only available for 24 hours, if you’re downloading a lot of data, you might not get to download all the Stories you were hoping to get.
Overall, it’s good to be aware of the limitations and adjust your expectations for social media data.
You can read more about how we used the emotion recognition model to analyze Elon Musk’s and Richard Branson’s Instagram posts and check what emotions they displayed the most on their profiles.
How can I get more insights about my performance?
You wouldn’t be getting the full picture with only one set of data. Actually, knowing only how often emotions appear in your posts won’t tell you how your audience reacts to it, or what you should change about your content.
Instead, we recommend taking all of it into consideration. You can combine emotion data and performance data to see how your posts have impacted your reach, engagement, and follower count. You can check these kinds of performance insights using native social media analytics built into platforms like Facebook or Instagram, or an external tool. If you use Sotrender, you could gather data from longer and customizable periods of time to give you more specialized answers.
Just by clicking on the interactive graphs in the Sotrender app, you could check which of your posts had the best engagement and highest reach, and see if there is a particular emotion that comes up. If organic content that includes surprised and happy faces tends to get more comments, likes, and shares, then you’ll have your answer. 😉
Ready to start using emotion recognition on your own profiles?
Machine learning can seem intimidating at first glance. However, if you start to understand the basics and logic behind the models, you’ll get a better understanding of how your audience perceives your profile. At some point, simple social media analytics won’t be enough to tell you more about the qualitative side of your social media presence, but machine learning will.
Interested in learning more about how you can use ML to discover new insights about your profile? Contact our Sales Team and we can discuss the ways that Sotrender’s models can be useful for you.
Read more about AI and machine learning here: