Emotion Detector with Arduino and Machine Learning

Kanchana Kariyawasam
3 min readAug 15, 2024
Image by SoftWeb

Combining Arduino and Machine Learning (ML) provides an opportunity to integrate the digital and physical worlds. In this blog article, I’ll show you how I used machine learning to build an emotion detector and link it to an Arduino to react to detected emotions.

Gif by Giphy

Project Overview

The project detects emotions in images using a pre-trained machine-learning model. An Arduino detects emotions and controls LEDs to graphically show them. For example, if the feeling is “happy,” a certain LED will light up.

Required Components

Step 1: Setting Up the Arduino

First, program the Arduino to connect with your Python script. Install the StandardFirmata firmware on your Arduino, allowing your Python programs to control it using the pyfirmata library.

Here’s the code that controls the LEDs based on the observed emotion. Here COM6 port has been used to connect Arduino to the computer. (Find Arduino port → Refer to this page)

##controller.py
import time
import pyfirmata

comport='COM6'

board=pyfirmata.Arduino(comport)

led_1=board.get_pin('d:2:o')
led_2=board.get_pin('d:3:o')
led_3=board.get_pin('d:4:o')
led_4=board.get_pin('d:5:o')

def led(emotion):
if emotion=='happy':
led_1.write(1)
led_2.write(0)
led_3.write(0)
led_4.write(0)
elif emotion=='angry':
led_1.write(0)
led_2.write(1)
led_3.write(0)
led_4.write(0)
elif emotion=='neutral':
led_1.write(0)
led_2.write(0)
led_3.write(1)
led_4.write(0)
elif emotion=='surprise':
led_1.write(0)
led_2.write(0)
led_3.write(0)
led_4.write(1)
else:
led_1.write(0)
led_2.write(0)
led_3.write(0)
led_4.write(0)

time.sleep(3)
# Keep the LEDs on for 3 seconds

# Turn off all LEDs after 3 seconds
led_1.write(0)
led_2.write(0)
led_3.write(0)
led_4.write(0)

Step 2: Integrating Machine Learning model

Next, we are going to use the DeepFace library to detect emotions in a picture. The Python software below takes an image, analyzes it to determine the current emotion, and then sends a signal to the Arduino to light the relevant LED.

##main.py
from deepface import DeepFace
import cv2
import time
import controller as cnt

time.sleep(2.0)

img_path = 'surprise_boy.jpg'
img = cv2.imread(img_path)

#if want, we can find gender,emotion,age and race using ['age', 'gender', 'race', 'emotion']
attributes = ['gender','emotion']

demography = DeepFace.analyze(img_path,attributes)
demography_data = demography[0].get('dominant_emotion')

print(demography_data)

if demography_data == 'happy':
cnt.led('happy')
elif demography_data == 'angry':
cnt.led('angry')
elif demography_data == 'neutral':
cnt.led('neutral')
elif demography_data == 'surprise':
cnt.led('surprise')
else:
cnt.led('other')
Gif by Giphy

Still we cannot run the project and test it because we have not uploaded the code into the Arduino. To run the project,

  • Connect the LEDs to the specified pins on the Arduino
  • Upload the StandardFirmata sketch to the Arduino using the Arduino IDE
  • Run the Python script(main.py) on the computer

Refer to this page for uploading StandardFirmata to the Arduino 👇

Refer to this GitHub repo 👇

This project shows one way of combining ML with Arduino. By using Python for ML tasks and Arduino for physical control, developers can create a wide range of interactive and intelligent devices.

📚 References

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Kanchana Kariyawasam
Kanchana Kariyawasam

Written by Kanchana Kariyawasam

Former Software Engineer Intern at Geveo-Australasia || Undergraduate of Faculty of Information Technology, University of Moratuwa.

No responses yet

Write a response