Skip to main content
Skip to main content

Can AI guess your emotion?

Integrating Digital Technologies
Years F-2; 3-4

DT+ Health and Physical Education


Discuss emotions as a class, and introduce the idea of artificial intelligence (AI). This lesson can also be used to introduce image classification – a key application of AI.

Developed in collaboration with Digital Technologies Institute.

Tool used in the plugged part of the activity: Teachable machine



Decorative image

Image credit: teguhjati pras/ pixabay


Preliminary notes

Familiarise yourself with the Teachable machine application. View the supporting videos.

Note: Teachable machine requires Google Chrome on Windows or Macintosh (tablets are not supported) and a webcam.

Image 1: Teachable Machine Application Happy/Sad project screenshot

The image above shows the view of a project created in the Teachable machine AI application. On the left, the classes are shown Background, Happy and Sad. The preview on the left shows the model being tested using a smiley emoji. The AI recognises the emoji as ‘Happy’ (100%) shown as a complete red bar.


Safety warning

Privacy and personal information: Discuss the potential misuse of personal images when uploading images of ourselves or friends on websites. Instruct students not to record images of themselves or others via the webcam or uploading images.

The conditions of use for Teachable machine state that images are not stored on external servers if the teachable machine program is closed when completed and the project is not saved. If students close the tab, nothing is saved in their browser or on any servers.

Learning map and outcomes

By the end of this lesson students will:

  • Describe the key facial features that help us recognise a person’s emotions.
  • Draw and recognise different faces showing an emotion such as sad, happy or angry.
  • Train an Artificial Intelligence (AI) to recognise an emoji as happy or sad.
  • (In a follow up lesson) Create their own AI model to recognise images (eg a pet, an Australian animal or a number from 1 to 10).

Learning hook

Unplugged activity

  1. Use a suitable hook to discuss emotions. Why is it important that we recognise someone’s emotions?

    Write or draw some emotions on cards, one emotion per card. Give a student a card to act out. Then ask the class to guess the emotion. Alternatively, show character images from a picture book, and have students guess the emotions.

    Ask the class: What tells us that a person is happy, sad, angry or surprised? What is a neutral expression?

  2. As a class, create a table of information to look for patterns in the data.

    Example table (yours may be different)

  3. Compare your descriptions of features to emojis or emoticons. How do your descriptions compare? What features are used to convey each emotion?

    This activity can be done as a teacher-led activity, or adapted as a hands-on activity to suit a range of skills. Students can begin at Level 1 and demonstrate understanding at each level or start at a selected level.

    The level of difficulty could be increased as follows and presented using a gaming analogy. When explaining the task use a combination of verbal and visuals.


    The activity has been levelled to enable differentiation.


Learning input


Plugged activity

  1. Explain that computers can be programmed to be intelligent, or at least smart. Ask the class if they think a computer could guess their emotions? Could the computer work out if they are happy, sad or angry? Explain that instead of using an image of ourselves we will use an emoji. Briefly explain issues around privacy and protection when uploading images of themselves to the internet.
  2. Use the tool Teachable machine. This tool lets you train up an AI application – without having to code – to recognise inputs and match them each to a particular output. Use this pre-made model to test the AI to see how well it recognises a happy or sad emoji. (You will need a device with a webcam).
  3. Model how to create, train and test an AI model that can recognise happy and sad. Refer to this video explanation or follow these steps:

    Image 2: Teachable Machine Application screenshot

    • Create one class as the background which is the view of the webcam. To do this edit ‘Class 1’ and label it as ‘Background’. Record images using the webcam.
    • Next edit ‘Class 2’ and label as ‘Happy’. Record images of smiley emojis. Use printed images or student drawn emojis held in front of the webcam.
    • Finally add a Class, edit ‘Class 3’ and label it as ‘Sad’. Record images of sad emojis. Use printed images or student drawn emojis held in front of the webcam.
    • Train the model, then test it using happy and sad emojis. Discuss how well the AI recognised the happy and sad emojis.
    • The completed project will look something like this:

    Image 3: Teachable Machine Application Happy/Sad project screenshot

Learning construction

  1. Provide students with the opportunity to create their own AI model using Teachable Machine.
  2. Students may add a further class to the ‘Happy and Sad’ model. For example students may add an additional class such as surprised or angry. Students demonstrate their model in action.

Learning demo

  1. Students may come up with their own training data and project to test the Teachable Machine AI further. Here are some prompts:
    • Can the AI guess your pet type?
    • Can the AI guess the Australian animal?
    • Can the AI guess the number from 1 to 10?
    • Can the AI work out what word you have written?
  2. Discuss the training data required for their project. What data will the need and where will they get it from? For example if recognising pets; where will they source images? How many images of each pet type will they need? What labels will they use? What images will they use for testing?
  3. Ask students to draw a quick plan of their AI model. This plan should show the classes and describe the images needed.

Reflection

  1. Talk about the bar on the interface, which displays the AI’s confidence level.

    For younger students, discuss the scale. If the bar is fully coloured, the AI is very sure. If the bar is only partly coloured, the AI is not quite sure. Here’s an example from our Happy/Sad AI model:

    Discuss:

    • When was the AI very sure of its guess?
    • When was it not so sure?
    • A bias may be experienced if the data used does not contain a diverse range. Examples may include one size, one type of image etc.

Why is this relevant

AI is the ability of machines to mimic human capabilities in a way that we would consider 'smart'.

Machine learning is an application of AI. With machine learning, we give the machine lots of examples of data, demonstrating what we would like it to do so that it can figure out how to achieve a goal on its own. The machine learns and adapts its strategy to achieve the goal.

In our example, we are feeding the machine images of emojis via the inbuilt camera. The more varied the data we provide, the more likely the AI will correctly classify the input as the appropriate emotion. In machine learning, the system will give a confidence value; in this case, a percentage and the bar filled or partially filled, represented by colour. The confidence value provides us with an indication of how sure the AI is of its classification.

This lesson focuses on the concept of classification. Classification is a learning technique used to group data based on attributes or features.