Data bias in AI
Artificial intelligence can sometimes be biased to certain shapes or colours. When such AI systems are applied to situations that involve people, then this bias can manifest itself as bias against skin colour or gender. This lesson explores bias in AI, where it comes from and what can be done to prevent it.
Additional details
Year band(s) | 5-6, 7-8 |
---|---|
Content type | Lesson ideas |
Format | Web page |
Technologies & Programming Languages | Artificial Intelligence |
Keywords | Artificial Intelligence, AI, artificial, intelligence, teachable machine, smart phone, algorithms, problem solving, digital systems, Scratch, Lesson idea, Lesson plan, Digital Technologies Institute, data bias |
Organisation | ESA |
Copyright | Creative Commons Attribution 4.0, unless otherwise indicated. |
Related resources
-
A matter of style
In this lesson sequence, students using the Zen Garden website reflect on criteria for effective design.
-
Google CS First: Storytelling
CS First: Storytelling guides students to use block-based coding in Scratch projects through a series of themed activities.
-
Scope and sequence overview
This resource provides a possible set of sequenced topics that could be used in teaching the Australian Curriculum Digital Technologies curriculum to address the content descriptions of the curriculum.
-
F-2: Digital systems: Hardware and software
At the F-2 level, students develop understandings of digital systems (hardware and software) when they use some key functions to undertake authentic curriculum tasks.
-
F-2: Digital systems: Changes in Technology
Changes to technology over time has affected many aspects of life.