When the iPad was released in 2010, it revolutionized education and special education. In fact, special education was one of the early adopters of the iPad because of the customization and personalization that can be done on the device. There were also accessibility features built in that allowed students with various disabilities to use it. Students with vision problems, hearing issues, learning disabilities, and cognitive issues could access materials that had been out of reach without these tools. These were—and are—great solutions for students with unique learning needs, but a larger issue presented itself.

While these students had powerful tools, there were more significant challenges—the curriculum and educational materials were not easily accessible. Not all students had access to the same educational opportunities as their nondisabled peers. Some barriers were easy to overcome, but other obstacles presented difficulties: How can a student with physical limitations safely handle lab tools or chemicals in a chemistry class? How can a student who is blind study a graph? How can a student who is hard of hearing or deaf take notes and watch a sign language interpreter at the same time? How do we measure and gauge learning for students who do not learn through traditional methods or at a typical pace? In order to provide a level playing field and equal opportunities for all students, we can look to the practice of implementing Universal Design for Learning (UDL) and today’s emerging technologies in our classrooms. We can look at solutions such as artificial intelligence, augmented reality, mixed reality, and virtual reality to provide opportunities for all students.

Universal Design for Learning

UDL is an educational framework that was developed by the Center for Applied Special Technology. The principles of UDL provide a curriculum that can be customized and modified to meet every student’s individual needs in the classroom. It’s tied to a scientifically based framework and provides for multiple means of engagement, representation, and expression. Newer technologies may offer tools and solutions that can be matched to the checkpoints within the UDL framework to meet the unique learning styles and needs of students in engaging and immersive ways. Along with traditional educational materials and activities, students can explore topics and subjects in novel and unique ways. These technologies can also provide multiple ways that students can express what they know and what they are learning. With information gathered from the technologies, we can optimize learning by focusing on each student’s specific learning style, strengths, and weaknesses, and we can make adjustments to provide an optimal learning environment.

Artificial Intelligence

Artificial intelligence (AI) is a branch of computer science. It allows a machine to process and analyze enormous amounts of information independently to solve problems. We see examples of AI being used around us all the time, but we often don’t realize it. AI analyzes what you search for and buy on the internet and narrows or targets ads customized to your preferences. Alexa, Siri, Google Home, and Cortana are personal assistants that use AI to improve your experiences with your devices. They learn from your interactions with your device and personalize their interactions based on your unique preferences and requests. So how will AI impact special education? As more applications are developed utilizing AI engines, we will be able to customize and personalize the content and curriculum to meet our students’ unique learning needs—determining strengths and weaknesses in a student’s learning and making adjustments within their application. It can also provide a comprehensive overview of a student’s needs that can be used to target strategies to maximize learning.

Robots that use AI are being used to teach social skills and communication skills that allow students with autism to better interact with those around them. There are also many solutions for students with hearing and visual impairments. Today’s hearing aids can analyze various sounds, determine background noise, and filter those noises out. Microsoft is using AI in its Microsoft Translator app to convert human speech into captions. Apple’s upcoming iOS 14 features sound recognition, which will alert deaf users when their phone detects certain sounds. FaceTime in the new iOS will recognize when someone is signing and switch that view to the signing participant. For the visually impaired, the Seeing AI app developed by Microsoft uses artificial intelligence to recognize text and convert it to speech. Another feature utilizes the camera on a smartphone to recognize objects and faces, and then identify them for the user.

Augmented Reality

Augmented reality (AR) technology allows you to see the real-world environment through smartphones or In order to provide a level playing field and to provide equal opportunities for all students, we can look to the practice of implementing Universal Design for Learning and today’s emerging technologies in our classrooms. Principal Leadership October 2020 37 glasses, and have digital data or images superimposed into the physical world. You can see the objects in a real-world environment, manipulate and move these objects, and move around them. AR became extremely popular with the Pokémon Go app and Snapchat filters. AR apps provide powerful, interactive, and realistic learning opportunities that would not otherwise be possible in a classroom. Some students may not have the ability to interact physically with a real-world object, but with AR, they can digitally work with the object.

For instance, Froggipedia provides a student who may have physical limitations with the opportunity to explore the anatomy and life cycle of a frog or dissect a frog using an Apple Pencil or their finger. Or, students can explore the human anatomy more deeply with apps such as Anatomy Atlas 3D or Insight Heart. Students who cannot physically go to a museum or art gallery can have museum artifacts or pieces of art appear in a real-world environment where they can explore and navigate around the objects. Some students have difficulty understanding abstract or difficult concepts—AR can render 3D objects and place them in the real world where they can be visualized. GeoGebra helps students to better visualize math in the real world. Students can explore visual representations of math by manipulating 3D shapes that they create.

SignGlasses allow students to receive live sign language interpreting overlaid on top of the classroom environment through a pair of smart glasses. With the sign glasses, a student can watch a lecture without shifting focus between the teacher, the interpreter, and their notes. The interpreter or captions appear on the surface of the glasses.

Apps like ARMakr provide students and educators with the opportunity to create their own augmented reality content. It is an easy-to-use app that allows them to create 3D objects and then place them virtually into the real world. It is a great way for educators to create engaging content and an excellent tool for students to demonstrate what they know.

There is another type of technology called mixed reality, which piggybacks on AR. While AR places virtual objects on the real-world environment, mixed reality not only overlays but affixes the virtual objects to the real world.

Virtual Reality

Unlike AR, virtual reality (VR) is a totally immersive experience. VR is a computer- generated 3D environment that simulates a real-world environment where students can move about and interact with items in the virtual world. VR requires the use of a head-mounted display; these range from headsets made from cardboard and utilizing a cellphone to stand-alone headsets or ones that tether to a PC or console.

In the field of special education, there are a wide variety of applications for VR, including virtual simulations, virtual walk-throughs, virtual tours, virtual training, and experiential activities.

There are many advantages to using VR with students with diverse learning needs. The first advantage is they can learn and interact in a safe environment. Students can explore new environments or activities before experiencing them in the real world. Students in wheelchairs can use virtual wheelchair simulators to learn how to navigate environments safely. Students with autism can learn about appropriate social interactions and reading others’ emotions and nonverbal cues in controlled environments rather than in real-life situations, which may cause anxiety. Plus, learning can happen in an environment that is relatively free of distractions. Because the student is totally immersed in the virtual world, they can focus on the activity at hand, free from outside distractions. Lastly, students can experience activities that they may never have had the opportunity to experience otherwise. Students can go on virtual field trips around the world using National Geographic Explore VR, discover rich learning environments such as using the Anne Frank House VR app, or actively participate in activities such as underwater exploration using the Ocean Rift app.

There are lots of benefits to using VR in special education. When the use of VR is tied to specific educational goals and outcomes, it can become an engaging and powerful tool in your pedagogical tool kit.

Another critical issue is accessibility. Most controllers are not readily accessible for persons with physical and motor difficulties. Many of the games are highly visual, which is limiting for persons with visual impairments. Many are also not created with accessibility in mind. However, as developers begin to develop more programs for education, there will also be a push to make them accessible for all students.

It is an exciting time to be in education and especially in special education. Advances in today’s technologies provide us with amazing opportunities to deliver a curriculum that has multiple ways for students to be engaged, multiple ways materials can be presented, and many ways they can express themselves. It also gives us ways to measure progress, make adjustments, and personalize learning for our students. Artificial intelligence gives us the tools to analyze a student’s learning so we can make adjustments and customization to maximize it. Augmented reality and virtual reality give us highly engaging and immersive tools that can provide our students with meaningful learning opportunities that are both engaging and educational.


Mark Coppin is the director of disability services at North Dakota State University in Fargo, ND.