Technology will be at the heart of a major disruption in teaching and learning at the university level, says Jennifer Lock, PhD’04, professor and associate dean of Teaching & Learning in the Werklund School of Education at the University of Calgary.
When asked what the UCalgary campus might look like in the year 2067, Lock foresees a very “limited bricks-and-mortar presence. Partly, that’s because the whole notion of where we go to work might change. People will likely have a virtual presence, working and studying from home,” she says. “We’ll have a smaller footprint and fewer buildings here, and there won’t be as many people coming and going.”
Those physically coming to campus might serve more administrative and management roles. Some learning and teaching may still be done on site, perhaps in research labs, but it will be augmented by technology and self-directed by students, much as professor Christian Jacob describes for medical students (see below).
“I think the campuses of the future won’t be bound by time, physical space or geography,” Lock adds. “We’ll be able to work and study around the world, with anyone around the world, without leaving home.”
The whole notion of learning will expand far beyond what students receive during their time on campus. Lock believes they’ll spend more time learning and developing skill sets in community placements or work environments where practice is brought to theory. Apprenticeships won’t just be for tradespeople anymore.
Travel and volunteer time — in different spaces and modes — might become important modules for credit. Likewise, massive online open courses (MOOCs), introduced in 2008, already offer alternatives to the lecture theatre. People tend to sign up for MOOCs, take what they need, but don’t necessarily complete them, says Lock.
“That isn’t necessarily a bad thing. Maybe they got just enough to form a building block of their learning about their area of interest, as opposed to taking 10 or 20 formal courses,” she says, musing that students of the future might put together their own educational modules or even degree programs using both formal and informal learning.
If this brave new world upsets our existing ways of recognizing what people know and can do, how will credits and credentials be assessed? “It might be a performance piece, where I demonstrate my knowledge. Or, perhaps I present a series of portfolio pieces or I do a mentorship,” explains Lock. “It opens up a potential wealth of ways of demonstrating knowledge, skill sets and competencies.” Whatever the future holds, she says it’s important to continue to offer “rich, robust learning experiences.”
By 2067, it will be old hat for medical students to virtually “fly” through the human body, “see” its diseases up close and plumb the depths of its cells.
Indeed, the future of medical education is already here, says Christian Jacob, a UCalgary professor in the departments of computer science and biochemistry and molecular biology.
Jacob’s current work combines virtual reality (VR) and augmented reality (AR) with computer-game engines to immerse students around and inside the human body. Jacobs and his colleagues have already created HoloCell, educational software that allows users to explore a 3D simulation of the inner workings of a human cell.
In essence, HoloCell combines holograms with AR to provide a live view of the real world in real time, augmented with computer-generated sound, video or graphics (think Pokémon Go). Jacob hopes a version of HoloCell will be used to enhance teaching in alumnus Reed Ferber’s anatomy lab in the Department of Kinesiology this fall.
The long-term plan is to expand the software to build immersive experiences of the entire human body for many educational applications. For example, medical students could dissect a real heart while using AR lenses (picture a big set of ski goggles) to see a 3D heart hologram floating nearby, take it apart virtually, and then watch it pump blood through different heart conditions, such as a heart murmur.
And, while peeling back the muscles with their fingers and instruments, a voice interface could converse with them: “Go further; what else do you see?” This type of self-directed learning, using real tissue along with other technologies such as touch interfaces, will revolutionize medical training, Jacob says. For one, fewer instructors would be required.
As software and hardware continue to improve and the costs decrease, patient education can benefit, too. If knee surgery is needed, both doctor and patient might put on AR gear to see a CT scan projected over the patient’s actual knee. The doctor could explain why surgery is needed and then show what’s involved, using a hologram. The patient’s chart might pop up in another hologram. They’d be able to have a conversation, see one another and see these tools.
“I like AR because it’s blending computers into our real world, without hiding things behind a screen. We can bring digital contents from the computer into the real world,” Jacob says. He envisions a time when we’ll be able to activate computer menus and make choices by gesturing or through voice command.
If the future is already here, Jacob muses, medical studies in 2067 might look something like the medical-drama TV series Pure Genius. Students could learn to use 3D printers to build artificial organs, deploy swallowable nano-devices to identify cancers and program “diabetic robots” to monitor and deliver insulin. Augmentation and artificial intelligence will assist medical students and their future patients.
“We’ll know much more about ourselves, simple things like blood pressure and weight, through sensors which will gather data, and that data will be accessible to our doctors,” Jacob says. “We won’t even have to go to the doctor unless an alarm goes off for, say, our blood pressure.”