Bored? Computers might be able to identify and correct the doldrums

You’re sitting in class or at work, and your eyes slip out of focus while your mind drifts elsewhere. You read the same sentence five times over.

Welcome to boredom.

Ironically, boredom is more complex and way more interesting than it appears at first glance. It’s not just the simple act of disengaging mentally with the task at hand. In fact, just feeling bored involves three key things: Energy, awareness and environment.

Yes, it takes energy to be bored. If you have nothing to do, but low energy, your brain tends to interpret this as relaxation. When you have energy and nowhere to use it, that’s boredom. You also have to be aware of your boredom to feel it, and it must be perceived as being external to you—a product of your environment.

According to Psychology Today, “boredom typically occurs when people have trouble focusing their attention and they believe the reason for this difficulty is in the environment.”

It feels out of our control, like the world around us is failing to provide enough stimulus to keep us interested.

But boredom is more than just an unpleasant side effect of waiting rooms, airports, and uncompelling lectures. It’s problematic because it tends to generate negative feelings about the thing causing the boredom.  

Psychology Today says that “these negative feelings can actually impair later performance.  Stress can decrease people’s ability to pay attention and can narrow people’s working memory capacity.” 

While this isn’t a problem in say, waiting rooms, it becomes a big issue in schools, where boredom can cause students to disengage completely with information.

New BSMS study says computers can now detect and prevent boredom

We reported last week that the school notebook market was valued at $1.35 billion in 2015 and is expected to reach $2.87 billion in 2020, growing at a compound annual growth rate of 17.81%.

At the time, we argued that giving notebooks to students—especially younger ones—could be disruptive to learning. However, one region where computers might have a leg-up is in detecting, and maybe even preventing boredom.

A new study led by Dr. Harry Witchel, Discipline Leader in Physiology at Brighton and Sussex Medical School (BSMS), indicates that computers are now able to read body language to tell if a person is bored.

Witchel, a body-language expert, says that how much a person fidgets is inversely proportional to their level of interest—the more rapt their attention, the less they exhibit tiny movements called non-instrumental movement.

Witchel’s study is relatively small. 27 participants were exposed to three-minute stimuli on a computer which ranged from interesting to downright dull. The researchers used video motion tracking to quantify the participants’ movements and found that the most interesting material generated a 42% reduction in non-instrumental movement.

Despite the study’s small scale, the implications here are pretty vast, especially in education. Giving computers insight into when our attention starts to ebb could enable the creation of programs that adapt to a user’s interest level, and keep them engaged through the entire learning process.

“Being able to ‘read’ a person’s interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process,” said Dr Witchel in a release issued by the University of Sussex. “Further ahead it could help us create more empathetic companion robots, which may sound very ‘sci fi’ but are becoming a realistic possibility within our lifetimes.”