Backyard Brains Logo

Neuroscience for Everyone!

+1 (855) GET-SPIKES (855-438-7745)


items ()

Unique ‘Pain Fingerprint’ – New Study Charts Brain’s Varied Responses to Pain

pain fingerprint study illustrated by backyard brains
Illustrated by Cristina Mezuk

“How bad does it hurt?” It’s not for nothing that doctors usually struggle to ascertain our level of pain. It depends not only on how bad we report it to be, but also on the amount of pain we think we feel.

But are there reasons behind it that would begin to decipher our (in)ability to cope with or even verbalize the dreaded sensation? According to a recent collaborative study led by Dr. Elia Valentini from the University of Essex, there’s more to this phenomenon than a mere lack of tools that would accurately quantify exactly how much pain there is in an “ouch.”

What Does Our Brain Do While We Hurt?

So far, science held a more or less persisting view that a surefire way to quantify our levels of pain – much like any other physical sensation or state – was to measure our brain’s electrical activity. When you’re sitting and idly scrolling on your phone, your brain waves will likely hover around 12 Hz. Start dozing off and these alpha waves will slide back in intensity to theta (4-8 Hz) or even delta (1-4 Hz) if you were to fall asleep.

Sleeping brain EEG

But if a very angry tweet kicks you out of your zen, your brain waves are likely to surge into the beta sphere, anywhere from 22 to 38 Hz. Finally, if you hop into the kitchen and stub your toe on the way, your brain activity will shoot through the roof and exhibit a very high level of oscillations, up to 80 Hz.

Or so the theory went!

The study published in the Journal of Neurophysiology paints a more nuanced picture. Different brains, it suggests, show remarkably varied responses to the same type and amount of pain. This leads the researchers to believe that each of us have our own and unique “pain fingerprint.” To gauge what our brain does against what it says it does, the researchers took two groups of willing subjects and put them through two datasets. The first group of willing participants was zapped with a laser and touched within a 2-week span, whereas the other only only got the laser stimulus. All the while, the participants’ response was measured on two fronts. Their EEG was recorded with a focus on the rapid gamma brain waves. Three seconds after the stimulus was applied, the participants were asked to verbally rate their feeling of pain from no pain (0) to maximum pain they were willing to tolerate (10).

The most intriguing finding? We may experience and describe a stimulus as painful in a certain way and to a certain extent, but the gamma waves will not necessarily play along. In other words, the waves that have been associated with pain for so long will actually vary significantly between individuals. But where they do show in an individual, they will be remarkably stable, consistent and reproducible.

(more…)
Tags:

2023 Summer US-Serbian Research Fellowship Concludes: New Experiments in the Works!

2023 research fellowship concludes

Over a dozen busy bees, 5 research projects, 4 hot weeks of July, countless data, iterations and coffee cups, one book of experiments to soak it all up and present to the wider audience — and the Backyard Brains 2023 US-Serbian Summer Research Fellowship rounds off. The result will hit the shelves this fall, with the new, Serbian edition of our book “How Your Brain Works” containing brand new experiments that our team started working on.

But if you expect to see a bunch of cockroaches, worms, moths and bees and other invertebrates buzzing around Belgrade’s Center for Promotion of Science lab makerspace where we spent the month, you’re in for a surprise. This time, we ventured into two completely different, even opposite realms, hoping to eventually tie them together. One is the realm of single-celled creatures who don’t seem to be hindered or bothered by their lack of brain. The other lies behind our all-powerful brain and borders on philosophy of awareness. What is consciousness and attention? How do we think what reality is — and how do we share it with others? Finally, is there a way for these two realms to inform and complement each other?

backyard brains 2023 research fellowship in cpn makerspace
Helping hand just got a whole new meaning.

This year’s cohort was small but diverse, composed of three undergrads who flew in from the University of Michigan and four Serbian undergrads from the Universities of Belgrade and Novi Sad. One of the greatest values was the wide variety of backgrounds that came together: from neuroscience to electrical engineering, psychology, molecular biology and computer science.

(more…)

N170 Detection (Or Why Your Brain Picks Faces Over Rolex Watches)

N170 detection experiment
Testing oneself is always the first step!

— Written by Petar Damjanovic —

If I were to show you a photo of a Rolex watch and a face of an unknown person, you’d probably be more interested in the prestigious, shiny object than the random stranger, right? However — and this may come as a surprise — your brain is much more modest and more of a social being than you think you are! 🙂

Evidence indicates that our brain reacts differently when it sees a face from when it sees any other object. This likely has to do with the way human brains have evolved, recognizing faces as something of a greater importance than random stuff.

And so we arrive at the N170 from the title of this blog post. What exactly does it mean? Its a very peculiar spike in EEG recordings: one that is observed approximately 170 milliseconds after a person has been exposed to a stimulus. Multiple papers state that this N170 has a higher amplitude when the stimulus is a human face rather than anything else. And that’s exactly what I set out to prove using the Human SpikerBox and several other electronic devices. 

The general idea of the experiment was to record EEG of the subject as they watch a presentation consisting of photos of human faces and wristwatches. In the presentation, photos of faces and watches flash in random order, with grey screens in between. As soon as a face or a watch pops up, the goal was to send event markers to the SpikerBox and feed them into the EEG recording.

Now that the EEG recordings got their event markers, I was able to extract some valuable data of interest — such as when exactly the spike appears and what its amplitude is. The results were close to what I’d expected, with higher amplitudes for faces, but happening a bit earlier than 170 milliseconds. This may have had to do with the delay of the sensor as it sends event markers to the Spikerbox, but we’ll come to that shortly. All of this data is so cool, but it wasn’t easy getting there. Per aspera ad astra! I came across so many problems during the project that I can hardly even recall all of them. But, oh well, that’s science – you fail 95% of times and succeed only in the remaining 5%.

n170 detection experiment - tweaking the setup

Anyway, to get started, I had to make a sensor reliable and sensitive enough to detect the change in screen brightness. The goal was to have it detect 6 different levels of brightness, with as small a delay as possible. Firstly, I tried with a basic photo resistor that changes its resistance according to light and uses it in a voltage divider. However, as it turned out, it had quite a delay. So I had to switch to phototransistor after spending several days on the resistor. How does a phototransistor work? It ‘generates’ current proportional to the light intensity. Fortunately, the phototransistor was good enough so I could use it in further examinations. But why do we even need that sensor?

(more…)