Backyard Brains Logo

Neuroscience for Everyone!

+1 (855) GET-SPIKES (855-438-7745)


items ()

N170 Detection (Or Why Your Brain Picks Faces Over Rolex Watches)

N170 detection experiment
Testing oneself is always the first step!

— Written by Petar Damjanovic —

If I were to show you a photo of a Rolex watch and a face of an unknown person, you’d probably be more interested in the prestigious, shiny object than the random stranger, right? However — and this may come as a surprise — your brain is much more modest and more of a social being than you think you are! 🙂

Evidence indicates that our brain reacts differently when it sees a face from when it sees any other object. This likely has to do with the way human brains have evolved, recognizing faces as something of a greater importance than random stuff.

And so we arrive at the N170 from the title of this blog post. What exactly does it mean? Its a very peculiar spike in EEG recordings: one that is observed approximately 170 milliseconds after a person has been exposed to a stimulus. Multiple papers state that this N170 has a higher amplitude when the stimulus is a human face rather than anything else. And that’s exactly what I set out to prove using the Human SpikerBox and several other electronic devices. 

The general idea of the experiment was to record EEG of the subject as they watch a presentation consisting of photos of human faces and wristwatches. In the presentation, photos of faces and watches flash in random order, with grey screens in between. As soon as a face or a watch pops up, the goal was to send event markers to the SpikerBox and feed them into the EEG recording.

Now that the EEG recordings got their event markers, I was able to extract some valuable data of interest — such as when exactly the spike appears and what its amplitude is. The results were close to what I’d expected, with higher amplitudes for faces, but happening a bit earlier than 170 milliseconds. This may have had to do with the delay of the sensor as it sends event markers to the Spikerbox, but we’ll come to that shortly. All of this data is so cool, but it wasn’t easy getting there. Per aspera ad astra! I came across so many problems during the project that I can hardly even recall all of them. But, oh well, that’s science – you fail 95% of times and succeed only in the remaining 5%.

n170 detection experiment - tweaking the setup

Anyway, to get started, I had to make a sensor reliable and sensitive enough to detect the change in screen brightness. The goal was to have it detect 6 different levels of brightness, with as small a delay as possible. Firstly, I tried with a basic photo resistor that changes its resistance according to light and uses it in a voltage divider. However, as it turned out, it had quite a delay. So I had to switch to phototransistor after spending several days on the resistor. How does a phototransistor work? It ‘generates’ current proportional to the light intensity. Fortunately, the phototransistor was good enough so I could use it in further examinations. But why do we even need that sensor?

As I said before, it’ll detect the appearance of face or watch by discerning a small square in the bottom left corner of each slide (white for watches and black for faces). Therein lay another problem: different screens emit different brightness, so I had to find a way around that to make it doable on every screen. To achieve that, I coded a calibration part, where before every use a person would calibrate the sensor to their screen. Some coding was also needed to make the sensor work properly, but in the end it was all done (with some of the delay left, which makes our spikes appear a bit earlier than 170 milliseconds). There it is: the sensor is ready to be used and we can start recording and collect some cool data.

Well, nice try, but my project had a different idea.

From moving our subjects’ hairs around to ensure good enough contact between electrodes and the skin (which is quite an effort with people with long, thick hair) to removing all possible sources of external noise and interference (phones, laptops etc.), the real fun began when we started testing on people! What was interesting to observe was the presence of alpha waves which appear when the brain is resting (for example when your eyes are closed). The alpha waves were also an indicator that the noise we had was tolerable.

However, I was able to collect some interesting data towards the end of the fellowship. Even though the recordings from different persons varied (which needs to be looked into a bit more), the collective data came out as expected and coincided with the papers already dealing with N170 detection. Some possible directions for future research would include optimizing the sensor (especially dealing with the delay), conducting the experiment on more people, as well as introducing some other types of slides (other objects, familiar faces mixed up with the unknowns) to observe what happens and what the EEG looks like.

About Me

I am a junior undergraduate at the University of Belgrade School of Electrical Engineering. My interests include biomedical engineering and image processing.


No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.