The Fellows! Missing: Ilya and Nathan, they already had started presenting!
Today our Summer Research Fellows “snuck in” and presented their summer work at a University of Michigan, Undergraduate Research Opportunity Program (UROP) symposium! Over the two sessions our fellows presented their work and rigs to judges, other students, to university faculty, and community members. Some of the fellows are seasoned poster designers, but others had to learn quickly as they all rushed to get their posters printed in time! As our motto goes, we think it’s a shame that science is locked up in labs, and we pride ourselves on being able to take our DIY rigs wherever we go, so of course we encouraged the fellows to bring as much of their rigs as possible to show off in person. Science is much cooler when you can hold and see it in person.
Poster presentations are close to our heart here at Backyard Brains… You might be surprised that our company started out as a poster presentation! The “$100 Spike!” was the poster which launched a thousand ships. Our founders Tim and Greg developed the original SpikerBox as a passion project and presented it at a “Society for Neuroscience Conference” poster session. They pinned up their poster, tacked a hundred dollar bill to the board, and showed everyone who would listen to live action potentials on their first-generation SpikerBox. People expressed interest in purchasing the SpikerBox and Backyard Brains was born!
We’re proud to see our fellows continuing the tradition of creating affordable, DIY neuroscience experiments. Check out the photos and posters below, and be on the lookout for more blog posts from our fellows as they finish their write ups!
Who would have thought a summer fellowship would grant you sleep sessions during work hours! Not just that, but it also comes with the ability to explore the deepest phases of sleep and access to unlimited Delta waves that come in all shapes and heights! Well, that can only happen at Backyard Brains, right from the interesting sleep lab I’m running this summer in collaboration with Om of Medicine.
Om of Medicine: Where the Magic Happens! Om is letting us use part of their lounge as our sleep lab, where subjects come and perform the study.
For the past couple of weeks, I have been working diligently on designing and implementing the experimental procedures to test if inducing consolidation during sleep by cuing certain auditory stimuli can improve memory recall. This is done using the TMR or Targeted Memory Reactivation technique, where we selectively target memories, reactivate them, and compare them to ones that are not targeted or cued with any stimuli. Such methodology allows us to explore different parameters to learn more about the specificity of memory formation and bias in learning. From here, my project splits into two main parts: The memory task and EEG recording/decoding.
For the first part, I am collaborating with Dr. Ken Norman from the Princeton Computational Memory Lab and two of his students: Robert Zhang and Everett Shen to develop an iOS software for the memory tasks. The goal is to have this be a fully functional app users can download from the App Store and run their own sleep studies.
The memory task is simply comprised of 3 main parts. The first part is the learning task, where subjects would watch 48 different images being displayed on random locations on the screen, each with a distinct sound correlated with it, for example: cat with a meow sound. Subjects should try to memorize the location of where each image was displayed. Following this phase comes two consecutive rounds of testing with feedback, where subjects would see each image and should then click on where they think it’s correct location should be based on what they remember. Following this multi-stage learning phase, the subject would do the actual pre-sleep test. This is essentially the same as the previous two rounds, but without the feedback. The second part of the app, is the cueing phase that will be played during the nap when the subject is sleeping. The idea is to cue 24 targeted sounds out of the 48 the subject listened to before the nap. For the other 24 untargeted sounds, we play a baseline sounds that the subject did not listen to before the nap (so different than all the 48 presented). Part three is the post-sleep test which is again the same as the pre-sleep test.
Part 2 of the app: cueing phase, should play only in the Slow Wave Sleep cycle, where Delta waves are observed. Here comes the second cool aspect of my project: EEG recording and Decoding.
Some screenshots from the current working version of the app. It is still being developed and improved upon. Code can be found soon on GitHub. CC
Scoring Sleep Stages and spotting Delta waves in real time can be very challenging. The end goal of this project, is to be able to detect deep sleep automatically and cue the sounds accordingly. For now I am using our EEG setup and Spike Recorder to observe Delta waves in real time as the subject is sleeping, once I see them, I start cuing the sounds from the app.
My beautiful Delta Waves in different shapes and height taken from our subjects. Delta waves typically
have a frequency of 0.5-3 Hz with an amplitude of around 75 microvolts
After recording, I am performing signal analysis and plotting of frequency and power graphs in different variations to check that Delta waves are happening at the same time we did the cueing in real time. So far, the results are on point and matching!
Top Left: Subject 1, Top Right: Subject 2, Bottom Left: Subject 3, Bottom Right: Subject 4
One of the most challenging tasks in my project is to find subjects willing to volunteer, perform the task and sleep. As this step is very crucial, I designed a brochure and gave it out during Tech Trek and to various parties. There is a doodle poll where subjects can sign-up for sessions.
Throughout this time, I learned Matlab from scratch and worked more with electronics and soldering. During the sleeping session, I play white noise using a generator, and the cueing sounds from a speaker placed next to the subject’s head. The trick is to not have the cueing sounds be more than 4dB higher than the white noise in order not to wake up the subject. Setting this up took a lot of testing and playing around with different wires, sound meters, and speakers. All subjects were asked after waking up if they listened to any sounds while sleeping. All assured they did not, which is good for us because we can make sure the procedure is working. Next to the speaker sits the EEG shield connected to the Arduino. The electrode placements are as follows. Reference electrode on the mastoid, active electrode and ground on frontal lobe using our EEG headband.
Top: iPad running the Memory Task. It is connected to the speaker placed inside the room by the subject’s head. I cue the sounds from it once I observe Delta waves. Mac for recording EEG in real time and scoring/observing SWS. We have both extended outside the room so that I don’t wake up the subject by sitting in the room with them.
Bottom: Speaker, white noise generator, and sound meter.
Subjects during the session. Photos were taken with the permission of the subject and taken at the very end of the nap, right before waking them up.
Finally, here comes the best part!! Getting our data that agrees with the expected literature and published papers.
This is the basic plotting of the data we got. More statistical analysis regarding error bars and figure labeling will be applied. The graphs show the mean distance in pixels of the 48 images for each category: Cued and Uncued, before and after sleep. The distance is between where the user clicked and where the original location of the image should be. This distance is compared to a certain threshold we set and compares to it. Larger distances mean more error and thus quantifies as incorrect. Smaller distances mean less error thus quantifies as correct. We can clearly notice that both subjects performed worse on the uncued sounds after sleep compared to before. Subject 4 also clearly shows an improvement in recall for the cued images after sleep compared to before. This supports the TMR technique and shows the selectivity of memory consolidation and recall.
The upcoming final month will be filled with more exciting work and experimentation. I will be running more experiments on more subjects to double-check our results. Then I will start with the control experiments, were some subjects would not sleep, and other would but have no sounds cued at all. Stay tuned!!
From the Lab Journal: Artistic Representation of the Sleep Kit
Can we enhance our learning and improve memory recall during sleep? Apparently, sleep does have secret powers! It has been recently discovered that such magic can happen through targeted memory reactivation during sleep. Studies have shown that by associating a sound cue while learning to perform a specific task, you can then play that cue while the subject is sleeping to strengthen their memory of that task! Studies have found that this cuing during sleep helps strengthen memory consolidation and recall, compared to subjects who don’t receive the sleep cue. The cue can also be any other form of stimulus, like a distinctive odor for instance.
There’s more! The cues were shown to be more effective if played in a specific sleep cycle; the slow-wave phase. To coordinate this cue, the researchers need to detect when the subject reaches this slow-wave phase of sleep, so they use the patient’s EEG signal (their brain wave recordings). My project aims to create a DIY replication of these fascinating findings, so people can test these hypotheses on their own and come up with their own experiments. The end goal is to create a functional “mini sleep lab” using a mobile application and the BYB EEG headband/SpikeShield.
Flow Char to Illustrate the process: Start Upper Left
Decoding the EEG signals to analyze and detect the slow-wave sleep cycle will be the most challenging part of the project. I will be using Deep Neural Networks (DNN) as a starter to perform this task. Once achieved, I will be moving into creating the app that will have the learning tasks and cues for the user to apply.
I am really excited to work on this project over the summer, and I believe this fellowship will help me build on my skills and implement my knowledge to share it with students and neuroscience enthusiasts!
So.. what brought me to Backyard Brains in the first place! Well, it has been a long journey. I am Joud, an international student from Jordan. I am a rising junior at Mount Holyoke College, majoring in Neuroscience and Behaviour with a focus in Biomedical Engineering Nexus Program.
In my home college, I currently work with Professor Kenneth Colodner in the Colodner Lab, where we focus on studying Alzheimer’s disease and related tauopathies using various genetic techniques in fruit fly Drosophila melanogaster models. Last semester, I worked on enhancing the design of a HIT (High Impact Trauma) device used to deliver mechanical force to flies, by subjecting them
to cycles of acceleration and deceleration to inflict TBI paradigms. The device was used in the development of a Drosophila melanogaster model of CTE (Chronic Traumatic Encephalopathy). My next step will be exploring glial cells!
I learned about Backyard Brains while doing a project in collaboration with NeuroTinker, using their NeuroBytes to create “SPLIT”; an educational tool model simulating neurological symptoms in split-brain patients.
This is my first time visiting Michigan, and I’m already enjoying the vibes here in Ann Arbor! The past week was an experimentation period, where we got to work with the BYB tools and the new plant electrophysiology kit.
With the fellows, I’m upper right!
We also had fun exploring the area and celebrating Memorial Day at the BYB Barbecue. Stay tuned for more posts and updates on my progress!
Also, for those interested, here are some of the paper’s I have been working with!
Rudoy, J.D., Voss, J.L., Westerberg, C.E., & Paller, K.A. (2009). Strengthening individual memories by reactivating them during sleep. Science, 326, 1079, with supporting online material.
Antony, J.W., Gobel, E.W., O’Hare, J.K., Reber, P.J., & Paller, K.A. (2012). Cued memory reactivation during sleep influences skill learning, Nature Neuroscience, 15, 1114-1116.
Oudiette, D., Antony, J.W., Creery, J.C., & Paller, K.A. (2013). The role of memory reactivation during wakefulness and sleep in determining which memories endure. Journal of Neuroscience, 33, 6672-6678.
Hu, X., Antony, J.W., Creery, J.D., Vargas, I.M. Bodenhausen, G.V., & Paller, K.A. (2015). Unlearning implicit social biases during sleep. Science, 348, 1013-1015.
Jaskir, A (2015). Decoding EEG Signals Using Deep Neural Networks: A Basis for Sleep Analysis. Advisor: Ken Norman, Professor of Psychology and the Princeton Neuroscience Institute