Fully remote, fully in-person, or somewhere in a sweet spot between the two. Those are the main safety concerns that are being laid right now in front of the decision makers, on behalf of students, parents, teachers and everyone around them, right at the kickoff of the new academic year. But whichever model prevails, it might turn out to be a temporary fix to a permanent problem. Furthermore, it doesn’t provide an answer to the key educational concern. How to empower the remote so that it can fully substitute the in-person if need be?
This issue is especially relevant to teaching STEM. How will an educator facilitate hands-on, project based learning without projects that students can actually get their hands on? In other words, is the “learning” part of the “distance learning” equation going to be reluctantly surrendered to a lesser evil scenario?
Even as COVID-19 begins to stretch out from a single season into an era, it’s becoming clear that distance learning might be here to stay. But it’s not a reason to despair if you’re a teacher or a parent, or both. Quite the contrary – there are ways to leverage all the good aspects of learning from the comfort of one’s couch and still provide hands-on (or should we say: gloves-on?) engagement.
A groundbreaking study by researchers from Purdue and Harvard Universities (DeBoer et al., 2017) has shown it, using our very own Neuron SpikerBox kit. Online learning, the study has found, yields remarkable results when complemented with at-home lab kits. Students who enrolled in a MOOC (Massive Open Online Course) and used our bioamplifiers got better grades than their peers who weren’t equipped with the lab kits. More importantly, their self-efficacy was three times higher than that of their counterparts. Both groups followed the same syllabus; both watched videos, took quizzes and virtual simulated labs. The only difference was the chance to do-it-yourself, which yet again turned out to be a source and key to confidence.
A two-week online course on neural engineering spruced up with some signal processing and machine learning – is there a better way to spend two weeks of August? Plus, you’ll tinker with a BYB Heart and Brain SpikerBox – and you’ll get to keep it too! Full details here.
If you’re as hyped up about FREE neuroscience education opportunities as we are, you’ll want to know that this course will teach you:
Neurophysiology and brain organization
Brain data acquisition and signal processing
Basic and advanced neural coding using machine learning
All lectures are conveniently divided into AM and PM sessions, so your brain can have some me-time in between studying – why, the brain of course!
Best of all, it’s not just theorizing but a great deal of hands-on experience, thanks to our little pal SpikerBox. Since the course will be held online in the best tradition of social distancing, you are welcome to apply from anywhere in the world!
This University of Michigan student team developed a way to control a drone with a new kind of controller…
The Braincopter Team!
We work with students of all ages — from outreach to early elementary, to hands-on demonstrations, labs, and even research with students from fifth grade to… well, grad school and beyond!
We wanted to share this novel and exciting project which is the result of a group of Aerospace Engineering students who had an exciting question: Can we fly a plane, or at least a drone, with our thoughts?
It wasn’t an easy project, but with very minimal support on our end, they were able to get a prototype up and running within just the few weeks allotted to the project!
But how does it work?
Click to view the PDF
The students took advantage of two signals that you can record using the Heart and Brain SpikerBox – First, EEG (Electroencephalograms, or brain waves) could be used to “wake up” the drone (take off / ready) by opening your eyes, or “put it to sleep” (land / standby) by closing your eyes. This works because, when you record from your occipital lobe, alpha waves are present when your eyes are closed, and “disappear” when they are open – a phenomenon which the students leveraged for their “On/Off” switch.
Then they used EOG signals (Electrooculograms, from your eyes!) to tell the drone to move in different directions depending on if you are looking up, down, left, or right. This is possible thanks to the different electrical signals recorded when you look in different directions.
They were able to do this in real time, creating a very creative control scheme that could be applied to other devices as well. The sky is the limit for the future of this project! Or maybe not just the sky… maybe space isn’t even a limit anymore for students these days!