Distance learning just got a lot easier for the 7th grade students of Abington Avenue School (Newark, NJ)! Their teacher Khalil Gordon has recently won $1,000 in Neuron SpikerBox Bundles. More precisely, they will get 13 of BYB research kits that they can use for project-based science learning from the comfort of their PJs!
The funding is part of Society for Science & the Public’s STEM Research Grants totaling $100,000, awarded to 100 middle and high school teachers from all across the U.S. They put special emphasis on schools in underserved and underrepresented communities.
This year, the program was geared toward distance learning, striving to provide teachers with resources and tools that facilitate hands-on science labs that students can do at home. As we’ve already written, a Harvard study has shown how well our bioamplifiers perform in student dorms, living rooms, bedrooms, or just about anywhere.
Neuron SpikerBox and other standards-aligned Remote Labs lie at the intersection of various nerdy disciplines such as biology, electrophysiology, computer science. They are already in use in hundreds of U.S. schools, colleges and other institutions – from elementary to higher education. The tweets speak for themselves!
Congrats to Mr. Gordon! We’re looking forward to hearing about his students’ scientific discoveries in the classroom – be it remote, in-person, or hybrid. (We can already see the working title of a student project: “The day I got kicked by a cockroach leg.”)
Are there any other teacher grants?
Over the year, there are many national and local grants for individuals, schools, and school districts. However, most funding cycles are now over, so we’ll drop just a few that you may still apply for. Heed the deadlines though – the clock is ticking!
Donors Choose Grant (public fundraising opportunity) – no deadline, just pitch well and apply anytime
Fund for Teachers (up to $5,000 for individuals or $10,000 for teams; to be used on customizable professional development) – the deadline is January 21st, 2020
McCarthey Dressman Education Foundation Academic Enrichment Grants (up to $10,000 per year, maximum of $30,000 over three consecutive years; to be used on projects that supplement classroom curriculum or afterschool activities for students from low-income households) – application from January 15th to April 15th or until they reach 350 submissions
Walmart Local Community Grants (up to $5,000 in classroom resources or equipment for K-12 schools) – the application deadline for this cycle is December 31st, 2020
AIAA Foundation Classroom Grant (up to $500 for K-12 teachers to be used on classroom STEM equipment, tools, supplies with an emphasis on aerospace) – the application deadline is January 15th, 2021
Honda Community Grant (up to $75,000 per year for school districts, to be used on classroom equipment or curriculum) – next deadline for new organizations is February 1st, 2021
Lemelson-MIT InvenTeam Initiative (up to $10,000 for high-school teams of teachers and students who need to deploy their STEM expertise to solve a real-life problem with a technological innovation; to be used on related research, materials, and learning experience) – the initial application deadline is June 4th, 2021
What lies at the intersection of math and medicine? Why many things, of course. Certainly more than could possibly fit into a blog post! But today, I am going to talk about the connection between brain function and numbers.
My name is Natalia Díaz and I am a student of Mathematical Engineering at the University of Santiago de Chile. Ever since I can remember, I have been tantalized by mathematics and medicine (especially brain function). The opportunity to mix both subjects finally arose when I entered college. That is how Neuroscience popped into my life!
To get my degree, I must complete my internship and my thesis. That’s how I started working with my mentors Dr. Patricio Rojas (University of Santiago) and Dr. Patricio Orio (University of Valparaíso). We are investigating, through numerical simulations, the effect of the electrical synapse topology between inhibitory neurons.
For this, we use a neural mathematical model of a mixed network of inhibitory and excitatory neurons of the cerebral cortex, and we study different types of topology (“all with all” or lattice style) of connection between inhibitory neurons characterizing the patterns obtained.
For example, the figure below shows a significant difference in network synchronization using different topologies. In the first yellowy-whitish graph, there is no gap junction (electrical synapse). The second shows a gap junction with a lattice topology, and in the last one we apply a gap junction with an all-to-all topology. To plot this, we use different values for the mean synaptic strength between excitatory neurons (mGsynE) and for the mean synaptic strength between inhibitory neurons (mGsynI). Lots of abbreviations, I know. But I promise they are fun!
As I was researching internship opportunities, Dr. P. Rojas told me about Backyard Brains, a go-to company for those who want to tackle neuroscience through mathematics. As for my project, I will be working on our next generation interface products. Coding Neural Interfaces for beginners can be difficult to understand, so I’ll help make the interface by testing an Ardiuno library Backyard Brains has developed, seeing how easy it is to use, trying to “break” it, and improving the documentation on the library.
This way, someone using our Arduino -based products won’t have to start from scratch as they learn to control devices like robotics, computers, musical instruments, and video games with the signals of their bodies (EOG, EMG, EKG, and EEG). My project will last until the end of January, and depending on my time, I may roll up my sleeves and get into some Python data analysis programming, a long-standing data analysis dream for the Backyard Brains team.
In my spare time (before COVID), I used to travel to my mother’s house in Pichidegua (VI Region), where we’ve always got together as a family and had a good time, as you can see in the above photo. Yes, there are many of us – try to find me! But now due to the quarantine, I have only been in Santiago, which I also love because I can spend time with my boyfriend, Luis. He is very funny, and he’s also a mathematician.
Can you imagine riding on an autonomous car that knows your emotions and thoughts better than you do?
Neither can we, but our co-founder and CEO Dr. Greg Gage can, and he isn’t freaked out by the prospect. On the contrary, he’s quite optimistic about it. How come?
Human-centered technology is the keyword here. As Senior TED Fellows, Greg and artist Sarah Sandman were invited by Lexus to give their two cents on the future of AV (autonomous vehicles) operated by AI rather than humans. They both think that there is a possibility of a car that is not driven by humans yet remains human-driven – or rather, in Greg’s vision, emotion-driven!
Can a Car Feel You?
Emotions are, Greg says in his latest TED video, one of the major evolutionary inventions that we’ve developed in order to better interact with one another. A human-centered autonomous vehicle should therefore be equipped to detect not only obstacles on the road and other external signals, but also the passenger’s state of mind. Are you tired? The seat’s already lowering down into a bed and tucking you in with some chill-out music. There are sensors picking up your bodily signs such as blood pressure or EKG, so the car can give you exactly what you need before you even know you’re needing it.
This is not science fiction but merely smooth teamwork between the car’s brain and your own. It’s no secret, though, that this car – any car – will be a lot smarter than you are. But no reason to feel bad about it! If you allow it insight into your emotions, it will go one step further from technology for its own sake to technology for human sake.
Devices can’t read our minds just yet. But Greg doesn’t think it’ll be all doom and gloom once they learn how to do it. We reached out to him to tell us a wee bit more about the broader impact of AV that detects and interprets human emotion. “I think that advanced technology will be forced to interpret our emotional state to better communicate and make better predictions of our needs. It could even help us communicate better with other humans by modeling healthy behaviour within its interaction,” Greg told us while tinkering with the DIY beehive in his backyard. “In an autonomous car, the occupants would not be in direct control. So any detected emotional states of passengers (aggressive or otherwise) would be shielded from the control systems. I feel there could be some feedback in extreme situations, for example, if someone is in an emergency and needs to get to a doctor. But overall, this technology is mostly inward focused on the passenger’s experience.”
So the goal is to improve human-to-human interaction by way of an advanced car-to-human interplay. Plug us in!
Another thoughtleader and TED Senior Fellow, Sarah Sandman envisions a car that we’ll happily leave our homes for – one that would fulfill our social as well as physical and emotional needs. As you’re dropping your kid to school, you could be spending some quality time with them instead of keeping your eyes locked on the road. And how about having a cozy cup of coffee with other passengers instead of silently sitting next to each other on a train, everyone glued to their smartphone? Commuting could finally become communal!
If we create a human-centered artificial intelligence, we won’t have to worry about evil robots chasing us around the scorched earth as they whip us into submission. How about (re)claiming our own future instead of envisioning dystopian havoc?