Backyard Brains Logo

Neuroscience for Everyone!

+1 (855) GET-SPIKES (855-438-7745)


items ()

Why are Neuroscientists Interested in… plants?

But Why Plants?

Recording an Action Potential from a Sensitive Mimosa!

With the Introduction of the The Plant SpikerBox, you can, for the first time ever, explore plant behavior and electrophysiology at home or in the classroom. But wait…. Plants? Why are neuroscientists interested in… plants…?

What has a brain?

When we work with young students, we often begin by asking them “What has a brain?” You get your typical responses, like “I have a brain,” “my dog,” “my cat,” etc. Then we ask them to clarify, how are they defining that category, and often we hear the response “They move on their own!” This is true, and the mechanics behind movement in brained creatures is a fundamental element of neuroscience and electrophysiology. But, there are living creatures without neurons that move: Plants!

Certainly you’ve seen a plant growing towards the sun, opening up its leaves or petals during the day for better exposure or pollination, but what’s more, there are some plants which exhibit rapid movements in response to direct stimulation. We created the Plant SpikerBox to record the electrical activity of these plants! Like the Neuron or Muscle SpikerBox, the Plant SpikerBox is a kit which is designed to make electrophysiology preps easy, so that students and teachers can focus on the science and experiments and not be bogged down by technical issues.

Disclaimer: Venus Flytraps do not have subterranean brains.

We proved this to be an idea worth spreading… Our 2017 TED Talk (Vancouver, BC) introduces viewers to this little-known world of plant electrophysiology. On the TED main stage, our CEO Greg Gage explains the principal elements of electrophysiology research, demonstrating that the electrical signals which control our own bodies are also present in plants! He proves this through a number of demonstrations, first by visualizing his heartbeat with our Heart and Brain SpikerShield, before moving onto the plants.

You can see the TED talk here!

To return specifically to the Plant SpikerBox, we encourage users to first find a Venus Flytrap, the plant that Darwin called “One of the most delightful plants in the world,” and investigate its eating behavior…

Venus Flytrap

In order to supplement its nutrition, Venus Flytraps capture and “eat” insects. In order to do so, they have to snap their traps shut quickly so their prey doesn’t escape. But how does the plant know when to snap its trap shut and how do the mechanics of this action work? 

Stimulating a Trigger Hair in a Venus Flytrap

Just like humans and animals, Venus Flytraps use electrical activity to move! Recording this signal with the Plant SpikerBox reveals that, like us, plants use “Action Potentials” to send movement signals! In the TED talk, Greg demonstrates how Venus Flytraps distinguish between false alarms and real prey. These are the amazing plants which inspired our interest in plant electrophysiology, we hope you find them as incredible as we do! Check out this experimental write-up to learn more!

Sensitive Mimosa

Anatomy of a Sensitive Mimosa and its Behaviors

Another interesting, rapidly moving plant is the Sensitive Mimosa, or Mimosa Pudica. Also known as the “shy,” or “bashful” plant, the Sensitive Mimosa will fold up its leaves and branches when it is touched or flicked. Using the Plant SpikerBox, you can experiment with the Sensitive Mimosa and discover how Action Potentials are responsible, again, for the dramatic movement response when you flick the stem of the plant. On the TED stage, Greg demonstrates these two kinds of behaviors, showing how the leaves fold up with soft touches, but entire branches fold when flicked. See the experiment here!

The Sensitive Mimosa has also received some attention lately following the announcement of the 2017 Novel Prizes! This year’s prize for Physiology or Medicine went to researchers who study circadian rhythms, or sleep cycles, which were originally discovered in the Sensitive Mimosa! For a great explanation, check out the Nobel Prize website!

Interspecies Plant-Plant-Communicator

But perhaps the most exciting experiment you can perform with your Plant SpikerBox is the Interspecies Plant-Plant-Communicator experiment. To demonstrate the ubiquitous nature of the action potential, Greg uses the Plant SpikerBox on the TED stage to capture a signal from a Venus Flytrap and send it into a Sensitive Mimosa…

Screencapture taken just a moment before Interspecies Plant-Plant-Communication is achieved…

The Plant SpikerBox and Plant Sciences have a lot of potentials (ha!). There are countless other experiments to be performed on these plants alone, but investigating other plants opens a world of opportunities. Perhaps the Trigger Plant or the Telegraph Plant are hiding electrical signals? Perform your own experiments! Let us know what you discover!

The Plant SpikerBox is available in our store, and the companion recording software, SpikeRecorder, is free to download.

What will you discover?


The 2017 Summer Fellowship Concludes

Over 11 sunny Ann Arbor weeks, our research fellows worked hard to answer their research questions. They developed novel methodologies, programmed complex computer vision and data processing systems, and compiled their experimental data for poster, and perhaps even journal, publication. But, alas and alack… all good things must come to an end. Fortunately, in research, the end of one project is often the beginning of the next!

Some of the fellows intend to continue working with on the research they began here while they’re away and many of these projects will be continued next summer! Definitely expect to hear updates from Nathan’s EEG Visual Decoding project and Joud’s Sleep Memory project. Additionally, two of the projects will continue throughout the next few months: Zach’s Songbird Identification and Shreya’s Electric Fish Detector projects will continue through to December!

Meet the Fellows, See the Projects

The fellows are off to a great start! Check out their blog posts introducing their projects:

Progress

 The team has been working hard to bring their projects to life. Check out these blog posts on their rig construction and data collection efforts!

Conclusions

Our fellows experience the peaks and valleys of research this summer, but they all came out on top! Check out their final posts for their results, posters, and other details!

Continuations…

A few of our fellows are staying on throughout this next semester for longer term development projects! Zach is going to be back to working with his team on the Songbird Identification Device project, and Shreya will be working through to December on the Electric Fish Detector project. Expect updates on their progress from them soon!


Octopus Wrestling and Computer Vision

 

Hello again my faithful viewers, and thanks for tuning in for another exciting octopus themed blog post. As always I am your host Ilya Chugunov, but today I’ve come with sad news; all good things must come to an end, and this marks the end of my summer research here with Backyard Brains. Now’s a time to grab a hot cocoa and reminisce on what we’ve learned and talk about what there’s still left to do.

First and foremost, if you haven’t already had the chance to look at my previous blog posts, you can see them here:

Octopus Learning and Behavior

Studying Aggressive Behavior in Octopodes

Now let’s recap and break this down into some conversational dialogue.

First, we found out, rather accidentally, that if left together our Bimacs will wrestle each other to assert dominance. This gave us the idea of using computer vision to gather data for analysis with the hope that we could identify some interesting features within their behaviors.

First I built my acrylic setup to record the octopuses doing their thing, making sure to have even lighting and a stable setup for my Go Pro so that the code didn’t just explode from all the variability.


The first, and most classic, behavior found in our trials with the Bimacs were “bouts”, which were little sumo-wrestling fights where each octopus tried to push the other around as far as possible; these were common when both the octopuses were excited and lasted about 5 seconds each.

The second curious behavior found was the “poke”, where one octopus wanted to provoke a real fight, but the other just wasn’t feeling it. The more excited octopus would waltz up to the lazy-bones and just briefly tap him with an arm before jetting off across the chamber.

I noticed that in both the bouts and the pokes, right as the distance between the two octopuses closed, and they made contact, the angle between them rapidly decreased too. They would approach each other sideways (almost backwards at times), then rapidly spin around right as they got close to poke/fight. In the poke behavior, the offending octopus would then spin back around and jet off, while in the bout behavior they’d just stay locked face to face.

Another notable thing our octopus do in their fighting ritual is change colors. As I assume you already know, these guys are covered in chromatophores and seem to flash bright black as they go on the offensive (Can you tell who the attacker is in that picture above?).

The poke behaviour elicited the same response, twice! The first bump was the attacked octopus darkening as the poking octopus approached it and second was the poking octopus turning a dark brown as he squirted away.

“But Ilya, how in the world do you process so much video? And how do you know when the fight starts in the first place?”
Why thanks for the question, hypothetical reader. I use a mess of MOG (Mixture of Gaussians) background subtraction, erosion, and band-pass filtering combined with the OpenCV convex hull functions to find the general outlines of the octopus, and then I check if they’re two separate blobs, or one combined megaoctopus. If they’re 2 blobs, they’re not in contact, and vice versa, so now it’s easy to define first contact and a bout vs a poke (long contact, short contact).

 

Using a simple windowing function and a pretty boring logistic regression, we can take a bunch of our video clips of octopus fights where we’ve already classified when a fight occurs, and from them predict a point of contact in a new video we feed into the algorithm. This is where the concept of machine learning starts to play into the project, letting a program learn from previous octopus video to predict what will happen in new octopus video.

I’ve compiled my research results and created a poster which I presented at a University of Michigan symposium.

What’s next?

For me, Canada. Heading up to Montreal next week.

In general, my code is up on my GitHub and is completely open source, so anyone is welcome to make changes to it, take it in whatever directions they want; you don’t even have to use it on octopus if you don’t want. 

Now for some musings…

I’m excited about computer vision. Historically, behavioral studies involve a lot of humans watching animals, recording specific events (like eating a certain food and when), or interpreting their behavior. This is not only time consuming, but also unscientific. In these studies, there needs to be redundancy. Multiple people need to record the events. Then, that data needs to be interpreted statistically to ensure that, on average, the interpretations are consistent between different observers. As you can see already, it is challenging. Computer Vision programs are changing this!
 
By taking the humans out of the equation, you remove chances for bias, for missed behaviors or interaction, or fudged results. Computer vision techniques can be used to comb over hundreds of hours of video footage, quickly providing researchers with quantifiable results. There are certainly still some behavioral studies that require human discretion, for instance, was a touch affectionate or aggressive, but for many researchers, computer vision is the future.

I think there’s a lot still to be done with computer vision and behavioral analysis, and this summer research was just me dipping my foot into the pool. There is much more data we can draw from the same video I was working with, tentacle position and length, how curled the octopus’s arms were, maybe even their heartrate could be extrapolated with enough clever coding. As I continue onward in whatever field of STEM I find myself in next, I hope to keep throwing computational power at problems that don’t seem like they even need a computer, because who knows, maybe they do.

I’ll leave you with some boring philosophy. No one, not a single scientist, knows for certain what the next big thing is going to be. No one knows when or where the next technological revolution is going to be, no one knows if the next world-changing invention is going to be made in a million dollar Elon Musk laboratory, or at 3am by a hungover student in their dorm room. So just know that when you read a blog post like this, about an 11 week undergrad project, even it has the chance to be something big; not all scientific breakthroughs are made by bearded dudes in lab coats, they could be made by you.