Backyard Brains Logo

Neuroscience for Everyone!

+1 (855) GET-SPIKES (855-438-7745)


items ()

Octopus Wrestling and Computer Vision

 

Hello again my faithful viewers, and thanks for tuning in for another exciting octopus themed blog post. As always I am your host Ilya Chugunov, but today I’ve come with sad news; all good things must come to an end, and this marks the end of my summer research here with Backyard Brains. Now’s a time to grab a hot cocoa and reminisce on what we’ve learned and talk about what there’s still left to do.

First and foremost, if you haven’t already had the chance to look at my previous blog posts, you can see them here:

Octopus Learning and Behavior

Studying Aggressive Behavior in Octopodes

Now let’s recap and break this down into some conversational dialogue.

First, we found out, rather accidentally, that if left together our Bimacs will wrestle each other to assert dominance. This gave us the idea of using computer vision to gather data for analysis with the hope that we could identify some interesting features within their behaviors.

First I built my acrylic setup to record the octopuses doing their thing, making sure to have even lighting and a stable setup for my Go Pro so that the code didn’t just explode from all the variability.


The first, and most classic, behavior found in our trials with the Bimacs were “bouts”, which were little sumo-wrestling fights where each octopus tried to push the other around as far as possible; these were common when both the octopuses were excited and lasted about 5 seconds each.

The second curious behavior found was the “poke”, where one octopus wanted to provoke a real fight, but the other just wasn’t feeling it. The more excited octopus would waltz up to the lazy-bones and just briefly tap him with an arm before jetting off across the chamber.

I noticed that in both the bouts and the pokes, right as the distance between the two octopuses closed, and they made contact, the angle between them rapidly decreased too. They would approach each other sideways (almost backwards at times), then rapidly spin around right as they got close to poke/fight. In the poke behavior, the offending octopus would then spin back around and jet off, while in the bout behavior they’d just stay locked face to face.

Another notable thing our octopus do in their fighting ritual is change colors. As I assume you already know, these guys are covered in chromatophores and seem to flash bright black as they go on the offensive (Can you tell who the attacker is in that picture above?).

The poke behaviour elicited the same response, twice! The first bump was the attacked octopus darkening as the poking octopus approached it and second was the poking octopus turning a dark brown as he squirted away.

“But Ilya, how in the world do you process so much video? And how do you know when the fight starts in the first place?”
Why thanks for the question, hypothetical reader. I use a mess of MOG (Mixture of Gaussians) background subtraction, erosion, and band-pass filtering combined with the OpenCV convex hull functions to find the general outlines of the octopus, and then I check if they’re two separate blobs, or one combined megaoctopus. If they’re 2 blobs, they’re not in contact, and vice versa, so now it’s easy to define first contact and a bout vs a poke (long contact, short contact).

 

Using a simple windowing function and a pretty boring logistic regression, we can take a bunch of our video clips of octopus fights where we’ve already classified when a fight occurs, and from them predict a point of contact in a new video we feed into the algorithm. This is where the concept of machine learning starts to play into the project, letting a program learn from previous octopus video to predict what will happen in new octopus video.

I’ve compiled my research results and created a poster which I presented at a University of Michigan symposium.

What’s next?

For me, Canada. Heading up to Montreal next week.

In general, my code is up on my GitHub and is completely open source, so anyone is welcome to make changes to it, take it in whatever directions they want; you don’t even have to use it on octopus if you don’t want. 

Now for some musings…

I’m excited about computer vision. Historically, behavioral studies involve a lot of humans watching animals, recording specific events (like eating a certain food and when), or interpreting their behavior. This is not only time consuming, but also unscientific. In these studies, there needs to be redundancy. Multiple people need to record the events. Then, that data needs to be interpreted statistically to ensure that, on average, the interpretations are consistent between different observers. As you can see already, it is challenging. Computer Vision programs are changing this!
 
By taking the humans out of the equation, you remove chances for bias, for missed behaviors or interaction, or fudged results. Computer vision techniques can be used to comb over hundreds of hours of video footage, quickly providing researchers with quantifiable results. There are certainly still some behavioral studies that require human discretion, for instance, was a touch affectionate or aggressive, but for many researchers, computer vision is the future.

I think there’s a lot still to be done with computer vision and behavioral analysis, and this summer research was just me dipping my foot into the pool. There is much more data we can draw from the same video I was working with, tentacle position and length, how curled the octopus’s arms were, maybe even their heartrate could be extrapolated with enough clever coding. As I continue onward in whatever field of STEM I find myself in next, I hope to keep throwing computational power at problems that don’t seem like they even need a computer, because who knows, maybe they do.

I’ll leave you with some boring philosophy. No one, not a single scientist, knows for certain what the next big thing is going to be. No one knows when or where the next technological revolution is going to be, no one knows if the next world-changing invention is going to be made in a million dollar Elon Musk laboratory, or at 3am by a hungover student in their dorm room. So just know that when you read a blog post like this, about an 11 week undergrad project, even it has the chance to be something big; not all scientific breakthroughs are made by bearded dudes in lab coats, they could be made by you.


Visualizing Harmonic Convergence in Mosquito Mating

Wow, what a summer!!! I have some exciting news to report…I didn’t get bit by ONE mosquito all summer!!! Just kidding, my project is a little more exciting than that! I did it! I successfully put together and executed a project that I was a little iffy about back in May, and developed a new-found love for mosquitoes [fake news, don’t tell them I said that!]. I now like to be referred to as the mosquito whisperer, so if you see me on the streets, I will not respond to any other name.

But now, let’s get to the good stuff! Last time you heard from me, I was getting ready to start recording male/female pairs of mosquitoes. Now, I have about 7,000 audio and video recordings of these interactions, and I couldn’t be happier with the data I collected! The goal for this stage of my research was to observe whether or not mosquitoes actually communicate with one another to signal their interest in mating, or basically flirt. Below are the visual results of this from the previous study.

For my own recordings, I was able to detect the presence of these interactions by importing my audio files into a computer program called Audacity. Within this program, I could convert the sound file into a spectrogram that was able to clearly show me the frequencies produced by the mosquitoes in the recording. What the heck am I talking about, you ask?? Below is one example of a recording spectrogram that revealed a converging interaction!

But before I get into explaining the scary pink and blue stuff above, let’s talk about how I got these recordings in the first place- that’s the fun part (minus the 500 times mosquitoes got loose in the lab and attacked all of my friends…losers)! About midway through the summer, I changed some of my methods to make my procedure a little easier and reduce the number of casualties caused from pinning my little friends onto insect pins…yeah, they were not happy with me when they woke up from their nap to find themselves stuck to a wire…but, you got to do what you got to do for science!!!!! At the beginning of the summer, I was using insect wax (a yummy combination of beeswax and rosin) to fix these guys to their new home, but it turned out that the wax wasn’t strong enough to keep the mosquitoes in place when they woke up, and more often than not, they flew right off of the pin and straight for my face. So, I decided to try pinning them with a tiny amount of superglue, and it worked magically! The trick was to touch the super glued side of the pin to the mosquito’s thorax (pictured below) instead of their abdomen, which is where I was attempting to pin them when I was using the insect wax. When I tried to pin their abdomen with superglue, sometimes their wings would get stuck to the pin, making it a little bit difficult to get a good recording when their wings couldn’t move… Instead, their thorax provided a perfect amount of surface area for the pin without interfering with their antennae or wings at all.

Once I adapted this method, pinning them was a breeze! I kid you not, I could probably pin 20 mosquitoes within 30 seconds. You’re impressed, I know, I was too…Below are a few examples of my mad skills.

      

Don’t they look so comfortable and happy!? Next, I set up my recording stands, which were actually 3D printed ‘micro-manipulators’ designed by Backyard Brains! My company is so cool… These stands were used to fix the mosquitoes, with the help of some silly putty, for the duration of the experiment. They were perfect.

 

Now I was ready to record!! Below is a beautiful video of one of my experiments (I’m a little proud of myself, can you tell?) Make sure you turn on your sound!!

 

How creepy is that??? These noises will be burned into my brain for the rest of my life! But isn’t it also super cool? You can definitely hear the difference in sound between the two sexes, but can you hear when they begin converging?? Listen again.

If you’re thinking that it happens roughly 20 seconds into the video, lasting about 15, you’re right!! But just to be safe and make sure that the noises we were hearing were indeed interactions, I imported both files into MATLAB for a closer look

Here you can see the two different frequencies of the female and male (though there is a bit of noise blocking the females’ fundamental frequency). The key to detecting an interaction is to look at the higher frequencies, up in the harmonics, around 1200 Hz because this is where convergence will normally occur. And lucky for us, it did! On camera!  I was so excited I just about packed up and called it a day, but I really wanted to see some more interactions, so I pinned 8 million more mosquitoes and got down to business! In the end, I was able to successfully record, both audio and video, 49 male/female pairs, observing interactions in 33 of them! That means, in the small sample size I had, the pairs would communicate a love interest to one another 67% of the time! Gross, get a room!!!!

Nearing the end of my time in Ann Arbor, I finally finished recording, throwing in the towel for my beloved new hobby, and I was ready to start processing my data in the hope of making it a little more ‘Hollywood’ as Greg would say! Little did I know, this process wasn’t as appealing as I first thought, and on multiple occasions I considered playing with some more mosquitoes just to get away from the madness known as MATLAB. Lucky for me, I had a MATLAB expert living with me (Hmmm…maybe that’s why we became best friends since she couldn’t escape me anytime I opened my computer to work!) Christy helped me create the most magical, color coded, satisfying and all around perfect video of not only my little buddies interacting, but also a spectrogram underneath it that played in perfect sync with the original video recording! Brace yourselves…you will never see anything more beautiful in your life…

 

 

If you caught yourself replaying it multiple times, don’t fret, as you will catch me playing it periodically throughout the day just for fun. I’m not a nerd. But look, I was successful!!!

We also presented our research at a poster symposium at University of Michigan!

So now is about the time where we wrap up!!! Ah don’t make me leave!!!! But I am so happy with the work I produced this summer and I feel so lucky that I got the chance to be part of this program. Greg Gage, you are the best boss I have ever had (don’t tell that to my dad since he’s the only other boss I’ve had…) and I will be forever thankful for the impact you had on my life as not only a researcher but also an individual. I love you and your family to pieces, especially your little ones that taught me all about Peppa Pig, and are still convinced my name is ‘Dirt’. Wonder where they got that…cough, cough, Christy. I already miss you guys, and I haven’t even left Ann Arbor yet! I’d also like to thank all of the staff at Backyard Brains (Stanislav, Zorica, Will, Zach, Caty, Catherine and John), who made my time here so worthwhile and comfortable- I never felt alone even when my MATLAB would crash, or when my fellow interns would shun me for letting some mosquitoes loose in the lab…

And last but not least, thank you to all of the BYB interns that made this summer one for the books! You will all be a part of my life forever, and I can’t wait to see where our lives take us once we leave each other this evening. You’re all such wonderful people, and I couldn’t have asked for better friends. Love you guys!!

Backyard Brains forever!!!! (Tattoo idea, interns?????)


When Computers Hear the Birds Sing…

Hey there! Zach here with the Songbird Identification project for a quick update. Since the last post, I’ve been hard at work creating a prototype device to listen for and record songbirds. I began by creating a small circuit using a microphone and amplifier chip. This acts as a sound recorder and also includes circuitry to act as a trigger to start recording when sound is detected. Currently, the sound level can be set using an adjuster on the board and an LED light indicates when sound is detected. This worked, but I needed more power and a dedicated board before I could begin field-testing…

After proof of concept, I began work on the actual recording device using an Arduino M0 Pro microcontroller. I connected this to my circuit and an SD card reader and programmed the board to automatically record as a .wav file to the card when sound is detected. Having completed this initial prototype, I had the circuit turned into a PCB “shield” (an extender for the Arduino microcontroller). Once put together, the initial prototype boards looked like this:

At this point, it was about time to do a field test, so I took the board out into the woods near the Nichols Arboretum to see how well the board would pick up birds in the wild. Check out the video below to see it in action. When you hear the birds, watch the red LED on my board… it flickers as the birds sing! This is the visual indicator that the board hears the bird and has started recording!

Bird Box in Action

The next step will be to develop a weatherproof housing for the board. This is an important step for two reasons. First, I need longer term test recordings to make sure that my hardware isn’t running into any issues. Second, the housing itself is a key element to this project, as the end goal is to deploy these boxes for days, maybe even weeks, at a time. Be on the lookout for an update once I have my enclosure built!

My 4th of July Parade Hat, Ft. Acrylic, Laser cut songbirds!

Thanks for following the project thus far and stay tuned for more!