Backyard Brains Logo

Neuroscience for Everyone!

+1 (855) GET-SPIKES (855-438-7745)

items ()

Studying the Aggressive Behavior of Octopodes

Oh hey there! Long time no see, why don’t you have a seat and hear what I’ve been up to since my last blog update.

If you were at the Ann Arbor 4th of July parade you might have seen me dressed up as a beautiful purple octopus (or maybe it was a squid? The costume was quite ambiguous). The costume was an elegant combination of recycled foam, acrylic paint, paper mache, and hope.

For the behavioral analysis part of my project I have decided to forgo maze solving, as forcing an octopus to run a maze over and over for data proves both more difficult than previously thought and honestly doesn’t seem that fun for the octopus itself.

Instead, I have decided to delve deeply into deciphering the ancient mystery of Octopus wrestling. Believe it or not, if left to their own devices, bimac octopuses absolutely love to have wrestling matches, pushing around their opponent with their tentacles to figure out who among them is truly the dominant bimac. They then take a short break (only a couple minutes at max) and go back at it again, in a rematch to see if the underdog can take a round off the reigning champion.

Before you get too worried for the safety of our small aquatic friends, know that I’m not forcing or aggravating them into wrestling and that it’s actually quite difficult to prevent them from tusseling. Additionally, I keep a close watch on them to make sure no one is trying any dirty tricks like biting or going for the eyes.

This unique behavior prompted me to laser cut some custom housing arrangements for these 8 legged boxers. They are very territorial, so I had to construct some acrylic dividers with nylon mesh windows to promote water circulation. I inserted these into their aquarium to separate it into three individual tanks for the octopus because I’d prefer them not to fight unsupervised.

Next, to film the matches I constructed an acrylic wrestling chamber with a rack to hold the video camera for recording. This way they have an area to fight and my camera is guaranteed to give me the same angle of video every time.

I mount a go-pro facing directly down into the white, open-faced tank. This way I get identically framed footage every time, very important for later analysis

One tank becomes three…

There are many intricacies to an octopus wrestling match, many behaviors and patterns that we can try to decompose and comprehend with the help of computation. They circle each other, they taunt their opponent with their curled tentacles, they sometimes even act almost coy towards each other. This is where I switch to a completely different animal, Python, to analyze the speeds, positions, angles, and even colorings of the octopuses.

Why won’t you Look me in the Eyes?

Our confidently wrestling octopuses seem to have a bit of a shy side when it comes to making eye contact with each other. As you can see in the two frames from footage recorded of the octopuses before a wrestling bout, they do not appear to be facing each other at all, and watching the footage confirms this; they approach one another by moving sideways, not with their tentacles leading the movement as one would expect.

This however drastically changes as the distance between them closes, as seen in the above graph, as the distance lowers beyond a certain point, the angle between the two octopus rapidly drops to zero (zero meaning they are directly facing each other).

This seems to indicate a certain “fight zone” or, dare I say, a “danger zone”, where if the distance between the two enters this zone they will rapidly spin around to face their opponent with tentacles at the ready. This is most curious since it’s not only octopuses that display this kind of behavior.

Link to Gorilla Fight: Keep an eye out for how they approach eachother at a weird angle…

Even Silverback Gorillas tend to go into a fight sideways, spinning to face their opponent at the last moment of their approach, and black iguanas are suggested to use eye contact and approach to discriminate risk from an approaching animal (As described in the 1992 paper Risk discrimination of eye contact and directness of approach in black iguanas  by Joanna Burger).

Octopuses: Maybe not as Bright as we Think

No, I’m not talking about their intelligence, of course, the octopuses we have are plenty smart, even though they sometimes seem to forget what exactly to do with a crab we give them for food, deciding to instead stare at them for a while. I’m talking about their physical coloring, their chromatophores.

Their coloring is an extremely reliable way to know if you’re coming too close towards them. Even when they’re still in their tank, if you quickly approach the glass they will drastically darken their color, hoping you leave them alone.


When there’s a fair bit of distance between the competitors, they both appear to be a shade of light beige, but once that distance closes and we enter the “danger zone”, the one going on the offensive colors its tentacles a dark brown before entering the active wrestling bout.

In the pictures below we can identify the attacker and defender by their relative colorings, the bottom octopus flaring up with color in its first offensive, and then the top one flaring up in retaliation.

I’m Bored Already, So What and What’s Next?

If we can confidently identify parallels between these octopuses and other non-cephalopod animals when it comes to approaching and commencing a fight, we might be left with a great assay tool to study the physiological and genetic influences on this behavior. Bimacs are quite ubiquitous and easy to care for animals, and so finding such a use for them would help make neuroscience a more accessible topic for audiences outside of research laboratories, since even a high-school student can take good care of a bimac octopus.

I’m now working on a program that uses convex hulls to draw a contour surrounding the octopus, in order to get a bearing its relative size and tell us more about what it’s doing with its tentacles. A tightly curled up octopus will have a very small contour, whereas a more freely spread out one will take up a much larger area with its contour. This might prove to be interesting information when it comes to analyzing the initiation of the wrestling matches where there is a lot of shape changes in the octopuses.

Additionally, I’m hoping to gather enough data in the next couple weeks to show real trends in the octopus behavior and run more general analyses on the full collection of vectors. This will let me say more confident statements on the overall behavior of the bimacs.

[Summer’16 Internship] The Dragonfly: Nature’s Guided Missile

The dragonfly can capture its prey mid-air with a 95% accuracy rate. What biological equipment does the dragonfly have that allows it to do this amazing feat? To examine this very question, I will be replicating Paloma T. Gonzalez-Bellido’s 2012 PNAS paper using DIY open source tools. Gonzalez-Bellido found that there are neurons that run from the dragonfly’s compound eyes to its wings that help the dragonfly track and catch its prey with such accuracy. My version of this experiment will control a laser using a laser-light show set up (instead of a expensive custom-made projector) that shines a laser dot (representing a fly) as the stimulus. I will simultaneously be recording on open source bio-amplifiers the activity from these target selective neurons as the dragonflies watches its “prey” move across a screen.

The goal of this experiment is to provide a way for amateur scientists to replicate professional research.


The dragonfly is nature’s guided missile. They can use their 360° visual span to swoop down and devour their prey mid-flight with a 95% kill rate. They can individually manipulate the trajectory of each wing to switch directions rapidly in mid-flight. Ant-Man? Spider-Man? They really should have Dragonfly-Man. To illustrate this insect’s abilities, I invite you to google “dragonfly catching fly slow motion video,” or something of that sort. You will be amazed.

Despite their stunning predatory accuracy, dragonflies are deceptively simple creatures. 300 million years ago, they were still buzzing around like little killer drones. The source of the accuracy of dragonflies is also deceptively simple. Eight pairs of neurons (called TSDNs for target selective descending neurons) that run from the dragonfly’s visual system to their flight center, help them encode the direction of their target (their prey) and track it with such incredible accuracy. Through examination of this surprisingly simple yet adept insect, it is my hope that my research can be used for the purpose of furthering technology, such as building a faster, more accurate helicopter, and education, by providing a cheap, DIY method of producing a professional experiment.

In my project, I will be replicating and extending Paloma Gonzalez- Bellido’s 2012 paper, “Eight pairs of descending visual neurons in the dragonfly give wing motor centers accurate population vector of prey direction.” This paper studies the relationship between eight pairs of neurons and the dragonfly’s predatory abilities. The results of the paper show that these neurons, aptly named the target-selective descending neurons (TSDNs), encode a population vector that is strongly correlated with the position of the target (the fly). This means that these eight pairs of neurons are direction-oriented, or that they have a preferred direction and show more activity when their direction is favored.

The Original Set-Up:

As previously mentioned, this project will be replicating a academic paper, “Eight pairs of descending visual in the dragonfly give wing motor centers accurate population vector of prey direction.” In this paper, Gonzalez-Bellido used a modified projector to shine a black dot (representing prey) on a curved screen and measured the neural activity of the dragonfly. Below is Gonzalez- Bellido’s set up:

The set-up in the original 2012 paper

My Set-Up:

In my set up I will use a laser light show system instead of an expensive modified projector. The laser show is very replicable because it is cheap and made with materials that most makers have around. My laser light show includes two servos and an Arduino Uno. These servos have mirrors attached to them. As different voltages pass through these galvos, the mirror will change position. The two servos, one moving in the x direction and one in the y, will change the position of the laser. The laser and the two servos are connected to an Arduino Uno, making the position of the laser both reliably and easily manipulated through Arduino software. Furthermore, a Spikerbox (Backyard Brains) is connected to two electrodes (one in the dragonfly’s abdomen and acting as a ground, and one in the TSDN) to measure the neural activity of the target neurons, and displaying it on an laptop. This set-up is pictured below:

My modified set-up

If all works, the essential goal of my experiment is to provide a way for anyone to replicate professional laboratory neuroscience!


First draft for the final laser show: servo_laser_draft1.ino

Below are the complete instructions for this experiment, if you want to see the whole process (every step and attempt to achieve this project) you can check out the following logs:


Essential materials:

(from top left to bottom right)

1. Spikerbox pro (with appropriate cord to plug into laptop and electrodes)

2. breadboard

3. Arduino Uno and jumper cables/ wires

4. Mirrors (two)

5. servos (two)

6 USB chord (for Arduino to laptop)

7. laser (not pictured)


Non-essential materials:

1. materials for stand and base for laser light show (I used wood and acrylic)

2. supports for mirror (I used wooden sticks and popsicle sticks)

3. microscope (not needed, just helpful)

4. insect wax and popsicle sticks

5. thread and silly putty (for the neural recordings)

6. paper (for the curved screen)

7. solder and soldering iron

optional: Neuron spikerbox (pictured above)

Set up the laser light show:

Gather the following materials for the next couple of steps:

1. servos (two)

2. mirrors (two)

3. jumper cables

4. breadboard

5. Arduino

In this step we will connect the servos to the Arduino so that we can control their movement through code.

This fritz shows how one servo is hooked up, do the same for the other servo.

These are the servos I used:

Each servo has a brown, yellow and a red cable. The brown and red are for ground and power, respectively, and the yellow is for the signal. The yellow can be plugged into the Arduino pin 10. Do the same for the other servo, except plug the yellow into pin 11.

Try this test code:

include <Servo.h>
  const int servo = 10;       // The servo motor pin
  const int servo2 = 11;
  Servo myservo;  // create servo object to control a servo
  Servo myservo2;

  void setup() {
    // Servo  
    myservo.attach(servo);  // attaches the servo to the servo pin
  void loop(){

      for(servoAngle = 0; servoAngle < 20; servoAngle++)  //move the micro servo from 0 degrees to 20 degrees
    digitalWrite(laserPin, LOW); // turn the laser off
    myservo.write(0); // return to zero degrees
    delay(1000); // pause for one second

Next step is to glue the mirrors onto the servos. I used wooden sticks and mirror pieces that I cut from a large hand mirror.

Try to get the mirrors flush on the wooden sticks.

Set up the laser. Take out the back of the lasers and remove the batteries. Inside should be a spring. Attach one aligator clip on the spring (ground), and one on the outside metal (positive). Connect the ground to ground, and the power to the 12 pin on the Arduino.

Test with this test code:

#include <Servo.h>
  const int servo = 10;       // The servo motor pin
  const int servo2 = 11;
  int laserPin = 12;
  Servo myservo;  // create servo object to control a servo
  Servo myservo2;
  int servoAngle = 0;   // servo position in degrees
    int servoAngle2 = 0;   // servo position in degrees
  void setup() {
    // Servo  
  pinMode(laserPin, OUTPUT); // set up the laser pin
    myservo.attach(servo);  // attaches the servo to the servo pin
  void loop(){
      digitalWrite(laserPin, HIGH); // turns the laser on
      for(servoAngle = 0; servoAngle < 20; servoAngle++)  //move the micro servo from 0 degrees to 20 degrees
    digitalWrite(laserPin, LOW); // turn the laser off
    myservo.write(0); // return to zero degrees
    delay(1000); // pause for one second

Create a screen. I made mine out of paper, some more wooden sticks and a little bit of string. Do not glue the screen down quite yet. Position the servos (with mirrors attached to them), the laser and the screen so that laser shines off of both servos, and the beam hits all parts of the screen. This is a very difficult part of the experiment and needs patience — but you will feel so accomplished when it actually works!

Once you have the laser light show set up, then you can work on recording the events. Use a Spikerbox pro (a product from Backyard Brains) to do this.

Spikerbox pro:

Connect the Spikerbox to your laptop using the appropriate cable (green cable in photo) and open Spikerecorder (a free app by Backyard Brains). Next to where you plug the cable in to the Spikerecorder are eight holes. Using a multimeter, find which is Ground, and which are events. The events, when connected to power, will show up as a numbered bar on the Spikerecorder screen. Solder wires to ground, event 1 and event 2.

Disconnect the laser power and ground and place on breadboard. Connect the Spikerbox like below:

Once you have everything connected, test with code.

The way to turn the laser on:

digitalWrite(laserPin, HIGH);

if you have set up the laserPin to be whatever pin you plugged the laser power into (for me it is 12).

Since we want double events, connect one end of the resistor to a pin in the Arduino, and the other side of the resistor to the Spikerbox pro event 2 (not pictured in fritz).

then add the code:

digitalWrite(laserPinOff, HIGH);

every time the laser is off, and write LOW every time it is on.

Now you have the entire laser set up together!

Now it is time for the neural recordings. For now, we will test without the laser set-up, just to ensure that we can locate the TSDN neurons. First, catch the dragonflies! They love the vegetation by large bodies of water. I found a lot by a field near a river. Take a butterfly net and cage and catch as many as you can. My suggestion is to track them to where they are perching, and they try to net them from behind. Dragonflies are hard to catch when they are flying.

After you have caught the dragonflies, place them in the refrigerator. This will anesthetize them.

Get these materials ready:

1. Neuron Spikerbox (optional)

2. Spikerbox pro, cable and electrodes

3. popsicle stick

4. microscope (optional but helpful)

5. silly putty

6. thread

7. micromanipulator (optional but helpful)

Anesthetize the dragonflies and heat up insect wax. When it is melted, place on popsicle stick and then place anesthetized dragonfly onto the popsicle stick. It’s dorsal side should be on the stick and its wings should be free from the putty.

Set up the dragonfly like above.

Though I have the electrodes plugged into the neuron Spikerbox in this picture, in the final version I used a Spikerbox pro so that I can also record double events. I just use the neuron Spikerbox as a stand to put the dragonfly on (this is why it is optional.)

Using a microscope, locate the TSDN and pierce with an electrode.

The micro manipulator (the orange 3d printing manipulator in the picture before the one above) helps keep this electrode in place. Above is a picture of the anesthetized dragonfly under the microscope while I place the electrode in the TSDN.

The TSDN are located here

Turn on the Spikerbox, you can either plug the Spikerbox into the computer or just listen with a radiohead speaker. You will hear a electronic tap- these are the action potentials firing! Wave a piece of paper with a black dot on it or make another movement and you should hear these electronic taps increase in frequency in response to your movement!

Now, test all together! This is the next step for me too, along with writing code that calculates where the laser beam is as a function of time. But once these two steps are completed, you can calculate and hopefully affirm Paloma T. Gonzalez Bellido’s population vectors for each of the TSDNs!

By Patricia Aguilar

[Summer’16 Internship] The Dragonfly: Building a servo laser set-up, prototype 2

I built a servo laser set-up (prototype 2) that allows me to control the movement of the laser beam across a curved screen using an X direction servo and a Y direction servo.

Below is the final result of prototype 2:

The problems:

The hot glue on the mirrors makes the mirrors tilt at an angle, and not line up exactly with the wooden sticks.

This curves the laser beam, and does not make it perfectly straight. Furthermore, my set-up was altered many times to get the laser, the mirrors and the screen to line up. Thus, the set-up is overall very messy and filled with scraps of old glue, tape, and wood.

By Patricia Aguilar