Backyard Brains Logo

Neuroscience for Everyone!

+1 (855) GET-SPIKES (855-438-7745)


items ()

Quicker and Smarter: Neurorobot on the hunt

Though it feels like just last week I landed here in Michigan, it seems that it’s almost the time for me to go back home. Though my work with the project here is wrapping up, there’s still so much to be done, but I’m confident that I’m leaving behind a good framework.

These past few weeks have been so eventful! I was home sick for a week with the flu, which sucked, but thankfully I could just take my work home with me. The Ann Arbor Art Fair (pictured above) was incredible; half of downtown was covered in tents full of beautiful creations from all around the country. Completely unrelated, half my paycheck now seems to be missing.

If you were following my last blog post, you may remember that one of the biggest hurdles I had to overcome was video delay between the bot and my laptop; big delay meant bad reaction speed meant sad bot.

Through changing libraries, multithreading, and extensive testing, we’ve now gotten the delay to just under 0.4 seconds, less than half of what it was with the old versions of the code!! This may not seem too exciting, but it means the bot now react to things much smoother and calmer than before, as demonstrated in this video of it hunting down a ferocious bag of chips:

Another new accomplishment is neurons! Spiking neurons to be specific, nothing like your regular generic brand neural net neurons. These ones are much prettier (using the Izhikevich model) and react to stimuli very organically, making graphs that resemble what you’d see on your lab oscilloscope:

More importantly, other than just looking good, these neuron models also behave really well. As an example here’s just two neurons (listen for the different clicking tones) connected to the motors of the Neurorobot, one activated and one inhibited by the ultrasonic distance sensor:

With just these two neurons, and fewer than twenty lines of code, the Neurorobot can already display cool exploration behaviours, avoiding getting stuck on walls and objects the best it can. Neurons are powerful, that’s why creatures like the roundworm can survive with just over 300 of them: it doesn’t take a lot to do a lot.

Here’s another example, in which position neurons that are more sensitive to left and right areas of the image space are given the task of finding something big and orange:

Notice how when the cup is to the left of the bot’s camera, the blue neuron spikes; whereas when it drifts to the right, the green and red neurons start spiking.

There’s still much optimization that can be done to make the Neurorobot think and react faster, eventually the entire camera code is going to be rewritten from scratch, as well as better visualisation of what’s going on under the hood. Lights, speakers, microphone, and a good user interface are all coming soon to a Backyard Brains near you!

Christopher Harris’s Neurorobot prototypes already have a drag-and-drop interface for putting together neurons:

The real goal of this project isn’t to have student write a thousand lines of code for the robot to do something interesting, but for them to learn and explore the wonderful behaviour of neurons by just by playing around with them; changing parameters, connecting their synapses to different components and seeing if they can get the bot to wiggle when it sees a shoe or beep and back away from anything that looks like a vase. And as is in keeping with Backyard Brains’ tenets, seeing what they will discover.


Neurorobot Video Transmission In Progress

Hey everybody, it’s your favourite Neurorobot project once again, back with more exciting updates! I went to my first knitting lesson this week at a lovely local cafe called Literati, and attended the Ann Arbor Fourth of July parade dressed as a giant eyeball with keyboards on my arms (I meant to dress as “computer vision” but I think it ended up looking more like a strange halloween costume).

Oh wait… Did you want updates on the Neurorobot itself?
Unfortunately it’s been more snags and surprises than it has been significant progress; one of major hurdles we’re still yet to overcome is in the video transmission itself. (I did however put huge googly eyes on it)

The video from the Neurorobot has to first be captured and transmitted by the bot itself, then sent flying through the air as radio waves, received by my computer, assembled back together into video, loaded into program memory, processed, and only then can I finally give the bot a command to do something. All parts of this process incur delays, some small, some big, but the end result so far is about 0.85 seconds.

(A demo of how I measure delay, the difference between the stopwatch in the bot recording and the one running live on my computer)

Unfortunately, human perception is a finicky subject; typically in designing websites and applications it has been found that anything up to 100ms of delay is considered “instantaneous,” meaning the user won’t send you angry emails about how slow a button is to click. 0.85 seconds however means that even if you show the robot a cup or a shoe and tell it to follow it, the object may very well leave its view before it’s had a chance to react to it. This means the user has a hard time telling the correlation between showing the object and the bot moving towards it, leading them to question whether it’s actually doing anything at all.

Unfortunately the protocol the wifi module on our robot uses to communicate video with the laptop isn’t that easy to figure out, but we’ve made sizable progress. We’ve gotten the transmission delay down to 0.28 seconds, but the resulting code to do this is 3 different applications all “duct-taped” together, so there’s still a little bit of room for improvement.

I hope to have much bigger updates for my next blogpost, but for now here’s a video demo of my newest mug tracking software.

{Previous update: https://blog.backyardbrains.com/2018/06/neurorobot-on-wheels/}


The 2017 Summer Fellowship Concludes

Over 11 sunny Ann Arbor weeks, our research fellows worked hard to answer their research questions. They developed novel methodologies, programmed complex computer vision and data processing systems, and compiled their experimental data for poster, and perhaps even journal, publication. But, alas and alack… all good things must come to an end. Fortunately, in research, the end of one project is often the beginning of the next!

Some of the fellows intend to continue working with on the research they began here while they’re away and many of these projects will be continued next summer! Definitely expect to hear updates from Nathan’s EEG Visual Decoding project and Joud’s Sleep Memory project. Additionally, two of the projects will continue throughout the next few months: Zach’s Songbird Identification and Shreya’s Electric Fish Detector projects will continue through to December!

Meet the Fellows, See the Projects

The fellows are off to a great start! Check out their blog posts introducing their projects:

Progress

 The team has been working hard to bring their projects to life. Check out these blog posts on their rig construction and data collection efforts!

Conclusions

Our fellows experience the peaks and valleys of research this summer, but they all came out on top! Check out their final posts for their results, posters, and other details!

Continuations…

A few of our fellows are staying on throughout this next semester for longer term development projects! Zach is going to be back to working with his team on the Songbird Identification Device project, and Shreya will be working through to December on the Electric Fish Detector project. Expect updates on their progress from them soon!