## Archive for the ‘Science’ Category

### A Year of Personal Analytics

A couple years ago I read a blog post by Stephen Wolfram about personal analytics. In his post, he showed a number of different types of personal data he's collected over the past couple decades, as well as some interesting insights revealed by graphing the data. I was immediately fascinated, and felt somewhat of an urgency to start recording different types of data about my own life. For the past year or so, I've kept track of a number of different things. Here I include analyses of my weight and my car's gas mileage.

### Weight

Last October I started weighing myself every day. My motivation was two-fold: I wanted to lose a little weight, and I was taking a signals class at the time, so I thought it would be interesting to see what kinds of insight I could gain from the data. Wanting to get as close to the raw data as I could, I opted to use Matlab scripts I wrote for analysis, rather than outsourcing the tracking to other software. It takes more time, but it's more fun and offers more detail. Here's my weight over the past year:

My target weight was 170 lbs. As just a rough way to keep track of how well I was doing, I fit a line to my recorded weights and then extrapolated how long it would take me to reach my target weight at that trajectory. As you can tell, it's going to be a while...

Most of my weight recordings I did in the morning, before I had eaten, without clothing on, after using the restroom. Using my phone, I entered the weight into a text file on Dropbox. However, the actual time of day I weighed myself had some variance. As an example, here are the times of my recordings over the past month, based on the times previous versions of the text file were saved to Dropbox.

The average time I weighed myself during this month was at 10:31 am, but there was a standard deviation of 3 hours and 27 minutes(!). So there's some jitter in the time of day I recorded my weight. However, the average time period between recordings was fairly close to 24 hours. That is shown in this next graph, a plot of the derivative of the first graph.

The average time period between recordings was 23.91 hours, with a standard deviation of 5.04 hours. I am going to assume that's close enough to my desired sampling rate of 1/day. On the 24 days that I was out of town or forgot to weigh myself, I took the days before and after, assumed my weight changed linearly between them, and filled in the values.

Then, I wanted to see how much of my weight fluctuation was just noise in the recordings, rather than real changes. I detrended the recordings using a running average.

By doing this, I was able to discover that the noise in my weight is Gaussian distributed (Lilliefors test, p = 0.07), with a standard deviation of about 1 lb.

I also wanted to find out if there was a cyclical nature to my weight. I took the Fourier transform of my weight over the past year to see if any obvious peaks would show me a frequency with which my weight fluctuated. Since I recorded my weight once a day, the Nyquist frequency is 1/2 days, meaning I wouldn't be able to see any frequencies less than 1/2 days--no peaks for defecation.

Interestingly, I don't really see anything here that suggests there's a cyclical nature to my weight. I thought there might be some regular increase on the weekends followed by a slow decrease during the week that would manifest itself, but I guess not. This could be because there's nothing to see, or it could mean I haven't sampled frequently enough, or it could mean I haven't recorded for a long enough period. Maybe with a few more years of data collection something will show up.

In any case, I will have to try replicating the behavior I had between days 100 and 150 (in the first graph) to see if I can lose weight at an accelerated rate.

### Gas Mileage

I have a 2.5 liter, 4-cylinder, 2002 Nissan Altima. A little over a year ago, I started using it for my daily commute to school. I was curious about how much it was costing me to drive to school every day, so I started keeping my receipts from the gas station and keeping track of the number of miles I drove between fill ups. Now I use a text file on Dropbox to keep track of this, too.

I'm not sure what happened during Q2-14 that made me get such terrible gas mileage. It likely was a recording error on my part. But my average gas mileage hovers around 25.3 miles/gallon, with a standard deviation of 2.7 miles/gallon. The average range of my car between fill ups is about 344 miles with a standard deviation of 69 miles. However, I've gotten up to 441.1 miles on one tank before, so it really depends on what kind of driving I've been doing. I normally drive it until my fuel gauge is pretty close to empty.

My driving schedule has me filling up about once every 16 days, with a standard deviation of 12 days, as shown in the next histogram. The outliers were probably times when I forgot to record a trip to the gas station, making it seem like I went a lot longer between fill ups than I really did.

As far as gas prices go, the price per gallon dipped down at the end of 2013, beginning of 2014. Hopefully it dips again soon...

### The Completo

Backyard Brains also sells a kit, called The Completo, with an adapter to turn your cell phone camera into a microscope, an amplifier (called SpikerBox), and a micro-manipulator, all of which folds up into a small toolbox. Your phone acts as both the microscope and the oscilloscope.

Using one of the SpikerBoxes, they can stimulate a detached cockroach leg with a music wave and cause it to flex to the beat of the music. I saw a demonstration of this at their booth this morning. It really works!

There are probably a million other experiments one could do with these tools, and I have no doubt it would inspire many kids and kids-at-heart to learn more about neuroscience.

All of their hardware and software are open, meaning other people can write apps for different or more complicated stimulation protocols, which I am also excited about. It never occurred to me that you could do such cool science with relatively inexpensive, simple tools.

What an exciting company!

So stop what your doing. Right now. Go to https://backyardbrains.com/. Prepare to be dazzled.

### #SfN13: Day 3

Ed Boyden gave a fascinating lecture today about a variety of methods that he's worked on with a swath of other scientists and engineers. Each method was powerful on its own, and to hear about them all in one talk was quite amazing. Here's an overview of them:

• Automated patch-clamping. Whole-cell patch-clamp already exists, enabling simultaneous electrical, chemical and gene analysis of single cells. However, it takes a long time for trainees to learn and is very difficult to do in vivo. One of Ed's grad students automated this method. They apply square waves to look at changes in impedance continually as a pipette is lowered into the brain region of interest, detecting nearby neurons. Then, by looking at changes in impedance over a series of lowering steps they can tell that a neuron is approaching and they can reliably and automatically patch-clamp neurons. The equipment for automatic patching is available online at http://www.neuromatic-devices.com.
• Using this method, their group has looked at how different anesthetics affect single neurons in living animals.
• Another application is to look at how a synaptic connection changes during a learning task. They've been using a setup that uses 4 individually controlled pipettes at once to record from pre- and post-synaptic neurons simultaneously.
• He's also teamed up with Allen Institute for Brain Science to do integrative analysis of the different cell types in the brain, including their morphology, electrophysiology and molecular features.
• Automated animal surgeries. This works by recording the impedance of the drill. Since the skull has a high impedance, it is easy to tell when the drill has reached the inner edge of skull during drilling, and the drill stops. Automating this method has helped them avoid bleeding during surgery. Also, by having a rapid way to make a grid of holes in the skull with this method, one might be able to do high-throughput in vivo pharamcological testing on many patches of cortex.
• Advanced electrode arrays. They are developing electrode arrays with 120 recording sites to listen to neurons, using algorithms to sort out which neurons each electrode is listening to. He's not sure how many neurons their electrode will be able to listen to, as they are still testing this.
• The arrays are tiny, with each electrode being only 10s of microns wide, reducing bleeding, morbidity, and mortality.
• Spike recording with DNA. This was probably my favorite. When spikes occur, calcium flows into the cell. This causes a DNA polymerase that's copying strands of DNA during the recording time to mutate its shape and make mistakes. By looking for mistakes in the copied DNA and comparing it with the rate of replication, they can have a record of spiking behavior. At the moment, this method is too slow to record neural data, but they are working on improving it.
• Of course he's also worked on optogenetic molecules, of which there are 3 classes: archaerhodopsins and bacteriorhodopsins (pass protons), halorhodopsins (pass chloride ions), and channelrhodopsins (pass protons and sodium, potassium, and calcium ions).
• Now they're working on making a red light sensitive rhodopsin. Because red light can penetrate to deeper tissue than other wavelengths, it is better suited for deep neural stimulation.
• Using two new types of rhodopsins, chronos and crimson, they can activate neurons with blue and red light with no measurable overlap, allowing them to stimulate two different populations of cells individually.
• He also is working on ways to deposit cells in 3-D space to make 3-D cell cultures, in order to analyze, for instance, how connections form between neurons.

I was amazed at all the tools that Ed had to offer, and eagerly anticipate the ones still in development.

Which method do you think is the most promising or exciting? Which method do you think we should prepare for?