Neuron Physics A process of learning Tue, 24 Jan 2017 01:21:51 +0000 en-US hourly 1 Neuron Physics 32 32 41725569 A Year of Personal Analytics Thu, 02 Oct 2014 17:42:29 +0000 A couple years ago I read a blog post by Stephen Wolfram about personal analytics. In his post, he showed a number of different types of personal data he's collected over the past couple decades, as well as some interesting insights revealed by graphing the data. I was immediately fascinated, and felt somewhat of an urgency to start recording different types of data about my own life. For the past year or so, I've kept track of a number of different things. Here I include analyses of my weight and my car's gas mileage.


Last October I started weighing myself every day. My motivation was two-fold: I wanted to lose a little weight, and I was taking a signals class at the time, so I thought it would be interesting to see what kinds of insight I could gain from the data. Wanting to get as close to the raw data as I could, I opted to use Matlab scripts I wrote for analysis, rather than outsourcing the tracking to other software. It takes more time, but it's more fun and offers more detail. Here's my weight over the past year:


My target weight was 170 lbs. As just a rough way to keep track of how well I was doing, I fit a line to my recorded weights and then extrapolated how long it would take me to reach my target weight at that trajectory. As you can tell, it's going to be a while...


Most of my weight recordings I did in the morning, before I had eaten, without clothing on, after using the restroom. Using my phone, I entered the weight into a text file on Dropbox. However, the actual time of day I weighed myself had some variance. As an example, here are the times of my recordings over the past month, based on the times previous versions of the text file were saved to Dropbox.


The average time I weighed myself during this month was at 10:31 am, but there was a standard deviation of 3 hours and 27 minutes(!). So there's some jitter in the time of day I recorded my weight. However, the average time period between recordings was fairly close to 24 hours. That is shown in this next graph, a plot of the derivative of the first graph.


The average time period between recordings was 23.91 hours, with a standard deviation of 5.04 hours. I am going to assume that's close enough to my desired sampling rate of 1/day. On the 24 days that I was out of town or forgot to weigh myself, I took the days before and after, assumed my weight changed linearly between them, and filled in the values.

Then, I wanted to see how much of my weight fluctuation was just noise in the recordings, rather than real changes. I detrended the recordings using a running average.


By doing this, I was able to discover that the noise in my weight is Gaussian distributed (Lilliefors test, p = 0.07), with a standard deviation of about 1 lb.


I also wanted to find out if there was a cyclical nature to my weight. I took the Fourier transform of my weight over the past year to see if any obvious peaks would show me a frequency with which my weight fluctuated. Since I recorded my weight once a day, the Nyquist frequency is 1/2 days, meaning I wouldn't be able to see any frequencies less than 1/2 days--no peaks for defecation.


Interestingly, I don't really see anything here that suggests there's a cyclical nature to my weight. I thought there might be some regular increase on the weekends followed by a slow decrease during the week that would manifest itself, but I guess not. This could be because there's nothing to see, or it could mean I haven't sampled frequently enough, or it could mean I haven't recorded for a long enough period. Maybe with a few more years of data collection something will show up.

In any case, I will have to try replicating the behavior I had between days 100 and 150 (in the first graph) to see if I can lose weight at an accelerated rate.

Gas Mileage

I have a 2.5 liter, 4-cylinder, 2002 Nissan Altima. A little over a year ago, I started using it for my daily commute to school. I was curious about how much it was costing me to drive to school every day, so I started keeping my receipts from the gas station and keeping track of the number of miles I drove between fill ups. Now I use a text file on Dropbox to keep track of this, too.


I'm not sure what happened during Q2-14 that made me get such terrible gas mileage. It likely was a recording error on my part. But my average gas mileage hovers around 25.3 miles/gallon, with a standard deviation of 2.7 miles/gallon. The average range of my car between fill ups is about 344 miles with a standard deviation of 69 miles. However, I've gotten up to 441.1 miles on one tank before, so it really depends on what kind of driving I've been doing. I normally drive it until my fuel gauge is pretty close to empty.


My driving schedule has me filling up about once every 16 days, with a standard deviation of 12 days, as shown in the next histogram. The outliers were probably times when I forgot to record a trip to the gas station, making it seem like I went a lot longer between fill ups than I really did.


As far as gas prices go, the price per gallon dipped down at the end of 2013, beginning of 2014. Hopefully it dips again soon...


So what is my daily commute cost? With gas costing about $3.50/gallon, and with my car getting about 25 miles/gallon, the 20 miles round trip to school is costing me about

\frac{20 * \$3.50}{25} = \$2.80

And that's just the cost of gas. Ouch!

What's the point?

Looking at different data from my life lately was a lot of fun. I got to play around with the data and even learned some new things. However, my behaviors haven't really changed much since I started keeping track of these two variables. I eat a little bit healthier, but not so much that I have shed some extra pounds. I'd say my exercise rate has stayed about the same. I also don't really drive more conservatively. I've heard that performance measurement leads to improvement, but when performance is measured and reported, the rate of that improvement accelerates. Maybe I need to start reporting my performance. I guess this blog post is a good way to start.

]]> 2 1721
#SfN13: Day 5 Thu, 14 Nov 2013 06:59:37 +0000 Today was the last day of SfN 2013. Posters came down, airplanes took off, and tears were shed as we all embraced and listened to Green Day's "Good Riddance" one last time together.

As a first timer at the meeting, I was pleasantly surprised at how useful it was. Most conferences I've been to previously were smaller, and as a result, there weren't many others working on research related to my own. This past week, on the other hand, gave me many opportunities to talk with others who were working on things very closely related to what I am working on. I learned a ton and am excited to implement many of the things I learned into my own research. It's been a real boost.

There are also some things I would change. Here are some tips I've thought of over the past few days:

  • Be very picky about what you put on your planner to see. There is not enough time to see all the things you might be interested in, and even if there was, you only have so much physical and mental energy each day.
  • Once you narrow down your list of what you plan on seeing, read the abstracts before you arrive at the conference floor. It will save you awkward time standing in front of each poster reading their abstracts, and you can jump straight into discussing their results with them.
  • If you want to talk to someone, DON'T CHICKEN OUT! You'll regret it later. You don't have to make friends with every neuroscience celebrity you see, but if you have a legitimate reason to talk to them, why not?
  • If you present a poster, try to cut out as much text as possible without losing important details, and don't try to squeeze too many figures in the limited space. It gets overwhelming to your audience, and their eye doesn't know what to focus on. Instead, concentrate on a handful of main results, and refer them to your contact info or blog for more information.
  • Also, if you present a poster, be present for the entire session, not just your designated hour. I've read this advice other places, and most people seemed to follow it, but I'm reiterating it because it's really important. It's not always practical for people to visit your poster during your designated hour.
  • On that note, plan on visiting posters/exhibits/symposiums in an order that will minimize travel time and distractions and maximize learning time.
  • Put priority on posters that you'd like to see over symposiums. Symposiums can certainly be useful and interesting, but posters offer more one-on-one time with authors.
  • Eat healthily. Get enough sleep. Exercise. You will have more energy to learn and have fun during the day if you do those things.
  • Consider sight-seeing or taking some sort of break to see the city. Although you will not want to miss the opportunities offered by the conference, there are also unique opportunities offered by the host city that can also be good.

Those are just a few things that I think would have helped make my conference experience even better. That being said, it was a fantastic experience, and I look forward to next year!

2013-11-13 23.55.29-1

]]> 2 1480
#SfN13: Day 4 Wed, 13 Nov 2013 00:17:31 +0000 Probably the coolest thing I've seen so far at SfN 2013 is the exhibit by Backyard Brains. They're a company that makes neuroscience tools for educators and DIY neuroscientists. I could play with their stuff all day, and the crowd that was gathered around their booth showed that I wasn't alone. Here are a couple of the tools they were demonstrating at their booth:


I want one. They call it the "World's First Cyborg Kit." Basically, they sell a kit so you can stick a chip on the back of a cockroach and control him with your iPhone. The chip on the back has a small battery and communicates with an app on your iPhone via bluetooth low energy. I am not very familiar with cockroaches, but here's how it works: apparently cockroach antennae have neurons running down them that are used to sense walls or other objects. By gluing the chip to the back of the cockroach and gluing wires along the antennae, you can trick the cockroach into thinking there's a wall each time you stimulate an antenna. For example, if you tap on the "Go Right" side of the app, it stimulates the left antenna. The cockroach thinks there's a wall to the left and goes right.

The surgery takes about 30 minutes to perform, and in case you were worried about the cockroach feeling pain (if they even can...), you drop him in some ice water first. They're cold blooded, so ice water slows down their metabolism and acts as anesthesia.

The roach becomes habituated to the stimulus after about five minutes of constant driving (which Backyard Brains spins as being a good thing, as it accomplishes their mission of being a neuroscience education tool). After a while of not stimulating or by stimulating the two antennae randomly (for example, they found that the electrical patterns used for playing music were sufficiently random), the habituation will presumably go away.

Price: $100 for everything needed to use your chip on three different cockroaches (the chip is reusable, but a new set of electrodes is needed for each cockroach) and the app. You can buy replacement electrodes on their website.

The Completo

Backyard Brains also sells a kit, called The Completo, with an adapter to turn your cell phone camera into a microscope, an amplifier (called SpikerBox), and a micro-manipulator, all of which folds up into a small toolbox. Your phone acts as both the microscope and the oscilloscope.

Using one of the SpikerBoxes, they can stimulate a detached cockroach leg with a music wave and cause it to flex to the beat of the music. I saw a demonstration of this at their booth this morning. It really works!

There are probably a million other experiments one could do with these tools, and I have no doubt it would inspire many kids and kids-at-heart to learn more about neuroscience.

All of their hardware and software are open, meaning other people can write apps for different or more complicated stimulation protocols, which I am also excited about. It never occurred to me that you could do such cool science with relatively inexpensive, simple tools.

What an exciting company!

So stop what your doing. Right now. Go to Prepare to be dazzled.

2013-11-12 16.03.44-1

]]> 5 1463
#SfN13: Day 3 Tue, 12 Nov 2013 07:59:30 +0000 Ed Boyden gave a fascinating lecture today about a variety of methods that he's worked on with a swath of other scientists and engineers. Each method was powerful on its own, and to hear about them all in one talk was quite amazing. Here's an overview of them:

  • Automated patch-clamping. Whole-cell patch-clamp already exists, enabling simultaneous electrical, chemical and gene analysis of single cells. However, it takes a long time for trainees to learn and is very difficult to do in vivo. One of Ed's grad students automated this method. They apply square waves to look at changes in impedance continually as a pipette is lowered into the brain region of interest, detecting nearby neurons. Then, by looking at changes in impedance over a series of lowering steps they can tell that a neuron is approaching and they can reliably and automatically patch-clamp neurons. The equipment for automatic patching is available online at
    • Using this method, their group has looked at how different anesthetics affect single neurons in living animals.
    • Another application is to look at how a synaptic connection changes during a learning task. They've been using a setup that uses 4 individually controlled pipettes at once to record from pre- and post-synaptic neurons simultaneously.
    • He's also teamed up with Allen Institute for Brain Science to do integrative analysis of the different cell types in the brain, including their morphology, electrophysiology and molecular features.
  • Automated animal surgeries. This works by recording the impedance of the drill. Since the skull has a high impedance, it is easy to tell when the drill has reached the inner edge of skull during drilling, and the drill stops. Automating this method has helped them avoid bleeding during surgery. Also, by having a rapid way to make a grid of holes in the skull with this method, one might be able to do high-throughput in vivo pharamcological testing on many patches of cortex.
  • Advanced electrode arrays. They are developing electrode arrays with 120 recording sites to listen to neurons, using algorithms to sort out which neurons each electrode is listening to. He's not sure how many neurons their electrode will be able to listen to, as they are still testing this.
    • The arrays are tiny, with each electrode being only 10s of microns wide, reducing bleeding, morbidity, and mortality.
  • Spike recording with DNA. This was probably my favorite. When spikes occur, calcium flows into the cell. This causes a DNA polymerase that's copying strands of DNA during the recording time to mutate its shape and make mistakes. By looking for mistakes in the copied DNA and comparing it with the rate of replication, they can have a record of spiking behavior. At the moment, this method is too slow to record neural data, but they are working on improving it.
  • Of course he's also worked on optogenetic molecules, of which there are 3 classes: archaerhodopsins and bacteriorhodopsins (pass protons), halorhodopsins (pass chloride ions), and channelrhodopsins (pass protons and sodium, potassium, and calcium ions).
    • Now they're working on making a red light sensitive rhodopsin. Because red light can penetrate to deeper tissue than other wavelengths, it is better suited for deep neural stimulation.
    • Using two new types of rhodopsins, chronos and crimson, they can activate neurons with blue and red light with no measurable overlap, allowing them to stimulate two different populations of cells individually.
  • He also is working on ways to deposit cells in 3-D space to make 3-D cell cultures, in order to analyze, for instance, how connections form between neurons.

I was amazed at all the tools that Ed had to offer, and eagerly anticipate the ones still in development.

Which method do you think is the most promising or exciting? Which method do you think we should prepare for?

2013-11-11 22.56.39-1

]]> 7 1442
#SfN13: Day 2 Mon, 11 Nov 2013 06:23:57 +0000 Another poster from North Carolina caught my eye today. Zhenglin Gu and Jerrel Yakel, neuroscientists at Research Triangle Park in Raleigh, North Carolina kindly presented some compelling research about the medial septum to me. I know, I know, I posted research about the medial septum yesterday. But that little brain structure has a lot of secrets people are interested in!

Their study used a medial septum (MS)-hippocampus co-cultured preparation where both structures are removed from the brain and placed side by side for many days so that the MS can regrow connections to the hippocampus. They then looked at the cooperation between the MS and Schaffer collaterals (SC) in producing theta rhythms in the CA1 region of the hippocampus. Here's how it goes:

  • By just stimulating the SC with a stimulating electrode, no theta is evoked in CA1.
  • However, if you first stimulate the MS (they used channelrhodopsin with 10 ms pulses of light at 10 Hz for one second), and then stimulate the SC, they detected theta in CA1, measured with an extracellular recording electrode. They called this phenomenon theta "induction".
  • Then, after 3-5 times of this pairing between MS and SC inputs to CA1, something changes and simply stimulating the SC is sufficient to evoke theta. They called this theta "expression".

Induction of theta was blocked by the drugs atropine, MLA, and APV. Expression, on the other hand, was not blocked by atropine, MLA, or DHβE, but was blocked by APV. Clearly NMDA is important in this process.

Next, they looked at which type of receptors in CA1 neurons are necessary for this type of theta generation. They found that NMDA receptors on glutamatergic neurons weren't necessary, but that nicotinic acetylcholine receptors on GABAergic neurons were. Narrowing it down further, they discovered that somatostatin GABAergic neurons were necessary for theta induction, but parvalbumin GABAergic neurons were necessary for theta expression. They have yet to test the role of muscarinic acetylcholine receptors, but I think that's somewhere in the pipeline.

The next step they're going to work on is whole-cell recordings from single neurons in CA1 in order to understand their contributions to theta individually. One of the challenges in dealing with the medial septum and theta is that we have a growing body of observations that have yet to be linked together. I am hopeful that their research will combine with research from other labs to help bring the many pieces of the puzzle together and we will understand this important rhythm soon. See you tomorrow!

2013-11-10 22.19.12-1

]]> 3 1430
#SfN13: Day 1 Sun, 10 Nov 2013 07:37:10 +0000 Today I had the treat of meeting Garrett Smith, a fellow North Carolinian, here from Davidson College in Davidson, NC (less than an hour from where I grew up! and where my sister-in-law currently works as an adjunct assistant biology professor). I planned on visiting his poster when I saw it dealt with medial septum (MS), a brain structure that I'm very interested in, and I was not disappointed.

Garrett was kind enough to guide me through his research: the MS provides excitatory cholinergic and inhibitory GABAergic inputs to the dentate gyrus (DG). In addition, a major input to the DG comes from the ipsilateral entorhinal cortex (EC) through the perforant path. There are also a small number of DG connections from the contralateral EC, but apparently not enough to cause depolarization to threshold in the DG (as measured by population spikes in the DG when the contralateral EC is stimulated).

All this changes when the ipsilateral EC is lesioned. Without the perforant path input, something signals the MS and the contralateral EC to form more synapses with the DG. The added input allows the contralateral EC, when stimulated, to provide inputs resulting in population spikes in the DG.

What Garrett's group found was that stimulating the MS shortly before stimulating the contralateral EC significantly increased the size of the population spike evoked from EC stimulation. This indicates that the MS could be involved in the recovery of learning and memory following an ipsilateral EC lesion in rodents by strengthening the contralateral EC input to the DG. What role exactly this plays in vivo is unknown, though it is compelling, as all these structures are involved in learning and memory.

The mechanism by which the MS achieves this potentiating effect is also unknown, but they plan on investigating it in future studies by studying how the DG responds to cholinergic and GABAergic inputs individually.

It was a good day, and I'm excited for tomorrow!

2013-11-09 23.12.46-1

]]> 7 1407
Blogging at #SfN13 Wed, 06 Nov 2013 00:27:57 +0000 Good news! Between overeating and enjoying the sunny California weather, I will be writing daily posts from Saturday, November 9th, to Wednesday, November 13, as an official blogger for this year's annual Society for Neuroscience meeting! Specifically, my posts will focus on neural excitability and novel methods and technology development, though I may throw in some other things as well. I look forward to it! So stay tuned...

2013-11-05 16.42.51

]]> 1 1341
Consciousness Tue, 17 Sep 2013 08:06:13 +0000 The Scarecrow from The Wizard of OzThe cerebellum, the part of the brain that deals with motor learning, is a structure that makes up about 10% of the volume of the brain. It has many, many, tiny neurons in it, so much so that it actually contains more neurons than the rest of the brain. Interestingly, if the cerebellum is removed from someone's head, although they will have motor deficits, their personality and conscious experience of the world will remain unchanged. In contrast, even though the cerebrum has less neurons than the cerebellum, removal of the cerebrum will leave a person in a permanent, vegetative, and unconscious state. Something about the cerebrum that scientists haven't quite pinned down yet is responsible for conscious, subjective experience. I recently watched an excellent discussion among experts from different fields about the nature of consciousness, where they brought up this very point. I highly recommend the video, if only to see neuroscientist Christof Koch's fabulous shirt.

Although consciousness is difficult to define, much less explain, neuroscientists have made a lot of interesting progress in this area in recent years. It makes neuroscience an exciting field to be a part of, and is one of the things that drew me to neuroscience in the first place. What makes it so intriguing is its mystique--consciousness is one of the most scientifically intractable questions of which I know. It is easy enough to explain the neural mechanisms of something such as the tuning of an individual neuron in the visual cortex, but it is another thing entirely to explain what neural basis there is, if any, for the subjective experience of sight.

Neuroscience often teaches us to view the brain in terms of action potentials. Sodium atoms rush into the cell, potassium atoms rush out, causing a transient change in the local voltage. That's what the brain is: different types of atoms moving around in response to different forces. Harmonizing this deterministic, mechanical, soulless view of the brain with the idea of a subjective experience of consciousness is a difficult task. How can a large number of atoms moving around cause consciousness?

Christof Koch put it this way in an interview on NPR:

"So we know the brain is part of the physical universe, just like anything else. But brains - human brains, animal brains, baby brains - brains also exude this stuff, this feeling, like feelings of pain or pleasure, of artistic sensibility, of seeing red.

"And the big mystery has always been, how is it that a physical system that's described by the laws of physics, how can it give rise to conscious sensation? And can other physical systems such as a computer, can they also give rise to physical sensation? Is it something in the structure, is it something in the information, is it something in the complexity of it that gives rise to consciousness?"

He and another scientist, Giulio Tononi, are advocates of the idea that consciousness is a product of something called integrated information. As the name suggests, this theory borrows concepts from information theory and applies them to neuroscience, emphasizing that the integration of information across brain regions and modalities is critical to consciousness. They talk about this theory some in another video here.

Other scientists have recently developed alternative metrics for measuring consciousness. Traditionally, scientists observed electrical patterns recorded from EEG electrodes and then attempted to correlate them with behavioral manifestations of consciousness or unconsciousness, leading to some technologies such as the bispectral index, or BIS monitor used by many anesthesiologists to measure a patient's depth of anesthesia.

Recent studies using EEG have even found heightened signs of consciousness in rats for a short period of time immediately following death. This surprising result came when scientists found high levels of gamma rhythm synchronization between the front and back of the brain. Synchronous activity in the gamma frequency range is thought to be responsible for binding information from different brain regions together to make a coherent experience. However, it is important to remember that these studies do not measure consciousness directly, but rather measure electrical and behavioral correlates of consciousness.

Researchers in Italy and Giulio Tononi also came up with a new metric for measuring consciousness, called the perturbational complexity index, or PCI. Their metric also relies on EEG to measuring different large-scale electrical rhythms in the brain, and was developed by measuring these rhythms in people during a wide variety of different states of consciousness. Although they were able to distinguishing between different states of consciousness in test subjects using only their analysis of the subjects' EEG signals, many more people would need to be tested before this method became clinically relevant.

Yet other researchers have different theories about the nature of consciousness. Some postulate that quantum effects in the brain could explain consciousness. For instance, some have proposed that electrons in structural proteins called microtubules would be a possible candidate for information processing through quantum effects, allowing the brain to perform quantum computer-like calculations. There is even some evidence that microtubules are involved in anesthesia, which could lend support to this hypothesis, but many scientists feel this theory lacks substantial evidence.

Whatever the physical substrates of consciousness are, it will definitely be interesting to see where this field goes in the next few years. What do you think is the basis for consciousness? How should we measure it?

]]> 1 1015
Two Thoughts About Same-Sex Marriage Sat, 13 Apr 2013 19:18:32 +0000 IMG_1533

My parents this past December. They have been married almost 34 years, have five children, and have blessed the lives of countless others. What great examples to me!

With the Supreme Court expected to make a ruling about same-sex marriage this summer, I have been thinking a lot about the issue lately. During 2008, I was (and still am) a strong supporter of Proposition 8. During my more recent musings on the matter, I have come to two different conclusions:

1. Supporters of traditional marriage are seeking to define marriage as being between one man and one woman.

The fact that it's about the definition of marriage is crucial. It can be hard to see through all the rhetoric about equal rights. Of course I believe in equal rights. Of course the churches believe in equal rights. Of course God believes in equal rights, for "God is no respecter of persons," (Acts 10:34), and "all are alike unto God" (2 Nephi 26:33). What supporters of traditional marriage want is not unequal protection under the law. What we want is for the definition of marriage to be between one man and one woman. Marriage between one man and one woman is available to all. If there is a partnership between two people of the same gender, it is not marriage. It is something else. Tax benefits or visitation rights can be afforded to same-sex couples, but not because they are members of the institution called marriage, because the definition of that is a union formed between one man and one woman. In this way, equal rights are not violated. Which brings me to point number two, which is dependent on point number one.

2. The major reason the Church of Jesus Christ of Latter-day Saints opposes same-sex marriage is to protect its rights as a church.

I am not a spokesman for the church. However, from reading and listening to statements from the church and its Apostles, I believe the major reason the church opposes same-sex marriage is to protect its rights. I say this because I think people often feel the major reason the church or other denominations oppose same-sex marriage is to discourage sin. To be sure, discouraging sin is a chief role of the leaders and members of the church. However, if same-sex marriage is legalized, it will be impossible for churches and their members to maintain free exercise of conscience without stepping on the toes of gays and lesbians. Already pressure has been put on church-affiliated adoption agencies and wedding photographers, to name a couple off the top of my head,  to go against their consciences. Nationwide same-sex marriage legalization will only add fuel to this fire, and the rights of churches across the nation will suffer as they try to do what they believe is right.

I do not support mean or unfair treatment of any person based on their sexual orientation. However, I do support the rights of people and churches to free exercise of conscience with regards to matters of sex. If the definition of marriage is changed to include same-sex couples these rights will be compromised, as people will be forced to decide between treating people unequally with regards to the laws of the land and being disobedient to the laws of their Creator. This is what we want to avoid.

One of my favorite lectures on same-sex marriage was given in an address at BYU in 2008 by Robert George, the MP3 of which can be found here.

Anesthetics Change the Intrinsic Excitability of Neurons Fri, 07 Dec 2012 19:53:16 +0000 Gamma oscillations in the brain are hypothesized to be involved in consciousness (Gray, 1994). Interestingly, general anesthetics are known to change both the incidence and frequency of gamma oscillations in the hippocampus. They are also known to increase the amplitude and decay time-constant of postsynaptic inhibitory currents (Whittington et al., 1996). No causal relationship, however, has been established between these network effects and cellular effects. In an effort to begin this description, I have measured the frequency-current relationships in CA1 pyramidal cells both under control conditions and in the presence of the anesthetic propofol.


Using the whole-cell patch-clamp method, I delivered a series of input currents to the neurons. Whole-cell patch-clamp is a method where a finely-tipped glass micropipette is filled with an artificial intracellular fluid and brought down to the surface of a cell. Negative pressure (sucking) allows the cell to form a seal with the pipette and rupture the section of membrane just under the pipette tip. The intracellular fluid of the cell becomes continuous with the artificial intracellular fluid contained in the pipette. An electrode is inserted into the back end of the pipette, which can be used to record from and manipulate the cell electrically.

Each cell received the current steps with and then without the anesthetic drug propofol.


The most significant result that I saw was a change in the gain of the neuron (as evidenced by a change in the slope of the f-I curve) in response to propofol. I recorded the firing rates of the neurons in response to increasing steps of currents (current steps are shown in Figure 1).

Figure 1. Current steps.

Firing rates were separated into initial rates (rate during the first 0.3 seconds of pulse times) and steady-state rates (rate during the last third of pulse times). For this study, I was most interested in the steady-state firing rates. First, I recorded the f-I curves from neurons first without and then with propofol added (Figure 2; n=9). The process took about 30 minutes. There was a modest change in the gain of the neurons in response to the propofol treatment.

Figure 2. Steady-state f-I curve for the control (blue) and with the addition of propofol (red).

However, as I was doing the experiments I noticed that the gain of the neuron would change over time, independent of the addition of propofol. Therefore, I wanted to eliminate the effects of time from my analysis of the effects of propofol. I again followed the same protocol as I did for Figure 1, but without adding propofol for the second f-I curve. This second recording I called a "delayed control" (Figure 3; n=10). Here I also noticed a change in gain, but in the opposite direction.

Figure 2. Steady-state f-I curve for the control (blue) and delayed control (red).

When I compared the gain of the neurons with propofol added with the delayed recordings without propofol, significant differences were seen (Figure 4; p=0.021, unpaired t-test).

Figure 4. Bar graph of changes in gain (slope) as a result of the addition of propofol.

This is indicative of a change in the intrinsic excitability of the neurons as a result of the propofol treatment. This change in intrinsic neuronal excitability may, in addition to the synaptic effects of propofol, lead to changes in network behavior and contribute to propofol-induced anesthesia.

The entire poster for this study can be seen here: Utah BME 2011 Conference Poster.


Gray, CM (1994). Synchronous oscillations in neuronal systems: mechanisms and functions. Journal of Computational Neuroscience, 1(1-2), 11–38.

Whittington, MA, Jefferys, JG, & Traub, RD (1996). Effects of intravenous anaesthetic agents on fast inhibitory oscillations in the rat hippocampus in vitro. British Journal of Pharmacology, 118(8), 1977–86.

]]> 2 407