Monday, January 31, 2011

Autism and Ketogenic Diets

I had forgotten that the good Dr. Su sent me a link to a dietary trial of ketogenic diets in kids with autism a few months ago. He reminded me of this himself when he quoted some comments I made in a recent blog post, but then "Paleo Guy" extraordinaire earned bonus margarita mixes for his machine by reminding me yet again and sending me a link to a complimentary paper that is an excellent review of ketosis in general.

I will get back to posting on sleep. It's just the continuing sleep deprivation I've been experiencing makes the reading of the sleep textbook a bit too painful. However, I'm committed to restoring good sleep hygiene habits and no more twitter at 3am. In addition, each little beastie now has an LED nightlight play toy thingie that switches off after 30 minutes. They get huggable freedom from fear of the dark, and we all get blessed nighttime blackness. Win win. We'll just ignore the shutting down of the melatonin with the 30 minutes of LED glow. It's better than leaving the hall light on all night.

Right. Dr. Su's paper. It is a study of 30 kids from Crete with autism who were placed on a ketogenic diet for 6 months in 1999. They went on a "John Radcliffe" version of a ketogenic diet, consisting of 30% medium chain triglyceride oil, 30% fresh cream, 11% saturated fat (oops! overshooting the USDA 2011 guidelines by a bit), 19% carbohydrates, and 10% protein along with vitamin and mineral supplements. The kids were placed on the diet in 4 week intervals, followed by 2 weeks of anything goes, so on and off. The kids' urine was tested with ketostix and their serum checked for beta hydroxybutyrate (a ketone) to measure the amount of ketosis. After 6 months, the diets were discontinued, and the kids were evaluated monthly for another 6 months.

At the beginning of the study, 2 of the 30 kids met criteria for mild autism, the rest were more severe. Interestingly enough the premise of the study was to presumably improve mitochondrial efficiency in the brain via ketosis (using ketone bodies as fuel rather than glucose). 11 years later a small study did in fact confirm that kids with autism often have problems with mitochondrial efficiency.

23 kids tolerated the diet beyond the initial 4 weeks, and of those, 5 more discontinued the diet due to lack of improvement during the first few cycles. Of the remaining 18 kids, two boys improved enough in symptoms to be taken out of the special school and placed in mainstream education. Overall the 18 ketogenic kids "presented with improvements in their social behavior and interactions, speech, cooperation, stereotypy, and... hyperactivity, which contributed significantly to their improvement in learning."

The kids who did not stay on the diet were the most severely affected by autism, and the ones who had the best response were ones most mildly affected. Another interesting fact from the study is that the kids maintained their improvements through the two week washout periods and in the 6 months after the study was over. None of the kids had any complications (such as poor weight gain or selenium deficiency) seen in other trials of ketogenic diets in kids with epilepsy.

Overall (using the original sample size of 30), 26.66% of the kids benefited significantly from the diet. The researchers also have a nice explanatory paragraph about the biochemistry of ketosis and how it favors the relaxing inhibitory neurotransmitter GABA over the excitatory, and in excess, neurotoxic glutamate:

"The increase of ketone bodies maintains the synaptosomal content of ╬│-aminobutyric acid (GABA) at a higher level, a phenomenon that may contribute to the beneficial effect of a ketogenic diet in children with epilepsy and perhaps children with autistic behavior. Other researchers, in an attempt to clarify the manner in which ketone bodies increase the synaptosomal content of GABA, showed that the metabolism of ketone bodies to acetyl coenyzme A results in a decrease of the pool of brain oxaloacetate, which is consumed in the citrate synthetase reaction. As less oxaloacetate is available for the aspartate aminotransferase reaction, thereby lowering the rate of glutamate transamination, more glutamate becomes accessible to the glutamate decarboxylase pathway, thus favoring the synthesis of GABA."

Couldn't have said it better myself!

Well, this wasn't a large study or a blinded study and there was no control, but for some kids, the improvement was exceptional, and ketosis didn't have to be strictly maintained. My personal preference is not to live in ketosis, but rather to dip in on occasion via 16 or 24 hour fasting and some very low carb breakfasts after overnight fasts. This study seems to suggest that dipping into ketosis can have benefit for brain energetics, though the kids went through a larger scale "dip" than I ever have.

And, once again, dietary therapies prove to be exceedingly beneficial for some, but won't do much of anything for others. It would be important for any parent of an autistic child to know that ahead of time before pinning one's hope on a ketogenic diet. On the other hand, autism is currently incurable, and a ketogenic diet seems like a nice weapon to have in the arsenal against this disease.

(Put in the links and fixed some of those bizarre sentence structures.  I shouldn't be allowed to blog when sleep-deprived.)

Edited to add links to the rest of my posts on autism.  I cover gluten-free diets, inflammation, mitochondria, vitamin D, theories about the pathology:

Diet and Autism1
Diet and Autism 2
Autism and Vitamin D
Autism 4 - Inflammation Speculation
Brain Efficiency, Pediatric Edition
Autism and Interpregnancy Interval

Friday, January 28, 2011

I Hate Homocysteine (Also It Is Elevated in Schizophrenia and Bipolar Disorder)

There are some biochemical reactions that are just gorgeous. Poetry. Glycolysis and the citric acid cycle, for example. I used to be able to draw out the whole process, and there was always something very pleasing about doing so. Enter glucose and oxygen and a few other things, and exit energy in the form of ATP, flying off the citric acid cycle like sparkling droplets of water off a spinning wheel.

The folate cycle, on the other hand, is ugly. It plays a starring role in Evolutionary Psychiatry, however, and I have to come to terms with it. There are a zillion components, an army of vitamins, and end-products going every which way - amino acids, neurotransmitters, membrane lipids, and whatever the heck homocysteine is. (seriously - my anchor article for this post calls homocysteine a "non-protein amino acid". What does that even mean?)

Image from Wikipedia


Homocysteine is a by-product of the folate cycle. It is supposed to be recycled back into methionine, but if you are low in certain B vitamins (or, like 10% of people, genetically deficient in certain enzymes that work to recycle homocysteine), you end up with too much of it hanging around. And when that happens, you happen to have a higher risk of all sorts of nasty things, such as heart disease, stroke, hip fractures, and dementia. Turns out that homocysteine likes to cleave the disulfide bridges in cysteine molecules. That doesn't sound so bad, but it affects things you might need, like collagen, for example, which plays a major part in holding bones together and keeping your arteries nice and elastic. High homocysteine *sometimes* goes hand in hand with high triglycerides, high blood pressure (from those stiff, inelastic arteries, one would presume), low HDL, high fasting glucose, and abdominal obesity. All those signs together (or three of them at least, anyway) make up the so-called metabolic syndrome which plagues our Western populations.

The good news is that abnormally high blood levels of homocysteine can rather easily be lowered by B vitamin supplementation. Almost any B vitamin will do the trick - B6, B12, folate, even betaine. That's rather exciting, one would think. Plausible biologic mechanism for a big, big problem. Cheap and simple fix. The bad news is that lowering homocysteine with B vitamin supplementation doesn't seem to make one whit of difference in cardiovascular disease, or at least it didn't in the 3700 Norwegian heart attack survivors who were followed for two years in the last decade (3) or seem help 5500 folks with known vascular disease or diabetes (4).

I don't really care about your hearts or bones. That's not true. I care. But your brains interest me a lot more. A new paper came out this month in Psychiatry Research from researchers in Croatia about high blood levels of homocysteine in patients with bipolar disorder and schizophrenia. And you will not be surprised to know that metabolic syndrome and obesity (and diabetes and heart disease) are more common in these patients than in the general population. While some of the medications used to treat these conditions cause obesity and impaired glucose tolerance, when you really parse the data, there appears to be an increased risk of metabolic syndrome just from having the illnesses, apart from any medication contribution.

(A rather unrelated aside - antipsychotic medications are well known for some pretty disturbing side effects. One of the scariest ones is called "neuroleptic malignant syndrome" where you get a high fever, stiffness, blood pressure spikes, and it can lead to kidney failure and death from muscle injury. One of the fastest treatments for NMS is electroshock therapy, believe it or not. What many people don't know is that schizophrenics institutionalized in the years prior to the invention of medication would suffer high fevers, stiffness, and death (it was called "malignant catatonia"). Now there is no question that the medicines cause NMS, but there is also an additional issue with the dopamine regulation in schizophrenia that could lead to autonomic dysfunction in a serious and fatal way. Just some food for thought.)

Right. Homocysteine. Not only does it degrade important things like bones and arteries, but it also might be able to antagonize the NMDA receptor in the brain (1), which could be a mechanism by which homocysteine itself could cause psychosis directly. It has been suggested that high homocysteine and low folate and B12 are independent risk factors for the development of schizophrenia and bipolar disorder (2).

The Croatians did a pretty simple study. They measured the fasting homocysteine and other signs of metabolic syndrome in patients admitted to their hospital ward with schizophrenia and bipolar disorder. They did not measure serum folate and B12, which is unfortunate, because that would be interesting to know. Oh well. The results? 34.2% of the sample of 60-odd patients had metabolic syndrome. And 67% of those with metabolic syndrome had high homocysteine. Only 23% of participants without high homocysteine had metabolic syndrome. In addition, high blood pressure also independently correlated with high homocysteine, which at least makes biologic sense.

One more little interesting tidbit from the paper - high homocysteine has also been found to be correlated with high omega 6 fatty acid levels in patients with major depression.

The take home? As I said in the Zombieland 2 post, I tend to connect high homocysteine levels with poor nutrition in general. Also, high homocysteine can be caused by a number of drugs and supplements, including niacin, metformin, insulin, corticosteroids, NSAIDs, and some anticonvulsants cause high homocysteine levels too. Chronic high intensity exercise and smoking are also related to high homocysteine. What do we do about it? Having all the B vitamin players on the team can help, so it can be properly recycled. The Norwegians were heart attack survivors - the damage had already been done. Maybe (comparable to omega 3s for mild cognitive impairment = possibly useful vs omega 3s and Alzheimers = a disappointment) homocysteine is something to keep low in the long term as a preventative strategy. Lowering homocysteine did seem to reduce stroke incidence by 25% in the HOPE2 trial. But who knows? We'll have to wait for more studies.

The brain is on the front lines, and metabolic syndrome (or the inflammation behind it) has psychiatric components as well. It is all linked in ways that we only barely understand.

Thursday, January 27, 2011

Dietary Fat Intake and Depression Risk

More sleep coming soon, but a paper came out yesterday that ought to be blogged about (open access, too from PLoS One):

Dietary Fat Intake and the Risk of Depression: The SUN Project

The paper begins mild-mannered:  "Emerging evidence relates some nutritional factors to depression risk.  However, there is a scarcity of longitudinal assessments on this relationship."

The researchers followed a group of Spanish university graduates, initially depression-free, for an open enrollment period of 1999-2010.  12,059 ultimately signed up.  At baseline they filled out a food frequency questionnaire to estimate the amount of saturated fat, polyunsaturated fats, trans fats, monounsaturated fats eaten, and the use of "culinary fats" (olive oil, seed oils, butter, and margarine).  During the follow up period (of median 6 years), 657 new cases of depression were identified (via a new diagnosis of depression by a physician or initiating the use of antidepressant drugs).  As is typical, confounders were accounted for, including adherence to a Mediterranean Diet (legumes, fruits, vegetables, fish, cereals, and low in meat and dairy products), which has already been shown to correlate with decreased depression (1).  Turns out there was a reasonably strong (and dose-dependent) correlation between trans fat usage and depression - the less trans fat the Spaniards ate, the less likely they were to become depressed, and those in the highest quintile of trans fat intake had a 48% increased risk of developing depression compared to those who ate no trans fats.  There was also a weaker inverse correlation with the amount of MUFA and PUFA eaten (meaning these types of fats may be protective against depression.) 

In Europe, suicide rates and mental disorders are higher in Northern Europe, and the lowest in the Southern Mediterranean countries.  I can think of a number of possible reasons for this trend (sunlight, relaxed lifestyle, vitamin D *cough*), but the overall food choices differ mainly with respect to olive oil and pulses.  Very little is known about the specific types of fat and risk for depression, except with respect to the omega 3s.  And even those studies are tricky - often, the omega3 capsule is compared to a placebo capsule of olive oil, and in some trials, both the fish oil group and the olive oil (control) group improve.  It could be that both fish oil and olive oil improve mood, or that the increased fat content in general improved mood, as there is some evidence that low fat diets adversely affect mood (2)(3). 

Now the discussion and speculation - always the most interesting part.  The commonly accepted mechanism now for depression is one of inflammation leading to a decrease in BDNF, which is a nerve fertilizer necessary for axonal growth, nerve survival, and synaptic plasticity and function.  Some of this awesome nerve fertilizer is made in the endothelium (possibly through a nitric oxide mechanism).  Now cardiovascular disease, which correlates with depression, is also likely mediated through inflammation and endothelial dysfunction.  Trans fats are thought to cause inflammation and endothelial dysfunction, so it would make a lot of sense if they trash your brain along with your coronary arteries. 

Here's a rather scary analysis (though also makes me question the conclusions of the whole study) - the people studied overall actually ate a lot of whole foods and a very low amount of trans fats, and most of it came from whole fat dairy (I'm assuming, then, they are talking about CLA, which as far as I know is a very good fat associated with reductions in diabetes and obesity (both of which individually correlate with depression also, of course) though it is in truth a natural trans fat).  In this study sample, in the highest quintile, trans fat made up only 0.4% of calories.  In America, trans fat intake is up to 2.5% of calories, and the main sources are artificial foods (such as processed snack foods and margarine.)

Olive oil, on the other hand, is thought to have antioxidant and anti-inflammatory effects, and various metabolites of olive oil can improve sleep and improve the binding of serotonin to its receptors.  The folks in the study overall (being a Mediterranean population) ate a ton of olive oil and very little butter, margarine, or seed oil.  (These researchers are big olive oil fans.)

So what can we glean from this study?  Not that much - the strength of the design was a large population, but a huge weakness is that the diet was measured only once, at the beginning of the study, through a food questionnaire.  It was felt that the use of college graduates would help the validation, as they might be more likely to give accurate information about diet.  However, it seems that most of this highly educated cohort ate very similar diets, which means the whole effort may have been something of a wash.  I'm always skeptical of the ability of epidemiologists to adjust for confounding variables.

But it does make sense biologically that olive oil and PUFA intake (since they weren't using seed oils, I'm guessing a lot of the PUFA was in fish) would correlate with a happier brain, while the inflammatory trans fats would trash the endothelium and BDNF.  So maybe we hang our hats on that a little.

Tuesday, January 25, 2011

Appetite, Sleep, and Mood - The Connection

Whew.  Between making up for my holiday vacation and the snow days (we might get yet another storm, up to 18 inches, tomorrow, and there is still 2&1/2 feet on the ground!!), work has been incredibly busy.  At the same time, my children have decided to run sleep deprivation experiments on me.  The problem with being sleep-deprived is that when you try to read rather clunky textbooks on the basic and clinical science of sleep and mental illness, you tend to nod off rather quickly.  (the reference for this post is the aforementioned clunky textbook)

That's a rather verbose introduction to the fact that I need a perky song or two to blog by today.  Let's start with Ra Ra Riot's Boy. (right click in new tab if you want to keep reading).

So - appetite, sleep, and mood.  We psychiatrist types ask about such things all the time, because we know they are inexorably related.  In fact, problematic appetite, sleep and energy levels have a clinical nickname - "neurovegetative symptoms."  Well, that's not much of a nickname.  We call it "neuroveg" for short.  We also call ending a relationship with a patient "termination."  And emotions "affects."  We aren't particularly fun people, really.  (Kidding!  Sort of.)

So let's talk neurotransmitters.  Wakefulness in general is supported by several excitatory neurotransmitter pathways, among them acetylcholine and norepinephrine.  These brain chemicals send wakey wakey signals to the forebrain, and when this happens, our brain EEG signals go from large sleepy floppy slow wave sleep to the brisk jagged beta waves of being awake.  If we lack acetylcholine and norepinephrine signals to the forebrain, we get all drowsy and fall into non-REM sleep (all our restorative sleep).   If you are into the whole neurochemistry/electricity thing, the excitatory neurotransmitters lead to rapid firing and depolarization of the wakeful brain neurons.  The lack of that signal leads to a relative hyperpolarization and long, slow lazy signal of the sleeping brain.

To get into even more nitty gritty, remember that the basic energy currency of the cells is ATP.  That's short for adenosine triphosphate.  In order to lend power to chemical reactions, the ATP gives up its three phosphates one by one until it becomes plain old adenosine again.  And in the brain, the more adenosine, the more difficult it becomes to send signal to the wakey wakey neurons.  That makes sense - you've spent your energy reserves and have a bunch of waste adenosine around?  Chill out, cool down, and rest, so the nighttime healing and restoration of reserves can commence.

(Time for another song?  I really like this one by The Black Keys.  It's overplayed but, some songs are never really overplayed.)

To recap - the neurotransmitter monoamines, norepinephrine and acetylcholine, promote wakefulness.  Non-REM sleep (stage 1-4) is the opposite of wakefulness, and lack of monoamine input leads to progressively more slow wave sleep.  BUT, remember that during polysomography, it is difficult to tell the difference between someone waking up and someone in REM sleep.  The EEG waves of REM sleep are impossible to tell from being awake - only the rapid eye movements and the paralyzing of our major large muscle groups can let the sleep researcher know that the subject is in REM sleep and hasn't woken up. 

Turns out REM sleep neurons turn on when there is almost NO input from the monoamines, while sleepy-inducing neurotransmitters like GABA (GABA receptors are activated by GABA (duh), benzos like valium, alcohol, and sleep medicines like lunesta or ambien.) seem to activate both non-REM and REM sleep.

So - in major depressive disorder, we seem to have a lack of monoamines in the right places at the right time in the brain.  GABA is more or less okay.  Therefore, you get a classic sleep hypnogram for major depression - lots of REM sleep that starts way too early in the night, lots of wakefulness, and almost no restorative slow wave sleep.  All clinically effective medicinal treatments for major depressive disorder will improve your hypnogram by increasing the amount of time it takes you to get to REM sleep 10 DAYS EARLIER than you notice any improvement in mood. 

(If your depressive disorder has a lot of anxiety symptoms too, your GABA is probably lacking, so then the sleep gets really messed up.)  

Now, let's bring in appetite.  The Nurses' Health Study and some other studies show that chronic short duration of sleep and chronic long duration of sleep are associated with type II diabetes.  Laboratory subjects who are studied in sleep restriction and sleep deprivation have impaired glucose tolerance and increased appetite.  In the majority of serious psychiatric disorders, disruption of the circadian rhythm and sleep occurs, and major psychiatric disorders are also associated with an increase in appetite and risk of diabetes.  Some of this increase is medication-driven, but there also appears to be an increase in risk of diabetes and weight gain independent of medications.

The center of appetite and mental problems appears to be the hormone orexin.  As we learned earlier, orexin is a hormone that makes you hungry.  Serotonin seems to suppress orexin.  Folks with schizophrenia and atypical depression seem to be pretty low on serotonin, and they will be hungrier and have more central obesity than folks without those disorders.

Putting it all together, disturbances in the sleep-wake cycle reported in psychosis and in atypical depression seem to disrupt the feedback mechanisms of energy and metabolism in a way that decreases glucose tolerance and reduces sleep efficiency and effectiveness.  All the usual neurotransmitter suspects are implicated. 

Another song?  Sure, why not.  How about The Cave by Mumford and Sons.

Or if you want a nice modern classical romantic song, try this little piece from the newest Pride and Prejudice movie.  Good date movie - not too many tears.

Saturday, January 22, 2011

Chronotherapeutics for Affective Disorders

Little update 3/2/11 - Just found this website with research updates on chronotherapuetics which may be of interest: http://www.chronotherapeutics.org/Index.html
(end little update)

I'm going to spend some more time discussing some nitty gritty, genetics, and biochemistry related to mood disorders (especially), treatments for mood disorders, and circadian rhythm abnormalities.  Bet you can't wait!  In the mean time, however, I came upon some neat articles (1)(2) about the process of chronotherapeutics.  That is, using light therapy, dark therapy, sleep deprivation and sleep phase delay or advance to treat mood disorders, such as depression and bipolar disorder.  A little warning - these methods are powerful, quick, and affect the same neurotransmitter systems as psychiatric medications (more on that in a different post), and it is not a good idea to experiment with these all on your lonesome.  Let me give you a worst case scenario - you try to treat depression with light therapy or sleep deprivation.  Turns out you are bipolar.  You get manic, spend $30,000 on a new stereo system, sleep with your boss, and antagonize your friends and relatives, and end up in a hospital after singing opera naked on your rooftop (this is an invented but not entirely unreasonable scenario).  So... best to let some loved ones know if you attempt these methods, and if you already have a psychiatric diagnosis, don't attempt these methods without the blessing and observation of your therapist or doctor.  In addition, if the methods aren't done quite right, you can very quickly relapse (within 1-2 days). 

Most of you will be familiar with the concept of light therapy.  Sitting in front of special 10,000 lux light sources on late autumn and winter mornings has been proven to be an effective treatment for seasonal affective disorder, major depressive disorders, and even bipolar depression (if you are careful - injudicious use of light therapy can also bring on mania).  The FDA approved lights (such as the ones from this company - this is just the company I typically recommend to patients, I have no relationship to them and receive no monetary or other benefit from them)  all have a 30 day money back guarantee, also, which is nice, and some insurance companies will pay for them if you are lucky.  The usual method is to sit in front of the lights in the morning for 15-30 minutes, glancing at the light every 30 seconds to a minute or so.  You have to do it nearly every day, and it works best if you begin when the seasons change (late September here at 40 degrees north).  Light therapy can nearly instantaneously improve seasonal depression, and I've heard tales of trucks of light therapy boxes driving around to small towns in Alaska and reducing the winter suicide rate along the way. 

But let's get back to the basics of chronotherapeutics.  In general, interventions that lead to sleep phase advance (waking up early and going to sleep early) have an antidepressant effect, and sleep phase delay (going to sleep later and waking up later) will have a depressant (or anti-manic) effect.  Also, reinforcing the natural circadian rhythm will tend to help mental illness - at hospitals in Canada (3) and Italy (4), they noticed that patients in sunny or easterly facing rooms were discharged on average 2&1/2 to 3&1/2 days earlier than patients in rooms without much sunlight. (Even more interestingly, the differences were minimal in the winter, but extended to up to 7 days in the autumn).  Not surprisingly, all of this has been discovered before by our intrepid ancestors.  Classical texts and descriptions of psychiatric wards from 1794 showed that depressed patients were advised to spend time out of doors, and agitated patients were closed up in darkened rooms (5).

One old-fashioned and newly-fashionable method of treating all sorts of depression is sleep deprivation (SD).  There is complete sleep-deprivation, which is self-explanatory, and partial sleep-deprivation,which generally involves waking people up for the second half of the night.  The only known contraindication to sleep deprivation is epilepsy (I've spent some time on the long-term seizure monitoring units in neurology, and we've been known to elicit seizures for diagnosis via EEG by sleep deprivation (basically, sending the medical students and residents - who were up anyway -  to keep the patient awake at all hours) and use of a judicious amount of red wine).  SD's efficacy has been reported in major depression, bipolar disorder, depression in schizophrenia and in Parkinson's disease, and post-partum depression.  Patients who respond best to sleep deprivation are the same patients who respond best to antidepressant medications - those with a diurnal pattern of mood (typically more depressed in the morning and feeling pretty good by afternoon), low IL-6 levels, and an abnormal dexamethasone suppression test.  Light therapy has similarly proved therapeutic (nearly instantly) with depression associated with ADHD, Parkinson's, Alzheimer's, pregnancy, post-natal, and regular depressive disorders.

As with every other method (such as therapy and antidepressants) (except shock therapy, which is up to 90% effective), light therapy and sleep deprivation is at least modestly helpful in 60-70% of cases.  However, and interestingly, people with bipolar depression seem more likely to respond to sleep deprivation or light therapy than to standard antidepressant medications, suggesting to me (and truth be told I've read other papers with other evidence for this theory) that genetic issues with the circadian rhythm system is the primary problem leading to the vulnerability to bipolar disorder.Due to the tricky nature of bipolar depression and the risk of switching to mania with antidepressant drugs, some of the most robust data has been shown for chronotherapeutics (sleep deprivation, phase advance, or light therapy) for this condition, and mood stabilizers (which work upon the circadian rhythm proteins) can enhance and continue the initial benefits brought about via chronotherapeutics.  The medicine remains useful, as once chronotherapeutics are discontinued (one can't be sleep-deprived forever, for example), the depression can return within a hours of a normal night's sleep.  In fact, only 5-10% of the studied bipolar depressed patients remain with a normal mood through  chronotherapeutics alone.  Repeating the intervention doesn't always help, as people tend to become tolerant to the treatment.

One way of ameliorating the tolerance to chronotherapeutic techniques is to combine them.  For example, there is a severely depressed patient with known bipolar disorder in the hospital.  Start with a few days of sleep deprivation, then begin phase advance treatment (going to bed early, waking up early) and morning light therapy to retain the benefits over time.  Perhaps add in some mood stabilizers to enhance the effect (again, I will go into more specifics as to how mood stabilizers and antidepressants affect, directly, the circadian rhythm system in another post - but to give you a preliminary taste, both serum and PET, SPECT, and fmri data has shown that antidepressants and sleep deprivation/phase delay/light therapy affect the same neurotransmitter system in similar areas of the brain), and we have a recipe for nearly immediate reversal of severe bipolar depression with maintenance of normal mood for the foreseeable future.

An interesting part of the discussion of chronotherapeutics is that the techniques (other than the physical lights of light therapy) cannot be patented.  Therefore there is less (short-term) economic motivation for future study (as is the case with most evolutionary medicine ideas).  However, in countries with socialized medicine, far-sighted bureaucrats might see the writing on the wall - cheap interventions (such as sticking all the depressed patients in easterly-facing rooms in the autumn) decreasing hospital times saves real taxpayer money very quickly.  Days in the hospital equals thousands of dollars.  It is that simple.

So if you are depressed, seek light and wakefulness (an old timey depression remedy was to wake up at 3am once a month for those known to be vulnerable to the condition), and if you are manic, seek darkness and low stimuli.  Under close supervision, of course.

Friday, January 21, 2011

Sleep Architecture

At the risk of recreating a wikipedia article, we need to lay some groundwork here and discuss the structure of normal sleep. Of course it is never as simple as that - normal sleep architecture changes throughout the life cycle. However, the basics are similar for everyone except perhaps preemies.

Sleep is defined using EEG readings (where you have a bunch of leads taped to your head, and the electrical output of the brain is measured on a polysomnography - which also includes muscle measurements and eye movement measurements to be completely accurate.) In general, all stages of non-REM sleep are characterized by a slowing and deepening of the waveforms as we get deeper and deeper into sleep. When we are awake but zoning out, or meditating, we have a type of waveform detected on the EEG called an "alpha wave." When the alpha waves start to become theta waves, we've progressed to stage 1 sleep. In general, someone who has been awakened from stage 1 sleep will not think he or she was actually sleeping. (Think Dad nodding in his chair on Sunday afternoon, and when you bug him, he blinks and goes back to watching the game as if nothing happened).

EEG tracing of stage 1 sleep with mostly theta waves and some alpha


A few minutes into stage I, if all goes well, you go to stage 2, and then after another few minutes into stage 3 and 4. These last two stages are characterized by "delta waves" and are commonly referred to as "slow wave sleep" and represent the deep, refreshing sleep. After about an hour or a little more (total for 1-4), normal adults will transition to REM sleep, where eyes move rapidly and large muscle groups are paralyzed. This stage is when a lot of dreaming occurs (though dreaming also happens during slow wave sleep). REM sleep waves are somewhere in between stage I and wakefulness - in fact, the sleep scientists rely on the muscle readings and eye movement readings to distinguish between someone who has woken up, and someone who is in REM sleep.


Once the REM cycle is finished, one drops down through stage 1-4 again. The cycle repeats itself every 90 minutes or so (for newborns, every 45 minutes) throughout the night, though the last couple of cycles one spends more time in REM sleep and less time in slow wave sleep.

Normal sleep architecture


A couple of important things - the first cycle or two are the ones where we spend the most time in the deep, refreshing slow wave sleep, so it is vital that these cycles are of good quality. In classic Major Depressive Disorder, for example, patients will often never reach full slow wave sleep throughout the night, thus the common complaint of insomnia, feeling constant fatigue, irritability, or being too easily awakened. These same patients typically aren't hungry and lose weight. In "atypical" depression, seasonal affective disorder, or bipolar disorder, some patients will actually spend too much time in slow wave sleep. They also feel lethargic, and will be hungry and tend to gain weight.

Alcohol consumed close to bedtime will tend to decrease sleep latency (that is, the amount of time from being awake to being asleep when we go down for the night), increases the length of time to get to REM sleep, and increases slow wave sleep for the first half of the night. However, the second half of the night, there will be more wakefulness, more REM sleep, and less slow wave sleep. Alcohol is the most common go-to substance that chronic insomniacs use to get some shut-eye, and overall it decreases the quality and efficiency of sleep.

All told, sleep problems and fatigue arise from a complicated array of too much or too little in a number of neurochemical systems. There are natural chemicals promoting wakefulness, and chemicals promoting sedation as part of the circadian rhythm, and issues with any of these can lead to complaints of poor sleep, insomnia, or fatigue. I hope to get to more of the details in future posts!

Monday, January 17, 2011

Circadian Rhythm, Psychiatric Symptoms, and Pineal Gland Tumor

Today's Green Journal had an interesting letter to the editor. "Ivan," a 19 y/o, had a history of a pineal gland tumor (the pineal gland secretes melatonin) that had been resected in 2001. In order to decrease building pressure in his brain, he had a second surgery for a VP shunt to drain his cerebrospinal fluid into his abdominal region. After the second surgery, he had insomnia, a disturbed sleep-wake cycle, and fragmented sleep. In June 2004 he began to exhibit paranoia and other classic signs of bipolar disorder. By 2007 he was on several psychiatric medications, and he was not able to continue school due to severe fatigue and bipolar symptoms. A sleep study showed an irregular sleep/wake cycle, and 24 hour urine measure of a melatonin metabolite showed barely detectable levels.

The patient was started on melatonin (a controlled release formulation), his sleep cycle stabilized, and his psychiatric medication was slowly withdrawn over the next several months. Repeat sleep studies showed normalized sleep. Since rapid release melatonin lasts 60-90 minutes, it was felt it was not a suitable replacement for the absent pineal gland - thus the controlled release formulation.

Interestingly enough, the patient discontinued his melatonin in 2009 and remained symptom free, with a stable sleep-wake cycle. The authors of the letter speculated that his bipolar symptoms arose from lack of restorative sleep, and with the help of exogenous melatonin, the patient was able to somehow use other signals to synchronize his sleep wake times, preventing relapse.

Lesson - sleep is exceedingly important (more on this fact in future posts). Also, all my first episode psychotic patients get an MRI and a full medical work-up.

Sunday, January 16, 2011

The Neurobiology of Sleep

There is quite a bit going on, and I'm not always capable of doing it all with one hand behind my back, especially when sleep-deprived. One of my colleagues had a baby recently, and she asked us when our kids began sleeping through the night. Another colleague, mother of three, said "never." And to some extent it is true - sure, the newborn every 3-4 hour torture phase passes, but then you get teething, and then nightmares/fear of the dark, and beyond that I wouldn't know from personal experience, but let's just say I've got one teething and one afraid of the dark, and I am thoroughly sleep-deprived.

Which wouldn't bother me too much, except that I put on a couple of pounds (darn cortisol) and vanity is a failing of mine. From medical training I'm an old hand at sleep deprivation. Now there are actually laws against the amount of work I did back in the day. Not that I disagree with those laws as a rule - I happened to do my psych consultation service on some of the same floors where the medicine interns wore EEG leads post-call (meaning after being up all day and all night, and rounding on patients in the morning) as part of a study on resident work hours. Turns out they spent a good deal of those post-morning rounds in phase I sleep. Since I'm sure most of you don't care for sleepwalking doctors in charge of communicating vital information about your admission and hospital course, I imagine you are probably in favor of laws controlling medical residents' work hours. You might be surprised by the response within the medical community. They want residents to suffer - they think it makes better doctors. And working crazy hours does build confidence and experience faster than anything. However, I hope it is not terribly controversial to suggest that we not make new doctors at the expense of safety (too much).

Well, sleep! Which I am sorely missing. I'm not young anymore, after all. I have at hand a basic reference for the neurobiology of the circadian rhythms. It is from the supplements from the Journal of Clinical Psychiatry (which I normally throw away, as they are generally thinly-gilded advertisements for the pharmaceutical industry. And this supplement is no different- it came out in 2005, which was right about when Ambien CR and Rozerem were being marketed, and if you read the entire supplement you get statements about how cheap, long-used sleeping pills aren't FDA approved (of course they are not - they were generic long before the current FDA approval process, and who would bother to spend the several gazillion dollars needed for the current FDA approval process for generic medicines?) But, pharmaceutical advertising notwithstanding, the paper I'm referencing is rather too basic to be dangerous, and it is free (with free registration) for everyone, so let's call it egalitarian.


The Human Circadian System and Normal and Disordered Sleep

The neurobiology of sleep is nearly as simple as light and dark. We have, in our brains, right in the middle a bit above our eyes, a little area called the superchiasmatic nucleus. It is an area of about 10,000 neurons that runs our circadian rhythms. Without light stimulus, it tends to fire in a rhythmic pattern in a cycle of a bit more than 24 hours. Light stimulus will reign it in and keep it correlated with our natural light/dark cycles.

Light hits the retina of our eyeballs, which then sends a signal up through our optic nerve directly to the superchiasmatic nucleus. There a dimer of two proteins is made in response to the light - CLOCK+BMAL1 (let me suggest that the scientist who named the CLOCK protein was a tad more poetic than the namer of BMAL1). BMAL1/CLOCK starts the day running by binding the per/cry promoter regions in the nucleus, leading to the creation of the PER/CRY protein complex (there are in fact 2 pers and 3 crys, but let's just call it per/cry for simplicity's sake.)

From here, PERs and CRYs have to bond with other PERs and CRYs to form dimers, or else they are broken down very readily. The dimer of PER/CRY is then translocated from the nucleus to the cytoplasm for daytime cellular activity. All right! Throughout the night, when no new PERs or CRYs are made, the dimers already in the cytoplasm slowly degrade, until light comes again and more new ones are made. Thus the cycle of life and light and dark. That, in a nutshell, is the circadian rhythm.

Under conditions of sleep deprivation, our circadian clock will keep us from getting too far out of whack from light/dark cycles. We will readily sleep in the wee hours of the morning (and in fact major industrial accidents such as the Exxon Valdez crash and the Three Mile Island radiation leak happened at 3-4 am - at least according to my book on chaos and sleep), and yet even if sleep deprived and given ample opportunity, we have a hard time sleeping during the "forbidden zones" of 9am and 9pm.

And what of that famous pineal gland hormone, melatonin? It is secreted in response to dark, and light stimulus (such as checking the twitter feed on the ipad at 3am, or turning the hall light on to quell the 3 y/o's fear of the dark) will diminish melatonin secretion immediately. The human superchiasmatic nucleus (SCN) has a bunch of melatonin receptors (there are very few in the human brain outside this region). Melatonin signals the SCN to cool it and settle down for the night. Melatonin seems to sharpen the natural SCN response to light and dark. Without proper melatonin signaling, light and dark signals to sleep or wake up are attenuated, leading to night wakefulness and daytime sleepiness.

So what to do if you have insomnia? The first thing is to cut out any late night retina stimulators - like TV or internet. Music or low-light reading is probably okay. Our retinas were never designed for HDTV at midnight streaming so much signal straight into our brains (I'm reminded of when my oldest as a baby was awake at night, and my husband, trying to spare me, dutifully took her downstairs and began watching "Three Kings" with her at 2am - well, that Iraqi sun on the big screen TV at 2am did nothing for our baby's ability to sleep, and she was WIRED for several hours.) Do not turn on the lights for a midnight visit to the refrigerator or the restroom. I'm not entirely convinced that complete, black darkness is necessary given our ancestral propensity to sleep under the moon and stars, but as little light as possible is likely ideal.

Get rid of the clocks, too. I don't use an alarm and haven't for many years. If you need an alarm to wake up, you aren't getting nearly enough sleep in the first place.
There is an online resource called "CBTforInsomnia.com" that has an inexpensive program to help with insomnia (I have no connections to this resource and receive no money from promoting it).

Failing that, if you have a condition such as depression, anxiety, or bipolar disorder, sleep is exceedingly important. I'll focus more on these individual conditions in separate blog posts. In these cases I often feel it is prudent to prescribe sleep medicines (in fact, with an escalating mania, especially a psychotic mania, sleep medicines will nip it in the bud as quickly as anything else), but it is obviously not the ideal and is not a long term solution.

My own preferred sleep remedy is magnesium supplementation. I take magnesium oxide (low bioavailability, but easy to find at any drugstore) 250-500mg (depending on how many nights in a row I forget to take it), a lower dose of magnesium citrate, or a low dose of magnesium gel meant to spread on the skin, and I sleep well, right up until one of the children wakes me up. Maybe I should coat them with magnesium too.

Wednesday, January 12, 2011

Autism and Interpregnancy Interval

A paper came out earlier this week: Closely Spaced Pregnancies are Associated with Increased Odds of Autism in California Sibling Births.

In short, the researchers matched every single sibling birth in California from 1992-2002 with reports of getting services for autism, spent a good deal of time on the statistics and did a secondary case-controlled study to make sure they weren't missing anything.  And turns out that the odds of a second child born within 12 months of a first child (*actually I messed this up a little - it is an interpregnancy interval of less than 12 months - so a second child born less than 18 months after the first.  sorry!) have a little more than a 3 fold risk of having autism than a second child born more than 3 years after the first one.  Risk for second children born at interpregnancy intervals of between 12 and 36 months were middling, but risk rose abruptly between 12-18 and "0"-12 months.

Unfortunately, the researchers spent so much time in the paper reviewing the statistics and making sure every last variable was accounted for that the discussion as to why was about one paragraph.  They thought it might be folate depletion, omega 3 depletion, or stress (it is, obviously, very stressful to have a young baby and to be pregnant at the same time.)  In a number of posts on autism I have speculated as to some nutritional and genetic causes:

Diet and Autism1
Diet and Autism 2
Autism and Vitamin D
Autism 4 - Inflammation Speculation
Brain Efficiency, Pediatric Edition

Since, during pregnancy, a baby will tend to suck whatever nutrients are needed straight from mama, whether she can spare them or not, it would make sense that a nutritional explanation could account for the increased risk of autism in second children when the pregnancies are closely spaced. 


Here is a free, online, up to date, and comprehensive review of pregnancy, nutrition, and birth outcomes - if you are interested.

My ultimate preventative solution is, of course, to make sure that moms-to-be out there are consuming nutrient rich diets with plenty of folate, phospholipids, minerals, omega 3s, etc. etc. etc.

Also, it is snowing.  A lot.

Mailbox and Snow Bank

Grill

Tuesday, January 11, 2011

Snowpocalypse II Beef Stew

We're supposed to get another 14-21 inches of snow starting after midnight tonight, and despite several unseasonably warm days, we still have 8-10 inches of snow on the ground from the Christmas storm plus some additional snow last week. Needless to say clinic will be closed, and I'm cooking up some stew for a warm and cozy snowed-in day tomorrow. With that amount of snow, the plows can push up banks 4-6 feet high at the end of the driveway - they probably won't melt completely until late April or May! Well, here's the recipe of the evening (as usual, thrown together):

*Carton of beef broth
*Glug glugX 30 or so of red wine
*A pound of grassfed stew meat (usually I would brown this but I was feeling "slow and gentle cooking methods" today, or perhaps "lazy.")
* one onion (chopped)
* three small potatoes, peeled and chopped
* 2 carrots, chopped
* 1/4 cup cooked rice (to thicken - leftover excess from fried rice in the fridge)
* 4 marrow bones, roasted at 350 in the oven on a cookie sheet for half an hour, then put with tongs into the stew
* 2-3 stalks of celery, chopped
* a bit of baby spinach, chopped
* Celtic Sea Salt®, Light Grey, By The Grain & Salt Society, Coarse Ground, 1 lb, pepper, kelp flakes (that's just the amazon link - of course I don't use a pound of salt!)

Boil on high for a bit, then set to low and boil for a long time. Toward the end I will add some chunks of pasture butter and maybe even some olive oil. Remove the bones before you eat! I usually use 2 bones, but I found a source of grassfed marrow bones not 2 miles from the house, so now I don't have to wait for my (infrequent) 50 pound orders of grassfed beef + marrow bones from my home state. This will feed four of us several times for a couple of days, so I don't sweat the rice and potatoes.

Here's a picture from a previous winter - maybe I will have some new ones tomorrow!


Sunday, January 9, 2011

Evolutionary Psychiatry - the Key Posts

What with the very recent post link from PaNu, and Chris Kresser tweeting and facebooking (eesh is that a word?) my last post (thanks Chris!), it looks like I have some new readers.  Hi!  I can understand you not wanting to go back through 130 some-odd posts to catch up, but a lot of my posts build on previous ones, which have built on previous ones... well, I wouldn't want anyone to miss out.  And one of my goals of this blog is to break down these topics so that the layperson can understand it (sometimes I am more successful at that than other times), but I don't want to keep repeating myself, either.

So here we are, an Evolutionary Psychiatry primer, if you will.

Basic Premise (includes monetary disclaimer, for giggles)

One of the key factors in studying obesity and metabolism is the idea that inflammation is the root of all evil.  Well, it turns out the same is true of depression and other mental illness.  Here are a couple starter posts for that:  Depression 2 - Inflammation Boogaloo, and Depression Crashed Your Party.  Here's one related to Autism.

Another theme of my blog is how mental illness is changing over time.  As an example, back in the age of large asylums, a great many folks with schizophrenia were of the hebrephrenic type.  Meaning they were disorganized and silly, and generally obstinate.  In recent times the hebrephrenic type has essentially disappeared - I've seen only one in my career.  No one knows why (as far as I know) - perhaps it is particularly responsive to medication, perhaps it was due to some sort of severe vitamin deficiency issue that general vitamin repletion has fixed in society at large - all total speculation.  Changes like that in mental illness interest me, especially changes that have escalated with the escalation of obesity and metabolic syndrome, because that says "diet" to me.  Here's a post on the history of depression: Depression 3 - Not Quite What It Used to Be.  Here's one about Eating Disorders.

Here's some important stuff about the links between metabolic syndrome and depression: Chronic Stress is Chronic Illness and Stress is Metabolic Syndrome

If you want the basics on how neurons work, look at Ketogenic Diets and Bipolar Disorder 2.

Here's my most popular post (by a lot) and requires no explanation:  Your Brain on Ketones.

One of my themes is that a single nutrient on its own is rarely the answer - we need an anti-inflammatory diet rich with nutrients (and do the best we can to control sleep and stressful lifestyle factors as well).  Alzheimer's Pathology and the Dementia-Free Kitavans is a bit heavy, but makes the point.

Here's another self-explanatory one: Your Brain on Omega 3s.

So I think that pretty much covers the bare bones basics.  However, if you have a particular interest, I encourage you to check out my post map, which (more or less) has everything broken down by condition or nutrient.

And, last but not least, a few of my favorite posts (though I'm narcissistic enough to like quite a lot of them, perhaps I'd be better off writing less and thinking/revising/consolidating more!)

Zombieland
The Evolution of Serotonin
That Tapeworm Ate Your Depression
The Case for Evolution

Genius and Madness

Saturday, January 8, 2011

Alzheimer's, Mild Cognitive Impairment, and Ketosis

Funny thing is, not even a month ago I was commenting over at Dr. Parker's Diabetic Mediterranean Diet blog that there weren't really any studies about ketogenic diets and Alzheimer's and dementia.  There were a few case reports that it was helpful, to be sure, but nothing much on Pubmed. Which is disappointing, because if you know anything about (very low carb or coconut oil heavy) ketogenic diets, you know that they should reduce inflammation, enhance energy efficiency, and decrease hyperinsulinemia, all of which are implicated in the pathogenesis of Alzheimer's. 

Thank heavens I follow Dr. Eades' twitter feed.  In December 2010 and January 2011 not one, but two academic papers have come out related to dementia and ketogenic diets or glucose metabolism.  So here they are:


Dietary ketosis enhances memory in mild cognitive impairment

In this study, researchers did the obvious thing and put 23 older folks with mild cognitive impairment (MCI, a type of early dementia) on a very low carbohydrate diet or their normal diet for 6 weeks.  Most people with MCI will progress to Alzheimer's, and MCI is important, because at this point your brain still has more than a fighting chance.  Interventions at this point, if they work, might mean you don't progress to Alzheimer's.

Oh dear, here is a depressing quote from the article:

Contemporaneous with the developing dementia epidemic
is an epidemic of obesity and associated metabolic disturbance.
Currently, 64% of the USA adult population is overweight and
34% obese (Flegal et al., 2010). It is projected that by the year
2030, 86% will be overweight and 51% of adults in the USA
will be obese (Wang, 2008). Likewise, diabetes prevalence is
accelerating, particularly in the aging population (National
Institute of Diabetes and Digestive and Kidney Diseases,
2008). Hyperinsulinemia, which is a precursor to type 2 diabetes,
occurs in more than 40% of individuals aged 60 and
older (Craft, 2005; Ford et al., 2002).

So the elders were recruited, given a battery of cognitive tests, then randomized to either a high carbohydrate diet (>50% of calories) meant to mimic an ordinary older American's diet, or a very low carbohydrate diet (10-20g of carbohydrate daily) for six weeks.  Protein, fat, and calories were not restricted.  "All subjects were advised to choose monounsaturated fats when possible, although this was not controlled." 

At the beginning of the studies, the subjects were pretty ordinary older Americans (though anyone with frank diabetes was excluded).  They tended to be overweight, had normal fasting glucose, but were at the high end of fasting insulin levels.  Fasting insulin levels correlated with waist circumference.

After six weeks, the ketogenic diet group got skinnier waists, lower fasting glucose and lower fasting insulin (which shouldn't be a surprise for anyone).  Ketone bodies were measurable in the urine of the very low carb subjects, and total calorie intake was also lower - fat and protein were not significantly different than the high carb group, however.  Basically the low carb folks ditched the carbs and didn't replace them, resulting in lower calories overall.

And the brain effects?   Memory was improved in the low-carb subjects, but not the tests of executive functioning (like trailmaking tests), suggesting a specific hippocampal and parahippocampal effect of ketones in the brain.  Very interesting!

On balance, these preliminary data provide evidence that
dietary ketosis by means of carbohydrate restriction can provide
neurocognitive benefit for older adults with early memory
decline and increased risk for neurodegeneration.

Got it!  Moving on.

Second paper is Brain fuel metabolism, aging, and Alzheimer's Disease from the January 2011 edition of Nutrition.  This paper is a review article rather than a study, with lots of interesting evidence for shoddy glucose metabolism in animal and human studies of Alzheimer's disease.  I like this paper because it targets every piece of dietary insult on our brain that a Kitavan avoids just by being a horticulturist and not subsisting on a diet of *insert the vast majority of foods in the middle aisles of the modern Western grocery store here* - and the Kitavan happens to avoid dementia at the same time.  Poor glucose metabolism, poor omega 3 (DHA status), and poor mitochondrial function are all found time and time again in folks with Alzheimer's, especially those with the ApoE4 allele or a maternal family history of Alzheimer's dementia.

This paper is 18 pages long and rather amazing, but I'll skip to the ketogenic part here:

Nevertheless, two observations in particular support the notion
that the neurons affected in AD are still functional: (1) in AD,
brain ketone uptake is apparently normal or at least less
impaired than is glucose, and (2) there is a functional
response to nutritional supplements that increase brain fuel
availability, particularly ketones. Hence, if brain fuel metabolism
could be optimized or even partially returned toward normal, the
risk of further cognitive decline may diminish. Raising plasma
ketones to 0.4-0.5 mM would contribute to 5-10% of the brain’s
energy requirements (Fig. 3), which is equivalent to the early
cortical glucose deficit in those genetically at risk AD. Such
a mild, safe level of ketonemia is achievable with ketogenic
supplements, so if implemented before symptoms develop, it
seems plausible that they could diminish the risk of further
metabolic deterioration and clinical onset of cognitive decline.

Ketogenic supplements would mean medium chain triglycerides, such as coconut oil, which the Kitavans eat in spades

I know, the prescription is very nearly always the same on my blog.  The scientific explanation to get to the prescription in different disease states are often very different.  Avoid the neolithic dietary agents of disease (I count them as many paleo types do as wheat, fructose, and vegetable oils), not to mention other novel fake foods we were never designed to eat, just to be safe.  Ratchet down the carbs if you have love handles or metabolic syndrome.  Dabble with ketosis by intermittent fasting, coconut oil, very low carb, or all three depending on your own personal brain status.  There's plenty left to eat that is gorgeous and yummy.  And you get to keep your brain.

Friday, January 7, 2011

The Power of a Woman's Tears

First off a big thanks to Kurt Harris over at PaNu for his props for my blog (at the same time he recommended Andrew's) - glad to see you back into blogging, and Professor Gumby is certainly a fascinating fellow.  A bit more housecleaning - there are a couple new studies out about Alzheimer's and Mild Cognitive Impairment and ketosis that I want to cover, and I'll probably get to that this weekend.  However, this morning into my mailbox tumbled a new study that has "Evolutionary Psychiatry" written all over it (or maybe not, but it is cool nevertheless), "Human Tears Contain a Chemosignal"  from pretty much the premier academic science journal in the world, Science (actually this one is in Science Express, which is the advanced online publication arm of Science.) 

The paper starts out with Charles Darwin, and that's always a good sign.  Basically the evolutionary question is why do we cry?  It builds on nicely from yesterday's post on sex.  Emotions are troublesome, after all.  While some (like love) are typically sweet and enjoyable, others (like grief) are just plain painful.  We really don't have much choice about feeling them - if you somehow suppress anger or grief, for example, it usually pops out in other unhealthy ways, such as anxiety, substance abuse, or irritability.  Take it from a psychiatrist - the best thing to do with your powerful, driving emotions is to feel them and to act on what they are telling you in a reasonable way (that would not usually include busting a beer bottle over someone's head when you are angry).  For example, I'm still pretty angry about what I consider the misappropriation of medical and nutritional "science" to peddle cheap commodity grains at the expense of our health.  Each time I read Good Calories, Bad Calories: Fats, Carbs, and the Controversial Science of Diet and Health (Vintage), it gets thrown across the room a few times.  So I write a blog, and I promote the blogs of like-minded folks to the right.  Not because many of them also have me on their blogrolls, but because I like how they put their minds to the real problem of nutrition, lifestyle, and health in the 21st century, and I inevitably learn something from them.

But back to the paper.  These researchers started out small.  They took the tears of two women, ages 30 and 31, who watched sad movies in isolation.  They also took a vial of saline that had been trickled over the skin of the women and let each of 24 young men (mean age 28) have a sniff - the men couldn't tell a difference in smell between the emotional tears and the saline.  Then another 24 men (mean age 27) smelled the tears of three women (mean age 30) or saline and filled out some rating scales about the experience. 

Then the young men were given a pad pasted to their upper lip with 100 microliters of tears or saline dribbled onto the pad.  While thus anointed, the young men were asked to rate picture's of women's faces for sadness and how sexually attractive the faces were, and men were given a standard questionnaire to assess empathy.  Each subject (in double blind fashion, and on different days) did the tests both with a saline and tear-stained pad. Tears or saline didn't change the men's empathy ratings or their ratings of sadness on the women's faces, but 17 out of 24 men smelling tears found the women's faces to be less sexually attractive (p <0.02).

The researchers at this point brought in 50 men and used a sensory paradigm to generate negative emotions, all the while measuring heart rate, sweating, respiration rate, skin temperature, and self-rating of mood and arousal.  Salivary levels of testosterone were also measured before, during, and after smelling the tears or trickled saline of 5 donor women.  After sniffing, the subjects watched sad movies (one of the news reports mentioned Terms of Endearment). Once again, the subjective arousal of tear-sniffing men was decreased, as was the level of arousal in general measured by sweating, respiration, etc., and, more tellingly, the levels of testosterone in saliva dropped.

Then the researchers got very cute (and this last bit is likely why the paper made it into Science).  They took 16 young men with saline or tear-stained nose pads and had them watch sexually arousing movies and view sexy pictures - then stuck them into fMRIs.  The sexual arousal signal apparently is visible in the hypothalamus and left fusiform gyrus of the brain, and within these regions, activity was significantly lower in men who were sniffing tears.

All in all, ladies, if you are looking to dampen the mood, take him to a tear-jerker.  Otherwise, maybe stick with an action film or comedy.  Also, tears may be a quick way to diffuse a man's overall emotional arousal - it may be an ancient hormonal way to tell an aggressive man to back off.

Now other animals also have tears, though it is thought humans are the only ones to shed emotional tears.  In any event, the lacrimal secretions of mice contain chemosignals, as do human urine and sweat.  Chemosignals in mammalian studies have uncovered pheromones that increase receptive sexual behavior or decrease arousal as in this study, while other chemosignals trigger sexual maturation or provide information (such as kinship - this may be why women will tend to be most attracted to men who have the most different major histocompatibility genes).

I'll let the researchers finish up:

The findings pose many questions: What is the identity of the active compound/s in tears?...Moreover, could the emotional or hormonal state (menstrual phase, oral contraceptives) of the crier/experimenter influence the outcome?  In turn, what if any are the signals in men's tears... children's tears, and what are the effects of all these within rather than across gender?

 My question is - what are we doing to our behaviors by masking scents with bathing, deodorant, and perfume?   Would I be a more perceptive psychiatrist if my patients stopped bathing?  Hmmm.

Thursday, January 6, 2011

Sex, and Drugs, and Sex

Not too terribly long ago, Jamie Scott made a mockery of Mr. Kellogg, who if legend holds true peddled breakfast cereal as an armament against masturbation and excessive sexual behavior. Now why someone else would care that much what any other random person does while naked is a continuing puzzlement to me (or not really if you want to delve into the psychology, which is a whole different sort of blog post. Let's just say that there is a primitive defense called "projection" and it is typically why outspoken proponents of sexual prohibitions typically exhibit the sexual behavior they loathe the most. Also, never put anything in writing, on facebook, text message, in a blog, or in photos you desperately wouldn't mind showing up in a court of law or public eye later on. Just common sense, people. For doctors specifically - never write anything in the medical record you wouldn't want your patient and his or her lawyer to see.) In the comments on Jamie's post, I made note of the findings that the young men in Minnesota noticed considerable lack of sexual interest during their semi-starvation experiment.

And let's be frank. Evolution is all about sex, genes, and breeding. The latter two topics might not sound sexy, but sex has everything to do with why you and I are here, and why you and I behave the way we do. Sex is ultimately why I write this blog (narcissism, fame, my picture on computer screens far and wide across the world - all should increase my local fitness and enhance the fitness of my children if all goes well, or at least fools my brain into thinking so). I've been reading The Red Queen: Sex and the Evolution of Human Nature, so genetics is a bit on the mind. I'm a psychiatrist, in America a discipline mostly inherited from the theoretical perspective of one dirty old man. Nothing in human evolution makes sense without considering sex. In my mind one of the positives of a paleolithic style diet is the anecdotal reports of (mostly) increased sexual drive. Vegetarians (in anecdote also) will often report a decreased sex drive.

In the comments for Jamie's post I quipped that decreased libido was probably a bad sign if it seemed to be caused by diet. If your body doesn't think you are fit for reproducing, it won't send you signals leading that direction, after all. And if your body doesn't think your diet is good enough to support little ones, it makes sense that your diet is not likely optimal for human health.

So that's the sex part. What about the drugs? Well, as a psychiatrist, I happen to have a lot of experience with drugs that decrease libido. And I hate being a hypocrite. Some of the data can be clouded by the fact that depression itself causes decreased libido. But the fact of the matter is that SSRIs and their cousins seem to cut down sexual desire and other phases of sexuality. With SSRI's it seems that the drugs cause a direct pharmacologic "anti-viagra" in men, and more commonly in women a decreased sexual interest caused by mucking around with certain varieties of serotonin receptors. These are different problems than the low-protein diet - which seems to affect sex hormone production (here's an explanatory bit from a book about pigs, anyway).

Ideally, we could figure out a way to get depressed and anxious people feeling better without messy pharmacologic interventions. I would start with therapy, exercise, and a primal-style diet (Mark Sisson's 100-150g carbs a day with a pass on the starchy tubers as part of the non-toxic carbs, to be crystal clear).

And why sex (and males) in the first place, when parthenogenesis is just so much neater and cleaner and doesn't require all this vast effort to blog, not to mention black eyeliner, platform black suede boots, and haircuts? The Red Queen: Sex and the Evolution of Human Nature (pg 86) has the best argument I've heard - disease.

Sex is about disease. It is used to combat the threat from parasites. Organisms need sex to keep their genes one step ahead of their parasites. Men are not redundant after all; they are a woman's insurance policy against her children being wiped out by influenza and smallpox (if that is a consolation). Women add sperm to their eggs because if they did not, the resulting babies would be identically vulnerable to the first parasite that picked their genetic locks.

It is not an accident that the most polymorphic genes are the immunologic histocompatibility genes. Meaning that each generational variability seeks above all to outwit the pathogens that feed on us. Sex is the best generational bet to supply different MHC genes to confuse the viruses, bacteria, parasites, and fungi. There are no long-lived species without sex. Which, using somewhat un-rigorous thinking, means I will never be a vegan.

Tuesday, January 4, 2011

Bipolar Disorder and the Psychiatry War of the 20th Century

I'm going to start with a little history of psychiatry in America and the DSMIV.  Up into the 60s and 70s, psychiatry in America was heavily influenced by psychoanalysis - Freudian stuff (patient is on a couch free-associating, and the therapist is a "blank screen" - theoretically, though it hardly ever worked out like that), primarily because most of the psychoanalysts fled Germany during WWII and set up shop in London and New York and Boston - these places are still hotspots of psychoanalysis today and the center of East Coast academic training in psychiatry.  Psychoanalysts spend a lot of time talking about rage and repression and the unconscious and the mind, all very interesting, if you like that sort of thing.  But psychoanalysis isn't exactly neuroscience, and when biological correlates to some of the major psychiatric disorders started coming out, psychiatry swung the other direction.

In some respects there is a lot to admire about psychoanalysis as a science - Freud, a Victorian, met his match in a young girl with symptoms of hysteria named Ida (he called her Dora), and he came up with all sorts of Victorian theories about repression and sexuality to explain her symptoms of not being able to speak, and some neurological hand problems.  The case is often studied in academic feminism as an example as to why psychoanalytic theory is patriarchal and misogynist.  And yes, it certainly was, because that was the prevailing idea of the time.  However, it is often missed that the whole reason Freud published Dora was to present a failure of his therapy.  He missed the boat, he knew it, as a Victorian he didn't quite understand why.  In that respect, Dora is a humble case study and real medical science circa 1901. Now we would get to the brass tacks that Dora at the age of 14 was by her account repeatedly sexually propositioned by the father of the children she babysat, and the children's mother was also the lover of Dora's father.  Such a situation would be difficult now - imagine when one could not speak of such things and would not have been believed in any event.

Well.  Dora has a lot of dream analysis in it, and maybe that was a Victorian indirect way to get to the truth.  In 2011 we prefer more direct methods - saves time.  And I certainly prefer to look at much of psychiatric pathology from a neuroscience perspective, as it seems only rational.  The analysts will say we have lost the art of listening and all the modern psychiatrist does is shove pills down people's throats.  The biologic psychiatrist will say that psychiatric illness has more causes than just mental distress and to ignore those causes is unscientific and unconscionable.  The analyst will say the biologist is "mindless," the biologic psychiatrist will say the analyst is "brainless."

The truth of the matter is that we cannot afford to lose our ability to listen to patients - that is the problem in all of medicine at this time, and psychiatrists may be the last bastion of listening.  On the other hand, we can't afford to base psychiatric treatment on medical science circa 1901 (I'm being a little unfair here - hardly any analysts are Freudian drive-based anymore, most use a mix of more modern theoretical concepts derived from attachment theory, relational therapy, and even chaos theory).  So into the fray between biologic sorts and analytic sorts came the DSMIII and IV.  These books were written to be atheoretical.  Causes (whether it be genes and inflammation or history of trauma and personality style) are left out, on purpose, I think in part due to the fight between the analysts and the biologic psychiatrists.  I came into training rather at the end of this "war" but apparently it raged for decades. (One of my teachers, an analyst, said of another, a biologic psychiatrist, "I don't think he even believes in the unconscious!"   Another teacher talked about how he was forced as a resident to give psychoanalysis to actively psychotic individuals in state mental hospitals, and when it didn't work, was blamed for his failure.)

The DSMIV is merely a recipe book of traits.  Have the traits, match it up to the diagnosis, and there you go.  Mostly it was intended for research - since we don't have lab tests to define psychiatric illness, psychiatrists in one research center needed to be studying the same disorders as in another center - thus a checklist of sorts.  And then psychiatrists in the field needed to be talking about the same sort of problem that the researchers were studying treatments for.  It all makes perfect sense, but the DSMIV is maddeningly boring and the atheoretical part makes it a lightning rod for critics.  Then managed care and insurance and services based on diagnosis came along and the DSMIV became way more important than it should have been.

But the DSMIV is what we have, and there are certain definitions for bipolar disorder.  Bipolar I is when you have a manic episode (a period of insomnia, hypersexuality, impulsivity, rapid speech, increased religiosity, irritability, racing thoughts, manic psychosis often with religious delusions or grandiose delusions, increased energy, and euphoria - you don't need all of these to be manic, just enough of them, and to be mania, it needs to be serious enough for you to be psychotic or hospitalized.)  Bipolar I people usually have major depressive episodes also, but they don't have to.  Some people are only manic.

Then there is Bipolar II, where people tend to be depressed most of the time but occasionally have hypomanic episodes - mostly insomnia, irritability, increased goal-directed behavior, impulsivity, euphoria - but not as serious as a full manic episode. Bipolar II is a little hard to sort from regular depression - most of the people who show up at your clinic will be depressed, and hypomanic symptoms are often forgotten about, even when you ask directly about them.  

Neither of these are the same thing as "moody."  Being moody and irritable does not make you bipolar, though if you are bipolar, you will likely be more moody and irritable than average during an episode.  In a lot of ways, bipolar disorder overlaps (and sometimes exists at the same time with) other disorders - substance abuse, personality disorders, anxiety, depression, ADHD, which makes it all the more controversial.

Bipolar symptoms also tend to be different at different stages in life.  Kids will tend to cycle very rapidly between mood states and could hit many in the same day.  Adults tend to stick with one for several weeks or more.  (Bipolar disorder in kids is a bit controversial - it's called bipolar disorder because the same criteria fit to describe the behaviors, and often kids with bipolar symptoms do grow up to be adults with standard adult bipolar disorder, so it seems to be the same animal.  It is also highly genetic.  However, every kid with a temper or a bratty streak is not bipolar.  Bipolar in kids tends to be very serious - these kids are often kicked out of school (or preschool) for behavior problems.)

In a lot of ways, bipolar disorder is poorly understood.  

Which brings me to the paper I'm blogging about today, "An admixture analysis of the age at index episodes in bipolar disorder."  In this study, researchers interviewed 390 people with bipolar disorder in Canada about the history of illness, threw a bunch of data into a number cruncher, and came out with some interesting correlates. 

First off, people with early onset bipolar disorder (average onset age 18) tended to be more likely to have a family history of bipolar disorder, and more likely to have psychosis, anxiety, suicidal thoughts and behaviors, and a chronic and rapid cycling course, and were more likely to have migraines.  People with late onset (usually starting around age 33) disease were more likely to also have diabetes.  Typically, bipolar disorder begins with a depressive episode, and often earlier in women than in men (which would match up with women's greater vulnerability to mood disorders in general).  I can add further speculation that the later onset bipolar being more associated with diabetes would suggest that it is possibly part of metabolic syndrome in certain vulnerable people.  Early onset bipolar disorder may be more it's own animal.  I do think in both cases, inflammatory Western diets may be contributory, and there is some (bad epidemiologic) evidence that doesn't dispute that speculation.  Also interesting is the connection between bipolar disorders and migraines - both can respond to medications for epilepsy and theoretically from a ketogenic diet


A modern psychiatrist is hamstrung without time to get a good history and the psychological savvy to establish an excellent rapport with the patient and the understanding of basic human nature - but a modern psychiatrist is also crippled by a lack of knowledge of neuroscience, nutrition, and general medicine.  We need to pursue both threads in 2011.  It all comes back together for the betterment of everyone.

Sunday, January 2, 2011

Zombieland 2 - You Are What You Eat, Mommyblogger Style

First the Zombieland movie trailer, just for fun.  Now take a peek at one of my more popular posts, Zombieland, which a brilliant psychiatrist friend of mine described as "food for thought."  The post was about phospholipids, primarily phosphatidylserine and phosphatidylcholine, delectable and important nutrients nature has supplied most richly in the consumption of brains (though eggs are perhaps a more palatable source).  (Linus Pauling Institute article about choline is here.)

(Hey, Nephropal is back!  Sorry about the jump around.  I spent a good part of the morning playing video games and my attention span is shot, though my reaction times are *awesome*!)

Little did I know that choline would become a very fashionable nutrient in the paleoblogosphere.   Chris Masterjohn, Paul Jaminet, and Stephan Guyenet all did a post or three, and everyone who is anyone doubled down on his or her egg consumption thereafter.  Much of the to-do was focused on fatty liver disease (one really must eat choline to protect one's liver from the rigors of modern life).  But, as it turns out, while the liver is on the front lines, the brain is where the battle really rages.

The data in humans is... scarce (there are some small human trials of induced choline deficinecy and fatty liver, reversed by choline repletion).  But let's throw out what we have, be it epidemiology or mice or rats, and do some speculation. 

First off, choline is part of the folate cycle, which I discussed a bit in this post here.  The folate cycle is exceptionally important for liver and brain/nerve health, and includes some key players - iron, vitamin B6, vitamin B12, folate, methionine, choline, oxidized choline (called betaine), SAM, niacin, and riboflavin.  All the players need to be on the field for the full folate cycle to run efficiently.  Most choline absorbed by humans is immediately made into phosphatidylcholine and incorporated into cell membranes.  Choline is also made into betaine in the liver, where it serves as a methyl donor for many important chemical reactions (1). One of the phosphorylated products of these many reactions is phosphocreatine, by the way.  Also important to neurons and the brain is sphingomyelin.



Problems or deficiencies in the folate cycle (including choline deficiency) are implicated in fatty liver disease, neural tube defects (like spina bifida), cardiovascular disease, and cancer.   The cycle begins with methionine (an amino acid), which using various B vitamin cofactors is made into SAM.  SAM is a methyl donor and is vital in nearly sixty important chemical reactions in the liver, from making neurotransmitters to cell membranes to DNA.   After being used as a methyl donor, SAM becomes SAH which becomes homocysteine.  Having high homocysteine is associated with dementia and cardiovascular disease - in my mind it is associated with poor nutrition in general.  Without all the players, homocysteine has a hard time being recycled for use in the folate cycle again.  In order to get the majority of the players, you need to eat a lot of whole grains, or vitamins, or organ meats/eggs.   Whole grains have their issues, so I'll stick to the latter sources, thank you very much.

All right, so choline is necessary, but why?  Fatty liver develops because you need choline to make VLDL particles.  VLDL carries triglycerides from the liver into the bloodstream.  If you don't have enough phospholipids like phosphatidylcholine to form the coat of the VLDL particle, you end up with triglycerides stuck in the liver ==> fatty liver.  Now if you don't have enough phospholipids to even get your triglycerides out of the liver, how could you possibly have enough for use in your other cell membranes (or in the neurons of the brain, which are especially rich in phospholipids)?  And how would they even get there without your cholesterol/fat carrying particles?  Fatty liver is just a harbinger of even more serious problems to follow. 

Neural tube defects are extremely common birth defects that result from faulty closing of the ends of the neural tube.  In very early development, we humans spend some time being sort of flat, like an elongated pancake.  Then we roll up in a couple of ways to become more... wormlike, before we grow limb buds and all sorts of other interesting things to become babies.  Rolling up properly requires the folate cycle to be running at full efficiency.  And, sure enough, epidemiologic studies have linked choline deficiency in human mothers to neural tube defects in their offspring (those in the bottom 25% of choline intake have 4 times the risk of babies with neural tube defects as those in the top 25%), and in mice, choline restriction is shown to cause neural defects.

Pregnancy and lactation are periods in a woman's life when she needs some serious choline to keep the machinery going.  Placental/amniotic fluid levels of choline are nearly 10 times mother's serum levels (2)  Moms seem to benefit from an enhanced choline-making machinery during pregnancy, but mom still ends up with depleted levels after childbirth (which will continue during lactation).  It's best to have time to replete this vital nutrient between kiddos.

Beyond the neural tube defect issue is the memory issue.  Neural tube defects usually begin within the first month of pregnancy, so by the time you realize you are expecting, it's pretty much too late to do anything about it.  But studies of rats have shown that choline depleted mothers in the latter half of rat pregnancy results in offspring with lousy memories.  The hippocampus is the part of the brain at the epicenter of memory (and depression).   Pregnant rats who have plenty of choline seem to be able to make all the membranes, DNA, and stem cells you need to make an awesome hippocampus.  Pregnant rats without choline have baby rats who just don't remember as well.  In addition, choline supplementation in the second half of pregnancy seems to protect offspring rats from the detrimental brain effects of alcohol given to their mothers (3).  There are no studies in humans showing this link.  However, elevated human maternal homocysteine levels are linked with preeclampsia, prematurity, and low birth weight, and most studies have shown that the higher a mother's choline levels, the lower her homocysteine tends to be.

Other studies (including the Nurses Health Study and the Framingham Offspring Study) have shown that higher choline levels are associated with lower inflammatory markers of many kinds, including IL-6, TNF-alpha, and C reactive protein.  Keep in mind that in the PROSPECT-EPIC study, even those women in the highest quartile of choline intake fell below the Institutes of Medicine number for Adequate Intake of 425 mg a day for women (to get that amount, you need to eat half a pound of chicken, two eggs, and a quart of 1% milk (not sure why anyone would be drinking that, but okay)! And those are relatively rich sources!) In the Nurses' Health Study, 95% of women fell below an average daily intake of 411 mg.  In addition, nearly 50% of seem to have a genetic polymorphism where we have a hard time making methionine into choline, meaning choline becomes even more of an essential nutrient.  This fact may be why some people have bulletproof livers, and others develop fatty liver just by looking at a bottle of wine. 

The women with adequate intake in these large studies had a higher than average intake of eggs.  I wonder if choline is the reason recent studies have shown that eggs in conjunction with a calorie restricted diet seem to improve diabetes, despite all that fat and cholesterol. (I like these studies and how they seem to annihilate the lipotoxicity hypothesis). 

I find the information about choline to be convincing, and I eat eggs for breakfast about 4X a week (as do my kids).  The perinatal period where baby rats' brain development benefits from increased choline consumption can be extrapolated to about 4 years old in humans.  So eat up your eggs and offal, especially if you are a rugrat or plan to have rugrats any time soon!

*Wheat germ and soy lecithin are the richest vegetarian sources, but they won't be on my menu.