Research Into Pain Shows That When People Expect More Pain, They Feel More Pain

A good study that’s needed to be done for a while.

Expect a shot to hurt and it probably will, even if the needle poke isn’t really so painful. Brace for a second shot and you’ll likely flinch again, even though — second time around — you should know better.

That’s the takeaway of a new brain imaging study published in the journal Nature Human Behaviour which found that expectations about pain intensity can become self-fulfilling prophecies. Surprisingly, those false expectations can persist even when reality repeatedly demonstrates otherwise, the study found.

“We discovered that there is a positive feedback loop between expectation and pain,” said senior author Tor Wager, a professor of psychology and neuroscience at the University of Colorado Boulder. “The more pain you expect, the stronger your brain responds to the pain. The stronger your brain responds to the pain, the more you expect.”

For decades, researchers have been intrigued with the idea of self-fulfilling prophecy, with studies showing expectations can influence everything from how one performs on a test to how one responds to a medication. The new study is the first to directly model the dynamics of the feedback loop between expectations and pain and the neural mechanisms underlying it.

Marieke Jepma, then a postdoctoral researcher in Wager’s lab, launched the research after noticing that even when test subjects were shown time and again that something wouldn’t hurt badly, some still expected it to.

“We wanted to get a better understanding of why pain expectations are so resistant to change,” said Jepma, lead author and now a researcher at the University of Amsterdam.

The researchers recruited 34 subjects and taught them to associate one symbol with low heat and another with high, painful heat.

Then, the subjects were placed in a functional magnetic resonance imaging (fMRI) machine, which measures blood flow in the brain as a proxy for neural activity. For 60 minutes, subjects were shown low or high pain cues (the symbols, the words Low or High, or the letters L and W), then asked to rate how much pain they expected.

Then varying degrees of painful but non-damaging heat were applied to their forearm or leg, with the hottest reaching “about what it feels like to hold a hot cup of coffee” Wager explains.

Then they were asked to rate their pain.

Unbeknownst to the subjects, heat intensity was not actually related to the preceding cue.

The study found that when subjects expected more heat, brain regions involved in threat and fear were more activated as they waited. Regions involved in the generation of pain were more active when they received the stimulus. Participants reported more pain with high-pain cues, regardless of how much heat they actually got.

“This suggests that expectations had a rather deep effect, influencing how the brain processes pain,” said Jepma.

Surprisingly, their expectations also highly influenced their ability to learn from experience. Many subjects demonstrated high “confirmation bias” — the tendency to learn from things that reinforce our beliefs and discount those that don’t. For instance, if they expected high pain and got it, they might expect even more pain the next time. But if they expected high pain and didn’t get it, nothing changed.

“You would assume that if you expected high pain and got very little you would know better the next time. But interestingly, they failed to learn,” said Wager.

This phenomenon could have tangible impacts on recovery from painful conditions, suggests Jepma.

“Our results suggest that negative expectations about pain or treatment outcomes may in some situations interfere with optimal recovery, both by enhancing perceived pain and by preventing people from noticing that they are getting better,” she said. “Positive expectations, on the other hand, could have the opposite effects.”

The research also may shed light on why, for some, chronic pain can linger long after damaged tissues have healed.

Whether in the context of pain or mental health, the authors suggest that it may do us good to be aware of our inherent eagerness to confirm our expectations.

“Just realizing that things may not be as bad as you think may help you to revise your expectation and, in doing so, alter your experience,” said Jepma.

AI System Successfully Predicts Alzheimer’s Years in Advance

Important research of Alzheimer’s disease since it’s one of those diseases where the treatment will be more effective the earlier it’s caught.

Artificial intelligence (AI) technology improves the ability of brain imaging to predict Alzheimer’s disease, according to a study published in the journal Radiology.

Timely diagnosis of Alzheimer’s disease is extremely important, as treatments and interventions are more effective early in the course of the disease. However, early diagnosis has proven to be challenging. Research has linked the disease process to changes in metabolism, as shown by glucose uptake in certain regions of the brain, but these changes can be difficult to recognize.

“Differences in the pattern of glucose uptake in the brain are very subtle and diffuse,” said study co-author Jae Ho Sohn, M.D., from the Radiology & Biomedical Imaging Department at the University of California in San Francisco (UCSF). “People are good at finding specific biomarkers of disease, but metabolic changes represent a more global and subtle process.”

The study’s senior author, Benjamin Franc, M.D., from UCSF, approached Dr. Sohn and University of California, Berkeley, undergraduate student Yiming Ding through the Big Data in Radiology (BDRAD) research group, a multidisciplinary team of physicians and engineers focusing on radiological data science. Dr. Franc was interested in applying deep learning, a type of AI in which machines learn by example much like humans do, to find changes in brain metabolism predictive of Alzheimer’s disease.

The researchers trained the deep learning algorithm on a special imaging technology known as 18-F-fluorodeoxyglucose positron emission tomography (FDG-PET). In an FDG-PET scan, FDG, a radioactive glucose compound, is injected into the blood. PET scans can then measure the uptake of FDG in brain cells, an indicator of metabolic activity.

The researchers had access to data from the Alzheimer’s Disease Neuroimaging Initiative (ADNI), a major multi-site study focused on clinical trials to improve prevention and treatment of this disease. The ADNI dataset included more than 2,100 FDG-PET brain images from 1,002 patients. Researchers trained the deep learning algorithm on 90 percent of the dataset and then tested it on the remaining 10 percent of the dataset. Through deep learning, the algorithm was able to teach itself metabolic patterns that corresponded to Alzheimer’s disease.

Finally, the researchers tested the algorithm on an independent set of 40 imaging exams from 40 patients that it had never studied. The algorithm achieved 100 percent sensitivity at detecting the disease an average of more than six years prior to the final diagnosis.

“We were very pleased with the algorithm’s performance,” Dr. Sohn said. “It was able to predict every single case that advanced to Alzheimer’s disease.”

Although he cautioned that their independent test set was small and needs further validation with a larger multi-institutional prospective study, Dr. Sohn said that the algorithm could be a useful tool to complement the work of radiologists — especially in conjunction with other biochemical and imaging tests — in providing an opportunity for early therapeutic intervention.

“If we diagnose Alzheimer’s disease when all the symptoms have manifested, the brain volume loss is so significant that it’s too late to intervene,” he said. “If we can detect it earlier, that’s an opportunity for investigators to potentially find better ways to slow down or even halt the disease process.”

Why Changing the Clocks With Daylight-Saving Time is Absurd

It’s an antiquated practice that has many people driving home from work (at around 5 o’clock) in relative darkness, likely leading to more traffic accidents and less quality time outside as well.

Daylight-saving time (not “daylight-savings” time) was created during World War I to decrease energy use. The practice was implemented year-round in 1942, during WWII. Not waking up in the dark, the thinking went, would decrease fuel use for lighting and heating. That would help conserve energy supplies to help the war effort.

[…]

According to advocacy groups like Standardtime.com, which are trying to abolish daylight-saving time, claims about saving energy are unproven. “If we are saving energy, let’s go year-round with daylight-saving time,” the group says. “If we are not saving energy, let’s drop daylight-saving time!”

In his book Spring Forward: The Annual Madness of Daylight-Saving Time, author Michael Downing says there isn’t much evidence that daylight-saving actually decreases energy use.

In fact, sometimes DST seems to increase energy use.

For example, in Indiana – where daylight-saving time was implemented statewide in 2006 – researchers saw that people used less electricity for light, but those gains were canceled out by people who used more air conditioning during the early evenings.

(That’s because 6pm felt more like 5pm, when the sun still shines brightly in the summer and homes haven’t had the chance to cool off.)

DST also increases gasoline consumption, something Downing says the petroleum industry has known since the 1930s. This is probably because evening activities – and the vehicle use they require – increase with that extra daylight.

Changing the clocks also causes air travel synchronisation headaches, which sometimes leads to travel delays and lost revenue, airlines have reportedly said.

There are also health issues associated with changing the clocks. Similar to the way jet-lag makes you feel all out of whack, daylight-saving time is like scooting one time zone over.

This can disrupt our sleep, metabolism, mood, stress levels, and other bodily rhythms. One study suggests recovery can take three weeks.

In the days after DST starts or ends, in fact, researchers have observed a spike in heart attacks, increased numbers of work injuries, more automobile accidents, and higher suicide rates.

[…]

The absence of major energy-saving benefits from DST – along with its death toll, health impacts, and economic ramifications – are reason enough to get rid of the ritual.

Combating Climate Change With Free Busing

Ideas worth trying to tackle the extremely relevant problem of climate change.

We are clearly going to have to change much about our lives if we are going to reduce greenhouse gas emissions by enough to save the planet. We can and should look for technical fixes like more fuel efficient cars and increased used of solar and wind energy, but it is not likely that these fixes can be adopted quickly enough to prevent lasting damage to the environment. We will also have to change the way we do lots of things.

One obvious target is commuting. We burn an enormous amount of oil in the process of getting to and from work. Part of this story is rush hour traffic, which causes people to burn fuel sitting in their idling cars, especially in the summer months when they have their air conditioners running.

There are some fairly simple ways to combat this congestion. For example, we could have congestion pricing, which would charge people for driving into city centers in the middle day. This is a Milton Friedman idea that was put into practice by London’s left-wing mayor, Ken Livingston.

A second way to reduce congestion is to try to smooth out the flow of traffic over the work day, by encouraging employers to have flexible work hours. A modest tax credit may go a long way in this regards. After all, a 9 to 5 work day is a norm, not a matter of religious conviction.

The same story would apply to four-day work weeks. Suppose companies switched to four-day work weeks, with workers putting in 9 or 10 hour days, instead of the current standard five-day work week. This would reduce commuting by 20 percent, with the reduction in gas use being even larger since it would also reduce congestion.

But in addition to these actions, we should look to more mechanisms to get people out of their cars and to instead take advantage of more efficient modes of transportation. Most progressives will quickly sign on to mass transit, but this generally means subways or light rails. These modes of transportation have the serious disadvantage that, they tend to be both very expensive and that they take a long time to get up and running. The light rail approved in 2019 is likely to still be under construction in 2029. That is not a good story if the goal is a near-term reduction in greenhouse gas emissions.

There is a simple, quick, and cheap alternative. It’s called a “bus.” For some reasons, busses don’t seem to feature prominently on the mass transit agenda. I have never been quite able to figure that one out. Perhaps it’s one of the many cases where the answer is too simple to be taken seriously.

It wouldn’t cost a lot of money or take very much time to get more busses on the road. Currently, we manufacture around 5,000 passenger busses a year. I suspect that number could be increased rapidly, if there was demand. We could also import busses from foreign manufacturers.

Of course many city busses now travel half empty. This both makes the cost per trip expensive and raises the obvious question as to what good having more busses would do if they are already hugely underutilized?

This is where having free bus transportation would make a big difference. If people had the option of taking a free bus, as opposed to driving to work and paying for gas, insurance, and parking, many more would opt to take the bus. We could even try to make busses more attractive by doing things like having more bus only lanes, that would allow them to pass other traffic. We could even follow an example used in other countries, where traffic lights are set to adjust so that busses will have green lights when they approach.

But even if these measures sound too expensive and/or exotic, simply making bus rides free should hasten their rate of travel considerably. We would no longer have to wait for people to fumble with their money or transit cards, or deal with card readers that don’t want to read. They would just hop on and off the bus.

Would free busses break the bank? To take one example, the Chicago Transit Authority, which serves the whole metropolitan area of 9.5 million, gets a bit less than $300 million a year from its bus fares. This means that replacing current passenger revenue would require annual tax revenue of a bit more than $30 per person.

Of course this would go up if we envisioned ridership doubling or even tripling. But there would also be savings if the bus system no longer had to deal with cash or issuing and reading fare cards. And, the cost increase would be nowhere near proportionate to the increase in ridership, since it costs little more to operate a full passenger bus than one that is nearly empty.

When comparing policies to deal with global warming, free bus fares has to rank near the top in yield per dollar. It would also have the great advantage of reducing other pollutants in the air in major cities. Gasoline is much cleaner than it was five decades ago, but the less we burn of it the better.

In addition, taking cars off the road is also going to reduce the number of injuries and fatalities in car accidents. Yeah, driverless cars will do this too, but that is not going to be next year. In fact, with average insurance cost per car close to $1,000 a year, the typical driver may save enough on their insurance to more than compensate for the taxes needed to pay for free bus fare. (We could also start pushing pay by the mile auto insurance, but that is another story.)

And, free bus travel can be phased in, just to see how people respond. We can have free travel days where the city announces that Tuesdays or some other day of the week will be free. We can also do it by route, where some bus routes are free, while people still have to pay regular fare on others.

Free bus travel is only one part of what we will have to do to limit greenhouse gas emissions, but it is a simple step that could in principle be quickly implemented. It should rank high on the agenda for folks who care about saving the planet.

Study: Aerobic Exercise Has Antidepressant Effects for Those With Major Depression

It seems like doctors should prescribe this sort of moderate intensity aerobic exercise instead of pharmaceutical drugs much more.

An analysis of randomized controlled clinical trials indicates that supervised aerobic exercise has large antidepressant treatment effects for patients with major depression. The systematic review and meta-analysis is published in Depression and Anxiety.

Across 11 eligible trials involving 455 adult patients (18-65 years old) with major depression as a primary disorder, supervised aerobic exercise was performed on average for 45 minutes, at moderate intensity, 3 times per week, and for 9.2 weeks. It showed a significantly large overall antidepressant effect compared with antidepressant medication and/or psychological therapies.

Also, aerobic exercise revealed moderate-to-large antidepressant effects among trials with lower risk of bias, as well as large antidepressant effects among trials with short-term interventions (up to 4 weeks) and trials involving preferences for exercise.

Subgroup analyses revealed comparable effects for aerobic exercise across various settings and delivery formats, and in both outpatients and inpatients regardless of symptom severity.

“Collectively, this study has found that supervised aerobic exercise can significantly support major depression treatment in mental health services,” said lead author Dr. Ioannis D. Morres, of the University of Thessaly, in Greece.

Three Types of Depression Identified in Research for the First Time

More knowledge about the societal problem of depression should lead to more effective treatments for it.

According to the World Health Organization, nearly 300 million people worldwide suffer from depression and these rates are on the rise. Yet, doctors and scientists have a poor understanding of what causes this debilitating condition and for some who experience it, medicines don’t help.

Scientists from the Neural Computational Unit at the Okinawa Institute of Science and Technology Graduate University (OIST), in collaboration with their colleagues at Nara Institute of Science and Technology and clinicians at Hiroshima University, have for the first time identified three sub-types of depression. They found that one out of these sub-types seems to be untreatable by Selective Serotonin Reuptake Inhibitors (SSRIs), the most commonly prescribed medicines for the condition. The study was published in the journal Scientific Reports.

Serotonin is a neurotransmitter that influences our moods, interactions with other people, sleep patterns and memory. SSRIs are thought to take effect by boosting the levels of serotonin in the brain. However, these drugs do not have the same effect on everyone, and in some people, depression does not improve even after taking them. “It has always been speculated that different types of depression exist, and they influence the effectiveness of the drug. But there has been no consensus,” says Prof. Kenji Doya.

For the study, the scientists collected clinical, biological, and life history data from 134 individuals — half of whom were newly diagnosed with depression and the other half who had no depression diagnosis- using questionnaires and blood tests. Participants were asked about their sleep patterns, whether or not they had stressful issues, or other mental health conditions.

Researchers also scanned participants’ brains using magnetic resonance imaging (MRI) to map brain activity patterns in different regions. The technique they used allowed them to examine 78 regions covering the entire brain, to identify how its activities in different regions are correlated. “This is the first study to identify depression sub-types from life history and MRI data,” says Prof. Doya.

With over 3000 measurable features, including whether or not participants had experienced trauma, the scientists were faced with the dilemma of finding a way to analyze such a large data set accurately. “The major challenge in this study was to develop a statistical tool that could extract relevant information for clustering similar subjects together,” says Dr. Tomoki Tokuda, a statistician and the lead author of the study. He therefore designed a novel statistical method that would help detect multiple ways of data clustering and the features responsible for it. Using this method, the researchers identified a group of closely-placed data clusters, which consisted of measurable features essential for accessing mental health of an individual. Three out of the five data clusters were found to represent different sub-types of depression.

The three distinct sub-types of depression were characterized by two main factors: functional connectivity patterns synchronized between different regions of the brain and childhood trauma experience. They found that the brain’s functional connectivity in regions that involved the angular gyrus — a brain region associated with processing language and numbers, spatial cognition, attention, and other aspects of cognition — played a large role in determining whether SSRIs were effective in treating depression.

Patients with increased functional connectivity between the brain’s different regions who had also experienced childhood trauma had a sub-type of depression that is unresponsive to treatment by SSRIs drugs, the researchers found. On the other hand, the other two subtypes — where the participants’ brains did not show increased connectivity among its different regions or where participants had not experienced childhood trauma — tended to respond positively to treatments using SSRIs drugs.

This study not only identifies sub-types of depression for the first time, but also identifies some underlying factors and points to the need to explore new treatment techniques. “It provides scientists studying neurobiological aspects of depression a promising direction in which to pursue their research,” says Prof. Doya. In time, he and his research team hope that these results will help psychiatrists and therapists improve diagnoses and treat their patients more effectively.

Lessons from the 2017 Tax Scam

The tax cuts were supposed to lead to an investment boom, which would increase productivity in the economy, which would then in turn provide a boost to real wages for most workers. That essentially didn’t happen of course, and it was simple enough to predict that with an economy already so rigged for the richest at the expense of everyone else.

It’s a bit less than a year since Congress passed the Trump tax cut, but we are far enough along that we can be fairly confident about its impact on the economy. There are three main lessons we can learn:

  • The tax cut is to not leading to the promised investment boom;
  • The additional demand generated by the tax cut is spurring growth and reducing the unemployment rate;
  • The Federal Reserve Board’s interest rate hikes are slowing the economy in a way that is unnecessary given current inflation risks.

The Investment Boom: Just Like Jared Kushner’s Hidden Genius, No One Can See It

Taking these in turn, it is pretty clear at this point that we will not see the investment boom promised by proponents of the tax cut. This point really has to be front and center in any discussion of the benefits of the tax cut. By far, the largest chunk of the tax cut was the reduction in the corporate tax rate from 35 percent to 21 percent, along with various other measures lowering corporate taxes.

The immediate impact of a corporate tax cut is to give more money to the richest people in the country since stock ownership is highly skewed towards the top 10 percent of the income distribution and especially the top one percent.

[…]

The data for the first three quarters of 2018 indicate that this is not likely to happen. Investment is up modestly, but we’re clearly not seeing the promised boom. In the first three-quarters of 2018 investment was 6.7 percent higher than in the same period last year. By comparison, investment rose by 6.9 percent in 2014 and increased at a 9.1 percent annual rate from 2010 to 2012.

Much of the growth we have seen this year is not from tax cuts, but higher world energy prices spurring a boom in oil and gas drilling. If we pull out energy-related sectors, the rise in investment would be even less.

[…]

There is also no evidence of the promised investment boom in any of the various surveys showing business plans for the future. For example, the Commerce Department reported last week that new orders for non-defense capital goods, the largest component of investment, were up by less than 1.0 percent from their year-ago levels.

In short, at this point, it certainly looks like the skeptics were right. Cuts in corporate tax rates are not an effective way to boost investment, they are an effective way to give more money to rich people.

Larger Budget Deficits Can Boost Growth and Employment

The second point is that the tax cuts did boost demand. This meant more growth and more jobs than we would have otherwise. This is a very good story; the 3.7 percent unemployment rate is the lowest we have seen in almost 50 years. The Congressional Budget Office is projecting the unemployment rate will bottom out at 3.2 percent next year. Its pre-tax cut forecast had the unemployment rate at 4.2 percent in 2019.

When we get to low levels of unemployment the people who are getting jobs are overwhelmingly workers who are disadvantaged in the labor market, blacks, Hispanics, people with disabilities, and people with criminal records. The chance to get a foot in the labor market can make an enormous difference in their lives. We don’t have any social programs that can make as much difference for these people as say, lowering the unemployment rate from 4.5 percent to 3.5 percent.

In addition to giving people jobs, the tighter labor market also gives workers in the middle and bottom of the wage distribution the bargaining power to achieve real wage gains. While the rate of real wage growth has been disappointing given the low levels of unemployment, workers at the middle and bottom have been seeing real wage gains the last four years in contrast to earlier in the recovery when their wages were stagnant or declining. It is likely that the rate of real wage growth will pick up if the unemployment remains this low or goes lower.

While boosting growth with a larger deficit offers clear and substantial benefits, tax breaks to the rich were hardly the best way to do it. If we had spent the money on infrastructure, education, or even tax breaks to low and moderate income households, it would have boosted demand by even more. And, we would have seen a longer-term dividend of a more productive economy if we spent the money on infrastructure and education.

Obviously, the Republicans’ priority was giving more money to rich people and, given their control of the White House and Congress, it was inevitable that a tax cut for the rich would be the outcome. The Republican Congress blocked efforts by Obama to have any stimulus in his second term, but he really did not push the case. Many of Obama’s top economic advisers were fearful of deficits and were not anxious to see additional spending that was not offset with either tax increases or spending cuts in other areas.

We see from the economy’s response to the tax cut — increased growth, lower unemployment, and no evidence of accelerating inflation — that the economy could benefit from a larger budget deficit. It is unfortunate that we were only able to get this boost by giving still more money to rich people.