Diabetes Drug May Have Potential to Treat Parkinson’s Disease

While further research is needed, it’s another example of how powerful drug research can be. Now if only the U.S. spent more of this kind of research instead of inefficiently giving too much of it to the pharmaceutical industry.

A drug commonly used to treat diabetes may have disease-modifying potential to treat Parkinson’s disease, a new study suggests, paving the way for further research to define its efficacy and safety.

The study, published in The Lancet and funded by The Michael J. Fox Foundation for Parkinson’s Research (MJFF), found that people with Parkinson’s who injected themselves each week with exenatide for one year performed better in movement (motor) tests than those who injected a placebo.

“This is a very promising finding, as the drug holds potential to affect the course of the disease itself, and not merely the symptoms,” said the study’s senior author, Professor Tom Foltynie (UCL Institute of Neurology). “With existing treatments, we can relieve most of the symptoms for some years, but the disease continues to worsen.”

The researchers followed 60 people with Parkinson’s disease at the National Hospital for Neurology and Neurosurgery (NHNN) as they used either a once-weekly injection of exenatide for 48 weeks, or a placebo, in addition to their regular medications.

They found that people who used exenatide had better motor function at 48 weeks when they came off the treatment, which persisted after the 12-week follow-up. Those who had injected the placebo showed a decline in their motor scores at both the 48- and 60-week tests. The advantage of 4 points, on a 132-point scale of measures such as tremors, agility and speech, was statistically significant.

The participants did not report noticeable improvements in their symptoms during the trial period beyond what their standard medication already did for them. They were tested while temporarily off all medication, to determine how the disease itself was progressing. The research did not determine conclusively whether the drug was modifying the disease itself, so the next stage in the research will investigate that more fully.

Parkinson’s disease affects 1 in 500 people and is the second most common neurodegenerative disease worldwide. Symptoms typically don’t become apparent until over 70% of the brain’s dopamine-producing cells have been affected. The condition results in muscle stiffness, slowness of movement, tremors, sleep disturbance, chronic fatigue and an impaired quality of life.

The above is from August 2017. Update on related Parkinson’s research on August 10th, 2018:

In July 2018, Johns Hopkins researchers developed an experimental drug that slows Parkinson’s disease in mice.

Johns Hopkins researchers say they have developed an experimental drug, similar to compounds used to treat diabetes, that slows the progression of Parkinson’s disease itself — as well as its symptoms — in mice. In experiments performed with cultures of human brain cells and live mouse models, they report the drug blocked the degradation of brain cells that is the hallmark of Parkinson’s disease. The drug is expected to move to clinical trials this year.

“It is amazingly protective of target nerve cells,” says Ted Dawson, M.D., Ph.D., director of the Institute for Cell Engineering and professor of neurology at the Johns Hopkins University School of Medicine.

Dawson explains that if planned clinical trials for the drug, named NLY01, are successful in humans, it could be one of the first treatments to directly target the progression of Parkinson’s disease, not just the muscle rigidity, spasmodic movements, fatigue, dizziness, dementia and other symptoms of the disorder.

A report of the study’s results was published June 11 in Nature Medicine.

According to the investigators, NLY01 works by binding to so-called glucagon-like peptide-1 receptors on the surface of certain cells. Similar drugs are used widely in the treatment of type 2 diabetes to increase insulin levels in the blood. Though past studies in animals suggested the neuroprotective potential of this class of drugs, researchers had not shown directly how it operated in the brain.

To find out, Dawson and his team tested NLY01 on three major cell types in the human brain: astrocytes, microglia and neurons. They found that microglia, a brain cell type that sends signals throughout the central nervous system in response to infection or injury, had the most sites for NLY01 to bind to — two times higher than the other cell types, and 10 times higher in humans with Parkinson’s disease compared to humans without the disease.

Dawson and his team knew that microglia secreted chemical signals that converted astrocytes — the star shaped cells that help neurons communicate with their neighbors — into aggressive “activated” astrocytes, which eat away at the connections between cells in the brain, causing neurons to die off. They speculated that NLY01 might stop this conversion.

“The activated astrocytes we focused on go into a revolt against the brain,” says Dawson, “and this structural breakdown contributes to the dead zones of brain tissue found in those with Parkinson’s disease. The ideas was that if we could find a way to calm those astrocytes, we might be able to slow the progression of Parkinson’s disease.”

In a preliminary experiment in laboratory-grown human brain cells, Dawson’s team treated human microglia with NLY01 and found that they were able to turn the activating signals off. When healthy astrocytes were combined with the treated microglia, they did not convert into destructive activated astrocytes and remained healthy neuroprotective cells. Dawson’s team suspected that neurons throughout the body could be protected in the same way.

They explored this hypothesis by testing the drug’s effectiveness in mice engineered to have a rodent version of Parkinson’s disease.

In one experiment, Dawson’s team injected the mice with alpha-synuclein, the protein known to be the primary driver of Parkinson’s disease, and treatedmice with NLY01. Similar but untreated mice injected with alpha-synuclein showed pronounced motor impairment over the course of six months in behavioral tests such as the pole test, which allows researchers to measure motor impairment such as that caused by Parkinson’s disease. However, Dawson’s team found that the mice treated with NLY01 maintained normal physical function and had no loss of dopamine neurons, indicating that the drug protected against the development of Parkinson’s disease.

In a second experiment, Dawson’s team used mice that were genetically engineered to naturally produce more human-type alpha-synuclein typically used to model human Parkinson’s disease that runs in families. Under normal conditions, these so-called transgenic mice will succumb to the disease in 387 days. However, Dawson’s team found that treatment with NLY01 extended the lives of the 20 mice treated with the drug by over 120 days.

Upon further investigation, Dawson’s team found that the brains of the mice treated with NLY01 showed few signs of the neurodegenerative characteristics of Parkinson’s disease.

Parkinson’s disease is a progressive disorder of the nervous system that affects approximately 1 million people in the U.S., according to the Parkinson’s Foundation. Early symptoms include tremors, trouble sleeping, constipation and trouble moving or walking, which ultimately give way to more severe symptoms such as loss of motor function and the ability to speak, and dementia. Most people begin showing symptoms in their 60s, but cases have been reported in patients as young as 2 years old.

Dawson cautions that the experimental drug must still be tested for safety as well as effectiveness in people, but based on the safety profile of other similar drugs, he does not anticipate any major roadblocks to its use in humans.

Dawson says he and his team have reason to be hopeful that NLY01 could, in a relatively short period of time, make an impact on the lives of those with Parkinson’s.

Similar drugs to NLY01 already approved by the Food and Drug Administration for the treatment of type 2 diabetes include exenatide, lixisenatide, liraglutide and dulaglutide, each of which can cost approximately $2,000 for a 90-day supply. NLY01 is a long-acting drug with improved the brain penetration compared to these approved drugs for diabetes.

More Climate Change Worsens Natural Disasters

Hurricane Florence has been receiving massive media coverage for the immense damage it’s doing. There are hundreds of thousands of people without electricity in North Carolina now, and among other things, such as threatening nuclear reactors, the flooding is doing major harm.

In the news media, it is almost never mentioned that climate change has made natural disasters such as hurricanes worse. More warm air translates to more water vapor, and more water vapor means worsened superstorms. In 2017, there was a record amount of U.S. economic costs related to natural disasters, in significant part due to hurricanes like Hurricane Florence.

Amazingly, it is now 2018 and there is not even much discussion about ways that human technology can reduce the strength of superstorms. Hurricanes require a sea surface temperature of 26.5 degrees Celsius to form, and there is some research showing that sending compressed bubbles (via perforated pipes located over a hundred meters down) from deeper in the ocean brings up colder water to the surface. The cold water would cool the warmer surface water, possibly preventing hurricanes through removing their supply of energy.

The United States has given enormous subsidies to fossil fuels companies that operate oil rigs on the ocean, contributing to the greenhouse gas effect that leads to warming and worse storms. It doesn’t seem unreasonable to use the materials from them to create platforms that use the perforated pipes to cool the ocean water and prevent (or perhaps ameliorate) hurricanes. In response to data that predicts where hurricanes are about to form, it doesn’t seem unreasonable that that sort of platform could be quickly deployed or transported to other locations either.

But the absence of a discussion like this is what kind of mass media (and therefore significantly communicative) structure is currently in place — one that doesn’t discuss a key factor in making the problem much worse, and one that doesn’t really mention potentially viable technological solutions in the 21st century.

Climate change (yes, it’s real and at least largely human-caused) will keep making these sorts of disasters much worse if it continues unabated. In 20 years, Hurricane Florence may seem mild compared to the average hurricanes of 2038, and that is clearly a stormy future that needs prevented.

Some Antidepressants Are Actually Worsening Antibiotic Resistance

Another side effects of antidepressants, which are overall overrated for effectively solving mental health problems. The real solutions (antidepressants don’t help everyone and in general inadequately help many who take them) to mental health crises are methods such as creating an improved society for the general public, making concrete improvements in people’s lives through means such as better diet, use of valuable therapeutic treatments, and more exercise, and the use of mental techniques (e.g., changing thinking patterns in major depressive disorder) that make use of the human mind’s power.

Specifically though, fluoxetine, the essential ingredient in Prozac and an SSRI, has been implicated in spreading antibiotic resistance. The researchers who found this have previously reported in another study that triclosan — a typical ingredient in hand wash and toothpaste — also causes antibiotic resistance to worsen.

It was recently found that there are thousands of new antibiotic combinations that are quite effective, however, and it’s important enough to note. Antibiotic resistance is becoming a major problem that may do serious damage to the foundations of modern medicine if more research like that isn’t done and used effectively.

Research: Kindness to Employees Improves Worker Performance

There’s thus good evidence that mean employers devalue companies. Someone ought to mention this to the highest level of management in the economy — there are too many of these employers, as workers generally know quite well.

Want the best results out of your employees? Then be nice to them.

New research from Binghamton University, State University at New York finds that showing compassion to subordinates almost always pays off, especially when combined with the enforcement of clear goals and benchmarks.

“Being benevolent is important because it can change the perception your followers have of you,” said Chou-Yu Tsai, an assistant professor of management at Binghamton University’s School of Management. “If you feel that your leader or boss actually cares about you, you may feel more serious about the work you do for them.”

[…]

They surveyed nearly 1,000 members of the Taiwanese military and almost 200 adults working full-time in the United States, and looked at the subordinate performance that resulted from three different leadership styles:

  • Authoritarianism-dominant leadership: Leaders who assert absolute authority and control, focused mostly on completing tasks at all costs with little consideration of the well-being of subordinates
  • Benevolence-dominant leadership: Leaders whose primary concern is the personal or familial well-being of subordinates. These leaders want followers to feel supported and have strong social ties.
  • Classical paternalistic leadership: A leadership style that combines both authoritarianism and benevolence, with a strong focus on both task completion and the well-being of subordinates.

The researchers found that authoritarianism-dominant leadership almost always had negative results on job performance, while benevolence-dominant leadership almost always had a positive impact on job performance. In other words, showing no compassion to your employees doesn’t bode well for their job performance, while showing compassion motivated them to be better workers.

They also found that classical paternalistic leadership, which combines both benevolence and authoritarianism, had just as strong an effect on subordinate performance as benevolent-dominant leadership. Tsai said the reason for this phenomenon may extend all the way back to childhood.

“The parent and child relationship is the first leader-follower relationship that people experience. It can become a bit of a prototype of what we expect out of leadership going forward, and the paternalistic leadership style kind of resembles that of a parent,” Tsai said.

“The findings imply that showing personal and familial support for employees is a critical part of the leader-follower relationship. While the importance of establishing structure and setting expectations is important for leaders, and arguably parents, help and guidance from the leader in developing social ties and support networks for a follower can be a powerful factor in their job performance,” Dionne said.

Because of the difference in work cultures between U.S. employees and members of the Taiwanese military, researchers were surprised that the results were consistent across both groups.

“The consistency in the results across different cultures and different job types is fascinating. It suggests that the effectiveness of paternalistic leadership may be more broad-based than previously thought, and it may be all about how people respond to leaders and not about where they live or the type of work they do,” Yammarino said.

Tsai said his main takeaway for managers is to put just as much or even more of an emphasis on the well-being of your employees as you do on hitting targets and goals.

“Subordinates and employees are not tools or machines that you can just use. They are human beings and deserve to be treated with respect,” said Tsai. “Make sure you are focusing on their well-being and helping them find the support they need, while also being clear about what your expectations and priorities are. This is a work-based version of ‘tough love’ often seen in parent-child relationships.”

Advanced Automation in the Future

Over the last several decades in the U.S., productivity gains have been concentrated in the upper echelon of the income distribution. The general population hasn’t really received them.

productivitygraph

Productivity means the average output per hour in the economy. This has increased due to technological advances such as faster computer processing power and workers becoming more efficient at their jobs.

The story of robots taking all the jobs is today printed in the mass media with some regularity. However, if the robots were actually taking all the jobs today, it would show up in the data. Massive automation implies massive increases in productivity, but as it is now, productivity gains rates have been quite low. Yearly productivity growth was higher in 2003 than it is today, and since about 2005 there’s been a slowdown in it. So based on the trend of the last dozen years, it is unlikely enough that we will see significant advances in productivity (automation) over the next several years.

Society should be structured so that in the next decades, productivity gains will be distributed to the general population instead of primarily to upper-middle class and wealthy people. In a significant way, this will depend on who owns the technology.

It’s crucial that there be real care taken on the rights awarded to people owning the most valuable technology. This may frankly determine whether that technology is a curse or a blessing for humanity.

In one example, say that the groundbreaking designs for the most highly advanced robotics are developed by a major corporation, which then patents the designs. The patent is valuable since the robotics would be far more efficient than anything else on the market, and it would allow the corporation to charge much higher prices than would otherwise be possible. This would be good for the minority of people who own the company and are invested in it, but it would almost certainly be harmful to the general public.

The case of prescription drugs shows us what happens when legal enforcement via patents goes wrong. The United States spent $450 billion on prescription drugs in 2017, an amount that would have been about a fifth as much (representing thousands of dollars per U.S. household in savings) were there no drug patents and a different system of drug research incentives. The consequence of this disparity is obviously that there are many people who suffer with health ailments due to unnecessarily expensive medications.

The major corporation with the valuable robotics patents may be able to make the distribution of the valuable robotics (which could very efficiently perform a wide range of tasks) much more expensive than necessary, similar to the prescription drugs example. The robotics being too expensive would mean that there’d be less of them to do efficient labor such as assembling various household appliances, and this would manifest itself as a cost to a lot of people.

So instead of the advanced robotics (probably otherwise cheap due to the software and materials needed for them being low cost) being widely distributed inexpensively and allowed to most efficiently automate labor, there could be a case where their use is expensively restricted. The robotics may even be used by the potentially unaccountable corporation for mostly nefarious ends, and this is another problem that arises with the control granted by the patents. Clearly, there need to be public interest solutions to this sort of problem, such as avoiding the use of regressive governmental interventions, considering the use of shared public ownership to allow many people to receive dividends on the value the technology generates, and implementing sensible regulatory measures.

There are also standards that can be set into law and enforced. A basic story is that if (after automation advances lead to less labor requirements among workers generally) the length of the average work year decreases by 20 percent, about 25 percent more people will be employed. The arithmetic may not always be this straightforward, but it’s a basic estimate for consideration.

Less time spent working while employing more people is clearly a good standard for many reasons, particularly in the U.S. where leisure rates among most are low compared to other wealthy countries. More people being employed may also mean tighter labor markets that allow for workers to receive higher real wage gains.

If there is higher output due to technology, that value will go somewhere in the form of more money. Over the last decades we have seen this concentrated at the top, but it is possible to have workers both work shorter hours and have similar or even higher pay levels.

Lacking Net Neutrality Presents Public Safety Risks

It’s horrible that ISPs slowed speeds to emergency respondents in the wake of massive wildfires. The issue of net neutrality is really quite simple at its core — it’s about whether ISPs will have too much control over user access to the Internet or not. The large ISPs would prefer as much control as possible to increase their profits, even if that’s at the expense of public safety.

An ongoing study first reported by Bloomberg reveals the extent to which major American telecom companies are throttling video content on apps such as YouTube and Netflix on mobile phones in the wake of the Republican-controlled Federal Communications Commission (FCC) repealing national net neutrality protections last December.

Researchers from Northeastern University and the University of Massachusetts, Amherst used a smartphone app called Wehe, which has been downloaded by about 100,000 users, to track when wireless carriers engage in data “differentiation,” or when companies alter download speeds depending on the type of content, which violates a key tenet of the repealed rules.

Between January and May of this year, Wehe detected differentiation by Verizon 11,100 times; AT&T 8,398 times; T-Mobile 3,900 times; and Sprint 339 times. David Choffnes, one of the study’s authors and the app’s developer, told Bloomberg that YouTube was the top target, but carriers also slowed down speeds for Netflix, Amazon Prime Video, and the NBC Sports app.

[…]

Jeremy Gillula, tech policy director at Electronic Frontier Foundation, pointed to Verizon slowing down data speeds as Santa Clara County emergency responders battled the largest fire in California’s history. Verizon claimed it was a “customer-support mistake,” but county counsel James Williams said it proves that ISPs “will act in their economic interests, even at the expense of public safety,” and “that is exactly what the Trump administration’s repeal of net neutrality allows and encourages.”

That example, Gillula told Bloomberg, demonstrates “that ISPs are happy to use words like ‘unlimited’ and ‘no throttling’ in their public statements, but then give themselves the right to throttle certain traffic by burying some esoteric language in the fine print” of service contracts. “As a result, it’s especially important that consumers have tools like this to measure whether or not their ISP is throttling certain services.”

Study on Why Ridiculously False Beliefs Remain in Individuals

According to the study, it seems to have to do with the illogical feedback loop that’s been created in individuals. In my view, it’s important to think at least occasionally about whether your view of reality is actually an illusion. I know how much people often disregard this view (the truth is admittedly hard to view at certain times), but since we live in a world generally so obsessed by money, I like to point to a Bloomberg article about billionaires. The article was based on billionaire interviews and it reads that “Every member of my billionaire sample discussed the importance of discerning reality that others have missed or misunderstood.”

The study may also be extrapolating some based on its experimental evidence, but it is interesting and the conclusion seems plausible enough nonetheless. I would tie the high amount of confidence gained in the low-knowledge part of the Dunning-Kruger effect (low-ability people having too much confidence due to lack of knowledge/experience) to the less curious dynamic presented.

Ever wonder why flat earthers, birthers, climate change and Holocaust deniers stick to their beliefs in the face of overwhelming evidence to the contrary?

New findings from researchers at the University of California, Berkeley, suggest that feedback, rather than hard evidence, boosts people’s sense of certainty when learning new things or trying to tell right from wrong.

Developmental psychologists have found that people’s beliefs are more likely to be reinforced by the positive or negative reactions they receive in response to an opinion, task or interaction, than by logic, reasoning and scientific data.

Their findings, published today in the online issue of the journal Open Mind, shed new light on how people handle information that challenges their worldview, and how certain learning habits can limit one’s intellectual horizons.

“If you think you know a lot about something, even though you don’t, you’re less likely to be curious enough to explore the topic further, and will fail to learn how little you know,” said study lead author Louis Marti, a Ph.D. student in psychology at UC Berkeley.

This cognitive dynamic can play out in all walks of actual and virtual life, including social media and cable-news echo chambers, and may explain why some people are easily duped by charlatans.

“If you use a crazy theory to make a correct prediction a couple of times, you can get stuck in that belief and may not be as interested in gathering more information,” said study senior author Celeste Kidd, an assistant professor of psychology at UC Berkeley.

Specifically, the study examined what influences people’s certainty while learning. It found that study participants’ confidence was based on their most recent performance rather than long-term cumulative results. The experiments were conducted at the University of Rochester.

For the study, more than 500 adults, recruited online through Amazon’s Mechanical Turk crowdsourcing platform, looked at different combinations of colored shapes on their computer screens. They were asked to identify which colored shapes qualified as a “Daxxy,” a make-believe object invented by the researchers for the purpose of the experiment.

With no clues about the defining characteristics of a Daxxy, study participants had to guess blindly which items constituted a Daxxy as they viewed 24 different colored shapes and received feedback on whether they had guessed right or wrong. After each guess, they reported on whether or not they were certain of their answer.

The final results showed that participants consistently based their certainty on whether they had correctly identified a Daxxy during the last four or five guesses instead of all the information they had gathered throughout.

“What we found interesting is that they could get the first 19 guesses in a row wrong, but if they got the last five right, they felt very confident,” Marti said. “It’s not that they weren’t paying attention, they were learning what a Daxxy was, but they weren’t using most of what they learned to inform their certainty.”

An ideal learner’s certainty would be based on the observations amassed over time as well as the feedback, Marti said.

“If your goal is to arrive at the truth, the strategy of using your most recent feedback, rather than all of the data you’ve accumulated, is not a great tactic,” he said.

11 EU Countries to Supposedly Make Public Science Research Results Freely Available by 2020

Science should be much less about profit for paywalls and more about positively advancing humanity. More openness can enable more collaboration, which is beneficial to scientific researchers. If research is closed off, it may also be the case that more than one group of highly competent researchers is working on a specific problem, which can be inefficient since those people could be working on another important problem.

The UK, France, Italy, and eight other countries have formed a bold pact called cOAlition S, designed to ensure that from 1 January 2020, all publicly funded scientific research is freely, immediately available and fully open access (OA).

For the nations taking part, the plan represents the imminent realisation of an open access dream that began decades ago, and looks destined to signify the end of the paywall as we know it.

“‘Knowledge is power’ and I firmly believe that free access to all scientific publications from publicly funded research is a moral right of citizens,” the EU’s Commissioner for Research, Science and Innovation, Carlos Moedas, said in a statement.

“It is one of the most important political commitments on science of recent times and puts Europe at the forefront of the global transition to open science.”

[…]

The key principle of Plan S is that from 2020 forward, all scientific research funded by public grants awarded by the 11 nation funders must be published in compliant Open Access journals or on compliant Open Access platforms – immediately, and with no restrictions.