Bipolar Disorder Associated with Increased Levels of Vitamin D

A blood test may have the potential to speed accurate diagnosis – and proper treatment – of bipolar disorder in children, new research suggests.

Researchers at The Ohio State University found that children with bipolar disorder had higher blood levels of a protein associated with vitamin D compared to children without mood disorders. Finding a blood test to confirm bipolar disorder could improve care and cut the current 10-year average lag time between onset and diagnosis, said Ouliana Ziouzenkova, the study’s lead author and an associate professor of human nutrition at Ohio State.

In the study of 36 young people, levels of the vitamin D binding protein were 36 percent higher in those with bipolar disorder than in those without a mood disorder. The study appears online in the journal Translational Psychiatry.

Confirming the significance of the blood marker with further research will take time, but Ziouzenkova and her collaborators are enthusiastic about its potential, and the benefits it could offer to children and their parents.

Bipolar Disorder Associated with Increased Levels of Vitamin D

Fig. 1: Serum 25-hydroxy vitamin D levels are inversely correlated with BMI in combined groups. Inverse linear correlation between vitamin D concentrations in plasma and BMI measured in combined groups of patients with and without MMD. Pearson correlation. Translational Psychiatry volume 8, Article number: 61 (2018) doi:10.1038/s41398-018-0109-7

“Childhood bipolar disorder can be very difficult to distinguish from other disorders, especially in youth with certain types of depression,” said Barbara Gracious, a study lead co-author and associate professor of clinical psychiatry and nutrition at Ohio State.

“Prompt diagnosis and appropriate treatment alleviates the suffering of the child and family, and can potentially lessen the risk for suicide,” she said.

Sensitive and specific biomarkers could give clinicians more confidence in choosing the most appropriate treatment, and decrease lags in proper diagnosis, Gracious said, adding that more research will be needed to confirm whether testing for the vitamin D protein could prove a valuable tool in practice.

The clinical part of the pilot study was conducted at Harding Hospital at Ohio State’s Wexner Medical Center and included 13 children without mood disorders, 12 children with diagnosed bipolar disorder and 11 children with major depressive disorder.

Ziouzenkova said it made sense to look at vitamin D binding protein because it potentially plays a role in brain inflammation. The researchers also looked at inflammatory markers in the blood, but found no significant correlations. Looking for the nutrient vitamin D in the blood, as opposed to the binding protein, appears to have low diagnostic power, she said.

“We wanted to look at factors that could be involved in mood disorders on a cellular level and that could be easily found in the blood,” Ziouzenkova said.

To date, finding a reliable blood marker for bipolar diagnosis has been elusive, she said. Her lab used an intricate technique to evaluate blood plasma, in which they essentially used biological “bait” to fish for inflammatory factors. That helped them identify the vitamin D binding protein as a potential diagnostic target.

“We want to help psychiatrists and other doctors diagnose children early and accurately. Once bipolar disorder progresses, it is more challenging to treat,” Ziouzenkova said.

If further research confirms the findings, developing a blood test would be a fairly straightforward and relatively inexpensive proposition, she said. Ziouzenkova is currently seeking support for a larger study using blood that has already been collected from patients with bipolar disorder, including adults.

The research was supported by the National Institutes of Health and the National Center for Research Resources.

Other Ohio State researchers involved in the work were Brawnie Petrov, Ayat Aldoori, Cindy James, Kefeng Yang, Aejin Lee, Liwen Zhang, Tao Lin, Jonathan Parquette, Arpad Samogyi, L. Eugene Arnold and Mary Fristad.

Publication: Brawnie Petrov, et al., “Bipolar disorder in youth is associated with increased levels of vitamin D-binding protein,” Translational Psychiatry, volume 8, Article number: 61 (2018) doi:10.1038/s41398-018-0109-7

Source: scitechdaily.com

Physicists Just Discovered an Entirely New Type of Superconductivity

One of the ultimate goals of modern physics is to unlock the power of superconductivity, where electricity flows with zero resistance at room temperature.

Progress has been slow, but physicists have just made an unexpected breakthrough. They’ve discovered a superconductor that works in a way no one’s ever seen before – and it opens the door to a whole world of possibilities not considered until now.

In other words, they’ve identified a brand new type of superconductivity.

Why does that matter? Well, when electricity normally flows through a material – for example, the way it travels through wires in the wall when we switch on a light – it’s fast, but surprisingly ineffective.

Electricity is carried by electrons, which bump into atoms in the material along the way, losing some of their energy each time they have one of these collisions. Known as resistance, it’s the reason why electricity grids lose up to 7 percent of their electricity.

But when some materials are chilled to ridiculously cold temperatures, something else happens – the electrons pair up, and begin to flow orderly without resistance.

This is known as superconductivity, and it has incredible potential to revolutionise our world, making our electronics unimaginably more efficient.

The good news is we’ve found the phenomenon in many materials so far. In fact, superconductivity is already used to create the strong magnetic fields in MRI machines and maglev trains.

The bad news is that it currently requires expensive and bulky equipment to keep the superconductors cold enough to achieve this phenomenon – so it remains impractical for broader use.

Now researchers led by the University of Maryland have observed a new type of superconductivity when probing an exotic material at super cool temperatures.

Not only does this type of superconductivity appear in an unexpected material, the phenomenon actually seems to rely on electron interactions that are profoundly different from the pairings we’ve seen to date. And that means we have no idea what kind of potential it might have.

To understand the difference, you need to know that the way electrons interact is dictated by a quantum property called spin.

In regular superconductors, electrons carry a spin referred to as 1/2.

But in this particular material, known as YPtBi, the team found that something else was going on – the electrons appear to have a spin of 3/2.

“No one had really thought that this was possible in solid materials,” explains physicist and senior author Johnpierre Paglione.

“High-spin states in individual atoms are possible but once you put the atoms together in a solid, these states usually break apart and you end up with spin one-half. ”

YPtBi was first discovered to be a superconductor a couple of years ago, and that in itself was a surprise, because the material doesn’t actually fit one of the main criteria – being a relatively good conductor, with a lot of mobile electrons, at normal temperatures.

According to conventional theory, YPtBi would need about a thousand times more mobile electrons in order to become superconducting at temperatures below 0.8 Kelvin.

But when researchers cooled the material down, they saw superconductivity happening anyway.

To figure out what was going on, the latest study looked at the way the material interacted with magnetic fields to get a sense of exactly what was going on inside.

Usually as a material undergoes the transition to a superconductor, it will try to expel any added magnetic field from its surface – but a magnetic field can still enter near, before quickly decaying away. How far they penetrate depends on the nature of the electron pairing happening within.

The team used copper coils to detect changes in YPtBi’s magnetic properties as they changed its temperature.

What they found was odd – as the material warmed up from absolute zero, the amount that a magnetic field could penetrate the material increased linearly instead of exponentially, which is what is normally seen with superconductors.

After running a series of measurements and calculations, the researched concluded that the best explanation for what was going on was that the electrons must have been disguised as particles with higher spin – something that wasn’t even considered as a possibility for a superconductor before.

While this new type of superconductivity still requires incredibly cold temperatures for now, the discovery gives the entire field a whole new direction.

“We used to be confined to pairing with spin one-half particles,” says lead author Hyunsoo Kim.

“But if we start considering higher spin, then the landscape of this superconducting research expands and just gets more interesting.”

This is incredibly early days, and there’s still a lot we have to learn about exactly what’s going on here.

But the fact that we have a brand new type of superconductivity to test and measure, adding a cool new breakthrough to the 100 years of this type of research, is pretty exciting.

“When you have this high-spin pairing, what’s the glue that holds these pairs together?” says Paglione.

“There are some ideas of what might be happening, but fundamental questions remain-which makes it even more fascinating.”

Source: sciencealert.com

SpaceChain, Arch Aim to Archive Human Knowledge in Space

SpaceChain on Monday announced that it has entered a partnership with theArch Mission Foundation to use open source technology to launch an ambitious project involving the storage of large data sets in spacecraft and on other planets.

Arch Mission will load large quantities of data onto SpaceChain’s satellite vehicles with the eventual aim of storing data on other planets.

“The goal of archiving and preserving knowledge of future generations will advance archiving science and human knowledge by itself,” SpaceChain cofounder Zheng Zuo said. “The ambitious goal of disseminating this knowledge throughout the solar system is finally achievable today, thanks to greatly reduced launch costs through new space launch providers.”

SpaceChain’s decision to support the Arch mission to archive human data in space will help launch the Earth Library — a ring of backup data orbiting around the Earth — said Nova Spivack, cofounder of the Arch Mission Foundation.

The goal of the foundation is to “preserve and disseminate humanity’s most important information across time and space, for the benefit of future generations,” he told LinuxInsider.

Data Preservation

Among the data sets that are included are Wikipedia, the human genome, Project Gutenberg, the Internet Archive and the Rosetta Project, Spivack said. The project ultimately will include a vast library of books, music, photos, film, video and other data sets.

The partnership would allow SpaceChain’s long-term goal of storing data archives throughout the solar system come to fruition.

The venture follows an earlier partnership involving Elon Musk’s Space X, which launched its Falcon Heavy rocket into space last month. Among other things, the launch carried a cherry red Tesla Roadster into space. The car is expected to circumnavigate the sun for at least 30 million years.

The Arch Mission included its first library, including the Isaac Asimov Foundation Trilogy, in that Falcon Heavy payload. There are plans to include additional libraries on future space flights, including a Lunar Library scheduled for delivery to the moon by 2020, and a Mars Library that is designed to accompany the first human settlers to the red planet.

SpaceChain, which was co-founded by CTO Jeff Garzik, a pioneer in the blockchain field and a key Linux kernel engineer, recently entered a critical venture with Qtum Foundation, in order to launch the world’s first blockchain node in space.

Qtum launched a CubeSat into space that uses its blockchain technology on a Raspberry Pi.

“Blockchain is an interesting technology as a foundational infrastructure for future space colonies,” said Aditya Kaul, research director at Tractica.

“Beyond the secure storage of data, I am personally very excited about how these could come together to create decentralized economies and possibly even governing mechanisms,” he told LinuxInsider.

“There is an interesting conference on some of these themes coming up in June in London,” Kaul noted.

Source: NewsDigitize

“Marine noise budgets” – a new method to manage the impact of underwater noise

Cefas scientists have published a study which proposes a new methodology to manage the impact of underwater noise on marine life.

The work, titled “Marine Noise Budgets in Practice” and published in the journal Conservation Letters, allows policy makers to measure how much noise pollution a particular marine species or protected area is exposed to, and to set targets to manage pollution levels.

What makes underwater noise, what does it mean for marine animals?

Underwater noise pollution can disturb or injure many marine animals, from the largest whales down to microscopic zooplankton. The study will assist ongoing efforts in the UK to better manage underwater noise. Noise can be produced by activities such as shipping, sonar, explosions, pile driving (e.g. to construct offshore wind farms) and geophysical surveys (e.g. to look for oil and gas beneath the seabed).

Marine noise budgets: the new approach

The new method considers the population density of marine animals and their exposure to noise pressure across a managed area of ocean to map the risk it poses. In doing so, policy-makers can better target efforts to manage this noise.

Mapping risk and calculating exposure indicators – the new approach demonstrated using North Sea harbour porpoise density

The article uses data from the 2017 OSPAR Intermediate Assessment, which carried out the first international assessment of impulsive noise activity (underwater noise made by pile driving, geophysical surveys, explosions, and some sonar activity), and was coordinated by the UK.

See more information.

Sex-based differences in phagocyte metabolic profile in rats with monosodium glutamate-induced obesity

Abstract

The important component of obesity pathogenesis is inflammatory activation of innate immune cells within adipose tissue and in other body locations. Both the course of obesity and innate immune reactivity are characterized by sex-associated differences. The aim of the work was a comparative investigation of metabolic profiles of phagocytes from different locations in male and female rats with MSG-induced obesity. The administration of monosodium glutamate (MSG) caused obesity, with sex-associated differences, that was more severe in male rats. Obesity was associated with pro-inflammatory activation of CD14+ phagocytes from adipose tissue in female, but not in male rats, which was demonstrated by decreased phagocytosis activity along with increased ROS generation. Phagocytes from the peritoneal cavity and peripheral blood of obese female rats exhibited neutral metabolic profile, whereas those cells from obese male rats displayed a pro-inflammatory metabolic profile. Thus, the manifestation of obesity-induced inflammation was characterized by different patterns of metabolic profile of phagocytes in male and female rats. Identified immune cell characteristics expand our knowledge of obesity immunobiology and may help to develop more effective preventive and therapeutic interventions for obese patients of different sexes.

Introduction

The worldwide prevalence of obesity and its metabolic complications have substantially increased in recent years1,2. The propensity towards development of obesity differs between the sexes, and this is, first of all, due to the effect of sex hormones on adipocyte metabolism3,4. In addition, sex-associated differences in cell types, other than adipocytes within adipose tissue, such as innate immune cells, also account for differences in obesity between males and females. Sex-based differences in immune responses are well documented. These differences are attributed to the immunomodulatory effects of sex hormones, as well as being related to the X chromosome gene contributions. The X chromosome encodes for a number of critical genes involved in the regulation of immunity, such as Toll-like receptors. Moreover, the X chromosome contains about 10% of all microRNAs in the genome, which regulate immune cell differentiation and functioning5,6. The sex differential expression of PRRs stipulates sex-specific activity of the innate immune cells following stimulation. Peritoneal phagocytes from female rodents produce higher levels of anti-inflammatory prostanoids, than do male-derived cells in response to microbial stimuli. Whereas, male phagocytes produce more pro-inflammatory cytokines and chemokines following PRR stimulation, than do female cells. The phagocytic activity of innate immune cells from many locations is higher in females than in males7,8,9. Sex hormones exert different immunomodulating effects. Natural level of testosterone shows a significant positive relationship with Th1 immune response, whereas natural level of estrogen – with Th2 immune cell metabolic shift10,11.

Adipose tissue and immune system are closely interrelated. Major alterations of immune responses expressed during obesity, have been represented as obesity-induced low-grade inflammation or «meta-inflammation»12,13. This disorder is associated with an increase in local adipose tissue of inflammatory cytokines and other proteins (TNFα, IL-1b, IL-6, IFNγ, MCP-1,

Read full article: journalgazett.com

Second SpaceShipTwo performs first powered test flight

WASHINGTON — Virgin Galactic’s second SpaceShipTwo suborbital vehicle successfully performed its first powered flight April 5, the first such test flight since a fatal crash nearly three and a half years ago.

The vehicle, named VSS Unity, was released from its WhiteKnightTwo carrier aircraft in the skies about 14,200 meters above Mojave, California, at approximately 12:00 p.m. Eastern time. The vehicle ignited its hybrid rocket motor on what the company said was a “partial duration burn” lasting 30 seconds. The vehicle reached a top speed of Mach 1.87 and altitude of 25,686 meters.

After the end of its powered flight, SpaceShipTwo deployed its feathering system, raising its twin booms to permit a safe reentry before lowering them again for landing. The vehicle glided to a landing at the Mojave Air and Space Port in California a little more than 10 minutes after release.

The flight was the first powered flight for VSS Unity, which the company rolled out more than two years ago. Virgin started a series of glide flights of VSS Unity in December 2016, with the most recent such flight on Jan. 11. Company officials at the time suggested that January glide flight would be the last before the start of powered test flights.

This flight featured a longer burn, and thus higher speed and altitude, than prior powered test flights by the first SpaceShipTwo, VSS Enterprise. In a January 2014 test flight, VSS Enterprise reached a top speed of Mach 1.4 and altitude of more than 21,000 meters.

That flight was the third powered test flight by VSS Enterprise. On the fourth, in October 2014, the vehicle broke apart seconds into the powered phase of flight, killing co-pilot Michael Alsbury and injuring pilot Peter Siebold.

An investigation by the National Transportation Safety Board concluded Alsbury had prematurely unlocked SpaceShipTwo’s feathering mechanism as it passed through the sound barrier, causing an aerodynamic breakup. The investigation also blamed the vehicle’s developer, Scaled Composites, for design elements that contributed to the accident, including its inability to foresee a single-point failure such as a premature unlocking of the feathering mechanism.

The Spaceship Company, originally a joint venture between Virgin and Scaled Composites but now wholly owned by Virgin, had already been building a second SpaceShipTwo at the time of the accident. Virgin leadership, including Sir Richard Branson, decided to continue development of that vehicle, which the company named VSS Unity at its February 2016 rollout. The company also built the rocket motor, using nitrous oxide and HTPB propellants.

This powered test flight is the first in a series of such flights planned by the company to expand the vehicle’s performance envelope and to prepare for commercial flights carrying tourists and research payloads. The company has not stated how many such powered tests it expects to fly before the vehicle enters commercial service.

“We’re looking forward to having a full 2018 with powered test flights,” George Whitesides, chief executive of Virgin Galactic, said at a suborbital research conference in December. “We’re going to take our time to do it right.”

In addition to the test program, Virgin announced a non-binding agreement in October with the Public Investment Fund (PIF) of Saudi Arabia whereby the PIF would invest $1 billion into Virgin’s space companies, which also include small launch vehicle developer Virgin Orbit.

Virgin has not provided an update on the status of that investment. However, the company said that Saudi Crown Prince Mohammad bin Salman Al-Saud, whose duties include serving as chairman of PIF, visited the facilities of Virgin Galactic and The Spaceship Company in Mojave last weekend, while the company was preparing for what it only described as an “upcoming” SpaceShipTwo flight test. Photos of the event show the logo for the Saudi Arabian government’s “Vision 2030” initiative on the side of WhiteKnightTwo.

Virgin Galactic is “back on track,” Branson tweeted shortly after the successful test flight. “Data review to come, then on to the next flight. Space feels tantalizingly close now.”

Source: spacenews.com

Injectable goo could bring drugs to cancerous tumors

An injectable gel-like scaffold can hold combination chemo-immunotherapeutic drugs and deliver them locally and sequentially to tumors.

The results in animal models so far suggest this approach could eventually ramp up therapeutic benefits for patients bearing tumors or after removal of the primary tumors.

The research, published in Science Translational Medicine, focused on two specific types of melanoma and breast cancer, but this approach could work in other tissue types. Also, the research showed that this localized delivery of combination therapy significantly inhibited the recurrence of cancer after the primary tumor was surgically removed.

“We’ve created a simple method to use chemotherapy while leveraging the biology of the tumor and our natural defense against foreign invaders to beat back tumor development with limited side effects,” says senior author Zhen Gu, associate professor in the joint University of North Carolina/North Carolina State University biomedical engineering department. “We have a lot more work to do before human clinical trials, but we think this approach holds great promise.”

How immunotherapy works

In our bodies right now, there are normal cells mutating from their typical form and function. Thankfully, as our immune system lets normal cells move along and perform important biological functions, mutated cells are recognized and destroyed. Unfortunately, though, these cells can hijack the system designed to dispatch them. If that happens, these cancerous cells become virtually undetectable, free to multiply unabated, and able to form tumors.

Immunotherapy tries to reset our immune response to recognize these hijacker cancer cells. For example, immune checkpoint blockade (ICB) therapies target the cellular pathway that programs cell death; the therapies trigger the pathway so cancer cells are killed. This kind of therapy has shown incredible potential to treat various forms of cancer, such as melanoma, kidney cancer, head and neck cancer, bladder cancer, and non-small cell lung cancer.

But there can be troublesome side effects, including kick-starting the immune system to attack healthy tissue. And often this immunotherapy does not work because many tumors lack the specific characteristics needed in order for the immunotherapy to recognize and attack the cancer cells as enemies. These sorts of tumors are called low-immunogenic tumors.

Doctors have achieved better results with immunotherapy if they attack the tumors with chemotherapy first. But still, this approach is not sufficient for patients with low immunogenic tumors. Scientists, therefore, have been engineering various methods to make immunotherapy more effective. For example, scientists are utilizing delivery systems to transport drugs and immunotherapy directly to the tumor site to enhance treatment efficacy and decrease toxicity in other parts of the body.

A gel with two parts

To this end, the researchers developed what they call a bioresponsive scaffold system. Essentially, it’s a hydrogel—a polymeric network that can be loaded with therapeutics.

“The trick is that the gel can be formed quickly inside the body once a biocompatible polymer and its crosslinker are mixed together,” says co-lead author Jinqiang Wang, a postdoctoral researcher in the Gu lab. “We made sure that one of these agents can be cleaved apart by reactive oxygen species, or ROS—a natural chemical byproduct of cell metabolism.” In the context of cancer, a high level of ROS is a major player in tumor development and growth.

Researchers loaded the hydrogel scaffold with a chemotherapeutic gemcitabine and an immunotherapeutic agent—anti-PD-L1 blocking antibody. When injected into the tumor, the gel promotes the kinds of tumor characteristics that immunotherapies can identify. Then, in response to the highly abundant ROS in the tumor, the scaffold gradually breaks down, releasing gemcitabine first, and then anti-PD-L1.

“The cytotoxic chemotherapy can first kill some cancer cells and enhance the sensitivity of the tumor toward ICB therapy, which then stimulates the effectiveness of the ICB therapy,” says coauthor Gianpietro Dotti, professor of microbiology and immunology at the UNC School of Medicine and member of the UNC Lineberger Comprehensive Cancer Center. “With degradation of the gel, the ROS level in the tumor site can be reduced, which also helps inhibit tumor growth.”

The scientists tested this therapeutic gel-mediated approach against two cancers—B16F10 melanoma and 4T1 breast cancer, the latter being low immunogenic. The method was effective at making the tumor microenvironments susceptible to treatment. And when the payload was released, tumors decreased significantly. The researchers then conducted experiments to have the hydrogel scaffold form at the surgical site after removal of primary tumors. They witnessed a remarkable inhibition of cancer recurrence.

“Regarding the potential of this approach, scientists should further investigate the biocompatibility of using the gel scaffold for clinical benefit,” Gu says. “Meanwhile, we will optimize the dosages of combination drugs as well as treatment frequencies.”

The Alfred P. Sloan Foundation, the NIH Clinical and Translational Science Awards, the North Carolina Translational and Clinical Sciences (NC TraCS) Institute, and the UNC Lineberger Comprehensive Cancer Center supported the work.

Source: futurity.org

The macaques who care about dental hygiene

When a Nicobar long-tailed macaque feels hungry, it looks for the most easily available food in its vicinity – a coconut – and handles it dexterously to get the white meat from inside. These macaques de-husk using hand and teeth. When the inner brown shell becomes sufficiently exposed, they pound it on a hard surface until the outer shell cracks open and take out the white meat. Once hunger is taken care of, the macaques don’t rest. After all, there is a pressing need for oral hygiene. These long-tailed macaques from Nicobar floss their teeth after eating. They use a twig, a thin metal wire, a piece of coir or even a sharp grass blade as dental floss.

Teeth flossing in Macaques is rare. Only two other types of macaques across the globe, the Japanese macaques and the Thai long-tailed macaques, are known to do it. Indian macaques have never been seen doing it before.

It is the same story with the macaque’s ability to pry open a coconut. Very few macaques, such as some populations of rhesus macaques and some Balinese macaques are known to eat coconut. Even the pigtail macaques (also called coconut harvesters), who on the bidding of their owners climb a tree to pluck a coconut and give it to their owners, are never inspired to crack the nut open and eat it.

This foraging-related behaviour is demonstrative of “a high level of sensorimotor skills and intelligence,” said Honnavalli N. Kumara, one of the scientists involved in studying the foraging techniques employed by the Nicobar long-tailed (NLT) macaques. Kumara and his colleagues studied these macaques for a period of three years to arrive at their conclusions. Four PhD students tailed a group of macaques in Campbell Bay (a village in Nicobar) everyday for the entire three-year period. They have published their findings in the journal Primates.

The NLT macaque is endemic to three islands in the Andaman and Nicobar Archipelago – Great Nicobar, Little Nicobar and Katchal. These macaques have not been studied much in the past. However, a population survey on macaques, done right after the 2004 Asian Tsunami, had shown a sharp decline in their number. A substantial chunk of coastal habitat occupied by these macaques was lost in the tsunami. The scientists wanted to know how the species had fared in the subsequent years? Much to their delight they found that the population had bounced back.

“In the aftermath of the tsunami the macaques were forced to move inland and adapt,” said Kumara. While some groups moved deeper into the forests others started living on the periphery of villages and human settlements. One such group, living in proximity of humans, was the subject for the study on the foraging behaviour in macaques. It comprised three adult males and eight adult females a couple of adolescent males and some infants and juveniles.

The scientists found that the NLT macaques were ‘generalist’ feeders – they ate a variety of food. They also engaged in a range of complex behaviour to make their food easily accessible and palatable. For example, if some food was covered with thorns, hair or mud, they would wipe it clean with a leaf, a cloth or a polythene sheet before consuming it. They were also seen cleaning their food by rubbing it between their palms or sometimes by washing the food item in a water hole to get rid of the muck.

When interested in eating insects hiding under vegetation, the macaques were seen to vigorously shake the vegetation to flush out the insects, making it easier to catch them. The scientists call this vigorous shaking of plants ‘beating the bush.’ The fact the NLTmacaques engage in something like beating the bush, indicates that they have a “conceptual understanding of the hiding places of the insects,” noted the paper.

In the repertoire of foraging techniques displayed by NLT macaques, “beating the bush to obtain insects appears to be novel to the Nicobar macaques, which is fascinating,” said Andie Ang a primatologist and the vice-president of the Jane Goodall Institute – Singapore. Ang was not associated with this study.

These foraging techniques have allowed the NLT macaques to survive despite a major habitat alteration in the 2004 tsunami. In fact, in the years after the tsunami the macaques’ dependence on coconuts has increased. This is causing coconut farmers much distress as they lose a substantial part of their harvest to the macaques. The farmers are now raising an alarm, and “have even complained to forest departments officials, asking for compensation,” said Kumara.

However, he is hopeful that the study will not only help improve our understanding about these highly intelligent species of macaques but also help address management aspects, such as conflicts with farmers, more effectively.

This article first appeared on Mongabay India

Source: indiabioscience.org

Five things to consider before ordering an online DNA test

This article was originally published on The Conversation. Read the original article.

You might be intrigued by what your genes could tell you about your ancestry or the health risks hidden in your DNA. If so, you’re not alone.

Fascination with personal genetics is fuelling an explosion of online DNA testing. More than 12 million people have been tested – 7 million through ancestry.com alone. Amazon reported the 23andMe online DNA test kit as one of its top five best-selling items on Black Friday in 2017.

But while online genetic testing can be interesting and fun, it has risks. Here are five things to keep in mind if you’re considering spitting in a tube.

1. Understand the limits of what’s possible

Keep in mind the evidence behind claims a DNA testing company makes. Some companies list the science that backs up their claims, but many don’t.

DNA testing can be used to tell your ancestry and family relatedness quite accurately, but companies claiming to predict wine preferences or children’s soccer prowess from DNA are in the realm of fantasy.

There is also a lack of regulation on this issue to protect consumers.

Genetic testing products like 23andMe are exploding in popularity.
from shutterstock.com

2. Make sure you’re prepared for the information

Genetics can tell us many things, some of which we may not be prepared for. You may go in looking for information on your ancestry, but could find out about unexpected paternity. Or you might discover you’re at risk of certain diseases. Some of these have no cure, like Alzheimer’s disease, which could only leave you distressed.

Some products can test for genetic changes in the BRCA genes that put you at risk of breast and ovarian cancers. Other online genetic interpretation tools can take raw data from ancestry DNA tests and, for a small payment, provide a wide range of disease risk estimates, many of which have been brought into question by the scientific community.

Think carefully about whether you really want to know all this information, and whether it’s valid, before you proceed.

3. Consider the medical follow-up you might need

If something serious is discovered in your genes, you might need the results to be professionally interpreted, or to have genetic counselling to come to terms with what you’ve learnt.

Some genetic information can be complex and difficult to interpret, and have medical implications for you and your family. Relying on the internet for interpretation is not advised.

Does the DNA testing company offer any counselling or medical services? If not, are you hoping your GP or genetics clinic will provide this? You might find GPs are not adequately trained to understand DNA results, and public genetics services have very long waiting lists. This means you might be left on tenterhooks with a potentially distressing result.

Before you spit into a tube, be prepared for what you might discover.
from shutterstock.com

4. Think how the results may affect your insurance

In Australia, private health insurance can’t be influenced by genetic test results. But life insurance companies can use genetic test results to discriminate against applicants, with little consumer protection. All genetic test results known to an applicant at the time of a life insurance application must be disclosed if requested, including internet-based test results.

Once you have a result that indicates increased risk of disease, the life insurance company may use this against you (by increasing premiums, for instance), even if the scientific evidence isn’t solid. This applies to life, income protection, disability and even travel insurance.

5. Consider who will have access to your DNA and data

Some online genetic testing companies don’t comply with international guidelines on privacy, confidentiality and use of genetic data. Many online testing companies retain DNA samples indefinitely. Consumers can request samples be destroyed, but sometimes have difficulties with this.

Some online testing companies have been accused of selling access to databases of genetic information to third parties, potentially without the knowledge of donors. You might have to plough through the fine print to find out what you have consented to.

The ConversationIn many ways, it is wonderful we now have access to our personal DNA code. However, as always, understanding the limitations and risks of fast-moving medical technology is very important.

Jane Tiller, Ethical, Legal & Social Adviser – Public Health Genomics, Monash University and Paul Lacaze, Head, Public Health Genomics Program, Monash University

This article was originally published on The Conversation. Read the original article.

DNA testing has its risks, including that you don’t know who will own your genetic data. Photo by Markus Spiske on Unsplash

Source: sciblogs.co.nz

Automatically-Triggered Brain Stimulation during Encoding Improves Verbal Recall

Fig. 4 (modified from Ezzyat et al., 2018). Stimulation targets showing numerical increase/decrease in free recall performance are shown inred/blue. Memory-enhancing sites clustered in the middle portion of the left middle temporal gyrus.

Everyone forgets. As we grow older or have a brain injury or a stroke or develop a neurodegenerative disease, we forget much more often. Is there a technological intervention that can help us remember? That is the $50 million dollar question funded by DARPA’s Restoring Active Memory (RAM) Program, which has focused on intracranial electrodes implanted in epilepsy patients to monitor seizure activity.

Led by Michael Kahana’s group at the University of Pennsylvania and including nine other universities, agencies, and companies, this Big Science project is trying to establish a “closed-loop” system that records brain activity and stimulates appropriate regions when a state indicative of poor memory function is detected (Ezzyat et al., 2018).

Initial “open-loop” efforts targeting medial temporal lobe memory structures (entorhinal cortexhippocampus) were unsuccessful (Jacobs et al., 2016). In fact, direct electrical stimulation of these regions during encoding of spatial and verbal information actually impairedmemory performance, unlike an initial smaller study (Suthana et al., 2012).1

{See Bad news for DARPA’s RAM program: Electrical Stimulation of Entorhinal Region Impairs Memory}

However, during the recent CNS symposium on Memory Modulation via Direct Brain Stimulation in Humans, Dr. Suthana suggested that “Stimulation of entorhinal white matter and not nearby gray matterwas effective in improving hippocampal-dependent memory…” 2

{see this ScienceNews story}

Enter the Lateral Temporal Cortex

Meanwhile, the Penn group and their collaborators moved to a different target region, which was also discussed in the CNS 2018 symposium: “Closed-loop stimulation of temporal cortex rescues functional networks and improves memory” (based on Ezzyat et al., 2018).

Fig. 4 (modified from Ezzyat et al., 2018). Horizontal section. Stimulation targets showing numerical increase/decrease in free recall performance are shown in red/blue. Memory-enhancing sites clustered in the middle portion of the left middle temporal gyrus.

Twenty-five patients performed a memory task in which they were shown a list of 12 nouns, followed by a distractor task, and finally a free recall phase, where they were asked to remember as many of the words as they could. The participants went through a total of 25 rounds of this study-test procedure.

Meanwhile, the first three rounds were “record-only” sessions, where the investigators developed a classifier  a pattern of brain activity that could predict whether or not the patient would recall the word at better than chance (AUC = 0.61, where chance =.50).” 3 The classifier relied on activity across all electrodes that were placed in an individual patient.

Memory blocks #4-25 alternated between Simulation (Stim) and No Stimulation (NoStim) lists. In Stim blocks, 0.5-2.25 mA stimulation was delivered for 500 ms when the classifier AUC predicted 0.5 recall during word presentation. In NoStim lists, stimulation was not delivered on analogous trials, and the comparison between those two conditions comprised the main contrast shown below.

Fig. 3a (modified from Ezzyat et al., 2018). Stimulation delivered to lateral temporal cortex targets increased the probability of recall compared to matched unstimulated words in the same subject (P < 0.05) and stimulation delivered to Non-lateral temporal targets in an independent group (P < 0.01).

The authors found that that lateral temporal cortex stimulation increased the relative probability of item recall by 15% (using a log-binomial model to estimate the relative change in recall probability). {But if you want to see all of the data, peruse the Appendix below. Overall recall isn’t that great…}

Lateral temporal cortex (n=18) meant MTGSTG, and IFG (mostly on the left). Non-lateral temporal cortex (n=11) meant elsewhere (see Appendix below). The improvements were greatest with stimulation in the middle portion of the left middle temporal gyrus. There are many reason for poor encoding, and one could be that subjects were not paying enough attention. The authors didn’t have the electrode coverage to test that explicitly. This leads me to believe that electrical stimulation was enhancing the semantic encoding of the words. The MTG is thought to be critical for semantic representations and language comprehension in general (Turken & Dronkers, 2011).

Thus, my interpretation of the results is that stimulation may have boosted semantic encoding of the words, given the nature of the stimuli (words, obviously), the left lateralization with a focus in MTG, and the lack of an encoding task. The verbal memory literature clearly demonstrates that when subjects have a deep semantic encoding task (e.g., living/non-living decision), compared to shallow orthographic (are there letters that extend above/below?) or phonological tasks, recall and recognition are improved. Which led me to ask some questions, and one of the authors kindly replied (Dan Rizzuto, personal communication). 4

  1. Did you ever have conditions that contrasted different encoding tasks? Here I meant to ask about semantic vs orthographic encoding (because the instructions were always to “remember the words” with no specific encoding task).
    • We studied three verbal learning tasks (uncategorized free recall, categorized free recall, paired associates learning) and one spatial navigation task during the DARPA RAM project. We were able to successfully decode recalled / non-recalled words using the same classifier across the three different verbal memory tasks, but we never got sufficient paired associates data to determine whether we could reliably increase memory performance on this task.
  2. Did you ever test nonverbal stimuli (not nameable pictures, which have a verbal code), but visual-spatial stimuli? Here I was trying to assess the lexical-semantic nature of the effect.
    • With regard to the spatial navigation task, we did observe a few individual patients with LTC stimulation-related enhancement, but we haven’t yet replicated the effect across the population.

Although this method may have therapeutic implications in the future, at present it is too impractical, and the gains were quite small. Nonetheless, it is an accomplished piece of work to demonstrate closed-loop memory enhancement in humans.

Footnotes

1 Since that time, however, the UCLA group has reported that theta-burst microstimulation of….

….the right entorhinal area during learning significantly improved subsequent memory specificity for novel portraits; participants were able both to recognize previously-viewed photos and reject similar lures. These results suggest that microstimulation with physiologic level currents—a radical departure from commonly used deep brain stimulation protocols—is sufficient to modulate human behavior and provides an avenue for refined interrogation of the circuits involved in human memory.

2 Unfortunately, I was running between two sessions and missed that particular talk.

3 This level of prediction is more like a proof of concept and would not be clinically acceptable at this point.

4 Thanks also to Youssef Ezzyat and Cory Inman, whom I met at the symposium.

References

Ezzyat Y, Wanda PA, Levy DF, Kadel A, Aka A, Pedisich I, Sperling MR, Sharan AD, Lega BC, Burks A, Gross RE, Inman CS, Jobst BC, Gorenstein MA, Davis KA, Worrell GA, Kucewicz MT, Stein JM, Gorniak R, Das SR, Rizzuto DS, Kahana MJ. (2018). Closed-loop stimulation of temporal cortex rescues functional networks and improves memoryNat Commun. 9(1): 365.

Jacobs, J., Miller, J., Lee, S., Coffey, T., Watrous, A., Sperling, M., Sharan, A., Worrell, G., Berry, B., Lega, B., Jobst, B., Davis, K., Gross, R., Sheth, S., Ezzyat, Y., Das, S., Stein, J., Gorniak, R., Kahana, M., & Rizzuto, D. (2016). Direct Electrical Stimulation of the Human Entorhinal Region and Hippocampus Impairs MemoryNeuron 92(5): 983-990.

Suthana, N., Haneef, Z., Stern, J., Mukamel, R., Behnke, E., Knowlton, B., & Fried, I. (2012). Memory Enhancement and Deep-Brain Stimulation of the Entorhinal AreaNew England Journal of Medicine 366(6): 502-510.

Titiz AS, Hill MRH, Mankin EA, M Aghajan Z, Eliashiv D, Tchemodanov N, Maoz U, Stern J, Tran ME, Schuette P, Behnke E, Suthana NA, Fried I. (2017). Theta-burstmicrostimulation in the human entorhinal area improves memory specificity. Elife Oct 24;6.

Turken AU, Dronkers NF. (2011). The neural architecture of the language comprehension network: converging evidence from lesion and connectivity analysesFront Syst Neurosci. Feb 10;5:1.

Appendix (modified from Supplementary Table 1)  

– click on image for a larger view –

 

In the table above, Stim and NoStim recall percentages are for ALL words in the blocks. But:

  • Only half of the words in each Stim list were stimulated, however, so this comparison is conservative. The numbers improve slightly if you compare just the stimulated words with the matched non-stimulated words. Not all subjects exhibited a significant within-subject effect, but the effect is reliable across the population (Figure 3a)

Source: neurocritic.blogspot.in