What Is the Main Idea?
Telomere biology disorders (TBDs) are inherited conditions that affect how parts of chromosomes called telomeres are maintained. In the open-access review article “Inherited Telomere Biology Disorders: Pathophysiology, Clinical Presentation, Diagnostics, and Treatment”, published in the journal Transfusion Medicine and Hemotherapy, the authors review our current understanding of TBDs and issues that patients living with them can face.
What Else Can You Learn?
In this blog post, the roles of telomeres in cellular aging and maintaining the integrity of chromosomes are described. The varying ways that TBDs can present are also discussed.
Take-Home Message
TBDs can particularly affect the bone marrow, liver, and lungs. If there is a chance that a patient may have a TBD, they should be screened because they will need individual and specific treatment approaches.
What Are Telomeres?
Telomeres are found on the ends of chromosomes. Our chromosomes consist of DNA wound tightly around spool-like histones. DNA molecules in cells are very long, and packaging them in the form of chromosomes means that they are able to fit in the nuclei of cells. Coiling (also known as “condensing”) the DNA in this way also helps to maintain its integrity because it is less likely to get tangled up.
Human chromosomes look x-shaped when they are fully coiled up, and the ends need to be protected from becoming damaged or fusing with other loose DNA ends (such as the ends of other chromosomes). Telomeres act as protective “caps” on chromosome ends, similar to how the bit of plastic at the end of a shoelace helps to stop it from unravelling.
How Do Telomeres Function?
Telomeres are made up of repeats of a six-base sequence (“bases” are part of the units that form together to create a strand of DNA). They prevent the otherwise free ends of chromosomes from getting tangled and fusing together. By doing this, they ensure that chromosomes are stable and are protected from being degraded.
Every time a cell divides to make two new cells, the telomeres at the ends of the chromosomes within it get shorter. This means that your telomeres shorten as you get older. Once the telomeres become shortened to a particular point the cell stops replicating. It either becomes inactive (“senescent”) or is broken down by a process called apoptosis. This limit to “cellular aging” in turn shortens the lifespan of the organism. This is important, because if “old” cells continue to divide the short telomeres can destabilize the DNA, potentially causing problems such as an extra chromosome, loss of a chromosome or breaks in the chromosomes’ DNA. These can cause a variety of conditions, including cancer.
What Are TBDs?
TBDs are a group of rare inherited conditions that affect how telomeres are maintained. People with this type of condition develop, or are born with, abnormally short telomeres because the telomeres shorten prematurely or at a faster rate than normal. In many cases, the fact that someone has a TBD becomes obvious during their childhood, although it may only start to cause serious problems when they reach adulthood. Children with a TBD can develop bone marrow failure (where the bone marrow does not make enough of different types of blood cells), mouth and skin changes, and have abnormal nails.
Common conditions that indicate that an adult has a TBD include:
- bone marrow failure,
- interstitial lung disease (this term is used to describe a group of conditions that cause scarring of the lungs),
- cirrhosis (scarring of the liver as a result of long-term liver damage) and liver failure, and
- myelodysplastic syndrome (a rare type of blood cancer that affect the bone marrow and the blood cells it produces).
These parts of the body can be particularly affected by TBDs because the cells in them replicate rapidly as part of their normal functioning.
What Treatment Issues Do People with TBDs Have?
As well as the conditions outlined above, adults with TBDs are also at increased risk of developing other cancers, including:
- skin cancers and head and neck tumors,
- hardening of the arteries (blood vessels that transport blood from the heart to other areas of the body),
- complications during pregnancy and
- kidney, dental and/or joint problems.
It is important that patients with these issues are identified correctly as having a TBD if they have one, because they may not respond to a standard treatment in a way that would be expected. In some cases, there may be no response to the treatment at all.
Another issue is that patients who undergo allogeneic stem cell transplantation (allo-SCT) have been reported to have worse outcomes if they have a TBD. allo-SCT is a transplant that replaces bone marrow that is not producing enough healthy blood cells with healthy stem cells (cells that self-renew and can develop into many different cell types) from a donor. It is particularly important that donors for such patients are carefully selected and that the normal treatment regime is adapted for them. Special checkups are also recommended for patients with TBDs who undergo allo-SCT.
What Do the Authors Conclude?
The authors of the review article stress the importance of healthcare professionals – particularly those who specialize in the treatment of conditions that affect the blood, liver, and lungs – being aware of the different ways that TBDs can present. TBDs are considered to be underdiagnosed, particularly when they manifest in adulthood.
Screening of patients in whom a TBD is suspected is essential because people with TBDs need individual and specific treatment approaches. In addition, they are at increased risk of complications that can rapidly become life-threatening and of developing cancer. Adequate counseling of patients and their families is also important because the diagnosis of one person in a family having a TBD has implications for their relatives.
Note: One of the authors of this paper makes a declaration about receiving scientific support. It is normal for authors to declare this in case it might be perceived as a conflict of interest. For more detail, see the Conflict of Interest Statement at the end of the paper.
What Is the Main Idea?
In addition to opening up new treatment options for people with neurological conditions, advances in wearable sensor and electrode technology have enabled the development of a new generation of neuromodulation devices. These may allow people who do not have neurological conditions to enhance their brain function. In the open-access review article “Electrical and Magnetic Neuromodulation Technologies and Brain-Computer Interfaces: Ethical Considerations for Enhancement of Brain Function in Healthy People – A Systematic Scoping Review”, published in the journal Stereotactic and Functional Neurosurgery, the authors investigated ethical research published to date on the use of neuromodulation by people who are considered to be healthy.
What Else Can You Learn?
In this blog post, different neuromodulation techniques are described. How nerves transmit signals around the body, and the differences between systematic and scoping reviews, are also discussed.
Take-Home Message
Rapid advances in neuromodulation technology and direct-to-consumer marketing mean that neuromodulation may be readily available in some regions. However, a lack of information about its long-term effects means that the risks of harm may outweigh any potential benefits, and there are ethical concerns about its use.
What Is Neuromodulation?
Neuromodulation uses chemicals, a magnetic field, or an electric current to alter the activities of nerve cells in specific areas of the body. Nerves send electrical signals that control our senses, like pain and touch, and essential processes – such as breathing, digestion, and movement – from one part of the body to another.
When an electrical signal reaches the end of a nerve it is converted into a chemical signal. This causes molecules called neurotransmitters, such as dopamine and epinephrine (also known as adrenaline), to be released. By stimulating nerve cells, neuromodulation influences the release of these neurotransmitters. As an extension of this, brain–computer interfaces can be used to connect the brain’s electrical activity to support devices, such as wearable sensors.
What Conditions Are Treated with Neuromodulation?
Although the exact mechanisms of neuromodulation are not known, a range of treatments are already in use or in development. There is evidence that neuromodulation may be useful in treating epilepsy in cases where it cannot be managed with medication alone and conditions that cause chronic head pain.
Transcranial magnetic stimulation works non-invasively. An electromagnetic coil is placed against the scalp and delivers magnetic pulses to stimulate nerve cells in the brain. It is used to treat conditions such as major depressive disorder, migraines, and addiction.
Transcranial electrical stimulation is also non-invasive, and involves a low electric current being passed between two or more electrodes on the scalp to alter the electrical excitability of nerve cells in specific areas of the brain. There is increasing evidence that it is useful in the treatment of depression and schizophrenia.
Deep brain stimulation is an example of an invasive approach that involves surgery. Small holes are made in the skull, through which electrodes are inserted and positioned in deep brain structures. It can be used to treat movement-related symptoms of Parkinson’s disease and other neurological conditions, such as tremor.
How Does Neuromodulation Aim to Enhance Brain Function?
In addition to its potential as a treatment for medical conditions, there has been increasing interest in the possibility of using neuromodulation to enhance mental functioning beyond what is necessary for health in people who are well. Such enhancements include improved cognition (mental processing), motor functions (to improve physical performance) and mood (to improve the stability of a person’s moods).
What Did This Study Investigate?
Although neuromodulation technologies offer exciting possibilities, the speed at which they have come on to the market for healthy individuals, the likelihood of them being widely used, and direct-to-consumer marketing have raised concerns about access, the potential for misuse, and whether or not current regulatory frameworks are adequate to protect users.
The authors of this study conducted a scoping review to map studies published to date that discuss ethical challenges relating to the use of neuromodulation technologies and brain–computer interfaces to enhance brain function in healthy individuals, with the aim of guiding future research. Of the 159 articles initially identified for evaluation, 23 were included in the final analysis.
What Is a Scoping Review?
Scoping reviews have some similarities with systematic reviews. They both seek to comprehensively review and analyze all of the available research literature that has been published about a research question. However, unlike a systematic review, where the aim is to produce a statement that can guide decision-making on the basis of critical analysis of studies that have been assessed for risk of bias (for example, about the best type of treatment for a particular condition), a scoping review looks to present a descriptive overview of the literature, with the aim of informing policies and helping to identify priority areas for future research.
What Were the Findings of This Study?
The authors of the study found that studies looking at this area of research were lacking. The most common ethical concerns were linked to a lack of data about how safe and effective the new technologies are, as well as what constitutes an acceptable level of risk for a person undergoing a procedure. There were also concerns relating to:
- potential loss of privacy, which could potentially lead to discrimination (for example, in the workplace or relating to the ability to get health insurance coverage if information has been recorded about psychological states and mental health risks) and
- the possibility that procedures could result in people being fundamentally changed (for example, their ability to be sociable or their character/personality), and not necessarily in a desirable way. Relating to this, concerns were even raised by some researchers that neuromodulation technologies could be used for military purposes (for example, by causing people to become immune to traumatic harm).
- potential socioeconomic consequences that already existing inequalities could be worsened or that new ones could be introduced, resulting in some individuals being disadvantaged. For example, high-cost technologies would likely only be available to those who can afford them.
- the risk of people feeling societal pressure to have procedures and of implications relating to decision-making for children and their well-being.
What Did the Authors of the Study Conclude?
The authors concluded that the ethical debate about the neuroenhancement of healthy individuals is still developing and that there is a critical shortage of ethical research on these approaches. They also noted that the risks of invasive approaches may currently outweigh the potential benefits, and that there was very little information about the long-term effects of neuromodulation enhancement technologies or their effects on children, in whom the brain is still developing.
The authors expressed the view that there is an urgent need for the integration of ethical considerations into neuroscientific research to address significant gaps in knowledge and ensure equitable outcomes and access. In addition to pursuing the enhancement of human capabilities, it is critical that the development of neuromodulation technologies safeguards individual well-being and autonomy, and actively works against increasing social inequalities that already exist.
What Is the Main Idea?
Alopecia areata is an autoimmune condition that causes sudden hair loss. In the brief research report “Alopecia Areata Is Associated with Posttraumatic Stress Disorder and Alcohol Use in a Case-Control Study of 4,785 Patients”, published in the journal Skin Appendage Disorders, the authors investigated whether there are any associations of alopecia areata with lifestyle factors and mental health disorders.
What Else Can You Learn?
In this blog post, alopecia areata and autoimmune conditions are described. Different types of observational research studies are also discussed.
Take-Home Message
If you have alopecia areata, it is important that you discuss any social or emotional impacts with a health care provider so that you can access support.
What Is Alopecia Areata?
Alopecia is the medical term for hair loss. Although hair loss can run in families (for example, male and female pattern baldness), some types of hair loss have other causes. The term “alopecia areata” specifically describes hair loss that is caused by an autoimmune condition.
What Causes Alopecia Areata?
Autoimmune conditions are caused by the immune system mistakenly recognizing the body’s own tissue as foreign and attacking it. This results in inflammation (the process by which your body responds to an injury or a perceived threat, such as a bacterial infection). Although it is not fully understood how or why, alopecia areata involves the immune system mistakenly attacking hair follicles, which stops them from growing hair. Because alopecia areata does not cause scarring to the scalp, the hair may begin to regrow over time, but it is not guaranteed and there is a risk of more hair loss in the future.
Alopecia areata can occur at any age and causes sudden hair loss that can develop anywhere, often in the form of small, round, coin-sized bald patches. Some people experience a burning, itching, or tingling sensation in the skin underneath the hair before it is lost. Compared with the rest of the population, people with alopecia areata are slightly more likely to develop or have another autoimmune condition, such as type 1 diabetes and vitiligo. There also appears to be a link with allergic conditions such as asthma and eczema.
What Did This Study Investigate?
Although alopecia areata is known to be an autoimmune condition, the causes of the immune system starting to attack the hair follicles are not well understood. There are no known links with a poor diet or vitamin deficiencies, and although there is some evidence that stress can trigger alopecia areata, many people with the condition do not report significant stress so this may be coincidental. However, the development of the condition can be very upsetting, particularly if a person is unable to disguise their hair loss with their hairstyle, and some studies have reported that alopecia areata is associated with an increased risk of someone having anxiety and/or depression. The authors conducted a nested case–control study to look for evidence of associations of alopecia areata with lifestyle factors and disorders that affect people’s mental health.
What Is a Nested Case–Control Study?
Different types of studies can be used to investigate factors that cause conditions. A nested case–control study is a type of observational study. These are so-called because the people conducting the study observe what happens to a group of people over time without any manipulation or intervention. “Prospective” observational studies involve researchers following a group of people (cohort) over a long period of time, watching out for outcomes and then looking for patterns that suggest factors that may contribute to or prevent them.
In contrast, “retrospective” observational studies look backwards from an outcome and review information about the participants’ pasts to try to identify associations. Case–control studies are usually retrospective. They compare a group of people with the condition that is being investigated (cases) with a second, similar group of people that do not have it (controls). If a case–control is “nested”, several healthy control participants are selected for each case.
What Did the Study Show?
The authors used information collected by the All of Us research program, a diverse health information database that anyone in the USA can join, to analyze the health data of 957 adults with a diagnosis of alopecia areata matched 1:4 (1 case to 4 controls) with 3,828 controls by age, sex at birth, and self-reported ethnicity. The results showed that when compared with controls, people with alopecia areata were more likely to have depression and anxiety. There were also associations with post-traumatic stress disorder, obesity, and increased alcohol use, although people with alopecia areata were less likely to smoke.
The authors suggest that the association with post-traumatic stress disorder supports the idea that stress plays a role in the dysregulation of the immune system that leads to alopecia areata. They also note that the increased likelihood of obesity in people with alopecia areata may be related to the effects of mental health conditions, which may result in people being less physically active. The increased alcohol use may similarly be influenced by people having mental health disorders, in addition to the distress that some people experience when alopecia areata develops.
On the basis of these results, the authors suggest that people with alopecia areata should be referred to patient support groups by their dermatologist. They also recommend that patients are screened for anxiety, depression, post-traumatic stress disorder, alcohol use disorder and other health issues, with referral to specialist health care practitioners as necessary.
What Is the Main Idea?
High blood pressure in the portal vein, known as portal hypertension, can cause serious complications and early detection is a priority. In the open-access review article “Non-Invasive versus Invasive Assessment of Portal Hypertension in Chronic Liver Disease”, published in the journal GE – Portuguese Journal of Gastroenterology, the authors review research published to date regarding different methods used to assess and measure portal hypertension in patients with liver cirrhosis.
What Else Can You Learn?
In this blog post, different methods that are used to detect and assess portal hypertension are discussed. The role of the liver and different stages in the progression of chronic liver disease are also described.
Take-Home Message
Although hepatic vein catheterization is currently considered the most effective way to assess portal hypertension, research is being conducted to develop non-invasive methods with the aim of detecting it earlier and slowing the progression of chronic liver disease.
What Does the Liver Do?
The liver is the largest solid organ in the body and plays many essential roles that keep it healthy. These include making a fluid called bile (which helps the body to break down fats in the food we eat), processing digested food from the intestine by breaking down proteins and carbohydrates so that the body can use them, storing vitamins and iron, and fighting infections.
The liver also cleans the blood to remove harmful substances and microbes that can cause infections. As a result, the liver has about 13% of the body’s total volume of blood in it at any one time. It enters the liver via two major blood vessels called the hepatic artery (which supplies oxygen-rich blood to the liver) and the portal vein (which carries blood from the digestive tract and spleen to the liver). This is to ensure that molecules from digestion are taken up into the blood and processed or checked before they start to circulate the body in the bloodstream.
What Is Chronic Liver Disease?
Because the liver filters toxins from the blood, it is vulnerable to becoming damaged and its ability to function reducing if it is exposed to high levels of toxins. Although the liver is able to produce new cells and regenerate itself, its ability to regenerate becomes reduced over time if it keeps having to work too hard. Eventually, chronic liver disease can develop, in which damage to the liver progresses (gets worse) over a long period of time (at least 6 months). This damage cannot be reversed.
Chronic liver disease develops in four stages:
- The first (called “hepatitis”) means that there is inflammation in the liver tissue. In the short term, this means that the liver can deal with infections and start the healing process. However, if it continues for a long time, hyperactive healing can take place.
- This eventually causes the second stage, known as “fibrosis” (scarring). During this stage, thin bands of scar tissue build up over time, leading to the liver gradually becoming stiffer and the blood flow through it becoming reduced. Some of this damage can be reversed.
- If it continues, though, the third stage is eventually reached. This is called “cirrhosis” and is characterized by severe, permanent scarring that is no longer reversible.
- When the damage becomes so extensive that the fourth stage, “liver failure” (also known as “decompensated cirrhosis”), is reached the liver can no longer function properly.
What Are the Symptoms of Cirrhosis?
Early on in the development of cirrhosis, signs and symptoms may not be noticeable. The first signs can include feeling generally ill, weak or tired; loss of appetite; nausea; pain in the upper abdomen (tummy); and red patches on the palms of the hands or spider-like, visible blood vessels. As cirrhosis progresses, symptoms can include jaundice (yellowing of the whites of the eyes and skin), itchy skin, swollen legs, and bleeding or bruising easily.
Although cirrhosis cannot be reversed, treatment may be able to stop it from getting worse or slow its progression. This is important because cirrhosis is associated with serious complications that can by themselves be life-threatening, including swelling of the abdomen as a result of fluid build-up (called “ascites”) and variceal bleeding (“varices” are veins that have become abnormally widened) in the digestive tract. This results in a person vomiting blood or their poo being black or bloody. Ascites and variceal bleeding are mainly caused by the development of portal hypertension.
What Is Portal Hypertension?
The term “portal hypertension” describes an increase in the blood pressure in the portal vein. It can be caused by resistance in the liver increasing as a result of cirrhosis or because there is a blockage, such as a blood clot. As the drainage of the blood from the abdomen becomes impeded, it starts to try to leave the abdomen via other veins that become more fragile as they are stretched wider.
In addition to ascites and varices, portal hypertension can cause a number of other serious complications. Because prompt treatment of portal hypertension with medication that reduces blood pressure is known to be able to slow the progression of chronic liver disease, it is important that any increases in blood pressure in the portal vein are detected as early as possible. A variety of different techniques are used or are in development to assess whether a patient has pulmonary hypertension.
Detecting Portal Hypertension
Hepatic Vein Catheterization
Hepatic vein catheterization is currently considered to be the most effective way to measure the difference in blood pressure between the portal and hepatic veins.
- First, a flexible tube called a “catheter” with a tiny balloon on the end is inserted into the jugular vein in the neck.
- X-ray imaging is then used to guide the catheter into the hepatic vein so that the blood pressure inside it can be measured.
- The balloon on the end of the catheter is then inflated and a second measurement of blood pressure is taken.
- The two values can then be used to calculate whether portal hypertension is present (classed as a portal venous pressure gradient greater than 5 mm Hg, and regarded as clinically significant if greater than 10 mm Hg).
Although hepatic vein catheterization is effective and it is rare for there to be problems while it is carried out, it is invasive and needs to be done under local or general anesthetic. In addition, it can only be done at highly specialized medical centers and cannot be used to take a series of measurements as the chronic liver disease progresses over time. As a result, researchers are investigating whether non-invasive methods are as effective at assessing pulmonary hypertension.
Serum Biomarkers
These have included serum biomarkers that can be detected by analyzing blood samples that are taken from patients (sometimes called “liquid biopsies”). Serum is the liquid that you have left if all the cells and clotting factors are removed from the blood. The term “biomarker” describes a measurable characteristic, such as a molecule in your blood or a change in your genes, that indicates what is going on in the body.
Endoscopic Ultrasound
Among the other methods available are endoscopic ultrasound, which uses a camera device (endoscope) with a small ultrasound device (which emits high-frequency sound waves) on the end to look at the digestive tract and the surrounding organs. Although this approach is invasive, with the endoscope inserted into the mouth or anus, it can be repeated over time and has been shown to have good accuracy.
Scoring Systems
Scoring systems have also been developed that include measurements of stiffness of the liver and/or spleen being taken using a non-invasive technique called transient elastography. This method works by a vibrating probe and transducer being applied to the skin over the liver or spleen that produces something called a “shear wave”. The speed at which the shear wave travels through the organ is linked to its level of stiffness and indicates how much fibrosis there is.
Conclusion
Although these approaches are not currently as effective as hepatic vein catheterization by themselves, there is some evidence that the combination of different techniques may improve accuracy. Future research in this area aims to improve the non-invasive early detection of pulmonary hypertension.
What Is the Main Idea?
Current treatment approaches for Alzheimer’s disease focus on managing its symptoms and the changes in behavior that happen as the disease progresses. In the open-access review article “Personalized Paths: Unlocking Alzheimer’s via the Gut-Brain Axis”, published in the journal Visceral Medicine, the authors review research published to date regarding the potential role of the gut microbiota in the development and progression of Alzheimer’s disease, and potential future treatment strategies.
What Else Can You Learn?
In this blog post, factors that are known to contribute to the development of Alzheimer’s disease are discussed. The gut–brain axis and the role of the gut microbiota are also described.
Take-Home Message
Although further research is needed to better understand the link between the gut and the development of Alzheimer’s disease, taking steps to improve the health of their guts (for example, by consuming a diet that is low in red meat and processed foods, but high in fruits and vegetables with moderate consumption of dairy products, poultry, eggs, and fish) may be one way that people can reduce their risk of developing Alzheimer’s disease in the future.
What Is Alzheimer’s Disease?
Alzheimer’s disease is the most common type of dementia in adults and is usually diagnosed in people aged 60 years and older, although it can develop in younger people. People with dementia often experience declines in cognitive function that affect their memory and other thinking skills like language, problem-solving, attention, and reasoning. Their behavior, feelings, and relationships can also be affected, with significant effects on their daily lives.
Alzheimer’s disease develops in three stages:
- The first “preclinical stage” is characterized by changes in the brain without the person experiencing any obvious symptoms.
- In the second stage, known as “mild cognitive impairment”, the person experiences problems with their memory and thinking skills, although they are not severe enough to affect their independence.
- In the third stage, which is when a diagnosis of Alzheimer’s disease is usually made, the person may experience memory loss and have difficulties with, or an inability to, recognize and identify familiar people, objects, and words.
What Causes Alzheimer’s Disease
Alzheimer’s disease is known to be caused, at least in part, by the abnormal functioning of two proteins called beta-amyloid and tau. In people with Alzheimer’s disease, beta-amyloid forms clumps called “plaques” on neurons (central nervous system cells that transmit messages between different parts of the brain and around the body) that make it hard for them to stay healthy and communicate with each other. Abnormal forms of tau cling to other tau proteins inside neurons and form “tau tangles”. However, this is not the whole story, and Alzheimer’s disease is now believed to be a complex condition that is caused by a variety of factors – including genetic, environmental, and lifestyle factors – that affect the brain over time.
There is no cure for Alzheimer’s disease. Current treatment approaches focus on managing its symptoms and the changes in behavior that happen as the disease progresses. As a result, researchers are investigating factors that may slow or prevent the development of the disease in the future, and improve the lives of patients and their carers.
What Is the Gut–Brain Axis?
The gut–brain axis is the term given to the connections between the central nervous system (which includes the brain and the spinal cord) and the gut (also known as the gastrointestinal system; it refers to the parts of our bodies that are involved in digestion, including the stomach and intestines), and the ways that they communicate with each other. A key component of the gut that enables it to function effectively is its microbiota, which is the name given to the community of microorganisms that live in it. Together, these microorganisms play essential roles in breaking down food, producing nutrients that the body needs, protecting against infection, and regulating the immune system.
How May the Gut–Brain Axis Be Linked to Alzheimer’s Disease?
Some of the microorganisms that make up the microbiota release signaling molecules that can influence brain function, as well as having anti-inflammatory and potentially neuroprotective properties. This means that the different microorganisms present in a person’s microbiota may either contribute to the development of Alzheimer’s disease or slow its development.
In addition, changes to the normal composition of the community of microorganisms that make up the microbiota – caused by factors such as infections, poor diet, and stress – can lead to long-term (chronic) inflammation in the gut. This inflammation is known to activate immune cells in the central nervous system called microglia, and chronic inflammation is a known characteristic of Alzheimer’s disease that is thought to contribute to the neurons becoming damaged.
In turn, the brain influences the gut and its microbiota in what can become a feedback loop. By releasing stress hormones like cortisol, the brain is able to alter the function of the gut and the composition of its microbiota. As a result, stress negatively affects the gut, which can in turn result in the gut having a negative effect on some neural conditions and contribute to the development of neurodegenerative diseases like Alzheimer’s disease.
The Microbiota and Alzheimer’s Disease
Recent research has suggested that the gut microbiota may directly influence the development and progression of Alzheimer’s disease. Some bacteria found in the gut produce amyloid proteins that are similar to those found in the brain, and which may add to the buildup of plaques. Researchers have also reported that the different microorganisms that make up the gut microbiota are present in different proportions in people with Alzheimer’s disease compared with people without the condition, with some types of bacteria potentially being useful as biomarkers for the development of Alzheimer’s disease or targets for treatment in the future.
Potential Future Treatments
A number of treatment approaches are being investigated that are related to the microbiota to see whether or not they have any potential in treating people with Alzheimer’s disease. Among them, fecal microbiota transplantation (FMT, also known as poo or stool transplantation) has already been suggested by some researchers to have a benefit on cognition in patients with dementia. FMT works by the microbiome from a healthy donor being transferred to the intestines of a recipient, usually in capsule or liquid form. However, the lack of knowledge regarding potential long-term side effects (any unintended effects of a treatment) means that larger studies are needed before FMT can be judged to be clinically useful in the treatment of patients with Alzheimer’s disease.
Other approaches that are being investigated include the use of probiotics (live yeasts and bacteria that are found in some foods and supplements), although there have been conflicting results to date, and dietary changes. Diet is thought to be the single most influential factor on the composition of the gut microbiota. A diet that is low in red meat and processed foods, but high in fruits and vegetables with moderate consumption of dairy products, poultry, eggs, and fish, has been reported to be potentially protective against chronic inflammation and diseases that are caused by it.
All of these approaches are affected by each person’s individual gut microbiota composition. As research to better understand the mechanisms that link the gut–brain axis continues, future treatments will need to be personalized to be most effective in preventing a person’s cognitive decline and reduce the risk of side effects.
What Is the Main Idea?
There is a risk of patients who undergo kidney transplant operations developing antibody-mediated rejection after they receive their new kidneys. In the open-access research article “Positive Long-Term Outcome of Kidney Allocation via Acceptable Mismatch Program in Highly Sensitized Patients”, published in the journal Transfusion Medicine and Hemotherapy, the authors analyze whether the Eurotransplant “acceptable mismatch” program heightens the chance of patients at increased risk of antibody-mediated rejection having better outcomes over the long term.
What Else Can You Learn?
In this blog post, chronic kidney disease (CKD) and kidney failure in general are discussed. Issues relating to the potential rejection of transplanted kidneys, particularly antibody-mediated rejection, are also described.
Take-Home Message
Thorough assessment of patients who receive kidney transplants and of donated kidneys can reduce the risk of the recipient developing antibody-mediated rejection, and increase the long-term success of kidney transplants in patients at higher risk of developing it.
What Is End-Stage Renal Failure (ESRD)?
The kidneys do several important jobs in the body, including helping to control your blood pressure, making red blood cells, and removing waste products and extra water from your body to make urine (wee). If the kidneys become damaged and no longer work as well as they should, their ability to remove waste products from your blood is reduced and too much fluid and waste remains in the body. This is called CKD, an umbrella term that generally means that the kidneys have been permanently damaged by a variety of conditions and that kidney function (how well the kidneys do their job) is reduced.
Factors that can increase your risk of developing CKD include having diabetes, high blood pressure, and heart disease. People over 60 years of age are also more likely to develop CKD. Although CKD can initially be a mild condition, with no or few symptoms, some patients progress to ESRD (also known as end-stage renal failure and kidney failure) where kidney function drops to below 15% of its normal level. When this happens, it means that the kidneys have lost their ability to look after the body’s needs by themselves.
How Is ESRD Treated?
When the kidneys stop working, kidney replacement therapy in the form of dialysis or kidney transplant is needed so that the person can survive. Some people undergo dialysis, a procedure by which the blood is regularly “cleaned” by a machine that filters the blood to remove the excess water and waste products, for the rest of their lives. Other patients choose to receive dialysis until they can get a kidney transplant. This is a type of surgery that places a healthy kidney from another person (the “donor”) into the patient’s body (the “recipient”) to filter their blood, if they are fit enough. Data show that kidney transplantation is the best treatment option for patients with ESRD in the long term. However, there are a number of factors that can affect its success and patients can wait several years for a suitable kidney to become available.
What Affects the Success of a Kidney Transplant?
When kidney transplants are successful, they offer the benefits of the person who receives the transplant having fewer restrictions on what they can eat and drink, and better quality of life. Patients with ESRD who receive a kidney transplant also tend to live longer than those who do not, although this is not guaranteed.
Nonetheless, a kidney transplant operation is a major procedure with risks of complications and infections. There are also risks that something might go wrong with the transplanted kidney. Furthermore, patients need to take immunosuppressant medicines (usually for the rest of their lives) that reduce the activity of the immune system so that it does not attack the new kidney as “foreign”, which can cause the new kidney to be rejected. There is also a risk of antibody-mediated rejection developing months or even years after the person receives their new kidney, which can cause the transplant to fail.
What Is Antibody-Mediated Rejection?
Antibodies are specialized proteins that are made by the immune system and recognize markers that are considered foreign to the body (these are called “antigens”), like on bacteria and viruses. Different antibodies specifically recognize and neutralize different antigens. When they have recognized and responded to a particular antigen once, antibodies against that antigen continue to circulate in the blood to provide protection against it if it is encountered again (this is how we become immune to some diseases).
As well as existing on the surfaces of bacteria and viruses, antigens are also present on the cells of our own bodies. The immune system uses specific antigens called human leukocyte antigens (HLAs) to recognize which cells belong in our bodies and which do not. When a kidney is transplanted, HLA mismatches between the donor and the recipient can be detected as “foreign” by the recipient’s immune system and trigger it to make donor-specific antibodies. This significantly increases the chance of the transplanted kidney being rejected or failing sometime after the operation.
Can the Risk of Antibody-Mediated Rejection Be Reduced?
The most effective way to reduce the risk of antibody-mediated rejection is for both the donor and recipient to be “HLA typed”. Each person has many HLA markers, and research has shown that at least six HLA markers of a donor must match those of the recipient for a transplant to have a chance of being successful, although much closer matches are usually required. Some HLA types are less common than others, so some patients may face a longer wait for a suitable donor to be found as a result. In addition, there is evidence that not all HLA mismatches are equal in terms of how they contribute to the risk of antibody-mediated rejection.
What Did This Study Investigate?
Eurotransplant is an international non-profit organization that facilitates cross-border exchange of donor organs between eight countries in Europe. By mediating between donor hospitals and transplant centers in its member countries, Eurotransplant aims to increase the likelihood that a person waiting for a transplant will find a suitably matched kidney and decrease the length of time that they will have to wait.
Some patients are classed as higher urgency by Eurotransplant, and these include patients for whom there is a risk that they have antibodies that will react to blood or tissue from another person (termed “immunized” patients). Because this means that the risk of organ rejection is also increased, immunized patients are eligible to join a dedicated program called the “acceptable mismatch” program. This identifies HLA mismatches that are unlikely to cause severe antibody-mediated reactions (in other words, they are “acceptable” to the potential recipient’s immune system).
In the short term, immunized patients who have received kidneys through the acceptable mismatch program have been shown to have similar rates of transplant survival over the short term as patients who are not immunized. However, data showing success rates over longer periods of time have been lacking. The authors of this study compared the long-term outcomes of immunized patients who received kidneys through the acceptable mismatch program with patients who received kidney transplants and were either not sensitized to HLA mismatches or were sensitized to a small extent (all patients were allocated kidneys by Eurotransplant). They also looked at whether HLA compatibility and the type of induction therapy received (this is a type of treatment that is given at the time of the transplant operation to reduce the risk of the new kidney being rejected) affected the chance of success.
What Were the Findings of the Study?
The authors of the study report that overall graft survival rates 10 years after transplant are similar between patients who are not sensitized and those who are allocated a kidney via the acceptable mismatch program. In contrast, overall graft survival after 10 years in patients who are slightly sensitized but are not eligible for the acceptable mismatch program is significantly lower than in patients who are not sensitized. Additionally, broad mismatches were identified by the study that can predict increased risk of antibody formation that can cause antibody-mediated rejection.
In conclusion, the authors state that patients on the acceptable mismatch program benefit from improved long-term outcomes, and that the risk of them developing antibodies that could cause rejection of their new kidneys is decreased. They also note that patients who are partly sensitized have better outcomes with a particular type of induction therapy compared with other types. Overall, the acceptable mismatch program delivers better outcomes for immunized patients with ESRD over the long term.
Note: One of the authors of this paper makes a declaration about grants, research support, consulting fees, lecture fees, etc. received from pharmaceutical companies. It is normal for authors to declare this in case it might be perceived as a conflict of interest. For more detail, see the Conflict of Interest Statement at the end of the paper.
What Is the Main Idea?
At-home use of nail cosmetic products (NCPs) is increasingly popular but can be associated with increased risk of adverse effects compared with their use in salons. In the free-access review article “Adverse Effects of Do-It-Yourself Nail Cosmetics: A Literature Review”, published in the journal Skin Appendage Disorders, the authors assess the most common adverse effects that can result from the use of NCPs.
What Else Can You Learn?
Different types of NCPs are described along with adverse effects that can result from their use.
Take-Home Message
Users of at-home NCPs should be aware of the chemicals in the products that they use and should check that they are using NCPs correctly. If you use artificial nails, it is worth taking a break from them every few months to let your nails recover from exposure to the chemicals used to apply and remove them.
What Are the Most Popular Types of NCPs?
Although traditional nail polish remains one of the most popular NCPs, a range of other NCPs and techniques have been developed that offer the advantage of being longer lasting and harder wearing. These newer NCPs were originally available only in salons, but at-home manicures have become increasingly popular because they can be relatively easy to apply and enable people to follow nail art trends seen in video tutorials available online.
Press-on nail sets include plastic tips that range in size to ensure that one can be fitted to each fingernail, which are attached to the nail via the use of nail glue or a peel-and-stick adhesive. In contrast, gel nail polish contains acrylic- or cyanoacrylate-based compounds, which are applied as at least three layers: a base coat, a colored layer, and a top coat. The layers are then “cured” (dried and hardened) using a light-emitting diode (LED) or ultraviolet (UV) lamp.
Shellac nail polish, which similarly needs to be cured using a UV lamp, is a method that combines both traditional and gel nail polish. In contrast, acrylic nails are formed by combining a liquid with a powder to form a mixture that is applied and sculpted onto the nail, and that hardens through air-drying. Both acrylic nails and gel polish need to be removed by either soaking in acetone or the use of a nail drill.
Nail hardeners are used to protect nails and increase their strength, and are also used to deter nail-biting habits in both children and adults. Like traditional nail polish, nail hardeners often contain substances called tosylamide resin and/or formaldehyde resin, which help the polish/hardener to stick to the nail.
Can NCPs Have Adverse Effects?
The short answer is yes. The term “adverse effect” describes any unintended harmful effect that is caused by a medication, intervention or treatment. Although NCPs are not inherently harmful, a range of adverse effects have been reported following NCP use in salons over the years. Furthermore, there is evidence that consumers using at-home NCPs are at increased risk of experiencing adverse effects, due in part to a lack of training and not being aware of the potential consequences of not applying the NCPs correctly. Some nail tutorials available online do not show accurate and safe techniques for the application of the NCPs being used, and some NCPs do not come with detailed information about how to apply the NCP correctly or warnings about potential issues if the person fails to do so.
What Is the Most Common Adverse Effect That NCPs Can Cause?
The most common adverse effect caused by NCPs is allergic contact dermatitis, which is triggered when a person comes into contact with a particular substance, such as a chemical in cosmetics or nickel in jewelry. It is a type of eczema that can affect any part of the body, and can develop immediately after exposure to the substance causing the reaction or hours or days later. Symptoms vary from dry, cracked itchy skin to a burning or stinging sensation, with some people developing painful fluid-filled blisters on the affected area. In the case of NCPs, most cases of allergic contact dermatitis are caused by acrylates found in acrylic nails, gel polish, and nail glue.
What Other Adverse Effects Are Associated with NCPs?
Chemical Burns
The nail glues that are used to stick on press-on nails can cause chemical burns (these are burns caused by contact with caustic, acid, or alkali chemicals rather than heat). They often occur as a result of accidental spillage, particularly when children are using NCPs without supervision, and the burns can be severe. In some cases, people who have had chemical burns caused by nail glue have had to have skin grafts.
In many cases, part of the problem is believed to be a lack of awareness about how to best deal with a chemical burn. Any contaminated clothing needs to be removed carefully and in a way that will not spread the chemical that is causing the burn. It is also important not to wipe the skin, because this can spread the chemical over a greater area. Instead, the affected skin should be rinsed with clean water (ensuring that the water can run off the skin freely) as soon as possible without wiping or rubbing, and the person should seek medical attention as soon as possible.
Nail Infections
Extended use of press-on nails has been associated with increased risk of nail infections. This is often caused when the real nail (the nail plate) under a press-on nail is partially dislodged from the nail bed, for example if the press-on nail is knocked against something. The nail bed is a layer of skin that is visible directly under the nail plate (which is semi-transparent).
If the nail plate becomes dislodged, microbes like bacteria and fungi can get into the gap between the nail plate and the nail bed and start to grow. An example of this is green-nail syndrome (also known as “the greenies” or “chromonychia”), which is caused by infection with a type of bacteria called Pseudomonas aeruginosa. Although green-nail syndrome usually responds well to treatment, in some cases the affected nail may need to be removed.
Risk of Photosensitivity
In addition to the above, the use of UV lamps to cure nails is associated with a risk of photosensitivity (an unusual reaction or heightened sensitivity when the skin is exposed to UV radiation). This has been particularly associated with people with forms of an autoimmune condition called lupus erythematosus. Some studies have also reported concerns about whether or not the use of UV nail lamps increases a person’s risk of developing skin cancer. Although it is still not clear whether using these lamps does significantly increase a person’s risk, frequent users may wish to wear gloves or apply sunscreen before using them as a precaution.
Note: The authors of this paper make a declaration about grants, research support, consulting fees, lecture fees, etc. received from pharmaceutical companies. It is normal for authors to declare this in case it might be perceived as a conflict of interest. For more detail, see the Conflict of Interest Statement at the end of the paper.
What Is a Food Allergy?
A person is described as having a “food allergy” if their immune system has an unusual, and usually unpleasant, reaction to a specific food. Reactions to foods that are not classed as food allergies include food intolerances (where a food irritates the digestive system or the body cannot digest a particular food properly), reactions to food that has become contaminated, or something being in the food that can have drug-like effects on the body (like caffeine in coffee).
The body’s immune system protects your body from things that could make you ill, like harmful substances and infections. Key components are inflammation (which traps things that might be harmful and begins to heal injured tissue) and white blood cells (which identify and eliminate things that might cause infection). Some white blood cells make antibodies that, together with other specialized immune cells, enable the body to recognize and fight specific germs that it has previously come into contact with, sometimes providing lifelong protection. Antibodies are divided into five different classes – IgD, IgG, IgM, IgA, and IgE – based on their characteristics and roles.
Are There Different Types of Food Allergy?
Food allergies are divided into two types: IgE-mediated and non-IgE-mediated (NIM).
- If a food allergy is IgE-mediated, it is caused by IgE antibodies wrongly recognizing the food as a threat. Within minutes of the food being ingested, hives (a raised and itchy rash, also called “urticaria”) and redness of the skin can appear. The person may also start to vomit and, if the reaction is serious, anaphylaxis can occur. Anaphylaxis is life-threatening and symptoms can include difficulty breathing, swelling of the throat and tongue, feeling faint or dizzy, wheezing or coughing, and tightness in the throat.
- NIM food allergies are caused by components of the immune system other than IgE. They are not as well understood as IgE-mediated food allergies, but a key difference is that allergic reactions do not develop as quickly. Whereas IgE-mediated allergic reactions appear almost immediately after the trigger food is eaten, the appearance of symptoms of NIM reactions is delayed, sometimes appearing as long as several days later. This can make it more difficult to identify the food that is causing the reaction.
What Are the Symptoms of NIM Allergic Reactions?
NIM allergic reactions can affect any part of the gastrointestinal tract. This refers to the route that food and drink takes as it enters the body at the mouth, travels through the stomach and intestines, before waste is passed out of the body. Symptoms can include diarrhea, vomiting, discomfort in the stomach area, and constipation. Babies can also have “colic”, which is when a baby cries a lot without there seeming to be an obvious reason for it.
What Is FPIES?
FPIES is a rare type of NIM food allergy that is usually diagnosed in infants and that is likely to have been present from birth. It affects the small intestine, which is the part of the digestive tract that receives partially digested food from the stomach before it moves on to the colon (large intestine). In most cases, symptoms include repeated vomiting between 1 and 4 hours of the trigger food being eaten, often not long after the infant has first eaten it (for example, during weaning), and diarrhea within 24 hours. However, in some cases symptoms appear several days later. If the vomiting is severe the infant may become pale and floppy.
Common trigger foods include cow’s milk, hen’s eggs, and soy, but FPIES can also be caused by rice, meats, and other foods that are not often associated with food allergies. Although the reactions can be severe, some children “grow out of” the allergy and become able to tolerate the trigger food by the age of 2 years.
What Did the Study Investigate?
Unlike with IgE-mediated food allergies, there are currently no skin or blood allergy tests for NIM food allergies like FPIES. Instead, FPIES is diagnosed through a process of removing foods from the diet one at a time and then reintroducing them if the symptoms start to get better. As a result, it is thought that some cases of FPIES are not diagnosed, which is not helped by the fact that the symptoms are non-specific (in other words, they are common symptoms of illness that could be caused by several different things). Better understanding of the characteristics of FPIES that is not diagnosed by healthcare professionals may help to prevent underdiagnosis or someone being wrongly diagnosed with something else.
The authors of the study used information collected by the Japan Environment and Children’s Study to investigate how commonly FPIES is diagnosed in Japan, and to look for differences in parent- and healthcare professional-diagnosed FPIES. The Japan Environment and Children’s Study involved more than 100,000 pregnant Japanese women. Each woman was asked to complete questionnaires about their child and family at regular intervals until their child was aged 3 years, and were asked if their child had ever had any of the symptoms that suggest FPIES, particularly repeated vomiting. The authors of the study were then able to analyze the results collected at age 1.5 years and look for differences and trends.
What Did the Study Show?
The number of children diagnosed with FPIES was low, which was expected because FPIES is relatively rare, with less than 1% (0.69%) of parents reporting that their child had shown symptoms of FPIES. However, only 0.06% of children had been diagnosed as having FPIES by a healthcare practitioner, only around 10% of children whose parents had reported FPIES symptoms. This suggests that FPIES has been underdiagnosed in Japan.
In addition, there was a discrepancy between the trigger foods that the parents reported as causing the symptoms and the ones that healthcare professionals identified as being the cause of allergic reaction when making their diagnoses. Parents were more likely to report hen’s eggs as being the trigger food, while healthcare practitioners were more likely to diagnose an allergy to cow’s milk. These results suggest that, as well as more research being needed, healthcare practitioners need to have more accurate information about a child having episodes of similar symptoms in the past to be able to make an accurate diagnosis.
What Is the Main Idea?
Allergic rhinitis and chronic rhinosinusitis both involve inflammation of the nose. In the research article “National Trends in Allergic Rhinitis and Chronic Rhinosinusitis and COVID-19 Pandemic-Related Factors in South Korea, from 1998 to 2021”, published in the journal International Archives of Allergy and Immunology, the authors describe how the increasing rate of incidence of allergic rhinitis and chronic rhinosinusitis slowed in South Korea during the COVID-19 pandemic.
What Else Can You Learn?
The causes and symptoms of allergic rhinitis and chronic rhinosinusitis are discussed. Lifestyle measures adopted during the COVID-19 pandemic that might benefit people with these conditions, and the role of the sinuses, are also described.
Take-Home Message
Lifestyle factors such as the wearing of face masks and eye protection, as well as social distancing, frequent hand washing, and disinfection of surfaces, which were adopted to limit the spread of the SARS-CoV-2 virus during the COVID-19 pandemic, may benefit patients with allergic rhinitis and chronic rhinosinusitis.
What Are Allergic Rhinitis and Chronic Rhinosinusitis?
Allergic rhinitis and chronic rhinosinusitis both affect the nose (as indicated by the prefix “rhin”):
- Allergic rhinitis describes inflammation of the inside of the nose caused by a person coming into contact with something that they are allergic to. Inflammation is a normal process through which the body responds to an injury or infection by causing blood cells and other substances to gather at the affected area. If someone is allergic to something, their immune system identifies it as potentially harmful and inflammation is triggered in an attempt to remove it. Common causes of allergic rhinitis include dust, mold spores, pollen (this form of allergic rhinitis is more commonly known as hay fever), contact with animals, and chemicals used to maintain the quality of the water in swimming pools.
- Chronic rhinosinusitis is also caused by inflammation, but specifically describes inflammation of the sinuses that is not necessarily caused by an allergy and that lasts longer than 12 weeks, even with treatment. Although it is not yet known how inflammation of the sinuses becomes chronic in some people, smoking, having a weakened immune system, the presence of growths (known as “nasal polyps”) in the nose, and allergies and related conditions like asthma have all been shown to be associated with it.
What Are the Sinuses?
The term “sinus” is used in medicine to describe more than one thing, but one use of the term is to specifically describe air-filled cavities inside the skull that are connected with each other. The role of the sinuses is not fully understood, but their presence means that the overall mass of the skull is less than it would be if it was entirely made up of bone.
Both the sinuses and the inside of the nose are lined by membrane layer that produces and secretes mucus (snot). Mucus is a sticky liquid that contains water, salt, and cells that are produced by the immune system. It keeps the nasal passages lubricated, and also protects the body from irritants (like dust and pollen) and microbes that can cause infections. If a microbe or irritant enters the nose, it gets trapped in the sticky mucus and the body then tries to get rid of it, for example by sneezing. If you have a cold or an allergic reaction, more mucus is produced than normal because the body is trying to get rid of the microbes or irritants that are causing the immune system to mount a response.
How Do Allergic Rhinitis and Chronic Rhinosinusitis Differ?
The symptoms of allergic rhinitis often develop quickly after a person comes into contact with something they are allergic to, and are similar to those caused by having a cold: a runny or blocked nose, a cough, sneezing, and the eyes may become reddened, itchy, or watery. People with chronic rhinosinusitis often have similar symptoms, and because the condition keeps mucus from draining away, they can also experience swelling resulting in pain and tenderness around the forehead, nose, eyes, or cheeks. Other symptoms include aching in the teeth, bad breath, and ear pain. The key difference between the two conditions is the length of time over which a person experiences symptoms, with the symptoms of chronic rhinosinusitis lasting much longer.
How Common Are Allergic Rhinitis and Chronic Rhinosinusitis?
Allergic rhinitis and chronic rhinosinusitis are both common. Allergic rhinitis is estimated to affect up to 40–50% of the world’s population, while studies of chronic rhinosinusitis have estimated that it affects between 5 and 12%. Although the prevalences of both conditions differ between countries, global incidence is increasing and has been linked to increased environmental air pollution, climatic factors such as humidity and increased exposure to particles carried by winds, and lifestyle factors such as increased exposure to allergens and changes in the foods that people eat.
What Did the Study Investigate?
The authors of this article analyzed data collected as part of a large, national study conducted in South Korea called the Korea National Health and Nutrition Examination Study (KHANES), which was begun to enable the health and nutrition status of thousands of Korean citizens aged 1 year or older to be monitored over a long period of time. Studies like this enable researchers to identify changes in the health of a population and to identify things that may increase the risk of developing conditions like allergic rhinitis and chronic rhinosinusitis.
The authors analyzed data relating to a group of adult KHANES participants over a period of 24 years (between 1998 and 2021) to see how the incidence of allergic rhinitis and chronic rhinosinusitis changed. They found that the incidence of allergic rhinitis and chronic rhinosinusitis increased by more than 3% and more than 2%, respectively, over the course of the study. These findings mirror those of studies conducted in other countries.
How Did Incidence Change during the COVID-19 Pandemic?
The authors also observed that the rate at which the incidence of allergic rhinitis and chronic rhinosinusitis increased slowed down between the start of the COVID-19 pandemic in 2020 and 2021. COVID-19 is caused by a virus called SARS-CoV-2, which is mainly spread via small “respiratory droplets” (so small that you cannot see them) that are released into the air when a person infected with the virus breathes, coughs, speaks, and sneezes. Although South Korea did not experience a lockdown like some other countries, the wearing of face masks and eye protection, as well as social distancing, frequent hand washing, and disinfection of surfaces was quickly and widely adopted.
Other studies have reported that such lifestyle measures, which limit the spread of SARS-CoV-2, are also useful in the management of patients with allergic rhinitis. In addition, reductions in air pollution as a result of lockdowns have been reported by some researchers to have had positive effects on people with allergic rhinitis and chronic rhinosinusitis. Although more research is needed, the authors conclude that the reduced incidence of allergic rhinitis and chronic rhinosinusitis seen in South Korea during the pandemic indicates the potential for lifestyle changes like these to benefit people with these conditions.
Dr DongKeon Yon, corresponding author, on the relevance of the article for patients:
The increase in prevalence of allergic rhinitis (AR) and chronic rhinosinusitis (CRS) from 1998 to 2021 underscores the need for enhanced public health efforts to prevent and manage these conditions. These could include improving public awareness, increasing access to diagnostic and treatment services, and implementing preventive measures such as improving air quality. The pandemic-related decrease suggests that lifestyle and behavioral changes, such as reduced outdoor activity and increased use of face masks, have protective effects against these conditions.
What Is the Main Idea?
The nails on the ends of our fingers and toes protect them from damage, but can indicate underlying health conditions and may themselves become diseased. In the open-access review article “Vitamins for the Management of Nail Disease: A Literature Review”, published in the journal Skin Appendage Disorders, the authors review the evidence published to date that investigates whether the treatment of nail disorders with vitamins and their derivatives is effective or not.
What Else Can You Learn?
The roles of fingernails and toenails, and how they are formed, are discussed. Different types of nail disorders and their causes are also described.
Take-Home Message
There is little evidence to support vitamin supplements being effective in improving nail disorders. People who develop a nail disorder should seek advice and treatment from a dermatologist.
Why Do We Have Fingernails and Toenails?
Our fingernails and toenails are essentially tough, rigid plates that protect the ends of our fingers and toes from damage, and act as a barrier to stop microbes like bacteria, fungi, and viruses entering the body and causing infections. Nails also enhance the sensitivity of the ends of the fingers and toes when they touch an object, enable us to scratch things, and make it easier for us to pick up very fine things like a hair on a jumper or a needle on the floor.
How Do Nails Grow?
Nails have several parts. The hard surface that we think of as being the actual nail is called the nail plate and is mostly made of a protein called keratin, which is also found in claws, horns, and hooves of other animals. The nail bed is a layer of skin that is visible directly under the nail plate (which is semi-transparent). At the base of the nail, a thin layer of skin called the “cuticle” grows over the nail plate and provides a waterproof barrier.
Below the cuticle is the “proximal nail fold”, which covers a pouch of skin that the nail is tucked into called the nail matrix. The whitish, half-moon-shaped area at the base of the nail is the part of the matrix that is visible. The nail matrix is the part of the nail responsible for its growth. Keratin is constantly being produced and slowly pushes the nail plate forward, causing it to grow longer. Fingernails grow faster than toenails and do so, on average, at a rate of 3–3.5 millimeters per month.
How Does Nail Integrity Represent Overall Health?
The keratin that forms the nail plate comes from a specialized type of cell. The way in which these cells link together can affect the consistency, strength, and look of the nail plate. As a result, because the nail plate forms from living cells, changes to the way the nail looks can indicate that a person has a health problem like a nutritional deficiency.
For example, people with a chronic iron deficiency may have nails that bend up at the sides or that are unusually pale, while clubbing (when nails appear swollen or wider than normal) can indicate low oxygen levels in the blood, potentially caused by a chronic lung disorder. In addition to nail abnormalities caused by underlying health problems, the nails themselves can become diseased. The nail plate is more permeable (a measure of how easily gases and liquids can pass through something) than skin, which means that harmful substances and microbes can penetrate it more easily.
What Are Some Common Nail Disorders?
There are a range of nail disorders that are caused by different things:
- Brittle nail syndrome causes the nail to become fragile.
- Onychomycosis is a fungal nail infection that can cause the nail to become brittle and discolored.
- Habit-tic nail deformity is caused by the repeated rubbing, picking, or pushing back of the proximal nail fold.
- Periungual/subungual verrucas are a type of wart that forms in the grooves of the proximal nail fold or under the nail plate.
- Nail psoriasis is an autoimmune disorder that can cause nails to become pitted and discolored, and may be accompanied by a psoriatic rash (patches of skin that are red, dry, and flaky).
What Did the Article Investigate?
Because many nail disorders are chronic conditions (lasting for a long time or that come back over and over again), there is a need for more safe, effective treatments for nail disorders that can be used long-term. Although there is limited clinical evidence that treatment with vitamin supplements can be effective, survey-based studies have shown that dermatologists (doctors that specialize in treating nail, skin, and hair problems) often recommend vitamin supplements to patients, and that self-reported use of vitamin supplements to improve nail, skin, and hair disorders among affected patients almost doubled between 2011–2012 and 2017–2020.
The authors of the article searched the published medical literature for studies that have assessed the effectiveness of vitamin and vitamin derivatives in treating nail disorders if taken by mouth (orally), applied on the outside of the body (“topically”, for example a cream that is rubbed into the skin or nail), or applied directly to lesions (areas of nail or skin damage, for example by being injected into an abnormal area of skin). In total, 49 articles were considered suitable for assessment. In addition to looking at research involving the common nail disorders described above, the authors also looked at evidence relating to a rare condition called yellow nail syndrome.
What Did the Authors Conclude?
The authors concluded that, overall, there is currently limited evidence to support treating nail disorders with vitamins and their derivatives. Exceptions to this included the treatment of some patients with yellow nail syndrome with oral or topical vitamin E, and the treatment of patients with onychomycosis with topical vitamin E (although the authors note that clinical trials are needed to investigate its efficacy and possible side effects).
The treatment of nail psoriasis with topical tazarotene, a type of retinoid (a form of vitamin A), or analogs of vitamin D (these are forms of naturally occurring vitamin D that have been chemically modified to have different or greater therapeutic effects) was evaluated as having been proven to be effective. In addition, intralesional vitamin D3 treatment seemed to be effective in treating periungual/subungual verrucas, but the authors noted that more studies are needed.
Overall, the authors did not find good evidence that taking vitamin supplements is effective in treating nail disorders. These findings confirm the fact that further research is needed to develop more effective treatments for nail disorders, and that people who develop nail disorders should seek specialist advice from a dermatologist rather than relying on taking vitamin supplements to improve the condition.
Shari Lipner MD, PhD, corresponding author, on the relevance of the article for patients:
“There is increasing interest by the public in treatment of medical conditions, including nail disorders, with vitamins because they believe they are safe and hope that they are effective. While more research is needed, vitamin E may be reasonable for yellow nail syndrome treatment, given limited treated options. Biotin is not recommended for brittle nail syndrome treatment, given potential laboratory interactions and lack of efficacy. Topical vitamin A and D analogs are efficacious and may safely be prescribed for nail psoriasis.”
What Is the Main Idea?
Platelets, a type of blood cell, are involved in the control of blood loss. In the review article “In vitro Hemostatic Functions of Cold-Stored Platelets”, published in the journal Transfusion Medicine and Hemotherapy, the authors discuss how the temperature at which platelets are stored before they are transfused into patients affects their functions. They also review research published to date comparing the effects of storing them in the cold compared with at room temperature.
What Else Can You Learn?
The role of platelets in stopping blood loss and the importance of platelet transfusion are discussed. Transfusion-transmitted sepsis is also described.
Take-Home Message
While research continues to optimize how platelets are stored, there is an increasing need for people to become platelet donors.
What Are Platelets?
Blood is made up of a liquid called plasma and three main types of blood cell:
- Red blood cells (also known as “erythrocytes”) carry oxygen around the body.
- There are several different types of white blood cells (also known as “leukocytes”) and they help fight infection.
- Platelets (also known as “thrombocytes”) are the smallest type of blood cell and are involved in the process that enables blood to clot to promote healing and control “hemostasis” (the process that stops us from losing blood if we begin to bleed).
How Do Platelets Help Control Blood Loss?
The process by which blood clots are formed involves platelets and a number of different types of protein. Platelets are made in the bone marrow and continuously travel around the body in the bloodstream. Under normal conditions, when there is no damage to blood vessels that needs to be repaired, platelets are “quiescent” (inactive) and plate-shaped. If a blood vessel becomes damaged, platelets start to be attracted to the site of injury because a protein called collagen becomes exposed. Once they get there, they start to stick to the collagen and also to each other. This is helped by the cells lining the damaged blood vessel releasing a molecule called von Willebrand factor.
The platelets then become activated. As this happens, their shape changes to spherical with long “spines” or “tentacles”. The platelets start to secrete chemical signals that attract other platelets to the injury site, and they clump together to temporarily cover and close the wound, a bit like a plaster. This platelet covering isn’t able to last long so clotting factors in the blood start to convert a protein called fibrinogen into a different form called fibrin. Fibrin can form long, tough, insoluble strands that bind to the platelets and cross-link together to form a strong, long-lasting mesh on top of the platelet plug. The fibrin then acts as a scaffold as part of the healing process.
What Is Platelet Transfusion and When Is It Needed?
Platelet transfusion is the process by which platelets that have been donated are transferred into the bloodstream of another person. The body needs to have a certain number of platelets in the blood to be able to control hemostasis properly. Too few or too many platelets can cause problems. A normal platelet count is considered to be 150,000–450,000 platelets per microliter of blood. If a person’s platelet count is greater than this they are classed as having a condition called thrombocytosis. Thrombocytosis can have a number of different causes, which affect how serious it is and whether or not a person needs treatment. If a person’s platelet count is less than 150,000 platelets per microliter of blood they are classed as having thrombocytopenia.
Thrombocytopenia can develop as the result of a number of conditions, including cancer, some types of anemia, autoimmune conditions, and viral infections, and as a result of certain types of medical treatment. Symptoms can include frequent gastrointestinal bleeding or bleeding from the gums and nose, and bruising easily. A person may also have a low platelet count if they have bled severely (for example during surgery or as the result of being in an accident) or if their spleen (the organ that “cleans” the blood to keep it healthy) starts to remove too many platelets. In addition to having too few platelets, people may also need a platelet transfusion if they have a platelet function disorder that means that they have enough or too few platelets but they do not work properly.
What Happens to Donated Platelets?
There is a constant and increasing demand for donations of platelets. Some donations are used to transfuse people with a low platelet count, while others are used to help patients who are receiving cancer treatment or are in intensive care. One of the reasons for the increasing need for new donors is that platelets don’t last very long. They are only usable for 7 days after they have been donated, and in the body they are removed from the bloodstream by the spleen or liver after 7–10 days.
Until the 1960s, platelets were stored in the cold (at 4 °C, i.e. in a fridge) because cold-stored platelets are better at stopping blood loss than ones stored at room temperature. However, research showed that the recovery of patients after platelet transfusion was better and that platelets lasted longer if they were stored at room temperature, and cold storage of platelets stopped. While this improved patient recovery after transfusion it brought a new problem; storage of platelets at room temperature increases the risk of septic transfusion reactions caused by the platelets being contaminated with bacteria.
What Is Transfusion-Transmitted Sepsis?
Transfusion-transmitted sepsis can develop if a patient is transfused with donated platelets that are contaminated with bacteria. It is typically caused by contamination with bacteria that usually live harmlessly on a person’s skin, contamination getting into the platelet sample during collection or processing, or the donor unknowingly having bacteria in their blood. Symptoms can begin during or shortly after a transfusion and include severe shivering and chills, high fever, nausea and vomiting, breathing difficulties, low blood pressure, a fast heart rate and circulatory collapse. Severe cases can be fatal.
What Did This Review Article Investigate?
Research continues to investigate ways to improve the safety of platelet transfusion and to optimize how platelets are stored. As a result, over recent years, interest in cold storing platelets has increased because it reduces the risk of bacterial contamination and potentially increases the length of time that platelets can be stored.
There is some evidence that storing platelets at room temperature for 5 days (during which time they are constantly gently shaken), followed by cold storage without shaking for up to another 16 days, may result in platelets being of better quality at the time of transfusion. Other studies have reported that cold-stored platelets are “primed” for activation to a greater extent when compared with platelets stored at room temperature, and also seem to be better at sticking to collagen at sites of blood vessel damage. In addition, one research group has reported that cold-stored platelets are able to form denser clots, with thinner fibers and more crosslinks, making them more effective at stopping blood loss.
Although these results seem promising, there is also evidence from research reports that how platelets are prepared for transfusion, the solutions that are added to them (such as plasma), and variations in how they are stored can affect how well they function. Until these factors become globally standardized it is difficult to draw conclusions regarding whether storage at room temperature or 4 °C is best. In the meantime, platelet donations are increasingly needed to help a broad group of patients with a variety of conditions. Requirements vary according to regions, but if you are interested in becoming a platelet donor you will be able to get information from the health service in your country.
Note: The author of this paper make a declaration about grants, research support, consulting fees, lecture fees, etc. received from pharmaceutical companies. It is normal for authors to declare this in case it might be perceived as a conflict of interest. For more detail, see the Conflict of Interest Statement at the end of the paper.
What Is the Main Idea?
The gastrointestinal system or “gut” refers to the parts of our bodies that are involved in digestion, including the stomach and intestines. In the open-access review article “Gut Frailty: Its Concept and Pathogenesis”, published in the journal Digestion, the author discusses a concept called “gut frailty”, and describes how the extent to which our gut becomes frail in old age can affect our overall health.
What Else Can You Learn?
The concept of gut frailty is discussed. The importance of the gut microbiome and the link between constipation and frailty are also described.
Take-Home Message
Taking steps to keep our guts healthy can increase the chances of us staying well in old age.
How Do Life Expectancy and Healthy Life Expectancy Differ?
In many countries, life expectancy (the average number of years that a person can expect to live) has increased over the last 50 years. However, there can be big differences between life expectancy and healthy life expectancy (the average number of years that a person can expect to live in good health). For example, in Japan, one out of every two babies born in 2023 is now expected to live until the age of 100 years; however, current healthy life expectancy is approximately 9 years less than life expectancy for men and 12 years less for women.
A number of research studies are being conducted to help us better understand how to narrow the gap between healthy life expectancy and life expectancy. Increased prevalence of obesity and levels of physical activity have both been shown to be significant factors. In addition, recent research has suggested that gut frailty may also be involved.
What Is Gut Frailty?
People are described as being frail when they are between the states of being healthy and needing care. As people age, they become frail when they have reduced physical (muscle) and mental strength and health, and the risk that they will need assistance with daily activities begins to increase. The term “gut frailty” refers to the functions of the gastrointestinal system becoming “weakened”. Recent research has shown that gut frailty can be a precursor to overall frailty, can worsen the symptoms and severity of some diseases, and also causes chronic inflammation.
What Are the Symptoms of Gut Frailty?
The following symptoms are considered to be potential indicators of gut frailty:
- Pain or discomfort in the abdomen;
- Constipation (finding it hard to poo or going to the toilet less often than usual) or diarrhea (when the poo is loose and watery, and needing to go to the toilet more often than usual);
- Abdominal bloating;
- Stress-related symptoms;
- Weight loss or decreased appetite.
What Is the Link between Constipation and Frailty?
Among the symptoms listed above, constipation seems to be particularly associated with frailty. Studies have shown that people who experience constipation are at greater risk of developing a number of conditions that include disorders affecting the heart and blood vessels (cardiovascular disorders), chronic kidney disease, and neurodegenerative disorders such as Parkinson’s disease. Although constipation is often thought to simply be a result of the colon not functioning properly (the colon is the part of the digestive system where water and some other nutrients are absorbed into the body from our partially digested food), it can actually be a symptom that disease is developing and can make it worse.
In a study that compared the cognitive decline (changes in cognitive function that are considered to be a normal part of the aging process, like difficulties with multitasking and sustaining attention, and an overall slowing of thinking speed) of elderly people who experienced constipation with people who did not, the rate of cognitive decline was 2.7 times faster among the people with constipation. Similarly, another study reported that loss of muscle and strength as a person gets older (known by the medical term “sarcopenia”) was significantly greater in a group with constipation symptoms compared with a group without them.
What Causes Gut Frailty?
The exact causes of gut frailty aren’t yet known but they are thought to be a combination of reduced secretion of mucus inside the gut, thought to be a key factor in the early stages of gut frailty’s development, and an imbalance in the community of microbes that live in the gut (termed “dysbiosis”) among other factors. The guts of healthy adults contain more than 1,000 different species of microbes, collectively known as the gut microbiome. Although the majority of the microbes are beneficial to us, breaking down indigestible fibers and producing essential nutrients that you would not otherwise be able to get, some are pathogenic (cause disease). If the numbers of “good” bacteria decrease, it becomes possible that the “bad” bacteria will increase in number and overrun the population of good microbes.
Research has shown that the gut microbiome and the immune system are intimately linked. The gut microbiome communicates with the immune system and, if it is healthy, effectively helps it to increase the number of immune cells that dial down the immune system responses that cause inflammation. It is also becoming apparent that the gut microbiome and the nutrients that it produces influence aging. One study reported that people with a low level of gut microbiome diversity had a lower rate of survival compared with people with a higher level when compared after 4 years.
How Can Gut Frailty Be Prevented?
Research investigating how gut frailty can be prevented is ongoing. Potential approaches include dietary changes, medications, next-generation prebiotics (plant fibers that help “good” bacteria to thrive in your gut) and probiotics (live bacteria and yeasts, promoted as having health benefits, that are usually taken as supplements or added to yoghurts), and fecal microbiota transplantation (FMT, also known as poo or stool transplantation). FMT works by transferring the microbiome from a healthy donor to the intestines of a recipient, usually in capsule or liquid form, and has shown to have positive effects lasting several years in patients with irritable bowel syndrome.
Although the concept of gut frailty is not yet widely recognized, better understanding of how gut frailty affects our health will open up the possibility of developing new preventive and therapeutic interventions that focus on the gastrointestinal system, with the aim of helping us to lead healthier lives well into old age.
Note: The author of this paper make a declaration about grants, research support, consulting fees, lecture fees, etc. received from pharmaceutical companies. It is normal for authors to declare this in case it might be perceived as a conflict of interest. For more detail, see the Conflict of Interest Statement at the end of the paper.
What Is the Main Idea?
Trigeminal neuralgia (also called “tic douloureux”) causes intense facial pain, usually on one side of the face. In the open-access clinical study “Open and Percutaneous Trigeminal Nucleotractotomy: A Case Series and Literature Review”, published in the journal Stereotactic and Functional Neurosurgery, the authors assess a surgical technique called nucleotractotomy that can reduce the pain experienced by people with this debilitating condition.
What Else Can You Learn?
The roles of the trigeminal nerve are discussed. Trigeminal neuropathy and two different ways that surgery can be done to treat it are also described.
Take-Home Message
Nucleotractotomy can be highly effective in treating intense facial pain that cannot be treated with medication.
What Is the Trigeminal Nerve?
The trigeminal nerve is a large, three-part nerve in the head that is responsible for us being able to feel sensations like touch and pain in our faces. We each have two trigeminal nerves, one on each side of the head. Each one starts in the brain and then splits into three different branches that extend out across the face like the branches of a tree, and that have different roles:
- One branch travels to the lower part of your face and is involved in the lower jaw’s functions (biting, chewing, and swallowing). It’s also involved in feeling sensations with your lower lip and gums.
- A second branch is involved in the upper lip and gums feeling sensation, and also the cheeks, nose, and lower eyelids.
- The final branch covers the scalp and the upper part of the face, including the eyes, upper eyelids, and forehead.
What Is Trigeminal Neuralgia?
Like the skin, nerves can sometimes become damaged or bruised. Although the trigeminal nerve can recover over time if it becomes damaged, some people experience numbness or facial pain in the area that the damaged branch of the trigeminal nerve serves (this is known as “trigeminal neuropathy”).
Trigeminal neuralgia (also called “tic douloureux”) is a type of trigeminal neuropathy that causes intense pain, usually on one side of the face, that some people describe as being like severe stabbing, burning, or electric shock-like pain. People with trigeminal neuralgia often have attacks of pain that get worse over time, with shorter pain-free periods. It can be caused by compression of or pressure on the trigeminal nerve (for example as the result of the growth of a tumor or cyst), a facial injury, and disorders that affect the myelin sheaths of nerve cells (these act to insulate the signal-sending parts of the nerve cells, a bit like the covering of an electrical wire) like multiple sclerosis. Essentially, the trigeminal nerve keeps sending signals of intense pain to the brain, whether or not anything is actually happening to that part of the face.
How Is Trigeminal Neuropathy Treated?
Trigeminal neuropathy can be treated in different ways according to the individual needs of patients. A number of medications can be effective, but some patients find that their pain is not significantly reduced by medication and surgery is offered instead. Nucleotractotomy is a type of surgery that involves the selective cutting or damaging of a region of the trigeminal nerve called the nucleus caudalis. This is the area where the different signals from the branches extending out to the rest of the face are brought together. The technique works by stopping the intense pain signals from reaching the brain, but importantly does not stop the person from being able to sense that the affected region of the face is being touched.
How Is Nucleotractotomy Done?
There are two main ways that nucleotractotomy can be done:
- The first is an “open” technique under general anesthesia. A portion of the patient’s skull is removed and an electrode is inserted and used to “thermolesion” (damage part of the trigeminal nerve using heat) the part of the nerve that is causing the intense pain signals, so that they are no longer received.
- The second technique is done under local anesthetic while the patient is awake and involves an electrode being inserted through the skin (“percutaneous”) and guided by computed tomography scanning to the location that will be damaged.
What Did This Study Investigate?
The aim of this study was to review how effective nucleotractotomy is at stopping patients from experiencing severe facial pain in the long term. The authors of the study assessed the amount of pain that 13 patients (7 who underwent the open procedure and 6 that underwent the percutaneous one) experienced before and after surgery using a pain intensity score questionnaire (which rates pain from 0 to 10). They found that before surgery, patients’ pain was rated on average as 9.3. Not long after surgery, this had decreased to on average 1.57 for patients who underwent open nucleotractotomy and 2.66 for patients who underwent the percutaneous technique.
Although there was some evidence that there was a higher rate of pain getting worse again with the percutaneous method, the smaller area of tissue that is affected by this technique seems to be linked to it having a lower chance of patients experiencing severe complications after surgery. The patients were followed up for an average of 40 months (range 1 to 71 months), and at the end of this period the pain scores across the two groups were on average 2.6. Although severe facial pain developed again in 3 patients after percutaneous surgery and in 1 patient after open surgery, the authors of the study judged the techniques to be safe overall and to be equal in terms of how well they relieve severe facial pain in the long term.
What Is the Main Idea?
Immunohematology is the study of antigens on red blood cells and antibodies that are associated with blood transfusions. In the open-access review article “Long-Read Sequencing in Blood Group Genetics”, published in the journal Transfusion Medicine and Hemotherapy, the authors review the latest developments in DNA sequencing technology and the potential benefits to the field of immunohematology.
What Else Can You Learn?
Different types of DNA sequencing are discussed. The structure of DNA, and the roles of different blood group systems in influencing whether a transfusion of blood from a donor to a recipient will be successful are also described.
Take-Home Message
There are currently few full-length blood group system variant sequences available, and it is hoped that long-read sequencing will change this, making it easier to accurately screen blood donors and therefore reducing the risk of a patient receiving an incompatible blood transfusion.
What Is Long-Read Sequencing?
Long-read sequencing is a method that is used to determine the sequences of stretches of DNA (deoxyribonucleic acid). The cells in your body contain long strings of double-stranded DNA that are coiled up as chromosomes in a part of the cell called the nucleus, which acts as the cell’s command center. Your genes are short sections of this DNA that carry the genetic information for the growth, development, and function of your body.
In many types of living organism, including humans, the DNA exists as a two-stranded molecule, which can be thought of as being like a “ladder”, that is twisted into a shape called a double helix. Each strand is made up of units called nucleotides, which consist of a sugar molecule (deoxyribose), a phosphate group, and a nitrogenous (nitrogen-containing) base. There are four different nitrogenous bases in DNA – adenine (A), thymine (T), cytosine (C), and guanine (G) – and they bind together in pairs (A with T and G with C) to form the “rungs” of the ladder.
When new cells are made, the sequence of the nucleotides in a gene should be copied exactly. If it is not, a mutation (a change in the sequence of the DNA) results. Although some mutations have no obvious effect or can have a positive effect on the organism, enabling it to adapt better to its environment over time, others can have a significant negative effect. A number of diseases are caused by the mutation of only one nucleotide, and mutations can also lead to the development of cancer.
DNA sequencing enables the sequence of nucleotides in a piece of DNA to be determined. It is used in medicine to diagnose and treat rare diseases, identify new drug targets, and can be used as a form of genetic testing to identify if someone is at risk of developing a genetic disease and to provide counselling to affected couples who want to have a child. DNA sequencing has also helped scientists to understand the functions of genes and other parts of the genome (all of the DNA in a living organism).
How Does DNA Sequencing Work?
The first DNA sequencing technique was developed in the 1970s by Fred Sanger and his team. It involves making lots of copies of a target region of DNA in which the deoxyribose sugar molecule is replaced by a different version called dideoxyribose, in which the part of the sugar molecule that acts as a “hook” to join the strand to the next nucleotide is missing. As a result, once a dideoxyribose-containing nucleotide has been added to a DNA strand, no more nucleotides can be added and the strand ends.
The dideoxyribose-containing nucleotides are also marked with different colored fluorescent dyes, one for each of the nitrogenous bases. The different copies of the target DNA strand are then “read” in order of size and the dye color on the end of each strand is detected, which enables the sequence of the original piece of target DNA to be worked out.
Although the Sanger sequencing method can produce accurate sequences of DNA segments up to 900 nucleotides long and is still used, it is expensive and it takes a long time to sequence a large amount of DNA like a human genome. As a result, second-generation DNA sequencing techniques were developed that used “short reads” (essentially, a large number of sequencing reactions are run in parallel that sequence DNA strands that are between 50 and 70 nucleotides long), enabling large quantities of DNA to be sequenced more quickly and cheaply. Third-generation sequencing techniques are now also being developed, which include long-read sequencing, that have technical advantages over short-read sequencing.
How Does Long-Read DNA Sequencing Work?
As its name suggests, long-read sequencing can sequence long reads of DNA in one go without them needing to be broken up into smaller fragments. There are currently two companies that offer long-read sequencing and they use different methods:
- The first involves making a copy of a long chain of DNA using the sequence of DNA that is going to be sequenced as the template, which has been joined at the ends to become circular. A single circular piece of DNA is placed on a surface with thousands of tiny little wells, so a different reaction can take place in each well. Nucleotides labelled with fluorescent dyes are then used to make new strands and the circular DNA is copied many times.
- The second method involves a single strand of DNA being passed through a small hole, called a nanopore, in a membrane that is submerged in a salt solution. When an electrical current is established through the pore as well, each nitrogenous base blocks the flow of the current in a different way as the DNA strand passes through the nanopore. The order of these flow disruptions can then be translated into the sequence of the nitrogenous bases on the DNA strand. For both methods, many different copies of a particular sequence are then put together to form a high-accuracy “consensus” sequence.
What Is Immunohematology?
Immunohematology is a medical specialty that brings together the fields of hematology (the study of the blood and blood disorders) and immunology (the study of the immune system) in the study of antigens on red blood cells and antibodies that are associated with blood transfusions. The term “antigen” describes anything that causes a response by the immune system, while antibodies are molecules that specifically bind to antigens and identify them to the immune system as needing to be dealt with.
There are many different types of antigens on the surfaces of red blood cells. These antigens are normally ignored by the immune system, but if a person receives blood from someone else in the form of a blood transfusion, their immune system will identify and attack any red blood cells with antigens that are different to the ones on their own red blood cells. As a result, it is essential that the red blood cell antigens of both the blood donor and the recipient are determined before a transfusion is given, which enables their “blood groups” to be identified.
Red blood cell antigens are coded for by genes in our DNA called blood group systems and more than 40 have been identified in humans. Some blood group systems can have more than one form called variants. We each inherit one set of our chromosomes from our mother and another from our father, and this can mean that we inherit different variants of a blood group system. Some variants are dominant over other ones, which means that a child’s blood group may be different than the blood groups of its parents.
How Can Long-Read Sequencing Aid Immunohematology?
A number of blood group systems are very long and have complicated sequences that are difficult to work out using short-read sequencing. One of the main advantages of long-read sequencing is that it is able to span entire lengths of complicated regions of DNA. This means that it is better able to detect and sequence regions that contain a lot of repetition or for which there are several different variant forms, or variation that affects more than 50 “rungs” on a DNA “ladder”.
Long-read sequencing can also be used to work out which copy of a variant exists on which copy of a chromosome (i.e., whether it’s on the chromosome inherited from the mother or the father). New variant forms of blood group systems are still being identified, and long-read sequencing is expected to help resolve confusion about what different variant forms of these blood group systems can mean for an individual and how the genes involved are regulated. The technology may also help to establish new reference databases for all blood group systems.
What Is the Main Idea?
People with Parkinson’s disease often have movement-related symptoms such as tremors. In the open-access research article “Long-Term Follow-Up of Unilateral Deep Brain Stimulation Targeting the Caudal Zona Incerta in 13 Patients with Parkinsonian Tremor”, published in the journal Stereotactic and Functional Neurosurgery, the authors investigate whether using deep brain stimulation that targets a region of the brain called the posterior subthalamic area to treat patients with Parkinson’s disease who have severe tremor still reduces their symptoms at least 3 years after surgery.
What Else Can You Learn?
The symptoms and causes of Parkinson’s disease are discussed. The roles of different areas of the brain in regulating movement and of the neurotransmitter dopamine are also described.
Take-Home Message
The results suggest that treating patients with Parkinson’s disease who have severe tremor with deep brain stimulation that targets the PSA is effective and safe, and significantly improves their tremor symptoms, while slowness of movement is slightly improved. Future studies directly comparing the effects of targeting the VIN or PSA in deep brain stimulation will provide researchers with more information that can be used to refine the techniques and improve the quality of life of people with Parkinson’s disease.
What Is Parkinson’s Disease?
Parkinson’s disease is an age-related neurodegenerative disorder that develops when nerve cells in the nervous system or brain stop functioning and eventually die, causing more severe symptoms over time. It is the most common movement-related brain disease and is slightly more common in men than in women. Although it can develop in adults as young as 20 years old this is extremely rare. It is inherited in around 10% of cases, but in most cases is not linked to gene changes inherited from a parent. The average age at which Parkinson’s disease develops is 60 years and it is estimated to affect more than 1% of people aged over 60 years worldwide.
What Are the Symptoms of Parkinson’s Disease?
Parkinson’s disease is characterized by a range of symptoms that can be broadly divided into two groups: those that are movement-related and those that are not. The two best known movement-related symptoms are bradykinesia and tremor:
- The term bradykinesia means that movement is slow and that a person’s continuous movements may be hesitant or halt midway. This is caused by problems with muscle control rather than a loss of strength.
- Tremor is a rhythmic shaking of the muscles, even when a person is resting and not using them.
Other movement-related symptoms include a hunched or stooped posture, rigidity or stiffness of the joints, changes in the way a person walks (often resulting in them taking shorter, shuffling steps and needing to take more steps when turning), difficulty swallowing, and blinking less often than usual.
Symptoms that are not related to movement and muscle control include a loss of sense of smell, problems with focusing and with sleep, depression, problems relating to the stomach and intestines (gastrointestinal problems), urinary incontinence, and low blood pressure when standing up. Some of these symptoms are now thought to be warning signs that Parkinson’s disease is developing, which begin years before movement-related symptoms start to be noticeable.
What Causes Parkinson’s Disease?
The brain is made up of several regions with different roles. The outside surface of the brain is made up of a thin layer of cells called the cerebral cortex. This area of the brain is responsible for language and social skills, memory, reasoning, and decision-making. Below it is the sub-cortex, which contains four regions that are important for emotion, thinking, and movement.
One of these is the brainstem, a stalk-like structure that connects the brain to the spinal cord. It is in an area of the brainstem that the neurotransmitter dopamine is produced (neurotransmitters carry chemical signals between neurons, a type of cell that transmits messages from one part of the brain and nervous system to another, and trigger an action or change in the target cell).
Dopamine has a wide range of roles including:
- motivation and pleasurable reward,
- attention,
- behavior,
- cognition (an umbrella term that describes a combination of processes that take place in the brain, such as the ability to learn, remember, and make judgements based on experience, thinking, and information from the senses), and
- movement.
The basal ganglia is another important region in the sub-cortex.
How Is the Basal Ganglia Linked to Parkinson’s Disease?
The basal ganglia is a group of structures near the center of the brain that is about 10 cubic centimeters in size. The structures within it are responsible for important connections between different areas of the brain that enable them to work together and send signals back and forth, a bit like a circuit board in an electronic device. The basal ganglia plays a key role in our ability to move by managing the signals that the brain sends to help you move your muscles.
The structures within it can filter out signals that are unnecessary or wrong, and approve or reject movement signals so that you can control particular muscles without using other ones in the same area of your body. They also process sensory information, which helps you to further refine your movements, and are involved in emotions, motivation, and habits.
Parkinson’s disease develops when the basal ganglia begins to deteriorate and causes a major change in the chemistry inside the brain that results in there not being enough dopamine. Because the basal ganglia’s fine-tuning of your movements involves cells that require dopamine to function properly, this reduction in the amount of dopamine results in people having the slowed movement and tremors that are characteristic of the disease.
How Is Parkinson’s Disease Treated?
Although there is currently no cure for Parkinson’s disease, the symptoms of the condition can be treated with medication and/or surgery. Although many people with Parkinson’s disease can have their symptoms reduced by taking medication, which often works by increasing the levels of dopamine in the brain, it can become less effective and side effects (these are positive or negative unintended effects of a medication) can become more severe as the condition progresses.
Tremors can be particularly difficult to treat with medications that affect dopamine levels, and some researchers have reported evidence that as well as dopamine, levels of other neurotransmitters are involved in causing tremors in Parkinson’s disease.
Patients who do not experience significant improvements in their symptoms as a result of taking medication may be offered a type of surgical therapy called deep brain stimulation. This approach involves the reversible implantation of a device that works in a similar way to how a pacemaker regulates the heart. A device called a pulse generator is implanted under the skin in the stomach or chest area that is connected to one or two fine wires. These are inserted into specific areas of the brain and deliver a mild electrical current that changes some of the signals in the brain that cause the movement-related symptoms of Parkinson’s disease.
What Did This Study Investigate?
The thalamus, which is located close to the basal ganglia, acts as the main relay station of signals that come into the brain and passes them on to other areas for interpretation and response. An area of the thalamus called the ventral intermediate nucleus (VIN) is usually the location of choice for the implantation of wires when a patient with a tremor disorder like Parkinson’s disease is treated with deep brain stimulation.
However, over the last decade, targeting an area below the thalamus called the posterior subthalamic area (PSA) has been shown to be at least as effective as targeting the VIN in reducing tremors in patients. It has also been reported to reduce other Parkinson’s disease symptoms, such as rigidity and the loss of the ability to move muscles voluntarily, which is not seen when the VIN is targeted. To assess whether this approach is effective in reducing symptoms of tremor in the long term, the authors of this research study investigated whether deep brain stimulation targeting the PSA remained effective in reducing tremors more than 3 years after patients with Parkinson’s disease underwent surgery.
Thirteen patients were included in the study and 12–24 months after surgery their tremor symptoms had improved by an average of 88% and slowness of movement by an average of 40%. When they were assessed on average 62 months after surgery, the improvement in tremors was still seen and slowness of movement symptoms remained an average of 20% better than when they were assessed before surgery.
What Is the Main Idea?
Dryness of the mouth and eyes are known side effects of some anticancer treatments. In the open-access article “Sjögren’s Syndrome Caused by PD-1 Inhibition in a Lung Cancer Patient”, published in the journal Case Reports in Oncology, the authors describe the case of a 71-year-old woman. She was receiving immunotherapy treatment for a type of lung cancer, went on to develop these symptoms, and was discovered to have an autoimmune condition called Sjögren’s syndrome.
What Else Can You Learn?
The symptoms of Sjögren’s syndrome, and how it differs from general symptoms of dryness of the mouth and eyes, are discussed. Differences between chemotherapy and immunotherapy anticancer treatments are also described.
Take-Home Message
This case report demonstrates the importance of patients who are receiving immunotherapy treatment and who have symptoms that are difficult to interpret being evaluated by multidisciplinary healthcare teams. Involving specialists from different medical disciplines means that less common adverse events can be identified and effective treatment started quickly, which can significantly improve the quality of life of patients.
What Is Sjögren’s Syndrome?
Sjögren’s syndrome is an autoimmune disease that is more common in women than in men, and usually develops between the ages of 45 and 55 years. The cells in your body have molecules on their surfaces that the immune system usually recognizes as “self-antigens” (in other words, it recognizes them as “not foreign” and therefore not potentially dangerous).
However, sometimes the body’s immune system starts to recognize self-antigens as foreign ones and begins to attack them. When this happens, the inflammation caused by the “autoimmune” response can result in the destruction of normal, healthy body tissue, or changes in the function or growth of an organ.
What Causes Sjögren’s Syndrome?
The exact causes of Sjögren’s syndrome are not yet understood, but it mainly affects “exocrine” glands, particularly the ones that produce tears (the lacrimal gland) and saliva (the salivary gland). Exocrine glands are organs in the body that produce and release substances through ducts (openings), and include the glands that release milk, digestive juices, tears, and sweat.
Although Sjögren’s syndrome is mainly associated with having dry eyes and a dry mouth, it is a systemic disease (a condition that affects the whole body rather than a single body part or organ) because the long-term (chronic) inflammation that causes it often occurs in other organ systems as well. Patients can experience tiredness, skin rashes (particularly after they have been out in the sun) and dry skin, pain in the muscles or joints, vaginal dryness, and swollen salivary glands.
For some people, Sjögren’s syndrome develops in isolation and is referred to as “primary” Sjögren’s syndrome. For others, its development can be associated with another related autoimmune condition, such as rheumatoid arthritis, and is referred to as “secondary” Sjögren’s syndrome. One reason that Sjögren’s syndrome is difficult to diagnose is that symptoms of constant dryness in areas of the body are not uncommon, particularly as people age, and can vary widely between one person and another.
To be able to diagnose Sjögren’s syndrome, healthcare practitioners look for evidence that an autoimmune response is causing the patient’s symptoms, often by measuring the levels of particular antibodies in blood samples. If there is no evidence that the dryness symptoms are being directly caused by an autoimmune response the symptoms are classified as “sicca” (which literally means “dry”) syndrome. It is worth noting that patients with other autoimmune conditions, such as rheumatoid arthritis and lupus erythematosus, can also experience dryness of the eyes and mouth.
How Is Dryness of the Eyes and Mouth Linked to Cancer Treatment?
Severe mouth and eye dryness are known side effects of some anticancer treatments, with approximately 9.4% of cancer patients who receive chemotherapy treatment developing them. Chemotherapy targets and kills rapidly dividing cells like cancer cells, but can also affect other cells in the body that divide rapidly, causing side effects. Patients with cancer who are treated with immunotherapy experience fewer adverse events (these are unintended and undesirable effects that develop after exposure to a medicine, although they may not have been caused by it) compared with patients treated with platinum-based chemotherapy.
Immunotherapy is a type of treatment that uses the body’s own immune system to tackle a cancer. For example, some immunotherapy medicines target and block a protein called PD-1. PD-1 is found on the surfaces of some immune cells and plays a role in preventing autoimmune responses from developing. Blocking PD-1 triggers these immune cells to find and kill cancer cells.
Although some adverse events that can occur during or after immunotherapy treatment are well known and are routinely looked for by patients’ healthcare teams, others are extremely rare and can have symptoms that are difficult to diagnose and interpret. This can lead to delays in diagnosis and patients receiving treatment for these symptoms.
Approximately 5.3% of cancer patients treated with immunotherapy develop symptoms of dryness of the mouth and eyes because the immune system starts to attack normal, healthy cells as well as the cancerous ones. Where this occurs, the signs and symptoms are different to those of primary Sjögren’s syndrome. Around 50% of all cases of mouth and eye dryness that are linked to immunotherapy occur in men, as opposed to only around 5% for primary Sjögren’s syndrome, and the average age of diagnosis is around 10 years older.
What Does This Case Report Describe?
In this case report, a type of study that looks in depth at the case of a single individual or a specific group of patients, the authors describe the case of a 71-year-old woman who was eventually diagnosed with Sjögren’s syndrome after receiving immunotherapy treatment for non-small cell lung cancer. Case reports are useful because they enable healthcare practitioners to communicate information about rare or previously unreported conditions, complications, or treatments to the rest of the medical community. The authors report that the case of this woman’s experience is unusual because she developed Sjögren’s syndrome only 18 months after her immunotherapy treatment began, and because she developed a broad range of signs and symptoms of the condition.
After 18 months of receiving immunotherapy treatment, the woman’s non-small cell lung cancer was in partial remission (this means that the cancer had reduced in size or stopped growing) and the immunotherapy was stopped. She had been experiencing some mild side effects from the immunotherapy and these had been treated with low-dose steroid treatment. However, once the steroid treatment ended, she quickly started to experience an extremely dry mouth that made swallowing difficult, resulting in her rapidly losing weight. She also developed severe dry eye syndrome (known as “xerophthalmia”) and a type of skin inflammation called “erythema nodosum”, which results in painful reddish lumps developing under the skin, usually on the shins.
In the case of this patient, evaluation by a dermatologist (a doctor that specializes in conditions that affect the nails, hair and, skin) and a rheumatologist (a doctor that specializes in chronic inflammatory conditions like Sjögren’s syndrome, rheumatoid arthritis, and lupus erythematosus) meant that her symptoms were diagnosed correctly as Sjögren’s syndrome. She was able to receive the treatment that she needed (corticosteroid treatment) and her symptoms rapidly improved.
What Is the Main Idea?
Feminizing adrenocortical tumors are an extremely rare type of cancer that develops in the adrenal glands. In the open-access article “Feminizing Adrenocortical Tumor with Multiple Recurrences: A Case Report”, published in the journal Case Reports in Oncology, the authors describe the case of a 35-year-old man diagnosed with this type of cancer and how his treatment has been managed.
What Else Can You Learn?
Feminizing adrenocortical tumors and their symptoms are discussed, alongside the roles of case reports in raising awareness of rare conditions. The roles of the endocrine system and its key components, particularly the adrenal glands, are also described.
What Are Glands?
Glands are organs in the body that produce substances and release them either through ducts (openings) or directly into the bloodstream. Glands that release substances through ducts are called “exocrine” glands, and this group includes the glands that release milk, digestive juices, tears, and sweat.
“Endocrine” glands release hormones, molecules that act as chemical messengers, into the bloodstream. Together, hormones and endocrine glands make up the endocrine system, a messenger system that targets and regulates organs all over the body and controls almost all of the processes that take place within it.
What Does the Endocrine System Do?
To be able to function properly, the various parts of the body need to be able to communicate with each other to make sure that the internal environment is kept constant, and that any changes in the internal or external environment get an appropriate response. Two systems enable this communication:
- The nervous system is made up of the nerves, spinal cord, and brain, and enables messages to travel from one part of the body to another within fractions of seconds.
- In contrast, the endocrine system is better suited to responding to situations where a longer-lasting and more widespread response is needed, because it involves hormones being made and travelling around the body in the bloodstream.
Although the two systems complement and interact with each other, the endocrine system is responsible for regulating development, growth, metabolism (the process by which the food and drink that we consume is changed into energy), and our ability to reproduce, as well as the components that make up bodily fluids like saliva and blood, our emotions and moods, and even our sleep.
Which Parts of the Body Are Involved in the Endocrine System?
Although hormones are made in many parts of the body, there are several key components of the endocrine system. These include the pituitary and pineal glands and the hypothalamus in the brain, the thymus in the upper part of the chest, the thyroid and parathyroid glands in the neck, the pancreas (which is behind the stomach and is also part of the digestive system), the gonads (the “sex glands”: ovaries in women and testes in men), and the adrenal glands, which are located on top of the kidneys.
The production of hormones and their release must be tightly controlled to ensure that the body’s functions are regulated properly. To achieve this, many functions are regulated by several hormones that regulate each other via positive and negative feedback loops. For example, the effect of one hormone on an organ may cause that organ to release a second hormone that feeds back to the gland that sent the first hormone. This can prevent the message being sent by the first hormone from being “on” continuously.
What Do the Adrenal Glands Do?
The adrenal glands are triangular-shaped and there is one on each kidney. They are made up of two parts with different functions and that make different sets of hormones:
- The inner part of the adrenal gland is called the “adrenal medulla”, and it is here that a type of hormone called “catecholamines” are made. The best known catecholamine is adrenaline (also known as epinephrine or the “fight or flight” hormone), which increases the body’s heart rate and blood pressure when it is under stress.
- The outer part of the adrenal gland is called the adrenal cortex, and it is here that a type of hormone called “corticosteroids” are made. Some of the roles of these hormones include metabolism, the body’s response to stress, the immune system, and sexual development and function.
What Does This Case Report Describe?
A case report is a type of study that looks in depth at the case of a single individual or a specific group of patients. Case reports are particularly useful when healthcare practitioners want to communicate information about rare or previously unreported conditions, complications, or treatments to the rest of the medical community. In this study, the authors describe the case of a 35-year-old man who had a type of adrenal gland cancer called a “feminizing adrenocortical tumor”.
Primary tumors (tumors that have not spread from elsewhere in the body) that start in the tissues that cover your organs and glands can be classed as adenomas (these are “benign”, meaning that they are not able to invade surrounding tissue or spread to other areas of the body) or carcinomas (these are “malignant”, which means that they can invade and spread). Primary carcinomas of the adrenal glands are rare and, although it is unusual, sometimes a tumor in an adrenal gland can start to produce and release corticosteroids abnormally.
In the case of feminizing adrenocortical tumors, only estrogens are secreted. Estrogens are a type of sex hormone, so called because they are critical in regulating the biological differences between males and females, and are particularly involved in reproduction and puberty. In humans, the key sex hormones are estrogens, progesterone, and testosterone. The high levels of estrogens produced by feminizing adrenocortical tumors have a feedback effect on the levels of testosterone, meaning that testosterone production is usually suppressed in patients with this type of tumor.
As a result, common symptoms are hypogonadism (where the gonads produce low levels of or no hormones) and overdevelopment or enlargement of the breast tissue in men and boys. Patients with this type of tumor can also experience discomfort or pain in one side of the body between the back and the upper abdomen (belly area). Feminizing adrenocortical tumors most commonly occur in men but can also develop in women and children. In women, additional symptoms include irregular or postmenopausal bleeding.
One of the difficulties in treating feminizing adrenocortical tumors is that they are extremely rare, accounting for less than 2% of all adrenal gland tumors. In fact, only 50 cases were reported in the medical literature between 1970 and 2015. As a result, case reports have an important role to play in increasing awareness of this type of tumor and improving its diagnosis and treatment.
Take-Home Message
Feminizing adrenocortical tumors are often aggressive (meaning that they develop and/or spread quickly), are almost always malignant, and the chance that they will recur is high. Case reports like this study help to raise awareness of the need to recognize and treat this type of cancer aggressively, and monitor patients closely for signs of recurrence.
What Is the Main Idea?
Primary immune thrombocytopenia is a type of autoimmune disorder. In the research article “The Role of Follicular Regulatory T Cells/Follicular Helper T Cells in Primary Immune Thrombocytopenia”, published in the journal Acta Haematologica, the authors discuss how two types of immune system cells are linked to primary immune thrombocytopenia and may have potential as future therapeutic targets.
What Else Can You Learn?
The symptoms and our understanding of the causes of primary immune thrombocytopenia are described. The roles of the innate and adaptive branches of the immune system, and of B and T cells, are also discussed.
Take-Home Message
Although further research is needed, these results suggest that targeted immunotherapies (treatments that work by activating or suppressing the immune system) may be worth investigation as potential treatments for patients with primary immune thrombocytopenia in the future. For example, it may become possible to directly target components of the immune system to prevent the recognition of platelets as foreign and reduce their breakdown so that normal levels of platelets in people with primary immune thrombocytopenia can be maintained.
What Is an Autoimmune Disorder?
Primary immune thrombocytopenia is an autoimmune disorder, which means that it develops when the body’s immune system starts to attack cells in the body that are not harmful by mistake. The immune system protects the body from things that are potentially harmful by recognizing “antigens”, which is a term used to describe anything that causes an immune response and can include chemicals or molecules on the surfaces of bacteria and viruses.
The cells in your body also have molecules on their surfaces, but the immune system usually recognizes them as “self-antigens”. In other words, the immune system knows that they are not “foreign” and should not be attacked. However, sometimes the body’s immune system starts to recognize self-antigens as foreign ones and begins to attack them. This is an “autoimmune” response and can result in the destruction of normal, healthy body tissue, or changes in the function of or the abnormal growth of an organ. Autoimmune disorders include type 1 diabetes, rheumatoid arthritis, and multiple sclerosis.
What Are the Symptoms of Primary Immune Thrombocytopenia?
The main symptom of primary immune thrombocytopenia is a low number of platelets in the blood. Blood is made up of a liquid called plasma and three main types of blood cells:
- Red blood cells carry oxygen around the body.
- White blood cells fight infection and there are several different types, including lymphocytes (the main type of white blood cell found in the lymph fluid that circulates around the body) called B cells and T cells.
- Platelets are the third type of blood cell and are involved in the process that enables blood to clot to promote healing and control blood loss.
In people with primary immune thrombocytopenia the immune system starts to mistakenly attack platelets in the blood, stopping them from working or breaking them down. This results in reduced levels of platelets in the blood, which can result in excessive bleeding because the blood is less able to clot. Although the symptoms of primary immune thrombocytopenia vary between patients, common symptoms include spontaneous bruising or bruising easily, bleeding from the gums, blood blisters on the insides of the cheeks, frequent heavy nose bleeds that are hard to stop, and fatigue.
How Do Immune Cells Contribute to Primary Immune Thrombocytopenia?
There is some evidence that abnormalities in the function and number of some types of immune cell are associated with primary immune thrombocytopenia. The immune system can be thought of as having two branches:
- The first, the “innate immune system”, includes the inflammatory response and does not have the ability to “remember” antigens that it has encountered.
- The second, the “adaptive immune system”, does have memory, which means that if the immune system has encountered an antigen once before it will be able to mount a stronger response if it encounters it again, a property that is exploited by vaccines.
B and T cells are the main mediators of the adaptive immune system. B cells have immunoglobulin molecules on their surfaces that act as receptors that recognize antigens and can be secreted as antibodies. T cells have specific antigen receptors called T-cell receptors on their surfaces that other types of lymphocyte do not have.
What Do T Cells Do?
There are three different types of T cells with different functions:
- Cytotoxic T cells can directly kill virus-infected cells and cancer cells (cytotoxic means toxic to cells or “cell killing”).
- Helper T cells help to activate other cells in the immune system, like B cells and cytotoxic T cells, and play a role in regulating the responses of the immune system.
- Regulatory T cells play important roles in the immune system’s ability to recognize self-antigens, and a type of these called T follicular regulatory (Tfr) cells can also suppress the functions of B cells and influence the breakdown of T helper cells.
What Did This Study Show?
The exact mechanisms by which changes in the immune system lead to the development of primary immune thrombocytopenia are currently unknown. If these changes can be better understood it may be possible to target parts of the immune system that become dysregulated during the development of primary immune thrombocytopenia as a way of treating the condition.
The authors of this study therefore compared blood samples from people with primary immune thrombocytopenia with blood samples donated by people without the condition, and looked for differences in gene expression and numbers of different types of T cells. (Gene expression is the process by which the information encoded by a gene is translated into a function, usually through the production of a protein, by being switched on or off or by the activity of the gene being increased or reduced.)
The results of the study showed that there were a number of genes that were expressed differently in patients with primary immune thrombocytopenia compared with people without the condition that were mainly involved in immune responses and the process of inflammation. In particular, there were differences in the proportions of Tfr and T follicular helper (Tfh) cells, and the proportions of these cell types also changed when patients with primary immune thrombocytopenia responded to treatment. The levels of Tfh cells were increased in patients with primary immune thrombocytopenia but decreased after they responded to treatment, while the ratio of Tfh cells to Tfr cells increased after treatment responses.
In addition, two genes were much more highly expressed in people with primary immune thrombocytopenia than in people without the condition. BCL-6 encodes a protein that regulates the proliferation of Thf cells, while IL-21 encodes a protein that is able to increase the differentiation of Thf cells and also suppress the differentiation of Tfr cells.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is the Main Idea?
Flukes are parasitic flatworms that can infect mammals, including humans. In the review article “Biliary Parasitic Diseases Associated with Hepatobiliary Carcinoma”, published in the journal Visceral Medicine, the authors discuss the links between infections with some types of fluke and the development of cancer in the liver or bile ducts.
What Else Can You Learn?
Three parasitic flukes that cause disease in humans are described. The roles of the liver and bile ducts in the digestive system are also discussed.
What Are Liver Flukes?
The name “fluke” describes a group of species of parasitic flatworms that are able to infect mammals including humans. Parasites are organisms that live on or in another organism (known as the “host”). They depend on their hosts for their survival, getting their food either from their hosts or at the hosts’ expense, and they are specially adapted to live in this way.
Because parasites need their hosts to be able to survive, many species do not kill the host directly but often carry diseases that can be life-threatening to the host. Once they have infected a person, blood flukes tend to reside in the blood vessels. Liver flukes are small enough to travel around the body in the blood circulation. They often end up in the liver, gallbladder, and bile ducts where they can cause disease.
What Do the Liver and Bile Ducts Do?
The liver has a number of roles in the body, including cleaning the blood to remove harmful substances and metabolizing proteins, fats, and carbohydrates so that the body can use them. The liver also makes a fluid called bile that helps the body to break down fats from food, which can be stored in the gallbladder or can travel directly from the liver to the small intestine.
Most of the digestion of the food we consume takes place in the small intestine and it is here that nutrients and minerals from our food are absorbed into the blood. The bile ducts are part of the digestive system and are small tubes that connect the liver to the gallbladder and small intestine.
Where Are Flukes Found?
Different types of fluke have different life cycles (the different stages that organisms go through during their lives) and are found in different areas of the world. Some types of liver fluke are endemic (this means “native to” or regularly occurring and/or present) in areas of southern and southeastern Asia, but other types can be found in all continents except Antarctica. Although health authorities in areas where flukes are endemic have made major efforts to prevent and control their presence, with some successes, they are still a problem and many people become infected.
How Are Fluke Infections Linked to Cancer?
Although the symptoms caused by fluke infections are often mild, flukes can survive in the human body for several decades if the infection is not treated. This can lead to chronic (long-term) inflammation, the process by which your body responds to an injury or a perceived threat.
Liver fluke infection can also lead to an increase in the number of cells lining the ducts and the passageways connecting the liver, gallbladder, and small intestine (known as “epithelial hyperplasia”), and thickening or scarring of the ducts (known as “periductal fibrosis”). These symptoms can cause further complications over time including the formation of stones and the development of hepatobiliary cancers (the prefix “hepato” refers to the liver and “biliary” refers to the gallbladder and bile ducts).
Which Flukes Are Linked to Cancer?
There are a number of flukes that have been linked to the development of hepatobiliary cancer. Three of the most common are Schistosoma japonicum, Clonorchis sinensis, and Opisthorchis viverrini.
Schistosoma japonicum
S. japonicum is a blood fluke that is responsible for a disease called schistosomiasis, which is estimated to affect 200 million people worldwide. People become infected with this type of fluke through contact with fresh water in which the parasite is present, either through work and agriculture, or activities of daily living. During its life cycle, S. japonicum can cause blockages in small blood vessels in the liver and cause cirrhosis (scarring of the liver tissue that causes long-term damage and prevents the liver from working properly).
It is suspected that S. japonicum infection is directly linked to the development of a type of liver cancer called hepatocellular carcinoma, because rates of this type of cancer are much higher in areas where S. japonicum is endemic. The International Agency for Research on Cancer (IARC) has classified this fluke as a group 2B carcinogen (a substance or organism that can cause cancer), which means that the IARC has brought together a panel of experts on the subject that has evaluated all of the available published evidence and agreed that it is “possibly” able to cause cancer in humans.
Clonorchis sinensis
C. sinensis is endemic in China and Korea, and is estimated to affect 35 million people worldwide. People can become infected with this liver fluke by eating raw or undercooked infected fish, crab, or crayfish. Similarly to S. japonicum, C. sinensis has been linked to the development of cholangiocarcinoma, the name given to a group of cancers that form in the bile ducts, because the incidence of cholangiocarcinoma is much higher in areas where C. sinensis is endemic.
In one study, C. sinensis infection was shown to increase the risk of developing cholangiocarcinoma by 14 times compared with individuals with no history of C. sinensis infection. The IARC has classified this liver fluke as a group 2A carcinogen, which means that it is “probably” able to cause cancer in humans.
Opisthorchis viverrini
O. viverrini is known by some as the “South East liver fluke”, is endemic in northern Thailand, and is estimated to affect 10 million people worldwide. Similarly to C. sinensis, people can become infected with this liver fluke by eating raw or undercooked infected fish, crab, or crayfish. The evidence that O. viverrini infection is linked to the development of cholangiocarcinoma is so strong that the IARC has classified it as a group 1 carcinogen, which means that the IARC views O. viverrini as “definitely” being carcinogenic.
Take-Home Message
It is clear that fluke infections can have serious long-term implications that go beyond the initial effects of the parasite on the host. It is therefore essential that fluke infections are recognized and treated as soon as possible after infection to reduce the risk of hepatobiliary cancer developing in the future.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is the Main Idea?
Psychoneuroimmunology is a field of study that brings together researchers that traditionally work in separate fields. In the open-access editorial article “2nd European Psychoneuroimmunology Network Autumn School: The Skin–Brain Axis and the Breaking of Barriers”, published in the journal Neuroimmunomodulation, the authors summarize how studying the skin is enabling us to better understand the relationships between the brain, hormones, and the immune system that contribute to good health when we are well and that fail when we become ill.
What Else Can You Learn
The different research fields that are brought together under the term psychoneuroimmunology are described. The skin–brain axis and how studying the skin is enabling researchers to gain new knowledge about health and disease are also discussed.
What Is Psychoneuroimmunology?
Psychoneuroimmunology is a multidisciplinary area of research that brings together people working in different fields so that they can pool knowledge and explore research questions across traditional subject boundaries. It incorporates fields that have traditionally been separate, such as:
- neuroscience (which considers the function and disorders of the nervous system, including the brain),
- physiology (which considers how living organisms or parts of the body function when they are working normally),
- immunology (which considers the function and disorders of the immune system),
- genetics (which studies genes, how traits are inherited, and genetic variation), and
- psychosocial disciplines (which consider how psychological factors and the surrounding social environment influence how people function and behave).
How Does Psychoneuroimmunology Link Different Fields of Research?
Psychoneuroimmunology brings people like clinicians and healthcare practitioners, epidemiologists (researchers who study the causes, effects, and patterns of disease in groups of people), basic scientists (researchers who seek to improve our understanding of the world and how things work), and statisticians (researchers who compile and use statistical data to solve problems) together, to study how processes that influence our minds and thoughts interact with the body’s immune and nervous systems.
This can involve looking at how the nervous and immune systems function and their effects on people’s behavior when they are well or unwell, such as when disorders such as autoimmune diseases (which develop when the body’s immune system mistakenly starts to recognize the body’s own tissue as foreign and attacks it) and immune deficiencies (which occur when the immune system becomes weakened, potentially enabling problems like infections to occur more easily) develop. There is now good evidence that psychosocial stresses and interventions can affect our immune systems in ways that lead to changes in our health.
Although further research is needed, it seems that stressful events can trigger physical and cognitive (thought-based) responses that induce changes in the body that weaken or damage the immune system, by altering the way the endocrine system (a network of organs and glands that uses hormones to control processes in the body) and the sympathetic nervous system (the part of the nervous system that is responsible for the “flight-or-flight” response to things that we perceive to be threatening or harmful) work.
For example, environmental and psychosocial factors can influence the development of cancer and autoimmune diseases, and affect the speed at which we heal. It has also been shown that regular physical activity has an immunoregulatory effect, as well as improving symptoms of depression and low mood, and having positive effects on the heart and muscle fitness.
What Is the Skin–Brain Axis?
The skin–brain axis is the term given to the connections between the brain and the skin, and the ways that they communicate with each other. The skin is the body’s largest organ and has a number of functions. It cover’s the body’s entire surface and acts as a barrier to things like UV light, chemicals, and microbes that have the potential to make us ill like bacteria, and helps to regulate our body temperature.
The skin also plays a key role in sensing changes in our environment, and environmental changes that it senses are translated into chemical and biological messengers that travel via hormones, or the immune or nervous system, to reach the brain and other organs. For example, if you put your hand on something that is very hot, the skin on your hand sends signals via nerves to the brain that are translated as pain so that you move your hand away quickly, and the immune system is activated to repair any damage to your hand. Communication can also run from the brain to the skin.
Many skin conditions are linked with chronic stress, including eczema, psoriasis, and acne. Stress has also been shown to reduce the skin’s ability to act as a barrier, increasing the chance of infections. Sleep has also been shown to be important for the immune system to work effectively, and meditation has been shown to have positive effects on the mental health and immune system function in people with long COVID.
How Does Studying the Skin Help Research in Psychoneuroimmunology?
Studying the skin has several advantages for psychoneuroimmunology researchers. It is relatively easy for participants in research to give samples of skin and for them to be cultured in a laboratory. Skin swabs can be used to quickly and cheaply sample the populations of microbes (such as bacteria and viruses) that live on the skin’s surface when people have inflammatory skin diseases, and to investigate how they differ under non-stressed and stressed conditions.
Skin-related experimental models (these use systems, such as the culturing of cell in a laboratory, to investigate processes that are thought to be involved in diseases and to evaluate new drugs that are being developed) can also be used to investigate the influences of lifestyle, perception, and our behavior on how well our organs function. For example, it has been shown that the levels of stress that a person is experiencing around the time of being vaccinated against flu can predict how long lasting their antibody response will be against that flu strain in older adults.
It is also hoped that the study of skin diseases may improve our understanding of how inflammation in the different areas of the body affects the brain and, conversely, how inflammation and dysregulation in the brain and nervous system affects other areas of the body. Such research has the potential to increase our understanding of how trauma and disruptions to the immune system affect our mental health, and whether there are links between them and the development of neurodegenerative diseases such as dementia in later life.
What Is the Main Idea?
A colorectal polyp is a small clump of cells that forms on the lining of the colon or rectum, and its size can indicate whether or not it is likely to become cancerous. In the research article “Usefulness and Educational Benefit of a Virtual Scale Endoscope in Measuring Colorectal Polyp Size”, published in the journal Digestion, the authors investigate the accuracy of a new type of endoscope, called a virtual scale endoscope, and explore its potential application as an educational tool.
What Else Can You Learn
Colorectal cancer and its symptoms are described. The roles of the colon and rectum in the digestive system, and different types of endoscopy procedure used to examine them during screening, are also discussed.
What Are Colorectal Polyps?
The colon and rectum are both part of the digestive system, and together with the anus are known as the large intestine. Once food enters the body it is broken down in the stomach before being passed on to the small intestine. The breakdown of food continues in the small intestine, and it is here that most of the nutrients in our food are absorbed into the body. The leftover material, which is mostly liquid, then moves into the colon where water and some further nutrients are absorbed.
The remaining waste (known as “stool”, “feces”, or “poo”) is stored in the rectum before it is passed out of the body via the anus. Colorectal polyps are small clumps of cells that form on the lining of the colon and/or rectum. Most colorectal polyps are harmless (benign) and do not cause any symptoms, but over time some polyps can begin to grow out of control leading to colorectal cancer.
What Is Colorectal Cancer?
Colorectal cancer is one of the most common types of cancer worldwide, with incidence varying widely between different regions and countries. Although colorectal cancer is usually diagnosed in people older than 50 years of age, it can occur in people who are younger than 50 years, particularly if they have a family history of colorectal cancer or have certain conditions that can be inherited. As well as genetic conditions, it is known that lifestyle choices such as eating a diet that is high in processed and red meat are linked to increased risk of developing it.
Common signs and symptoms of colorectal cancer include a persistent change in bowel habits (such as diarrhea, constipation, and/or a change in stool consistency that does not clear up after a short period of time), rectal bleeding or blood in the stool, a feeling that the bowel has not been completely emptied of stool when going to the toilet, losing weight without trying, a persistent feeling of weakness or tiredness, and discomfort in the abdomen area (the area between the chest and the pelvis) that does not go away.
Why Is Colorectal Polyp Size Important?
Research has shown that the larger the size of a colorectal polyp, the greater the chance that it will become cancerous. Changes in genes (mutations) that take place in the cells of a colorectal polyp over time can mean that they become able to grow more quickly and live longer, resulting in the polyp growing larger. If a polyp that is beginning to grow too large and too quickly is identified, and removed from the colon or rectum before it becomes cancerous and begins to spread, the development of colorectal cancer is prevented. As a result, the sizes of polyps influence how often follow-up screening will be conducted and whether or not they will need to be removed.
What Did This Study Investigate?
Because the size of a colorectal polyp can indicate whether or not it is likely to become cancerous, it is important that polyp size is estimated correctly. Polyp size can be assessed by endoscopy, a medical procedure that uses a long, thin, flexible tube with a small camera inside (called an endoscope) to look inside the body. The two types of endoscopy that are most commonly used to assess colorectal polyps are colonoscopy and flexible sigmoidoscopy.
Colonoscopy involves using an endoscope to assess the entire length of the colon, while flexible sigmoidoscopy only looks at the lower third. Most judgments about the sizes of colorectal polyps are made by endoscopists estimating polyp sizes by sight, but this means that there is a risk that their estimates may be wrong. Correctly estimating the sizes of polyps is difficult because endoscopes have “fisheye lenses” (also called an ultra-wide lens), which tend to curve straight lines and distort the images that endoscopists see. This means that objects such as polyps at the center of the display can appear larger than they really are, while things at the outer edge of the display can appear smaller. Although is has been suggested that forceps be used to measure polyps during endoscopy procedures, their use can make them more complicated and time-consuming.
To get around these problems, a new type of endoscope called a virtual scale endoscope (VSE) has been developed. A VSE is able to project a red laser dot onto the surface of the colon or rectum that changes positing according to the distance between the polyp and the end of the endoscope. The software that processes the VSE’s images then detects the position of the red laser dot and uses it to display a virtual scale (this can be linear, a bit like a ruler, or circular) on the image that is produced to help the endoscopist estimate polyp size in real time during the procedure. The authors of this study looked at whether polyp measurements made using a VSE were accurate, and also investigated whether using the images produced could be used as a teaching aid to help endoscopists better estimate polyp sizes.
How Was The Study Conducted?
The authors carried out two studies. The first study compared the sizes of polyps as measured using a VSE before they were removed from the colon with their actual size as measured with a ruler after they were removed. The second study involved 14 endoscopists with differing levels of experience estimating the sizes of 42 polyps in a pre-test before receiving a lecture about how to measure colorectal polyps using VSE images.
The endoscopists were categorized as beginners, intermediates, or experienced based on the number of years of experience that they had. The lecture that they received included the endoscopists being shown the correct sizes of the polyps that they had been asked to estimate in the pre-test, together with VSE images with a virtual scale, and each endoscopist received an explanation of how sizing errors were being introduced. The endoscopists then had a 1-month training period where they could practice what they had learned before doing a post-test using the original images (which were shown to them in an order that differed from the order shown in the pre-test to try to reduce bias).
What Did The Study Show?
The results of the first study indicated that there was agreement between the polyp size estimates produced using a VSE and the actual sizes of the polyps. In the second study, the accuracy of the beginner and intermediate endoscopists in measuring polyp sizes was significantly better in the test conducted after the training period than in the pre-test, with accuracy improving by approximately 50% in some cases.
These results indicate that VSEs can accurately measure colorectal polyp sizes before they are removed from the colon and that the images that they produce are useful tools with which to train endoscopists in the first few years of their careers, suggesting that VSE use has the potential to increase the accuracy of polyp measurement during endoscopy and, as a consequence, improve the detection of polyps that are beginning to grow out of control.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is the Main Idea?
Atherosclerosis develops when our arteries begin to become narrowed or hardened. In the research article “Assessment of Subclinical Atherosclerosis in Children with Atopic Dermatitis”, published in the journal International Archives of Allergy and Immunology, the authors investigate whether early signs of atherosclerosis beginning to develop can be detected in children with atopic dermatitis and attempted to identify risk factors associated with both conditions.
What Else Can You Learn
Atherosclerosis and its symptoms are described. Atopic dermatitis and the role of inflammation in the development of cardiovascular disease are also discussed.
What Is Atherosclerosis?
Atherosclerosis is a progressive disease that develops slowly when the arteries, a type of blood vessel that carries oxygen-rich blood from our heart to the organs and tissues around our body, become narrowed or hardened. It is caused by the buildup of fatty deposits called plaque, which consist of fats, cholesterol, and other substances. Over time, as the amount of plaque in the arteries increases, the narrowing makes it more difficult for the blood to flow freely and cardiovascular disease (a general term that is used to describe diseases that affect the heart or blood vessels) can develop.
Cardiovascular diseases that can be caused by atherosclerosis include:
- peripheral arterial disease (where a blockage develops in the arteries that deliver blood to your limbs, usually the legs),
- aortic disease (where the aorta, the body’s main artery, is unable to work properly),
- stroke (where the blood supply to the brain becomes disrupted), and
- coronary artery disease (where the coronary arteries, which are the main sources of blood supply to the heart, become narrowed or blocked), which can lead to angina or heart attack.
Although many people with atherosclerosis do not have any symptoms, some people experience pain in their chest, or in their arms or legs when exercising, a feeling of weakness and/or confusion, and may feel short of breath or tired most of the time.
What Causes Atherosclerosis?
Atherosclerosis can begin to develop in early childhood. High levels of fats and cholesterol in the blood are known to contribute because they make up some of the components of plaque. Damage or injury to the inner layers of arteries is also thought to be involved because the immune system responds to and seeks to repair the damage through a process called inflammation.
When inflammation is initiated, it causes blood cells and other substances to gather at the site of injury, and this can contribute to plaque starting to build up inside the arteries. Interestingly, there is evidence that the inflammation caused by inflammatory diseases such as rheumatoid arthritis, psoriasis, and inflammatory bowel disease can contribute to the development of atherosclerosis, and atopic dermatitis (more commonly known as eczema) may also be involved.
What Is Atopic Dermatitis?
Atopic dermatitis is an inflammatory skin condition that is usually long-term and recurrent, although in children it can improve or clear up completely as they get older. It causes the skin to be dry, cracked, itchy, and sore, and can range from occurring in small, localized patches to all over the body. Although the exact causes of atopic dermatitis are unknown, it is considered to be a systemic disease (a condition that affects the whole body rather than a single body part or organ) because the chronic inflammation that causes it often occurs in other organ systems as well as in the skin. It also often occurs in people who have allergies or asthma.
What Did This Study Investigate?
Because atopic dermatitis is an inflammatory disease and chronic inflammation has been linked to the development of atherosclerosis, it is possible that there is a relationship between people having atopic dermatitis and developing cardiovascular disease later in life. Some studies have found that the systemic inflammation caused by atopic dermatitis may double the risk of cardiovascular disease. Some research has suggested that the two may have an indirect relationship due to atopic dermatitis causing risk factors linked to increased risk of cardiovascular disease, such as sleep problems caused by itching, inactivity, and the use of corticosteroid treatments).
However, other research has suggested that there is a direct relationship caused by the excessive inflammation in the body that is independent of other factors. Recent research has shown that the levels of molecules in the blood that are prognostic (in other words, they can be used to indicate how a condition is likely to progress) for atherosclerosis and damage to the arteries are increased in skin and blood serum samples from patients with atopic dermatitis.
Most studies looking at whether there is a link between atherosclerosis and atopic dermatitis to date have involved adult patients. Considering that atherosclerosis can start to develop in early childhood, the authors of this study investigated whether early signs of atherosclerosis beginning to develop can be detected in children with atopic dermatitis and attempted to identify risk factors associated with both conditions. They compared a group of children who had atopic dermatitis with a similar number of children who did not have the disease who were alike in terms of factors like their age, weight, and height.
What Did the Study Show?
The results of the study showed that early signs of atherosclerosis were detectable in children with atopic dermatitis, with the length of time that they had had atopic dermatitis, the severity of their disease, and their age all associated with the likelihood of signs being present.
In particular, increases in a factor called carotid intima–media thickness were found to be associated with children having atopic dermatitis. Carotid intima–media thickness is calculated using a special type of ultrasound by measuring the thickness of the two most inner layers of the carotid arteries (the major arteries that supply blood to your brain, with one on each side of your neck), the intima and the media, and is used to assess whether atherosclerosis may be present. The greater the carotid intima–media thickness, the greater the likelihood that atherosclerosis is developing.
The authors of the study suggest that it may be important that children with atopic dermatitis be monitored for signs of atherosclerosis development and other risk factors that are known to be associated with cardiovascular disease. These include obesity, high levels of fats in the blood, and high blood pressure. Studies following the health of children with atopic dermatitis over longer periods of time are now needed to shed more light on the relationship between it and the development of atherosclerosis and cardiovascular disease.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is the Main Idea?
Glioma is a type of tumor that develops in the nervous system, with differences between gliomas that develop in children and adults. In the open-access review article “Pediatric Glioma Models Provide Insights into Tumor Development and Future Therapeutic Strategies”, published in the journal Developmental Neuroscience, the authors summarize different experimental models that are being used to study glioma in children and how they may contribute to improvements in its treatment.
What Else Can You Learn?
Glioma and its symptoms are described. Differences in gliomas arising in children and adults, driver mutations, and the use of experimental models in cancer research are also discussed.
What Is Glioma?
Glioma is a type of tumor that is found in the nervous system. It usually develops in the brain but can also develop in the spinal cord (a tube of nervous tissue that runs from the brain to the lower back), although this is rare. Glioma develops when glial cells begin to develop and grow out of control. There are different types of glial cell and they play essential roles in the nervous system that support the function of neurons (cells that transmit messages from one part of the nervous system to another via electrical impulses), with glial cells sometimes being described as the “glue” that holds the nervous system together.
As well as surrounding neurons and holding them in place, glial cells also create a myelin sheath around neurons that insulates their electrical impulses so that they can transmit messages effectively, a bit like the coating of an electrical wire. They supply oxygen and nutrients to neurons to keep them nourished, regulate inflammation (an immune system process through which the body responds to an injury or a perceived threat, like a bacterial infection or damaged cells). They also form the blood–brain barrier, which is a barrier between the blood vessels in the brain and the other components that make up brain tissue that allows nutrients to reach the brain while preventing other things from entering it that could cause infections or damage.
What Are the Symptoms of Glioma?
There are different types of glioma, depending on the type of glial cell from which the glioma develops and speed at which the tumor is growing. As a result, the symptoms and signs of glioma can vary between people, and are also affected by where the tumor is in the nervous system and its size. Common symptoms are:
- headache, which may hurt more in the morning;
- changes in mental function (such as problems with understanding information and memory) and personality;
- feeling sick and vomiting;
- problems with vision (such as blurred or double vision);
- seizures, especially if the person has not had them before.
Gliomas can develop at any age, and although glioma is most commonly diagnosed in adults there are some types of glioma that are more common in children and young adults.
What Did This Article Look at?
Review articles survey the information that has been published on a topic to date. Rather than presenting new findings from their own research, the authors aim to clarify current thinking on a topic and the evidence that supports it, and sometimes set out suggestions for changes to what is considered to be best practice.
In this article, the authors review the different experimental models that are being used to study glioma in children and summarize how these models may contribute to improvements in its treatment. Experimental models use systems, such as the culturing of cells in a laboratory, to investigate processes that are thought to be involved in diseases and to evaluate new drugs that are being developed before they are assessed by going through the clinical trials process in humans.
There is a need for new treatments for glioma in children. Treatments that are currently the standard of care for childhood glioma have been chosen based on their effects on gliomas in adults, but adult and child gliomas that are high-grade (this means that they are “malignant”, growing in an uncontrolled way and able to spread to nearby tissues and other parts of the body) progress differently and have different underlying “driver mutations” (these are changes in genes in the tumor cells that give them a growth advantage and, as a result, promote the development of cancer).
Until recently, a lack of experimental models that could accurately recreate the environment in which gliomas form meant that efforts to study child gliomas were limited. However, the discovery of child glioma-specific driver mutations has enabled researchers to investigate the origins of these tumors, laying the foundation for the development of more appropriate and effective treatments.
What Experimental Models Are Being Used?
Over the last 10 years there have been major advances in the development of cell lines (a population of cells that can be grown and maintained in a laboratory) and models derived from tissue samples obtained from glioma patients during surgical procedures. Cell lines have the advantages of being relatively cost-effective and easily shared with other researchers, as well as being suitable for use in high-throughput screening (this is a process by which hundreds of samples of cells and hundreds of different potential drugs can be tested quickly, often using robotics).
Advances in stem cell engineering have also opened up new opportunities to investigate the development of glioma. Stem cells are unique in that they can self-renew, are either undifferentiated or only partially differentiated, and are the source of specialized cell types, like red blood cells and types of brain cell. Stem cells are useful in glioma research because they can be used to model tumor types from which it is difficult to obtain tissue samples or establish cell lines, which is the case for some types of glioma, and they can be controlled so that specific cell types and driver mutations can be investigated.
Organoids are three-dimensional tissue cultures that are grown from stem cells. Although cell cultures can be very useful in cancer research, they are usually grown as flat sheets of cells in tissue culture flasks and do not accurately represent all of the complicated interactions that take place between tumor cells and their environment in the body. This “tumor microenvironment” includes immune cells, signaling molecules, the matrix that surrounds cells in tissues and supports them, and the surrounding blood vessels. Tumors and their surrounding microenvironment constantly interact and it is known that they can influence each other.
Immune cells in the microenvironment can affect the growth and development of tumor cells, while a tumor can influence its microenvironment by releasing signaling molecules that promote the development of new blood vessels, which increase the supply of nutrients to the tumor and aids its ability to start to spread around the body, and inhibiting and evading the immune system’s ability to recognize and destroy tumor cells. Using organoids enables elements of the tumor microenvironment to be incorporated into models of glioma so that the experiments more accurately mimic the situation in the body.
These model systems are enabling us to better understand how glioma develops. As our understanding increases, more features of glioma cells will be identified that can be targeted specifically by new treatments, increasing the range of therapies that can be used to treat glioma in children and improving the outcomes of patients.
What Is the Main Idea?
Glutamate is the body’s main excitatory neurotransmitter, stimulating neurons to send signals around the body. In the free-access review article “Sex Hormones, Neurosteroids, and Glutamatergic Neurotransmission: A Review of the Literature”, published in the journal Neuroendocrinology, the authors summarize the current research evidence regarding whether or not there is a link between glutamate’s role as a neurotransmitter and the levels of sex hormones and neurosteroids in the body.
What Else Can You Learn
The role of the amino acid glutamate as a neurotransmitter in the brain is discussed. Sex hormones and neuropeptides, amino acids, and the general purpose of review articles are also discussed.
What Is Glutamate?
Glutamate is a naturally occurring amino acid that is found in the food we eat and is also produced by the body. It is a type of molecule called an “amino acid”. Amino acids are best known for being the component molecules that make up proteins, with the amino acids used and the order in which they are joined together in a protein influencing its functions, shape, and ability to interact with other molecules. If the order of the amino acids in a particular protein changes (for example if the gene that codes for it becomes mutated), the protein produced may no longer be able to function properly or even at all.
An example of this is when a single amino acid is changed in a protein called beta-globin because of a mutation in its coding gene. Beta-globin is a component of hemoglobin, which is found in red blood cells and is involved in carrying oxygen around the body. The single amino acid change creates a “sticky” patch on hemoglobin molecules that causes them to clump together and distort the red blood cells into a sickle shape, giving rise to a condition called sickle cell disease.
What Does Glutamate Do in the Body?
Glutamate plays several important roles in the body. It is a key component of metabolism, the process by which the food and drink that we consume is changed into energy, and can be broken down as an energy source in the brain when glucose levels are low. Glutamate is involved in the removal of excess nitrogen from our bodies via the production of urea (which is passed out of our bodies in urine). It is believed to be involved in the regulation of the sleep–wake cycle because levels are high during the rapid-eye-movement phase of sleep and when you are awake. Another major role of glutamate is as an “excitatory neurotransmitter”.
What Are Neurotransmitters?
Neurotransmitters carry chemical signals between neurons, a type of cell that transmits messages from one part of the brain and nervous system to another, and trigger an action or change in the target cell. This can be either “inhibitory” (it prevents or blocks the message from being transmitted any further), “modulatory” (it influences the effects of other neurotransmitters), or excitatory (it “excites” the target neuron, causing it to send the message on to the next cell).
Glutamate is the most abundant excitatory neurotransmitter in the human nervous system. It is involved in processes that take place in the brain such as memory and learning (it is estimated to be involved in more than 90% of the brain’s excitatory functions), and high levels of glutamate are also associated with increased pain levels. Glutamate is also converted into an important inhibitory neurotransmitter called gamma-aminobutyric acid (GABA) that is known as the “calming” neurotransmitter because it is involved in the regulation of anxiety, relaxation, and sleep. The process by which glutamate acts as a neurotransmitter is called “glutamatergic neurotransmission”.
What Are Sex Hormones and Neurosteroids?
Sex hormones are so called because they are critical in regulating the biological differences between males and females, and are particularly involved in reproduction and puberty (hormones are chemical messenger molecules that coordinate different processes and functions in the body). In humans, the key sex hormones are estrogen, progesterone, and testosterone. Neurosteroids are steroids that are produced in the brain or that have an effect on its functions (they can also act as signaling molecules). They are involved in a wide range of roles such as memory, learning, and behavior, as well as responses to stress and depression.
What Did This Article Look at?
Review articles are conducted as a sort of survey of all the information that has been published on a topic. Rather than presenting new findings, they aim to clarify current thinking on a topic and the evidence that supports it, and sometimes set out suggestions for changes to what is considered to be best practice. Increasing numbers of research articles are being published that are reporting a link between glutamate’s role as a neurotransmitter and the levels of sex hormones and neurosteroids in the body.
There is also evidence that changes to the regulation or levels of sex hormones and neurosteroids may be linked to the development of a range of neurological conditions. For example, dysregulation of glutamate’s role as a neurotransmitter has been linked to a number of disorders including epilepsy and post-traumatic stress disorder. It has also been linked to premenstrual dysphoric disorder, which is a severe form of premenstrual syndrome. It is therefore important that we gain a better understanding of how sex hormones and neurosteroids influence the normal functioning of the brain and identify any roles in the development of conditions that affect its function.
What Were the Review’s Findings?
The authors of the review concluded that the current evidence is that sex hormones can directly affect glutamate’s role as a neurotransmitter. In particular, there was evidence that estrogens can be protective against excitotoxicity, which occurs when excessive or prolonged activation of neurotransmission, particularly if mediated by glutamate, has a negative effect on neurons, leading to their loss of function or death. This is particularly relevant to stroke, where loss of blood flow (known as “ischemia”) in a region of the brain can not only damage neurons directly, but can also affect glutamate transport resulting in glutamate levels increasing to levels at which neurons die.
Other conditions known to be linked to too high levels of glutamate in the brain include Alzheimer’s disease, multiple sclerosis, Parkinson’s disease, and chronic fatigue syndrome. Equally, levels of glutamate in the brain that are abnormally low are linked to low energy, trouble concentrating, and insomnia. Estrogen levels in the brain have also been shown to be linked to memory function in several non-human species. Progesterone may also have a neuroprotective effect although further research is needed to investigate the link.
There was some conflicting evidence regarding whether testosterone has a protective or negative effect on neurons, and a number of neurosteroids that are produced from the conversion of testosterone and progesterone may also play an independent role in altering the levels of glutamate in the brain. As we learn more about the relationships of sex hormones and neuropeptides with glutamate-mediated neurotransmission it is hoped that we will gain new insights regarding how to prevent the development of disorders and treat them more effectively.
What Is the Main Idea?
The use of biologically-based complementary and alternative medicines (CAMs) by patients with long-term health conditions is increasing. In the research article “Biologically-Based Complementary and Alternative Medicine Use in Breast Cancer Patients and Possible Drug-Drug Interactions”, published in the journal Breast Care, the authors describe how the use of biologically based CAMs by patients with breast cancer has the potential to cause drug interactions, both with anticancer medicines as part of a chemotherapy treatment and with each other.
What Else Can You Learn?
In this blog post, standard medical treatment for breast cancer and the possibility of drug interactions when medicines are taken together are discussed. Different types of complementary and alternative medicine are also described.
What Is Breast Cancer?
Breast cancer can start in one or both breasts. It develops when cells in the breast become abnormal, start to grow out of control, and begin to invade the surrounding tissue. Breast cancer cells can also spread to other areas of the body by being carried there by the blood and lymphatic systems. The lymph fluid that is transported around the body by the lymphatic system is an important part of the immune system. There are different types of breast cancer, with the exact type determined by which type of cells in the breast has become cancerous. Breast cancers are also classified on the basis of whether or not the cancer cells produce certain proteins or have changes (mutations) in specific genes. Genes are short sections of DNA that carry the genetic information for the growth, development, and function of your body.
How Is Breast Cancer Treated?
Treatments that have been assessed and accepted as effective treatments for particular diseases by the medical community are known as “standard medical treatments”. The standard medical treatments for breast cancer include surgery, chemotherapy, radiotherapy, hormone therapy, and targeted therapy.
- Types of surgery that are used to treat breast cancer include breast-conserving surgery (where a cancerous lump is removed) and mastectomy (where a whole breast is removed).
- Chemotherapy uses medicines that are “cytotoxic” (which means that they are toxic to cells, damaging them or causing them to die) to kill cancer cells. However, because cells in the body that are not cancerous can also be affected by chemotherapy medicines, many people who receive this type of treatment experience side effects. As this term is used to describe any unintended effects of a medicine, it can refer to beneficial and/or unfavorable effects.
- Radiotherapy aims to kill cancer cells by using controlled doses of radiation.
- Hormone therapy is used to lower the levels of the hormones estrogen and progesterone, which naturally circulate in the body, because some breast cancers develop the ability to be stimulated to grow by them.
- Targeted therapy specifically targets molecules that cancer cells need to survive and spread.
What Is Complementary and Alternative Medicine?
The term “complementary and alternative medicine” (CAM) is an umbrella term that describes medical practices and products that are not part of standard medical care. Complementary medicine is used alongside standard medical treatment, whereas alternative medicine is used instead of standard medical treatment. A wide range of different types of products and practices are included in CAM that can be broadly divided into five groups.
- Whole medical systems, such as ayurveda and naturopathy
- Mind–body therapy, including meditation, yoga, and hypnotherapy
- Manipulative and body-based practices, such as reflexology and massage
- Energy healing, such as reiki
- Biologically-based approaches, such as vitamins and dietary supplements, plants and plant extracts, and special foods or diets
The effectiveness and safety of most types of CAM approaches are less well understood than for standard medical treatment and more research is needed. However, while some CAM therapies have been shown to be generally safe and effective (such as acupuncture and yoga), some may be harmful and others may not work. Some may also cause drug interactions.
What Is a Drug Interaction?
A drug interaction happens when a medicine that is being taken by a person reacts with something else. Drug interactions can happen when one medicine reacts with another medicine or medicines, with something that the person is consuming (such as a herbal supplement or a particular food), or starts to cause side effects in the person because of another condition. When drug interactions occur, the results can range from mild side effects to a drug working less well or not at all. This means that a drug interaction has the potential to have a serious effect on the patient.
What Did the Study Investigate?
Advances in standard medical treatment for breast cancer have led to significant increases in 5- and 10-year survival rates in all countries in the European Union and in the UK in recent years. At the same time, health information has become more widely available and a large proportion of patients with long-term health conditions look for ways to improve their health and quality of life that fall outside of standard medical treatment.
Research has shown that the use of biologically-based CAMs is particularly popular among women with cancer, primarily because it is hoped that biologically-based CAMs can lessen the side effects of chemotherapy and strengthen the body against the effects of anticancer treatments. However, many of the biologically-based CAMs that people use carry the risk of drug interactions, and patients may begin taking them without consulting or notifying their medical team, making it difficult for any effects caused by drug interactions that do occur to be identified.
The authors of this study followed 47 patients with breast cancer as they began chemotherapy treatment, and asked them to complete questionnaires on their first day of treatment and again 10–12 weeks later. During this time period, 91% of the participants in the study reported that they used a biologically-based CAM, with the most popular types of biologically-based CAMs including the taking of vitamins, minerals, trace elements, and plants or plant extracts.
Drug interactions that had the potential to be clinically relevant (i.e., that could affect the effectiveness of a chemotherapy medicine or increase its toxicity in the body) were identified for 30 out of the 43 patients who reported using biologically-based CAMs. This was particularly true for patients who were using turmeric and ginger supplements together, which shows that the taking of more than one biologically-based CAM at once can cause drug interactions with each other, not just with anticancer medicines.
While the consumption of turmeric and ginger in food has generally been reported to have health benefits, they can both have a blood-thinning effect when high levels are consumed. This puts a person at risk of dangerous bleeding if they are also taking an anticoagulant (a type of medicine that prevents blood clots form forming). There are also some instances where drug interactions only occur if two substances are taken together. In such cases, it is possible for a patient’s medical team to help put together a medication plan that can help avoid drug interactions by ensuring that the taking of the two medicines is done at safe time intervals.
Take-Home Message
Although some biologically-based CAMs may have beneficial effects on the health of patients undergoing treatment for breast and other cancers, further studies are needed to identify potential interactions that can occur with chemotherapy drugs and with other biologically-based CAMs. If you are undergoing treatment for breast cancer, let your medical team know if you start to use a biologically-based CAM. This will enable them to monitor you for any potential drug interactions and will also add to the pool of knowledge regarding the best CAM options for patients undergoing anticancer treatment. There may also be known potential drug interactions that should be taken into consideration, and your medical team will be able to provide advice to help you support your standard medical treatment safely.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is the Main Idea?
Some changes in cognitive function are considered to be a normal part of aging, but others can indicate the presence of disease, such as dementia. In the research article “Cognitive Activity Is Associated with Cognitive Function over Time in a Diverse Group of Older Adults, Independent of Baseline Biomarkers”, published in the journal Neuroepidemiology, the authors investigate whether there is a relationship between a person’s level of cognitive activity, biomarkers in their blood that can indicate Alzheimer’s disease or dementia, and changes in their cognitive function in older age.
What Else Can You Learn?
In this blog post, changes in cognitive function as we age are described. Cognitive reserve and different forms of dementia and also discussed.
How Does Cognitive Function Change as We Age?
The term “cognitive function” describes a combination of processes that take place in the brain that enable us to learn, manipulate information, remember, and make judgements based on experience, thinking, and information from the senses. These processes affect every aspect of life and our overall health, including how we form impressions about things, fill in gaps in knowledge, and interact with the world.
Some changes in cognitive function that are considered to be a normal part of the aging process include difficulties with multitasking and sustaining attention, and an overall slowing of the speed at which we think. The ability to “hold information in mind”, which means the ability to think about something without steady input about it from the outside world, also tends to decrease.
In contrast, skills like verbal reasoning and vocabulary tend to increase or stay the same as we get older. Changes in cognitive function that are considered a normal part of aging are usually subtle over time; however, some people experience major changes in cognitive function that may indicate the development of a neurodegenerative disease caused by abnormal changes in the brain, such as dementia. The term “neurodegenerative” means the degeneration or death of neurons, a type of cell that transmits messages from one part of the brain and nervous system to another.
What Is Dementia?
Dementia mainly occurs in people aged over 65 years and covers a range of conditions with different causes. For example, vascular dementia develops when blood flow to one or more areas in the brain is blocked or reduced, preventing cells from getting the oxygen and nutrients that they need to function properly.
In contrast, Alzheimer’s disease is believed to be caused by the abnormal functioning of two proteins called beta-amyloid and tau. In people with Alzheimer’s disease, beta-amyloid forms clumps called “plaques” on neurons that make it hard for them to stay healthy and communicate with each other, while abnormal forms of tau cling to other tau proteins inside neurons and form “tau tangles”. People with dementia often experience declines in cognitive function that affect their memory and other thinking skills like language, problem-solving, attention, and reasoning. Their behaviour, feelings, and relationships can also be affected, with significant effects on their daily lives.
What Did the Study Investigate?
It is well known that the extent to which a person engages in cognitive activity (mental tasks that require focus, reading, learning, creativity, memory, and/or reasoning) can affect their cognitive function as they age. There is strong evidence that people who are more cognitively active maintain higher levels of cognitive function over time than people who are less cognitively active, regardless of whether they develop a form of dementia. In other words, some brains keep working more efficiently than others despite them experiencing similar amounts of cognitive decline and/or damage. However, it remains unclear whether this is because cognitive activity directly benefits cognitive health or because people with declining cognitive function become less cognitively active.
What Is “Cognitive Reserve”?
The possibility that cognitive activity can positively affect our brain health relates to an idea called “cognitive reserve”. It suggests that people build up a reserve of cognitive abilities during their lives that can protect them against some of the cognitive decline that can happen as the result of ageing or the development of disease such as dementia. A person can increase their cognitive reserve through activities that engage their brain, such as learning a language or new skill, solving puzzles, and high levels of social interaction, particularly if the activities are novel and varied. Regular physical activity, not smoking, and a healthy diet are also important.
The idea of cognitive reserve is supported by research that has shown that the relationship between cognitive activity and function in older age is not affected by the degree of abnormal brain changes. In other words, two people with Alzheimer’s disease may have similar levels of beta-amyloid plaques and tau tangles in their brains, but may differ regarding the extent to which their cognitive function has declined. Equally, two people who seem to have the same level of cognitive function may differ regarding the extent of abnormal change that has happened in their brains.
What Role Do Biomarkers Play?
The authors investigated whether there is a relationship between the levels of three biomarkers in the blood that can be used to predict and stage some types of dementia, including Alzheimer’s disease, and the extent to which a person’s level of cognitive activity affects their cognitive function as they age. Biomarkers are measurable characteristics, such as molecules in the blood or changes in genes (mutations), that can indicate whether the body is working normally or a disease is present.
In this study, the authors measured the levels of three biomarkers in blood samples: total tau, neurofilament light chain (NfL),and glial fibrillary acidic protein (GFAP).
- As already mentioned, tau tangles are a characteristic of Alzheimer’s disease, and high levels of total tau (both normal and abnormal forms of tau) in the blood have been reported to be associated with increased risk of cognitive impairment.
- High levels of NfL in the blood have been linked to neurodegeneration and there is evidence that it may be possible to use levels of NfL in the blood to detect whether a person has dementia.
- Levels of GFAP in the blood have been shown to be increased early on in the development of Alzheimer’s disease. This can be used to determine whether a person has Alzheimer’s disease or frontotemporal dementia which is a rarer type of dementia that affects the frontal and temporal lobes of the brain responsible for language, behavior, and emotions.
Who Participated in the Study?
The people who participated in the study were all aged 65 years or older, and one-third of the participants were randomly selected to give blood samples for biomarker testing at the start of the study. All of the participants reported how often they participated in cognitive activities that were judged to be common to older adults because they are not overly dependent on a person’s financial or social situation:
- Watching television
- Listening to the radio
- Visiting a museum
- Playing games or doing puzzles
- Reading books
- Reading magazines
- Reading newspapers
Their cognitive function was also assessed at the start of the study and in 3-year cycles after that, using tests of short-term and immediate memory, perceptual speed, and language functioning.
What Did the Authors Find?
The authors of the study found that higher levels of cognitive activity were associated with better cognitive function not only at the start of the study, but also after an average of 6.4 years of follow-up when the authors made contact with participants at later, prearranged dates to check on progress. However, the levels of the blood biomarkers did not affect this relationship. In other words, the benefits of high levels of cognitive activity on cognitive function were not affected by the levels of tau, NfL, and GFAP in the blood, even when they were present at high levels.
These results lend weight to the idea of cognitive reserve and suggest that people who engage in enriching activities throughout their lives may enter old age with a higher level of cognitive function, which can delay or reduce any symptoms resulting from dementia or other neurodegenerative diseases from affecting their quality of life.
Take-Home Message
Ensuring that we are cognitively active before we reach our 60s (i.e., before the age at which the study’s participants were initially assessed) may benefit our brain health and cognitive function as we age. The fact that the authors of the study did not find a link between the blood biomarkers and cognitive activity over time also suggests that people benefit from enrichment activities throughout their lives, including in their later years.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is Stroke?
Arteries are blood vessels that carry oxygen-rich blood from the heart to cells and organs throughout the body. Stroke is a disease that affects the arteries that lead to and pass through the brain. The oxygen and nutrients that brain cells need to function properly are carried around the brain by the blood. When stroke happens, the blood supply to part of the brain is cut off or reduced.
This can be caused by a blockage in an artery (this is called an “ischemic” stroke) or by an artery rupturing, causing bleeding in or around the brain (this is called a “hemorrhagic” stroke). The cells in the affected area of the brain can no longer get all the oxygen and nutrients they need and quickly begin to die. The bleeding can also cause irritation and swelling, and pressure can build up in surrounding tissues, which can increase the amount of damage in the brain.
As well as the two main types of stroke, some people experience “mini-strokes” called transient ischemic attacks (TIAs). A TIA is essentially a stroke caused by a temporary, short-term blockage of an artery. Once the blockage clears the symptoms stop. Although someone who has a TIA may feel better quickly they still need medical attention as soon as possible, because the TIA may be a warning sign that they will have a full stroke in the near future.
What Are the Effects of Stroke?
The effects of stroke differ from one person to another and depend on the severity, the area of the brain that is affected, and the type of stroke experienced. The main symptoms of stroke include one side of the face dropping or the person being unable to smile, not being able to lift both arms and keep them raised, the person having difficulty understanding what you are saying, or slurred speech or not being able to talk.
Other symptoms include confusion or memory loss, numbness or weakness on one side of the body, a sudden fall or dizziness, sudden severe headache, and/or loss of sight or blurred vision (in one or both eyes). Although some people will have a full recovery after stroke, others will have permanent effects that do not get better.
What Causes Stroke?
There are some factors that are known to increase your chance of stroke. These include your age, ethnicity, having a close relative (a sibling, parent, or grandparent) who has had a stroke, especially if the stroke happened before they reached age 65 years, and having other conditions such as diabetes or a type of heart disease. Your arteries naturally become narrower as you get older, and blood clots that cause ischemic stroke often form in areas where arteries have become narrower or blocked over time as a result of the buildup of fatty deposits (a process called “atherosclerosis”).
Smoking, high levels of lipids (fats) in the blood (such as cholesterol and triglycerides), diabetes, drinking excessive amounts of alcohol (binge drinking), obesity, and high blood pressure (also called “hypertension”) can all speed up this process. High blood pressure is also the main cause of hemorrhagic stroke because it can weaken arteries in the brain. The roles of smoking, diabetes, high lipid levels, and high blood pressure in causing stroke are so well known that they are sometimes called “traditional” risk factors.
What Did This Study Investigate?
Although stroke is more common among the elderly it can happen at any age, even in infants. There is some evidence that the global incidence of stroke among younger and middle-aged people (aged 18–64 years) is increasing, with significant increases in low- and middle-income countries. As these countries have undergone economic changes, so too have the dietary and lifestyle habits of their inhabitants, resulting in increases in high blood pressure, diabetes, and obesity.
Although it used to be thought that rare, non-traditional risk factors were mainly responsible for stroke in younger people (such as conditions that mean that a person has a tendency to develop blood clots or rheumatic heart disease), this may no longer be the case.
The authors of this study used data from a study called INTERSTROKE to assess whether traditional risk factors are now the main cause of stroke in people aged 18–45 years. INTERSTROKE was a case–control study that involved 142 centers located in 32 countries across the world between 2007 and 2015. A case–control study is a type of study that compares the medical and lifestyle histories of two different groups of people to identify risk factors that may be associated with a disease or condition:
- one group of people with the disease being studied (cases) and
- another similar group of people who do not have the disease (controls).
In INTERSTROKE, people who experienced their first acute stroke and who presented to medical professionals within 5 days of their symptoms beginning were matched with control participants based on their age and sex. In total, 1,582 pairs of participants were assessed.
What Did the Study Show?
As in older people, ischemic stroke was more common than hemorrhagic stroke in younger adults (accounting for 71% of cases). No statistically significant regional differences in risk factors were identified, although this may have been influenced by the low numbers of participants from individual regions. Traditional risk factors such as high blood pressure, high lipid levels, smoking, excessive alcohol consumption, obesity, and psychosocial stress (caused by our environment and relationships) were also shown to be significant risk factors for stroke in younger adults. High blood pressure was shown to be particularly significant, and was consistently identified as the strongest risk factor across all of the regions included in the study, different stroke types, and both sexes.
These results show that, worldwide, the traditional risk factors for stroke are now as important for younger adults as they are for older members of the population. The authors suggest that public health efforts that aim to identify and address traditional risk factors for stroke should start when people are in their 20s and 30s, which is much earlier than previously thought.
Take-Home Message
Taking steps to control your blood pressure and keep it low, whatever your age, can have significant health benefits that include reducing the risk of stroke. Eating a healthy diet that includes plenty of vegetables, wholegrains, fruit, some dairy products, fish, poultry, nuts, seeds, and beans, and reducing your consumption of sugars and red and processed meat can help. Stopping smoking, only drinking moderate amounts of alcohol (and avoiding binge drinking in particular), and being more active can also have significant positive effects on your health.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is the Main Idea?
The exact causes of migraine are unknown, but it is thought that migraine attacks develop as a result of abnormal brain activity. In the research article “Migraine Attacks Triggered by Ingestion of Watermelon”, published in the journal European Neurology, the authors describe how watermelon consumption may trigger migraine headache attacks by activating a process called the L-arginine-nitric oxide pathway.
What Else Can You Learn?
In this blog post, different types of migraine and what is known about how migraine attacks develop are described. The processes by which nerves transmit signals throughout the body and the L-arginine-nitric oxide pathway are also discussed.
What Is Migraine?
Migraine is often characterized as a headache that causes severe throbbing pain or a pulsing sensation, usually on one side of the head. However, there are different types of migraine and headache, and it can be difficult to tell them apart. Different people also experience different migraine symptoms.
Although many migraine attacks involve a severe throbbing headache, some people will experience migraine attacks without headache (known as silent migraine). When this happens, the person experiences “aura” symptoms such as flashing lights or seeing zigzag lines, but does not develop head pain.
Other people may experience migraine that includes severe head pain with or without aura symptoms, such as changes in their vision, numbness or tingling, feeling dizzy, having difficulty speaking, and feeling or being sick. Migraine attacks can last anywhere between several hours and three days, and symptoms may start and end one or two days before headache develops.
What Causes Migraine?
The exact causes of migraine are not known, although the fact that people are more likely to get them if they have a close family member that gets migraines suggests that there is some sort of genetic involvement. It is thought that migraines develop when nerve signals, chemicals and blood vessels in the brain are affected by abnormal brain activity.
Neurogenic inflammation (a type of inflammation caused when particular types of nerves are activated and release mediators of inflammation such as nitric oxide) and the widening of blood vessels in the membrane layers that protect the brain and spinal cord are believed by some researchers to be key causes of migraine headache. Leakage of blood plasma (the liquid component of blood that does not include blood cells) from blood vessels into the surrounding tissues may also be involved.
Nerves (also known as neurons), together with the spinal cord and brain, are key components of the nervous system and consist of bundles of nerve fibers wrapped up to form cable-like cells. Nerves send electrical signals that control our senses, like pain and touch, and essential processes such as breathing, digestion, and movement, from one part of the body to another. When an electrical signal reaches the end of a nerve it is converted into a chemical signal. This causes molecules called neurotransmitters, such as dopamine and epinephrine (also known as adrenaline), to be released into the space between the end of one nerve and the start of the next one, which is called a synapse.
Once they have crossed the synapse, the neurotransmitters bind to receptors on the new nerve, and the signal is converted back into a chemical signal and travels on along the neuron. The ability of nerves to transmit signals internally or between one nerve and another is dependent on a process called depolarization, which is essential to the function of many cells and communication between them. Most cells have an internal environment that is normally negatively charged compared with the cell’s external environment.
When depolarization occurs, the internal charge of the cell temporarily becomes more positive before returning back to normal. Migraine aura is thought to be caused by a wave of “spreading depolarization” in a part of the brain called the cortex. Nitric oxide and glutamate are released during spreading depolarization, and some studies have reported increased levels of nitric oxide during headache attacks. This has led some researchers to suggest that the pathways that break down nitric oxide may be involved in migraines.
What Did This Study Investigate?
Although the exact causes of migraine are still unclear, migraine attacks are known to be triggered by stress and tiredness, hormonal changes, prolonged fasting or skipping meals, and the consumption of too much alcohol or caffeine and certain foods. Watermelon is the main natural source of an amino acid (the component units that are joined together to make proteins) called L-citrulline (in fact, its name is derived from the scientific name for watermelon, Citrullus vulgaris).
L-citrulline is also made by the body in the liver and intestine, and is an important component of the urea cycle, the process by which toxic ammonia is converted into urea so that it can be passed out of the body in urine. L-citrulline in the body can be converted to another amino acid called L-arginine, from which nitric oxide is produced via a process called the L-arginine-nitric oxide pathway. This means that watermelon may be an indirect source of nitric oxide in the body and may trigger migraine in some people.
The authors of this study conducted a clinical trial to investigate whether eating watermelon causes headache attacks in people who experience migraine. They recruited 38 volunteers who experience migraine without aura and 38 who do not, and asked them to each consume a portion of watermelon after avoiding consumption of watermelon and other L-citrulline-containing foods in the preceding 7 days, and fasting for the preceding 8 hours.
All of the volunteers gave blood samples before and after eating the watermelon to enable the researchers to assess whether there were any changes in blood serum nitrite levels (produce by the breakdown of L-citrulline). All of the volunteers then ate and were followed up for 24 hours by telephone, so that the researchers could be informed if any of the volunteers developed headache.
What Were the Results of the Study?
Headache was triggered in almost one-quarter of the people in the group who experienced migraine (23.7%) after, on average, around 2 hours after watermelon was consumed. In contrast, none of the volunteers in the migraine-free group developed headache over the 24-hour follow-up period. Interestingly, around one-quarter of the volunteers in the migraine (23.4%) and migraine-free (24.3%) groups were shown to have increased nitrite levels in their blood serum samples after consuming watermelon. These increases from the values recorded before watermelon consumption were statistically significant.
These findings suggest that eating watermelon can trigger headache attacks in people who experience migraine and increase serum nitrite levels, which may be due to activation of the L-arginine-nitric oxide pathway. Although everyone is different and not all of the migraine group volunteers developed headache after consuming watermelon, people who experience migraine may wish to consider reducing or avoiding consumption of watermelon.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available.
What Is the Main Idea?
The treatment of ulcerative colitis has traditionally focused on the control of symptoms. In the review article “Current and Emerging Targeted Therapies for Ulcerative Colitis”, published in the journal Visceral Medicine, the authors describe how advances in targeted treatments have the potential to improve the quality of life of people with ulcerative colitis.
What Else Can You Learn?
In this blog post, ulcerative colitis and emerging treatments for it are described. Different phases of clinical trials are also discussed.
What Is Ulcerative Colitis?
Ulcerative colitis is a form of inflammatory bowel disease. People with ulcerative colitis have chronic (long-term) inflammation and ulcers (sores) in the colon (also known as the large bowel and part of the large intestine, it removes water and some nutrients from partially digested food before the remaining waste is passed out of the body).
For many people with ulcerative colitis, the disease follows a “relapsing and remitting” course, which means that there will be times when their symptoms get worse and others when their symptoms partly or completely go away. Symptoms of ulcerative colitis include needing to go to the toilet frequently and urgently, abdominal pain, a general feeling of being unwell, and fatigue, which can combine to have a major impact on a person’s quality of life and ability to work.
What Causes Ulcerative Colitis?
The exact causes of ulcerative colitis are not fully understood, but it is known that a combination of factors cause inflammation to be activated by the immune system. Inflammation is a normal process through which your body responds to an injury or a perceived threat, such as a bacterial infection. In ulcerative colitis, a high level of inflammation taking place for too long results in tissue damage in the colon and disease-related complications that cause the symptoms described above.
Ulcerative colitis is thought by some to be an autoimmune condition, which means that the body’s immune system wrongly attacks normal, healthy tissue. The intestines contain hundreds of different species of bacteria, which are part of the “gut microbiome” (the term given to all of the microorganisms that live in the intestines and their genetic material). Although some of these species can cause illness, many are essential to our health and wellbeing, playing key roles in digestion, metabolism (the chemical reactions in the body that produce energy from food), regulation of the immune system, and mood.
Some researchers believe that in ulcerative colitis, the immune system may mistakenly identify harmless bacteria inside the colon as a threat and start to attack them, causing the colon to become inflamed. Genetic factors like changes in genes and environmental factors are also known to be involved in the development of ulcerative colitis, and recent advances in our understanding have enabled new targeted therapies to be developed that selectively block or reduce the activity of components involved in inflammation.
Treatment of ulcerative colitis has traditionally focused on symptom control, whereas the development of new targeted treatments aims to achieve remission (the signs and symptoms of disease are reduced either partially or completely) and the restoration of people’s quality of life. A number of new treatments are in phase 2 or 3 clinical trials and may soon add to the range of treatments available to people with ulcerative colitis.
What Are the Different Types of Clinical Trials?
To be approved, a treatment must be proven to be safe and better than existing treatments. New treatments have to successfully go through several phases of clinical trials before they are approved for use and cannot move on to the next phase unless that particular phase of trial has yielded positive results. Phase 0 and phase 1 trials are the earliest-phase trials. They usually involve a small number of people (usually up to 50 people), aim to determine whether a treatment is safe, and (if the treatment involves a drug being given) what happens to it in the body.
Once found to be safe, treatments enter larger phase 2 trials (usually up to 100 people) where they are assessed as treatments for specific illnesses and any side effects (an unintended effect of the drug) are investigated in more detail. Phase 3 trials include hundreds or thousands of people and test new treatments against an existing treatment to see whether it is better. Phase 3 trials are randomized and often take place over several years so that the long-lasting effects of the new treatment can be assessed.
Emerging Therapies for Ulcerative Colitis
Interleukin-23 (IL-23)
A protein called interleukin-23 (IL-23) is known to inhibit the responses of a type of white blood cell called regulatory T cells. These cells play an important role in the body by suppressing the response of the immune system, ensuring that its normal level of activity remains within set limits and that its activity is reduced once a threat has been dealt with. They are also critical in preventing the development of autoimmunity.
When IL-23 inhibits regulatory T cells, inflammation is able to continue unchecked. A particular form of IL-23 called IL-23p19 has been identified as being involved in the development of ulcerative colitis. Four IL-23p19 inhibitors are currently in or have completed phase 2 or 3 trials. They appear to be particularly effective in patients whose ulcerative colitis has become resistant to treatment with tumor necrosis factor (TNF) inhibitors, and their effectiveness in combination with TNF inhibitors is also being investigated.
S1P
S1P is a type of molecule called a “lipid mediator” and is produced in response to a cell receiving a stimulus, and then exported from the cell so that it can bind to a receptor to transmit a signal to target cells. S1P binds to five different S1P receptors expressed on various types of immune cell, resulting in lymphocytes (cells that make antibodies and help control the immune system) being able to travel toward inflamed tissue in the intestine. Drugs that bind to S1P receptors and cause them to be internalized back into the cell and broken down are called S1P agonists. One S1P agonist has already been approved for the treatment of ulcerative colitis and another is in clinical development.
Toll-Like Receptor 9 (TLR-9)
A receptor inside cells called Toll-like receptor 9 (TLR-9) recognizes and binds to bacterial and viral DNA that is present inside cells. It does this by recognizing components called CpG motifs, which are made of a cytosine and a guanine bound together (these are two of the four components of DNA that make up the “genetic code”). CpG motifs are known to be the components of bacterial and viral DNA that cause the immune system to be activated.
As a result, some researchers are investigating the use of short, single-stranded synthetic stretches of DNA (called CpG oligonucleotides) to stimulate the immune system. One such molecule, which activates TLR-9 on target cells, has been shown in clinical trials to suppress immune cells that promote inflammation and to activate immune cells that suppress it, and is undergoing further testing.
microRNAs
Another approach is investigating the potential use of microRNAs. Your genes are short sections of DNA that carry the genetic information for the growth, development, and function of your body. Each gene carries the code for a protein or an RNA. There are several different types of RNA, each with different functions, and they play important roles in normal cells and the development of disease. MicroRNAs are small RNA molecules that do not code for proteins and instead play important roles in regulating genes, for example by inhibiting (silencing) gene expression.
Some microRNAs also activate signaling pathways inside cells, turning processes on or off. One such microRNA is miR-124, which negatively regulates inflammation. Reduced expression levels of miR-124 have been reported in studies of patients with ulcerative colitis, and a treatment that has been designed to upregulate miR-124 is currently in clinical trials involving patients with a variety of inflammatory diseases, including ulcerative colitis and rheumatoid arthritis.
Interleukin-6 (IL-6)
Interleukin-6 (IL-6) is another molecule that promotes inflammation and has been shown to play a central role in the development of inflammatory bowel disease. The binding of IL-6 to its receptor results in uncontrolled accumulation of activated T cells that stop inflammation from being reduced. Results of a phase 2 trial investigating an IL-6 inhibitor have been positive and it will be investigated further to assess its safety and efficacy in treating ulcerative colitis.
Take-Home Message
It is hoped that the emerging treatments described above, and others, will increase the options available to patients with ulcerative colitis. In addition, their investigation will continue to improve our understanding of how ulcerative colitis is caused, enabling further targeted therapies to be developed and opening up the possibility of personalizing each patient’s treatment.
Note: This post is based on an article that is not open-access; i.e., only the abstract is freely available. Furthermore, in the Conflict of Interest Statement at the end of this paper, the authors make a declaration about grants, research support, consulting fees, lecture fees, etc. received from pharmaceutical companies. It is normal for authors to declare this in case it might be perceived as a conflict of interest.
What Is a Systematic Review?
A systematic review is a type of research study that seeks to summarize all of the available primary research (i.e., research that has collected data first-hand) that has been conducted to answer a research question. It involves a systematic search for data using a specific, repeatable method with a clearly defined set of objectives. The search is usually conducted using databases that hold information about research publications and aims to identify all studies within them that meet predefined eligibility criteria.
The validity of the findings for each study is then assessed, particularly regarding whether there is any risk that the results may be biased, following which the results are considered together and any conclusions drawn. Systematic reviews enable up-to-date assessment of what is known about a subject and are often used in the development and updating of clinical guidelines.
What Did This Study Investigate?
The authors of this study conducted a systematic review to summarize the results of studies that have reported adverse reactions when patients with neurological disorders – conditions that affect the brain, spinal cord, and/or nerves throughout the body – are treated with intravenous immunoglobulin. Intravenous immunoglobulin is a product that is made up of different human antibodies (immunoglobulins is another word for antibodies) that have been pooled together and are given intravenously (through a vein).
Antibodies are specialized protective proteins that are made by the immune system and recognize anything that is foreign to the body (these are called “antigens”), like bacteria and viruses. Different antibodies specifically recognize and neutralize different antigens and, once they have recognized and responded to a particular antigen once, antibodies against that antigen continue to circulate in the blood to provide protection against it if it is encountered again (this is how we become immune to some diseases).
Because intravenous immunoglobulin is prepared from blood samples donated from a large number of different people (depending on the manufacturer, the number of donors can be between 1,000 and 100,000) it contains a diverse collection of antibodies, which reflects the exposure of everyone who has donated blood to their environment, against a broad range of antigens. As a result, intravenous immunoglobulin can be effective in preventing or treating infections in people who are unable to make enough antibodies (known as “humoral immunodeficiency”) or who have an autoimmune disease (where the body mistakenly recognizes a cell type or specific protein in the body as foreign, treats it as an antigen, and attacks it).
Although a large number of clinical trials have reported that treatment with intravenous immunoglobulin is safe and generally well tolerated, some patients experience adverse reactions (an undesired effect of the treatment). The authors of this study therefore set out to systematically review studies that have reported adverse reactions to intravenous immunoglobulin therapy when it is used to treat more than one neurological disorder, to investigate whether any particular characteristics of individual neurological disorders are associated with patients experiencing adverse reactions.
How Was the Study Conducted?
The authors of the study searched three electronic databases for all research studies published up until that date using the following combination of search terms:
- IVIg (the acronym for intravenous immunoglobulin), intravenous immunoglobulin, or immunoglobulin G (the type of immunoglobulin that makes up the greatest proportion of intravenous immunoglobulin), and
- any term beginning with “neurolog”, and
- adverse reaction, adverse effect, side effect, or any term beginning with “allerg”.
Articles were then included in the review if they described primary research, reported adverse reactions to intravenous immunoglobulin therapy in more than one neurological disorder, and were available as full-text publications in English. Although 2,196 studies were identified initially, only 65 met all of the eligibility criteria and were included in the final analysis.
What Did the Study Find?
After systematically reviewing the eligible studies, the authors of this study reported that when the results from all the studies were combined, the chance of patients developing an adverse reaction was estimated to be between 24 and 34%. In many studies the definition of specific adverse reactions was unclear or not specified. In addition, a large proportion of studies were conducted retrospectively, which increased the chance of selection bias. Selection bias is introduced when a group of patients is selected for analysis in a way that does not allow the sample population to be truly randomized, which means that it isn’t representative of the population as a whole, potentially leading to errors when the researchers draw conclusions about associations or outcomes.
Overall, there was a lack of high-quality comparative data (data that can be used to estimate the extent of similarity or dissimilarity between two things), which made it difficult for the authors to determine whether any specific neurological symptoms or signs are associated with patients having an increased risk of having an adverse reaction if treated with intravenous immunoglobulin therapy. Although intravenous immunoglobulin treatment was found to be generally well tolerated by patients with neurological conditions, headache was a common adverse reaction and there were some reports of “thromboembolic” complications (caused by the obstruction of a blood vessel by a blood clot that has become dislodged from another site in the circulatory system, which circulates the blood and lymph fluid through the body).
The authors concluded that patients with limited mobility (as seen in some conditions that affect both nerves and muscles), paraproteinemia, which occurs when an abnormal protein called a paraprotein starts to be secreted by a population of antibody-producing cells (as seen in some conditions where nerve damage causes pain, weakness, or numbness, often in the hands, arms, and feet), and cardiomyopathy (a general term that describes problems with your heart that make it harder for it to pump blood) were likely to have an increased risk of experiencing adverse reactions. They also found some evidence that children might be at increased risk of experiencing them.
Although the systematic review was unable to identify neurological disease characteristics that are definitely associated with adverse reactions in patients treated with intravenous immunoglobulin, the knowledge gained from this study can be used to guide the design of research studies in the future. Systematic reviews like this one play a key role in shaping future research directions by identifying areas relating to research questions that remain poorly understood or that need further investigation because different studies have reported conflicting results. This increases the chance of positive discoveries in the future that may improve the prevention and treatment of adverse reactions.