Sunday 2 November 2014

LOETUE PURE COCOA POWDER

Loetue Pure Cocoa is a powerful Health Drink, an anti-oxidant with so much amazing health benefits to prevent different ailments. It has been proven to have the capability to prevent over eight different diseases. Health Benefits of Loetue Pure Cocoa Drink
1) Best Natural Antioxidant in the world, it fight free Readicals
2) Lower High Blood Pressure, thereby Prevent Hypertension
3) Boost Libido and Improve Sexual function in Men
4) Prevent Breast Cancer in Women
5) Prevent Diabetes and High Cholesterol
6) Foster Higher Life Expectancy
7) Boost Brain Power and Improve Memory
8) Fight Fatigue when you are weak Get you Rejuvenated
9) Relief for Menstrual Pain and free flow
10) Arrest Persistent Cough and many more



 
















Price: NGN: 600,1000.

CONTACT: ADEKOGA ADENIJI
Tel: 08033949820

NB: DISTRIBUTORS WANTED NATION WIDE AND KINDLY SHARE WITH YOUR FRIENDS.
Publications that supports the above benefits:



Tuesday 14 October 2014

Getting Sample Hemolysis Under Control

Hemolysis has been cited as the most common cause of preanalytical error. How can sample hemolysis be avoided?
A: Many variables contribute to the ability to obtain a high quality, non-hemolyzed blood sample. Controlling the flow of blood between the vein and the tube is really the basis of a good sample collection. Factors that influence the flow of blood include needle gauge, force of suction, size and quality of the vein, and the device used for collection. Direct transfer into vacutainer tubes will control the flow from the needle into the tube, minimizing hemolysis during transfer. When drawing blood into a syringe, the force of suction should be minimal. A good tip is to pull the syringe back and fill it a bit at a time to control the flow. The same goes for transferring from syringes into collection tubes. Allowing the vacutainer to pull the blood into the tube—rather than pushing—maintains appropriate pressure.
How should laboratories examine samples to identify hemolysis?
Automated hemolysis index measurements on chemistry analyzers are fast and very reliable at detecting the presence—and relative quantification—of hemoglobin in a sample. Compared to visual examination, automation is more sensitive and reproducible in detecting the presence of hemoglobin and distinguishing it from similar colored interferents, such as bilirubin. Significantly, automation also allows direct electronic communication to the laboratory information system. 
How hemolyzed is too hemolyzed? Read more here

Defining Critical Value Lists and Limits: How can labs balance efficiency and patient safety?


schmotzerAn interview with Christine Schmotzer, MD
Defining Critical Value Lists and Limits: How can labs balance efficiency and patient safety?
Defining, identifying, and rapidly communicating critical values is essential to the quality of care. But as the workload in clinical laboratories continues to increase and physicians face information overload, laboratories are forced to be more efficient without compromising patient safety. In this interview, Christine Schmotzer, MD, discusses how to design a critical value list and steps that labs can take to balance efficiency with patient safety. Schmotzer is the medical director of clinical chemistry at University Hospitals Case Medical Center and an assistant professor of pathology at Case Western Reserve University School of Medicine in Cleveland, Ohio.
Jaime Noguez, PhD, of the Patient Safety Focus editorial board conducted this interview.
Q What is the best strategy for establishing a critical value list and limits?
A Despite the importance of critical values in patient care and the emphasis on effective communication of these results in the past decade, there is no widely accepted guideline for defining which analytes should be on a critical value list and how the ­cutoffs should be assigned. Developing a critical value list remains at the discretion of each institution. In practice, a common group of tests—including glucose, potassium, hemoglobin, hematocrit, and platelets—appear on the critical value list of nearly every institution. The specific values for these commonly covered analytes, as well as other analytes that should be included beyond the common ones, vary considerably between institutions. The best strategy for your lab is to use all available data to guide your decision. This includes published literature, peer comparisons, local institutional data—especially the populations being served—and the local opinion and consensus of clinicians working at your institution.
Ideally, labs would use outcomes literature to determine cutoffs at which a specific analyte value becomes life-threatening if an intervention is not taken. But outcomes literature is limited, due in part to the challenges of obtaining broadly applicable data in varied patient populations. A number of surveys and institutional case studies have been published on this topic emphasizing the institutional variability of critical value lists/cutoffs and the lack of a well-defined mechanism for establishing them (1–4). The availability of these surveys and studies allows laboratories to compare their lists to others and provides insight into whether your institution is over or under-restrictive in critical value calling. Survey data should be transferred with caution as it may not be current, and may not be a suitable match to your patient population. 
Achieving Balance in Critical Value Policies
• Optimize critical value limits
• Remove tests that do not meet "life-threatening­" criteria
• Discontinue repeat calls for select analytes with previous criticals
• Discontinue calls to units where "critical" result is expected
• Include regular review of critical value ­policies and data in test utilization management committee meetings
Another approach to improving your critical value list is specific peer-to-peer comparisons. Peer comparison can enable a lab to select institutions with a similar patient mix and complexity of population which may lead to critical value lists that are more directly transferrable or comparable to your own institution. Peer comparison has been enabled in the last decade by widespread availability of current institutional critical value lists and cutoffs on the Internet (5–7). These are provided by national labs, university-based labs, and other hospital labs. In general, it is relatively easy to find a peer, either through the Internet or your professional network.
Regardless of your initial approach to data-gathering, developing a critical value list and cutoffs should include discussion among clinicians, nurses, laboratory directors, and staff representing various departments and specialties. It is in this setting that institution-specific practices and needs can be discussed and influence the critical value list. For example, if an institution performs all blood gases at the point-of-care rather than in a decentralized laboratory, it may not need pH or pO2 to be included on a critical value list. Without specific outcomes literature to guide decisions, institutional and personal experience can be solid guides to setting critical value cutoffs that best meet the needs and philosophy of an institution.
Q Are there any other factors that need to be considered when designing your critical value list?
A While literature review, peer comparisons, and consulting with your physicians are important, assessing your current state, including critical result distribution, call frequency, and reporting logistics can provide insight into opportunities for improving your critical call list and process. Determining the tests leading to the highest number of calls and the units receiving the most calls can lead to valuable insights. For example, we were surprised to find critical vancomycin levels were in our top 10 most called tests. Further exploration led to practice changes to enhance the relationship between time of draw and drug administration, as well as discussion on whether abnormal vancomycin levels met the definition of a critical—immediately life-threatening—value. An important but often overlooked factor in successful critical value policies and procedures is the capabilities of your laboratory information system (LIS) for helping you identify and flag critical results. Many LISs don't have the ability to assign unit-specific flags. For example, clinical consensus at your institution may show that the threshold for critically low potassium can be different for inpatients versus outpatients. If your LIS does not allow for different critical results based on inpatient or outpatient status, the critical result will likely be set at the most conservative cutoff.
Q How can labs improve their critical values notification efficiency without compromising patient safety? Read more here

The Clinical Laboratory's Role in Preventing False-Negative hCG Point-of-Care Results


bench mattersFalse negative pregnancy tests in the emergency department (ED) can have serious consequences—including birth defects or loss of pregnancy—if pregnant women are subjected to certain treatments potentially harmful to a fetus. While in vitro device manufacturers and ED clinicians must do their part to help minimize the frequency of false-negative test results, laboratory personnel ultimately are responsible for the accuracy of point-of-care (POC) testing. To minimize false-negative­ qualitative human chorionic gonadotropin (hCG) test results, laboratorians must understand the limitations of POC hCG devices, utilize strategies to investigate results that are inconsistent with the clinical presentation, and recommend alternate testing to help establish a definitive diagnosis.
Measurement of hCG represents an analytical challenge, as the range of hCG concentrations associated with normal pregnancy spans from 0 IU/L immediately following conception to approximately 200,000 IU/L by weeks 8 to 10. False-negative urine hCG results could be encountered for a number of different reasons. Negative results are common in very early pregnancy, when hCG concentrations in urine likely are below a device's limit of detection.
False-negative results may also occur due to the hook effect, a phenomenon characterized by a pathologically high concentration of intact hCG that saturates all available binding sites and prevents an antibody-hCG-antibody sandwich from forming. In normal pregnancy, intact hCG concentrations are not sufficiently elevated to cause a hook effect. However, intact hCG is not the only variant observed in normal pregnancy, and hCG POC devices may either recognize or interfere with these other variants. One such variant, hCG β core fragment (hCGβcf), is present at 10-fold higher concentrations than intact hCG in urine beginning at around week 6 of pregnancy. As a degradation product formed during renal filtration, hCGβcf is exclusively found in urine. Of particular interest to laboratory personnel, false-negative POC hCG results have been documented in women with high urinary concentrations of hCGβcf. Read more here

Obesity accelerates aging of the liver, researchers find using novel biological aging clock

Using a recently developed biomarker of aging known as an epigenetic clock, researchers have found, for the first time, that obesity greatly accelerates aging of the liver. "Given the obesity epidemic in the Western world, the results of this study are highly relevant for public health," the lead investigator said. 
Read more here

Monday 1 September 2014

Energy drinks cause heart problems, study suggests


Energy drinks can cause heart problems according to research presented at ESC Congress 2014 today by Professor Milou-Daniel Drici from France.

Professor Drici said: "So-called 'energy drinks' are popular in dance clubs and during physical exercise, with people sometimes consuming a number of drinks one after the other. This situation can lead to a number of adverse conditions including angina, cardiac arrhythmia (irregular heartbeat) and even sudden death."

He added: "Around 96% of these drinks contain caffeine, with a typical 0.25 litre can holding 2 espressos worth of caffeine. Caffeine is one of the most potent agonists of the ryanodine receptors and leads to a massive release of calcium within cardiac cells. This can cause arrhythmias, but also has effects on the heart's abilities to contract and to use oxygen. In addition, 52% of drinks contain taurine, 33% have glucuronolactone and two-thirds contain vitamins."

He added: "Patients with cardiac conditions including catecholaminergic arrhythmias, long QT syndrome and angina should be aware of the potential danger of a large intake of caffeine, which is a stimulant that can exacerbate their condition with possibly fatal consequences."


He concluded: "Patients rarely mention consumption of energy drinks to their doctors unless they are asked. Doctors should warn patients with cardiac conditions about the potential dangers of these drinks and ask young people in particular whether they consume such drinks on a regular basis or through binge drinking."

HEMOZION : New way to diagnose malaria

Red blood cells from a patient infected with Plasmodium falciparum.
Over the past several decades, malaria diagnosis has changed very little. After taking a blood sample from a patient, a technician smears the blood across a glass slide, stains it with a special dye, and looks under a microscope for the Plasmodium parasite, which causes the disease. This approach gives an accurate count of how many parasites are in the blood -- an important measure of disease severity -- but is not ideal because there is potential for human error.

A research team from the Singapore-MIT Alliance for Research and Technology (SMART) has now come up with a possible alternative.The researchers have devised a way to use magnetic resonance relaxometry (MRR), a close cousin of magnetic resonance imaging (MRI), to detect a parasitic waste product in the blood of infected patients 

The new SMART system detects a parasitic waste product called hemozoin. When the parasites infect red blood cells, they feed on the nutrient-rich hemoglobin carried by the cells. As hemoglobin breaks down, it releases iron, which can be toxic, so the parasite converts the iron into hemozoin -- a weakly paramagnetic crystallite.

Those crystals interfere with the normal magnetic spins of hydrogen atoms. When exposed to a powerful magnetic field, hydrogen atoms align their spins in the same direction. When a second, smaller field perturbs the atoms, they should all change their spins in synchrony -- but if another magnetic particle, such as hemozoin, is present, this synchrony is disrupted through a process called relaxation. The more magnetic particles are present, the more quickly the synchrony is disrupted.

Hemozoin crystals are produced in all four stages of malaria infection, including the earliest stages, and are generated by all known species of the Plasmodium parasite. Also, the amount of hemozoin can reveal how severe the infection is, or whether it is responding to treatment. There are a lot of scenarios where you want to see the number, rather than a yes or no answer.

In this paper, the researchers showed that they could detect Plasmodium falciparum, the most dangerous form of the parasite, in blood cells grown in the lab. They also detected the parasite in red blood cells from mice infected with Plasmodium berghei.

The researchers are launching a company to make this technology available at an affordable price. The team is also running field tests in Southeast Asia and is exploring powering the device on solar energy, an important consideration for poor rural areas.

Journal Reference:
  1. Weng Kung Peng, Tian Fook Kong, Chee Sheng Ng, Lan Chen, Yongxue Huang, Ali Asgar S Bhagat, Nam-Trung Nguyen, Peter Rainer Preiser, Jongyoon Han. Micromagnetic resonance relaxometry for rapid label-free malaria diagnosis. Nature Medicine, 2014; DOI: 10.1038/nm.3622


Sunday 31 August 2014

Leading Ebola researcher says there's an effective treatment for Ebola

A leading U.S. Ebola researcher from the University of Texas Medical Branch at Galveston has gone on record stating that a blend of three monoclonal antibodies can completely protect monkeys against a lethal dose of Ebola virus up to 5 days after infection, at a time when the disease is severe.

Thomas Geisbert, professor of microbiology and immunology, has written an editorial for Nature discussing advances in Ebola treatment research. The filoviruses known as Ebola virus and Marburg virus are among the most deadly of pathogens, with fatality rates of up to 90 percent.

Since the discovery of Ebola in 1976, researchers have been actively working on treatments to combat infection. Studies over the past decade have uncovered three treatments that offer partial protection for monkeys against Ebola when given within an hour of virus exposure. One of these treatments, a VSV-based vaccine was used in 2009 to treat a laboratory worker in Germany shortly after she was accidentally stuck with a needle possibly contaminated by an Ebola-infected animal.

Further advances have been made that can completely protect monkeys against Ebola using small 'interfering' RNAs and various combinations of antibodies. But these treatments need to be given within two days of Ebola exposure.

"So although these approaches are highly important and can be used to treat known exposures, the need for treatments that can protect at later times after infection was paramount," said Geisbert.
Further research led to a cocktail of monoclonal antibodies that protected 43% of monkeys when given as late as five days after Ebola exposure, at a time when the clinical signs of the disease are showing.

The new study from Qui and colleagues at MAPP Biopharmaceutical Inc. used ZMAPP to treat monkeys given a lethal dose of Ebola. All of the animals survived and did not show any evidence of the virus in their systems 21 days after infection, even after receiving the treatment 5 days after infection. They also showed that ZMAPP inhibits replication of the Ebola virus in cell culture.

ZMAPP has been used to treat several patients on compassionate grounds. Of these, two US healthcare workers have recovered, although but whether ZMAPP had any effect is unknown, as 45% of patients in this outbreak survive without treatment. There were also two patients treated with ZMAPP who did not survive, but this may be because the treatment was started too late in the disease course.

"The diversity of strains and species of the Ebola and Marburg filoviruses is an obstacle for all candidate treatments," said Geisbert. "Treatments that may protect against one species of Ebola will probably not protect against a different species of the virus, and may not protect against a different strain within the species."

Although we certainly need treatments for filovirus infections, the most effective way to manage and control future outbreaks might be through vaccines, some of which have been designed to protect against multiple species and strains. During outbreaks, single-injection vaccines are needed to ensure rapid use and protection. At least five preventative vaccines have been reported to completely protect monkeys against Ebola and Marburg infection. But only the VSV-based vaccines have been shown to complete protect monkeys against Ebola after a single injection.

"Antibody therapies and several other strategies should be included in the arsenal of interventions for controlling future Ebola outbreaks," said Geisbert. "Although ZMAPP in particular has been administered for compassionate use, the next crucial step will be to formally assess its safety and effectiveness."

Journal Reference:
  1. Thomas W. Geisbert. Medical research: Ebola therapy protects severely ill monkeys. Nature, 2014; DOI: 10.1038/nature13746

Surprising discovery: HIV hides in gut, evading eradication

Researchers at UC Davis have made some surprising discoveries about the body's initial responses to HIV infection. Studying simian immunodeficiency virus (SIV), the team found that specialized cells in the intestine called Paneth cells are early responders to viral invasion and are the source of gut inflammation by producing a cytokine called interleukin-1 beta (IL-1β).
Though aimed at the presence of virus, IL-1β causes breakdown of the gut epithelium that provides a barrier to protect the body against pathogens. Importantly, this occurs prior to the wide spread viral infection and immune cell killing. But in an interesting twist, a beneficial bacterium, Lactobacillus plantarum, helps mitigate the virus-induced inflammatory response and protects gut epithelial barrier. The study was published in the journal PLoS Pathogens.

One of the biggest obstacles to complete viral eradication and immune recovery is the stable HIV reservoir in the gut. There is very little information about the early viral invasion and the establishment of the gut reservoir.

"We want to understand what enables the virus to invade the gut, cause inflammation and kill the immune cells," said Satya Dandekar, lead author of the study and chair of the Department of Medical Microbiology and Immunology at UC Davis.

"Our study has identified Paneth cells as initial virus sensors in the gut that may induce early gut inflammation, cause tissue damage and help spread the viral infection. Our findings provide potential targets and new biomarkers for intervening or blocking early spread of viral infection," she said.

In the study, the researchers detected a very small number of SIV infected cells in the gut within initial 2.5 days of viral infection; however, the inflammatory response to the virus was playing havoc with the gut lining. IL-1β was reducing the production of tight-junction proteins, which are crucial to making the intestinal barrier impermeable to pathogens. As a result, the normally cohesive barrier was breaking down.

Digging deeper, the researchers found the inflammatory response through IL-1β production was initiated in Paneth cells, which are known to protect the intestinal stem cells to replenish the epithelial lining. This is the first report of Paneth cell sensing of SIV infection and IL-1β production that links to gut epithelial damage during early viral invasion. In turn, the epithelial breakdown underscores that there's more to the immune response than immune cells.

"The epithelium is more than a physical barrier," said first author Lauren Hirao. "It's providing support to immune cells in their defense against viruses and bacteria."

The researchers found that addition of a specific probiotic strain, Lactobacillus plantarum, to the gut reversed the damage by rapidly reducing IL-1β, resolving inflammation, and accelerating repair within hours. The study points to interesting possibilities of harnessing synergistic host-microbe interactions to intervene early viral spread and gut inflammation and to mitigate intestinal complications associated with HIV infection.

"Understanding the players in the immune response will be important to develop new therapies," said Hirao. "Seeing how these events play out can help us find the most opportune moments to intervene."

Journal Reference:
  1. Lauren A. Hirao, Irina Grishina, Olivier Bourry, William K. Hu, Monsicha Somrit, Sumathi Sankaran-Walters, Chris A. Gaulke, Anne N. Fenton, Jay A. Li, Robert W. Crawford, Frank Chuang, Ross Tarara, Maria L. Marco, Andreas J. Bäumler, Holland Cheng, Satya Dandekar. Early Mucosal Sensing of SIV Infection by Paneth Cells Induces IL-1β Production and Initiates Gut Epithelial Disruption. PLoS Pathogens, 2014; 10 (8): e1004311 DOI: 10.1371/journal.ppat.1004311

Wednesday 27 August 2014

New smartphone app can detect newborn jaundice in minutes

Newborn jaundice: It's one of the last things a parent wants to deal with, but it's unfortunately a common condition in babies less than a week old.

Skin that turns yellow can be a sure sign that a newborn is jaundiced and isn't adequately eliminating the chemical bilirubin. But that discoloration is sometimes hard to see, and severe jaundice left untreated can harm a baby.

University of Washington engineers and physicians have developed a smartphone application that checks for jaundice in newborns and can deliver results to parents and pediatricians within minutes. It could serve as a screening tool to determine whether a baby needs a blood test -- the gold standard for detecting high levels of bilirubin. 

"Virtually every baby gets jaundiced, and we're sending them home from the hospital even before bilirubin levels reach their peak," said James Taylor, a UW professor of pediatrics and medical director of the newborn nursery at UW Medical Center. "This smartphone test is really for babies in the first few days after they go home. A parent or health care provider can get an accurate picture of bilirubin to bridge the gap after leaving the hospital."
The research team will present its results at the Association for Computing Machinery's International Joint Conference on Pervasive and Ubiquitous Computing in September in Seattle.
The app, called BiliCam, uses a smartphone's camera and flash and a color calibration card the size of a business card. A parent or health care professional would download the app, place the card on her baby's belly, then take a picture with the card in view. The card calibrates and accounts for different lighting conditions and skin tones. Data from the photo are sent to the cloud and are analyzed by machine-learning algorithms, and a report on the newborn's bilirubin levels is sent almost instantly to the parent's phone.
"This is a way to provide peace of mind for the parents of newborns," said Shwetak Patel, a UW associate professor of computer science and engineering and of electrical engineering. "The advantage of doing the analysis in the cloud is that our algorithms can be improved over time."
A noninvasive jaundice screening tool is available in some hospitals and clinics, but the instrument costs several thousand dollars and isn't feasible for home use. Currently, both doctors and parents assess jaundice by looking for the yellow color in a newborn's skin, but this visual assessment is only moderately accurate. The UW team developed BiliCam to be easy to use and affordable for both clinicians and parents, especially during the first several days after birth when it's crucial to check for jaundice.
Jaundice, or the yellowing of the skin, can happen when an excess amount of bilirubin collects in the blood. Bilirubin is a natural byproduct of the breakdown of red blood cells, which the liver usually metabolizes. But newborns often metabolize bilirubin slower because their livers aren't yet fully functioning. If left untreated, severe jaundice can cause brain damage and a potentially fatal condition called kernicterus.
The UW team ran a clinical study with 100 newborns and their families at UW Medical Center. They used a blood test, the current screening tool used in hospitals, and BiliCam to test the babies when they were between two and five days old. They found that BiliCam performed as well as or better than the current screening tool. Though it wouldn't replace a blood test, BiliCam could let parents know if they should take that next step.
"BiliCam would be a significantly cheaper and more accessible option than the existing reliable screening methods," said Lilian de Greef, lead author and a UW doctoral student in computer science and engineering. "Lowering the access barrier to medical applications can have profound effects on patients, their caregivers and their doctors, especially for something as prevalent as newborn jaundice."
The researchers plan to test BiliCam on up to 1,000 additional newborns, especially those with darker skin pigments. The algorithms will then be robust enough to account for all ethnicities and skin colors. This could make BiliCam a useful tool for parents and health care workers in developing countries where jaundice accounts for many newborn deaths.
"We're really excited about the potential of this in resource-poor areas, something that can make a difference in places where there aren't tools to measure bilirubin but there's good infrastructure for mobile phones," Taylor said.
Within a year, the researchers say BiliCam could be used by doctors as an alternative to the current screening procedures for bilirubin. They have filed patents on the technology, and within a couple of years hope to have Federal Drug Administration approval for the BiliCam app that parents can use at home on their smartphones.
Related research paper can be found at: http://homes.cs.washington.edu/~mayank/BiliCam.pdf


source: www.sciencedaily.com 

Sunday 24 August 2014

Rapid Serological Assay Developed for Strongyloidiasis


Image: The adult free-living female Strongyloides stercoralis with a row of eggs within the body of the nematode (Photo courtesy of the CDC - Centers for Disease Control and Prevention).
Image: The adult free-living female Strongyloides stercoralis with a row of eggs within the body of the nematode (Photo courtesy of the CDC - Centers for Disease Control and Prevention).
Several imperfect methods exist for diagnosing strongyloidiasis and stool examination with microscopic identification of larvae considered the gold standard diagnostic procedure, showing good specificity with experienced staff.

Individuals with strongyloidiasis are typically asymptomatic, and the infection can persist for decades without detection. Problems arise when individuals with unrecognized Strongyloides stercoralis infection are immunosuppressed, which can lead to hyper-infection syndrome and disseminated disease with an associated high mortality if untreated.

An international team of scientists led by those at McGill University (Montreal, QC, Canada) obtained 54 positive serum samples that were confirmed by positive stool samples for S. stercoralis from multiple reference laboratories. There were 47 negative control samples consisted of sera obtained from healthy individuals residing in Canada with no prior history of travel outside of Canada and individuals with confirmed diagnosis of other parasitic infections, including trichinosis, and were negative for Strongyloides by an “in-house” enzyme-linked immunoassay (ELISA) (NRCP).

The team developed a rapid and sensitive serodiagnostic assay for strongyloidiasis based on a 31-kDa recombinant antigen from S. stercoralis (NIE) using a novel diffraction-based optical biosensor technology. The panelPlus oligonucleotide-based addressing system was used for NIE immobilization onto dotLab Sensors and All assays were performed on the dotLab mX System using panelPlus D Sensors (Axela, Inc.; Toronto, ON, Canada). All serum samples were also tested by an NIE ELISA that was developed and validated.

The assay readily differentiated S. stercoralis-infected patients from controls detecting 96.3% of the positive cases, and with no cross reactivity observed in the control group. These results were in excellent agreement with results obtained by an NIE-based ELISA. A further 44 sera from patients with suspected S. stercoralis infection were analyzed and showed 91% agreement with the NIE ELISA. The novel, high-sensitivity diffractive optics technology (dot) platform generated results in less than 30 minutes and is fully automated requiring minimal user intervention. This makes it potentially attractive for near-patient testing and for use in regions where technical expertise or adequate laboratory facilities may not be available.

The authors concluded that with the ability to create custom multiplex assays using the panelPlus oligonucleotide-based addressing system, the dotLab mX System could also be used further to improve Strongyloides serodiagnostics by incorporating multiple recombinant antigens in a multiplex format or by simultaneously screening for clinically relevant co-infections such as Human T-cell lymphotropic virus type 1 (HTLV-1).

The study was published on August 7, 2014, in the journal Public Library of Science Neglected Tropical Diseases.

source : www.labmedica.com

Amount, types of fat we eat affect health, risk of disease

Healthy adults should consume between 20 percent and 35 percent of their calories from dietary fat, increase their consumption of omega-3 fatty acids, and limit their intake of saturated and trans fats, according to an updated position paper from the Academy of Nutrition and Dietetics.

The position paper "Dietary Fatty Acids for Healthy Adults" has been published in the January issue of the Journal of the Academy of Nutrition and Dietetics. The position paper provides guidance for registered dietitian nutritionists and dietetic technicians, registered to translate research on fat and fatty acids into practical dietary recommendations for consumers.

The Academy's updated position is: It is the position of the Academy of Nutrition and Dietetics that dietary fat for the healthy adult population should provide 20 percent to 35 percent of energy, with an increased consumption of n-3 polyunsaturated fatty acids and limited intake of saturated and trans fats. The Academy recommends a food-based approach through a diet that includes regular consumption of fatty fish, nuts and seeds, lean meats and poultry, low-fat dairy products, vegetables, fruits, whole grains and legumes.

Registered dietitian nutritionists can help consumers understand that a total diet approach is more beneficial than simply reducing dietary fat and replacing it with carbohydrates, as a high intake of refined carbohydrate can also negatively affect health.

The Academy's position paper can be translated into healthful eating messages for the public:
• A simple and effective way to improve health is to eat more fish, nuts and seeds and to consume fewer desserts and convenience foods.
• Fat is a critical nutrient, and certain types of fat, such as omega-3s and omega-6s, are needed for good health. For this and other health reasons, a fat-free diet is not recommended.
• Fish is an excellent source of the omega-3s EPA and DHA; flax, walnuts and canola oil are good sources of ALA omega-3.
• Both the amount of fat (how much) and the type of fat (what foods) in the diet can affect health and risk of disease.
• Different foods provide different types of fat. Some fats improve your health (omega-3s help your heart and brain) while some are detrimental to your health (trans fat increases heart disease risk factors).


Journal Reference:
  1. Gretchen Vannice, Heather Rasmussen. Position of the Academy of Nutrition and Dietetics: Dietary Fatty Acids for Healthy Adults. Journal of the Academy of Nutrition and Dietetics, 2014; 114 (1): 136 DOI: 10.1016/j.jand.2013.11.001

Low birth weight linked to higher incidence of type 2 diabetes in African American women

African American women born at a low or very low birth weight may be at a higher risk for developing type 2 diabetes. The findings, which appear in Diabetes Care, may explain in part the higher occurrence of type 2 diabetes in African American populations, which has a high prevalence of low birth weight.

Researchers from Boston University's Slone Epidemiology Center followed more than 21,000 women enrolled in the Black Women's Health Study over the course of 16 years, analyzing characteristics such as birth weight, current age, family history of diabetes, body mass index, physical activity and socioeconomic status.

The study results indicate that women with low birth weight had a 13 percent higher chance of developing type 2 diabetes than those with normal birth weight, and those with very low birth weight had a 40 percent higher chance of developing the disease. Low birth weight was defined as less than 2.5 kg, and very low birth weight as less than 1.5 kg. It appeared that body size did not play a role in this relationship as there was a clear association between birth weight and diabetes even for women who were not obese.

Although previous studies have shown that birth characteristics such as birth weight can have a major impact on adult health, this is the first large-scale study to demonstrate this effect in an African American population.
"African American women are at increased risk of developing type 2 diabetes, and also have higher rates of low birth weight than white women," said Edward Ruiz-Narváez, ScD, assistant professor of epidemiology at Boston University School of Public Health. "Our study shows a clear relationship between birth weight and diabetes that highlights the importance of further research for this at-risk group."

According to the researchers, there are two leading hypotheses for the phenomenon. The first, known as the "thrifty phenotype hypothesis," states that once the newborn body perceives that it lacks nutrition, it reprograms itself to absorb more nutrition, causing an imbalance in metabolism that eventually leads to type 2 diabetes. The second, known as the "fetal insulin hypothesis," states that genes that are responsible for impaired insulin secretion also have a negative effect on birth weight. Some of these genes have been discovered in recent studies, supporting the latter hypothesis.

Journal Reference:
  1. E. A. Ruiz-Narvaez, J. R. Palmer, H. Gerlovin, L. A. Wise, V. G. Vimalananda, J. L. Rosenzweig, L. Rosenberg. Birth Weight and Risk of Type 2 Diabetes in the Black Women's Health Study: Does Adult BMI Play a Mediating Role? Diabetes Care, 2014; 37 (9): 2572 DOI: 10.2337/dc14-0731

Overweight and obesity linked to 10 common cancers, over 12,000 cases every year in UK

A higher body mass index (BMI) increases the risk of developing 10 of the most common cancers, the largest study of its kind on BMI and cancer shows.
 UK researchers at the London School of Hygiene &Tropical Medicine and the Farr Institute of Health Informatics estimate that over 12,000 cases of these 10 cancers each year are attributable to being overweight or obese, and calculate that if average BMI in the population continues to increase, there could be over 3500 extra cancers every year as a result.
"The number of people who are overweight or obese is rapidly increasing both in the UK and worldwide. It is well recognised that this is likely to cause more diabetes and cardiovascular disease. Our results show that if these trends continue, we can also expect to see substantially more cancers as a result"*, said study leader Dr Krishnan Bhaskaran, National Institute for Health Research Postdoctoral Fellow, from the London School of Hygiene & Tropical Medicine, London, UK.

Using data from general practitioner records in the UK's Clinical Practice Research Datalink (CPRD), the researchers identified 5·24 million individuals aged 16 and older who were cancer-free and had been followed for an average of 7·5 years. The risk of developing 22 of the most common cancers, which represent 90% of the cancers diagnosed in the UK, was measured according to BMI after adjusting for individual factors such as age, sex, smoking status, and socioeconomic status.
A total of 166 955 people developed one of the 22 cancers studied over the follow-up period. BMI was associated with 17 out of the 22 specific types of cancer examined.

Each 5 kg/m² increase in BMI was clearly linked with higher risk of cancers of the uterus (62% increase), gallbladder (31%), kidney (25%), cervix (10%), thyroid (9%), and leukemia (9%). Higher BMI also increased the overall risk of liver (19% increase), colon (10%), ovarian (9%), and breast cancers (5%), but the effects on these cancers varied by underlying BMI and by individual-level factors such as sex and menopausal status. Even within normal BMI ranges, higher BMI was associated with increased risk of some cancers.

There was some evidence that those with high BMI were at a slightly reduced risk of prostate cancer and premenopausal breast cancer.

Dr Bhaskaran explained, "There was a lot of variation in the effects of BMI on different cancers. For example, risk of cancer of the uterus increased substantially at higher body mass index; for other cancers, we saw more modest increases in risk, or no effect at all. For some cancers like breast cancer occurring in younger women before the menopause, there even seemed to be a lower risk at higher BMI. This variation tells us that BMI must affect cancer risk through a number of different processes, depending on the cancer type."
Based on the results, the researchers estimate that excess weight could account for 41% of uterine and 10% or more of gallbladder, kidney, liver, and colon cancers in the UK. They also estimate that a population-wide 1 kg/m² increase in average BMI (roughly an extra 3 to 4 kg, or 8 to 10 pounds, per adult), which would occur every 12 years or so based on recent trends, would result in an additional 3790 cases of these 10 cancers in the UK each year.

Writing in a linked Comment, Dr Peter Campbell from the American Cancer Society, Atlanta, USA, says, "We have sufficient evidence that obesity is an important cause of unnecessary suffering and death from many forms of cancer…More research is not needed to justify, or even demand, policy changes aimed at curbing overweight and obesity. Some of these policy strategies have been enumerated recently, all of which focus on reducing caloric intake or increasing physical activity, and include taxes on calorically dense, nutritionally sparse foods (eg, sugar-sweetened beverages); subsidies for healthier foods, especially in economically disadvantaged groups; agricultural policy changes; and urban planning aimed at encouraging walking and other modes of physical activity. Research strategies that identify population-wide or community-based interventions and policies that effectively reduce overweight and obesity should be particularly encouraged and supported. Moreover, we need a political environment, and politicians with sufficient courage, to implement such policies effectively.


Journal Reference:
  1. Krishnan Bhaskaran, Ian Douglas, Harriet Forbes, Isabel dos-Santos-Silva, David A Leon, Liam Smeeth. Body-mass index and risk of 22 specific cancers: a population-based cohort study of 5·24 million UK adults. The Lancet, 2014; DOI: 10.1016/S0140-6736(14)60892-8

Monday 4 August 2014

Malaria vaccine shows continued protection during 18 months of follow-up

A vaccine previously shown to reduce malaria in young infants and children reduces larger numbers of malaria cases in areas of higher malaria transmission, according to results from an ongoing clinical trial published in PLOS Medicine. The effect of vaccination diminished over time, but protection against clinical malaria remained evident 18 months after vaccination.

In the new report, the RTS,S Clinical Trials Partnership update estimates of vaccine efficacy (the reduction in the risk of malaria in participants who received the vaccine compared to those who received a comparator vaccine) and calculate the number of cases of malaria that the vaccine prevented in a phase 3, randomized, controlled clinical trial of the malaria vaccine RTS,S/AS01 given to young infants and children in Africa.

The study included 6,537 infants aged 6-12 weeks and 8,923 children aged 5-17 months who were randomly assigned to receive three doses of RTS,S/AS01 or comparator vaccine. During 18 months following vaccination, the researchers report vaccine efficacy of 45% [95% confidence interval (CI): 41%-49%, intention-to-treat analysis] in children age 5-17 months, and 27% vaccine efficiency [95% CI: 21%-33%, intention-to-treat analysis] in infants age 6-12 weeks. In both age groups, vaccine efficacy was highest in the first 6 months after vaccination. Across all 11 study sites, RTS,S/AS01 averted an average of 829 (range 37 to 2365) cases of clinical malaria per 1,000 children vaccinated, and 449 (Range -10 to 1402) cases in infants vaccinated, over 18 months following vaccination.

Safety analyses found overall serious adverse events (SAE) to occur less often in children age 5-17 months who received the vaccine [18.6% (95% confidence interval 17.6%-19.6%), compared with 22.7% (95% CI 21.2%-24.3%) in children who received a comparator vaccine]. In infants age 6-12 weeks overall SAE were not found to differ significantly with immunization. As noted in earlier reports, more meningitis cases were reported as SAE in participants who received the malaria vaccine than in those who received a comparator immunization (16 cases among the 5,949 children in the RTS,S/AS01 vaccine group and one case among the 2,974 children in the control group; and nine cases among 4,358 young infants in the RTS,S/AS01 group and three among 2,179 young infants in the control group) and no causal relationship to the vaccine has been established.

Going forward the study will analyze further efficacy and safety results following administration of a booster immunization given to study participants just after the time period analyzed in the current report. The authors note that "Translated to the population at risk of malaria, reductions in clinical cases on this scale as a result of vaccination with RTS,S/AS01 would have a major public health impact."

Journal Reference:
  1. The RTS,S Clinical Trials Partnership. Efficacy and Safety of the RTS,S/AS01 Malaria Vaccine during 18 Months after Vaccination: A Phase 3 Randomized, Controlled Trial in Children and Young Infants at 11 African Sites. PLOS Medicine, 2014 DOI: 10.1371/journal.pmed.1001685

New malaria vaccine candidates identified

Children and guardians present to a local dispensary for sampling for the detection of anti-malarial antibodies.

Researchers have discovered new vaccine targets that could help in the battle against malaria. Taking a new, large-scale approach to this search, researchers tested a library of proteins from the Plasmodium falciparum parasite with antibodies produced by the immune systems of a group of infected children.

The tests measured which proteins the children’s immune systems responded to, revealing antigens that had not previously been identified as possible vaccine targets and new insights into the ways antigens could be used in combination to increase protection.

“Resistance to malaria drugs is an increasing problem so vaccines are desperately needed to battle the Plasmodium falciparum parasite before it has a chance to make people sick,” says Dr Faith Osier, first author from the Kenya Medical Research Institute. “This study presents us with a large number of new vaccine candidates that offer real hope for the future.”

A group of children infected with malaria were followed over a six-month period by scientists at the Kenya Medical Research Institute (KEMRI). While some patients became sick, others were protected by naturally occurring antibodies that stopped the malaria parasite from penetrating their red blood cells during the blood stage of the disease, which produces severe symptoms such as fever and anaemia. Researchers used samples taken from these children to identify combinations of antibodies that provided up to 100 per cent protection against clinical episodes of malaria.

The study used a library of parasite proteins that was generated using an approach developed at the Wellcome Trust Sanger Institute by Dr Gavin Wright and Dr Julian Rayner. These researchers had previously developed a new approach to express large panels of correctly folded, full-length proteins from the Plasmodium falciparum parasite, targeting proteins involved in the invasion of human red blood cells. In this study, Sanger Institute scientists collaborated with colleagues in Kenya to see which of them the children’s immune systems had developed antibodies against.

“The use of these proteins by the Sanger Institute’s Malaria Programme is helping to zero in on and exploit the weakest point in the malaria parasite’s life cycle,” says Dr Julian Rayner, an author from the Sanger Institute. “Trials for vaccines in the past have focussed on one target at a time and have had limited success; with this approach, we can systematically test larger numbers of targets and identify targets that might work in combination.”

The findings of this research add further weight to the theory that a successful blood-stage vaccine needs to target multiple antigens. The next step in this research will be to generate antibodies against all of the proteins in the library and test them in the laboratory in different combinations to see whether combinations that appear to protect individuals in the field are able to directly prevent parasite invasion. Such studies are now underway at the Sanger Institute. At KEMRI, Dr Faith Osier’s team is working on validating these findings in other African countries.

“Each year, hundreds of thousands of people die from malaria; but hundreds of millions are infected, many of whom are protected from severe symptoms by their immune response,” says Dr Kevin Marsh, Director of the KEMRI Wellcome Trust Research Programme at the Kenya Medical Research Institute. “Collaborating with our colleagues at the Sanger Institute helps to bring the latest technological advances to the field, which in this case has highlighted combinations of naturally occurring antibodies that could contribute to the design of new vaccines.”

Journal Reference:
  1. Osier, F et al. New antigens for a multicomponent blood-stage malaria vaccine. Science Translational Medicine, 2014 DOI: 10.1126/scitranslmed.3008705

Erectile dysfunction can be reversed without medication

Men suffering from sexual dysfunction can be successful at reversing their problem, by focusing on lifestyle factors and not just relying on medication, according to research at the University of Adelaide.

In a new paper published in the Journal of Sexual Medicine, researchers highlight the incidence of erectile dysfunction and lack of sexual desire among Australian men aged 35-80 years.
Over a five-year period, 31% of the 810 men involved in the study developed some form of erectile dysfunction.

"Sexual relations are not only an important part of people's wellbeing. From a clinical point of view, the inability of some men to perform sexually can also be linked to a range of other health problems, many of which can be debilitating or potentially fatal," says Professor Gary Wittert, Head of the Discipline of Medicine at the University of Adelaide and Director of the University's Freemasons Foundation Centre for Men's Health.

"Our study saw a large proportion of men suffering from some form of erectile dysfunction, which is a concern. The major risk factors for this are typically physical conditions rather than psychological ones, such as being overweight or obese, a higher level of alcohol intake, having sleeping difficulties or obstructive sleep apnoea, and age.

"The good news is, our study also found that a large proportion of men were naturally overcoming erectile dysfunction issues. The remission rate of those with erectile dysfunction was 29%, which is very high. This shows that many of these factors affecting men are modifiable, offering them an opportunity to do something about their condition," Professor Wittert says.

The lead author of the paper, Dr Sean Martin from the University of Adelaide's Freemasons Foundation Centre for Men's Health, says: "Even when medication to help with erectile function is required, it is likely to be considerably more effective if lifestyle factors are also addressed.
"Erectile dysfunction can be a very serious issue because it's a marker of underlying cardiovascular disease, and it often occurs before heart conditions become apparent. Therefore, men should consider improving their weight and overall nutrition, exercise more, drink less alcohol and have a better night's sleep, as well as address risk factors such as diabetes, high blood pressure and cholesterol.
"This is not only likely to improve their sexual ability, but will be improve their cardiovascular health and reduce the risk of developing diabetes if they don't already have it."

Journal Reference:
  1. Sean A. Martin, Evan Atlantis, Kylie Lange, Anne W. Taylor, Peter O'Loughlin, Gary A. Wittert. Predictors of Sexual Dysfunction Incidence and Remission in Men. The Journal of Sexual Medicine, 2014; DOI: 10.1111/jsm.12483

One in four patients with newly-diagnosed erectile dysfunction is a young man

In a recent analysis of one outpatient clinic, one in four men seeking medical help for newly-developed erectile dysfunction (ED) was younger than 40 years, and nearly half of young men with the condition had severe ED. While larger population-based studies are needed, the findings, which were published in The Journal of Sexual Medicine, suggest that erectile dysfunction in young men may be more prevalent and more serious than previously thought.

Erectile dysfunction is a common complaint in men over 40 years of age. Prevalence increases with age, but the prevalence and risk factors of erectile dysfunction among younger men have been scantly analyzed. The research that has been done paints a vague picture, reporting prevalence rates ranging between two percent and nearly 40 percent.

To provide more clarity, Paolo Capogrosso, MD, of the University Vita-Salute San Raffaele, in Milan, Italy, and his colleagues assessed the sociodemographic and clinical characteristics of 439 men seeking medical help for newly-developed erectile dysfunction between January 2010 and June 2012 at a single academic outpatient clinic. Of the 439 patients, 114 (26 percent) were aged 40 years or younger. Compared with older patients, younger patients had a lower average body mass index, a higher average level of testosterone in the blood, and a lower rate of other medical conditions. (Only 9.6 percent of younger patients had one or more concomitant medical conditions compared with 41.7 percent among older patients.) Younger ED patients smoked cigarettes and used illicit drugs more frequently than older patients. Premature ejaculation was more common in younger men, whereas Peyronie's disease (bent erection from scar tissue) was more prevalent in older patients. Severe erectile dysfunction was found in 48.8 percent of younger patients and 40 percent of older patients while the rates of mild, mild-to-moderate, and moderate erectile dysfunction were not significantly different between the two groups.

"These findings, taken together with those of other studies showing the importance of erectile dysfunction as a potential "sentinel marker" of major diseases, outline the importance of taking a comprehensive medical and sexual history and to perform a thorough physical examination in all men with erectile dysfunction, irrespective of their age," said Dr. Capogrosso.

"Erectile function, in general, is a marker for overall cardiovascular function -- this is the first research showing evidence of severe erectile dysfunction in a population of men 40 years of age or younger" stated Irwin Goldstein, editor-in-chief of The Journal of Sexual Medicine. "Clinically, when younger patients have presented with erectile dysfunction, we have in the past had a bias that their ED was primarily psychologic-based and vascular testing was not needed. We now need to consider regularly assessing the integrity of arterial inflow in young patients -- identifying arterial pathology in such patients may be very relevant to their overall long-term health."

Journal Reference:
  1. Paolo Capogrosso, Michele Colicchia, Eugenio Ventimiglia, Giulia Castagna, Maria Chiara Clementi, Nazareno Suardi, Fabio Castiglione, Alberto Briganti, Francesco Cantiello, Rocco Damiano, Francesco Montorsi, Andrea Salonia. One Patient Out of Four with Newly Diagnosed Erectile Dysfunction Is a Young Man-Worrisome Picture from the Everyday Clinical Practice. The Journal of Sexual Medicine, 2013; DOI: 10.1111/jsm.12179

Friday 11 July 2014

Managing Risk at the Point-of-Care :

Although point-of-care testing (POCT) provides rapid test results and the opportunity for faster medical decisions, the unique risk of errors with POCT raises concern over the quality and reliability of test results. In contrast to the central laboratory, where errors predominately occur in the pre- and postanalytic phases, POCT errors occur primarily in the analytic phase of testing (1, 2). This might be related to the non-laboratory staff involved in POCT, but might also be due to test limitations and misuse of POCT in extreme environmental conditions (3, 4).
Clinical personnel with minimal laboratory skills and experience, such as nurses and patient care technicians, perform the majority of POCT. These operators are focused on patient care and do not necessarily understand why they must handle POCT—a task viewed as a laboratory role and not a job for clinical staff.
Yet regulatory standards hold the laboratory director responsible for managing and supervising POCT quality. In a clinic setting, the laboratory director may be a physician, but in a hospital or health system, the chief of pathology and head of the central laboratory often become responsible. POCT is thus at odds with both the clinical staff performing the test as well as the laboratory staff responsible for supervising the test. This conflict creates a situation ripe for errors. 
The Evolution of Quality Control
The analysis of quality control (QC), a liquid sample of known analyte concentration, has historically been used to prove the stability of a test system and ensure quality results. The concept of QC arose from the factory models of the early 1940s in which products on a factory line were periodically inspected to ensure that they met standards of production. If not, the line was shut down until the problem could be corrected. Just as in the factory, the periodic performance of QC in a laboratory—normally two levels each day of patient testing—ensures that the test system is performing as expected and the risk of errors is minimized to clinically acceptable levels.
The standard for two levels of QC each day became the default QC frequency in an era when batch analyzers sampled QC from the same bottle of reagent as patient samples. Patient results were held until QC results from before and after a batch of patients were acceptable, indicating that the test system was stable over the group of specimens. Now, modern instruments produce results continuously and automatically verify test results before the next QC is analyzed.
In the laboratory, QC does a good job at detecting systematic errors—those that occur from one point in time forward, and affect the QC samples in the same manner as patient samples. Reagent degradation, incorrect calibration setpoints, and pipetting errors can all be detected by QC because the same reagent and instrument settings are used for both patient and QC samples.
However, QC does a poor job at detecting random errors, which are unique to single samples, such as clots, hemolysis, or drug interferences, and affect a patient sample differently than QC. As a result, the requirement for two levels of QC each day of testing does not ensure that a test system will be free from errors and have zero risk. 
Many POCT devices are single unit-use cartridges and test strips. With unit-use formats, analysis of liquid QC can verify the performance of an individual test, but the analysis of QC consumes the test cartridge and cannot guarantee the quality of tests from other cartridges. Thus, unit-use tests often contain internal control processes built into each test to ensure result quality on each cartridge.
For example, pregnancy tests contain an internal positive and negative control built into each test to ensure the viability of each cartridge. However, some POCT devices, like bilirubinometers, are non-invasive and have no means of analyzing a liquid QC sample. Others, like newer molecular arrays and diagnostic chip technologies, perform hundreds of test reactions on a single cartridge.
How does the laboratory control such tests? Does the operator have to control each reaction occurring on the chip each day of testing? This could be cost prohibitive and duplicative of internal control processes built into the test system. With so many different devices and control processes available, laboratories need a systematic approach to ensure quality and strike the right balance of liquid QC in concert with internal control processes. That approach is risk management.
Understanding the New Guidelines
The Clinical and Laboratory Standards Institute (CLSI) guideline EP-23 introduces risk management principles to the clinical laboratory (6). EP23 describes good laboratory practice for developing a quality control plan based on the manufacturer's risk information, applicable regulatory and accreditation requirements, and the individual healthcare and laboratory setting. This guideline helps laboratories identify weaknesses in the testing process that could lead to error and explains how to develop a plan to detect and prevent those errors from happening. 
The Centers for Medicare and Medicaid Services (CMS) has incorporated key elements of risk management from CLSI EP23 into the new CLIA interpretive guidelines that offer a QC option called an Individualized Quality Control Plan (IQCP) (7). The CMS changes were launched in January 2014 and have a 2-year educational period. Beginning in 2017, laboratory tests, including POCT, will have two options for defining the frequency of QC for moderate- and high-complexity tests: either two concentrations of liquid QC each day, or developing an IQCP.
Inspectors will be checking that the laboratory's IQCP is comprised of three parts: a risk assessment (RA), a Quality Control Plan (QCP), and a Quality Assessment (QA). In the RA, the laboratory identifies and evaluates potential failures and errors in a testing process. The QCP is the laboratory's standard operating procedure that describes the practices, resources, and procedures to control the quality of a particular test process. The QA plan is the laboratory's policy for the ongoing monitoring of their IQCP (7).  
Implementing IQCP
IQCPs will be valuable to laboratories that use unit-use devices and instrumentation with built-in control processes. While CMS will be enforcing the new CLIA interpretive guidelines and IQCPs on only moderate- and high-complexity laboratory tests, any laboratory will find risk management principles useful in defining their weaknesses and reducing errors in their testing processes.
The first step in developing an IQCP is collecting information about the test and conducting a risk assessment. How will the test result be utilized in patient care? This defines the clinically acceptable tolerance for analytical performance, bias, and precision. Take glucose, for example. Use of a glucose result for diagnosis of diabetes requires tighter performance than use of glucose tests for managing insulin dosage. These differences in clinical use limit glucose meters to management rather than diagnosis or screening purposes. Laboratories should also have an understanding about who will conduct the test and where the test will be analyzed—for example, in a laboratory setting, at the bedside, or in a mobile ambulance, each with different environmental conditions and operators. 
Sites with clinical staff or more frequent staff turnover may have higher risk of errors and require additional training or supervision compared to sites with experienced medical technologists. Laboratories should also collect information from their accreditation agencies about their standards for QC requirements.
Finally, laboratories will need information about the test systems themselves and how internal control processes work. Good sources of information include the test package insert and device owner's manual. Understanding the test limitations as well as manufacturer recommendations for use can help laboratories minimize use under conditions that may increase the risk of error. 
Performing Risk Assessment
Risk assessment identifies potential hazards—failures or errors—that can occur at any step of the testing process. The risk assessment process takes into account preanalytical, analytical, and postanalytical processes. To assess risk, a laboratory maps its testing process by stepping through each part of the procedure to look for weaknesses: from order to sample collection, transport, processing, analysis, result reporting, and communication of results. The laboratory also should consider the sample, reagents, operator, test system, and environment as potential sources of error.

figure 1
 
CLSI EP23 provides a fishbone diagram of major sources of error to consider when conducting a risk assessment (Figure 1). For each of the identified hazards, the laboratory should develop an action plan that details how that risk will be handled. In some instances, the manufacturer's internal control process may address the risk. Take, for example, barcoded reagents that prevent use after the package expiration date. Barcoding is a control process that minimizes the possibility of using expired reagents. While barcoding doesn't absolutely prevent the error from occurring (one can never achieve zero risk), the likelihood of this error is reduced to a clinically acceptable level.
Defining Risk
Risk is the chance of suffering harm or loss, and it can be estimated from the probability of an event and the severity of harm that can come from that event. Risk management is the systematic application of policies, procedures, and practices to the task of analyzing, evaluating, controlling, and monitoring risk (5). Essentially, risk is the potential for an error to occur and risk management is the process of assessing weaknesses in our operations and taking actions to detect and prevent errors. In POCT, we already do a lot of activities that would be considered risk management, like validation of tests before use in patient testing, troubleshooting failed quality controls, repeating tests when we question a result, performing maintenance, and ensuring operators are trained and competent in our procedures. All of these activities work to minimize the chance of an error and ensure reliability of test results.
For other hazards, the risk of error may be unacceptable and require the laboratory to take additional actions. For example, an analyzer might have clot detection to reduce the probability of releasing a falsely decreased test result, but a laboratory could emphasize training, collection technique, specimen mixing, and monitor phlebotomists for frequency of clotted specimens as additional control measures. The selection of how each laboratory addresses its identified hazards creates the individuality of the QC plan. The summary of all identified hazards and the laboratory-specific actions to address each risk become the laboratory's IQCP. 
Maintaining Quality 
Once developed, the laboratory should monitor the effectiveness of its IQCP. Benchmarks of quality can trend the frequency of failed QC, error codes from internal control processes, repeat testing, physician complaints, and any unexpected events. When the laboratory identifies a trend, it should determine the cause of the problem and take corrective action to prevent recurrence. Once corrected, the laboratory should reassess the risk to determine if a particular hazard was missed during the initial risk assessment, if a specific error occurs more frequently, whether the laboratory action is not as effective as predicted, or if missed errors have greater patient harm than thought. The outcome of the risk assessment will determine whether the laboratory needs to take additional steps to mitigate this hazard and whether the IQCP should be modified. In this manner, the laboratory has created a continuous improvement cycle of identifying, assessing, addressing, and monitoring risk.
Focusing on the Right QC
The primary objective of IQCPs is not to reduce the frequency of analyzing liquid QC, but rather to ensure the right QC to address a laboratory's specific risks and ensure quality test results. In the context of POCT, laboratories should incorporate both internal and external control processes. Each device is unique, operates differently, and offers specific control processes engineered into the test. And since no single control process can cover all potential risks, a laboratory's QC plan must incorporate a mix of internal controls and traditional liquid QC.
Each test will require a specific IQCP, because devices are different and present unique risks. However, a single risk assessment and IQCP could cover multiple tests conducted on the same instrument, provided the IQCP factors in the differences unique to each analyte. For instance, a single IQCP for a chemistry analyzer could cover all tests conducted on that analyzer, since instrument operation, risk of error, and functionality of control processes is shared amongst all analytes on the same analyzer. Specific tests, like potassium, present unique consideration for hemolysis, and that risk could be added to the general IQCP covering the analyzer. This process will simplify a laboratory's risk assessments and efficiency in developing its IQCPs.
IQCPs will benefit laboratories in a number of ways. Laboratories using unit-use devices will define the optimum frequency of liquid QC in conjunction with the manufacturer's control processes. For unit-use blood gas and coagulation devices, laboratories can be more efficient by analyzing QC for lots of reagents using a subset of devices rather than every device available, since the chemistry of the test is in the unit-use cartridge—not in the device, which acts only as a volt-meter or timer. For molecular arrays and labs-on-a-chip, analyzing liquid QC across each reaction may be less effective than controlling the processes of greatest risk, such as quality and amount of sample, viability of replicating enzyme, and temperature cycling.
In conclusion, no device is foolproof and errors can occur anywhere in the testing process. Recognizing the conditions that could lead to errors and outlining the necessary actions to avoid them is the basis of developing an IQCP. Risk management and the principles of an IQCP should not be an entirely new concept to the clinical laboratory as most laboratories already recognize the potential for errors and take steps to prevent and detect errors that could ultimately harm a patient. By adopting an IQCP for POCT, laboratories can make certain that patients receive the highest quality of care, with faster turnaround times that do not compromise the accuracy of results.

References:
  1. Bonini P, Plebani M, Ceriotti F, et al. Errors in laboratory medicine. Clin Chem 2002;48:691–8.
  2. O'Kane MJ, McManus P, McGowan M, et al. Quality error rates in point-of-care testing. Clin Chem 2011;57:1267–71.
  3. Lippi G, Guidi GC, Mattiuzzi C, et al. Preanalytical variability: The dark side of the moon in laboratory testing. Clin Chem Lab Med 2006;44:358–65.
  4. Plebani M. Does POCT reduce the risk of error in laboratory testing? Clinica Chimica Acta 2009;204:59–64.
  5. International Organization for Standardization (ISO). Medical devices – Application of risk management to medical devices. ISO 14971. Geneva, Switzerland: ISO 2007.
  6. Clinical and Laboratory Standards Institute (CLSI). Laboratory quality control based on risk management; approved guideline. CLSI document EP23-A. Wayne, Pennsylvania: CLSI 2011.
  7. Centers for Medicaid and Medicare Services (CMS). Individual Quality Control Plan (IQCP) for Clinical Laboratory Improvement Amendments (CLIA) laboratory nonwaived testing. http://www.cms.gov/Regulations-and-Guidance/Legislation/CLIA/Downloads/IQCP-announcement-letter-for-CLIA-CoC-and-PPM-labs.pdf (Accessed June 2014)                                                                                                                                                                                                                                                                                                                                                 source: www.aacc.org                 

Sunday 6 July 2014

Rapid Urine Test Evaluated for Helicobacter Pylori Infection

Image: Rapirun Helicobacter pylori Antibody Stick. The urine sample is considered positive when two red bands at the test line and control line (arrows) are observed after 15 minutes and is considered negative when only the control line is observed. The absence of a control line indicates an invalid result (Photo courtesy of Duc T Quach).
Image: Rapirun Helicobacter pylori Antibody Stick. The urine sample is considered positive when two red bands at the test line and control line (arrows) are observed after 15 minutes and is considered negative when only the control line is observed. The absence of a control line indicates an invalid result (Photo courtesy of Duc T Quach).
A rapid urine test based on enzyme-linked immunosorbent assay (ELISA) has been developed for the detection of anti-Helicobacter Pylori antibodies in urine.

Several methods to diagnose H. pylori infection have been developed, among which the urea breath test (UBT) is currently regarded as the most accurate assay, but the UBT is still expensive and not widely available in many countries.

Scientists at the Ho Chi Minh City Medicine and Pharmacy University (Vietnam) working with Japanese colleagues, enrolled 200 patients undergoing upper gastrointestinal endoscopy from October 2012 to December 2012. Three biopsies were taken from each patient: two for histologic examination and one for the rapid urease test (RUT).

The biopsy for RUT was taken from the greater curvature of the corpus, about 2 cm above the atrophic border. This biopsy location has been reported to optimize the sensitivity of the PyloriTek RUT to detect H. pylori (Serim Research Co.; Elkhart, IN, USA). Urine samples were collected and were processed within one hour of collection for the detection of antibodies against H. pylori using the Rapirun Helicobacter pylori Antibody Stick (Otsuka Pharmaceutical Co., Ltd.; Tokyo, Japan). The test measures human immunoglobulin G (IgG) antibodies against H. pylori in urine using the principle of immunochromatography.

Of the 200 patients, 111 (55.5%) were diagnosed as being H. pylori positive. The sensitivity, specificity, and accuracy of the Rapirun Stick test were 84.7%, 89.9%, and 87.0%, respectively. There were 17 (8.5%) false-negative patients and 9 (4.5%) false-positive patients. Of the 24 patients with gastro-duodenal ulcer, 22 (91.7%) had H. pylori infection. However, 7 of 22 (31.8%) patients with reflux esophagitis also had the infection.

The authors demonstrated the usefulness of the Rapirun Stick test for the diagnosis of H. pylori infection in a Vietnamese population and the sensitivity, specificity, and accuracy of the Rapirun Stick test were high. In several patients, RUT and histologic examination produced false-negative or false-positive results, leading to the possible misdiagnosis of H. pylori infection. 
 
The study was published on May 7, 2014, in the World Journal of Gastroenterology.