Background and Objectives: Malnutrition is under-recognized and under-treated in Asia due to resource con- straints, lack of awareness and knowledge among healthcare professionals and patients, and lack of standardized procedures for malnutrition management. While international guidelines for the management of malnutrition are available, they may not be easily applicable to the patient population and healthcare settings within Southeast Asia. This paper provides consensus recommendations, developed by the Regional Nutrition Working Group, to foster evidence-based nutritional care in Southeast Asia to improve patient outcomes. Methods and Study De- sign: The group convened and discussed evidence-based recommendations and clinical experiences in the man- agement of malnutrition in hospitalized and community-dwelling adults, and the relevance of oral nutritional sup- plements in clinical practice. Supported by a literature search from January 2007–September 2017, consensus statements on key aspects of malnutrition management were developed. Results: Malnutrition management should be considered as an integral part of patient care and managed by a multidisciplinary team. Hospitalized pa- tients and outpatients should be screened for risk of malnutrition with validated tools. Nutrition intervention, in- cluding oral, enteral, or parenteral nutrition, should be accessible and individualized to all patients who are mal- nourished or at risk of malnutrition. Education on nutrition care is imperative for healthcare professionals, pa- tients and caregivers. Conclusion: These consensus recommendations provide practical guidance to improve nu- trition practice within healthcare in Southeast Asia. With collaborative efforts from the clinical community, pro- fessional societies and policy makers, this regional effort may also facilitate change in the nutrition practice at the institutional and national level.
Background and Objectives: Soy products are essential to the daily life of the Chinese population. However, the association between soy products and serum uric acid remains unclear. Better understanding of their relationship could provide food choice information for patients with gout. This study assessed the acute effects of soy and soy products on serum uric acid. Methods and Study Design: Sixty healthy adult male volunteers were recruited and randomly assigned to six groups. Ten participants in each group randomly ingested one of six foods: water, soy, and four different soy products. A blood test was conducted after 3 h to examine uric acid concentration. Results: The serum uric acid concentration significantly increased by 21.4±23.4 μmol/L at 1 h and 16.3±19.4 μmol/L at 2 h following ingestion of whole soybeans. These changes also applied to the soy powder group. The serum uric ac- id concentration rapidly increased by 38.1±20.5 μmol/L at 1 h, 34.4±18.2 μmol/L at 2 h, and 24.1±24.2 μmol/L at 3 h after the ingestion of soybean milk. The maximum concentration of serum uric acid was observed at 1 h after intake of soybeans and soy products, and then gradually decreased during the subsequent 2-h period. No signifi- cant uric acid changes from ingesting bean curd cake and dried bean curd stick were detected. Conclusions: In- gesting different soy products resulted in different concentrations of serum uric acid, with soybeans, soybean milk, and soy powder considerably increasing serum uric acid.
Background and Objectives: Some cereals, consumed at breakfast, have shown lower glycemic responses. Lim- ited data exist in the Indian context, where the effect could be modified due to genetic or racial differences. This study aimed to investigate the effect of cereal and milk, with or without fruits/nuts, on the glycemic response in healthy Indian men. Methods and Study Design: A randomized cross-over study was carried out on 16 men (18 - 45 years), with 3 interventions providing equal amounts of glycemic carbohydrate: a glucose drink (Reference), cereal and milk (CM), and cereal, milk, fruits and nuts (CMO), on separate days. Plasma glucose, serum insulin, C-peptide, ghrelin, energy expenditure (EE), substrate oxidation and appetite/satiety were measured repeatedly over 3 hours post meal. Results: A significant time effect and time x meal interaction between the meals, higher for the Reference meal, was observed for plasma glucose (p<0.001), insulin (p<0.001), C-peptide (p<0.001), and carbohydrate oxidation (p<0.001); while lower for satiety (p<0.001). The plasma glucose concentrations of CM and CMO meals returned to baseline 60 min postprandially, then remained there, unlike the Reference meal, where the plasma glucose values returned to baseline at 120 min and dipped significantly below baseline at 150 and 180 min. A significant effect of time (p<0.001) was observed for EE between meals. Ghrelin levels did not differ significantly between the test meals. Conclusions: Cereal with milk, along with fruits and nuts at breakfast, has a lower and stable glycemic response, along with increased satiety among healthy male subjects.
Background and Objectives: Probiotic treatment has proven to increase the density of bone mass, prevent against bone loss, and improve bone formation. We aimed to assess the effect of oral administration of the probi- otic Lactobacillus casei Shirota (LcS) on pain relief in patients with single rib fracture. Methods and Study De- sign: A total of 283 eligible patients who had a single rib fracture were enrolled and randomly assigned to receive skimmed milk containing either a commercial probiotic LcS or placebo every day through oral administration for 1 month after the fracture. The pain relief effect was assessed during activities that elicited pain; meanwhile, sleep quality and sustained maximal inspiration (SMI) lung volumes were monitored. Results: Patients in the LcS group had more effective pain relief than those in the placebo group during deep breathing, coughing and turning over the body. Between the two groups of patients, increase in SMI lung volume was larger in LcS group patients than that of patients in the placebo group. Sleep quality did not show significant improvements after 1 month LcS treatment. Conclusions: In patients with a single rib fracture, oral administration of the probiotic LcS could ex- hibit alleviating effects on pain intensity.
Background and Objectives: Associations between blood 25-hydroxyvitamin D (25(OH)D) concentration and sarcopenia remain controversial; thus, this meta-analysis was conducted to explore the relationship between blood 25(OH)D concentration and sarcopenia. Design: We searched the PubMed and EMBASE databases for relevant published observational studies that investigated blood 25(OH)D and sarcopenia up to June 2017. We then investigated data from these studies that compared blood 25(OH)D between the sarcope- nia and healthy control groups. A random-effect model was used to calculate the pooled weighted mean differ- ence (WMD) of blood 25(OH)D concentration with a 95% confidence interval (95% CI). Results: Twelve studies (eight cross-sectional, two matched case-control, and two prospective cohort studies) with a total of 22,590 indi- viduals were included. Sarcopenic individuals had lower blood 25(OH)D concentrations than healthy controls (WMD=−2.14, 95% CI: −2.81–−1.48; I2=74.6%). Subgroup analysis showed that the methods of assessing both blood 25(OH)D concentrations and sarcopenia might be sources of heterogeneity, and further showed that studies excluding obese individuals and different sarcopenia assessment criteria enhanced the relationship. Sensitivity analysis by one-study-removed confirmed the robustness of these results. Conclusions: Our study shows that sar- copenic adults have lower blood 25(OH)D concentrations. Further high-quality large-scale prospective cohort studies are needed to confirm these findings.
Background and Objectives: The relationship between vitamin C intake and hyperuricemia among the general US adult population has seldom been reported; thus, the present study examined the associations of total vitamin C (dietary vitamin C plus supplementary vitamin C) and dietary vitamin C intake with the risk of hyperuricemia. Methods and Study Design: Pooled data from three 2-year cycles (2007–2012) of the cross-sectional National Health and Nutrition Examination Survey were used in the present study. Dietary intake data were extracted from two 24-hour dietary recall interviews. Logistic regression models were used to determine the associations be- tween vitamin C intake and hyperuricemia risk. Results: A total of 14885 adults aged 20 years or older (7269 men and 7616 women) were registered in the present study. The prevalence of hyperuricemia was 19.1%. Based on the lowest quartile of dietary vitamin C intake, multivariate adjusted odds ratios with 95% confidence intervals of hyperuricemia for quartiles 2–4 were 0.84 (0.74–0.95), 0.83 (0.73–0.94), and 0.72 (0.63–0.82), and those for total vitamin C intake were 0.87 (0.77–0.99), 0.85 (0.75–0.96), and 0.66 (0.58–0.76). Inverse associations be- tween vitamin C intake and hyperuricemia were discovered in both men and women, even with or without co- variate adjustments. Conclusions: Total vitamin C and dietary vitamin C intake are inversely associated with hyperuricemia in the general US adult population.
Background and Objectives: The aim of this study was to determine the associations of intake of soy products and isoflavones with allergic diseases. Methods and Study Design: We conducted a cross-sectional study in 1437 participants (aged 20-64 years) who were living in Tokushima Prefecture, Japan during the period 2010– 2011. We obtained anthropometric data and information on life style characteristics including dietary intake and current medical histories of allergic diseases using a structural self-administered questionnaire. Multiple logistic regression models were used to assess the associations of soy products and isoflavones with allergic diseases after controlling for age, family history of allergic diseases, smoking, drinking, physical activity, energy intake, BMI and dietary factors. Results: Intake of soy products showed significant inverse dose-response relationships with allergic rhinitis. The third quartile for soy products had an adjusted OR of 0.56 (95% CI: 0.35-0.91) compared to the reference group (first quartile), though intake of soy products showed no dose-response relationship with atopic dermatitis. Intake of soy isoflavones showed a significant inverse dose-response relationship with atopic dermatitis, though the association between intake of soy isoflavones and atopic dermatitis was U-shaped after ad- justments for potential confounders. On the other hand, the associations between intake of soy isoflavones and other allergic diseases were not significant. Conclusions: The results indicate that higher intake of soy products is associated with reduced risk of allergic rhinitis in Japanese workers. Furthermore, moderate intake amounts of soy products and soy isoflavones are associated with inverse risk of atopic dermatitis.
Background and Objectives: An adequate level of maternal vitamin D is essential for maternal and fetal health during pregnancy. We examined the relationship between lifestyle, maternal vitamin D intake and the vitamin D status of pregnant women. Methods and Study designs: The sample of the cross-sectional study was 203 third trimester pregnant women in September-November 2016 in four different districts of West Sumatra, Indonesia. Questionnaire was used to assess lifestyles, dietary intake, anthropometry, maternal characteristics, demography and socioeconomic data. The Vitamin D serum level was measured by the ELISA method and the data were ana- lyzed using descriptive statistics, chi-squared tests, Pearson’s correlation and logistic regression. Results: 160 blood serum samples of pregnant women were collected. The means of 25-hydroxyvitamin D and maternal vita- min D intake were 29.06±11.39 ng/mL and 7.92±5.26 μg/day respectively. The prevalence of vitamin D deficien- cy-insufficiency was 61.25%, and more than 85% of the women had inadequate vitamin D intake. We found that living in mountainous areas (p=0.03) and low physical activity (p=0.02) were significantly associated with mater- nal vitamin D levels as a prediction factor. In addition, younger who had lower pre-pregnancy weight had a high- er prevalence of vitamin D deficiency. Conclusions: Low levels of vitamin D were common among pregnant women in West Sumatra, Indonesia. Additional intake of vitamin D from supplements may be important to meet the recommended dietary level for pregnant women.
Background and Objectives: To explore advantages and challenges for exclusive breastfeeding (EBF), com- pared to non-exclusive breastfeeding (nEBF). Methods and Study Design: Mothers from 7 cities in China were visited at 3, 10, 60, 120, and 180 days postpartum. Data about feeding practices, infant growth, and the macronu- trient contents of human milk (HM) were collected. Results: 130 lactating mothers attended 5 visits. 59 mothers (45.4%) exclusively breastfed infants for 0-4 month. Frequencies of breastfeeding per day were higher in the EBF group than the nEBF group at day 3, 10, 120 and 180, and were less than 8 times per day in the nEBF group. For Weight-for-age z scores, there were no differences between the two groups. Length-for-age z score was greater in the nEBF group at day 180 (0.74±1.05 vs 0.33±1.28). Weight-for-length z scores were greater in the EBF group at day 120 and 180 (day 120: 0.88±1.08 vs 0.36±1.1, day 180: 1.1±0.94 vs 0.54±1.07). The average protein and lactose contents of HM in the nEBF group were higher than in the EBF group at day 10. Conclusions: For nEBF infants, intake of formula replaced intake of breastmilk, due to lack of breastfeeding frequency, which did not bring weight gain for nEBF infants. During the introduction of complementary foods, EBF infants needed com- plementary nutrients to support growth. Therefore, lactating mothers may need to provide appropriate comple- mentary feeding and maternal leave extension to attend to their infant’s nutritional requirements. The criteria for linear growth may also need to be more commensurate with breastfeeding and relevant to later health outcomes.
Background and Objectives: Eating habits established during childhood affect health in later life. The United Arab Emirates (UAE) has a high prevalence of obesity in adolescents and adults; however, data on the health of preschool children are scarce. This study assessed the weight status and dietary habits of Emirati and non-Emirati children attending nurseries in Abu Dhabi, UAE. Methods and Study Design: Weight and height were measured in children aged 18 months–4 years. Z scores for height-for-age (HAZ), weight-for-age (WAZ), and BMI-for-age (BAZ) were calculated based on WHO protocols. Parents completed a questionnaire regarding demographics and food frequency. Results: A total of 203 children participated. Abnormal anthropometric status (z scores of <-2 or >2) for WAZ was indicated in 12.8% of Emirati children versus 1.4% of non-Emirati children (p=0.008) and for BAZ in 19.9% of Emirati children versus 8.4% of non-Emirati children (p<0.05). Emirati children exhibited higher prevalences of malnutrition (4.3% vs 1.4%), wasting (11.5% vs 2.8%), and overweight (8.5% vs 4.2%) than non-Emirati children and consumed discretionary calorie foods and typical components of Emirati cuisine (rice, fish, and pulses) significantly more often than non-Emirati children. Conclusions: Similar to findings in other countries undergoing economic transition, an indication of a double burden of disease was revealed in chil- dren attending nurseries in Abu Dhabi. Malnutrition and overnutrition were represented, especially among Emira- ti children, and were seemingly related to lifestyle rather than genetics. Therefore, policies focusing on child health interventions are required.
Background and Objectives: The dietary inflammatory index (DII®) is a measure of the overall inflammatory potential of a person’s diet. However, there have been no studies looking at the effect of DII on measures of mus- cle mass and strength. We aimed to examine the association between DII and skeletal muscle mass and strength in Chinese children. Methods and Study Design: A total of 466 children aged 6-9 years completed the study. Total body skeletal muscle mass (TSM), appendicular skeletal mass (ASM) and appendicular lean mass (ALM) were determined using Dual-energy X-ray absorptiometry. TSM/Height2, TSM/Weight, ASM/Height2 and ASM/Weight were calculated. The residual method was applied to compute ALM index (ALMI) adjusted for height and body fat. Hand grip strength was measured using hand dynamometer. DII scores were calculated from a 79-item food frequency questionnaire. Results: Fully adjusted linear regression models showed a statistically significant negative relationship between DII and ASM, ASM/Height2, ASM/Weight, ALMI, TSM, TSM/Height2, and TSM/Weight (p: 0.019‒0.014). The analysis of covariance indicated that the percentage differences in the ex- treme quartiles (Q4 vs Q1) of DII for the above-mentioned measures ranged from -1.04% to -4.36% (p-trend: <0.001‒0.013). When boys and girls were analyzed separately, similar findings were observed for boys but not for girls. No significant associations were detected between DII and hand grip strength. Conclusions: DII score was inversely associated with skeletal muscle mass in boys but not in girls aged 6-9 years old. No significant as- sociations were observed between DII and hand grip strength.
Background and Objectives: This study aimed to evaluate malnutrition prevalence and usefulness of the body mass index (BMI) in the assessment of malnutrition in hospitalized elderly patients with adrenal insufficiency (AI). Methods and Study Design: 318 hospitalized AI patients were diagnosed by a rapid ACTH stimulation test with a history of steroid treatment and compared with 374 control patients. Nutrition was assessed using the Malnutrition Universal Screening Tool (MUST). Nutritional status was evaluated using the Mini Nutritional As- sessment short form (MNA-SF) and BMI. Results: There was no difference in nutritional screening between the AI and control groups. Nutritional assessments indicated that 31.2% of all elderly patients suffered from malnu- trition and 33.8% of patients were at risk of malnutrition. Less than half of the patients (34.9%) were identified as well nourished. In this study, 33.6% vs 29.1% of patients were malnourished in the AI and control group, respec- tively. Overall, prevalence of malnutrition was higher in the AI group than the control group. In the AI group, pa- tients with low basal cortisol had a higher incidence of malnutrition than those with high basal cortisol. The BMI of patients in the AI group was higher than in the control group. According to BMI criteria, 64.3% of malnour- ished patients were overweight or obese in the AI group. Conclusions: Elderly AI patients are prone to develop malnutrition despite being overweight or obese. Therefore, more extensive nutritional assessment of elderly pa- tients with AI is required regardless of BMI.
Background and Objectives: Nutritional and dietary habits may affect children’s behaviors and learning. The etiology of attention-deficit/hyperactivity disorder (ADHD), a common neurodevelopmental disorder in children, may be associated with unhealthy diets or nutrients deficiencies. The purpose of this study was to examine whether children with ADHD exhibited different dietary habits or nutrient profiles from healthy control subjects. Methods and Study Design: We recruited 42 patients with ADHD (mean age: 8.1 years) and 36 healthy children as the control group (mean age: 9.8 years). We adopted the ADHD Rating Scale and the Swanson, Nolan, and Pelham Version IV Scale to interview both the ADHD patients and the control subjects and then evaluated partic- ipants’ dietary intake with a food frequency questionnaire. Logistic regression models were utilized to produce a composite dietary/nutrient score, while receiver operating characteristic (ROC) was adopted to differentiate be- tween the two participant groups. Results: Compared to the control children, children with ADHD demonstrated a higher intake proportion of refined grains (p=0.026) and a lower proportion of dairy (p=0.013), calcium (p=0.043), and vitamin B-2 (p=0.024). We observed that the composite score of dietary and nutrient could signif- icantly distinguish patients with ADHD from healthy controls (p<0.001). The composite dietary/nutrient score demonstrated a significant correlation with the severity of ADHD clinical symptoms (p<0.05). Conclusions: ADHD children and healthy controls have different dietary patterns and that dietary and nutrient factors may play a role in the pathophysiology of ADHD. Clinicians should consider dietary habits and specific nutrients in the routine assessment of children with ADHD.
Background and Objectives: To investigate the association of dietary patterns and dietary diversity with cardio- metabolic disease risk factors among South Asians. Methods and Study Design: In a population based study conducted in 2010-11, we recruited 16,287 adults aged >20 years residing in Delhi, Chennai, and Karachi. Diet was assessed using an interviewer-administered 26-item food frequency questionnaire. Principal component anal- ysis identified three dietary patterns: Prudent, Indian, and Non-Vegetarian. We also computed a dietary diversity score. Multinomial and binary logistic regressions were used to calculate adjusted prevalence (95% confidence intervals) of cardio-metabolic disease risk factors across quartiles of dietary pattern and dietary diversity scores. Results: The adjusted prevalence of diagnosed diabetes was lower among participants in the highest versus lowest quartile of the Prudent Pattern (4.7% [3.8–5.6] versus 10.3% [8.5–12.0]), and the Indian Pattern (4.8% [3.7–5.9] versus 8.7% [6.7–10.6] in highest versus lowest quartile, respectively). Participants following the Indian Pattern also had lower adjusted prevalence of diagnosed hypertension (7.0% [5.4–8.5] versus 10.6% [8.6–12.5] in highest versus lowest quartile, respectively). Participants in the highest versus lowest quartile of the dietary diversity score had a lower adjusted prevalence of diagnosed diabetes (4.1% [3.0–5.2] versus 8.2% [7.1–9.3]), diagnosed hyper- tension (6.7% [5.3–8.1] versus 10.3% [9.1–11.5]), and undiagnosed hypertension (14.2% [12.0–16.4] versus 18.5% [16.9–20.1]). Conclusions: High dietary diversity appears to be protective against cardio-metabolic disease risk factors in this urban cohort of South Asian adults. Further investigation to understand the underlying mechanism of this observation is warranted.
Background and Objectives: To evaluate the associations of dietary factors and the risk of gout and hyperuricemia. Methods and Study Design: PubMed and Embase databases were searched from inception to June 2017 for eligi- ble studies. Nineteen prospective cohort or cross-sectional studies with adequate sample sizes are included, all involving red meat, seafoods, alcohol, fructose, dairy products, soy foods, high-purine vegetables and coffee. Re- sults: Meta-analysis revealed several dietary associations with gout risk: red meat: OR 1.29 (95% CI 1.16-1.44); seafoods: OR 1.31 (95% CI 1.01-1.68); alcohol: OR 2.58 (95% CI 1.81-3.66); fructose: OR 2.14 (95% CI 1.65- 2.78); dairy products: OR 0.56 (95% CI 0.44-0.70); soy foods: OR 0.85 (95% CI 0.76-0.96); high-purine vegetables: OR 0.86 (95% CI 0.75-0.98); coffee: OR 0.47 (95% CI 0.37-0.59).Dietary association with hyperuricemia risk (red meat: OR 1.24 (95% CI 1.04-1.48); seafoods: OR 1.47 (95% CI 1.16-1.86); alcohol: OR 2.06 (95% CI 1.60-2.67); fructose: OR 1.85 (95% CI 1.66-2.07); dairy products: OR 0.50 (95% CI 0.37-0.66); soy foods: OR 0.70 (95% CI 0.56-0.88); high-purine vegetables ingestion: OR 1.10 (95% CI 0.88-1.39), P=0.39; coffee:OR0.76 in men (95% CI 0.55-1.06), OR 1.58 in women (95% CI 1.16-2.16). Conclusion: The risk of hyperuricemia and gout is positively correlated with the intake of red meat, seafoods, alcohol or fructose, and negatively with dairy products or soy foods. High-purine vegetables showed no association with hyperuricemia, but negative association with gout. Cof- fee intake is negatively associated with gout risk, whereas it may be associated with increased hyperuricemia risk in women but decreased risk in men.
Background and Objectives: Examine availability and price of healthier foods-vs-regular counterparts and their association with obesity. Methods and Study Design: A cross-sectional survey of weight and height among Māori in 2 urban and 96 rural areas in the Waikato/Lakes Districts-NZ (year 2004-06) was undertaken. Concur- rently, availability of 11 ‘healthier’ food in fast-food-outlets was examined by location (urban vs rural) and medi- an income (high-low). In supermarkets, five-specific ‘regular’ foods were scored against ‘healthier’ counterparts (white-vs-wholemeal bread, with-skin-vs-skinless chicken, regular-vs-trim meat, standard-vs-trim milk, sugar- sweetened-beverages vs-water) for in-store availability and price according to the Nutrition Environment Measures Survey. Results: Overall, 3,817 Māori (BMI: women: 32.9±7.8 kg/m2; men: 33.1±6.7 kg/m2) were in- cluded with 451 food-outlets in two urban-clusters and 698 food-outlets in 96 rural-clusters. Fast-foods: The availability of healthier food choices was higher for 8/11 items in rural and low-income areas than urban and high-income areas. Multivariate analysis considered location and income as cofactors. No association between number of fast-food-outlets/cluster and healthier foods/cluster with obesity prevalence (General/Māori BMI cut- offs) was observed. Supermarkets: Water was cheaper than sugar-sweetened-beverages and negatively associated with obesity prevalence (General r=-0.53, p=0.03; Māori r=-0.53, p=0.03); high availability scores for trim milk compared to standard milk correlated with higher obesity prevalence (General r=0.49, p=0.04; Māori r=0.57, p=0.01). Conclusions: Bottled water vs sugar-sweetened-beverages prices were inversely associated with obesity. This supports the argument to regulate the availability and price of sugar-sweetened-beverages in NZ. The posi- tive association of the availability of trim milk with the prevalence of obesity warrants investigation into individ- ual’s dietary and food-purchase behaviour.
Background and Objectives: When iodine intake is in excess, a susceptible population that has a genetic predis- position will have an increased risk of hypothyroidism or autoimmune thyroiditis. This study evaluated the vul- nerability to iodine excess and subclinical thyroid disease through screening of single nucleotide polymorphisms (SNPs) in reproductive-age women to provide evidence to be used for the prevention of subclinical thyroid dis- ease. Methods and Study Design: In Shanxi province, four areas where a range of iodine exposures from low to high were chosen in each region, 60 women were anticipated to enrol, including 20 pregnant women, 20 lactating women, and 20 non-pregnant, non-lactating women. Genotyping was performed using whole-blood samples, and the genotypes of 21 SNPs were determined and compared among areas with different water iodine and between controls and patients with subclinical thyroid disease. Results: In total, 241 participants were enrolled. Among the 21 candidate SNPs, no difference was found among areas with various water iodine, whereas, TG (rs2252696), TSHR (rs4903957), CTLA-4 (rs231775), CAPZB (rs1472565), PDE4D (rs27178), and HLA (rs2517532) were significantly associated with various subclinical thyroid diseases; in particular, the PDE4D (rs27178), ad hoc TT allele, was associated with all examined subclinical thyroid diseases. Conclusions: Vulnerability to subclinical thyroid diseases is influenced by the presence of gene polymorphisms. There is a need for screening of suspected genes to effectively prevent and reduce the occurrence of thyroid diseases. People with the TT allele in PDE4D (rs27178) should be made aware of an increased risk of subclinical thyroid disease.
Background and Objectives: In China, some studies have been reported that solute carrier family 30 member 8 (SLC30A8) gene polymorphism might increase the risk of T2DM, but some are not. The aim of this meta-analysis was to systematically investigate the association between the rs13266634 polymorphism of the SLC30A8 gene and T2DM in Chinese Han and ethnic minority populations. Methods and Study Design: All published elec- tronic articles were retrieved from Pubmed, Web of Knowledge, Chinese National Knowledge Infrastructure (CNKI), Wanfang database, VIP database and Google scholar. Pooled OR and 95% CI were calculated using ran- dom- or fixed-effects models. Results: Twenty-five articles involving 62,285 subjects were included in this meta- analysis. Considering the total population, significant associations between the rs13266634 polymorphism and T2DM were observed under the allele model (C vs T: OR=1.23, 95% CI=1.18-1.29), the additive models ( CC vs TT: OR=1.44, 95% CI=1.32-1.56; CC vs CT: OR=1.08, 95% CI=1.02-1.15; CT vs TT: OR=1.25, 95% CI=1.15- 1.37), the dominant model (CC vs CT+TT: OR=1.24, 95% CI=1.17-1.32) and the recessive model (CC+CT vs TT: OR=1.26, 95% CI=1.16-1.35). Based on subgroup analysis, besides the CC vs CT model, these associations were stronger in the ethnic minority groups than in the Han population. Moreover, no association was observed under the CC vs CT model (OR=1.26, 95% CI=0.95-1.66, p=0.105) in ethnic minority groups. Conclusions: Chinese C allele carriers could have an increased risk of T2DM. Well-designed future studies should be conduct- ed with a larger sample size to better understand this association in ethnic minority groups.
Background and Objectives: Energy requirement estimations are crucial for major burn patients’ nutrition man- agement. To find a practical equation for patients with burns over >50% of their total body surface area (TBSA) in an intensive care unit (ICU). Methods and Study Design: We conducted a six-week follow-up study of 21 ICU burn patients aged 17-28 years (second- and third-degree burns, TBSA: 50-90%) who were prescribed enter- al nutritional support. The energy consumption ratio (ECR) was calculated by dividing the actual energy intake by the estimated energy requirement. Linear regression was used to evaluate the stability of each equation and the wound healing rate over time. Results: All included patients survived. On the fifth day, among the seven equa- tions used, the ECRs of those dependent on the basal metabolic rate and body weight, namely, 35 kcal/kg BW, BMR × 1.5, and the Toronto formula, reached 74%, 71% and 69%, respectively. The ECRs for the above- mentioned formulae achieved nutritional support goal sufficiency (0.9-1.1) from the third week. Additionally, with every 1% increase in the Energy Consumption Increase Rate per week, the wound healing rate increased from 0.35% to 0.80% per week. Both the 28 and 35 kcal/kg BW formulas had the smallest regression coefficients (0.46) over 6 weeks. Conclusion: The 35 kcal/kg BW equation was suitable for young patients with burns over >50%TBSA in the ICU because it could be applied without equivocation, in time, and with acceptable wound healing rates. Additionally, it was well tolerated and contributed to stable management with feeding sim- plicity.
Background and Objectives: Total body potassium (TBK), has a natural radioactive isotope, which can be measured to derive body cell mass (BCM), making it useful in clinical conditions, early growth and pregnancy. The objective was to build a whole-body potassium counter (WBKC), to accurately measure TBK in the body. Methods and Study Design: A WBKC was designed and constructed using a shadow shield. A cellular four compartment (4C) model of fat free mass (FFM), using estimates of TBK along with total body water (TBW), was compared with a molecular 4C model of the body in twenty healthy adults (10 men and 10 women). The mo- lecular 4C model used measurements of TBW, bone mineral content (BMC), and body volume from deuterium dilution (DD), dual energy x-ray absorptiometry (DXA) and air displacement plethysmography (ADP) respective- ly. Results: The accuracy and precision of the WBKC were 2.8% and 1.9% with TBK phantoms. The mean esti- mate of FFM by the molecular 4C model was 40.4±6.8 kg, while it was 41.2±7.3 kg using the cellular 4C model. Conclusions: A WBKC constructed from base principles, was relatively low cost, efficient, safe and non- invasive, but requires some design considerations. Its measurement of FFM compared well with the molecular 4C model. Once constructed, it offers a relatively costless, accurate and repeatable method to measure body composi- tion in conditions with uncertain hydration status, at all life stages.
Background and Objectives: The objectives of this study were to identify and validate a screening tool to detect malnutrition among Indigenous and non-Indigenous Australian patients. Methods and Study Design: This study included medical patients admitted into three regional hospitals in Australia. A literature review was undertaken of current screening tools before the Malnutrition Screening Tool (MST) and the newly developed Adult Nutri- tion Tool (ANT) were used to validate a screening tool for use among participants against the Subjective Global Assessment (SGA) tool. The sensitivity and specificity of both the MST and ANT were determined for all study participants as well as according to participants’ Indigenous status. Results: A total of 608 participants were en- rolled into the study, of whom 271 (44.6%) were Indigenous. The area under the curve (AUC) when utilising ANT was higher in all participants compared to the MST (0.90, 95% CI 0.88–0.92 versus 0.81, 95% CI 0.77–0.84, p<0.001). The AUC was also significantly higher for Indigenous participants when utilising ANT compared to the MST (0.88, 95% CI 0.84–0.92 versus 0.78, 95% CI 0.73–0.83, p<0.001). An ANT ≥2 demonstrated superior sensitivity for both Indigenous and non-Indigenous participants (96.0%, 95% CI 92.8–98.7%) than the MST (84.0%, 95% CI 78.9–88.3) but with inferior specificity (59.5%, 95% CI 54.2–64.6) than the MST (70.7%, 95% CI 65.7–75.3). Conclusions: The ANT is both a valid and accurate tool for Indigenous and non-Indigenous Aus- tralian patients. Further research is required to validate ANT to aide in the detection of malnutrition in other clini- cal settings.
Background and Objectives: The potential side effects of common phosphate binders are gastrointestinal in practice. We hypothesized that regular use of phosphate binders may be associated with decreased appetite, die- tary intake and consequently, poor nutritional status. Methods and Study Design: This was cross-sectional study of 78 patients (mean age 67.5±13.0, 34.6% women) undergoing maintenance hemodialysis (MHD) treatment. Participants were divided into three equal groups - sevelamer (n=25), lanthanum (n=24) and the control group (n=29). Eating motivation was assessed using visual analogue scales (VAS) and by a self-reported appetite as- sessment which was graded on a 5-point Likert scale. Main outcome measure was differences in VAS scores for appetite, dietary intake and nutritional status (malnutrition-inflammation score [MIS]) in the study groups. Re- sults: Appetite, dietary intake, biochemical nutritional markers, anthropometric measures and MIS were similar in the three groups. A statistically significant difference was observed in sensation of fullness between the groups: multivariable adjusted ORs in the sevelamer carbonate group was 4.90 (95% CI: 1.12 to 21.43), p=0.04 and in the lanthanum carbonate group was 5.18 (95% CI: 1.15 to 23.30), p=0.03 versus the control group. However, no line- ar association was observed between MIS scores and VAS scores for appetite in any study group. Conclusions: Regular use of these phosphate binders was not associated with anorexia, decreased dietary intake and nutritional status in the study population. Therefore, there is no preference in the choice of phosphate binders in MHD pa- tients with hyperphosphatemia, even those who are at nutritional risk.
Background and Objectives: Malnutrition has adverse impacts on survival of cancer patients. The aim of the present study was to investigate the prevalence of malnutrition, and the nutrition support status in hospitalized pa- tients with cancer in China. Methods and Study Design: A multi-center, cross-sectional study was conducted in 29 tertiary public hospital wards in 14 Chinese cities. Malnutrition was defined as weight loss (WL) >5% over the past 6 months or body mass index (BMI) <20 kg/m2 with WL >2%. The nutrition risk index (NRI) and perfor- mance status (PS) were evaluated. Results: 1138 hospitalized cancer patients (93.4% of the initial sample, 662 men, 60.6±14.5 years) were enrolled. Overall, 41.3% of patients were malnourished. The percentage of nutrition- al disorders as determined by the NRI was 51.4%. PS was 0 in 50.3%, 1 in 15.4%, 2 in 13.9%, and 3 or 4 in 20.4%. Compared with patients with PS of 0-1, patients with PS of 3-4 had a relative risk of malnutrition of 1.275 (95% CI 0.250-0.488, p<0.0001). Only 38.6% of patients received nutrition support, of whom 45.0% of the mal- nourished and 31.9% of the non-malnourished patients did; 63.2% of patients complained of poor appetite, while merely 14.0% of patients had received nutrition counseling. Conclusions: The prevalence of malnutrition is high in hospitalized cancer patients, and inappropriate use of nutritional interventions highlights the urgent need to de- fine standard operating procedures and quality control process.
Background and Objectives: For delivery of parenteral nutrition (PN), long-term central access is often required in infants with intestinal failure (IF). Compared to central venous catheters (CVCs), peripherally inserted central catheters (PICCs) are less invasive, as they are smaller, and they can even be placed without general anesthesia. In this study, we report the complications of long-term use of PICCs, and compare our results with previously published research. Methods and Study Design: We reviewed the infants in the Xin Hua Hospital to determine the incidence of catheter-related bloodstream infections (CRBSIs) as well as other complication rates. Results: A total of 43 infants diagnosed with intestinal failure and receiving PN through a PICC met the inclusion criteria. There were 66 PICCs accounting for 2563 catheter days, and a total of 29 complications were been recorded. The overall incidence of complications was 11.31 per 1000 catheter days, and the incidence of CRBSI was 5.85 per 1000 catheter days. Gram-positive bacterial species were the most common organisms growing in blood cultures. As for the risk factors, we find that low weight when PICC was inserted was associated with an increased risk of complications as well as low mean weight during the PICC dwelling time. Conclusions: We did not find an in- creased incidence rate of CRBSI in using PICC as an alternative to CVC. Also, as PICCs offer an advantage over CVCs in placing and nursing, we recommended PICCs as the first choice in patients with IF.
Background and Objectives: Malnutrition is commonly diagnosed in patients with inflammatory bowel disease (IBD). However, only few clinical studies have adequately explored the importance of body composition in the nutritional assessment of Chinese patients with IBD. Methods and Study Design: A total of 78 IBD patients were enrolled, and Patient-Generated Subjective Global Assessment (PG-SGA) was used to assess malnutrition. Bioelectrical impedance analysis was used to analyze the body composition of IBD patients and their fat free mass indexes (FFMI) were also calculated. FFMI values <17 kg/m2 in men and <15 kg/m2 in women were con- sidered low. Food consumption data were collected using the semi-quantitative food frequency questionnaire. Results: Of the 78 patients, 49 (62.8%) had low-FFMI. Among the patients with PG-SGA <4, 12 (41.4%) had al- tered body composition with low-FFMI. FFMI negatively correlated with the PG-SGA scores and disease activity. No statistically significant differences in fat free mass (FFM) and skeletal muscle mass were observed between patients in the active phase and patients in remission (p>0.05). However, the fat mass and visceral fat area of pa- tients in remission were higher than those of patients in the active phase (p<0.05). The average energy derived from fat, proteins and carbohydrates was 29.6±8.45%, 10.4±1.97% and 60.3±9.33%, respectively. Conclusions: Our study shows that 41.4% of IBD patients had altered body composition despite being well-nourished accord- ing to the PG-SGA. Patients in the remission phase presented with fat accumulation and their FFM remained low. The dietary pattern was not adequate among the IBD patients, especially regarding protein intake.