Human history began with the understanding of food, including what can be eaten, what cannot be eaten, and what is beneficial for health. This is the earliest and simplest form of food nutrition for humans. The history of nutrition, from ancient dietary habits to modern nutritional science as we know it today, is a fascinating journey spanning thousands of years. This article provides a comprehensive overview of the development history of human nutrition.
Background and Objectives: Globally, there is an increasing trend in the number of individuals utilizing home enteral nutrition (HEN). In China, HEN is currently at a nascent stage. We aim to investigate the current situation of home enteral tube feeding (HETF) caregivers to provide more convenient services and assistance to them in China. Methods and Study Design: We conducted a questionnaire survey among family caregivers of enteral tube feeding patients preparing for hospital discharge to assess their awareness, needs, and preferences for resources. Results: 108 family caregivers were recruited in the study. 65 caregivers were considered proficient in tube feeding knowledge, while 43 were not, resulting in a non-proficiency rate of 39.8%. Education levels (p <0.001), employment status (p = 0.029), and the patients’ age-adjusted Charlson Comorbidity Index (aCCI) (p = 0.032) were the critical factors affecting the caregivers’ tube feeding knowledge. The risk of non-proficiency in tube feeding knowledge was increased for those with lower education levels compared with those with higher education levels (OR=5.08, 95% CI: 1.77-14.52). The family’s monthly income and expenditure (p = 0.030) was the sole factor impacting the type of tube feeding service needs. Conclusions: Providing tube feeding knowledge to HETF patients and their caregivers before discharge is essential. Personalized training, especially for caregivers with lower education levels, can improve their understanding. Additionally, implementing online nutrition follow-ups and Nutrition-Nursing Joint Clinics can help address healthcare resource disparities and offer more accessible services to HETF patients in China.
Background and Objectives: Sleeve gastrectomy (SG) is a commonly utilized surgical procedure for managing weight and patients must adopt healthy lifestyle practices and dietary modifications to sustain weight loss and prevent relapse. This study aims to evaluate the dietary habits, levels of physical activity, and weight outcomes among Saudi women post-gastric sleeve surgery. Methods and Study Design: This study involved 352 female participants aged 20 to 50, who had undergone gastric sleeve surgery. The Bariatric Patients Association, Bariatric World and Patients Forum were also used and contacted via phone. Surveys were used to ascertain their physical activity levels and dietary behaviors; a food frequency questionnaire (FFQ) was also utilized and each participant’s BMI was calculated. Results: The study involved 352 women, of whom the highest proportion was in the 40-50 age group. Prior to the surgery, nearly all of the participants (98.9%) were diagnosed as having obesity or morbid obesity, which significantly decreased following the surgery. According to the participants, the primary reason for undergoing the surgery was failed dietary regimens (26.4%). A considerable portion of participants continued consuming dietary supplements post-surgery (35.59%) and frequently consumed juices and sweets. Most of the participants did not meet the WHO recommendations for regular physical activity. Correlation analysis revealed a significant relationship between BMI and the consumption of healthy foods post-surgery. Conclusions: The study identified concerning lifestyle habits among the participants, underscoring the importance of maintaining a healthy diet and engaging in regular physical activity to optimize the long-term benefits of weight loss surgery and enhance overall well-being.
Background and Objectives: The low fermentable oligo-, di-, mono-saccharides and polyols (FODMAP) diet is an effective dietary intervention for irritable bowel syndrome (IBS), yet up to 50% of patients fail to respond adequately. Identifying reliable predictors of response could optimize treatment selection and improve treatment outcomes while avoiding unnecessary dietary restrictions. This narrative review examines current evidence for predictors of response to the low FODMAP diet and highlights gaps in knowledge that must be addressed to develop clinically useful indicators for routine practice. Methods and Study Design: We reviewed the literature on the low FODMAP diet, and studies investigating factors that may predict treatment response, including clinical, diagnostic, biological, biochemical, and microbial markers. Results: Several potential predictors to the low FODMAP diet have emerged, including baseline symptom severity, psychological factors (particularly depression), hydrogen breath test results, volatile organic compounds in fecal samples, and specific gut microbiota profiles. Clinical and psychological measures show the most immediate potential for implementation due to accessibility and established measurement tools. Biological markers, including breath testing, metabolomics, and microbiome analysis, show promise but require further validation in larger, diverse populations and standardization of methodologies. Conclusions: Despite promising research, significant gaps remain in developing reliable, accessible predictors of response to the low FODMAP diet. Future research should focus on validating simple clinical tools that combine symptom profiles with psychological assessment to guide treatment decisions. A personalized approach to dietary management of IBS based on reliable response predictors would optimize clinical outcomes while minimizing unnecessary dietary restriction and healthcare resource utilization.
Background and Objectives: Consuming a diet that ensures adequate nutrient intake is essential to address all forms of malnutrition. In Japan, a meal combining staple food, main dish, and side dish is considered a balanced diet. This study was conducted to investigate the frequency of meals combining staple food, main dish, and side dish associated with nutrient adequacy. Methods and Study Design: This cross-sectional study included 6,264 adults. All data were obtained from the 2015 Health and Nutrition Survey in Shiga prefecture. Staple food, main dish, and side dish were each defined as a dish with primary ingredients of ≥50 g. Regarding the frequency, participants were divided into ≥2 and <2 times/d groups. The nutrient adequacy evaluated using the Dietary Reference Intakes for Japanese 2020 (DRIs–J) score is based on the reference values provided in the DRIs–J. The t-test was used to evaluate nutrient adequacy between the 2 groups. Results: Of the total participants, only 1,423 (22.7%) were classified into the ≥2 times/d group, and they had significantly higher DRIs–J scores than participants in the <2 times/d group (p < 0.001). The adequacy percentage of all nutrients except saturated fatty acid, particularly dietary fiber and most micronutrients, was >1.5-fold higher in the ≥2 times/d group than in the <2 times/d group (p < 0.001). Conclusions: This study provides important information that meals combining staple food, main dish, and side dish at least twice a day is effective in maintaining a diet with high nutrient adequacy.
Background and Objectives: To explore the nutritional challenges and its influencing factors of adults aged 40-69 living in Chinese cities. Methods and Study Design: This cross-sectional study involved 300 subjects from 29 cities in China. Questionnaires were used to collect demographic information, presence of chronic disease, and the use of nutritional supplements and fortified foods. 24-hour food intake was recorded using the Eat-Right Assistant, a validated digital service. Results: Fiber (56.7%), calcium (66.3%) and selenium (67.0%) were the nutrients with the highest insufficient intake. The foods with the highest inadequate consumption were dairy products (91%), fruits (84.3%), tubers (76.3%), soybeans and nuts (70%), and whole grains (65%). Even though 95.7% of the study population showed medium-high level of dietary diversity, dietary imbalance was present among 99% of the subjects. Higher socioeconomic status, passive health awareness, or the use of nutritional supplements or fortified foods showed positive influence on nutrient intake and dietary quality. Conclusions: This research provided insights into the dietary intake status and its influencing factors of 300 urban residents aged 40-69. The adult population still face a challenge of inadequate nutrient intake and imbalanced diet. In addition, this study supported the feasibility of using a digital service in research. Further studies with a larger sample size are needed to confirm current findings. This will help to clarify the unmet nutritional needs of adults in China and thus help to achieve healthy aging.
Background and Objectives: To investigate the correlation between weight-adjusted-waist index (WWI), a novel obesity index, and serum ferritin (SF) level in patients with type 2 diabetes mellitus (T2DM), and the association between WWI and the prevalence of hyperferritinemia. Methods and Study Design: A total of 943 patients with T2DM were divided into three groups based on WWI tertile levels. Disparities in SF levels and the prevalence of hyperferritinemia were compared among these groups. The correlations among WWI, SF levels, and hyperferritinemia were analyzed in patients with T2DM. Results: As WWI tertile levels increased, SF levels tended to increase (p for trend <0.01). A statistically significant positive correlation was observed between the WWI and SF levels (R = 0.263, p < 0.001). After adjusting for confounders by multiple linear regression, a significant positive correlation was maintained between the WWI and SF levels [β = 0.194, 95% CI (49.914, 112.120)], p < 0.01]. Binary logistic regression analysis revealed a positive association between the WWI and the likelihood of hyperferritinemia, with a notably stronger correlation observed in females compared to males [OR = 3.248, 95% CI (2.027, 5.204), p < 0.01 vs. OR = 2.091, 95% CI (1.432, 3.054), p < 0.01]. Conclusions: Along with increasing WWI, SF levels gradually increased in patients with T2DM. The WWI exhibited a positive correlation with SF levels and hyperferritinemia, more significantly in female patients.
Background and Objectives: Exploring the effects of circulating micronutrients on Graves' disease (GD) through observational research or randomized controlled trials has drawn more attention. In order to investigate the putative causal inference, we provide an illustrative estimate of two-sample Mendelian randomization (MR) study. Methods and Study Design: Inverse-variance weighted (IVW) method was employed as the primary approach to determine the causal relationships between micronutrients level and GD. Several complementary sensitivity analyses were also undertaken to evaluate the impact of potential violations of MR assumptions. In addition, we utilized cross-sectional data from the National Health and Nutrition Examination Survey (NHANES) to analyze the differences in the prevalence of GD among participants with different levels of trace nutrient concentrations. Results: In terms of vitamins, IVW MR analysis revealed a suggestive relationship between each standard deviation decrease in vitamin D level and increased risk of GD (OR=1.28, 95% CI: 1.04-1.59, p = 0.0212). A nominally significant association was also noted for genetically predicted vitamin B-6 concentration and higher risk of GD (OR=1.56, 95% CI: 1.08-2.25, p = 0.0171). Genetically predicted concentrations of other vitamins level and 6 minerals levels were not in association with GD susceptibility. The causal estimates from other complementary MR approaches were consistent with these findings. Additionally, we found that participants from NHANES with vitamin D and vitamin B-6 deficiency had a higher prevalence of GD. Conclusions: Our study provides an obvious unidirectional causality of circulating vitamin B-6 and vitamin D with GD. Dietary supplementation with micronutrients may be a complement to classical therapies for preventing and treating GD.
Background and Objectives: Anemia is a major health problem worldwide, with complex etiologies and significantly affecting the quality of life and health outcomes. In Indonesia, anemia is a public health concern with a complex interplay between genetic, environmental, and infectious disease factors. The prevalence tends to increase in Indonesia from 2007 to 2018. This study aims to explore factors contributing to anemia in Indonesia. Methods and Study Design: We used archived data from various population studies collected between 1995 and 2023. A total of 5,486 subjects from 17 study populations in Indonesia were included in the analyses. Results: The proportions of anemic women are higher than anemic men (p<0.001), and the anemia prevalence in Indonesia is diverse in various populations. More than 50% of this study subjects were microcytic hypochromic anemia with 35% indicative of iron deficiency and 13% of thalassemia based on Mentzer Index and RDW index cut-off. Hb analysis showed that HbA2 and HbF proportions above normal were significantly higher in the anemic group (p<0.001). We also found beta thalassemia proportions were higher in the anemic group (p<0.001) indicating genetic disorders are prevalent in Indonesia. Conclusions: The anemia prevalence in Indonesia is high, and the etiology is very complex, with nutritional and non-nutritional factors. Therefore, anemia mitigation in the Indonesian population should consider nutritional and non-nutritional factors. Policy makers should consider intervention programs beyond nutrient-specific strategies such as genetic background of the individuals.
Background and Objectives: This study aimed to assess the predictive power of the Geriatric Nutritional Risk Index (GNRI) and the triglyceride-glucose (TyG) index for poor prognosis in non-ST segment elevation acute coronary syndrome (NSTE-ACS) patients post-percutaneous coronary intervention (PCI). Methods and Study Design: A cohort of 393 NSTE-ACS patients who underwent PCI at the People’s Hospital of Nanjing Jiangbei from 2016 to 2022 was analyzed. Major adverse cardiovascular events (MACEs), including death, non-fatal myocardial infarction, and target vessel revascularization, served as the primary outcome. Relationships between GNRI, TyG index, and MACEs were explored using univariate and multivariate logistic regression, with results presented as odds ratios (OR) and 95% confidence intervals (CI). The predictive value was further evaluated using the area under the curve (AUC) from the receiver operating characteristic (ROC) curve. Results: MACEs occurred in 34 patients. A TyG index ≥1.36 was associated with a significantly increased risk of MACEs (OR=5.07, 95%CI: 1.64-15.71), while a GNRI ≥108 indicated a decreased risk (OR=0.17, 95%CI: 0.04-0.68). These associations were consistent across various subgroups, including age, gender, and specific pre-existing conditions. The combined predictive value of TyG index and GNRI was higher than each alone (AUC=0.711, 95%CI: 0.642-0.779). Conclusions: In post-PCI patients with NSTE-ACS, the TyG index and GNRI are significant predictors of MACEs, with the TyG index indicating higher risk and GNRI lower risk. Their combined use may enhance the predictive accuracy for MACEs in this patient population.
Background and Objectives: Repeating food frequency questionnaires (FFQs) within the same population was reported to improve the validity of correlation coefficient (CC). However, the enhancement of validity in ranking agreement remains underreported. Herein, we assessed the validity of energy and nutrient intake estimates using single and multiple FFQs and their ability to rank individuals. Methods and Study Design: 213 men and women aged ≥20 years were recruited from the residents participating in the Tohoku Medical Megabank Project (TMM) cohort studies; three FFQs were conducted in November each year from 2019 to 2021, with 12-day weighted food records (WFRs) as the reference method. Spearman’s rank CCs were calculated between single or multiple FFQs estimates and those obtained through the 12-day WFR. Additionally, the ranking agreement was compared based on cross-classification. Results: CCs between intake estimated using a single FFQ and 12-day WFR were moderate for several nutrients, with median CCs of 0.52 for men and 0.48 for women. CCs for multiple FFQs were slightly higher than that of single FFQ, with median CCs of 0.59 for men and 0.56 for women. Regardless of the number of FFQs, the proportion of subjects classified into the opposite extreme category was ≤5% for most nutrients. Conclusions: A single FFQ used for adults in the TMM cohort studies showed moderate validity. Estimates from multiple FFQs improved the accuracy slightly; nevertheless, this indicates that relying on a single FFQ is unlikely to result in a serious misclassification compared to using intake data from multiple FFQs over a relatively short period.
Background and Objectives: Nutrition is important in promoting health and preventing disease, while malnutrition can exacerbate disease symptoms and lead to adverse clinical outcomes in patients. The process of nutritional diagnosis and treatment includes nutritional risk screening, nutritional assessment, and nutritional therapy. This study aims to understand the number of publications, cooperation of research subjects, progress of research content, and research hotspots of nutritional risk screening and assessment, and then identify the future trends and directions of global nutritional screening. Methods and Study Design: Articles on nutritional risk screening, nutritional assessment and application of nutritional diagnostic tools were identified from the Web of Science and the collected data were analysed using bibliometrics and information visualisation with the help of CiteSpace software. A total of 10632 articles published between 1991 and 2024 were selected. Results: The country with the highest number of articles was the United States; two institutions, the University of São Paulo and the Karolinska Institutet, had higher centrality and number of articles. Keyword emergent analysis revealed that global leadership initiatives on malnutrition, diagnosis, criteria, myasthenia gravis, and clinical nutrition were the five emergent terms that lasted until 2024 and were the most popular hot topics among experts and scholars. Conclusions: We describe the characteristics of the development of nutritional risk screening and assessment studies and their trends. Currently, there is not a close collaboration between institutions and authors in the research process, while the field is trending towards more specific research and a greater focus on the disease progression in patients.
Background and Objectives: This study aimed to analyze the relationship between thyroid volume (TVOL) and physical development of children, and explore the suitable TVOL correction methods. Methods and Study Design: 1500 children aged 8-10 years from Gansu province northwest China were selected. The height (H), weight (W), urine iodine of children was measured and their thyroid was examined by ultrasound. Body mass index (BMI), body surface area (BSA) and TVOL were calculated (BSA calculated by three formulas). The relationship between TVOL and age, sex, physical development was analyzed. The applicability of TVOL correction methods including BMI corrected volume (BMIV), BSA corrected volume (BSAV), weight and height corrected volume indicator (WHVI) and height corrected volume indicator (HVI) were compared. Results: Median urinary iodine concentrations of children aged 8, 9, 10 years were 166.6 μg/L, 167.2 μg/L and 178.8 μg/L respectively. The rate of iodine deficiency was 20.3%, the rate of thyroid goiter was 3.2%. Physical development indexes (height, weight, BMI and BSA) and TVOL increased with age. Also, physical development indexes (height, weight, BMI and BSA) of boys were higher than girls (p <0.05). Only BSAV1 had no correlation with all physical development indexes (p >0.05). The TVOL P97 (97th percentile) of children aged 8, 9, 10 years were 4.4 ml, 4.9 ml, 6.5 ml, the values were 4.6 mL, 4.7 mL, 5.9 mL after BSAV1 corrected. The difference between TVOL and BSAV1 ranges from -0.37% to 0.36%. Conclusions: The thyroid volume is not only affected by age, but it is also affected by physical development. Thyroid goiter should be assessed based on age and physical development. The formula BSAV1=TVOL/ (W0.425×H0.725×71.84×10-4) was a suitable TVOL correction method.
Background and Objectives: Obese and diabetic individuals tend to have insulin resistance, but are less likely to develop osteoporosis. The association of triglyceride-glucose (TyG) related indices with osteoporosis remains controversial, and longitudinal evidence exploring the male osteoporosis (MOP) is limited. This study aims to examine TyG, TyG-body mass index (TyG-BMI) and the metabolic score for insulin resistance (METS-IR) with osteoporosis risk among older men. Methods and Study Design: A cohort study based on 1622 middle-aged and older men in 2015 was conducted, and followed up until 2022. Participants with osteoporosis and admittedly secondary risk factors were excluded. TyG, TyG-BMI, METS-IR and corresponding quantiles were calculated. Cox proportional hazard regression models were used to assess the hazard ratios (HRs) and 95% confidence interval (CI). Receiver operating characteristic (ROC) curve was applied to estimate their performance in osteoporosis screening. Results: 72 of 1622 participants were newly developed OP during the 9317 person-years. The adjusted HRs of TyG, TyG-BMI, and METS-IR for MOP were 0.573 (95%CI 0.336-0.976), 0.991 (95%CI 0.984-0.999) and 0.929 (95%CI 0.892-0.968), respectively, and presented at linear dose-response relationships. Subgroup analysis showed that the estimated benefit for MOP incidence was consistent among participants aged more than 70 years and related to BMI and eating mount of milk, fresh fruit and vegetables. No difference was found in the area under ROC curve for screening osteoporosis, ranging from 0.585 to 0.617. Conclusions: TyG and relevant indices were associated with the incidence of osteoporosis in the senile men, and the relationship was thought to correlate with BMI and nutritional behaviors.
Dear Editor,
The classification for underweight adults, defined as a body mass index (BMI; <18.5 kg/m²) by the World Health Organization (WHO), has been consistent globally with respect to populations. Although the WHO has recognized that body composition and health risks differ according to ethnicity by offering adjusted BMI cut-offs for overweight and obesity (23–24.9 kg/m² for overweight and ≥25 kg/m² for obesity) in Asian-populations,1 the underweight threshold has strangely not been modified. Because there is evidence of unique physiological and metabolic profiles of Asian populations the applicability of this uniformity to Asian populations remains as an important question.
The modified overweight and obesity standards have been influenced by research showing that Asians often have a larger percentage of body fat at a given BMI than their Western counterparts. Nevertheless, the underweight group has not been given any comparable implications. Although research indicates that Asians who are low weight can be at various health risks compared to other ethnic people with the same BMI, yet it is a matter. For example, Asian individuals who are underweight can be susceptible to the lack of sarcopenia which can spoil health issues related to low body mass index.2
In addition, a single global low weight cut-offs affect clinical practices and public health initiatives. In countries like India, where malnutrition is still an important issue, an analogue approach that considers local diet trends, socio -economic inequalities, and health results can be helpful for low weight classification. Due to abortion, the prevalence of malnutrition can be underestimated, which can affect the allocation of resources and the design of interventions.
We recommend that WHO should rethink the BMI cut-off with the same level of overweight and obese classification, using data of large, population-specific studies by reducing this gap which is in line with the goals of global health equity, by ensuring that BMI categories accurately reflect the diverse health profiles of other individuals. We encourage scholars and policy makers to cooperate to develop and synthesize strong evidence to support this necessary amendment.
Background and Objectives: The aim of this study was to elucidate the dose-response relationship between dietary carbohydrate consumption and the improvement of glycemic control and insulin sensitivity in individuals with type 2 diabetes mellitus (T2DM), following an intensive dietary intervention. Methods and Study Design: Randomized controlled trials published up to December 2023 were systematically reviewed from four databases: PubMed, Embase, Web of Science, and Cochrane Database of Systematic Reviews. Primary outcomes included: glycated hemoglobin (HbA1c), fasting glucose (FG); and secondary outcomes included: BMI, fasting insulin (FI), Homeostasis Model Assessment−Insulin Resistance (HOMA−IR). We performed a random−effects dose−response meta−analysis to estimate mean differences (MDs) for each 10% reduction in carbohydrate intake. Results: A total of 38 articles were analyzed, encompassing 2,831 total participants. Compared to the highest recorded carbohydrate intake (65%), reducing carbohydrate intake to 5% showed that for every 10% decrease, the following improvements were observed: HbA1c (MD: 0.39%; 95%CI: -0.5 to -0.28%), FG (MD: 0.55 mmol/L; 95%CI: -0.82 to -0.28 mmol/L), BMI (MD: -0.83 kg/m2; 95%CI: -1.27 to -0.38 kg/m2), FI (MD: -2.19 pmol/L; 95%CI: -3.64 to -0.73 pmol/L), HOMA-IR (MD: -1.53; 95%CI: -3.09 to 0.03). Conclusions: Reducing dietary carbohydrate intake significantly improves glycemic control and insulin resistance in individuals with type 2 diabetes. A linear reduction in carbohydrate intake was observed, with significant effects occurring within the first 6 months of the intervention. However, these effects diminished beyond this period. Notably, the improvements in glycemic parameters were not significantly affected by whether calorie restriction was implemented.
Cow’s milk protein allergy is an adverse immune reaction to proteins found in cow’s milk, primarily casein and whey, affecting artificially fed, breastfed and mixed-fed infants. The immunological mechanisms involved lead to diverse clinical presentations, most commonly affecting the digestive, respiratory, and integumentary systems. Diagnosis relies primarily on clinical evaluation due to the absence of specific diagnostic tests, making accurate identification crucial to prevent misdiagnosis or underdiagnosis. Treatment requires strict avoidance of cow's milk proteins in the diets of both children and breastfeeding mothers, with close monitoring of nutritional status during long-term management. Recent advancements in treatment, including the use of probiotics, provide new options for improving clinical outcomes. This narrative review aims to provide clinicians with evidence to standardise diagnosis and treatment, improve food allergy management by non-allergy specialists and develop accurate feeding recommendations.
Background and Objectives: Behavioural strategies can promote adherence to intensive lifestyle treatments for obesity. This study aimed to explore effective behavioural strategies for weight maintenance in Chinese patients with overweight or obesity. Methods and Study Design: Retrospective analysis of weight maintenance data was conducted. Patients with overweight or obesity, who had received a 3-month weight loss and behavioural intervention, were asked to complete questionnaires to monitor compliance with behavioural strategies after the weight loss. They continued to follow a daily calorie restriction for 12 months to maintain their weight. The primary outcome was to evaluate the association between a total weight loss (TWL) of more than 5% and compliance with behavioural strategies during the 6-month weight maintenance phase. Results: A total of 131 patients completed the questionnaire. The top three easy-to-perform behaviours were eating vegetables and protein first and carbohydrates later, self-weighing each day and having regular eating times. Of 131 patients, 61 (46.5%) and 42 (32.1%) were followed up for 6 months and 12 months respectively. Reducing high-fat food intake (p = 0.002) and eating an average frequency of 5 times a day (p = 0.034) were associated with a TWL of more than 5% during 6 months weight maintenance. Reducing high-fat food intake was associated with a TWL of more than 5% during 12 months weight maintenance (p = 0.029). Conclusions: Chinese patients with overweight or obesity who experienced a TWL of more than 5% were more likely to reduce high-fat food intake during long-term weight maintenance.
Background and Objectives: Protein-energy wasting (PEW) is common among maintenance hemodialysis (MHD) patients and is strongly associated with mortality and adverse outcomes. This study aimed to assess the effects of low-protein energy supplements on the nutritional status of MHD patients with PEW. Methods and Study Design: We conducted a prospective randomized controlled trial in 68 MHD patients suffering from PEW. Patients randomized to the intervention group received dietary counseling along with daily low-protein supplements containing 212 kcal of energy and 2.4 g of protein every day for 3 months. The control group received dietary counseling only. Dietary data, nutritional assessments, anthropometric measurements, bioelectrical impedance analysis and blood analysis were collected at baseline and after three months from both groups. Results: Fifty-nine MHD patients completed the study. Patients in the intervention group showed an increase in energy intakes (p < 0.001). A significant decrease in the Malnutrition Inflammation Score (MIS) (p < 0.001) and Nutrition Risk Screening 2002 (p < 0.001) were found in the intervention group compared with the control group. Moreover, significant improvements in mid-upper arm circumference (p < 0.001), mid-arm muscle circumference (p < 0.001), albumin (p = 0.003), and prealbumin (p = 0.033) were observed in the intervention group compared with the control group. Conclusions: The combination of oral low-protein supplements and dietary counseling for three months was more effective than dietary counseling alone in terms of improving the nutritional status of MHD patients with PEW.
Background and Objectives: Patients with gastrointestinal (GI) malignancies are at high risk for malnutrition because of reduced food intake, poor digestion, and altered absorption. Methods and Study Design: In a retrospective review of medical records for patients admitted to urban hospitals in an Asian nation for GI tumor surgery (gastric, colon, or anorectal cancers), we found that malnutrition was common yet often overlooked. Our review identified records for 349 adult GI-tumor surgery patients. The Nutrition Risk Screening-2002 (NRS-2002) was the most frequently used screening instrument. In further review, we compared outcomes for malnourished GI tumor surgery patients given daily oral nutritional supplements (ONS) to outcomes for patients who were not given ONS. Results: Review of results revealed that only 20% of patients in our sample underwent nutritional screening or assessment on admission. Of those who did, nearly 60% were malnourished. Although due to small sample sizes, no statistically significant differences were observed, malnourished patients who received ONS had fewer complications and shorter lengths of stay by 1-day. Such findings reveal many missed opportunities to improve patient outcomes and to avert excess healthcare costs for treatment of complications, slowed recovery, longer hospital stays, and readmissions. Conclusions: Based on our findings, nutritional training for professionals is necessary to address the serious problems of under-recognition and inadequate treatment of malnutrition in hospitalized patients.
Background and Objectives: Acquired acrodermatitis enteropathica (AE), a rare dermatological condition, often stems from nutritional zinc deficiency linked to prolonged total parenteral nutrition (TPN). This study aims to explore the pathogenesis, clinical characteristics, and treatment approaches for AE, emphasizing the importance of early recognition and intervention. Methods and Study Design: A 51-year-old female patient with acquired AE presented with widespread erythema, pustules, and itching. A comprehensive diagnostic approach, including various tests and skin biopsy pathology, confirmed the diagnosis. Treatment involved zinc gluconate supplementation, topical applications, and symptomatic TPN support. Results: Significant improvement was observed one week post-treatment, with reduced erythema, pustules, and skin lesions, along with improved hair loss. Erosive and ulcerative surfaces healed substantially, indicating positive treatment outcomes. Conclusions: The successful management of adult-onset AE in this case underscores the significance of recognizing clinical features and implementing effective treatment strategies. These findings provide valuable insights for diagnosing and managing AE.
Background and Objectives: Critically ill patients require individualized nutrition support, with assessment tools like Nutrition Risk Screening 2002 and Nutrition Risk in the Critically Ill scores. Challenges in continuous nutrition care prompt the need for innovative solutions. This study develops an artificial intelligence assisted nutrition risk evaluation model using explainable machine learning to support intensive care unit dietitians. Methods and Study Design: Ethical approval was obtained for a retrospective analysis of 2,122 patients. Nutrition risk assessment involved six dietitians, with 1,994 patients assessed comprehensively. Artificial intelligence models and shapley additive explanations analysis were used to predict and understand nutrition risk. Results: High nutrition risk (35.2%) correlated with elder age, lower body weight, BMI, albumin, and higher disease severity. The AUROC scores achieved by XGBoost (0.921), CatBoost (0.926), and LightGBM (0.923) were superior to those of Logistic Regression. Key features influencing nutrition risk included Acute Physiology and Chronic Health Evaluation II score, albumin, age, BMI, and haemoglobin. Conclusions: The study introduces an artificial intelligence assisted nutrition risk evaluation model, offering a promising avenue for continuous and timely nutrition support in critically ill patients. External validation and exploration of feature relationships are needed.