(entire issue freely on-line)
Health Economics of weight management
A report for the Weight Management Code Administration Council of Australia
Background – Observational studies have reported associations between size at birth and both higher levels of cardiovascular disease risk factors and increased risk of cardiovascular disease in adult life. It has been suggested that these associations provide evidence of programming in humans. While programming is well recognised in animals the existence and importance of programming in humans is disputed. In this paper I will summarise analyses from the Avon Longitudinal Study of Parents and Children (ALSPAC) that have attempted to clarify the meaning of these associations with size at birth and to explore associations between early life exposures and offspring blood pressure.
Methods – ALSPAC is a prospective study investigating the health and development of children. 14,541 pregnant women living in one of three Bristol-based health districts in the former County of Avon with an expected delivery date between April 1991 and December 1992 were recruited. Detailed information has been collected from pregnancy onwards using questionnaires, data extraction from medical notes, linkage to routine information and at clinics that the children have been invited to attend from the age of seven years.
Results – Birthweight was associated with blood pressure at age 7 - in unadjusted models a 1kg higher birthweight was associated with lower systolic blood pressure 0.8mmHg (95% CI -1.4, -0.3) in boys but not in girls. (1) We have showed that birthweight was positively associated with measures of lean mass (320g per SD increase in birthweight, p=0.001) and fat mass (2.5% per SD increase in birthweight, p =0.001) at age 9 and that ponderal index at birth was more strongly associated with fat mass than birthweight. (2) Using results from several studies including ALSPAC a common variant in the glukokinase gene - the A allele at GCK(-30) - was shown to be associated with a 0.075 mmol/L (p=0.003) increase in fasting plasma glucose in pregnancy. (3) Maternal genotype was associated with a higher offspring birthweight of 64g (95% CI 25-102) but child genotype was not associated with birthweight. (3) We showed that birthweight was associated with fat intake at 43 months but not at age 7. (4) We found no evidence that maternal age (5), maternal diet (6) or maternal smoking were associated with blood pressure. (7) Conclusions – We have found some evidence to support the suggestion that shared genetics may in part explain birthweight-cardiovascular disease associations. We have found little evidence that early life exposures are associated with offspring blood pressure in this well-nourished contemporary population.
References
Background – Bones develop to withstand the mechanical forces applied to them from muscle pull and growth (1). Heritable factors account for up to 80% of the variance in bone mass (2). Of the modifiable factors, physical activity, lean tissue mass and dietary calcium intake that have been extensively evaluated. After genetics, physical activity appears the major determinant of peak bone mass, accounting for up to 17% of its variance (3).
Objective – To discuss the major modifiable factors of bone strength including physical activity, lean tissue mass and dietary calcium intake.
Design – Review of the literature on modifiable factors of bone strength.
Outcomes – Bone mineral density testing has shown that muscle mass is a major predictor of bone mineral content (BMC) (4). This is seen in children with reduced ambulation, where reduced lower limb bone and muscle mass co- exist (1). In healthy children, physical activity can positively influence bone mass (5), but this is not maintained once activity is ceased (3). Mechanical stimulation of bone through vibration takes advantage of the anabolic effects of muscle stimulation on bone (6).
There are no data on the minium threshold of calcium intake required for optimal bone health during childhood (7). Studies into calcium supplementation have lacked adequate control groups to account for the multiple factors influencing bone and none have assessed one strength (7). Further, calcium’s effect appears short lived, with very few studies showing a prolonged benefit on bone mass. Milk calcium may be more important that calcium salts, as milk may also provide essential proteins and growth factors necessary for bone development (3).
Conclusion – As more is done to unravel the modifiable influences on bone strength, it appears that the greatest promise lies with the development and maintenance of optimal physical activity and thus lean tissue mass. References
Background – Breast-feeding is associated with higher intelligence in later life, but the mechanisms remain unknown. The subject is controversial because it is difficult to disentangle genetic, environmental and nutritional factors. Understanding the molecular basis of learning, memory and cognitive development is one of science’s most difficult frontiers, yet of increasing clinical and public health importance. Sialic acid (Sia), a 9-carbon sugar, is a vital component of brain gangliosides and the building block of polysialic acid (PSA) on neural cell adhesion molecule (NCAM). Human milk is one of nature’s richest sources of Sia (~1 g/L), but infant formulas contain little (0~0.25g/L). Gangliosides and polysialated NCAM in the brain have an important role in cell-to-cell interactions, neuronal outgrowth, modifying synaptic connectivity, and memory formation. The alpha 2,8 sialyltransferase IV (ST8SiaIV) is one of two key enzymes for synthesizing PSA on NCAM. In rodents, the level of NCAM polysialylation increase with learning behavior. The liver can synthesise Sia from glucose, but the activity of the limiting enzyme, UDP-N-acetylglucosamine-2-epimerase (GNE) is low during the neonatal period. In rat pups, supplementation Sia has been shown to enhance learning concomitantly with increased brain ganglioside-bound and protein-bound Sia content. An exogenous source of Sia may be critical under conditions of extremely rapid brain growth, particularly during the first months after birth.
Objective – We examined the dietary may be a conditionally essential nutrient for early brain development and cognition in piglets, an animal model of human infants.
Design – Piglets (n = 54) were allocated to 1 of 4 groups fed sow’s milk replacer supplemented with increasing amounts of Sia as casein glycomacropeptide for 35 days. Learning speed and memory were assessed using an easy and difficult visual cue in an 8-arm radial maze. Brain ganglioside and sialoprotein concentrations and mRNA expression of two learning-associated genes (ST8SiaIV and GNE) were determined.
Results – In both tests, the supplemented groups learned significantly faster than the control group, with a dose- response relationship in the difficult task (P = 0.018), but not in the easy task. In the hippocampus, there were significant dose-response relationships between level of Sia supplementation and mRNA levels of ST8SiaIV (P = 0.002) and GNE (P = 0.004), corresponding to proportionate increases in protein-bound Sia concentration in the frontal cortex.
Conclusion – Sia may be a limiting nutrient in the neonatal period, facilitating optimal cognitive development in young animals. An exogenous source of Sia enhances brain development, providing a mechanism to explain the link between breast-feeding and higher intelligence.
References
Background – Poor nutrition can adversely affect health and quality of life in the elderly population. Whilst several studies have used diagnostic screening questionnaires as a tool for nutritional risk assessment, few studies have investigated the relationships of score results derived from the questionnaires with the nutritional status indicated by anthropometric measurements and biochemical markers.
Objectives – To determine the nutritional status among a group of Australian aged care residents utilising three available diagnostic screening questionnaires: Mini Nutritional Assessment (MNA), Geriatric Depression Scale (GDS), and Katz Activities of Daily Living (ADL). The associations of the questionnaire results with the nutritional (anthropometry and serum concentrations of 25(OH)D, folate and vitamin B12) and mobility (using the timed up and go, TUG, walking test) status of the study population were examined.
Subjects and Methods – Of 115 participants, 66 were able and willing to complete all three questionnaires, 113 provided a blood sample for biochemical tests and underwent anthropometric measurements, and 46 were able and willing to complete the TUG test.
Results – According to the participants’ responses to the questionnaires, 16% were malnourished (MNA score <17), and 39% were depressed (GDS score ≥ 6). GDS score was inversely associated with MNA score (greater the depression the lower the MNA score) (r=-.353, P=.015), and serum zinc concentrations (r = -0.343, P = 0.001), and GDS was positively associated with the TUG (r = 0.30, P = 0.030) and ADL score (r = 0.37, P = 0.001). Conclusions – Nutritional risk assessment using questionnaires indicated that nearly 20% of the study population were classified as malnourished, and more than one-third were depressed. Those who were depressed were at greater risk of malnutrition, and had reduced serum zinc concentrations and reduced mobility. Results from this study indicate that depression is a key factor affecting dietary intake and mobility in elderly populations. Minimising or alleviating depression may therefore assist in increasing food intake, improving mobility, and consequently enhancing the quality of life for aged care residents.
Background – Lipid oxidation is on of the most important chemical reactions, which lead to food spoilage. In vivo oxidation reactions have been associated with several important diseases, including cancer and cardiovascular diseases. Both in foods and in vivo oxidation reactions are complex. Therefore, comprehensive understanding of roles of various antioxidants and their activities is challenging. Natural plant-derived antioxidants continue to be one of the most popular research topics in food science. One of the trends is to search for simple measurements to characterize foods and their potential for providing antioxidants and thus supporting our health.
Review – In foods antioxidants are needed to inhibit lipid oxidation, which may proceed by autoxidation, photo- oxidation or enzymatic oxidation. Antioxidants act by several mechanisms, e.g. by donating phenolic hydrogen to radicals, chelating metals, and quenching singlet oxygen. In addition, food antioxidants are often multifunctional, i.e. each antioxidant may inhibit oxidation by several mechanisms. The dominant mechanism depends on conditions. For example, composition of foods and experimental systems (oxidizing substrates, initiators, and other components present), physical structure of foods, and temperature are among factors, which may influence the dominant mechanism and antioxidant activities. Partitioning of oxidizing substrates, antioxidants, and pro-oxidants in the studied food is critical. Antioxidant activity may thus be distinctly different in bulk oils and multiphase foods, such as emulsions. In addition, the method used to measure and calculate antioxidant activities has a major impact on the results. For all these reasons, use of several test conditions and methods simultaneously is generally highly recommended. Antioxidant activity in a given food is reliably shown by measuring decrease in the formation of a number oxidation products from different steps of the oxidation chain reactions. Antioxidant activities measured using radical scavenging tests do not take into account the effect of the substrate and its structure and other properties.
In vivo, imbalance between antioxidant defense and initiative factors may lead to oxidation reactions, which cause undesirable changes not only in lipids but also in DNA, enzymes and other proteins. In vivo the role of enzymatic defense is emphasized as compared with food systems. Liberation of antioxidants from foods in digestion may differ from liberation in extraction for antioxidant tests. Further, dietary antioxidants have to be absorbed and localized in active forms in the oxidation site to enable the antioxidant effect. Recently, there has growing interest in measuring total antioxidant activities or total antioxidant capacities of foods by simple radical trapping methods, with the aim of providing data on the nutritional quality of the studied foods. Because of the high number and diversity of individual antioxidants in our foods, efforts to look for simplified procedures is understandable. However, the current approaches still leave many open questions.
Conclusions – When antioxidant activities are considered we need to distinguish the role of antioxidants in foods and in vivo. Factors affecting oxidation reactions and antioxidant activities in foods and in vivo differ. In addition, multifunctionality of many antioxidants complicates further possibilities to find simple approaches for antioxidant activities.
References
Background – Antioxidants have been a rapidly growing area of scientific research over the last decade. The term “antioxidant” is now widely recognised, and to some extent understood, by the general public as media coverage increases and the food industry promotes their health benefits. However, there is still considerable confusion over what compounds are antioxidants, the levels present in different foods, which foods have the highest antioxidant capacity and their particular health benefits. Accurate data on the antioxidant composition of foods is important for a number of reasons, including for the food industry to be able to promote their products and for health professionals to compare different foods in order to make the best recommendations for their clients. It is also an essential tool for population studies and intervention trials in order to translate the promise surrounding the health benefits of antioxidants into validated reality.
Review– Participants at the First International Congress on Antioxidant Methods agreed that antioxidant methods must be standardized but disagreed on the best methods to use (1). Analysis of antioxidants is a complex issue because of the number of compounds involved, the variety of ways in which they act and the different food matrices they are present in. There are several different approaches to quantifying antioxidants:
1) Measure individual antioxidant compounds: Some antioxidants such as vitamin C, vitamin E and selenium are relatively simple to quantify. However, the challenges lie in identifying and quantifying the carotenoids and flavonoids as these groups are made up hundreds and thousands of different compounds respectively. The U.S. Department of Agriculture, Agricultural Research Service (2) is building up food composition databases for selected carotenoids, flavonoids, proanthocyanidins and isoflavonoids. These databases provide useful information but have their limitations and because of the number of compounds present in some foods it may make food labels complex if antioxidants were to be shown.
2) Quantify antioxidants by class (e.g. total phenolics, total carotenoids): This is perhaps the simplest way is to measure some antioxidants. There is consensus that the Folin-Ciocalteu method is an appropriate assay for quantification of total phenolics (3) and simple spectrophotometric methods are available for total carotenoids. There are limitations to these general assays and they do not always provide enough information to be meaningful.
3) Determine antioxidant capacity: This approach has benefits over simply quantifying antioxidant components as it provides a measure of their effectiveness. The difficulty is that no one assay can capture the different modes of action of antioxidants. Current commonly used and accepted methods include the oxygen radical absorbance capacity (ORAC) assay and the Trolox equivalent antioxidant capacity (TEAC) assay (3). These assays are useful measures of in vitro activity but it may be more important to learn how the compounds affect cell activity and if any of the beneficial antioxidants are absorbed. Hence, there is a need to develop better processes for cell model screening of biological activity, and subsequent absorption.
Conclusions – With the growing research and consumer awareness of antioxidants it is important to have accurate antioxidant composition and antioxidant capacity data for consumer education, food labelling and promotion. However, there is still some way to go before a consensus is reached on the best measures to use.
References
Background – Antioxidants are compounds that help prevent oxidation. Information on their presence in foods and their uses as neutralizers of oxidants or free radical compounds in the prevention of chronic non-communicable diseases has attracted scientists in various fields. Hence information related to their analytical determination is crucial.
Objectives – To discuss the major analytical methods in the determination of antioxidants in foods.
Discussion – The choice of analytical methods in the determination of antioxidants depends on the types of antioxidants of interest. Antioxidants in foods could be analysed either as a functional group, antioxidant groups or as individual antioxidants. Functional group could be measured by the total antioxidant capacity (TAC) in the use of any of the following major assays; trolox equivalent antioxidant capacity (TEAC) decolourization (1), ferric reducing antioxidant power (FRAP) (2) and oxygen radical absorbance capacity (ORAC) (3). The major antioxidant group assays include the determination of total polyphenol (TPP) performed by the Folin-Ciocalteu method, total carotenoids, total anthocyanins and total flavonoids (4). The determinations of individual antioxidants such as carotenoid and flavonoid profiles include the use of HPLC and LC-MS and other complex assays (5).
Conclusion – Attempts are made to validate the antioxidant assays in the Pacific region for use as a routine analysis in regional laboratories.
References
1 Re R, Pellegrini N, Proteggente A, Pannala A, Yang M and Rice-Evans C. Antioxidant activity applying an improved abts radical cation decolorization assay. Free Radic. Biol. Med, 1999; 26, 1231-1237.
2 Pulido R, Bravo L and Saura-Calixto F. Antioxidant activity of dietary polyphenols as determined by a modified ferric reducing/antioxidant power assay. J. Agric. Food Chem, 2000; 48, 3396-3402.
3 Cao G and Prior RL. Measurements of oxygen radical absorbance capacity in biological samples. In Methods in Enzymology: Oxygen radicals in biological systems. New York, Academic press, pp. 51-62.
4 Sellappan S, Akoh CC. Flavonoids and antioxidant capacity of Georgia-grown Vidalia onions. J. of Agric. and Food Chemistry, 2002a; 50: 5338-5342.
5 Lako J, Trennery VC, Wahlqvist M, Wattanapenpaibbon T, Sotheeswaran S, and Premier R. Phytochemical flavonols, carotenoids and the antioxidant properties of a wide selection of Fijian fruit, vegetables and other readily available foods. Food Chemistry 2006; in press.
Background – Challenges in the analysis of food include new residues that need monitoring, and ever lower concentrations of target compounds. New methods of analysis have provided clever approaches but now quantitative methods need adequate validation and laboratories using those methods need proper quality control. Food analysis has traditionally relied on mutually agreed standard methodology, eschewing approaches such as the Guide to Uncertainty in Measurement (GUM)1 , and thereby foregoing metrological traceability. This paper will explore aspects of quality assurance and will argue that food analysis can be brought into the orbit of normal chemical analysis. Quality Assurance – Australia has led the world in aspects of metrology ever since minister Dedman stated in 2
Parliament “Measurements must be what they purport to be” . The National Association of Testing Authorities (NATA) was the first such accreditation body in the world. A proper quality assurance program will have the following elements: 1. Accreditation of the laboratory, 2. Participation in proficiency testing schemes, 3. Use of certified reference materials (CRMs) for calibration and quality control, 4. Estimation of measurement uncertainty,
People are living longer. Australia now has the second longest life expectancy (at birth) in the world: about 71 years for males and 74 for females. At 60 years our average health-adjusted life expectancy (HALE) is another 16.9 in men and 19.5 in women (mortality is longer). Around 20% of Australians are now over 65 years old. Many occupations are no longer obliged to retire people at a fixed date and a number continue working past 65. “70 is the new 60!” So we have to adjust the cut offs for “older” people and “elderly”.
Nutrition research in older people is handicapped by (i) great heterogeneity of the people’s health and conditions, (ii) special problems with access, (iii) confounding by medications, (iv) the varied, often indirect criteria used for nutritional status.
Nutrition science’s concerns with this age group can seem to be looking in two opposite directions. “Don’t eat too much, so you’ll stay healthy for longer” (and there’s the calorie restriction hypothesis) OR “We’ll help you get more to eat to keep you healthy for longer”. Which direction depends partly on interpretation of the limited research. It depends more on the stage an individual is at: third age or fourth. Chronological timing of aging is more variable than the timing of growth in teenagers.
For the third age there is much good advice in the NHMRCs ‘Dietary Guidelines for Older Australians’ (1999) (1). Among the messages to be found in it:- “Anyone refusing to give a sprightly 66 year old dietary advice to prevent cardiovascular disease could be accused of age discrimination’ (p 85). ‘We know that salt sensitivity (of blood pressure) increases with age’ (p 110). ‘Keep active to maintain muscle strength and a healthy body weight’ (p 29). ‘Older people may become infected with food borne pathogens at low doses that might not produce a reaction in (younger) people’ (p 48). As people age ….. a lowered energy output leads to a lowered energy intake: we eat less food. ‘Because of this older people need foods that are rich in nutrients – nutrient-dense foods – if they are to maintain their intake of essential nutrients’ (p xxi).
The National Nutrition Survey (1995) confirmed lower energy intakes over age 65 y. As to requirements of essential nutrients, the NH&MRC (2005) sets RDIs higher for protein, calcium, riboflavin and (especially) vitamin D for the 70+ year line (than for younger adults).
In the fourth age old people are no longer healthy and under – or malnutrition is likely to be associated. SENECA Study participants who lost 5 kg body weight had a significantly shorter survival. Undernutrition in old people has many, often combined causes. Broadly these are (a) serious disease or advanced aging, and (b) socio-economic and management problems. The first group are for medical care, AUSPEN and high tech nutrition support. The latter group are the main challenge for this society: older people who are not eating what they need because they are socially isolated, have chronic disabilities, were nutritionally depleted by major illness in hospital, are housebound or don’t eat enough in nursing homes.
Different agencies and professionals are theoretically in a position to help nourish and re-nourish people in some of these situations. Nutritionists/dietitians are likely to be advisers rather than at the front line. For most jobs there’s a manual telling the best way to do the job. In nursing homes this has been lacking for nutrition. Bartl and Bunney asked 100 people for practical ways of looking after nutrition and food in nursing homes. This is embodied in their manual (2), which a book reviewer considers “ideal for aged care staff who want a single source of information in plain English, which will assist them, or prompt them, to address all the food and nutrition related standards and guidelines for aged care”.
Housebound elderly are scattered, isolated, out of sight and more difficult for nutritional advice to access. This was discussed – with no major answers – at a conference at Sydney University last year. We heard stories about individuals with food insecurity, about the adverse effects of bad teeth, early Alzheimer’s, depression, social isolation and other disadvantages. In Lipski’s experience at least 30% of independent community living elderly are undernourished, most unrecognised. Meals on Wheels cost more than they used to. Community nurses and care services help some people. An increasing number of commercial companies will deliver meals for those who can pay. It is estimated there will be a large increase of frail older Australians living in the community and not enough people to care for them - ? us. This important exception to the obesity epidemic deserves much more of our attention.
References
Background – Childhood diet may have important implications for health in old age for two reasons. First, diet in childhood may influence diet in later life. Second, diet in childhood may have direct effects on the pathogenesis of chronic disease processes independent of subsequent adult diet. Few studies have examined the associations of diet in childhood with later diet and mortality in old age prospectively. I will attempt to summarise findings from the Boyd Orr cohort that have examined the association between diet in childhood and later diet and health outcome. Methods – The Boyd Orr cohort is based on the long- term follow up of 4,999 children who were surveyed in the Carnegie survey of Family Diet and Health in Pre-World War II Britain (1). Socio-economic data, anthropometric data and a seven-day household dietary inventory were collected on 1,352 families living in 16 areas of England and Scotland between 1937 and 1939 and repeat dietary data was collected on over 300 families (1). The name, age and address of the children of the families surveyed were used to trace the children and 88% have been successfully traced (1). In 1997-1998 a self-completion questionnaire that included a 113- item food frequency questionnaire was sent to all 3,182 surviving traced study members (2).
Results – A childhood diet rich in vegetables was associated with a healthy diet in early old age. In multivariable models the healthy diet score (a 12 item score) for those in the upper quartile of childhood vegetable intake was 0.30 (95% CI –0.01 to 0.61, p for trend 0.04) higher compared to those in the lowest quartile. (2) Higher childhood energy intake was associated with increased cancer mortality - in multivariable proportional hazard models that adjusted for social variables the relative hazard for all cancer mortality was 1.15 (95% CI 1.06 to 1.24, p 0.001 for every MJ increase in adult equivalent daily intake) (3). Higher fruit intake was associated with reduced risk of incident cancer. In fully adjusted logistic regression models odds ratios (95% CI) across increasing quartiles of fruit consumption were 1.0 (reference), 0.66 (0.48 to 0.90), 0.70 (0.51 to 0.97) and 0.62 (0.43 to 0.90), p for linear trend =0.02. (4) Higher childhood intake of vegetables was associated with lower risk of stroke. After controlling for age, sex, energy intake and range of socio-economic and other confounders the rate ratio between the highest and lowest quartiles of intake was 0.40, 95% CI 0.19 to 0.83, p for trend = 0.01) (5) Higher intake of fish was associated with higher risk of stroke. The fully adjusted rate ratio between the highest and lowest quartile of fish intake was 2.01, 95% CI 1.09 to 3.69, P for trend = 0.01) (5). There was no association between intake of any of the foods and constituents considered and deaths attributed to coronary heart disease or all cause mortality (5). The reported childhood diet-cancer associations were robust to adjustment for measurement error (6).
Conclusions – Though these results are based on household measures on children across a range of ages measured in the 1930s they do suggest that diet in childhood influences diet in old age and cancer risk. These findings require replication but suggest that childhood diets rich in vegetables and fruit may be beneficial.
References
Background – Healthy aging depends, in part, on good nutrition, to compress morbidity, or, in other words, to minimize the time spent in states of ill-health in the later years of life. Little information is available in Australia about trends in food and nutrient intakes of any age group, including older people. The Blue Mountains Eye Study, a population based cohort study, provided an excellent opportunity to study trends in intakes of foods and nutrients, measured three times over a 10-year period on a large cohort of older people. Such data enable the exploration in patterns of dietary change among the same people as they age over time, which is not possible with population cross sectional surveys. We have also begun to analyse the dietary trend data in relation to health outcomes, to see whether particular patterns of intake predict better health status and longevity.
Review – The Blue Mountains Eye Study is a population-based cohort of older people living in two postcode areas
west of Sydney. At baseline (1992-1994) 3654 people aged 49 years and over (82% of those eligible) were
examined (mean age at baseline 62 years). Five years later (1997-1999) 2334 people were re-examined, and ten
years after the baseline data collection (2002-2004) 1952 people were re-examined (75% of survivors). A validated
145 item food frequency questionnaire (FFQ) was used to assess food and nutrient intake at each of the three assessments.
During the ten year period, 1166 people completed the FFQ satisfactorily on all three occasions.
Mean intakes of foods and nutrients were examined. In addition, food and nutrient intakes at baseline were
examined in relation to various health outcomes, including weight gain, eye disease, and mortality. Mean intakes of
energy and sugars significantly increased among women over the 10 year period (7920kJ vs 8272kJ, p<0.05; 118g
vs 127g sugar, p<0.05). Mean intakes of n-3 polyunsaturated fatty acid (PUFA) and fish significantly increased
among women and men over the ten year period (n-3 PUFA: women 0.9g vs 1.1g, p<0.05; men 1.0g vs 1.1g,
p<0.05) (fish: women 27g vs 38g, p<0.0001; men 26g vs 36g p<0.0001). Mean dietary folate intake increased in
women and men, reflecting changes to food supply during the study period (folate (μg): women 323 vs 376,
p<0.001; men 346 vs 393, p<0.001). Study participants consumed significantly lower quantities of cuts of red meat
(red meat cuts: women 42g vs 36g, p<0.0001; men 49g vs 40g, p<0.0001), but more mixed dishes containing red
meat over the ten year period (red meat dishes: women 54g vs 62g, p<0.001; men 69g vs 73g, p=0.23 not sig).
Whole milk consumption decreased and low fat yoghurt increased (whole milk: women 98g vs 78g, p=0.002; men
131g vs 102g, p=0.002; yoghurt: women and men 18g vs 30g, p<0.0001). There were no significant changes in total
fruit and vegetable intake, though some sub-types of fruits and vegetables increased, notably canned fruit and
avocado (p<0.05). Mean intakes of wholemeal bread decreased over the 10 year period (52g vs 40g, p<0.05).
Amongst men, mean intake of beer decreased but wine increased (210g vs 166g for beer, p<0.01, 68g vs 78g for
wine, p=0.051). People who consumed the highest tertile of fruit and vegetables were more likely to consume lean
red meat and fish (p<0.0001). At baseline 57.9% of participants were overweight or obese and ten years later 65.9%
were overweight or obese. During the 10 years of follow-up median weight gain was 2.4kg and 31% of people had
weight gain greater than 5kg. Participants with the highest vs lowest quintile of n-3 PUFA at baseline had a lower
risk for incident early age-related macular degeneration (ARM) at five years (OR 0.41, 0.22-0.75), and there was a
40% reduction of incident early ARM associated with fish consumption of at least once a week (OR 0.58, 0.37-
3
0.9).
Conclusions – Many of the observed changes in diet over a 10-year period among older Australians were in line with current population dietary recommendations, including, an increase in intakes of fish, n-3 fatty acids, folate and low fat dairy products. However, some changes resulted in poorer dietary choices, such as decreasing use of wholemeal bread. A full analysis of the dietary trends will inform nutrition policy and programs targeted to older Australians.
References
Background – Long-term calorie restriction (CR) extends the life of many lower animals, including rats, mice, fish, flies and worms. However, in humans there are no life-long studies, only short term trials over 2 to 6 years which indicate that CR reduces the risk factors for cardiovascular disease and diabetes.
Review – In 1935 McCay (1) clearly showed that prolonged CR extended the life of rats. This has been repeatedly confirmed by later investigators, who also showed that CR delays the onset of various age-related diseases and thereby extends life. No life-long CR studies have been performed in humans. However the inhabitants of Okinawa (islands south of Tokyo) who are the longest lived people on earth eat 40% fewer calories than the Americans and live 4 years longer (2). The Okinawans have a lower mortality from stroke and cancer. In developed countries human life expectancy has increased in the 20th century by an average of nearly 30 years, mainly due to reduced deaths from infectious diseases, accidents and cardiovascular disease (3). During the latter part of the 20th century diets in developed countries were improved by increasing the intake of fruit, vegetables and fish and decreasing saturated fat consumption. However, calorie intake has increased since 1970 by about 30% in USA and in many countries is leading to overweight, which is life-shortening. Overweight in middle age shortens life by 3 years and obesity by 7 years (4). Obesity in young adults takes 13 years off life. Thus CR to prevent overweight and obesity could add 3 to 13 years to life expectancy. The semi-starvation (20% CR) of wartime in the 1940s in Scandinavian countries reduced the incidence of cardiovascular disease. In the 1990s, 20% CR studies on healthy adult humans over a 6 year period produced significant falls in the risk factors for cardiovascular disease and diabetes (5). A recent 25% CR study on overweight adult humans over 6 months reduced their body weight and decreased body temperature and plasma insulin, which are biomarkers of longevity (6) indicating increased life expectancy. Conclusions – Thus CR over 6 months to 6 years has been able to reduce the risk factors (body weight, blood pressure, cholesterol, glucose) for the major diseases that kill humans and improve metabolic functions (body temperature, plasma insulin) that determine life expectancy. While these studies indicate that CR should extend human life, the only proof will come from very long-term CR studies over 50 or more years.
References
Background – There is little information about whether the effects of milk supplementation on bone turnover markers still persist, 3 years after supplements cease.
Objectives – To assay biochemical markers of bone turnover and also calcitropic hormones in blood, 3 years after the completion of a milk supplementation trial.
Design – A follow-up study of a two-year school milk intervention trial, in which 757 Beijing girls aged 10 years at baseline were assigned into Ca-fortified milk (Ca milk), Ca plus vitamin D-fortified milk (CaD milk) and control groups. Three years after the completion of intervention, fasting blood and urine samples were collected at the end of winter (March to April) from 504 girls. Using standard assay kits, the blood concentrations of osteocalcin (OC), bone alkaline phosphatase (BAP) and the urine deoxypyridinoline /creatinine (Dpd/Cr) ratio were used as bone biomarkers. Plasma calcium (PCa), parathyroid hormone (PTH) and insulin-like growth factor-I (IGF-I) were also measured.
Outcomes – Three years after milk supplement withdrawal, there were no significant differences in BAP, Dpd/Cr, PCa, PTH and IGF-I between groups after adjusting for baseline values, baseline pubertal stage and clustering by schools. However, the Ca milk group subjects had a significantly greater percentage increase in plasma OC concentration than controls over the 5 years from the pre-supplement baseline (21.5 ± 10.1%; P=0.04).
Conclusions – The effects of milk supplementation on biochemical markers of bone turnover and on calcitropic hormones were no longer present 3 years after the cessation of milk supplementation.
Funding – Danone-China, Nestle Foundation.
Background – Current treatments and outcomes for Ulcerative Colitis (UC), a form of inflammatory bowel disease (IBD), are generally disappointing hence new forms of therapy are urgently required. It has been proposed that manipulation of the gut microbiota with a diet containing wheat bran plus resistant starch (WBRS) may have the ability to modulate the colonic microflora towards a more remedial community and thus potentially act as a novel therapeutic approach in the management of UC.
Objective – To examine the effect of a diet containing high (~32 g/day) levels of WBRS compared to a diet containing low (~10 g/day) levels of WBRS, on the composition of the colonic microflora in individuals with UC. Design – Single blind randomised 48-day crossover dietary intervention, where a total of five subjects (two women, three men) between the age of 18 and 54 years with UC in remission, consumed dietary supplements containing low and high levels of WBRS.
Outcomes – Compared to the low WBRS diet, the high WBRS diet significantly reduced the numbers of the Bacteroides-Prevotella spp. group (P = 0.043) and increased the numbers of the Bifidobacterium spp. group (P = 0.04). No effects on the total number of bacteria, Enterobacteriaceae including Escherichia coli, Clostridium spp. group, Eubacterium spp. group and Lactobacillus spp. group were observed.
Conclusions – Addition of high level WBRS supplements to the diet of UC subjects provided favourable changes to specific groups of bacteria, in particular the Bacteroides-Prevotella spp. group and the potentially beneficial Bifidobacterium spp. group, which suggest this novel combination of dietary fibres may be useful in the management of UC.
Background – Food fortification and/or multivitamin supplementation may improve nutritional status in aged care residents who are at risk of malnutrition and related diseases.
Objective – To assess the effectiveness of a multivitamin tablet, and calcium-vitamin D3 fortified milk supplementation for six months, on serum indices of nutritional status and bone quality (quantitative heel ultrasound, QUS) in a group of Australian aged care residents.
Design – With a 2x2 factorial design, subjects were randomized to receive either a placebo (P) or multivitamin (MV) tablet containing various vitamins including 5 μg cholecalciferol, 250 mg calcium carbonate, 20 μg cyanocobalamin and 200 μg folic acid (1 tablet/day), and, matched on mobility levels, to receive fortified milk (7 g protein, 200 mg calcium and 5 μg cholecalciferol), or usual milk for six months. Measurements of body weight, QUS, and serum concentrations of nutrients were performed at baseline and at six months.
Outcomes – Low compliance in consuming the fortified milk caused by the difficulties in delivering the milk to the participants led to the cessation of milk after 16 weeks of the study. Therefore only the effect of multivitamin supplementation was examined. Of 115 participants entering the study, 92 (49 MV and 43 P) completed the study. Following supplementation after six months, compared to the P group, the MV group had a greater rise in serum 25(OH)D (33.4 ± 2.6 nmol/l), folate (13.4 ± 2.8 nmol/l), and vitamin B12 (163.5 ± 40.3 pmol/l). The number of participants with adequate 25(OH)D concentrations (>50 nmol/l) increased from 23% to 77% for the MV group, and was reduced from 17% to 10% for the P group. After adjustment for baseline levels, the MV group had an improvement in QUS by 2.7 dB/MHz, compared to -2.5 dB/MHz for the P group (P=0.041).
Conclusions – Daily multivitamin supplementation improved nutritional status with respect to serum 25(OH)D, and raised vitamin B12 and folate concentrations. Additionally, there was an indication of a positive effect on bone density; this could in the long term contribute to a reduction in falls and fractures.
Background – Squalene is a component of shark liver oil and has been speculated to have cholesterol reducing properties. High levels of total and LDL cholesterol have been shown to contribute to the development of chronic heart disease. The liver is central to the regulation of cholesterol metabolism and dietary intervention has long been recognized as a primary means to reduce the risks of chronic heart disease and related ailments.
Objectives – To determine the effect of dietary squalene supplementation on gene transcripts associated with liver cholesterol metabolism. Specifically the effect of squalene supplementation on mRNA levels for proteins that regulate cholesterol biosynthesis (HMDH & ERG1), cholesterol elimination (SRB1), bile synthesis (CP7A1 & CP27A) and cholesterol excretion by the liver into bile (ABCG5 & ABCG8) was investigated.
Design – Rats (n=32) were divided into four groups and supplemented for 12 weeks. Groups one and two were fed a cholesterol rich diet for six weeks followed by six weeks of a cholesterol rich diet plus 1.75mg/day of squalene or 3.5 mg/day. Group three was fed a cholesterol rich diet for 12 weeks and group four was fed standard rat chow for 12 weeks. Blood lipid levels were monitored during the study and liver gene expression was determined at the conclusion of the feeding trial via RT-PCR.
Outcomes – 3.5 mg/day of squalene lowered total and LDL cholesterol in rats consuming a cholesterol rich diet. This dose of squalene also resulted in constant levels of HMDH and ERG1 whereas the cholesterol rich diet halved mRNA levels of these enzymes. Furthermore 3.5 mg/day of squalene caused a greater than 3.0 fold increase in mRNA levels of the proteins SRB1, CP7A1, CP27A and ABCG5.
Conclusion – Dietary squalene supplementation at a dose of 3.5 mg/day lowers total and LDL cholesterol in rats consuming a cholesterol rich diet. These reductions in cholesterol levels may be due to increased cholesterol elimination, bile synthesis and cholesterol excretion by the liver into bile mediated by changes in gene expression of key enzymes involved in these metabolic pathways
Background – Inverse associations between fibre intake and cardiovascular risk factors are reported in many studies. In a randomised controlled trial we have previously shown that dietary fibre, in the form of a supplement, decreases cardiovascular risk factors in the postprandial period in overweight and obese individuals. The effects of dietary fibre in conjunction with a healthy diet on cardiovascular disease have not been studied in overweight and obese subjects.
Objectives – To evaluate the effects of dietary fibre with or without a healthy eating plan on blood lipids and glucose in overweight and obese individuals.
Design – A 12-week randomized controlled trial, including ad libitum diets and fibre supplements, was undertaken using four groups: (1) control, (2) control plus fiber supplement, (3) healthy diet and (4) healthy diet plus fibre supplement. Participants in groups 2 and 4 were given extra fibre in the form of supplements (36 grams psyllium husk per day). The healthy diet groups (3 and 4) followed a healthy eating plan based on the Australian Dietary Guidelines. Fasting blood triglyceride, total cholesterol, LDL-cholesterol, HDL-cholesterol and glucose were measured at the baseline, six and twelve weeks.
Outcomes – In total, 70 overweight and obese subjects were recruited and 57 (81%) completed the trial. In primary ‘intention to treat analysis’, total cholesterol and LDL-cholesterol were significantly different between all treatment groups and control group (P=0.000 for total cholesterol and LDL-cholesterol). In comparison with control group, total and LDL-cholesterol decreased by 3% and 9% respectively in the control plus fibre group (P=0.03 and P=0.002 for total cholesterol and LDL-cholesterol respectively), by 6% and 8% respectively in healthy diet group (P=0.001 and P=0.013 for total cholesterol and LDL-cholesterol respectively) and by 12% and 13% respectively in healthy diet plus fibre group (P=0.000 for total cholesterol and LDL-cholesterol). There were no differences in triglyceride, HDL-cholesterol and glucose between the groups.
Conclusion – These results suggest that consumption of additional fibre in the form of psyllium husk is as effective as a healthy dietary plan to improve fasting blood cholesterol and LDL-cholesterol in overweight and obese individuals.
Background – Bioelectrical impedance analysis (BIA) has been commonly used as a convenient, cost effective way to measure body composition in large population studies. However, the validity of this technique remains uncertain. Objective – To compare measurements of body fat mass using single frequency BIA with dual-energy X-ray absorptiometry (DEXA).
2
Design – Forty-two overweight and obese (body mass index 25 to 40 kg/m ) but otherwise healthy volunteers aged
30 to 70 y were recruited from the general population. In a cross sectional analysis, fat mass was measured using foot-to-foot BIA (Tanita Scale Model TBF-300) and whole body DEXA (GE Lunar Prodigy scanner).
Outcomes – Although fat mass measured using BIA was strongly correlated with DEXA (r2=0.89), this analysis is not useful when comparing methods of measurement. There was bias in the measurement of fat mass with BIA measuring lower than DEXA (mean±SD: -2.96±2.91 kg). In weighted least products regression there was a significant fixed bias (mean (95% CI): -6.24 (-10.75, -1.72). However, there was no significant proportional bias (1.09 (0.96, 1.23)). That is, the slope did not differ significantly from 1. This implies that the difference between the methods did not increase with increasing fat mass.
Conclusions – Although the results suggest that there may be some concern regarding accuracy of the BIA technique in overweight and obese individuals, this method may be useful for estimating of fat mass in large cross- sectional and longitudinal population studies, as well as in intervention studies.
Background – Both soy and whey proteins have been shown to have additional health benefits beyond their basic nutritional value, including anti-carcinogenic properties against colorectal, breast and prostate cancers. At present, limited knowledge exists of the mechanisms by which soy and whey exhibit these anti-carcinogenic properties and whether anti-tumour properties extend to other tissues.
Objective – To determine the effects of soy and whey protein-based diets on the incidence and progression of N- nitrosodiethylamine (NDEA)-induced, pre-neoplastic lesions and gene expression in the liver.
Design – Pre-neoplastic lesions were induced in the livers of Wistar rats by NDEA injections (25mg/kg body weight, 2 times week for 3 weeks). Rats (9 per treatment) were fed control (meat-based rodent chow), whey-based or soy-based diets for 15 weeks. Following euthanasia, liver samples were obtained and the total area occupied by preneoplastic lesions was determined by immunohistochemical staining for GST-yp. DNA microarray analysis is being undertaken to investigate the effects of the diets upon gene expression.
Outcomes – The incidence of preneoplastic lesions in whey-fed rats (0.31% total liver area) was reduced by 41% compared to rats fed a control diet (0.53% total liver area; P < 0.05). Soy-fed rats showed an 81% increase in lesion incidence (0.98% total liver area; P < 0.013).
Conclusions – Whey-based diets protect against initiation and progression of NDEA-induced preneoplastic lesions in liver compared to either meat- or soy-based diets. A similar pattern of protection has been observed against colorectal tumours. The molecular mechanisms of this protection remain to be elucidated.
Background – Increases in the prevalence of asthma may be explained by changes in the dietary intake of long chain omega-3 polyunsaturated fatty acids (LCn-3 PUFA’s), which are shown to have anti-inflammatory properties. Objective – To evaluate the dietary intake of LCn-3 PUFA’s in adult women with and without asthma and to examine the effect of inhaled corticosteroid use (ICS) on dietary intake.
Design – We recruited asthmatic women (n=34) and non-asthmatic controls (n=26). Participants completed a food frequency questionnaire and dietary analysis was determined using Foodworks software, Xyris, QLD.
Outcomes – There were no significant differences in age and BMI between the asthmatic and non-asthmatic control groups. Asthmatic women had lower lung function compared to controls (Mean ± SEM) (FEV1%; 96 ± 2% vs. 100 ± 1%, P=0.51). The median (Q1-Q3) inhaled corticosteroid dose for asthmatic women was 625 (287.5-1000) μg/day. A significant difference in the dietary intake of LCn-3 PUFA’s between asthmatic and non-asthmatic controls was observed (Mean (g/day) ± SEM) (Asthma: 0.16 ± 0.42, Control: 0.05 ± 0.005, P=0.02). The control group had a lower LCn-3 PUFA intake compared to the Australian average (0.159 g/day) (1). Further subgroup analysis showed that there was no significant difference in dietary intake of asthmatic women using ICS compared to asthmatic women who were steroid naïve.
Conclusions – An unexpected elevation in the LCn-3 PUFA intake was seen in these asthmatic women. This may be likely to represent an increased awareness of the health benefits associated with LCn-3 PUFA consumption in asthma, leading to an increased dietary intake. These results also indicate that the non-asthmatic control group are failing to meet Australian intake recommendations for LCn-3 PUFA’s.
References
Background – LCn-3PUFA are known to have beneficial effects on human health. However, changes to the human diet in the past 150 years appear to have resulted in reduced intakes of LCn-3PUFA which now fall well short of an adequate intake for optimal health.
Objective – To examine awareness of and intake of LCn-3PUFA major food sources by young adult consumers. Design – Questionnaires about knowledge and consumption of omega-3 fats were completed by 78 tertiary students (28 male and 50 female). Demographic variables were considered as influences on behaviour.
Outcomes – Intake of LCn-3PUFA (ALA, EPA and DHA) was considerably lower in females (60 mg/day) than in males (100 mg/day). These intakes were both lower than the average Australian adult intakes (females 159 mg/day and males 222 mg/day), and much lower than the recommended 430 mg/day for females and 610 mg/day for males. Seventy four percent of participants did not know about the best food sources of n-3 PUFA or the potential health benefits of consuming these fatty acids.Conclusions – The low intake of LCn-3PUFA by young adults is a concern. It is possible that awareness and consumption increases with age, as reflected in the Australian adult population. Consideration should be given to promotions targeting young adults to ensure an adequate consumption to maximise the health benefits in later years.
Background – Pro-inflammatory cytokines are released in response to conditions of stress such as trauma, surgery, burns, sepsis and exercise. Exercise has been suggested to affect the immune system in a J curve, where moderate exercise improves immune function and chronic exercise impairs the immune system and increases the risk of upper respiratory tract infection [1, 2].
Objective – The objective of the study was to determine the effect of acute maximal exercise on plasma markers of inflammation, carotenoids and fatty acids in healthy endurance athletes and untrained adults.
Design – Twenty endurance trained athletes and 15 sedentary adults completed an overnight fasting treadmill VO2max test, a 7-day physical activity record and a 4-day weighed food record. Blood was collected at baseline and post-exercise for the analysis of inflammatory markers, fatty acids and carotenoids in plasma.
6
Outcomes–Plasmamonocyteconcentration(x10 cells/ml)significantlyincreased(P<0.001)inallparticipants
from baseline (1.35 + 0.33) to post-exercise (2.65 + 1.10). Athlete monocyte concentration was significantly higher (P < 0.05) at baseline (1.44 + 0.37) and post-exercise (3.00 + 0.80) compared to sedentary adults (1.25 + 0.24, 2.23 + 1.29). There was no difference in the plasma inflammatory markers IL-6, TNF-alpha and LTB4 at baseline and post-exercise. Acute maximal exercise significantly increased plasma lutein/zeaxanthin (P < 0.001), beta- cryptoxanthin (P < 0.001) and lycopene (P < 0.01) concentrations. Plasma non-esterified fatty acid (NEFA) significantly increased but total fatty acids remained unchanged in response to exercise.
Conclusion – In healthy human adults acute maximal exercise can increase plasma monocyte, carotenoid and NEFA concentration without a change in inflammatory mediators.
References
Background – The in vivo glycaemic index (GI) testing of foods can give a broad range of results between laboratories and even within laboratories for the same food item. Demographic factors such as gender, ethnicity and age of participants may influence the glycaemic responses in the individuals to a standard Oral Glucose tolerance test (OGTT), hence may affect the determination of the GI value of foods.
Objective – To determine effect of age, gender and ethnicity of the subjects attending the GI testing site on their glycaemic response (GR) to a standard OGTT.
Design – Glucose response data was collected on 204 subjects of varying age, ethnicity and gender from multiple OGTT (minimum =3, maximum=11). The individuals were grouped based on their ethnicity into: Western European (n = 105), Asian (n = 83) and Indian and Middle Eastern (n = 16), as well as age groups: 18 - 26, 27 – 35 and 36- 45 years, and male and female gender.
Outcomes – There was no significant difference in the area under the curve (AUC) for the 50g glucose solution in Asian, Western European and Indian and Middle Eastern F (2, 201) = 10.3, P= 5.2. The blood glucose level in Indian and Middle Eastern subjects tended to return to the fasting level very slowly compared to other ethnicities. The effect of age on glucose responses was not significant. Statistical analysis also showed no significant differences in the AUC based on gender (57.3% female; 42.6% male), F (1, 202) = 0.42, P= 0.5.
Conclusion – Ethnicity, gender and age do not have significant influence on the GR and thus may have no impact on GI determination of foods.
Background – Increasingly, measures of dietary patterns have been used to capture the complex nature of dietary intake and investigate their association with health. Healthy dietary patterns may be important in the prevention of chronic disease however there are few investigations in adolescents.
Objective – The objective of this study was to describe the dietary patterns of adolescents and their associations with socio-demographic factors and health outcomes.
Design – Analysis was conducted of data collected in the 1995 National Nutrition Survey on participants aged 12-18 years who completed a 108 item food frequency questionnaire (n=764). Dietary patterns were identified using factor analysis and pattern scores were calculated from the consumption of the food items in each dietary pattern. Outcomes – Factor analysis revealed three dietary patterns labeled on the basis of the food items that loaded highly, that is, a fruit, salad, fish and cereals pattern, a vegetables pattern and a high fat and sugar pattern which explained 11.9%, 5.9% and 3.9% of the variation in food intakes, respectively. High consumers of the high fat and sugar pattern were more likely to be male (p<0.001), while high consumers of the vegetable pattern were more likely to be living in rural areas compared to metropolitan areas (p=0.004). There were no significant associations with age or area-level index of relative socio-economic disadvantage. The fruit, salad, fish and cereals pattern was significantly associated with diastolic blood pressure (p=0.019) with high consumers having lower blood pressure. In contrast, high consumers of the high fat and sugar pattern had significantly higher systolic blood pressure (p=0.026). Conclusion – Diets characterised by healthier food items were associated with lower blood pressure while consumption of foods high in fat and sugar were associated with higher blood pressure. Specific dietary patterns are already evident in adolescence and may be associated with health.
Background – Glycemic Response (GR) is the extent to which any food raises the blood glucose level and is controlled by the structure of the carbohydrate and its rate of digestion and absorption by the small intestine. Glycaemic index (GI) is a measure used to rank carbohydrates based on their effect on the blood glucose level relative to a standard such as pure glucose. One of the factors which may change the rate of gastrointestinal emptying, digestion and absorption and hence influence the GR and GI value is ethnicity.
Objective – To ascertain the effect of racial differences on glycaemic response and glycaemic index of white bread: comparing Western European vs. South East Asian cohorts.
Design – The standard Oral Glucose Tolerance Test (OGTT) was performed for both the standard food (glucose solution) and the test food (white bread) on 40 healthy individuals, age 18 to 45 years from Western European and South East Asian backgrounds. The “Homebrand” white bread was selected as the test food and was consumed on one occasion. A 50g glucose solution was used as the standard food and consumed on 3 occasions by the subjects. Finger prick blood samples in duplicates were obtained and instant blood glucose concentration was measured using automatic analyzers (HemoCue® glucose 201) over a two hour period. The area under the curve (AUC) for each 2h blood glucose response to the glucose solutions and the white bread was calculated and used to determine the GI value of the test food.
Outcomes – There were 55% females and 45% males in each ethnic group. The AUC for both the glucose standard (M = 234, SD = 8.1) and the test food (M = 174, SD = 13.4) were greater in Asian subjects compared with Caucasians (M = 214, SD = 12.6) and (M = 153, SD= 4.2). An ANOVA test showed no significant differences in the GI value of white bread F (2, 37) = 0.29, P =0.6 between Asian (M= 78, SEM= 5.4) and Caucasians (M= 73, SEM= 8.6).
Conclusion – The in vivo determination of the GI value of foods can not be influenced by ethnicity of the participants.
Background – The origins of children’s food preferences remain largely unexplored. However, experimental research suggests they are affected by parents’ feeding behaviours. Outside of the laboratory context, in daily life, there is little indication of how parents attempt to influence their children’s food likes and dislikes, and associations between feeding behaviours and children’s food preferences. Furthermore, parents’ use of feeding behaviours may be partly determined by characteristics of the child, especially their food neophobia.
Objective – To explore parents’ use of strategies for influencing their preschool-aged children’s food preferences, and associations with children’s food preferences and food neophobia.
Design – Semi-structured interviews were conducted with three groups of parents: those of (a) children with healthy food preferences (N=20), (b) children with unhealthy food preferences (N=18), and (c) food neophobic children (N=19). Parents were asked to describe how they tried to influence their children’s food preferences in general, as well as a specific time when they attempted to promote liking or disliking of a food. Interviews were transcribed verbatim and entered into a qualitative software package (N6) for thematic analysis and extraction of quotes.
Outcomes – Several themes concerning parents’ feeding behaviours emerged, some of which differed by group. Themes that arose from parents of children with healthy food preferences included use of exposure, repeated exposure, encouragement, parental modelling, manipulating peer influence and involving children in food preparation and selection. Conversely, themes emerging from parents of unhealthy or food neophobic children included forcing, fighting, restricting and controlling, rewarding or bribing and indulging children’s desires. Conclusions – The results support the hypothesis that parents’ feeding behaviours may be a source of variation in children’s food preferences. Parents’ use of feeding behaviours promoting unhealthy food preferences may be partially in response to children’s food neophobia. Education of parents about effective strategies for promoting healthy food preferences in children, and especially food neophobic children, is needed.
Background – Carotenoids are fat soluble pigments, which may play an anti-inflammatory role. Their circulating levels are low in the asthma, a disease that is defined as chronic inflammation of the airways. It has been shown that lycopene supplementation can reduce exercise-induced oxidative stress in stable asthma.
Objectives – The aim of this study was to determine the effect of the lycopene supplementation (as tomato juice and lycopene capsules) on the plasma levels of selected cytokines in stable asthma.
Design – Thirty-two asthma patients visited the clinic on 7 occasions over the 51-day period. They received 3 different treatments ie tomato juice, lycopene capsules, and placebo for 7 days each, in random order with a 10 day washout period between each treatment. During the study period, patients were advised to consume a low antioxidant diet. Blood samples were collected and interleukin-6, interleukin-8, high sensitivity CRP, and TNF-α levels in plasma were measured by ELISA. The difference between the levels of different cytokines after each treatment was analyzed by Friedman test, and the change during the initial washout period was analyzed by Wilcoxon test.
Outcomes – Seventeen subjects completed the study. There was no significant difference between the levels of each cytokine after each treatment, and the levels of the cytokines were not statistically different following the initial washout period. There was a trend toward decreased CRP after tomato juice and lycopene capsules, but because of small sample size and high variability of the data, these changes were not statistically significant.
Conclusion – Treatment of asthmatic subjects with lycopene supplements has not led to changes in systemic inflammation.
Background – Dietary intervention for obese populations is vital in light of the growing incidence of obesity within Australia. The Healthy Eating Lifestyle Program (HELP) and Healthy Eating for the Reduction of Obesity (HERO) are randomized controlled dietary intervention studies that target this condition. Both studies aim to assist overweight volunteers with weight loss through structured dietary prescriptions.
Objective – This study aimed to identify differences in nutrient intakes and food patterns between obese subjects with and without type 2 diabetes mellitus (T2DM).
Design – HELP study included adult healthy obese subjects while HERO study included adult obese subjects with T2DM. Participants were recruited through the local media and internet mailing lists. Dietary macronutrient intakes were assessed at baseline using diet history interviews and analysed using FoodWorks Professional (version 4.00.1178). Gram amounts of macronutrient intake were obtained and compared using independent sample t-test. Reported foods were grouped for each study based on the Australian Guide to Healthy Eating food groups and fatty acid content.
Outcomes – Thirty-eight (15 males, 23 females) healthy obese subjects without T2DM and 50 (22 males, 28 females) obese subjects with T2DM were recruited. Weight (89.6 ± 13.2; 92.8 ± 15.4) and BMI (31.8 ± 3.5; 33.2 ± 4.2) of subjects, were not significantly different between studies. Reported intakes were significantly lower among obese volunteers with T2DM for energy (P<0.001), total fat (P=0.011), saturated fat (P<0.001), protein (P=0.018) and carbohydrate (P<0.001). Meat-based dishes, dairy foods and ‘extra’ foods were the major food groups contributing to saturated fat in both groups. No differences were found in reported intakes of monounsaturated fat, polyunsaturated fat and fibre between obese subjects with or without T2DM.
Conclusion – Presence of a disease state in obese subjects appear to have a significant impact on dietary intake and subjects appear to follow a lower-energy and lower-fat dietary pattern when they are diagnosed with T2DM. Funding Source – HELP Study: National Health and Medical Research Council, HERO Study: California Walnut Commission.
Background – Osteoporosis is a disease which places considerable burden on the health budget, and brings pain, disability and possibly death to those in whom it develops. As it is essentially incurable, a preventative public health approach is required. Perceptions of personal susceptibility, belief in the seriousness of the disease and in the efficacy of recommended risk reducing behaviours have been shown to be critical in bringing about behavioural change.
Objective – To investigate the attitudes and knowledge of New Zealand women towards osteoporosis risk factors and risk reduction.
Design – A descriptive, web-based survey was completed by 622 Auckland women between the ages of 20 and 49 years. Two validated questionnaires measured levels of knowledge about osteoporosis and preventative behaviours, and perceptions of personal susceptibility and seriousness of the disease. Subjects were recruited by email and the sample was opportunistic.
Outcomes – The subjects reported higher than average educational attainment, and were well motivated to take care of their health. However, over 60% of the women denied feeling any susceptibility to osteoporosis, and 78% did not believe the disease to be crippling. Although most women (90%) agreed that a diet low in calcium increased the risk, 77% thought that calcium-rich foods contained too much cholesterol and 87% did not feel good about eating calcium-containing foods.
Conclusion – The findings of this study suggest that public health strategies aimed at increasing osteoporosis- preventing behaviours in pre-menopausal women should address attitudes about personal susceptibility and the seriousness of the disease. Further research into the opinions of women about the cholesterol content of calcium-rich foods would be valuable.
Background – Whilst in most cases changes in diet and lifestyle can account for the sharp increase in morbidity and mortality from cardiovascular disease (CVD) in many Asian countries, the relatively high incidence of CVD in Sri Lanka cannot be fully explained by traditional plasma lipid markers or by an excessive total fat intake. Nevertheless, an excessive intake of saturated fat derived primarily from coconut (oil/milk/flesh), and a low consumption of polyunsaturated fatty acids (PUFA) coupled with a background diet rich in highly digestible carbohydrates may promote vascular endothelial dysfunction leading to the development of CVD.
Objective – To evaluate the effect of dietary saturated fats on the vulnerability to cardiac arrhythmia in the rat. Design – Twelve weeks old rats were fed ad libitum the standard AIN-97 diets containing different dietary fats. The control diet contained 5% (w/w) fat from canola oil and the test diets were made by replacing canola oil with either lard, or a mixture of coconut oil (CO; 4.5% w/w) and sunflower oil (0.5%w/w). After a 3 month pre-feeding period, myocardial ischaemia was induced by temporary occlusion of the left anterior descending coronary artery. Parameters of cardiac arrhythmia including % incidence and duration of ventricular tachycardia (VT), ventricular fibrillation (VF) and arrhythmia score (AS) were calculated.
Outcomes – Compared to the control group, CO and lard diets resulted in significant increase in the duration (sec) of VT (1.9±0.8 control; 38.8±13.4 lard; 31.0±10.1 CO, P<0.01). VF was absent in the control group, compared to 58% (lard) and 91% in the CO group. The duration of VF and the % mortality from VF was also higher in the CO fed rats. The PUFA profile of myocardial phospholipids was unchanged by the dietary manipulation.
Conclusions – CO appears to have direct pro-arrhythmic actions in the rat model of cardiac arrhythmia and sudden cardiac death. Some of these actions may be elicited at the vascular endothelium level. Therefore, strategies directed at promoting vascular integrity and function may afford protection in such situations.
References – Abeywardena MY. Dietary fats, carbohydrates and vascular disease: Sri Lankan perspectives. Athereosclerosis 2003; 171:157-161. Abeywardena MY. Dietary fatty acids and cardiovascular disease: Modulation of non-lipid risk factors. 2006; In. Huan et al eds, Dietary Fats and Chronic Disease, AOCS Press (Champaign); pp 157-167.
Background – Ready access to a safe and affordable food was identified as an important element for food security at the 1996 World Food Summit in Rome (1). In 1994, 30% of Indigenous adults worried at least occasionally about going without food (2). The health status of Indigenous people remains the worst of any subgroup within the population, with little evidence of significant improvement over the past two decades (3). The causes of health
3 disadvantage are complex, however, improved diet, access to food and health care play a role in improving health .
Objectives – Examine the literature for evidence of food security in the Indigenous population and examine the amount and type of materials published over the last 20 years.
Design - Biomedical databases were interrogated to determine the amount and type of materials published over the last two decades. Other resources, including departmental reports, were also examined.
Outcomes – Of the materials collected, 84% were published in the second decade. Most of those published in the first decade were descriptive and focused on Indigenous health, nutrition, diet and factors affecting food access. During the second decade, the focus was more on the development of policy and intervention programs. Most related to the Northern Territory.
Conclusion – Lack of food security in many communities is a major concern contributing to poor health status in Indigenous communities. Over the last decade this issue has received more attention in the literature. Policies have been developed to address food insecurity in some jurisdictions; however few report implementation of these policies. Little work, if any, was reported levels of food insecurity in the urban environment nor interventions undertaken to address it.
References
Background – Laboratory studies have shown that the resistance of isolated low density lipoprotein (LDL) cholesterol or linoleic acid to oxidation is increased in incubations with chilli extracts or capsaicin - the active ingredient of chilli. It is unknown if these in vitro antioxidative effects also occur in serum of people eating chilli regularly.
Objective – To investigate the effect of daily ingestion of chilli on serum oxidation in adult men and women
Design – This study investigated the effects of regular consumption of chilli on in vitro serum lipoprotein oxidation
and total antioxidant status, in healthy adult men and women. In a randomized cross over study, twenty seven
participants (13 men and 14 women) ate 30g/day of ‘Freshly chopped chilli’ blend (55% cayenne chilli) and no
chilli (bland) diets, for four weeks each. Use of other spices such as cinnamon, ginger, garlic, mustard was restricted
to minimum amounts. At the end of each dietary period serum samples were analysed for lipids, lipoproteins, total
antioxidant status (TAS) and copper-induced lipoprotein oxidation. Lag time (before initiation of oxidation), and
rate of oxidation (slope of propagation phase) were calculated.
Outcomes – There was no difference in the serum lipids, lipoproteins and TAS at the end of the two dietary periods.
In the whole group, the rate of oxidation was significantly lower (mean difference MD -0.23
-3
A*10 /min; p = 0.04) after the chilli diet, compared to the bland diet. In women, lag time was higher (MD 9.61min;
p < 0.001) after the chilli diet, compared to the bland diet.
Conclusions – Regular consumption of chilli for four weeks increases the resistance of serum lipoproteins to oxidation.
Background – In Australia, the concept of the Glycemic Index (GI) was introduced to the general public through dietitians, other health professionals and the popular press in the mid 1990s, with the recommendation to consume more low-GI foods.
Objective – To determine whether general advice to consume more low GI foods has impacted the mean dietary GI of a representative sample of older Australians over the last 10 years.
Design – Prospective cohort study of 3,654 people aged 49 years or older in the Blue Mountains region of NSW. A total of 2,868 people completed a 145-item semi-quantitative food frequency questionnaire satisfactorily in 1992-94 (BMES 1), and were followed up in 1997-1999 (BMES 2), and 2002-4 (BMES 3). This data analysis includes those people who satisfactorily completed the FFQ on all three occasions (n=1166). Nutrient intake data were analysed in a custom-built database using the NUTTAB (1-2) databases and Australian GI values (3). One-way analysis of variance was used to determine differences between mean dietary GI.
Outcomes – Mean dietary GI was 56.5±4.2 in BMES 1, 56.4±4.3 in BMES 2 and 56.2±4.3 in BMES 3 (P=0.037). Post-hoc comparisons using the Tukey HSD test indicated that mean dietary GI for BMES 3 was significantly lower than BMES 1, but BMES 2 did not differ significantly from either BMES 1 or 3.
Conclusion – Mean dietary GI values of older Australians have dropped by a small but significant amount since the mid 1990’s. Recommendations to consume more low-GI foods may be having a positive effect on older Australians’ diets.
References
1 Department of Community Services and Health. NUTTAB 90 Nutrient Data Table for Use in Australia. Canberra: Australian Government Publishing Service, 1990.
2 Department of Community Services and Health. NUTTAB 95 Nutrient Data Table for Use in Australia. Canberra: Australian Government Publishing Service, 1995.
3 Foster-Powell K, Holt SH, & Brand-Miller JC. International table of glycemic index and glycemic load values: 2002. Am J Clin Nutr 2002; 76: 5-56.
Background – One of the most effective intervention methods to assist with lifestyle modifications for weight management involves individual counselling with face to face contact, however, this method is time intensive and costly for patients. Recently, internet based interventions and education programs have been developed. The internet can access a large number of consumers in a more cost effective manner than other information delivery channels. Objectives – To determine whether an online, personalised weight reduction program including dietary advice plus exercise is more effective in reducing weight than an exercise only program over 12 weeks.
Design – Participants were randomized to either an exercise only group (EX) or a diet plus exercise group (ED). Body Mass Index (BMI) and 24 hour dietary records were collected at baseline and week 12. Participants attended a baseline and final intervention visit where anthropometric measurements were performed. Subjects wore a pedometer, recorded daily steps and set weekly goals to increase daily steps through the internet program. The ED group also received healthy eating advice, set dietary goals via the internet and received personalised email assistance.
Outcomes – Seventy three participants commenced and 53 completed (EX n = 26; ED n = 27; BMI (mean (SD)), 29.7 (2.5) kg/m2; age 46.3 (10.8); 21% male). The percent weight changes were: EX, 2.1 (0.6) %; ED, 0.9 (0.6) % (P = 0.15), and change in total energy intake was EX, +110 (666.6) kJ ED, -1812.6 (803.4) kJ, P = 0.07 between groups, with no difference in daily step change (ED 3525 (896.7), EX 3148 (848.2) steps, P = 0.76).
Conclusions – An internet-based program with goal setting resulted in a mean weight loss of 1-2%. The combined exercise and dietary modifications did not result in a greater weight loss when compared to exercise alone. Dietary education did not enhance weight loss over 12 weeks and there was an indication of a greater weight loss in the exercise only group, even though the ED group reported a similar increase in physical activity and a greater fall in energy intake. It may be that those randomised to the exercise group made additional lifestyle changes that we were unable to detect.
Background – There is little recent data on the dietary intake of sodium (Na) and potassium (K) in the Australian population. The best method for assessing dietary Na is 24-hour urine collections, which require a high level of subject co-operation. Dietary assessment can provide an estimate of Na intake but the association between dietary assessment and urinary measurement in Australian community dwelling adults is not known.
Objective – To determine the dietary intake of Na and K measured by 24-hr urinary excretion (UNa and UK) and 24-hr dietary recall (Diet Na and Diet K).
Design – Adults recruited to dietary intervention studies had dietary intake measured using a 24-hr recall (analysed with FoodWorks) and provided a single 24-hr urine collection, whilst on their usual diets.
Outcomes – Of the 144 participants, 85% (54 females (F), 69 males (M)) had UNa over the suggested dietary target (SDT) of 70mmol/day and 62% (32 F, 57 M) were over the upper limit (UL) of 100mmol/day.
Only 19% of participants (5 F, 23 M) met the SDT for K (120mmol/d). Those with two 24hr recalls at baseline (n=88), Diet Na and Diet K were significantly correlated with UNa and UK (r=0.391; B(se)=0.018(0.005); P=0.0001 and r=0.579; B(se)=0.500(0.076); P=0.0001, respectively). BMI was also significantly correlated with UNa (r=0.397; B(se)=5.293(1.318); P =0.0001).
Conclusions – Most participants exceeded the UL for Na and few met the SDT for K. Dietary assessment was correlated with urinary output. Body size was a predictor of UNa and with the increasing BMI of the population, meeting the SDT for Na and K presents a great challenge.
Background – In recent years, there has been a focus on using the general practice setting for health promotion including improving dietary and exercise practices among patients.
Objectives – To determine the extent to which Australian-based GPs advise overweight and obese patients to make lifestyle changes for weight loss and advise hypertensive patients to reduce intake of salt and/or salty foods.
Design – A face to face survey was conducted on a representative sample of South Australian residents. Respondents provided information on height and weight (self-report), whether they had received lifestyle advice from their GP for weight loss, whether their GP had recorded their weight in the past 12 months, if they had ever been told that they had high blood pressure, their current use of anti-hypertensive medication and if they have ever received advice to reduce their intake of salt and/or salty foods.
Outcomes – The sample included 2947 people aged 18 years or older (47% female; BMI (mean (SD)), 26.6 (5.3) kg/m2; age, 50.7 (18.0) years). Ninety-six percent of respondents had visited their GP in the past 12 months. Forty- one percent of males and 25% of females were overweight and 19% of males and 20% of females were obese. Forty-five percent of overweight or obese respondents were weighed and 27% received lifestyle advice for weight loss purposes from their GP (5.5% received only dietary advice, 6.5% received only exercise advice and 15% received both dietary and exercise advice). Thirty-three per cent of all respondents had been told in the past by their doctor that they had high blood pressure. Of these, 66.7% were taking medication for blood pressure control and 33.7% had been advised to reduce salt intake.
Conclusion – Although almost 1/2 of overweight and obese patients had been weighed, less than 1/3 of these had received lifestyle advice that could assist with weight loss. Additionally just over 1/3 of those who had been told they have high blood pressure by a doctor received advice to reduce salt intake. There are potentially missed opportunities in which GPs could provide re-enforcement of benefits of lifestyle changes with respect to weight and blood pressure control. Strategies should be investigated to encourage GPs to assess risk of overweight, obesity and that would support GPs in providing simple advice to assist patients in making positive lifestyle changes that could at least assist in the prevention of weight gain and reduce blood pressure.
Background – Heart rate recovery (HRR) is an independent risk factor for cardiovascular disease and mortality and is inversely associated with insulin resistance and co-related metabolic risk markers.
Objective – The objective of this study was to determine the effects of weight loss on HRR and it’s association with traditional cardiovascular risk markers in overweight and obese men with components of the metabolic syndrome. Design – HRR (defined as the decrease in HR from peak HR to that measured 1 minute after a standardised graded treadmill test) and a range of well-established cardiovascular risk factors, were measured in 42 overweight and obese men (BMI: 33.8±0.6 kg⋅m2; age: 46.5±1.3 yrs) who had no symptoms of cardiovascular disease, but had components of the metabolic syndrome, before and after 12-weeks of weight loss.
Outcomes – The dietary intervention resulted in a 9% weight reduction (P<0.001). There were significant reductions in waist circumference, blood pressure, plasma triglycerides, total, LDL and HDL cholesterol, triglyceride:HDL ratio, CRP, plasma insulin, glucose and HOMA (P<0.05). Although peak heart rate remained unchanged, HRR at 1 minute improved from 33.4±1.4 to 37.0±1.2 bpm (P<0.001) after weight loss. There were no changes in either cardiorespiratory fitness (P=0.30) nor physical activity levels (P=0.67). The improvement in HRR was significantly correlated with decreases in body weight, BMI, waist circumference, plasma glucose, serum triglycerides and triglyceride:HDL ratio, but was only independently associated with changes in weight and plasma glucose concentrations.
Conclusion – In addition to improving a range of well-accepted cardiovascular and metabolic risk factors, weight loss also improves HRR after exercise, a less well-recognised risk factor.
Acknowledgment – This work was funded by a Medical Research grant from Meat and Livestock Australia. None of the authors had any conflict of interest.
Background – There is a growing concern over the increasing prevalence of obesity, diabetes and dental erosion amongst Australian children. Association of the effect of marketing such as television advertising of low nutrient, high energy dense foods with childhood obesity is becoming an issue of concern for public health (1).
Objectives – The purpose of this study was to determine the nutrient content, particularly fat and sugar content, of food products marketed to children in Australia.
Design – Packaged food products (408) were purchased from supermarkets in different Brisbane regions on random days in April, June and July in 2005. A further 23 products were purchased from fast food outlets. Food products were chosen based on the marketing techniques directed at children. The food products were categorized into 17 food types. A Microsoft Access database was created to consolidate information on nutrients, labels and marketing techniques of the products evaluated. The RED category criteria tables from the Smart Choices food and drink selector (2) and the Nutrition Australia Selection Guidelines (3) were used as the benchmarks for mean energy, total fat, saturated fat, sugar and sodium levels. Statistical analyses were performed using MINITAB 14.
Outcomes – Important marketing strategies included ease-of-use packaging that is well-designed for children (88.9%), use of popular personality (63.3%) and television advertising (37.4%). Mean total fat, saturated fat and/or mean sugar content of most food types were found to be significantly higher (p <0.05) than the comparable benchmark values. The majority of the 431 food products were classified into the RED category of the Smart Choices food and drink selector ie foods that are high in energy, saturated fat, and/or added sugar and/or sodium. Conclusion – Manufacturers use a variety of techniques to market food products to children. Most of the food products especially marketed to children used in this study are not the most appropriate and healthiest food choices. References
1 Zuppa JA, Morton H, Mehta K. TV food advertising: Counterproductive to children’s health? Nutr Diet 2003;60:78–84.
2 Department of Education and the Arts & Queensland Health. Smart Choices: Healthy food and drink supply strategy for Queensland Schools. Queensland Government, 2005.
3 Nutrition Australia. Food selection guidelines for children and adolescents. Nutrition Australia, 2004.
Background – Obesity is a major concern for westernised populations and one of the leading contributors is the prevalence of foods that are high in fats and sugars. Honey is a ready source of sugar that offers nutritional benefits over the use of sucrose.
Objectives – To assess whether replacing sucrose in a standard western diet with honey would have any impact on weight, food intake or blood sugar and cholesterol levels.
Design – Forty rats, aged 6 weeks were fed one of four experimental diets that contained either no sugar, 8% w/w sucrose, 8% w/w mixed sugars as in honey or 10% w/w high peroxide/high antioxidant rewarewa honey (honey is 20% w/w water). The diets were fed ad libitum for 6 weeks. The carbohydrate/fat/protein ratio of each diet was formulated to be equivalent to a typical New Zealand diet based upon data from the 1997 NZ National Nutrition Survey. During the experiment, the rats were housed in standard rat cages (2 animals per cage) that had a raised mesh floor. Animals’ weights and food intakes were assessed weekly. On day 42, all rats were anaesthetised using CO2 gas. Blood samples were taken via cardiac puncture, and analysed for glycated haemoglobin (HbA1c) and fasting lipid profiles. After euthanisation, each rat was minced using a Sunmile SM-G50 mincer (Vantage) and total body fat and protein levels determined using soxtec fat extraction and LECO total combustion method, respectively. Outcomes – Overall percent weight gain in honey-fed rats was significantly reduced by 16.7% compared with sucrose-fed rats (p < 0.01), and similar to that observed in rats fed a sugar-free diet after 6 weeks. Total food and calorie intake was significantly higher in all sugar treatments compared with the sugar-free treatment group (P<0.01); however, no statistically significant differences in total food intake were observed between the three sugar treatments. No differences in HbA1c, total-, LDL- and HDL-cholesterol or triglyceride levels were observed between the three sugar treatments. Body fat measurements were inconclusive due to large data variability, but no significant differences in total body protein levels were observed.
Conclusion – The replacement of sucrose with honey in the diet can lead to lower weight gains in young animals despite a similar food intake. Mixed sugars (as in honey) did not show decreased weight gains suggesting the effect with honey may be due to other components of the honey.
Background – Adhesion and colonization of the mucosal surfaces by probiotics are possible protective mechanisms against pathogens through competition for binding sites and nutrients (1) or immune modulation (2).
Objectives – The objective was to test the abilities to inhibit, to displace and to compete with pathogens in order to screen the most effective adhesive probiotic combination, and to develop methods for new probiotic characterization.
(2)
Design – A human intestinal mucus model was used to assess probiotics strains and their combinations. The
strains were selected on the basis of their use as a commercial probiotic strains and they have each demonstrated to have beneficial in vivo health effects.
Outcomes – Probiotic strains showed different abilities against pathogen adhesion. These properties were strain- and combination-specific indicating the need of a case-by-case characterization. All combination were able to reduce (p<0.05) the pathogen adhesion and in some cases over 40-50% of inhibition was demonstrated, but not all strains alone were able to inhibit pathogen adhesion. Thus, the selection of probiotic strains or combination to inhibit or displace a specific pathogen could be the basis for both product development and future clinical intervention studies on prevention or treatment of dysfunctions.
Conclusion – Our results suggest that different probiotic combinations can be formulated to enhance the inhibition and the displacement percentages to pathogen adhesion to intestinal mucus. New combinations could be useful in inhibition and displacement of pathogen adhesion than a single strain. Further studies are needed to characterize each combination and to understand their role in inhibition mechanisms.
References
Background – Epidemiological studies suggest that adhering more closely to National Dietary Guidelines is
associated with improved diet-related health outcomes, with a reduction in morbidity and mortality. A number of methods have been used to generate dietary scores to measure diet quality and variety.
Objectives – To evaluate whether an association exists between diet quality and indices of quality of life, health service use and Medicare costs in the Australian Longitudinal Study of Women’s Health (ALSWH).
Design – Cross-sectional measurement of association between an Australian Recommended Food Score (ARFS), self-reported variables and Medicare costs in women (n = 11,194, 50-55 yr) participating in the 2001 survey of the mid-aged cohort of ALSWH. ARFS was derived from responses to the Dietary Questionnaire for Epidemiological Studies FFQ and increases as the number of foods consistent with Australian Dietary Guidelines consumed at least once a week, increases. ARFS was divided into quintiles with higher scores having more favourable macro and micronutrient profiles. Data linkage allowed examination of associations with Medicare costs.
Outcomes – More women in the lowest quintile of the Australian Recommended Food Score reported their general health as fair or poor compared to those in the highest quintile (18 vs 10%, P<0.0001). The mean SF36 general health perception domain score was higher for those in the top ARFS quintile compared to the bottom (mean (95%CI): 75.3 (74.3, 76.2) vs 67.1 (66.2, 68.0) P<0.0001). Fewer women in the highest ARFS quintile reported four or more GP consultations in the previous year compared to the lowest (13% vs 17%, P=0.0024) but there was no difference in Medicare costs across the quintiles, P>0.05.
Conclusion – Higher ARFS is associated with improved self-reported indices of quality of life, but not reduced Medicare costs. Longitudinal evaluation will determine whether a higher ARFS is protective in terms of predicting health outcomes or reducing long-term health costs.
Background – Following ingestion of a meal, postprandial hyperglycaemia in cats persists for 20-24 hrs, and the reasons for this are unknown.
Objectives – To describe the patterns of postprandial plasma glucose, D-lactate, and L-lactate concentrations, and gastric emptying time in meal-fed cats, and to assess the effects of meal volume on gastric emptying time.
Design – Eleven healthy cats were fed a commercially available, high carbohydrate (54% metabolisable energy) diet for 2 weeks. In the third week, on two separate occasions, fasted cats were fed a meal of 50 kcal/kg and consumed at least 90% within 30 mins. On the first occasion, the cats underwent repeated ultrasound examinations over 26 hrs to determine gastric emptying time. On the second occasion, plasma glucose, D-lactate and L-lactate concentrations were measured over 24 hrs. To assess the effect of volume of food eaten on gastric emptying time, 2 weeks later, five of the same cats were fed a meal of the same composition but half the volume (25 kcal/kg) and a second series of ultrasound examinations was performed.
Outcomes – Glucose concentrations were significantly higher than baseline from 1 to 18 hrs after feeding (P<0.001), reaching a peak at 10.7 ± 5.3 hrs (mean ± SD) after the meal. Median time to gastric emptying when cats were fed their total daily energy intake in a single meal was 24 hrs (range 16-26 hrs). In contrast, times to gastric emptying were substantially shorter when cats were fed 50% of their daily intake in a single meal (median 14 hrs; range 12-14 hrs). D- and L-lactate concentrations did not change substantially after feeding.
Conclusion – These results suggest that prolonged gastric emptying time contributes to the prolonged postprandial hyperglycaemia observed in meal-fed cats. They also show that gastric emptying is faster if the meal size is reduced.
Background – Most plant foods, especially green vegetables, wholegrain breads and cereals and peas and dried beans contain folates. There is a critical need to estimate dietary folate intakes for nutrition monitoring and food safety evaluations in the South Pacific.
Objectives – One objective of this survey was to ascertain the knowledge of the people of the Central and Eastern parts of Viti Levu in the Fiji Islands about the importance of folates in their diet. Another objective was to collect information on the types of food consumed by the population in order to select folate containing foods and to analyse their folate content. A third objective was to assess whether the people surveyed were getting sufficient folates in their daily diet.
Design – A short qualitative food frequency questionnaire (FFQ) to assess folate intake was developed. A random sample of 200 males and females was interviewed in the year 2005. The average age for the study sample was 35 years old. 50% were from rural areas and 50% from urban.
Outcomes – The FFQ showed that most of the foods high in folic acid/ folate were being consumed only once a week in both the male and female population. In the research study of folate levels in Fiji leafy vegetables, Chinese cabbage (Brassica chinensis) and Bele (Abelmoschus manihot) were found to have high levels of folates (1). However, it was noted that of the female population surveyed, none consumed these vegetables on a daily basis. Conclusion – There is a need to explore ways to improve folate intakes in the overall population in Fiji Islands and the South Pacific to prevent folate deficiency diseases, especially neural tube defect in females of child bearing age. Reference
1 Sotheeswaran S, Riteshma Devi, Jayashree Arcot, Sadaquat Ali. Folate concentration of selected Fijian Foods using Tri-enzyme Extraction method, J. Food Composition and Analysis, 2006, submitted.
Background – In-vitro models of the human digestive system are useful for identifying factors that may influence the molecular behaviour of nutritional ingredients during digestion and passage into the circulatory system. It is important that models faithfully represent important digestive processes with the minimum of operational complexity. Current models (a) usually use mechanical size reduction to mimic chewing and (b) sometimes use a Caco-2 epithelial cell monolayer to estimate uptake into cells, but not to evaluate metabolism across the cell layer. Objectives – To investigate (a) whether chewing can be substituted by mechanical size reduction and (b) whether passage across Caco-2 cells can be used to assess the potential for metabolism of food components during uptake. Design – (a) Using fresh and processed fruit as example foods, the size profile and microstructure of chewed (ready to be swallowed) pieces was examined. Subsequent release of fruit sugars during simulated gastric and intestinal processing was monitored. (b) The passage of beta-carotene and catechin across a Caco-2 epithelial cell monolayer was studied.
Outcomes – (a) Physiologically, chewing cannot be simulated with simple size reduction methods because of the heterogeneity of chewed particle sizes (75 μm – 7 cm) and shapes, and the effects of oral processing on bolus characteristics. The large size (> 0.5 cm) of chewed fresh or dried fruit results in incomplete release of sugars after simulated gastric and small intestinal digestion (up to 47% lower compared with juice). (b) After 2 h assay, beta- carotene is metabolized by the Caco-2 cell monolayer more extensively (in total, approximately 8.03% conversions to retinol) than catechin (about 1.43E-5% conversion to catechin metabolites).
Conclusion – (a) Chewed particle characteristics are a likely determinant of subsequent nutrient release from solid foods and should not be overlooked in the development of in-vitro digestion models. (b) the Caco-2 cell monolayer can be used to monitor metabolic transformation of nutrients that may be relevant to first pass metabolism in-vivo.
Background – Current scientific literature has yet to provide strong support for the role of nutrition knowledge in influencing food intake behaviours. This relationship may have been prematurely rejected, as previous measures of nutrition knowledge used have lacked validity and reliability; and measuring dietary intake is notoriously difficult. Objective – Test a valid and reliable measure of nutrition knowledge to assess the relationship between knowledge and dietary intake in two South Australian communities of differing socio-economic status (SES).
Design – Nutrition knowledge was measured in two community groups in South Australia: a lower SES sample (Low-SES, n = 118) and a higher SES sample (High-SES, n = 96). Dietary intake was measured within a smaller sub-sample of the two groups by validated food frequency questionnaire (FFQ) and assessed using a diet quality index and food group analysis.
Outcomes – Low-SES scored significantly lower on three of the four knowledge areas: specifically, knowledge of the sources of nutrients, choosing everyday foods and knowledge of diet-disease relationships (p<0.001). Overall, the total nutrition knowledge score (out of a possible 113) was almost 13 points lower for Low-SES compared to High-SES (a notably significant difference; p<0.001). Preliminary analysis of reported dietary intake (FFQ) suggests that individuals from the lower SES community were consuming less fruit and vegetables, more high sugar, low fibre carbohydrates foods and less variety in their food choices.
Conclusions – Nutrition knowledge levels differed significantly between community samples of differing SES. Differences in diet quality appear to be present. Further analysis will be conducted on the strength of the relationship between nutrition knowledge and dietary intake and located within the context of other factors that impact upon dietary behaviour.
Background – Selenium deficiency may be associated with increased risk of viral virulence, cancer, negative mood states and immunological dysfunction, and Se intakes above the minimum requirements appear to have a positive effect on the later three. There is relatively little detailed information on selenium intake in Australian adults. Objective – This study aimed to estimate selenium intake of a randomly selected population of Victorian women enrolled in the Geelong Osteoporosis Study; to ascertain the main food groups contributing to daily selenium intake and to investigate the effect of the age group differences in food choices on Se intake and Se sufficiency.
Design – A detailed-history semi-quantitative food frequency questionnaire (FFQ) was administered to randomly selected women (n=556), aged 20–88 y, from the Barwon Statistical Division, representative of Australia in several demographic factors. Se values for Australian foods were used where available (Australian NZ Food Authority). Outcomes and conclusion – The FFQ captured responses on 359 foods. Se intake (mean±SD) was 77±29 μg/day; median 73 μg/day – which is higher than NZ and most European countries, but lower than US and Canadian mean Se intakes. Mean intake was 1.1±0.4 μg/kg body weight (range (0.2 – 3 μg/kg). 43% of women consumed less Se compared to the Australian NHMRC recommended intake (70 μg/day) for women. Wheat products, fish, vegetables, beef, fruit (Australian canned fruit has relatively high Se), dairy and poultry provided 20%, 10%, 10%, 10%, 9%, 9% and 7% respectively of the total Se intake. Significant age group differences were also found. Women 40-49y were those mostly at risk of Se insufficiency and older women consumed more Se from fruits, lamb, peas and beans. Younger women consumed more Se from mixed dishes (includes many “take away” food), chocolate foods and non-wheat grains.
Background – Australian fresh pork composition was last analysed in 1994; since then rearing and butchering practices have improved considerably with potential to affect composition of pork as available at the present time. Objective – To assess the folate composition of nationally representative retail fresh pork cuts in 2005/6
Design – Fresh pork cuts were purchased anonymously in three capital cities in 2005/2006 according to a sampling design that generates representative independent duplicate samples per state. Pork loin chop was studied as independent state duplicates, and other cuts were analysed as national homogenates. Gross composition was measured at Food Science Australia and lean and fat homogenates, raw and cooked, were formed for freight as fresh chilled samples for folate measurement in triplicate by the tri-enzyme method (1), modified with a fourth lipase enzyme treatment.
Outcomes – Folate levels varied from 0–83μ g/100 g in raw lean, and from 0-37 μg/100 g in raw fat. The nationwide average folate levels for all cuts in μ g/100 g (mean + SD) were: raw lean, 39.8 + 24.8, cooked lean, 28.6 + 23.2, raw fat, 17.8 + 12.3, cooked fat, 25.9 + 21.5.
Conclusions – Australian raw fresh pork contains highly variable levels of folate. The average levels of folate in Australian pork were higher than expected, and were much higher than reported for British or American fresh pork. Reference
Background – Maternal nutrition plays a major role in a healthy outcome for pregnancy. Increasing numbers of women have experienced an eating disorder and recovered. Does this influence weight gain and nutrition in pregnancy?
Objective – To explore how some women recover from an eating disorder (ED) and manage the experience of pregnancy and mothering specifically weight gain and the nutritional needs of themselves and their child.
Design – Two groups of women were studied, the ED group (10) and the reference group (8) women of comparable body mass who said they had never had an ED. This was a qualitative research study which used Grounded Theory as the research method.
Outcomes – Recovery involved adopting more constructive coping strategies (exercise) to “measure up”. Their pregnancies were characterised by predominantly high weight gains( 6/10 women gained 18-30 kgs i.e. more than the upper recommended limit for women with a low BMI however only 3 of those 6 women were in the low prepregnancy BMI range, 2/10 women gained in the normal weight range and 2/10 gained less than recommended), above normal infant birth weights (8/10 babies weighed >3300gms and none were in the low birth weight range and successful breastfeeding(all women breastfed for a minimum of 4 months and 7/10 breastfed for 6months or longer). None of the 10 women reported they received any personalised nutritional information from their health professional during their pregnancies .
Conclusions – ED recovered women are characterised by having a need to “measure up” throughout their lives. They are highly motivated to have healthy pregnancies and can be particularly receptive to nutritional guidance at this time.
Background – Approximately 12% of people of Northern European descent are heterozygous for the C282Y mutation of the HFE gene (homozygosity for which mutation is associated with hereditary haemochromatosis). Improved phenotypic characterization is needed to assess health risks for the heterozygote genotype.
Objective – To determine the relative importance of HFE genotype, diet, lifestyle, and blood loss for predicting iron status in a sample of UK men aged 40 years or over.
Design – Iron status (serum ferritin (SF), transferrin saturation (TS), soluble transferrin receptor (sTfR)) was measured in 44 C282Y heterozygote and 85 age- and BMI-matched wildtype men. Dietary intake of iron (total, haem and non-haem), and components that influence iron bioavailability, was determined using a validated Meal- Based Intake Assessment Tool. Lifestyle and blood loss data were obtained by questionnaire, and height and weight measured. Linear mixed models were used to determine the predictors of iron status controlling for matching. Outcomes – C282Y heterozygosity was associated with 18% higher TS (95% CI: 7%, 31%) but no difference in SF or sTfR concentrations. Blood donation was negatively associated with TS (-13% (-3%, -22%)) and SF (-58% (- 44%, -68%)), and had a marginally significant positive association with sTfR concentration. Self-reported faecal blood loss was negatively associated with SF concentration (-35% (-54%, -7%)). Alcohol was the only dietary variable associated with iron status and was associated with all three of the iron status indices. Serum ferritin concentration was positively associated with BMI (10% increase per BMI unit increase (6%, 15%)).
Conclusions – Blood loss was a stronger predictor of iron status than either C282Y heterozygosity or diet in this population of UK men.
Bioelectrical impedance analysis (BIA) measures the impedance and resistance associated with passage of an alternating current through the body. The aforementioned are proportional to total body water (TBW) and therefore can be used to provide expedient estimates of body composition. However, little validity information is available for popular commercially available bathroom scale type devices which perform segmental measurements (lower limbs). The aim of this study was therefore to compare body composition estimates between a commercially available easy to use segmental BIA device (Tanita BC-532, Tanita Corp, Tokyo, Japan) and criterion values in a group (n = 9) of healthy males (mean ± SD: 48.6 ± 18.8 yr; 173.8 ± 4.5 cm; 72.8 ± 8.9 kg). Criterion four compartment body composition determinations involved measures of body density, TBW and bone mineral mass. The results (mean ± SD) are summarised below:
Measures | BIA | Four compartment | Four compartment - BIA | P |
% Body fat | 20.8 ± 5.5 | 21.0 ± 6.2 | 0.2 ± 2.5 | 0.785 |
Fat free mass (kg) | 57.4 ± 4.9 | 57.0 ± 4.2 | -0.4 ± 1.8 | 0.543 |
TBW (kg) | 39.3 ± 3.5 | 41.1 ± 3.2 | 1.8 ± 1.2 | 0.002 |
While the mean %BF and fat free mass values for both methods were not significantly different, considerable intra- individual differences were observed. BIA values varied from the four compartment values by -3.0 to 4.4% BF and - 3.3 to 1.9 kg fat free mass. The BIA estimates of TBW were significantly different from the criterion measures and intra-individual differences displayed a large range of -0.6 to 3.6 kg. Significant underestimations of TBW via BIA are concerning given that this is the parameter initially established by this method. Furthermore, the BIA data resulted in a FFM hydration value of 68.5% which was significantly (P<0.001) lower than the four compartment value of 72.0%. Presumably the BIA algorithms use an assumed FFM hydration value to determine body composition. In conclusion, the BIA device tested displayed poor individual accuracy for the estimation of body composition compared with a criterion method.
Background – Since mandatory fortification of folic acid of flour has contributed to the reduction of neural tube defects (NTDs) in the U.S., Canada and Chile (1), mandatory folic acid fortification in bread-making flour has been proposed for Australia and New Zealand (2).
Objectives – To determine patterns of eating behaviour of women of childbearing age, and evaluate the appropriateness of bread-making flour as the selected food vehicle of folic acid fortification.
Design – A population sample of 197 women aged 19-45 years were recruited to complete a food frequency questionnaire (FFQ). Data on women’s dietary preferences and intakes were collected and analysed.
Outcomes – For all subgroups, average intakes of bread and cereal products did not meet recommendations set in the Dietary Guidelines for Australian Adults (3). Bread consumption averaged three to four slices per day. With fortification of bread, dietary folate intake would increase by 117-156 μg/day, achieving the proposed target. However, the survey revealed that one third of women did not eat sufficient bread to benefit from fortification. Conclusion – Although potentially beneficial, folic acid fortification warrants an educational campaign, particularly for women with low bread consumption.
References
Background – The majority of epidemiological studies examining associations between glycemic load (GL) and risk of chronic diseases have used validated food frequency questionnaires with carbohydrate correlation coefficients ranging from 0.4 to 0.8. Few have reported the degree of agreement between the questionnaire and reference intake.
Objectives – To test the validity of a GL questionnaire (GLQ) by comparison with a detailed diet history.
Design –54 women aged 42 to 82 years were recruited from a cohort of 511 participants in the Longitudinal Assessment of Ageing in Women (LAW). Carbohydrate intake was assessed by a specially-designed GLQ; GL [carbohydrate (g) x glycemic index (%)] values were summed to provide the average daily GL. Data were validated against a diet history.
Outcomes – Mean + SEM intakes from the diet history were 6% higher than those from the GLQ for carbohydrate (216 + 6 versus 203 + 8 g/day, P<0.05) and the GL (110 + 4 versus 103 + 4, P=0.1), respectively. There were significant correlations between methods for carbohydrate (r=0.54, P<0.01) and GL (r=0.57, P<0.01). 95% limits of agreement determined by Bland Altman plots ranged from -111 to 83.7 g for carbohydrate, with almost half the subjects recording a difference of + 40 g; and -60.0 to 46.6 for GL, with a third of subjects recording a difference of + 25 units or more.
Conclusion – Our GLQ had acceptable validity in terms of correlation with the dietary history. From a clinical perspective however, substantial error existed in estimation of individual carbohydrate and GL intakes. We suggest that studies using food frequency questionnaires to estimate GL state limits of agreement instead of or as well as correlations when discussing validity. Failure to accurately assess carbohydrate intake could explain some of the discrepancies in results of dietary studies investigating associations between GL and the development of chronic diseases in individuals.
Background – Blueberries are considered a healthy addition to the diet as a rich source of antioxidants. Although the nutritional content of blueberries is highly affected by environmental growing conditions, no research has been published on New Zealand - grown blueberries varieties.
Objective – To investigate antioxidant activities and total phenolic content of four NZ grown blueberry cultivars. Design – Atlanta, Burlington, Jersey and Stanley blueberries were harvested from one commercial producer in Canterbury, New Zealand. Total phenolic content (TPC) was measured using Folin–Ciocalteu reagent and
.-
antioxidantactivitiesusingthesuperoxideanion(O2 )scavengingactivity(SASA)methodandtheDPPHmethod.
-1
Outcomes – TPC of NZ blueberries ranged from 230.10 ± 18.00 to 497.10 ± 63.20 mg GAE.100g
blueberries had significantly (P<0.001) higher TPC than other varieties. Antioxidant activity (SASA) was significantly (P<0.05) different between cultivars. Burlington showed highest activity (1369 ± 141GAE mg/100 g). There was a significant correlation between TPC and SASA (r = 0.58, P<0.05). Antioxidant activity by DPPH showed no differences between cultivars (P >0.05).
Conclusion – TPC of NZ blueberries is similar to blueberries grown in America. This study suggests Burlington may offer a slight advantage in antioxidant content over Atlanta, Jersey and Stanley cultivars.
Background – Coffee is a very popular beverage and it contains several bioactive compounds. Antioxidant activities have been reported in coffee, however, little is known on the level of phenolics and the antioxidant capacity in commercial coffee drinks.
Objective – To determine the phenolics content of six commercial coffee drinks (espresso, long black, filtered, flat white, latte and cappuccino) from 4 different coffee retailers, and to investigate their antioxidant activities.
Design – Six commercial coffee drinks were purchased from 4 different retailers on 3 different days. Total phenolics concentration (TPC) was determined using the Folin–Ciocalteu method. Antioxidant activities of the coffees were
.- estimatedusingtheDPPHandthesuperoxideanion(O2 )scavengingactivity(SASA)methods.
Outcomes – Black coffee (espresso, long black, and filtered) was higher (P < 0.001) in TPC /g freeze-dried extract compared with white coffee (flat white, latte and cappuccino). Based on the serve of coffee, TPC (GAE/ cup) from different retailers was not different (P > 0.05), whereas white coffee had higher (P < 0.001) TPC GAE/ cup compared with black coffee. The overall %SASA per serve is higher (P < 0.001) in white coffee than in black coffee. A similar trend was seen using DPPH assay. Significant correlations were found between TPC; SASA and DPPH.
Conclusions – The data show that coffee is a good source for antioxidants, but variation between retailers and within retailers can affect the daily intake of phenolics from coffee.
Background – Many studies have documented that exercise and nutrition both have regulatory effects upon carcinogenesis. Indeed the progression of lesions and cancers can be reduced or eliminated. Both soy and whey proteins have been shown to have additional health benefits beyond their basic nutritional value, including anti- carcinogenic properties against colorectal, breast and prostate cancers. At present, limited knowledge exists of the mechanisms by which exercise and/or in combination with nutrition exhibit these anti-carcinogenic properties and whether anti-tumour properties extend to other tissues.
Objective – To determine the effects of moderate voluntary exercise on the incidence and progression of N- nitrosodiethylamine (NDEA)-induced, pre-neoplastic lesions and gene expression in the liver.
Design – Pre-neoplastic lesions were induced in the livers of Wistar rats by NDEA injections (25mg/kg body weight, 2 times week for 3 weeks). Rats (9 per treatment) were then allowed voluntary access to exercise for up to 12 weeks. Following euthanasia, liver samples were obtained and the total area occupied by preneoplastic lesions was determined by immunohistochemical staining for GST-yp. DNA microarray analysis was undertaken to investigate the effect of exercise upon gene expression.
Outcomes – This study found that pre-neoplastic lesions failed to progress after 9 and 12 weeks of exercise (P<0.01). Exercising also resulted in altered gene expression profiles which were consistent with enhancement of the immune system. Exercise also counteracted many of the effects upon gene expression due to the presence of lesions.
Conclusions – Moderate exercise has prevented the progression of NDEA-induced preneoplastic lesions in liver compared to sedentary controls possibly via a mechanism involving modulation of gene expression. The interactive effects of exercise with dietary change is being further investigated.
Background – Increased net acid excretion has been found to negatively affect bone mass in children and women and may have an impact on other indicators of health such as blood pressure.
Objective – To determine the potential renal acid load (PRAL) and net acid excretion (NAE) levels derived from dietary assessment in community dwelling adult men and women following a variety of different diets.
Design – PRAL and NAE (mEq/d) were calculated from 24-hour recall data (1-3 days) in subjects (mean age 54 (10) SD years), on a range of diets (>4 weeks): low vegetable (veg), dairy; low fat, high CHO, energy deficit (LFHCHOEND); Dietary Approaches to Stop Hypertension, energy deficit (DASHEND); low sodium (Na), high potassium (K); DASH.
Outcomes – The usual Australian diet and low fruit and vegetable diet had the highest acid load. Diets with increased fruit and vegetables reduced acid load. PRAL on DASHEND was lower (-9 mEq/d, P=0.03) than the DASH diet.
Conclusion – Diets higher in fruit and vegetables reduce the acid load for a similar intake of protein. Diets with a similar dietary pattern but designed for energy deficit also reduced acid load. This reduction in acid load may account for some of the favourable effects on bone of these dietary patterns.
Background & Objectives – Dietary antioxidants are often defined by in vitro measures of antioxidant capacity. Such measures are valid indicators of the antioxidant potential, but provide little evidence of efficacy as a dietary antioxidant. This study was undertaken to assess the in vivo antioxidant efficacy of a berry fruit extract.
Design – Rats were fed basal diets containing fish and soybean oil likely to generate different levels of oxidative stress. After two weeks oxidative stress was assessed by measuring biomarkers of oxidative damage to protein (carbonyls), lipids (malondialdehyde, MDA), and DNA (8-oxo-2’deoxyguanosine urinary excretion) and plasma antioxidant status (antioxidant capacity ORAC, vitamin E). Boysenberry (Rubus loganbaccus x baileyanus Britt) extract was used as the dietary antioxidant.
Outcomes – The basal diets (chow (CD), synthetic/soybean oil (SD), or synthetic/fish oil (FD)) had significant effects on the biomarkers of oxidative damage and antioxidant status with rats fed FD having the lowest levels of oxidative damage and the highest antioxidant status. For example, plasma MDA was 45 ng/mL for the FD fed rats and significantly higher with 182 ng/mL for the SO fed rats. Furthermore the plasma antioxidant capacity was 9.2 mmol TE/L for the FD fed rats and significantly lower at 7.3 mmol TE/L for the SD fed rats. When Boysenberry extract was added to the diet, there was little change in 8-oxo-2’deoxyguanosine excretion in urine, oxidative damage to proteins decreased, and plasma malondialdehyde either increased or decreased depending on the basal diet. For example, the mean protein carbonyl concentration for the CD fed rats was 0.21 nmol/mg protein for the control rats and was significantly lower at 0.07 nmol/mg protein when 10% boysenberry extract was added to the diet. Interestingly for MDA, concentrations decreased to 36% of the control for the SD rats, increased by 256% for the FD, and remain unchanged for the CD fed rats when 10% boysenberry extract was added to the diet.
Conclusion – This study showed that Boysenberry extract functioned as an in vivo antioxidant and raised the antioxidant status of plasma while decreasing some biomarkers of oxidative damage, but the effect was highly modified by basal diet. These results are further evidence of complex interactions between dietary antioxidants, background nutritional status as determined by diet, and the biochemical nature of the compartments in which antioxidants function.
Background – Dairy foods such as cheese and yogurt provide a health promoting option, and components present within their fat could be offering some significant benefits. Dairy fat has an optimal ratio of w6/w3 fatty acids (2), along with conjugated linoleic acids (CLA) are also present (particularly cis-9, trans-11 18:2) which have been identified as potentially beneficial to bowel health eg antiinflammatory and cancer preventing effects (1).
Design – Dairy fat has undergone investigation in experimental animal models of disease, and in human clinical studies, to evaluate potential benefits with respect to bowel cancer prevention.
Outcomes – A rat study of high fat diets showed cheese was an optimal source of w3 fatty acids producing high concentrations of long chain w3 fatty acids: EPA and DHA in liver triglycerides. A report by Larsson et al (2) from the women’s mammography prospective cohort study showed that over 15 years of observation the highest intakes of full fat dairy products (≥ 4 serves/day) were associated with reduced expression of colorectal cancers (down by 41%, p trend = 0.002) relative to those ingesting ≤ 1 serve of dairy per day. CLA intake in this study also showed there was a significant inverse association (multivariate rate ratio 0.71, p trend = 0.004). Increased apoptosis was identified in animal studies as a possible protective mechanism.
Conclusions – Dairy foods have an important role to play in health and carry some functionally significant components (w3FAs and CLA) of benefit in maintaining bowel health.
References
Background – Numerous health benefits have been attributed to both eicosapentaenoic acid (EPA, 20:5n3) and docosahexaenoic acid (DHA, 22:6n3) found in fish oil. However, docosapentaenoic acid (DPA, 22:5n3) found particularly in red meat has been less well studied. Australians consume 6 times more meat than we do fish. The richest commercial capsule source of DPA available is seal oil.
Objective – To compare the effects of DPA rich seal oil supplementation with DHA rich fish oil, on measures of plasma lipids in hypertriglyceridaemic subjects.
Design – A randomised, parallel, placebo controlled, double blind study was conducted in 52 hypertriglyceridaemic subjects. They were randomly allocated to one of three groups receiving a total of 1g/d EPA, DPA & DHA but different relative amounts: seal oil capsules (360mg EPA, 250mg DPA, 450mg DHA), fish oil capsules (210mg EPA, 30mg DPA, 810mg DHA) or placebo capsules (containing a vegetable oil) for 6 weeks. Fasting blood samples were taken at baseline and at 6 week post intervention. Blood samples were tested for red blood cell (RBC) fatty acids and plasma lipids (triglycerides, total cholesterol, LDL-C and HDL-C).
Outcomes – Seal oil supplementation significantly increased incorporation of DPA (from 2.5-2.7%), DHA (from 4.9-5.8%) and EPA (from 1-1.8%), p<0.0005), whereas fish oil increased incorporation of DHA only (from 5.2- 6.2%), p<0.01 into RBC. Plasma triglycerides remained unchanged in the placebo group (2.30-2.36mmol/l), whilst reductions of 7% (2.24-2.09mmol/l) and 14% (2.54-2.19mmol/l) were seen in the fish oil and seal oil groups respectively, but only the seal oil group reached significance (p<0.05).
Conclusion – Seal oil supplementation increased RBC levels of DPA, EPA and DHA whilst DHA rich fish oil supplementation increased RBC levels of DHA only. It appears that seal oil may be more effective than fish oil at lowering plasma triglyceride levels in hypertriglyceridaemic subjects.
Acknowledgement – Supported by funding from Meat and Livestock Australia.
Background – Control of postprandial hyperglycemia is one of the main therapeutic targets in diabetic patients, glucose-intolerant and normal subjects. High amylose starch, as found in Indica rice and corn flour, suppress blood glucose levels and insulin responses when compared with the low amylose starch. Since the traditional Japanese rice crackers are generally made from Japonica rice, which has a low amylose content (16 – 18 % amylose), they have a high glycemic index (GI, 91). We developed a rice cracker using Indica rice with high amylose content (25 – 35 % amylose) and non-digestible dextrin.
Objective – To determine the GI and insulinemic index (II) of the new rice cracker, plasma glucose and insulin concentrations were monitored in healthy volunteers for 2 hr after consumption of the reference starch solution (Trelan-G), the traditional Japanese rice cracker and the new rice cracker.
Design – Twelve healthy volunteers, 6 men and 6 women aged 28.6±6.6 y, with normal body mass indexes (20.5±1.7 kg/m2) participated. The blood samples were collected before and 15, 30, 45, 60, 90, 120 min after the intake of the reference starch solution (Trelan-G), the traditional Japanese rice cracker and the new rice cracker. Outcomes and Conclusions – The GI and II of the traditional Japanese rice cracker were 87±28 and 142±123, respectively. The GI and II of the new rice cracker were 63±24 and 85±55, respectively. It was clarified that the GI and II of the new rice crackers were significantly low when compared with those of the traditional Japanese rice crackers. These results indicated that the new rice crackers might be useful for the prevention of the metabolic syndrome.
Background – A balanced diet is an important aspect of disease prevention. Legumes are high in fibre and plant protein, low in fat and cholesterol and have a low glycaemia index. They are an important addition to a balanced diet. Despite their healthy nature, consumption of legumes within Australia is low. Qualitative research into the perceived benefits and barriers of legume consumption is an important step in addressing this. Hypocholesterolaemic properties and enhanced sensations of satiety have been reported. Through increasing satiety chickpeas may replace other food items in the diet potentially leading to beneficial nutrient changes.
Objective – To determine the effect of chickpea supplementation on other food consumption in an ad libitum diet,
and the participants’ perception of chickpea consumption.
Design – In an ordered crossover study, 42 participants consumed their habitual diet for four weeks, an ad libitum -1
diet supplemented with 104 g day chickpeas for 12 weeks, and their habitual diet for another four weeks. Fifteen participants from the above study took part in focus groups exploring factors that determine food choice, the acceptability of chickpeas, and the benefits and barriers of legume consumption.
Outcomes – When chickpeas were added they replaced foods in the cereals, vegetables and dairy food groups. Focus groups identified many factors influencing dietary choice and many barriers to and benefits of legume consumption. Participants were particularly concerned with choosing healthy foods, and eating a variety of different foods that are tasty, convenient and accepted by other family members. A number were concerned about flatus. Conclusions – Chickpeas mainly replaced carbohydrate rich foods. The health benefits of legumes are the main characteristics encouraging their consumption, while inconvenience and gastrointestinal upset discourage consumption.
Background – Although diets with increased protein to carbohydrate appear more effective for improving body composition and metabolic outcomes in women, they have been poorly studied in men.
Objective – Our study aimed to comprehensively assess the efficacy of a high protein low GI diet (HP) compared to a high carbohydrate low GI diet (HC) for weight loss in overweight/obese men. Design–Onehundredandtwentythreeoverweightmenaged49±9yandBMI34±6kg/m wererandomizedtoone of 2 isocaloric (7MJ) weight loss diets for 12 weeks. HP: (Protein:CHO:Fat;%Sat Fat = 35:40:25;8%) or HC (17:58:25;8%). Outcome measures were regional fat and lean loss as well as cardiovascular risk markers.
Outcome – Weight loss on both diets was similar; 8.9±4.2kg (mean±SD). Total abdominal fat mass loss was significantly greater on HP compared with HC even after controlling for baseline differences (HP -0.76±0.38 kg vs HC, -0.56±0.36 kg; P=0.02). Triglycerides (TG) fell by 0.45±0.70 mmol/L, LDL cholesterol by 0.48±0.66 mmol/L, blood pressure by 11/12±10/8mmHg and HDL cholesterol remained unchanged independent of diet composition. Glucose and insulin fell by 0.27±0.58mmol/L and 4±6mU/L respectively. CRP fell significantly only in those subjects with TG>2 mmol/L and on HP (P=0.03 time/diet/TG status interaction). Plasma folate increased 7% on both diets and homocysteine remained unchanged at 7.7 μmol/L. Plasma B12 increased significantly only on HP by 20% (P=0.027 for diet interaction).
Conclusion – Both high protein diets and high carbohydrate low GI diets are effective in improving cardiovascular risk in obese men but with some metabolic advantages on the higher protein pattern.
Background – Epidemiological studies have shown low folate status is associated with colorectal cancer. Colonic tissue folate levels at different stages of cancer development should give important information, but different methodologies to extract the colonic tissue folates have been used. This has hampered progress in defining the relationship between systemic and tissue folate levels.
Objective – To evaluate two methods of colonic tissue preparation for estimation of total folate content.
Design – Whole tissue punch biopsy samples were obtained from the descending colon of 31individuals following a normal colonoscopy. Blood samples were obtained for the determination of plasma homocysteine (Hcy), red cell folate (RCF), methylenetetrahydrofolate reductase 677C>T genotype, and serum vitamin B12 and folate. Colonic tissue folate was measured both in washed whole tissue biopsies and in epithelial cells isolated from tissue biopsies. Outcomes – Whole biopsy and epithelial cell folate concentrations were significantly correlated (R=.375; P=.038). Hcy was inversely correlated with both measures (R=-.365; P=.043 and R=-.364; P=.044 respectively). RCF was significantly correlated with isolated epithelial cell folate (R=.477; P=.007) but not with whole tissue biopsy folate (R=.264; P=.151). There were no significant associations between serum and colonic folate in this study. Conclusion – Both methods are useful for comparing systemic and localised tissue folate status but epithelial cells may provide more reliable data.
Background – The incidence of obesity is increasing and products that suppress appetite and subsequently food intake may be important in controlling body weight. Murray Goulburn Nutritionals has produced a dairy based supplement that has been shown to stimulate cholecystokinin (CCK), a potent satiating hormone, release in vitro. This product may potentially influence appetite leading to a reduced food intake.
Objective – To determine if a dairy based meal replacement product can stimulate the release of CCK to increase
satiety and decrease food intake in obese and lean volunteers.
Design – Double-blind, randomised, placebo-controlled, cross-over study with 15 lean (age 27.3 ± 5.7 yr; BMI 22.0 -2 -2
± 1.3 kgm ) and 15 obese (age 38.2 ± 9.5 yr; BMI 35.3 ± 4.6 kgm ) men. On separate days, volunteers consumed 250 mL novel dairy fraction (test) and placebo beverages following an overnight fast. Plasma CCK concentrations and visual analogue scale assessments of appetite were measured. Subsequent food intake was assessed at a buffet meal.
Outcomes – Food intake (kilojoules) was lower in both lean and obese volunteers after the placebo supplement (Lean 7.98% less, Obese 7.39% less) with no difference between groups (P = 0.99). Obese volunteers rated themselves as less hungry and more full after the test supplement, whereas lean volunteers rated themselves less hungry and more full after the placebo; however these were not significant. CCK concentrations increased following both the test and placebo supplement (P < 0.001) but there was no difference in the CCK response between lean and obese volunteers (P = 0.13).
Conclusions – Although the test product proved inactive, this approach could be used to evaluate other nutrients which have the potential to suppress appetite. Further investigation of possible alterations in appetite and food intake response to novel dairy based supplements in lean and obese volunteers is warranted
Background – Experimental and epidemiological data suggest that dietary fibre and resistant starch (RS) promote bowel function through faecal bulking and short-chain fatty acid (SCFA) production. Our data has recently shown that dietary n-3 polyunsaturated fatty acids (PUFA) from fish oil (FO) also have important actions on the gut (1). Objective – To feed normotensive and the spontaneously hypertensive rat (SHR) diets supplemented with FO and RS and examine indices of gut health including effects on in vitro contractility.
Design – In experiment 1, young Sprague Dawley (SD) rats wed fed 4 diets that contained 100 g/kg fat as sunflower oil or FO and with 10% fibre supplied as α-cellulose or high amylase maize starch rich in RS. In experiment 2, older SHR and WKY control rats were fed 3 diets with 50 g/kg fat as lard, canola oil, or FO. We measured gut tissue fatty acid composition, muscarinic receptor binding properties, the SCFA pool and agonist-induced gut contraction. Outcomes – FO supplementation lead to increased n-3 PUFA content of gut tissue while RS resulted in increased caecal content of SCFA, especially as butyrate, and lowered pH. There where no changes in total muscarinic binding in gut tissue of older SHR. However, in young SD rats FO supplementation altered the sensitivity of the M1 receptor subtype compared to the other diets. In SD ileum, FO feeding also led to higher 8-iso-PGE2 (83%) PGE2 (259%) and PGF2α-induced (203%) maximal contractility with a RS effect noted for carbachol (105%). Lower prostanoid effects in young SD rats and older SHR were also enhanced by FO. It was noted for SHR, FO supplementation also resulted in the first observation of increased maximal contraction of the colon. While a 5% canola oil diet rich in α-linolenic acid lead to a marginal increase in total tissue n-3 PUFA in SHR ileum, there were no effects on contractility.
Conclusion – Although little interactive effects were noted for FO and RS, the data suggests developmental changes in ileal receptor systems with independent effects of RS and FO on some bowel properties of juvenile rats. In older SHRs, FO supplementation increased contractility of ileum and colon and restored depressed prostanoid contractility in ileum with docosahexaenoic acid (DHA) indicated as the active agent. FO and RS produce positive outcomes for bowel health, likely by independent mechanisms which may be of interest to the food industry. Reference
1 Patten GS, Conlon MA, Bird AR, Adams MJ et al. Interactive effects of resistant starch and fish oil on SCFA production and agonist-induced contractility in ileum. Dig Dis Sci 2006;51:254-61.
Background – Colorectal cancer (CRC) is the second ranked cause of cancer related death in Australia. A healthy lifestyle based on a good diet is the most attractive preventative strategy to fight this disease. Dietary fruits, vegetables, legumes and cereals are recommended for the prevention of CRC development. It is desirable to know which particular foods posses specific anticancer properties. From our research, several plant-derived products with anticancer properties have been identified and found to protect at different stages of the complex process of carcinogenesis.
Objective – The objective was to identify dietary plant sources with anti-proliferative and apoptotic properties against HT-29 colon cancer cells.
Design – Aqueous-ethanolic extracts of edible part of plants, nuts and herbs were tested in exponentially growing HT-29 cells in 96 well plates using the cell titre blue (CTB) assay. Lead samples with anti-proliferative activity were then further tested for their specific effects on apoptosis, histone deacetylase (HDAC) activity, cell cycle phase, and expression of cyclooxygenases (COX) and enzymes implicated in CRC such as quinone reductase and glutathione synthase transferase (GST).
Outcomes – Salicylate at 10 mM inhibited the growth of HT-29 cells up to 55 % with no effect on apoptosis, conversely butyrate at 10 mM inhibited the growth of HT-29 cells at about 30 % and induced apoptosis up to 25%. We found several plant genera including rosemary, chives, bay leaves, coriander, basil, cardamom and cloves demonstrated anti-proliferative and apoptotic activity comparable to salicylate and butyrate and altered activity and expression of key enzymes linked to CRC development.
Conclusion – We have identified leads from food extracts with potential anticancer properties. We intend to further validate these lead extracts in animal models of CRC to develop functional foods and nutraceutical products that might prevent the development of human CRC.
Background – Children are often targeted for nutrition intervention programmes with the aim to prevent ill health later in life. Schools are seen as an ideal place to implement such interventions as most children attend together and foods are usually consumed. Most schools in New Zealand are publicly owned and therefore government agencies have the authority to influence policy surrounding food at school, so that agencies appear pro-active in the fight against nutrition related disease. There are no data showing on the dietary intakes of children during school hours in New Zealand. This prevents school based nutrition interventions from being evidence based and specifically targeted to sub groups of children who maybe in particular need.
Objectives – This study describes the dietary intakes of New Zealand children during school hours. The information gathered will be of benefit to organisations planning school based nutrition policies or interventions including ethnic and socio-economic determinants of dietary intakes.
Design – The study is a secondary analysis of data from the National New Zealand Children’s Nutrition Survey 2002 (CNS 02). The CNS 02 sample included 3275 children aged 5-14 yr, of which 2958 provided dietary data on school days.
Dietary intakes were measured by 24 hr diet recall. All foods consumed between 9am and 3pm, Monday to Friday were assumed to have been consumed at school.
Outcomes – Preliminary results show the mean energy intake of children during school hours was 2.5MJ, which equates to 30.7% of their daily energy intakes. Of the energy consumed during school hours 29.1% was contributed from fat (11.5% from saturated fats), 53.8% from carbohydrates (9.6% from sucrose) and 9.7% from protein.
Conclusions – The results so far show that children’s dietary habits during school hours are in accordance with current guidelines.
Background – There has been a gradual development of interest in the contribution of pulses to a healthy lifestyle, as awareness of ethnic diets and lifestyles has grown. A couple of controlled dietary intervention studies with chickpeas have shown a small but significant reduction in serum low density lipoprotein (LDL-C) and total cholesterol (TC) concentrations in women and men. But a question remains as to the potential effect of chickpeas on nutrient intake, metabolic and physiological changes in a more realistic ad libitum setting.
Objective – To estimate the effect of including a realistic quantity of chickpeas in an otherwise ad libitum diet in free-living adults.
Design – An ordered crossover design of 20 weeks duration with four weeks of habitual diet at commencement and end. Forty-five adult women and men, as a group slightly hypercholesterolaemic but normoglycaemic, included 104g of chickpeas per day in their habitual diet for 12 weeks. Comparison was made of macronutrient and dietary fibre consumption, body mass index, fasting plasma glucose, serum lipids, lipoproteins, insulin, leptin and ghrelin concentrations, after habitual diet supplemented with chickpeas and after four weeks of post chickpea ad libitum diet. Semi-quantitative assessment of bowel function was made using anchored visual analogue scales. All data was analysed with repeated measures ANOVA using GLM with robust standard error estimation and ordinal logistic regression for ordinal data.
Outcomes – Chickpea-related increases in mean dietary fibre and PUFA intake were associated with significant decreases in serum TC and LDL-C, fasting insulin and HOMA-IR (p<0.05 for all) when compared to the usual dietary phase. Small but significant reductions in body weight (p=0.001) and improved bowel function were noted during the chickpea phase compared to the usual dietary phase.
Conclusion – Adding chickpeas to the diet is a sensible option for individuals wanting to modify their diet- associated CVD risk factors.
Background – Chickpeas are common in many ethnic diets and are rich in polyunsaturated fatty acids (PUFA), dietary fibre and resistant starch. However, little information is available on the health effects of regular chickpea consumption.
Objective – To compare the effects of a diet supplemented with chickpeas to a wheat-supplemented diet of similar fibre content on serum lipids and glycaemic control, and to compare these diets plus a wheat based diet of low fibre content on satiety and bowel function.
Design – Twenty-seven free-living adults followed two randomized, crossover dietary interventions each of five weeks duration. The chickpea diet included canned chickpeas (140g/day), bread and biscuits containing 30% chickpea flour. The diets were isoenergetic to the participants’ usual diet, matched for macronutrient content and controlled for dietary fibre. Following on from the second randomised intervention, a sub-group of 18 participants underwent a third lower-fibre wheat diet. Measures at the end of the diets were compared by repeated measures ANOVA using GLM.
Outcomes – Serum TC was 0.25 mmol/L (p< 0.01) and LDL-C was 0.20 mmol/L lower (p=0.02) following the chickpea diet compared to the wheat diet. An unintended significant increase in PUFA and corresponding decrease in MUFA consumption occurred during the chickpea diet and statistical adjustment for this reduced the effect on serum lipids by about 50%. There was no significant difference in glucose or insulin concentrations. Perceived general bowel health improved significantly during the chickpea diet although there was considerable individual variation. Greater satiety was reported by some participants and was significantly greater than on the low fibre diet. Conclusions – The small but significantly lower serum TC and LDL-C during the chickpea diet could provide a valuable health benefit.
Background – The high consumption of non-core or ‘extra’ foods is of concern as they may contribute to excessive energy intakes and replace more nutritious foods in the diet. The definitions used to classify ‘extra’ foods are inconsistent and need to be standardised.
Objective – To develop a classification system to identify ‘extra’ foods which can be used in the analysis of dietary intake data and development of nutrition policy in Australia.
Design – Two sets of criteria to identify ‘extra’ foods were developed based on standards for fat and sugar content; one set was based on principles outlined in the Australian Guide to Healthy Eating (AGHE) and, the other set was more stringent and followed the additional principles outlined in the Dietary Guidelines (AGHE+). The energy and nutrient contribution of ‘extra’ foods based on these two sets of criteria were compared, using dietary data for children aged 2-18 years who participated in the 1995 NNS.
Outcomes – Using the AGHE criteria, ‘extra’ foods contributed 41% of energy, 19% of protein, 48% of fat, 54% of sugar and 20-30% of micronutrient intakes. By comparison, using the AGHE+ criteria, ‘extra’ foods contributed 70% of energy, 58% of protein, 82% of fat, 80% of sugar and 60-70% of micronutrients.
Conclusion – ‘Extra’ foods contribute to a large proportion of energy, fat and sugar intake in the diets of Australian children, using either set of criteria. However, the AGHE+ system is too stringent, in that it includes many foods that would be regarded as high-fat or high sugar forms of core foods. We recommend the AGHE criteria be considered more widely for use in identifying ‘extra’ foods. This set of criteria uses defined cut-points for each food category and yields results similar to the few international studies which have assessed the contribution of energy- dense, nutrient-poor foods to the total diet. It is important that there is agreement on standardised criteria to identify ‘extra’ foods.
Background – The first week after hatch is the most critical period in the life of a broiler chicken. Several recent studies have examined the utilisation of energy and protein in the newly hatched chick, but corresponding data on mineral utilisation is scanty.
Objective – To determine the retention of major minerals (Ca, P, Mg, K and Na) and some trace minerals (Fe, Mn, Zn and Cu) of diets based on wheat, sorghum and maize during the first two weeks post-hatch of broilers.
Design – Three diets containing wheat, sorghum and maize as the cereal base were formulated. All three diets were formulated to contain similar levels of energy, amino acids and major minerals. Each diet was fed ad libitum to six replicate groups (8 birds/replicate) from days 1 to 14 post-hatching. On days 3, 5, 7, 9 and 14, total excreta collection method was employed to determine the mineral retention.
Outcomes – For all minerals, the retention values were higher at day 3 and then declined during 5 to 9 days, before increasing at day 14 post-hatching. Among minerals analysed, the retention of Na was the highest and those of Zn and Cu were the lowest. Cereal effects were significant only for Ca and P, with sorghum-based diets having high retention values.
Conclusion – The capacity to absorb and retain minerals appears to be limiting during the first week of life in modern fast-growing broilers.
Background – Hereditary haemochromatosis is one of the commonest genetic disorders in Australia. There is considerable variability in the rate of end-organ disease in people with susceptible HFE genotypes for hereditary haemochromatosis, possibly due to variability in the rate of iron accumulation.
Objectives – To estimate the association between dietary iron and other nutrients, and iron status.
Design – A community cross-sectional survey of 114 men and 119 women with different HFE genotypes conducted in northern Tasmania. Macro- and micro-nutrient intake was assessed by food-frequency questionnaire.
Outcomes – 67% of men and 71% of women with C282Y homozygous genotype had elevated transferrin saturation. The median ferritin in C282Y homozygous men under age 35 was 151 μg/dL and 809 μg/dL over age 35. Only 33% of C282Y homozygous women (all over age 50) had ferritin levels over 350 μg/dl. Serum transferrin saturation and ferritin levels were strongly associated with dietary fat intake in C282Y homozygotes, but not in heterozygous and normal genotypes. Serum ferritin levels were strongly associated with dietary haem iron in homozygotes with ferritin > 350 μg/dL but not below this. When the two factors were analysed together in this group, 1 SD increase in haem iron increased serum ferritin by 470 μg/dL (CI95% 82 to 857; P=0.03) and 1 SD total fat increased serum ferritin by 310 μg/dL (CI95% 31 to 589; P=0.036). There was no association between dietary non-haem iron and iron status.
Conclusions – There is a strong association between haem iron intake and ferritin in C282Y homozygotes with elevated ferritin levels. Total fat intake had an independent strong association. In these people dietary change may slow iron accumulation and may delay the need for venesection.
Background – The incretin hormones such as GIP and GLP-1 are released from endocrine cells in the intestinal mucosa after ingestion of carbohydrates and enhance postprandial insulin release from the pancreatic beta cells. The released insulin stimulates glucose uptake in skeletal muscles and adipocytes, and the enhancement of the glucose disappearance rate from blood (glucose clearance: GC) is promoted by the facilitative subcellular redistribution of the glucose transpoter isoform (GLUT4) from an intracellular compartment to the plasma membrane. The ability to stimulate incretin hormone secretion differs among the various types of carbohydrates.
Objective – To clarify the relationship between the insulin release (AUC of plasma insulin concentration: AUCINS) and GC after intravenous and oral administration of glucose, and to examine whether the relationship can be used to predict the intestinal glucose absorption rate after intakes of various types of carbohydrates such as sucrose.
Design – The glucose was intravenously infused into male Wistar rats. The oral glucose tolerance test (OGTT) was also performed using Trelan-G. Sucrose was administered orally. The alterations of plasma glucose and insulin concentrations were measured, and the simple kinetic model was used to determine their GC.
Outcomes and Conclusions – The values of AUCINS and GC after oral administration of glucose were greater than those after intravenous infusion of the same amount of glucose. There was a strong positive correlation between AUCINS and GC after either intravenous or oral administration of glucose. It was clarified that the intestinal glucose absorption rate after intake of sucrose could be predicted from the relationship between AUCINS and GC.
Background – Nuts are often referred to as high oxalate containing foods but reliable data on the oxalate content of many commonly eaten nuts are hard to find.
Objectives - This study was conducted to determine the oxalate contents in common nuts either grown or imported into New Zealand. Samples of imported nuts were purchased from supermarkets in Christchurch while home-grown nuts were obtained directly from the growers. Gastric soluble and intestinal soluble oxalates were extracted from the nuts using an in vitro assay. The extracted oxalates were then determined by HPLC chromatography.
Outcomes – The gastric soluble oxalate contents of the nuts represent the total oxalate in the samples, however, the more interesting fraction is the intestinal soluble oxalate. This is the fraction that will be absorbed in the small intestine. Peanuts, Spanish peanuts, peanut butter, ginkgo and pecan nuts all contained relatively low levels of intestinal soluble oxalate ranging from 129 to 173 mg intestinal soluble oxalate/100 g fresh weight (FW). Almonds, Brazil, cashew and candle nuts contained higher levels of intestinal soluble oxalate ranging from 216 to 305 mg/100 g FW. Pine nuts contained the highest levels of intestinal soluble oxalate (581 mg/100 g FW) while in contrast, chestnuts and pistachio nuts were low (72 and 77 mg /100 g FW). Over all the nuts studied the mean soluble oxalate contents was 78% of the total oxalate content (range 41 to 100%).
Conclusion – The results obtained in this study confirm that the soluble oxalate contents of nuts range widely and people who have a tendency to form kidney stones would be wise to moderate their consumption of certain nuts.
Background – Studies in both humans and rodents have shown an endogenous entrainment of 24 h rhythms in plasma glucose and insulin metabolism (1). These rhythms are regulated by the suprachiasmatic nucleus and are independent of the influence of feeding activity (1). Little is known of these metabolic rhythms in the domestic pig despite the fact that pigs housed in commercial environments are maintained at ambient photoperiod.
Objectives – To determine 24 h profiles in insulin, glucose and feeding behaviour in pigs fed ad libitum and entrained to a 12 h (0600 to 1800 h) light regimen.
Design – Ten entire male pigs were allocated randomly to individual pens in the same room maintained at 22.0 ± 0.7oC. Jugular cannulae were introduced into each pig via the ear vein 24 h prior to the onset of sampling. Blood samples (3mL) were taken at intervals of one hour for 24 h and the plasma stored at -20°C until assayed. Circulating insulin concentrations were determined by radioimmunoassay and plasma glucose concentrations by enzymic analysis. Feeding behaviour was monitored by video analysis over the sampling period.
Outcomes – Feeding behaviour was characterized by a distinct photoperiod entrainment with activity greater during the period of light compared to the period of darkness. Both glucose and insulin concentrations displayed an ultradian rhythm although plasma insulin secretion was correlated to neither feeding behaviour nor glucose status. Conclusions – The data suggest that pigs consume food at regular intervals throughout the daylight hours. Feeding animals ad libitum may not be metabolically efficient as insulin secretion is not correlated with feeding behaviour. Reference
1 la Fleur, SE. Daily rhythms in glucose metabolism: suprachiasmatic nucleus output to peripheral tissue. J Neuroendocrinol 2003; 15: 315-322.
Background – Folates are sensitive to temperature, pressure and exposure to light and thus can be affected during food processing.
Objective – To review some of the common practices of processing foods and their impact on folate stability. Review – A number of studies in the literature have reported thermal destruction of folate in model systems when temperatures were up to 100°C and under UHT conditions, and in food systems. In particular, the concentration of folic acid and 5-methyl tetra hydro folate (THF) were followed. Given the different extraction methods used for folate analysis using the microbiological assay, data vary largely. The traditional technique using single enzyme method on cereals, in particular, gave one third to almost one half of the amount of folate detected by the tri-enzyme method. Common processing methods in such as boiling, fermentation, roasting and baking, where heating may be dry or moist, caused considerable losses of folic acid in foods. Several studies on effects of the baking process on wheat products (eg. bread) have been done on pilot scale and it is generally agreed that despite the increase in folate during fermentation, total folate is lost due to the high baking temperatures employed. Other studies reveal a loss of 40% of total folate during commercial soy milk processing compared to the total folate concentration in the raw beans (1). This paper will analyse existing data on the effects of processing keeping in mind the differences in extraction methods used for folate analysis in foods.
Reference
1 J Arcot, S Wong and AK Shrestha. Comparison of folate losses in soybean during the preparation of tempeh and soymilk. J Science of Food and Agriculture. 2002;82: 1365-68.
Background – Accurate, representative data for the level of nutrients found in foods are critical for the assessment of the quality of a nation’s food supply and population nutrient intakes. Due to the dynamic nature of the food supply, advances in nutrition research and the need for comprehensive data for public health and labelling purposes, the task of providing official, up-to-date nutrient composition data is both difficult and ongoing. Analytical difficulties such as sampling variability and inadequate methods of analysis make this process even more difficult. Objective – To compile and publish quality up-to-date data on the nutrient composition of Australian foods.
Design – Nutrient data were derived primarily from Australian analytical data, with a strong focus on including new foods and known changes in food practices that could affect nutrient levels within the constraints of available resources. Nutrient data will be released in a form similar to the USDA, capturing not only the nutrient composition of each food, but also a detailed description and sample details for each food. The revised data will be published as a web-based database with additional downloadable PDF files.
Outcomes – A revised reference database of nutrient values found in Australian foods called NUTTAB06 to replace the current nutrient database NUTTAB95. NUTTAB06 will include additional nutrients such as iodine and selenium and will be available to the public for free. NUTTAB06 will form the underlying dataset for the development of the survey nutrient database for the National Nutrition and Physical Activity Survey and will be available as summary PDF files by the end of 2006 and in a web-based searchable format by early 2007.
Conclusions – NUTTAB06 will reflect current knowledge of the nutrient content of the food supply within time and funding restraints. Updating nutrient data is a difficult and time consuming process and it is inevitable that the database will only represent knowledge at any one particular time. The dynamic nature of the food supply, with the introduction of fortified foods and consumer desire for nutrient-modified products, changes in food production practices and analytical techniques all contribute to the difficulties in maintaining an up to date set of data.
Background – Patients with anorexia nervosa (AN), a psychiatric disorder in which diet is a central issue, present from diverse cultural backgrounds but to date there have been no cross-cultural studies into diet and AN.
Objective – To investigate the energy intake and macronutrient profile of diets in women with and without AN in Sydney and Singapore with respect to socio-cultural factors.
Design – Participants were aged 14-35 y – 39 with AN and 89 controls – and of North European or East Asian ethnicity. Daily food intakes when AN was most acute or (for controls) at one month prior to the interview date were obtained through diet history interviews. The percentages of energy contributed by protein (%Pro), fat (%F) and carbohydrate (%CHO) and the natural log of energy intake were analysed with multiple linear regression in terms of AN/control status, cultural background and orientation, socio-economic status (SES), age and education. Outcomes – AN status was associated with lower energy intakes (P<0.001) and higher %CHO (P<0.001). Higher %Pro was associated with greater Western orientation (P=0.055) but %Pro was similar regarding AN/control status. Greater Western orientation was associated with lower %F (P=0.03) in women with AN but with higher %F in controls (P=0.04). Higher SES was associated with lower %Pro (P=0.001) and %F (P=0.03) but higher %CHO (P=0.01). Cultural background was not associated with energy, %Pro, %F or %CHO.
Conclusion – The relationship between %F and orientation to Western culture may reflect the higher fat and protein intakes associated with Western diets but also the increased knowledge of current weight-loss diets promoted by Western culture. The findings regarding SES may stem from the affordability of “diet foods”.
Background – A recent Food Regulation Ministerial Council policy guideline endorsed mandatory fortification of the food supply with folic acid as an effective public health strategy for reducing the prevalence of neural tube defects.
Objective – To conduct a comprehensive risk assessment including the determination of an appropriate food vehicle for adding folic acid and assessing the potential impact of a mandatory fortification program on intakes.
Design – The steps involved in assessing folic acid intakes include (1) the compilation and evaluation of nutrient data from sources such as food composition programs and the uptake of provisions allowing voluntary fortification of foods with folic acid by manufacturers; (2) the identification of food composition and/or food consumption data gaps and addressing them; and (3) dietary modelling of different scenarios and interpretation of results. Bread was selected as the vehicle for folic acid fortification due to the high consumption of bread by the target group, women of child-bearing age. A number of levels of fortification were assessed Mean intakes and proportion of respondents exceeding the Upper Level of intake were determined for women of child-bearing age and selected non-target groups.
Outcomes – There was no level of fortification that resulted in the target group reaching the recommended 400 μg of folic acid a day which reduces the risk of neural tube defects without at least one other population group having an undesirable proportion exceeding the Upper Level of intake. However a fortification level that maximises benefit but minimises risk can be identified.
Conclusion – Mandatory food fortification can assist in increasing folic acid intakes in women of child-bearing age but other strategies, such as voluntary fortification in a number of foods and peri-conceptional supplement use must still be continued.
Background – Nut consumption is associated with a protective effect against coronary heart disease, partly due to its high antioxidant content. It is hypothesized that the inclusion of nuts in the diet will improve the antioxidant status of subjects with metabolic syndrome who may be vulnerable to impaired antioxidant status.
Objective – To investigate the effects of high cashew nut and high walnut diets on the antioxidant status of subjects with metabolic syndrome.
Design – Sixty-four volunteers (29 male and 35 female, 45±10y) with metabolic syndrome (diagnosed by using the ATP III criteria) received a prudent control diet, prepared in the metabolic kitchen of the North-West University, Potchefstroom campus (NWU-PC) for a period of 3 weeks (run-in). The participants were grouped according to gender and age and randomized into three groups, receiving either the walnut, cashew nut or the control diets for 8 weeks, while maintaining a stable body weight. Nuts provided 20% of daily energy intake. Fasting blood samples were taken after the run-in period (baseline) and at the end of the intervention period and analysed for various antioxidant capacity markers.
Outcomes – The oxygen radical absorbance capacities (ORAC) of the walnut and cashew nut diets were significantly higher than the control diet (2277±160 and 2142±73, respectively vs 1457±226 mmoL TE/g wet mass in the control diet). Despite this, the walnut and cashew nut diets had no significant effects on serum ORAC, reduced (GSH), oxidized (GSSG) glutathione, GSH:GSSG or hydroperoxide levels compared to the control group. However, all three groups showed significant improvements in antioxidant status from baseline to end (GSSG and hydroperoxide levels decreased; GSG:GSSG ratio and ORAC levels increased). This might have been due to a general increased antioxidant intake from the prudent diet compared to the habitual diets.
Conclusion – Although the inclusion of walnuts and cashew nuts into a prudent diet resulted in an increased antioxidant capacity (ORAC) of the nut diets, compared to the control diet, it did not improve the serum antioxidant profiles of subjects with metabolic syndrome.
Background – The current folate extraction methods for analysis in foods uses a tri-enzyme technique namely, addition of protease and α-amylase for hydrolysis of the protein and starch in foods to free the folate and a conjugase (chicken pancreas or human plasma or rat plasma) to further convert the polyglutamyl form of the folate to monoglutamate for determination using the traditional microbiological assay or the chromatographic methods. Given that there are high fat containing foods, the effect of addition of a lipase enzyme was investigated.
Objective – To assess the effect of the presence of inherent and added fat on the folate extraction from foods by modifying the existing method (Tamura et al, 1997) with the addition of a fourth enzyme namely lipase.
Design – Several foods under the Milk, cereal and meat category were analysed for folate with and without the addition of a fourth enzyme namely lipase.
The foods were purchased at least from three different outlets and homogenised in the laboratory using a waring
0
blender and stored as composite samples at -20 C until extraction.
Outcomes – In the milk category, the amount of folate was on an average 71% higher (21μg/100g) when lipase was added than when only the trienzyme technique was used (6μg/100g) in cheese; processed cereal foods indicated a mean increase of 60% (20μg/100g) in croissant with added lipase; 71% mean increase in jam doughnuts; 75% mean increase in Chicken thigh; 68% mean increase in lamb loin chops; 85% mean increase in beef blade steak; 78% mean increase in tuna; 80% mean increase in egg yolk; 50% mean increase in potato chips.
Conclusions – The presence of fat in foods certainly interfered with the maximum extraction of folate in the selected foods, more so in foods containing inherent fat and further systematic studies on animal foods is warranted. Reference
Background – Milk can be an important source of minerals in the diet. In Australia, seasonal variation in the nutritive characteristics of pasture is likely to be associated with seasonal variation in the concentrations of minerals in milk.
Objective – To determine if there is seasonal variation in the concentrations of some minerals (table) in milk. Design – Milk samples representative of calving groups (autumn and spring) within commercial dairy herds located in northern Victoria were taken at 6 – 7 week intervals and analysed for concentrations of minerals. Summary statistics (table) are based on the entire data set, where 158≤N≤166 for the mean of each mineral; units are mg mineral/kg milk except for selenium where units are μg/kg milk. Seasonal variation in concentrations of minerals for mid autumn (April), mid winter (July), mid spring (October) and mid summer (January) are representative of the bulk milk supply, where N=12. Values with a subscript in common are not significantly different (P<0.05).
Outcomes – The range in concentrations for all minerals was large with maximum concentrations approaching twice that of minimum values for the macro-minerals calcium, magnesium, phosphorus and potassium, and exceeding twice that of minimum values for the micro-minerals selenium and zinc. The average concentrations of all components examined also varied with season.
Conclusion – Environmental factors (farm management and season) are important determinants of the concentrations of some minerals in milk produced in northern Victoria. Milk produced in pasture-based production systems should not be considered as a uniform product across seasons.
Background – There is increasing evidence that milk contains a range of lipids that can promote health. In Australia, seasonal variation in the nutritive characteristics of pasture is likely to be associated with seasonal variation in the concentrations of lipids in milk.
Objective – To determine if there is seasonal variation in the concentrations of some milk lipids (table) that may be associated with health benefits in humans.
Design – Samples representative of the milk supply from northern Victoria for mid autumn (April), mid winter (July), mid spring (October) and mid summer (January) were analysed for concentrations of milk lipids (g component/kg milk fat). Values with a subscript in common are not significantly different (P<0.05).
Outcomes – The range in values observed for all components was large with maximum values being a whole number multiple of minimum values. The average concentrations of all components examined, with the exception of sphingomyelin, varied with season.
Conclusion – Environmental factors (farm management and season) are important determinants of the concentrations of some lipids in milk produced in northern Victoria. Milk produced in pasture-based production systems should not be considered as a uniform product across seasons.
Background – Dysphagia (swallowing disorders) affects 30-60% of nursing home residents giving rise to serious health issues, amongst which is dehydration is the most important affecting up to 25% of non-ambulatory residents. Thickened fluids are used to manage dysphagia but their efficacy is modified by the particular thickening agent, the liquid medium being thickened and lack of standardised preparation to achieve a given density and viscosity. Objective – To determine the rheological properties of fluids (fruit juices, cordials and milk) thickened with commercially available thickening agents.
Design – Laboratory based measurement of density and viscosity of thickened fluids at 20°C using a strain controlled rheometer. Yield stress was extrapolated from rheology data.
Outcomes – The density, yield stress and viscosity of the thickened fluids were significantly influenced (P < 0.05) by the type of dispersing liquid and thickener (xanthan or guar gums and modified starch). Rheological models,
-1
determined for each medium, were used to predict the viscosity at an assumed shear rate of 50s
Physical properties of thickened food are sensitive to changes in solids content, and errors in preparations can be better minimised using weights rather than volumes from scoops and spoons that are not standardised.
Conclusions – The rheological models defined in this study can be used as the basis of a standardised method of preparation of thickened fluids. Reduced compliance with fluid intake due to incorrect fluid preparation affects hydration state. Elderly persons may then suffer poor nutrition, dwindling health and medical complications associated with failure to thrive.
Background – Bioimpedance spectroscopy (BIS) is recognised as a more accurate method for the assessment of body composition than single frequency bioimpedance analysis (BIA). Nevertheless the method still relies on certain assumptions, most notably the assumed values for the resistivities of intra- and extracellular water (ICW and ECW respectively). Currently used values for adults originate from a study in an Italian population using D2O and Br dilution as reference methods for total body water (TBW) and ECW respectively.
Objective – To determine resistivity constants for BIS by two methods in an Australian population.
Design – ECW was measured in 12 healthy control subjects (9 M, 3F) by Br dilution and TBW by D2O dilution or from fat-free mass measurements using DXA (Hologic Discovery) assuming an hydration constant of 0.732. Concurrently, whole body BIS measurements were performed using an SFB7 impedance instrument (Impedimed, Brisbane) and apparent resistivity constants calculated according to the mixture theory model of impedances of body fluid volumes.
Outcomes – TBWDXA and TBWD2O were highly and significantly correlated (r = 0.88, P < 0.001). The resistivity constants for ECW were 201.5 ohm.cm and 183.6 ohm.cm for males and females respectively. ICW resistivity constants differed dependent upon whether D2O or DXA was used as the reference method: 735.8 ohm.cmDXA and 770.8 ohm.cmD2O for males and 690.8 ohm.cmDXA and 757.2 ohm.cmD2O for females respectively.
Conclusions – Resistivity constants for use in BIS analysis of body composition were derived in an Australian population. The exact values of the constants depend upon the reference method used in their derivation and thus they should not be used interchangeably. It is recommended that the appropriate constants be used where body composition data, derived from BIS, are to be compared with those from either DXA or dilution studies.
Background – Although obesity is primarily an increase in the fat mass of the body the condition is also associated with changes in body fluid content. In the morbidly obese (BMI > 40 kgm-2 ) the extracellular water compartment
(ECW) is often expanded such that the ECW:ICW (intracellular water) ratio is increased (> 0.8) and may remain so
even after weight reduction.
objective – To determine the size of body fluid compartments in obese, but not morbidly obese,subjects.
Design – Whole body impedance was measured in 92 obese (mean BMI = 34.9 kgm ) but otherwise healthy females using multiple frequency bioimpedance analysis (Bodystat Quadscan 4000). Total body water (TBW), ECW and ICW volumes were determined according to the mixture theory model of body fluid volumes using different published values for fluid resistivity constants and also using the proprietary Bodystat Quadscan software. Fat and fat-free mass measurements were simultaneously obtained using DXA analysis.
Outcomes – Predicted TBW volumes varied slightly depending upon the specific resistivity constants used for calculation but averaged 37.2 L or 40 % of body weight (BW). This compares to a TBW of 50-60 % BW commonly observed in normal weight individuals. Irrespective of the method of calculation, the mean ECW:ICW ratio was 0.62, typical of that found in normal weight subjects.
Conclusions – Despite the subjects being obese, with a body fat content approaching half (45.7 %) of body weight, this study showed no evidence of expansion of ECW compared to normal individuals unlike that previously reported for the morbidly obese.
Background – The potential for agricultural practices to influence the nutritional quality of food has long been debated particularly when comparing foods produced by organic and conventional agricultural methods. Whilst it has been established that different feeds affect the quality of cows' milk, there are relatively few studies that compare the effect of organic agricultural practices on the fatty acid composition of dairy products.
Objectives – To determine the fatty acid composition of dairy products that are available on the Australian market that have been produced through certified organic or conventional agricultural methods.
Design – Sixty two samples of certified organic dairy products [milk, n=11; cheese, n=14; cream, n=6; and yoghurt, n=31]; and 45 conventional samples [milk, n=7; cheese, n=13; cream, n=5; and yoghurt, n=20] were purchased from a range of commercial outlets in Sydney. The samples were homogenised, the lipids were extracted and the fatty acid composition was determined by using gas chromatography.
Outcomes – Small but statistically significant differences were observed in the saturated fatty acid composition, with lower percentages of lauric (2.2 vs 2.4, P<0.05) and myristic (10.1 vs 10.5, P<0.05) acids detected in organic dairy products as compared to their conventional counterpart, respectively. No differences were noted in either the π-3 or π-6 fatty acids between the organic and conventional samples. However, organic cheese was found to have a lower total polyunsaturated fatty acids content (P<0.05), and milk was found to have a higher π-3:π-6 ratio (P<0.05) compared to the matched conventional products.
Conclusion – Dairy products that are produced by organic or conventional methods have small but significant differences in their fatty acid composition. Whilst the decrease in lauric and myristic acids and increase in π-3:π-6 ratio indicate a potentially favourable fatty acids profile in organic products, the magnitude of the differences is small and the observation cannot be applied uniformly to all dairy product categories.
Background – The VicHealth initiative Food for All: Improving access to food for healthy eating. A food security program will progress Victorian improvement in food security through a number of creative projects over several years. One of these projects is the e-based Food Security Network (1) auspiced by the Victorian Local Governance Association (VLGA).
Objectives – The network aims to provide support for local governments and other stakeholders, who are working with their communities to reduce barriers to local food access for healthy eating, and to improve food security. Design – The VLGA website provides up to date news and events in the local government sector. Open access to the network provides entry to links for other organizations and resources related to food security. The discussion forum creates a space for ongoing interactive debate, discussion, support and resources around the issues of food security, particularly those in local government areas. This information can be accessed by anyone. The actual discussants have been initially restricted to organisations or individuals in Victoria who are registered discussion group members.
Outcomes – After one year of operation, questionnaire evaluation of the evolving website indicates that it is beginning to fulfil the purpose intended, with 70 registered members from local governments and a range of other settings. Local food policies and municipal public health plans linked to municipal strategy statements and corporate plans include opportunities for improving food security through the natural, built, economic, social- cultural, and health environments for residents in all neighbourhoods. Other important stakeholders include residents, self-help groups and clubs, primary health care partnerships, primary health care agencies, community services, welfare organisations, and health institutions such as local hospitals.
Conclusion – The Food Security Network provides information and support for local governments who have a whole of population responsibility and for other stakeholders who can contribute to improved community health and well-being through equitable and local food chain systems.
Reference
Background – Surprisingly little is known about optimal protein intake and health. Estimates of protein requirement are based on several assumptions the relevance of which can be challenged for populations consuming western diets.
Review – The most recent Nutrient Reference Values for Australia and New Zealand (1) list the RDI for men 19-70 years as 64g per day and for women as 46 g/day with 25% increases in individuals >70 years. These recommendations are based on reference body weights 76 kg and 57 kg for men and women respectively. These values rely heavily on the meta-analysis by Rand et al (2) based on 19 nitrogen balance studies in 235 individuals. This study provided estimated average requirements (EAR) and 97.5th percentile (RDI) of protein intakes for healthy adults at 0.65 and 0.83 g good-quality protein per kilogram per day, respectively. These values were not different for adult age groups, sex, or diet groups although the data was not powered to detect less than very major differences. Millward (3) argues that the adequacy of protein and amino acid intakes cannot be discussed separately from the adequacy of energy intake.
There are at least 3 issues which need to be borne in mind in applying these data to food based recommendations. Firstly, the definition of measured body nitrogen equilibrium (zero nitrogen balance) as the criterion of nutritional adequacy may be questionable. As skeletal muscle accounts for approximately 35% of the total body mass and energy expenditure, optimising accretion of lean tissue may be advantageous in minimising adiposity and enhancing insulin sensitivity. Secondly, whether the most appropriate method of expressing the protein requirement should be based on absolute body weight or lean body mass has not been determined. Thirdly, nitrogen balance data from these studies relates to energy balance not energy restriction. These are important limitations given that 60% of Australians are overweight or obese and estimating protein needs in this group in both energy balance but particularly energy restriction is needed.
Limited evidence suggests that a higher ratio of protein to carbohydrate during weight loss has several subtle metabolic advantages, most notably lean mass preservation and enhanced fat loss , triglyceride reduction , increased thermic effects, improved satiety, lower post prandial glucose and insulin responses , improved nutritional status (4- 6). The mechanism of whether these effects are related to increased protein or carbohydrate restriction or an interaction between these macronutrients is not straightforward.
Epidemiological studies (7,8) also suggest that higher protein intakes may be associated with a lower risk of ischaemic heart disease and stroke. The 50th percentile of protein intakes from the National Survey (1995) reports protein intakes at 96-115 g in men and 70-74 g in women. Protein sources in the Australian and New Zealand diet comprise meat poultry and fish 33% and dairy 16% with at least 25% from non animal sources such as cereal and cereal based foods (2). Protein foods are also sources of several micronutrients. In omnivorous western diets, obtaining the RDI for calcium, iron and zinc from wholefoods necessitates protein intakes in excess of current RDIs to achieve optimal nutrient intakes.
Conclusion – The optimal dietary pattern for a western diet requires protein in excess of current recommendations. References
Background – Brain tissue contains large quantity of n-3 fatty acids, particularly decosahexaenoic acid (DHA). DHA is derived from its precursor alpha-linolenic acid, a dietary essential n-3 fatty acid, or it can be obtained directly from dietary sources. Dietary deficiency of n-3 fatty acids leads to impairment of spatial learning and memory in experimental animals, and dietary supplementation with DHA appears to improve mental development in human infants. How DHA influences brain memory function is not fully understood although a number of potential mechanisms have been proposed. One of the possible mechanisms is to affect the synthesis of neurotrophic factors in the brain.
Objectives – The present study aimed to investigate the effect of dietary n-3 fatty acid deficiency on brain-derived neurotrophic factor (BDNF) gene expression and spatial learning performance in young rats.
Design – Sprague Dawley rats were fed on an n-3 fatty acid deficient diet for three generations. The rat pups of the third generation were tested for their spatial learning performance using Morris Water Maze at four and ten weeks of age. At the end of the behaviour test, the rat pups were killed and the brain tissue was dissected for measurement of BDNF and its mRNA levels using ELISA and real-time PCR techniques respectively.
Results – Dietary n-3 fatty acid deficiency led to a marked depletion of DHA in the brain tissue. The concentration of BDNF mRNA in the cerebral cortex and the hippocampus was significantly lower in n-3 deficient rat pups than in the control counterparts (P<0.05), although BDNF level in the brain tissue did significantly differ between the control and n-3 deficient animals. In comparison with the control animals, n-3 deficient rats performed significantly poorer in Morris Water Maze task (P<0.05), and the effect was particularly evident at four weeks of age. Conclusion – Dietary n-3 fatty acid deficiency leads to the impairment of spatial learning and memory performance and to a reduction of BDNF gene expression in the brain, the latter is a neurotrophic factor known to be involved in the memory function of the brain.
Background – The labelled magnitude scale (LMS) has been found to provide better discrimination of satiety 1
sensations compared to other scales for a homogenous population . Verbal anchors were placed on the scale to represent numerical ratios of perceived satiety. The satiety perception in a diverse population such as Australia may produce differences in the numerical ratios due to language acquisition and diversity.
Objective – To investigate whether LMS is an appropriate methodology to assess satiety in a diverse population. Design – Forty three subjects (28 female, 15 male) took part in the study. Of this group, 44% had English as their first language (EFL) while 56% had other language as first language (EOL). Subjects quantified the semantic meaning of 47 English words denoting hunger/fullness at various intensities. Ambiguous words were removed and geometric means (GM) were calculated. Eleven final words were chosen for anchors for scale construction. Outcomes – Words removed due to ambiguity differed between EFL and EOL groups as these words have no equivalent in non-English first languages e.g. ravenous and voracious. An asymmetrical scale was constructed. The scale developed for this diverse population had some differences in magnitude of numerical ratios of words such as
1 extremely full/hungry and very full/hungry compared to previous study .
Conclusions – Provided ambiguous words are avoided, labelled magnitude scale can be used in English to assess satiety in populations differing in their first language.
Reference
Background – Food macronutrient composition is linked to satiety, but does not define the physical architecture of foods. Little is known about the effect of food architecture on satiety for the same macronutrient composition. Objective – To determine the effects of food architecture with other factors in determining perceived satiety. Design – Fifteen lean subjects (8 male, 7 female) who were non-smokers, non-diabetic, regular breakfast eaters, non-athletic, not on medication affecting appetite and complying to a questionnaire (1) were selected. Subjects consumed a breakfast test meal (188g) of beef steak (BS) or beef mince (BM). As a control for method familiarization, cereal with milk (CM) was used. Subjects fasted overnight, consumed the test meal with 200ml water and recorded feeling of satiety on a pre-constructed scale. Pizza lunch was provided and subjects ate until comfortably full. Food intake for the rest of the day was recorded and energy intake was calculated using Food Works Software Version 3.02 (Australia). Outcomes – Subjects reported being moderately hungry, prior to the test meal consumption. Subjects felt fuller 3 hours after BM than BS consumption although the difference was not significant. Satiety scores were significantly
lower (P<0.05) after CM at all points of recording except at the 75th min.
Conclusions – Food from similar raw material with difference in its architecture may exert some differences in satiety perception after BS or BM breakfast but differences were not significant. A more extreme difference in food architecture e.g. a sausage-type meat emulsion will now be studied in comparison to the beef steak.
Reference
1 Stunkard AJ, Messick S. The three-factor eating questionnaire to measure dietary restraint, disinhibition and
hunger. J Psychosomatic Res 1985; 29: 71-83
Background – Nutrition is known to have an important influence on bone mass accretion during puberty.
Objective – To assess, in a longitudinal study, the effect of differences in food intake on total body bone mass accretion in Chinese girls during puberty, using a mixed model analysis.
Design – Subjects were 377 Chinese girls who had participated in a two-year milk supplementation trial and a three year follow-up study from the ages of 10 to 15 yr. Total body bone mass was measured by dual energy x-ray absorptiometry at baseline and then at 12, 24, 48 and 60 m later. Food intake was estimated from a 7-d food record at baseline and then subsequently from a 3-d food record over two weekdays and one weekend day. The quantity of cereals, vegetables and fruit, legumes and nuts, meat, eggs, dairy products, and “other” foods consumed was estimated from these records.
Outcomes – After adjusting for age, total body bone mineral content (tBMC) from baseline to the end of the follow- up study was positively associated with the quantity of dairy products (β=0.043, P=0.01) and eggs (β=0.172, P=0.03) consumed for all subjects, with or without milk supplementation. In contrast, when tBMC of the control group (representing typical girls in China at puberty) was analysed separately, the only dietary association was with consumption of dairy products (β=0.075, P=0.02). For all subjects, the foods associated with tBMC were dairy products and eggs during the intervention study (P≤0.05). However, during the 3 yr follow-up study positive associations were found with the intake of dairy products (P=0.05). The effect of dietary variation on total body bone mineral density (tBMD) differed slightly from that on total body BMC in the intervention trial and the follow- up study. In general, tBMD was positively associated with the consumption of eggs (P<0.05).
Conclusions – The quantity of dairy products consumed was the most significant dietary factor related to total body BMC in Chinese girls at puberty. The quantity of eggs consumed might also be related to bone mass accretion in these subjects.
Acknowledgement –This study was funded by Dairy Australia, Danone-China and the Nestle Foundation
Background – Diminished taste acuity as an early symptom of zinc deficiency has been demonstrated in animals and humans. A number of trials have also documented hypogeusia in association with low zinc in patients with a variety of aetiologies and in ‘healthy’ individuals in different life-stages, while others have failed to demonstrate such a correlation. Diagnosis of hypogeusia in these studies has been based on a range of chemical gustometry designs and more recently electrogustometry techniques.
Objectives – To identify the preferred methods of zinc assessment used by Australian naturopaths.
Design – Questionnaires were sent to 205 naturopaths regarding their zinc assessment methods.
Outcomes – A response rate of 57% was achieved. A taste acuity test proposed by Bryce-Smith in 1984
primary method of zinc assessment in 80% of respondents with only 4% not ranking this method in their top three assessment tools. The use of hair mineral analysis was also prevalent with 30% of naturopaths identifying this as their second preferred method. Other methods reported with lower frequency include kinesiology, plasma zinc, live blood analysis and electro-dermal screening.
Conclusion – The preliminary data suggests that the ‘zinc taste test’ is overwhelmingly the most frequently employed method for zinc assessment by naturopaths and is used in combination with clinical assessment but independent of any other biochemical index for zinc.
Reference
Background – The role of protein in the obesity crisis has, until recently, been largely ignored. This is for two reasons. First, protein provides the minor part of the human energy budget. Second, protein intake has remained far more constant over time and across populations than either fat or carbohydrate, both as a percentage of energy in the diet and in terms of absolute amounts eaten. Hence, while the obesity epidemic has spread, protein intake has remained relatively unchanged – giving the impression that protein cannot be responsible.
Method – We use state-space models (the Geometric Framework) developed from extensive animal studies to postulate a key role for protein appetite in the obesity epidemic, and provide supporting evidence from experimental, nutritional survey, and animal studies.
Results – Protein is the most satiating macronutrient group for humans and is the most tightly regulated post- absorptively. Results from comparative studies of other vertebrates, human experiments, and population-level data strongly suggest that the response of humans when faced with imbalanced diets is to prioritise protein intake. Hence, when the percent protein in the diet is low, non-protein energy is overeaten, whereas when dietary % protein is high, energy intake is limited: in both cases, absolute intake of protein is maintained near constant.
Conclusion – We show how, paradoxically, it may be because protein comprises a relatively small component of the human diet and is tightly regulated that it has sufficient leverage over human ingestive behavior to explain obesity. Focusing on this leverage over intake clarifies the role of dietary protein in the development of obesity, provides a possible means of ameliorating the problem, and explains the effectiveness of high-protein diets as weight loss regimes.
Plenary 1: Nutrition in-utero
Aspects of an Australian Aboriginal birth cohort:
a unique resource for a life course study of an Indigenous population
Background: Dietary protein promotes growth and regulates appetite. In the gut, protein elicits digestive, absorptive and hormonal responses that facilitate nutrient disposal and the control of growth. In bone, dietary protein promotes bone density and resistance to fracture. The molecular and cellular mechanisms that underlie these effects of protein, however, have been poorly understood.
Review: As dietary protein intake changes, serum and intracellular amino acid levels also change and recent research indicates that the body is equipped with sophisticated amino acid sensing mechanisms for detecting and responding to states of feast - and famine. The molecular mechanisms that are emerging are predominantly intracellular in the case of amino acid famine and predominantly extracellular in the case of amino acid plenty. Mechanisms for amino acid depletion are based on the generation of amino acid-free forms of transfer RNA and operate, for example, to regulate appetite, perhaps even the appetites for specific types of foods. Two basic mechanisms appear to operate in the case of amino acid plenty. One is based on amino acid transporters and the other on class 3 G-protein coupled receptors, some of which bind multiple amino acids in their extracellular bilobed “Venus FlyTrap” domains. Expression of these receptors in the gut provides mechanisms by which amino acids contribute to the control of digestion and absorption. Expression in endocrine cells, on the other hand, provides mechanisms by which the synthesis and/or secretion of growth-regulating hormones can be regulated. Finally, amino acid modulation of calcium-sensing receptors provides a mechanism by which protein and calcium metabolism are linked providing insights into the molecular control of bone homeostasis.
Conclusions: The cellular and molecular mechanisms responsible for detecting and responding to changes in dietary protein intake are, at last, coming to light.
Reference
AD Conigrave and DR Hampson (2006) Broad-spectrum amino acid sensing by class 3 G-protein coupled receptors. Trends Endocrinology and Metabolism, in press.
Background – Medical nutrition therapy for the management of type 2 diabetes (T2DM) is focussed on weight, glycaemic control and plasma lipids. Higher protein diets, when compared to high carbohydrate, low fat diets, may be beneficial for weight loss; effects on plasma lipids is variable. High protein diets, because weight and glycaemic control are related, may also result in improved glycaemic control.
Objectives – To compare and contrast a higher protein (HP) diet with a high carbohydrate (HC) diet in free-living people with T2DM.
Design – One year, randomised trial of HP vs HC nutrition education given by a dietitian in three sessions over a period of three months. At baseline, 12 weeks and one year anthropometric measurements were taken; three-day weighed food records and blood samples for biochemical analyses were collected; questionnaires relating to exercise and quality of life were completed. Participants’ food preferences were defined at the start of the study.
Outcomes – 123 subjects randomised to either HC (n=61) or HP (n=62) group. All values are least squares means ± 95% confidence interval of the least squares means. Weight was significantly (p=0.001) lower at one year compared to baseline for participants consuming the HC diet (88.4 ± 4.3 kg vs 86.8 ± 4.3 kg) and significantly (p<0.001) lower for participants consuming the HP diet (87.5 ± 4.3 kg vs 85.7 ± 4.3 kg). Lipids improved in both groups but there was no change in glycaemic control. There was no difference between groups. Participants in the HP group did not follow the advice to eat a HP diet.
Conclusion – Neither the HC nor the HP diet proved superior in the management of people with T2DM. The study suggests that it is more difficult for people to follow advice to eat a HP diet compared to a HC diet. However, it may suit some motivated individual patients and it may help them to lose more weight, but implementation at a mass level appears impractical.
Background – The effects of high protein and high carbohydrate diets on cardiovascular risk factors are inconclusive. Few studies have examined the long-term effects in a free-living population.
Objective – To determine the long-term effects of a high protein versus a high carbohydrate diet on cardiovascular risk factors post-weight loss.
Design – One hundred and forty one overweight and obese subjects who had previously lost >10% of their body weight using a very low energy diet (VLED) for 12 weeks were randomised to a high protein (HP) or high carbohydrate (HC) iso-energetic diet for weight maintenance, or further weight loss if necessary, for 12 months. Outcomes – Both dietary groups maintained their initial weight loss over the 12 month follow-up period. There was no significant difference in weight loss between the dietary groups at any time point. Total cholesterol and triglyceride levels were significantly reduced after the VLED and remained lowered throughout the trial for all subjects. After six months dietary treatment, the HP group experienced a significant increase in High Density Lipoprotein (HDL) cholesterol compared to the HC group (P=0.036). The difference was close to significant at 12 months (P=0.06). Pairwise comparisons across all individuals revealed a significant mean decrease in SBP of 13.5mmHg (P<0.001) from baseline to the end of month 3. After randomization at month 3, no significant difference in mean SBP decrease from baseline was detected between the two groups at this time (P=0.375) or at month 9 (P=0.194). However, by study completion the mean decrease of 14.3mmHg in SBP from baseline for the HP group was significantly larger than 7.7mmHg for the HC group 0 (P=0.045). On average, whilst individuals in the HC group struggled to maintain their initial reduction in SBP, those in the HP group did not.
Conclusion – Despite there being no difference in weight loss achieved by the two dietary groups, the HP diet appeared to confer beneficial effects on HDL cholesterol and SBP in the medium to long-term.
This study was funded by a grant from Meat and Livestock Australia.
Rationale – Diet and life-style related illnesses are the major causes of disability and premature death in affluent westernised countries and are emerging also in countries in the Asian region with increasing affluence. Illnesses such as coronary heart disease, certain cancers (eg large bowel) and diabetes are a consequence of the ready availability of energy-dense, readily digestible, highly refined foods. Dietary change through altered purchasing practice is an effective means of risk reduction and modifying the carbohydrate content of foods is an important route to achieving this.
Background – Humans possess one intestinal polysaccharidase (α−amylase) which can hydrolyse only starch. All other polysaccharides (non-starch polysaccharides, NSP; major components of dietary fibre) resist human digestion. High fibre foods are effective in controlling constipation and increased fibre consumption is thought to protect against colo-rectal cancer. While the importance of fibre is recognised, evidence is emerging that starches which are digested less efficiently by human small intestinal enzymes may be even more important. The significance of fibre arose from population studies showing that groups who ate unrefined foods had lower risk of diseases associated with the refined products consumed by affluent ones. However, it has emerged that the former have quite low fibre intakes but eat large amounts of starch and relatively small quantities of animal products (1). Further, their cooking practices favoured the generation of resistant starch (RS) through retrogradation. RS is that fraction of starch (and products of starch digestion) which escapes into the large bowel of healthy humans and retrogradation is a recrystallization of starch after gelatinisation giving more RS. RS is metabolised by the colonic microflora generating short chain fatty acids (SCFA) which promote several aspects of visceral function (2). RS is a mass term, reflecting the extent of small intestinal starch digestion. Glycaemic response (GR) applied to starch, describes measures of the rate of its small intestinal digestion. Slowing this process gives a lower glycaemic index (GI) and reduced demand for insulin and is of value in controlling diabetes. GR and RS are related but not synonymous and limiting both is of clear benefit.
Progress – Engineering starches to control their digestion is an important research target and CSIRO is developing a range of substantiated high RS/low GI products. One of these is a novel, hull-less barley cultivar (BARLEYmaxTM) which was produced by Plant Industry. It has a lower total starch content and an increased proportion of amylose and higher dietary fibre. The new barley can be incorporated readily into a range of common processed foods including breakfast cereals, bread and other bakery products. Wholegrain foods made BARLEYmaxTM have higher RS and lower GR than those made from standard wheat or barley and also raise large faecal SCFA. The latter finding supports a recent intervention showing that the combination of RS (22 g/day) and a moderate fibre intake (12g/day) resulted in improved indices of bowel health compared with NSP alone (3). High RS/low glycaemic index (GI) wheat cultivars are being developed by Food Futures Flagship and a high amylose wheat has been generated using RNA interference (RNAi) and shown to have high RS in rats (4). Other functional foods are being developed in p-Health Flagship to assist in the promotion of gut function and the prevention of colo-rectal cancer. One of these is a chemically modified starch which delivers SCFA to the large bowel. It is hoped that these products and ingredients will contribute to improved health and lowered disease risk. High throughput in vitro screens are also being developed to facilitate product development.
Future developments – Substantiation of the nutritional attributes of the new products is under way including establishing their benefits for gut health and cardiovascular risk factors.
References
Background – Resistant starch is that fraction of dietary starch that escapes digestion in the human small intestine
with consequent fermentation in the colon. There are three potential mechanisms by which unmodified starches may avoid hydrolysis by salivary and/or pancreatic alpha amylases. One is the presence of intact granules e.g. in uncooked grains, a second is through encapsulation of starches e.g. intact plant tissues that do not disintegrate during digestion, and the third is through adoption of double helical conformations by segments of starch molecules. The first two mechanisms are based on the presence of structural barriers that limit the access of amylase to the starch component. The third mechanism is based on the inability of a starch double helix to fit into the active site of human alpha amylases. This latter mechanism offers the greatest scope for optimization via food processing, and is the only relevant route for common starchy foods such as breads, many breakfast cereals, and extruded products.
Results – Techniques for characterizing the molecular conformation of starch polymers (NMR, FTIR) and the way in which these are organized at the supramolecular level (XRD, SAXS) have been developed and applied to native and processed high amylose maize starches before and after digestion with alpha amylase. 13C CPMAS (solid state) NMR is used to quantify the double helical content of ‘dry’ starches (1) and to calibrate FTIR spectra, previously shown to be sensitive to starch crystallinity (2). Levels of double helices in native starches are reduced significantly or completely after laboratory processing under a range of ‘cooking’ and ‘drying’ conditions. Similarly, structural features of starch granules detected by XRD and SAXS and interpreted in terms of a side chain liquid crystalline assembly of polymers (3) are reduced or eliminated by laboratory processing. These results may suggest that processed high amylose maize starches would not exhibit resistance to alpha amylase digestion. However, in vitro simulated digestion under conditions that mimic in vivo digestion (4) resulted in significant yields of ‘resistant’ starch. Structural analysis of this fraction showed the presence of double helices (NMR) arranged into different crystalline polymorphs depending on the processing conditions applied (XRD) with an apparent repeating motif different to both native granules and processed material (SAXS). These characteristics are similar to the structure of enzyme-resistant amylose from retrograded (high double helix content) starches (5). In addition, differential scanning calorimetry (DSC) showed that processed starch materials of low double helix content have characteristic endotherms for amylose double helix melting at 130-160C in excess water.
Conclusions – Taken together, the data obtained suggest that rehydration of processed high amylose maize starch leads to adoption of significant double helix formation, and that this fraction subsequently forms the basis for resistance to enzyme digestion. The inference is that double helix formation triggered by hydration is faster than extensive enzymic depolymerisation of starch, as this would slow down or prevent double helix formation. We propose that the physical structure of foods containing processed high amylose maize starch is likely to be a determinant of resistant starch levels in vivo by influencing the relative rates of rehydration, retrogradation and enzyme attack during the digestive process.
References
Background – Complex carbohydrates are important for the function of the large bowel. Dietary resistant starch (RS) has an important role in this regard. It undergoes fermentation in the colon, resulting in the production of the short chain fatty acids (SCFA), especially butyrate, which helps maintain the normal phenotype of colonocytes. Recent human epidemiological and experimental studies also suggest that high dietary protein intakes have adverse effect on large bowel health. .
Objectives – We have carried out a series of studies in rats to examine if high levels of dietary protein compromise colonic integrity by examining DNA damage (single strand DNA breaks measured by the comet assay) and mucus layer thickness and whether inclusion of RS in the diet can protect against changes that may increase the risk of diseases such as cancer. Changes in large bowel SCFA levels were examined to determine if they are associated with such protection.
Outcomes – Increasing dietary casein from 15% to 25% in the absence of added dietary RS (substituted for digestible starch; 5% wheat bran was included) resulted in a doubling of the number of DNA strand breaks in colonocytes extracted from the colon. Inclusion of cooked red meat at 25% of the diet caused even greater damage, whereas in a different study whey and soy had lesser and greater effects than casein respectively. Inclusion of RS in the diet as high amylose maize starch at high levels (48%) abolished or lowered the levels of the protein-induced DNA damage in the colon in these studies irrespective of protein source. High levels of dietary casein and red meat also resulted in thinning of the colonic mucus barrier, which was reversed by the inclusion of dietary RS. In another study rats were fed 25% dietary protein as casein together with 0%, 10%, 20%, 30% or 40% high amylose maize starch. The RS dose-dependently reduced protein-induced DNA damage in colonocytes with a noticeable effect with as little as 10%. The protection was strongly correlated with caecal SCFA levels, but most strongly with butyrate. Conclusions - Our data suggest that high protein diets may be harmful to the health of the large bowel but that addition of RS to the diet may protect against this damage.
Background – Cereal grains with their high starch content are fed to livestock predominantly as a source of energy for rapid growth or high milk yield. The capacity of an individual grain sample to provide energy is known to vary widely between and within cereal and animal species. A large research effort, the Premium Grains for Livestock Program, funded jointly by the Australian grains and animal research and development organisations and Ridley Agriproducts, was established in 1996 to define the range in and identify the causes for variation in available energy released during digestion (MJ/kg) and in total available energy intake (MJ/d) for wheat, barley, triticale, oat and sorghum grains fed to sheep, cattle, pigs, broilers and laying hens.
Review – Over 3300 cereal grains with a wide range in chemical and physical characteristics were collected from germplasm archives, plant breeder, farmers and selected because of drought, frost damage or pre-harvest germination. Over 190 grains selected on variation in near infrared spectroscopy scans and in vitro fermentation/digestion assays were fed to animals and 40 grain samples were offered across all animal types. The energy from grains made available following digestion was measured in all animal types and voluntary intake was measured in cattle, pigs, broilers and layers. A comprehensive chemical and physical analysis was conducted on all grains. Available energy is expressed as digestible energy for pigs, apparent metabolisable energy for poultry and metabolisable energy for ruminants.
There were large variations across cereal grain species, individual grain samples and animal types in the available energy content (MJ/kg DM, figure). Barley tended to have the lowest values for pigs and poultry and sorghum the highest values, whereas, sorghum had the lowest energy content of all grain species for cattle. Pigs tended to extract more energy from the grains than the other animal species and cattle the least. The smallest within grain species variation was observed for cattle. The extent of the within grain variation depended on the grain species, with the variation being particularly small for pigs offered sorghum. There were low and negative correlations in available energy content of grains between the animal types (e.g. broilers-pigs, 0.19; broilers-cattle, -0.28; pigs-cattle, 0.21). There were also low and negative correlations between available energy content and available energy intake within each animal type (e.g. 0.2 for broilers to -0.1 for pigs) indicating that different characteristics of grains determine digestibility and intake. Similarly, there were low and negative correlations in available energy intake across animal types (e.g. broilers-pigs, -0.15; broilers-cattle, -0.03; pigs-cattle, 012).
Conclusions – The combined results suggest that grains with high digestibility in will not necessarily provide high intakes of available energy for the animals examined and that some individual grain samples provide more energy for one animal type than another and vice versa. The differences between grains and animal types is determined primarily by an interaction between the animal digestive system, chemical composition of the grain and accessibility of amylolytic enzymes to starch as influenced by the composition and thickness of endosperm cell walls in wheat, barley and triticale and the protein matrix in sorghum.
Background – Current health claims indicate that 25 g daily of soy protein (SP) may reduce the risk of heart disease by lowering cholesterol, particularly low-density lipoprotein cholesterol (LDL-C). Whether the isoflavones (ISO) associated with the SP contribute to this benefit is still unclear. However, they may offer additional protection against heart disease by improving arterial dilatation and arterial compliance as a result of their ability to bind to endothelial oestrogen receptors and stimulate vasorelaxation.
Objective – To investigate differential effects of SP and ISO on total cholesterol (TC), LDL-C and other risk factors for heart disease.
Design – 91 hypercholesterolaemic subjects (TC> 5.5 mM) underwent an 18 week dietary intervention using a randomised, controlled, three-way cross-over design. For three 6-week periods, and in random order, subjects consumed foods containing 24 g of SP with 80 mg of ISO per day (S), foods containing 12 g SP and 12 g dairy protein with 80 mg ISO (SD) or a control diet consisting of foods with 24 g dairy protein and no ISO (D). At the end of each six week diet phase blood lipids, flow-mediated dilatation (FMD) of the brachial artery and compliance of large and small arteries were assessed.
Results – Compared with the control diet (D) there was a small but significant reduction in TC on the S diet only (2.8 + 1.1%, P<0.05). FMD was improved to a similar extent with both S (7.05 + 0.47%, P<0.05) and SD (7.06 + 0.49%, P<0.05) compared with D (5.93 + 0.35%). LDL-C and arterial compliance did not differ between diets. Conclusions – In contrast to the approved health claim, we found that 24 g/day of SP did not reduce LDL-C and resulted in only a small reduction in TC. Improvement in FMD was similar with both 24 g/day and 12 g/day of SP, suggesting that this effect may have been at least partly mediated by ISO.
Background – With escalating costs of pharmaceuticals to manage cardiovascular risk factors, there is a need to develop more effective lifestyle intervention programs that can reduce the reliance on these agents.
Objectives – To evaluate the efficacy of a pilot Comprehensive Lifestyle Intervention Program (CLIP) compared with qualitative lifestyle advice (L) and Simvastatin plus qualitative lifestyle (S+L) on cardiovascular risk factors in overweight hypercholesterolaemic individuals at mild-moderate cardiovascular risk.
Design – Parallel randomised controlled trial of 6 weeks duration. Intervention groups were CLIP (n=22): structured meal plan comprising energy restriction (6MJ), fish 2meals/week, cereal high in soluble fibre, saturated fat<8% energy, wholegrain bread, nuts, 25g plant sterol margarine per day plus exercise advice and self monitoring. The
2
groups were matched for total cholesterol 6.3 ± 0.8mmol/L, age 51± 9 y and BMI 32 ± 4 kg/m . L (n=22) were
provided comprehensive qualitative diet and exercise advice and S + L (n-22 received 20mg/day simvastatin plus the same advice.
Outcomes – CLIP lowered LDL cholesterol by 0.70 ± 0.73mmol/L (18%), L by 0.23 ± 0.63mmol/L (6%) and L+S by 1.5 ± 0.6mmol/L (39%) all significantly different (P<0.001). The total cholesterol/HDL ratio was only lowered by CLIP and S+L. Weight and waist circumference was significantly lowered by CLIP (-4.4 ± 2.1kg; -6.7 ± 3.9 cm) compared to L (-1.1 ± 1.7 kg; -2.6 ± 3.5 cm) and L+S (-1.0 ± 1.3 kg; -2.7 ± 2.2) P<0.001). β carotene levels increased on CLIP and L relative to S+L (P=0.001). Folate increased on CLIP only (P<0.01). CLIP was well accepted by participants.
Conclusion – CLIP is more effective than qualitative lifestyle advice in improving cardiometabolic risk factors. Although not as effective as simvastatin in lowering LDL cholesterol, this program, if sustainable, may assist in comprehensive risk factor management and delay the need for lipid lowering drugs in this group.
Background – Hospital admissions of Aboriginal Australians for coronary disease (CHD) are double those of non- indigenous Australians in men and fourfold in women (1), with greater average cost and length of stay (LOS). Objectives – To examine predictors of hospital admission and LOS for CHD in Aboriginal Australians.
Design: In 1988-89, randomly selected Australian Aborigines (256 men, 258 women), aged 15-88 years, completed interviewer-administered questionnaires about diet, exercise, smoking and alcohol drinking. Blood pressure, weight, height and blood lipids were measured. The WA Data Linkage Unit linked participants to hospital records to the end of 2002. Cox regression and negative binomial models were used to model Hospital admissions and LOS for CHD. Outcomes – Among 51 men and 55 women admitted with CHD, greater LOS was predicted by hypertension (hazard ratio (HR) 1.32; 95% CI 1.03, 1.68); diabetes (HR 1.62, 95% CI 1.03, 1.53); smoking (HR 1.90, 95% CI 1.02, 3.53); eating processed meat > 4 times/month (HR 1.81, 95% CI 1.01, 3.24); sweet foods > 6 times/month (HR 1.69, 95% CI 0.94, 2.88); and > 6 eggs/week (HR 1.72, 95% CI 1.03, 2.94). Relative to abstainers, lower alcohol intake (HR 0.54, 95% CI 0.35, 0.83) predicted shorter LOS. Intake of eggs (HR 1.05, 955 CI 1.01, 1.09) predicted shorter intervals between admission; exercise ≥ 1/week (HR 0.61, 95% CI 0.36, 1.05); eating bush meats ≥ 7 times/month (HR 0.46, 95% CI 0.23, 0.92), and red meat >7 times/week (HR 0.56, 95% CI 0.31, 1.03) predicted longer intervals. Risk of admission was higher in hypertensives (HR 4.07, 95% CI 1.32, 15.52); ex-drinkers (HR 6.60, 95% CI 2.30, 19.00); and those adding salt to prepared food (HR 3.16, 95% CI 1.12, 8.92), and lower with eating bush meats ≥ 7 times/month (HR 0.26, 95% CI 0.10, 0.67); and red meat > 7 times/week (HR 0.98, 95% CI 0.97, 0.99).
Conclusion – Hospital admissions for CHD in Aboriginal Australians are predicted by hypertension and diabetes and by several aspects of diet (intake of processed meat, red meat, “bush” food, eggs and sweet foods) and lifestyle. Such findings can inform planning of preventive health programs and health services for indigenous Australians. Reference
Background – The “fetal-origins” hypothesis of adult disease postulates that fetal undernutrition is associated with
an increased susceptibility to the development of coronary heart disease (CHD) and allied disorders in later life. Individual published studies of the relation between size at birth and subsequent CHD risk factors have had limited statistical power to assess an association reliably, and explored the impact of confounding to differing degrees. Review – A systematic review of the size and direction of the association between birthweight and subsequent coronary heart disease was conducted. Seventeen published studies of birthweight and subsequent coronary heart disease were identified including a total of 144,794 singletons. Relative risk estimates for the association between birthweight and coronary heart disease were obtained from sixteen of these studies. Additional data from two unpublished studies of 3801 individuals were also included. In total, the analyses included data from eighteen studies on 4210 non-fatal and 3308 fatal coronary heart disease events in 147,009 individuals. The mean weighted estimate for the association between birthweight and the combined outcome of non-fatal and fatal coronary heart disease was 0.84 (95% CI 0.81-0.88) per kg of birthweight. There was no strong evidence of heterogeneity between estimates in different studies (p=0.09) or of publication bias (Begg test p=0.3). Restricting the analysis to fatal coronary heart disease events had little effect on the overall mean weighted estimate (0.84, 95% CI 0.80-0.88). Fifteen studies were able to adjust for some measure of socioeconomic position, either in early or current life, but such adjustment did not materially influence the association: 0.85 (95% CI 0.81-0.90).
Conclusions – These findings are consistent with one kilogram higher birth weight being associated with 10-20% lower risk of subsequent coronary heart disease. Further studies are needed to establish whether the observed association reflects a stronger underlying association with a related exposure, or is due, at least in part, to residual confounding.
Background – Both eicosapentaenoic acid (EPA, 20:5n3) and docosahexaenoic acid (DHA, 22:6n3) have been shown to have numerous health benefits. However, docosapentaenoic acid (DPA, 22:5n3) found particularly in red meat has been less well studied. The richest commercial source of DPA available is seal oil.
Objective – To compare the effects of DPA rich seal oil supplementation with fish oil, on measures of platelet activation and other CVD risk markers.
Design – A randomised, parallel, placebo controlled, double blind study was conducted. Thirty healthy subjects were randomly allocated to one of three groups receiving: seal oil capsules (350 mg EPA, 450 mg DHA, 250 mg DPA), fish oil capsules (210 mg EPA, 810 mg DHA, 30 mg DPA) or placebo capsules (containing a vegetable oil with no EPA, DHA or DPA) for 14 days. Baseline and 14 day blood samples were tested for platelet activation, platelet aggregation, ATP release and incorporation of omega-3 fatty acids into platelet phospholipids (PL). Full lipid profiles were also performed including total cholesterol, LDL-cholesterol, HDL-cholesterol and TG. Outcomes– Seal oil supplementation significantly increased incorporation of DPA, DHA and EPA (p<0.05) into platelet PL, whereas fish oil increased EPA and DHA only (p<0.05). A significant decrease in plasma TG (1.58 to 1.18 mmol/L, P<0.05) and a significant increase in HDL-cholesterol (1.40 to 1.56 mmol/L. P<0.05) were observed post seal oil supplementation.
Conclusion – This study further supports the suggestion that DPA may also have beneficial health effects. Acknowledgement – Supported by funding from Meat and Livestock Australia.
Background – Diet based strategies to promote cardiovascular health are becoming increasingly popular, and recent studies have identified potential therapeutic roles in human health for specific dietary components (bioactives). At present, the main heart health ingredients are those with strong scientific evidence for their efficacy (eg n-3 PUFAs; plant sterols). The market for foods to promote heart health is expected to grow 60% in the next 5 years, and therefore high demand exists for novel bioactives so long as they are backed by good scientific evidence.
Objective – 1) To develop a robust bioactive discovery process spanning from identification of target mechanisms, in vitro screening assays to pre-clinical substantiation in animal models prior to validation in humans; 2) To assess vascular relaxation actions of different plant and grape seed extract (GSE) and fractions on different blood vessel preparations representing conductance and resistance vessels of the normotensive WKY (Wistar-Kyoto) rat.
Design – Thoracic aorta, intact mesenteric vascular bed and the third arcade of mesenteric arteries (250-350 micron diameter) were isolated from 20 week old rats. Aortic rings (3 mm) were mounted under 4 g resting tension while the mesenteric bed was perfused continuously with oxygenated Krebs-Henseleit buffer. The micro-vessels were studied using a myograph. Test extracts were added cumulatively to vascular preparations pre-contracted with noradrenaline and the extent of relaxation recorded. GSE extracts were provided by Tarac Technologies (SA).
®
Outcomes – In the aortic ring preparation, different GSE extracts - polymeric fraction and Vinlife – caused the
greatest relaxation (80-90%) followed by the oligomeric fraction while the monomeric fraction was least effective as evident by a rightward shift in the dose-response curve. Similar profile was apparent for the perfused mesentery.
®
Only the Vinlife preparation resulted in a dose-dependant relaxation of the micro-vessels. Several other plant
extracts caused relaxations in aortic rings ranging from 30-97% that were endothelium dependant. Lemongrass extract showed preferential relaxation in the mesenteric bed that may be mediated via cannabinoid (CB1) receptors. Conclusions – The observed differences in vascular relaxation among different GSE preparations may be due to compositional differences in the polyphenols.
Background – Epigallocatechin gallate (EGCG), considered to be the active component of green tea, has yet to be shown to lower cholesterol in the cholesterol-fed hypercholesterolaemic rabbit model. Furthermore, cholesterol reduction by EGCG may involve the reduction of cholesterol synthesis through inhibition of squalene epoxidase. Objectives – To determine the effects of EGCG on serum cholesterol and on the cholesterol synthesis precursor, lathosterol, in the cholesterol-fed rabbit model of hypercholesterolaemia.
Design – Twelve NZ White rabbits were fed, for two weeks, a rabbit chow to which was added 0.25% (w/w) cholesterol. For the next four weeks, one group (6 controls) continued on the 0.25% (w/w) cholesterol chow while the other group (6 treated) was fed the 0.25% (w/w) cholesterol diet with 2% (w/w) EGCG added. Serum cholesterol (enzymatic assay) and lathosterol (GC) were measured and expressed as mean ± SEM.
Outcomes – The EGCG effectively lowered serum cholesterol by 85% (P = 0.02) from 8.6 ± 2.3 at week 2 to 1.3 ± 0.1 mmol/L at week 6. In contrast, serum cholesterol did not change significantly (P = 0.72) in the control group, going from 8.6 ± 2.3 at week 2 to 7.9 ± 4.0 mmol/L at week 6. Serum lathosterol decreased from 2.4 ± 0.5 at week 2 to 0.25 ± 0.05 μmol/L at week 6 (P < 0.01) in the treatment group and from 3.1 ± 1.1 at week 2 to 0.93 ± 0.32 μmol/L at week 6 (P < 0.01) in the control group and was significantly lower in the treatment group at week 6 compared to the control group (P = 0.03). However, when serum lathosterol was normalised using serum cholesterol (lathosterol/cholesterol) to account for lipoprotein carrier capacity, there were no differences between the groups. Conclusions – The green tea catechin, EGCG, effectively lowered cholesterol and lathosterol in the cholesterol-fed hypercholesterolaemic rabbits but it did not have any effect on the serum lathosterol to cholesterol ratio.
Background – Increased platelet aggregation or “stickiness” is associated with the progression of coronary atherosclerosis, coronary thrombosis and myocardial infarction (MI). Inhibiting platelet aggregation reduces the risk
12
of MI. Platelet aggregation tendency increases with age and with increased risk of heart disease and may be
3 reducedbyanumberoffoodgroupsincludingseafood,garlic,tomatoorchilli .
Objective – To determine background factors that influence the tendency of platelets to aggregate using a whole- blood impedance aggregometer.
Design – Exploratory study examining whole-blood collagen stimulated impedance platelet aggregation in 80 participants after an overnight fast and a light breakfast. Background details including participants’ age, gender, BMI, garlic, chilli and alcohol intake were recorded and correlated with aggregation.
Outcomes – Whole-blood platelet aggregation was not related to smoking, garlic, chilli or alcohol intake or age, but was significantly (p < 0.001) higher in female participants than males (Aggregation area under the curve = 34.1 ± 0.85 versus 27.0 ± 0.94 Ohm.mins respectively). Although male participants were heavier, after controlling for BMI in a multiple regression analysis, whole-blood platelet aggregation was still significantly predicted by sex. Conclusions – Gender appears to be an important determinant of platelet aggregation. The gender of participants is likely to have implications on the response in platelet aggregation in drug and/or nutrition intervention trials. Future studies examining the effects of food on platelet aggregation should consider matching groups based on gender. References
Background – Adiponectin is reduced in obesity and may be associated with atherosclerosis and coronary artery disease. Hypoadiponectinemia may also contribute to the development of obesity-related hypertension.
Objective – To determine the blood pressure and adiponectin responses to weight loss.
Design – Weight loss study in 25 obese men and women over a 12 month period.
Outcomes – Weight loss after 3 months of energy restriction was 7% (7.7 ± 3.4 kg) and 5% at 12 months. Resting systolic blood pressure (SBP) at baseline was 123±13 mmHg which was lower after weight loss at 3 months (114±16 mmHg, P<0.05) but was not different at 52 weeks. Adiponectin fell by 4 % at 3 months (p=0.1) but rose by nearly 20% at 12 months (P<0.05). There was a negative correlation between adiponectin and SBP (r=-0.427, P≤0.05) at baseline which remained after adjustment for BMI. Diastolic blood pressure (DBP) was also negatively correlated with adiponectin. These correlations were not observed at 3 months. After 52 weeks SBP was negatively correlated with adiponectin (r=-0.554, P≤0.05) which was lost after adjustment for BMI. After 52 weeks DBP was also negatively correlated with adiponectin (r=-.610, P≤0.05) but after adjustment for BMI this relationship was weaker (r-.528, P=0.08)
Conclusions – There is an inverse relationship between adiponectin and blood pressure that appears to be disrupted after short-term weight loss but is re-established at 52 weeks when adiponectin rises.
Background – The process of evidence based practice involves a lengthy cycle of research, systematic review of all relevant data, implementation of the research/systematic review findings, evaluation, and modification of the implementation or more research, which ever is required. However in practice, this cycle is not always apparent. Objective – The aim of this paper is to examine the process of the evidence-based cycle in the research process and the translation of data relating to omega-3 long chain polyunsaturated fatty acid (LCPUFA) requirements in the perinatal period.
Results from studies involving infants – Initial trials were designed to test the efficacy of supplementing infant formulas for preterm and term infants with omega-3 LCPUFA. These trials were relatively small in size, of variable duration and used doses that ranged from 0.1 to 1% of total dietary fat. Outcomes have tended to range from simple LCPUFA status to more complex visual and cognitive outcomes. Despite the large number of studies now in the literature it has been difficult to bring all the data together in a satisfactory way in to systematic reviews because of the wide range in methodology, the differing trial protocols and the assessment methods used as outcomes. Nevertheless, the individual studies have been suggestive of a developmental benefit and this has been most consistent in infants born preterm. As a consequence infant formulas for preterm infants and many for term infants are supplemented with omega-3 LCPUFA. However, point of controversy has been the dose and the ratio of omega- 3 to omega-6 LCPUFA supplementation. Our systematic review relating to the effects of LCPUFA supplementation of infant formulas on growth has addressed part of this controversy (1), but discussion remains.
Results from studies with pregnant women – On the other hand trials designed to test the effect of dietary omega- 3 LCPUFA on pregnancy outcomes have generally been larger and have used higher doses of marine oils. As the outcomes have been very specific (pregnancy outcomes) we have a good body of work that has enabled trials to be combined in systematic reviews that have given us a clear idea of the safety and efficacy of omega-3 LCPUFA in pregnancy (2). In summary, we now know that up to 3 grams of fish oil per day in pregnancy is safe, but that there is not enough evidence to support the routine use of marine oil, or other prostaglandin precursor, supplements during pregnancy to reduce the risk of pre-eclampsia, preterm birth, low birthweight or small-for-gestational age. Very few of these trials have satisfactorily followed the effects on the growth and the development of the resulting children. In the prenatal supplementation area it has been possible to commence the next phase trials, which are focussed on maternal well-being and the development of the children, with confidence.
References
1
2
Makrides M, Gibson RA, Udell T, Ried K and the International LCPUFA Investigators. LCPUFA supplementation of infant formula does not influence the growth of term infants. Am J Clin Nutr 2005;81:1094- 1101
Makrides M, Duley L, Olsen SF. Marine oil, and other prostaglandin precursor, supplementation for pregnancy uncomplicated by pre-eclampsia or intrauterine growth restriction. In: The Cochrane Database of Systematic Reviews 2006 Jul 19;3:CD003402
Background – An antioxidant-rich diet has been associated with reduced asthma prevalence in epidemiological studies. However, there has been no direct evidence that altering the intake of antioxidant-rich foods affects asthma outcomes.
Objectives – This study aimed to investigate changes in asthma control and airway inflammation resulting from a low antioxidant diet and subsequent use of lycopene-rich supplements.
Design – Adults with stable asthma (n=32) consumed a low antioxidant diet for 10 days, then commenced a placebo controlled, randomized, cross-over trial using lycopene-rich supplements. 17 subjects completed the trial in which they received 3 x 7 day treatment arms (placebo, lycopene concentrate and tomato juice), each separated by a 10 day washout. The tomato juice and lycopene concentrate each provided 45mg/day lycopene. Clinical status was monitored using spirometry and the Asthma Control Score questionnaire. Airway inflammation was assessed using total and differential cell counts from induced sputum. Plasma carotenoids and tocopherols, were measured by HPLC.
Outcomes – Following the initial washout period on the low antioxidant diet, plasma carotenoid concentrations decreased (p=0.026) and asthma control worsened, including increased Asthma Control Score (0.035) and decreased %FEV1 (0.004) and %FVC (0.002). Furthermore, airway inflammation worsened, with increased %neutrophils in induced sputum (p=0.038). Supplementation with both tomato juice and lycopene concentrate reduced neutrophilic airway inflammation (placebo: 55.1 (35.0, 91.1) % versus tomato juice: 42.0 (21.0, 67.8) % versus lycopene concentrate: 39.8 (18.4, 77.5) %; (p=0.006).
Conclusion – A low antioxidant diet worsens asthma control, lung function and noneosinophilic airway inflammation. Lycopene-rich supplements reverse this trend in airway inflammation. Dietary antioxidant consumption is an important variable that modifies clinical asthma status, and changes in dietary antioxidant intake may be relevant to the rising asthma prevalence and as a therapeutic intervention.
Background –There is evidence to suggest that mild zinc deficiency may be present in New Zealand (NZ) children. Toddlers may be at an increased risk of zinc deficiency due to their high zinc requirements for growth and low intakes of meat, which is a highly bioavailable source of dietary zinc.
Objectives – To examine the baseline zinc status of 12-20 month old South Island NZ children who participated in a randomised controlled trial designed to determine the efficacy of a meat-based or a fortified cow’s milk-based dietary intervention on biochemical zinc status.
Design – A 20-week randomised-controlled intervention trial was conducted with each of 225 toddlers randomised into one of the two diet groups or the control group. At baseline, a hair and non-fasting serum sample were collected using trace-element free techniques for zinc analysis by flame atomic absorption spectrophotometry. Dietary intakes were assessed via a three-day weighed food record. Trained anthropometrists measured weight and length. Outcomes – At baseline, the toddlers in this study had a mean (SD) age of 17.1 (2.8) months and 56.4% (n=127) were boys. Mean Z-scores (SD) for length-for-age were 0.14 (1.13) and for BMI-for-age were 0.77 (1.04) (n=225). Mean (SD) dietary zinc intake was 4.8mg/day (1.2; n=224), with 1.6% estimated to be at risk of inadequate zinc intakes. Mean serum zinc concentration at baseline was 9.8 μmol/L (n=183) with 38.3% of toddlers classified as having a low serum zinc concentration using time-of-day specific cut-offs. The mean hair zinc concentration at baseline was 1.83 μmol/g (n=215), with 31.6% of toddlers found to have a low hair zinc concentration using season- specific cut-offs.
Conclusions – Baseline results suggest the existence of mild zinc deficiency in these NZ toddlers, which may be related to low dietary intakes of zinc.
Background – Many studies of pregnancy include a dietary assessment component, yet we are lacking nationally representative data describing diet quality during this lifestage.
Objectives – To investigate the overall diet quality of young Australian women, and to compare this according to pregnancy status, defined as: pregnant, actively trying to conceive, given birth in the previous 12 months, or otherwise not pregnant.
Design – Cross-sectional study of a nationally representative sample of 9,118 women aged 25 to 30 years, who participated in survey three of the Australian Longitudinal Study on Women’s Health (March 2003). The Dietary Questionnaire for Epidemiological Studies was used to calculate diet quality, consistent with the Australian Recommended Food Score (ARFS) methodology (1). This is summative estimation of food variety and frequency, in line with the Australian Dietary Guidelines.
Outcomes – Pregnancy status was significantly predictive of diet quality even after logistic regression accounted for disparities in education, marital status, and area of residence (P=0.004). Pregnant women and those who had given birth in the previous 12 months had significantly higher mean ARFS than those who were otherwise not pregnant (respective means (95% CI): 29.4 (28.7-30.1); 29.5 (28.9-30.1); 28.4 (28.2-28.7)), although these scores were only marginally improved.
Conclusion – Opportunities exist for enhancing the diet quality of young Australian women in line with national recommendations. Recent or current pregnancy appears to be associated with higher diet quality and variety. Further examination of the composition and correlates of maternal diet may help identify where improvements may be achieved, using the drivers for particular behaviours.
Reference
1.
Collins C, Hodge A, Young A. Are you what you eat? Associations between diet quality and health utilisation in mid- aged women from the Australian Longitudinal Study of Women’s Health. 23rd National DAA Conference, Perth, 2005.
Background – Lifestyle diseases substantially influence increased mortality in Aboriginal Australians relative to the non-indigenous population. Poor nutrition, sedentary behaviour, alcohol-drinking, and smoking have been implicated, using cross-sectional data. We have examined unique longitudinal data which include aspects of diet and lifestyle and cardiovascular risk factors in a cohort of WA Aborigines with follow-up of mortality and hospital data. Objectives – To examine predictors of CHD and all-cause mortality in Aboriginal Australians.
Design In 1988-89, randomly selected Australian Aborigines (256 men, 258 women), aged 15-88 years, completed interviewer-administered questionnaires about diet, exercise, smoking and alcohol drinking; blood pressure, weight, height and blood lipids were measured. The WA Data Linkage Unit linked participants to hospital and death records to 31 December 2002. Cox regression was used to examine predictors of CHD and all-cause mortality.
Outcomes - CHD risk increased with smoking (Hazard Ratio (HR) 2.62, 95% CI:1.19, 5.75), eating processed
meats >once/week (HR 2.21, 95% CI:1.05, 4.63),
eggs >twice/week (HR 2.59, 95%CI:1.11, 6.04)
and using spreads on bread (HR 3.14. 95%
CI:1.03, 9.61). All-cause mortality risk decreased
with exercise >once/week (HR 0.51, 95% CI:
0.26, 1.05), increased in ex-drinkers (HR 3.66,
95% CI:1.08, 12.47), heavy drinkers (HR 5.26,
95% CI:1.46, 7.52), and with eating takeaway
foods >9 times/month (HR 1.78, 95% CI 0.96,
3.29). Adverse behaviours clustered in 55% of participants and increased risk of CHD (HR 2.1, 95% CI:1.1, 4.0) and all-cause mortality (HR 2.3, 95% CI:1.2, 4.2) (see Figure).
Conclusion – Aspects of diet and lifestyle in Aboriginal Australians predict CHD and all-cause mortality. Clustering of adverse behaviours is common and increases risk of CHD and death.
Background – The ‘science of nutrition and the science and art of dietetics can be likened to a rope of many strands twisted around a central core representing food with a series of inputs representing progressively the various sciences and then several of the humanities…’(1). In likening the ever-expanding range of the nutritional sciences beyond ‘vitamins, cholesterol, sugar and fats’, Dr Fred Clements showed his characteristic breadth of thinking that made this unassuming scientist a major influence in the course of nutrition, and its teaching, in Australia for decades.
Review- The plenary theme of this 30th meeting of the Nutrition Society of Australia ‘Nutrition through the
Lifecycle’ is a direct descendant of the emphasis he placed on the whole individual and his/her role in a wider society, at a time when the emphasis was more on the biochemistry of nutrients, the emerging nutritional epidemiology and clinical dietetics. It is instructive to look at the programme of the first Annual Meeting of the Nutrition Society of Australia when, as the Society’s first president (1975-76), Fred Clements spoke on the nutrition of children in Australia with particular reference to underprivileged groups (2). Strikingly for that time, he noted that Aboriginal workers would need to be heavily involved in any solutions to the undernutrition and poor health of their children.
This first meeting 30 years ago sums up much of the person he was: a scientist with respect for evidence; a human being with concern for children and those from ‘underprivileged groups’; the educationalist wanting to disseminate the new science and knowledge; and finally, someone with an international approach whose career included time at the World Health Organization Headquarters in Geneva. Nevertheless he also urged that the new Nutrition Society of Australia adopt guidelines for the interests and activities of the Society; and that these should interpret both contemporary and local issues, and ‘should not accept, uncritically, those of similar organizations in other countries.’ What would he find today? Acceptance of the need to intervene throughout the life-cycle; shock that 11 million children globally are dying from the want of known interventions- and that the cause of over half of these deaths are due to underlying undernutrition (3); and, likely amazement at the global epidemic, including in Australia, of obesity and noncommunicable diseases, including in children and adolescents (4).
Conclusions- The social dimensions of nutrition remain even more important than when first espoused in Australia by Fred Clements in the first Society meeting in 1976. Interventions must be evidence-based and continue throughout the life-cycle because of the intergenerational dimensions and because poor foetal and young child nutrition, besides demonstrating continuing global inequities, will have consequences in the global epidemic of noncommunicable diseases.
References
Background – Compensatory growth (or catch up growth) is the greater than normal growth of an animal following a period of nutritional restriction. In grazing animals, compensatory growth responses are commonly observed due to variations in seasonal conditions and hence pasture supply and quality. Diets of intensively housed animals can also be manipulated to include short periods of nutritional restriction to achieve changes in the body composition of the carcass. In humans catch up growth may occur during infancy/ early childhood in children born small for gestational age due to fetal undernutrition. Catch-up growth can influence ultimate body muscularity and fatness and in humans at least be associated with detrimental metabolic and health effects during later adult life.
Review – Compensatory growth responses in pigs can occur following a short period of dietary restriction, although the degree of catch up growth can vary. In some cases the animals can fully compensate such that they are the same weight at the same slaughter age as their non-restricted counterparts, while in other cases they do not reach the same final weight. There is some evidence that restricting protein intake for a short period of time during the weaner phase (similar to human infant) may result in an increase in protein deposition and decrease in fat deposition during later growth. During a period of protein restriction, both protein and lipid deposition rates are reduced, although the rate of lean tissue deposition is reduced to a greater degree (1). During realimentation the ratio of lipid to protein deposition has also been shown to decrease (1). This reduction in lipid deposition during the realimentation period may reduce the lipid content in the body. Many investigations into compensatory growth responses in pigs have also observed improvements in feed efficiency during the realimentation period, and there may be additional environmental benefits due to a reduction in the excretion of unused nutrients.
In addition to a period of dietary restriction via a reduction in the nutrient content of the diet, pigs experience periods of nutritional restriction due to other production processes such as weaning. Weaning typically occurs between 21 and 28 days of age and involves the removal of the piglet from the sow. This process imposes a number of environmental and social stresses on the piglet, including a change in diet from sows milk to solid feed, mixing with piglets from other litters and a change in environment. Weaning at younger ages, typically around 14 days of age, is practised in some systems in order to reduce the transfer of disease from sow to offspring. Such early weaning of piglets has been shown to result in larger growth depressions immediately post weaning (2). However, by approximately six weeks of age the early-weaned pigs can catch-up to be the same weight as those conventionally weaned.
In humans, compensatory growth responses may have detrimental effects in later adult life. Infants born at low birth weights for gestational age may have increased risks of disease in later life such as coronary heart disease, stroke, hypertension and type 2 diabetes (3). The basis for this hypothesis is that undernutrition during fetal development, either via poor placental function or reduced maternal nutrient intake, leads to altered homeostatic mechanisms that in turn increase the susceptibility to disease in adult life. Rapid childhood catch-up growth may exacerbate the effects of impaired fetal development. Eriksson et al. (1999) reported that in one study of men born in Helsinki, those that had the highest rates of coronary heart disease were thin at birth but had caught up in weight by 7 years of age and had above average BMI (4).
Conclusions – Compensatory growth responses occur in both animals and humans, although the response can be variable depending on the timing and magnitude of the restriction. Periods of restriction and subsequent realimentation can be utilised in production animals to achieve desired body compositions at slaughter. In humans, compensatory or catch-up growth during infancy following fetal undernutrition may lead to increased risks of a range of adult diseases in later life, although the mechanisms by which this occurs are not yet fully understood.
References
Background – The role of dietary fatty acids (FA) and their subsequent effects on metabolism has received considerable attention in mammalian species. It is becoming increasingly clear that fatty acids have metabolic consequences over and above their influence on energy density of the diet. Recent studies have linked changes in the fatty acyl composition of the cell membrane, induced by variation in the dietary fat profile, to alterations in both lipid and glucose metabolism (1,2). These diet-induced changes have effects on insulin action, glucose transport and enzyme activity that regulate triglyceride and fatty acid synthesis, factors that ultimately influence protein and lipid deposition in animals. However, there are few studies that have identified the duration of dietary feeding that is required for FA to exert changes in metabolism and carcass composition.
Review – To determine the time required to orchestrate both morphological and physiological responses of broiler chickens fed different sources of fatty acids: fish oil (n-3 FA); sunflower oil (n-6 FA); edible tallow (saturated FA). Fish oil, sunflower oil or tallow was added to the basal diet at a concentration of 50 g/kg. Growth rates, feed conversion efficiencies, carcass composition and circulating metabolite concentrations were calculated on weeks 5, 6 and 7. In addition, the respiratory quotient (RQ) was assessed on weeks 4, 5, 6 and 7 to determine any change in substrate oxidation over time. There was no difference in RQ for the three dietary groups at week 4, with all groups oxidizing similar proportions of carbohydrate and lipid. However, birds consuming the n-3 and n-6 FA diets oxidized a significantly higher (P>0.05) proportion of lipid than carbohydrate at weeks 5 and 6 compared to those birds fed tallow and this pattern extended to week 7. Consistent with the higher proportion of lipid oxidation, the fish and sunflower oil dietary groups also had lower circulating concentrations of triglycerides and cholesterol with the greatest reduction for these two metabolites occurring at week 7. Feed conversion efficiency and final breast muscle mass was improved by feeding birds sunflower oil compared to feeding birds either fish oil or tallow. Birds fed fish oil had lower abdominal fat pad mass at weeks 5, 6 and 7 compared to birds consuming either tallow or sunflower oil.
Conclusions – The data indicate that omega 3 and omega 6 FA have a differential effect on broiler performance and alter both energy utilization and metabolite concentrations with subsequent effects on carcass composition. Fish oil (n-3 FA) is more effective than either sunflower oil (n-6 FA) or tallow (saturated FA) at reducing body fat whereas sunflower oil is more effective in improving breast muscle mass and feed conversion efficiency than either fish oil or tallow. Most changes in carcass composition and broiler performance occur after six weeks of feeding these FA, although significant changes in both energy utilization and endocrine status occur after 5 weeks.
References
Background – The addition of fats to pig diets has primarily concentrated on the contribution to the energy component of the diet, principally the digestible energy content. The results of experiments where pigs were kept under individual (ideal) housed conditions are consistent in that the pigs adjusted their voluntary feed intake with digestible energy (DE) density to maintain a constant energy intake (1). The results of experiments where pigs were kept in groups (commercial housing conditions) showed that pigs tended to increase their daily DE intake as the DE density of the feed increased. This increase in DE intake improved the growth rate of the pigs but also increased the fat deposition of those pigs (1). Economic analysis of the experiments indicates that formulating diets to a least cost per MJ of DE is not the most profitable point to set the DE density (1). In recent years we have seen the upsurge in evidence that individual fatty acids can have significant defined metabolic effects. This has been led by the interest from human nutrition circles. The latest work on dietary fats has shown a significant effect on metabolism and feed intake control separate from its role on energy metabolism.
Objectives – To show that the inclusion of fat into a balanced diet will increase the growth performance of growing pigs above that expected from increased DE density of the diet.
Design – The experiment involved 576 male pigs of 25 kg randomly allocated to 2 dietary treatments over a four week period. The pigs were assigned to either a low DE density feeding program or a high energy density feeding program. The low DE program consisted of feeding a diet containing 13.8 MJ of DE per kg for 48 days and a diet consisting of 13.6 MJ DE/kg for 35 days. The high DE program consisted of a 14.6 MJ DE/kg diet and a 14.5 MJ DE/kg diet fed over the same time periods. The extra DE content of the high DE diets was obtained from adding fat to the diets. The diets were balanced for amino acids. Average live weights were recorded on a pen basis at the start of the experiment and again at 21, 48, 69 and 83 days into the experiment. Average daily feed intakes were recorded at day 21, 48, 69 and 83 of the experiment. An ultrasound fat depth was recorded at the P2 site at the end of the experiment. The animals were then sent to the abattoir for slaughter where hot standard carcass weight and a carcass P2 fat depth measurement were recorded.
Outcomes – The 7% increase in DE density and 7.8% increase net energy density during the first 49 days of the experiment had no significant effect on growth rate or average daily intake. The pigs on the high DE diets did have a 3.3% reduction in feed conversion. This indicates that the pig did not efficiently utilise the 6% increase in DE intake per day, the reasons for this are unclear but it does show that the pig is not responding to DE. The response of the pigs through the final 35 days of the experiment is shown in table 1. There was a significant 7.63% increase in rate of gain with a reduction in feed intake. The 6.6% increase in DE density and 8.2% increase in net energy resulting in a 13% improvement in feed conversion, despite the fact that the DE intake of the pigs per day was equivalent. This clearly shows that there is an effect on metabolism, above that of DE, when the fat level of the diet is increased. The increase in fat deposition that was shown in this experiment and previous experiments discussed above indicates that the effects on growth rate and feed efficiency are magnified further than can be explained by the energetic relationships alone.
Conclusion – The series of experiments discussed above indicate that there is clearly an effect of
that of a direct energy intake effect by increasing the conversion of the energy to live weight. The mechanisms for this are as yet unclear but it appears that the greatest effects occur on changes in fat levels and the ability of the animal’s metabolism to react to those changes in fat content of the diets. The implications for pig and human nutrition are that while we tend to take into account just the energy density of the diet the functional properties of fat addition must also be taken into account, as they can be as significant as the energy density of the diet. Ths will extend further as we look at the metabolic and functional properties of individual fatty acids.
Background – The major advances in our understanding of the central and peripheral regulators of energy homeostasis has provided new pathways for exploitation for controlling weight gain and obesity in humans and boosting feed intake and productivity in the pig industry. These factors are directed by both chronic and acute mechanisms designed to co-ordinate energy balance and the flow of substrate between tissues. The identification of a role for the endogenous ligand for the hypothalamic/pituitary GH secretogogue receptors, ghrelin, in the regulation of feeding provides an important functional link between the actions of a key anabolic hormone and the flow of energy substrate required to meet biosynthetic demands. Thus as a regulator of feeding, ghrelin was targeted for pharmacological intervention to control obesity irrespective of species.
Review – Numerous studies (1) have shown distinct diurnal rhythms in ghrelin characterized by a pre-prandial rise and a post-prandial fall in circulating ghrelin consistent with the hormone contributing to the pre-prandial orexigenic drive in humans. In many of these studies ghrelin status is positively associated with the adipocyte hormone leptin and negatively associated with insulin status as energy substrate is partitioned according to need and then assimilated to support tissue metabolism. In contrast feeding animals for production purposes is associated with maximizing feed intake and its efficiency of conversion into liveweight. Our recent studies have shown that feeding high protein/energy diets to pigs ad libitum results in dissociation of ghrelin status and feeding behaviour irrespective of the number of meals the feed is offered over. In contrast insulin concentrations continued to track the glycaemic status of the animal in line with its obligatory glucoregulatory role. Similarly we have found that fasting of day-old piglets for 12 hours did not alter circulating ghrelin status suggesting that this hormone plays little role in the initial suckling response for colostrum intake. However circulating ghrelin status decreased significantly between days 1 and 4 post-partum, although levels of expression in both the gastric fundus and the pancreas remained at similar levels. In contrast, in the human ghrelin status increases from birth to day 4 of life, although as with the pig at this age ghrelin is not responsive to feeding (2). These differences may be related to differences in thermogenic mechanisms between species as infants rely on brown adipose tissue whereas this is absent in the newborn piglet. The strong relationship between circulating ghrelin status in humans and anthropometric and metabolic parameters through development (3) suggests that ghrelin plays a chronic role in maintaining energy balance and body composition. And yet circulating ghrelin status responds acutely to amino acid and glucose infusions more readily than to fat at least in rodents.
Conclusion – Hormones regulating feeding behaviour acutely also play a role in regulating energy balance over the life of the animal. It is important that species differences in these mechanisms are elucidated to see how the pig can be used for studying the orexigenic drive in humans.
References
1 Cummings DE, Purnell JQ, Frayo RS, Schmidova K, Wisse BE and Weigle DS(2001) Diabetes 50, 1714-1719.
2 Bellone S, Baldelli R, Radetti G, Rapa D, Vivenza D, Petri A. Savastio S, Zaffarono M, Broglio F, Ghigo E and
Gona G(2006) J Clin Endocr Metab 91, 1929-1933.
3 Soriano-Guillen L, Barrios V, Chowen JA, Sanchez I, Vila S, Quero J, Argente J (2004) J Pediatr 144, 30-35
Background – Ruminant depot fat has a high saturated fatty acids (SFA) to polyunsaturated FA (PUFA) ratio due to ruminal hydrogenation of dietary FA. However, the lipid contained in trimmed lamb (intramuscular fat) contains a higher proportion of PUFA and omega-3 (ω-3) FA than depot lipid and may provide important sources of these FA.
Objectives – To determine the variation in muscle FA between genotypes in 14 month old yearling sheep.
Design – One side of 147 carcasses from five genotypes (pure Merino=Merino; Border Leicester x Merino= BLM; Poll Dorset selected for growth x Merino=PDgM; Poll Dorset selected for muscling x Merino= PDmM; PDg x BLM= 2X) of sheep maintained under the same grazing conditions were used. Carcass lean and fatness, entire loin muscle weight and muscle FA composition were determined.
Outcomes – Carcass fatness (%) increased in an ascending order from Merino to Poll Dorset to Border Leicester genetics. Muscle lipid, SFA, PUFA:SFA ratio and omega-3 FA did not differ between genotypes.Conclusion – These data indicate that consumption of one serve of yearling sheep meat (150 g) would contribute 120 mg of long chain ω-3 FA to their diet, which is 24% of the suggested daily allowance recommended by the National Health and Medical Research Council 2005.
Background – Increased consumption of the long chain omega-3 (n-3) fatty acids eicosapentaenoic acid (EPA) and docosahexanoic acid (DHA) is associated with a reduction in cardiovascular (CV) and inflammatory risk factors but it is unclear what level of intake is required to achieve benefits.
Objective – The aim of this study was to establish a relationship between changes in red blood cell (RBC) membrane DHA levels and changes in CV and inflammatory risk factors.
Design – Seventy subjects (42 males and 28 females, mean age 51.8 yr) with habitually low dietary n-3 intake, 2
elevated triglycerides (> 1.6 mmol) and BMI > 25 kg/m were enrolled in a randomized, double-blind, placebo- controlled intervention trial. Subjects were assigned to consume 6 x 1g oil capsules per day for 12 weeks. Varying combinations of DHA-rich tuna oil (26% DHA, 6% EPA) or sunflower oil (placebo) capsules provided intakes of 0, 2, 4 or 6g of either oil/day. RBC membrane fatty acid composition and markers of CV risk and inflammation were measured.
Outcomes – DHA incorporation into RBC membranes increased over 12 weeks and was proportional to the level of DHA consumed (P<0.05). After 6 weeks of supplementation, there was a dose-response relationship between the intake of DHA and reduction in plasma triglycerides (R=0.315, P<0.05), with no effect on plasma cholesterol. Conclusions – Plasma triglycerides are reduced in a dose-dependent manner in response to supplementation with DHA-rich fish oil. The relationship between the intake of DHA, its incorporation into RBC membranes and changes in markers of cardiovascular risk may provide a potential index of the health benefits of DHA rich foods.
Background – Pregnant women and their babies are a priority public health target group. Long Chain Omega-3 Polyunsaturated Fatty Acids (LC n-3 PUFA) and their importance during pregnancy have been studied extensively over the years. It is not known if adequate amount of information is available to pregnant women from their health professionals in relation to risks and benefits of omega-3 (n-3).
Objective – To determine communication strategies between health professionals and pregnant women using a series of interviews and surveys to understand health professionals’ attitudes regarding risks and benefits of omega- 3 and subsequent information flow.
Design – A total of 16 health professionals were recruited and interviewed (7 midwives, 7 dietitians, 2 general practitioners). Interviews were transcribed and content analysis was performed in order to generate both qualitative and quantitative data. Interviews examined the extent to which information is being delivered to pregnant women. Pregnant women are currently being surveyed to determine information flow of LC n-3 PUFA.
Outcomes – Only four of 16 interviewed health professionals have a wide knowledge of n-3, two of which are midwives who are involved in educating other midwives and the other two are long term practising dietitians. Five of seven midwives provide no advice about n-3. Dietitians see pregnant women only if there are underlying nutritional disorders and hence have limited access. General practitioners interviewed were not involved in the management of pregnant women as they are referred to ante-natal clinics. Survey data collection and analysis are still pending.
Conclusions – Preliminary results show there is limited knowledge about n-3 and the health professionals that do know about n-3 have limited contact with the pregnant women. Therefore more strategic information flow about n-3 to pregnant women is warranted.
Background – It has been recognized that specific fatty acids have the ability to directly influence the abundance of gene transcripts in organs such as the liver. However little comparison has been made between the effects of common dietary of fatty acids and there influence on gene expression.
Objectives – To determine the effect of diets rich saturated, monounsaturated and polyunsaturated on gene transcripts associated with liver fat metabolism. Specifically how these three classes of fatty acids influence mRNA levels of key transcriptional regulators (PGC1a, PPARa, PPARd, SREBP1C & ChREBP), fat oxidative (ACO, L- CPT1, HMG-CoA lyase & UCP-2) and fat synthetic (ACC, MCD, GPAT & malic enzyme) genes were investigated. Design - Rats (n=32) were evenly divided into four groups; a saturated fat diet, a monounsaturated fat diet, a polyunsaturated fat diet (each diet contained 23% fat) and standard rat chow (7% fat) diet and fed for 12 weeks. Real-time PCR analysis was performed on liver tissue.
Outcomes – PGC1a and SREBP1C increased 1.9 fold or greater in all groups. Conversely, PPARa, PPARd and ChREBP demonstrated variable changes with diet composition. Monounsaturated and polyunsaturated fat increased HMG-CoA lyase 2.8 fold, a response that was absent in the saturated fat fed animals. UCP-2 was decrease 3.0 fold by all dietary treatments. Malic enzyme was increased 2.8 and 2.4 fold with saturated and polyunsaturated diets respectively, yet was unaltered by the monounsaturated fat diet.
Conclusion – Modifications in common dietary fat composition initiated divergent gene responses in liver. These alterations were complex, with no uniform alteration in transcription factors with closely related functions (PPAR- family) and genes encoding proteins within the same metabolic pathway (fat oxidation or fat synthesis). Further studies are necessary to identify the predominant mechanisms regulating these differences in gene expression.
Background – Long chain omega-3 fatty acids (LCO3FA), eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA), have a range of specific health benefits in the prevention and treatment of chronic diseases, particularly coronary heart disease (CHD). Australian experts’ recently recommended intakes of 610 mg /day for men and 430 mg /day for women (1). However recent estimates suggest that Australians consume, on average, only 189 mg of LCO3FA/day (2). Increasing demand for foods rich or enriched in LCO3FA may increase demand upon wild fisheries which are at maximum levels of exploitation with some declining. Aquaculture meets some demand but is also dependent upon wild caught fish for fishmeal. There is a need for alternative methods of producing LCO3FA. Objectives – To elicit predictors of variation in intentions to consume foods rich in the LCO3FA, EPA and DHA, so as to understand potential demand for novel sources of LCO3FA.
Design – Responses from a consumer sample (n = 220) were elicited including protection motivation theory constructs as independent variables. Descriptions of model products representing options for possible (future) consumption were presented, including fish, currently available enriched foods (breads and milks) and novel products derived from genetic modification (GM) of oilseed crops for direct human consumption or consumption by fish in aquaculture.
Outcomes – Multivariate regression was undertaken and significant (P <0.05) predictors (β) for likelihood to purchase “farmed fish (fed fishmeal)” were: self efficacy 0.56; behaviour (product) efficacy 0.19; belief that
2
fishmeal is unnatural -0.14 (R 0.44). For likelihood to purchase “farmed fish (fed GM oilseed)” predictors were:
self efficacy 0.65; perceived severity of CHD 0.15; BMI -0.13; significant other has arthritis 0.11; belief that GM
2 oilseedisunnatural0.11(R 0.49).
Conclusion – Self efficacy (confidence to consume) was the most important predictor of likelihood to purchase. References
1 Baghurst K. (2006). Nutrient Reference Values for Australia and New Zealand Including Recommended
2
Dietary Intakes. http://www7.health.gov.au/nhmrc/publications/synopses/n35syn.htm
Meyer BJ, Mann NJ, Lewis JL, Milligan GC, Sinclair AJ, Howe PRC. (2003). Dietary Intakes and Food Sources of Omega-6 and Omega-3 Polyunsaturated Oils. Lipids; 38: 391- 398.
Background – The ability to have a healthy diet on a low income is thought to be a major problem, which contributes to the widening inequalities in nutrition and health. It is perceived that many people can not afford to meet the recommended 400g of fruit and vegetable a day.
Objective – To conduct market research on the price of the selected fresh fruit and vegetables to determine the daily cost of meeting the 5-a-day initiative in New Zealand.
Design – Four waves of data collection were conducted on five selected food stores to reflect differences in stores prices and seasonal variation. To establish the price unit/80g two different strategies were applied. The items sold on price per kilogram were converted by dividing the price by 1000 to give price per 1g, and then multiplied by 80. The items sold on weight basis were converted by dividing the actual weight in grams and multiplying by 80.
Outcomes – To meet the 5-a-day initiative the total cost ranged from $1.40-$1.97 (spring), $1.13-$1.98 (summer), $1.37- $2.00 (autumn), and $1.64- $2.12 (winter) per person. There was no significant difference in the price of fruit and vegetables between seasons, after the sale items were calculated into the daily cost F(3, 16) = .256, P >.05, however, there was evidence of a significant effect of food store, F(4, 15) = 4.67, P = .012.
Conclusions – The cost of meeting the 5-a-day initiative ranged from NZ$1.37 (the cost of a typical chocolate bar) to NZ$2.12 (the cost of a typical packet of biscuits) throughout the year. The results of the present study provide encouraging evidence to empower low income population to consume a healthy diet.
Introduction
New Zealand was once known as one of the lowest selenium (Se) environments in the world, but the implications in terms of human health were not clear. In 1988 Ray Burk described the situation as “selenium deficiency in search of a disease” (1), which reflected the uncertainties at that time. Clearly there were many ‘selenophiles’ who believed that Se was a cure for all ills, including cancer, cardiovascular disease (CVD), rheumatoid arthritis, male infertility and many others, while other researchers were more cautious because of the lack of clear evidence for these associations. We now know much more about Se deficiency and function, which helps to clarify the role of Se in health and disease, but there are many questions remaining. This paper will review briefly what we currently know about Se’s role in health in disease, with particular reference to those conditions of interest for the New Zealand population.
Se exerts its functions through the selenoproteins, which contain selenocysteine residues, usually at their active sites (2). There are 25 mammalian selenoproteins, not all of which we know the function. When Se intake is limited, there is a clear priority for Se supply, both to certain tissues and to certain selenoproteins, resulting in a “hierarchy” of importance of selenoproteins (3). Many of the effects of Se deficiency or health effects can be attributed to these proteins, but some actions of Se, such as proposed anti-cancer properties, may operate independently of the selenoproteins.
Background – In developed countries, persons of low socioeconomic status (SES), particularly women, are less likely to consume diets consistent with dietary guidelines. Little is known about the mechanisms that influence SES differences in eating behaviours and food purchases. Cost is a strong influence on food purchases and given that persons of low SES often have more limited budgets, healthier foods such as fruit and vegetables may be overlooked in favour of less healthy, more energy-dense lower cost options.
Objective – To investigate the importance of the available household food budget as a predictor of food purchasing choices among women of low and high SES.
Design – This study used a novel experimental design which included a mixture of quantitative and qualitative methods. A sample of 74 women (37 low SES women and 37 high SES women) was selected on the basis of their household income and sent an itemized shopping list in order to calculate their typical weekly household shopping expenditure. The women were also asked to indicate those foods they would add to their list if they were given an additional 25% of their budget to spend on food and what foods they would remove if they were restricted by 25% of their budget.
Outcomes – Total food expenditure and expenditure on fruit and vegetables, meat products and alcohol were lower among low SES households compared with high SES households. However, expenditure on ‘extra foods’ such as biscuits and convenience foods was higher in low SES households. When the women were asked what foods they would add to their shopping list with a larger household food budget, low SES women chose more foods from the ‘healthier’ categories.
Conclusion – This study highlights the importance of cost when making food purchasing choices among low SES groups. Public health strategies aimed at reducing SES inequalities in diet might focus on promoting healthy diets that are low cost.
Background – The prevalence of obesity and type 2 diabetes have increased over the last 20 years while mortality from coronary heart disease has been declining. The reasons for these changes remain relatively unclear as longitudinal data on predictors of obesity change are scarce.
Objective – To understand the food and lifestyle antecedents of BMI and their changes over time between 1976 and 2005.
Design – 300 to 725 self-reported questionnaires from the Sydney Adventist Hospital were randomly selected for the years 1976, 1986 and 2005. Analyses included simple descriptive statistics, reliability analysis, univariate analysis of variance and linear regression analysis.
Outcomes – In 1976 dieting, physical activity, breakfast, chicken and pie/cake consumption predicted BMI for males; for females dieting, time urgency, eating between meals, butter and soft drink consumption were predictive. In 1986, dieting, physical activity, eating between meals, regular meal patterns and consumption of spreads, coffee, cereals, and salt predicted BMI in males; dieting and consumption of margarine, spreads and coffee were predictive for females. In 2005, choosing low-fat foods, consumption of eggs, spreads, coffee, cola and liquor predicted BMI for males; consumption of cola was predictive for females.
Conclusions – Foods and behaviours appear to be differentially related to BMI by sex and year. Factors predicting increases in BMI reflected changes in food patterns toward more energy dense foods.
Background – Low energy dense diets are associated with reduced energy intakes and are promoted as a sustainable strategy for long term weight control.
Objectives – To determine if eduction sessions on ways to reduce dietary energy density (ED) can successfully decrease the energy density of diets consumed by women following weight reduction.
Design – Overweight or obese women entering a 14 week weight maintenance education program following a weight loss diet (~5MJ/day for 8 weeks). Women were randomised to receive either standard eduction on Dietary Guidelines (six 30 minute sessions) or modified education sessions involving twelve 30 minute sessions on principles of ED. The ED education program included food examples, homework, and group discussion. Five day diet diaries were used to measure dietary intake and Nutritionist V (FirstDataBank Inc. San Bruno, CA) was used to determine energy intake, total weight of food consumed, and number of servings of each of the food groups. ED was calculated as kilocalories divided by total weight of food and beverages excluding water.
Outcomes – Compared with the group receiving standard nutrition eduction sessions (n=48), those receiving the reduced ED eduction (n=17) had a significant reduction in ED at the end of the study (0.94 vs 0.79,β -0.29 95%CI - 0.43, -0.06). The reduction was due to a decrease in the amount of servings from fats, oils and sweets food (9.8 vs 3.7, β -0.25 95% CI -4.45, -0.13), meat, fish, poultry and alternatives (1.8 vs 1.3, β -0.24 95% CI -0.71, -0.01) and breads, cereals rice and pasta (5.6 vs 4.2, β -0.28 95% CI -2.02, -0.23). No significant differences were seen between the groups with respect to number of servings of fruits (1.9 vs 1.5, β -0.09, 95% CI -0.96,0.40), vegetables (2.4 vs 2.3, β 0.03 95% CI -0.65, 0.81) or dairy products (1.8 vs 1.5, β -0.11, 95% CI-0.72,0.25).
Conclusion – Decreases in ED can be achieved following education sessions. However the reduction is largely achieved through decreased intake of high fat foods rather than an increase in low energy dense foods. The lack of significant increase in number of servings from low energy dense food groups (fruits and vegetables) suggest that this decrease may not be sustained over the long term.
Background – Diet indices reflecting recommended or optimal eating patterns have been suggested as a method for describing dietary patterns however there is little published work on indices relevant to the Australian context. Objective – The objective of this study was to develop and evaluate a food-based dietary index to reflect adherence to the Dietary Guidelines for Australian Adults and the Australian Guide to Healthy Eating for use in epidemiology. Design – Analysis was conducted of data collected in the 1995 National Nutrition Survey on participants aged >19 years who completed a 108 item food frequency questionnaire (n=8332). The dietary index consisted of fifteen items reflecting the dietary guidelines including intake of vegetables and legumes, fruit, total cereals, meat and alternatives, total dairy, fluids, sodium, saturated fat, alcoholic beverages, sugars and “extra” foods (as defined by the Australian Guide to Healthy Eating). Diet quality was incorporated by inclusion of items relating to wholegrain cereals, lean meat, reduced/low fat dairy and dietary variety. Mean dietary index scores were calculated across socio-demographic factors and mean nutrient intakes from 24-hour recalls were calculated across quintiles of dietary index score.
Outcomes – Significant differences were found in mean dietary index scores according to sex, age, income and area-level index of relative socio-economic disadvantage with higher scores shown amongst women, older people, those with higher incomes and those living in the least socio-economically disadvantaged areas. Higher dietary index scores were associated with lower intakes of energy, total fat and saturated fat and higher intakes of fibre, b- carotene equivalents, vitamin C, folate, calcium and iron (p<0.05).
Conclusion – This dietary index based on the recommendations for healthy eating in Australia is able to discriminate across a variety of socio-economic factors and reflects intakes of key nutrients. Further work is required to determine whether this index is useful in predicting health outcomes.
Background – Legislation for regulating nutrition, health and related claims on foods in Australia and New Zealand is currently under review. It has been proposed that in future, all claims be regulated through the Food Standards Code (1), as the current co-regulatory system comprising the Food Standards Code and Code of Practice on Nutrient Claims (2) is not fully enforceable, and may inadvertently lead to consumer deception.
Objective – To determine a) the number of Australian foods currently carrying nutrition, health and related claims, and; b) of those that do, the proportion that meet the current provisions in the Food Standards Code and Code of Practice on Nutrient Claims (the co-regulatory system).
Design – A comprehensive survey of the labels of 4,171 foods in a large suburban supermarket in central Sydney, NSW between August and September, 2005. All food label information was entered into a custom-built, MS Access database for collation and statistical analysis.
Outcomes – A total of 2,611 (62.6 %) foods carried some kind of nutrition, health or related claim. Of these, 872 (33.3 %) foods carried a claim that did not comply with the co-regulatory system. The most common reason for a breach was failure to list the total fat content in the nutrition information panel (47.1%). Of the products that did not make any nutrition, health or related claims, only 296 (7.1%) breached the co-regulatory system.
Conclusion – A large proportion of foods currently making nutrition, health or related claims are in breach of either the current Food Standards Code or Code of Practice. As such, our data provide support for the placement of all nutrition, health and related claims in the Food Standards Code and the abolition of the current co-regulatory system. This will provide much stronger protection for consumers in Australia and New Zealand.
References
1 Australia New Zealand Food Authority. Food Standards Codes – Volume 2. Canberra: Information Australia, 2000.
2 National Food Authority. Code of Practice. Nutrient claims in food labels and in advertisements. Canberra: National Food Authority, 1995.
Background – A probiotic has been defined as a viable microbial food supplement, which beneficially influences the health of the host (1). Lactobacillus rhamnosus GG (=ATCC 53013) is one of the best-documented probiotics in the human studies and this experience forms the basis for future developments.
Objectives – We evaluated the mechanisms of probiotics and the target identification for probiotic use. Based on the assessment a rationale for probiotics and future probiotic developments has been formulated.
Design – Mechanistic and human studies on probiotics and especially on Lactobacillus GG were reviewed to identify targets for probiotic action and mechanisms of action. Based on the review and outlook for new criteria has been formulated.
Outcomes – Individual probiotics have defined strain-specific actions in the human studies and all probiotics are unique. Clinical efficacy has been demonstrated in clearly identified target aberrancies related to intestinal microbiota development. Based on meta-analyses, Lactobacillus GG and Saccharomyces boulardii and Bifidobacterium lactis a & Streptococcus thermophilus are identified as efficient in preventing antibiotic associated diarrhoea. As studies have also indicated new probiotic combinations effective for specific aberrancies such developments should be further characterized. Based on these the future selection criteria for specific probiotics and probiotic combinations can be established.
Conclusion – The available demonstrations indicate that microbiota targets should be identified for probiotic action. Genetic information on the specific strains and their action can assist in finding new probiotics and combinations for future target specific preparations with increased efficacy in the human studies.
Reference
1
Szajewska H, Ruszczynski M, Radzikowski A. Probiotics in the prevention of antibiotic-associated diarrhea in children: A meta-analysis of randomized controlled trials. J Pediatr. 2006;149:367-372.
McFarland LV. Meta-analysis of probiotics for the prevention of antibiotic associated diarrhea and the treatment of Clostridium difficile disease. Am J Gastroenterol. 2006;101:812-22.
Background – New probiotic combinations are an opportunity to develop more targeted products for different consumer groups or health problems. The general criteria for an industrially interesting probiotic are safety, survival in the gastrointestinal tract, adhesion, colonisation, documented physiological effects shown in clinical trials, and technological feasibility.
Objective – The aim was to develop a clinically effective and technologically applicable probiotic combination. Design – Microbiological and biochemical screening combined with cell line studies were used in order to screen for the most suitable combination for clinical interventions.
Outcomes – The screening resulted in a combination of four strains: Lactobacillus rhamnosus GG, Lactobacillus rhamnosus Lc705, Propionibacterium freudenreichii ssp. shermanii JS and Bifidobacterium Bb-12. Clinical efficacy was demonstrated in subjects suffering from irritable bowel syndrome, and in subjects undergoing Helicobacter pylori eradication treatment. All the strains were capable of colonising the intestinal tract. The combination was technologically feasible for industrial scale production, and the stability of all the strains in a fermented milk drink was good during a five-week shelf life.
Conclusions – This development process show that the combination of LGG, L. rhamnosus Lc705, P. freudenreichii ssp. shermanii JS and Bb-12 is a promising probiotic ingredient for functional foods targeted to relieve symptoms of irritable bowel syndrome.
Background – Adhesion and colonization of the mucosal surfaces by probiotics are possible protective mechanisms against pathogens through competition for binding sites and nutrients (2) or immune modulation (1).
Objectives – The objective was to test the abilities to inhibit, to displace and to compete with pathogens in order to screen the most effective adhesive probiotic combination, and to develop methods for new probiotic characterization.
Design – A human intestinal mucus model (2) was used to assess probiotics strains and their combinations. The strains were selected on the basis of their use as a commercial probiotic strains and they have each demonstrated to have beneficial in vivo health effects.
Outcomes – All probiotic strains showed abilities against pathogens adhesion, but the displacement, inhibition and competition were clearly strain- and combination strains- dependent indicating the need of a case-by-case characterization of each probiotic strain and their combinations. The selection of probiotics to inhibit or displace a specific pathogen could be the basis for both product development and future clinical intervention studies on prevention or treatment of dysfunctions.
Conclusion – Our results suggest that different probiotic combinations can be formulated to enhance the inhibition and the displacement percentages to pathogen adhesion to intestinal mucus. New combinations could be useful in inhibition and displacement of pathogen adhesion than a single strain. Further studies are needed to characterize each combination and to understand their role in inhibition mechanisms.
References
1 Salminen S, Bouley C, Boutron-Ruault MC, et al. Br J Nutr 1998; 80 Suppl 1:S147-71.
2 Ouwehand AC, Salminen S, Tolkko S, Roberts P, Ovaska J, Salminen E. Clin Diagn Lab Immunol 2002; 9:184-6.
Background – New species and more specific strains of probiotic bacteria are constantly being sought for novel probiotic products. Their safety cannot be assumed. Prior to incorporating novel strains into products a careful evaluation of their efficacy is required and an assessment made as to whether they share the safety status of traditional food-grade organisms. Probiotic products which claim specific nutritional, functional or therapeutic characteristics blur the boundaries between what is a food, a diet supplement or a medicine, posing challenges for regulators.
Objective – To report on the adequacy of contemporary studies to characterize and substantiate probiotic safety. Design – Probiotic studies were examined in relation to the guidelines proposed for safety of probiotics.
Outcomes – Evidence for the safety and efficacy of probiotic organisms has until recently been largely anecdotal or based on relatively little, and often poorly designed research. Food organisms intrinsic to the production of traditional foods have been arbitrarily classified as safe in the absence of scientific criteria, partly because they exist as normal commensal flora, and because of their presence for generations presumably without adverse effect.
Many bacteria are being tested to find a putative probiotic, yielding conflicting data, sometimes for the same organism. Comparisons between studies and organisms cannot be readily made because of non-standardised dosing procedures, particularly for the number of bacteria and the duration of dosing. Information is not readily available on the equivalence or comparability of formulations in different probiotic preparations. Intake data are not generally available for those countries where products are used.
Conclusions – The demonstration of efficacy in probiotics offers vast opportunities to develop human and veterinary products. A new probiotic culture must be at least as safe as its conventional counterparts. There is vigorous debate on what constitutes appropriate safety testing for novel strains proposed for human consumption. Conventional toxicology and safety evaluation is of limited value in assessing the safety of probiotic bacteria. The addition of novel bacterial strains to foods and therapeutic products requires reconsideration of safety assessment procedures.
Background – Several studies demonstrate that probiotic cultures have beneficial effects on human health. Some of these studies have used food products as delivery vehicles, whereas other studies have used dietary supplements to deliver probiotics to the gastrointestinal tract (GIT). It is hypothesised that (1) the physiological state of the probiotic cells will affect their survivability and beneficial activity, particularly in the harsh conditions of the human stomach, and that (2) consuming probiotics in conjunction with food will improve survival and activity of probiotics in the GIT.
Objectives – The current study had two objectives. Firstly, to investigate if the physiological state of the probiotic cells (fresh or lyophilised) affects the survival and metabolic activity of probiotic cultures, following an incubation of probiotic cultures in artificial gastric juice. Secondly, to investigate if co-administration of probiotics with cows milk and soymilk improves survival and/or metabolic activity of probiotics in gastric juice.
Design – Lyophilized and fresh (cultivated overnight) cultures of three probiotic strains (Lactobacillus acidophilus ®®®
LAFTI L10, Lactobacillus casei LAFTI L26 and Bifidobacterium animalis LAFTI B94) were re-suspended in peptone water, soymilk and cows milk. Gastric juice (pH 2.0) was inoculated with suspensions containing probiotic cultures (1/10), and the survival and metabolic activity of probiotic cultures were monitored for 30 min.
Outcomes & Conclusions – After 30 minutes in the gastric juice, the viability of fresh cultures was reduced by 0.5 to 4 log10 units per ml depending the strain used. The corresponding viability losses of lyophilized cultures were between 0.5 and 2.5 log10 units per ml. Cows milk and soymilk did significantly improve the survival, and metabolic activity, of the probiotic cultures in gastric juice. When cultures were delivered to gastric juice in sweet (unfermented) milk, soymilk or yoghurt, viability losses of less than one log10 unit were detected after 30 minutes incubation in the gastric juice. These results suggest that the lyophilized cultures are more tolerant to gastric acid than fresh cultures and that both survival and activity of probiotics are enhanced if they are delivered to the gastrointestinal tract together with milk, soymilk or an excipient that protects the probiotic cultures from the harsh conditions of the human stomach.
Background - UNICEF has recently highlighted the fact that there are 146 million children under 5 yr of age five who are underweight (<-2SD W/A) (1). Whereas the proportion of under-weight children has decreased globally, the numbers in sub-Saharan Africa have actually increased. Over a third of births in South Asia are low weight predisposing to increased morbidity, mortality and intergenerational stunting. Of the 102 countries for which UNICEF have data, only 42 (45.1%) are on track to reach the 1st
halve by 2015 the prevalence of underweight children under-five years of age. At the same time, there is an increasingly strong evidence base for proven and promising interventions that improve nutrition and child health, with an estimate of over 60% of the 11 million annual child deaths able to be prevented if these known interventions were to be implemented (2). Over 50% of these deaths have undernutrition as the underlying cause of death. On the other hand, increasing attention is directed to the growing epidemic of childhood obesity and the subsequent development of nutrition-related chronic diseases.
Review - There have been many public multilateral policy responses to nutrition issues since 1976 when the FAO boldly targeted the elimination of famine. Thirty years later there are still an estimated 830 million people ‘hungry’. In 1990, the UN held the Summit for Children that endorsed a series of nutrition and other goals. In 1992, FAO and WHO hosted the International Conference on Nutrition in Rome with equally ambitious targets. A decade later, in the absence of achievement of the goals, but a perception that they had been useful in getting government and donor commitment, a UN Special Session for Children refined the goals, including the ones for nutrition. The six main goals of the MDGs have been a major boost for nutrition-related policy and programming. The single largest investor in global nutrition, the World Bank has recently recognized nutrition as key to its development agenda. The UN Standing Committee on Nutrition (SCN) has developed an action plan to address the double burden of malnutrition i.e. the co-existence of underweight (usually the infants and children) and overweight (usually mothers) which has become common, especially in disadvantaged populations. The World Food Programme and UNICEF have sponsored a global initiative, aimed at increasing country implementation, ‘Ending Child Hunger and Undernutrition’. There has been joint action in harmonizing interventions to address severe mal(under-)nutrition by several UN agencies with Emergency NGOs and bilaterals, especially Canada and the USA. WHO has just released the new child growth standards, based on optimally fed infants and children, which re-enforce the fact that all young children have the same growth potential. Industrialized countries, including Australia, are starting to take a national policy approach to obesity in their young populations. The private sector, alone and in co-operation with large philanthropic foundations are seeking coordinated and sustainable solutions, with existing partners and Governments, to nutrition problems in developing countries.
Conclusions - These many efforts provide an enormous and potentially unprecedented potential to leverage the content and direction of global health and nutrition policy towards improving nutrition and health in childhood and adolescence. The challenge will now be to have these harmonized, funded and implemented.
References
Background – Irritable bowel syndrome (IBS) is one of the most common diagnoses in gastroenterology, but current therapies are inefficient. Recent clinical trials suggest beneficial effects of certain probiotics in IBS. Objective – The aim was to evaluate the clinical efficacy of a probiotic combination (L. rhamnosus GG, L. rhamnosus Lc705, P. freudenreichii ssp. shermanii JS and a bifidobacterium) in IBS patients.
Design – Two randomised, double-blind, placebo-controlled clinical intervention trials were conducted. In the first trial, altogether 103 IBS patients received during six months daily either probiotic supplementation or placebo. In the second trial, 86 IBS patients received during five months daily either probiotic supplementation or placebo. IBS symptoms (bowel movements, abdominal pain, distension, flatulence, rumbling) were followed by symptom diaries. Outcomes – In the first trial, the total symptom score (abdominal pain+distension+flatulence+rumbling) was 7.7 (95% CI 13.9 to 1.6) points lower in the probiotic group compared to placebo (p=0.015) at six months. This means a median reduction of 42% in the symptom score of the probiotic group compared to 6% in the placebo group. The total symptom score decreased significantly also in the second trial when the probiotic group was compared to placebo (14 points vs. 3 points; p=0.0083). When each symptom was analysed separately the probiotic combination had a beneficial effect on abdominal pain (p=0.052) and bloating (p=0.023).
Conclusions – Two long-term clinical interventions indicate that the combination of these four probiotics is a useful and safe treatment option for IBS. Studies on the mechanisms of the beneficial effects are in progress.
Iodine of maternal origin is essential for brain development during foetal and early neonatal life. Globally, iodine deficiency is the leading cause of preventable mental handicap. Individually, the developing brain is extremely vulnerable to even minor degrees of maternal hypothyroxinemia secondary to iodine deficiency. Even mild, clinically unrecognisable, hypothyroxinemia can cause serious irreversible neuromotor deficits rendering a child handicapped for life. The invisibility of the deficiency makes it all the more dangerous. WHO estimates that 2 billion people worldwide, comprising over 300 million children in 54 countries, still have inadequate iodine intake, with over 41 million newborns not protected annually from iodine deficiency. Meta-analyses of IQ studies in children, born to mothers who are moderately to severely iodine deficient during pregnancy, show an IQ loss of 10 to 15 points.
The recommended daily intake of iodine in the child and the adult non-pregnant state is 150 ug, increasing to 250 ug during pregnancy. The increased daily requirement represents the need to satisfy increased maternal T4 production, transfer of both T4 and iodine to the foetus and increased maternal renal iodide clearance. The infant requires around 90 to 100 ug iodine per day, mandating a requirement of 250 ug iodine daily in the breastfeeding mother. Iodine content of human breast milk varies with maternal iodine intake, emphasising the need to ensure iodine intake is optimised during lactation to protect the infant from hypothyroxinemia.
Several localised, regional studies in South Eastern Australia and Tasmania have recently documented the re- emergence of mild to moderate iodine deficiency. To provide a comprehensive snapshot of iodine nutrition throughout Australia we undertook a National Iodine Nutrition Study between mid 2003 and end 2004. The survey was a cross-sectional study of 8 to 10 year old school children, randomly selected from government and non- government primary schools, in the 5 mainland Australian states of New South Wales, Victoria, South Australia, Western Australia and Queensland. The sample consisted of 1,709 students from 88 schools, comprising 881 boys and 828 girls. 1) Urinary iodine excretion levels (UIE) were determined and compared with WHO/ICCIDD criteria for the severity of iodine deficiency. 2) Thyroid volumes measured by ultrasound were compared with new international reference values (WHO/ICCIDD). On a State basis, NSW and Victorian children are mildly iodine deficient with median UIE levels of 89ug/L and 73.5ug/L, respectively. South Australian children are borderline iodine deficient with a median UIE of 101ug/L. Both Queensland and Western Australian children are iodine sufficient with median UIE levels of 136.5 ug/L and 142.5 ug/L, respectively. Thyroid volumes were significantly larger in Australian children compared with the iodine-replete international reference range. Ongoing studies of iodine nutrition in pregnant women in NSW, and their offspring, confirm mild to moderate iodine deficiency is widespread throughout the State.
The results of this study confirm the existence of inadequate iodine intake in the Australian population and call for the implementation of mandatory iodisation of all edible salt in Australia. In the interim, we recommend iodine supplementation be considered for pregnant women, those contemplating a pregnancy, and breastfeeding mothers.
Iodine is essential for normal development of the brain and central nervous system. It is not surprising, therefore, that the dietary requirements for iodine are highest during pregnancy, lactation, and early childhood. Despite the importance of an adequate intake of iodine at these stages in the life cycle, there is increasing evidence that pregnant women and very young children in Australia and NZ are iodine deficient.
Recent studies in Australia have reported mild iodine deficiency in pregnant women with a median urinary iodine concentration (MUIC) of 52-85μg/L1,2; a MUIC ≥100μg/L indicates adequate iodine status. The one recent published report assessing iodine status in NZ pregnant women reported a MUIC of 38-44μg/L, indicative of moderate iodine deficiency, however, the sample size used in this study was relatively small (n<50) and only
3
included women living in Dunedin. In October and November 2005, the ThyroMobil and Iodine in Pregnancy
(TRIP) survey assessed the iodine status of 174 pregnant women living throughout NZ. The MUIC of these women was 38μg/L and 7% of the women had goitre (i.e. thyroid volume >18 mL).
ICCIDD/UNICEF/WHO have suggested that neonatal TSH levels be used as index of iodine status with no more
4
than 5% of neonates having a TSH value >5 mU/L. Both Australia and New Zealand routinely screen newborns
using a heel-prick blood sample. A study by McElduff reported elevated TSH concentration in 5.4% and 8.1% of
5
two samples of newborns born in Sydney compared to only 2.2% of the newborns in the more recent study of
2
Traversetal. TheuseofneonatalTSHconcentrationstoassessiodinestatusinNZhasnotbeenfullyinvestigated.
There is a dearth of information about the iodine status of lactating mothers in Australia and NZ. A randomised, double-blind, placebo-controlled, intervention trial of breast-feeding mothers in Dunedin was recently carried out to determine the effect of two levels of iodine supplementation (75μg I/day and 150μg I/day) during the first six months postpartum. Supplementation resulted in a UIC 2.1-2.4 times higher compared to placebo women (P<0.001). Breast milk iodine concentration (BMIC) in supplemented mothers was 1.3-1.7 times higher compared to placebo (P<0.0001). Despite these increases, supplementation of 75μg I/day or 150μg I/day was insufficient to increase maternal iodine status to levels considered adequate by ICCIDD/UNICEF/WHO.
6
There are also limited data that the iodine status of NZ children <2 years of age is sub-optimal ; unfortunately there
are no comparable Australian studies in this age group. Inadequate intakes of iodine in this age group are not surprising given recommendations that salt should not be added to foods prepared for infants, iodised salt is not used in manufactured infant foods, and only a small number of children at this age eat fish and seafood.
Together, the results of these studies strongly suggest that pregnant and lactating women, and very young children living in Australia and, in particular NZ, are at increased risk of iodine deficiency. The recent proposal by FSANZ for mandatory iodine fortification of breakfast cereals, bread and biscuits will need to supply sufficient additional iodine to meet the requirements of these vulnerable groups.
References
Background – Evidence from studies and surveys have shown that New Zealand and some parts of Australia are experiencing a re-emergence of mild-to-moderate iodine deficiency over the past 20 years. Food Standards Australia New Zealand has been asked by the Australia and New Zealand Food Regulation Ministerial Council to give priority consideration to increasing the iodine content of the food supply.
Objective – To reduce the prevalence of iodine deficiency in affected parts of Australia and in New Zealand by means of mandatory iodine fortification of food.
Design – A benefit/risk analysis was carried out to determine a suitable approach to iodine fortification of food that would permit increases in iodine intakes where needed without compromising public safety. The analysis included:
• assessment of the health risks associated with the current deficiency;
• health risks and benefits of mandatory fortification based on dietary intake estimates of two fortification
scenarios involving cereal based foods or all salt containing processed foods; and partial iodisation of
discretionary salt;
• considerations of consumer choice; and
• analysis of the cost and qualitative benefits of implementing a mandatory fortification solution.
Following the benefit/risk analysis a risk management approach and communication strategy were formulated. Outcomes – The replacement of non-iodised with iodised salt in selected cereal products was proposed. Dietary intake estimates indicated that this approach had an impact comparable to replacing non-iodised with iodised salt in all processed foods. However, restricting mandatory fortification to selected cereal products has a considerably lower impact on local industry and trade. Public comment was sought on this proposal in September and a final recommendation is expected in November 2006.
Conclusions – The replacement of non-iodised with iodised salt in selected cereal products is the draft preferred option for mandatory fortification with iodine as at September 2006.
Background – Thirty percent of Australians have high blood pressure and the incidence of hypertension rises with age, such that between 60 and 70 years, 70% of the population has hypertension [1]. Hypertension is a major risk factor for the development of coronary disease and strokes. There is a large body of evidence implicating high intakes of dietary sodium to hypertension.
Objective – To summarize the evidence to date implicating dietary salt (sodium) in the development and maintenance of hypertension, cardiovascular disease and osteoporosis and review the information on the current dietary sodium intake with reference to the Suggested Dietary Target for health [2].
Outcomes – Meta-analyses of intervention studies continued for at least four weeks have demonstrated significant reductions in blood pressure. Twenty trials in individuals with elevated blood pressure (n=802) and 11 trials in individuals with normal blood pressure (n=2220) with a mean reduction of 78 mmol (4.6 g/day salt), reduced systolic pressure (SBP) by 5.1 mmHg and diastolic pressure (DBP) by 2.7 mmHg in hypertensive individuals. A 74 mmol (4.4 g/day salt) reduction in normotensives reduced SBP by 2.0 mmHg, and DBP blood pressure by 1.0 mmHg [3]. On a population basis, it has been estimated that a reduction of 2 mm Hg in SBP blood pressure would result in 6% reduction in risk of stroke and a 4% reduction in risk of coronary heart disease, and an overall reduction in mortality of 3% [4]. Few studies have linked lower sodium intake to reduced risk of cardiovascular disease, but positive associations have been found between higher sodium intakes and stroke incidence and mortality, and mortality from cardiovascular disease. Although some prospective studies suggest that a high salt intake has adverse effects on cardiovascular disease mortality, there are insufficient reliable data on morbidity and premature mortality outcomes. The lack of this evidence relates to problems in undertaking large scale dietary intervention studies of long duration, requiring a high degree of dietary compliance that is difficult to maintain with the current salt levels inherent in our food supply. A high salt intake has been found to increase left ventricular mass independently of blood pressure and higher dietary salt increases calcium losses from bone. In Australia, sodium intakes have been estimated to be about 150 mmol/day (9g salt) [5], with more than 75% of the dietary sodium consumed, present in the food supply. To ensure that Australians are able to effectively reduce dietary sodium to levels that approach the Suggested Dietary Target of 70mmol sodium/day (4g salt) a reduction in the salt content of the food supply must occur.
Conclusion – There is good evidence that in both hypertensives and normotensives there is a significant fall in blood pressure with dietary sodium reduction in the range 50-100mmol/d (3-6g salt/d). There is moderate evidence that diets high in sodium are associated with increased blood pressure and increased prevalence of hypertension. Dietary sodium reduction can be achieved on a population wide basis with little individual effort by reducing the sodium content of the food supply. An average reduction of 50mmol sodium (3g salt) would reduce blood pressure of the population and would significantly contribute to a decrease in the burden of cardiovascular disease.
References
Worldwide, raised blood pressure throughout its range is the major cause of death and the second leading cause of disability after childhood malnutrition and this is through the strokes, heart attacks and heart failure it causes. More than 60% of all strokes and approximately half of all heart disease is due to raised blood pressure.
Our current high salt intake plays a major role in raising blood pressure and, particularly, the rise in blood pressure that occurs with increasing age. Evidence that relates salt intake to blood pressure comes from six different lines of evidence - epidemiology, migration, intervention, treatment, animal and genetic studies. All of these suggest that our salt intake is not only important in raising blood pressure, but a reduction in salt intake would lead to a reduction in population blood pressure, a reduction in the rise in blood pressure with age and better control of those who are already on blood pressure treatment.
The current salt intake in most countries in the world is between 10 and 20 grams/day and recommendations from the WHO set a world-wide target of reducing salt intake in all adults to less than 5 grams/day. These recommendations are similar in other countries and the UK has set much lower levels for children depending on age. In most developed countries salt consumption is passive - that is it is already added to processed, ready prepared, canteen, restaurant, fast and takeaway foods. Only 15% of salt intake is added in cooking or at the table and 5% is naturally present in foods.
The only way, therefore, that a reduction in the population's salt intake can be made in these countries is by the food industry slowly reducing the very high and unnecessary salt concentrations of all foods where salt has been added and doing this slowly over a period of time, e.g. in the UK, five years. In addition, a public campaign educating the public about the dangers of eating too much salt would lead to the use of less table and cooking salts and put additional pressure on the food industry.
The public has a right to know exactly what foods contain and, in relation to salt, it is vital that the salt content per serving is on all products with a recommended intake accompanying this. There should also be a signpost or traffic labelling system which indicates whether a product is low (green), moderate (amber) or high (red) in salt. The benefits of reducing salt intake are very large. For instance, if salt intake is reduced by 6 grams/day there would be a 24% reduction in stroke mortality and an 18% reduction in coronary heart disease mortality.
Of all public health strategies, a reduction in population salt intake is the most easy to achieve as it does not require the population to change what it eats, but does require changes from the food industry. Although this change could occur without the public necessarily being involved, it would be greatly helped by a public health campaign. The UK, for once, is leading in this area of public health but one would expect, given the evidence, that other countries, particularly Australia, should rapidly adopt this and overtake the UK in the next few years.
Background – The National Heart Foundation of Australia has advocated that Australians eat less salt for more than forty years. A recent review by the Heart Foundation to determine whether a causal association exists between a decrease in dietary sodium intake and CVD risk reduction confirms that there is good evidence that reducing dietary sodium intake will lower blood pressure (1). Much of the salt consumed by Australians is from the food supply rather than the salt shaker. The Heart Foundation’s Tick Program has been profiling healthier food choices for 17 years. By challenging manufacturers to meet tough nutrition standards, the Tick Program aims to improve the food supply for ultimate public health gains, while helping consumers with their choices at point-of-sale. The Heart Foundation sets strict standards for sodium across categories representing the foods Australians consume most often and tracks the impact of these benchmarks.
Objective – To outline the impact of the Heart Foundation’s Tick Program on reducing the salt intake of Australians by improving the food supply.
Outcomes – The Tick Program sets tough sodium criterion for 36 its 54 categories. A recent review of the Tick criteria has seen further sodium reductions in 10 categories. Sodium reduction is a challenge for the industry as it has a critical role in consumer’s taste acceptance, texture, pathogen inhibition and flavour. There are limits in the reduction of sodium that can be achieved in some categories and a requirement to lower sodium levels incrementally to achieve desired targets. There are several examples of how the Tick Program’s sodium benchmarks have driven formulations and innovations towards healthier offerings for consumers. For example, 235 tonnes of sodium were removed annually from the food supply by a manufacturer reformulating 12 cereals using the Tick criterion as a target (2). On average luncheon meats have a sodium content of 1000-1500mg/100g. A range of Tick approved luncheon meats have been developed with 50% of the sodium content of comparable products. On average Australians eat 240 million meat pies each year. Tick meat pies contain 50% less sodium than the average pie, so switching to a Tick alternative would save Australians 111 tonnes of sodium each year. Bread is a significant contributor of sodium to the Australian diet. The Tick Program is incrementally reducing the sodium level from 450mg to 400mg over a two year period in partnership with the bread industry. A manufacturer of pasta sauces has achieved a 25% reduction in the salt content of its sauces over a two year period using the Tick’s benchmarks. Conclusions- The Heart Foundation Tick Program offers Australians a real solution to lower the salt content of the foods they eat most often. The Tick is a simple, independent, easy to spot and trusted guide that shoppers are using to identify healthier choices when faced with aisles of food choices and numerous nutrition claims. For the food industry, the Tick is defining healthier choices, setting realistic benchmarks and providing them with the incentive to modify their products. In doing so, the Tick Program is removing thousands of tonnes of sodium from the Australian food supply each year thereby improving the health of the population.
References
Background – Data are limited but it is estimated that Australians currently consume in the order of 150mmol sodium (9.0g salt) per day [1] compared to the Suggested Dietary Target of 70mmol sodium (4.0g salt) per day [2]. 75% of salt in the Western diet is likely to come from processed foods such as breads, cereals and ready prepared foods [3]. By reducing the sodium content of these foods, food manufacturers can remove significant volumes of sodium from the food supply with potentially significant benefits for population blood pressure levels.
Objective – To investigate the feasibility of sodium reduction across a range of > 130 processed savoury food products with an average sodium content of 800mg/100g.
Design – Product specific benchmarks for sodium were derived from a combination of international and national dietary recommendations and existing signposting systems such as the National Heart Foundation Tick program. These benchmarks were used to reformulate existing products and develop new products and were reviewed after 3 years.
Outcomes – Between 2001 and 2004, there was an average sodium reduction of 25% across the range of these products resulting in an estimated removal of 36 tonnes of sodium from the Australian food supply. Results for the period 2004-2006 are being currently being analysed. Initial large sodium reductions were relatively easy to achieve by a combination of removal of added salt and addition of potassium chloride. Subsequent reductions were smaller and limited by a range of factors including consumer acceptance, lack of acceptable sodium substitutes and the need for lower sodium ingredients.
Conclusion – Sodium remains a challenging nutrient to reduce in the food supply. Reductions made tend to be small and gradual to allow for consumer taste adaptation. In addition, consumer awareness of the adverse health effects of sodium is low and strong demand for more ‘natural products’ appears to favour addition of salt in place of ‘unnatural’ sodium substitutes to foods. However, the development of novel sodium substitutes and lower sodium ingredients offers some promise for achieving greater sodium reductions in the future. In addition, research into taste perception is an emerging area of science which may offer a different approach to sodium reduction. The food industry does have a responsibility to reduce the sodium content of processed foods. However, it is the responsibility of all sectors to work together to educate the public and raise awareness about the importance of sodium reduction as well as agree on best approaches to achieve this in practice. Only then can significant and sustainable reductions in sodium in the food supply be achieved.
References
1 Beard TC, Woodward, DR, Ball PJ, et al. The Hobart Salt Study 1995: few meet national sodium intake target. Med J Aust 1997;166: 404-7.
Background – Zinc is an essential trace element required for growth and development. Nutritional zinc deficiency causes dermatitis, diarrhoea, reduced wound healing, neurological disturbances and increased susceptibility to infections. Zinc deficiency rates within the top 20 selected leading risk factors in relation to global deaths (1), however the extent of zinc deficiency is difficult to determine as there is no reliable indicator of body zinc status. Genetic factors can cause zinc deficiency and individuals with inherited disorders of zinc metabolism show similar features to nutritional deficiency. The genetic disorders provide an opportunity to understand the underlying pathology of zinc deficiency. A severe form of zinc deficiency is seen in some premature breast-fed babies, caused by reduced levels of zinc in maternal milk (2). We hypothesised firstly that cellular zinc transporters may be present in the mammary gland to mediate secretion of zinc into milk and secondly, that defects in one of these zinc transporters may underlie the condition leading to the production of zinc-deficient milk.
Approach – We analysed human breast tissue for the presence of the SLC30 family of zinc transporters that are predicted to cellular zinc efflux. We found that 5 members of the SLC30 family of zinc transporters were expressed in the breast. The multiplicity of zinc transporters in the lactating breast was surprising and may be consistent with the requirement for processing of zinc for different cellular functions including secretion. We then investigated three families where the lactating mothers produced milk with reduced zinc levels (25% of age-matched controls). Breast- fed infants on these mothers who produced zinc-deficient milk developed infected dermatitis, sparse hair, weakness and failure to thrive (3). Analysis of cells from mothers producing zinc-deficient milk showed reduced expression of two genes, SLC30A5 and SLC30A6.
Conclusion – The human breast expresses 5 zinc transporters belonging to the SLC 30 family. Two of these have altered patterns of expression in cells from women with inherited disorders of zinc secretion into milk. We postulate that the tissue-specific expression of different types of zinc transporters may account for the variability in zinc levels seen between different organs and tissues. Understanding the function of individual zinc transporters will facilitate the development of indicators for zinc status. Due to the multiplicity of cellular zinc transporters, genetic factors may interact with dietary factors to influence the susceptibility or predisposition individuals to zinc deficiency.
References
Background – There is growing evidence to suggest that higher than recommended dietary intakes of selenium (Se) confer additional health benefits (1), and many individuals are interested in supplementing their diet. Brazil nuts are the richest known natural food source of Se, yet no studies have investigated their efficacy in humans in raising Se status.
Objective – To assess the efficacy of Brazil nuts in increasing Se status in comparison to a selenomethionine (SeMet) supplement and a placebo, as measured by the response of plasma Se and glutathione peroxidase (GPx) activity in plasma and whole blood.
Design – A semi-blinded, placebo controlled trial was conducted with 56 healthy Dunedin adults (18-60 yr) with low Se status. Participants consumed two Brazil nuts containing an estimated 100μg Se, 100μg Se as SeMet, or a placebo tablet (Alaron Products Ltd, Nelson), daily for 12 weeks. Because of a large range in Se concentrations in Brazil nuts, the intake for the nuts averaged 79 μg/day with a possible range of 50-105 μg. Fasting, morning blood samples were taken at baseline, weeks 2, 4, 8 and 12 for measurement of plasma Se and plasma and whole blood GPx activities. The effects of the three treatments were compared using a random effects model (STATA 8.2), adjusting for baseline values, age, sex and BMI.
Results – Mean (SD) baseline plasma Se concentrations were 90 (13), 92 (14) and 89 (14) μg/L in the Brazil nut, SeMet supplemented and placebo groups, respectively. Plasma Se increased by 67.9% (p<0.001), 73.1% (p<0.001), and 6.9% (p=0.117) in the three groups. Changes in plasma Se over time in the Brazil nut and SeMet groups differed significantly from the placebo group (p<0.0001), but not from each other (P=0.301). Whole blood GPx activity increased by 12.6% (p<0.001), 6.4% (p=0.005), and 1.4% (p=0.478) in the three groups, respectively. The change in whole blood GPx activity was greater in the Brazil nut group than placebo group (p<0.001), but did not differ between SeMet and placebo groups (p=0.102). The change was greater in both the Brazil nut than the SeMet group, but the difference was not significant (p=0.087). Plasma GPx activity decreased by 4.0% (p=0.165) in the placebo group and increased by 12.1% (p<0.001) and 6.4% (p<0.01) in the Brazil nut and SeMet groups, respectively. The change was greater in both Brazil nut and SeMet groups than the placebo group (p<0.001), but did not differ from each other (p=0.165).
Conclusions – Consumption of two Brazil nuts daily is at least as efficacious at increasing Se status and enhancing GPx activity, as is a 100 μg Se SeMet supplement. This was in spite of lower average Se intake from the nuts. It is possible, therefore, that Se from Brazil nuts is more bioavailable for GPx synthesis than is SeMet. Although SeMet is probably the major form of Se in Brazil nuts (2), uncharacterized Se species in Brazil nuts may be more bioavailable. In view of the increasing interest in possible health benefits of higher Se intakes, Brazil nuts are a convenient source of Se to increase Se status of New Zealanders. A simple public health message to consume this high-Se food would avoid the need for fortification of foods or for expensive supplements.
References
Background - Little is known about the bone mineral accretion rate and the relationship between habitual dietary
intakes and bone mineral accretion during puberty in Chinese girls habitually consuming plant-based diets low in calcium.
Objectives - To evaluate the rate of bone mineral accretion during puberty in Chinese girls and to study the association between calcium and milk intake and bone mineral accretion rate.
Design - A 5-year observational cohort study was carried out on the unsupplemented controls from a milk intervention trial in Chinese girls. Eighty-seven Beijing urban girls aged 9.5-10.5 years at baseline were included in this analysis. For each of these subjects, there was a complete dataset both at baseline and at years 1, 2, 4 and 5 afterwards. Total body bone mineral content was assessed by dual energy x-ray absorptiometry with a Norland XR- 36 densitometer. Average calcium and milk intakes were estimated from 7-day food records at baseline and from 3- day food records at years 1, 2, 4 and 5.
Outcomes - Mean follow-up time was 4.8 years, with a range of 4.6 to 4.9 years. The mean total bone mineral accretion over 5 years from 10 to 15 years of age was 961 (SD 140) g. The mean annual rate of bone mineral accretion was 199 (SD 30) g/year, representing a calcium accretion rate of 164 (SD 24) mg/day and giving an apparent calcium retention efficiency in Chinese girls during puberty of 41.0 (SD 14.7) %. There was a significant association between bone mineral accretion and mean milk intake over the 5 years (r = 0.216, P = 0.04), but not between bone mineral accretion and mean calcium intake (r = 0.109, P = 0.3).
Conclusion - Despite their low calcium intakes of less than 500 mg per day, Chinese girls accumulated similar amounts of bone mineral during puberty as western girls consuming 1000 mg calcium per day (1). Milk intake of Chinese girls during puberty appears to have a beneficial effect on bone mineral accretion, independent of calcium intake.
References
Background – Selenium is an essential trace element with well established antioxidant and redox related biological roles. There is increasing evidence relating to its importance in the prevention of chronic disease such as cancer. Objectives – The aims of this study were to estimate the dietary selenium intake of a sample of northern Tasmanian adults; to determine dietary differences between genders and establish the major contributing food groups.
Design – A sample of 69 adults aged 23 -74 yrs, largely from an electoral roll sample, was selected from the northern Tasmanian region, an area hypothesised to be at risk of inadequate selenium intakes due to low soil content. Responses from the 121 item semi-quantitative FFQ, standard serving size data and food content data (ANZFA where possible or USDA) were used to produce dietary selenium intake estimates.
Outcomes – Selenium intakes were not significantly different between genders; men (n=30) consumed 85.9 ± 24.6 μg/day (mean ± SD) while women (n=39) consumed 79.1 ± 26.4 μg/day. Sixty three percent of men and 38% of women had estimated selenium intakes below the Australian RDI (85 μg/day and 70 μg/day respectively). These mean values are higher than countries with established low selenium status such as New Zealand, but are much lower than selenium intakes that have been associated with chemopreventive effects in recent studies. The major contributing food groups for male and female subjects respectively were meat/fish (38% and 41%), cereal based foods (both 24%) and vegetables (9% and 10%).
Conclusion – While the estimated selenium intakes of this sample are sufficient to avoid symptoms of overt deficiency, a large proportion consume less than the Australian RDI and most appear to receive significantly less than what may be considered as an optimum intake. Further investigation on selenium intake of the Tasmanian population is warranted to identify groups that may be most at risk.
Background – Selenium (Se), an essential micronutrient, is incorporated into cells and proteins of the immune system, boosting immune function. Incorporation of micronutrients into biological materials (fortification), is proposed to improve micronutrient absorption. Recently technologies have been developed to increase the incorporation of micronutrients into cow’s milk.
Objective – To determine the dose-response relationship between 25, 73 and 121 μg Se /100g diet and blood selenium concentrations and immune function in healthy 3 month old mice.
Design – Three month-old mice were housed 3-per box and fed non-fortified milk protein (MP, 18 μg Se /100g total diet) and 3 Se-fortified MP diets (25, 73 and 121 μg Se /100g total diet), combined with normal mouse chow. The fortification was achieved by feeding Se enriched diets, allowing greater incorporation of Se into the cow’s milk proteins. All mice were vaccinated against Mem71 strain influenza virus at 21 d, and euthanised at 49 d, peripheral blood was then collected via cardiac puncture and the spleen removed. In blood, Se concentrations and the immune enzyme glutathione peroxidase (GPx) were measured. In the spleen, non specific proliferative capacity and Mem71 virus-specific proliferation were measured.
Outcomes – Blood Se concentrations increased linearly with increasing dietary Se concentration (P<0.001). This corresponded with increasing red blood cell GPx activity. Splenocyte proliferative capacity increased with 73 and 121 μg Se /100g diet compared to non-supplemented control, but the virus specific proliferation was highest when supplemented to 121 μg Se /100g diet.
Conclusion – Increasing Se fortification of cow’s milk linearly increased peripheral selenium concentrations, with corresponding improvement in immune function in healthy mice.
Background – While Se intakes of Australian and New Zealand consumers are sufficient to ensure no overt signs of deficiency, the relatively low intakes may contribute the risk for some cancers. However, Se supplementation is problematic, as high Se intake can be toxic, particularly if the source is inorganic. Protein-bound Se is more bioactive and less toxic than inorganic forms of Se and there is interest in delivering Se in organic forms in food products.
Objectives – To use real time RT-PCR to determine the colonic gene expression of important selenoproteins and genes involved in development of some colon cancers in rats fed Se-enriched casein or yeast.
Design – 51 male Sprague Dawley rats were fed diets containing either control casein (0.035 ppm Se), Se-enriched casein (1 ppm Se) from cows fed Sel-Plex® (HseC) or control casein plus Se-enriched yeast (1 ppm, as Sel-Plex®) (HseY). Rats were fed diets for 5 wk prior to initial s.c. azoxymethane (AOM) injection (15 mg/kg) in the lower abdomen. A 2nd dose of AOM was given one wk later and after 9 wks rats were euthanised and colon removed.
Outcomes – There was no significant effect of dietary Se on colonic GPx expression (+840 and +164% for HSeC and HSeY respectively, P=0.19). When the HSeC was compared to the other diets there was an increase in GPx expression (+623%, P=0.078). While there was no significant effect of dietary Se on K-ras expression (-61% and - 90%, P=0.15), when both high Se diets were compared to the control diet the expression of K-ras was decreased by supplemental Se (-79%, P=0.10). There was no effect of dietary Se on expression of SelP (+31 and -50%, P=0.59).
Conclusion – These data suggest that dietary Se can influence the expression of key biomarkers of Se status and colon cancer formation, but the form of the Se may be important. Increased dietary Se either as yeast or casein reduced the expression of K-ras whereas only casein bound Se increased the expression of colonic GPx.
Background – Suboptimal status of selenium and iodine has been reported in New Zealand adults. This is likely to be exacerbated in older adults who are particularly prone to inappropriate dietary intakes and inadequate nutrient status. Both selenium and iodine are essential for optimal thyroid hormone metabolism. To our knowledge, no studies have investigated the effect of a combined selenium and iodine intervention on thyroid status in both a low selenium and iodine region.
Objectives – To assess the efficacy of a selenium, iodine and combined selenium and iodine supplementation in older adults on thyroid hormone status, in comparison to a placebo.
Design – A randomized, double blind intervention trial was conducted in August and November 2005. Participants aged 60 to 80 years (n=102) consumed 100μg selenium as selenomethionine, 80μg iodine, 100μg selenium and 80μg iodine or placebo supplements daily for 12 weeks. Fasting, morning blood samples were taken at baseline, weeks 2, 4, 8 and 12 for measurement of thyroid hormone status (plasma TSH, free T3 and T4, and thyroglobulin) and selenium status (plasma selenium and whole blood glutathione peroxidase (GPx) activity). Median Urinary Iodine Concentration (MUIC) was determined at baseline and week 12 from casual urine samples.
Outcomes and Conclusions – Participants had a mean (SD) age of 73 (4.8) years and an average BMI of 27.4 (4.3). The MUIC was 48μg/L (IQR 31, 79), a level indicative of moderate iodine deficiency. Mean plasma selenium was 94.5 (25.7) μg/L and the correlation between plasma selenium and whole blood GPx was 0.329 (P=0.001). Selenium supplementation had no significant effect on plasma thyroid hormone concentrations. These results suggest the selenium intake of many of these older adults is still insufficient for optimal GPx activity, yet adequate for thyroid hormone status.
Background – Many elderly are at increased risk of poor nutritional intake following increased sedentary behaviour and low food consumption impacted on by other complex social and psychological factors. Most studies investigating the nutritional status of the elderly focus on those in nursing homes (high level care) or free-living elderly in the community. Few studies have examined those living in hostels (low level care).
Objective – To determine the nutritional status of elderly residents in low level care.
Design – Cross sectional study of 77 residents (59 women, 18 men) from 10 low level care facilities in metropolitan
Melbourne. Nutrient intake was assessed by three day weighed food records; body composition by DXA and height
and weight via usual methods. Blood samples were drawn and analysed for serum albumin.
2
Outcomes – Residents were aged 83 ± 18.1 years (mean ± SD) with a mean BMI of 27.1 ± 4.7 kg/m in women and
22
25.7 ± 3.5 kg/m in men. Few women (9.4%) or men (6%) were underweight (BMI<22 kg/m ) while almost half
2
were overweight or obese (47% women, 44% men with BMI> 27 kg/m ). A significant proportion of residents were
2
sarcopenic as determined by appendicular skeletal muscle mass/ht , (24.5% of women <5.4, 43.7% of men <7.26).
2
Moreover, 17% of women with a BMI > 22 kg/m were sarcopenic, as were 31.2% of men. Only five women
(9.2%) and no men had low serum albumin. Average energy intakes were 6.5 ± 1.9 MJ/day for women and 8.1 ± 1.6 MJ/day for men.
Average intakes of protein, fibre, calcium, zinc, magnesium and folate were low (56.3 ± 17.1g/day, 16.5 ± 4.8 g/day, 668 ± 284 mg/day, 7 ± 2mg/day, 215 ± 69mg/day, 207 ± 69g/day, respectively), with many not meeting the EAR or RDI. There were no gender differences in nutrient intake after correction for energy intake.
Conclusions – Elderly people living in low level residential care demonstrate a degree of poor nutritional intake. They may be sarcopenic even when apparently within a healthy weight range. Periodic monitoring of residents may help ensure that nutritional deficiencies are addressed and poor nutritional status does not become chronic in this group.
Background – Postmenopausal women are at increased risk of secondary hyperparathyroidism resulting in reduction in bone mass because of loss of the beneficial effects of estrogen on stimulation of intestinal calcium absorption and reduction of renal calcium excretion.
Objective – To evaluate the relative benefits of vitamin D and calcium supplementation compared to calcium alone on hip bone mineral density (BMD) in ambulant elderly Western Australian women aged 70-80 years at baseline. Design – A 5-year randomised controlled double-blind study of 40 women per group assigned to received either 1000 IU vitamin D and 1200 mg calcium carbonate (CaD group), 1200 mg Ca and placebo vitamin D (Ca group) or placebo Ca and placebo vitamin D (placebo group) per day over the 5 years. Hip BMD was measured by DXA at baseline and years 1, 2, 3 and 5 using an identical protocol. Vitamin D status was measured at baseline.
Outcomes – The mean baseline age of subjects was 74.5 ± 2.4 years, mean total hip DXA 817 ± 99 mg/cm2 and mean total 25(OH)D 68.0 ± 28.7 nmol/L, 43% had vitamin D insufficiency (25(OH)D < 60 nmol/L), there were no baseline differences between the groups. Adjusted for baseline values, both Ca and CaD groups had significantly better maintenance of hip structure than the placebo group at 1 year, the effects were maintained in the CaD group at 3 and 5 years.Conclusions – Addition of vitamin D to calcium may have long term beneficial effects on bone structure in elderly postmenopausal Australian populations on the 35th parallel South.
Background – Malnutrition is common in the elderly. Vitamin B12 is of particular interest as deficiency is linked to chronic diseases including dementia, osteoporosis, stroke and macular degeneration.
Objective – To examine the extent and determinants of malnutrition, including micronutrient deficiencies in an aged care rehabilitation unit.
Design – Patients (n=67, age >60 y) admitted to Calvary Health Care Sydney, underwent screening using the Mini Nutritional Assessment (MNA) and a modified MNA (mMNA) incorporating Australian protein and anthropometric indices. Biochemical screening assessed protein and micronutrient status. A sub-group of patients (n=22) underwent additional investigations to determine vitamin B12 status, as assessed by methylmalonic acid (MMA) and homocysteine concentrations, and dietary intake using a semi-quantitative food frequency questionnaire (FFQ). Outcomes – Eighteen and 27% of patients were malnourished as determined by the MNA and mMNA, respectively, with both tools indicating 61% of patients were at risk of malnutrition. Twenty-six subjects (38%) were deficient in vitamin B12 (based on values <220 pmol/L) and 53 subjects (79%) were deficient in vitamin D (<50 nmol/L). A positive correlation was found between serum folate concentrations and the mMNA score (r= +0.28, P<0.05). Hyperhomocysteinemia (>12 μmol/L) was found in 73% of subjects (n=16), and was significantly associated with lower serum- and erythrocyte-folate concentrations. Thirty-six percent (n=8) of subjects had elevated MMA indicating cellular vitamin B12 deficiency, but all had serum vitamin B12 in the normal range. All subjects had adequate intakes (>77% 1991RDI) of vitamin B6, B12 and folate.
Conclusions – The present study highlights malnutrition, vitamin D and vitamin B12 deficiencies in subjects with normal serum vitamin B12 levels. This study supports routine screening and appropriate supplementation of micronutrients.
Background – Falls and fractures are common in aged-care residents and are a costly health burden. Sub-optimal nutritional intake, especially that of calcium, protein and vitamin D may contribute to fragility fractures and falls risk either by directly affecting metabolic process or functions, or by affecting body composition.
Objectives – To determine the risk factors for falls and fractures in ambulatory aged care residents.
Design – Cross sectional analysis of 82 aged care residents (mean age 86.2 yrs) from 18 hostels in Melbourne. Nutrient intake was determined from 3 day weighed food intake. Bone mineral density (BMD) was ascertained using DXA. Balance, leg strength and functional capacity were measured using established methods. Blood samples were drawn and analysed for 25(OH) vitamin D, PTH, creatinine, albumin, and bone metabolism. Medical records were reviewed to determine medication use, medical conditions and fracture history. Prospective falls data was collected over a 12-month period.
Outcomes – Over the 12 months, falls were reported in half of residents, with 100 falls recorded. One third of residents had a history of fractures. Mean calcium (668 mg/day) and protein (0.82 g/kg BW) intakes were below recommended levels. 34% had vitamin D levels below 30 nmol/L. Residents with a history of fractures consumed less calcium and had less lean mass, after adjusting for size (P<0.05). Fracture sufferers had reduced leg strength, lower femoral neck BMD (P<0.05) and tended to have reduced functional capacity (P< 0.1). Fallers were heavier and had a higher BMI than non-fallers (P<0.05), and tended to demonstrate more body sway (P< 0.1). Those with vitamin D levels below 30 nmol/L tended to have lower functional capacity and more body sway (P<0.1). Conclusion – Risk factors for falls or fractures appear to differ. The potential for vitamin D to improve balance and functional capacity and calcium to reduce fracture risk in this group, warrants further investigation. Protein intake was not related to falls or fractures in this group. However, frank protein deficiency may contribute to risk factors.
Background – The elderly living in residential aged-care facilities are particularly at risk of nutritional inadequacies due to their overall health status, state of dependency and general lack of control over food provision.
Objective – To measure energy and nutrient intakes in aged-care facilities and evaluate against dietary recommendations.
Design – A three-day weighed food record, incorporating all main meals and snacks, was performed on 83 participants ranging in age from 65 to 94 years in five Melbourne residential aged-care facilities. Dietary intake data were analysed using FoodWorks 3.01. Mean energy and nutrient intakes were compared to nutrient reference values (NRV).
Outcomes – Estimated energy requirements (EER) were met (M126% EER ± 41% {all data is ± SD}); F122% EER ± 27%). Estimated average requirements (EAR) for protein were also met (M109% ± 56%; F140% ± 36%). Carbohydrate intake for males was 50% of energy ± 7% and 48% for females ± 5%. Sugars made up a larger proportion of carbohydrate intake (M28% ± 9%; F25% ± 6%). Intake of fat for males was 34% of energy ± 7% and 36% for females ± 5%. Saturated fat intake was high (M17% ± 5%; F16% ± 2%). Fibre intake was low (M56% adequate intake (AI) ± 28%; F66% AI ± 24%). Participants failed to meet the EAR for calcium (M94% ± 50%; F88% ± 41%) and magnesium (M77% ± 26%; F94% ± 27%). Intake of potassium was below the AI (M79% ± 29%; F96% ± 24%). Males failed to meet the EAR for zinc (77% ± 27%). Sodium intake for males and females was exceptionally high (M308% AI ± 92%; F247% AI ± 77%).
Conclusion – Results of this study can be used to guide education programs for food service staff to ensure the nutritional requirements of residents are met in the future.
Background – Anaemia is relatively common in pregnancy and the most common cause is iron deficiency. Despite lack of evidence on the most effective way to treat anaemia in pregnancy, it is often treated with high doses iron that may cause gastrointestinal side effects, interfere with mineral absorption and cause haemoconcentration. Both severe anaemia and haemoconcentration have been linked with adverse pregnancy outcomes.
Objective - To compare the efficacy and side effects of low dose vs. high dose iron supplements in treating anaemia in pregnant women.
Design – Double blinded randomised dose response trial. Eligible pregnant women with anaemia (haemoglobin <110g/l) at the mid-pregnancy routine blood test were randomly allocated to receive either 20mg, 40mg or 80mg of iron daily for 8 weeks. Iron status of the women was assessed at the end of treatment. Gastrointestinal side effects were assessed every 2 weeks. Information on pregnancy outcomes and pregnancy complications were collected from medical records.
Outcomes – A total of 180 women were enrolled and 179 completed the study. At the end of treatment there was a clear dose response of increasing Hb concentration with iron dose (111 ± 13g/L at 20mg/day, 114 ± 11g/L at 40 mg/day, 118 ± 13g/L at 80mg/day, P=0.015). However, the incidence of moderate anaemia (Hb<100g/L) or outcome of pregnancy did not differ between groups. Gastrointestinal side effects (including nausea, stomach pain and vomiting) also increased with iron dose (P<0.05). Similarly, there was a tendency for more women to have haemoconcentration, defined as Hb>130g/L, with increasing iron dose (4%, 7%, 13%, P=0.175).
Conclusions – Although high dose iron supplements are more effective in increasing Hb levels in pregnancy, they are associated with more gastrointestinal side effects and may also be associated with a higher risk of haemoconcentration. Further research is needed to determine the optimal levels of Hb in pregnancy and the most effective and safe dose of iron to treat anaemia in pregnancy.
Adolescence is a time of rapid physical and psychosocial change. Physical changes include the growth spurt and pubertal maturation, which require nutritional adequacy and which result in significant changes in body composition. The psychosocial tasks of adolescence, which are essential to the transition to functioning adulthood, include increased independence from family of origin and integration into the peer group, which in turn alter many lifestyle behaviours. There is evidence that the nutritional habits acquired in adolescence persist into young adulthood. Meal skipping, snacking, eating away from home and binge drinking may all affect dietary balance. There are also changes in physical activity throughout adolescence that may alter energy balance, particularly for females who reduce their activity levels. Puberty itself is a risk factor for obesity in adolescence, particularly in females.
Adolescent nutritional issues include underweight and overweight, as well specific nutrient deficiencies, restrictive dieting and fad dieting behaviours and the use of nutritional supplements. The presence of an eating disorder is a common cause of underweight and the diagnosis carries implications for both adolescent and future morbidity. A growing group of adolescents with potential for nutritional problems are the survivors of chronic childhood illness whose life expectancy has increased with improved medical and surgical therapies. These young people need to transition into adult care and are at risk of inadequate engagement with adult services. Nutritional advice and intervention needs to take into account the illness specific needs and the specific needs of adolescence.
The prevalence of overweight and obesity among children and adolescents has been increasing since the mid-1980s in most developed economies, as well as developing economies, making overweight one of the most common chronic disorders of childhood and adolescence. Adolescence is a “critical period” for the development of adult obesity, with obese adolescents having a 70-80% risk of becoming obese adults. Overweight and obese adolescents suffer a range of immediate and longer-term health and psychosocial problems. For these reasons, effective management of overweight and obesity during adolescence is a priority. There are barriers to successful intervention that include factors intrinsic to adolescence, as well the failure to perceive overweight as being present. Other barriers are the belief that overweight will resolve spontaneously at puberty and unfounded concerns that weight management might impair the adolescent growth spurt or induce an eating disorder.
The NH&MRC Clinical Practice guidelines for the management of overweight and obesity in children and adolescents were released in 2003, including a general practitioner resource. Unfortunately, these guidelines were not supported by any funding to educate health professionals, and it is impossible to determine their impact. There are about 10 published studies of randomised controlled trials of obesity management among adolescents, of which a third are pharmacological studies. The majority of these randomised controlled trials were performed in North America, generally in a tertiary care setting, and all involved intense behavioural management support. Such studies provide guidance as to the efficacy of treatment interventions for adolescents in resource-intensive settings. Research interventions are costly to operate, making them difficult to sustain, and tertiary care settings are unlikely to have the capacity to meet the high need or to be sufficiently convenient for the potential large numbers of young people who may require therapy. Interventions which operate at a sustainable intensity in accessible, community- based settings are required, as these do not exist for adolescents in Australia. However, there is no high level evidence regarding what types of interventions may be useful.
A major challenge when targeting adolescent health is to ensure easy access and retention. When adolescents are asked their views of an ideal health service, their suggestions include group programs, wide publicity, youth-specific services, confidentiality, respect and location in a setting that is informal and with welcoming staff. These findings are highly relevant to the development of a successful adolescent weight management program, or indeed any nutrition program. Community health centres are readily accessible by adolescents and offer a multi-disciplinary approach and competence in group programs, both of which are well-established elements of adult obesity management. However, information is needed on whether a community-based weight management. Initial results from an evaluation of a community-based weight management program for adolescents, the Loozit study, has shown positive results in weight management and lifestyle competency and is presented as a model on how adolescent interventions might best be accomplished.
Background – Malnutrition is common in the elderly. Vitamin B12 is of particular interest as deficiency is linked to chronic diseases including dementia, osteoporosis, stroke and macular degeneration.
Objective – To examine the extent and determinants of malnutrition, including micronutrient deficiencies in an aged care rehabilitation unit.
Design – Patients (n=67, age >60 y) admitted to Calvary Health Care Sydney, underwent screening using the Mini Nutritional Assessment (MNA) and a modified MNA (mMNA) incorporating Australian protein and anthropometric indices. Biochemical screening assessed protein and micronutrient status. A sub-group of patients (n=22) underwent additional investigations to determine vitamin B12 status, as assessed by methylmalonic acid (MMA) and homocysteine concentrations, and dietary intake using a semi-quantitative food frequency questionnaire (FFQ). Outcomes – Eighteen and 27% of patients were malnourished as determined by the MNA and mMNA, respectively, with both tools indicating 61% of patients were at risk of malnutrition. Twenty-six subjects (38%) were deficient in vitamin B12 (based on values <220 pmol/L) and 53 subjects (79%) were deficient in vitamin D (<50 nmol/L). A positive correlation was found between serum folate concentrations and the mMNA score (r= +0.28, P<0.05). Hyperhomocysteinemia (>12 μmol/L) was found in 73% of subjects (n=16), and was significantly associated with lower serum- and erythrocyte-folate concentrations. Thirty-six percent (n=8) of subjects had elevated MMA indicating cellular vitamin B12 deficiency, but all had serum vitamin B12 in the normal range. All subjects had adequate intakes (>77% 1991RDI) of vitamin B6, B12 and folate.
Conclusions – The present study highlights malnutrition, vitamin D and vitamin B12 deficiencies in subjects with normal serum vitamin B12 levels. This study supports routine screening and appropriate supplementation of micronutrients.
Background – Recent cross-sectional research suggests that iron stores diminish with age in the first two years of life.
Objectives – To determine the dynamics of serum ferritin (SF) concentration over a five month period in a sample of healthy 12-20 month old New Zealand (NZ) children.
Design – In a 20-week randomised placebo-controlled trial 225 toddlers were assigned to one of three groups including a placebo group (n=90) in which the toddlers’ regular milk was replaced with unfortified (<0.1 mg Fe/100 mL) cow’s milk. Three-day weighed dietary intakes were recorded. Non-fasting venipuncture blood samples and anthropometric measures were collected at baseline and 20 weeks. Suboptimal iron status was: “depleted iron stores”, SF ≤10 μg/L; “iron deficiency” (ID), haemoglobin (Hb) ≥110 g/L and two or more abnormal values for SF, mean corpuscular volume (≤73 fL) and zinc protoporphyrin (≥70 μmol/mol haem); or “iron deficiency anaemia” (IDA), Hb <110 g/L and ID.
Outcomes – At baseline the children (n=71) were predominantly boys (58%), and had a mean (SD) age of 16.6 (2.7) months. The mean (SD) intake of dietary iron was 5.6 (2.7) mg/d and 28.8% (95% CI 18.3 to 39.4) had iron intakes below the Australian & NZ EAR for iron (4 mg/d). The prevalence (95%CI) of suboptimal iron status increased from 13.2% (6.2 to 23.6) at baseline to 17.6% (9.5 to 28.8) at 20 weeks. Mean SF concentration (95%CI) declined from 22.8 (19.9 to 26.0) μg/L to 18.6 (16.1 to 21.4) μg/L over 20-weeks (P=0.0359). There was no change in Hb concentration (P=0.1968). Faster growth in length was associated with lower SF concentration (P=0.0004) but not Hb concentration (P=0.3779). Younger girls had lower Hb concentration than boys, and older girls had higher Hb than boys (P=0.0177).
Conclusions – Iron stores decreased over 20 weeks among healthy 12-20 month old NZ children who followed typical NZ toddler diets. Faster growth in length appeared to have contributed to diminishing iron stores. Dietary intervention strategies aiming to improve iron status of NZ toddlers should be assessed.
Background – Folic acid taken before conception and through early pregnancy reduces the risk of neural tube defects. The amount of folic acid (400 μg/d) proven in randomised control trials to prevent neural tube defects far exceeds that readily obtained by eating foods naturally rich in folate. Therefore, population strategies to reduce rates of neural tube defects require either the addition of folic acid to foods that are consumed regularly by women of child-bearing age or increasing the proportion of women who take folic acid supplements during the critical period. Food Standards Australia New Zealand has recently proposed (P295) to mandate the addition of folic acid (230-280 μg/100 g) to bread-making flour and estimate this will increase mean intake of folic acid in the target population by 100 μg/d in Australia and 131 μg/d in New Zealand.
Review – Mandatory fortification has the advantage that it reaches most women regardless of education or socio- economic backgrounds, it suffers because the need to minimise high folic acid intakes in young or old people limits the level of fortificant (μg folic acid/100 g food) to an amount where only a small percentage of women receive 400 μg/d. Supplementation, on the other hand, has the advantage that it provides folic acid at the correct dose and time directly to – and only to – the target population. The disadvantage is that almost half of pregnancies are unplanned. Overseas experience shows that in regions with high rates of neural tube defects and low folate status, increased population intakes of folic acid reduce neural tube defect rates dramatically. However, in regions where neural tube defect rates are low and folate status is high the effect of increasing population intakes of folic acid is uncertain. The rates of neural tube defects in Australia and New Zealand rank low by international comparison. Furthermore, a population-based sample of Dunedin woman (18-45 y) suggests high folate status in New Zealand. These conditions are likely to influence the effectiveness of mandatory folic acid fortification or supplementation programmes.
Background – Analytical methods have been a limitation in the study of folates due to their inability to distinguish accurately between the added form (folic acid) and naturally occurring forms in foods. This is critical in view of the need for accurate data in establishing folate composition, requirements and assessing the bioavailability of the vitamin. The complexity, diversity and instability of folates are substantial obstacles encountered in the development and selection of analytical methods. The analysis of folates is further complicated due to the difficulties in sample preparation which include extraction, deconjugation and extract purification. Currently, there are three main analytical methods: Microbiological assay (MA), High pressure liquid chromatography (HPLC) and biospecific procedures including enzyme protein binding assays (EPBA), enzyme linked immunosorbent assays (ELISA) and radioassays (RBPA). Folate in foods is commonly measured by microbiological assay which is based on the assumption that L.rhamnosus (most commonly used folate dependant microorganism) has identical growth responses to the mono-, di- and triglutamyl folate structures present in foods. However there is much debate over this assumption as some investigators report that L.rhamnosus response is similar to all forms (O’Broin et al., 1975; Shane, Tamura and Stokstad, 1980) while others do not (Phillips and Wright 1982; Goli and Vanderslice, 1992). High pressure liquid chromatography separation techniques with ultraviolet and/or fluorescent detection have been documented to detect the different forms of folate. In many instances these methods lack specificity and have failed to reach the required detection limits due to matrix interference by the presence of breakdown products that arise during extraction (Shane, 1986). Recently, LC-MS methods have been reported as an acceptable and accurate approach to the analysis of folates in foods and biological materials which are based on the separation power of reversed phase chromatography coupled with the superior detection capability of mass spectrometry (Rychilik et al,
2003).
Objectives –The main objective is to critically review the current status of folate analysis methods. This session will
provide a discussion on 1. Strengths and limitations of the various methods of analysis, 2. Considerations in the
selection of existing analytical methods and 3. Further research needs concerning folate analysis and an added
review on the liquid chromatography mass spectrometry methods available today. Quantification of folates is 13
performed using C isotopically labelled internal standards.
Review –Though the microbiological assay is most commonly used, it is time consuming, needs great care and skill. It cannot however quantify the different forms of food folates. In addition, whether microorganisms respond to the different forms differently is still under question. Immunoassay techniques are quick, easy and cheaper but are not suitable for food folate determination. HPLC has proven to be a better analytical technique and more recently liquid chromatography mass spectrometry (LC-MS) techniques offer better sensitivity and specificity to accurately quantify folates in foods and biological samples. It appears that the method of choice would depend on the purpose. Conclusions – The LC-MS/MS techniques offer an accurate, reproducible and reliable method for profiling and quantifying the folate forms present in foods. This new method provides enhanced sample throughput of 36 samples/12 hours.
References
Background – Folate is a B vitamin, which acts as a coenzyme in single carbon transfer reactions to synthesize components of DNA, RNA and proteins. Folate deficiency causes megaloblastic anemia and neural tube defects. In addition, association of folate with cardiovascular diseases, certain cancers and cognitive disorders is under active research. In many countries folate intake has been shown to fall below recommendations. Therefore, mandatory or voluntary fortification is widely accomplished. In countries, which do not practice fortification, enhancing folate levels by other means is important. In Finland, cereal products provide ca. 40% of the daily folate intake. Therefore, increasing folate levels in cereal products further would significantly effect folate status of the population.
Review – We have studied possibilities to increase folate levels in cereal product by selecting optimal cereal raw materials and by utilization of in situ folate synthesis. We have mainly focused on rye and wheat fractionation and bioprocesses, such as fermentation, malting and germination. Variation of folate levels between varieties has also been investigated. A microbiological assay on microtiter plates after tri-enzyme extraction was used to measure total folate contents and HPLC after purification with affinity chromatography to determine folate vitamer composition of selected samples.
Genetic and environmental factors affect folate levels in cereal raw materials. However, fractionating the grains and utilizing the folate-rich fractions was shown to be a more efficient mean to enhance folate levels in cereals raw materials and thus those of the final products. Fractionation of rye grains by a roller mill at laboratory scale to bran, short, and two flours led to products with folate contents ranging from ca. 10 μg/100 g (flour) to ca. 110 μg/100 g (bran). In wheat fractions, taken from a commercial scale mill, the folate contents increased linearly with the ash content up to 4% ash. In rye milling fractions the correlation was not that unambiguous. However, when the ash content exceeded 3% (approximating to 25% fibre content) the folate levels were relatively high, ca. 110–130 μg/100 g.
Both germination/malting and fermentation enhanced folate contents. As an example, germination of rye grains for six days at 25 °C led to 3.5-fold higher levels as compared with the grains. Fermentation with yeast in rye and wheat baking processes was shown to significantly enhance folate levels. For instance, a total folate content 62 μg/100 g in the flour led to a folate content 162 μg/100 g after the fermentation step in rye bread baking. Some bacteria also produce folate.
Conclusions – Utilization of folate-rich cereal fractions would significantly increase folate levels and simultaneously contents of many other bioactive compounds in cereal-based foods. Developing further bioprocesses, especially fermentation, offers a useful means to enhance folate contents in foods available for the entire population.
References
Background – There are various polymorphisms in the genes coding for enzymes and carriers involved in folate metabolism which are known to affect folate distribution and disposition. Recently, a single nucleotide polymorphism (SNP) in the reduced folate carrier (RFC) has been found to modulate the uptake of folate by cells. The SNP, a change from guanine to adenine at position 80 of exon 2 of the gene (G80A RCF) leads to an arginine replacing a histidine in the expressed RCF protein.
Objectives – As the G80A RCF SNP may affect the absorption of dietary folate and its uptake by cells, the aim was to determine whether it impacted on the associations between dietary folate and serum and red cell folate.
Design – Subjects (119, 52 males, 67 females) were recruited from a retirement village. Dietary folate intake was assessed by food frequency questionnaire, serum and red cell folates were measured by immunoassay and Pearson correlation coefficients (r) and their significance (P < 0.05) were determined using SPSS.
Outcomes – Dietary folate intake was significantly associated with serum folate in the elderly having the GG (r = 0.524; P = 0.002) and GA (r = 0.408; P = 0.002) genotypes but not in those with the AA (r = 0.347; P = 0.060) genotype. Similarly, dietary folate was significantly associated with red cell folate in the GG (r = 0.399; P = 0.022) and GA (r = 0.564; P < 0.0001) but not in the AA genotypes (r = 0.223; P = 0.236).
Conclusions – The G80A RCF SNP modulated the association of dietary folate intake with serum and red cell folate in this elderly population with the GG and GA genotypes but not the AA genotype showing significant associations.
Background – Pregnancy is a condition where the GI concept may be of particular relevance because maternal glucose is the main energy substrate for intrauterine growth (1-2).
Objectives – The aim was to compare the effects of a low GI and conventional dietary strategy on pregnancy outcomes in healthy women. Compliance and acceptability were also investigated.
Design – Volunteers were assigned alternately to receive dietary counselling that encouraged either low GI carbohydrate foods (LGI) or high fibre, moderate-to-high GI foods (HGI) and studied five times between <16 weeks gestation and delivery. Of the 70 women who met the inclusion criteria, 62 completed the study (32 in LGI and 30 in HGI). Primary outcomes were measures of foetal size.
Outcomes – Mean diet GI fell significantly in the LGI group but not the HGI group. Compared with the LGI group, women in the HGI group gave birth to infants who were heavier (3408 ± 78 vs 3644 ± 90 g respectively, P = 0.051) with higher birth centiles (48 ± 5 vs 69 ± 5, P = 0.005), higher ponderal index (2.62 ± 0.04 vs 2.74 ± 0.04, P = 0.03) and higher prevalence of large-for-gestational age (3% vs 33%, P = 0.01). There was no effect of diet composition on maternal weight gain, method of delivery or indirect measures of insulin sensitivity. Compared with baseline, only the LGI group reduced intake of saturated fat. Women in the LGI group found the diet easier to follow. Conclusion – Since birth weight and ponderal index may predict chronic disease in later life, a low GI diet may favourably influence long-term outcomes.
Background – Cortisol is a key hormone in the response to stress, and depression, anxiety and stress are associated with increased daily cortisol secretion. Dietary factors may influence daily cortisol secretion.
Objective – To assess the effect on cortisol secretion of two diets: a high-calcium diet, rich in low-fat dairy foods (HC) and a low-sodium, high-potassium diet, rich in fruits and vegetables (LNAHK) with a moderate-sodium, high- potassium, high-calcium “DASH” type diet, high in fruits, vegetables and low-fat dairy foods (OD).
Design - In a crossover design, subjects were randomised to two test diets for 4- wk, the OD and either HC or
LNAHK, each preceded by a 2 wk control diet (CD). Saliva samples were collected in the morning and at 1200 h,
1600 h, 2000 h for 1 d at the end of each diet.
Outcomes – Seventy-four subjects completed the study (29 men, 45 women) with a mean (SD) age of 56.3(9.8) yr 2
and a mean BMI of 29.2(3.8) kg/m . Cortisol variability was high for morning samples (176% CV); however, afternoon/evening samples (area under the curve (AUC) (nmol.l-1 .8hr-1 )) had less variation (30% CV). CD cortisol concentrations predicted the change in AUC: for the OD ß=-0.8(0.1) (SEM), LNAHK ß=-0.7(0.1) and HC ß=-0.7(0.1) (R2 : 0.4-0.6). The % change in AUC was lower in the HC diet when compared to the OD diet (P=0.058),
and significantly lower when compared to the LNAHK diet (P<0.05).Conclusion – Consumption of 3-4 serves/day of dairy foods resulted in a fall in cortisol secretion compared to a rise seen in two diets requiring some dietary restrictions. This suggests increased dairy intake may have beneficial effects on cortisol secretion in the afternoon/evening period.
Background – Endothelial dysfunction is a key feature of type 2 diabetes (T2D) and plays a significant role in the
early development of atherosclerosis. While lifestyle interventions incorporating weight loss and increased physical activity are advocated as the first line of treatment for T2D, the effects of weight loss, particularly when combined with exercise training on endothelial function in patients with T2D are largely unknown.
Objectives - To compare the effects of a moderate energy restricted diet, with and without aerobic exercise training on endothelial function, oxidative stress and established markers of cardiovascular risk in patients with T2D.
Design – Using a parallel randomised controlled study design, 29 sedentary, overweight and obese patients with T2D followed a 12-week moderate energy restricted diet (~5000 kJ/day, ~30% energy deficit consisting of two meal replacements and one self-prepared high protein meal) whilst either maintaining their habitual physical activity levels (DO, N=16) or undertaking a progressive aerobic exercise training program (DE, N=13).
Outcomes – Both interventions resulted in significant reductions in body weight (DO 9.5%, DE 9.0%, P<0.001 for time), body fat (14.3%), waist circumference (9.3%), blood pressure (7/4 mmHg), fasting glucose (24%), HbA1c (18%), triglycerides (38%), total cholesterol (12%, P=0.001) and malondialdehyde (28%, P<0.001), but there were no differences in the magnitude of these effects between treatments. At baseline, endothelial function assessed by brachial artery flow-mediated dilatation FMD was similar in both groups (DO 2.5 ± 5.7%, DE 4.3 ± 4.6%; P=0.26) and did not change after the interventions (P=0.76).
Conclusion – In overweight and obese patients with T2D, weight loss with and without aerobic exercise training did not improve FMD, but was effective in improving glycemic control and a range of cardiovascular risk factors. Acknowledgment – This study was supported by the Diabetes Australia Research Trust, Pharmacy Health Solutions Pty Ltd., CSIRO Human Nutrition and the ATN Centre for Metabolic Fitness.
Obesity is a serious public health problem in Australia. The most recently available national representative data from the 1990s suggest that approximately 19-23% of Australian children and adolescents are overweight or obese and there is evidence to suggest that the prevalence is increasing. Data from five recent surveys over the periods 1969 to 1997 show that the prevalence of overweight increased by 60-70% and the prevalence of obesity trebled. The changes in prevalence between 1969 and 1985 were smaller than the subsequent 12 years indicating that the rates of obesity may also be accelerating.
Treatment of established obesity is challenging and a focus on primary prevention is essential. Currently, there is considerable interest in the role of birth size, a proxy for prenatal growth, growth in infancy and childhood, early puberty and the risk of obesity. Several studies, including data from a prospective study in western Sydney, the Nepean Study, have shown that lower birth size is associated with early pubertal development and increased risk of adult disease including cardiovascular disease and type 2 diabetes.
Most studies report a direct association between birth weight and attained body mass index (BMI): high birth weight is associated with a high BMI. The positive association appears to contradict evidence that low birth weight programs an increase risk of cardiovascular disease and type 2 diabetes. However, BMI is a measure of relative weight and does not distinguish between lean and fat mass. There is limited evidence, including results from the Nepean Study, that lower birth weight is associated with a subsequent greater abdominal or central fat mass and a higher ratio of fat mass to lean mass. In contrast,high birth weight is associaed with a relatively greater proportion of lean mass.
Both obesity and birth size have also been associated with earlier pubertal development. It is not clear if increased adiposity in childhood causes earlier sexual maturation, if earlier sexual maturation induces an increase in adiposity, or whether both these phenomena co-occur. Results from the Nepean Study indicate that after adjusting for BMI at 8 years of age girls who were long and light at birth attained menarche 1 year earlier (mean age of menarche ± SEM: 12.0 ± 0.3 yr) than girls who were short and heavy (13.0 ± 0.3 yr). Early maturation with obesity has been associated with polycystic ovarian syndrome, insulin resistance and a higher risk of adult disease including metabolic syndrome and cancer.
The biological basis for the association between birth size, early pubertal development and increased adiposity has been attributed to developmental plasticity and rapid or compensatory growth in infancy and early childhood. Infants who have been growth restrained in utero tend to gain weight more rapidly during the postnatal period. Different developmental pathways are thought to be triggered by environmental events induced during sensitive or critical periods in the development. However, whether there are or not early critical periods in programming of obesity, is still a matter for debate.
References
Background – A novel hydrolysate (NatraBoost XR, NBXR) of whey protein isolate (WPI) reduced production of tumor necrosis factor-α (TNFα) and increased cell growth in vitro.
Objectives – This study examined whether feeding NBXR could enhance recovery of muscle function following eccentric exercise.
Design – Muscle soreness (MS, by visual analogue scale), serum creatine kinase activity (CK), plasma TNFα and insulin concentrations, and knee extensor peak isometric torque (PIT) were determined in 40 healthy sedentary males at baseline. 100 maximal eccentric contractions (ECC) of the knee extensors were then performed. MS, CK TNFα, insulin and PIT were then reassessed prior to consuming 250 ml of flavoured water (FW; n = 11), or 250 ml of FW containing 25 g of NBXR (n = 6), WPI (n = 11) or casein (C, n = 12) in a double-blind randomised parallel design. All assessments were repeated 1, 2, 6 and 24 hr later, and supplements were consumed at 6 and 22 hr. Outcomes – There was no difference in PIT between groups at baseline (P = 0.70). PIT decreased in all groups following ECC (P < 0.001), with no difference in the reduction between groups (P > 0.58). PIT remained suppressed in WPI, C and FW, but recovered rapidly in NBXR such that it was not different from baseline by 2hr (P > 0.05) and was greater than all other groups a 6 hr (P < 0.01) and 24 hr (P < 0.001). MS increased in all groups following ECC (P < 0.001) and remained elevated, with no difference between groups (P = 0.93). TNFα (P > 0.83) and CK (P > 0.32) did not change from baseline. Insulin increased transiently in NBXR and WPI only at 1 hour (P < 0.001), but the increases were not different from each other (P = 0.77).
Conclusion – NBXR enhanced recovery of PIT following eccentric exercise. The effect did not appear to be mediated by suppression of inflammation or MS, or by any anabolic effect of insulin. The enhanced recovery may be related to the activity of some novel peptide(s) in the hydrolysate.
Background – We will describe briefly the continuing gap in health and life expectancy between the Aboriginal and the broader Australian peoples. While some important advances have occurred in Indigenous health in other countries, Aboriginal people in Australia continue to experience significant inequalities in terms of life expectancy at birth, life chances, health status and life expectancy at almost any age compared to the rest of the population.
In addition, the Aboriginal population is only 2.8% of the total, which raises the question of why such a small and geographically dispersed population continues to experience such high levels of social, material and health disadvantage (1).
Objectives – To describe, via an Epidemiological Transition Model (2), some reasons behind the continuing poor morbidity and mortality and use this model to review nutrition programmes in the community context.
Discussion – Indigenous peoples are in a mid 20th century epidemiologic transition of premature disease and disability. This transition occurred for the broader Australian population some 100 or so years ago. At the same time the Indigenous population is increasing and growing younger and not ageing in the same way as we see in the rest of Australian society. These facts challenge many in policy and programme development to design appropriate interventions. Those programmes that work well seem to use a determinants of health model (3) and build on a definition of health that considers not only at the lack of physical disease, but also the wellbeing of ones family and community, physically, emotionally and spiritually.
Conclusion – Programmes and interventions must be designed with the target community and population in mind. The use of programmes developed for populations that are in the fifth epidemiologic transition, that is, for the broader Australian community, will not work for those who are still reeling from the effects of disease and disability a century old. As we have seen, those programmes that work do so because they take this into account.
References
Background – Berry fruit have very high antioxidant capacities as determined by in vitro antioxidant assays such as the oxygen radical absorbance capacity (ORACFL) and the ferric reducing antioxidant power (FRAP) methods. In a selection of blackcurrant (Ribes nigrum L.) genotypes we have found that antioxidant capacities (ORACFL) vary from 71 to 194 μmol TE/g fresh weight. Anthocyanins are the main contributor to antioxidant capacity in blackcurrant and their concentrations range from 180 to 732 mg/100g FW.
Objectives - To further explore the potential of blackcurrant anthocyanins as dietary antioxidants we have used preparative HPLC to isolate the four individual anthocyanins and measured the antioxidant capacities (ORACFL and FRAP) and the ability to protect plasma proteins from peroxynitrite mediated tyrosine nitrosylation for each anthocyanin. In a previous study we found that blackcurrant anthocyanins are absorbed intact into plasma following consumption (1). Therefore we have also investigated the biochemical properties that contribute to the efficacy of the health benefits such as plasma protein binding and octanol partition coefficients.
Outcomes – We have found that the four anthocyanins vary for the parameters measured. The two cyanidin-based anthocyanins have greater (15%) antioxidant capacity than the two delphinidin-based compounds. The mean octanol partition coefficient (log p) is lower for the two rutinosides (-1.56) compared to the two glucosides (-0.71) suggesting there are differences in the ability of the anthocyanins to penetrate cell membranes. The phenolic aglycone component of anthocyanin appears to have a greater effect on protein binding than the sugar component. The percentage of anthocyanin bound to human serum albumin averaged 75% for delphinidin, 69% for cyanidin, 58% for peonidin, and 55% for malvidin.
Conclusion – These results extend our understanding about the potential health benefits of berry fruit containing high concentrations of anthocyanins. Anthocyanins are a large diverse group of compounds and these results indicate that they differ in biochemical properties that are associated with health benefits suggesting that specific compositions of anthocyanins may have increased health benefits.
Reference
Background – Red meat intake has been associated with increased risk of coronary heart disease and type 2 diabetes. The often coupled intake of saturated fat and processing of red meat may be at least partly responsible. Effects of iron derived from red meat to increase iron stores and initiate oxidative damage and inflammation is another possible pathway.
Objective – To determine whether an increase in unprocessed and lean red meat intake, with a concomitant reduction in carbohydrate intake, adversely influences markers of oxidative stress and inflammation.
Design – Sixty participants completed an 8 wk parallel-designed study. They were randomized to maintain their usual diet (control) or to partially replace energy from carbohydrate-rich foods with approximately 200 g/d of lean red meat (protein) in isoenergetic diets. Markers of oxidative stress and inflammation were measured at baseline and at the end of intervention.
Outcomes – Results are presented as the between group difference for protein relative to control. There was a significant decrease in urinary [-137 (-264, -9) pmol/mmol creatinine, P=0.04], but not plasma [-12 (-122, 100) pmol/L, P=0.84] F2-isoprostane concentrations. There was a significant decrease in leucocyte [-0.51 (-0.99,-0.02) X109/L, P=0.04] and lymphocyte [-0.20 (-0.36,-0.05) X109/L, P=0.01] counts, a decrease in plasma high sensitivity C-reactive protein concentrations [-1.6 (-3.3, 0.0) mg/L, P=0.06] of borderline significance, but no significant effect on plasma fibrinogen concentrations [-0.08 (-0.40, 0.24), P=0.63].
Conclusion – Our results do not support the suggestion that an increase in the intake of lean red meat, partially replacing carbohydrate, increases oxidative stress or inflammation.
Background – This paper reports the antioxidant compositions of fruits, vegetables and beverages in Fiji. Objectives – The availability of such data will help promote their use in the daily diet of the people in Fiji.
Design – The total antioxidant capacity (TAC) was assayed using trolox equivalent antioxidant capacity (TEAC) decolourization method (1). The total polyphenol (TPP) assay was performed using the Folin-Ciocalteu method (1). HPLC was used to determine the major carotenoid and flavonoid profiles.
Outcomes – Commercial noni (Morinda citrifolia) fruit drink, which is exported to Australia, was shown to have the highest total polyphenol levels (375.1 mg/100g juice) followed by turmeric (Curcuma longa) (320 mg/100g). Sweet potato leaves (Ipomoea batatas) (240-280 mg/100g) and drumstick (Moringa oleifera) leaves (260 mg/100g) were also high in total polyphenols. The paper also discusses the TAC levels of the foods assayed. Flavonoid assay showed that quercetin was present in sweet potato leaves (43-90 mg/100g), drumstick leaves (100 mg/100g) and also in turmeric (41 mg/100g). It appears that polyphenols and carotenoids contribute to the antioxidant capacity of most foods (2).
Conclusion – Attempts are made to publicise and promote the consumption of a variety of the antioxidant-rich vegetables and fruits in the diets of the people of Fiji.
References
1 Sellappan S, Akoh CC. Flavonoids and antioxidant capacity of Georgia-grown Vidalia onions. J. of Agric. and Food Chemistry, 2002a; 50: 5338-5342.
2 Lako J, Trennery VC, Wahlqvist M, Wattanapenpaiboon T, Sotheeswaran S, and Premier R, Phytochemical flavonols, carotenoids and the antioxidant properties of a wide selection of Fijian fruit, vegetables and other readily available foods, Food Chemistry 2006; in press.
Background – Polyphenols from different plant sources have been investigated for their antioxidant activities.
Polyphenols have been reported to exhibit anti-allergenic, anti-inflammatory and cardioprotective effects (1).
Objective – To compare the outcomes of in vitro and in vivo measures of antioxidant activity of palm polyphenols.
Design – In the in vitro experiments, human plasma was incubated with palm polyphenol extracts, and the antioxidant capacity was measured by two methods: 2,2’-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) radical .+
cation (ABTS ) scavenging and ferric reducing antioxidant power (FRAP) assays. For the in vivo study, hamsters
were fed an atherogenic diet and supplemented with palm fruit juice (PFJ) for 8 wk. PFJ was administered at three
different polyphenol concentrations, 750, 1000, and 1500 mg gallic acid equivalents (GAE)/L. The antioxidant
.+
Outcomes – Incubation of human plasma with palm polyphenols did not result in a significant increase in plasma
.+
capacity of the hamster plasma was measured at the end of 8 wk by the ABTS
scavenging and FRAP assays. scavenging capacity, but plasma FRAP values were significantly elevated. A similar trend was observed in
ABTS
the in vivo experiments. ABTS scavenging capacity in hamster plasma was unaffected by PFJ treatment at the different PFJ concentrations administered. Plasma FRAP values, on the other hand, increased from 45.64 ± 24.96 μM Trolox equivalents (TE) in control animals given water, to 82.33 ± 41.26 μM TE in animals supplemented with PFJ at 1500 mg GAE/L.
Conclusions – In vitro measures of antioxidant capacity are useful indicators of possible outcomes of in vivo trials. Nevertheless, the quantitative outcomes of the latter may differ from what could be extrapolated from purely in vitro measures due to absorption, metabolism and bioavailability of ingested antioxidants.
Reference
Background – Phytochemicals are abundant micronutrients in fruit and vegetables. There is an emerging body of evidence regarding their health benefits, some of which may be due to their antioxidant properties.
Objective – To compare chemical antioxidant assays with more biologically relevant cell-based assay systems for measuring antioxidant activities of food polyphenols.
Design – The free radical scavenging activities of three flavonoids: quercetin, rutin and catechin, commonly found in apple, onions and tea respectively, were measured. The three flavonoids were evaluated using both oxygen radical absorbance capacity (ORAC) and lipid peroxidation inhibition capacity (LPIC) assays. Cytoprotective effects were measured by the degree of protection against H2O2-induced damage of human Jurkat cells.
Outcomes – As expected all compounds exhibited activity in these assays. Quercetin offered the strongest protection against H2O2-induced cell death. A comparison of the results of the assays showed that the ability to inhibit peroxidation of lipids in a liposomal system (LPIC) correlated well with the cytoprotective activities (expressed as EC50), but not with the ability to protect an aqueous fluorescent substrate in the ORAC assay. Conclusions – In vitro assays can only rank antioxidant activity for their particular reaction system and their relevance to in vivo health-protective activities is uncertain. Therefore, it is prudent to use more than one type of antioxidant assay to measure antioxidant activities, and to include at least one assay that has biological relevance.
Background – Phytochemicals are abundant micronutrients in fruit and vegetables. There is an emerging body of evidence regarding their health benefits, some of which may be due to their antioxidant properties.
Objective – To compare chemical antioxidant assays with more biologically relevant cell-based assay systems for measuring antioxidant activities of food polyphenols.
Design – The free radical scavenging activities of three flavonoids: quercetin, rutin and catechin, commonly found in apple, onions and tea respectively, were measured. The three flavonoids were evaluated using both oxygen radical absorbance capacity (ORAC) and lipid peroxidation inhibition capacity (LPIC) assays. Cytoprotective effects were measured by the degree of protection against H2O2-induced damage of human Jurkat cells.
Outcomes – As expected all compounds exhibited activity in these assays. Quercetin offered the strongest protection against H2O2-induced cell death. A comparison of the results of the assays showed that the ability to inhibit peroxidation of lipids in a liposomal system (LPIC) correlated well with the cytoprotective activities (expressed as EC50), but not with the ability to protect an aqueous fluorescent substrate in the ORAC assay. Conclusions – In vitro assays can only rank antioxidant activity for their particular reaction system and their relevance to in vivo health-protective activities is uncertain. Therefore, it is prudent to use more than one type of antioxidant assay to measure antioxidant activities, and to include at least one assay that has biological relevance.
Background – Exercise is known to increase the production of reactive oxygen species (ROS)[1]. Dietary carotenoids have antioxidant properties and may possess anti-inflammatory effects [2].
Objective – The objective of the study was to determine the effect of dietary antioxidant restriction on short- duration maximal exhaustive exercise induced markers of inflammation, carotenoids and fatty acids in healthy male endurance athletes.
Design – Seventeen endurance-trained athletes performed two separate exercise tests. Participants followed their habitual (high) antioxidant diet (HA) and performed an overnight fasting treadmill exercise test. Participants then followed a reduced antioxidant diet (RA) for 2 weeks and then performed the same overnight fasting treadmill exercise test. Blood was collected at rest and post-exercise for the analysis of inflammatory markers, fatty acids and carotenoids in plasma.
Outcomes – The RA diet induced a significant increase (P<0.01) in baseline plasma TNF-alpha concentration (612.86 + 325.23 ng/ml) compared to the HA diet (28.30 + 39.07 ng/ml). Baseline plasma beta-carotene concentration significantly decreased (P<0.05) in the RA diet (122.61 + 54.49 ng/ml) compared to the HA diet (194.96 + 92.07 ng/ml). Exercise decreased plasma carotenoid concentrations in both diets. Exercise significantly decreased (P<0.01) plasma n-6 fatty acid concentration in the RA diet (186.38 + 94.54, 96.16 + 48.76 μg/ml) and increased (P<0.05) plasma n-3 fatty acid concentration in the HA diet (14.27 + 5.04, 18.63 + 4.94 μg/ml). Conclusion – Healthy endurance-trained adults performing short-duration exhaustive exercise may require higher intakes of carotenoids to combat oxidative stress and inflammation generated through exercise, which can be achieved via a diet containing high-carotenoid foods.
References
Background – The results of observational studies suggest that high plasma homocysteine concentrations are inversely related to cognitive function in older people.
Objective – To test the hypothesis that lowering plasma homocysteine concentration improves cognitive function in healthy older people.
Design – Two year, double-blind, placebo-controlled, randomised clinical trial involving 276 healthy participants, 65 years of age or older, with plasma homocysteine concentrations of at least 13 μmol/L. Homocysteine-lowering treatment was a daily supplement containing folate (1000 μg), vitamins B12 (500 μg) and B6 (10 mg). Tests of cognition were conducted at baseline and after one and two years of treatment. Treatment effects were adjusted for baseline values, sex, and education.
Outcomes – On average, during the course of the study, plasma homocysteine concentration was 4.36 μmol/L (95% CI, 3.81 to 4.91; P<0.001) lower in the vitamin group than in the placebo group. Overall, there were no significant differences between the vitamin and placebo groups in the scores on tests of cognition (1).
Conclusions – The results of this trial do not support the hypothesis that homocysteine-lowering with B-vitamins improves cognitive performance.
Reference