Bookshelf

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Institute of Medicine (US) Panel on Micronutrients. Dietary Reference Intakes for Vitamin A, Vitamin K, Arsenic, Boron, Chromium, Copper, Iodine, Iron, Manganese, Molybdenum, Nickel, Silicon, Vanadium, and Zinc. Washington (DC): National Academies Press (US); 2001.

Cover of Dietary Reference Intakes for Vitamin A, Vitamin K, Arsenic, Boron, Chromium, Copper, Iodine, Iron, Manganese, Molybdenum, Nickel, Silicon, Vanadium, and Zinc

Dietary Reference Intakes for Vitamin A, Vitamin K, Arsenic, Boron, Chromium, Copper, Iodine, Iron, Manganese, Molybdenum, Nickel, Silicon, Vanadium, and Zinc.

Institute of Medicine (US) Panel on Micronutrients. Washington (DC): National Academies Press (US); 2001.

9 Iron

SUMMARY

Iron functions as a component of a number of proteins, including enzymes and hemoglobin, the latter being important for the transport of oxygen to tissues throughout the body for metabolism. Factorial modeling was used to determine the Estimated Average Requirement (EAR) for iron. The components of iron requirement used as factors in the modeling include basal iron losses, menstrual losses, fetal requirements in pregnancy, increased requirement during growth for the expansion of blood volume, and/or increased tissue and storage iron. The Recommended Dietary Allowance (RDA) for all age groups of men and postmenopausal women is 8 mg/day; the RDA for premenopausal women is 18 mg/day. The median dietary intake of iron is approximately 16 to 18 mg/day for men and 12 mg/day for women. The Tolerable Upper Intake Level (UL) for adults is 45 mg/day of iron, a level based on gastrointestinal distress as an adverse effect.

BACKGROUND INFORMATION

Almost two-thirds of iron in the body is found in hemoglobin present in circulating erythrocytes. A readily mobilizable iron store contains another 25 percent. Most of the remaining 15 percent is in the myoglobin of muscle tissue and a variety of enzymes necessary for oxidative metabolism and many other functions in all cells. A 75-kg adult man contains about 4 grams of iron (50 mg/kg) while a menstruating woman has about 40 mg/kg of iron because of her smaller erythrocyte mass and iron store (Bothwell et al., 1979).

Function

Iron can exist in oxidation states ranging from –2 to +6. In biological systems, these oxidation states occur primarily as the ferrous (+2), ferric (+3), and ferryl (+4) states. The interconversion of iron oxidation states is a mechanism whereby iron participates in electron transfer, as well as a mechanism whereby iron can reversibly bind ligands. The common biological ligands for iron are oxygen, nitrogen, and sulfur atoms.

Four major classes of iron-containing proteins exist in the mammalian system: iron-containing heme proteins (hemoglobin, myoglobin, cytochromes), iron-sulfur enzymes (flavoproteins, hemeflavoproteins), proteins for iron storage and transport (transferrin, lactoferrin, ferritin, hemosiderin), and other iron-containing or activated enzymes (sulfur, nonheme enzymes). In iron sulfur enzymes, iron is bound to sulfur in one of four possible arrangements (Fe-S, 2Fe-2S, 4Fe-4S, 3Fe-4S proteins). In heme proteins, iron is bound to porphyrin ring structures with various side chains. In humans, the predominant form of heme is protoporphyrin-IX.

Hemoglobin

The movement of oxygen from the environment to the tissues is one of the key functions of iron. Oxygen is bound to an iron-containing porphyrin ring, either as part of the prosthetic group of hemoglobin within erythrocytes or as part of myoglobin as the facilitator of oxygen diffusion in tissues.

Myoglobin

Myoglobin is located in the cytoplasm of muscle cells and increases the rate of diffusion of oxygen from capillary erythrocytes to the cytoplasm and mitochondria. The concentration of myoglobin in muscle is drastically reduced in tissue iron deficiency, thus limiting the rate of diffusion of oxygen from erythrocytes to mitochondria (Dallman, 1986a).

Cytochromes

The cytochromes contain heme as the active site with the iron-containing porphyrin ring functioning to reduce ferric iron to ferrous iron. Cytochromes act as electron carriers. The 40 different proteins that constitute the respiratory chain contain six different heme proteins, six with iron sulfur centers, two with copper centers, and ubiquinone to connect nicotinamide adenine dinucleotide hydride to oxygen.

Physiology of Absorption, Metabolism, and Excretion

Absorption

The iron content of the body is highly conserved. In the absence of bleeding (including menstruation) or pregnancy, only a small quantity is lost each day (Bothwell et al., 1979). Adult men need to absorb only about 1 mg/day to maintain iron balance. The average requirement for menstruating women is somewhat higher, approximately 1.5 mg/day. There is, however, a marked interindividual variation in menstrual losses, and a small proportion of women must absorb as much as 3.4 mg/day. Towards the end of pregnancy, the absorption of 4 to 5 mg/day is necessary to preserve iron balance. Requirements are also higher in childhood, particularly during periods of rapid growth in early childhood (6 to 24 months), and adolescence.

In the face of these varying requirements, iron balance is maintained by the regulation of absorption in the upper small intestine (Bothwell et al., 1979). There are two pathways for the absorption of iron in humans. One mediates the uptake of the small quantity of heme iron derived primarily from hemoglobin and myoglobin in meat. The other allows for the absorption of nonheme iron, primarily as iron salts, that can be extracted from plant and dairy foods and rendered soluble in the lumen of the stomach and duodenum. Absorption of nonheme iron is enhanced by substances, such as ascorbic acid, that form low molecular weight iron chelates. Most of the iron consumed by humans is in the latter nonheme form.

Heme iron is highly bioavailable and little affected by dietary factors. Nonheme iron absorption depends on the solubilization of predominately ferric food iron in the acid milieu of the stomach (Raja et al., 1987; Wollenberg and Rummel, 1987) and reduction to the ferrous form by compounds such as ascorbic acid or a ferri-reductase present at the musosal surfaces of cells in the duodenum (Han et al., 1995; Raja et al., 1993). This bioavailable iron is then absorbed in a three-step process in which the iron is taken up by the enterocytes across the cellular apical membrane by an energy-dependent, carrier-mediated process (Muir and Hopfer, 1985; Simpson et al., 1986), transported intracellularly, and transferred across the basolateral membrane into the plasma.

The duodenal mucosal cells involved in iron absorption are formed in the crypts of Lieberkuhn. They then migrate up the villi becoming functional iron-absorbing cells only when they reach the tips of the villi. After a brief period of functionality, the cells are shed into the lumen together with iron that had entered the cell but had not been transferred to the plasma. In humans, mucosal cell turnover takes between 48 and 72 hours. Cells are programmed to regulate iron absorption when they reach tips of the villi by the amount of iron that they acquire from plasma during their early development. Recent studies by Cannone-Hergaux and coworkers (1999) strongly suggest that a metal transporter (divalent metal transporter [DMT-1] protein), which is a transmembrane protein and an isoform of natural resistance associated macrophage protein (NRAMP2), mediates the uptake of elemental iron into the duodenal cells. The quantity of this transport protein that is formed is inversely proportional to the iron content of the cell; synthesis is regulated by posttranscriptional modification of the DMT-1 messenger ribonucleic acid (mRNA) (Conrad and Umbreit, 2000). The regulatory mechanism involves the cellular iron response proteins (IRP) and the iron response element (IRE) on the mRNA (Eisenstein, 2000).

The mechanism by which iron is transported through the enterocyte has not been completely elucidated. Absorbed iron in the intracellular “labile iron pool” is delivered to the basolateral surface of enterocytes, becomes available for binding onto transferrin, and is then transported via transferrin in the plasma to all body cells. Ceruloplasmin, a copper-containing protein, facilitates the binding of ferric iron to transferrin via ferroxidase activity at the basolateral membrane (Osaki et al., 1966; Wollenberg et al., 1990).

Heme is soluble in an alkaline environment and is less affected by intraluminal factors that influence nonheme iron uptake. Specific transporters exist for heme on the surface of rat enterocytes (Conrad et al., 1967; Grasbeck et al., 1982); however, rats do not absorb heme iron as efficiently as do humans (Weintraub et al., 1965). To date, no specific receptor/transporter for heme has been identified in humans. After binding to its receptor, the heme molecule is internalized and degraded to iron, carbon monoxide, and bilirubin IXa by the enzyme heme oxygenase (Bjorn-Rasmussen et al., 1974; Raffin et al., 1974). This enzyme is induced by iron deficiency (Raffin et al., 1974). It is thought that the iron that is liberated from heme enters the common intracellular (enterocyte) pool of iron before being transported to plasma transferrin.

Transport and Metabolism

Iron movement between cells is primarily conducted via reversible binding of iron to the transport protein, transferrin. One atom of iron can bind to each of two binding sites on transferrin and will then complex with a highly specific transferrin receptor (TfR) located on the plasma membrane surfaces of cells. Internalization of transferrin in clathrin-coated pits results in an endosomal vesicle where acidification to a pH of approximately 5.5 results in the release of the iron from transferrin. The movement of iron from this endosomal space to the cytoplasm is not completely understood at this time, but recent discoveries provide some clues. DMT1 (NRAMP2) has now been identified in endosomal vesicles (Gunshin et al., 1997). Although it is not a specific iron transporter and although it is capable of transporting other divalent metals, recent studies suggest that it may play a primary role in the delivery of iron to the cell. A second transporter, stimulator of iron transport (SFT), has been cloned and characterized as an exclusive iron transporter of both ferric and ferrous iron out of the endosome (Gutierrez et al., 1997).

Iron entering cells may be incorporated into functional compounds, stored as ferritin, or used to regulate future cellular iron metabolism by modifying the activity of the two IRPs. The size of the intracellular iron pool plays a clear regulatory role in the synthesis of iron storage, iron transport, and iron metabolism proteins through an elegant posttranscriptional set of events (see review by Eisenstein and Blemings, 1998).

Storage

Intracellular iron availability is regulated by the increased expression of cellular TfR concentration by iron-deficient cells and increased ferritin production when the iron supply exceeds the cell's functional needs. Iron is stored in the form of ferritin or hemosiderin. The latter is a water-insoluble degradation product of ferritin. The iron content of hemosiderin is variable but generally higher than that of ferritin. While all cells are capable of storing iron, the cells of the liver, spleen, and bone marrow are the primary iron storage sites in humans.

Excretion

In the absence of bleeding (including menstruation) or pregnancy, only a small quantity of iron is lost each day (Bothwell et al., 1979). Body iron is therefore highly conserved. Daily basal iron losses are limited to between 0.90 and 1.02 mg/day in nonmenstruating women (Green et al., 1968). The majority of absorbed iron is lost in the feces. Daily iron losses from urine, gastrointestinal tract, and skin are approximately 0.08, 0.6, and 0.2 to 0.3 mg/day, respectively. These basal losses may drop to 0.5 mg/day in iron deficiency and may be as high as 2 mg/day in iron overload (Bothwell et al., 1979). Menstrual iron losses are quite variable. Studies on Swedish and British women demonstrated a mean iron loss via menses of 0.6 to 0.7 mg/day (Hallberg et al., 1966b).

Clinical Effects of Inadequate Intake

Important subclinical and clinical consequences of iron deficiency are impaired physical work performance, developmental delay, cognitive impairment, and adverse pregnancy outcomes. Several other clinical consequences have also been described. The bulk of experimental and epidemiological evidence in humans suggests that functional consequences of iron deficiency (related both to anemia and tissue iron concentration) occur only when iron deficiency is of a severity sufficient to cause a measurable decrease in hemoglobin concentration.

Once the degree of iron deficiency is sufficiently severe to cause anemia, functional disabilities become evident. It is difficult to determine whether any particular functional abnormality is a specific consequence of the anemia per se, presumably due to impaired oxygen delivery, or the result of concomitant tissue iron deficiency. However, it has been shown that anemia and tissue iron deficiency exert independent effects on skeletal muscle (Davies et al., 1984; Finch et al., 1976). Anemia primarily affects maximal oxygen consumption. Endurance exercise is markedly impaired by intracellular iron deficiency in the muscle cells (Willis et al., 1988). From a practical point of view, the distinction may be relatively unimportant since anemia and tissue iron deficiency develop simultaneously in humans who suffer from nutritional iron deficiency.

Work Performance

Various factors may contribute to impaired work performance with iron deficiency. It has been shown that anemia and tissue iron deficiency exert independent effects on the function of organs such as skeletal muscle (Davies et al., 1984; Finch et al., 1976). Anemia primarily affects maximal oxygen consumption. Mild anemia reduces performance during brief but intense exercise (Viteri and Torun, 1974) because of the impaired capacity of skeletal muscle for oxidative metabolism. Endurance exercise is more markedly impaired by intracellular iron deficiency in skeletal muscle cells (Willis et al., 1988).

In laboratory animals, the depletion of oxidative enzymes in skeletal muscle occurs more gradually than the development of anemia (Dallman et al., 1982). The significant decrease in myoglobin and other iron-containing proteins in skeletal muscle of laboratory animals contributes significantly to the decline in muscle aerobic capacity in iron-deficiency anemia and may be a more important factor contributing to the limitation in endurance capacity (Dallman, 1986a; Siimes et al., 1980a).

One study used 31 P nuclear magnetic resonance spectroscopy to examine the functional state of bioenergetics in iron-deficient and iron-replete rat gastrocnemius muscle at rest and during 10 seconds of contraction (Thompson et al., 1993). Compared to controls, muscle from iron-deficient animals had a marked increase in muscle phosphocreatine breakdown and a decrease in pH and a slower recovery of phosphocreatine and inorganic phosphate concentrations after exercise. During repletion for 2 to 7 days with iron dextran, there was no substantial improvement in these indicators of muscle mitochondrial energetics. These authors concluded that “tissue factors” such as reduced mitochondrial enzyme activity, decreased number of mitochondria, and altered morphology of the mitochondria might be responsible for impaired muscle function.

Cognitive Development and Intellectual Performance

Studies of iron deficiency anemia and behavior in the developing human and in animal models suggest persistent functional changes. Investigators have demonstrated lower mental and motor test scores and behavioral alterations in infants with iron deficiency anemia (Idjradinata and Pollitt, 1993; Lozoff et al., 1982a, 1982b, 1985, 1987, 1996; Nokes et al., 1998; Walter et al., 1989). In studies conducted in Guatemala and Costa Rica, infants with iron deficiency anemia were rated as more wary and hesitant and maintained closer proximity to caregivers (Lozoff et al., 1985, 1986).

Several studies have shown an improvement in either motor or cognitive development according to Bayley's scale of mental development after iron treatment of iron-deficient infants (Idjradinata and Pollitt, 1993; Lozoff et al., 1987; Oski et al., 1983; Walter et al., 1983). Other studies have failed to show an improvement in either motor or cognitive development scores after providing iron supplements to iron-deficient infants (Lozoff et al., 1982a, 1982b, 1987, 1996; Walter et al., 1989). Lower arithmetic and writing scores, poorer motor functioning, and impaired cognitive processes (memory and selective recall) have been documented in children who were anemic during infancy and were treated with iron (Lozoff et al., 1991, 2000).

Specific central nervous system processes (e.g., slower nerve conduction and impaired memory) appear to remain despite correction of the iron deficiency anemia. There is a general lack of specificity of effect and of information about which brain regions are adversely affected. Recent data from Chile showed a decreased nerve conduction velocity in response to an auditory signal in formerly iron-deficient anemic children despite hematologic repletion with oral iron therapy (Roncagliolo et al., 1998). This is strongly suggestive evidence for decreased myelination of nerve fibers, though other explanations could also exist.

Current thinking about the impact of early iron deficiency anemia attributes some role for “functional isolation,” a paradigm in which the normal interaction between stimulation and learning from the physical and social environment is altered (Pollitt et al., 1993; Strupp and Levitsky, 1995).

Adverse Pregnancy Outcomes

Several large epidemiological studies have demonstrated that maternal anemia is associated with premature delivery, low birth weight, and increased perinatal infant mortality (see Table 9-1) (Allen, 1997; Garn et al., 1981; Klebanoff et al., 1991; Lieberman et al., 1988; Murphy et al., 1986; Williams and Wheby, 1992). Some of these studies have been criticized because maternal hemoglobin concentration was measured only at the time of delivery. Physiological factors cause the maternal hemoglobin concentration to rise shortly before delivery. Delivery, occurring early because of known or unknown factors unrelated to anemia, could therefore be expected to show an association with a lower hemoglobin concentration even though anemia played no causal role. Other surveys have shown the association to be present even when hemoglobin concentration was measured earlier in pregnancy. In one recent prospective study, only anemia resulting from iron deficiency was associated with premature labor (Scholl et al., 1992). Furthermore, Goepel and coworkers (1988) reported that premature labor was four times more frequent in women with serum ferritin concentrations below 20 μg/L than in those with higher ferritin concentrations, irrespective of hemoglobin concentration.

TABLE 9-1

Association of Anemia and Iron Deficiency with Inadequate Weight Gain and Pregnancy Outcome.

High hemoglobin concentrations at the time of delivery are also associated with adverse pregnancy outcomes, such as the newborn infant being small for gestational age (Yip, 2000). Therefore, there is a U-shaped relationship between hemoglobin concentration and prematurity, low birth weight, and fetal death, the risk being increased for hemoglobin concentration below 90 g/L or above 130 g/L. The etiological factors are different, however, at each end of the spectrum. Iron deficiency appears to play a causal role in the presence of significant anemia by limiting the expansion of the maternal erythrocyte cell mass. On the other hand, elevated hemoglobin concentration probably reflects a decreased plasma volume associated with maternal hypertension and eclampsia. Both of the latter conditions have an increased risk of poor fetal outcome (Allen, 1993; Hallberg, 1992; Williams and Wheby, 1992).

Fetal requirements for iron appear to be met at the expense of the mother's needs, but the iron supply to the fetus may still be suboptimal. Several studies suggest that severe maternal anemia is associated with lower iron stores in infants evaluated either at the time of delivery by measuring cord blood ferritin concentration or later in infancy. The effect of maternal iron deficiency on infant status has been reviewed extensively by Allen (1997).

While the observations relating iron status of the mother to the size of stores in infants (based on serum ferritin concentration) are important, it should be noted that the total iron endowment in a newborn infant is directly proportional to birth weight (Widdowson and Spray, 1951). Maternal iron deficiency anemia may therefore limit the infant's iron endowment specifically through an association with premature delivery and low birth weight. Preziosi and coworkers (1997) evaluated the effect of iron supplementation during pregnancy on iron status in newborn babies born to women living in Niger. The prevalence of maternal anemia was 65 to 70 percent at 6 months gestation. The iron status of the infants was also evaluated at 3 and 6 months of age. Although there were no differences between the supplemented and unsupplemented women in cord blood iron indexes at both 3 and 6 months of age, the children born to iron-supplemented women had significantly higher serum ferritin concentrations. Furthermore, it was reported that Apgar scores were significantly higher in infants born to supplemented mothers. There were a total of eight fetal or neonatal deaths, seven in the unsupplemented group.

Other Consequences of Iron Deficiency

With use of in vitro tests and animal models, iron deficiency is associated with impaired host defense mechanisms against infection such as cell-mediated immunity and phagocytosis (Cook and Lynch, 1986). The clinical relevance of these findings is uncertain although iron deficiency may be a predisposing factor for chronic mucocutaneous candidiasis (Higgs, 1973). Iron deficiency is also associated with abnormalities of the mucosa of the mouth and gastrointestinal tract leading to angular stomatitis, glossitis, esophageal webs, and chronic gastritis (Jacobs, 1971). Spoon-shaped fingernails (koilonychia) may be present (Hogan and Jones, 1970). The eating of nonfood material (pica) or a craving for ice (pagophagia) are also associated with iron deficiency (Ansell and Wheby, 1972). Finally, temperature regulation may be abnormal in iron deficiency anemia (Brigham and Beard, 1996).

SELECTION OF INDICATORS FOR ESTIMATING THE REQUIREMENT FOR IRON

Functional Indicators

The most important functional indicators of iron deficiency are reduced physical work capacity, delayed psychomotor development in infants, impaired cognitive function, and adverse effects for both the mother and the fetus as discussed above. As indicated earlier, these adverse consequences of iron deficiency are associated with a degree of iron deficiency sufficient to cause measurable anemia.

A specific functional indicator, such as dark adaptation for vitamin A (see Chapter 4), is used to estimate the average requirement for some nutrients. This is done by evaluating the effect on that functional indicator in a group of experimental subjects fed diets containing graded quantities of the nutrient. The effect of different levels of iron intake on the important functional indicators identified above can not be measured in this way because of the difficulty inherent in quantifying abnormalities in these functional indicators, as well as the complexity of the regulation of iron absorption.

Biochemical Indicators

A series of laboratory indicators can be used to characterize iron status precisely and to categorize the severity of iron deficiency. Three levels of iron deficiency are customarily identified:

depleted iron stores, but where there appears to be no limitation in the supply of iron to the functional compartment;

early functional iron deficiency (iron-deficient erythropoiesis) where the supply of iron to the functional compartment is suboptimal but not reduced sufficiently to cause measurable anemia; and

iron deficiency anemia, where there is a measurable deficit in the most accessible functional compartment, the erythrocyte.

Available laboratory tests can be used in combination to identify the evolution of iron deficiency through these three stages (Table 9-2).

TABLE 9-2

Laboratory Measurements Commonly Used in the Evaluation of Iron Status.

Storage Iron Depletion

Serum Ferritin Concentration. Cellular iron that is not immediately needed for functional compounds is stored in the form of ferritin. Small quantities of ferritin also circulate in the blood. The concentration of plasma and serum ferritin is proportional to the size of body iron stores in healthy individuals and those with early iron deficiency. In an adult, each 1 μg/L of serum ferritin indicates the presence of about 8 mg of storage iron (Bothwell et al., 1979). A similar relationship is present in children in that each 1 μg/L of serum ferritin is indicative of an iron store of about 0.14 mg/kg (Finch and Huebers, 1982). When the serum ferritin concentration falls below 12 μg/L, the iron stores are totally depleted.

Based on the Third National Health and Examination Survey (NHANES III), for adults living in the United States the median serum ferritin concentrations were 36 to 40 μg/L in menstruating women and 112 to 156 μg/L in men (Appendix Table G-3). The median serum ferritin concentration was 27 μg/L for adolescent girls and 28 μg/L for pregnant women. These concentrations exceed the cut-off concentration of less than 12 μg/L for adolescent girls and pregnant women (IOM, 1990; Table 9-2).

However, direct correlation between the estimation of iron intakes and iron status is low (Appendix Table H-5). Serum ferritin concentrations are known to be affected by factors other than the size of iron stores. Concentrations are increased in the presence of infections, inflammatory disorders, cancers, and liver disease because ferritin is an acute phase protein (Valberg, 1980). Thus, serum ferritin concentration may fall within the normal range in individuals who have no iron stores. Elevated serum ferritin concentrations are also associated with increased ethanol consumption (Leggett et al., 1990; Osler et al., 1998), increasing body mass index (Appendix Table H-3), and elevated plasma glucose concentration (Appendix Table H-4) (Tuomainen et al., 1997). Dinneen and coworkers (1992) reported high serum ferritin concentration in association with newly diagnosed diabetes mellitus. Analysis of the NHANES III database demonstrated a statistically significant direct correlation between body mass index and serum ferritin concentration in non-Hispanic white men over the age of 20 years, non-Hispanic black men and women aged 20 to 49 years, Mexican-American men aged 20 to 49 years, and Mexican-American women over the age of 50 years (Appendix Table H-3). An examination of the NHANES III database also showed that individuals in the highest quartile for plasma glucose concentration had higher serum ferritin concentrations than those in the lowest quartile for all gender and age groups (Appendix Table H-4). Similar findings were reported by Ford and Cogswell (1999). For these reasons and because of the variability in consumption of promoters and inhibitors of iron absorption, iron intake does not necessarily correlate with ferritin status.

Despite the influence of various unrelated factors on serum ferritin concentration, this indicator is the most sensitive indicator of the amount of iron in the storage compartment.

Total Iron-Binding Capacity. Iron is transported in the plasma and extracellular fluid bound to transferrin. This metalloprotein has a very high affinity for iron. Virtually all plasma iron is bound to transferrin. Therefore it is convenient to measure plasma transferrin concentration indirectly by quantifying the total iron-binding capacity (TIBC), which is the total quantity of iron bound to transferrin after the addition of exogenous iron to plasma. TIBC is elevated with storage iron depletion before there is evidence of inadequate delivery of iron to erythropoetic tissue. An increased TIBC (> 400 μg/dL) is therefore indicative of storage iron depletion. It is less precise than the serum ferritin concentration. About 30 to 40 percent of individuals with iron deficiency anemia have TIBCs that are not elevated (Ravel, 1989). TIBC is reduced in infectious, inflammatory, or neoplastic disorders (Konijn, 1994).

Early Iron Deficiency

Early iron deficiency is signaled by evidence indicating that the iron supply to the bone marrow and other tissues is only marginally adequate. A measurable decrease in the hemoglobin concentration is not yet present and therefore there is no anemia.

Serum Transferrin Saturation. As the iron supply decreases, the serum iron concentration falls and the saturation of transferrin is decreased. Levels below 16 percent saturation indicate that the rate of delivery of iron is insufficient to maintain the normal rate of hemoglobin synthesis. Low saturation levels are not specific for iron deficiency and are encountered in other conditions such as anemia of chronic disease (Cook, 1999), which is associated with impaired release of iron from stores.

The median serum transferrin saturation was 26 to 30 percent for men and 21 to 24 percent for women (Appendix Table G-2). The median serum transferrin saturation was 21 percent for pregnant women and 22 percent for adolescent girls. These values exceed the cut-off value of 16 percent (Table 9-2).

Erythrocyte Protoporphyrin Concentration. Heme is formed in developing erythrocytes by the incorporation of iron into protoporphyrin IX by ferrochetalase. If there is insufficient iron for optimal hemoglobin synthesis, erythrocytes accumulate an excess of protoporphyrin, which remains in the cells for the duration of their lifespans (Cook, 1999). An increased erythrocyte protoporphyrin concentration in the blood therefore indicates that the erythrocytes matured at a time when the iron supply was suboptimal. The cut off concentration for erythrocyte protoporphyrin concentration is greater than 70 μg/dL of erythrocytes. Erythrocyte protoporphyrin concentration is again not specific for iron deficiency and is also associated with inadequate iron delivery to developing erythrocytes (e.g., anemia of chronic disease) or impaired heme synthesis (e.g., lead poisoning). In iron deficiency, zinc can be incorporated into protoporphyrin IX, resulting in the formation of zinc protoporphyrin (Braun, 1999). The zinc protoporphyrin:heme ratio is used as an indicator of impaired heme synthesis and is sensitive to an insufficient iron delivery to the erythrocyte (Braun, 1999).

Soluble Serum Transferrin Receptor Concentration. The surfaces of all cells express transferrin receptors in proportion to their requirement for iron. A truncated form of the extracellular component of the transferrin receptor is produced by proteolytic cleavage and released into the plasma in direct proportion to the number of receptors expressed on the surfaces of body tissues. As functional iron depletion occurs, more transferrin receptors appear on cell surfaces. The concentration of proteolytically cleaved extracellular domains, or soluble serum transferrin receptors (sTfR), rises in parallel. The magnitude of the increase is proportional to the functional iron deficit. The sTfR concentration appears to be a specific and sensitive indicator of early iron deficiency (Akesson et al., 1998; Cook et al., 1990). Furthermore, sTfR concentration is not affected by infectious, inflammatory, and neoplastic disorders (Ferguson et al., 1992). Because commercial assays for sTfR have become available only recently, there is a lack of data relating iron intake to sTfR concentration, as well as relating sTfR concentration to functional outcomes. This indicator may prove to be very useful in identifying iron deficiency, especially in patients who have concurrent infections or other inflammatory disorders.

Iron Deficiency Anemia

Anemia is the most easily identifiable indicator of functional iron deficiency. As discussed above, physiological impairment occurs at this stage of iron deficiency both because of inadequate oxygen delivery during exercise and because of abnormal enzyme function in tissues.

Hemoglobin Concentration and Hematocrit. The hemoglobin concentration or hematocrit is neither a sensitive nor a specific indicator of mild yet functionally significant iron deficiency anemia. Iron deficiency anemia is microcytic (reduced mean erythrocyte volume and mean erythrocyte hemoglobin). However, microcytic anemia is characteristic of all anemias in which the primary abnormality is impaired hemoglobin synthesis. Iron deficiency is only one of the potential causal factors. The diagnosis of iron deficiency anemia, based solely on the presence of anemia, can result in misdiagnosis in many cases.

Garby and coworkers (1969) recognized this fundamental problem. After supplemental iron tablets (60 mg/day) or a placebo were provided to a group of women with mild anemia for 3 months, the women were characterized as having iron deficiency anemia based on a change in hemoglobin concentration in response to the iron supplement that was greater than that which occurred with the placebo. There was a significant overlap between the distribution curves for the initial hemoglobin concentration of the responders (iron deficiency anemia) and the nonresponders (no iron deficiency anemia). A single hemoglobin concentration used as a discriminant value for detecting iron deficiency anemia therefore lacks precision.

Based on NHANES III data (Appendix Table G-1), the median hemoglobin concentration for men was 144 to 154 g/L and 132 to 135 g/L for women. The median hemoglobin concentration was 132 g/L for adolescent girls and 121 g/L for pregnant women. The hemoglobin concentration for pregnant women approaches the cutoff concentration of 120 g/L (IOM, 1990).

Erythrocyte Indexes. Iron deficiency leads to the formation of small erythrocytes. Mean corpuscular hemoglobin (MCH) is the amount of hemoglobin in erythrocytes. The mean corpuscular volume (MCV) is the volume of the average erythrocyte. Both MCH and MCV are reduced in iron deficiency, but their values are not specific for it. They occur in all conditions that cause impaired hemoglobin synthesis, particularly the thalassemias (Chalevelakis et al., 1984).

Surrogate Laboratory Indicators

As discussed earlier, functional abnormalities occur only when iron deficiency is sufficiently severe to cause measurable anemia. Low iron storage does not appear to have functional consequences in most studies. This does not imply that all functional consequences of iron deficiency are mediated by anemia, but rather that cellular enzymes that require iron become depleted in concert with the development of anemia. There is extensive experimental evidence indicating that tissue iron depletion has significant physiological consequences that are independent of the consequences of anemia (Willis et al., 1988).

Early anemia could nevertheless be chosen as the surrogate functional indicator. However, the significant overlap between the iron-sufficient and the iron-deficient segments of a population limit the sensitivity of this indicator. The precision of the laboratory diagnosis of iron deficiency anemia can be improved by combining hemoglobin measurements with one or more indicators of iron status. The Expert Scientific Working Group (1985) described two models or conceptual frameworks. The ferritin model employs a combination of serum ferritin concentration, erythrocyte protoporphyrin concentration, and transferrin saturation. The presence of two or more abnormal indicators of iron status is indicative of iron deficiency. The MCV model uses MCV, transferrin saturation, and erythrocyte protoporphyrin concentration as indicators. Once again, when two or more indicators are abnormal, this is indicative of iron deficiency. The two models give similar results and improve the specificity of the hemoglobin concentration or hematocrit as an indicator of iron deficiency anemia. They were considered as potential surrogate laboratory indicators of functional iron deficiency for use in estimating requirements, but rejected because they were felt to lack sufficient sensitivity to provide an adequate margin of safety in calculating iron requirements.

The sTfR concentration may, in the future, prove to be a sensitive, reliable, and precise indicator of early functional iron deficiency. At present, however, there are insufficient dose-response data to recommend this indicator.

Methods Considered in Estimating the Average Requirement

In light of the rationale developed in the previous section, the calculation of the Estimated Average Requirement (EAR) is based on the need to maintain a normal, functional iron concentration, but only a minimal store (serum ferritin concentration of 15 μg/L) (IOM, 1993). Two methods of calculation were considered—factorial modeling and iron balance.

Factorial Modeling

Because the distribution of iron requirements is skewed, the simple addition of the components of iron requirement (losses and accretion) cannot be done. Instead, the physiological requirement for absorbed iron can be calculated by factorial modeling of each of the components of iron requirement (basal losses, menstrual losses, and accretion). Total need for absorbed iron can be estimated through the summation of the component needs (losses and accretion) (see Chapter 1, “Method for Setting the RDA when Nutrient Requirements Are Not Normally Distributed”). Information about the distribution of values for the components of iron requirement, such as hemoglobin accretion, are modeled on the basis of known physiology. Since the distributions of some components are not normally distributed (i.e., are skewed), simple addition is inappropriate. In this case, Monte Carlo simulation is used to generate a large theoretical population with the characteristics described by the component distributions. When the final distribution representing the convolution of components has been derived, then the median percentile of the distribution can be used directly to estimate the average requirement for absorbed iron and the ninety-seven and one-half percentile can be used for determining the Recommended Dietary Allowance (RDA). The EAR and RDA are then determined from this data set by dividing by the upper limit of iron absorption.

Basal Losses. Basal losses refer to the obligatory loss of iron in the feces, urine, and sweat and from the exfoliation of skin cells. Attempts to quantify these iron losses by measuring the amount of each of individual component have yielded highly variable results because of the technical difficulties encountered in distinguishing between the small quantities of iron lost from the body and contaminant iron in the samples collected. The only reliable quantitative data for basal iron losses in humans are derived from a single study (Green et al., 1968). However, a study by Bothwell and coworkers (1979) on iron absorption derived from radioiron absorption tests provides collateral support for the accuracy of the measurements made by Green and coworkers (1968).

The observations made by Green and coworkers (1968) were based on earlier experimental data demonstrating that all body iron compartments are in a constant state of flux and that uniform labeling of all body iron could be achieved several months after the injection of a long-lived radiolabelled iron ( 55 Fe, half life 2.6 years). After uniform labeling is achieved, the change in specific activity of a readily accessible iron compartment (circulating hemoglobin) could be used to calculate the physiological rate of iron loss, provided that iron balance is maintained during the period of observation. They also measured individual compartmental losses from skin and in sweat, urine, and feces separately in other volunteers. Results obtained by summing compartmental losses were similar to the whole body excretion studies. They reported an average calculated daily iron loss of 0.9 to 1.0 mg/day (≈14 μg/kg) in three groups of men with normal iron storage status who lived in South Africa, the United States, and Venezuela (Table 9-3). While there is a need for more information associating body weight with basal iron losses, subsequent analyses of the data from South Africa (R. Green, University of Witwatersrand, Johannesburg, South Africa, personal communication, 2000) showed that within the substudy groups, body weight was an important explanatory variable for basal iron loss; the other very important variable was magnitude of iron stores.

TABLE 9-3

Total Body Iron Losses in Adults.

Menstrual Losses. Additional iron is lost from the body as a result of menstruation in fertile women. Menstrual iron losses have been estimated in a number of studies (Beaton, 1974) (see review by Hefnawi and Yacout, 1978) and in three large community surveys conducted in Sweden (Hallberg et al., 1966b), England (Cole et al., 1971), and Egypt (Hefnawi et al., 1980). There was a reasonable degree of consistency between the different studies. The median blood volume lost per period reported in the three largest studies was 20.3 mL (Egypt), 26.5 mL (England), and 30.0 mL (Sweden). Losses greater than 80 mL were reported in less than 10 percent of women.

Accretion. The requirement for pregnancy and for growth in children and adolescents can also be estimated from known changes in blood volume, fetal and placental iron concentration, and the increase in total body erythrocyte mass.

Balance Studies

Chemical balance is the classical method for measuring nutrient requirements through the estimation of daily intake and losses. While this direct approach is conceptually appealing, its use in measuring iron requirements presents several major technical obstacles (Hegsted, 1975). For instance, it is difficult to achieve a steady state with nutrients such as iron that are highly conserved in the body. Because the fraction of the dietary intake that is absorbed (and excreted) is very limited, even small errors in the recovery of unabsorbed food iron in the feces invalidate the results.

Thirteen adult balance studies were evaluated (Table 9-4). All of these studies yielded values that exceed the daily iron loss calculated on the basis of the disappearance of a long-lived iron radio-isotope after uniform labeling of body iron (Green et al., 1968). One might therefore conclude that all of the subjects were in positive balance during the period of observation. Moreover, the magnitude of estimated positive balance in most cases predicted the relatively rapid accumulation of body iron. Neither of these conclusions is compatible with numerous other experimental observations. Therefore, balance studies were not considered in estimating an average requirement.

TABLE 9-4

Iron Balance Studies in Adults.

FACTORS AFFECTING THE IRON REQUIREMENT

The proportion of dietary iron absorbed is determined by the iron requirement of the individual. Absorption is regulated by the size of the body iron store in healthy humans (percentage absorption is inversely proportional to serum ferritin concentration) (Cook et al., 1974). There is a several-fold difference in absorption from a meal between an individual who is iron deficient and some-one with sizeable iron stores. The calculation of dietary requirements must be based on the maintenance of a well-defined iron status. This has been accomplished by setting the need for the maintenance of a minimal iron store (serum ferritin concentration cutoff of 15 μg/L) as the surrogate indicator of functional adequacy.

The other major factor to take into account when computing dietary iron requirements is iron bioavailability based on the composition of the diet. Iron is present in food as either part of heme, as found in meat, poultry, and fish, or as nonheme iron, present in various forms in all foods. As previously discussed, the absorption mechanisms are different. Heme iron is always well absorbed and is only slightly influenced by dietary factors. The absorption of nonheme iron is strongly influenced by its solubility and interaction with other meal components in the lumen of the upper small intestine.

Gastric Acidity

Decreased stomach acidity, due to overconsumption of antacids, ingestion of alkaline clay, or pathologic conditions such as achlorhydria or partial gastrectomy, may lead to impaired iron absorption (Conrad, 1968; Kelly et al., 1967).

Nutrient-Nutrient Interactions: Enhancers of Nonheme Iron Absorption

Ascorbic Acid. Ascorbic acid strongly enhances the absorption of nonheme iron. In the presence of ascorbic acid, dietary ferric iron is reduced to ferrous iron which forms a soluble iron-ascorbic acid complex in the stomach. Allen and Ahluwalia (1997) reviewed various studies in which ascorbic acid was added to meals consisting of maize, wheat, and rice. They concluded that iron absorption from meals is increased approximately two-fold when 25 mg of ascorbic acid is added and as much as three- to six-fold when 50 mg is added. There appears to be a linear relation between ascorbic acid intake and iron absorption up to at least 100 mg of ascorbic acid per meal.

Because ascorbic acid improves iron absorption through the release of nonheme iron bound to inhibitors, the enhanced absorption effect is most marked when consumed with foods containing high levels of inhibitors, including phytate and tannins. Ascorbic acid has been shown to improve iron absorption from infant weaning foods by two- to six-fold (Derman et al., 1980; Fairweather-Tait et al., 1995a).

Other Organic Acids. Other organic acids including citric acid, lactic acid, and malic acid have not been studied as thoroughly as ascorbic acid, but they also have some enhancing effects on nonheme iron absorption (Gillooly et al., 1983).

Animal Tissues. Meat, fish, and poultry improve iron nutrition both by providing highly bioavailable heme iron and by enhancing nonheme iron absorption. The mechanism of this enhancing effect on nonheme iron absorption is poorly described though it is likely to involve low molecular weight peptides that are released during digestion (Taylor et al., 1986).

Nutrient:Nutrient Interactions: Inhibitors of Nonheme Iron Absorption

Phytate. Phytic acid (inositol hexaphosphate) is present in legumes, rice, and grains. The inhibition of iron absorption from added iron is related to the level of phytate in a food (Brune et al., 1992; Cook et al., 1997). The absorption of iron was shown to increase four- to five-fold when the phytic acid concentration was reduced from 4.9 to 8.4 mg/g, to less than 0.1 mg/g in soy protein isolate (Hurrell et al., 1992). Genetically modified, low-phytic acid strains of maize have been developed. Iron absorption with consumption of low-phytic acid strains was 49 percent greater than with consumption of wild type strains of maize (Mendoza et al., 1998). Still, the overall availability of iron remained quite low and generally under 8 percent, even for subjects with marginal iron status. The absorption of iron from legumes such as soybeans, black beans, lentils, mung beans, and split peas has been shown to be very low (0.84 to 1.91 percent) and similar to each other (Lynch et al., 1984). Because phytate and iron are concentrated in the aleurone layer and germ of grains, milling to white flour and white rice reduces the content of phytate and iron (Harland and Oberleas, 1987), thereby increasing the bioavailability of the remaining iron (Sandberg, 1991).

Polyphenols. Polyphenols markedly inhibit the absorption of nonheme iron. This was first recognized when tea consumption was shown to inhibit iron absorption (Disler et al., 1975). Iron binds to tannic acid in the intestinal lumen forming an insoluble complex that results in impaired absorption. The inhibitory effects of tannic acid are dose-dependent and reduced by the addition of ascorbic acid (Siegenberg et al., 1991; Tuntawiroon et al., 1991). The response to iron supplementation was shown to be significantly greater for Guatemalan toddlers who did not consume coffee (which contains tannic acid) than for those who did (Dewey et al., 1997). Polyphenols are also found in many grain products, other foods, herbs such as oregano, and red wine (Gillooly et al., 1984).

Vegetable Proteins. Soybean protein has an inhibitory effect on nonheme iron absorption that is not dependent on the phytate effect (Lynch et al., 1994). Bioavailability is improved by fermentation, which leads to protein degradation. The iron bioavailability from other legumes and nuts is also poor.

Calcium. Calcium inhibits the absorption of both heme and nonheme iron (Hallberg et al., 1991). The mechanism is not well understood (Whiting, 1995); however, calcium has been shown to inhibit iron absorption, in part by interfering with the degradation of phytic acid. Furthermore, it has been suggested that calcium inhibits heme and nonheme iron absorption during transfer through the mucosal cell (Hallberg et al., 1993). Calcium has a direct dose-related inhibiting effect on iron absorption such that absorption was reduced by 50 to 60 percent at doses of 300 to 600 mg of calcium added to wheat rolls (Hallberg et al., 1991). Inhibition may be maximal at this level. When preschool children consumed mean calcium intakes of 502 or 1,180 mg/day, no difference was observed in the erythrocyte incorporation of iron (Ames et al., 1999). Despite the significant reduction of iron absorption by calcium in single meals, little effect has been observed on serum ferritin concentrations in supplementation trials with supplement levels ranging from 1,000 to 1,500 mg/day of calcium (Dalton et al., 1997; Minihane and Fairweather-Tait, 1998; Sokoll and Dawson-Hughes, 1992).

Algorithms for Estimating Dietary Iron Bioavailability

Despite the complexity of the food supply, the various interactions, and the lack of long-term bioavailability studies, attempts have been made to develop an algorithm for estimating iron bioavailability based on nutrients and food components that improve and inhibit iron bioavailability. Monsen and coworkers (1978) developed a model that was based on the level of dietary meat, fish, or poultry and ascorbic acid.

Most recently, an algorithm has been developed and validated for calculating absorbed heme and nonheme iron by the summation of absorption values derived from single-meal studies to estimate the iron absorption from whole diets (Hallberg and Hulthen, 2000). This algorithm involves estimating iron absorption on the basis of the meal content of phytate, polyphenols, ascorbic acid, calcium, eggs, meat, seafood, soy protein, and alcohol. Reddy and coworkers (2000) have developed another algorithm based on the animal tissue, phytic acid, and ascorbic acid content of meals. It is also important to note that single-meal studies may exaggerate the impact of factors affecting iron bioavailability. Cook and coworkers (1991) compared nonheme iron bioavailability from single meals with that of a diet consumed over a 2-week period. There was a 4.5-fold difference between maximally enhancing and maximally inhibiting single meals. The difference was only two-fold when measured over the 2-week period.

The determination of an Estimated Average Requirement (EAR) depends on a precise assessment of the physiological requirement for absorbed iron and the estimation of the maximum rate of absorption that can be attained by individuals just maintaining the level of iron nutriture considered adequate to ensure normal function. As discussed earlier, normal function is preserved in individuals with a normal functional iron compartment provided that the dietary iron supply is secure and of sufficiently high bioavailability. There appears to be no physiological benefit to maintaining more than a minimal iron store (Siimes et al., 1980a, 1980b). The EAR is therefore set to reflect absorption levels in individuals with a normal complement of functional iron, but only minimal storage iron as indicated by a serum ferritin concentration of 15 μg/L (IOM, 1993). The selection of this criterion for adequate iron balance is critical to determining the EAR because iron absorption is controlled primarily by the size of iron stores. As iron stores rise, the percentage of dietary iron absorption and apparent bioavailability fall (Cook et al., 1974).

The second factor that is critical to determining the EAR is dietary iron bioavailability. Although much is known about the factors that enhance and inhibit iron absorption, the application of specific algorithms based on these factors to complex diets remains imprecise. Based on the general properties of the major dietary enhancers, the FAO/WHO (1988) identified three levels of bioavailability and the associated compositional characteristics of such diets. The typical diversified U.S. and Canadian diets containing generous quantities of flesh foods and ascorbic acid were judged to be 15 percent bioavailable. Constrained vegetarian diets, consisting mainly of cereals and vegetable foods with only small quantities of meat, fish, and ascorbic acid, were judged to be 10 percent bioavailable; very restricted vegetarian diets were judged to be 5 percent bioavailable. These levels of absorption were predicted for individuals who were not anemic, but had no storage iron. A mixed American or Canadian diet would therefore be predicted to allow the absorption of about 15 percent of the dietary iron in an individual whose iron status was selected as a basis for calculating the EAR (serum ferritin concentration of 15 μg/L).

Hallberg and Rossander-Hulten (1991) suggested that the bioavailability of iron in the U.S. diet may be somewhat higher than 15 percent: approximately 17 percent. Some support for this contention was provided by the observation of Cook and coworkers (1991) who measured nonheme iron absorption over a 2-week period in free-living American volunteers eating their customary diets. After correcting nonheme iron values (to a serum ferritin concentration of 15 μg/L), the bioavailability of nonheme iron in self-selected diets was 16.8 percent ([34 μg/L ÷ 15 μg/L] × 7.4 percent). Heme constitutes 10 to 15 percent of iron in the adult diet (Raper et al., 1984) and the diet of children (see Appendix Table I-2) and is always well absorbed. Based on a conservative estimation for overall heme absorption of 25 percent (Hallberg and Rossander-Hulten, 1991) and again a conservative estimate for the proportion of dietary iron that is in the form of heme (10 percent), estimated overall iron bioavailability in the mixed American or Canadian diet is approximately 18 percent:

Overall iron absorption = (Fraction of nonheme iron [0.9] × proportion of nonheme iron absorption [0.168]) + (Fraction of heme iron [0.1] × proportion of heme iron absorption [0.25]) × 100 = 17.6 percent.

For these reasons, 18 percent bioavailability is used to estimate the average requirement of iron for children over the age of 1 year, adolescents, and nonpregnant adults consuming the mixed diet typically consumed in the United States and Canada. The diets of most infants aged 7 through 12 months contain little meat and are rich in cereals and vegetables, a diet that approximates a medium bioavailability of 10 percent (Davidsson et al., 1997; Fairweather-Tait et al., 1995a; FAO/WHO, 1988; Skinner et al., 1997).

FINDINGS BY LIFE STAGE AND GENDER GROUP

Infants Ages 0 through 6 Months

Method Used to Set the Adequate Intake

No functional criteria of iron status have been demonstrated that reflect response to dietary intake in young infants. Thus, recommended intakes of iron are based on an Adequate Intake (AI) that reflects the observed mean iron intake of infants principally fed human milk.

At birth, the normal full-term infant has a considerable endowment of iron and a very high hemoglobin concentration. Because the mobilization of body iron stores is very high, the requirement for exogenous iron is virtually zero. After birth, an active process of shifts in iron compartments takes place. Fetal hemoglobin concentration falls, usually reaching a nadir when the infant is between 4 and 6 months of age, and adult hemoglobin formation begins because hematopoiesis is very active. Some time between 4 and 6 months, exogenous sources of iron are used and after 6 months, it can be assumed that the stores endowed at birth have been utilized and that the physiological norm is to meet iron needs from exogenous rather than endogenous sources as erythropoiesis becomes more active. Thereafter, the hemoglobin concentration rises slowly but continuously (1 to 2 g/L/year) through at least puberty (longer in males) (Beaton et al., 1989). This normal physiological sequence of events complicates the estimation of iron requirements.

It is widely accepted that the iron intake of infants exclusively fed human milk must meet or exceed the actual needs of almost all of these infants and that the described pattern of utilization of iron stores is physiologically normal, not indicative of the beginning of iron deficiency. For this age group, it is assumed that the iron provided by human milk is adequate to meet the iron needs of the infant exclusively fed human milk from birth through 6 months. Therefore, the method described in Chapter 2 is used to set an AI for young infants based on the daily amount of iron secreted in human milk. The average iron concentration in human milk is 0.35 mg/L (Table 9-5). Therefore, the AI is set at 0.27 mg/day (0.78 L/ day × 0.35 mg/L).

TABLE 9-5

Iron Concentration in Human Milk.

Since there is strong reason to expect that iron intake and iron requirement are both related to achieved body size and growth rate (milk volume relating to energy demand), it is assumed that a correlation between intake and requirement exists. This allows the group mean intake to be lower than the ninety-seven and one-half percentile of requirements (Recommended Dietary Allowance). Therefore, there should be no expectation that an intake of 0.27 mg/day is adequate to meet the needs of almost all individual infants and therefore should be applied with extreme care.

Iron AI Summary, Ages 0 through 6 Months

AI for Infants
0–6 months 0.27 mg/day of iron

Special Considerations

The iron concentration in cow milk ranges between 0.2 and 0.3 mg/L (Lonnerdal et al., 1981). Although the iron content in human milk is lower, iron is significantly more bioavailable in human milk (45 to 100 percent) compared to infant formula (10 percent) (Fomon et al., 1993; Lonnerdal et al., 1981). Casein is the major iron-binding protein in cow milk (Hegenauer et al., 1979). Because of the poor absorption of iron, in the United States cow milk is not recommended for ingestion by infants until after 1 year of age; in Canada it is not recommended until after 9 months of age. In addition, the ingestion of cow milk by infants, especially in the first 6 months of life, has been associated with small amounts of blood loss in the stool. The cause of the blood loss is not well understood, but is assumed to be an allergic-type reaction between a protein in cow milk and the enterocytes of the gastrointestinal tract. Because the early, inappropriate ingestion of cow milk is associated with a higher risk of iron deficiency anemia, it would be prudent to monitor iron status of any infants ingesting cow milk. If anemia is detected, it should be treated with an appropriate dose of medicinal iron.

The American Academy of Pediatrics (AAP, 1999) and Canadian Paediatric Society (1991) reviewed the role of commercial formulas in infant feeding. Their conclusion was that infants who are not, or only partially, fed human milk should receive an iron-fortified formula.

Infants Ages 7 through 12 Months

Evidence Considered in Estimating the Average Requirement

For older infants the approach to estimation of requirements is parallel to that of other age and gender groups. Although body iron stores decrease during the first 6 months (and this is seen as physiologically normal), it is appropriate to make provision for the maintenance and development of modest iron stores in early life, even though requirements for older children, adolescents, and adults do not make provision for iron storage as a part of requirement.

For infants over the age of 6 months, it becomes both feasible and desirable to model the factorial components of absorbed iron requirements to set the Estimated Average Requirement (EAR) and Recommended Dietary Allowance (RDA) (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”). The major components of iron need for older infants are:

obligatory fecal, urinary, and dermal losses (basal losses); increase in hemoglobin mass (increase in blood volume and increase in hemoglobin concentration); increase in tissue (nonstorage) iron; and

increase in storage iron (as noted earlier, building a small reserve in very young children is seen as important).

A number of these component estimates can be linked to achieved size and growth rate. Dibley and coworkers (1987) provided data on both estimates. Median body weights at 6 and 12 months were 7.8 and 10.2 kg for boys and 7.2 and 9.5 kg for girls (Dibley et al., 1987) and the body weight at the midpoint between 7 and 12 months (0.75 years) were 9 and 8.4 kg for male and female infants, respectively. These weights are similar to the reference weights provided in Table 1-1. Approximate normality was assumed and the standard deviation (SD) estimates for infants fed human milk were used as an indicator of likely variability in body size (WHO, 1994). These were taken to present a coefficient of variation (CV) of about 10 percent for this age group.

Basal Losses. The estimated basal loss of iron in infants is taken as 0.03 mg/kg/day (Garby et al., 1964). On the assumption that the variability of these losses is proportional to the variability of weight, the accepted estimates of basal losses at 6 and 12 months are 0.22 ± 0.02 (SD) mg/day at 6 months and 0.31 ± 0.03 (SD) mg/day at 12 months for both genders; the midrange estimate is 0.26 ± 0.03 (SD) mg/day.

Increase in Hemoglobin Mass. The rate of hemoglobin formation, and hence iron needed for that purpose, is a function of rate of growth (weight velocity). The median or average growth rate is estimated as 13 g/day (2,400 g/180 days) for boys and 12.7 g/day (2,300 g/180 days) for girls, suggesting 13.0 g/day (0.39 kg/month) for both genders (Dibley et al., 1987). The World Health Organization Working Group on Infant Growth (WHO, 1994) gathered data on growth velocity from limited longitudinal studies of infants fed human milk. The reported means and SDs for 2-month weight increments at ages 8 to 20 months were 0.27 ± 0.14 kg/month (9 g/ day) for boys and 0.26 ± 0.12 kg/month (8.6 g/day) for girls. The observed CV was 45 to 52 percent. Although skewing of the distributions would be expected, no information was provided. For the purposes of this report, the median weight increment is taken as 13 g/ day for both genders, and the SD is taken as 6.5 (CV, 50 percent).

If blood volume is estimated to be 70 mL/kg (Hawkins, 1964), the median hemoglobin concentration as 120 g/L, and the iron content of hemoglobin as 3.39 mg/g (Smith and Rios, 1974), then the amount of iron utilized for increase in hemoglobin mass can also be estimated:

Weight gain (0.39 kg/month) × blood volume factor (70 mL/kg) × hemoglobin concentration (0.12 mg/mL) × iron concentration in hemoglobin (3.39 mg/g) ÷ 30 days/month = 0.37 mg/day.

The CV of iron utilization for this function is taken as the CV for weight gain, and thus the estimate becomes 0.37 ± 0.195 (SD) mg/ day.

Increase in the Nonstorage Iron Content of Tissues. The nonstorage iron content of tissues has been estimated as 0.7 mg/kg body weight for a 1-year-old child (Smith and Rios, 1974). On the assumption that this estimate can be applied at age 7 months as well, the average tissue iron deposition would be

Weight gain (13.3 g/day) × nonstorage iron content (0.7 mg/kg) = 0.009 mg/day.

Applying the CV accepted for weight gain (50 percent) gives a modeling estimate of tissue iron deposition of 0.009 ± 0.0045 (SD) mg/ day.

Increase in Storage Iron. The desired level of iron storage is a matter of judgment rather than physiologically definable need. In this report, it is assumed that body iron storage should approximate 12 percent of total iron deposition (Dallman, 1986b), or

(Increase in hemoglobin iron [0.37 mg/day] + Increase in nonstorage tissue iron [0.009 mg/day]) × (Percent of total tissue iron that is stored [12 percent] ÷ Percent of total iron that is not stored [100 – 12 percent]) = 0.051 mg/day.

The variability would be proportional to the combined variability of hemoglobin deposition and nonstorage iron deposition.

Total Requirement for Absorbed Iron. Median total iron deposition (hemoglobin mass + nonstorage iron + iron storage) is 0.43 mg/day (0.37 + 0.009 + 0.051) and basal iron loss is 0.26 ± 0.03 (SD) mg/ day. Therefore, the median total requirement for absorbed iron is 0.69 ± 0.145 (SD) mg/day (Table 9-6).

TABLE 9-6

Summary Illustration of Median Absorbed Iron Requirements for Infants and Young Children.

Dietary Iron Bioavailability. During the second 6 months of life, it is assumed that complementary feeding is in place. The primary food introduced at this time is infant cereal, most often fortified with low-bioavailable iron (Davidsson et al., 2000); this cereal is the primary source of iron (see Appendix Table I-1). Feeding with human milk and infant formula (possibly fortified with iron), may continue. Iron absorption averaged 14.8 percent in human milk (Abrams et al., 1997). A study on food intakes of infants showed that by 1 year of age, over half of the infants consumed cereals and fruits, but less than half consumed meat or meat mixtures (Skinner et al., 1997). Only 32 percent of infants consumed beef at 12 months of age. Therefore, a moderate bioavailability of 10 percent is used to set the EAR at 6.9 mg/day (0.69 ÷ 0.1).

Iron EAR and RDA Summary, Ages 7 through 12 Months

The EAR has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the fiftieth percentile, with use of an upper limit of 10 percent iron absorption and rounding (see Appendix Table I-3).

EAR for Infants
7–12 months 6.9 mg/day of iron

The RDA has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the ninety-seven and one-half percentile, with use of an upper limit of 10 percent iron absorption and rounding (see Appendix Table I-3).

RDA for Infants
7–12 months 11 mg/day of iron

Children Ages 1 through 8 Years

Evidence Considered in Estimating the Average Requirement

The EAR for children 1 through 8 years is determined by factorial modeling of the median components of iron requirements (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”). The model is presented for males and females though gender is ignored in deriving the EAR for young children because the gender differences are sufficiently small. The major components of iron need for young children are:

basal iron losses; increase in hemoglobin mass; increase in tissue (nonstorage iron); and increase in storage iron.

A fundamental influence on body iron accretion is the rate of change of body weight (growth rate). Because variability in body weight is needed for calculating the distribution of basal losses, the reference weights in Table 1-1 were not used. Median change in body weight was estimated as the slope of a linear regression of reported median body weights on age (weight = 7.21 + 2.29 × age, for pooled gender) (Frisancho, 1990) (Table 9-6). The fit was satisfactorily close for the purpose of modeling. Inclusion of gender in the model demonstrated that boys typically weighed more than girls, but the interaction term was insignificant, statistically and biologically. A median rate of weight change of 2.3 kg/year or 6.3 g/day was assumed for both sexes.

For each component discussed above, the data for children 1 through 3.9 years and 4 through 8.9 years were used for modeling the iron needs for children 1.5 through 3.5 years and 4.5 through 8.5 years, respectively. The midpoints for these age ranges are 2.5 and 6.5 years, which were used to estimate the total requirements for absorbed iron.

Basal Losses. Basal iron losses for children, aged 1.5 to 8.5 years, were derived from the total body iron losses directly measured from adult men (Green et al., 1968) (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”). Rather than assuming a linear function of body weight, estimated losses were adjusted to the child's body size on the basis of estimated surface area (Haycock et al., 1978). Body surface area was used rather than body weight because it is directly related to dermal iron losses (Bothwell and Finch, 1962) and because it is a predictor of metabolic size. On this basis, the adult male basal loss was computed as 0.538 mg/m 2 /day (Green et al., 1968). The derived values are presented in Table 9-6.

Garby and coworkers (1964) found that iron lost from the gastrointestinal tract alone was 0.03 mg/kg in infants, an amount that would yield higher estimated basal losses than were determined by extrapolating from the data of Green and coworkers (1968). Therefore, the basal losses of children 1 through 8 years of age may be underestimated. Nonetheless, the data of Green coworkers (1968) were used because of the greater number of study subjects (n = 41 versus n = 3 studied by Garby and coworkers [1964]), as well as the finding that basal losses are related to body size (Bothwell and Finch, 1962; R. Green, University of Witwatersrand, Johannesburg, South Africa, personal communication, 2000).

Increase in Hemoglobin Mass. Median increase in hemoglobin mass was estimated as

Hemoglobin mass (g) = blood volume (mL/kg) × hemoglobin concentration (g/L).

During growth, both blood volume and hemoglobin concentration change with age. Although blood volume is a function of body weight, the actual relationship between blood volume and weight appears to change with age. Hawkins (1964) estimated blood volume at specific ages by averaging estimates obtained by several calculations based on body weight or body surface area. Hawkins' estimates are presented in Table 9-7. Age- and gender-specific hemoglobin concentration is estimated from the equations of Beaton and coworkers (1989) using 119 + 1.4 g/L/year in males and 121 + 1.1 g/L/year in females. Estimated blood volume and hemoglobin mass are shown in Table 9-7. Change in hemoglobin mass was estimated between mass at successive ages (Table 9-6). Iron needs were computed from the estimated change of hemoglobin mass and its expected iron content (3.39 mg/g). Thus, for example, from Table 9-7, the increase in hemoglobin mass between ages 7 and 8 years was 29.9 g (from 231.8 to 261.7). That represents 101.4 mg of iron (29.9 g × 3.39 mg/g) per year or 0.28 mg/day, as shown for increase in hemoglobin mass at 7.5 years in Table 9-6.

TABLE 9-7

Estimates of Blood Volume and Hemoglobin Mass, by Age and Gender.

Increase in the Nonstorage Iron Content of Tissues. Iron deposition was derived with use of the estimate of body weight change (0.7 mg/kg) (Smith and Rios, 1974) and the median rate of weight change (2.29 kg/year). The estimated deposition is estimated to be 0.004 mg/ day (2.29 kg/year × 0.7 mg/kg ÷ 365 days/year) for all age groups (Table 9-6).

Increase in Storage Iron. Similar to the calculation described for older infants, increase in storage iron was computed as

Increase in hemoglobin mass (mg/day) + Increase in tissue iron (mg/day) × Portion of total tissue iron that is stored (12 percent).

This calculation was used for estimating an increase in iron stores for children up to 3 years old and was based on an estimated 12 percent of iron that enters storage (Dallman, 1986b). Beyond age 3 years, this percent progressively falls to no provision of iron stores by 9 years of age. The iron storage allowance for each age group is shown in Table 9-6.

Total Requirement for Absorbed Iron. Total requirement for absorbed iron for children 1 through 8 years is based on the higher estimates derived for males. Median total iron deposition (hemoglobin mass + nonstorage iron + iron storage) is 0.21 mg/day (0.18 + 0.004 + 0.023) and basal iron loss is 0.33 mg/day for children aged 1 through 3 years. Therefore, the median total requirement for absorbed iron is 0.54 mg/day (Table 9-6). The median total iron deposition is 0.27 mg/day (0.25 + 0.004 + 0.015) and basal iron loss is 0.47 mg/day for children 4 through 8 years. Therefore, the median total requirement for absorbed iron is 0.74 mg/day (Table 9-6).

Dietary Iron Bioavailability. Based on a heme iron intake of 11 percent of total iron for children 1 to 8 years old, the upper limit of absorption is 18 percent (see “Factors Affecting the Iron Requirement—Algorithms for Estimating Dietary Iron Bioavailability” and Appendix Table I-2).

The derived estimates of dietary requirements are shown in Table 9-8. Representative values are selected for tabulated EARs and RDAs. The derived distributions of requirements for children 1 year of age and older are skewed and are tabulated in Appendix Table I-3.

TABLE 9-8

Derived Estimates of the Estimated Average Requirement (EAR) and Recommended Dietary Allowance (RDA) for Young Children.

Estimation of the Variability of Requirements. For the estimation of variability of requirements, it is necessary to have an estimate of the variability of weight velocity. The Infant Growth Study (WHO, 1994) offers an estimate of the variability of 2-month weight gains at 10 to 12 months. The apparent CV was 62.5 percent in boys and 63.6 percent in girls. The report on weight velocity standards for the United Kingdom (Tanner et al., 1966) seems to suggest a CV of 25 to 30 percent for 1-year weight velocities in children in the age group examined at each year of age. Given that the relative variability (CV) increases as the duration of the increment interval decreases, it was judged appropriate to accept a somewhat higher estimate of the variability in biologically meaningful intervals. A CV of 40 percent for weight velocity in boys and girls at ages 1 through 8 years is estimated. In all likelihood, the actual distribution of weight velocities is skewed, but no estimates of the actual distribution characteristics have been identified.

The variabilities of both hemoglobin iron deposition and tissue iron deposition were assigned the CV for weight gain (40 percent). Basal iron loss is estimated on the basis of surface area. The logical variability would be proportional to the variability of surface area. To obtain an estimate of variability of basal losses, these were computed for weights and heights reported in the U.S. Department of Agriculture (USDA) Continuing Survey of Food Intakes by Individuals (CSFII) (self-reported weight and height), and the variability of estimate within 1-year age intervals was examined. CVs of square root-transformed data for individual age-sex groups were examined, as well as the linear scale, and all showed appreciable departure from normality, but the square root transformation was empirically the best fit. CVs for individual age-sex groups ranged from 29 percent in 8-year-old boys to 47.4 percent in 4-year-old boys. With use of a statistical model that took into account age and gender effects, an overall CV of the basal iron loss was estimated as 38 percent. That CV was applied to the square root of the median basal losses shown in Table 9-6.

Iron EAR and RDA Summary, Ages 1 through 8 Years

The EAR has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the fiftieth percentile, and with use of an upper limit of 18 percent iron absorption and rounding (see Table 9-8 and Appendix Table I-3).

EAR for Children
1–3 years 3.0 mg/day of iron
4–8 years 4.1 mg/day of iron

The RDA has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the ninety-seven and one-half percentile, and with use of an upper limit of 18 percent iron absorption and rounding (see Table 9-8 and Appendix Table I-3).

RDA for Children
1–3 years 7 mg/day of iron
4–8 years 10 mg/day of iron

Children and Adolescents Ages 9 through 18 Years

Evidence Considered in Estimating the Average Requirement

The EAR for children and adolescents ages 9 through 18 years is determined by factorial modeling of the median components of iron requirements (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”). The major components of iron need for children are:

basal iron losses; increase in hemoglobin mass; increase in tissue (nonstorage iron); and menstrual iron losses in adolescent girls (aged 14 through 18 years).

In this model, no provision was made for the development of iron stores after early childhood. It is accepted that all recognized functions of iron are met before significant storage occurs and that stores are a reserve against possible future shortfalls in intake rather than a necessary functional compartment of body iron. Because most individuals in this age group in the United States and Canada are believed to consume iron at levels above their own requirement, it can be assumed that most will accumulate some stores.

The major physiological event occurring in this age group is puberty. The associated physiological processes that have major impacts on iron requirements are the growth spurt in both sexes, menarche in girls, and the major increase in hemoglobin concentrations in boys. Because the growth spurt and menarche are linked to physiological age, the secular age at which these events occur varies among individuals. The factorial model distorts this by using averages. Since the growth spurt and menarche can be detected in the individual, provision is made for adjustments of requirement estimates when counseling specific individuals. These are addressed later under “Special Considerations”.

Estimation of the variability of requirements in this age range is complicated because of the physiological changes that occur. In this report, median requirements for absorbed iron are estimated for each year of age, but the variability of requirement and the requirement for absorbed iron at the ninety seven and one-half percentile are estimated at the midpoint for children 9 through 13 years (11 years) and adolescents 14 through 18 years (16 years).

For modeling, the entire age range is treated as a continuum; for description, the conventional age intervals of the DRIs are used. Although requirement estimates have been developed for individual ages, these should be interpreted with care. Unsmoothed data have been used and year-by-year fluctuations may not be meaningful.

In addition to achieved size, it is necessary to estimate growth rates (weight velocities). After fitting linear regressions to median weights for segments of the age range, the regression slopes were taken as estimates of median weight velocities for the age interval. The estimates used are shown in Table 9-9.

TABLE 9-9

Growth Velocity for Boys and Girls.

Basal Losses. Basal iron loss estimates are based on the study of Green and coworkers (1968) (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”). Observations in adult men were extrapolated to adolescents on the basis of 14 mg/kg median weight and the losses for each age group are shown in Table 9-10.

TABLE 9-10

Summary Illustration of Median Absorbed Iron Requirements for Children and Adolescents, Aged 9 through 18 Years.

Increase in Hemoglobin Mass. Estimation of the net iron utilization for increasing hemoglobin mass necessitates estimation of the rate of increase in blood volume and estimation of the rate of change in hemoglobin concentration. Blood volume is taken as approximately 75 mL/kg in boys and 66 mL/kg in girls (Hawkins, 1964). The average yearly weight gains for boys and girls are shown in Table 9-9. The rate of change in hemoglobin concentration has been directly estimated as the coefficients of the linear regression models applied to hemoglobin versus age for Nutrition Canada data by Beaton and coworkers (1989). The rate of change in hemoglobin concentration and the average hemoglobin concentrations for boys and girls are shown in Table 9-11. The iron content of hemoglobin is 3.39 mg/g (Smith and Rios, 1974), therefore the daily iron need for increased hemoglobin mass can be calculated as follows:

TABLE 9-11

Equations Used to Estimate Hemoglobin Concentration and Increase in Hemoglobin (Hb) Concentration.

([Weight (kg) × increase in hemoglobin concentration (kg/L/year)] + [Weight gain (kg/year) × hemoglobin concentration (g/L)]) × blood volume (0.075 L/kg) × hemoglobin iron (3.39 mg/g) ÷ 365 days/year.

([Weight (kg) × increase in hemoglobin concentration (kg/L/year)] + [Weight gain (kg/year) × hemoglobin concentration (g/L)]) × blood volume (0.066 L/kg) × hemoglobin iron (3.39 mg/g) ÷ 365 days/year.

For example, the medium daily need for increased hemoglobin mass for a 16-year-old girl would be ([55.6 × 0.28] + [1.63 × 135]) × 0.066 × 3.39 ÷ 365, or 0.14 mg/day.

Increase in the Nonstorage Iron Content of Tissues. Nonstorage tissue iron concentration (myoglobin and enzymes) (Table 9-10) can be calculated when the average weight gain for boys and girls and the iron content in muscle tissue are known. The iron deposition is approximately 0.13 mg/kg total weight gain (0.26 mg/kg muscle tissue) (Smith and Rios, 1974). The median need for absorbed iron associated with increase in weight in both sexes is

Tissue iron = Weight gain (kg/year) × nonstorage tissue iron (0.13 mg/kg) ÷ 365 days/year, or weight gain × 0.00036 mg/day.

For example, the median daily need for nonstorage iron for a 16-year-old boy is 0.001 mg/day (2.75 × 0.13 ÷ 365) after rounding. No provision is made for iron storage after the age of 9 years. It is not a component of requirement though it can be expected to occur when intake exceeds actual requirement.

Menstrual Losses. Iron losses in the menses can be calculated when the average blood loss, the average hemoglobin concentration, and concentration of iron in hemoglobin (3.39 mg/g) (Smith and Rios, 1974) are known. It was deemed appropriate to use the blood losses reported by Hallberg and coworkers (1966a, 1966b) with additional information from Hallberg and Rossander-Hulthen (1991) and, more specifically, to use the blood loss estimates for 15-year-old girls. These losses were lower than those reported for older ages.

Several important features of these and other data related to menstrual blood loss were recognized in developing models to predict requirements:

Menstrual losses are highly variable among women and the distribution of losses in the population shows major skewing, with some women having losses in excess of three times the median value.

Menstrual losses are very consistent from one menstrual cycle to the next for an individual woman.

Once the woman's menstrual pattern is established after her menarche, menstrual losses are essentially unchanged until the onset of menopause in healthy women. Hallberg and coworkers (1966b) found very little difference in blood loss with age. Losses were lower in the 15-year-old group, but incomplete collection might have been a factor. Cole and coworkers (1971) reported a small effect of age that was attributed to two covariates, parity and infant birth weight.

Contraceptive methods have a major impact on menstrual losses. Bleeding is significantly increased by the use of certain intrauterine devices and significantly decreased in individuals taking oral contraceptives.

Age, body size, and parity were not considered to have an effect of sufficient magnitude on menstrual blood losses to include them as factors in the models for estimating iron requirements in females—except with regard to the lower menstrual loss assumed for adolescents.

The data on menstrual losses reported by Hallberg and coworkers (1966a, 1966b) were used for all calculations in adolescent and adult females. This data set was selected for the following reasons:

It is representative of the other survey data quoted above and can be considered generalizable to women living in countries other than that of the study, including the United States and Canada.

Women were selected to fall into six age groups between 15 and 50 years, thus permitting estimates for all women.

Although the original data were not available, comprehensive descriptions of the distribution of menstrual losses are available from a series of publications by Hallberg and colleagues.

The survey was carried out before intrauterine devices and oral contraceptives were widely available. Only one woman in the study was using an oral contraceptive. None of them used an intrauterine device. The measurement can therefore reasonably be assumed to reflect “usual losses”.

Blood losses per menstrual cycle were converted into estimated daily iron losses averaged over the whole menstrual cycle. The following assumptions were made:

Blood loss does not change with mild anemia and is therefore independent of hemoglobin concentration.

In estimating hemoglobin loss (blood loss × hemoglobin concentration), hemoglobin concentration was taken as a constant (135 ± 9 g/L in adult women and based on age in adolescents) (Hallberg and Rossander-Hulthen, 1991) and variance was ignored.

The iron content of hemoglobin is 3.39 mg/g (Smith and Rios, 1974).

The duration of the average menstrual cycle is 28 days. Beaton and coworkers (1970) reported a cycle duration of 27.8 ± 3.6 days in 86 self-selected healthy volunteers.

Since the distribution of menstrual blood losses in the data reported by Hallberg is skewed, it was modeled as described previously (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”). Comparison of the observed and modeled values (Table 9-12) provides a way of visualizing the adequacy of the fit of the model. A log-normal distribution was fitted to the reported percentiles of the blood loss distribution (natural log of blood loss = 3.3183 ± 0.6662 [SD]) to result in a median blood loss of 27.6 mL/estrous cycle. Blood losses of greater than 100 mL/ estrous cycle are observed at the ninety-fifth percentile (Table 9-12) and the distribution is highly skewed. Although these high menstrual losses were found in apparently healthy women, it would be difficult to exclude unidentified hemostatic disorders (Edlund et al., 1996) or occult uterine disease as possible contributory factors. The investigators considered all the subjects they studied to be free of any condition that might affect menstruation. There are no criteria for identifying a subpopulation at risk for increased menstrual blood loss or for setting an upper limit for “normal” losses. Calculation of the EAR and RDA was therefore based on the complete set of observations.

TABLE 9-12

Comparison of Reported and Modeled Distributions of Blood Loss Per Menstrual Cycle of Swedish Women.

Regression estimates of hemoglobin concentration and rates of change in hemoglobin concentration by age and gender have been derived by Beaton and coworkers (1989). Estimated hemoglobin concentration for females 14 to 20 years of age was 131 g/L + 0.28 × age (years).

The above data were used to compute median menstrual iron loss as follows:

(Blood loss [27.6 mL/28 days]) × (hemoglobin concentration [131 g/L] + [0.28 × age]) × iron content of hemoglobin (3.39 mg/g) ÷ 1,000.

Thus for adolescent girls, the median iron loss would be 0.45 mg/ day (Table 9-10). Discussion on menstrual iron losses prior to 14 years is discussed under “Special Considerations.” (For a discussion of menstrual iron losses during oral contraceptive use, see the “Special Considerations” section following “Lactation”.)

Total Requirement for Absorbed Iron. Because all components (basal iron loss, hemoglobin mass, and nonstorage iron) are not normally distributed (skewed), these components as shown in Table 9-10 can not be summed to accurately determine an EAR and RDA. After summing the components for each individual in the simulated population, the estimated percentiles of distribution were tabulated and are shown in Appendix Tables I-3 and I-4. The modeled distribution of iron requirements are used to set the EAR (fiftieth percentile) and RDA (ninety-seven and one-half percentile) with the assumption of an upper limit of 18 percent for iron absorption.

Dietary Iron Bioavailability. The upper limit of dietary iron absorption was estimated to be 18 percent and used to set the EAR based on the fiftieth percentile of absorbed iron requirements (see “Factors Affecting the Iron Requirement—Algorithms for Estimating Dietary Iron Bioavailability”).

Estimation of the Variability of Requirements. While Table 9-10 shows an estimate of median requirement, it is a simple summation and does not reflect the distributions. The distribution of requirements must be modeled using Monte Carlo simulation before the EAR and RDA can be estimated. This necessitates estimation of variability for components of requirements.

Basal or obligatory losses were derived from Green and coworkers (1968) with the assumption of proportionality to body surface area. To derive an estimate of variability of surface area, basal losses were computed with use of heights and weights reported in the USDA CSFII 1994–1996. Various transformations were then tested; a square root transformation approximated normality. The relative variability of surface area in this proxy data set was taken as an estimate of variability of basal iron loss. The observed CVs of proxy basal loss were 22.7 and 8.7 percent for boys aged 11 and 16, respectively, and 19.1 and 13.2 percent for girls aged 11 and 16, respectively. These CVs were applied to the square root of median iron loss, estimated on the basis of weight at ages 11 and 16 years (median loss shown in Table 9-10).

Estimating iron associated with change in hemoglobin mass requires consideration of rate of increase in blood volume and in hemoglobin concentration. Blood volume estimates were based on body size, and estimated median growth velocity is shown in Table 9-9. The algorithm for estimating iron need was presented earlier. For the purpose of modeling, blood volume as a proportion of body weight and rate of hemoglobin change as a function of age were taken as constants. The variability of iron need was attributed to variation in weight and weight velocity.

Based on reported percentiles of body weight in the Third National Health and Nutrition Examination Survey (NHANES III), normal distributions were fitted at 11 and 16 years of age for boys and girls. The fit was approximate only but acceptable for the present purpose. The resultant body weight distributions (kg) were 42.96 ± 12.47 and 70.30 ± 12.70 for boys aged 11 and 16, respectively, and 44.96 ± 9.96 and 61.36 ± 12.88 for girls aged 11 and 16, respectively.

The average weights differ from the median weights shown in Table 9-10. Estimates of weight velocity at ages 11 and 16 years were based on the analyses of longitudinal data reported by Tanner and coworkers (1966) (Table 9-9). Approximation of a normal distribution was assumed. The resultant distributions of weight velocities (kg/year) were used for modeling: 4.87 ± 1.65 and 2.75 ± 2.27 for boys aged 11 and 16, respectively, and 4.77 ± 2.06 and 1.63 ± 1.63 for girls aged 11 and 16, respectively. The variability of tissue iron deposition was based on the variability of body weight.

Values for iron hemoglobin concentration and altered hemoglobin concentration were estimated for these ages from the equations of Beaton and coworkers (1989), and variability in hemoglobin concentration was ignored. Variability arising from menstrual loss was estimated from the fitted regression of blood loss (ln blood loss = 3.3183 ± 0.6662 [SD]).

Iron EAR and RDA Summary, Ages 9 through 18 Years

The EAR has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the fiftieth percentile, and with use of an upper limit of 18 percent iron absorption and rounding (see Appendix Tables I-3 and I-4). For the EAR and RDA for girls, it is assumed that girls younger than 14 years do not menstruate and that all girls 14 years and older do menstruate.

EAR for Boys
9–13 years 5.9 mg/day of iron
14–18 years 7.7 mg/day of iron
EAR for Girls
9–13 years 5.7 mg/day of iron
14–18 years7.9 mg/day of iron

The RDA has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the ninety-seven and one-half percentile, and with use of an upper limit of 18 percent iron absorption and rounding (see Appendix Tables I-3 and I-4).

RDA for Boys
9–13 years 8 mg/day of iron
14–18 years 11 mg/day of iron
RDA for Girls
9–13 years 8 mg/day of iron
14–18 years 15 mg/day of iron

Special Considerations

Adjustment for Growth Spurt. During the growth spurt, median rates of growth of boys might be double those seen in 11-year-olds; for girls the difference is smaller (about a 50 percent increase). The needs for absorbed iron associated with growth (increase in body weight) were estimated as 0.035 mg/g weight gained for boys and 0.030 mg/g weight gained for girls. The additional weight gain in the peak growth spurt years was estimated as the difference between the maximum and average growth rate (Table 9-9), which is 15.2 g/day ([10.43 – 4.87 kg/year] × 1,000 g/kg ÷ 365 day/year) for boys and 6.76 g/day ([7.24 – 4.77 kg/year] × 1,000 g/kg ÷ 365 days/year) for girls. These represent demands of 0.53 mg/day of iron for boys and 0.20 mg/day for girls. Therefore, the increased requirement for dietary iron is 2.9 mg/day for boys identified as currently in the growth spurt, and for girls the increase is approximately 1.1 mg/day.

Menstruation Before Age 14 Years. In the United States, the average age of menarche is about 12.5 years. It is reasonable to assume that by age 14 almost all girls will have started to menstruate, and hence the estimates of iron requirements should include menstrual losses at that time. It would be unreasonable to assume that no girls are menstruating before age 14 years. For girls under age 14 who have started to menstruate, it would be appropriate to consider a median menstrual loss of 0.45 mg/day of iron. Therefore, the requirement is increased by approximately 2.5 mg/day of iron.

Adults Ages 19 Years and Older

Method Used to Estimate the Average Requirement

Factorial modeling was used to calculate the EAR and RDA for adult men and women (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”). Requirements for maintaining iron requirements were derived by estimating losses. No provision is made for growth beyond age 19 years, and therefore there is no allowance for deposition of tissue iron.

Men. Basal iron loss was the only component used to estimate total needs for absorbed iron. Basal losses are based on the study by Green and coworkers (1968). Basal iron losses are taken as related to body weight (14 μg/kg/day), and for adult men, the requirement for absorbed iron is equivalent to the basal losses:

Basal losses (mg/day) = Weight (kg) × 0.014 mg/kg/day. (1)

There are insufficient data for estimating variability of basal losses in adult men. Therefore, the median and variability for basal losses were calculated by using the median and variability values for body weight reported in NHANES III. Because variability in body weight is needed for calculating the distribution of basal losses, the reference weights in Table 1-1 were not used. Recorded weights reasonably yield a normal distribution based on the square root of the median weight for men:

Weight 77.4 (kg) 0.5 = 8.8 ± 0.84 kg. (2)

The distribution of basal losses, and therefore requirements in men, was obtained by combining equations (1) and (2). The estimated median daily iron loss in men living in the United States—and therefore the median requirement for absorbed iron—is 1.08 mg/day (77.4 kg × 0.014 mg/kg/day). The ninety-seven and one-half percentile of absorbed iron requirements is 1.53 mg/day.

The upper limit of dietary iron absorption was estimated to be 18 percent (see “Factors Affecting the Iron Requirement—Algorithms for Estimating Dietary Iron Bioavailability”). Using this value, the EAR is 6 mg/day (1.08 mg/day ÷ 0.18).

It is important to note that these calculations ignore the fact that men have higher iron stores than women. Moreover, the calculations assume that this widely recognized observation has no biological importance, but is merely the consequence of a total intake of food energy and associated food iron that is typically higher in men than in women, coupled with a much lower iron need in men. Appendix Table I-3 provides the estimated percentiles of the distribution of iron requirements for adult men.

Menstruating Women. Factorial modeling is again used to estimate the requirement for absorbed iron. Iron requirements for women were estimated by using the customary two-component model:

Iron requirement = basal losses + menstrual losses.

There are no direct measurements of basal iron losses, separated from menstrual iron loss, in women. Values for women have therefore been derived from the observations made in men (Green et al., 1968) (see “Selection of Indicators for Estimating the Iron Requirement—Factorial Modeling”) by using a simple linear weight adjustment. The mean and variability in basal losses is based on the distribution of body weights recorded in NHANES III. Because variability in body weight is needed for calculating the distribution of basal losses, the reference weights in Table 1-1 were not used. The square root of reported weights yields a normal distribution reasonably closely:

Weight 64 (kg) 0.5 = 8.0 ± 1.06 kg.

Therefore the median basal iron loss was calculated as follows:

Basal iron losses (mg/day) = Median weight (64 kg) × 0.014 mg/ kg/day = 0.896 mg/day.

The ninety-seven and one-half percentile of the estimated absorbed iron requirement is 1.42 g/day (101.6 kg × 0.014 mg/kg/ day). Menstrual blood (iron) losses have been estimated in many small studies (Beaton, 1974) and in two large community surveys, one in Sweden (Hallberg et al., 1966b) and the other in the United Kingdom (Cole et al., 1971). The findings of all of these studies were reasonably consistent. The factors and choice of data selection described for adolescent girls were also used for estimating menstrual losses in premenopausal women. Table 9-12 shows that the modeled median blood lost per menstrual cycle is 30.9 mL. The average concentration of iron in hemoglobin is 3.39 mg/g (Smith and Rios, 1974). As determined by Beaton and coworkers (1989), the average hemoglobin concentration for nonanemic women is 135 g/L. Using the above information, the daily menstrual iron loss can be calculated as follows:

Menstrual iron loss (mg/day) = blood loss/28 days (30.9 mL) × hemoglobin concentration (135 g/L) × iron concentration in hemoglobin (3.39 mg/g) ÷ 28 days = 0.51 mg/day.

The simulated distribution of menstrual losses is shown in Table 9-13. Median total iron needs were derived by summing the component needs (basal loss [0.896] + menstrual losses [0.51] = 1.4 mg/ day).

TABLE 9-13

Estimated Distribution of Menstrual Losses and Absorbed and Dietary Iron Needs in Adult Women.

The upper limit of dietary iron absorption was estimated to be 18 percent (see “Factors Affecting the Iron Requirement—Algorithms for Estimating Dietary Iron Bioavailability”). By dividing the sum of absorbed requirements by 18 percent, a distribution of dietary requirements was derived (see Appendix Table I-4). Based on this calculation and rounding, the EAR and RDA are set at 8 and 18 mg/day, respectively, for menstruating women not using oral contraceptives.

Postmenopausal Women. As for men, basal iron loss is the only component of iron needs for postmenopausal women and the physiological iron requirements and the EAR and RDA were derived by factorial modeling using the following equation:

Basal losses (μg/day) = weight (kg) × 14 μg/kg.

As was the case for men, the median and variability for basal losses was calculated using the median and variability values for body weight reported in NHANES III. Because variability in body weight is needed for calculating the distribution of basal losses, the reference weights in Table 1-1 were not used. Recorded weights approximate a normal distribution based on the square root of weight:

Weight 64 (kg) 0.5 = 8.0 ± 1.06 kg.

The distribution of basal losses, and therefore requirements for postmenopausal women, was obtained by combining the equations relating weight to basal losses and describing the weight distribution as outlined for men (Appendix Table I-3). The estimated median daily iron loss in postmenopausal women living in the United States, and therefore the median requirement for absorbed iron, is 0.896 mg/day (64 kg × 0.014 mg/kg/day). The ninety-seven and one-half percentile of estimated absorbed iron requirement is 1.42 g/day (101.6 kg × 0.014 mg/kg/day).

The upper limit of dietary iron absorption was estimated to be 18 percent (see “Factors Affecting Iron Requirement—Algorithms for Estimating Dietary Iron Bioavailability”). Based on this value, the EAR is set at 5 mg/day (0.896 ÷ 0.18).

It is assumed that basal losses, as a function of lean body mass, are essentially constant with age. Thus with increasing age, the only adjustment made to the EAR was the reduction associated with menopause.

Iron EAR and RDA Summary, Ages 19 Years and Older

The EAR has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the fiftieth percentile, and with use of an upper limit of 18 percent iron absorption and rounding (Appendix Tables I-3 and I-4).

EAR for Men
19–30 years6 mg/day of iron
31–50 years6 mg/day of iron
51–70 years6 mg/day of iron
> 70 years6 mg/day of iron
EAR for Women
19–30 years8.1 mg/day of iron
31–50 years8.1 mg/day of iron
51–70 years5 mg/day of iron
> 70 years5 mg/day of iron

The RDA has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the ninety-seven and one-half percentile, and with use of an upper limit of 18 percent iron absorption and rounding (Appendix Tables I-3 and I-4).

RDA for Men
19–30 years8 mg/day of iron
31–50 years8 mg/day of iron
50–70 years8 mg/day of iron
> 70 years8 mg/day of iron
RDA for Women
19–30 years 18 mg/day of iron
31–50 years 18 mg/day of iron
51–70 years8 mg/day of iron
> 70 years8 mg/day of iron

Pregnancy

Evidence Considered in Estimating the Average Requirement

Factorial modeling is used to estimate median requirements of pregnant women (see “Selection of Indicators for Estimating the Requirement for Iron—Factorial Modeling”) with use of the equation:

Requirement for absorbed iron = basal losses + iron deposited in fetus and related tissues + iron utilized in expansion of hemoglobin mass.

Basal Losses. Using a body weight of 64 kg for a nonpregnant woman and an average basal loss of 14 μg/kg (Green et al., 1968), basal iron losses were calculated to be 0.896 mg/day (64 kg × 0.014 mg/kg) or approximately 250 mg for the entire pregnancy (280 days).

Fetal and Placental Iron Deposition. Numerous estimates of the iron content of the fetus and placental tissue exist. In the computation of the requirements, an estimate of 315 mg has been used (FAO/ WHO, 1988). Bothwell and coworkers (1979) and Bothwell (2000) offered an estimate of 360 mg/pregnancy (270 + 90), whereas Hytten and Leitch (1971) suggested a total of 450 mg/pregnancy (375 + 75) but noted that there were insufficient data to estimate deposition by trimester. Thus, while there is considerable disagreement regarding these estimates, there are no new data to determine which estimate is more accurate. For this reason, the FAO/ WHO total of 315 mg of iron partitioned by trimester was used.

Increase in Hemoglobin Mass. Although controversy continues, a general accepted value for iron needed to allow for expansion of hemoglobin mass is approximately 500 mg (FAO/WHO, 1988). Hemoglobin mass changes very little during the first trimester but expands greatly during the second and third trimesters. Information on the precise timing of the increase remains uncertain. For modeling, an equal division between the second and third trimesters is assumed in keeping with FAO/WHO (1988).

The actual magnitude of hemoglobin mass expansion depends on the extent of iron supplementation provided (De Leeuw et al., 1966). Beaton (2000) suggested that for every 10 g/L difference in the final hemoglobin concentration in the last trimester of pregnancy, there would be a difference of about 175 mg in the estimate of need for absorbed iron. It follows from this that the estimate of iron needs in pregnancy is directly dependent upon the cut-off that is used for hemoglobin concentration. In turn, that cut-off may depend on whether one believes that the iron needs of pregnancy can ever be met by diet alone. Evidence is needed concerning the functional significance of using a somewhat lower cut-off for final hemoglobin concentration. In this connection, it is to be recognized that by using a high hemoglobin concentration, the efficiency of dietary iron utilization is being targeted given that iron absorption is strongly affected by body iron status (Beaton, 2000). At this time, the hemoglobin concentration implied by the reference curve portrayed in Figure 9-1 is accepted.

FIGURE 9-1

Hemoglobin concentrations in healthy, iron-supplemented (100–325 mg/day) pregnant women living in industrialized countries. The upper solid line represents the median hemoglobin concentration. The lower dashed curve represents the fifth percentile (more. )

With the above estimates, the total usage of iron throughout pregnancy is 250 mg (basal losses) + 320 mg (fetal and placental deposition) + 500 mg (increase in hemoglobin mass), or 1,070 mg. At delivery, actual loss of iron in blood, including blood trapped in the placenta, may be in the range of 150 to 250 mg. That implies that of the 500 mg allowed for erythrocyte mass expansion during pregnancy, as much as 250 to 350 mg remains in the body to revert to maternal stores. The net cost of pregnancy could then be estimated as approximately 700 to 800 mg of iron (1,070 – [250 to 350]). This amount could be seen as the obligatory need for absorbed iron. Iron is not utilized at a uniform rate during pregnancy. The estimates of deposition of iron in the conceptus by stage of pregnancy are presented in Table 9-14.

TABLE 9-14

Estimated Deposition of Iron in Conceptus by Stage of Pregnancy.

Dietary Iron Bioavailability. The upper limit of dietary iron absorption is approximately 25 percent during the second and third trimesters (Barrett et al., 1994). This may be an underestimate of efficiency, coupled perhaps with the acceptance of too high a target for third trimester hemoglobin concentrations.

Table 9-15 presents a summary of the factorial model for estimation of median physiological needs, and Table 9-16 translates this to the median dietary iron requirement for pregnant women for each trimester. The iron requirement for women during the first trimester is less than that for premenopausal women because menstruation has ceased.

TABLE 9-15

Summary of Absorbed Iron Requirements in Pregnant Adult Women.

TABLE 9-16

Dietary Iron Requirement During Pregnancy.

Estimation of the Variability of Requirements. Several approaches regarding components of variation could be considered in estimating the CV for iron needs in pregnancy:

variability of basal requirement based on prepregnancy body weight; this would then need to be matched with the estimates of basal losses in nonpregnant females;

variability of iron in the fetus based on variation in fetal weight at term; basing variability on birth weight alone would be a conservative (low) approach;

variability of blood iron based on variation in hemoglobin concentration (SD of about 9 g/L) ignoring variation in blood volume; and

variation based on the responses to level of iron supplementation.

The most conservative approach is based on variation in basal loss and assumes a CV of body weight of 21 percent (see “Adults Ages 19 Years and Older”) and a CV of hemoglobin concentration in iron-supplemented women during the third trimester of about 7 percent (9 g/L/135 g/L) (Beaton et al., 1989). When these assumptions are applied, with basal losses based on prepregnancy weight, the iron need for products of conception is 315 ± 66.2 (SD), and the iron need for hemoglobin mass expansion is 500 ± 35 (SD). For the total pregnancy, this model yielded an estimated requirement of 1,055 mg ± 99.2 (SD) (CV, 9.4 percent). Table 9-16 summarizes the average requirement for absorbed and dietary iron for each trimester.

To estimate the needs of pregnant adolescents, the approach described above was followed with the notable exception that for adolescents the factorial model included basal losses and iron deposition in tissue as computed for adolescents. The fact that birth weights for adolescent mothers tend to be lower than for older women was ignored. In adolescents, the ninety-seven and one-half percentile of requirement was estimated for each trimester from simulation models rather than deriving one CV estimate and applying it to all three trimesters.

Iron EAR and RDA Summary, Pregnancy

The EAR and RDA are established by using estimates for the third trimester to build iron stores during the first trimester of pregnancy.

EAR for Pregnancy
14–18 years 23 mg/day of iron
19–30 years 22 mg/day of iron
31–50 years 22 mg/day of iron

The RDA has been set by modeling the components of iron requirements, estimating the requirement for absorbed iron at the ninety-seven and one-half percentile, and using an upper limit of 25 percent iron absorption and rounding.

RDA for Pregnancy
14–18 years 27 mg/day of iron
19–30 years 27 mg/day of iron
31–50 years 27 mg/day of iron

Lactation

Evidence Considered in Estimating the Average Requirement

Components of Requirement. Until menstruation resumes, assumed to be after 6 months of exclusive breast feeding, median iron needs during lactation are estimated as the sum of iron secretion in human milk and basal iron losses calculated for nonpregnant, nonlactating women (0.896 mg/day). The derived estimate of iron secreted in mature human milk is 0.27 ± 0.089 (SD) mg/day (0.35 mg/L × 0.78 L/day) (Table 9-5 and Chapter 2). Therefore, the median total requirement for absorbed iron is 1.17 mg/day (0.896 mg/day + 0.27 mg/day). For adolescent lactating mothers, the approach was identical to the one above except that in addition to basal losses (0.85 mg/day) and milk secretion (0.27 mg/day), provision was made for the deposition of iron in tissues (0.001 mg/day) and hemoglobin mass (0.14 mg/day) (see Table 9-10) as part of expected growth of the mother. Thus, the median requirement for absorbed iron is 1.26 mg/day (0.85 + 0.27 + 0.001 + 0.14). Again, a simulation model was used to derive the ninety-seven and one-half percentile of need.

Dietary Iron Bioavailability. To estimate the total iron requirement for lactation, iron secreted in milk and basal iron loss must be added by means of simulated distribution. The resultant distribution of iron needs, and assuming 18 percent absorption, results in the EARs and RDAs listed below.

Estimation of the Variability of Requirements. The variability of requirement was based on basal needs modeled as described for nonpregnant, nonlactating women and milk secretion modeled from the above distribution. Large breast-feeding studies suggest CVs between 10 and 40 percent (Dewey and Lonnerdal, 1983; Vaughan et al., 1979). A CV of 30 percent was adopted and iron concentration of mature human milk, for the purpose of modeling, is taken as 0.35 mg/L assuming normality with a CV of 33 percent.

Iron EAR and RDA Summary, Lactation

EAR for Lactation
14–18 years 7 mg/day of iron
19–30 years 6.5 mg/day of iron
31–50 years 6.5 mg/day of iron

The RDA for iron is set by determining the estimate of requirements at the ninety-seven and one-half percentile.

RDA for Lactation
14–18 years 10 mg/day of iron
19–30 years 9 mg/day of iron
31–50 years 9 mg/day of iron

Special Considerations

Use of Oral Contraceptives and Hormone Replacement Therapy

It has been reported that approximately 17 percent of women in the United States use oral contraceptives (Abma et al., 1997), which are known to reduce menstrual blood loss. Although many studies have documented lower menstrual blood losses among women using oral contraceptives, only one study actually allowed estimation of the magnitude of reduction, compared to expected loss. A reanalysis of data from that study (Nilsson and Solvell, 1967) suggested that a reasonable estimate of effect would be the equivalent of a 60 percent reduction from expected loss. Therefore, the requirement at the fiftieth and ninety-seven and one-half percentile for adolescent girls taking oral contraceptives is 6.9 and 11.4 mg/day, respectively and 6.4 and 10.9 mg/day for premenopausal women (see Appendix Table I-4).

Hormone replacement therapy (HRT), which provides estrogen and progesterone, is commonly practiced by postmenopausal women. Some uterine bleeding can occur in some women during HRT, especially during the first year of therapy (Archer et al., 1999; MacLennan et al., 1993; Oosterbaan et al., 1995). Therefore, women on HRT who continue to menstruate may have higher iron requirements than postmenopausal women who are not on HRT.

Vegetarianism

As previously discussed, iron is more bioavailable from meat than from plant-derived foods. Meat and fish also enhance the absorption of nonheme iron. Therefore, nonheme iron absorption is lower for those consuming vegetarian diets than for those eating nonvegetarian diets (Hunt and Roughead, 1999). Serum ferritin concentrations have been observed to be markedly lower in vegetarian men, women, and children than in those consuming a nonvegetarian diet (Alexander et al., 1994; Dwyer et al., 1982; Shaw et al., 1995). For these reasons, individuals who typically consume vegetarian diets may have difficulty consuming adequate intakes of bioavailable iron to meet the EAR. Cook and coworkers (1991) compared iron bioavailability from single meals with that of a diet consumed over a 2-week period. There was a 4.4-fold difference between maximally enhancing and maximally inhibiting single meals, but the difference was only two-fold when measured over the 2-week period. It is therefore estimated that the bioavailability of iron from a vegetarian diet is approximately 10 percent, rather than the 18 percent from a mixed Western diet. Hence the requirement for iron is 1.8 times higher for vegetarians. It is important to emphasize that lower bioavailability diets (approaching 5 percent overall absorption) may be encountered with very strict vegetarianism and in some developing countries where access to a variety of foods is limited.

Intestinal Parasitic Infection

Intestinal parasites infect approximately 1 billion of the world's population. Some of the parasites, particularly hookworm, cause significant intestinal blood loss. These infections are prevalent in developing countries where the intake of bioavailable iron is often inadequate. When possible, the primary intervention should be elimination of the parasitic infection. In addition, an adequate intake of bioavailable dietary iron may be necessary to treat iron deficiency. When bioavailable dietary iron is not available, supplemental iron may be needed. Various regimens are provided for such groups at risk of iron deficiency anemia (Stoltzfus and Dreyfuss, 1998; WHO/ UNICEF/UNU, 1998).

Blood Donation

An annual donation of 0.5 L of blood is equivalent to between 200 and 250 mg of iron, which represents approximately 0.6 to 0.7 mg/day. Blood donors have lower serum ferritin concentrations than nondonors (Milman and Kirchhoff, 1991a, 1991b). More frequent donations can be problematic, especially for women, resulting in a need for supplemental iron (Garry et al., 1995).

Increased Iron Losses in Exercise and Intense Endurance Training

Many reviewers of the scientific literature conclude that iron status is marginal or inadequate in a large number of individuals, particularly females, who engage in regular physical exercise (Clarkson and Haymes, 1995; Raunikar and Sabio, 1992; Weaver and Rajaram, 1992). Dietary intake patterns of these individuals are frequently suboptimal with a reduced intake of a number of micronutrients. Weaver and Rajaram (1992) estimated that daily iron losses increase to 1.75 mg/day in male athletes and to 2.3 mg/day in female athletes with prolonged training. This is in contrast to a whole body loss of iron of approximately 1.08 mg/day in males beyond puberty and 1.45 mg/day in menstruating females. Ehn and coworkers (1980) demonstrated that highly trained, long distance runners have a biologic half-life of body iron of only approximately 1,000 days, a significantly shorter time than the 1,300 and 1,200 days, respectively, of male and female nonexercisers. Several reviewers of this topic conclude that increased fecal losses and perhaps sporadic hematuria contribute to depressed iron stores in athletic segments of the population (Siegel et al., 1979; Stewart et al., 1984). There is a notable reduction in hematologic parameters that could be the result of increased intravascular hemolysis of erythrocytes. Many studies have found an increased rate of erythrocyte turnover and fragility in athletes (Lampe et al., 1991; Newhouse and Clement, 1995; Rowland et al., 1991). Thus, several mechanisms by which iron balance could be affected by intense physical exercise have been advanced (Fogelholm, 1995; Magnusson et al., 1984; Weight, 1993), including increased gastrointestinal blood losses after running, and hemoglobinuria as a result of erythrocyte rupture within the foot during running. For the above reasons, and based on the strong whole body iron loss data collected by Ehn and coworkers (1980), the EAR for iron will conservatively be 30 percent greater for those who engage in regular intense exercise. If the estimate of Weaver and Rajaram (1992) is used, the EAR may be as much as 70 percent greater in the subpopulation of athletes.

Validation of Requirement Estimates

The theoretical and operational derivation of iron requirement estimates has been described for each life stage group. Requirements have been based on the estimation of the amount of iron needed to meet body functions with minimal storage. This level of nutriture is marked by a serum ferritin concentration of about 15 μg/L in children, adolescents, and adults and by a somewhat lower concentration (10 to 12 μg/L) in infants. Percentiles of the simulated distributions of requirement are presented in Appendix Tables I-3 and I-4.

The prevalence of apparently inadequate intakes is estimated through an assessment of the estimated distribution of usual intakes and by applying risk tables (Appendix Tables I-5, I-6, I-7) derived from the estimated requirement distributions and compared with the estimated prevalence of inadequate iron status based on serum ferritin concentration (see Table 14-1). The data sets used in this comparison were USDA CSFII 1994–1996 for iron intake and NHANES III for serum ferritin concentration. Statistical procedures were used to derive estimates of the usual iron intake or usual serum ferritin concentration; the data were also adjusted, with use of reported weighting factors, to represent the U.S. population and to compensate for the fact that sampling weights were not identical in the two data sets. Table 9-17 presents the outcome of this comparison. Considering that the dietary data do not include iron ingested as direct supplements and that no adjustment for alleged underreporting has been made, the agreement between apparent dietary inadequacy and apparent biochemical deficiency is reasonable for most age groups.

TABLE 9-17

Comparison of Estimated Prevalence of Apparently Inadequate Iron Intakes and Serum Ferritin Concentrations Indicative of Apparent Iron Deficiency, Third National Health and Nutrition Examination Survey, 1988–1994.

Children Ages 1 through 8 Years

The estimated prevalence of inadequate intake is lower (less than 5 percent) than the estimated prevalence of inadequate iron status for children (Table 9-17). One reason for the lack of congruence between iron intake and iron status may be the lack of validation of cut-off concentrations for serum ferritin in young children. Although studies have confirmed the correlation between a lack of storage iron and low ferritin concentrations, such studies have not been conducted in children. Thus ferritin concentrations of 10 and 15 μg/L may not be indicative of low iron stores in children.

Children and Adolescents Ages 9 through 18 Years

When the predicted prevalence of inadequate intakes and the reported prevalence of iron deficiency are compared in those aged 9 through 18 years, agreement is not consistent (Table 9-17). For example, in girls aged 9 through 13 years, the prevalence of inadequate intake is less than 5 percent, but the prevalence of low serum ferritin concentration is 8 percent. The lack of congruence of these results is likely due to the fact that a proportion of girls aged 12 and 13 years have reached menarche and have higher iron requirements than those who have not reached menarche. There is better congruence between dietary and biochemical estimates for 14- through 18-year-old girls whose iron requirements include menstrual iron losses. Among boys, the prevalences of inadequate intakes and low serum ferritin concentrations are both less than 5 percent.

Adults Ages 19 Years and Older

There is congruence between the prevalences of inadequate iron intakes and low serum ferritin concentrations for men and for preand postmenopausal women (Table 9-17). The prevalence of inadequate iron intakes for premenopausal women is approximately 20 percent and the prevalence of low serum ferritin concentration is 13 to 16 percent, prevalences indicating that the additional iron requirements due to menstrual losses are not being met in this group of women.

The overall pattern offers some degree of reassurance that the general model used to estimate requirements, the specific estimates of components of that model, and the assumed limits to bioavailability of dietary iron are reasonable.

INTAKE OF IRON

Food Sources

The iron content of vegetables, fruits, breads, and pasta varies from 0.1 to 1.4 mg/serving. Because most grain products are fortified with iron, approximately one-half of ingested iron comes from bread and other grain products such as cereals and breakfast bars. Some fortified cereals contain as much as 24 mg of iron per 1-cup serving. Heme iron represents only 7 to 10 percent of dietary iron of girls and women and only 8 to 12 percent of dietary iron for boys and men (Raper et al., 1984). Human milk provides approximately 0.27 mg/day (Table 9-5).

Dietary Intake

Data from nationally representative U.S. surveys are available to estimate iron intakes (Appendix Tables C-18, C-19, D-3, E-5). Data from these surveys indicate that the median daily intake of dietary iron by men is approximately 16 to 18 mg/day, and the median intake by pre- and postmenopausal women is approximately 12 mg/ day. Data from a survey done in two Canadian provinces showed that the dietary intake of iron by both men and women was slightly lower than intakes in the United States (Appendix Table F-2). The median intake of dietary iron by pregnant women was approximately 15 mg/day, which is less than the Estimated Average Requirement (EAR) of 22 mg/day, indicating the need for iron supplementation during pregnancy.

Intake from Supplements

Approximately 21 to 25 percent of women and 16 percent of men were reported to consume a supplement that contains iron (Moss et al., 1989; see Table 2-2). The median intake of iron from supplements is approximately 1 mg/day for men and women, an amount based on the difference in median iron intake from food plus supplements and food alone (Appendix Tables C-18 and C-19). The median iron intake from food plus supplements by pregnant women is approximately 21 mg/day.

TOLERABLE UPPER INTAKE LEVELS

The Tolerable Upper Intake Level (UL) is the highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals. Although members of the general population should be advised not to routinely exceed the UL, intake above the UL may be appropriate for investigation within well-controlled clinical trials. Clinical trials of doses above the UL should not be discouraged, as long as subjects participating in these trials have signed informed consent documents regarding possible toxicity and as long as these trials employ appropriate safety monitoring of trial subjects. In addition, the UL is not meant to apply to individuals who receive iron under medical supervision.

Hazard Identification

Iron is a redox-active transition metal. In health, it is carried from one tissue to another bound to transferrin and stored in cells in the form of ferritin or hemosiderin. These proteins hold iron in the ferric state. Kinetic restrictions prevent the iron from being reduced by cellular reductants, and it is thus shielded from unwanted participation in redox reactions (McCord, 1996). If the transport and storage mechanisms are overwhelmed, the free iron will immediately be chelated by cellular compounds, such as citrate or adenosyl diphosphate, that readily participate in redox reactions catalyzing the formation of highly toxic free radicals or the initiation of lipid peroxidation.

Adverse Effects

Acute Effects. There are reports of acute toxicity resulting from overdoses of medicinal iron, especially in young children (Anderson, 1994; Banner and Tong, 1986; NRC, 1979). Accidental iron over-dose is the most common cause of poisoning deaths in children under 6 years of age in the United States (FDA, 1997). Vomiting and diarrhea characterize the initial stages of iron intoxication. With increasing time after ingestion, at least five organ systems can become involved: cardiovascular, central nervous system, kidney, liver, and hematologic (Anderson, 1994). The severity of iron toxicity is related to the amount of elemental iron absorbed. Symptoms occur with doses between 20 and 60 mg/kg with the low end of the range associated primarily with gastrointestinal irritation while systemic toxicity occurs at the high end (McGuigan, 1996). These data, however, are not used because acute intake data are not considered in setting a UL.

Iron-Zinc Interactions. High intakes of iron supplements have been associated with reduced zinc absorption as measured by changes in serum zinc concentrations after dosing (Fung et al., 1997; Meadows et al., 1983; O'Brien et al., 2000; Solomons, 1986; Solomons and Jacob, 1981; Solomons et al., 1983). However, plasma zinc concentrations are not considered to be good indicators of body zinc stores (Whittaker, 1998). Studies using zinc radioisotopes showed reduced zinc absorption when both minerals were administered in the fasting state at an iron-zinc ratio of 25:1 but not at 1:1 or 2.5:1 (Sandstrom et al., 1985). When iron and zinc supplements were given with a meal, however, this effect was not observed. Other investigators have reported similar observations (Davidsson et al., 1995; Fairweather-Tait et al., 1995b; Valberg et al., 1984; Walsh et al., 1994; Yip et al., 1985). A radioisotope-labeling study by Davidsson and coworkers (1995) showed that fortifying foods such as bread, infant formula, and weaning foods with iron had no effect on zinc absorption. In general, the data indicate that large doses of supplemental iron inhibit zinc absorption if both are taken without food, but do not inhibit zinc absorption if they are consumed with food. Because there is no evidence of any clinically significant adverse effect associated with iron-zinc interactions, this effect is not used to determine a UL for iron.

Gastrointestinal Effects. High-dose iron supplements are commonly associated with constipation and other gastrointestinal (GI) effects including nausea, vomiting, and diarrhea (Blot et al., 1981; Brock et al., 1985; Coplin et al., 1991; Frykman et al., 1994; Hallberg et al., 1966c; Liguori, 1993; Lokken and Birkeland, 1979) (Table 9-18). Because GI effects are local, the frequency and severity of the effect depends on the amount of elemental iron released in the stomach (Hallberg et al., 1966c). The adverse effects of supplemental iron appear to be reduced when iron is taken with food (Brock et al., 1985). While most of the observed effects are relatively minor, some individuals have found them severe enough to stop further supplementation (Frykman et al., 1994).

A single-blinded, 8-week study by Brock et al. (1985) reported “moderate to severe” GI effects in 50 percent of subjects taking 50 mg/day of elemental iron as ferrous sulfate. This finding is supported by other better-controlled, prospective studies showing GI effects at similar doses (Coplin et al., 1991; Frykman et al., 1994; Lokken and Birkeland, 1979). These data suggest a definite causal relation between high iron intake and GI effects.

Secondary Iron Overload. Secondary iron overload occurs when the body iron stores are increased as a consequence of parenteral iron administration, repeated blood transfusions, or hematological disorders that increase the rate of iron absorption. Although the iron in patients with secondary iron overload tends to be stored initially in macrophages where it is less damaging, the typical pathological consequences of iron overload that are characteristic of hereditary hemochromatosis may eventually occur.

Whether an excessive iron intake alone can lead to secondary iron overload and associated organ damage is unknown. Some individuals appear to control their rates of iron acquisition very effectively in the face of a high iron intake, but as yet there has been no study with a large number of experimental subjects and a sufficient duration to be certain of this conclusion. Individuals who are heterozygous for hemochromatosis manifest minor phenotypic expression, usually a slight to moderate increase in serum ferritin concentrations and transferrin saturation (Bulaj et al., 1996). Iron stores are modestly increased but do not continue to rise significantly with increasing age, and the pathological features of homozygous hemochromatosis do not occur.

There is only one clear example of dietary iron overload. The high prevalence of iron overload in South African and Zimbabwean blacks is associated with the consumption of traditional beer with an average iron content of 80 mg/L (Bothwell et al., 1964). The iron is highly bioavailable and some people may consume several liters of the beer per day. Iron overload does not occur in members of the population who are not consuming large quantities of beer or iron. There is therefore little doubt that the high iron intake plays a major role in the pathogenesis of sub-Saharan iron overload. However, intake may not be the only factor. Gordeuk and coworkers (1992) collected evidence to suggest that there is also a genetic component involving a gene different from the HFE gene-linked hereditary hemochromatosis (Feder, 1999).

Cardiovascular Disease. Sullivan (1981) first hypothesized that increased body iron plays a role in the development of coronary heart disease (CHD). This hypothesis was based on the difference in the prevalence of ischemic heart disease between men and postmenopausal women, on one hand, and between men and premenopausal women on the other. According to Sullivan's hypothesis, the prevalence of CHD is higher in men and increases after menopause in women as a result of higher body iron stores.

Epidemiological support for this hypothesis was provided by Salonen and coworkers (1992). In a cohort study, they demonstrated a significant association between high serum ferritin concentrations and the risk of myocardial infarction (MI) among middle-aged men in Finland. Men with serum ferritin concentrations greater than 200 μg/L had a 2.2-fold greater risk of acute MI than men with levels less than 200 μg/L. The association was even stronger in those with high cholesterol concentrations. Their original conclusions were confirmed by a reanalysis of the same group of subjects after a 5-year follow-up (Salonen et al., 1994). Another prospective cohort study reported an association between high serum ferritin concentrations and carotid vascular disease (Kiechl et al., 1997). However, several other large prospective cohort studies failed to demonstrate a significant relationship between serum ferritin concentrations and increased risk for CHD (Aronow and Ahn, 1996; Frey and Krider, 1994; Magnusson et al., 1994; Manttari et al., 1994; Stampfer et al., 1993) (Table 9-19).

TABLE 9-18

Iron and Gastrointestinal (GI) Adverse Effects, by Increasing Dose.

The relationships between various other measures of iron status (e.g., serum transferrin saturation, serum iron concentration, and total iron-binding capacity) and CHD severity, incidence, or mortality have been examined in other prospective cohort studies. Investigators reported that transferrin saturation (Liao et al., 1994), serum iron concentrations (Liao et al., 1994; Morrison et al., 1994; Reunanen et al., 1995), and total iron-binding capacity (Magnusson et al., 1994) were related to CHD (Tables 9-19 through 9-22). However, some of these same studies and several other large prospective cohort studies failed to demonstrate any relationship with transferrin saturation (Baer et al., 1994; Reunanen et al., 1995; Sempos et al., 1994; Van Asperen et al., 1995) or total iron-binding capacity (Liao et al., 1994; Reunanen et al., 1995; Van Asperen et al., 1995).

Danesh and Appleby (1999) recently conducted a systematic assessment of 12 prospective epidemiological studies of iron status and CHD. They concluded that these studies do not support a strong association between iron status and CHD.

There was no association between CHD and heterozygosity in two studies (Franco et al., 1998; Nassar et al., 1998). Two subsequent surveys from Europe demonstrated a two-fold increase in acute MI in heterozygous men (Tuomainen et al., 1999) and a 1.6-fold increase in overall CHD mortality in heterozygous women (Roest et al., 1999). In summary, the currently available data do not provide convincing support for an association between high body iron stores and increased risk for CHD.

Taken as a whole, this body of evidence does not provide convincing support for a causal relationship between the level of dietary iron intake and the risk for CHD. However, it is also important to note that the evidence is insufficient to definitively exclude iron as a risk factor. Several studies suggest that the serum ferritin concentration is directly correlated with the risk for CHD. However, serum ferritin concentrations are affected by several factors other than dietary iron intake. The significance of the high serum ferritin concentrations that have been observed in population surveys and the nature of the relationship between serum ferritin concentration and CHD risk remain to be determined.

TABLE 9-19

Serum Ferritin Concentration and Cardiovascular Disease.