Food fortification or enrichment is the process of adding micronutrients (essential trace elements and vitamins) to food. It can be carried out by food manufacturers, or by governments as a public health policy which aims to reduce the number of people with dietary deficiencies within a population. The predominant diet within a region can lack particular nutrients due to the local soil or from inherent deficiencies within the staple foods; the addition of micronutrients to staples and condiments can prevent large-scale deficiency diseases in these cases.[1]
As defined by the World Health Organization (WHO) and the Food and Agricultural Organization of the United Nations (FAO), fortification refers to "the practice of deliberately increasing the content of an essential micronutrient, i.e. vitamins and minerals (including trace elements) in a food, to improve the nutritional quality of the food supply and to provide a public health benefit with minimal risk to health", whereas enrichment is defined as "synonymous with fortification and refers to the addition of micronutrients to a food which are lost during processing".[2]
Food fortification has been identified as the second strategy of four by the WHO and FAO to begin decreasing the incidence of nutrient deficiencies at the global level. As outlined by the FAO, the most commonly fortified foods are cereals and cereal-based products; milk and dairy products; fats and oils; accessory food items; tea and other beverages; and infant formulas.[3] Undernutrition and nutrient deficiency is estimated globally to cause the deaths of between 3 and 5 million people per year.
Fortification is present in common food items in two different ways: adding back and addition. Flour loses nutritional value due to the way grains are processed; enriched flour has iron, folic acid, niacin, riboflavin, and thiamine added back to it. Conversely, other fortified foods have micronutrients added to them that don't naturally occur in those substances. An example of this is orange juice, which often is sold with added calcium.[4]
Food fortification can also be categorized according to the stage of addition:
Micronutrients serve an important role in bodily development and growth. Deficiencies of these micronutrients may cause improper development or even disease.
The WHO and FAO, among many other nationally recognized organizations, have recognized that there are over 2 billion people worldwide who have a variety of micronutrient deficiencies. In 1992, 159 countries pledged at the FAO/WHO International Conference on Nutrition to make efforts to help combat these issues of micronutrient deficiencies, highlighting the importance of decreasing the number of those with iodine, vitamin A, and iron deficiencies. A significant statistic that led to these efforts was the discovery that approximately 1 in 3 people worldwide were at risk for either an iodine, vitamin A, or iron deficiency.[6] Although it is recognized that food fortification alone will not combat this deficiency, it is a step towards reducing the prevalence of these deficiencies and their associated health conditions.
In Canada, the Food and Drug Regulations have outlined specific criteria which justify food fortification:
There are also several advantages to approaching nutrient deficiencies among populations via food fortification as opposed to other methods. These may include, but are not limited to: treating a population without specific dietary interventions therefore not requiring a change in dietary patterns, continuous delivery of the nutrient, does not require individual compliance, and potential to maintain nutrient stores more efficiently if consumed regularly.
The subsections below describe fortifications in some jurisdictions around the world. A more comprehensive view is given by the online Global Fortification Data Exchange. It indicates which of 197 countries worldwide have mandatory and voluntary food fortification in their datasets[8] and country profiles.[9] The website is maintained by the Food Fortification Initiative, GAIN, Iodine Global Network, and the Micronutrient Forum.[10]
In Argentina, wheat flour must by law (Ley 25.630 of 2002)[11] be fortified with iron, thiamine (vitamin B1), riboflavin (B2), niacin (B3), and folic acid (B9).[12]
Wheat flour sold in Colombia must by law be fortified with vitamin B1, vitamin B2, niacin (B3), folic acid (B9) and iron (Decreto 1944 of 1996).[13]
The four countries, also called the C-4, all legally require wheat flour to be fortified with vitamins B1, B2, B3, B9, and iron.[14] [15]
The Philippine law on food fortification has two components: mandatory (covering select staples)[16] and voluntary (under the Sangkap Pinoy program). The latter has been criticized for covering only low nutritional-value food, namely, junk food, to enable them to be sold in schools.[17]
UK law (The Bread and Flour Regulations 1998)[18] [19] requires that all flour (except wholemeal and some self-raising flours) be fortified with calcium. Wheat flour must also be fortified with iron, thiamine (vitamin B1) and vitamin B3.[20]
In the 1920s, food fortification emerged as a strategy in the United States to address and prevent the lack of micronutrients in the population's diet. Specifically, it was discovered in the 1930s and 1940s, that micronutrient deficiency is often linked to specific diseases and syndromes. Consequently, the Committee on Food and Nutrition suggested that micronutrients be added to flour.[21] In 1980, The Food and Drug Administration put into action its Food Fortification Policy which included six fundamental rules. In addition to establishing safety guidelines of food fortification, this policy aimed to ensure that food fortification was solely for when the supplemental micronutrient had a national deficiency and that the food chosen to provide that nutrient was consumed by enough of the population to make a change. This policy also emphasized the importance of clinical data, a shift from earlier policies which relied on dietary data alone.[4] The 2002 farm bill (P.L. 107–171, Sec. 3013) requires the Administrator of USAID, in consultation with the Secretary of Agriculture, to establish micronutrient fortification programs under P.L. 480 food aid. Section 3013 replaces a pilot program similarly named and authorized in the 1996 farm bill (P.L. 104–127, Sec. 415). Under the programs, grains and other commodities made available to countries selected for participation will be fortified with micronutrients (e.g., iron, vitamin A, iodine, and folic acid).
In addition to criticism of government-mandated fortification, food companies have been criticized for indiscriminate enrichment of foods for marketing purposes. Food safety worries led to legislation in Denmark in 2004 restricting foods fortified with extra vitamins or minerals. Products banned include: Rice Krispies, Shreddies, Horlicks, Ovaltine, and Marmite.[22]
One factor that limits the benefits of food fortification is that isolated nutrients added back into a processed food that has had many of its nutrients removed, does not always result in the added nutrients being as bioavailable as they would be in the original, whole food. An example is skim milk that has had the fat removed, and then had vitamin A and vitamin D added back. Vitamins A and D are both fat-soluble and non-water-soluble, so a person consuming skim milk without fats may not be able to absorb as much of these vitamins as one would be able to absorb from drinking whole milk. On the other hand, the nutrient added as a fortificant may have a higher bioavailability than from foods, which is the case with folic acid used to increase folate intakes.[23]
Phytochemicals such as phytic acid in cereal grains can also impact nutrient absorption, limiting the bioavailability of intrinsic and additional nutrients, and reducing the effectiveness of fortification programs.
There is a concern that micronutrients are legally defined in such a way that does not distinguish between different forms, and that fortified foods often have nutrients in a balance that would not occur naturally. For example, in the U.S., food is fortified with folic acid, which is one of the many naturally-occurring forms of folate, and which only contributes a minor amount to the folates occurring in natural foods.[24] In many cases, such as with folate, it is an open question of whether or not there are any benefits or risks to consuming folic acid in this form.
In many cases, the micronutrients added to foods in fortification are synthetic.
Certain forms of micronutrients can be actively toxic in a sufficiently high dose, even if other forms are safe at the same or much higher doses. There are examples of such toxicity in both synthetic and naturally occurring forms of vitamins. Retinol, the active form of Vitamin A, is toxic in a much lower dose than other forms, such as beta carotene. Menadione, a phased-out synthetic form of Vitamin K, is also known to be toxic.
Many foods and beverages worldwide have been fortified, whether a voluntary action by the product developers or by law. Although some may view these additions as strategic marketing schemes to sell their product, there is a lot of work that must go into a product before simply fortifying it. To fortify a product, it must first be proven that the addition of this vitamin or mineral is beneficial to health, safe, and an effective method of delivery. The addition must also abide by all food and labeling regulations and support nutritional rationale. From a food developer's point of view, they also need to consider the costs associated with this new product and whether there will be a market to support the change.[25]
The Food Fortification Initiative lists all countries in the world that conduct fortification programs,[26] and within each country, what nutrients are added to which foods, and whether those programs are voluntary or mandatory. Vitamin fortification programs exist in one or more countries for folate, niacin, riboflavin, thiamine, vitamin A, vitamin B6, vitamin B12, vitamin D and vitamin E. Mineral fortification programs include calcium, fluoride, iodine, iron, selenium and zinc. As of December 21, 2018, 81 countries required food fortification with one or more vitamins. The most commonly fortified vitamin – as used in 62 countries – is folate; the most commonly fortified food is wheat flour (enriched flour).[27] Examples of foods and beverages that have been fortified:
See main article: Iodised salt.
Iodised salt has been used in the United States since before World War II. It was discovered in 1821 that goiters could be treated by the use of iodized salts. However, it was not until 1916 that the use of iodized salts could be tested in a research trial as a preventative measure against goiters. By 1924, it became readily available in the US.[28] Currently in Canada and the US, the RDA for iodine is as low as 90 μg/day for children (4–8 years) and as high as 290 μg/day for breast-feeding mothers.
Diseases that are associated with an iodine deficiency include: intellectual disabilities, hypothyroidism, and goiter. There is also a risk of various other growth and developmental abnormalities.
Folate (as a fortification ingredient, folic acid) functions in reducing blood homocysteine levels, forming red blood cells, proper growth and division of cells and preventing neural tube defects (NTDs).[29] In many industrialized countries, the addition of folic acid to flour has prevented a significant number of NTDs in infants. Two common types of NTDs, spina bifida and anencephaly, affect approximately 2500-3000 infants born in the US annually. Research trials have shown the ability to reduce the incidence of NTDs by supplementing pregnant mothers with folic acid by 72%.[30]
Niacin (a form of vitamin B3) has been added to bread in the US since 1938 (when voluntary addition started), a program which substantially reduced the incidence of pellagra.[31] Pellagra was seen amongst poor families who used corn as their main dietary staple. Although corn itself does contain niacin, it is not a bioavailable form unless it undergoes nixtamalization (treatment with alkali, traditional in Native American cultures) and therefore was not contributing to the overall intake of niacin.
Diseases associated with niacin deficiency include pellagra which consisted of signs and symptoms called the three D's-"dermatitis, dementia, and diarrhea." Others may include vascular or gastrointestinal diseases. Common diseases which present a high frequency of niacin deficiency include alcoholism, anorexia nervosa, HIV infection, gastrectomy, malabsorptive disorders, certain cancers and their associated treatments.[32]
Since Vitamin D is a fat-soluble vitamin, it cannot be added to a wide variety of foods. Foods that it is commonly added to are margarine, vegetable oils and dairy products.[33] During the late 1800s, after the discovery of curing conditions of scurvy and beriberi had occurred, researchers were aiming to see if the disease, later known as rickets, could also be cured by food. Their results showed that sunlight exposure and cod liver oil were the cure. It was not until the 1930s that vitamin D was actually linked to curing rickets.[34] This discovery led to the fortification of common foods such as milk, margarine, and breakfast cereals. This took the astonishing statistics of approximately 80–90% of children showing varying degrees of bone deformations due to vitamin D deficiency to being a very rare condition.[35]
Diseases associated with a vitamin D deficiency include rickets, osteoporosis, and certain types of cancer (breast, prostate, colon and ovaries). It has also been associated with increased risks for fractures, heart disease, type 2 diabetes, autoimmune and infectious diseases, asthma and other wheezing disorders, myocardial infarction, hypertension, congestive heart failure, and peripheral vascular disease.[35]
See main article: Water fluoridation. Although fluoride is not considered an essential mineral, it is useful in prevention of tooth decay and maintaining adequate dental health.[36] [37] In the mid-1900s it was discovered that towns with a high level of fluoride in their water supply was causing the residents' teeth to have both brown spotting and a strange resistance to dental caries. This led to the fortification of water supplies with fluoride in safe amounts (or reduction of naturally occurring levels) to retain the properties of resistance to dental caries but avoid the staining caused by fluorosis (a condition caused by excessive fluoride intake).[38] The tolerable upper intake level (UL) set for fluoride ranges from 0.7 mg/day for infants aged 0–6 months and 10 mg/day for adults over the age of 19.