Measuring Human Energy Intake and Ingestive Behavior

Measuring Human Energy Intake and Ingestive Behavior 150 150 IEEE Pulse

From the Technical Committees

According to the World Health Organization, more than half of the people in the world are overweight (39%) or obese (13%). Obesity is associated with increased risks for cardiovascular disease, diabetes, and certain forms of cancer. It has become a bigger global problem than being underweight and is a leading cause of preventable death. The study and treatment of obesity are aided by tools that measure energy intake, determined by the amount and types of food and beverage consumed. Traditional tools include questionnaires about the frequency of food consumption, food diaries, and 24-h recalls of the foods consumed during the day. However, these tools rely on self-reporting and have a number of limitations, including high user and experimenter burden, interference with natural eating habits, decreased compliance over time, and underreporting bias. Experts in the field of dietetics have emphasized the need for technology to advance the tools used for energy intake monitoring [1].
The challenge for engineers is to create new sensor-based tools that can automatically determine when, what, and how much in terms of energy and nutrients a person eats. Toward this goal, there are two major active paradigms in research: 1) using wearable sensors to monitor behavioral and physiological manifestations of food intake and 2) using cameras to capture and analyze images of foods consumed.
Wearable sensors have been applied to detect ingestion activities by instrumenting the head and the wrist. On the human head, several positions can be instrumented to detect activities associated with eating [2]. The ear can be instrumented with a microphone to detect sounds associated with chewing, and muscles of the head can be instrumented with strain or electromyographic sensors to detect motion associated with chewing. In addition, the throat can be instrumented with microphones and many other sensor modalities to detect swallowing sounds and laryngeal motion. Challenges in this paradigm include detecting intake-related sounds or motions while ignoring those related to speech and motion artifacts originating from activities of daily living. An alternate solution is that the human wrist can be equipped with sensors to detect when a person is eating [3]. The activity monitored is hand-to-mouth gestures, also known as bites.

Three people eating while wearing wrist monitors that automatically track the number of bites consumed during a meal.
Three people eating while wearing wrist monitors that automatically track the number of bites consumed during a meal.

Sensor-based methods offer some advantages compared to traditional food diaries and 24-h recalls. First, they are not biased by human memory or interpretation and thus provide an objective measurement of activities. Second, they are automated and thus require no manual data entry. Perhaps the biggest advantage of wearable sensors is their ability to measure behavioral variables and their potential for realtime feedback. For example, wearable sensors can be used to detect the pace of consumption and provide a cue to slow down. Wearable sensors can also be used to provide cues toward a target amount, assisting a person with limiting total consumption during a meal. Use of wearable sensors in the framework of just-in-time adaptive intervention opens new, previously unexplored frontiers in behavioral modifications aimed at weight loss.
Wearable sensors also have disadvantages and limitations. First, a person must be willing to wear one or more sensors, either during all waking hours or at least during eating activities. Some positions on the human body lend themselves more easily to this requirement. Sensors mounted on the wrist can take the form of a standard watch. Sensors mounted on the ear can take the form of an earpiece or be integrated into eyeglasses. Other positions potentially create more social stigma or discomfort for the user. Second, there is limited potential for determining the type of food consumed using sensors. Liquid and solid intake can be at least partially discriminated based upon sound differences, swallowing, and wrist motion patterns. The rheological properties of the food being consumed, such as crunchy versus chewy, can also be discriminated based upon sound.
However, present-day sensors cannot provide the types of nutrition measurements that can be obtained from food diaries or 24-h recalls. This limitation motivates the use of cameras to take pictures of foods. The images may be manually taken before and after consumption or captured by a wearable camera. The portion size, energy, and nutrient content may be estimated through manual review of the images by trained nutritionists or by the methods of computer vision. Image-based methods potentially provide richer, but still imperfect information about energy and nutrient content of the consumed foods. For example, imagery cannot differentiate between sugary and diet drinks or accurately estimate the amount of added oils. Thus, as of now, no single sensing modality can perfectly capture all aspects of ingestive behavior and nutrient intake.
The automatic monitoring of human energy and nutrient intake is still in its infancy, and the list of methods mentioned here is by no means exhaustive. The frequency of research publications suggesting new sensors and monitoring methods is rapidly growing. The future may see the exploration of new dimensions, such as intraoral sensing, allowing for direct access to consumed foods, and monitoring of gastrointestinal activity or other physiological processes related to ingestion. It is likely that a combination of several approaches may be needed to obtain a complete picture of one’s nutrition and related behaviors. At present, the accurate and objective measurement of energy intake and ingestive behavior still remains a challenge and an opportunity for bioengineers.

References

  1. National Institutes of Health. Improving diet and physical activity assessment. [Online].
  2. J. Fontana and E. Sazonov, Eds., “Detection and characterization of food intake by wearable sensors,” in Wearable Sensors: Fundamentals, Implementation and Applications, 1st ed. San Diego, CA: Academic, 2014, pp. 591–616.
  3. Y. Dong, J. Scisco, M. Wilson, E. Muth, and A. Hoover, “Detecting periods of eating during free-living by tracking wrist motion,” IEEE J. Biomed. Health Inform., vol. 18, no. 4, pp. 1253–1260, July 2014.