A mathematical model reflecting how red blood cells change in size and hemoglobin content during their four-month lifespan has been developed by scientists.
The study is collaboration between a physician-researcher at Massachusetts General Hospital (MGH) and a mathematician from Harvard University.
AdvertisementJohn Higgins, MD, MGH Center for Systems Biology and Department of Pathology, and L. Mahadevan, PhD, Harvard School of Engineering and Applied Sciences (SEAS), also describe how their model may be used to provide valuable clinical information.
"This study describes a promising way to predict who is likely to become anemic before they actually do, and it is based on tests routinely performed in hospitals," says Higgins. "More generally, we found that a type of mathematical analysis commonly used in physics can be applied to clinical data and uncover new details of human physiology which can help improve diagnosis."
Mahadevan adds, "We show that it is possible to use minimal models to compress the information available in existing clinical data into a few parameters, which can then serve as a quantitative basis for comparing characteristics across the entire population."
For the study, the investigators worked to develop a relatively simple mathematical description of how the volume and hemoglobin content of the average RBC change over time.
Starting from the known characteristics of young and mature cells, they developed equations that approximate how the young cells are transformed into mature cells.
After building their model with data from healthy individuals, they discovered that data from patients with three types of anemia correspond to different parameter values in the model.
For example, it appears that RBCs from healthy individuals are cleared from the bloodstream before they shrink beyond a specific size. But in patients with mild iron-deficiency anemia or a genetic condition called thalassemia trait, RBCs continue shrinking past the clearance threshold for healthy cells.
By looking for an increasing population of small RBCs in blood samples from individuals who had a normal blood test and then went on to develop iron-deficiency anemia 30 to 90 days later, the investigators were able to predict the development of iron-deficiency anemia.
"Looking for the initial shifting of this threshold may allow us to identify a developing anemia significantly earlier than we can now," Higgins says.
The study has been published online in PNAS Early Edition.
You May Also Like