Sleep is critical to a healthy brain. While we're at rest, the brain is performing such important tasks as consolidating memories and clearing toxins. Add stress to lack of sleep, ...
Breaking Health News
Health Disparities - What are Health Disparities?
Washington, DC – (February 28, 2013) - Health disparities (also called healthcare inequality in some countries) refer to gaps in the quality of health and health care across racial, ethnic, sexual orientation and socioeconomic groups.
The Health Resources and Services Administration defines health disparities as "population-specific differences in the presence of disease, health outcomes, or access to health care."
In the United States, health disparities are well documented in minority populations such as African Americans, Native Americans, Asian Americans, and Latinos.
When compared to whites, these minority groups have higher incidence of chronic diseases, higher mortality, and poorer health outcomes.
Among the disease-specific examples of racial and ethnic disparities in the United States is the cancer incidence rate among African Americans, which is 10% higher than among whites.
In addition, adult African Americans and Latinos have approximately twice the risk as whites of developing diabetes. Minorities also have higher rates of cardiovascular disease, HIV/AIDS, and infant mortality than whites.
Health disparities are evident in the developing world, where the importance of equitable access to healthcare has been cited as crucial to achieving many of the Millennium Development Goals.
Eliminating Health Disparities
Washington, DC (February 28, 2013) - Recent studies have shown that despite the steady improvements in the overall health of the United States, racial and ethnic minorities experience a lower quality of health services and are less likely to receive routine medical procedures and have higher rates of morbidity and mortality than non-minorities. Disparities in health care exist even when controlling for gender, condition, age and socio-economic status.
The AMA has encouraged physicians to examine their own practices to ensure equality in medical care.
Prompted by a request from Congress, the Institute of Medicine (IOM) performed an assessment on the differences in the kinds and quality of health care received by United States racial and ethnic minorities and non minorities.
Note: The results of that study will be published in a future edition of Brain Brawn & Body.
Oprah Winfrey first referred to Mehmet Oz as "America's doctor" in 2004, during one of his earliest appearances on her television show. The label stuck.
Oz was a rare find: so eloquent and telegenic that people are often surprised to learn that he is a highly credentialed member of the medical establishment.
"The Dr. Oz Show" frequently focuses on essential health issues: the proper ways to eat, relax, exercise, and sleep, and how to maintain a healthy heart.
Much of the advice Oz offers is sensible, and is rooted solidly in scientific literature. That is why the rest of what he does is so hard to understand.
Oz is an experienced surgeon, yet almost daily he employs words that serious scientists shun, like "startling," "breakthrough," "radical," "revolutionary," and "miracle." There are miracle drinks and miracle meal plans and miracles to stop aging and miracles to fight fat.
I asked Oz several times why he promotes that kind of product, and allows psychics, homeopaths, and purveyors of improbable diet plans and dietary supplements to appear on the show. He said that he takes his role as a medium between medicine and the people seriously, and he feels that such programs offer his audience a broader perspective on health.
Medical Malpractice: Why Is It So Hard For Doctors To Apologize?
The Boston Globe, February 28, 2013
The paradox of modern medicine is that the increasing specialization that has revolutionized care has also depersonalized it. When a mistake is suspected, it may be unclear who from a team must step in to take responsibility.
For patients seeking information, the only obvious recourse is to call a malpractice lawyer, whose livelihood depends on replacing a patient’s desire for comfort and understanding with a need for vengeance. There is reason for hope that things can be done differently, even among doctors like myself who are conditioned to be suspicious of malpractice claims.
Massachusetts recently enacted a law that, among other things, usually allows doctors to speak more openly to patients and families who were harmed, even apologize to them, without worry that their words will later be used against them in court.
Dr. Darshak Sanghav
The Atlantic: 'He Didn't Seem Crazy': Where Violence Meets Health Care
In 2008 Thomas Scantling, who at the time was not taking medication to treat his schizophrenia and who compounded his mental health problems by abusing PCP, attacked 20-year-old Dewayne Taylor.
Around the same time as Scantling's subway hammer attack, Philadelphia rolled out its criminal mental health court. Designed to steer low level offenders towards outpatient mental health treatment instead of county jail, advocates of mental health courts say they can prevent terrifying high profile violence of the sort described here by catching mentally ill offenders early and providing them with supportive services.
Critics claim that expanding the reach of the judicial system into the lives of people with severe mental illness will actually backfire, driving people away from therapists and doctors for fear of being reported to the police.
HealthyCal: Study reveals unexpected link between adversity and aging
By Elise Craig
Scientists looking for a correlation between factors like childhood hunger and cognitive aging found a surprising result. Though earlier studies have shown that childhood adversity may be related to the incidence of health problems such as heart disease and mental illness in old age, new research shows that African Americans who went hungry during childhood experienced slower cognitive decline than those who did not, according to a recent study published by the American Academy of Neurology.
“The finding was unexpected,” said study author Lisa L. Barnes, Ph.D., of Rush University Medical Center in Chicago. “We hypothesized that early-life adversity would be related to faster decline.”
Researchers tracked more than 6,000 Chicago residents with an average age of 75, some for as long as 16 years. Each person was asked about his or her childhood health, family financial situation, and factors in their home learning environment, such as how often people played games with them or read to them. Then, participants were tested every three years for symptoms of cognitive changes.
Among African American participants, the 5.8 percent who reported they sometimes, always or often did not have enough food to eat as kids showed a slower rate of cognitive decline—by about one-third—than those who said they either rarely or never didn’t have enough to eat. The study also found that the 8.4 percent of African-Americans who remembered being much slimmer at age 12 than their counterparts had a slower rate of cognitive decline than those who said they were either heavier or the same size as other kids their age—again by about a third.
According to Barnes, the researchers don’t really know why, but there are two possibilities.
Some research on animals has found that calorie restriction may slow the onset of age-related diseases and lengthen lifespans. One human study found restricting calories lead to improved memory, but, the authors note, it was limited to a three-month period.
It’s also possible that it’s a question of selective survival. “Older adults with early adversity may represent the hardiest and most resilient; those with the most extreme adversity may have died before reaching old age,” the authors wrote.
In earlier studies of same population, researchers found obesity later in life is not related to a decline in cognitive function, Dr. Barnes said.
Zhenmei Zhang, an associate professor of sociology at Michigan State University who has studied early life influences and cognitive decline among older Chinese populations, found the results surprising as well.
“No studies that I know of have reported that childhood hunger is associated with slower cognitive decline, although there are reports that markers of childhood deprivation is not associated with cognitive decline in old age,” says Zhang, who was not involved in the study.
But Zhang also notes that the study of cognitive decline and aging is relatively new when compared to health problems like cancer and cardiovascular disease, which can cause death. It’s also a particularly important area of study as baby boomers grow older, as age is one of the strongest predictors of dementia.
The study’s authors did not find a relationship between childhood adversity and cognitive decline among Caucasians, though they’re not sure why.
“It could be that we did not have enough Caucasians who experienced extreme childhood adversity in our sample,” Barnes said. The majority of participants in the study—62 percent—were African American. The researchers purposely chose a geographic area with a high concentration of both Caucasians and African Americans with a diverse set of socio-economic statuses in order to create a better comparison.
However, Zhang notes, the small sample size of Caucasian participants who reported adversity is also a limitation of the study.
Researchers also found no correlation between the type of home learning environment individuals grew up in and cognitive decline, according to Barnes.
“People who reported not being told stories frequently or not playing games with someone frequently as a child, had lower scores on our cognitive tests at the beginning of the study – but they did not decline faster than those without adversity in the home,” she said.
In the study, the authors note that the results are for a particular population in the Midwest and “may not be generalizable to elders in other parts of the country.” However, Barnes said, the findings point to a need for further research into childhood experiences’ effects on disease later in life.
“The results of this study suggest that early-life factors are important and need to be considered in studies of cognitive decline and other diseases of aging,” Barnes said. Zhang agrees.
“Early childhood nutrition matters for old-age cognition but we know so little about how it works,” Zhang says. “There are also conflicting reports.”
The study is ongoing, and researchers will continue to examine the factors that affect aging in that population. The study was published December 11 in Neurology, the journal of the American Academy of Neurology.
This is part of Kaiser Health News' Daily Report - a summary of health policy coverage from more than 300 news organizations.
If you've ever had anyone walk in to your cubicle as you were inhaling a Quarter Pounder with Cheese and say, "I didn't know anyone ate fast food anymore," congrats: You've been food shamed. You should know you're in excellent company, as it's happened to Health staffers at previous jobs, Olympic ...
Ways to protect your parents while preserving their identity “Dad has been outside cutting up a tree with a chain saw, and yesterday, I had to yell at him as he was about to cut through a live electrical cable,” a daughter recently wrote in an online dementia forum for caregivers. Such stories ...
The "acoustic roughness" of screaming selectively activates the amygdala, involved in danger processing, concludes the study. A rough sound is not how people typically describe a scream, though. "If ...
GoBankingRates says they can make planning for retirement easier Many people might wind up with a vast and varied array of financial pieces scattered across their lives once they enter retirement. ...