A series of recent articles highlight problems with health care. NPR has a story and the New York Times has a review of the new book Doctored: The Disillusionment of an American Physician, by Sandeep Jauhar, which discusses "unnecessary testing" and "uncoordinated care." Reuters has an article about how "a growing number of doctors simply are not taking contracts with insurance companies," including "45 percent of psychiatrists."
What all these stories have in common is that they do not include the word "ObamaCare." What's the point of a massive overhaul of the health care system if it doesn't fix the problems or if it in fact makes them worse? Somehow the president escapes the blame.