There's this well-known paper by Brendan Nyhan and Jason Reifler that shows -- I haven't really read it in detail but the analysis is simple and seems pretty airtight, and it is widely cited -- that political misinformation is very hard to correct; in particular, people tend to "double down," and lend more credence, to myths that line up with their political beliefs after they are debunked. (Empirical evidence for Credo quia absurdum est.) This is, of course, a broader phenomenon than just politics; it appears to have been behind some of the hysterical reaction last year (was it last year?) when someone suggested that mammograms were unnecessary for certain age groups. Via Nyhan's blog, here's an article describing how evidence-based medicine is ineffectual when it comes to getting people to give up their false beliefs re medication. (The article also has some pretty interesting stuff about breast cancer.)
This also applies to stuff like listing the calorie content of stuff on menus. Although salad dressing is unhealthy, it's hard to debunk the myth that salads are healthy no matter what. (If you don't like this example -- I don't know how true it is; frankly, food is not something I've ever been that interested in -- there must be other cases of deeply held misperceptions.) In general, this research on myth-busting suggests why (as e.g. Yglesias says here in passing) calorie labeling has been ineffectual so far, and will continue to be ineffectual, at getting people to form accurate beliefs about what's healthy. Of course, it might still be good for other things, like reminding people of how unhealthy the stuff they already consider unhealthy is.