When I went away to a Big Ten University, I was lost, under-supervised, and frequently bewildered. Nothing leading to my major subject was noticeable in the curriculum.
Not even close. For instance, before my first psychology class, somebody announced, This is a research psychology school, not a clinical psychology school. Oh-oh. Well, damn; I'm stuck here.
But when I research my DCIS, cancer, hospitals, or whatever, two rules of that college still help me stay out of trouble:
1. Papers with unsupported generalities will receive an F. The instructors would bemoan "glittering generalities" in class, and mark that label on guilty papers.
In one class, maybe sociology, the instructor asked what makes a person Jewish. (that was before the age of PC.) I answered, and he replied, "I think we can categorically reject that." I had a rare attack of bravery, and brought him printed support for my answer. Instead of flunking me for insubordination, he praised me.
Years later, a screenwriter I knew from the neighborhood coffee shop insisted I omit references without a specific source. (Oops. Yes, it had been a while since college.)
"Studies show" is the new and dangerous buzz phrase for "Here comes the vague part." (Something unsupported is an opinion. And that goes for "experts" who torture me on the car radio during traffic jams.)
That phrase is so dangerous because if the article topic is exciting, like mammograms-in-doubt, we may scan it even if we've never heard of the study or the writer. Sentences may stick with us, mixed in with reliable material, when we're researching our own diagnoses.
2. Question every experiment, every "study" for careful controls; and question whether they've ruled out enough variables to prove they're measuring only what they say they're measuring. This is where the university's experimental mindset became a blessing for me.
I couldn't do a T test or phi test if you paid me. Yet I still ask myself what variables were considered in a certain experiment, and which were ruled out. But sometimes, the reporting member of a study team simply does not write clearly. What "years" is he referring to? Does he mean time before recurrence? Or does he mean life span?
Of course, there is plenty of material on studies that seem very well designed, and reported clearly. When a nationally respected doctor I trust says there have been "some studies" I trust him.
In short, "studies show" and unclear conclusions are not always red lights for me, but at least yellow lights warning:
Get a second opinion!