One of the problems I have with newspapers is the struggle to keep track of the reliability of particular journalists. With so many stories each day, I might remember only one or two bylines, and then it's usually because they've done very good or very bad work.
Seymour Hersh, for example, is very good. But how to arrive at that sort of conclusion?
There are several metrics that could be useful:
1. A breakdown of the types of sourcing -anonymous, named, government, think tank, corporate, party affiliation. The tenor and target of that sourcing - supportive, hostile, whistleblower. Shown as percentages.
2. Track record. Charts claims made by the journalist, i.e., the "story" and whether they are descriptive or predictive. If descriptive, is the claim supported or contradicted by readily-available data? If predictive, are hard numbers given or is the thesis vague? Does the claim bear out over time? Shown as ratios - Descriptive supported/Descriptive contradicted and Predictive vindicated/Predictive wrong.
3. Case studies. There are two case studies in particular that would be useful for journalistic context (and these, of course, could be amended or changed as new cases gained notoriety): The Clinton investigations and the run-up to the Iraq Invasion. In both of these instances, the conduct of journalists is especially relevant. Both contain large numbers of falsehoods that were repeated as conventional wisdom long after they were debunked. How many stories did a journalist file about the subject and how many accepted falsehoods were contained in the pieces?
4. Money. Who pays the journalist? Do they accept speaking fees? Have they written, edited, or funded books? How much do they earn and from where?
5. Affiliations. What kind of friendships have they claimed? Do they regularly dine with elected officials, corporate boardmembers, or known partisan activists?
You could fit all this data on a fairly small pop-up. Or a deck of cards.
This is the kind of thing I'd do if I had good enough organizational skills.