1 Comment

Fifty-odd years ago, as an undergraduate at Georgetown's School of Foreign Service (please pardon the name dropping) I got a rudimentary education in public opinion—what it is, how its measured, what it means, etc. I decided not to pursue a career in political science—opting instead for a career in policy-making and implementation/administration.

What has struck me — and increasingly frustrated and then irritated me — is how pollsters are just asking over and over again questions that don't really mean anything. A question about whether you approve strongly, approve, neither approve nor disapprove, disapprove, disapprove strongly of a candidate, president-elect, president, or policy, by itself, says nothing important. What is important is not the 'grade' a person assigns but their reasons for assigning it—and what matters even more is whether their reasons are grounded in anything substantive. And as you note, the usefulness of the comparison to the past is increasingly limited because the present doesn't look much like the past.

Sure these "reports" on what "the public" feels/thinks (in that order) make for flashy 20-second segments on TV news and 'print' media—not to mention fodder for 'political pundits' and the consultant class on their TV/marketing appearances. I'm inclined to argue that bad polling bears considerable responsibility for the current dysfunctional state of politics in the U.S. and probably elsewhere.

I'd be interested in what your assessment is of the current state of opinion polling/research and what suggestions you have for making public opinion polling/research more useful as a tool or resource for achieving — to use the term borrowed by Rorty from Baldwin — a working democracy.

Expand full comment