When the public sees evidence of election fraud–as in the Louisiana and California Congressional races last fall–confidence in the root integrity of democracy is threatened. But imagine what would happen if the whole US statistical system, including the population count upon which Congress itself is apportioned, and by which billions of dollars of federal programs are allocated, became suspect.
Yet, such a nightmare could happen in less than three years’ time. A revolutionary plan is before Congress to change the US Census for 2000 from an easily understood head count of the population to an arcane process adjusted by survey samples.
A crucial trial run of the census is planned for next year.
But, the Clinton Administration and Congress are still locked in disagreement over the sampling proposal. If a resolution is not reached soon, the results could be questionable population figures, an increase of state and municipal lawsuits over the Census process, and another drop in public trust in government.
The decennial Census is a core institution of representative democracy and one of the few federal functions specifically mandated in the Constitution. It is described in Article 1 as an “actual enumeration.” Dictionaries and common sense indicate that enumeration means “to count off or name one by one; list” (American Heritage) and “to ascertain the number of; to specify one after another” (Webster’s). Yet the Clinton Administration believes the Supreme Court will be willing to stretch such a definition to include imputation of fictional persons from survey samples into the census.
It is true that sampling would save money (about $400 million), but the same could be said for elections, where we also demand a hard count. Yet which of us would like to let pollsters elect public officials instead of going through the “expensive” process of an actual election?
Sampling also is supposed to provide a more “accurate” population total. The hard count has always failed to include some residents who do not fill out their census forms or who cannot be found. -This “undercount” was an estimated 1.8 percent of the national total in 1990, up from 1.2 percent in 1980, but well down from the 5.4%, for example, in 1940. The undercount is higher for rural dwellers and inner city minorities, including illegal aliens, people working odd hours, and the homeless.
With sampling, it is argued, the adjusted census totals for the nation, states and even counties and most cities could be slightly more accurate. But at the level of small areas such as city blocks, where the virtual people would be listed, the resulting numbers could be in error by as much as 28% (according to one analysis) or even 35% (another analysis). It is these small area details that are the stuff of sensitive political redistricting. As with elections, a small number in a specific area can lead to big changes.
Therein lies the prospect of increased lawsuits. With all its human flaws, the Census enumeration has stood up to every court challenge. But, survey data representing only statistical abstractions standing in for flesh-and-blood residents could easily be unmasked in a court. Any citizen could investigate and prove his case when the official number of people supposedly living on his block was off by 28 percent or more!
Unfortunately, the sampling issue is already politicized. Republicans and states in the East and Midwest that stand to lose if sampling is deployed tend to oppose sampling and adjustment on those grounds, while Democrats and Western and Southern states that stand to gain, tend to support it.
But there is great wariness in Congress, nonetheless, including among Democrats. Most of the states that would gain are in Republican parts of the country (Texas, Florida, Arizona, notably), while predominantly Democratic states that in the past have complained most about the undercount (such as New York) are suddenly discovering from computer simulations that an undercount adjustment actually might hurt them.
Some African-Americans are worried–correctly–that once the Census Bureau knows it can rely heavily on samples to improve accuracy in a county or state, it may spend less, money and effort on outreach in central cities, leaving the base upon which the sample is taken as flawed as today.
In Congressional testimony and reports representatives from both parties express concern that when Americans as a whole find out that sampling will take the place of the 10% or so of individuals who fail to return the mailed Census forms or otherwise cannot be counted, the voluntary Census response rate will fall still further–leading to a need for still more sampling.
The US statistical system could come to rest increasingly on the soft base of sampled data rather than on the base of hard data that the census has always provided. Subsequent government surveys effectively would be samples of a sample. People in both parties have noticed how soft that base can be. Computer models show that even national population totals can change dramatically depending upon what assumptions go into formulating samples and adjusting the data. And a small error can lead to a major problem. In a simulated adjustment of the 1990 Census such an error amounted to a one million person population shift, and had it been implemented, the unwarranted loss of a House seat by Pennsylvania.
Many statisticians remain sanguine about such dangers. But they tend to see the Census mainly as a statistical process and not the crucial democratic institution that the American Founders intended. If the whole population count becomes another abstract, complicated activity understood only by experts, the Census will lose much of its historic role and much public support.
Congress, therefore should act now, ban sampling and provide the extra funds needed for improved outreach and promotion of a standard Census in 2000. The alternative means trouble.