Office of Management and Budget
Executive Office of the President
  Site Search     
 
About OMB  
- Organization Chart
- Contact OMB
 
President's Budget
- Budget Documents
- Supplementals, Budget Amendments, and Releases
Federal Management
- President's Management Agenda
- Office of Federal Financial
Management
-- Agency Audits
- Office of Federal Procurement
Policy
  -- CAS Board
-- FAIR Act Inventory
Office of Information and Regulatory Affairs
- OIRA Administrator
- Regulatory Matters
- Paperwork Requirements
- Statistical Programs & Standards
- Information Policy, IT & E-Gov
Communications & Media
- News Releases
- Speeches
Legislative Information
- Statements of Administration Policy (SAPs)
- Testimony
- Reports to Congress
Information for Agencies
- Circulars
- Memoranda
- Bulletins
- Pivacy Guidance
- Grants Management
- Reports
Site Map
First Gov  
eGov
|

PRESIDENTIAL MANAGEMENT OF THE
REGULATORY STATE

John D. Graham, Ph.D.
Administrator
Office of Information and Regulatory Affairs
Office of Management and Budget
Executive Office of the President of the United States

Remarks to the Committee on National Statistics, National Research
Council/National Academy of Sciences, Washington, DC

May 10, 2002

INTRODUCTION

I am delighted to have this opportunity to engage in a dialogue with the members of the Committee on National Statistics and the leaders of our Nation’s statistical agencies. As some of you know, I spent time here at the Academy earlier in my career, and I have the deepest respect for the work that is carried out by its many committees and panels. As an academic recently turned “bureaucrat,” I also have a keen awareness of the contributions that our university colleagues can and must make if we are to implement our responsibilities in a competent and creative manner. And, to the agency leaders, I would underscore my long term commitment to the necessity of having the highest quality, unbiased information to guide our actions.

To ensure that we have the critical statistical information that we need, I am lending my support to initiatives that will fill critical gaps in our current programs and introduce efficiencies in our work. Thus, for example, I am an advocate of the emerging American Community Survey, an initiative that shows promise of being a significant innovation in demographic data collection – and that will provide decision-makers at every level with far more current information to determine needs and allocate resources. Similarly, I am a supporter of increments in funding to fill critical gaps in our Nation’s economic statistics. And, though I am well aware of the intricacies and sensitivities surrounding potential changes to our measurements of income and poverty, I am hopeful that we will be able to capitalize on the work of this committee and the efforts of the statistical agencies to better address some of the longstanding criticisms of our measures. Last but certainly not least, I am pleased that we have gained a partner in the campaign to realize a long-held dream of the statistical system. With new-found energy from the Council of Economic Advisers, we are working in close collaboration with the Bureau of Economic Analysis, the Bureau of Labor Statistics, the Bureau of the Census, and the Department of the Treasury to craft and gain support for legislation that would provide statutory protection for the confidentiality of all data collected for statistical purposes, and would then permit the sharing of business data among BEA, BLS and Census to improve the comparability and accuracy of economic statistics.

Let me turn now to the “advertised” topic of my remarks to you today – our recent efforts related to a new mandate on “information quality.” The Office of Management and Budget (OMB) is of course well known for its role in budgetary matters and is becoming better known for its role in regulatory policy. Yet OMB’s responsibilities in the field of information and statistical policy are not widely recognized -- unless, of course, you are the object of our designation of Metropolitan Statistical Areas! Just as the importance of the word “Management” in OMB is poorly appreciated, the importance of the word “Information” in the title of my Office, the Office of Information and Regulatory Affairs (OIRA), is poorly appreciated. And the steps that OMB is taking to improve the quality of information that agencies disseminate to the public are just beginning to be known and appreciated.

Before discussing these steps, I should note that both Congress and OMB have a longstanding interest in the field of information policy. OIRA was officially created by Congress in the Paperwork Reduction Act of 1980, the law that established the basic clearance processes for “information collections” now required for all federal agencies. In the arguably obscure OMB Circular A-130 entitled “Management of Federal Information Resources,” OMB stated its strong support for dissemination of information to the public.

It is certainly true that Federal agencies have disseminated information to the public for decades. Until recently, that dissemination was accomplished principally by making paper copies of documents available to the public. With the advent of the Internet, there has obviously been a revolution in communications that has enabled agencies to disseminate an increasing volume of information to users throughout the world.

A major question we are currently addressing is “what steps should agencies take to ensure a basic level of quality in the information that agencies choose to disseminate to the public?” A recent law passed by Congress gives urgency to finding answers to this question.

LEGISLATIVE HISTORY OF THE INFORMATION QUALITY LAW

The story begins toward the end of the previous Administration, when Congress enacted a law requiring OMB to develop uniform guidelines establishing quality standards for information disseminated by Federal agencies. The law was enacted as a rider to our appropriations bill without any hearings or extensive legislative history. I am told by my career staff that the quality of information disseminated via agency web sites was a particular concern at the time.

This information quality law should not be confused with an earlier “information access” law -- one with which I know the members of CNSTAT and your academic colleagues are extremely familiar -- that amended the Freedom of Information Act to provide greater public access to research data generated under Federal research grants. OMB believes that the information access and information quality laws are compatible and in fact are mutually reinforcing in the way that they promote responsible public access to technical information used by government agencies.

RATIONALE FOR INFORMATION-QUALITY CONCERNS

There is plenty of evidence that the quality of the information advanced for use by government decision makers needs to be improved. In the scholarly literature on what is called “science-policy”, there are entire books of case studies demonstrating technical problems with the information collected, used and published by federal regulatory agencies. Although my examples are drawn primarily from environmental policy, where I did some previous writing, all agencies have their share of information quality problems.

My field of science, cost-benefit analysis, certainly has its share of quality problems. An instructive example occurred in the late 1970's, when a contractor for EPA reported that the extra cost of controlling water pollution at municipal treatment plants was $1.20 per pound. Analysts at the Council on Wage and Price Stability – a precursor office to OIRA – found a technical error in the contractor’s work and produced a corrected estimate of $0.30 per pound. When EPA was informed of the error, they asked a Court to remand a pending case so that the cost estimate could be corrected and the relevant regulation re-issued in revised form. In this case, since the cost estimate was being used as a benchmark for controlling pollution at pulp and paper mills, the revised standard at paper mills became more cost-effective as a result of the correction.

Sometimes poor interpretation of technical information can result in rules or standards that are not adequately protective of public health. The safe level of exposure to nitrates in drinking water, for example, is a case where scientific peer reviewers of a draft EPA document found that published studies may have been misinterpreted by EPA analysts. Peer reviewers persuaded the agency that, in order to provide an adequate margin of safety for infants, a key susceptible subgroup, the amount of allowable exposure to nitrates in water needed to be smaller than originally thought.

Information disseminated by EPA in support of its new air-quality standard for particulate matter has been widely criticized as erroneous or unreliable. Two studies by my faculty colleagues at the Harvard School of Public Health were especially controversial because the original data were not made available for public scrutiny. Yet an independent organization funded by the car companies and EPA, the Health Effects Institute in Cambridge, Massachusetts, did a major reanalysis of the two key studies and found no significant mathematical errors. The HEI reanalysis did find that the quantified health risks of pollution changed significantly when alternative methods of analysis were employed. The HEI work also offers an intriguing model of how reproducibility of analytic results can be achieved without insisting on public access to original data. That model may prove to be useful under the OMB information-quality guidelines. The controversy surrounding these particular health studies continues and may not be dispelled until the ideal of public access to original data -- with identifiers removed to protect confidentiality of subjects -- is achieved.

In my own work as a scholar, I must confess to a quality problem here and there -- even in those papers published in good journals! For example, I projected that a policy of mandatory airbags would save 9,000 lives per year in this country. The best published estimates based on real-world crash data are now around 3,000 lives saved per year. I also did not predict the harmful effects of passenger airbags on young children.

In citing these various examples of quality problems, I do not mean to suggest that the work of scientists can be perfect. Even the best of scientists are human. In addition, the scientific data may be ambiguous, allowing several equally plausible interpretations. Science is an evolutionary process where the work of one scientist is enhanced by the criticism of others. What we are discussing is an organizational challenge motivated by the reality that scientists and analysts are not perfect. How can we improve the quality of information disseminated by federal agencies, including disseminations that must covey scientific ambiguity?

PHASE ONE: OMB’S 2002 GUIDELINES

The Bush Administration is committed to vigorous implementation of the new information quality law. We believe it provides an excellent opportunity to enhance both the competence and accountability of government. Yet Section 515 charged OMB with a huge task: the development of government-wide guidelines to ensure and maximize the quality of information disseminated by agencies. The law covers both the independent agencies and the executive agencies but provides few limitations on the scope or types of information that are to be covered.

To make a long story short, OMB has now published -- after two rounds of public and interagency comment -- final guidelines in this area. These guidelines take effect October 1st of this year. They impose three core responsibilities upon all federal agencies.

First, agencies must commit to embrace a basic standard of quality as a performance goal and take appropriate steps to incorporate quality into their information dissemination practices. Obviously, the act of dissemination is not readily separated from the processes of generation and use of information -- particularly given “sunshine” laws -- and thus the OMB guidelines have important ramifications for all aspects of information management at agencies.

Second, agencies are to develop information quality procedures that are applied BEFORE information is disseminated. The practice of scientific peer review plays an important role in the guidelines, particularly in establishing a presumption that peer-reviewed information is “objective”. We recognize peer review at scientific journals as an acceptable form of peer review and offer some guidelines for assuring competent and credible peer review at agencies.

Third, and here is perhaps the key provision, Congress required each agency to develop an administrative mechanism whereby affected parties can request that agencies correct poor quality information that has been or is being disseminated by agencies. The burden of proof is squarely on the affected parties: They must demonstrate that a specific dissemination does not meet the quality standards in the OMB guidelines or the agency-specific guidelines. It is this opportunity for complaint and prompt correction that begins in October of this year. The OMB guidelines stipulate that, if an agency denies a correction request, an opportunity for appeal must be provided. Needless to say, many procedural details need to be worked out.

As we meet today, we are entering the public domain with “phase two” of the information quality guidelines process. In particular, each Federal department and independent agency was required to publish, by May 1st, an announcement of the availability of its draft guidelines. At this juncture, most of the Departments have published such notices, indicating where their draft guidelines can be accessed for review. In general, the Departments have indicated that they will receive comments for either 30 or 45 days.

It is noteworthy that the statistical agencies were decidedly “out in front” on this challenge. In a very real sense, they were perhaps most ready to meet the challenge, for information quality standards historically have been central to their work. What was especially remarkable, however, was the fact that the statistical agencies voluntarily came together at the earliest stages of this process to develop a common template for their agency guidelines. Moreover, these agencies – the ones in this room – are publishing a common Federal Register notice to draw the public’s attention to their individual statistical agency guidelines.

CONCLUSION

I urge you to examine the guidelines and provide feedback to the agencies. We have already benefitted from informal discussions with many interested parties, and from a workshop held in March here at the Academy. This is an ambitious legislative mandate that we must turn into a process that is practical at once for the agencies and for the public. I look forward to your comments and questions.