ExpectMore.gov


Detailed Information on the
Capability Enhancement of Researchers, Institutions, and Small Businesses Assessment

Program Code 10004403
Program Title Capability Enhancement of Researchers, Institutions, and Small Businesses
Department Name National Science Foundation
Agency/Bureau Name National Science Foundation
Program Type(s) Research and Development Program
Competitive Grant Program
Assessment Year 2006
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 100%
Program Management 100%
Program Results/Accountability 82%
Program Funding Level
(in millions)
FY2007 $222
FY2008 $242
FY2009 $262

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Continue efforts to increase the percentage of awards to new investigators (i.e. new companies) in Phase One of the Small Business Innovation Research and Small Business Technology Transfer Programs.

Action taken, but not completed This measure refers to NSF's broadening participation efforts, in particular the effort to increase the percentage of awards to new PIs in the SBIR/STTR Program. That Program continues to pursue aggressive strategies to attract and fund quality proposals from new PIs (i.e. new companies) through co-funding and use of supplements with NSF's research programs, and outreach, particularly through targeted efforts toward small businesses owned by underrepresented minorities and women.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Ensuring increased timeliness of yearly project reports from award recipients.

Completed On Nov. 18, 2006, changes will be implemented in the Project Reports System to enable NSF to monitor and enforce that PIs are submitting annual and final project reports within the appropriate timeframes. Annual reports are due 90 days prior to report period end date and are required for all standard and continuing grants and cooperative agreements. Final reports are due within 90 days after expiration of award. Policy documents have been updated to reflect the changes.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: External Advisory Committee validation that NSF, through Capability Enhancement Program components such as the Small Business Innovation Research (SBIR) program, has fostered connections between discoveries and their use in the service of society.


Explanation:This measure is a direct reflection of a primary goal of the SBIR component of the Capability Enhancement Program. Assessment of achievement is performed annually by the external Advisory Committee for GPRA Performance Assessment (AC/GPA). That committee assesses performance towards this goal and deems either that NSF has made "significant achievement" in this area (which translates to Success in this measure) or that they cannot validate "significant achievement" (which translates to Failure). Relative success in meeting this goal is formally reassessed every year, thus promoting continuing improvement in meeting this goal.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External Advisory Committee validation that NSF achieves the Capability Enhancement Program objective to encourage collaborative research and education efforts across organizations, disciplines, sectors and national and international boundaries.


Explanation:This measure is a direct reflection of a goal of the Capability Enhancement Program. Assessment of achievement is performed annually by the external Advisory Committee for GPRA Performance Assessment (AC/GPA). That committee assesses performance towards this goal and deems either that NSF has made "significant achievement" in this area (which translates to Success in this measure) or that they cannot validate "significant achievement" (which translates to Failure). Relative success in meeting this goal is formally reassessed every year, thus promoting continuing improvement in meeting this goal.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External Advisory Committee validation that NSF achieves its strategic objective to support activities that enable people who work at the forefront of discovery to make important and significant contributions to science and engineering knowledge.


Explanation:This measure reflects an NSF-wide goal that is relevant to the Capability Enhancement Program. Assessment of achievement is performed annually by the external Advisory Committee for GPRA Performance Assessment (AC/GPA). That committee assesses performance towards this goal and deems either that NSF has made "significant achievement" in this area (which translates to Success in this measure) or that they cannot validate "significant achievement" (which translates to Failure). Relative success in meeting this goal is formally reassessed every year, thus promoting continuing improvement in meeting this goal.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Annual Efficiency

Measure: For 70 percent of proposals submitted to the Major Research Instrumentation Program, which was a major component of Capability Enhancement, be able to inform applicants about funding decisions within six months of proposal receipt or deadline, or target date, whichever is later, while maintaining a credible and efficient merit review system.


Explanation:Because the program category "Capability Enhancement" no longer exists under NSF's new Strategic Plan, data on the measures associated with the PART Program can no longer be tracked. Capability Enhancement was a category composed of distinct and very different programs, such as EPSCoR, SBIR/STTR, HBCU-RISE, CREST (Centers for Research Excellence in Science and Technology), and I/UCRC (Industry/University Centers for Research Cooperation). Under the new Strategic Plan there is no program portfolio comparable to Capability Enhancement. Therefore, the edited measure reflects a Foundation-wide result.

Year Target Actual
2003 70% 96%
2004 70% 92%
2005 70% 90%
2006 70% 93%
2007 70% 87%
2008 70%
2009 70%
2010 70%
Long-term Outcome

Measure: External Advisory Committee validation that NSF achieves the Capability Enhancement Program objective to promote increased opportunities for underrepresented individuals and institutions to conduct high quality, competitive research and education activities.


Explanation:This measure is a direct reflection of a primary goal of the Capability Enhancement Program. Assessment of achievement is performed annually by the external Advisory Committee for GPRA Performance Assessment (AC/GPA). That committee assesses performance towards this goal and deems either that NSF has made "significant achievement" in this area (which translates to Success in this measure) or that they cannot validate "significant achievement" (which translates to Failure). Relative success in meeting this goal is formally reassessed every year, thus promoting continuing improvement in meeting this goal.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Annual Output

Measure: Percentage of Small Business Innovative Research Program (SBIR) and Small Business Technology Transfer Program (STTR) Phase I awards to new principal investigators.


Explanation:The average percentage of SBIR/STTR Phase I awards made to new principal investigators during the period 2003-2005 is 68%. This figure exceeds the NSF-wide norm and reflects a major influx of new principal investigators in response to economic factors impacting the Nation's small business community. Because variations in the economic climate significantly impact SBIR/STTR, it may prove difficult to maintain the funding rate for new principal investigators at this level over the period 2006-2008. Moreover, the SBIR/STTR programs are requiring that companies demonstrate increasing sophistication, not only in the development of innovative technologies, but also in business planning and management. New principal investigators may face a special challenge in meeting these requirements. In spite of these difficulties, SBIR/STTR will expend every effort to extend participation to new investigators, especially member of groups underrepresented in science and engineering. Because this is a new measure, there were no targets before 2006.

Year Target Actual
2003 N/A 69%
2004 N/A 65%
2005 N/A 69%
2006 68% 65%
2007 68% 65%
2008 68%
2009 68%
2010 68%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the Capability Enhancement (CE) Program is clearly stated. It is to enhance the capability of individuals, institutions, and small businesses to conduct high quality, competitive science and engineering (S&E) research, education, and technological innovation. Therefore, participants in the CE Program, including members of groups underrepresented in science, technology, engineering, and mathematics (STEM) fields, are positioned to more fully contribute to the Nation's research and development (R&D) enterprise. CE component programs are: the Centers of Research Excellence in Science and Technology (CREST) and its Historically Black Colleges and Universities Research Infrastructure in Science and Engineering (HBCU-RISE) activity; the Experimental Program to Stimulate Competitive Research (EPSCoR); the Small Business Innovation Research and Small Business Technology Transfer programs (SBIR/STTR); the Industry/University Cooperative Research Centers program (I/UCRC); the Research in Undergraduate Institutions (RUI) program; and Research Opportunity Awards (ROA).

Evidence: Evidence of the purpose of the CE program can be found in the following references: The NSF FY 2003-2008 Strategic Plan, (pp. 15-18) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201) ;the National Science Foundation Act of 1950, as amended, under the Functions of the Foundation (42 U.S.C. 1862, Sec. 3. (a)(5)); (http://www.washingtonwatchdog.org/documents/usc/ttl42/ch16/sec1862.html); and the NSF Authorization Act of 2002, (pp. 107-378, Section 2) (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=107_cong_bills&docid=f:h4664enr.txt.pdf). The following program solicitations also provide evidence of CE's clear program purpose. Each contains a description of the program's purpose and detailed information concerning program eligibility, proposal preparation and submission, proposal review procedures information, and award administration. The institutional-based CE effort includes the following components: CREST/HBCU-RISE (NSF 06-510) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06510) and RUI/ROA (NSF 00-144) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06510). Geographic and private sector CE components are represented by: EPSCoR (NSF 05-589) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05589); I/UCRC (NSF 01-116) (http://www.nsf.gov/pubs/2001/nsf01116/nsf01116.pdf); and SBIR/STTR (NSF 05-605) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05605) and SBIR/STTR supplemental funding for minority-serving community college research teams (http://www.nsf.gov/pubs/2006/nsf06008/nsf06008.jsp).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The purpose of the Capability Enhancement (CE) Program is clearly stated. It is to enhance the capability of individuals, institutions, and small businesses to conduct high quality, competitive science and engineering (S&E) research, education, and technological innovation. Therefore, participants in the CE Program, including members of groups underrepresented in science, technology, engineering, and mathematics (STEM) fields, are positioned to more fully contribute to the Nation's research and development (R&D) enterprise. CE component programs are: the Centers of Research Excellence in Science and Technology (CREST) and its Historically Black Colleges and Universities Research Infrastructure in Science and Engineering (HBCU-RISE) activity; the Experimental Program to Stimulate Competitive Research (EPSCoR); the Small Business Innovation Research and Small Business Technology Transfer programs (SBIR/STTR); the Industry/University Cooperative Research Centers program (I/UCRC); the Research in Undergraduate Institutions (RUI) program; and Research Opportunity Awards (ROA).

Evidence: The specific interests and existing national needs addressed by the CE Program are well documented. Evidence concerning the role played by the NSF and its CE Program in addressing these needs can be found in the following documents. (1) NSF Authorization Act of 2002, (P.L. 107-378, Section 2, (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=107_cong_bills&docid=f:h4664enr.txt.pdf). (2) A description of the relationship of specific CE Program objectives to NSF strategic goals is found in the NSF's strategic plan (pp. 15-18, National Science Foundation Strategic Plan FY 2003-2008, (http://www.nsf.gov/pubs/2004/nsf04201/FY2003-2008.pdf). (3) The need for increased opportunities for underrepresented individuals and institutions to conduct high quality, competitive research and education activities is well documented. Specific data concerning the under-representation of minorities, women, and persons with disabilities are provided in the FY 2006 NSF Science and Engineering Indicators (http://www.nsf.gov/statistics/nsb0602/) and (http://www.nsf.gov/statistics/seind06/) and in Women, Minorities and Persons with Disabilities in Science and Engineering (http://www.nsf.gov/statistics/wmpd/). (4) Data relating to the selection of EPSCoR participants from among those jurisdictions that receive a lesser amount of NSF research funding are available at the EPSCoR web site (http://www.nsf.gov/ehr/epscor/eligible.jsp). (5) A review of the SBIR program by the GAO found the program to have successfully addressed the needs of small businesses (Testimony before the Subcommittee on Environment, Technology, and Standards Committee on Science, House of Representatives, June 28, 2005, Federal Research: Observations on the Small Business Innovation Research Program, (http://www.gao.gov/cgi-bin/getrpt?GAO-05-861T). (6) Regular external evaluation of the CE Program indicates that it effectively addresses national S&E needs. For example, access the Office of Integrative Activities web site to see Committee of Visitor (COV) reviews of the individual component programs that comprise the overall CE Program. (http://www.nsf.gov/od/oia/activities/cov/covs.jsp).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The CE Program is designed not to be redundant of any other Federal, state, local or private effort (i.e., NSF coordinates with other Federal agencies to avoid duplication of efforts and spending). In contrast to the focused R&D efforts of the Federal "mission agencies" (e.g., NIH-biomedical, NASA-space, DOD-defense, etc.) NSF is the only federal agency charged with promoting the progress of fundamental science and engineering research and education in all STEM fields and disciplines. For example, in comparison with other Federal programs targeting Historically Black Colleges and Universities (HBCUs), the Foundation's programs are "virtually" unique in their focus on excellence in STEM research and education, including institutional and faculty development and the training of highly-skilled S&T professionals. Although the Army Research Laboratory's HBCU program also has a focus on skilled technicians and the Department of Education's Strengthening Black Graduate Institutions program focuses on education, not facilities and infrastructure, neither of these programs address institutional capacity or research competitiveness, nationally and internationally. It is evident that NSF's unique relationship with the national scientific community and its competitive grant mechanisms strongly position the agency to carry out programs specifically designed to enhance the research and education capabilities of significant segments of the Nation's S&E communities. Investments in declared institutional, geographic, and private sector communities address national STEM innovation, education and workforce needs that are not under the purview of other mission-oriented federal, state or local agencies. For example, the primary objective of the SBIR/STTR program is to increase the incentive and opportunity for small firms to undertake cutting-edge, high-risk, high quality scientific, engineering, or science/engineering education innovation research that would have a high potential economic payoff if the research is successful. The program is unique within NSF in that it provides an opportunity for small businesses to participate in federally funded research and to promote the growth and development of small businesses. In addition, NSF's investments in SBIR/STTR are both complementary to and different from those conducted by other Federal, state, local and private efforts to stimulate technological innovation. For example, although all agencies provide some R&D support for small businesses, the NSF's programs differ in that they address national research and innovation needs in areas of fundamental R&D that are not generally supported by the more mission-specific agencies.

Evidence: NSF has statutory authority to evaluate the status of various sciences and engineering programs and to consider results in correlating its programs with others. See the National Science Foundation Act of 1950, as amended, under the Functions of the Foundation (42 U.S.C. 1862, Sec. 3. (a)(5)) (http://www.washingtonwatchdog.org/documents/usc/ttl42/ch16/sec1862.html). NSF"s EPSCoR and NIH's Institutional Development Award (IDeA) programs are examples of the difference between NSF's CE Program and the focused R&D efforts of other "mission agencies." Both build research capability within jurisdictions that receive lesser R&D funding. However, IDeA is directed exclusively to biomedical research while EPSCoR builds capability in all STEM fields. Examples of projects funded by EPSCoR and IDeA may be viewed here: (http://www.nsf.gov/ehr/epscor/statewebsites.jsp). Information concerning NSF's involvement with HBCUs may be found in the NSB report Broadening Participation in Science and Engineering Faculty (NSB 04-41) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsb0441) while information concerning other Federal programs targeting HBCUs may be obtained from the following: (http://mercury.dll.org/HBCUs/GateWay_files/FederalPrograms.asp). The NSF SBIR/STTR program, differs from those conducted by other agencies in that it links the fundamental research conducted in STEM fields supported by NSF to topics driven by market and national needs. Examination of recent SBIR/STTR awards displays the breath of STEM topics supported by this CE Program.(http://www.nsf.gov/awardsearch/progSearch.do?SearchType=progSearch&page=2&QueryText=&ProgOrganization=OII&ProgOfficer=&ProgEleCode=&BooleanElement=false&ProgRefCode=&BooleanRef=false&ProgProgram=&ProgFoaCode=&RestrictActive=on&Search=Search#results). SBIR/STTR performance is reviewed on a triennial basis by its COV and on a bi-annual basis by its AC. The reviews are conducted within the context of NSF and other programs concerned with forging academic/business partnerships. In addition the Engineering Advisory Committee considers SBIR/STTR activities as an integral part of its overall review of Directorate efforts. The SBIR/STTR program uses these external evaluations to assess the potential benefits of its efforts. Evidence can be found in the 2004 Committee of Visitors Report for the SBIR/STTR programs (pp. 4, and 10-12, http://www.nsf.gov/od/oia/activities/cov/covs.jsp#eng); the SBIR/STTR Advisory Committee Report, June 2004, p. 6 http://www.nsf.gov/eng/sbir/Adcom/June%202004.doc; and external evaluations, for example the GAO review of the SBIR program http://www.gao.gov/cgi-bin/getrpt?GAO-05-861T and in NRC reports on the SBIR program: The Small Business Innovation Research Program Challenge and Opportunities (http://www.nap.edu/books/0309061989/html/R1.html) and SBIR Program Diversity and Assessment Challenges (http://www.nap.edu/books/0309091233/html/R1.html).

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The CE program is free from major design flaws that prevent it from meeting its defined objectives and performance goals. There is no strong evidence that another approach or mechanism would be more efficient or effective to achieve the intended purpose. The extensive oversight and review that the CE Program receives, like all NSF programs, effectively insures that its design is free of major flaws that would limit its effectiveness or efficiency. This oversight and review includes: (1) a merit review process that has been recognized as a best practice for administering R&D programs; (2) the professional judgment of trained NSF program officers knowledgeable in their respective fields; and (3) triennial external Committee of Visitor (COV) review to ensure effectiveness and efficiency of program operations. Independent reviews by COVs and other external groups such as NSF Advisory Committees (AC), the National Science Board (NSB), and other external organizations provide additional scrutiny of the portfolio's goals and results. The CE Program makes improvements based on recommendations received from its independent reviewers. This follows the guidance provided in the R&D criteria, as outlined in the Office of Management and Budget/Office of Science and Technology Policy Guidance Memo. No strong evidence exists that an alternative design would better achieve the purpose of the CE effort.

Evidence: Evidence of the effectiveness of the CE Program's design can be found in the following reports. (1) The 2005 AC/GPA report, (p.11, http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210). (2) The COV reports for the individual component grant programs that comprise the overall CE Program effort provide support for its effectiveness and efficiency. (http://www.nsf.gov/od/oia/activities/cov/covs.jsp). For example, see the 2005 COV report for EPSCoR (http://www.nsf.gov/od/oia/activities/cov/ehr/2005/EPSCoRcov.pdf) or the 2004 COV Report for SBIR/STTR (http://www.nsf.gov/od/oia/activities/cov/eng/2004/SBIRcov.pdf).

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The design of the CE Program is effectively targeted such that its resources directly address the program's purpose and reach the intended beneficiaries. The CE Program achieves its purpose by supporting the development of high quality, competitive STEM research, education, and technological innovation among individuals, institutions, and small businesses. It relies upon two mechanisms to ensure that resources reach the intended beneficiaries. First, program solicitations contain a clear statement of the purpose in the context of the particular activity and the audience to which they are directed. Second, the NSF ensures effective targeting of its CE activities through: (i) outreach visits to specific groups, (ii) extensive use of the World Wide Web, (iii) targeted e-mails and list servers, and (iv) participation in relevant workshops, meetings, and conferences.

Evidence: The effectiveness of the CE Program's design is demonstrated through the following examples. To ensure open access and to inform individuals interested in proposing ideas of opportunities to do so, all CE Program solicitations contain clear statements of the corresponding activity's purpose and context and are available online (http://www.nsf.gov/funding/). Targeted outreach is accomplished through several means including MyNSF, an electronic communications system that alerts self-identified people of specific opportunities (http://www.nsf.gov/mynsf/), and an extensive NSF-wide outreach visitation program to EPSCoR jurisdictions. The latter provides approximately $100,000 each year to support 80 to 100 outreach visits by permanent NSF staff to universities and colleges within the 27 EPSCoR jurisdictions. This program strongly encourages NSF staff participation in annual and topical conferences organized by the jurisdictions (http://www.nsf.gov/ehr/epscor/statewebsites.jsp). NSF staff members also conduct outreach activities at numerous regional and national professional conferences and during visits to academic institutions located throughout the Nation. Several dozen NSF staff members also participate in outreach at the NSF's bi-annual Regional Grants Conference and at national SBIR/STTR and EPSCoR meetings (http://www.nsf.gov/bfa/dias/policy/outreach.jsp). In addition, SBIR/STTR has established an extensive outreach program with specific emphasis on states and regions with historically lower participation (http://www.nsf.gov/od/oia/activities/cov/eng/2004/SBIRcov.pdf). Finally, the Report to the National Science Board on the National Science Foundation's Merit Review Process Fiscal Year 2005 (NSB 06-21) indicates that NSF's extensive external merit review process ensures that funding is awarded to those proposals that are of high scientific and technical merit and best address the goals and objectives of the CE Program (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf).

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The CE Program has four specific long-term performance measures that focus on outcomes and meaningfully reflect the program's purpose. These measures are drawn from the specific CE Program objectives set forth in the NSF Strategic Plan FY 2003-2008 and they encompass NSF's commitment to broadening S&E participation and to strengthening the Nation's STEM workforce. They are: increase opportunities for underrepresented individuals and institutions to conduct high quality, competitive research and education activities; foster connections between discoveries and their use in the service of society; encourage collaborative research and education efforts across organizations, disciplines, sectors and international boundaries; and enable people who work at the forefront of discovery to make important and significant contributions to S&E knowledge.

Evidence: CE Program performance measures may be found in the Measures Tab. Additional information regarding the assessment of performance may be found on the Performance Assessment Information web site (http://www.nsf.gov/about/performance/) and the NSF Strategic Plan FY 2003-2008 (pp. 27-29, http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201). Further information on the CE Program is available from the following sources. (1) Specific RUI goals are given in the announcement (http://www.nsf.gov/pubs/2000/nsf00144/nsf00144.pdf) and in the Dear Colleague Letters about ROA (http://www.nsf.gov/pubs/2005/nsf05548/nsf05548.pdf) and (http://www.nsf.gov/attachments/104362/public/Funding_Opps_Envir_Biology.doc). (2) SBIR/STTR activities are described in the portion of the strategic plan for the Foundation's Office of Industrial Innovation (http://www.nsf.gov/attachments/104206/public/Final_OII.doc). (3) The I/UCRC fosters connections between discoveries and their use in the service of society and works to develop self-sufficient partnerships among industry, academe, and government to conduct research that is of interest to both industry and university researchers. I/UCRC's specific goals are shown in NSF's Engineering Education and Centers Strategic Plan (pp. 12-13, http://www.nsf.gov/attachments/104206/public/Final_EEC.pdf) and in the program's solicitation in the Synopsis of Program section (http://www.nsf.gov/pubs/2001/nsf01116/nsf01116.pdf) (4) The CREST program evaluates the impact of Federal funds on institutional research capacity by monitoring indicators of increased research capability such as publications, presentations, patent applications, participant demographics and successful proposals that are produced by awardees as a result of CREST support. Information appears in the CREST data collection system (CRESTWeb), administered by the QRC Division of ORC Macro (Bethesda, MD), (http://chaffee.qrc.com/nsf/ehr/crestweb/index.cfm). The system is an internal resource available to NSF staff and requires a password to access. However, if specific information concerning CRESTWeb data is needed it may be sought by contacting the NSF's Division of Human Resource Development (HRD).

YES 9%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The targets and timeframes established for the CE Program's long-term measures are ambitious by nature. They are directly related to the program's ability to enhance the capability of individuals, institutions, and small businesses to conduct high quality, competitive STEM research, education, and technological innovation. NSF has demonstrated "significant achievement" in meeting the CE Program's objectives as assessed by the external Advisory Committee for GPRA Performance Assessment (AC/GPA). In addition, external advisory committees regularly assess the goals and timeframes for the CE Program to ensure that it is appropriately ambitious and that it promotes continuous improvement consistent with NSF goals and objectives. The COV process is the primary mechanism for external evaluation at the division or program level.

Evidence: Ambitious targets and timeframes for long-term measures can be found in the Measures Tab of this document. As discussed there, the targets are ambitious due to the constant increases in the complexity and number of research proposals each year. Additional information regarding targets and timeframes as well as general performance assessment information can be accessed at the following web sites. (1) Assessment of performance information may be found in Performance Assessment Information (http://www.nsf.gov/about/performance/) and the NSF Strategic Plan FY 2003-2008 (pp. 27-29, http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201). (2) Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (3) FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). (4) Information concerning the methods and targets developed by individual programs to enhance the capability of individuals, institutions, and small businesses to conduct high quality, competitive STEM research, education, and technological innovation may be found in their COV and AC reports, (http://www.nsf.gov/od/oia/activities/cov/covs.jsp) and in their grant solicitations (http://www.nsf.gov/publications/index.jsp?org=NSF&archived=false&hideSelects=false&pub_type=Program&nsf_org=NSF&x=9&y=10).

YES 9%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The CE Program has three specific annual performance measures, shown in the Measures Tab, which can demonstrate progress toward achieving the program's long-term goals and the agency's strategic objectives. These are: (1) the percent of award decisions made available to CE Program applicants within six months of proposal receipt or deadline date; (2) the percentage of proposals submitted to the CE Program, minus SBIR/STTR components, from academic institutions not in the top 100, as defined by total NSF funding; and (3) the percentage of SBIR/STTR Phase I awards to new principal investigators. These annual measures help ensure effective and efficient operation of the CE Program, which is relevant to all of the program's long-term goals. They also work to broaden participation by increasing the pool of ideas proposed to the Foundation (especially relevant to the first long-term goal) and enhance the agency's ability to identify the most promising new ideas from people who work at the forefront of discovery in science and engineering (especially relevant to the fourth long-term goal).

Evidence: Specific annual performance measures demonstrating progress toward achieving long-term goals may be found in the Measures Tab. There are no annual outcome measures due to the difficulty of basic research programs to set short-term goals and strive for specific successes toward research breakthroughs and other outcomes.

YES 9%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Baselines for the annual performance measures are obtained from internal NSF sources. Ambitious targets, commensurate with the budget environment, are set under the Measures Tab. As discussed there, the targets are ambitious due to the constant increases in the complexity and number of research proposals each year. In all instances the CE Program has met NSF performance standards.

Evidence: Performance measures can be found in the Measures Tab. Additional information may be accessed at the Performance Assessment Information web site ((http://www.nsf.gov/about/performance/) while annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp) and the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) provide more specific information. COV reports provide information about the performance of individual programs. They can be accessed at the Office of Integrative Activities web site (http://www.nsf.gov/od/oia/activities/cov/covs.jsp). For example, recent COV reports for the SBIR/STTR programs (http://www.nsf.gov/od/oia/activities/cov/eng/2004/SBIRcov.pdf) and EPSCoR (http://www.nsf.gov/od/oia/activities/cov/ehr/2005/EPSCoRcov.pdf) examine the ability of these programs to meet agreed upon performance standards of program processes and management and investment outputs and outcomes.

YES 9%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: All participants in the CE Program commit to and work toward achieving the program's long-term and annual goals. The CE Program achieves this level of commitment by assuring that all CE Program descriptions and solicitations are consistent with these goals. The CE Program employs merit review to select proposals that demonstrate commitment to the CE goals, and requires grantees to submit satisfactory template-based annual and final progress reports, subject to NSF program officer approval, as a prerequisite for continuation and/or renewal support. Continuing support (e.g., continuing grant increments) is based upon required annual progress reports submitted by grantees that are subject to review and approval by NSF program officers before additional funds are released. To receive further support (subsequent awards), all applicants are required to include in their new proposals a report on the results of previous NSF support, which is then considered in the merit review process. In addition, a final project report must be submitted after an award ends, and no subsequent awards can be made to an applicant unless a program officer has approved the final project reports for all previous awards.

Evidence: Evidence that all partners commit to and work toward the annual and/or long-term goals of the CE Program includes annual and final awardee project reports as required by the NSF Grant General Conditions (GC-1) (http://www.nsf.gov/pubs/gc1/gc1_605.pdf). All of the CE Program solicitations that are issued annually (http://www.nsf.gov/publications/index.jsp?nsf_org=NSF&org=NSF&pub_type=Program&archived=false&page=1&ord=title) reference NSF's two merit review criteria which are directly aligned with the goals of the CE Program. The latter are described at the following web site (http://www.nsf.gov/pubs/1999/nsf99172/nsf99172.htm). Other evidence may be found here: (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf).

YES 9%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: External committees conduct regularly scheduled high quality, independent evaluations of CE Program activities primarily through the COV process. The evaluations are conducted to support program improvements and to evaluate effectiveness and relevance to CE program goals. The results of these evaluations directly influence CE Program planning. COV reviews provide NSF with external expert judgments in two areas: (1) assessments of the quality and integrity of program operations and program-level technical and managerial matters pertaining to proposal decisions; and (2) comments on how the results generated by awardees have contributed to the attainment of NSF's mission and strategic outcome goals. COV reviews are conducted every three years. Advisory Committees, which meet several times per year, review Directorate performance and COV reports, and the AC/GPA assesses performance on an NSF-wide basis for the Strategic Outcome Goals. NSF's approach to evaluation was highlighted by GAO as an "evaluation culture--a commitment to self-examination, data quality, analytic expertise, and collaborative partnerships." These activities inform NSF senior management and contribute to development of plans for the agency. Additional evaluations are conducted on an as-needed basis by other independent entities.

Evidence: Evidence of independent evaluations of sufficient scope and quality may be found in the following documents: (1) GAO's report, Program Evaluation: An Evaluation Culture and Collaborative Partnerships Help Build Agency Capacity (GAO-03-454 May 2, 2003, http://www.gao.gov/new.items/d03454.pdf). (2) Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (3) Performance Assessment Information (http://www.nsf.gov/about/performance/). (4) COV reports and NSF responses provide information about the performance of individual programs. They can be accessed at the Office of Integrative Activities web site (http://www.nsf.gov/od/oia/activities/cov/). (5) Internal reports and data bases commissioned by the individual programs are also used in the evaluation process. For example, the CREST program evaluates the impact of Federal funds on institutional research capacity by monitoring indicators of increased research capability such as publications, presentations, patent applications, participant demographics and successful proposals that are produced by awardees as a result of CREST support. Information appears in the CREST data collection system (CRESTWeb), administered by the QRC Division of ORC Macro (Bethesda, MD), (http://chaffee.qrc.com/nsf/ehr/crestweb/index.cfm). The system is an internal resource available to NSF staff and requires a password to access. However, if specific information concerning CRESTWeb data is needed it may be sought by contacting the NSF's Division of Human Resource Development (HRD).

YES 20%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The NSF's annual budget request is explicitly tied to accomplishment of its annual and long-term performance goals and its resource needs are presented in a complete and transparent manner. The Foundation's performance structure provides the underlying framework for its request, with each major NSF organization (i.e. Directorates and Offices) tying its budget directly to this performance framework. This is further documented in the performance summary included in each organization's chapter of the budget, which ties their budget directly to the PART activities. With respect to presenting the resource needs in a clear and transparent presentation, the NSF budget displays resource requests by structural component and by performance goal. This presentation is based on consultations over the past year with key Congressional and OMB staff, and it also incorporates recommendations from the 2004 report on NSF by the National Academy of Public Administration. The purpose of this budget presentation is to highlight the matrix structure that NSF employs, with the major organizational units each contributing to the goals and investment categories established in the NSF Strategic Plan. This revised presentation contains additional information on the portfolio of investments maintained across NSF, including the components of the CE Program.

Evidence: The documents cited below provide evidence that NSF budget requests are tied to accomplishment of the annual and long-term performance goals of the CE Program. (1) The Executive Branch Management Scorecard (/results/agenda/scorecard.html) (2) The NSF FY 2007 Budget Request to Congress (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=budreqfy2007). (3) The National Science Foundation Governance and Management for the Future, (National Academy of Public Administrators, Order Number 04-07) (http://www.napawash.org/resources/news/news_4_28_04.html) and (http://71.4.192.38/NAPA/NAPAPubs.nsf/17bc036fe939efd685256951004e37f4/23f8c16a35c7eb6485256e85004b4a4f?). (4) Specific CE Program objectives and their relationship to NSF strategic goals is found in the NSF's strategic plan (pp. 15-18, National Science Foundation Strategic Plan FY 2003-2008, http://www.nsf.gov/pubs/2004/nsf04201/FY2003-2008.pdf).

YES 9%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Strategic planning deficiencies of the type and scope that would jeopardize the success of the CE Program have not been identified. To ensure that such deficiencies do not arise the CE Program uses the COV and AC processes on an ongoing basis to provide valuable constructive feedback concerning areas where strategic planning can be strengthened and responds directly to address those issues raised by the committees. Each NSF division or office prepares an annual update describing key actions that have been taken to implement the recommendations cited in the previous COV report.

Evidence: Evidence demonstrating CE Program strategic planning may be found in the following documents: COV reports and NSF responses for CE Program components (http://www.nsf.gov/od/oia/activities/cov/) and annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). A description of the relationship of specific CE Program objectives to NSF strategic goals is found in the NSF's strategic plan (pp. 15-18, National Science Foundation Strategic Plan FY 2003-2008, http://www.nsf.gov/pubs/2004/nsf04201/FY2003-2008.pdf). Examples of meaningful steps taken by the CE Program to improve strategic planning include the following cases. (1) For Phase I SBIR proposals, the COV recommended that the merit review panels "have more representatives from the business sector in order to provide earlier input to PI's and small businesses, and to improve the market success rates in the commercial world. More balance between technical and business reviewers should be achieved." The SBIR program readily accepted the COV recommendation and plans to integrate "commercial/business" reviewers as part of the Phase I technical review process. The actual process to be used (e.g., by mail or as part of the panel) is currently being developed as the program experiments with the best format to be used. For details see the SBIR/COV report (p. 5, http://www.nsf.gov/od/oia/activities/cov/eng/2004/SBIRcov.pdf) and its update (p. 2, http://www.nsf.gov/eng/general/cov/sbircov2005update.doc). (2) COV review deemed I/UCRC an excellent performer in all categories of assessment, but the COV also felt that I/UCRC was under-funded, and could use critical-mass resources to sustain this important link to industry and exposure to real world problems. In response, the funding for I/UCRC was increased in FY 2005 from $6 to $7 million and included a supplement for basic research at these Centers (I/UCRC Supplemental Funding Requests for Fundamental Research, http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=13659&org=EEC&from=home).

YES 9%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

Explanation: The CE Program uses regularly scheduled independent evaluations, primarily the COV process, to assess and compare the potential benefits of its efforts to those of other programs that have similar goals. The NSF's investments in the CE Program address unique national STEM research and educational needs that are not under the purview of the more mission-specific federal, state or local agencies. In areas where R&D activities may overlap, NSF coordinates activities to: avoid duplication, promote synergy, and ensure that its programs support those efforts that are most appropriate to NSF's mission. For example, SBIR/STTR is unique in its links to fundamental S&E research, while NSF utilizes its leadership of the EPSCoR Interagency Coordinating Committee (EICC) to coordinate the activities of the seven Federal agencies with EPSCoR-like programs. The Office of Science and Technology Policy, the National Science and Technology Council, the National Science Board, OMB, the Congress, and other policy-making bodies regularly review NSF's efforts within the context of the overall Federal S&E investment.

Evidence: Examples of independent evaluations of sufficient scope and quality for CE Program components are included in their COV reports and NSF responses (http://www.nsf.gov/about/performance/advisory.jsp) and in annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). The CE Program assesses and compares the potential benefits of its individual programs through the evaluation process. For example, coordination of SBIR/STTR activities with other similar programs both inside and outside of NSF undergoes COV and AC review to: (i) avoid duplication, (ii) promote synergy, and (iii) ensure that each program supports those efforts most appropriate to the mission of each respective organization. A recent AC report indicated that the SBIR/STTR program could make a "major contribution to achieving the NSF innovation vision" through "increased synergy between SBIR and NSF academic programs." The AC went on to note that "the NSF SBIR/STTR program, the Engineering Directorate and NSF in total, cover technology areas not addressed by other mission agencies and this strongly supports the NSF mission providing innovation and service to society" (SBIR/STTR Advisory Committee Report, June 2004, p. 6 http://www.nsf.gov/eng/sbir/Adcom/June%202004.doc). Information concerning the EPSCoR Interagency Coordinating Committee can be found at (http://www.nsf.gov/ehr/epscor/ehr_epscor_eicc.jsp) while additional documentation of the assessment process can be found in the following: (1) The FY 2004 COV Report for the SBIR/STTR programs (pp. 4, 10-12) (http://www.nsf.gov/od/oia/activities/cov/covs.jsp#eng). (2) External evaluations, such as the GAO review of the SBIR program (http://www.gao.gov/cgi-bin/getrpt?GAO-05-861T); the NRC report on the SBIR program: The Small Business Innovation Research Program Challenge and Opportunities (http://www.nap.edu/books/0309061989/html/R1.html) and the SBIR Program Diversity and Assessment Challenges (http://www.nap.edu/books/0309091233/html/R1.html).

YES 9%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: A prioritization process is used to formulate the specific budget requests and guide funding decisions for the CE Program. This process develops both NSF's overall highest priorities and individual programmatic priorities. In developing priorities for individual STEM activities, information on the following factors is obtained: (1) NSF's highest funding priorities listed in the FY 2007 Budget Request - especially strengthening the core and addressing major national challenges identified by the Administration; (2) needs and opportunities identified by COV and AC review; (3) new frontiers and topics of major impact that are identified by the scientific community, e.g., through workshops; and (4) important emerging areas for which large numbers of highly ranked proposals are received. Senior management integrates that information, prioritizes budget requests within and among programs, and determines funding levels for CE activities. The CE Program relies on the merit review process to prioritize proposals for funding decisions; final funding decisions also include consideration of NSF's core strategies and maintenance of a diverse portfolio.

Evidence: Evidence demonstrating a prioritization process to guide budget requests and funding decisions may be found in the following references: (1) The NSF Strategic Plan FY 2003-2008 (pp. 22-26, http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201). (2) The NSF FY 2007 Budget Request to Congress (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=budreqfy2007). (3) The Grant Proposal Guide, (NSF 04-23) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg). (4) National Science Board Reports, minutes and agendas ( http://www.nsf.gov/nsb/). (5) COV reports for the components of the CE Program can be accessed at the Office of Integrative Activities web site (http://www.nsf.gov/od/oia/activities/cov/).

YES 9%
Section 2 - Strategic Planning Score 100%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The CE Program regularly collects timely and credible performance data, including information from key program partners. This information is used to manage the program and to improve its overall performance. Evidence relating to the use of high-quality performance information may be found in COV reports and AC reports, including the AC/GPA report. The CE Program collects performance information through: (1) internal annual, interim, and final project reports, an internal database, annual contract performance evaluations, and site visit reports and (2) external evaluation reports, and program monitoring. For example, CREST monitors indicators of increased research capability such as publications, presentations, patent applications, participant demographics and successful proposals that are produced by awardees as a result of CREST support. The CE Program uses performance information to make corrective actions, adjust program priorities, make decisions on resource allocations and make other adjustments in management actions.

Evidence: Performance data and information are included in the following sources: (1) COV reports and NSF responses for the components of the CE Program can be accessed at the Office of Integrative Activities web site (http://www.nsf.gov/od/oia/activities/cov/covs.jsp). (2) The annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (3) The FY2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) (4) The NSF FY 2007 Budget Request to Congress (http://www.nsf.gov/about/budget/fy2007/). (5) Internal documents such as awardee project reports; internal database information; project site visit reports and the CREST data collection system, (CRESTWeb), administered by the QRC Division of ORC Macro (Bethesda, MD), CRESTWeb provides a historical encapsulation of Center-specific, cohort-based, and program-wide outputs. Since setting objective and reasonable expectations for Center performance is not always feasible among a portfolio with such disparate and highly-focused research areas, CRESTWeb data are used to indicate reasonable and ambitious targets for project productivity (e.g., publications, enrollments, presentations, patents and leveraged funding) over all Centers. These metrics can in turn be used to express expectations and benchmarks to current or potential applicants Although CRESTweb is an internal resource available to NSF staff and requires a password to access, information about the reported data may be sought by contacting the NSF's Division of Human Resource Development (HRD), (http://chaffee.qrc.com/nsf/ehr/crestweb/index.cfm).

YES 9%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: All NSF awardees and contractors must meet specific reporting and financial record keeping requirements, and are held accountable for cost, schedule and performance results. Accountability extends to NSF managers who also regularly monitor costs, schedule, and performance results and take corrective action when necessary. To receive further support (subsequent awards), all applicants are required to include in their new proposals a report on the results of previous NSF support. Such past performance is then considered in the merit review process. The efforts of NSF staff undergo supervisory and COV review. Sub-grantees are similarly held accountable to NSF by grantees and contractors.

Evidence: Evidence demonstrating that federal managers and program partners are accountable for cost, schedule and performance results may be found in COV reports (http://www.nsf.gov/od/oia/activities/cov/); awardee project reports; the NSF Grant General Conditions (GC-1) (http://www.nsf.gov/pubs/gc1/gc1_605.pdf); the FY 2005 Federal Cash Transaction Report (http://www.nsf.gov/pubs/2006/nsf0601/pdf/09.pdf); and annual performance evaluations of NSF staff/program officers.

YES 9%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: The NSF routinely obligates its funds in a timely manner, and they are closely monitored to assure that they are spent for the intended purposes. Accurate reporting of grant fund expenditures is a primary requirement for continued NSF support. Funds appropriated for regular research and educational grants made through the CE Program are mostly of two- to three-year duration. In a substantial number of these cases approximately 99 percent of NSF support will be obligated within the first year it is appropriated. Provisions for automatic awarding of continuing grants are made at the beginning of each fiscal year; each funding increment is subject to the approval of an annual progress report by the program officer. NSF also has pre- and post-award internal controls to reduce the risk of improper payments. Beginning in FY 2004 NSF has incorporated erroneous payments testing of awardees into its on-site monitoring program.

Evidence: Data and information demonstrating that funds are obligated in a timely manner and spent for the intended purpose are included in Federal Cash Transaction Reports, a clean opinion on financial statements for the past 8 years, and the following reports: (1) FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). (2) NSF carryover presented in the NSF budget requests to Congress (http://www.nsf.gov/about/budget/). (3) The Risk Assessment and Award Monitoring Guide (www.nsf.gov/about/contracting/rfqs/dcca_060018/Risk%20Assessment%20Guide%20for%20Post%20Award%20Monitoring%20Site%20Visits.pdf). (4) The FY 2005 Federal Cash Transaction Report (http://www.nsf.gov/pubs/2006/nsf0601/pdf/09.pdf).

YES 9%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The NSF is a Federal Government leader in the vigorous and dynamic use of information technology to advance the agency's mission and to achieve efficiencies and cost effectiveness in the execution of its programs. Information technology improvements permit more timely and efficient processing of proposals. This has allowed NSF to establish a 6-month time-to-decision goal that ensures that grant applicants will be given a funding status decision in a reasonable amount of time. The CE Program is a leader in this area and has consistently exceeded Foundation-wide goals. Monthly reports on progress are sent to managers and results are available to all staff through the agency's Enterprise Information System. Next generation e-capabilities (in the planning and implementation stage) will be influenced by eGov activities, and NSF will continue its leadership in the Government-wide Grants Management Line of Business Strategy (GMLoB). For example, NSF has been designated as one of three initial Consortia Provider Candidates (CPC) for GMLoB. This designation recognizes NSF's current capabilities and success as a model for other agencies. NSF will begin planning to leverage its extensive capability and experience base to provide grants-management-related services for other government agencies. NSF continues to investigate ways of making systems even more efficient such as broadening the use of letters of intent and proactively coordinating grant application deadlines. The CE Program is a leader in this area and has consistently exceeded Foundation-wide goals

Evidence: Evidence of procedures to measure and achieve cost effectiveness may be found in the Measures Tab and in the following documents: (1) A discussion of management integrity appears on pp. 17-18 and time-to-decision data are located on p. II-81 in the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). (2) Other information regarding program effectiveness may be found in: the COV reports and program solicitations of the CE Program components, (http://www.nsf.gov/od/oia/activities/cov/) and (http://www.nsf.gov/funding/), respectively; the NSF Strategic Plan FY 2003-2008 (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201); and the Grant Proposal Guide, (NSF 04-23) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg). (3) Information relating to the GMLoB may be found in the NSF FY 2007 Budget Request to Congress (p. 311, http://www.nsf.gov/about/budget/fy2007/pdf/7-OrgnizationalExcellence/1-SalariesandExpenses/33-FY2007.pdf).

YES 9%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The NSF promotes partnerships, both within the agency and outside, collaborating with other agencies, industry, national laboratories, and other countries for programs of mutually related interest. The CE Program is a leader in such collaborative activities and effectively coordinates its activities with related programs. The emphasis on coordination/collaboration by the CE Program avoids overlapping and duplication of effort and ensures effective utilization of existing funds. Interagency collaboration is exemplified by the long-standing EPSCoR Interagency Coordinating Committee (EICC) that is led by the NSF and coordinates the efforts of seven Federal agencies with EPSCoR-like programs. Within NSF the CE Program exemplifies this spirit of collaboration and coordination to the benefit of the national S&E community. For example, during the period 2003-05 EPSCoR co-funded over 800 awards with the Foundation's regular research and educational grant programs. These proposals, although deemed meritorious in nature by external review, would not have received NSF grants totaling over $218 million without this Foundation-wide cooperative split-funding effort. The alliance- or center-based construct of a CREST award also speaks directly to forging linkages. The most competitive proposals specify cooperation between: departments; institutions, academe and industry; academe and federal, state and local government; and sharing of research education content at all academic levels. In addition, CREST and HBCU-RISE are structured to complement the mission and objectives of other NSF programs targeting Historically Black Colleges and Universities, most directly the HBCU-Undergraduate Program (HBCU-UP, detailed in CEOSE 04-01, pp 36-38), which emphasizes quality undergraduate education experiences. HBCU-RISE builds upon this investment by supporting graduate-level institutional capacity. CREST, in turn, builds upon HBCU-RISE efforts to develop the specialized facilities and institutional changes necessary for world-class research centers at minority-serving institutions. Cooperative efforts with other NSF-funded efforts, including minority-serving institutional alliances (e.g., LSAMP, AGEP), the Small Business Innovation Research (SBIR) program, and Science and Technology Centers (STCs) are also encouraged in the CREST/HBCU-RISE program solicitation. These collaborative efforts help successful CRESTs become more competitive applicants for full-fledged Engineering Research Centers (ERCs) or Science and Technology Centers (STCs). The I/UCRC program is another example of a CE Program component that coordinates with other NSF programs that are interested in funding research with industry and requires its grantees to establish partnerships among industry, academe, and government. During FY 2005 the following organizations/programs collaborated in supporting I/UCRC awards: Computer and Information Science and Engineering Directorate (CISE), Electrical & Communications Systems (ECS) Division of the Engineering Directorate, Office of International Science & Engineering (OISE), and Research Experience for Teachers/Undergraduates (RET/REU) programs.

Evidence: Data and information demonstrating that the CE Program collaborates effectively with related programs are included in the NSF's management plans, internal administrative manuals, and in program solicitations (http://www.nsf.gov/funding/). For example, evidence of CREST's foundation of community-based networks of institutions is described in its current program solicitation (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06510) and efforts directed toward HBCUs are described in Broadening Participation in America's Science and Engineering Workforce: The 1994-2003 Decennial and 2004 Biennial Reports to Congress (CEOSE 04-01, pp. 36-38) (http://www.nsf.gov/od/oia/activities/ceose/reports/ceose2004report.pdf). Collaboration between NSF and DOE whereby various NSF programs, including CREST, provide support for students to work in DOE labs is described in the following web sites (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06522) or (http://www.nsf.gov/pubs/2006/nsf06522/nsf06522.pdf). Evidence demonstrating coordination and collaboration of the SBIR/STTR program with similar grant programs (e.g., Special Funding Opportunity to Team with a Minority Institution and EPSCoR) may be found on NSF and interagency web sites (www.nsf/gov/eng/iucrc), (http://www.nsf.gov/eng/sbir/supplemental_requests.jsp) and (http://www.sbirworld.com), respectively. Information concerning the EICC collaboration may be found on the EPSCoR web site (http://www.nsf.gov/ehr/epscor/ehr_epscor_eicc.jsp) as well as those of the participating agencies, (e.g., http://www.sc.doe.gov/bes/EPSCoR/eicc1.htm).

YES 9%
3.6

Does the program use strong financial management practices?

Explanation: The financial management practices of the CE Program are consistent with the strong practices that led to NSF being the first federal agency to receive a "green light" for financial management on the President's Management Agenda (PMA) scorecard. NSF continues to maintain a green rating. NSF has received a clean opinion on its financial audits for the past 8 years. NSF is committed to providing quality financial management to all its stakeholders. It honors that commitment by preparing annual financial statements in conformity with generally accepted accounting principles in the U.S. and then subjecting the statement to independent audits. As a federal agency, NSF prepares the following annual financial statements: Balance Sheet, Statement of Net Cost, Statement of Changes in Net Position, Statement of Budgetary Resources, and Statement of Financing. Supplementary statements are also prepared including Budgetary Resources by Major Accounts, Intergovernmental Balances, Deferred Maintenance, and Stewardship Investments.

Evidence: Data and information demonstrating strong financial practices are included in the Executive Branch Management Scorecard (/results/agenda/scorecard.html); in the results of NSF financial audits; and in performance and management assessments (/omb/).

YES 9%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: No management deficiencies have been identified. The CE Program receives and acts on advice on how to improve program management from external experts including triennial COV review of programs. COV reports regularly provide feedback on programmatic and management-related concerns, including external program assessments that are used in program and grant management. The COVs conduct detailed reviews of the materials associated with individual grant proposal actions and have traditionally assessed the integrity and efficiency of the processes for proposal review. Each COV addresses management issues through a series of questions in Section A5 of the COV template. NSF staff members, in turn, respond to any management deficiencies identified through a formal response to the COV report that outlines the steps the Foundation will take to address issues raised. The NSF response to COV recommendations is updated annually. In addition, the NSF response, as well as the initial COV report, is reviewed by Directorate Advisory Committees.

Evidence: The COV reports and NSF responses for CE Program components can be accessed at the Office of Integrative Activities web site (http://www.nsf.gov/od/oia/activities/cov/covs.jsp). An example of COV review of program management is the FY 2005 EPSCoR/COV recommendation that "The EPSCoR program should have a dedicated Advisory Committee (constituted as a subcommittee of the EHR/AC) to resolve challenging issues such as: graduation/progression of jurisdictions, launching new initiatives (e.g., Strength Based Research Collaborative), resource allocations, program evaluation and internal NSF organizational issues." EPSCoR concurred with the COV recommendation and has taken preliminary steps to form an AC for EPSCoR composed of an EHR/AC member and representatives of academic research, academic administration, business, and jurisdictional government. Another example of external review of program management involved the SBIR/STTR Advisory Committee who recommended allocating greater resources to onsite monitoring and mentoring of awardees; this is being implemented within the scope of available resources (p. 7, http://www.nsf.gov/eng/sbir/Adcom/June%202004.doc). Additional information regarding management planning procedures is available to NSF staff via documents such as the Proposal and Award Manual (http://www.inside.nsf.gov/cgi-bin/getpub?pam) and annual reviews of the NSF's internal accounting and administrative controls (http://www.nsf.gov/pubs/2006/nsf0601/pdf/05c.pdf). Advisory Committees also provide advice about program goals and management. For example, the Advisory Committee for SBIR/STTR recommended that the Office of Industrial Innovation be created to highlight contributions of the SBIR/STTR programs to the NSF and to advocate for a stronger innovation role for the Engineering Directorate. This change is underway. (p. 7, http://www.nsf.gov/eng/sbir/Adcom/June%202005.pdf).

YES 9%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: CE Program grants are awarded on the basis of NSF's competitive, merit review process that includes external peer evaluation using two standard NSB-approved criteria. A qualified expert serving as an NSF program officer carefully reviews all proposals. Additionally, each proposal is usually reviewed by 3 to 10 outside persons who are professionals in the particular field represented by the proposal. All such activities are subject to COV review. Competitive merit review, with peer evaluation, is NSF's accepted method for informing its proposal decision process. The NSB-approved criteria address the "Intellectual Merit" and the "Broader Impacts" of the proposed effort. Some program solicitations contain additional criteria that address specific programmatic objectives. For example, the Research Infrastructure Improvement (RII) program conducted by EPSCoR has the following additional review criteria: (1) Strategic Fidelity and Impact, (2) Value Added, (3) Management Plan, (4) Evaluation, (5) Sustainability, (6) Outreach Strategy, and (7) Dissemination and Communication.

Evidence: Evidence demonstrating that grants are awarded through a clear competitive process is included in the following sources: (1) The Report to the National Science Board on the National Science Foundation's Merit Review Process Fiscal Year 2005 (NSB 06-21) (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf). (2) The NSB Policy on Recompetition (http://www.nsf.gov/nsb/documents/1997/nsb97224/nsb97224.txt). (3) NSF Merit Review Criteria (http://www.nsf.gov/pubs/1999/nsf99172/nsf99172.htm). (4) COV reports and NSF responses can be accessed at the Office of Integrative Activities web site (http://www.nsf.gov/od/oia/activities/cov/). (5) Details of the additional review criteria employed in the RII program conducted by EPSCoR are available at the RII program solicitation web site (pp. 11-12, http://www.nsf.gov/pubs/2005/nsf05589/nsf05589.pdf).

YES 20%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The CE Program uses a multifaceted array of oversight practices including merit review, appropriate grant mechanisms, site visits, project reports, an IT-enabled grants management system, and a risk-based monitoring program that provide sufficient knowledge of grantee activities. The NSF merit review process provides a high degree of assurance that awardees are technically qualified and have resources available to successfully undertake the work they have proposed. NSF's award oversight is tailored for each award, and could consist of any combination of the following: regular reports from grantees on progress, desk-reviews, site visits, meetings with project staff, and interim reviews by special panels. Annually, awardees undergo a risk-based monitoring process, the Award Monitoring and Business Assistance Program (AMBAP). The AMBAP provides focused assurance that each awardee institution maintains sufficient knowledge of all NSF project activities. Through AMBAP, NSF has increased the number of staff devoted to post-award administration, including a dedicated outreach manager to interact and communicate with awardees. Internal Improvements: NSF has implemented and is continuously improving an IT-enabled grants management system to support all awards and ensure effective post-award management. NSF uses technology and innovative practices, such as teleconferencing, videoconferencing, and reverse site visits to enhance performance oversight. Federal Requirements: Finally, NSF adheres to the oversight standards used by all Federal agencies. Using these standards allows NSF to benefit from awardee compliance with all relevant OMB Circulars regarding annual audits, and compliance with other Federal regulations regarding the use of Federal funds. The Single Audit Act and cognizant audit agencies mandate significant oversight functions for grant and contract recipients. This law and the concomitant oversight audit and review activities provide baseline oversight procedures that govern all grant recipients. As a system, NSF's oversight mechanisms provide sufficient knowledge of grantee activities to monitor and understand how funds are utilized by grantees.

Evidence: Data and information demonstrating sufficient oversight practices are included in COV reports (http://www.nsf.gov/od/oia/activities/cov/); awardee project reports; Report to the National Science Board on the National Science Foundation's Merit Review Process Fiscal Year 2005 (NSB 06-21) (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf); Risk Assessment and Award Monitoring Guide; clean audit opinions; President's Management Agenda (PMA) Scorecard for Financial Management (/results/agenda/scorecard.html); site visit reports; trip reports from attendance at professional meetings; and workshops and grantee meetings as presented in the FY 2005 Performance and Accountability Report (www.nsf.gov/pubs/2006/nsf0601/index.jsp). Information concerning the Award Monitoring and Business Assistance Program (AMBAP) may be obtained at (http://www.nsf.gov/about/contracting/rfqs/dcca_060018/Risk%20Assessment%20Guide%20for%20Post%20Award%20Monitoring%20Site%20Visits.pdf).

YES 9%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The NSF collects grantee performance data on an annual basis from all NSF-supported research and educational programs, including the CE Program. The data is made available to the public in a transparent and meaningful manner through the Discoveries area of the NSF web site, through press releases, through the Performance and Accountability Report, in an annual brochure on Performance Highlights, in annual Budget Requests to Congress, and in the Report of the Advisory Committee for GPRA Performance Assessment. Grantees provide project reports to NSF, which are examined, and approved/disapproved by the program officers. NSF Grant General Conditions require that results of NSF-supported research be published in open literature such as peer-reviewed journals. Members of the general public have access to data on the numbers of proposals and numbers of awards as well as, for each award, the name of the principal investigator, the awardee institutions, amount of the award, and an abstract of the project. NSF proactively seeks out noteworthy discoveries and distributes these in general press releases.

Evidence: The following sources demonstrate that performance data is collected from grantees and is made available to the public in a transparent and meaningful manner. (1) NSF Discoveries web site (http://www.nsf.gov/discoveries/). (2) News releases (http://www.nsf.gov/news/news_list.cfm?nt=2). (3) FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). (4) Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (5) FY 2005 Performance Reports and Highlights (http://www.nsf.gov/about/performance/reports.jsp). (6) The NSF FY 2007 Budget Request to Congress (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=budreqfy2007). (7) The NSF Grant General Conditions (GC-1) (http://www.nsf.gov/pubs/gc1/gc1_605.pdf). (8) Highlights of annual meetings/grantees meetings, workshops, and awards database (http://www.nsf.gov/awardsearch/).

YES 9%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: Not applicable to the CE Program since all funding is allocated via the competitive grants process.

Evidence:

NA 0%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The CE Program has demonstrated adequate progress in achieving its long-term performance goals as determined by an external expert panel, the Advisory Committee for GPRA Performance Assessment (AC/GPA). Since FY 2002, the AC/GPA has determined that the accomplishments in all the indicators for the Ideas category,(for which CE is a major component) demonstrate "significant achievement." Those indicators are: increase opportunities for underrepresented individuals and institutions to conduct high quality, competitive research and education activities; foster connections between discoveries and their use in the service of society; encourage collaborative research and education efforts across organizations, disciplines, sectors and international boundaries; and enable people who work at the forefront of discovery to make important and significant contributions to S&E knowledge.

Evidence: The AC/GPA has consistently determined that NSF has demonstrated significant progress towards meeting the long-term performance goals related to the Ideas Strategic Outcome Goal, which includes the CE Program (FY 2005 AC/GPA Report, NSF 05-210, pp. 26-35) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210). A description of the relationship of specific CE Program objectives to NSF strategic goals is found in the NSF's strategic plan (pp. 15-18, National Science Foundation Strategic Plan FY 2003-2008, http://www.nsf.gov/pubs/2004/nsf04201/FY2003-2008.pdf). The long-term outcome measures of the CE Program are based upon these objectives. In addition, the FY 2005 Performance and Accountability Report, (page II-41) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) and independent COV evaluations (http://www.nsf.gov/about/performance/advisory.jsp) are replete with examples of the progress demonstrated by the CE Program in meeting its four performance goals. For example, progress in establishing collaborative activities in research and education has been demonstrated by the I/UCRC program, which has successfully stimulated the development of long-term R&D partnerships among academe, industry and government. The R&D centers, catalyzed by a small NSF investment, are supported primarily by industry center members with NSF assuming a supporting role in their development and evolution (I/UCRC program evaluation information and grantee leverage funding data are available at http://www.ncsu.edu/iucrc/NatReports.htm). The CREST/HBCU-RISE (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06510) and the RUI/ROA (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf00144) programs also provide excellent examples of the progress achieved by the CE Program in meeting its long-term goal of increased opportunities for underrepresented individuals and institutions to conduct high quality, competitive research and education activities.

YES 18%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The CE Program achieves its three annual performance goals. These include: (1) percent of award decisions made available to CE Program applicants within six months of proposal receipt or deadline date, while maintaining a credible and efficient competitive merit review system, as evaluated by external experts; (2) percentage of proposals submitted to the CE Program, minus SBIR/STTR components, from academic institutions not in the top 100, as defined by total NSF funding; and (3) percentage of SBIR/STTR Phase I awards to new principal investigators.

Evidence: Evidence showing that the CE Program meets its annual performance goals can be found in the Measures Tab and in internal NSF databases.

SMALL EXTENT 6%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The CE Program has demonstrated improved efficiencies or cost effectiveness in achieving its program goals. NSF is a government leader in administrative efficiency, sharing its processes with other Federal agencies. To maintain this position, NSF continues to improve efficiency and effectiveness of agency operations through increasing use of electronic business systems and operations. This includes building on the past successes of FastLane, which accepts proposals from the university community electronically and supports university/NSF business processes. FastLane, in turn, was the model for the recently unveiled government-wide grants.gov site. NSF continues improvement notably with the Electronic-Jacket (eJ) that supports merit review, real-time access to proposal and award information, and provides for "shared-work" processes for interdisciplinary proposals that involve more than one organization. The eJ has been an essential enabler for NSF to maintain dwell time results in the face of increased numbers of proposals. The NSF's Business and Operations Advisory Committee has rated the agency as successful in developing and using "new and emerging technologies for business application." The NSF has also received green ratings in e-Government, indicating current success and positioning for future successes in government-wide electronic business methods.

Evidence: Evidence demonstrating improved efficiencies in achieving program goals can be found in the FY 2005 Performance and Accountability Report, (pp. I 9-16; II 1-92, http://www.nsf.gov/pubs/2006/nsf0601/index.jsp). Efficiencies in proposal processing are demonstrated by the CE Program, which significantly exceeded the NSF's six-month time-to-decision goal of 70 percent as shown in the Measures Tab. Increased program management efficiency is also evident in the CE Program. For example, I/UCRC has increased the number of universities participating in the program while decreasing the number of actual I/UCRC centers by teaching professors and universities to work together in multi-university centers (http://www.ncsu.edu/iuicrc).

LARGE EXTENT 12%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: The performance of the CE Program compares favorably to other government and private programs with similar purpose and goals. Because of the high quality, relevance, and performance of the CE Program, programs in other Federal agencies and international governments often emulate aspects of NSF investments and processes. For example, the Foundation's innovative EPSCoR initiative led to the development of EPSCoR-like S&E capability enhancement efforts in six other federal agencies (DOE, DOD, EPA, NASA, NIH, and USDA). The NSF assumed leadership of this federal-wide effort through Chairmanship of the EPSCoR Interagency Coordinating Committee (EICC). Since NSF is the only Federal agency charged with promoting the progress of S&E research and education across all fields and disciplines while also avoiding "undue concentration of such research and education" its investments in the CE Program provide a principal source of Federal support for STEM research and education at: (1) predominantly undergraduate and minority-serving colleges and universities (RUI/ROA and CREST/HBCU-RISE, respectively); (2) jurisdictions that receive a lesser amount of NSF research funding (EPSCoR); and (3) members of the private sector including the small business community (I/UCRC and SBIR/STTR). For example, NSF created and launched the SBIR program and remains a national leader and a model for other agencies with respect to proposal processing dwell time, inception of the development of comprehensive activities, and efforts to promote broadening participation in SBIR/STTR. All of this has been accomplished while supporting world-class research in developing spin-off companies and technologies. No other entity, government or private, addresses such far-reaching and diverse purposes related to the enhancement of the Nation's STEM research and education capabilities. NSF uses COV and AC external expert review to ensure the continued high quality of the CE Program.

Evidence: Evidence of the CE Program comparing favorably to other government and private programs with similar purpose and goals can be found in several sources including the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par); and COV and AC reports (http://www.nsf.gov/od/oia/activities/cov/). Additional examples that demonstrate CE Program performance are available from a variety of sources. Information concerning NSF's leadership of the EICC, involving six other Federal agencies to develop a coordinated Federal response to the issue of S&E capability enhancement in jurisdictions that receive a lesser amount of Federal R&D funding may be found on the EPSCoR web site (http://www.nsf.gov/ehr/epscor/ehr_epscor_eicc.jsp) as well as those of the participating agencies, (e.g., http://www.sc.doe.gov/bes/EPSCoR/eicc1.htm). Testimony Before the Subcommittee on Environment, Technology, and Standards Committee on Science, House of Representatives, June 28, 2005 by the GAO, meanwhile, found the SBIR program to be successful in enhancing the role of small businesses in Federal R&D, stimulating commercialization of research results, and supporting the participation of small businesses (Federal Research: Observations on the Small Business Innovation Research Program http://www.gao.gov/cgi-bin/getrpt?GAO-05-861T ). The National Research Council also examined the SBIR/STTR program (The Small Business Innovation Research Program: Challenges and Opportunities, Board on Science, Technology, and Economic Policy, National Research Council, National Academy Press, Washington, D.C., 1999). Finally, a study by Professor Joshua Lerner of the Harvard Business School compared the long-term performance of SBIR-funded companies with a matched set of small companies that did not receive SBIR funding. This study included companies that received SBIR awards from the NSF and nine other Federal agencies (The Government as Venture Capitalist: The Long-Run Effects of the SBIR Program, Journal of Business, July, 1999, v. 72, pp. 285-297).

YES 18%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Independent evaluations by Committees of Visitors and other external groups such as Advisory Committees, the National Science Board, and national S&E organizations find that the CE Program is effective and achieves desired results. For example, the FY 2004 AC/GPA evaluation cited the CREST-supported centers at Tuskegee and Jackson State Universities for their innovative research focus. In addition, the 2004 COV that evaluated CREST/HBCU-RISE remarked, "Some programs, such as CREST, focus explicitly on developing research capacity at minority-serving institutions, and they have established very impressive track records of catalyzing world-class research communities." In addition, independent COV and AC reviews for SBIR/STTR and I/UCRC find that these programs are effective and achieve results consistent with CE Program goals. As noted in a recent NRC Report, there is some evidence that awards from programs like SBIR attract private investment by certifying that companies have technical quality, thus reducing some of the uncertainty involved in early-stage investment. The I/UCRC program, meanwhile, has become a model for leveraging a small amount of Federal funding to create over $60,000,000 in cooperative research support.

Evidence: Evidence demonstrating that independent evaluations are of sufficient scope and quality and that the CE Program is effective and achieving desired results can be found in: (1) the FY 2004 AC/GPA report (p. 35, http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04216); (2) the FY 2005 AC/GPA report (pp. 13-16, http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210&org=NSF); (3) the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par); (4) the FY 2005 COV report for the Human Resources Division (p. 3, http://www.nsf.gov/od/oia/activities/cov/ehr/2005/HRDcov.pdf); and (5) COV and AC reports and NSF responses (http://www.nsf.gov/about/performance/advisory.jsp). Examples of assessments that demonstrate the effectiveness of the SBIR component of the CE Program include: SBIR Program Diversity and Assessment Challenges (p.27, http://www.nap.edu/books/0309091233/html/R1.html) and Evaluating the Small Business Innovation Research Program: A Literature Review by Joshua Lerner and Colin Kegler in The Small Business Innovation Research Program: An Assessment of the Department of Defense Fast Track Initiative, Board on Science, Technology, and Economic Policy, National Research Council, Washington, D.C., 2000. Information concerning the conduct and achievements of the I/UCRC program can be found at its web site (http://www.nsf.gov/eng/iucrc/); in the Division of Engineering Education and Centers COV report (http://www.nsf.gov/od/oia/activities/cov/eng/2004/EEC_COV.pdf); and at the I/UCRC Program Evaluation Project web site (http://www.ncsu.edu/iucrc/NatReports.htm).

YES 30%
Section 4 - Program Results/Accountability Score 82%


Last updated: 09062008.2006SPR