|Program Title||Support for Small Research Collaborations|
|Department Name||National Science Foundation|
|Agency/Bureau Name||National Science Foundation|
Research and Development Program
Competitive Grant Program
|Assessment Section Scores||
|Program Funding Level
|Year Began||Improvement Plan||Status||Comments|
|Year Began||Improvement Plan||Status||Comments|
The program will improve performance targets and will continue to improve monitoring of performance against those targets.
|Completed||This improvement plan is duplicative of the 2006 Performance goal.|
External committees of visitors are continuing targeted reviews of the components of the program.
|Completed||Committees of Visitors (COVs) were convened in FY2004-2005 for the Centers for Learning and Teaching Program, the Informal Science Education Program, and the Math and Science Partnership Program. COV Reports as well as NSF responses are posted on NSF's website. COVs conduct thorough program reviews, assess outcomes, and make recommendations on program and management improvements. This is NSF's primary method for conducting assessment at the program and portfolio level.|
The Budget provides funding to continue this program??s current effectiveness in enhancing science and engineering education.
The program will strengthen its performance goals and improve how it monitors performance against those goals.
|Completed||This is part of NSF's Stewardship Goal - Broadening Participation. Concerted efforts continue to encourage proposals from MSIs, HSIs, and other institutions serving underrepresented minorities. Alliances for Graduate Education and Professoriate Program workshops and model projects are aimed at increasing minority participation. Proposal writing workshops, outreach visits, participation in national conferences are all aimed at producing more competitive proposals from MSIs.|
All Collaborations programs will ensure increased timeliness of yearly project reports from investigators.
|Completed||On Nov. 18, 2006, changes will be implemented in the Project Reports System to enable NSF to monitor and enforce that PIs are submitting annual and final project reports within the appropriate timeframes. Annual reports are due 90 days prior to report period end date and are required for all standard and continuing grants and cooperative agreements. Final reports are due within 90 days after expiration of award. Policy documents have been updated to reflect the changes.|
NSF will develop new ways and measures to monitor its efforts to broaden participation in this and other programs.
|Completed||This is part of NSF's Stewardship Goal - Broadening Participation. One significant step already in place is to develop a searchable reviewer database with demographic data, which will broaden and diversify the reviewer pool for proposals. Other recommendations concern training for staff and panelists on implicit bias, enhancing tracking mechanisms, and including a broadening participation performance indicator in annual staff evaluations.|
NSF continued its focus on strengthening and expanding broadening participation activities, involving research institutions.
Measure: For 70 percent of proposals submitted the Education and Human Resources Directorate (EHR), be able to inform applicants about funding decisions within six months of proposal receipt or deadline, or target date, whichever is later, while maintaining a credible and efficient merit review system.
Explanation:Because the program category "Small Research Collaborations" no longer exists under NSF's new Strategic Plan, data on the measures associated with the PART Program can no longer be tracked. However, because the PART Pogram corresponds to several programs administered by the EHR Directorate, the Foundation has adopted a Directorate-wide measure in its place for FY 2007 and beyond.
Measure: External validation by the Advisory Committee for GPRA Performance Assessment that NSF promotes public understanding and appreciation of science, technology, engineering and mathematics (STEM) disciplines and builds bridges between formal and informal science education.
Explanation:Assessment of the impact of Collaborations on promoting public understanding of the STEM disciplines by the Advisory Committee for GPRA Performance Assessment.
Measure: Increase the percentage of proposals submitted to the Education and Human Resources Directorate (EHR) programs from academic institutions not in the top 100 of NSF funding recipients.
Explanation:Because the program category "Small Research Collaborations" no longer exists under NSF's new Strategic Plan, data on measures associated with the PART Program can no longer be tracked. However, because the PART Program corresponds to several programs administered by the EHR Directorate, the Foundation has adopted a Directorate-wide measure in its place for FY 2007 and beyond.
Measure: External validation by Advisory Committee that NSF programs promote greater diversity in science, technology, engineering, and mathematics (STEM) workforce through increased participation of underrepresented groups and institutions.
Explanation:Assessment of the impact of Collaborations on promoting greater diversity in the STEM workforce by the Advisory Committee for GPRA Performance Assessment.
|Section 1 - Program Purpose & Design|
Is the program purpose clear?
Explanation: The purpose of NSF's investments in Collaborations is to 'foster partnerships with colleges, universities, school districts, and other institutions - public, private, state, local, and Federal - to strengthen science and engineering (S&E) education at all levels and broaden participation in S&E fields.' This statement is derived from the statutes that govern NSF. The NSF Act of 1950 authorizes and directs NSF to support science and engineering education at all levels. Other statutes, notably the Science and Engineering Equal Opportunities Act, address the underrepresentation of women, minorities, and persons with disabilities in science and engineering. These purposes have since been further expanded and clarified in the recently enacted NSF Authorization Act of 2002. Continuing as a high priority of the Administration, No Child Left Behind calls for research that enables the successful development and implementation of science-based programs and practices in K-12 education and calls for collaboration between universities and K-12 schools and districts.
Evidence: Relevant information concerning the Collaborations program purpose may be found in the NSF Strategic Plan FY 2003-2008 (pages 14-15, www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf). Additional information may be found in the National Science Foundation Act of 1950, 42 USC 1861 et. seq, the Science and Engineering Equal Opportunities Act, 42 USC 1885, the NSF Authorization Act of 2002 (P.L. 107-378), and the No Child Left Behind Act of 2001 (P.L. 107-110).
Does the program address a specific and existing problem, interest or need?
Explanation: The national imperative for NSF's investments in Collaborations is addressed in Paragraphs 1 and 2 of Section 3 (Policy Objectives) of the NSF Authorization Act of 2002: "To strengthen the Nation's lead in science and technology by - ' (C) expanding the pool of scientists and engineers in the United States; ' and (2) To increase overall workforce skills by ' (A) improving the quality of mathematics and science education, particularly in kindergarten through grade 12; ' (D) increasing access to higher education in science, mathematics, engineering, and technology fields for students from low-income households; and (E) expanding science, mathematics, engineering, and technology training opportunities at institutions of higher education."
Evidence: Collaborations address issues identified in the National Science Foundation Act of 1950, 42 USC 1861 et. seq. and in the NSF Authorization Act of 2002 (P.L. 107-378).
Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?
Explanation: NSF's relationship with the scientific research and education communities and its competitive grant mechanisms uniquely position the agency through its investments in Collaborations to address national science, technology, engineering and mathematics education and workforce needs that are not under the purview of mission-oriented federal, state or local agencies. NSF is the only federal agency charged with promoting the progress of science and engineering research and education in all fields and disciplines.
Evidence: The Informal Science Education (ISE) is the only activity of its kind in the federal government. The Institute of Museum and Library Services (IMLS) has a different mission and NASA and NIH have instituted some recent efforts to fund informal science education along lines consistent with their missions. NSF's ISE has previously collaborated with IMLS and other agencies on funding a research study on learning in museums. ISE communicates regularly with other agencies and review changes in their direction and solicitations to assure cooperation and unnecessary duplication.
Is the program design free of major flaws that would limit the program's effectiveness or efficiency?
Explanation: NSF's investments in Collaborations rely upon the competitive merit review process, NSF program officers, and Committees of Visitors (COVs) to ensure program effectiveness and efficiency. Merit review by peers has been recognized as a best practice for administering R&D programs. Independent reviews by COVs and other external groups (e.g., Advisory Committees, National Science Board, National Academy of Science/National Research Council, President's Committee of Advisors on Science and Technology) provide additional scrutiny of the portfolio and program goals and results. This follows the guidance provided in the R&D Criteria, as outlined in the OMB/OSTP Guidance Memo.
Evidence: Evidence demonstrating that the Collaborations program is free of major flaws may be found in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), the FY 2003 Report on NSF Merit Review System (www.nsf.gov/nsb/documents/2004/MRreport_2003_final.pdf) and in Committee of Visitors reports.
Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?
Explanation: NSF's investments in Collaborations rely upon two mechanisms to ensure that the program is effectively targeted and that funding addresses the program's purpose directly. First, the solicitations for each activity contain a clear statement of the purpose in the context of the particular activity. Then, the merit review process ensures that funding is awarded to proposals that best address the activity's purpose.
Evidence: Examples from Collaboration solicitations include: Centers for Learning and Teaching proposals must involve partnerships of organizations with a scientific, engineering, and/or educational mission. Each Center must have one or more school district partners (or an appropriate group of schools, e.g., specialized schools), as well as a partner that is authorized to award doctoral degrees in an appropriate science, technology, engineering and mathematics (STEM) education area. Alliances for Graduate Education and the Professoriate is designed to increase the number of minority students pursuing advanced study, obtaining doctoral degrees, and entering the professoriate in STEM disciplines. Alliances participating in this program are expected to engage in comprehensive institutional cultural changes that will lead to sustained increases in the conferral of STEM doctoral degrees, significantly exceeding historic levels of performance.
|Section 1 - Program Purpose & Design||Score||100%|
|Section 2 - Strategic Planning|
Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?
Explanation: Specific long-term performance measures for NSF's investments in Collaborations are listed in the 'Measures' tab. These are drawn from the objectives set forth in the NSF Strategic Plan FY 2003-2008 and they encompass NSF's commitment to broadening participation in science and engineering and to strengthening the U.S. workforce in science, technology, engineering and mathematics (STEM).
Evidence: Performance measures may be found in the Measures tab. Additional information regarding the assessment of performance may be found in the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf).
Does the program have ambitious targets and timeframes for its long-term measures?
Explanation: Ambitious targets and timeframes are set under 'Measures' tab.
Evidence: Targets and timeframes for long-term measures may be found in the Advisory Committe for GPRA Performance and Assessment Report www.nsf.gov/pubsys/ods/getpub.cfm?nsf04011) and in annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par).
Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?
Explanation: The program has identified a number of quantitative annual measures, shown in the 'Measures' tab, that relate directly to the agency's strategic goals.
Evidence: Performance measures may be found in the Measures tab.
Does the program have baselines and ambitious targets for its annual measures?
Explanation: Baselines are obtained from internal NSF sources. Ambitious targets are set under the 'Measures' tab.
Evidence: Performance measures may be found in the Measures tab. Additional information may be found in NSF's Enterprise Information System (internal); annual and final project reports.
Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?
Explanation: The key partners for NSF's investments in Collaborations both commit to and work toward the goals of the program. The commitment is ensured through the mechanisms described in the response to Q1.5 -- namely the combination of the program purpose being expressed in program solicitations and the selection of awards through the merit review process. NSF then ensures that its partners are working toward the goals of the program via the following mechanisms: 1) continuing support is based upon annual progress reports submitted by Principal Investigators and reviewed by NSF program officers; 2) to receive subsequent awards, all applicants are required to report on the results of previous NSF support, which is then considered in the merit review process.
Evidence: Evidence of commitment to annual and long-term goals may be found in annual and final project reports and in the grant conditions. For example, Informal Science Education (ISE) specifies required elements of the Annual Report in the solicitation: The Annual Report 'should highlight major accomplishments, describe the lessons learned, document alignment with the proposed time line, and describe the status of the development of the materials. Samples of completed materials, or drafts of materials, should be included.' Centers for Learning and Teaching have as a grant condition that they must participate in the external program evaluation, which includes a monitoring function. The Math Science Partnership solicitation has, as an additional review criterion, evidence of an effective partnership among the partnering institutions and organizations. Projects report on progress in this area in their strategic plans, implementation plans and annual reports.
Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?
Explanation: Evaluations are conducted regularly to bring about program improvements and influence program planning. Each activity at NSF is reviewed once every three years by a Committee of Visitors (COV). NSF's approach to evaluation was recently highlighted by GAO as an "evaluation culture ' a commitment to self-examination, data quality, analytic expertise, and collaborative partnerships." Advisory Committees review Directorate performance, and the Advisory Committee for GPRA Performance Assessment assesses performance on an NSF-wide basis for the Strategic Goals. NSF staff and external experts conduct site visits for major activities, such as Math Science Partnership Critical Site Visits for all Comprehensive projects. All these activities inform NSF senior management and contribute to the development of plans for the program. NOTE: Weight reflects the importance NSF places on the conduct of independent evaluations to support program improvements and evaluate effectiveness.
Evidence: Independent evaluations are critical to NSF performance assessment as evidenced by the GAO report Program Evaluation: An Evaluation Culture and Collaborative Partnerships Help Build Agency Capacity GAO-03-454, May 2, 2003. Examples of independent evaluations of sufficient scope and quality include Committees of Visitors reports and NSF responses, Advisory Committee reports, including the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf), and site visit reports (internal).
Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?
Explanation: Performance information is used by managers to make informed decisions and is incorporated into NSF's budget requests to the Congress. The FY 2005 Budget Request to Congress was built around the R&D Criteria, thereby highlighting specific performance information for NSF's investment portfolio. The budget also clearly presents the resource request and outlines the activities that will be supported with the funds. In the Budget Request, NSF displays the full budgetary cost associated with the new program framework defined in the NSF Strategic Plan FY 2003-2008.
Evidence: The FY 2005 NSF Budget Request to Congress presents the long-term goals of the Collaborations program and the resources needed in a complete and transparent manner (www.nsf.gov/bfa/bud/fy2005/pdf/fy2005.pdf). Additional information may be found in the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf).
Has the program taken meaningful steps to correct its strategic planning deficiencies?
Explanation: For NSF's investment in Collaborations, the Committee of Visitors (COV) process provides a valuable mechanism for identifying and addressing planning-related issues. The COV for Informal Science Education (ISE), for example, recommended more outreach to other organizations, resulting in the creation of the new ISE web site. The Math and Science Partnership (MSP) conducted capacity building workshops, some focused primarily on minority institutions, in order to strengthen and expand the proposal pool. Additionally, MSP provides fiscal and programmatic technical assistance to awardees, conducting financial and reporting workshops for all new MSP awardees and providing feedback to each MSP project's detailed strategic and implementation plans.
Evidence: Relevant evidence may be found in the Math and Science Partnership Annual Report To Congress, Math and Science Partnership awardee Strategic Plans, Committee of Visitors reports and NSF responses, and at the Informal Science Education website.
If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?
Explanation: NSF's investments in Collaborations address national science, technology, engineering and mathematics (STEM) workforce and education needs that are not addressed in the same ways at the more mission-specific federal, state, or local agencies. The NSF investments in Collaborations are positioned to focus on STEM workforce and education issues using a research and development strategy, with a strong focus on research and evaluation of the activity and projects within the activity, and the collaborative engagement of STEM disciplinary practitioners with STEM educators and educational researchers. The Collaborations program assesses and compares the potential benefits of its efforts through the evaluation process. The process uses external review at several levels: Advisory Committees, National Science Board Committee on Education and Human Resources, and external evaluations of the program.
Evidence: The Collaborations program assesses and compares the potential benefits of its efforts through the evaluation process. Evidence of this assessment can be found in the National Science and Technology Council Subcommittee on Education and Workforce Development, National Science Board Report on National Workforce Policy, The Science and Engineering Workforce: Realizing America's Potential (NSB 03-69), the National Science Board report on diversity in the scientific and technological workforce (NSB Report -- Broadening Participation Workshop Proceedings [Note: This is currently in late draft stage]), Advisory Committee reports, and external evaluations on programs such as Louis Stokes Alliances for Minority Participation and Centers for Learning and Teaching.
Does the program use a prioritization process to guide budget requests and funding decisions?
Explanation: The external merit review system includes review of proposals by expert panelists with backgrounds that match the core constituencies of the program. These reviewers assess proposals according to the Merit Review Criteria, intellectual merit and broader impacts. Often, proposals for large scale awards, such as those in Centers for Learning and Teaching and Math Science Partnership, participate in further external review in which teams from prospective collaborations that were deemed as competitive for funding come to NSF for reverse site visits involving expert external reviewers and NSF staff. Final decisions to make awards include consideration of NSF's core strategies and maintaining a diverse portfolio. For Budget Requests, each of the activities within the program provides input to senior management about past performance and future needs. Senior management integrates that information, prioritizes budget requests within and among programs, and determines funding levels, all of which is reviewed by the National Science Board.
Evidence: Relevant information regarding the prioritization process may be found in the NSF Budget Requests to Congress and the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf). Additional information regarding funding decisions may be found in the Grant Proposal Guide (www.nsf.gov/pubsys/ods/getpub.cfm?gpg).
|Section 2 - Strategic Planning||Score||100%|
|Section 3 - Program Management|
Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?
Explanation: Performance information is collected from NSF grant recipients via interim, annual and final project reports. Site visits to larger projects are another mechanism used to collect performance information. Committee of Visitors reviews and recommendations are utilized to improve program performance. Process-related or quantitative goals such as dwell time are monitored via the agency's Enterprise Information System (EIS). All of these assessments impact management practices. NSF programs collect high-quality performance data relating to key program goals and use this information to adjust program priorities, make decisions on resource allocations and make other adjustments in management actions. In addition, NSF utilizes on-line monitoring of projects in many of its programs. GPRA performance data are verified and validated by an independent, external consulting firm.
Evidence: Evidence relating to the use of credible performance information may be found in Committee of Visitors reports (internal documents) and Advisory Committee reports, including the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf). Data is collected through annual, interim, and final project reports (internal documents), the Enterprise Information System (EIS) data - GPRA module, annual contract performance evaluations, site visit reports (internal documents), external evaluation reports, and program monitoring.
Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?
Explanation: NSF awardees must meet annual and final reporting requirements as well as financial record keeping requirements. NSF staff monitor cost, schedule and technical performance and take corrective action when necessary. The efforts of NSF staff are reviewed by their supervisors and by Committees of Visitors. Individual performance plans are directly linked to NSF's strategic goals.
Evidence: Federal managers and program partners are held accountable through cooperative agreements or contracts and annual performance evaluation of NSF employees/program officers. Relevant evidence of this may be found in Committee of Visitors reports, annual and final project reports, and in the NSF General Grant Conditions.
Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?
Explanation: NSF routinely obligates its funds in a timely manner. NSF also has pre-award internal controls to reduce the risk of improper payments. Beginning in FY 2004 NSF has incorporated erroneous payments testing of awardees into the on-site monitoring program. When this testing is complete, it will provide NSF with information about the usage of NSF funding by awardees.
Evidence: Evidence of the agency's financial obligations may be found in the NSF FY 2001 Risk Assessment for Erroneous Payments, Data on NSF Carryover (found in the NSF Budget Request to Congress), the Risk Assessment and Award Monitoring Guide, NSF's clean opinion on financial statements for past six years, and in the Statement of Net Costs.
Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?
Explanation: NSF is a leader in the vigorous and dynamic use of information technology (IT) to advance the agency mission. IT improvements permit more timely and efficient processing of proposals. The NSF-wide priority of increasing award size and duration enhances efficiency because larger, longer awards allow the community to spend more time developing and researching their projects and less time preparing proposals. Several Collaborations activities limit the number of proposals invited from a single institution. Such limits mean that many proposals have already faced a competitive process within the institution before they reach NSF, which tends to strengthen submitted proposals while relieving administrative burden on NSF. This also allows for higher success rates and maximized interdisciplinary collaboration. Efficiencies and (internal) "effective practices" research are examining alternatives such as broadening the use of letters of intent, proactively coordinating program deadlines and developing strategies that reach out to a greater pool of reviewers.
Evidence: Procedures to measure and achieve efficiencies are found in a number of documents: annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par); Committees of Visitors reports; NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf); NSF Grant Proposal Guide; and the program solicitations: Math and Science Partnership -www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf03605; Centers for Learning and Teaching - www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf04501; Graduate Research Fellows in K-12 Education - www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf04533.
Does the program collaborate and coordinate effectively with related programs?
Explanation: NSF promotes partnerships, intra-agency and interagency cooperation for its Collaborations program. NSF regularly shares information with other agencies and participates in coordination activities. Senior staff from the Education and Human Resources directorate and other agencies with interests in education meet regularly for planning and coordination purposes. Informal Science Education (ISE) communicates regularly with others, including the Institute for Museum and Library Sciences, the National Academies and NIH. NSF and the Department of Education coordinate the activities of the Math and Science Partnership activity. Within NSF, mechanisms are established and implemented for split funding between directorates.
Evidence: Evidence relevant to demonstrating the Collaboration program's coordination and collaboration with similar programs may be found in management plans (internal documents). In addition, meetings between senior NSF and Department of Education officials have resulted in mutually beneficial goals and projects. NSF/ED cooperation on the administration of their separate Math and Science Partnership activities These collaborations are evidenced in the Math and Science Partnership Semi-Annual Report to Congress.
Does the program use strong financial management practices?
Explanation: NSF uses strong financial management practices at the agency level and at the program level. NSF was the first federal agency to receive a 'green light' for financial management on the President's Management Agenda scorecard. NSF has received a clean opinion on its financial audits for the last six years. The NSF is committed to providing quality financial management to all its stakeholders. It honors that commitment by preparing annual financial statements in conformity with generally accepted accounting principles in the U.S. and then subjecting the statement to independent audits. As a federal agency, NSF prepares the following annual financial statements: Balance Sheet, Statement of Net Cost, Statement of Changes in Net Position, Statement of Budgetary Resources, and Statement of Financing. Supplementary statements are also prepared including Budgetary Resources by Major Accounts, Intragovernmental Balances, Deferred Maintenance, and Stewardship Investments.
Evidence: Evidence of NSF's strong financial management practices may be found in the Executive Branch Management Scorecard, the results of NSF financial audits, and in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par).
Has the program taken meaningful steps to address its management deficiencies?
Explanation: Committees of Visitors regularly provide feedback on programmatic and management-related concerns. In addition, the Foundation conducts an annual review to assess administrative and financial systems and procedures to ensure that effective management controls are in place and that any deficiencies are identified and addressed.
Evidence: Reports indicating no significant management deficiencies in the Collaborations program include the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), the NSF Business Analysis, Committee of Visitors reports, Advisory Committees' review of COV reports, annual report to senior management; and IG reports and NSF responses.
Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?
Explanation: All activities rely upon NSF's competitive, merit review process that includes external peer evaluation. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF program officer, and usually by 3-10 other persons outside NSF who are experts in the particular field represented by the proposal. Competitive merit review, with peer evaluation, is NSF's accepted method for informing its proposal decision process. The NSB-approved criteria address the "Intellectual Merit" and the "Broader Impacts" of the proposed effort. Some solicitations contain additional criteria that address specific programmatic objectives. NOTE: The weight of this question has been doubled to reflect the relative importance of merit review in verifying the relevance, quality, and performance of NSF's investments.
Evidence: Evidence of grants awarded through a clear competitive process may be found in the FY 2003 Report on NSF Merit Review System (www.nsf.gov/nsb/documents/2004/MRreport_2003_final.pdf). Additional information may be found in the Enterprise Information System and the NSF Performance and Accountability Reports.
Does the program have oversight practices that provide sufficient knowledge of grantee activities?
Explanation: NSF has a formal Award Monitoring and Business Assistance Program (AMBAP) based on a financial and administrative risk assessment of NSF awardee institutions, focusing on award oversight, including desk and on-site monitoring and providing assistance to awardees. AMBAP is a collaborative effort between NSF administrative and financial managers/technical staff and NSF program managers working with their awardee counterparts. Oversight mechanisms are currently sufficient. NSF's capacity to provide adequate oversight is dependent on available resources to offset salary and expenses with current resources reducing NSF's ability to perform the level of oversight deemed desirable. NSF is using technology and creativeness, such as teleconferencing, videoconferencing, and reverse site visits to enhance performance oversight within current resource constraints. NSF maintains scientific oversight of all awards through annual and final project reports, and funds are tracked (via reporting systems) to ensure that funds are used for their designated purpose.
Evidence: Oversight activities which demonstrate a sufficient knowledge of grantee activities may be found in Committee of Visitors reports; quarterly / annual and final project reports; directorate reviews; FY 2003 Report on NSF Merit Review System (www.nsf.gov/nsb/documents/2004/MRreport_2003_final.pdf); the Risk Assessment and Award Monitoring Guide; Clean audit opinions for last six years; the Executive Branch Management Scorecard; site visit reports; workshops and grantee meetings; grants and cooperative agreements; and project audits.
Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?
Explanation: NSF Grant General Conditions require that results of NSF-supported research be published in the open literature and that NSF support is appropriately referenced/cited. NSF's annual Performance and Accountability Report and its annual Budget Request contain highlights of NSF-supported research. Principal Investigators provide annual progress reports to NSF that are examined and approved/disapproved by the program directors. Information is made available to the public on the numbers of proposals and numbers of awards as well as, for each award, the name of the principal investigator, the awardee institution, amount of the award, and an abstract of the project. The Budget Internet Information Site (BIIS) contains extensive information on awards and funding trends.
Evidence: Grantee performance data is collected annually as evidenced in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par) and the annual Budget Request to Congress. Additional information may be found in the Grant General Conditions (GC-1), highlights of annual meetings/grantees meetings, workshops, and the Budget Internet Information Site (dellweb.bfa.nsf.gov/).
For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?
Explanation: NSF programs are administered as competitive.
|Section 3 - Program Management||Score||100%|
|Section 4 - Program Results/Accountability|
Has the program demonstrated adequate progress in achieving its long-term performance goals?
Explanation: NSF relies on external evaluation to determine whether it is achieving its long-term objectives. Since FY 2002, the NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) serves as the focal point for these activities. Input is derived from numerous sources including Committees of Visitors, annual and final project reports, and summaries of substantial outcomes ('nuggets') from funded research. The AC/GPA has determined that the accomplishments under the People goal have demonstrated "significant achievement' toward annual and long-term performance goals. In addition, component activities of the Collaboration program undergo third party evaluations. Collectively, these evaluations provide further evidence that the activities are demonstrating adequate progress in achieving their long-term performance goals.
Evidence: Evidence demonstrating progress in meeting the Collaboration program's long-term goals may be found in the Measures tab, the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf); in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par); in annual and final project reports; and in third party evaluations and impact monitoring of component activities of the program.
Does the program (including program partners) achieve its annual performance goals?
Explanation: The program achieves its annual performance goals.
Evidence: Evidence demonstrating achievement of Collaboration's performance goals may be found in Committee of Visitors and other assessment reports, program reports, project reports, and the annual Performance and Accountability Reports.
Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?
Explanation: The NSF-wide priority of increasing award size and duration enhances efficiency because larger, longer awards allow the research community to spend more time conducting research and less time preparing proposals. Independent reviews by COVs and other external groups provide additional scrutiny to portfolio and program goals, ensuring effectiveness and operational efficiency. Where appropriate, the number of proposals accepted from a single institution is limited, leading to higher success rates and more interdisciplinary collaboration within submitting universities. Several activities use pre-proposals to improve merit review efficiencies and reduce the burden on researchers and reviewers. NSF continues to improve operational efficiencies through electronic systems including the use of interactive electronic panels, online reviews, and online award processing. Increases for high priority graduate fellowships and traineeships required reallocations across the People activities, affecting the ability to make consistent improvements in award size/duration.
Evidence: Evidence demonstrating cost effectiveness is shown in the Measures tab. Additional information may be found in the annual Peformance and Assessment Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), the NSF Budget Requests to Congress, Centers for Learning and Teaching (www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf04501); Math Science Partnership (www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf03605); Presidential Awards for Excellence in Science, Mathematics, and Engineering Mentoring (www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf04525); Graduate Research Fellows in K-12 Education (www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf04533); Informal Science Education (www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf03511).
Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?
Explanation: NSF's activities through its investment in Collaborations address national science, technology, engineering, and mathematics education and workforce needs that are not addressed by the mission agencies. Because of its recognized effectiveness, aspects of NSF investments in Collaborations are often emulated by other programs in government and the private sector. The NSF activities also create a nation-wide response to address the goals of the program.
Evidence: Other federal agencies have implemented similar programs with guidance from NSF. Evidence may be found in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), Committee of Visitors reports, Advisory Committee reports, and in data from the Enterprise Information System (internal).
Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?
Explanation: Independent reviews by Committees of Visitors and other external groups (e.g., Advisory Committees and the National Science Board) provide additional scrutiny of the portfolio and program goals and results. In particular, the most recent evaluation that included the entire Collaborations portfolio was the 2004 meeting of the Advisory Committee for GPRA Performance Assessment (AC/GPA). The AC/GPA determined that NSF demonstrated "significant achievement' with respect to its FY 2004 GPRA Strategic Outcome Goals for People (which includes Collaborations). In reaching this determination, the committee specifically considered indicators that matched the objectives used here for Collaborations. NOTE: The weight of this question has been doubled to reflect the importance of independent evaluation in verifying relevance, quality and performance of NSF's investment in Collaborations.
Evidence: Evaluations of the Collaborations program are sufficient in scope and quality, as evidenced by the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf), the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), the annual and final project reports; Committee of Visitors reports and NSF responses, and other external reports (e.g., National Academy of Sciences).
|Section 4 - Program Results/Accountability||Score||78%|