|Program Title||K-12 Math and Science Education|
|Department Name||National Science Foundation|
|Agency/Bureau Name||National Science Foundation|
Research and Development Program
Competitive Grant Program
|Assessment Rating||Moderately Effective|
|Assessment Section Scores||
|Program Funding Level
|Year Began||Improvement Plan||Status||Comments|
|Year Began||Improvement Plan||Status||Comments|
Measure: For MSP projects focused on mathematics, percentage of MSP schools meeting Adequate Yearly Progress in mathematics.
Explanation:By law, Adequate Yearly Progress (AYP) must be defined by States in a manner that (a) applies the same high standards of academic achievement to all public elementary school and secondary school students in the State; (b) is statistically valid and reliable; and (c) results in continuous and substantial academic improvement for all students (Public Law 107-110, the No Child Left Behind Act of 2001, http://www.ed.gov/policy/elsec/leg/esea02/pg2.html#sec1116). The MSP program collects, though its Management Information System, a broad array of information from projects including the names of MSP schools (those schools with significant MSP intervention likely to cumulatively impact student performance collected at the school level). This enables the program to learn whether or not MSP schools have met or not met AYP status in any reporting year. The school as unit of analysis makes sense conceptually given the goals of NSF's MSP program, which include emphasis on institutional change. Mathematics is the focus because it is a required assessment area for AYP. While Science assessments are just coming on board nationally and are not in AYP formulations, we will - after the science assessments have demonstrated multiyear validity and reliability as tests - include "increasing percentages of MSP projects reaching AYP in Science" as an annual indicator for those MSP projects with strategies focused on Science. These data are used by MSP program staff to gauge progress within specific projects, and are discussed in the annual site visits made by program staff to projects, as well as at PI meetings annually, in an effort to assist projects to heighten their focus on MSP school progress toward AYP and thereby manage the program effectively.
Measure: Minimum number of resources (instructional programs, models, or interventions) developed by the DRK-12 program whose effectiveness has been examined using rigorous methods.
Explanation:The goal of the DR-K12 program is to "enable significant advances in K-12 student and teacher learning of the STEM disciplines through research about, and development and implementation of, innovative resources, models, and technologies for use by students, teachers, and policy makers." (NSF08-502, DRK-12 solicitation, p. 2) The cycle of research-development-implementation-evaluation-revision needs to span several years in the process of instructional materials development. The most recent DRK-12 solicitation (2007) emphasizes the importance of framing evaluation questions that address the project's goals, designing evaluation methods that will be useful in answering the questions, and applying those methods rigorously. These instructions are much more clear and explicit than previous versions of the solicitation. Thus the NSF K-12 program expects to generate within the next 10 years between 10-18 products that have been rated by external experts as having used appropriate evaluation methods rigorously. This measure focuses on the outcomes of projects that have a substantial development component, to produce materials for use in K-12 STEM instruction. The program funds full research and development projects, exploratory projects, synthesis projects, and conferences and workshops. In FY07 (the first year of the program), 58 projects were funded. Of these, 38 intend to develop materials for use in K-12 STEM instruction (either for students or teachers). Given that the development process is a long-term activity that involves basic research, proof-of-concept design, early testing and revision, and wider efforts to implement under special conditions, again followed by revision, it will take several years before the full R&D projects in the DRK-12 program will produce interventions that are specific enough, and promising enough, to be evaluated for effectiveness in typical settings. And, some DR-K12 R&D projects will essentially be research projects to examine the effectiveness or implementation of some program, model, or intervention developed previously, rather than development of new materials or a new model. The targets are ambitious because the tradition in the precursors of this program have emphasized pre-efficacy research (i.e., early design research and implementation under more clinical conditions). The current solicitation emphasizes the importance of moving these materials closer to readiness for wide scale implementation, with appropriate testing and design to ensure their readiness. Based on evidence about NSF-supported instructional programs, models, or interventions funded in DRK-12 precursor programs whose effectiveness has been examined using rigorous methods. Program development and rigorous evaluation occur in a cyclic and interconnected ways. Target years and numbers represent points roughly midway in those cycles. Both the What Works Clearinghouse and IES's funding programs have examined NSF-developed materials rigorously. NSF-funded instructional programs entered in WWC in 2007: Connected Mathematics Project (CMP)(#9150217, #9986372), evaluation studies in 2000, 2001, and 2002; Cognitive Tutor(#9253161, #0087396, #0336585), evaluation studies in 2001 and 2002; Everyday Mathematics (#9252984), evaluation studies in 1997, 1998, 2000, 2001, 2003. In 2008, WWC posted a "quick review" of the NSF-funded Simulations for Calculus Learning - SimCalc (#0455868). IES Mathematics and Science Education Research program awards to conduct rigorous studies of materials developed with NSF funding: Taylor 2006 (high school BSCS materials, #9911614, #0242596); Alevan, 2008 (cognitive tutor); Clements, 2004 (Big Math for Little Kids, #9730683); Pane, 2007 (cognitive tutor); Barnett, WestEd, 2005 (Math Pathways and Pitfalls, #9911374); and Clements, 2005 (Building Blocks, # 9730804).
Measure: Percentage of development-intensive projects in the DRK-12 program that employ appropriate methods to evaluate efficacy and apply them rigorously.
Explanation:The cycle of research-development-implementation-evaluation-revision needs to span several years in the process of instructional materials development. The most recent DRK-12 solicitation (2007) emphasizes the importance of framing evaluation questions that address the project's goals, designing evaluation methods that will be useful in answering the questions, and applying those methods rigorously. These instructions are much more clear and explicit than previous versions of the solicitation. Thus the NSF K-12 program expects to generate within the next 10 years between 10-18 products that have been rated by external experts as having used appropriate evaluation methods rigorously. DRK-12 will add to this base annually approximately 10-12 research studies tied to development projects that employ quasi-experimental/experimental designs. This emphasis on more rigorous evaluation is featured in new solicitations and EHR is working with the field, and with reviewers, to heighten awareness of these expectations. Each year the number of such projects in the portfolio should grow, as the field becomes more cognizant about NSF's evolving expectations. In examining data from the external evaluation of the Teacher Professional Continuum (TPC) program, another precursor program to DRK-12. The draft report (Abt Associates, May, 2008) notes that "nearly one fourth of the projects (9 projects) proposed to randomly assign schools or teachers to conditions and one third of the projects (13 projects) proposed to use a quasi-experimental design ??." (p. 29). Because TPC involved the implementation of instructional programs or models for K-12 learning, this allows us to establish an earlier baseline for this annual measure, and to demonstrate progress. The What Works Clearinghouse (WWC) criteria for study inclusion (http://ies.ed.gov/ncee/wwc/overview/review.asp?ag=pi), which are similar to the criteria for study inclusion in the recent report of the National Mathematics Panel (http://www.ed.gov/about/bdscomm/list/mathpanel/report/final-report.pdf) reflect the standards that we will use for determining which projects meet this level of rigor in evaluation in the case of RCTs and quasi-experiments. We also acknowledge that rigorous evaluation can employ other methodologies, and will use the NRC Scientific Research in Education principles in addition.
|Section 1 - Program Purpose & Design|
Is the program purpose clear?
Explanation: The National Science Foundation's K-12 Mathematics and Science Education Program is designed to build strong foundations and foster innovations to improve teaching, learning, and evaluation in pre-college mathematics and science. It explicitly aims to generate research-based outcomes, develop innovative resources and tools, and build human capacity to improve K-12 Science, Technology, Engineering, and Mathematics (STEM) learning. Parts of the program also include commitment to directly improving students' STEM learning. The program addresses these themes by funding awards and contracts in the development of model programs, tools, and resources; research on teaching and learning; and, methods for improving assessment. There are two distinct components of the program: Mathematics and Science Partnership (MSP) and Discovery Research K-12 (DR-K12). In the FY 2009 NSF Budget Request for these program components, MSP is roughly 1/3 of the funds and DRK-12 is the other 2/3. The goals of these programs are described as follows in the program solicitations: "The Math and Science Partnership (MSP) program is a major research and development effort that supports innovative partnerships to improve K-12 student achievement in mathematics and science. MSP projects are expected to raise the achievement levels of all students and significantly reduce achievement gaps in the mathematics and science performance of diverse student populations. In order to improve the mathematics and science achievement of the Nation's students, MSP projects contribute to the knowledge base for mathematics and science education and serve as models that have a sufficiently strong evidence base to be replicated in educational practice." (http://www.nsf.gov/pubs/2008/nsf08525/nsf08525.htm) "The Discovery Research K-12 (DR-K12) program seeks to enable significant advances in K-12 student and teacher learning of the STEM disciplines through research about, and development and implementation of, innovative resources, models, and technologies for use by students, teachers, and policy makers. Activities funded under this solicitation begin with a research question or hypothesis about K-12 STEM learning or teaching; develop, adapt, or study innovative resources, models, or technologies; and demonstrate if, how, for whom, and why their implementation affects learning." (http://www.nsf.gov/pubs/2008/nsf08502/nsf08502.htm) The DRK-12 solicitation includes a description of a cycle of innovation and learning (see http://www.nsf.gov/pubs/2008/nsf08502/nsf08502.htm) that explains how various EHR programs provide support for scholarly work at a variety of stages of design, testing, and implementation. In particular, the programs emphasize the study and clarification of phenomena (basic research about STEM learning), design and development of resources and materials for learning, implementation of innovations and efficacy testing, evaluation and effectiveness studies, and synthesis. The initial stages - basic work and proof-of-concept design, and implementation for testing and refinement - are essential before it is cost effective or worthwhile to implement and test innovations at large scale. NSF programs expect that research and development work at all positions of this cycle should be scientifically rigorous, consistent with the methodologies and standards of the profession. NSF uses various mechanisms - especially program solicitations and reports - to communicate the intent of its investments in K-12 education.
Evidence: The program purpose is stated in the agency strategic plan, the mission statement for the overall program, and within the proposal solicitations for the two major components of the K-12 Program (MSP and DRK-12). Both the DRK-12 and MSP solicitations were revised for the FY 2008 competition, incorporating feedback from the field in order to make the solicitations more clear. Sources: NSF's Strategic Plan (pages 7-8): http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0648; Program mission and goal statement: http://www.nsf.gov/ehr/about.jsp; MSP: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06539 http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03605&org=NSF DRK-12: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06591
Does the program address a specific and existing problem, interest, or need?
Explanation: The program addresses an identified need for strong STEM education and a well-prepared STEM workforce. The need is presented quite forcefully in several analyses and serves as the basis for investments by the Administration, Congress, and a host of public and private organizations. The analyses show that U.S. students lag behind those from other nations on tests of knowledge and skills in mathematics and science; that sizeable numbers of students are taught by teachers without degrees in mathematics and science; and that the materials used in mathematics and science courses often do not incorporate contemporary knowledge in mathematics and science as well as research knowledge about learning. The K-12 program is designed to address these specific concerns.
Evidence: The solicitations for the program draw on the scholarly literature and discussions in the public arena to frame the issues to be examined. Key sources supporting the need for focus in this area include: Rising Above the Gathering Storm (NRC, 2007); Taking Science to School (NRC, 2007); Foundations for Success: Report of the National Mathematics Advisory Panel, 2008. The approach taken in the K-12 program reflects extant knowledge about what is likely to be effective to address a given issue. DRK-12 focuses specifically on the need for resources and materials for learning that are based on scientific research. For MSP, the relevant literature centers on teachers' subject matter knowledge and the importance of that knowledge for improving K-12 science and mathematics teaching and learning. MSP contributes to the American Competitiveness Initiative goal to produce 100,000 highly qualified mathematics and science teachers by 2015. DR-K12 is built in support of the goals of the American Competitiveness Initiative and clarifies the dimensions of the problems it is addressing by drawing on analyses by AAAS, NRC, and leading researchers to help in shaping research-based models, assessments, curricular components, and tools and implementation activities to improve K-12 STEM learning. Sources: http://ostp.gov/html/ACIBooklet.pdf; recommendations of CAWMSET, Congressional Commission on the Advancement of Women and Minorities in Science, Engineering and Technology Development (2000, September). Land of plenty: Diversity as America's competitive edge in science, engineering and technology. Washington, DC. http://www.nsf.gov/publications/ods/results.cfm? TextQuery=Land+of+Plenty&Current_status=Current&timeframe=Restrict+timeframe+to% 3A&docType=0&docSubtype=0 Rising Above the Gathering Storm (NRC, 2007): http://www.nap.edu/catalog.php?record_id=11463 Taking Science to School (NRC, 2007): http://www.nap.edu/catalog.php?record_id=11625 Foundations for Success: Report of the National Mathematics Advisory Panel (The U.S. Department of Education, 2008), http://www.ed.gov/about/bdscomm/list/mathpanel/report/final-report.pdf
Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?
Explanation: The National Science Foundation (NSF) has sought to design its K-12 Mathematics and Science Education Program to complement efforts undertaken elsewhere and to take account of NSF's strengths, capacities, and past investments. In contrast to the focused R&D efforts of the Federal mission agencies (e.g., NIH-biomedical, NASA-space, DOD-defense, etc.), NSF is the only federal agency charged with promoting the progress of fundamental science and engineering research and education in all STEM fields and disciplines. The MSP program at NSF is the only national, merit-reviewed, research and development (R & D) effort that requires partnerships between higher education and K-12 to improve student achievement in mathematics and science, grades K-12, through foci on (a) challenging courses and curricula; (b) teacher quality, quantity, and diversity; and (c) evidence-based outcomes that contribute to our understanding of how students effectively learn mathematics and science. The cycle of innovation and learning as described in the DRK-12 solicitation demonstrates how the program is aimed at R&D activity that will produce innovations that can then be scaled up and studied in effectiveness trials such as those funded in the Institute of Education Sciences' Mathematics and Science Education Research Program. Ongoing collaboration between NSF program staff with disciplinary expertise and those with education research expertise makes NSF unique among federal agencies for innovative and effective contributions to K-12 STEM education. In addition, the NSF's K-12 Program combines research and development and focuses on innovation and transformation of the highly problematic K-12 STEM teaching and learning environment. Within the private sector, there are some recent and ongoing efforts to fund improvements in STEM teaching and learning at the K-12 level. None of these is comparable to the major NSF R&D investment in this area in scope, in STEM focus, or in emphasis on research findings. For example, the Carnegie Corporation's Teachers for a New Era initiative funded 11 institutions over the past several years to reform the preparation of teachers, but there was no specific STEM focus required. And, the National Mathematics and Science Initiative, funded with ExxonMobil resources, has a very prescribed agenda for replicating the UTeach teacher education program (originally NSF-funded) at the University of Texas Austin and for propagating an emphasis on Advanced Placement programs. Other major foundations such as Intel, Noyce, and Gates have mathematics and science interests that are aimed at directly making change in schools, with less focus on building the knowledge base about how to best do this.
Evidence: The integration of research and education makes NSF's K-12 Math and Science Education program unique, in part because of its emphasis on a research and development cycle. All NSF-funded efforts are expected to contribute to the educational knowledge base for how to best improve STEM teaching and learning (in contrast to the implementation programs at science R&D mission agencies.) The U.S. Department of Education (ED) also administers a Mathematics and Science Partnerships program. However, the two programs complement rather than duplicate each other. NSF's Partnerships are funded at approximately $1 million to $7 million a year for five years. An important aspect of the R & D role of the MSP program at NSF is the development of tools and instruments needed by both NSF's and ED's Partnerships to effectively carry out and assess their work. The DRK-12 program and its precursors (most specifically the long-standing Instructional Materials Development program) complement IES's more recent Mathematics and Science Education program, which has focused on funding efficacy and effectiveness trials to examine the quality of instructional materials. A review of the projects funded by this program since its inception indicates that several of these are studies of the effectiveness of materials developed originally with NSF funds, including BSCS materials, "Big Math for Little Kids", SimCalc, Data Modeling, and Diagnostic Embedded Classroom Assessment. This is evidence that the idea of materials developed through NSF funding and basic research being "handed off" to IES-funded researchers for effectiveness studies is actually working in practice. DR-K12 fills a unique niche that blends research and development in K-12 STEM education. DR-K12 brings together STEM education faculty, mathematicians and scientists, cognitive scientists, teachers and administrators, and STEM education graduate students to conduct the research, develop resources and tools and build the nation's capacity to develop and test innovative solutions to improve K-12 teaching and learning. Within the R & D context, DR-K12 focuses on resources needed to address contextual and frontier challenges in K-12 STEM education, and encourages innovative thinking from the field while continuing to build on the solid foundation of what works in K-12 STEM teaching and learning, in unique ways not duplicated elsewhere. Sources: MSP: http://hub.mspnet.org/index.cfm; U.S. Department of Education: http://www.ed.gov/index.jhtml; NSF DR-K12: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06593 http://ies.ed.gov/ncer/funding/math_science/index.asp http://www.carnegie.org/sub/program/teachers_prospectus.html http://www.nationalmathandscience.org/ http://www.intel.com/community/grant.htm http://www.gatesfoundation.org/default.htm http://www.noycefdn.org/
Is the program design free of major flaws that would limit the program's effectiveness or efficiency?
Explanation: Yes, the National Science Foundation's K-12 Mathematics and Science Education Program design has matured without major flaws that would limit the program's effectiveness or efficiency. Proposal pressure for both the DRK-12 and MSP programs is strong, indicating that the programs are appealing to the field. The K-12 Program receives extensive oversight and review including: (1) a merit review process recognized as a best practice for administering R&D programs; (2) expert professional judgment of trained NSF program officers knowledgeable in their fields; and (3) triennial external Committee of Visitors (COV) review to ensure effectiveness and efficiency of operations. Independent reviews by COVs and other external groups such as NSF Advisory Committees, the National Science Board, and other organizations provide scrutiny of the portfolio's goals and results. The K-12 Program makes improvements based on recommendations received from these reviewers following guidance provided in R&D criteria, as outlined in the OMB/Office of Science and Technology Policy Guidance Memo. In addition, a new practice was implemented in DRK-12 in FY2007 where panelists come together for a debriefing discussion at the end of each review session, where any issues in lack of clarity or intent of the solicitation are discussed.
Evidence: NSF's MSP component was initiated in response to P.L. 107-368 and drew upon substantial research and educational expertise and experience of program staff from throughout NSF, as well as that of experts in the field from higher education and K-12. Initially, MSP solicitations emphasized Comprehensive and Targeted Partnerships. Experiences with these Partnerships and input from experts informed later solicitations for Institute Partnerships: Teacher Institutes for the 21st Century and projects that engage national professional and disciplinary societies. In its review of five MSP solicitations and the resulting portfolio, the 2005 COV stated "The COV applauds NSF for achieving a breadth in the MSP portfolio that supports the integration of research and implementation, along with capacity building designed to improve learning outcomes in high quality mathematics and science by all students, at all pre??K-12 levels." Similarly, NSF's oversight will ensure that the DR-K12 program will meet its objectives and performance goals. DRK-12 made its first set of awards in spring 2007, and FY 2008 awards are being made currently. The program solicitation states that all projects must have an evaluation plan "that will examine whether the project has met its goals??All proposals should specify the evaluation questions and evaluation data to be gathered. .. When appropriate, project goals should include teacher or student learning outcomes, and assessment of progress toward those outcomes should be included in the project evaluation." (NSF 08-502, p. 15) Plans for a full program evaluation are underway, based in part on a workshop organized by the contractor, AIR, to provide initial guidance in fall 2007. In addition, DR-K12 is well aligned with the NSF Strategic Plan, specifically with the Learning Strategic Goal. Any flaws in the original solicitation were remedied through the rigorous internal NSF review process; a new solicitation was issued for FY 2008. Flaws that emerge as the program is implemented will be identified through COVs and evaluations, and will be addressed. Sources: MSP: http://www.nsf.gov/od/oia/activities/cov/ehr/2005/MSPcov.pdf; MSPnet: http://www.mspnet.org; http://18.104.22.168/pubs/2008/nsf08525/nsf08525.htm http://www.nsf.gov/pubs/2008/nsf08502/nsf08502.htm Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.) (2007). Taking science to school: Learning and teaching in grades K-8. Washington, DC: National Academy Press. http://books.nap.edu/execsumm_pdf/11625.pdf
Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?
Explanation: The National Science Foundation's K-12 Mathematics and Science Education Program is designed to target STEM education (rather than education in general) through research and development that will provide resources for improvement. These resources ultimately are available to the nation, although schools and districts of course have autonomy in their choices of what materials and resources to use. NSF K-12 programs reach project-level participants, schools, regions, states, and the nation. Specifically, the program is aimed at improvement in areas known to be important for STEM teaching and learning, groups shown to be critical for such teaching and learning, and methods proven to be important for assessment. With reference to the areas of emphasis, the focus on enhancing the content knowledge and the teaching skills of mathematics and science teachers (MSP) and on engagement of future researchers (DR-K12) emerges from analyses on where investments should be targeted. Similarly, the attention to student learning (central to all components of the program) and research on it flow from findings in the extant literature, and the methods of assessment are ones that have been clearly established. In terms of design and allocation, the K-12 Program allocates resources to test ideas found in the literature - but which demand refinement and examination within particular contexts. The K-12 Program design and allocation to its execution aim to benefit students and teachers.
Evidence: The NSF K-12 Mathematics and Science Education design ensures that resources address the program purpose and reach the intended beneficiaries including K-12 students, teachers, administrators, and policy makers, to support the common purpose of generating improvements in curriculum, assessment, teacher professional development, and student achievement in STEM. Resources are allocated according to a documented merit-based system (see http://www.nsf.gov/pubs/policydocs/pappguide/nsf08_1/gpg_3.jsp) through proposals submitted to solicitations (see http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf08502 and http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf08525) MSP has held six competitions, beginning in FY 2002, and its 89 awards comprise a diverse portfolio of Comprehensive Partnerships, Targeted Partnerships, Institute Partnerships: Teacher Institutes for the 21st Century, and Research, Evaluation and Technical Assistance awards. The 52 Partnerships funded to date unite more than 150 institutions of higher education with more than 700 school districts, including more than 3,300 schools, in 30 states and Puerto Rico. The Partnerships are expected to impact more than 138,000 teachers of mathematics and/or science. In academic year 2005-2006, for example, more than 33,000 teachers participated in MSP professional development designed to deepen and expand their expertise. Evidence that the program's activities reach the intended beneficiaries may be seen in the resulting portfolio of awards. DR-K12 represents a new effort that made its first awards in FY 2007. Awards made in the precursor programs to DRK-12 have resulted in materials and models that are being used in schools, in teacher education, and in professional development, that have been examined in the What Works Clearinghouse, and that are being studied by IES-funded researchers. Proposal pressure is strong in both programs, with a success rates in FY07 of only 19.5% for DRK-12. Over the 4 major MSP competitions for partnership awards, the success rate has always been less than 10%. Sources: MSP: http://www.nsf.gov/publications/pub_summ.jsp? ods_key=nsf06539; http://www.nsf.gov/publications/pub_summ.jsp? ods_key=nsf03605&org=NSF; DR-K12: http://www.nsf.gov/publications/pub_summ.jsp? ods_key=nsf06593; MyNSF: http://www.nsf.gov/mynsf/; NSF's bi-annual Regional Grants Conference and at national meetings: http://www.nsf.gov/bfa/dias/policy/outreach.jsp; The Report to the National Science Board on the National Science Foundation's Merit Review Process Fiscal Year 2006 (nsb0722)): http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsb0722 Bureau of Labor Statistics, U.S. Department of Labor, "Career Guide to Industries, 2004-05 Edition." http://bls.gov/oco/home.htm; Horrigan, M.W. (2004). Employment projections to 2012: Concepts and context. "Monthly Labor Review Online," 127, 3-22. http://bls.gov/opub/mlr/2004/02/art1full.pdf
|Section 1 - Program Purpose & Design||Score||100%|
|Section 2 - Strategic Planning|
Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?
Explanation: NSF has a strategic long-term Learning outcome goal to cultivate a world-class, broadly-inclusive science and engineering workforce and expand the scientific literacy of all citizens. The NSF K-12 program contributes directly to this effort. The Math and Science Partnership (MSP) program goals are to improve K-12 student achievement in mathematics and science; generate evidenced-based outcomes that promote understanding on how to improve K-12 mathematics and science learning environments; and build human capacity (e.g., teacher training) to provide effective STEM learning environments. The MSP long-term (and annually-tracked) goal concerns student proficiency on state mathematics assessments required by NCLB. The DR-K12 program goal is to enable significant advances in K-12 student and teacher learning of the STEM disciplines through research and development of innovative resources, models, technologies for use by students, teachers, administrators, and policy makers. The long-term measure assesses the growth in numbers of projects in the portfolio that are using scientifically rigorous research of intervention-focused efforts. This measure is similar to that used by the Department of Education's Institute of Education Sciences.
Evidence: See the measures tab for more information. The long-term performance measures for the National Science Foundation's K 12 Mathematics and Science Education Program are reflected in program solicitations, and these measures are consistent with the long term outcome goal of Learning in the NSF Strategic Plan: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0648 MSP: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06539 DR-K12: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06593
Does the program have ambitious targets and timeframes for its long-term measures?
Explanation: The targets and timeframes for the long-term measures are ambitious. For the MSP measures, the targets are consistent with the goals of No Child Left Behind, and to move from 70% of schools meeting AYP in 2005 to 80% in 2008 is a particularly large jump, especially given the number of MSP schools in rural and urban areas, where teacher turnover, shortages, and weak background, together with low achievement levels and large achievement gaps at the outset of MSP involvement require especially aggressive intervention efforts. DR-K12 is a new program which made its first awards in 2007, but it is possible to establish a baseline using WWC data that includes three projects funded in the Instructional Materials Development program, a precursor to DRK-12. (Note that the development of these materials included in the WWC took 10-15 years for them to be at a point where rigorous effectiveness studies were feasible and indicated.) Given the time-frame needed to do basic research about students' understandings of core STEM concepts, develop and test proof-of-concept materials, revise, implement and test for efficacy, and then be ready for large-scale effectiveness studies, as well as the need to increase capacity of the field to do this style of R&D (many of the most accomplished developers are nearing retirement age), it is very ambitious to aim for a doubling of this number by 2012 and a tripling by 2018.
Evidence: Targets, timeframes, and more arguments for ambitiousness of long-term measures can be found in the Measures Tab of this PART. Data from National Assessment of Education Progress provided the context for establishing targets for the long-term student achievement goal of student learning for MSP. Specifically, in 2005, 30% of fourth graders nationally performed at or above the proficient level in mathematics, and 30% of the eighth graders did so. All MSP projects having direct contact with K-12 classrooms must report student achievement on state proficiency tests or some suitable substitute assessment instruments and/or use other rigorous criteria for success to determine significant improvement in student achievement. The DRK-12 targets are ambitious on the basis of the experience of the What Works Clearinghouse. This US Department of Education effort to publicize the results of studies of effectiveness of instructional materials has been in place for six years. The WWC reports on effectiveness of materials as long as one sufficiently rigorous study is available. For elementary mathematics instructional materials the WWC only reports on five programs; for middle grades mathematics there are reports on only seven programs. Given this national data, the targets established for DR-K12, which is only in its second year of making awards, are ambitious. These targets are based on the number of DRK-12 projects funded in FY07 that focus on the development of instructional resources for K-12 students (about 34% of the funded projects).
Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?
Explanation: The MSP measure is both a long-term and annual outcome measure. Keeping both a long-term view on the goal of bringing all MSP schools to the point of meeting AYP, and at the same time pushing for incremental progress, both support the overall MSP program goal of improving K-12 student achievement. The DRK-12 annual measure on appropriate and rigorous evaluation of efficacy in development-intensive projects (roughly 34 percent of the portfolio) is connected to the long-term goal of the program to enable significant advances in STEM learning through the research and development of resources for learning. The annual and long-term measures are coordinated, because DRK-12 needs to include research studies that are conducted rigorously to test efficacy of the interventions being developed, so that appropriate revision can be undertaken and the materials can be readied for effectiveness studies. Then, in the long term, when such development and research has been successfully undertaken, it will be possible and cost-effective to undertake large scale effectiveness evaluations of those materials that are ready and promising. The DRK-12 measure is related to the long-term goals of the program and accounts for 3/4ths of the DRK-12 portfolio in terms of dollars.
Evidence: Targets, timeframes, and more arguments for ambitiousness of long-term measures can be found in the Measures Tab of this PART. Data from National Assessment of Education Progress provided the context for establishing targets for the long-term and annual student achievement goal of student learning for MSP.
Does the program have baselines and ambitious targets for its annual measures?
Explanation: The baseline for the MSP annual measure is taken directly from the MSP MIS data. The annual targets are ambitious if the long-term goal is to reach 100% AYP in schools by 2014, given the difficulties of working in diverse and challenged schools in the area of mathematics. The DRK-12 baseline is determined on the basis of awards made in FY2007 and reviewed by an external contractor. The new solicitation for DRK-12 (first awards being made currently) will stimulate the field to do more rigorous work in conjunction with development, but this will take time, and the targets are ambitious. The community's capacity to combine development with rigorous research about student learning and effectiveness is growing (these tend to be different types of expertise, and DRK-12 now calls for projects to integrate this expertise.) Thus a ramp-up to a 100% target in 2014 is ambitious, and will require substantial outreach to the field. Regardless, the continuous improvement issue is not clearly addressed with the NSF-wide time-to-decision measure.
Evidence: Please see the Measures Tab for MSP and DRK-12 measures.
Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?
Explanation: All solicitations clearly describe the K-12 program's current annual and/or long term goals. Id in order for partners to be approved for funding and/or involvement, they are held to a commitment to these goals. As part of the award process, the program solicitations specify the goals and how NSF expects projects funded under the solicitation will contribute to its overall success. Program managers ensure that all funded projects contribute to the K-12 program's annual and long-term goals.
Evidence: The K-12 Program employs merit review (see http://www.nsf.gov/pubs/policydocs/pappguide/nsf08_1/gpg_3.jsp) to select proposals that demonstrate commitment to the K-12 goals, and requires grantees to submit satisfactory annual and final progress reports, subject to NSF program officer approval, as a prerequisite for continuation and/or renewal support. To receive further support (subsequent awards), all applicants are required to include in new proposals a report on results of previous NSF support, which is considered in the merit review process. A final project report must be submitted after an award ends, and no subsequent awards can be made to an applicant unless a program officer has approved the final project reports for all previous awards. The approval process includes examination against the program goals. Particular components of the K-12 program have additional expectations to assure that partners commit to and work toward the annual and/or long term goals. In the MSP Program, upon receiving an award, each Partnership must submit a Strategic Plan, Annual Implementation Plan, and Evaluation Plan that summarize commitments to include benchmarks to signify progress towards reaching goals. Annual Progress Reports are required both from the PI and the independent project evaluator to document progress towards goals. All projects are required to submit information to an externally contracted MSP-Management Information System (MSP-MIS) that calls for common quantitative and qualitative data on students, teachers, schools, districts, STEM disciplinary faculty and others, thus providing an additional avenue for monitoring partners' progress towards program goals. Many of the MSPs operate under cooperative agreements, thereby ensuring especially close interaction with all partners. In FY2008 the DR-K12 program will fund a DRK-12 Resource Network which will help with coordination and coherence in ensuring that DRK-12 PIs are addressing short and long term goals. Experience in evaluation and technical assistance in the legacy K-12 programs on which DR-K12 is based will be a basis for assuring the commitment of partners to program goals in DR-K12. Strategies used in the past that will be continued will include the use of PI meetings to update the community about knowledge accumulation and the program goals, and the collection of annual performance data. There was 100% compliance in the performance data collection process in the CLT program.
Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?
Explanation: NSF relies on two approaches to evaluation, both independent of the PIs and NSF staff: 1) monitoring and evaluation of programmatic efforts by well established evaluation organizations or institutions selected through a competitive process (every three to five years for each program), , and 2) the use of external expert judgment to conduct quality reviews of proposals and the award portfolio (COV process). Together, these two approaches allow for high quality, sufficient scope, independence, and regularity to support program improvements. Third-party program evaluations are conducted every three to five years, and include project monitoring, formative program assessment/implementation evaluation, an outcome evaluation, and evaluative research studies.. Merit-based K-12 grants are reviewed triennially by a committee of experts, through NSF's Committee of Visitors process, to assess pre and post-award processes and programmatic management, to review the quality of the award portfolio, to determine the responsiveness of the program to emerging research and education opportunities, and to review the accomplishments of the K-12 investment, as well as comment on areas in need of improvement. The results of these evaluations directly influence K-12 Program management and planning. The MSP Program has an extensive suite of evaluation studies underway, using multiple methodologies that include pre-post comparisons and other comparisons. These studies examine the impact of MSP program projects on (a)student achievement, (b) accomplishments in the domain of teacher quality, quantity, and diversity, (c) the evolution and nature of partnerships (types/kinds, assessment, and potential outcomes), (d) the degree of engagement of disciplinary faculty in mathematics, engineering, and the sciences, (e) the degree of challenge in the mathematics and science curricular content of MSPs and the role of university faculty in their development. Drawing on a workshop held in September 2007 with external experts, the DRK-12 program has developed a program evaluation plan which was submitted to and accepted by the NSTC Subcommittee on Education in fall 2007 as part of the process of producing a follow-up report to the report of the Academic Competitiveness Council. The evaluation plan, to be implemented beginning in FY08, involves identifying DRK-12 projects that are undertaking rigorous evaluation studies with RCTs or high-quality comparisons and monitoring those studies so that metaanalysis is possible later in the evaluation process. A statement of work will be prepared this fiscal year.
Evidence: An independent evaluation of MSP (MSP-PE) by COSMOS Corporation was funded at the end of FY 2004 following an open competition among proposals responding to an NSF contracts task order that covered both programmatic and performance requirements. The MSP-PE design consists of a series of substudies, each focusing on different but essential part of the MSP grantee's work (e.g., student achievement trends at MSP sites; partnership implementation in the MSP portfolio; and teacher quality, quantity and diversity accomplishments of the MSP portfolio). The relevant evaluation strategy for each sub study might be considered a meta-analytic strategy, quantitatively as well as qualitatively amassing and synthesizing evidence from the MSP grants. As of April 2008, there are 19 MSP-PE publications from the substudies that have been published, are in press or have been submitted and are being reviewed for publication. The DRK-12 plan is a metaanalytic strategy following the methodologies used in the work of the National Mathematics Advisory Panel. As stated previously, the Committee of Visitors meets once every three years to review and assess K-12 priorities, program management, and award accomplishments or outcomes. The COV process is of high quality in terms of the expertise of the members, ensures that the full scope of the portfolio is examined through the process by which awards are selected for their review, is unbiased and independent, and is conducted regularly. The schedule for K-12 COV meetings is outlined below. Discovery Research K-12 (new in FY07): COV - Scheduled for Fiscal Year 2010 (Note: Evaluation Planning - Underway with the assistance of AIR; first planning workshop held in FY2007. MSP: COV - Fiscal Year of Most Recent COV - 2005; Fiscal Year of Next COV - FY 2008 Sources: COV reports and NSF responses provide information about the performance and relevance of individual K-12 efforts and how NSF plans to respond to suggestions for program improvement. These reports are shared publicly at http://www.nsf.gov/od/oia/activities/cov/. The evaluation design and timeline of the evaluation activities are detailed in the proposals received from COSMOS (MSP evaluation). The planned DRK-12 evaluation description and the agenda from the AIR-organized DRK-12 evaluation workshop are available on request.
Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?
Explanation: NSF's annual budget requests present the funding needs for both components of the K-12 portfolio (MSP and DR-K12). Each is specifically highlighted in the request, and justifications address goals and priorities addressed in the NSF Strategic Plan. The Performance Information chapter of the budget request discusses the NSF-wide framework for performance assessment. Each major NSF organization (i.e. Directorates and Offices) ties its budget directly to this performance framework, but only at the Strategic Goal level. However, the NSF budget justification documents do not make clear the impact of funding decisions on expected performance (e.g., how would a 10% increase or decrease in funding impact the performance of the programs?).
Evidence: Budget information on the K-12 portfolio is included in the NSF FY 2009 Budget Request to Congress, pp. EHR 17-21. Specific K-12 Program objectives and their relationship to NSF strategic goals is found in the NSF's strategic plan (pp. 7-9, National Science Foundation, Investing in America's Future, Strategic Plan FY 2006-2011, http://www.nsf.gov/pubs/2006/nsf0648/nsf0648
Has the program taken meaningful steps to correct its strategic planning deficiencies?
Explanation: No major strategic planning deficiencies have been identified for the K-12 Program. However, there are a variety of meaningful steps to identify, monitor, and correct strategic planning deficiencies of the type and scope that may appear and that would jeopardize the success of the K-12 Program. These include rigorous project monitoring processes such as annual reports, site visits and reverse site visits; frequent communication with the scientific community through principal investigator meetings, conferences, and workshops; the merit review process, recognized as a best practice for administering R&D programs; and the professional judgment of trained NSF programs officers knowledgeable in their respective fields. Additional corrective measures include the triennial external Committee of Visitors (COV) and Advisory Committee (AC) processes, which provide valuable constructive feedback concerning areas where strategic planning can be strengthened and responds directly to address those issues raised by the committees on an ongoing basis. Each NSF division or office prepares an annual update describing key actions that have been taken to implement the recommendations cited in the previous COV report.
Evidence: NSF's K-12 program has two major components: Math and Science Partnership (MSP) and Discovery Research K-12 (DR-K12). The MSP program has taken meaningful steps to correct two strategic planning deficiencies: (1) The lack of a common language for communication about project-level evaluation and a framework to guide MSP projects in their thinking about evaluation activity; and (2) The additional value to be derived from a second stage of external merit review for the most complex and most meritorious Partnership proposals submitted. To correct the deficiencies in the MSP program, the following actions have taken place using the strategies outlined above: (1) The Consortium for Building Evaluation Capacity at Utah State University, an MSP-funded project, convened a meeting in fall 2004 of MSP Principal Investigators and evaluators to produce a framework that would guide MSP projects in their communication and thinking about project-level evaluation. From that meeting and subsequent discussions, the Consortium produced Evidence an Essential Tool: Planning for and Gathering Evidence using the Design-Implementation and Outcomes (DIO) Cycle of Evidence (NSF 05-31), http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0531. (2) Proposals for Comprehensive Partnerships submitted in response to the second Partnership solicitation (NSF 02-190) and deemed most meritorious in external merit review were all subject to an additional, second stage of external merit review in the form of a Reverse Site Visit prior to award decisions. This second stage of external merit review via Reverse Site Visit was then extended to the most meritorious proposals for Targeted Partnerships in the third Partnership solicitation (NSF 03-605). As a new program, DR-K12 has only had active projects since FY 2007 so there are no reports of deficiencies. Nevertheless, the program has a number of self-correcting mechanisms consistent with the other components of the NSF K-12 Program. These include the unbiased peer review process for recommending projects; the triennial COV reports; annual reports, site visits and reverse site visits; and regular communication with the scientific community through meetings with principal investigators and other disciplinary experts.
If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?
Explanation: While these programs are aimed at clear public benefits, they are basic R&D programs from an education R&D perspective. This question is intended for those applied R&D programs with potential commercial/industrial benefits.
Evidence: A brief summary of the type of R&D carried out in these K-12 programs is located in the NSF FY 2009 Budget Request to Congress, pp. EHR 17-21. http://www.nsf.gov/about/budget/fy2009/pdf/28_fy2009.pdf
Does the program use a prioritization process to guide budget requests and funding decisions?
Explanation: A prioritization process is used to formulate the specific budget requests and guide funding decisions for the K-12 Mathematics and Science Education Program. This process develops both NSF's overall highest priorities and individual programmatic priorities. In developing priorities for individual K-12 STEM education activities, information on the following factors is obtained: (1) NSF's highest funding priorities listed in the FY 2009 Budget Request - especially those addressing major national challenges identified by the Administration; (2) needs and opportunities identified by Committee of Visitors (COV) and Advisory Committee (AC) review; (3) new frontiers and topics of critical need that are identified by the education and scientific communities, e.g., through workshops; and (4) important emerging areas for which large numbers of highly ranked proposals are received. The FY09 NSF budget request identifies the five thematic priorities of EHR: Broadening Participation to Improve Workforce Development, Enriching the Education of STEM Teachers, Furthering Public Understanding of Science and Advancing STEM Literacy, Promoting Cyber-enabled Learning Strategies to Enhance STEM Education, and Promoting Learning through Research and Evaluation. Both the MSP and DRK-12 programs address elements of all of these themes. Senior management integrates that information, prioritizes budget requests within and among programs, and determines funding levels for K-12 activities. The K-12 Program relies on the merit review process to prioritize proposals for funding decisions; final funding decisions also include consideration of NSF's core strategies and maintenance of a diverse portfolio.
Evidence: Demonstrating a prioritization process to guide budget requests and funding decisions may be found in the following references. NSF Strategic plan: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0648 The NSF FY 2009 Budget Request to Congress http://www.nsf.gov/about/budget/fy2009/index.jsp National Science Board Reports, minutes and agendas http://www.nsf.gov/nsb/ COV reports for the components of the K-12 Program: http://www.nsf.gov/od/oia/activities/cov/
|Section 2 - Strategic Planning||Score||89%|
|Section 3 - Program Management|
Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?
Explanation: The Math and Science Partnership (MSP) component collects performance information from all MSP Partnerships through (a) an externally contracted MSP-Management Information System (MSP-MIS), which currently consists of seven on-line modules that collect an array of common quantitative and qualitative data from all Partnerships on an annual basis; (b) site and reverse site visits to awardees; and (c) review of annual progress reports that are aligned with the project's strategic and evaluation plans and that includes a financial report with detailed budget information. This information, as well as other reports and documentation, is used to manage the program and improve performance. All NSF grantees must submit annual and final project reports to NSF. The DR-K12 Program collects grantee information through annual and final project reports. Since the first awards were funded in FY2007, however, those reports are not yet available. One PI meeting has been held and annual reports for continuing awards will be coming in during the spring and summer of 2008. The DR-K12 Program managers review and use performance information in these reports to adjust program priorities, make changes on resource allocations if appropriate, and make other adjustments in managing the program. A management information system is being considered as part of the DRK-12 program evaluation, and in FY08 a cooperative agreement will be funded for a DRK-12 program resource network which was competed through the FY 08 solicitation. The performance of the agency itself in running the program is generally assessed via the periodic committee of visitors (COV) process that was discussed in response to Question 2.6.
Evidence: All MSP Comprehensive Partnerships are managed via cooperative agreement and are divided into two phases for post-award performance review: a two-year Phase I (initial implementation) and a three-year Phase II (full scale implementation). A critical Phase I site visit is conducted at the MSP site approximately 18-20 months after the project's start date by the project's NSF program manager and a team of external reviewers. The team provides recommendations to NSF regarding partnership performance progress and future plans. If the critical site review process indicates that a project has made satisfactory progress toward realizing its goals and that future plans are consistent with realization of these goals, the award is supported for the remaining three years in Phase II. If progress has been less than satisfactory, the cooperative agreement may be modified, phased out or terminated, depending upon the severity of the shortcomings identified. The DR-K12 Program managers review and use performance information in the annual project reports to adjust program priorities, make changes on resource allocations if appropriate, plan site visits as needed, and make other adjustments in managing the program. See Question 2.6 for more on the COV process.
Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?
Explanation: As explained in 3.1, NSF program managers are held accountable for regularly monitoring the program performance of grantees and program partners (NSF awardees and contractors), who must meet specific reporting and financial reporting requirements. Experienced NSF Program Officers, many of whom are "rotators" or visiting scientists who return to their home institutions after serving one or two years at NSF, monitor progress on an annual basis through frequent communication with grantees, annual reports from PIs, and site visits as travel funds permit. To receive further support (subsequent awards), all applicants for NSF support are required to include in their new proposals a report on the results of previous NSF support. Such past performance is then considered in the merit review process. The NSF Proposal and Award Policies and Procedures Guide, states "For all multi-year grants (including both standard and continuing grants), the PI must submit an annual project report to the cognizant Program Officer at least 90 days before the end of the current budget period. (Some programs or awards require more frequent project reports). Within 90 days after expiration of a grant, the PI also is required to submit a final project report. Failure to provide final technical reports delays NSF review and processing of pending proposals for that PI." A project's Principal Investigator and the grantee institution are accountable for all aspects of project management, including performance of any subawardees, contractors, and partners. All NSF awards are subject to NSF general grant conditions and any special terms of award indicated in the official award letter. In addition to programmatic oversight, NSF has a plan for post-award financial monitoring, but this is not carried out by program officers. Rather, the Division of Institution and Award Support is responsible for assessing the administrative and financial practices of awardee institutions through its Award Monitoring and Business Assistance Program (AMBAP). This annual risk-based portfolio management process ultimately results in the assignment of organizational awards-based risk scores. In general, institutions identified as high risk receive site visits; those of medium risk receive desk reviews. Program and administrative offices supplement AMBAP's list with their recommendations of institutions potentially needing business assistance. In FY 2008, high and medium risk awards covered by the Risk Assessment Award Portfolio Analysis will represent 93 percent of NSF's total outstanding obligations and 87 percent of its active awards.
Evidence: Program Planning and Management is one of the critical elements in the performance plans of most of the program managers in the K-12 Program. Program managers' responsibilities include advising and assisting in the development of long-rang plans and budgets for the Program in a timely manner as well as managing the peer review system and post award evaluation process, including appropriate and timely communication with awardees. Performance for this area is given one of the following ratings: outstanding, very good, fully successfully successful, minimally successful, and unacceptable in the annual performance appraisal process. NSF Proposal and Award Policies and Procedures Guide, January 2008 (nsf081) http://www.nsf.gov/publications/pub_summ.jsp?ods_key=papp NSF has satisfactorily addressed a recent Inspector General concern over post-award financial monitoring. See the OIG website: http://www.nsf.gov/oig/.
Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?
Explanation: Funding for the DRK-12 and MSP components of the K-12 portfolio is obligated in the year in which it is appropriated. NSF's improper payments rate is only a few hundredths of one percent, and NSF has received a clean opinion on financial statements for the past 9 years. NSF Program Officers and Grants Officers work in concert to monitor awardee progress and release funds only as reporting and other requirements outlined in grant agreements are met. The NSF Office of Inspector General (OIG) reviews whether various NSF award funds are spent for intended purposes; are properly reported; and that associated costs are allowable, allocable, and reasonable. NSF management also assists in the mitigation of this risk through its site visits and desk reviews under the Award Monitoring and Business Assistance Program (AMBAP). During both site visits and desk reviews, expenditures are reported to the NSF via quarterly financial reports that are reconciled to the institution's project/general ledger. Entities that expend $500,000 in federal funds in a year shall have a single audit conducted in accordance with OMB Circular A-133. These audits are required annually and must be submitted to the Federal Audit Clearing House. NSF has two potential paths for addressing findings and recommendations that result: (1) For those entities for which NSF has cognizance, and there are systemic findings and/or questioned costs related to our awards, NSF would employ their formal audit resolution process. (2) When NSF is providing funding to an entity but is not the cognizant agency (typically NIH or DOD), they still evaluate and consider the systemic findings and recommendations that result from the A-133 process if they affect NSF awards. NSF would formally resolve questioned costs relative to NSF awards. Also, the PI must submit a project report annually for NSF grants. Internally, the obligation and expenditure information is linked together and presented in the same document so that the program officer can compare the progress report with the expenditure pattern for the award. Further funding is contingent on successful project reports; lack of a due project report trips a flag in the system that prevents further obligations until the project report is submitted and successfully reviewed by the NSF program officer.
Evidence: NSF Budget Requests to Congress show limited carryover funds in general. DRK-12 and MSP have had no funds carryover. In FY 2007, NSF processed about 275 DRK-12 proposal actions and made over 80 initial and continuing increment (for precursor programs) awards totaling over $98 million, and 95 MSP proposal actions and 15 initial and continuing increment awards totaling almost $46 million. The audit and reporting procedures are contained in the Award and Administration Guide (http://www.nsf.gov/pubs/policydocs/pappguide/nsf08_1/aag_index.jsp) and in the cooperative agreement Financial and Administrative Terms and Conditions (FATC) document (http://www.nsf.gov/pubs/cafatc/cafatc605.pdf).
Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?
Explanation: There are NSF-wide and MSP and DRK-12 specific plans in place to encourage efficiencies and effectiveness. However, since there are no efficiency measures that meet the PART criteria, this answer must be "no." NSF has posted 100% of discretionary grant applications on Grants.gov, which is one of the factors that has earned NSF a Green rating on the President's Management Agenda Scorecard. NSF IT improvements, especially those associated with FastLane and the Electronic Jacket (ejacket) permit more timely and efficient processing of proposals, given the increasing numbers of proposals received each year with about the same numbers of program staff to handle them. This has allowed NSF to establish a Foundation-wide time-to-decision goal to inform applicants about funding recommendations within six months of proposal receipt. The K-12 Mathematics and Science Education Program is a leader in this area, consistently exceeding the NSF-wide goal that states that for 70% of proposals, a recommendation on funding will be made and conveyed to the PI within six months of proposal receipt. A recent study by the National Research Council reaffirmed that a time-to-decision goal is a valid efficiency metric for research grant-making agencies: Evaluating Research Efficiency in the U.S. Environmental Protection Agency, National Research Council, Washington, DC, January 2008, p.30. However, this proposed measure does not transparently account for the level of administrative resources applied, nor does it clearly demonstrate continual improvement. The proposed MSP efficiency measure called for participating schools and school districts to meet NSF annual reporting requirements in a timely fashion in order to enable the program managers to use the data to plan their post-award monitoring, refine their data collection and reporting systems, to guide the planning of site visits and other review procedures, and to refine program solicitations to ensure maximum program management efficiency. However, it is not clear how this is links to NSF program management efficiency.
Evidence: In FY 2008, the DRK-12 Program will fund a cooperative-agreement-based incentive, the DRK-12 Resource Network. This entity will be engaged in planning, implementing, and evaluating DRK-12-related initiatives. A web-based monitoring system for the Math and Science Partnership (MSP) is conducted under contract by Westat, Inc. using OMB 3145-0199. OMB 3145-0199 collects quantitative and qualitative data annually allowing for comparisons both within and among projects over time. Information is gathered by the seven web-based surveys. Given the absence of longitudinal the data, monitoring is a key way to understand MSP cost-effectiveness allowing for NSF to adjust funding levels. Additional evidence may be found in the report on the results of NSF's program management goals in the Performance Information chapter of the FY 2009 Budget Request to Congress: http://www.nsf.gov/about/budget/fy2009/index.jsp
Does the program collaborate and coordinate effectively with related programs?
Explanation: NSF agrees with the report of the Academic Competitiveness Council, which found that interagency coordination could be improved and that agencies should improve the coordination of their K-12 STEM education programs with states and local school systems. NSF has been involved in the NSTC Education Subcommittee activity (co-chaired by EHR AD Cora Marrett) to establish better approaches to coordination and evaluation. NSF coordinates with the Department of Education through the National Math Advisory Panel and the Institute of Education Sciences. NSF also coordinates its Math and Science Partnership (MSP) program with the related program at the Department of Education. Within NSF, an example of program collaboration and coordination is the Research, Evaluation and Technical Assistance (RETA) component of MSP, which co-funds several projects with the Research and Evaluation on Education in Science and Engineering (REESE) program, thereby making tools available for study and use in the DRK-12 program.
Evidence: NSF's Math and Science Partnership (MSP) collaborates with the Department of Education (ED) to define linkages necessary to manage the two parallel, but separate programs for greatest effectiveness. MSP collaboration between ED and NSF occurs at three levels: at the agency level, at the program level, and at the project level in the field. At the project level, almost two-thirds of NSF's funded Partnerships report direct collaboration with ED's state MSP sites. At the agency level, NSF and ED agreed through a Memorandum of Understanding (MOU) on cofunding for two large Partnership projects and, through a Supplemental MOU agreement, have continued their co-management. NSF collaborated with the US Department of Education during 1998- 2000 on a joint venture related to mathematics education. This led to the Figure This! public awareness campaign on middle school mathematics and a joint report, High Standards in Mathematics for Every Student: A Guide to Effective Use of Resources (NSF 00-83). Within NSF, formal cross-directorate collaboration is evident in the following program solicitations: (1) The DR-K12 program participates in the International Polar Year (IPY) initiative (NSF 07-536), for which proposals were submitted in March 2007. The programs embedded in the DR-K12 program also co-funded a number of projects under the first IPY solicitation (NSF 06-534). (2) The forerunner DR-K12 programs developed the K-12 component of the Nanoscale Science and Engineering Education solicitation (NSF 03044) and manage the resulting awards, in collaboration with the Engineering and Mathematical and Physical Sciences directorates. (3) The forerunner DR-K12 programs contributed to the development of the solicitation for the Global Learning and Observations to Benefit the Environment (GLOBE) (NSF 02-013) and co-funded all the awards related to K-12 education. Informal cross-directorate collaboration is reflected in joint participation and co-funding: (1) DR-K12 program staff members have served on cross-directorate committees, such as the Environmental Education Venture Fund. (2) DR-K12 program staff members attend panel meetings and participate in discussions on funding for programs outside EHR, such as the Division of Physics panel on Quark Net (3) DR-K12 program staff members, jointly with program staff members from ENG and MPS planned two workshops related to nanoscale science and engineering education involving education experts and education directors from the NSF funded nanoscale science and engineering research centers. The report for the first workshop has been published (NSF 06-54). (4) Program officers from the R&RA directorates have been reviewers and members of DR-K12 panels. (5) The DR-K12 program and its forerunner programs regularly co-fund projects with other directorates. Recent examples include (a) Divisions in CISE and MPS co-funded grant # ES-I0628091; and (b) DRL co-funded grants # DMR-0606387 and #PHY-0312038. NSF Press Release 07-005, "NSF's MSP program focuses on research and development and complements programs at the Department of Education that disseminate educational strategies and tools to the 50 states via formula-driven funds." (see http://www.nsf.gov/news/news_summ.jsp?cntn_id=108299&org=EHR&from=news)
Does the program use strong financial management practices?
Explanation: The NSF K-12 Mathematics and Science Education Program relies upon the same strong financial management practices that are used across the Foundation. NSF's practices led to NSF being the first federal agency to receive a "green light" for financial management on the President's Management Agenda (PMA) scorecard. NSF has maintained the rating since then. In FY 2007, NSF received its tenth consecutive unqualified "clean" audit opinion from an independent audit of its financial statements, with no material weaknesses reported.
Evidence: NSF has received a clean opinion on its financial audits for the past 10 years. NSF is committed to providing quality financial management to all its stakeholders. It honors that commitment by preparing annual financial statements in conformity with generally accepted accounting principles in the U.S. and then subjecting the statement to independent audits. For the latest statements, see the Foundation's FY 2007 Annual Financial Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=afr)
Has the program taken meaningful steps to address its management deficiencies?
Explanation: The leadership of NSF's Education and Human Resources Directorate regularly reviews the management of the K-12 Mathematics and Science Education Program portfolio and has found no management deficiencies in that program. In addition, external expert review for every NSF program is conducted once every three years by a Committee of Visitors, which reviews program and portfolio management as well as program results, makes recommendations to the program leadership, and reports to the appropriate directorate advisory committee. NSF management, in turn, responds to the COV report, outlining the steps the Foundation will take to address issues raised. The NSF response to COV recommendations is updated annually. In addition, the NSF response, as well as the initial COV report, is reviewed by directorate Advisory Committees. All COV reports are available on NSF's website.
Evidence: MSP Program grants are awarded on the basis of NSF's competitive, merit review process that includes external peer evaluation. MSP post-award management and oversight follow an overall six-pronged approach: (1) use of cooperative agreements and other mechanisms, such as carefully formulated "conditions of award," that enable focused oversight; (2) site and reverse site visits to awardees; (3) ongoing monitoring and Program Officer review of annual progress reports that are aligned with the project's strategic and evaluation plans and that include a financial report with detailed budget information; (4) technical assistance, especially for new awardees; (5) an externally contracted program-level management information system that collects and analyzes core data required annually from each funded Partnership; and (6) a substantial overall MSP program evaluation. See: http://www.nsf.gov/od/oia/activities/cov/covs.jsp
Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?
Explanation: All awards in the K-12 Mathematics and Science Education program are recommended on the basis of NSF's competitive, merit review process that includes the two merit review criteria approved by the National Science Board. NSF's Proposal and Award Policies and Procedures Guide explains the proposal processing and review procedures. NSF may also add specific criteria to individual program solicitations. During the 2005-2007 period, 142 first-time PIs applied to the DRK-12 or precursor-program competitions (31 were awarded), and 50 applied to the MSP competitions (5 awarded). Substantial outreach is provided for MSP through regional workshops and sessions at professional meetings; with DRK-12, sessions are offered at major professional meetings, and outreach workshops and webinars were held specifically for the FY0-7 and FY08 competitions. (See http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=500047&org=EHR)
Evidence: The two merit review criteria are "1. What is the intellectual merit of the proposed activity? How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields? How well qualified is the proposer (individual or team) to conduct the project? (If appropriate, the reviewer will comment on the quality of prior work.) To what extent does the proposed activity suggest and explore creative and original concepts? How well conceived and organized is the proposed activity? Is there sufficient access to resources? "2. What are the broader impacts of the proposed activity? How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society? " The MSP program includes additional review criteria related to its Five Key Features (Partnership-Driven; Teacher Quality, Quantity and Diversity; Challenging Courses and Curricula; Evidence-based Design and Outcomes; and Institutional Change and Project Sustainability) plus Evaluation, Management Plan, and other project attributes that support program goals. In addition, proposals for Comprehensive Partnerships submitted in response to the second Partnership solicitation (NSF 02-190) and deemed most meritorious in external merit review were further subject to a second stage of external merit review in the form of a Reverse Site Visit with external reviewers prior to award decisions. This second stage of external merit review via Reverse Site Visit was extended to the most meritorious proposals for Targeted Partnerships in the third Partnership solicitation (NSF 03-605). http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf06539 http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03605&org=NSF Additional background evidence demonstrating that grants are awarded through a clear competitive process is included in the following sources: (1) The Report to NSB on NSF's Merit Review Process Fiscal Year 2006 (nsb0722) http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsb0722 ). (2) The NSB Policy on Recompetition (http://www.nsf.gov/nsb/documents/1997/nsb97224/nsb97224.txt). (3) (3) COV reports and NSF responses: http://www.nsf.gov/od/oia/activities/cov/. Substantial outreach is provided for MSP through regional workshops and sessions at professional meetings; with DRK-12, sessions are offered at major professional meetings, and outreach workshops and webinars were held specifically for the FY0-7 and FY08 competitions. (See http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=500047&org=EHR)
Does the program have oversight practices that provide sufficient knowledge of grantee activities?
Explanation: NSF's oversight mechanisms provide sufficient knowledge of grantee activities to monitor and understand how funds are utilized by grantees. NSF continuously makes internal improvements in its oversight practices. In 2008, the COV report template was revised to include more specific questions on program management and review. In addition to technical evaluations, NSF assesses the administrative and financial practices of awardee institutions through its Award Monitoring and Business Assistance Program (AMBAP). NSF uses a variety of activities to monitor these areas including site visits and desk reviews. The site visit teams are typically comprised of program officers and cost analysts. During the visits, the team ensures that any award-specific terms and conditions are being met (i.e., cost sharing). In addition to programmatic oversight, NSF has a plan for post-award financial monitoring, but this is not carried out by program officers. Rather, the Division of Institution and Award Support is responsible for assessing the administrative and financial practices of awardee institutions through its Award Monitoring and Business Assistance Program (AMBAP). This annual risk-based portfolio management process ultimately results in the assignment of organizational awards-based risk scores. In general, institutions identified as high risk receive site visits; those of medium risk receive desk reviews. Program and administrative offices supplement AMBAP's list with their recommendations of institutions potentially needing business assistance. In FY 2008, high and medium risk awards covered by the Risk Assessment Award Portfolio Analysis will represent 93 percent of NSF's total outstanding obligations and 87 percent of its active awards.
Evidence: Data and information demonstrating sufficient oversight practices are included in COV reports(http://www.nsf.gov/od/oia/activities/cov/); PI annual and final project reports; the annual Report to the National Science Board on the National Science Foundation's Merit Review Process (latest is for Fiscal Year 2006: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsb0722; site visit reports; trip reports from attendance at professional meetings; and workshops and grantee meetings. Additional material showing site and desk reviews for the K-12 Programs is available for review (provided to the OMB Examiner as a spreadsheet of Post Award Monitoring activities by NSF at awardee organizations that were managing awards as part of the K-12 Program over the last three years as an addendum to Evidence Notebook prepared in 2007). Site and reverse site visit reports MSP-MIS modules. NSF has satisfactorily addressed a recent Inspector General concern over post-award financial monitoring. See the OIG website: http://www.nsf.gov/oig/.
Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?
Explanation: The National Science Foundation's (NSF's) K-12 Mathematics and Science Education Program collects grantee performance data on an annual basis as it does from all NSF-supported research and educational programs. These data are made available to the public through the Discoveries area of the NSF web site, through press releases, through the annual performance report and the Performance Highlights Report, and in the Report of the Advisory Committee for GPRA Performance Assessment.
Evidence: Grantees provide annual and final project reports to NSF, which are reviewed by the program officers. NSF Grant General Conditions require that results of NSF-supported research be published in open literature such as peer-reviewed journals. Members of the general public have access to data on the numbers of proposals and numbers of awards as well as, for each award, the name of the principal investigator, the awardee institutions, amount of the award, and an abstract of the project. NSF proactively seeks out noteworthy discoveries and distributes these in general press releases. Through an award to TERC, Inc., the Math and Science Partnership (MSP) program also maintains, MSPnet: An Electronic Community of Practice Facilitating Communication and Collaboration, an e-community linking active MSP projects for sharing, professional examination and dissemination of MSP work. The website is available to the public. The DRK-12 program will make an award in FY2008 to a DRK-12 Resource Network that will help with coordination, evaluation, and outreach. The following sources provide additional background information. (1) NSF Discoveries web site (http://www.nsf.gov/discoveries/). (2) News releases (http://www.nsf.gov/news/news_list.cfm?nt=2). (3) Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (4) FY 2007 Performance Highlights) http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0803. (5) The NSF FY 2009 Budget Request to Congress: http://www.nsf.gov/about/budget/fy2009/index.jsp (6) The NSF Grant General Conditions (GC-1) (http://www.nsf.gov/pubs/gc1/gc1_605.pdf). (7) Highlights of annual meetings/grantees meetings, workshops, and awards database (http://www.nsf.gov/awardsearch/). (8) MSPnet (http://www.mspnet.org)
For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?
Explanation: The K-12 Program has not awarded grants to investigators at FFRDCs, so this question is not applicable to this NSF program.
|Section 3 - Program Management||Score||90%|
|Section 4 - Program Results/Accountability|
Has the program demonstrated adequate progress in achieving its long-term performance goals?
Explanation: In the MSP program, projects have been reporting to the Management Information System with school level data since FY2003. Each year the system is refined and capacity among the PIs to provide good data is increasing, even as the number of projects and MSP schools is also increasing. The MSP program represents one-third of the funding in this assessment. For DRK-12, evidence of progress in this newly configured program suggests that the program will be prepared to demonstrate progress toward the long-term measure. This progress in setting up the program for success is evident both in the set of awards funded in the first year of the program (FY2007) and in the pending proposals in the FY2008 competition (see evidence below). Even though this is a newly configured program, there are sufficient data from precursor programs (see measures tab) to demonstrate progress toward this new long-term measure. The DRK-12 program represents two-thirds of the funding in this assessment.
Evidence: Documentation of the percentages of MSP schools meeting AYP is available in internal NSF documents. These analyses of student achievement in the MSP program are derived from data entered annually by MSP projects into an online Management Information System (MIS). For each school that has worked with MSP in any capacity, NSF obtains through the MIS a variety of information that enables the program to determine if the school has met AYP in mathematics in the collection year. For those schools deemed to be MSP schools because of significant MSP involvement, student achievement is being characterized along multiple dimensions (AYP status of the school, student success on state assessments and student enrollment and passing rates on mathematics and science courses). The revised DRK-12 solicitation for FY 2008 is clear in its expectations that effectiveness of instructional resources is to be examined rigorously. An external contractor reviewed the 58 DRK-12 proposals funded in FY07. The contractor was charged to make an independent judgment about the research design. These external experts found that 11 of the projects proposed to use Randomized Control Trials, 4 planned to use well-matched comparison groups in quasi-experimental designs, and 9 had "potential" to implement quasi-experimental designs. These projects began in FY2007. The target of expecting that 12 of these projects will have examined effectiveness of materials using rigorous methods is both plausible and at the same time ambitious; many are designing new instructional materials, and thus need to move through stages of basic research, iterative design and testing in clinical settings, wider implementation, efficacy, and then effectiveness studies. It will require constant project monitoring to ensure reaching these numbers, but the program is posed to do so and is committed to ensuring this level of evaluation. Program officers working on proposals that have come in the FY2008 DRK-12 competition report that virtually all submissions have shown particular attention to research design. Of course, not all of these will meet expectations for funding, but it is clear that the field is responding to the heightened emphasis on rigor that is articulated in the new DRK-12 program solicitation. The FY 2007 Performance Report, http://www.nsf.gov/about/budget/fy2009/index.jsp and independent COV evaluations (http://www.nsf.gov/about/performance/advisory.jsp) include examples of the progress demonstrated by the K-12 Program in meeting its performance goals. (See sections A.4.13 and B3 in the FY05 MSP COV report, and the following sections in COV reports for precursor programs to DRL: FY05 IMD COV A2.10, B1; FY 07 TPC COV report A5, B1.) Information on MSP progress can be found NSF Press Releases 06-029, 07-005 http://www.nsf.gov/news/news_summ.jsp?cntn_id=108154 http://www.nsf.gov/news/news_summ.jsp?cntn_id=108393 A new release is currently in preparation. Information about the FY2007 DRK-12 awards as analyzed by an external contractor is available on request. Information used in establishing the baseline and targets for the DRK-12 measures can be found at http://ies.ed.gov/ncee/wwc/ Information about specific projects in these programs is used in the GPRA process through internal NSF Highlights documents. A list of highlights from the MSP program and from the DRK-12 precursor programs is available on request.
Does the program (including program partners) achieve its annual performance goals?
Explanation: While the MSP program has some performance data (but is only 1/3 of the funding in this PART) but the DRK-12 program (2/3 of the funding) is too new to have the data necessary to demonstrate progress on its annual measure. Also, the DRK-12 program funds significant R&D that is not caputured by the annual measure even though the entire portfolio supports the long-term goal. In the MSP program, awardees are required to provide data annually about performance of students in MSP schools to the MSP Management Information System. This information has been gathered annually with excellent and increasing response rates and timeliness since 2003. The data in the table in the measures tab indicate the progress that is occurring in percentages of schools making the AYP criterion. In DRK-12, information about the baseline is drawn from the work in the precursor programs of DRK-12. An external contractor reviewed the 58 DRK-12 proposals funded in FY07. The contractor was charged to make an independent judgment about the research design. These external experts found that 11 of the projects proposed to use Randomized Control Trials, 4 planned to use well-matched comparison groups in quasi-experimental designs, and 9 had "potential" to implement quasi-experimental designs. These projects began in FY2007. The target of expecting that 12 of these projects will have examined effectiveness of materials using rigorous methods is both plausible and at the same time ambitious; many are designing new instructional materials, and thus need to move through stages of basic research, iterative design and testing in clinical settings, wider implementation, efficacy, and then effectiveness studies. It will require constant project monitoring to ensure reaching these numbers, but the program is posed to do so and is committed to ensuring this level of evaluation, with attention to progress annually. The DRK-12 solicitation was revised for FY2008 and is far more explicit about the program's expectation of rigorous research for whatever stage of design and development is being undertaken. FY2008 awards are being made now, and an analysis of the level of rigor in those awards will also be contracted. There is sufficient evidence from precursor programs (see measures tab) to allow for two years of relevant baseline data demonstrating progress against the annual measure for this newly constituted DRK-12 program.
Evidence: See the Measures tab for a discussion of the annual measure, and the relevant data used in establishing baseline number(s). Analyses of student achievement in the MSP program are derived from data entered annually by MSP projects into an online Management Information System (MIS). For each school that has worked with MSP in any capacity, NSF obtains through the MIS a variety of information that enables the program to determine if the school has met AYP in mathematics in the collection year. For those schools deemed to be MSP schools because of significant MSP involvement, student achievement is being characterized along multiple dimensions (AYP status of the school, student success on state assessments and student enrollment and passing rates on mathematics and science courses). In addition, the FY 2007 Performance Report, http://www.nsf.gov/about/budget/fy2009/index.jsp and independent COV evaluations (http://www.nsf.gov/about/performance/advisory.jsp) provide examples of the progress demonstrated by the K-12 Program in meeting its performance goals. (See sections A.4.13 and B3 in the FY05 MSP COV report, and the following sections in COV reports for precursor programs to DRL: FY05 IMD COV A2.10, B1; FY 07 TPC COV report A5, B1.) Information on MSP progress can be found NSF Press Releases 06-029, 07-005 http://www.nsf.gov/news/news_summ.jsp?cntn_id=108154 http://www.nsf.gov/news/news_summ.jsp?cntn_id=108393 Information about the FY2007 DRK-12 awards as analyzed by an external contractor is available at upon request. Information used in establishing the baseline and targets for the DRK-12 measures can be found at http://ies.ed.gov/ncee/wwc/ Information about specific projects in these programs is used in the GPRA process through internal NSF Highlights documents. A list of highlights from the MSP program and from the DRK-12 precursor programs is available on request.
Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?
Explanation: Since the answer to Question 3.4 was "no" because of the lack of an acceptable efficiency measure, this answer must also be "no." NSF has implemented several critical IT improvements that has allowed it to maintain its "overhead" costs to 5-6% of the overall budget while dealing with a steadily increasing workload of grant applications, etc.
Evidence: Evidence demonstrating improved efficiencies in achieving program goals can be found in the FY 2007 Performance Report, which is in the FY 2009 Budget Request to Congress: http://www.nsf.gov/about/budget/fy2009/index.
Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?
Explanation: There do not appear to be any studies or reports that have explicitly compared the NSF K-12 program to other government, private, and similar programs. In particular, NSF is not aware of any comparison of ED MSP schools and NSF MSP schools. The ED MSP uses tools and instruments being developed under NSF MSP (Hill/Ball, Smith/Horizon and Sadler items are being used in ED MSPs), and ED MSP regional conferences draw on NSF work as models. However, NSF undertook a modest comparison by using the latest National Research Council report on K12 education, Taking Science to School. This NRC report is a consensus study synthesizing research literature on K-12 education. An external group checked Department of Education's Institute for Education Science (IES) and NSF records to determine which of these authors/organizations had received funding. They found that two had received IES funding, and 11 had received NSF funding. (Both authors who had received IES funding also had received NSF funding.) This is an empirically based indication that the NSF K-12 program compares favorably to another government program with similar purpose and goals. However it is not an seperate and external evaluation carried out with the expressed purpose of making a comparison.
Evidence: The NRC report can be found at http://www.nap.edu/catalog.php?record_id=11625. NSF selected a specific chapter (Learning Progressions) because of its relevance and currency for the types of projects that NSF is funding in K-12 research and development. There are 39 references cited in this chapter, involving 25 individual lead authors or organizations.
Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?
Explanation: Third-party evaluations have been contracted out by the MSP program. These generally indicate that the MSP program is effective and meets desired results, particularly in areas of student achievement. Additionally, the MSP program was positively reviewed by a COV in 2005. The evaluation activities for DRK-12 began in FY 2007. The design received input from the field via an evaluation workshop conducted by AIR and was further refined with advice from the NSTC Education Subcommittee. The evaluation design has been submitted as part of the ACC evaluation follow-up report. While the planning and structure of this reconstituted program look very promising, it is not yet appropriate to conduct a full retrospective analysis of effectiveness. The external evaluations of DRK-12's precursor programs referenced in the measures tab (Teacher Professional Continuum and What Works Clearing House) have shown that the precursor programs achieved results.
Evidence: An analysis conducted by COSMOS, a firm with expertise in applied research and evaluation, of 123 schools participating in the National Science Foundation (NSF) Math and Science Partnership (MSP) Program shows improvements in student proficiency in mathematics and science at the elementary, middle-and high-school levels over a 3-year period (2002-2003, 2003-2004, and 2004-2005 school years). Students showed the most significant improvements in mathematics proficiency, with a 13.7 percent increase for elementary, 6.2 percent increase for middle-school, and 17.1 percent increase for high-school students. Science proficiency at each level showed marked gains as well, with a 5.3 percent increase for elementary, 4.5 percent increase for middle-school, and 1.4 percent increase for high-school students. The proficiency data also reveal a correlation between teachers who participate in MSP professional development and their school's change in student achievement. The correlations are positive in both mathematics and science at all grade levels (elementary, middle and high school) and are statistically significant for both elementary and high-school mathematics and science. This and other analyses of student achievement in the MSP program are derived from data entered annually by MSP projects into an online Management Information System (MIS). For each school that has worked with MSP in any capacity, NSF obtains through the MIS a variety of information that enables the program to determine if the school has met AYP in mathematics in the collection year. For those schools deemed to be MSP schools because of significant MSP involvement, student achievement is being characterized along multiple dimensions (AYP status of the school, student success on state assessments and student enrollment and passing rates on mathematics and science courses). The most recent COV report (2005) for the MSP program can be found at http://www.nsf.gov/od/oia/activities/cov/ehr/2005/MSPcov.pdf. For DRK-12, see the measures tab for reference to external evaluations.
|Section 4 - Program Results/Accountability||Score||60%|