ExpectMore.gov


Detailed Information on the
Smaller Learning Communities Assessment

Program Code 10003314
Program Title Smaller Learning Communities
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2008
Assessment Rating Adequate
Assessment Section Scores
Section Score
Program Purpose & Design 80%
Strategic Planning 88%
Program Management 90%
Program Results/Accountability 16%
Program Funding Level
(in millions)
FY2007 $94
FY2008 $80
FY2009 $0

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Working with Congress to terminate funding for this duplicative program.

Action taken, but not completed
2006

Continuing to implement a technical assistance strategy to raise student achievement, such as focusing on literacy interventions for ninth graders.

Action taken, but not completed
2006

Make performance data for active grants and for all school years available to the public.

Action taken, but not completed
2007

Develop a plan for phasing in the use of data collected from States (school-level data on student performance on standardized assessments and graduation rates) through the Education Data Exchange Network (EDEN) submission system instead of collecting it directly from grantees.

No action taken
2007

Produce and disseminate a guide for grantees on methods for improving the validity and reliability of the data they report on student enrollment in postsecondary education.

No action taken

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Use program data to establish baselines and long-term annual and targets for performance measures that do not yet have them.

Completed

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: FY03 Cohort: The percentage of students scoring at or above proficient on state math assessments.


Explanation:FY03 cohort received funding in FY04.

Year Target Actual
2004 Baseline 48.4
2005 60 48.8
2006 62 51.8
2007 64 Sept. 08
2008 66 Feb. 09
Long-term/Annual Outcome

Measure: FY04 Cohort: The percentage of students scoring at or above proficient on state math assessments.


Explanation:

Year Target Actual
2005 Baseline 53.9
2006 60 52.8
2007 62 Sept. 08
2008 64 Feb. 09
2009 66 Feb. 10
Long-term/Annual Outcome

Measure: FY05 Cohort: The percentage of students scoring at or above proficient on state math assessments.


Explanation:

Year Target Actual
2006 Baseline 45.5
2007 60 Sept. 08
2008 62 Feb. 09
2009 64 Feb. 10
2010 66 Feb. 11
Long-term/Annual Outcome

Measure: FY03 Cohort: The percentage of students scoring at or above proficient on state reading assessments.


Explanation:

Year Target Actual
2004 Baseline 54.1
2005 60 54.6
2006 62 55.9
2007 64 Sept. 08
2008 66 Feb. 09
Long-term/Annual Outcome

Measure: FY04 Cohort: The percentage of students scoring at or above proficient on state reading assessments.


Explanation:

Year Target Actual
2005 Baseline 59.8
2006 60 59.2
2007 62 Sept. 08
2008 64 Feb. 09
2009 66 Feb. 09
Long-term/Annual Outcome

Measure: FY05 Cohort: The percentage of students scoring at or above proficient on state reading assessments.


Explanation:

Year Target Actual
2006 Baseline 43.4
2007 60 Sept. 08
2008 62 Feb. 09
2009 64 Feb. 10
2010 66 Feb. 11
Long-term/Annual Outcome

Measure: FY03 Cohort: The percentage of graduates in schools receiving Smaller Learning Communities grants who graduate from high school.


Explanation:

Year Target Actual
2004 Baseline 84.8
2005 87 85.6
2006 88 85.9
2007 89 Sept. 08
2008 90 Feb. 09
Long-term/Annual Outcome

Measure: FY04 Cohort: The percentage of graduates in schools receiving Smaller Learning Communities grants who graduate from high school.


Explanation:

Year Target Actual
2005 Baseline 84.7
2006 87 84.7
2007 88 Sept. 08
2008 89 Feb. 09
2009 90 Feb. 10
Long-term/Annual Outcome

Measure: FY05 Cohort: The percentage of graduates in schools receiving Smaller Learning Communities grants who graduate from high school.


Explanation:

Year Target Actual
2006 Baseline 80.4
2007 87 Sept. 08
2008 88 Feb. 09
2009 89 Feb. 10
2010 90 Feb. 11
Long-term/Annual Outcome

Measure: FY03 Cohort: The percentage of graduates from schools receiving Smaller Learning Communities grants who enroll in postsecondary education, apprenticeships, or advanced training for the semester following graduation.


Explanation:

Year Target Actual
2004 Baseline 81.2
2005 83 82.5
2006 84 81.7
2007 85 Sept. 08
2008 86 Feb. 09
Long-term/Annual Outcome

Measure: FY04 Cohort: The percentage of graduates from schools receiving Smaller Learning Communities grants who enroll in postsecondary education, apprenticeships, or advanced training for the semester following graduation.


Explanation:

Year Target Actual
2005 Baseline 82.2
2006 83 83.7
2007 84 Sept. 08
2008 85 Feb. 09
2009 86 Feb. 10
Long-term/Annual Outcome

Measure: FY05 Cohort: The percentage of graduates from schools receiving Smaller Learning Communities grants who enroll in postsecondary education, apprenticeships, or advanced training for the semester following graduation.


Explanation:

Year Target Actual
2006 Baseline 80.3
2007 83 Sept. 08
2008 84 Feb. 09
2009 85 Feb. 10
2010 86 Feb. 11
Long-term/Annual Efficiency

Measure: FY03 Cohort: Cost (in dollars) per student demonstrating proficiency or advanced skills in mathematics.


Explanation:The numerator of the efficiency measure is the amount of funds expended during the year and the denominator is the number of students who scored proficient in mathematics during that same year.

Year Target Actual
2005 Baseline 680
2006 670 608
2007 660 Sept. 08
2008 650 Feb. 09
Long-term/Annual Efficiency

Measure: FY04 Cohort: Cost (in dollars) per student demonstrating proficiency or advanced skills in mathematics.


Explanation:The numerator of the efficiency measure is the amount of funds expended during the year and the denominator is the number of students who scored proficient in mathematics during that same year.

Year Target Actual
2005 Baseline 483
2006 475 506
2007 465 Sept. 08
2008 455 Feb. 09
2009 445 Feb. 10
Long-term/Annual Efficiency

Measure: FY05 Cohort: Cost (in dollars) per student demonstrating proficiency or advanced skills in mathematics.


Explanation:The numerator of the efficiency measure is the amount of funds expended during the year and the denominator is the number of students who scored proficient in mathematics during that same year.

Year Target Actual
2006 Baseline 546
2007 540 Sept. 08
2008 530 Feb. 09
2009 520 Feb. 10
2010 510 Feb. 11
Long-term/Annual Efficiency

Measure: FY03 Cohort: Cost (in dollars) per student demonstrating proficiency or advanced skills in reading.


Explanation:The numerator of the efficiency measure is the amount of funds expended during the year and the denominator is the number of students who scored proficient in reading during that same year.

Year Target Actual
2005 Baseline 595
2006 585 543
2007 575 Sept. 08
2008 565 Feb. 09
Long-term/Annual Efficiency

Measure: FY04 Cohort: Cost (in dollars) per student demonstrating proficiency or advanced skills in reading.


Explanation:The numerator of the efficiency measure is the amount of funds expended during the year and the denominator is the number of students who scored proficient in reading during that same year.

Year Target Actual
2005 Baseline 435
2006 425 442
2007 415 Sept. 08
2008 405 Feb. 09
2009 395 Feb. 10
Long-term/Annual Efficiency

Measure: FY05 Cohort: Cost (in dollars) per student demonstrating proficiency or advanced skills in reading.


Explanation:The numerator of the efficiency measure is the amount of funds expended during the year and the denominator is the number of students who scored proficient in reading during that same year.

Year Target Actual
2006 Baseline 559
2007 550 Sept. 08
2008 540 Feb. 09
2009 530 Feb. 10
2010 520 Feb. 11

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the Smaller Learning Communities (SLC) program is to award grants to school districts to create and implement smaller, more personalized school learning environments. The statute lists many different activities, besides creating structural changes in high schools, that can be supported with program funds. In an effort to direct funds to those high schools that experience more of the negative effects of being large, the Department decided, with House Committee input, to narrow the pool of eligible applicants and thus, from the beginning of the program, has defined an eligible LEA as one that applies on behalf of high schools of 1,000 or more students, in grades 9 and higher, and includes grades 11 and 12. Department grant competitions from the past four years have attempted to create more focus for the program by emphasizing activities that have the greatest likelihood of improving student achievement. Beginning in FY 2006, appropriations language has focused the program by directing the Department to "use funds only for activities related to establishing smaller learning communities within large high schools or small high schools that provide alternatives for students enrolled in large high schools??." The Department determined that it was already following this directive.

Evidence: Title V, Part D, Subpart 4 of the Elementary and Secondary Education Act of 1965 (ESEA), as amended by Public Law 107-110, the No Child Left Behind Act of 2001. House Appropriations Language, FY 2006 and FY 2008. Notices of application priorities for calendar year 2004 through 2008 grant competitions.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The program addresses the problem of student alienation in large schools - from teachers, other students, and learning - that can hamper student achievement. While structural changes and personalization strategies, by themselves, are not likely to improve student academic achievement, they may create the more optimal school conditions and opportunities to improve the quality of instruction and curriculum and to provide the individualized attention and academic support that all students need to excel academically. Some empirical studies have found that smaller school size and smaller learning communities, different reforms intended to create similar effects, are associated with higher student achievement, particularly among disadvantaged students, but no studies have been conducted using true experimental designs or quasi-experimental designs with rigorously matched comparison groups (Bomotti, 2004). Researchers generally assert that the smaller size of a school or a "learning community" within a larger school promotes improvement in student achievement only indirectly by facilitating curricular and instructional reforms, greater student engagement, and more collaboration among teachers. A "small" environment, in and of itself, does not improve student academic outcomes.

Evidence: Bomotti, 2004; Lee, Smith, Croninger, 1997; Gladden, 1998; Stern, Wing, 2004

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: SLC is the only Federal program that is exclusively focused on creating smaller learning communities and promoting improvements in curriculum and instruction in large high schools. Significant private-sector funding has been made available to promote small schools, including, particularly, a major high school initiative sponsored by the Bill and Melinda Gates Foundation. Early in its support for high schools, the Gates Foundation supported both the creation of new, small schools through grants to charter school management organizations and other non-profit organizations, as well as efforts to reform large high schools. In 2006, the Gates Foundation reported that, since 2000, it had invested about $1.5 billion in efforts to improve high schools, including support for the creation of more than 1,100 new small schools and 700 improved schools in 40 States and the District of Columbia. An October 11, 2006 article in Education Week described the shift in the Foundation's investment strategy. While the Gates Foundation has, since 2005, shifted the focus of its high school investments to supporting advocacy efforts and improving State and district capacity to implement curricular and instructional reforms, it continues to support the creation of new, small schools. The Annenberg Foundation has provided funds to education reform groups in support of the creation of new small schools, as has the Carnegie Corporation. For example, in 2006 the Annenberg Foundation made a $20 million grant to support small schools in New York City. The SLC program is unique in that it does not provide funds to school districts to create small schools, but rather only to school districts to restructure large high schools into smaller learning communities. Recent private investments have focused primarily on the creation of new small schools rather than trying to restructure large high schools since research has indicated that outcomes for students at new small schools are better than those for students in restructured large high schools. While private investments appear to focus largely on creating new small schools, there is evidence that the schools targeted for small school and smaller learning communities reforms overlap.

Evidence: In 2006, the Gates Foundation reported that, since 2000, it had invested about $1.5 billion in efforts to improve high schools, including support for the creation of more than 1,100 new small schools and 700 improved schools in 40 States and the District of Columbia (All Students College-Ready:Findings from the Foundation's Education Work 2000-2006, www.gatesfoundation.org/nr/downloads/ed/researchevaluation/EducationFindings2000-2006.pdf). An October 11, 2006 article in Education Week ("Gates Learns to Think Big") described the shift in the Foundation's investment strategy (http://www.cftexas.org/press/Education%20Week%20Gates%20Article%20101106.pdf); Spreadsheet of 2007 and 2008 Foundation awards (Gates, Annenberg, and Carnegie) and their relationship to SLC awards

NO 0%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: While the program narrowly focuses on implementing smaller learning communities, it has design advantages (e.g. flexibility) that have allowed the Department to focus grant competitions to emphasize improved outcomes for students enrolled in programs funded by the SLC program. Recent competitions have required grant recipients to implement research-based improvements in curriculum and instruction, and also to use program funds to link SLC grants to broader high school reforms through research, technical assistance, and program outreach. While research suggests that creating new small high schools may be more effective and efficient than reforming existing large high schools, that option may not always be financially feasible. There are also many benefits to large high schools that are sacrificed in small high schools. Creating smaller learning environments within large high schools allows for the many advantages and efficiencies of large high schools while creating more personal learning opportunities for students at the same time. In the FY07 Notice Inviting Applications, the Department set an absolute priority for "projects that create or expand SLCs that are part of a comprehensive effort to prepare all students to succeed in postsecondary education and careers without need for remediation." The notice includes guidance on what projects must do to meet the requirements of this priority. The Department also set a competitive preference priority for the same competition for projects in school districts with schools identified for improvement, corrective action, or restructuring under ESEA. Both priorities provide direction and guidance to applicants about effective strategies for creating successful smaller learning communities without prescribing the exact reforms to use. Congress has also directed the Department to promote connections between local grants and broader high school reforms.

Evidence: Department SLC grant application notices and other materials; http://www.ed.gov/legislation/FedRegister/announcements/2007-4/112607a.html. House Appropriations Committee reports, beginning in FY 05, have requested an outline of Department plans for the use of SLC national activities funds for research, technical assistance, and program outreach and networking to promote connections between local grants and broader high school reforms.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The program targets funds exclusively to high schools serving at least 1,000 students in grades 9 and higher, which creates a pool of approximately 4,167 eligible high schools. As of March 2008, program grants have been awarded to support the planning and implementation of SLCs in about 30 percent of the eligible high schools. However, many of the eligible schools have not chosen to create smaller learning communities or have already received support from the program. This seemed to be an indication that the program had already reached a large proportion of the LEAs with both eligible high schools and a strong commitment to the SLC restructuring strategy. Applications for FY 2003 funds offered some evidence of decreased demand for program funds. Even with two competitions, the pool of fundable applications was insufficient to absorb the available FY 2003 funds. Rather than fund lower-quality applications, the Department lapsed $26.5 million in unused FY 2003 SLC funds. However, the Department has not lapsed funds in subsequent years. Schools funded through SLC grants are larger than the average large high school (median enrollments of 1,926 students vs. 1,613 in large high schools generally) and have a much higher percentage of minority students (median of 63 percent vs. 25 percent). In 2007, twenty-one percent (21%) of the high schools served by active grants (i.e., grants awarded with funds from FY 2003 through 2006) were Title I high schools that had been designated as in need of improvement, corrective action, or restructuring, which is an indication that the program, targeted to serve large high schools, is effectively targeted to the high schools most in need. The addition of the competitive preference priority for schools identified in need of improvement, corrective action, or restructuring, beginning in FY07, has helped to target funds to the schools and districts most in need of them.

Evidence: SLC program competition slates, FY03 - FY07

YES 20%
Section 1 - Program Purpose & Design Score 80%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has adopted long-term measures that meaningfully reflect SLC's purpose. These measures include performance on reading and math assessments, high school graduation rates, and enrollment in postsecondary education or advanced training. The first three measures are consistent with the goals of NCLB. The fourth performance measure is consistent with the job training common measures.

Evidence: ED's Visual Performance Suite (VPS) system Measure 1: The percentage of students scoring at or above proficient on state math assessments. Measure 2: The percentage of students scoring at or above proficient on state reading assessments. Measure 3: The percentage of graduates in schools receiving Smaller Learning Communities grants who graduate from high school. Measure 4: The percentage of graduates from schools receiving Smaller Learning Communities grants who enroll in postsecondary education, apprenticeships, or advanced training for the semester following graduation. Measure 5: The cost per student demonstrating proficiency or advanced skills in reading. Measure 6: The cost per student demonstrating proficiency or advanced skills in mathematics.

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The Department has set targets for each cohort for four years following each baseline year. Targets for the measures were set to establish high and reasonable expectations for improved performance in these areas, starting from the baseline.

Evidence: ED's VPS system

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The annual performance measures reflect the goals for students participating in smaller learning communities, including students' achievement on State reading and math assessments, graduation rates, and enrollment in postsecondary education and advanced training. Annual targets and data provide regular and frequent information on how students in smaller learning communities are progressing toward the long-term goals.

Evidence: ED's VPS system

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The Department first collected data in 2001 for the antecedent program and has established ambitious targets for four years for each cohort of grantees for the four annual measures (student achievement in reading and math, high school graduation, and enrollment in postsecondary education) and two efficiency measures. Targets for the measures were set to establish high and reasonable expectations for improved performance in these areas, starting from the baseline. Data and targets for three cohorts are displayed.

Evidence: ED's VPS system

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: In the awards resulting from earlier competitions, grantees primarily focused on implementing structural changes and other strategies to create smaller, more personalized educational environments for students. However, projects funded through the most recent competitions stress improvements in curriculum and instruction designed to enhance student achievement, and grantees receive technical assistance to implement the new elements. Technical assistance providers and grantees work closely with Department staff on implementation of smaller learning environments and changes in curriculum and instruction designed to help improve student achievement. Grantees are required to set out annual targets for each of the program's performance measures in their applications. Their success in achieving these goals is considered in evaluating their progress and in identifying grantees that should receive additional technical assistance or more intensive monitoring.

Evidence: Grantee applications; SLC program national activities spending plans; Statements of work for Technical Assistance contracts

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Current evaluations include an implementation evaluation, conducted by Abt Associates, which was released in May 2008, that will support program improvements by identifying barriers to implementation of SLCs in high schools, and, with some major limitations, assess effectiveness by examining achievement and other data in schools both before and after schools received SLC funds. The Department used the preliminary results of this evaluation to inform the development and implementation of the FY 2004 and subsequent SLC grants competitions. In addition, the Department's Institute of Education Sciences (IES) is conducting a rigorous study of SLC grantees that will inform efforts to promote smaller learning communities by examining whether two supplemental reading interventions improve reading proficiency in high schools operating 9th-grade "freshman academies" (a popular approach that schools are implementing to create smaller learning communities). With approximately one-third of all students entering and exiting high school with low-level reading skills, there is an urgent need to identify effective interventions to help young people acquire the reading skills they need to succeed in postsecondary education and the workforce. Evaluation findings will help to guide the structure, content, and professional development activities of future smaller learning initiatives by providing evidence-based information on which programs can be built, thereby incurring significant future savings in staff time and costs associated with development and implementation. MDRC has conducted rigorous quasi-experimental evaluations of two reform models (not necessarily being implemented within schools with SLC grants) that couple smaller learning communities with improvements in curriculum and instruction. For first-time ninth-graders, the Talent Development model produced substantial gains in attendance, academic course credits earned, and promotion rates during the ninth grade year. These impacts on credits earned and promotion rates were sustained in the students' subsequent years in high school, and their likelihood of graduating on-time improved by about 8 percentage points. A 2001 Smaller Learning Communities grant supported the implementation of the Talent Development model in the five Philadelphia high schools that were the subject of this study. The second intervention assessed, First Things First, significantly increased rates of student attendance and graduation, reduced student dropout rates, and improved student performance on State reading and math assessments when it was implemented by one urban district. The implementation of First Things First in five of the Houston schools included in this evaluation was supported by a 2000 SLC implementation grant. In addition, each project must also conduct an independent third-party evaluation to provide information for use in gauging the project's progress and identifying areas for improvement. The Department used the preliminary results of the program evaluation to improve the FY 2004 and subsequent years' grants competitions by, for example, extending the grant period from three to five years based, in part, on the evaluation's finding that cohort 1 grantees needed greater time to implement career academies and other SLCs in the upper grades. IES released the first-year results of its rigorous evaluation of literacy interventions in January 2008. It found that, taken together, the interventions produced a modest, statistically significant impact on reading comprehension. The second-year results will be released in late 2008. While a randomized controlled trial (RCT) of the effectiveness of the overall program is not feasible, consistent with OMB guidelines the Department has initiated a series of RCTs to test the effectiveness of different interventions that are commonly used by grantees to improve program performance and efficiency.

Evidence: http://ies.ed.gov/ncee/pubs/20084015.asp; Annual Performance Reports; Quint, Bloom, et al., 2005; Kemple, Herlihy, Smith, 2005.

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Since 2003, the Administration has not requested any funding for this program. Although requests have been tied to accomplishment of a major policy goal (that of eliminating funding for programs that do not reflect an appropriate Federal role or, for other reasons, are not Administration priorities), it has not been tied to accomplishment of the annual or long-term performance goals. However, the Department's budget submissions show the full cost (including S&E) for all programs.

Evidence: Department of Education Budget Justifications, 2002-2009

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department has made incremental changes in the program competitions each year since 2001 to link structural changes to instructional and other reforms in order to raise students' academic achievement. Also, a decline in fundable applications, which resulted in the program lapsing a significant amount of FY 2003 funds, prompted changes in the program's grant-making and technical assistance strategies . The Department also has invested in improving the transparency and quality of the performance data it collects. In 2002, the Department began to require recipients of implementation grants to use the funds to implement smaller environments that served all students in the school, rather than create one or two smaller learning communities involving only some students. Since FY 2003, ED has required applicants to implement research-based strategies to address the needs of students who enter ninth grade with reading or math skills that are significantly below grade level. Beginning with 2004, competition priorities have focused on changes that not only address the organizational structure of schools, but also focus on making curriculum more rigorous. Beginning with 2005, the Department instituted changes designed to provide grantees with more time and resources to carry out their plans by, for example, extending the project period from three to five years and increasing the award amounts for individual grants to accommodate additional independent evaluation activities and comprehensive strategies and interventions to assist struggling students, both of which are required activities for SLC grantees. Based on the findings of the two MDRC reports, the program established new requirements and selection criteria in the FY 2006 and 2007 competitions designed to ensure that grantees implement key elements of the Talent Development and First Things First reform models in their projects, such as offering all students a common, rigorous curriculum that prepares them for postsecondary education and advanced training, and providing teachers with the professional development needed to implement a more rigorous curriculum. The Department has awarded a contract to establish a searchable, web-based database that provides public access to all of the performance data and information it collects from grantees. This database will be available in late 2008. In the interim, the Department has posted summary charts of each grantee's performance online. In addition, the Department has awarded a contract to develop a resource guide that will provide guidance and information to assist grant recipients in improving the quality of data they collect on student participation and success in postsecondary education and in using this information effectively to strengthen the preparation they offer students for postsecondary education. This guide will be available in early 2009.

Evidence: Notices Inviting Applications, FY02 - FY07; MDRC Reports, http://www.mdrc.org/publications/471/overview.html

YES 12%
Section 2 - Strategic Planning Score 88%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department collects and utilizes performance information through a web-based performance reporting system, annual evaluations, and reports from technical assistance contractors. The Department used information from grantees and contractors to structure changes in annual grant competitions in order to help achieve improved program and grantee performance. During FY 2000 through 2002, the program's performance reporting requirements were driven by the needs of the independent evaluation of the program. The evaluator determined the performance data that would be collected. In subsequent years, the program revised its performance indicators to make them more consistent with the requirements of ESEA Title I, as amended by the No Child Left Behind Act, and required grantees to establish specific performance goals for each indicator for each year of the grant period. The Department established through rule-making, its authority to take action (including terminating the grant) against grantees that failed to meet their performance goals. Few other Department discretionary grant programs now agree upon annual performance goals with grantees, and still fewer have established through rule-making their authority to terminate funds if a grantee fails to meet these goals. Starting in FY 2003, the program also implemented a web-based performance reporting system that reduced burden on grantees and made it easier for program staff to use performance data to manage the program and improve performance.

Evidence: http://www.ed.gov/legislation/FedRegister/finrule/2004-1/031504k.html L. High-Risk Status and Other Enforcement Mechanisms

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: As part of the President's Management Agenda, the Department has implemented an agency-wide system that links employee performance to progress on strategic planning goals. New performance agreements hold managers accountable for meeting established deadlines for awarding grants and for managing their programs through monitoring, technical assistance, and data collection activities that are designed to focus on raising and measuring performance. Also, contractors provide technical assistance to grantees to help improve the quality of project implementation and evaluation, and information submitted as part of annual performance reports. Performance agreements for Federal managers address the five most important performance objectives that are specific to their area of responsibility during each rating cycle, including cost, grant scheduling, and performance. Information obtained from reports, conferences, and technical workshops help the Department set work agendas for technical assistance providers, who are held accountable for work products, especially site visit reports that that detail grantees' project implementation status. Grantees are required to submit annual performance and financial reports and information from independent project evaluations.

Evidence: Education Department's Performance Appraisal System (EDPAS)

YES 10%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: Funds are obligated within the time frames set by Department schedules and used for the purposes intended. Funds are awarded promptly and accurately. Prior to funding a grant, program staff evaluate each proposed expenditure to determine its consistency with OMB Circular A-87, Department regulations, and the program statute. Excessive, unreasonable, or unallowable expenditures are reduced or eliminated, as appropriate. Grants in FY08 will be awarded prior to the start of the 2008-2009 school year, giving grant recipients 90 days to plan for the implementation of grant activities at the start of the school year. To ensure that funds are spent for the intended purposes and in a timely way, program staff monitor grantee drawdowns on a quarterly basis, review annual performance reports, and communicate regularly with grantees about project implementation.

Evidence: Department of Education Grant Award Processing System (GAPS); FY 2009 Budget Justification; FY 2008 Grant Award Schedule; memorandum recommending funding for FY 2006 grant awards; and, program monitoring plan.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The program has established two efficiency measures (cost per student demonstrating proficiency or advanced skills in mathematics and cost per student demonstrating proficiency in reading/English language arts) and set baselines and annual targets for these measures through FY 2011. The measures track program improvements in achieving successful outcomes for more students per dollar. The program has provided technical assistance to grantees to help them more effectively implement their programs and thus achieve better results for students. The Department has contracted to develop and maintain other tools to assist grantees such as a database of all SLC grantees. Current grantees and any interested parties may search the database to locate school districts that are implementing reforms of interest. This tool provides a public service in the form of information dissemination. The Department also uses SLC national activities funds to provide technical assistance to current grantees in order to help improve their successes. The Department has set guidelines for the competitions, based on current research and the greatest needs of large high schools. Focusing the efforts of grantees should increase their chances for greater success.

Evidence: ED's VPS system; Notices Inviting Applications; National Activities spending plans; Grantee database, http://slcprogram.ed.gov/

YES 10%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Several formal collaborative arrangements are designed to help achieve research and networking objectives. Through a collaboration between the Department's Institute of Education Sciences and the Office of Elementary and Secondary Education, SLC funds are supporting a 4-year experimental research study that will evaluate the effectiveness of two promising supplemental reading programs implemented within a freshman academy smaller learning community model. As part of this study, the Department conducted a special competition in calendar year 2005 to select school districts with high schools that will implement these reading programs. Findings from the study will be widely disseminated to inform schools that are interested in using literacy interventions in small learning environments. SLC funds also have been provided to IES to identify and describe available supplemental math programs and investigate the feasibility of implementing a rigorous evaluation to evaluate the their effectiveness when implemented in smaller learning communities. Also, the SLC program has established a partnership with the Mathematics and Science Partnerships and Title I programs to support a technical assistance institute in May 2008 that will convene State Title I directors, State math curriculum coordinators, members of the National Math Panel, and districts with high schools that receive both SLC and Title I funds. In addition, the Department consults periodically with officials from private foundations that are active in supporting high school improvement activities, including the Bill and Melinda Gates Foundation and the Carnegie Corporation.

Evidence: "The Enhanced Reading Opportunities Study," http://www.mdrc.org/publications/471/overview.html

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of the program. Pre-award consideration assesses financial capability of grantees. Auditors have reported no internal control weaknesses. Site visits include a review of grants management issues. The Department has a system for identifying excessive "draw downs" of grant funds, and can put individual grantees on probation, which requires ED approval of all grantee draw downs. Monitoring by Department and program staff minimizes the risk of improper payments. In addition, staff monitor grantee obligations and track grantee draw down of funds through the Department's Grant Administration and Payment System (GAPS). GAPS is used by the Department to track the financial activities of a grant from initial obligation of funds by ED, draw down on the funds, and final settlement of grants.

Evidence: The Department's Grant Administration and Payment System (GAPS)

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Accountability for program outcomes is a key management objective. Toward that end, beginning in FY 2003, the Department began holding grantees accountable for agreed-upon multi-year performance targets for student academic achievement, high school graduation, and placement in postsecondary education and employment. In prior years, grantees reported only on student outcomes. The Department has also expanded monitoring using contractors to promote early identification and resolution of management and implementation issues that may affect a grantee's performance.

Evidence: Application notices and other materials; grantee files; contractor's work plans; Department of Education's VPS system.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: All program grant funds are awarded on a competitive basis and judged on their relative merits. The process includes public notice in the Federal Register, technical assistance workshops to help prospective applicants, and an application review process that utilizes external reviewers with appropriate professional expertise. The highest-scoring applications are selected for awards. National activity funds are awarded competitively by contract. The program conducts extensive outreach to encourage the participation of new grantees. In FY 2008, the program sponsored three regional workshops (Washington, DC; St. Louis, MO; Phoenix, AZ) and one webcast to provide technical assistance to prospective applicants; 259 individuals participated in these events. Continuation awards are made according to the terms described in the notice inviting applications, which states that grantees must demonstrate substantial progress in implementing the goals and objectives established in their approved applications before receiving continuation awards.

Evidence: Notice Inviting Applications for New Awards Using FY 2007 Funds; memorandum recommending funding for FY 2006 grant awards; Department of Education FY 2007 and 2008 Acquisition Plan; Discretionary Grants Handbook

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: Department oversight of grantee activities is accomplished through the program's monitoring and reporting requirements and through site visits to projects by technical assistance providers and program staff. Program staff review the annual performance reports submitted by grantees and maintain telephone contact with project directors throughout the year. Contractors and program staff visit each grantee at least once during the grant period to monitor their progress and to provide technical assistance in resolving impediments to implementation. Grantees that are experiencing significant management problems are targeted for additional technical assistance and more intensive on-site consultations with technical assistance contractors.

Evidence: Annual performance reports; Site visit reports; Contractors' performance reports

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The Department maintains a searchable web-based database of grantees, which allows the public to identify school districts that are implementing specific structures or strategies in their smaller learning communities. In addition, the Department has awarded a contract to establish another searchable, web-based database that provides public access to all of the performance data and information it collects from grantees. This database will be available in late 2008. In the interim, the Department has agreed to post summary charts of each grantee's performance online, although they have not yet been posted.

Evidence: SLC searchable database of grantees: http://slcprogram.ed.gov/; The Department's VPS system

NO 0%
Section 3 - Program Management Score 90%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: While program grantees from three different cohorts have modestly increased the percentage of students scoring at proficient or above on State reading assessments, the program did not meet its GPRA targets for this indicator. Program grantees from the three cohorts have shown small increases in the percentage of students scoring at proficient or above on State math assessments, but the program did not meet its GPRA targets for this indicator either. Since 2003, the program has made its targets for two years for the measure of the percentage of students in participating high schools who graduate from high school. The program only has two years of data for the fourth measure, the percentage of graduates from participating high schools who enroll in postsecondary education, apprenticeships, or advanced training for the semester following graduation. Performance has remained flat or declined slightly from year one to year two for all three cohorts of grantees, and the program has just missed its GPRA target for this measure.

Evidence: The Department's VPS system; Data are not yet available for the 06-07 program year (2006 data in VPS), but should be available in September 2008.

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program has met its performance goal for one measure, the percentage of students in participating high schools who graduate from high school, but has not met its annual performance goals for the other three GPRA performance measures.

Evidence: The Department's VPS system; Data are not yet available for the 06-07 program year (2006 data in VPS), but should be available in September 2008.

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The cost (in dollars) per student demonstrating proficiency or advanced skills in reading declined from $595 in 2004 to $543 in 2005, while the cost per student demonstrating proficiency or advanced skills in mathematics declined from $680 in 2004 to $608 in 2005. The numerator of the efficiency measure is the amount of funds expended during the year and the denominator is the number of students who scored proficient in reading or math during that same year. In carrying out the peer review of applications for the FY08 competition, the Department relied more heavily on the E-reader system and conference calls among reviewers in order to reduce the average cost of reviewing an application. The peer review cost per application was $2,471 in FY07 and $2,390 in FY08. Also, the costs for technical assistance workshops are lower in FY08, compared to FY07. The cost savings can be attributed to the Department coordinating with other federal programs to share rented space, thus reducing the costs for space rental. Awarding the grants in two increments (3 years then 2 years) enables the Department to reduce the amount of funds grantees may return to the Treasury. Grantees with large carryover amounts will have their continuation awards reduced. In FY09, the Department will likely use FY08 funds to fund down the slate created in FY08, thus saving the costs of having another peer review.

Evidence: The Department's VPS system; Data are not yet available for the 06-07 program year (2006 data in VPS), but should be available in the fall of 2008; Spending reports from the contractors.

SMALL EXTENT 8%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: Although similar to the Gates Foundation initiative, the SLC program has put more emphasis on grantee accountability for improved student achievement and on rigorous evaluation of promising instructional approaches to raise student achievement that are implemented in smaller learning environments. Currently, no evaluation or data exist to compare the SLC program to investments made by private foundations such as Gates and Annenberg. Since these private foundations are not required to collect or share data on their grantees, a comparison to the SLC program is inherently too difficult to perform.

Evidence: NA

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Past evaluations of career and freshman academies, the most common approaches used by SLCs, and reform models that couple SLCs with improvements in curriculum and instruction, have found only modest impacts on student outcomes. A random assignment evaluation, conducted by MDRC, of career academies, a restructuring favored by a majority of grantees, found that these academies increase student earnings after high school but have no impact on student academic achievement, high school completion, or enrollment in postsecondary education. The SLC National Implementation Study, completed in 2005, was a comprehensive, descriptive study of the extent to which the first and second cohorts of grantees met the objectives of the program as they implemented SLCs. The study included an interrupted time series analysis of program outcomes, but the design was not sufficiently rigorous to allow conclusions about the program's impact or effectiveness. The study looked at trends in outcomes before and after receipt of Federal funds, but did not use a comparison group or other means of ensuring that changes were due to the receipt of SLC funds and not extraneous factors. Abt Associates of Cambridge, Massachusetts, an independent third-party, performed the evaluation. The study measured the extent to which schools funded in FY 2001 implemented all of the key features of the SLC program by the end of the grant period. Freshman and career academies were defined as either "high, " "moderately," or "low" implementing based on whether each had successfully implemented a specific number of defined features. Freshman academies created by a majority of schools (46 of 58) were found to be high or moderately implementing. Career academies created by a majority of schools (34 of 44) were found to be high or moderately implementing. Data showed little change in academic and behavioral outcomes, including no significant changes in student achievement; increases in student extracurricular participation and promotion rates from 9th to 10th grade; and slight decreases in the incidence of school violence, disciplinary action, and alcohol and drug use. The evaluation also noted other trends in outcomes such as increases in the reported percentages of students taking the SAT and intending to continue to postsecondary education. A rigorous quasi-experimental MDRC study of Talent Development freshman and career academies found substantial increases in the promotion of first-time, ninth-grade students to the tenth grade, and substantial gains in attendance and academic course credits earned. These impacts on credits earned and promotion rates were sustained in the students' subsequent years in high school, and their likelihood of graduating on-time improved by about 8 percentage points. Another quasi-experimental evaluation of First Things First by MDRC found that it significantly increased rates of student attendance and graduation, reduced student dropout rates, and improved student performance on State reading and math assessments when it was implemented by one urban district. During the most recent year for which data are available (2005), the rate of postsecondary enrollment of graduates of high schools funded by the program (81.6%) exceeded that of participants in GEAR UP (55.2%), Talent Search (77.8%), and Upward Bound (78.4%). In 2005, 68.6 percent of all high school completers enrolled in postsecondary education immediately following high school graduation, according to NCES. The graduation rate of students attending schools funded by the program (85.2%) is comparable to that of students participating in the GEAR UP program (84.4%). In 2004, the program's graduation rate (85.98%) exceeded the average graduation rate reported by States for that year in their Consolidated State Performance Reports (National Assessment of Title I).

Evidence: Kemple, Herlihy, Smith, 2005, Evidence from the Talent Development High School Model Making Progress Toward Graduation, Quint, Bloom, et al., 2005 http://www.mdrc.org/publications/408/full.pdf http://www.mdrc.org/area_index_1.html http://www.nces.ed.gov

SMALL EXTENT 8%
Section 4 - Program Results/Accountability Score 16%


Last updated: 09062008.2008SPR