We're on Twitter and Facebook   |   Search   |   Login  or  Register

Report Card 2010

In this section:

MethodologyAcknowledgments

Methodology

Summary 

 

  • Selection: The College Sustainability Report Card 2010 evaluates environmental sustainability efforts at 332 schools in the United States and Canada.
  • Survey Composition: Four surveys were designed to gather information about sustainability in campus operations, dining services, endowment investment practices and student activities. This year, for the first time, the complete survey responses are available online.
  • Data Collection & Verification: Data collection for the College Sustainability Report Card 2010 took place from June through August 2009. The research process included sending surveys to administrators and students at all 332 institutions. Researchers also gathered information from publicly available sources.
  • Assessment: A school's overall grade is calculated from the grades received in nine equally-weighted categories. A total of 48 indicators are used to evaluate performance within the categories.
  • Recognition:The Overall College Sustainability Leaders award is given to schools that have made notable achievements in sustainability by earning an overall grade of "A-."

 

School Selection

The College Sustainability Report Card 2010 evaluates environmental sustainability efforts at 332 schools in the United States and Canada.

 

Schools were selected from the 2007 NACUBO Endowment Study and from additional independent research on endowment size. Published in January 2008 by the National Association of College and University Business Officers, the 2007 NACUBO Endowment Study provides data about college and university endowments. Starting with the Report Card 2009, we went beyond the NACUBO Endowment Study and searched public records to develop a more comprehensive list of schools that met our criteria of having approximately $160 million or more in endowment assets. As a result of this search, 8 schools in the United States and 11 schools in Canada were added to the Report Card.  The Report Card does not assess institutions limited to a single, specialized field of graduate or professional study, institutions that do not have traditional campus facilities, or institutions that share endowments with primary or secondary schools.

 

For the first time, schools were invited to apply for inclusion in the Report Card.  In total, 32 schools were added to the Report Card 2010 through this application process. A $700 contribution was requested of each school that applied, to cover the costs associated with additional research and analysis. Grant assistance was offered to schools with demonstrated need.

 

Back to top of the page

 

Survey Composition

Four surveys were designed to gather information about sustainability in campus operations, dining services, endowment investment practices and student activities. This year, for the first time, the full versions of completed surveys are available online.

 

In order to compile and publish a comprehensive resource and accurate assessment of sustainability in higher education, the Sustainable Endowments Institute is continually working to improve the tools we use in our research process. A key advance in the research process for the Report Card 2010 was the development of new surveys to facilitate the collection of more detailed and quantitative information.   

 

In the spring of 2009, the Institute surveyed hundreds of college and university administrators, faculty and staff to obtain feedback about the Report Card research process in previous years as well as suggestions for improving the three main surveys. We received nearly 300 responses from school administrators and their feedback influenced the changes to the surveys.

 

Completed surveys from previous years were closely examined to find patterns in content and format. School's sustainability websites also provided insight into information that is commonly reported and in demand. Corresponding questions were composed to assess the existence and comprehensiveness of sustainability programs. Survey questions asked for information in formats that were most clearly understood and convenient. To address the evolutionary nature of sustainability in higher education, we took steps to research new and growing trends (e.g. trayless dining and composting in residence halls) and to add corresponding questions to the surveys.

 

Because of the breadth and variety of organizations working to assess environmental sustainability, survey questions mentioned only some of the most recognized and reliable organizations and rating systems by name. These included the United States Green Building Council's LEED rating system for green building, the EPA's Energy Star rating for buildings and appliances, the Fair Trade Certified label for items grown and processed under humane labor conditions, the Marine Stewardship Council and Monterey Bay Aquarium Seafood Watch standards for sustainable seafood harvesting, etc. A comprehensive list of organizations and rating systems mentioned as well as links to their websites can be found on our resources page.

 

In order to provide as much information as possible about each school's sustainability initiatives, the full versions of completed surveys are now available on the website.

 

Back to top of the page

 

Data Collection & Verification

Data collection for the Report Card 2010 took place from June through August 2009. The research process for the Administration, Climate Change & Energy, Green Building, Food & Recycling, Student Involvement, and Transportation categories included sending four surveys to administrators and students at each participating institution. The research team also gathered information from publicly available sources.  

 

We sent Campus, Dining, and Endowment surveys to verified contacts at every campus.  In the spring of 2009, the Report Card 2010 schools were asked to confirm the appropriate recipients for this year's Campus, Dining, and Endowment surveys. Contacts at more than 90 percent of schools were verified through this effort. In early June, blank versions of the three surveys were sent to the confirmed contacts at each school. If a school did not respond to SEI's previous attempt to verify contacts, the campus survey was sent to the president or another upper-level administrator, and the separate dining survey was sent to the dining services director or equivalent administrator.  

 

For the three endowment-related sections (Endowment Transparency, Investment Priorities, and Shareholder Engagement), a multiple-choice survey was sent via email to officials at each school whose duties pertain to endowment management. Typically, this individual was a chief investment officer, chief financial officer, vice president for investments, vice president for finance, director of investments, or another person with similar responsibilities. Most schools confirmed the appropriate recipient for the endowment survey during the spring outreach initiative mentioned above. The follow-up process used to gather information on campus management was also carried out to confirm the receipt of the endowment survey.   

 

In order to gather information about student-led environmental efforts, a new student survey was sent to student leaders of environmental and sustainability organizations on campus. Our research team searched for student contact information using publicly available sources. Because student contact information was not publicly available in some cases, the student survey was also sent to the campus survey contact person to forward the survey to the appropriate students.  

 

We conducted preliminary research for the school profiles by gathering publicly-available information about campus sustainability management. The campus and dining surveys were then partially filled out with information from our independent research research. Sources included schools' websites, media coverage, and information from the United States Environmental Protection Agency (EPA) and the United States Green Building Council (USGBC). Data from both the public and members-only sections of the Association for the Advancement of Sustainability in Higher Education (AASHE) website were also used in our preliminary research.   

 

Institutional demographic information, such as student enrollment and location, presented on each school's profile page and in search results, comes from the school or from the Carnegie Classification of Institutions of Higher Education.  The U.S. Department of Education's Integrated Postsecondary Education Data System (IPEDS) was also consulted for data on administrative staff levels.  In cases where these sources did not provide sufficient data, as in the case of the 17 Canadian universities included in the Report Card 2010, school websites and/or school administrators provided the necessary demographic information.

 

If school officials did not respond promptly to the Campus, Dining, or Endowment surveys, SEI made additional attempts to contact each school. At least two additional follow-up emails were sent and at least three phone calls were made to each contact. The Student surveys were also sent again to the student contacts as a follow-up. In total, 296 of the 332 schools (89 percent) responded to the campus survey, and 295 of the 326 (91 percent) schools with dining programs responded to the dining survey. We received at least one student survey from 248 of the 332 schools (75 percent). 

 

Because little information is publicly available on endowments, SEI was unable to conduct the same type of initial background research for endowments as was employed for the campus survey. Responses were received from 265 of the 324 schools (82 percent) after following up by both phone and email.

 

We used publicly available information to construct profiles of schools that did not respond to the survey. If more recent information was not provided in response to the surveys, the research team used previously collected information, including relevant survey responses from the Report Card 2009 and our findings from publicly available sources to create institutional profiles.

Before publication, we sent a verification email with draft findings consisting of the nine category paragraphs to all school administrators that completed surveys as well as to administrators who had either not responded or had declined participation. The email was designed to allow administrators to flag issues with out of date or otherwise inaccurate data before it was published or graded. This also served as an important step in ensuring accuracy of the data for the small group of schools that  chose not to respond to one or more of our surveys.

 

Back to top of the page

 

Profile Composition and Assessment

Among the potential formats for presenting research findings, the system of assigning letter grades was considered appropriate for educational institutions. Schools' overall grades are based on cumulative points awarded for 48 sustainability indicators, which are distributed among nine equally-weighted categories. When  appropriate, school size, endowment size, and geographic setting are taken into consideration in the assessment process.

 

SEI was careful to avoid potential bias or conflicts of interest in the research and assessment processes. Members of the research team were assigned to schools with which they had no current or previous affiliation, and each school’s complete information was reviewed by at least two evaluators.  

 

All information received was included in the assessment process. Some schools submitted extensive and detailed responses to our surveys. Due to space limitations, SEI must edit this information to fit within the profile format. Each profile paragraph was composed to include as much information as possible, and to highlight recent and noteworthy achievements. In order to provide as much information as possible about schools' sustainability initiatives, the full versions of completed surveys for the Report Card 2010 are available on the website as links from each school's profile and directly from the surveys section.

 

Based on thorough research of best practices in sustainability in higher education, specifically concerning campus operations and endowment policies, 48 indicators were chosen for the assessment process. While these indicators take a broad range of policies and programs into consideration, they do not encompass all college and university sustainability efforts, nor do they include teaching, research, or other academic aspects concerning sustainability. The 48 indicators were then grouped into the nine categories. The maximum number of points that can be earned for a single indicator varies based on the indicator's impact on overall campus sustainability and relative importance compared to other indicators within the category.

 

Category grades were calculated based on the total number of points earned for the indicators within the category. To simplify grading, only full letter grades (i.e., no plus or minus) of A, B, C, D, and F were used for the individual categories. To receive an "A" in any category, a school needed to accumulate at least 70 percent of total available points for the indicators in that category. At least 50 percent of available points were necessary to receive a "B," 30 percent of available points for a "C," and 10 percent of available points for a "D." No school received a "D" or “F” in the Investment Priorities category because all schools were awarded a minimum grade of “C” for aiming to optimize investment return.

 

The nine equally-weighted category grades were totaled to calculate an overall grade point average (GPA) on a 4.0 scale (where A = 4, B = 3, C = 2, D = 1, and F = 0). The GPA was then translated into an overall sustainability grade, ranging from “A” to “F,” using a standard grading scale.

 

The research team employed a variety of techniques to quantify information. A tracking system was created to notate data received in survey format or collected through independent research. We created clear and specific guidelines for the research team to use when interpreting data to ensure generalizable and universal numerical interpretation. Within each indicator, points were awarded based on algorithms that either operate independent of other schools (in the case of binary, incremental, and qualitative questions) or award points based on relative performance to other schools on scale-based questions. Final grades were then reviewed for accuracy.

 

Binary grading

For some of the data we used a binary grading scale. Points were awarded for the existence of certain programs, while no points were awarded when those programs were absent.

 

Incremental grading

In an effort to recognize the varying levels of complexity and stages of development of sustainability programs on different campuses, in some cases the grading process allowed points to be awarded for partial existence of programs. In other situations, we broke down credit for a given program or indicator into many pieces in order to award partial credit for different aspects of a program. In other cases, for example, the survey questions related to the Green Purchasing indicator of the Administration category, we asked whether a campus buys "none," "some," or "all" of a specific product. If "some" of the products was purchased, the school would receive partial credit for that piece of data.

 

 One potential problem with this system is that the word "some" encompasses a variety of levels. A school that purchases 30 percent and a school that purchases 80 percent of paper products from environmentally-preferable sources receive the same amount of credit for this piece of data. However, in many cases, the only schools that purchase "all" of a product from sustainable sources have made a strict commitment to do so. It is this additional level of dedication that awards these schools full credit for this piece of data.

 

Scale grading

In order to award points most fairly for the various levels of complexity of sustainability programs, a scale-based grading format was used to award points in all applicable cases. This type of system was used often, and mainly for quantitative data (e.g. the percentage of annual food budget spent on local food items or the ratio of sustainability staff to all administrative staff). Points were awarded based on the rank of that percent relative to other responding schools.

 

Grading qualitative data

Certain pieces of data were collected in a qualitative format (e.g. descriptions of projects worked on by an administrative sustainability advisory committee). In order to quantify this information, we often used notation indicating comprehensiveness. Guidelines were created for notating qualitative data, including examples of programs at different levels of comprehensiveness. Grades on qualtiative data were checked multiple times by different researchers to minimize variability and any personal bias

 

Not Applicable (N/A)

Every campus has different limitations and in some cases certain sustainability programs and commitments are not applicable to a school. Examples of this include:

 

- It may not benefit an urban campus to run a campus shuttle when city-wide public transportation frequently services the campus.

 

- Canadian schools cannot sign the ACUPCC or earn Energy Star label for buildings because these programs are specific to the United States.

 

- A community college may not have food services on the campus.

 

- When an endowment is entirely invested in mutual funds and commingled funds the school is unable to vote proxies.

 

- In cases of schools with lower amounts of endowment assets, the threshold for evaluation this year was $16 million. This amount is 1/10th the size of the lowest endowment assets of schools included in the Report Card because of their endowment size.

 

For these types of situations the points traditionally awarded for a certain piece of data, the Not Applicable indicator is not factored into the calculation of category grades. When a school is Not Applicable for an entire category, the overall grade is weighted evenly out of the remaining categories.

 

In addition to the general points available for each indicator, extra credit points could be earned for innovative and comprehensive programs. This year we added an "extra credit" system to our assessment process in order to reward particularly innovative and successful programs. Extra credit was only made available for indicators for which we found, through statistical and qualitative analysis, a wide range of responses including some rare and highly noteworthy examples.  This system allowed schools that may be focusing on, and excelling in, one area within a category to receive appropriate credit within that category for their efforts.

 

The size of a school's student body, its total building square footage, its endowment size, and its geographic location were taken into account when assigning points. While there is a high degree of geographic and size diversity among the schools in the Report Card, many of the best practices can be applied to all colleges and universities, be they large or small, public or private. In the research and grading, factors that might be primarily attributed to size or geographic location were taken into account and those categories were graded accordingly.

To ensure that we do not unfairly treat schools with relatively small endowments, we adjust our performance metrics as follows: We examine whether a given performance metric is strongly, and positively, predicted by schools' endowments per student, using a linear regression model. For performance metrics exhibiting this  strong correlation (t-value greater than 2), we adjust scores for schools with positive residuals in the performance-endowment per student regression. That is, schools with performance greater than would be predicted based on their endowment per student receive an incremental boost to their performance score. The size of the boost is a fraction of a single standard deviation of the performance metric, with the fraction equal to one minus the percentile of the school in terms of endowment per student.

 

As sustainability in higher education has evolved, our grading methodology has shifted accordingly. Thus, a comparison of grades in the 2007, 2008, 2009, and 2010 editions of The College Sustainability Report Card can provide only a general way to track progress among schools--recognizing that the basis for evaluation included some changes each year. The Report Card has aimed to remain as current as possible with the advancement of campus sustainability. For this reason, we have continually updated the specific sustainability initiatives assessed, making the Report Card an optimal tool for cross-school comparisons within any given year.

 

Back to top of the page

 

Recognition

The Sustainability Leaders awards are given to schools that have made notable achievements in sustainability.

 

Schools that receive a score of A- or higher are designated as Overall College Sustainability Leaders.  

In order to recognize schools that have made notable achievements in various areas of sustainability, the Sustainable Endowments Institute designates schools that are taking particularly significant action as Sustainability Leaders. In the Report Card 2010, 26 out of 332 schools earned a cumulative grade average of “A-” to qualify as Overall College Sustainability Leaders.

 

The Institute also recognizes schools that earn high overall scores in each of the categories. Campus Sustainability Leaders have received an average grade of “A-” or better on all six campus operations categories (Administration, Climate Change & Energy, Food & Recycling, Green Building, Student Involvement, and Transportation); Endowment Sustainability Leaders have received an average grade of “A-” or better on the three endowment management categories (Endowment Transparency, Investment Priorities, and Shareholder Engagement). In addition, we have recognized all schools that received an “A” grade in a specific category with the corresponding leadership designation (e.g. Green Building Leader, Transportation Leader, etc.).

Powered by Olark