Three for: Documenting Your Impact

Creative Commons Image Attribution:

The 2012-2013 school year is upon us!  One of my goals this year is to improve my methods of assessment.  School librarians are often so busy teaching that they forget to assess the learning taking place.  Can you imagine a classroom teacher NOT assessing student learning?

I hope that the three resources I have gathered for this post are as helpful/motivational to you as they have been to me.

  • “Assessing Learning: the Missing Piece in Instruction?” by Violet H. Harada and Joan M. Yoshina is an article I reread often.  It motivates me to keep looking for ways to assess learning in my library.
  • Ross Todd’s “The Evidence-Based Manifesto for School Librarians” summarizes much of the discussion at the 2007 Leadership Summit sponsored by School Library Journal.  He discusses evidence-based practice as it applies to school libraries, shares multiple types of evidence we can be collecting, and provides questions to guide us as we consider student outcomes and how we can share the good news.
  • The University of South Carolina’s School of Library and Information Science: SLMImpact  This awesome wiki includes tools, resources, strategies, and additional readings for proving your library’s impact.  Many of the links on the Tools page provide ways to assess student learning.

How do you assess student learning in your school library?

What Speed Do You Read?

ereader test
Source: Staples eReader Department

Click on the image above to take this simple reading test to determine your reading speed and how it compares to the national average.

(Be warned:  read for comprehension!)

Food for Thought: Data Collection and Analysis

Last month in an effort to improve my own practice, I studied monthly reports from several high school libraries around the country.  I found many outstanding examples which assisted me in creating a new format for my monthly reports.  (I reported on this process here and here.)

I am still pleased with the transformation of my monthly report, but….

New and Improved?

This wonderfully relaxing five day holiday has provided time to further reflect on the data I collect to share with the school community.  School librarians understand the value of statistics and measure such things as student and class visits to the library as well as items circulated. These numbers are the backbone of many school library monthly (and yearly) reports.  In the past, library resource and facility usage = proof of the necessity of a school library.

However, the more I contemplate this, the more I am convinced that these statistics are not enough to prove the need for a school library program.

In my annual report, I include statistics on the number of pages read by those participating in our voluntary reading program.  But even that does not provide proof positive that my program is impacting student achievement.

Using Collaboration Data

Pam Harland of Plymouth Regional High School includes a Monthly Collaboration Highlight table in her report. She indicates five levels of collaboration in her table that range from merely scheduling classes to teaching an information literacy skill concept and planning a unit with teachers.

Armed with her monthly reports and test data from the New England Common Assessment Program (NECAP), Pam could correlate data that indicates her program’s impact on student achievement.  Having taught only in South Carolina, I do not have specific knowledge of the NECAP, but if it is similar to our High School Assessment Program (HSAP), then New Hampshire’s students’ research skills are tested.

Extracting the research skills data from test reports of students who benefited from an information literacy skill library lesson would be time consuming and tedious, but it could be done. However, school librarians do not have to await state test results to obtain proof of their program’s impact.

“Our Instruction DOES Matter!”

Sara Poinier and Jennifer Alevy**, teacher librarians at Horizon High School in Thornton, Colorado, successfully proved their program impacts student achievement.  In “Our Instruction DOES Matter! Data Collected from Students’ Works Cited Speaks Volumes” (Teacher Librarian, February 2010, p. 38-39)* they share that success.  Partnering with health classes, they spoke with the students about available reliable resources and demonstrated how to create citations and Works Cited pages. When the students had finished their reports, the teachers shared the Works Cited pages with the teacher librarians.

Sara and Jennifer also collected a class set of Works Cited pages from a science class that did not receive library instruction.  Then they began to analyze the papers and gathered data concerning the types of resources students had used as well as the format of the Works Cited pages.  When the dust settled, these ladies proved their instruction made a difference in student achievement.

How Can You Measure Your Impact on Student Achievement?

We know that our programs increase student achievement, but being able to provide data that demonstrates it can be powerful.  What suggestions do you have for measuring your program’s impact?

I’m starting small.  Tomorrow, two English classes are coming in for brief instruction before they begin researching aspects of the Roaring Twenties.  I’ll ask them to complete a Google form and use the feedback to help us as we plan library instruction for next semester.


*You’ll want to read this excellent article for more details on their accomplishment.  I do not subscribe to Teacher Librarian but was able to locate the article through SC DISCUS, the databases our state library helps provide.  You might be able to locate it through databases in your own school or public library.

**Jennifer Alevy is now a teacher librarian at Northglen High School.

Image attribution: “Clementine” by ilmungo

Worth the Effort!

October 2010 report

My last post concerned revamping my monthly report.  The previous format I was using left a LOT to be desired.  It was useful in quickly compiling data to submit to my principal.  Period. Ever heard the saying that goes something like “You get out of it what you put into it”?  Definitely applies here.

I put more into October’s report.  Not just more data.  More thought, more time, more effort.  October’s report does more than present data; it analyzes data.  When I finished the report (I can’t say “completed” the report because there is actually more that I wanted to add), I found that I was using it to analyze my collection’s cost effectiveness.

I finally met with my principal on Friday to share the report.  I had been anxious to see his reaction to the new format and discuss how I was using it to inform my practice.

I was confident that the report was superior to any other monthly report I had created but wasn’t prepared to be overwhelmed by his response.  After just a few minutes of discussion, he picked up the phone and asked our assistant principal in charge of curriculum to join us (this was a first).  She found the graph illustrating each department’s usage of our facilities and resources informative and requested that I create a larger copy to be placed on our school’s Data Wall.

Then they discussed sharing this information with department heads this week and accompanied me to the library for a show and tell – identifying the Dewey sections each department would find useful.


All this because of one little monthly report.  Ladies and gentlemen, it was worth the extra effort!

Just a side note:  I remove pictures of students before I post reports online.  The five photos this month put a “face” on my program – reminding the reader that it is all about students.

Gearing Up for the New Year: How Do You Assess Students?

As you prepare to begin a new school year, consider starting an “Advocacy” file on your computer.  Include links to resources (see the Advocacy page of this blog) that can assist you as you plan your advocacy strategy for the year.

We often refer to studies conducted by Lance, Todd, Baumbach or others as we explain the need for school library programs.  But in bleak economic times, statistics from a study conducted years ago in another area (studies have been conducted in Ontario, Canada, and 18 states) aren’t going to provide the support you need to prove YOUR program is making a difference.

Gather Evidence

How do you assess student learning in your media center?  If you have only used observation in the past, plan to gather concrete evidence this year. Add this evidence to your Advocacy file and include information from it in each and every meeting you have with your principal.  Plan on sharing your monthly reports with your superintendent and your school board.

There’s Strength in Numbers

In a March post, I shared the above presentation created in Google Docs and asked readers how they assessed student learning in their media centers.  Two school librarians responded, but only Joquetta Johnson of Milford Mill Academy in Baltimore, Maryland added information to the presentation.

I have met many awesome school librarians at conferences and online and know they use a variety of methods to assess learning.  I hope that some of them are reading this and will add to the presentation, allowing us all to benefit as we face one of the toughest years yet in education.

Methods to Assess Learning in the Library Media Center

Are You Ready to Rumble?

Is your school library program worth fighting for?  Is it worth preventing a fight for its preservation?

Folks, you REALLY don’t want to see an angry librarian.  If you thought he was angry when he had to straighten the display you knocked over in your hasty exit to avoid being tardy for class, think again. If you thought she was angry when you couldn’t find that overdue book in your locker or bookbag or room, think again.  If you thought he was going to hit the ceiling when you used a proxy to get around the Internet filter, think again.

Try telling the librarian that her budget, program, or job is being cut.  An angry librarian, a truly angry librarian, is not a pretty sight.  However, there is hope.

Preventing the Angry Librarian Population from Growing
A good friend and fellow school librarian , Heather Loy, has written an excellent post challenging school librarians to answer some tough questions in these times of budget cuts.  Arming ourselves with evidence to support our answers, we might just be able to save our programs.

  • Will my principal fight for me – if he’s given the opportunity?
  • Have I given him reason enough to fight for me/my program – have I had an impact on student achievement and learning?  If so, how?
  • In this Internet age, why am I still relevant?
  • Also, how are the other media specialist in my district perceived?  Will their actions/inaction reflect back on me positively or negatively?


She admits she also needs to answer these questions and then says, “I need to document and advocate for how I and my program are essential to my students and school.”

Methods to Assess Learning in the Library Media Center

School library programs across the United States are on the chopping block.  How can I ensure that my program won’t be one of them?  I need to gather evidence that my program adds value to our students’ educational experience and helps them to become  information literate.

I’ve been a fan of Tom Barrett’s  “Interesting Ways” presentations for some time.  Instead of creating  wikis where educators collaborate to build shared knowledge, he has created presentations in Google Docs and invited others to add ideas.  The visual aspect of these presentations and their format (each idea is limited to one slide) is refreshing.

Why not use the same method to cull assessment ideas for library media centers?

Unfortunately, will not allow me to embed a Google Docs presentation in my blog so I have included a screen shot.  I started the presentation with three methods for assessment in the media center and hope other school librarians will contribute to the presentation.

To view the presentation, please go to

Please share this presentation with other school librarians so that we can all benefit from our shared knowledge and practices.  Email it, tweet it, Facebook it, Diigo it, Delicious it, blog it….the phrase “the more, the merrier” certainly fits here.

Photo used under a Creative Commons License

Attribution:  “untitled”