Pretty soon youth services librarians across the country will heave a giant sigh of relief and wrap up their summer reading programs for the year. Beyond soothing over-competitive or under-organized parents, making sure all the vendors are paid, and breaking out the wine and/or chocolate it’s also a time for evaluation and reporting.
Traditionally we track the following things depending on the structure of your program:
Number of registrations
Number of participants reaching a pre-determined completion point
Circulation (You are including this in your SRP stats, right?)
This is all important information, and numbers are essential when telling our story and advocating for our programs and services to administration and outside funding partners.
However, given the wide use of Evance and other electronic registration systems there are some other numbers that can help us evaluate our offerings and possibly grow participation in the future. If you didn’t get this information this year don’t worry, it will be time to plan next year’s program before you know it. (Sorry)
New Participants vs. Repeat Registrations
Are you reaching your entire community or are you preaching to the choir? Of course we love our regulars, but those are the families who would likely read over the summer anyway. If we really want to make a difference we want to reach out to families that might need the incentives, whatever they might be, to make reading a priority over the summer.
One way to see if this is happening is to count the number of kids registered this year who have never participated in summer reading before. It’s true you would need data from previous years for this, unless you asked upfront if this is their first time, but it might be worth it.
While you are doing that look for other patterns in new registrations such as school affiliation or location which we’ll talk more about below.
Usually we keep stats in two or three broad age groups such as preschool, school-age, and teen. Yet it might be instructive to look a little deeper at these numbers. If you run your program from birth to high-school are you actually getting participation from all of those age groups? Looking for more detail will let you see who you are and aren’t reaching.
Do you need to modify your teen program to appeal to high school students? Do you need better publicity to get the word out to parents that their littlest ones can participate as well? Do fifth or sixth graders participate in your kids’ program or is it aimed too young? You won’t know without the numbers.
School and Location Information
For libraries serving larger populations this can be very interesting, especially when these numbers don’t line up along socio-economic or geographic boundaries.
What schools have the highest participation compared to enrollment? The lowest? Gains or losses from previous years? It might be wise to reach out to schools in the fall both to encourage and thank those schools with high participation and to see how we might better reach schools with low participation. Sometimes schools can help us overcome tricky problems like providing translations of program materials or contacts for summer programs where we might reach children who can’t get to the library.
What branches have large gains or losses in registration? There may be new staff, schools that have opened or closed, changes in school staff, or competing programs that influence these numbers. How can under-performing branches plan for next year to improve their results? How can the library as a whole support both these branches and ones that are overwhelmed with participation?
Obviously there are tons of factors to consider when evaluating your summer reading program and we know that the strict numbers only tell part of the story. Still, it’s worth our time to get the whole story the numbers can provide. Now, go dig into that chocolate. You’ve earned it this summer.