Library Assessment Report for Academic Year (20152016): Institutional Effectiveness Functions: As College leaders in information technology and information literacy, the Library connects learners with: R esources that are both selected and organized to be easily discovered and meaningfully incorporated into the learning process. A ssistance, guidance and instruction in the use of information sources and learning technologies. O nline and physical spaces where they c an pursue learning. I. Institutional Effectiveness Goals: 1. Select, acquire and organize resources to be easily discoverable and meaningfully incorporated into the learning and research process. Collections should have a balance of formats and content reflecting current curricular needs and college priorities for research, scholarship, teaching and learning; technology should be upto date and include common learning tools. 1. Collections (print/electronic), including books, journals, media and archives. 2. Learning Technology for student use within and outside the library. 3. Interlibrary Loan to temporarily acquire and provide those items we do not own. 2. Provide quality assistance, guidance and instruction in the use of information sources, learning tec hnologies and collections to the entire SUNY Oswego community students, faculty and staff; those oncampus and off campus. 1. Research, Instruction and Technology Help, including Ask A Librarian services. 2. Special Collections/Archives 3. Access to materials 4. Customer Service 3. Provide a responsive and user friendly online library presence, and a variety of engaging and accommodating physical spaces that meet diverse student and faculty academic needs. 1. Physical spaces 2. Website Goal # Alignmen t with the Colleges Strategic Plan 1 Impact Two (Our Education Ecosystem) Our academic quality is demonstrated, extended, recognized, sought after and heralded. Performance Driver 1 of 6: Provide and deliver scholarly, interdisciplinary resources and innovative learning technologies that match student and faculty needs, and maintain information resources that have proven essential for academic needs. Key Indicator: Support for rigorous a cademic programs and research. 2 Impact One (Our Students) Our students and graduates thrive through knowledge, experience, perspectives, and discovery gained here that animates and informs their work, their communities, and their personal lives. Perfor mance Driver 1 of 4: Foster a collegewide culture of caring that provides supportive, effective information literacy instruction and personalized mentoring. Key Indicator: Retention assistance, guidance and instruction in the use of information sources learning technologies and collections
3 Impact Four (Our Institution) Our proven institutional effectiveness results in increased resources, flexibility, academic capacity and institutional success. Performance Driver 4 of 5: Put physical resources, capital assets, and technological capabilities to their highest usefulness, and ensure they are strengthened to advance continuous improvement. Key Indicator: Facilities and technology -facilities that offer appropriate and varied spaces for individua l or collaborative activities; web presence that provides seamless access to resources. II. Assessment activities completed in the recent academic year 20152016 a. Report goals assessed in academic year 20152016 : Measure Goal Results and Interpretations A. Using vendor statistics and EBSCO A to Zs Overlap Analysis, assess journal databases for duplication of coverage, both for specific journal titles identified as core and for needed coverage of academic disciplines. (CD) 1.1 Due to technical difficulties with EBSCOs A to Z Overlap Analysis, this review was done using Excel along with database vendor information. Nine subject specific databases were selected for indepth assessment based on potential overlap with both open access c ollections and comprehensive interdisciplinary databases received from SUNYConnect and NNYLN (consortial purchasing agreements). Statistical and overlap analysis provided information to assess percentage overlap between resources. In addition, titles that were not duplicated in multiple databases were assessed to determine if titles were considered core titles to the specific subject discipline. Three of the databases reviewed had approximately 50% overlap with comprehensive databases. One had 87.76% overlap resulting in only 87 unique titles not found in other library resources. In several cases, vendor statistics on core titles were available as a subset that was separately examined for overlap. Overall, the assessment found that none of the databas es reviewed overlapped entirely with the comprehensive databases examined.
B. Using a survey tested in 2014 15, better understand Metro Center student preferences for library resources and services. (Instruction) 1.1 & 1.3 At the end of the Spri ng 2016 semester, students at the Metro Center were asked to complete a brief survey (6 questions 3 Likert scale type, 1 multiple choice, 2 short answer) regarding their experiences using the services and resources of Penfield Library. 92 students representing various majors responded. An initial review of the responses tells us several things: 1) Initially, it seemed disappointing that more students did not use the library resources on a more regular basis (daily or weekly = 18.9% total daily/weekly), however. 2) The good news is that the preponderance of students did use the librarys resources at some point during the semester (61.1%. 2.2%=daily usage; 16.7=weekly; 20%=monthly; 22.2%=once a semester). 3) In the short answer questions, those that did use the librarys resources had mostly positive experiences. (Sample feedback: I use the library web page to connect to information for personal use and for classwork; Love ask a librarian; I used the online library link a lot as an undergrad human development student. I have not used it a lot in my master's program.) It should be noted that many students were able to navigate to the librarys website despite a minimal introduction to the library. Some of these students may have received libr ary instruction on the main campus. Further analysis of the results will be need to be done. C. Using data collected via the Gimlet reference statistics program, determine the most appropriate schedule for library faculty staffing of the Research Help Desk. This will involve identifying service times with slower traffic and analyzing the type and content of questions received during these times. (Reference) 2.1 Data was analyzed, and changes were made to the Research Help Desk schedule to align staffi ng with patron demand patterns. Summary of changes: staffing hours begin at 9AM rather than 8AM during the semester; during winter and summer sessions librarians serve on call for most open hours, and at the desk during times of library programming (E OP activities, summer camps, etc). Online chat reference help continues to be available 24/7. D. Collect and analyze summer data to inform discussion about library open hours and our research help schedule. Specific measures will include traffic count, materials check out, computer logins and a questions log for the Access Services desk. (Access Services/Reference) 2.4 Data was collected from May 17 August 23, 2015. Traffic counts decreased as the evening wore on with a daily average of two people ent ering the building from 67 PM. Computer logins were very low, as was materials check out (averaging 7 items per evening). Saturday numbers were also very low. A summer long total of 14 reference questions was answered during the evening hours, this low number confirming that an onsite Reference Librarian is not needed at this time. It was decided to continue with a 7 PM closing during the summer Mondays Thursdays. Saturday hours were eliminated, except on Alumni Weekend, in consideration of low use and high per use staffing expense. (See item C for other Research Help schedule changes.)
E. Using web analytics software and usability testing with students, broaden our assessment to the entire library website. (Webmaster) 3.2 Google Analytics provided usage patterns of the entire library website, showing that by far our most popular pages are the pages people use to search for resources: the homepage, research guides that recommend which databases users should search in, and th e database listing page. Services like interlibrary loan and Ask a Librarian (research help) also are among our top five most popular pages. Crazy Egg confirmed that most of the clicks on those pages are exactly what librarians want to see: Users searchi ng the library discovery system or entering the library catalog or databases to continue their research there. Unexpected usage patterns were investigated further by usability tests. Twenty two students at the Phoenix Center participated in a mixedfor mat usability test involving survey questions, a first click test, and a toptask analysis. Eighty oncampus students participated in surveys about library website use, and four students participated in half hour think aloud usability tests on the library website. These surveys and tests helped us to better understand what users want out of the librarys website; what words they use to describe their needs; and which parts of the librarys design, content, and information architecture need improvement. F. Assess Access Services student assistants ability to handle routine service desk business and their understanding of good customer service practices, after improvements were made in training and testing. All student assistants will complete a master y test administered via Blackboard following their training and results will be analyzed. (Access Services) 2.3 & 2.4 22 student assistants (all except new hires) were given a short answer test, with an average score of 86% and 20 students scoring 80% or better. This measure was administered in 201415 with a 72% average score, after which both the training and the test were improved, reflected in the increases in score. Administration via Blackboard allowed for randomization of questions and automated student feedback at the conclusion of the test. LETTERS UNDER EACH ITEM BELOW CORRESPOND TO THE COMPARABLE MEASURE IN THE TABLE ABOVE, PART A. b. Report ALL changes (e.g. course revisions and delivery, faculty professional development, curriculum change, etc.) that were implemented based on the previous assessment data gathered and the impact of these changes on previous res ults: A. None at this time. More time is needed for further data collection and teaching faculty feedback. B. This year, we worked closely with the Student Services Advisor at the Metro Center to administer the library survey. With his assistance, w e were able to reach a much larger sample of students than we did in the past when the survey was emailed to students.
C. Summary of changes: staffing hours begin at 9AM rather than 8AM during the semester; during winter and summer sessions l ibrarians serve on call and at the desk during times of library programming (EOP activities, summer camps, etc). Online chat reference help continues to be available 24/7. D. The Library Director and Advisory Group determined that the patron population and circulation statistics were so low on Saturdays they did not warrant the library being open. The only Saturday open hours planned for 2016 were during Alumni Reunion. This decision was partly in consideration of the three employees needed each Saturday for a four hour shift, and the savings that would be realized from discontinuing this shift. E. Website improvements were made based on data collected, including wording improvements for some links, reorganization of some pages and sections of the website, etc. F. Student assistants were instructed in their areas of weakness. c. Report ALL future changes based on assessment data gathered: A. Continuous review of these resources will provide a fuller picture of use patterns, identity of core t itles, potential impact of cancellations, and areas in which funds could be redirected to provide key databases in areas of weaker coverage. B. As a first step, we need to increase our visible presence at the Metro Center, which initially includes prominently displayed promotional posters and literature. The planned posters will be a series of about 6 posters, each highlighting specific library services useful at specific times during the semester, and will be changed out at appropriate times. We will al so work closely with the Student Services Advisor to involve librarians in the fall orientation events for Metro students. C. Work with new library director to discuss reference schedule alternatives for the upcoming year 201617. D. The study will be r eplicated to further examine library activity during summer evening hours. E. The data collected this year will continue to inform decisions about updates to the website. Planned updates at this time include reorganization and possible renaming of the R eference and the Multimedia Creation sections of the website. F. None. The training and testing will continue, to ensure quality library service. d. Required resources to implement the above changes: A. Librarian time only. B. Funds for professional printing of posters and brochures, and literature display racks. C. None D. Librarian and staff time. E. Time of the library webmaster. F. None e. Recommended changes to the assessment process: A. Confirm early in the ye ar that the needed vendor tools are functioning properly. Complete database assessments early in the year, to allow time for faculty conversations in preparation for contract renewals. B. None at this time. C. None D. In next years review of summer tr affic the data will include an additional measure, a full library observation and count of the number of patrons utilizing the library each evening. E. None F. None A ssessment activities planned for upcoming academic year 20162017 (see next page)
III. A ssessment activities planned for upcoming academic year 20162017: Measure Goal A. Continue database review using vendor statistics to assess select specialized j ournal databases for use and cost per use. Using curriculum/course descriptions and faculty feedback, examine relevance and importance of continuing lower use databases. (CD) 1.1 B. Using both quantitative and qualitative data examine the impact and participation trends of the "maker services" programming. Measures will include number of participants and projects/collaborations created, circulation statistics for maker tools, and participant feedback. (Learning Technologies Librarian) 1.2, 2.1 & 2 .3 C. Continue using data collected via the Gimlet reference statistics program to determine the most appropriate schedule for library faculty staffing of the Research Help Desk. This will involve identifying service times with slower traffic and analyz ing the type and content of questions received during these times. Access Services documentation of research questions received during times when librarians are off duty will also be considered in this analysis. (Reference) 2.1 D. Use ILLiad statistics a nd reports to inform discussion about document delivery services, recently expanded to all undergraduates. Specific measures will include number of requests, requests filled, and proportion of requests from undergraduates compared to other patron groups. ( ILL) 2.3 E. Repeat summer hours study, with added data collection on number of patrons using the library during the evening determined by counting patrons present at set times each evening. Collect and analyze summer data to inform decisions about summer evening library open hours. Specific measures will include traffic count, materials check out, computer logins, questions log for the Access Services desk, and attendance counts. (Access Services) 2.4 F. Continue with use of web analytics software and usability testing with students, to deepen our understanding of the use of the library website. (Webmaster) 3.2 See also: Library Assessment Report Information Literacy Learning Outcomes Submitted by Barbara Shaffer, Library Director (6/24/16) Please address feedback to Sarah Weisman, incoming Li brary Director (as of 7/5/2016)