Standard 3:Information and Knowledge
3.4.1. Research and Knowledge Creation: Candidates use evidence-based, action research to collect data.
Artifact:
Evaluation Tool
As with the other three projects I completed for School Library Administration, I learned an incredible amount creating the evaluation instrument. I am currently disenchanted with the county’s evaluation system for classroom teachers. I feel as if there are many tenured teachers who are not held accountable and are thus providing poor instruction for their students. This, I can only imagine, is similar in the school librarian world. Without an evaluation system that generates any accountability for school librarians, individuals are running some of our country’s school libraries poorly. What distinguishes them from classroom teachers is that in areas of our country, school librarians’ positions have already been cut. Without a system to ensure that the education coming out of libraries is quality and influential, more positions may soon be gone.
I spent a significant amount of time with three other students in School Library Administration creating our evaluation tool. We used evidence; we interviewed current school librarians and spent a considerable amount of time looking at a variety of school library evaluation tools at the county, state, and national levels. We completed action research by working as a group to determine possible solutions to a current issue, namely the lack of accountability in our county’s library evaluation system. Our professor charged us with creating an evaluation tool to match the MSDE subcommittee’s drafted school librarian’s five roles and 10 standards. In the end, she asked our permission to share our completed evaluation tool with the MSDE subcommittee – I guess we did a good job!
The evidence we collected came from documents and interviews. The six documents we cited and used as we created our own evaluation tool are as follows: the MSDE subcommittee’s draft, the current MSDE Standards for School Library Programs in Maryland, the current FCPS evaluation system rubric, A Planning Guide for Empowering Learners with School Library Program Assessment Rubric (AASL), Empowering Learners: Guidelines for School Library Programs Assessment Rubric (AASL), and the Alabama Professional Education Personnel Evaluation Program. The four of us pored over these six documents and shared our findings with two middle school librarians. During these interviews, we also discussed how and when the school librarians had been evaluated by administrators. One of the two librarians shared with us that she keeps track of all of her materials that meet the current MSDE school library standards. However, she has never needed to show her artifacts to anyone. The two shared that their observations or evaluations always occurred at the school level; county or state administrators had never officially evaluated their library programs. And, their school-based evaluations lack any concrete feedback. According to both of them, school librarians in Frederick County are left somewhat to their own devices. Because of this, the four of us made sure to include areas within the evaluation that allow evaluators to write narrative feedback. We also stress the need for an administrative conference between the school librarian and the evaluator following each evaluation session.
From what our professor shared with us, MSDE is working to create new SLM standards that align with the Common Core Standards to be implemented in the next three years. She shared MSDE’s draft with us.The indicators for all six roles in our evaluation instrument came primarily from MSDE’s drafted school librarian roles; however, we also modified ideas from the Alabama evaluation and AASL’s Empowering Learners Assessment Rubric. From the Empowering Learners rubric we adopted the concept of moving beyond a simple satisfactory or unsatisfactory rating to multiple levels of mastery. We also modified some indicators from Empowering Learners and integrated them into the six roles from the MSDE draft. We examined and adopted portions of the Alabama evaluation instrument because it addresses evaluations at various levels, from a self-evaluation to a state administrator evaluation. The Alabama evaluation process also concludes in a conference between the librarian and a supervisor at which point the school librarian may provide further evidence of effectiveness. As aforementioned, we unanimously decided to include the conference as a part of our evaluation tool. Much of our discussion about what to include and how to design the evaluation tool occurred over Blackboard discussions and online WIMBA conversations. We divided the tasks, but ultimately made decisions about what to include and not to include all together.
Every step of the process was a learning experience. Our results from the librarian interviews unsettled me. As I mentioned earlier, I feel that the lack of accountability among mediocre teachers is debilitating our education system. To discover that these low expectations are also evident in the school library spectrum was upsetting. This narrative evidence from the interviews made our determination to create a useful evaluation tool even stronger. As we moved into the examination of the multiple evaluation tools and the actual creation of our own evaluation tool, the decisions we made came from strong discussion and a respect for one another’s ideas and experiences. I truly feel as if our evaluation tool is one that can accurately determine the effectiveness of a school librarian.
The action research that my group completed was evidence-based. Along the way we collected data to create a project that may, in the end, be useful to the MSDE task force who is creating the evaluation system to hold state-wide school librarians accountable. As an educator who constantly improves my materials and teaching mission with every passing semester, I most certainly hope to see changes in the way educators, including school librarians, are evaluated.
I spent a significant amount of time with three other students in School Library Administration creating our evaluation tool. We used evidence; we interviewed current school librarians and spent a considerable amount of time looking at a variety of school library evaluation tools at the county, state, and national levels. We completed action research by working as a group to determine possible solutions to a current issue, namely the lack of accountability in our county’s library evaluation system. Our professor charged us with creating an evaluation tool to match the MSDE subcommittee’s drafted school librarian’s five roles and 10 standards. In the end, she asked our permission to share our completed evaluation tool with the MSDE subcommittee – I guess we did a good job!
The evidence we collected came from documents and interviews. The six documents we cited and used as we created our own evaluation tool are as follows: the MSDE subcommittee’s draft, the current MSDE Standards for School Library Programs in Maryland, the current FCPS evaluation system rubric, A Planning Guide for Empowering Learners with School Library Program Assessment Rubric (AASL), Empowering Learners: Guidelines for School Library Programs Assessment Rubric (AASL), and the Alabama Professional Education Personnel Evaluation Program. The four of us pored over these six documents and shared our findings with two middle school librarians. During these interviews, we also discussed how and when the school librarians had been evaluated by administrators. One of the two librarians shared with us that she keeps track of all of her materials that meet the current MSDE school library standards. However, she has never needed to show her artifacts to anyone. The two shared that their observations or evaluations always occurred at the school level; county or state administrators had never officially evaluated their library programs. And, their school-based evaluations lack any concrete feedback. According to both of them, school librarians in Frederick County are left somewhat to their own devices. Because of this, the four of us made sure to include areas within the evaluation that allow evaluators to write narrative feedback. We also stress the need for an administrative conference between the school librarian and the evaluator following each evaluation session.
From what our professor shared with us, MSDE is working to create new SLM standards that align with the Common Core Standards to be implemented in the next three years. She shared MSDE’s draft with us.The indicators for all six roles in our evaluation instrument came primarily from MSDE’s drafted school librarian roles; however, we also modified ideas from the Alabama evaluation and AASL’s Empowering Learners Assessment Rubric. From the Empowering Learners rubric we adopted the concept of moving beyond a simple satisfactory or unsatisfactory rating to multiple levels of mastery. We also modified some indicators from Empowering Learners and integrated them into the six roles from the MSDE draft. We examined and adopted portions of the Alabama evaluation instrument because it addresses evaluations at various levels, from a self-evaluation to a state administrator evaluation. The Alabama evaluation process also concludes in a conference between the librarian and a supervisor at which point the school librarian may provide further evidence of effectiveness. As aforementioned, we unanimously decided to include the conference as a part of our evaluation tool. Much of our discussion about what to include and how to design the evaluation tool occurred over Blackboard discussions and online WIMBA conversations. We divided the tasks, but ultimately made decisions about what to include and not to include all together.
Every step of the process was a learning experience. Our results from the librarian interviews unsettled me. As I mentioned earlier, I feel that the lack of accountability among mediocre teachers is debilitating our education system. To discover that these low expectations are also evident in the school library spectrum was upsetting. This narrative evidence from the interviews made our determination to create a useful evaluation tool even stronger. As we moved into the examination of the multiple evaluation tools and the actual creation of our own evaluation tool, the decisions we made came from strong discussion and a respect for one another’s ideas and experiences. I truly feel as if our evaluation tool is one that can accurately determine the effectiveness of a school librarian.
The action research that my group completed was evidence-based. Along the way we collected data to create a project that may, in the end, be useful to the MSDE task force who is creating the evaluation system to hold state-wide school librarians accountable. As an educator who constantly improves my materials and teaching mission with every passing semester, I most certainly hope to see changes in the way educators, including school librarians, are evaluated.