Skip to content

good assessment practice

February 10, 2013

300 club

CC Image Source: http://www.flickr.com/photos/puuikibeach/6844493279/sizes/l/in/photostream/

With assessment situated at the heart of student learning – and, by extension, the school as a learning organisation – research shows that purposeful and planned strategies ensuring accurate interpretations of student growth inform an iterative refinement cycle. Studies are broad (Barber & Mourshed, 2007; Hattie, 2009), policymakers are enthusiastic (CRESST, 2011), case study findings vary, and theorists may differ in their analysis of how improvement in performance is achieved, yet a balanced perspective is required in the school context.

Learning analytics cycle

CC Image Source: http://www.flickr.com/photos/dougclow/5484184568/lightbox/

Based on research, more accurate interpretation of student growth through best assessment practice can be defined as visible learning (Hattie, 2009) which is attained by setting clear and high expectations for  what students should achieve (Barber & Mourshed, 2007; Sharratt & Fullan, 2012). Commonality emerges in emphasis of the significance of feedback which is seen as the bridge between teaching and learning (Hattie, 2002; Wiliam, 2011), and facilitates a means of opening the black box to closer scrutiny (Black & Wiliam, 1998).

In order to effect change in the classroom, school leaders bridge dichotomies to make nuanced judgements about leading best practice with staff in their school context by driving an explicit improvement agenda (Masters, 2010).  This process is underscored by an important credo – “know your students and how they learn” (AITSL, 2011, p.9) – and involves sophisticated synthesis of three concepts: Formative assessment practice; school and system level evaluation; and overarching principles of educational measurement.

when learning is the goal

CC Image Source: http://www.flickr.com/photos/7815007@N07/5456941522/lightbox/

Central issues or concerns relate to scale, context and reliability of data. Care is essential to interpret data and drive improvements in learning:

When teachers are provided with opportunities to use and interpret assessment data in order to become more responsive to their students’ learning needs, the impact is substantive. Teachers, however, cannot do this alone, but require system conditions that provide and support these learning opportunities in ways that are just as responsive to how teachers learn as they are to how students learn (Timperley, 2009, p. 24).

Planning with the end in mind

CC Image Source: http://www.flickr.com/photos/7815007@N07/6916661035/in/photostream/

For reasons of complexity, effectively evaluating data demands that professional on-balance judgements hold sway. While there are lessons to be learned from why some school systems succeed and others do not (Barber & Mourshed, 2007), school and system level evaluation affords too-broad classifications of student performance. Scale also generates noise which increases the likelihood of error (Wu, 2010).

By manipulating the lens through which we view our local, school-based, learner-focused growth process, meaning is effectively contextualised. Studies show it is therefore necessary to log events, such as student “self-assessing, self-evaluating, self-monitoring, self-learning” (Hattie, 2009, p. 37) in order to build hypotheses, foster considered interpretation of fine-grained assessment data, and render both learning goals and success criteria visible.

summary of assessments term planning template

CC Image Source: http://www.flickr.com/photos/7815007@N07/7187116023/in/photostream/lightbox/

Selected references

AITSL. (2011). National Professional Standards for Teachers.  Retrieved February 9, 2013 from Australian Institute for Teaching and School Leadership website: http://www.aitsl.edu.au/verve/_resources/AITSL_National_Professional_Standards_for_Teachers.pdf

Barber, M. & Mourshed, C.C. (2007). How the world’s best performing school systems come out on top. Retrieved January 25 from McKinsey & Company website: http://mckinseyonsociety.com/how-the-worlds-most-improved-school-systems-keep-getting-better/

Black, P. & Wiliam, D. (1998). Inside the black box: Raising standards through classroom achievement. Retrieved January 20 from website: http://blog.discoveryeducation.com/assessment/files/2009/02/blackbox_article.pdf

CRESST report 802. (2011). Knowing and doing: What teachers learn from formative assessment and how they use information. Retrieved January 20 from website: http://www.cse.ucla.edu/products/reports/R802.pdf

Hattie, J. (2002). Teachers make a difference: What is the research evidence? Retrieved January 25 from website: http://www.acer.edu.au/documents/RC2003_Hattie_TeachersMakeaDifference.pdf

Hattie, J. (2009). Visible learning. Oxon: Routledge.

Masters, G. (2002). Teaching and learning school improvement framework. Retrieved February 1, 2013 from ACER website:   http://www.acer.edu.au/documents/c2e-teach-and-learn-no-crop.pdf

Sharratt, L. & Fullan, M. (2012). Putting faces on the data: What great leaders do. Thousand Oaks, CA: Corwin.

Timperley, H. (2009). Using assessment data for improving teaching practice. Retrieved February 9 from: http://research.acer.edu.au/cgi/viewcontent.cgi?article=1036&context=research_conference

Wiliam, D. (2011). Embedded formative assessment. Bloomington, IN: Solution Tree Press.

Wu, M. (2010). The misuse of NAPLAN data. Retrieved January 26 from NSW Teachers’ Federation website: http://research.acer.edu.au/cgi/viewcontent.cgi?article=1036&context=research_conference

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s

%d bloggers like this: