Home
| Databases
| WorldLII
| Search
| Feedback
Legal Education Review |
THE MODIFICATION OF ASSESSMENT TASK DIMENSIONS IN SUPPORT OF STUDENT PROGRESSION IN LEGAL SKILLS DEVELOPMENT
CLAIR HUGHES*
One of the outcomes of the past two decades of changes to the higher education sector has been a growth in the number of professional roles requiring university preparation.1 While the emergence of what has been termed as a ‘new vocationalism in higher education’2 has given rise to intense philosophical debate concerning the true purpose of a university education, it has also prompted intense institutional efforts to improve the quality of professional programs. Coordinators and teachers now have access to an extensive body of literature, offering both theoretical and practical guidelines for those engaged in preparing students to profit from the work-based or work-integrated learning components of their programs. This literature has been developed from a range of sources including formal research, reflection on practice, and various disciplinary and interdisciplinary intervention and scoping projects. The work of Mantz Yorke3 and the joint work of Brenda Little and Lee Harvey4 in the United Kingdom are representative of this field.
This article owes its origins to one such project, one with an assessment focus led by the Council of Australian Law Deans (CALD) and funded by the Australian Learning and Teaching Council (ALTC). At a symposium conducted as part of this project, group discussion centred on issues relating to the systematic assessment of law students’ abilities to undertake a variety of professional roles. This article elaborates on some of the ideas canvassed by participants at this group discussion, identifying key task variables that could be manipulated for the assessment of student learning at different stages in the progression from ‘novice’ to ‘expert’ legal practitioner.
After a general introduction to the structure and design of assessment tasks, this article identifies and describes task dimensions applicable to the modification of assessment tasks and discusses possible applications with reference to specific examples.
Assessment tasks are comprised of a number of individual and inter-related elements, each of which requires decision-making at the design stage. An assessment task design (ATD) framework developed from systemic functional linguistics5 illustrates task elements and associated task design decisions (see Figure 1 below) and is based on the following assumptions:
• A text is defined as ‘any meaning-producing event, be it a book, a film, an advertisement, a phone conversation and so on’.6 • All assessment involves students in the production of ‘texts’. The essays, reports, oral presentations, interviews, posters or blogs that students produce when engaging in assessment tasks are all accepted as lying within this definition of ‘text’.
Figure 1: Assessment Task Design (ATD) Framework7
Texts are shaped by the cultural and social contexts in which they are produced. It is helpful to our understanding of cultural context to consider the legal profession as a cultural group — referred to as a ‘community of practice’8 or ‘discourse community’.9 Cultural groups share specialised knowledge and terminology, and communication practices which employ particular text types. Students may simultaneously participate in a number of communities, including peripheral membership in a professional community of practice, while striving for ‘full enculturation’.10 Full membership in a community of practice is demonstrated by being able to participate in the discourses or cultural practices that distinguish that community,11 including producing the texts characteristically produced by its members. For example, to raise an issue of concern, a social scientist might write an essay, a manager make an oral presentation at an office meeting, a cartoonist draw a caricature for the editorial page of a newspaper and a group of friends meet for a discussion over coffee.
Within any particular social context or field of practice, the texts produced will be shaped by the role assumed by the text producer, the presumed audience of the text, its subject matter, and the mode or medium of communication. For example, a ‘text’ as simple as an apology may be expressed as a formal note inserted in a newspaper, an email to a colleague or a card accompanied by a bunch of flowers to a friend.
The legal profession, like other professional communities, has its own collection of traditional and valued text types which form the basis of students’ text production during their years of university study. The independent production of many of the ‘signature’ texts of a professional community is the ultimate purpose of many professional law programs and an essential condition for attaining full membership of this particular cultural group.
The priority of the academics who participated in the role discussion at the ALTC/CALD symposium was quite specific — to design assessment tasks that provided appropriate opportunities for students to demonstrate learning and understanding at different levels of the professional program, while encouraging them to periodically revisit legal knowledge, skills and ways of disciplinary thinking. According to Daugherty12 and his colleagues, this assessment priority is common among those responsible for the design and implementation of learning programs.
Assessment task design requires an assessor to develop specifications for the texts that students are to produce to demonstrate their learning. The decisions that this involves are related to elements of both the cultural and social context of text production as illustrated in Figure 1 above. Although, in practice, assessment task descriptions commonly explicate some specifications — usually subject matter and text type — but only imply others, the elements of the framework can be identified in all assessment tasks. For example, a law student working in the context or situation of a legal practice and assuming the role of solicitor may be required to conduct a face-to-face (medium) oral (mode) interview (text type) to elicit relevant information (purpose) from a client (audience) wishing to add a codicil to a will (subject matter).
To address the assessment priority of the academic participants at the ALTC/CALD symposium, a set of task ‘dimensions’ was identified. These dimensions would constitute variables which could be used to determine or adjust the challenge or demand of individual assessment tasks. Following the symposium, the initial set of dimensions that had been generated was further developed and links established with relevant assessment literature. This process indicated that while there is an impressive amount of literature addressing sound assessment planning — for example, design principles and criteria,13 constructive alignment,14 effective feedback15 and the importance of standards16 — literature informing the adjustment of assessment tasks suited to stages of progression from peripheral to full membership in a community of practice is relatively sparse. Some resources, such as the ‘United Kingdom Quality Assurance Agency Subject Benchmark Statements’ and graduate attribute maps developed by individual institutions and disciplines address progression to some extent, but are too general or holistic to inform the design of individual tasks. However, the dimensions of oral assessment identified by Gordon Joughin,17 and the dimensions of authenticity articulated by Judith Gulikers, Theo Bastiaens and Paul Kirschner18 suggest a way of advancing the ideas participants generated at the symposium when drawing on their own assessment design experience. The following discussion combines selected components from each source and locates them in the ATD framework. There are, of course, a number of other variables, such as text length or reference specifications, that assessors can manipulate in order to vary the challenge of assessment tasks. However, as the object of this exercise is identifying dimensions capable of adjustment for the design of tasks that suit different stages of student development, only those dimensions that can be represented on a continuum have been considered for inclusion in this paper (Table 1 below).
Although each of the dimensions in the Table 1 is discussed in turn below, a degree of overlap will be apparent because of the inter-relatedness of the elements of the ATD framework.
Table 1: Dimensions of Tasks to Assess Role
Development and Continua
TASK DESIGN DIMENSIONS AND CONTINUA
|
||
Authenticity of task context
• Decontextualised (eg, focused skill assessment task: time
constraints determined by examination conditions).
|
Elements of the cultural context
|
Situated in authentic professional/work context — some aspects of
performance may be consequential: authentic time frame (eg,
scattered or
immediate response).
|
Complexity of demand
• Remembering/identifying/applying: closed — single
‘correct’ response: convergent, fixed or predetermined
outcome.
|
|
• Creating and justifying original or innovative responses with
multiple possibilities: open or divergent outcome.
|
Predictability of text structure
• Rigid/formulaic.
|
|
• Devised by student.
|
Role range and formality
• Small range of informal roles, some able to be assumed using
commonsense or everyday knowledge and language.
Predictability of interaction
• Audience is familiar, equal or lower in status or power and/or with
known understanding of the subject matter Task is predictable
— all
aspects can be prepared in advance and delivered as prepared.
|
Elements of the social context
|
• Wide range of roles. Formal and requiring expertise —
specialised knowledge, terminology.
• Audience is unfamiliar, of higher status or power and with unknown
understanding of the subject matter: Aspects of the task
are unpredictable (eg,
client response, audience questions or comments) and require spontaneous or
flexible response.
|
Selection of subject matter
• Prescribed by others.
|
|
• Identified by student.
|
Level of scaffolding
• Task is heavily scaffolded through the provision of detailed notes,
templates, rubrics, modelling and other teaching and learning
activities. Only
partial text production required.
Task weighting
• Low weighting — formative/developmental emphasis.
|
Task Conditions
|
• ‘Fading’ of support – increasing student autonomy
(eg. student development and application of criteria and
standards); full text
production required.
• High weighting — summative /grading/credentialing
emphasis.
|
Authenticity of task context, purpose as related to cognitive complexity and selection of text type all present the assessor with possibilities for manipulating elements of the cultural context in the design of assessment tasks.
The situations or fields of practice in which learning takes place provide a spectrum of contexts for the assessment of role progression, ranging from the decontextualised to the fully authentic. Role-related learning can be decontextualised when students learn about role as a component of course subject matter. Students become aware of the expectations, regulations, accountability and ethical codes of roles associated with the legal profession. They also develop a knowledge and appreciation of the roles of people with whom they are likely to work collaboratively or whom they are otherwise likely to encounter in professional practice. However, they are distanced from practice or are ‘standing outside’ it,19 when the assumption of these roles is not integral to teaching and learning activities. Decontextualised assessment would reflect this type of learning in that students would be required to demonstrate learning about these roles through the production of texts such as essays, multiple choice tests or oral presentations within traditional university assessment contexts.
Experiential learning pedagogies involve learning through role (role-play or simulations) or in role (practicums, internships, work-based/integrated or service learning). These provide either lifelike or real-life scenarios as contexts for learning and assessment. Simulated contexts can be provided through role-play with peers (solicitor and client), with academics or actors as clients, or through the use of technology that facilitates replication of client interactions. Suggestions provided by Lee Andresen, David Boud and Ruth Cohen,20 illustrate types of assessment tasks congruent with experience-based learning and include group projects, critical essays located in the learner’s own experience, reading logs and learning journals. Case-based or scenario type examination items can also address the need to connect assessment to authentic contexts. There are, for example, instances where a legal practice has been invited to contribute to the design of assessment tasks to boost authenticity.21 Not only did members of the practice suggest authentic scenarios to contextualise assessment tasks, but they also made some contribution to the teaching program and to the assessment of student responses to the task. The additional work required to negotiate this involvement was considered worthwhile as academics who are no longer active professional practitioners are not as well situated as current practitioners to devise authentic scenarios as assessment tasks.
In authentic, real-life contexts, students undertake actual tasks with varying levels of supervision depending on ethical, legal and safety factors. The value of such learning and assessment experiences can be determined by the extent to which students are involved in real situations, engaging with real problems,22 in real time frames.23 Assessments in university contexts are generally time restricted, especially if undertaken in examination conditions. Professional tasks may either be ‘scattered over days or on the contrary, require fast and immediate reaction in a split second’.24
The level of cognitive demand is a significant determinant of complexity of purpose. Though the level of cognitive demand required by the student in producing any particular text should directly correlate with the learning objectives or purposes of the course, resources such as Bloom’s (revised) Taxonomy,25 can support the development of progressive assessment tasks that are mainly formative or developmental in orientation. Activities requiring the reproduction or identification of information represent the lowest level of demand on the taxonomy and provide a foundation for further progression through other stages of understanding, applying, analysing, evaluating and synthesising to creating (the highest level). Thus, development of a text type such as an interview can be undertaken initially as an observation exercise using a video tape with students required only to recognise or identify examples of specified behaviours and to record them on a checklist. Students can then progress to assessment tasks requiring application of interview prompts in a formulaic way to analysing and evaluating own and others’ interview effectiveness through role-play before conducting interviews in more authentic contexts. This in no way suggests that students should be limited to lower-level cognitive tasks in the early years of a law program. Rather, it suggests a useful sequence of activities for systematic skill development. Students whose skill levels restrict them to lower levels of cognitive activity in one area of an introductory program should still be required to perform at higher cognitive levels in other areas.
A related concept, ‘solution space’, is used by Gulikers et al26 in referring to the number of possible correct responses students can give to an assessment task. Real-life tasks or problems can be ‘closed’ — there is only one acceptable correct response — or ‘open’ — problems are open to interpretation and the generation of multiple responses that will be accepted as ‘correct’ providing task criteria are met. However, in many cases, cognitive complexity can be highly correlated with the extent to which a problem requires a closed/convergent or open/divergent response. Peter Knight argues for a move from ‘program design and determinate learning outcomes to the design of learning environments rich in opportunities for complex learning’, especially in non-formal learning engagements.27 It is also essential that students learn to recognise whether one or multiple responses are required by an assessment task and that they gain experience in producing texts as appropriate responses to problems with differing levels of cognitive complexity and ‘solution space’.
It has been observed that assessment tasks in higher education exist on a continuum between traditional genres (text types) with strict norms, and less structured problem- or project-based assignments.28 Law students, for example, are sometimes required to produce texts that have clearly defined, formulaic or rigid structures such as case notes. In some circumstances, this can lend a certain amount of predictability to the task of text production, making these tasks suited to students in the earlier years of a program. For example, the production of texts such as simple contracts or wills can be scaffolded through the provision of templates, sometimes partially completed or containing numerous prompts. Other text types more suited to students in later years — a research essay or memorandum — may require the student to devise an original way of organising and relating information.
Within each cultural context, texts are shaped by the role assumed by text producers; their knowledge of, relationships and interactions with others involved in the production process and with the audience of the text; the subject matter of the text; and the mode and medium in and through which it is expressed. To some extent, in a field such as law, mode and medium are determined by the selection of text type — a will (text type) is a written document (mode) stored in a paper-based or electronic form (medium). So, although much assessment innovation in recent times has been achieved through novel or original approaches to text type such as the development of ‘Trivial Pursuit’ questions on population health content or trading cards to demonstrate the different theoretical perspectives of archaeologists,29 this discussion is limited to more recognisable variations on traditional assessment practices. Consequently, mode and medium will be considered as inextricably bound to the selection of text type and therefore beyond significant independent variation.
Legal practitioners undertake a variety of professional roles in developing and maintaining society’s legal, industry, political and other systems. This provides a rich opportunity for the development of assessment tasks in which students assume a variety of roles in simulated or authentic contexts. Assigned roles can be professional — advisor, advocate, policy developer — or related to those with whom legal practitioners are likely to interact in professional practice — client, industry advocate, or environmentalist. Proponents of authentic assessment stress the importance of providing students with opportunities to undertake activities that reflect a broader variety of real-life roles than is currently included in professional programs.30 Lambert Schuwirth, for example, has identified medical graduate roles related to fields of direct patient care, the health-care system, personal development, scientist and teacher/supervisor.31 Corresponding fields for law graduates could relate to client service, the legal system, personal development, researcher and teacher/supervisor. It is therefore suggested that those who design tasks to assess role development consider a comprehensive range of roles which reflect different fields of potential professional practice and the role possibilities within those fields. These different roles — from ‘client’ to ‘legal professional’ — require different levels of legal expertise. While role-play in non-professional roles may not in itself offer direct assessment opportunities, it can provide a valuable learning experience to be assessed in other ways. Letters to the editor, informational pamphlets, posters, ministerial briefings or speeches to community groups are only some of the texts that can be produced in simulated contexts, while field or service learning experiences provide rich contexts for the production of a variety of more authentic texts.
Predictability has already been considered in the sense that some texts have formal, sometimes rigid structures or sequences of required components. In addition, the notion of predictability can also be applied to the extent to which the text is shaped through interaction with an audience, interlocutor or other. Joughin discusses these possibilities as ‘presentation’ or ‘dialogue’.32 Although Joughin’s discussion is in the context of formal oral assessment — his description of interaction being ‘reciprocity between examiner and candidate, with each acting on, responding to and being influenced by the other’ — it is also applicable to any text shaped by interaction.
Predictability can be manipulated by deciding whether, first, the audience will be one with which the student is familiar and comfortable, or whether audience dispositions and positions will be either unknown to a student (for example, an actor ‘primed’ to be hostile or otherwise difficult) or known to be antagonistic to an argument or perspective to be presented. Second, the task could allow texts to be prepared in advance and presented as intended (a prepared talk on a familiar topic delivered with no interruptions) or it could require interaction with others (interviewee, client, audience), which would call for spontaneity of response and an ability to improvise or ‘think on one’s feet’.
Requiring students to take on a variety of roles and interactions with a variety of peers and audiences meets Bates’s criterion of worth that an activity ‘provides opportunities for the student to engage in a range of relationships that either support or challenge his/her world view and his/her current and future professional frame of reference’.33 Knight also argues that complex learning is facilitated in learning environments that provide opportunities for diversity of task and interaction, opportunities for role variety and stretch and opportunities to develop and appreciate the attitudinal factors associated with professional workplaces.34
John Bransford35 also argues that the preparation of students to undertake professional life in fast-changing environments — what he refers to as ‘adaptive expertise’ — requires opportunities to learn how cultural settings influence interactions, thoughts, emotions and behaviours and that these opportunities are best provided through peer collaboration and client interaction.
The subject matter of assessment tasks can most easily be adapted for students at different stages of a professional program by the extent to which relevant legal principles are specified or otherwise indicated in task instructions. Some tasks restrict, prescribe, or identify the subject matter to be included in a text. Other tasks merely state a problem or issue and leave it to students to draw on their expanding knowledge of the disciplinary field to identify the appropriate legal principles or subject matter to bring to bear in investigating the problem and in proposing or justifying a solution or argument.
Therefore, students in the early years of a program can be given quite specific parameters regarding the subject matter to be included (or excluded) when responding to assessment tasks and guidelines for its treatment. More advanced students may be provided with a case or problem for which the relevant legal principle may be open to various interpretations and therefore the quality of response will be determined by students’ breadth and depth of legal knowledge, their ability to discuss and evaluate a range of perspectives and to present a defensible conclusion or opinion.
Two further task dimensions relate to the conditions under which assessment is conducted. The level of scaffolding provided to students and the weighting attached to individual assessment tasks offer possibilities for adjustment. As these can be varied for all other dimensions that have been discussed, they are addressed separately below.
Scaffolding refers to the type and level of support provided to students completing learning and assessment tasks and it is therefore an essential component of a teacher’s pedagogical repertoire. Assessment tasks can be heavily scaffolded for students in the early years of a program or the early stages of development of a particular skill or attribute and then gradually reduced or ‘faded’36 as students acquire greater independence.37 Mary Macken et al, for example, suggest a three-stage curriculum framework beginning with modelling the types of text students are expected to produce, followed by supported text construction and then the withdrawal of scaffolding during the final stage of independent text production.38 Scaffolding is also consistent with Hilda Taba’s model of curriculum development in which content is revisited periodically to increase breadth and depth of learning on a particular topic or skill.39 In general, the level of scaffolding or guidance provided will depend on ‘the student’s stage of development in the field and the complexity of the material’.40 This by no means absolves the teacher from any teaching responsibility as students advance through a program, but rather changes the nature of teaching to one more commensurate with the shift of responsibility for learning from teacher to student. Teaching is therefore more likely to take the form of questioning, prompting, provoking or challenging. Scaffolding can take many forms, including providing detailed criteria and standards, prompts, templates or one or more opportunities for feedback on draft texts prior to final submission or performance.
In addition, text types with clearly distinguishable components suggest activities where the task can be scaffolded by requiring only partial text completion. In their early years students may not necessarily be required to complete an entire text — for example, an interview schedule may be prepared but the interview itself need not be conducted; one aspect of a legal argument may be explained but an entire essay is not required.
Weighting is also obviously easy to modify. The advantages of assigning a low weighting to an assessment task are that it can minimise the level of stress induced by other dimensions of the task and, in giving the task a mainly formative or developmental orientation, it can provide a supportive learning environment. However, as students often equate low weighting with low importance, it is essential to clarify the intent behind such assessment decisions and to apply rigorous minimum standards.
The process of mapping the dimensions generated during the symposium against existing assessment literature has resulted in some modification of terminology, some clustering and some culling of elements from the original list. In order to trial the usability of the revised list of dimensions (in Table 1 above) they have been used to analyse or profile a sequence of tasks41 illustrating (in Table 2 below) methods of assessing student progression in the development of interviewing expertise.
The profiling illustrated in Table 2 is more of a rough guide than a precise gauge of task progression. Professional judgement is needed to interpret such analysis and to apply it to specific contexts, individuals and tasks. Nevertheless, it offers usable prompts to those designing assessment tasks and also supports the analysis, revision or justification of existing tasks. A further caution relates to the level and type of scaffolding provided to students at any level — there is growing concern in the assessment community that many current scaffolding practices are resulting in inappropriate levels of student dependency and a general reluctance by students to accept responsibility for their own learning or develop either the disposition or attributes required for lifelong learning.42
Table 2: Illustrative Application of the Dimensions
TASK DESIGN DIMENSIONS
|
||
Authenticity of task context
• Decontextualised (eg, focused skill assessment task: time
constraints determined by examination conditions).
|
Elements of the cultural context
|
• Situated in authentic professional/work context — some
aspects of performance may be consequential: authentic time
frame (eg, scattered
or immediate response).
|
Complexity of demand
• Remembering/identifying/applying: closed — single
‘correct’ response: convergent, fixed or predetermined
outcome.
|
|
• Creating and justifying original or innovative responses with
multiple possibilities: open or divergent outcome.
|
Predictability of text structure
• Rigid/formulaic.
|
|
• Devised by student.
|
Role range and formality
• Small range of informal roles, some able to be assumed using
commonsense or everyday knowledge and language.
Predictability of interaction
• Audience is familiar, equal in status or power and/or with known
understanding of the subject matter Task is predictable —
all aspects can
be prepared in advance and delivered as prepared.
|
Elements of the social context
|
• Wide range of roles. Formal and requiring expertise —
specialised knowledge, terminology.
• Audience is unfamiliar, of higher/lower status or power and with
unknown understanding of the subject matter. Aspects of the
task are
unpredictable (eg, client response, audience questions or comments) and require
spontaneous or flexible response.
|
Selection of subject matter
• Prescribed.
|
|
• Identified by student.
|
TASK CONDITIONS
|
|
|
Level of scaffolding
• Task is heavily scaffolded through the provision of detailed notes,
templates, rubrics, modelling and other teaching and learning
activities. Only
partial text production required.
Task weighting
• Low weighting — formative/developmental emphasis.
|
|
• Fading of support – increasing student autonomy (eg. student
development and application of criteria and standards);
full text production
required.
• High weighting — summative /grading/credentialing
emphasis.
|
|
Key Task 1: Complete online quiz regarding relevant topic knowledge,
communication theory and interview protocols. Learning is
supported through
provision of reading materials and opportunities to make summaries, take notes
and research relevant law content
and aspects of communication theory;
discussion of client interview protocols; review and critique of video
interviews. Weighting
comprises 10 per cent of course grade.
Task 2: Conduct interview with tutor providing advice regarding a
legal matter on a specified topic. Learning is supported
through guest lecture
on client interviewing from corporate partner/community legal centre in relation
to a particular problem; role-play
interviews in pairs with peer feedback
provided with reference to interview protocol and assessment criteria. Weighting
comprises
15 per cent of course grade.
Task 3: Conduct an interview with a client within a workplace under
supervision. Learning is supported through observation
of interviews within a
particular workplace context and discussion of expectations with supervisor;
assisting in interviews and discussion
with the supervisor; supervised
interviews and provision of feedback from supervisor. Weighting comprises 30 per
cent of course grade.
|
The identification of the key dimensions of assessment tasks that can be manipulated to prepare students for professional practice is intended to serve as a resource that supports the work of those with assessment responsibilities in professional programs. The continua suggest ways of supporting skill development and assessment over a single course or over a number of years, of bridging course-based and work-based elements of professional programs, and of preparing students to ‘perform’ under varying levels of supervision. The continua in no way suggest the fixed or staged development approaches that have been subject to critique because of their inflexibility and narrowness of perspective.43 Rather, the independence with which each of the dimensions can be manipulated provides assessors with a flexible means of manipulating assessment design in response to diverse and changing student characteristics and assessment contexts.
The development of the legal skills that are fundamental to professional practice is too important to be left to chance. It is improbable that work-based or integrated learning experiences will be able to offer the structured skill development that can be provided through formal course-based activities. Skills such as those required for professional communication and interaction therefore need to be introduced during the formal education phase that generally precedes work-based learning.
The practice suggested in this paper is intended to be feasible — it is based on the practical assessment experience of legal academics — and also theoretically defensible as it is mapped against relevant assessment literature. Though the examples included are drawn from ideas generated by legal academics, the approach is widely applicable to any discipline offering skill development in professional programs.
[*] Lecturer, Teaching and Educational Development Institute, The University of Queensland. Thanks are given to the following participants in the ALTC/CALD project symposium role discussion group whose ideas initiated this paper and whose examples have illustrated the points made — Andrew Antonopoulos, Vivienne Brand, Kelly Burton, Adrian Evans, Rosemary Howell, Wendy Larcombe, Geraldine McKenzie, Brenda Marshall and Chris Symes. I am grateful to Gary Davis and Susanne Owen for the opportunity to participate in the symposium and, in particular, to Susanne for developing and providing the examples included in Table 2.
[1] Simon Marginson, ‘Towards a Politics of the Enterprise University’ in Simon Cooper, John Hinkson and Geoff Sharp (eds), Scholars and Entrepreneurs: The Universities in Crisis (2002) 109.
[2] Colin Symes and John McIntyre (eds), Working Knowledge: The New Vocationalism and Higher Education (2000).
[3] Mantz Yorke, Issues in the Assessment of Practice-Based Professional Learning (Report prepared for the Practice-Based Professional Learning CETL at the Open University, 2005) <http://www.open.ac.uk/cetl-workspace/cetlcontent/documents/464428ed4aa20.pdf> at 23 December 2009.
[4] Brenda Little and Lee Harvey, Learning through Work Placements and Beyond (Report for HECSU and the Higher Education Academy’s Work Placements Organisation Forum, 2006) <http://www.prospects.ac.uk/downloads/documents/HECSU/Reports/Workplacement_Little_Harvey.pdf> at 23 December 2009.
[5] Clair Hughes, ‘Assessment as “Text” Production: Drawing on Systemic Functional Grammar to Frame the Design and Analysis of Assessment Tasks’ (2009) 34(5) Assessment and Evaluation in Higher Education 553.
[6] Peter Knapp and Megan Watkins, Genre, Text, Grammar: Technologies for Teaching and Assessing Writing (2005) 13.
[7] Hughes, above n 5, 556.
[8] Jean Lave and Etienne Wenger, Situated Learning: Legitimate Peripheral Participation (1991) 56.
[9] John Swales, Genre Analysis: English in Academic and Research Settings (1990) 9.
[10] Olga Dysthe et al, ‘A Theory-Based Discussion of Assessment Criteria: The Balance between Explicitness and Negotiation’ in Anton Havnes and Liz McDowell (eds), Balancing Dilemmas in Assessment and Learning in Contemporary Education (2008) 121.
[11] Mark Tennant, ‘Learning to Work, Working to Learn: Theories of Situational Education’ in Colin Symes and John McIntyre (eds), Working Knowledge: The New Vocationalism and Higher Education (2000) 123.
[12] Richard Daugherty, Paul Black, Katherine Ecclestone, Mary James and Paul Newton, ‘Alternative Perspectives on Learning Outcomes: Challenges for Assessment’ (2008) 19(4) Curriculum Journal 243, 24.
[13] Anne Hewitt, ‘A Critique of the Assessment of Professional Skills’ (2008) Legal Education Review 143.
[14] John Biggs, Aligning Teaching and Assessment to Curriculum Objectives (2002) <http://www.heacademy.ac.uk/assets/York/documents/resources/resourcedatabase/ id477_aligning_teaching_for_constructing_learning.pdf> at 23 December 2009.
[15] Graham Gibbs and Claire Simpson, ‘Conditions under Which Assessment Supports Students’ Learning’ (2004) 1 Learning and Teaching in Higher Education 3; David Nicol and Debra Macfarlane-Dick, ‘Formative Assessment and Self-Regulated Learning: A Model and Seven Principles of Good Feedback Practice’ (2006) 31(2) Studies in Higher Education 199.
[16] D Royce Sadler, ‘Interpretations of Criteria-Based Assessment and Grading in Higher Education’ (2005) 30(2) Assessment and Evaluation in Higher Education 175.
[17] Gordon Joughin, ‘Dimensions of Oral Assessment’ (1998) 23(4) Assessment and Evaluation in Higher Education 367.
[18] Judith Gulikers, Theo Bastiaens and Paul Kirschner, ‘Defining Authentic Assessment: Five Dimensions of Authenticity’ in Anton Havnes and Liz McDowell (eds), Balancing Dilemmas in Assessment and Learning in Contemporary Education (2008) 73.
[19] Tennant, above n 11, 132.
[20] Lee Andresen, David Boud and Ruth Cohen, ‘Experience-Based Learning: Contemporary Issues’ in Griff Foley (ed), Understanding Adult Education and Training (2nd ed, 1995) 225.
[21] Example offered from the practice of a participant at the ALTC/CALD symposium.
[22]Merrelyn Bates, ‘Work-Integrated Curricula in University Programs’ (2008) 27(4) Higher Education Research and Development 305, 313.
[23] Gulikers, Bastiaens and Kirschner, above n 19, 78.
[24] Ibid 80.
[25] David Krathwohl, ‘A Revision of Bloom’s Taxonomy: An Overview’ (2002) 41(4) Theory into Practice 212.
[26] Gulikers, Bastiaens and Kirschner, above n 18, 80.
[27] Peter Knight, ‘The Assessment of Complex Learning Outcomes’ (Paper presented at the International Conference on Engineering Education, Manchester, 18–21 August, 2002).
[28] Dysthe et al, above n 10.
[29] Claire Smith and Heather Burke, ‘Mortimer Wheeler, Lewis Binford, Ian Hodder ... and You: Active Learning in Archaeology’ (Paper presented at the Annual HERDSA Conference on Higher Education in a Changing World, Sydney, 3-6 July, 2005).
[30] Joy Cumming and Graham Maxwell, ‘Contextualising Authentic Assessment’ (1999) 6(2) Assessment in Education 177.
[31] Lambert Schuwirth, Assessment Overview? (2004) <http://www.fdg.unimaas.nl/educ/lambert/bern/assessment%20overview.ppt> at 23 December 2009.
[32] Joughin, above n 18, 370.
[33] Bates, above n 23, 313.
[34] Peter Knight, ‘Grading, Classifying and Future Learning’ in David Boud and Nancy Falchikov (eds), Rethinking Assessment in Higher Education: For Future Learning (2007) 72, 80.
[35] John Bransford, ‘Preparing People for Rapidly Changing Environments’ (2007) 96(1) Journal of Engineering Education 1.
[36] Nancy Falchikov, ‘The Place of Peers in Learning and Assessment’ in Nancy Falchikov and David Boud (eds), Rethinking Assessment in Higher Education (2008) 128, 137.
[37] Malcolm Knowles, The Adult Learner: A Neglected Species (1973).
[38] Mary Macken et al, An Approach to Writing K-12: The Theory and Practice of Genre-Based Writing: A Genre-Based Approach to Teaching Writing Years 3–6 (1989).
[39] Hilda Taba, Curriculum: Theory and Practice (1962).
[40] The Quality Assurance Agency for Higher Education, Subject Benchmark Statement: Law <http://www.qaa.ac.uk/academicInfrastructure/benchmark/honours/law.asp#9> at 23 December 2009.
[41] Tasks illustrated in Table 2 are those suggested by participants in a parallel discussion group on the same symposium topic and further developed by Dr Susanne Owen, the ALTC/CALD project manager.
[42] Jenifer Davies and Kathryn Ecclestone, ‘“Straitjacket” or “Springboard for Sustainable Learning?” The Implications of Formative Assessment Practices in Vocational Learning Cultures’ (2008) 19(2) Curriculum Journal 71, 83.
[43] Gloria Dall’Alba and Jorgen Sandberg, ‘Unveiling Professional Development: A Critical Review of Stage Models’ (2006) 76(3) Review of Educational Research 383.
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/LegEdRev/2009/6.html