Home
| Databases
| WorldLII
| Search
| Feedback
Legal Education Review |
USING STRUCTURES TO TEACH LEGAL REASONING
DUNCAN BENTLEY*
INTRODUCTION
In 1992, The American Bar Association Task Force
Report on legal education and professional development was
published.1 Part of the central mission of the Task
Force was to identify the skills and values required by a competent
lawyer.2 Ten skills were identified. The second of
these is legal analysis and reasoning.3 Legal reasoning
is usually a fundamental element in the teaching and understanding of law in
common law countries.4 In most core substantive law
courses this takes place at least in part through a study of cases and the use
of standard undergraduate
problems.5 These problems are
generally fairly straightforward fact patterns designed to raise one or more
issues within a specific area of
law.
At Bond University specific structures
are generally used in teaching legal reasoning. The hypothesis underlying their
use is that
students using such structures will improve their legal reasoning.
The first part of this article describes an experiment6
to test the use by students of one such structure.7
The second part of this article gives the results of the research, which are
categorised according to the aims, and a discussion of
those results. The aims
are categorised as follows:
The third part of this article draws some conclusions as to the weight which should be placed on unstructured anecdotal evidence in assessing success in teaching methods; discusses some difficulties in teaching legal reasoning to undergraduates; and places the results of this experiment in context.
Part 1: Research Into the Use of a Structure in Teaching Legal Reasoning
BACKGROUND
Law teachers spend a good deal of time helping their students to understand
and apply the process of legal reasoning. The common undergraduate
problem used
in law school tutorials and examinations has as its primary objective, the
testing of a “student’s ability
to recognize and articulate legal
issues in a fact pattern and to give a reasoned opinion about which party would
succeed should
the facts be placed before a
judge”.8 Various structures have been used to
break down the reasoning process into its component parts in order to facilitate
a greater understanding
of it.9 At Bond University, the
School of Law uses variations of the method of problem solving described by the
acronym MIRAT.10 The acronym stands for:
M
– material/missing facts
I – issues
R – rule
(principle) of law
A – application/argument
T –
tentative conclusion
The key to its success is its simplicity, which makes
it sufficiently adaptable for use in most undergraduate law subjects.
Particularly in examinations, where there are often severe time constraints,
but in considering any fact pattern, many students find
that they only really
identify the material/ missing facts once they have determined the issues and
the applicable rules of law.
Identification of material/missing facts can be
seen as a continuum. With most fact patterns there are some obviously
material/missing
facts. Depending on the level of expertise of the student,
these should be identified early on in the reasoning process. Other facts
are
less obviously material/missing and would generally be recognised as such after
the identification of the issues. Further material/missing
facts would likely
only become evident after consideration of possible sub-issues. Experience in
teaching MIRAT at Bond has shown
that many students find it confusing to have to
identify material/missing facts first, when they keep finding new
material/missing
facts as they progress through the reasoning process.
To
help students cope with this process, and as a variation on MIRAT, Kay
Lauchland11 has put forward the idea of a spiral.
Students, when faced with a fact pattern, should ask themselves:
Having done this, students should then be in a position, in written form, to:
The final step is to ask if any other facts
are material/ missing and start the process again: hence the spiral concept. The
written
element of this format gives the acronym IRAFT.
This is one possible
approach, and is the approach studied in this paper. Whatever the structure
chosen, the question remains as to
whether using such structures to teach legal
reasoning actually works.
Scope of the Research
To answer that question, I commenced research into
the effectiveness of using the IRAFT structure in teaching legal
reasoning.12 Extensive discussions were held with
fellow faculty members at Bond University. From the queries and perceived
problems raised by
members of faculty it was possible to isolate certain key
considerations which would form the basis for the research.
When examined in
the context of a course in which students are being taught legal reasoning using
the IRAFT structure, these considerations
could be set out as follows:
For some faculty members there was a skepticism as to the usefulness of structures to assist students in the legal reasoning process, particularly in examinations. For these faculty members there was a feeling that the quality of the answers given, whether or not a structure was used, fell below the faculty members’ expectations.
THE RESEARCH SAMPLE13
The Sample Group and Why It Was Chosen
The research was conducted during a taxation law course. Most students would
take taxation law about half way through their law degree,
but the sample group
included students from their second to eighth (final) semester. Accordingly,
they would, in .most cases, have
had reasonable exposure to law subjects
including teaching on the use of basic structures to assist in legal reasoning
and analysis.
The basic structure used would in almost all cases have been MIRAT
or a variation such as IRAFT.
It was felt that using a course with most
students in the middle of their degree would be useful. The students should have
overcome
any initial culture shock they may have had in a first year course,
which could have distorted the results.14 This meant
that the research was aimed at discovering whether IRAFT is appropriate as a
tool for students, with a basic knowledge
of law, to use in helping them to give
structure to their own process of legal reasoning.
Another advantage in
using a mid-degree course was that the emphasis on legal reasoning and analysis
in the early part of their degree
should have firmly established the importance
and some experience of these skills in the students’ minds. This
observation
is based on schema theory, which is fundamental to my analysis of
students’ use of structures in their legal reasoning. Schema
theory argues
that in every aspect of human experience we develop patterns to explain the
relationship between the different elements
of those experiences; to impose an
order on our sensory input.15 On entering any new field
of knowledge, we have to build up an interpretive framework, based on past
knowledge and experience, that
allows us “to make sense out of the bits
and pieces of information presented to us in given
situations”.16 Novices in any area have limited
knowledge and experience. Long periods of learning and experience are required
to build up expert
knowledge structures.17 Law is no
exception and students entering law school have to build up a
“legal” interpretive framework.18
This
can be particularly traumatic for students with high level interpretive
frameworks, which have worked well for them in other
fields, when they are faced
with the demand to develop a new interpretive framework as they commence their
study of law.19 Using a mid-degree course ensures that
this initial trauma should not affect results and the students should have
developed beyond
the lowest level novice interpretive framework.
Taxation
law was in many ways an ideal course to use in the research. It is strongly
statute based, to an extent not found in most
of the earlier courses taken by
students. In this sense a student’s interpretive framework built up in the
early part of the
law degree needs to be adapted to cope with what is
effectively a new rule structure. This provides an ideal opportunity to test
the
students’ legal reasoning skills as they approach a different style of
problem and adapt and expand their interpretive
frameworks.
Possible Problems with the Research Sample: The Ethical Issue20
The ethical issue of using students to assist in
research of this kind was recognised by explaining to students in detail the
nature
of the research and asking students who did not wish to participate to
indicate this to the lecturer or their tutor. It was stressed
that participation
or otherwise would in no way affect assessment of students’ performance.
This last assertion was given further
weight in that all written work and
examination scripts are marked blind in the School of Law. Only the student
number must appear
on the script.
No students indicated an unwillingness to
participate. In fact only 50 out of the 146 students enrolled for the course
completed all
three written elements used in the research, although 101 students
completed at least one of the two non-compulsory elements of the
research.
Students at Bond University have a three semester year. They usually complete
four pieces of assessment during each semester
and have weekly tutorials, at
which their performance is assessed. A number of students stated that pressure
of work prevented them
from completing the first two written elements used in
the research. This would suggest that non-participation could be attributed
to
pressure of work, rather than an unwillingness to participate.
Possible Problems with the Research Sample: The Hawthorne Effect
The Hawthorne effect21
suggests that knowingly being part of an experiment can improve performance so
as to distort the results. Otherwise known as expectancy
bias, it has been shown
in a wide range of experiments that the expectations of the experimenter can be
transmitted to the subjects
and can powerfully influence the subjects’
responses.22
It is suggested that the methodology
used in this experiment would not have produced sufficient distortion to
invalidate the results.
The process was part of the ordinary teaching program
and no mention was made of the experiment after the initial explanations.
Students
participating indicated verbally that they valued the opportunity to
practise exam type questions and to obtain feedback. Similar
opportunities are
available to students every semester. The likelihood of distortion with respect
to the examination question that
formed part of the experiment is particularly
low. Students did not know which examination question would be used in the
experiment
and their incentive to do well in the examination to attain a high
grade was far greater than any incentive they may have had as
participants in
the experiment.
METHODOLOGY
Legal Reasoning in Standard Problems
Three standard undergraduate problems23 were completed by students in the sample group as part of the ordinary teaching program. The first two were handed out in weeks 7 and 11, and students had a week to complete them. The third was part of the examination in week 14.24 All three were of a similar level of difficulty. Scripts from the first two problems were returned to students with limited written feedback, but a full model answer using the IRAFT structure was provided.
Reinforcement Through Teaching and Learning
In the course of a two hour lecture in week 7 the
lecturer reviewed the use of the IRAFT model and demonstrated it using several
examples.
During the ten hours of lectures over the final five weeks of the
course, the lecturer continued to demonstrate the use of IRAFT
using examples in
each lecture. The examples chosen covered both simple, single issue problems and
complex, multiple issue problems.
Small group
tutorials25 were also carried out over this period, in
which students were sometimes given a complete written problem and sometimes had
to seek
further information with respect to the problem in order to be able to
analyse it properly and formulate possible solutions. A mixture
of directed
questioning by the tutor, demonstration and student-lead learning was
used.26 Guidance was provided by tutors in ensuring an
understanding of the legal reasoning process used in answering the tutorial
problems.
Any demonstration of the legal reasoning process by tutors used the
IRAFT structure.
Collation and Assessment of Results
The collation and assessment of the research
questions and the assessment of the exam questions for research purposes, took
place
in the following semester.27 Each of the answers
to the questions was marked out of ten. Marks were recorded by student number. A
sample of the marks was checked
by another lecturer marking blind. The sample
showed a difference in the marks between the original and the second marker of
an average
of less than one quarter of a mark, with no single variation greater
than one mark.28 Accordingly, the marks given have been
assumed to be reliable and consistent.29
All
scripts were then reviewed for a second time by the lecturer, again by student
number and without reference to the mark given
for each script. This time it was
to determine whether the IRAFT structure or any other structure had been used in
answering each
question and if so, how well it had been used. Note was also made
of which aspect of the structure identified had been used inadequately
in
answering the question. The use of the structure was classified as good,
satisfactory or poor. Any more detailed classification
would have been too
subjective to provide adequate data.30
The marks
and rankings as to the use of a structure were then entered into a composite
spreadsheet for each of the answers to the
three questions. The student numbers
were then removed to give anonymous raw data.
It was found that only 50 out
of 146 students had completed all three questions used in the experiment. This
group of 50 students
was used as the core sample for analysis. The results are
set out in Figures 1 and 2.
All three problems required the
consideration of only one major issue and the consideration of two or three main
rules. The authority
for those rules was based in case law and statute. There
were a number of different cases which could have been cited as authority,
but
only one section or sub-section of the statute was applicable. The problems were
of average difficulty for an undergraduate subject.
A good answer using an IRAFT
approach would have to show an understanding of the issue involved, the relevant
rules of law and their
application to the particular facts in question, drawing
a valid tentative conclusion on the basis of a well reasoned argument. An
example of an answer to a problem using the IRAFT approach is set out in
Appendix A.
PART 2: ANALYSIS AND DISCUSSION OF RESULTS
The results are drawn from the raw data, which is summarised in Figures 1 and 2, and from basic regression analysis performed to identify statistical correlation and relationship(s).31
FIGURE 1 : Table Showing the Use of Method and Marks Achieved by Students Participating in the Experiment
|
Question 1
|
Question 2
|
Exam Question
|
|||
Use of method
|
No of students
|
% of total (out of 50)
|
No of students
|
% of total (out of 50)
|
No of students
|
% of total (out of 50)
|
Good
|
|
|
|
|
|
|
Marks out of 10
|
||||||
8–10
|
15
|
30
|
11
|
22
|
15
|
30
|
5–7
|
9
|
18
|
14
|
29
|
17
|
34
|
<5
|
1
|
2
|
1
|
2
|
–
|
–
|
Total good
|
25
|
50
|
26
|
52
|
32
|
64
|
Satisfactory
|
||||||
Marks out of 10
|
||||||
8–10
|
–
|
–
|
–
|
–
|
–
|
–
|
5–7
|
9
|
18
|
13
|
26
|
11
|
22
|
<5
|
2
|
4
|
5
|
10
|
1
|
2
|
Total satisfactory
|
11
|
22
|
18
|
36
|
12
|
24
|
Poor
|
|
|
|
|
|
|
Marks out of 10
|
||||||
8–10
|
–
|
–
|
–
|
–
|
–
|
–
|
5–7
|
2
|
4
|
1
|
2
|
2
|
4
|
<5
|
1
|
2
|
1
|
2
|
2
|
4
|
Total poor
|
3
|
6
|
2
|
4
|
4
|
8
|
Other Methods
|
|
|
|
|
|
|
Marks out of 10
|
||||||
8–10
|
–
|
–
|
1
|
2
|
1
|
2
|
5–7
|
4
|
8
|
2
|
4
|
1
|
2
|
<5
|
7
|
14
|
1
|
2
|
–
|
–
|
Total other methods
|
11
|
22
|
4
|
8
|
2
|
4
|
FIGURE 2: Table Showing a Summary of the Marks Achieved by Students Participating in the Experiment
|
Question 1
|
Question 2
|
Exam Question
|
|||
Mark Summary
|
No of students
|
% of total (out of 50)
|
No of students
|
% of total (out of 50)
|
No of students
|
% of total (out of 50)
|
Good
|
|
|
|
|
|
|
Marks out of 10
|
||||||
8–10
|
15
|
30
|
12
|
24
|
16
|
32
|
5–7
|
24
|
45
|
30
|
60
|
31
|
62
|
<5
|
11
|
22
|
8
|
16
|
3
|
6
|
Total students
|
50
|
100
|
50
|
100
|
50
|
100
|
ANALYSIS OF RESULTS
Did Students Use the Structure Taught?
From Figure 1 it can be
seen that in the first question, 22% of students used either a method other than
IRAFT or a method not discernible
to the markers. The percentage decreased to 8%
in the second question and to 4% in the exam question. Of those students who did
not
use IRAFT, only one student used an alternative discernible structure which
remained consistent for all three questions.
The results are not surprising
following a period of teaching and learning which emphasised the usefulness of
the IRAFT structure
in the legal reasoning process. Similar results could be
expected whatever the structure taught. Students would tend to use any technique
demonstrated and affirmed by a lecturer who is setting assessment which they
have to pass.32 It is interesting to note that only one
of the students showed sufficient confidence in her/his legal reasoning process
to develop
and use consistently and successfully her/his own structure.
Did Students Using the Structure, Use It Well, and If Not, Why Not?
Of the students who used IRAFT, the percentage whose
use was “good” was 64% for the first
question,33 56% for the second
question,34 and 67% for the exam
question35. In contrast, the percentage of students
whose use of IRAFT was “poor”, was 8% for the first
question,36 4% for the second
question37 and 8% for the exam
question.38
The 150 answers were analysed to
ascertain any area where the use of IRAFT was not “good”. This
analysis included answers
by students whose overall use of the structure
was “good”. In 28 answers, students had difficulty identifying the
appropriate rules, while in 27
answers they had difficulty applying the
appropriate rules to the material facts. In a further 50 answers, students had
difficulty
both in identifying the appropriate rules and applying them to the
material facts. In only nine of the 150 answers was there a problem
in
identifying the relevant issues.
Clearly a majority of students find the
structure simple to understand and to use well. This suggests that IRAFT is very
successful
as a simple structure to help students in the legal reasoning
process. However, in using the structure, students have most difficulty
in
identifying the appropriate rule to apply and then in actually applying the rule
to the material facts. The structure does not
assist with carrying out these
aspects of the legal reasoning process other than to identify them as steps in
the process.
It is not the purpose of this article to move to the next step
and try and explain why students have difficulties with rule identification
and
application. Mitchell39 believes that it is because
students are in the early stages of moving along a continuum, which starts with
the “novice”
interpretive framework of the first year law student
and moves to the “expert” interpretive framework of the practising
lawyer. His article provides useful suggestions for law teachers to help
students develop their thinking processes.40 However,
further research is required to explore and test the validity of these ideas.
It could be argued that the use of a successful structure may encourage
students to adopt a surface approach. This may well be so.
However, the
structure is merely a model or scheme to assist students with a limited legal
knowledge base to cope with the organisation
and manipulation of large amounts
of information in areas with which they are unfamiliar. Structures to help in
the legal reasoning
process are tools to be used by teachers in the development
of expert interpretive frameworks in their pupils. It is the role of
the teacher
to go beyond the mere provision of a structure. Glaser sees a form of
interrogation and confrontation of basic structures
as essential to the
development of any student’s knowledge base.
Such structures, when they are interrogated, instantiated, or falsified, help organize new knowledge and offer a basis for problem solving that leads to the formation of more complete and expert schemata. The process of knowledge acquisition can be seen as the successive development of structures which are tested and modified or replaced in ways that facilitate learning and thinking.41
Did Students Using the Structure use it Consistently?
There was a correlation between the
“good” use of IRAFT by students in question 1 and their
“good” use of
it in question 2. However, there was no such
correlation between the “good” use of IRAFT in question 1 and its
subsequent
“good” use by those students in the exam question. There
was a correlation between the “good” use of IRAFT
in question 2 and
its subsequent “good” use by those students in the exam question.
What can be suggested from this is that using the IRAFT method well in one
question did not guarantee its being used well in subsequent
questions. A skill
is only acquired with practice. It would have been interesting to see whether
the correlation between a “good”
use of IRAFT in the second question
and the exam question would have continued for a third question, given that
students would then
have had more practice in using the structure.
Did
Students’ Use of the Structure Improve? For the first question, 39 of the
50 students used IRAFT. This increased to 46 students
for the second question
and 48 students for the exam question. Of those not using IRAFT only one student
used an identifiable alternative
structure consistently over more than one
question.
It can be seen from the results to the question, “did
students using the structure, use it well, and if not, why not?”
that the
percentage of students whose use of IRAFT was “good” did not
increase significantly. However, it is interesting
to note that of the nine
students who changed from another or no discernible method in the first
question, to using IRAFT in the
second question,42 only
two showed a “good” use of IRAFT. Subsequently, in the exam
question, six of those nine students showed a “good”
use of IRAFT.
The fact that only one student consistently used an identifiable structure
other than IRAFT, suggests that it could be reasonable
to test the improvement
in use of IRAFT over 49 students for all three questions. This would show that
students with a “good”
use of IRAFT would be 51% for the first
question, 53% for the second question and 65% for the exam question. This does
show an improvement
by students in their use of an identifiable structure. It is
also consistent with an expected improvement in performance as the students
practised the skill of using the structure in the legal reasoning process.
Did Students’ Marks Improve as a Result of Using the Structure?
From an analysis of pure marks over the three
questions there was no statistically significant
correlation.43 However, there appears to be a trend
showing that students generally improved their performance over the three
questions.
For each individual question, it was clear that students who
achieved good marks tended to show a “good” use of IRAFT
in their
answers. There was an extremely strong correlation between the
two.44 In an analysis of the use of IRAFT in all three
questions looked at together, there was an extremely strong correlation between
the
achievement of a high mark and the “good use of
IRAFT.45
It is interesting to note further, that of
the 32% of students achieving 80% or more for the exam question, all showed a
“good”
use of IRAFT or another method. Students who did not show
“good” use of a structure did not achieve over 70%.
The results
did not show that students who showed “good” use of IRAFT in an
earlier question consistently achieved higher
marks in the later questions. It
was shown to be generally true that a student who showed “good” use
of IRAFT in the
first question in week 8, did achieve good marks in the second
question in week 12.46 However, neither a student who
showed “good” use of IRAFT in the first question in week
8,47 nor a student who showed “good” use of
IRAFT in the second question in week 12,48 necessarily
scored a high mark in the exam in week 14. So the closeness of the second
question to the exam did not necessarily help
students who showed
“good” use of IRAFT in the second question to use it well enough in
the exam to obtain a high mark.
There is clearly a very consistent
relationship between using a structure skilfully in an answer and achieving a
high mark. However,
as discussed above, students who showed that they could use
the structure skilfully, did not necessarily do so consistently.
DISCUSSION OF RESULTS
The use of IRAFT or any other structure to help in the analysis of legal fact patterns and the process of legal reasoning is largely an acquired skill. As with any skill, natural talent can enhance it and for some people it comes more easily than others. The effective use of a skill is likely to diminish under pressure. The students’ use of IRAFT is largely consistent with these observations.
Students Not Using IRAFT
A good application of IRAFT is likely to produce a
better result for the student. For those students who did not use IRAFT, it was
only those who used a discernible alternative structure who achieved pass marks.
Most of the students who started out with an alternative
or no discernible
structure subsequently adopted IRAFT in their answers (the number not using
IRAFT shrank from 22% in the first
exercise to 4% in the exam) and by the exam
there was a significant improvement in their marks.
Of the two students who
did not use IRAFT in the exam, one used a structure which did not clearly
identify the rules until after trying
to apply them to the facts. This did not
work successfully and the student averaged a bare pass, with a best mark of 60%
in the exam.
The remaining student who did not use IRAFT per se, in fact
used a more sophisticated version of it. After identifying the issues,
the
student discussed the rules and their application to the facts in a clear and
cogent manner, linking them through to the conclusion.
Yet there was no specific
identifiable order in the discussion. In my view it was representative of a
developed structure, reflecting
a more expert than novice interpretive
framework.49 The student achieved an average in the top
14%.
What Makes a Good Student?
The other students in that top 14% all used IRAFT
well. Again, this in my view reflects a move towards using an expert
interpretive
framework. Many other factors will influence the consistent
achievement of a good mark. However, in the light of the other results,
the use
of an effective structure to assist in the reasoning process stands out as a
strong indicator of which students will generally
achieve good marks and which
will not.
The number of students who used a structure well increased from
50% in the first question to 64% in the exam question. The number
of actual
failures decreased from 22% in the first question to 6% in the exam
question (refer Figure 2). It could be said that the exam conditions would
result in better marks in any event. However,
the correlation between the better
marks and a “good” use of IRAFT or another structure cannot be
ignored.
Part 3: Conclusions50
WHY DO STUDENTS FAIL TO DELIVER IN THE EXAM?
Consideration can now be given to the perceived problem that the quality of students’ answers in examinations falls below the expectation of the markers. This research has in fact shown that in this course the students did deliver what was required of them in examinations. They did not fall apart and give sub-standard answers. In accordance with what should be expected of the teaching process, the students improved over the course and performed at their peak in the exam. Indeed, only the best students showed evidence of having acquired something of an expert interpretive framework. Yet, in my view, this is a totally appropriate result to expect from a course taken half way through a law degree.
THE SUBJECTIVE VIEW
As indicated in Part I of this article, there is a
tendency for some law teachers to expect higher standards than in fact the
majority
of students are able to achieve. This could be explained in part by the
fact that law teachers have “expert” interpretive
frameworks in
relation to their own subjects and find it difficult to relate to the
“novice” interpretive approach taken
by many
students.51
Nonetheless, the objective research
into the students’ answers in this experiment, shows that students
appeared to be progressing
along the continuum of knowledge and application of
that knowledge. The students did use the legal reasoning process taught; their
marks were better because they used that process; and on an objective test of
the marks, as checked by an independent marker, they
performed to the standard
expected. Indeed, most had not yet developed an “expert”
interpretive framework. But they were
progressing satisfactorily along the path
to doing so.
Anecdotal evidence is not an adequate basis on which to give
opinions or make decisions of substance. As discussed in Part I of this
article,
I and some of my colleagues perceived that students were failing to use
structures given to them to assist in the development
of their legal reasoning
and, perhaps as a result, were producing poor answers to problems. It was on the
basis of this anecdotal
evidence that I undertook this research. The results
suggest that the anecdotal evidence and subjective perceptions were largely
false.52
USING STRUCTURES TO TEACH LEGAL REASONING: SOME CAVEATS
It is important to recognise that there are many
difficulties which arise in teaching legal reasoning. The use of IRAFT will not
solve
these difficulties. Often, the fact patterns students are given to solve
bear little relation to those faced by practitioners. Of
necessity, as teachers
try to present material to students in learnable, bite-sized chunks, problems
are placed in boxes, such as
“Contract” or “Tax”. This
is unlikely to happen in real life. Even when a problem given does purport to
cover
several areas of law, it is only usually final year law students who could
hope to cope with all the legal nuances of most real-life
fact
patterns.53
A fact pattern presented to students
also generally includes the material facts necessary to solve any problems
arising in that fact
pattern. Any missing facts are covered by the use of
“assumptions”. In a traditional law school program, the real life
skills in drawing out and recognising facts material to a client’s problem
can be dealt with through training in other skills
such as client interviewing
and areas of dispute resolution.
Applying legal reasoning to problems raised
in set fact patterns presupposes a certain knowledge base in the area of the law
in which
the problems are set. Simply providing students with a framework for
legal reasoning will not generally be sufficient for someone
with no knowledge
in an area of law to solve a problem in that area.54
Furthermore, there is seldom, if ever, one right approach, one right analysis or
one right answer to any legal problem.
Nonetheless, legal reasoning is an
essential lawyering skill that law teachers want their students to learn. It is
all very well to
say that law teachers should introduce teaching methods which
overcome the drawbacks to current methods of teaching legal reasoning
and
analysis, separate from the real world environment. For most undergraduate law
school programs that is impractical given the
funding and resources available.
The important point is that law teachers need to be aware of the problems
inherent in the methods
in order to compensate for those shortcomings.
THE RESULTS IN CONTEXT
The IRAFT structure was chosen as being fairly
representative of models developed for use in the legal reasoning
process.55 In the experiment, the results showed that
students did use the structure taught and its use in the legal reasoning process
did assist
students in their analysis of basic fact patterns. There was also a
very strong relationship between use of the structure and the
achievement of
high marks. However, the structure was not necessarily consistently applied and
it could probably be stated that consistent
application would only occur with
practice and experience, as the novice student moved towards the acquisition of
an expert interpretive
framework.
Significant research needs to be
undertaken into the teaching and learning of the legal reasoning process.
Particular problems for
students identified by this experiment were how to
recognise the appropriate legal rules and how to apply those rules to the
material
facts. Useful study could be done on the best methods of using a
structure in the teaching and learning of legal reasoning skills
in the context
of these particular problems.
As William Twining said:
“what is involved in teaching, learning and assessing individual professional skills is under-theorised and under-researched. The result is that almost everyone involved in general debates about professional competency and professional training ... do not really know what they are talking about ...”.56
What also comes out of the experiment described
in this article, is that empirical research can contradict the pure anecdote
which
can so easily shape the way skills are taught in law schools.
Legal
reasoning is not some mystical talent given to the fortunate and favoured few.
It is a skill to be taught as part of a structured
and incremental curriculum,
designed to best take the novice first year law student to the threshold of an
expert interpretive framework,
sufficient to equip that law student to step out
into any one of the diverse jobs now open to lawyers.
APPENDIX A
SAMPLE PROBLEM
Gold Coast Machinery Ltd
uses a complex and specifically designed electrical conveyor system which
conveys machines to the loading
dock, where it lifts them onto the trucks. The
conveyor system was originally commissioned in 1984 and cost $3.8 million. In
1994,
the original gearbox, which formed part of the engine which runs the
conveyor system, reached the end of its useful life. It was
replaced by a new
gearbox designed to present day technical standards.
The new gearbox cost
$300,000 and is being used in conjunction with the existing drive motors. It has
been designed for future power
upgrades but will only be able to increase the
original design capacity of the machinery if new motors are fitted. This is
expected
to happen in 1995, once the current general upgrade to the machinery
has been completed.
Is the gearbox replacement a repair to the machinery for
section 53?
Write an answer to this problem of not more than one page.
Bring the answer to your tutorial and it will be collected by your tutor.
SAMPLE SOLUTION USING IRAFT
The requirements of section 53 are met, with the
possible exception that the expenditure incurred may be capital in nature. This
is
the issue to be determined in this question. (Issue)
Whilst
numerous cases suggest that it is essentially a question of fact whether or not
expenditure is of a capital nature, two principles
have been established.
(General Rule)
The expenditure must be incurred in relation to the
renewal or replacement of a part and not of an entirety. (Rule)
In
Lindsay v FCT a slipway in a shipyard was held to be an
“entirety”, being identifiable as a separate item of capital
equipment. In contrast,
the relaying of 74 out of some 394 miles of a railway
line in order to remove specific defects and to restore the line to its normal
condition, was held to be a repair to the railway line as a whole. The Privy
Council in Rhodesia Railways Ltd v Resident Commissioner &
Treasurer Bechuanaland Protectorate held that it was a periodical renewal
and did not constitute a reconstruction of the whole railway. Rowlatt J in
O’Grady v Bullcroft Main Collieries Ltd said that the
identification of an “entirety” is largely a matter of impression
and degree. (Rule)
In the present case it seems that the gearbox is
part of the machinery rather than being a separate asset in its own right.
Neither
is it the major part of the machinery and unlike the slipway, it can be
regarded as a mere component of the whole. (Application of the Rule to the
Specific Facts and Tentative Conclusion)
The expenditure must not result
in a substantial improvement, addition or alteration to the existing asset.
(Rule) It was held in W Thomas & Co Pty Ltd v FCT that a
repair must restore an asset’s efficiency in function without improving on
it. Most cases illustrating the application
of this principle relate to
buildings and do not help in this case. (Rule)
Although the gearbox
is specifically designed for a future power upgrade, an improvement to the
function of the machine can only occur
with the installation of new motors.
Accordingly, it is arguable that the new gearbox has not in itself resulted in a
substantial
improvement, addition or alteration to an existing asset.
(Application to the Facts)
However, a substantial improvement may be
effected by a series of piecemeal repairs, which should each then be regarded as
capital
expenditure (FCT v Western Suburbs Cinemas Ltd). This is
particularly so when the expenditure involved is substantial. (Rule)
As the new gearbox was installed as part of a wider plan to upgrade the
machinery and the installation of new motors is anticipated
as part of this
upgrade, the replacement gearbox should be treated as a capital asset.
(Application of the Rule to the specific Facts and Tentative Conclusion)
Accordingly, it is strongly arguable that the expenditure is not
deductible as a repair under section 53. (Tentative Conclusion)
* Assistant Professor, School of Law, Bond University. This article was born out of the discussions of the Action Research Group in the Bond University Law School, who should take the credit for any good ideas. I would like to express my particular thanks to Associate Professor Kay Lauchland, Skills Co-ordinator in the Bond University Law School for her constant encouragement and helpful comments throughout the writing of this paper; Associate Professor Tapen Sinha of the School of Business at Bond University, for his useful comments and without whose input the statistical analysis and interpretation could not have been performed and Professor John Wade of the Bond University Law School, for his guidance and comment.
1 American Bar Association (ABA), Legal Education and Professional Development — An Educational Continuum, Report of the Task Force on Law Schools and the Profession: Narrowing the Gap (Illinois: American Bar Association, 1992). For an accessible review from an Australian perspective see EE Clark, Legal Education and Professional Development — An Educational Continuum, Report of the Task Force on Law Schools and the Profession: Narrowing the Gap, (Illinois: American Bar Association, 1992) [1993] LegEdRev 9; (1993) 4 Legal Educ Rev 201.
2 ABA, id, at 8.
3 Id at 138–140.
4 See for example, GW Paton & DP Derham, A Text-book of Jurisprudence (4th ed., Oxford: OUP, 1972) ch 8; Lord Lloyd of Hampstead & MDA Freeman, Lloyd’s Introduction to Jurisprudence (5th ed., London: Stevens & Sons, 1985) ch 12; M Davies, Asking the Law Question (Sydney: Law Book, 1994) ch 2; G Morris, C Cook, R Creyke & R Geddes, Laying Down the Law (3rd ed., Sydney: Butterworths, 1992) Part Two; and FK Maher & PL Waller, Derham, Maher, Waller: An Introduction to Law (6th ed., Sydney: Law Book, 1991) Part III.
5 Examples can be found many textbooks, such as IG Wallschutzky & GL Payne (eds), Tax Questions and Answers (Sydney: Butterworths, 1994); L Griffiths, Corporations Law Workbook (Sydney: Law Book, 1994); S Graw, An Introduction to the Law of Contract (2nd ed., Sydney: Law Book, 1993) and of course a detailed discussion of the design of undergraduate problems is found in most introductory texts, such as Glanville Williams, Learning the Law (11th ed., London: Stevens & Sons, 1986) ch 8 and Morris, Cook, Creyke & Geddes, ibid, ch 3.
6 “Experiments (which can be conducted either in laboratory or field settings) include those observational studies in which data are collected under conditions where behavioural choices are limited, or in some way constrained by the controlled manipulation of variables and measures selected by the researcher.” WD Crano & MB Brewer, Principles and Methods of Social Research (Newton: Allyn & Bacon, 1986) 19.
7 Readers not interested in the detailed description of the research methodology may wish to read only the “Background” section in this part before moving to the second part.
8 S. Nathanson, The Role of Problem Solving in Legal Education (1989) 39 J Legal Educ 167.
9 See for example, BJ Ward, The Problem method at Notre Dame, (1958) 11 J Legal Educ 100, in which Ward’ breaks the process down into 1) material fact identification; 2) issue identification; 3) rule identification; and 4) application of the rules to the facts. N Jackling, J Lewis, D Brandt & R Sell, Problem Solving in the Professions, (1990) 9 Higher Education Research and Development, 133, use an algorithm to help structure the legal reasoning process. They use similar elements: 1) identification of the issues; 2) marshalling facts; and 3) applying the relevant law to the facts. The Bar Examiners’ Handbook (2nd ed., 1980) at 287–291, quoted in M Josephson, Learning and Evaluation in Law School Vol 2 (Los Angeles: 1984) at 491–495, uses the elements: 1) analysis of the problem; 2) knowledge of the law; 3) application and reasoning; and 4) conclusions. The ABA Task Force Report, supra note 1, at 151–157, sets out in great detail the steps in the process of legal reasoning and analysis. They can be summarised broadly as: 1) analysis of the facts; 2) formulation of the legal issues; 3) identification and formulation of pertinent rules or principles of law; 4) application of the rules and principles to material facts; 5) formulation of relevant legal theories by analysing and synthesising the pertinent legal rules and principles in light of the facts; 6) elaboration and enhancement of legal theories; and 7) evaluation of the efficacy of legal theories in persuading decision makers to reach a particular result.
10 JH Wade, Meet MIRAT: Legal Reasoning Fragmented Into Learnable Chunks (1990–91) [1991] LegEdRev 14; 2 Legal Educ Rev 283.
11 Associate Professor and Skills Co-ordinator, School of Law, Bond University.
12 The idea for the research arose out of discussions with colleagues in a teaching interest group in the Law School at Bond University.
13 Readers not interested in the detailed research methodology should now move on to Part 2 of the article.
14 This aspect is one of many excellent points raised by Mitchell JB, Current Theories on Expert and Novice Thinking: A Full Faculty Considers the Implications for Legal Education (1989) 39 J Legal Educ 275 at 287. See further on this issue of novice versus expert thinking: RC Anderson, The Notion of Schemata and the Educational Enterprise, in RC Anderson, RJ Spiro & WE Montague, (eds) Schooling and the Acquisition of Knowledge (NJ: Hillsdale, 1977), JB Biggs & R Telfer, The Process of Learning (2nd ed., Sydney: Prentice-Hall of Australia, 1987) and Robert Glaser, Education and Thinking: The Role of Knowledge (1984) 39 Am Psychologist 93.
15 Anderson, id at 417–418.
16 Mitchell, supra note 14 at 277.
17 Glaser, supra note 14 at 98–99.
18 Mitchell, supra note 14 at 277.
19 Id at 289–292.
20 Ethical concerns arise as soon as students are used as the subjects of any experiment. Crano & Brewer, supra note 6 at 323, state that “(t)he guidelines for psychological research set by the American Psychological Association’s Committee on Ethical standards (1983) and by the President’s Panel on Privacy and Behavioural Research (1967) stress the idea of recruiting subjects for such research on the basis of ‘informed consent’ — that is, that participation be voluntary and with the volunteer’s full knowledge of what participation will involve.”
21 FJ Landy & DA Trumbo, Psychology of Work and Behaviour (Homewood: Illinois, 1980).
22 Refer for example, R Rosenthal, Experimenter Effects in Behavioural Research (New York: Appleton-Century-Crofts, 1966).
23 As discussed, supra note 5 and text.
24 DT Campbell & JC Stanley, in Experimental and quasi-experimental designs for research (Chicago: Rand-McNally, 1966) quoted in Crano & Brewer, supra note 6 at 28–29, have identified eight major threats to the internal validity of any research program. These are: 1) history — intervening events; 2) maturation — changes in the subjects studied; 3) testing — previous exposure to the measurement; 4) instrumentation — changes in nature of measurement instrument; 5) statistical regression — unreliable measurement; 6) selection — different selection procedures; 7) experimental mortality — varied dropout due to different treatment; 8) selection-history interactions — different selection procedures resulting in groups with different histories. The only one of these applicable to this experiment would be that the instrumentation varied. Using an exam question as the final test of the use of IRAFT may have resulted in distortion. Students may have performed differently because of the examination environment.
25 Students attended a one hour tutorial each week in groups of about 10 students. Attendance was high as tutorial performance made up 20% of the assessment.
26 This could be classified loosely as a mixture of the “case” and “modified case-based” methods, two of the six categories of problem-based learning methods identified by H Barrows in A Taxonomy of Problem- Based Learning Methods (1986) 20 Medical Educ 481. For further discussion of Barrows’ approach see inter alia, S Kurtz, M Wylie & N Gold, Problem-Based Learning: An Alternative Approach To Legal Education, (1990) 13 Dalhousie Law J 797, and A Blunden, Problem- Based Learning And Its Application To In-House Law Firm Training (1990) 8 J of Professional Legal Educ 115 at 116–118.
27 This avoided the problem of initial observations influencing the interpretation of subsequent observations. Refer Crano & Brewer, supra note 6 at 217.
28 The importance of this check lies in the converse of the Hawthorne effect or experimenter expectancy, namely experimenter bias. It is suggested that experimenters over-anxious to confirm their theoretical expectations may show bias in their recording, observation or computation of results. Refer Crano & Brewer, supra note 6 at 90–92.
29 Reliability of measurement data in observational research such as this, is discussed further by KE Weick, Systematic Observational Methods, in G Lindzey & E Aronson (eds), The Handbook of Social Psychology (3rd ed., Reading: Addison-Wesley, 1985).
30 Crano & Brewer, supra note 6 at 218–228. See also, Weick, id at 38.
31 For readers unfamiliar with statistical analysis,
it is important to note that the correlational statistics used here provide
information
on the association, if any, between two variables: the use of IRAFT
and the marks achieved. They cannot tell us that the use of IRAFT
caused
students’ marks to improve. They can only say that if students use
IRAFT well, it is, say, 95% certain that their marks will
improve. See further,
SK Kachigan, Statistical Analysis (New York: Radius Press, 1986) ch 10.
For this experiment, the data was arranged into the three categories: first
question, second
question, exam question. The use of IRAFT was graded as good,
satisfactory and poor. For the purpose of Pearson Product Moment Correlation,
good was assigned the number 3, fair the number 2 and poor the number 1. This
assignment was subjective in that there is no statistical
basis for saying that
satisfactory use of IRAFT is twice as good as poor use of IRAFT etc. For
Spearman Rank Correlation the ranking assigned was: good, 4; satisfactory,
3;
poor, 2; and other or no method used, 1. The Use heading refers to the use of
IRAFT in each of question 1, question 2 and the
exam. The Q1, Q2 and Exam
headings refer to the pure marks for each of the questions.
The results were
as follows:
Use Q1 Use Q2 Use E Q1 Q2 Exam
Pearson
Product-Moment Calculation
Use Q1 1.000
Use Q2 0.454 1.000
Use
E 0.248 0.487 1.000
Q1 0.680 0.413 0.095 1.000
Q2 0.319 0.340 0.249 0.447 1.000
Exam 0.070 0.194 0.493 0.224 0.255 1.000
Spearman Rank
Correlation
Use Q1 1.000
Use Q2 0.492 1.000
Use
E 0.199 0.550 1.000
Q1 0.724 0.492 0.164 1.000
Q2 0.297 0.501 0.359 0.410 1.000
Exam 0.147 0.255 0.530 0.250 0.295 1.000
32 Refer supra note 14 and see also: R Rosenthal & KL Fode, Three Experiments in Experimenter Bias, (1963) 12 Psychological Reports, 491; and R Rosenthal & L Jacobson, Pygmalion in the Classroom (New York: Holt, Rinehart & Winston, 1968).
33 Twenty five out of the 39 students who used IRAFT.
34 Twenty six out of the 46 students who used IRAFT.
35 Thirty two out of the 48 students who used IRAFT.
36 Three out of the 39 students who used IRAFT.
37 Two out of the 46 students who used IRAFT.
38 Four out of the 48 students who used IRAFT.
39 Mitchell, supra note 14.
40 Id at 283–296.
41 Glaser, supra note 14 at 101.
42 Two students changed from IRAFT to other methods.
43 The correlation between Q1 and Q2 was 0.447 and Q1, Q2 and the exam was 0.23.
44 There were F-ratios of 16.0 for Q1,6.82 for Q2 and 11.5 for the exam.
45 There was an F-ratio of 27.9.
46 There was an F-ratio of 2.45, giving a 7% chance of error in the correlation.
47 There was an F-ratio of 1.87.
48 There was an F-ratio of 1.34. An F-ratio of less than 2 has been taken as unacceptable in showing evidence of correlation.
49 Refer Mitchell, supra note 14.
50 It must be emphasised that this experiment is only the first step in showing that use of structures by students will improve their legal reasoning. The inference can be drawn from the results of the experiment that the hypothesis may be true. Further experimentation is required, primarily through falsification of as many competing hypotheses as possible. For example, the hypothesis should be tested that students’ legal reasoning will improve to the same extent without the use of a structure. See Crano &Brewer, supra note 6, for a discussion of the principles and methods which should be followed.
51 This would certainly flow from the analysis by Mitchell, supra note 14.
52 This is not unique to legal education. Cf the report by P Williams, R Williams, A Goldsmith & P Browne, The Cost of Civil Litigation Before Intermediate Courts in Australia (AIJA 1992), in which it was stated with reference to procedural reform in the civil litigation process: “(w)hat makes matters even more difficult is that reform challenges the often longstanding practices and preferences of participants. More often than not, these practices are justified on the basis of personal experience without the wider reference point of empirical research.
53 This may well be different in law schools using curricula which try and overcome this specific problem, such as occurs in some clinical programs.
54 Mitchell, supra note 14 at 289
55 For discussion of the various models used see supra note 9.
56 W Twining, Taking Skills Seriously (1986) 4 Jnl Prof Leg Ed 1. See also, C Roper, Issues in Skills Training in Australia, an as yet unpublished paper presented at the 1994 Professional Legal Skills Conference, held at Bond University 10–11 February 1994.
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/LegEdRev/1994/7.html