AustLII Home | Databases | WorldLII | Search | Feedback

Legal Education Review

Legal Education Review
You are here:  AustLII >> Databases >> Legal Education Review >> 2012 >> [2012] LegEdRev 7

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Greaves, Kristoffer; Lynch, Julianne --- "Is the Lecturer in the Room? A study of Student Satisfaction with Online Discussion in Practical Legal Training" [2012] LegEdRev 7; (2012) 22(1) Legal Education Review 147


IS THE LECTURER IN THE ROOM?

A STUDY OF STUDENT SATISFACTION WITH ONLINE DISCUSSIONS IN PRACTICAL LEGAL TRAINING

KRISTOFFER GREAVES* AND JULIANNE LYNCH**

I  INTRODUCTION

This article reports on exploratory practitioner research conducted at a practical legal training (PLT) college in 2011 (the site), which sought to identify the major factors associated with student satisfaction with online discussion forums as a medium for teaching and learning in practical legal training.1

In Australia law, graduates must complete PLT to be eligible for admission to the legal profession.2 The Australasian Professional Legal Education Council (APLEC) website identifies more than 20 PLT providers.3 Many of these use some form of online interaction as a teaching medium.4

The effectiveness of online interactions in PLT in Australia has attracted comment.5 There is some literature regarding online interactions in legal education and lawyers’ professional development;6 however, literature regarding online interactions in Australian PLT appears not as well established. In 2008 Roper recommended that PLT providers seeking accreditation for courses with online components submit ‘an argument for the basis upon which the effectiveness of distance learning can be assured’,7 and in 2009 Lansdell called for research regarding the ‘widespread incorporation of new technologies’ in PLT.8

This paper describes and discusses an exploratory study concerning PLT students’ satisfaction with online discussions as a teaching medium. The study, framed by theories of the ‘affective domain’ of student learning and a ‘community of inquiry framework’ for online learning, used an online questionnaire to collect quantitative and lexical data about students’ satisfaction with online discussions.

Part II provides a justification of the theoretical frameworks upon which the study was based. Part III includes a contextual background regarding the research site and the subjects, describes the objectives of the study and the research questions, and summarises the methodology and methods used to undertake the study.

Part IV describes the findings arising from the quantitative data using descriptive statistics, and includes some qualitative data extracted from participants’ responses to open-ended survey questions. Part V describes how the data was cross-tabulated to produce contingency tables from which 2 x 2 tables were extracted and submitted to Barnard’s exact test for independence to identify potential correlations between certain variables.

Drawing on the findings, Part VI identifies certain features of online discussions without which student satisfaction with online discussions as a teaching medium would be materially affected. The implications this might have for further research are also discussed.

II  ONLINE DISCUSSIONS: ‘SELF-SYSTEM’ AND ‘COMMUNITY OF INQUIRY FRAMEWORK’ ASPECTS

This part provides a justification of the theoretical frameworks upon which this study was based, together with an explanation of how they were used to inform the study.

Generally the expression ‘online interactions’ refers to synchronous and asynchronous discussions between two or more individuals at a distance, enabled by information and communications technology. Types of online interactions include email, text-based synchronous and asynchronous discussion groups, learning and content management systems, blogging, micro-blogging, social networking sites, virtual environments, telephony, voice over the internet protocol, webcams, webinars, and other emerging technologies. However, despite the variety of technologies, Garrison and Archer observe that the ‘prevalent form of online learning is asynchronous and text-based and mediated through written language with a minimum of non-verbal or paralinguistic cues’.9 The research to which this article refers focused on this type of online discussion/interaction.

Existing literature outside legal education indicates that the affective domain is highly relevant to student satisfaction,10 and that satisfaction with online interactions can affect students’ motivation to engage with the learning activities.11 Ally et al observe that student satisfaction and learning behaviours partly constitute the context that ought to be considered as part of an instructional design.12

Two authoritative frameworks used widely in education research were selected to frame this study: the ‘community of inquiry’ framework, and Marzano and Kendall’s ‘Self-system’ level of cognitive processing as part of their taxonomy of educational objectives. The two frameworks are described briefly here. They were used to frame questions for the study and to generate an explanation for the findings produced by the study.

The ‘community of inquiry framework’ has been very influential in the area of student satisfaction with online learning.13 This framework assumes that a community of inquiry comprising students and teachers provides ‘the optimum learning experience directed toward realisation of learning outcomes’.14 The framework consists of three overlapping core elements: ‘social presence’, ‘cognitive presence’ and ‘teaching presence’.15 ‘Social presence’ describes students’ ability to project their personal, social and emotional characteristics into a community of inquiry. ‘Cognitive presence’ describes students’ ability to engage through discussion and reflection in critical inquiry. ‘Teaching presence’ involves instructional design and organisation, facilitation of discussion, and direct instruction, focused on achieving learning outcomes through cognitive and social processes.16 The three ‘presences’ overlap, and by implication notions of satisfaction, motivation, and self-efficacy intersect with each of them. Arbaugh, Bangert and Cleveland-Innes assert that the community of inquiry framework is applicable to applied disciplines (like legal practice) because of ‘the emphasis on using inquiry to develop applicable knowledge’, compared with the ‘cumulative instructor-oriented approaches associated with pure hard disciplines’ (such as natural sciences and mathematics).17

Marzano and Kendall describe a ‘self-system’ level of knowledge processing in which the interaction between students’ attitudes, beliefs and emotions, determines students’ motivation and attention to learning.18 This proposition builds on that part of Bloom’s taxonomy of learning related to student satisfaction and to student motivation within the ‘response’ sub-category of the ‘affective domain’.19 The ‘self-system’ overarches the metacognitive and cognitive systems of processing and involves the student deciding whether to engage with a learning task and the amount of energy to allocate to the task. Self-system thinking can be analysed in four categories: (1) ‘examining importance’; (2) ‘examining efficacy’; (3) ‘examining emotional response’; and (4) ‘examining overall motivation’.20 The category of ‘overall motivation’ is derived from the first three categories. A condition affecting the perceived importance of a task is whether it meets a student’s need or the attainment of their goal. Perceived self-efficacy is derived from the student’s beliefs regarding the student’s resources, ability or power to effect changes. A student’s emotional response to a learning task may have significant consequences due to variable control over the student’s own emotions and the long-term effect of those emotions after they occur.21 The ‘self-system’ might overlap with elements of the community of inquiry framework where self-system factors are relevant to a student’s social and cognitive engagement and interaction with the online learning activities.

As Maharg and Maughan observe, the affective domain of knowledge can be ‘problematic’ for legal educators trained in the Socratic tradition, which separates reason and logic from emotional affect.22 They suggest that satisfaction and motivation interconnect with students’ emotions and needs for belonging, self-esteem and self-actualisation; if those needs are not met students may become risk-averse, fearful, restless and resistant to learning.23 Emotional processes are physically integrated with learning, and it is possible to optimise teaching and learning strategies with this in mind.24

Using the community of inquiry framework and the self-system concept as theoretical frameworks helped to identify certain recurrent issues in the review of educational research literature. For example, Wu, Tennyson and Hsia found that, ‘computer self-efficacy’, ‘performance expectations’, ‘system functionality’, ‘content feature’, ‘interaction’, and ‘learning climate’ were ‘primary determinants’ of student satisfaction.25 Expanding on this, and consistent with the community of inquiry and self-system factors, students’ perceived importance of the learning task,26 perception of self-efficacy (including efficacy with computers),27 student–student interactions,28 student–teacher interactions,29 and the online discussion software,30 were identified as factors affecting student satisfaction with online interactions.

The capacity to unite geographically and temporally separate students into a community of inquiry might provide flexibility, parity, and equity of access to instruction and instructors. However, it is prudent to investigate the contextual and affective factors that influence PLT students’ satisfaction with online discussions, and their perceptions of community of inquiry elements and self-efficacy factors, with the goal of improving the use of online discussions for teaching and learning in PLT.

III  STUDY AND CONTEXT

This part provides some contextual background regarding the research site and subjects, and describes the methodology and methods used to undertake the study.

The research was undertaken during April–October 2011 and involved an online survey of students (the research subjects) enrolled in the full-time and part-time PLT blended learning programs at PLT college campuses in Queensland, New South Wales, and Victoria.31 After first obtaining ethics clearance for the study from the Deakin University Human Research Ethics Committee in early April 2011, invitations to participate in the study were emailed to students.

The PLT program is an accredited program involving online coursework with several formative assessments, face-to-face skills workshops, and oral and written summative assessments. The students must also complete work experience and continuing legal education requirements to complete the PLT program. Subject to satisfactory completion of the program the students are issued with the PLT completion certificate and academic conduct report necessary to apply for admission to the profession. They are also entitled to conferral of a Graduate Diploma of Legal Practice.

Course evaluations previously undertaken at the site disclosed that most students were satisfied or very satisfied with a blended program of PLT, but gave lower satisfaction ratings for online discussions as a teaching medium compared with other media such as email feedback.32 The course evaluations did not collect data from the students about specific factors affecting their satisfaction with the online discussions.

The PLT program required students to participate in the online discussions for the Professional Responsibility and Ethics subject (‘the subject’), which is a subject specified in the Competency Standards for Entry-Level Lawyers. The subject was divided into seven units, each dealing with some part of it, such as practitioners’ professional relationship with the courts. Three units involved online discussions, which were not used as a teaching medium in any other subject at the site. There were jurisdictional differences in the content of the subject; however, the instructional design for the discussion groups was uniform across the three jurisdictions. The students were informed:

in this activity you will be required to participate in an online discussion group with other students, and facilitated by your lecturer, during the period specified in the timetable.

The students were instructed to participate in the discussion group by commenting on one or more topics, suggesting other topics, and/or responding to questions posed by other students or their lecturer. Individual lecturers acted autonomously as facilitator for their students’ discussions, guided by tips in the college lecturers’ manual, such as sending a group email in advance to explain the activity and expectations of the students, posting discussion starters to the group, daily checks and responses to student interactions, and follow-up prompts for non-contributing students.

In the first unit, students studied ‘acting ethically in practice’, and were supplied with ten ‘topics of current professional interest’; the students were required to research one or more of the topics and to post a contribution to the discussion regarding it. In the fourth unit, students studied the ‘duties of account and of care’, and were asked to post examples, giving reasons, of work management strategies and objectives aimed at complying with the duties of account and care. In the fifth unit, students studied ‘complying with the law and the duty to the court’, and were given ten problem scenarios with set questions; the students were required to contribute answers to one or more of the problems.

The research did not involve an intervention or ‘treatment’, nor did it collect data on lecturer online behaviour or the degree to which lecturers followed the guidelines provided, but did seek participants’ perceptions of the program as it was usually offered. The objectives of the study were to:

• investigate the relationship, if any, between the use of online discussions as a teaching medium and students’ perceptions regarding the online discussions and student satisfaction;

• ascertain students’ perceptions of the importance and relevance of the online discussion activities; and

• frame recommendations that might improve the use of online discussions as a teaching medium in practical legal training.

In this study, data about students’ perceptions regarding the online discussions was obtained by asking the participants questions framed by the community of inquiry and self-system theoretical frameworks, namely the participants’ perception of the importance and relevance of the subject and the online discussion activities, their self-efficacy (including efficacy with computers),33 student–student interactions,34 student–teacher interactions,35 and the online discussion software.36 For example, data was obtained about students’ prior experiences with online discussions, the location at which they usually participated in the discussions, their preferences regarding the software format, their ease of self-expression and of understanding others in the discussions, and their satisfaction with other students’ participation and with the lecturer’s interactions in the group and in direct response to the respondent’s contributions. The questions regarding interactions with the lecturer and other students were designed to obtain data regarding the ‘community of inquiry’ framework elements of ‘teaching presence’ and ‘social presence’.37

Participants were asked about their perceptions of the importance and relevance of online discussion activities, and the subject studied, in the context of their learning and practical legal training. These questions, together with the questions regarding the participants’ experience with the online discussion technology, were relevant to Marzano and Kendall’s ‘self-system’ level of cognitive processing,38 and more generally to Bloom’s ‘affective domain’.39

Garrison has identified a methodological issue arising from research specifically regarding the online community of inquiry framework regarding the analysis and validity of qualitative data:40 he differentiated the qualitative interpretivist methodology (typified in community of inquiry research by using text analysis to understand online interactions) from an inferential quantitative methodology to validate data from online interactions. After stating that there ‘has been surprisingly little discussion about the reasonableness and usefulness of the community of inquiry framework in studying online learning’, Garrison observed, ‘the time may be right to transition to a phase that utilises both qualitative and quantitative approaches’.41 In this study, data was collected via a largely quantitative online questionnaire with 50 items, including demographic questions, five-point Likert-scale choices, forced choice, and free response format questions to elicit lexical data. Likert-scale items included a ‘neither agree nor disagree’ choice, and respondents could skip a question if they wished; this facilitated voluntary answers but produced some variation in the total number of respondents for each item.

Items aimed at illuminating the participants’ perceptions of the online discussions were developed based on the ‘self-system’ and the ‘community of inquiry’ frameworks. For example, Marzano and Kendall’s self-system incorporates the learner’s self-efficacy, the learner’s perceptions of relevance, and the learner’s motivation towards and emotional response to the learning task. Hence, items were included that asked the survey respondents to choose whether they strongly agreed, agreed, disagreed, strongly disagreed, or neither agreed nor disagreed, with propositions regarding the importance and relevance of the subject studied and the learning task, experience with online discussions, ease of use of the software, and the effectiveness of the online discussions as a teaching medium. The community of inquiry framework involves teaching presence, social presence, and cognitive presence in the online discussions. Survey respondents were asked to respond to propositions regarding their satisfaction with the amount and quality of lecturer–student interactions and student–student interactions.

Invitations to participate in the study were issued after the students’ assessments were complete, to avoid any perception of a connection between their participation in the survey and their PLT assessments. The invitations included a link to the online questionnaire hosted on Survey Monkey™. The service automatically counts and collates the responses into summary reports that can be imported into Microsoft Excel or the SPSS statistics program to produce charts and, where appropriate, the mean and the standard deviation statistic.

Of 30 respondents who validly completed the online questionnaire, two were enrolled in Queensland courses, 18 in New South Wales, and 10 in Victoria; the respondent group was derived from at least 13 separate courses, each with its own lecturer. Of those sent invitations to participate in the online survey, the response rate was 17.5 per cent.42 Female students comprised 70 per cent of the respondents (compared to an average 60 per cent of the students enrolled in PLT courses during the study). More respondents were enrolled in a part-time course (65 per cent), which was consistent with enrolments during the study. Just over half of the respondents were in full-time employment (51 per cent), with 13.8% in part-time employment, and 17.2 per cent in casual employment.

It is acknowledged that the small sample size represents a limitation for the statistical significance of the findings, and further research with a larger sample would be desirable. This was an exploratory study conducted by a teacher–practitioner, and was not intended to prove that any specific factor affected student satisfaction. However, some items were identified as potentially associated and tested for independence. After producing descriptive statistics, the responses were cross-tabulated to create contingency tables. Given the sample size, the ‘Strongly Disagree’, ‘Disagree’ and ‘Neither Agree or Disagree’ responses were aggregated as ‘Not Agree’; the ‘Agree’ and ‘Strongly Agree’ responses were aggregated as ‘Agree’, to produce 2 x 2 tables. Barnard’s Exact Test was applied to test the 2 x 2 tables for independence.43 For the test, the null hypothesis holds that the data in the rows and columns of the tables are independent. The null hypothesis was accepted if the probability of independence (p value) was greater than five per cent (p > 0.05), otherwise the null hypothesis was rejected.

For reasons of concision, not all items from the questionnaire are included in the following discussion. Items chosen for inclusion were those that that provide context, together with those that produced a p value of less than or equal to five per cent, when tested for independence with Barnard’s Exact Test (most of the reported items involve p ≤ 0.01). A complete copy of the questionnaire and the de-identified data is available for inspection on request.

IV  DESCRIPTION OF QUANTITATIVE AND
QUALITATIVE DATA

This part provides an analysis of the quantitative data, together with illuminating excerpts from respondents’ open-ended comments. Items tested for correlation are reported, with comments informed by the literature.

A Student Contexts

The literature suggests that students’ attitudes, beliefs, and emotions affect their motivation and their attention to learning;44 these partly constitute the context that ought to be considered as part of an instructional design.45 Consequently, data were sought regarding the participants’ attitudes towards PLT, previous experience of online discussions; ease of access to the discussions; and preferences regarding the online discussion software.

B Vocational Orientation

Morgan observes that ‘students have to choose to engage in post-compulsory education or training’ and that by exploring student attitudes we can better understand variations in student performance. He also describes students as ‘vocationally extrinsically oriented’ or ‘vocationally intrinsically oriented’.46 The former category can be described as students who see PLT as merely a means to obtain accreditation enabling them to apply for admission to the legal profession. The latter category can be described as perceiving PLT as an opportunity to learn and develop professional skills, knowledge and understanding.

To obtain an indication of their ‘vocational orientation’ respondents were asked to think ‘about my own attitudes and expectations about PLT generally’, and choose one of these statements:

Practical legal training is an opportunity to learn and/or develop skills, knowledge and understanding as part of becoming a lawyer (skills statement); or

Practical legal training is something I have to do to get admitted to the legal profession as a lawyer (certification statement).

Of 28 respondents, 53.6 per cent chose the skills statement; while this represents a little over half of the sample, it is interesting to note that almost half of the sample may be more focused on accreditation than on acquisition of professional skills. Independence testing of the vocational orientation variable did not disclose an association with variables relevant to satisfaction with the online discussions, save that those who did not agree that it was easy to understand other students in the online discussions were more likely to have chosen the certification statement (p = 0.02).47

C Attitudes to Grades

The online discussions were ungraded formative assessments, during which the lecturer provided feedback to the group. Baroudi defines ‘formative assessment’ as:

activities used by the teacher to determine a student’s level of knowledge and understanding for the purpose of providing the student with feedback and planning future instruction. The feedback and future instruction may be concerned with remediation or the provision of further learning opportunities.48

Participation in the online discussions was a mandatory require-ment to be eligible to sit a two-hour open-book written examination at conclusion of the PLT coursework. The examination was weighted 100 per cent for grading in the professional responsibility subject. Given that the online discussions were ungraded, and were a ‘hurdle’ requirement to sit the examination and intended to provide feedback and instruction, data was sought regarding the participants’ attitude to grading, and whether there might be an association between those attitudes and satisfaction with the online discussions. To the statement, ‘Personally, I worked hard for high grades in the PLT coursework because high grades are important to me’, most agreed (40.7 per cent) or strongly agreed (14.8 per cent) (n=27; χ   =3.4; s=1.1);49 however, independence testing dismissed an association between attitudes to grading or other variables, including ‘vocational orientation’.

D Student Experience with Online Discussions

Artino,50 Liaw,51 and Lin, Lin and Laffey,52 identified student perception of self-efficacy as a factor in satisfaction with online communications. Wu, Tennyson and Hsia found student perceptions of self-efficacy with information and communications technology was a ‘primary determinant’ of satisfaction in online communications;53 however, Drennan, Kennedy and Pisarki found those perceptions to be influential at course commencement, but less likely to affect student satisfaction at the end of the course.54 Sun et al observed that learner computer anxiety and perceived ease of use are ‘critical factors’ that might ‘hamper’ satisfaction with online learning.55 Brinkerhoff and Koroghlanian found that the nature and degree of students’ previous online experience might influence satisfaction with online instruction,56 and Hong found that students with previous experience of online communications might be more likely to express satisfaction with an online course, but previous experience did not significantly affect learning outcomes.57 Pena-Shaff, Altman and Stephenson found that attitudes and expectations about information and communications technology and online discussions ‘were not significantly correlated to students’ participation levels and perceptions of learning’, but greater participation did produce greater satisfaction.58

To explore what associations, if any, there might be between student efficacy and their satisfaction with the online discussions, data was sought concerning participants’ previous experience and preferences regarding online discussions.

Most respondents (93.1 per cent) stated they had previously used online discussion forums or something similar (such as Facebook or Twitter) before the PLT course (n=29). Of these, 55.2 per cent said the prior use was mostly for social purposes, and 37.9 per cent said the prior use was mostly for study purposes.

When asked whether the prior experience was satisfactory, most agreed (60.7 per cent) (n=28; χ   =3.6; s=0.8). Most respondents strongly agreed (62.1 per cent) that they felt competent to use information and communications technology as a part of their PLT (n=29; χ   =4.4; s=0.9). Independence testing indicated that those who did not agree they were satisfied with their prior experience might be less likely to agree they were satisfied with the quality of other students’ responses (p < 0.05).

Most respondents (89.7 per cent) used a computer at home for the online discussion activities; the remainder used computers at work (n=29), and most respondents strongly agreed (69 per cent) that ‘getting access to a computer during the [online discussions] was not a problem for me’ (n=29; χ   =4.6; s=0.6). Independence testing disclosed that those who used a computer at work for the online discussions were less likely to agree they were satisfied with the contribution of the discussions to their learning (p < 0.01).

The online discussion software used by the students in this study was the ‘InstantForum’ product,59 which offers a simple user interface via a web page with some graphical content. Most respondents (64.3 per cent) agreed they found the online discussion software easy to use (n=28; χ   =4.1; s=0.7). When asked whether the software used in the online discussions met their expectations, most agreed (64.3 per cent) (n=28; χ   =3.7; s=0.7). When asked to nominate their preferred software format for the online discussions most respondents (60.7 per cent) ‘did not mind what format was used’, when offered ‘a simple graphical format like a simple web page’, or ‘a simple text-based format like a plain email exchange’, or ‘more complex format with text and graphics’, or ‘more content rich but mostly text like a blog or Twitter’ (n=28). Independence testing disclosed that those who did not agree the discussion software met their expectations were less likely to agree they were satisfied with the quality of other students’ participation in the discussions groups (p = 0.01), and were less likely to agree they were satisfied with the discussions’ contribution to their learning (p < 0.01).

It seems that vocational orientation or an orientation to high grades might not be directly associated with student satisfaction with the online discussions, but vocational orientation might be associated with perceived ease of understanding other students in the online discussions. The association between prior experiences of online discussions, expectations of the discussion software, and satisfaction with student-student interactions and satisfaction with the activities’ contribution to learning seem broadly consistent with the literature. These factors could be further investigated to learn more about how they could be taken into account as part of the instructional design and teaching with online discussions.

E Importance, Relevance, Self-Efficacy
(Self-system Aspects)

Consistent with Marzano and Kendall’s taxonomy,60 several items were focused on the perceived self-efficacy, and perceived importance and relevance of the ‘professional responsibility’ subject and its online components. Overall, the responses to these items suggest that the majority of respondents agreed that learning the subject was important and the online discussions were relevant. For example, 82.7 per cent of respondents agreed or strongly agreed that the professional responsibility subject was important for their practical legal training, and 69.0 per cent agreed or strongly agreed that online discussions were relevant to their practical legal training. Table 1 presents a summary of responses to the self-system items. Of interest is that while most agreed the subject was important, the learning tasks were relevant, and they were satisfied with the contribution of the online discussions to their learning, the response to the online discussions as a learning experience was less positive, with over 50 per cent of responses either neutral or choosing to disagree.

Of nine open-ended responses to the question, five indicated the subject was important. For example:

The subject matter is important as this is one of the foundations in one’s professional practice.

Even in the ‘ethics’ component of my LLB, we were not guided through the Professional Conduct rules, so it was important to be well grounded in them before being let loose on the public.

However, disagreement focused on perceived duplication of prior learning. For example:

The course is exactly the same as in university so there was no new knowledge.

Ethics is a Priestly Eleven subject — every single Victorian law graduate has already studied it. I found the material tedious, trite and repetitive.

Table 1: Student Perceptions of Self-system Aspects


SD
%
D
%
N’r
%
A
%
SA
%
The professional responsibility subject was important for PLT
n=29; χ  =4.0; s=0.961
3.1
3.4
10.3
51.7
31.0
The online discussions were relevant to PLT
n=29; χ  =3.7; s=1.1
3.4
13.8
13.8
48.3
20.7
I was satisfied with the contribution of the online discussion group activities to my learning
n=28; χ  =3.5; s=1.0
7.1
10.7
7.1
71.4
3.6
I was satisfied with the choice of topics and problems offered for the online discussion groups
n=28, χ  =4.0; s=0.7
0.0
3.6
14.3
64.3
17.9
I had the personal resources to complete the professional responsibility online discussion group activities
n=28, χ  =4.2; s=0.9
0.0
10.7
0.0
50.0
39.3
I had the personal abilities to complete the professional responsibility online discussion group activities
n=28, χ  =4.2; s=1.0
3.6
3.6
10.7
35.7
46.4
I thought the amount of work I had to complete for the professional responsibility online discussion group activities was appropriate for what I was expected to learn to pass the subject in practical legal training
n=28, χ  =3.6; s=1.0
3.6
14.3
14.3
57.1
10.7
I was satisfied with the online discussion group as a learning experience
n=28; χ  =3.3; s=1.1
3.6
25.0
25.0
35.7
10.7

It may be that new ‘goods’ ought to be identified in advance, to demonstrate there would be an advance on prior knowledge by undertaking the discussions.62 However, 82 per cent of respondents did agree or strongly agree they were satisfied with the choice of topics and problems for the discussions, and it could be inferred that most participants were satisfied with the content of learning activity.

Open-ended responses as to whether the online discussions were relevant seem to indicate issues with the instructional design and facilitation of the discussions, where student posts duplicated the information in previous posts:

The study groups asked us to respond to specific questions, which was difficult, because after one student had answered that question there was no direction given. I felt that my peers and I wrote glib answers for the sole purpose of gaining the participation credit and gained nothing from the experience whatsoever.

Some people posted in such an intellectualised/philosophical way that the meaning was lost and it felt like a war to use the biggest words. I also don’t trust other people’s opinions or interpretations so I always prefer to do the reading/study myself even after reading other people’s summaries or input.

These comments might indicate student expectations for more pronounced teaching presence, to provide encouragement, feedback and guidance, and to introduce fresh challenges.

In relation to ‘self-efficacy’ items, 89 per cent of participants agreed or strongly agreed they had the personal resources, and 82 per cent agreed or strongly agreed they had the personal abilities, to complete the online discussion activities; 67 per cent agreed or strongly agreed the amount of work they were required to complete was appropriate. The literature indicates that affective factors such as student satisfaction, motivation, and engagement are connected with self-system components, namely students’ perception of the relevance and importance of the learning task and students’ self-efficacy. The data suggests that although most participants agreed that the subject and the activities were relevant and important, and most agreed they had the resources and the abilities necessary to undertake (and were satisfied with the content of) the learning tasks, substantially fewer were satisfied with the online discussions as a learning experience. Here, Marzano and Kendall’s ‘self-system’ factors were a useful taxonomy to identify broad questions about the students’ perceptions of the importance and relevance of the learning activities and the subject matter, and their self-perception regarding their capacity to undertake and complete them. What must be borne in mind, however, is that those ‘self-system’ factors are dynamic and integrated, not static and discrete: they are formed and reformed through actions and interactions taking place in the online discussion space. The following exploration of the ‘community of inquiry’ elements of ‘social presence’ and ‘teaching presence’ provides further data about the factors that affect student satisfaction with the online discussions.

FOnline Interactions (Community of
Inquiry Aspects)

Online discussions involve student participation through interactions between students, and between students and teachers.63 To explore the associations between those interactions and student satisfaction with online discussions, the study drew on the ‘community of inquiry’ framework. A community of inquiry comprises three overlapping core elements: ‘social presence’, ‘cognitive presence’ and ‘teaching presence’.64 ‘Social presence’ describes students’ ability to project their personal, social and emotional characteristics into a community of inquiry. ‘Cognitive presence’ describes students’ ability to engage through discussion and reflection in critical inquiry. ‘Teaching presence’ may involve instructional design and organisation, facilitation of discussion, and direct instruction, focused on achieving learning outcomes through cognitive and social processes.65

In relation to teaching presence, Bolliger and Wasilik, Bower and Kamata, Herbert, and Hong, found in separate studies a relationship between student satisfaction and the teacher role in online learning,66 whereas Shin concluded that teaching presence affects student-perceived learning achievement, rather than student satisfaction with online learning.67 Elements of the teaching role include: ‘sense of availability and connectedness’;68 quality and timeliness of instruction;69 and instructor’s expertise and counselling.70 On the other hand, Kelly, Ponton and Rovai found that online students consider the teacher less important than do face-to-face students, and rated the course and course materials as a higher priority,71 whereas Dennen, Darabi and Smith found that despite teachers’ focus on course content and feedback, students’ satisfaction is affected by the ‘perception that they are treated as individuals and that their interpersonal communication needs are met’.72

Wise et al found that the teacher’s social presence affects the learner’s interactions and perception of the teacher, but did not affect perceptions of learning, satisfaction, engagement, or the quality of the teaching,73 and Bangert found that teaching presence combined with social presence produced more high-quality cognitive responses from students.74 Unsurprisingly, individual teaching practices can be influential;75 for example, the amount of instructor interventions may actually impinge on students’ free expression of thoughts and opinions.76

The literature shows that there may be an association between student satisfaction with online discussions and the teacher’s teaching presence and social presence, but the data produced interesting findings as to the strength of the association. Ten questionnaire items were directed to the ‘community of inquiry’ aspects of the online discussions, ‘teaching presence’, ‘social presence’, and ‘cognitive presence’. Interestingly, while most respondents agreed they were satisfied with the lecturer interactions and other students’ participation in the online discussions, only 25 per cent of the respondents agreed they felt a ‘sense of community’ in their online discussions, and only 28.5 per cent of respondents agreed they were satisfied with other students’ responses to their postings. Table 2 represents a summary of the lecturer’s teaching presence items, and Table 3 summarises the student social presence items. The items relate to satisfaction with cognitive presence to the extent that respondents comment on the quality of lecturer and student interactions.

Table 2 shows that the number of those who were satisfied with the quality of the lecturer’s responses to their postings (75 per cent), was not matched by those satisfied with the quality of the lecturer’s participation in the discussions (57.1 per cent), or the amount of the lecturer’s participation (53.6 per cent).

Table 2: Student Perceptions of Lecturer’s Teaching Presence


SD
%
D
%
N’r
%
A
%
SA
%
The lecturer’s presence in the discussion groups as a teacher was important to me
n=28; χ  =3.6; s=1.1
3.6
10.7
28.6
35.7
21.4
I was satisfied with the quality of the lecturer’s participation in the discussion groups
n=28; χ  =3.6; s=1.1
3.6
10.7
28.6
35.7
21.4
I was satisfied with the quality of the lecturer’s responses to my postings in the discussion groups
n=28; χ  =3.8; s=0.9
3.6
3.6
17.9
57.1
17.9
I was satisfied with the amount of the lecturer’s participation in the discussion groups
n=28; χ  =3.5; s=1.0
3.6
10.7
32.1
42.9
10.7

The open-ended responses focused on a need for timely guided instruction:

Because at the end of the day I need someone to in essence regulate the content and ensure that what I said is correct/incorrect.

Provided reassurance you were on the right track.

If the discussion is conducted at the same time (similar to a talk show), the presence of the lecturer as a moderator would certainly be a great help.

Near the end of the survey, participants were asked to give an open-ended response to the statement, ‘If there was just one thing that I could ask College and/or the lecturer to do to improve my satisfaction with the online discussion groups as a teaching method, it would be... ’. Most responses expressed a wish for timely and ongoing lecturer interactions in the online discussions:

Lecturer to get more involved. This would encourage discussions.

The lecturer to get involved earlier to set a tone or direction of the discussion. This might be useful.

More lecturer interaction, not just at the end.

Have the lecturer respond to postings with feedback. Have students comment on each other’s postings.

More interaction between the students and the lecturer.

Have the discussion interactive and in real time (similar to a talk show).

More lecturer involvement.

Directly comment on each of the posts so that I know if I am on the right track. For example if I refer to a specific rule I would expect the lecturer to tell me if that reference was not correct.

It might be inferred from the data that to establish a sense of community, the lecturer must provide more than a satisfactory level of participation in the group and on a one-to-one basis. Rovai found that a stronger sense of ‘community’ produces greater satisfaction and increased likelihood of persistence with learning and students will feel their education goals are being met.77 According to Garrison, to establish a community of inquiry it is necessary to establish critical reflection and discourse to support systematic inquiry, sustain community through expression of group cohesion, encourage and support the progress of inquiry through to resolution, evolve collaborative relationships where students are supported in assuming increasing responsibility for their learning, and ensure that there is resolution and metacognitive development.78 With this in mind, further research might more closely study lecturer-student interactions with a ‘treatment’ directed to establishing those processes.

In relation to student-student interactions, Drouin observes a ‘sense of community’ might be more dependent on those interactions than on instructor–student interactions.79 Pena-Shaff, Altman and Stephenson also found high levels of student–student interactions in online learning can contribute to student satisfaction;80 however, and consistent with Garrison’s observations above, Gilbert, Morton and Rowley observed ‘other learning support’ is necessary.81 While So and Brush found a positive, but not statistically significant, ‘relationship between social presence and overall satisfaction’, it was ‘course structure, emotional support, and [the] communication medium’, that were critical to satisfaction.82

Several questionnaire items were directed to obtaining data about the participants’ perceptions of their interactions with other students. Table 3 shows that while over 70 per cent of respondents were satisfied with the quality of other student’s participation in the online discussions, less than 29 per cent of respondents were satisfied with other students’ responses to their postings. Student–student interactions were not rated more highly than lecturer–student interactions.

Table 3: Student Perceptions of Students’ Social Presence


SD
%
D
%
N’r
%
A
%
SA
%
It was easy for me to understand what other people were saying in the discussion groups
n=29; χ  =3.9; s=1.0
0.0
17.2
6.9
48.3
27.6
It was easy to express myself in the online discussion groups
n=29; χ  =3.9; s=1.1
3.4
13.8
3.4
48.3
31.0
My interactions with other students were more important to me than my interactions with the lecturer in the discussion groups
n=28; χ  =2.6; s=0.8
7.1
42.9
35.7
14.3
0.0
I was satisfied with the quality of most other students’ participation in the discussion groups
n=28; χ  =3.7; s= 0.9
0.0
14.3
14.3
60.7
10.7
I was satisfied with the quality of other students’ responses to my postings
n=28; χ  =3.2; s=0.8
3.6
7.1
60.7
21.4
7.1
I felt a sense of community with the other students in the online discussion groups
n=28; χ  =2.6; s=1.0
14.3
32.1
28.6
25.0
0.0

That less than 29 per cent of respondents were satisfied with other students’ responses to their postings was illuminated by open-ended responses that indicated there was little or no student-student interaction with cognitive value, for example:

There were no responses specifically to my postings. We each posted our required number of posts, with no follow up. I and some other students referred to existing posts when we posted our answer or opinion on the chosen topic, but did not post direct responses to individual posts.

Respondents were generally satisfied with their peers’ interactions in the online discussions, but dissatisfied with their peers’ feedback to their posts; conversely respondents were generally satisfied with lecturer feedback to their posts, but less satisfied with the lecturers’ interactions in the group. The findings suggest a need to balance the lecturers’ teaching and social presence in the group with the lecturers’ feedback to a student. Similarly, the students’ social presence in the group may need to be balanced with their teaching and social presence in giving peer feedback. Hew and Cheung describe techniques that students do use to encourage other students to participate, including ‘giving own opinions or experiences’, ‘questioning’, ‘showing appreciation’, ‘establishing ground rules’, ‘suggesting new direction’, ‘personally inviting contributions from other people’, and ‘summarising’.83 Further research might involve transactional analysis of student–student interactions, possibly involving a ‘treatment’ where a lecturer, or student leader, models the facilitation techniques that are explicitly described in guidelines.

The ‘community of inquiry’ framework, coupled with Marzano and Kendall’s ‘self-system’, provides a useful theoretical approach to investigating the student experience in online discussions as a teaching medium. By identifying factors such ‘teaching presence’, ‘social presence’, and ‘self-system’ factors of self-efficacy and perceptions of the importance and relevance of the learning activity, it was possible to broadly investigate the student experience in relation to the online interactions and activities, and identify possible factors relevant to student satisfaction with online discussions as a teaching medium.

In the following part, a number of the items in relation to the lecturer and student online interactions and self-system factors are shown to be potentially associated.

V  ASSOCIATIONS: LECTURER INTERACTIONS, STUDENT INTERACTIONS, AND SELF-SYSTEM ASPECTS

Part IV described how Barnard’s Exact Test was used to test for possible associations. For the test, the null hypothesis holds that the variables tested are independent. The null hypothesis is rejected if the probability of independence is less than or equal to five per cent, (p ≤ 0.05). Not all independence test results are discussed here; the following focuses on aspects relevant to teaching presence and social presence (community of inquiry elements), and self-system aspects.84 All p-values shown are ‘rounded up’ to two decimal places.

A Lecturer Interactions (Teaching Presence)

The lexical data in Part IV suggests the lecturer’s teaching presence was associated with the respondents’ satisfaction with the online discussions. The independence tests noted in Table 4 support this association.

The results indicate that the quality and the timeliness of a lecturer’s teaching presence in both individual and group interactions was strongly connected to some respondents’ satisfaction with online discussions as a teaching medium.

Table 4: Items Associated with Teaching Presence


p
Satisfaction with the quality of lecturer participation and satisfaction with the amount of lecturer participation in the online discussions
0.01
Dissatisfaction with the quality of the lecturer’s participation and dissatisfaction with the online discussions as a learning experience
0.01
Dissatisfaction with the quality of the lecturer’s responses to students’ posts and dissatisfaction with the quality of the lecturer’s participation in the forums
0.01
Dissatisfaction with the amount of the lecturer’s participation and dissatisfaction with the quality of the lecturer’s responses to respondents’ posts
0.01
Agreement that the lecturer’s presence was important and satisfaction with the amount of the lecturer’s participation
0.01
Satisfaction with the amount of lecturer participation and satisfaction with the learning experience
0.01

B Student Interactions (Social Presence)

In relation to student interactions in the online discussions, independence testing produced p values of ≤ 0.05 for seven potential associations (see Table 5).

Table 5: Items Associated with Social Presence


p
Agreement that understanding others was easy and satisfaction with the contribution to learning
0.01
Agreement that self-expression was easy and satisfaction with the contribution to learning
0.01
Agreement that the professional responsibility subject was important and dissatisfaction with the quality of other students’ responses to postings
0.01
Disagreement that the online discussions activity was relevant and disagreement that it was easy to understand other students in the discussions
0.01
Disagreement that the discussion software met expectations and dissatisfaction with the quality of other students’ participation in the online discussions
0.01
Dissatisfaction with the quality of other students’ responses to the respondent’s posts and dissatisfaction with the quality of other students’ participation in the discussions groups
0.04
Dissatisfaction with previous experience of online discussions and dissatisfaction with the quality of other students’ responses
0.05

These results suggest that satisfaction with student–student interactions are strongly connected to the ‘self-system’ factors of the perceived relevance and importance of the learning tasks. Self-efficacy factors, namely understanding others and being understood, were also involved.

Dissatisfaction with prior experience of online discussions, or dissatisfaction with the online discussion software, appears to be relevant to some respondents’ satisfaction with the quality of interactions with other students.

C Self-system Aspects

In addition to self-system aspects already mentioned above (relevance, importance, self-efficacy), independence testing of the following self-system aspects produced p values of ≤ 0.05 (Table 6).

Table 6:  Items Associated with Self-system Aspects


p
Disagreement that the subject was relevant to respondents’ PLT and dissatisfaction with the discussion groups as a learning experience
0.01
Dissatisfaction with the learning experience and dissatisfaction with the online discussions’ contribution to learning
0.01

It seems that some respondents’ perceptions of the importance and relevance of the subject were relevant to their satisfaction with their online interactions as a learning experience, and the discussions’ contribution to their learning.

Although Barnard’s Exact Test is regarded as a powerful test where small counts are used, the results of the independence tests might be different with a larger sample tested with Chi square or other tests. The results, however, do suggest that the quality and amount of lecturer–student interactions in a community of inquiry are more likely to be connected to student satisfaction with the online discussion learning experience than other factors, and that student satisfaction with student–student interactions might be more likely to be connected to self-system factors, such as perceptions of the importance and relevance of the subject and the activities, and student perceptions of their self-efficacy.

VI  IMPLICATIONS FOR FURTHER RESEARCH

This part summarises some key findings, and discusses some implications and possible approaches for further research. It is acknowledged that given the sample size, the statistical significance of this exploratory practitioner research is limited, although the study produced data that might assist further research.

The data suggests that most respondents did not have issues with using the software or accessing a computer to participate in the online discussions. Most respondents agreed they had previous experience of online discussions for study or work purposes, and most indicated they had no particular preference for the software format. It seems that provided the software is reasonably robust and easy to use, it is what takes place within the online discussions that is important to students, not the look and feel of the software. The interactions should therefore be the focus of further research.

The data in this study showed that most respondents agreed that the subject being studied was important and relevant for their practical legal training, that the online discussions were relevant to their learning, and that they were satisfied with the contribution of the online discussions toward their learning. However, less than half of the respondents indicated they were satisfied with the online discussions as a learning experience. Independence testing indicated that three aspects were associated with dissatisfaction with the learning experience: perceptions of the quality of the lecturer’s participation; the relevance of the subject; and the contribution of the online discussions to learning.

Independence testing indicated that disagreement that the subject was important, or that the online discussions were relevant, was strongly associated with aspects of student–student interactions, including dissatisfaction with the quality of other students’ responses to postings, and disagreement that it was easy to understand others in the discussions. Further research might disclose whether it is more likely that student perceptions of relevance and importance influence perceptions of student–student interactions, or vice versa.

Most respondents were satisfied with the quality of the lecturer’s responses to their postings (75 per cent), but substantially fewer respondents were satisfied with the quality of the lecturer’s participation in the discussions (57.1 per cent), or the amount of the lecturer’s participation (53.6 per cent). Interestingly, the distribution of responses for those satisfied with the lecturer’s participation in the online discussions was identical to those agreeing that the lecturer’s presence in the discussion groups as a teacher was important. Most respondents did not agree that their interactions with other students were more important than their interactions with the lecturer. The lexical data indicated there were respondents who wanted the lecturer to be more involved in the discussions and to provide direct feedback and guidance. The results from independence testing indicated that perceptions of the quality and the amount of the lecturer’s contributions in the group discussions, together with the quality of the lecturer’s responses to the respondent’s postings, were strongly associated.

Student interactions were not rated more highly than interactions with the lecturer. Most respondents agreed that self-expression and understanding what others were saying in the online discussions was easy; independence testing indicated that both of those aspects were strongly associated with perception of the online discussions’ contribution to learning. While over 70 per cent of respondents were satisfied with the quality of other students’ participation in the online discussions, less than 29 per cent of respondents were satisfied with other students’ responses to the respondent’s postings. The lexical data indicated that there was little or no student feedback to the respondents’ postings. Independence testing indicated a strong association between respondents’ agreement that the subject was important, and dissatisfaction with the quality of other students’ responses to the respondent’s postings. Most respondents did not agree they felt a sense of community in the online discussions.

Drawing on the above, respondents were mostly satisfied with other students’ participation in the online discussion but much less satisfied with the lecturer’s participation in the group. On the other hand, respondents were mostly satisfied with the lecturer’s responses to their postings, but much less satisfied with the responses from their peers. From a ‘community of inquiry framework’ perspective, further research might unveil how lecturers’ teaching and social presence in the group might be balanced more satisfactorily, together with that of the students’ social and teaching presence. The study did not investigate the lecturers’ behaviours, but it appears from the data that many respondents judged the lecturers’ interactions to be unsatisfactory in some respects, or at least a significant proportion of the respondents were ambivalent about their satisfaction with the amount and quality of the lecturers’ participation in the online discussions. This may indicate that a close study of lecturers’ interactions in the online discussions is warranted, involving qualitative transactional methods, together with quantitative methods.

The literature and the data do point to the lecturer’s role as vital to student satisfaction with online discussions as a teaching medium. That is not to say that online discussions should be teacher-centred rather than learner-centred. The data does show that most respondents were satisfied with the individual feedback they received from lecturers, but were less satisfied with lecturers’ participation in the group, and the feedback they received from their peers. By implication, the role of the lecturer might warrant more scrutiny when investigating students’ satisfaction with online discussion groups, and there may be several possible approaches to do this, such as: exploring variation in lecturers’ online practice; considering how and whether this is linked to student satisfaction; exploring factors such as lecturers’ aims, motivations, and satisfaction about online learning and the factors that affect lecturers’ decision-making; and developing case studies of lecturers who attract good satisfaction ratings for online aspects of their courses.

Further research should be undertaken in the PLT field to investigate the intersections between students’ self-system factors, their contexts, the community of inquiry elements, and evidence-based teaching practices, within blended instructional designs. We can aspire and strive to produce effective teaching and learning experiences for students and lecturers. Online discussions can effectively span distances to join geographically and temporally separated students and lecturers in an online room, but regardless of whether the classroom is physical or virtual, it is vital that the lecturer be ‘in the room’.


[*] Lawyer and PhD candidate at the School of Education at Deakin University.

[**] Senior Lecturer in Curriculum and Pedagogy at Deakin University.

[1] The first author undertook the research during study for a master of professional education and training degree under the supervision of the second author.

[2] Legal Practitioners Act 1981 (SA) s 14C; Legal Practitioners Education and Admission Council Rules 2004 (SA) r 2; Legal Profession Act 2004 (NSW) s 24(b)(i); Legal Profession Act 2007 (QLD) s 30(1)(c); Supreme Court Admission Rules 2004 ss 7–7A (Qld); Legal Profession Act 2004 (Vic) s 2.3.2(1)(c); Legal Profession Act 2006 (ACT) s 21(b)(i); Legal Profession Act 2007 (NT) s 30(1)(c); Legal Profession Act 2007 (Tas) s 25(b)(i); Legal Profession Act 2008 (NT) s 29(1)(c)(i); Legal Profession Act 2008 (WA) s 21(2)(c).

[3] Australasian Professional Legal Education Conference, APLEC Corporate Members APLEC <http://www.aplec.asn.au/Pdf/ACFD815.pdf> .

[4] For example: Legal Practice Centre Griffith University, Graduate Certificate in Legal Practice, Skills and Ethics (In-Practice) — Program Overview (2011) <http://www17.griffith.edu.au/cis/p_cat/admission.asp?ProgCode=3194 & type=overview> Leo Cussen Institute, Course Features — Online PTC (2011) <http://www.leocussen.vic.edu.au/cb_pages/ptc_course_features.php#online_ptc> College of Law, Coursework Component (2011) <http://www.collaw.edu.au/Future-Students/Practical-Legal-Training-Programs/Program-Information/Coursework-Component/> University of Technology Sydney, Practical Legal Training Information Session November 2010 (2011) <http://www.law.uts.edu.au/practical/plt_info_10.pdf> ANU Legal Workshop, Program Outline (2011) <http://law.anu.edu.au/legalworkshop/Coursework.aspx> .

[5] For example: Gaye T Lansdell, ‘Have We “Pushed the Boat Out Too Far” in Providing Online Practical Legal Training? A Guide to Best Practices for Future Programs’ (2009) 19(1 & [1991] LegEdRev 9; 2) Legal Education Review 149; Christopher Roper, ‘Standards for Approving Practical Legal Training Courses and Providers’ (Victoria Council of Legal Education, November 2008).

[6] For example: Pamela O’Connor and Beth Gaze, ‘Training for Better Decisions: Designing a Computer-Mediated Distance Education Subject for Tribunal Members’ (2002) 13(1) (2002) Legal Education Review 21; Bernadette Richards, ‘Alice Comes to Law School: The Internet as a Teaching Tool’ (2004) 14(1) (2003– 4) Legal Education Review 115; Jennifer Yule, Judith McNamara and Mark Thomas, ‘Mooting and technology: to what extent does using technology improve the mooting experience for students?’ (2010) 20(1) Legal Education Review 137; Archie Zariski, ‘Teaching Legal Ethics Online: Pervasive or Evasive?’ (2001) 12(1 & 2) Legal Education Review 131 Nikki Bromberger, ‘Enhancing Law Student Learning: The Nurturing Teacher’ (2010) 20(1) Legal Education Review 45; Des Butler, ‘Entry into Valhalla: Contextualising the Learning of Legal Ethics Through the Use of Second Life Machinima’ (2010) 20(1) Legal Education Review 85; Lillian Corbin, Kylie Burns and April Chrzanowski, ‘If You Teach It, Will They Come? Law Students, Class Attendance and Student Engagement’ (2010) 20(1) Legal Education Review 13; Jennifer Ireland, ‘Blended Learning in Intellectual Property: The Best of Both Worlds’ (2008) 18(1) Legal Education Review 139; Lawrence McNamara, ‘Lecturing (and Not Lecturing) Using the Web: Developing a Teaching Strategy for Web-Based Lectures: Flexible Delivery in a First Year Law Subject, Part I’ [2000] LegEdRev 6; (2000) 11(2) Legal Education Review 149; Lawrence McNamara, ‘Why Teaching Matters and Technology Doesn’t: An Evaluation and Review of Web-Based Lectures: Flexible Delivery in a First Year Law Subject, Part II’ [2000] LegEdRev 7; (2000) 11(2) Legal Education Review 175; Marina Nehme, ‘E-Learning and Students’ Motivation’ (2010) 20(1) Legal Education Review 223.

[7] Roper, above n 5, 41. The term ‘distance education’ was used generally to include online education.

[8] Lansdell, above n 5, 155.

[9] D Randy Garrison and Walter Archer, ‘A Theory of Community of Inquiry’ in Michael G Moore (ed), Handbook of Distance Education (Lawrence Erlbaum Associates, 2nd ed, 2007) 77, 78.

[10] The authors’ list of example references comprises several pages and is omitted here; the list can be downloaded from: http://thekglawyerblog.com/LER_References/LER_References_Addendum.pdf.

[11] For example, Ruth Butler, ‘Task-Involving and Ego-Involving Properties of Evaluation: Effects of Different Feedback Conditions on Motivational Perceptions, Interest and Performance’ (1987) 79(4) Journal of Educational Psychology 474; Charlotte N Gunawardena and Frank J Zittle, ‘Social Presence as a Predictor of Satisfaction Within a Computer-Mediated Conferencing Environment’ (1997) 11(3) American Journal of Distance Education 8; Virginia Roach and Linda Lemasters, ‘Satisfaction with Online Learning: A Comparative Descriptive Study’ (2006) 5(3) (Winter 2006) Journal of Interactive Online Learning 317; M Cecil Smith and Amy Winking-Diaz, ‘Increasing Students’ Interactivity in an Online Course’ (2004) 2(3) (Winter 2004) Journal of Interactive Online Learning 1.

[12] Mohamed Ally et al, Theory and Practice of Online Learning (Athabasca University, 2004), 18.

[13] For an introduction to the ‘community of inquiry framework’ see Garrison and Archer, above n 9. Additional references can be downloaded from: http://thekglawyerblog.com/LER_References/LER_References_Addendum.pdf.

[14] Garrison and Archer, above n 9, 78.

[15] Ibid, 79-80.

[16] Ibid.

[17] J B Arbaugh, Arthur Bangert and Martha Cleveland-Innes, ‘Subject Matter Effects and the Community of Inquiry (CoI) Framework: An Exploratory Study’ (2010) 13(1–2) The Internet and Higher Education 3719, 43.

[18] Robert J Marzano and John S Kendall (eds), The New Taxonomy of Educational Objectives (Corwin Press, 2nd ed, 2007).

[19] Benjamin S Bloom, Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. (David McKay Co Inc, 1956); David R Krathwohl, Benjamin S Bloom and Bertram B Masia, Taxonomy of Educational Objectives, the Classification of Educational Goals. Handbook II: Affective Domain (David McKay Co, Inc, 1973).

[20] Marzano and Kendall, above n 18, 55–9.

[21] Ibid, 58.

[22] Paul Maharg and Caroline Maughan (eds), Affect and Legal Education: Emotion in Learning and Teaching the Law, Emerging Legal Learning (Ashgate, 2011) 1–2; Caroline Maughan, ‘Why Study Emotion?’ in Maharg and Maughan, ibid, 11.

[23] Ibid, 51.

[24] Richard Roche, ‘Learning and the Brain: An Overview’ in Maharg and Maughan, above n 22, 45.

[25] Jen-Her Wu, Robert D Tennyson and Tzyh-Lih Hsia, ‘A Study of Student Satisfaction in a Blended e-Learning System Environment’ (2010) 55(1) Computers & Education 155, 157, 160–3.

[26] For example, Jennifer Gilbert, Susan Morton and Jennifer Rowley, ‘e-Learning: The Student Experience’ (2007) 38(4) British Journal of Educational Technology 560; Yi-Mei Lin, Guan-Yu Lin and James M Laffey, ‘Building a Social and Motivational Framework for Understanding Satisfaction in Online Learning’ (2008) 38(1) Journal of Educational Computing Research 1; Leah E Wickersham and Patricia McGee, ‘Perceptions of Satisfaction and Deeper Learning in an Online Course’ (2008) 9(1) (Spring 2008) Quarterly Review of Distance Education 73.

[27] For example, Judy Drennan, Jessica Kennedy and Anne Pisarki, ‘Factors Affecting Student Attitudes Toward Flexible Online Learning in Management Education’ (2005) 98(6) Journal of Educational Research 331; Kian-Sam Hong, ‘Relationships Between Students’ and Instructional Variables With Satisfaction and Learning from a Web-Based Course’ (2002) 5(3) The Internet and Higher Education 267; Judith Pena-Shaff, William Altman and Hugh Stephenson, ‘Asynchronous Online Discussions as a Tool for Learning: Students’ Attitudes, Expectations, and Perceptions’ (2005) 16(4) Journal of Interactive Learning Research 409; Pei-Chen Sun et al, ‘What Drives a Successful e-Learning? An Empirical Investigation of the Critical Factors Influencing Learner Satisfaction’ (2008) 50(4) Computers & Education 1183.

[28] For example, Avner Caspi, Paul Gorsky and Eran Chajut, ‘The Influence of Group Size on Nonmandatory Asynchronous Instructional Discussion Groups’ (2003) 6(3) The Internet and Higher Education 227; Michelle A Drouin, ‘The Relationship Between Students’ Perceived Sense of Community and Satisfaction, Achievement, and Retention In an Online Course’ (2008) 9(3) (Fall 2008) Quarterly Review of Distance Education 267; Alfred P Rovai, ‘Sense of Community, Perceived Cognitive Learning, and Persistence in Asynchronous Learning Networks’ (2002) 5(4) The Internet and Higher Education 319; Hyo-Jeong So and Thomas A Brush, ‘Student Perceptions of Collaborative Learning, Social Presence and Satisfaction in a Blended Learning Environment: Relationships and Critical Factors’ (2008) 51(1) Computers & Education 318.

[29] For example, Heejung An, Sunghee Shin and Keol Lim, ‘The Effects of Different Instructor Facilitation Approaches on Students’ Interactions during Asynchronous Online Discussions’ (2009) 53(3) Computers & Education 749; Arthur Bangert, ‘The Influence of Social Presence and Teaching Presence on the Quality of Online Critical Inquiry’ (2008) 20(1) Journal of Computing in Higher Education 34; Beverly L Bower and Akihito Kamata, ‘Factors Influencing Student Satisfaction with Online Courses’ (2000) 4(3) Academic Exchange Quarterly 52; Vanessa P Dennen, A Aubteen Darabi and Linda J Smith, ‘Instructor–Learner Interaction in Online Courses: The Relative Perceived Importance of Particular Instructor Actions on Performance and Satisfaction’ (2007) 28(1) Distance Education 65; Alyssa Wise et al, ‘The Effects of Teacher Social Presence on Student Satisfaction, Engagement, and Learning’ (2004) 31(3) Journal of Educational Computing Research 247.

[30] For example, Jongpil Cheon and Michael M Grant, ‘Are Pretty Interfaces Worth the Time? The Effects of User Interface Types on Web-Based Instruction’ (2009) 20(1) Journal of Interactive Learning Research 5; Serçin Karatas and Nurettin Simsek, ‘Comparisons of Internet-Based and Face-To-Face Learning Systems Based on “Equivalency Of Experiences” According to Students’ Academic Achievements and Satisfactions’ (2009) 10(1) (Spring 2009) Quarterly Review of Distance Education 65.

[31] ‘Blended learning’ involves face-to-face instruction combined with computer-mediated instruction. See, for example, Ireland, above n 6, 140.

[32] For example, end-of-course evaluations conducted by the PLT college in Queensland, New South Wales, Victoria and Western Australia disclosed that students reported a lower satisfaction rating for online discussions compared with face-to-face workshops and email feedback. For the sake of clarity, the end-of-course evaluation forms and data mentioned here were not used in this study. A new survey instrument was created, based on the theoretical frameworks described, to collect data for this study, and the data was collected and stored in accordance with the Deakin University Human Research Ethics Committee requirements.

[33] Above n 27.

[34] Above n 28.

[35] Above n 29.

[36] Above n 30.

[37] Above n 9.

[38] Above n 26.

[39] Above n 27.

[40] D Randy Garrison, ‘Online Community of Inquiry Review: Social, Cognitive, and Teaching Presence Issues’ (2007) 11(1) Journal of Asynchronous Learning Networks 61.

[41] Ibid 68–9.

[42] Ethics approval for the research required that invitations be sent to the students after they completed their PLT coursework and assessments, partly to avoid bias where respondents might provide survey answers in contemplation of the effect the answers might have on grades. On the other hand, collecting survey data after completion of assessments might also skew results because of respondents’ reaction to their grades. Additionally, it is likely that many if not most students were focused on preparing their admission documents at the time and that this would affect the response rate. The low response rate might limit the statistical significance of the data. The problem of ‘timing’ the collection of data is an example of one difficulty encountered in practitioner research.

[43] Goteti Bala Krishnamurty, Patricia Kasovia-Schmitt and David J Ostroff, Statistics: An Interactive Text for the Health and Life Sciences (Jones and Bartlett Publishers, 1994) 588, caution that chi-square should not be used where any expected count is less than one or more than 20 per cent of the expected counts are less than five. David S Moore and George P McCabe, Introduction to the Practice of Statistics (W H Freeman and Company, 3rd ed, 1998) 631, 658, advise that a chi-square test for 2 x 2 tables should only be used ‘where all four expected cell counts be five or more’ and that where expected cell counts are low it is best to use an exact test rather than chi-square. Alan Agresti, An Introduction to Categorical Data Analysis (John Wiley & Sons, Inc, 2nd ed, 2007) 45, advises that when counts are small, ‘one can perform inference using exact distributions rather than large sample approximations’.

  For a complete description of Barnard’s exact test see George A Barnard, ‘A New Test for 2 x 2 Tables’ (1945) 156 (3954) Nature 177; George A Barnard, ‘A New Test for 2 x 2 Tables’ (1945) 156 (3974) Nature 783; George A Barnard, ‘Significance Tests for 2 x 2 tables’ (1947) 34 Biometrika 123. For discussion comparing conditional and unconditional binomial tests see Cyrus R Mehta and Pralay Senchaudhuri, ‘Conditional versus Unconditional Exact Tests for Comparing Two Binomials’ (2003) <www.cytel.com/papers/twobinomials.pdf>. For discussion and practical application of tests to analyse 2 x 2 tables see John Ludbrook, ‘Analysis of 2 x 2 Tables of Frequencies: Matching Test to Experimental Design’ (2008) 37 (18 August 2008) International Journal of Epidemiology 1430. Barnard’s test is not uncontroversial: see Agresti, ibid, 95, for a discussion of Fisher’s criticism. Agresti observed, however, that an ‘unconditional test tends to be more powerful and less conservative than Fisher’s exact test’.

[44] Marzano and Kendall, above n 18.

[45] Ally et al, above n 12.

[46] Alistair R Morgan, Improving Your Students’ Learning: Reflections on The Experience of Study (Kogan Page, 1993) 28, 120.

[47] Independence testing is described at Part V.

[48] Ziad M Baroudi, ‘Formative Assessment: Definition, Elements and Role in Instructional Practice’, (2007) Post-Script: Postgraduate Journal of Education Research, 8(1) 37, 39.

[49] Here ‘n’ = the total number of respondents to the question. Reponses to Likert-scale items were scored: 1.0 Strongly Disagree, 2.0 Disagree, 3.0 Neither Agree or Disagree, 4.0 Agree, 5.0 Strongly Agree. ‘χ   ’ is the mean score, and ‘s’ = the standard deviation, where the question involved a five-point Likert item. The authors acknowledge the debate concerning treatment of Likert items as interval or ordinal data.

[50] Anthony R Artino Jr, ‘Online or Face-To-Face Learning? Exploring the Personal Factors that Predict Students’ Choice of Instructional Format’ (2010) 13(4) The Internet and Higher Education 272.

[51] Shu-Sheng Liaw, ‘Investigating Students’ Perceived Satisfaction, Behavioral Intention, and Effectiveness of E-Learning: A Case Study of the Blackboard System’ (2008) 51(2) Computers & Education 864.

[52] Lin, Lin and Laffey, above n 26.

[53] Wu, Tennyson and Hsia, above n 25, 157.

[54] Drennan, Kennedy and Pisarki, above n 27, 337.

[55] Sun et al, above n 27, 1196.

[56] Jonathan Brinkerhoff and Carol M Koroghlanian, ‘Student Computer Skills and Attitudes Toward Internet-Delivered Instruction: An Assessment of Stability Over Time and Place’ (2005) 32(1) Journal of Educational Computing Research 27.

[57] Hong, above n 27, 279.

[58] Pena-Shaff, Altman and Stephenson, above n 27, 409.

[59] http://www.instantasp.co.uk/Products/InstantForum/WhatsNew.aspx.

[60] Above n 18.

[61] Professional responsibility and ethics was the subject studied in the online discussions.

[62] Nehme, above n 6.

[63] Terry Anderson and Alex Kuskis, ‘Modes of Interaction’ in M Moore (ed), Handbook of distance education (Erlbaum, 2007); Judith Blanchette, ‘Characteristics of Teacher Talk and Learner Talk in the Online Learning Environment’ (2009) 23(5) Language and Education 391; Jason F Rhode, ‘Interaction Equivalency in Self-Paced Online Learning Environments: An Exploration of Learner Preferences’ (2009) 10(1) International Review of Research in Open and Distance Learning <http://www.irrodl.org/index.php/irrodl> .

[64] Garrison and Archer, above n 9, 77, 78.

[65] Ibid, 79-80.

[66] Bolliger and Wasilik, above n 29; Bower and Kamata, above, n 29; Herbert, above, n 29; Hong, above, n 27.

[67] Shin, above, n 29.

[68] Shin and Chan, above n 29, 285.

[69] Young and Norgard, above n 29, 114.

[70] Paechter, Maier and Macher, above n 29, 228.

[71] Kelly, Ponton and Rovai, above n 29.

[72] Dennen, Darabi and Smith, above n 29, 77.

[73] Alyssa Wise et al, above n 29.

[74] Bangert, above n 29.

[75] Blignaut and Trollip, above n 29.

[76] An, Shin and Lim, above n 29, 758.

[77] Alfred P Rovai, above n 28.

[78] In Garrison and Archer, above n 9, 77, 82.

[79] Drouin, above n 28.

[80] Pena-Shaff, Altman and Stephenson above n 27.

[81] Gilbert, Morton and Rowley, above n 26, 570–1.

[82] So and Brush, above n 28, 331–2.

[83] Khe Foon Hew and Wing Sum Cheung, ‘Attracting Student Participation in Asynchronous Online Discussions: A Case Study of Peer Facilitation’ (2008) 51(3) Computers & Education 1111, 1120.

[84] It is not suggested that the correlations prove causation; however the correlations could inform further investigation.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/LegEdRev/2012/7.html