Skip to main content

The Clinical Legal Education Handbook: 3. Assessment in clinics: Principles, practice and progress

The Clinical Legal Education Handbook
3. Assessment in clinics: Principles, practice and progress
    • Notifications
    • Privacy
  • Project HomeThe Clinical Legal Education Handbook
  • Projects
  • Learn more about Manifold

Notes

Show the following:

  • Annotations
  • Resources
Search within:

Adjust appearance:

  • font
    Font style
  • color scheme
  • Margins
table of contents
  1. Cover
  2. Title Page
  3. Copyright Page
  4. Contents
  5. List of abbreviations
  6. Notes on contributors
  7. Introduction
  8. 1. Law clinics: What, why and how?
  9. 2. Regulatory framework
  10. 2.1 Regulation of solicitors and university law clinics
  11. 2.2 Establishing a law clinic as an alternative business structure
  12. 2.3 Insurance
  13. 2.4 Client care and taking on new clients
  14. 2.5 Anti-money laundering
  15. 2.6 Signposting and referrals
  16. 2.7 Quality assurance: Advice standards
  17. 2.8 Quality assurance: Higher education and clinical legal education
  18. 2.9 Clinical legal education as solicitor qualifying work experience
  19. 2.10 International student participation in law clinics: immigration issues
  20. 2.11 Digital security
  21. 2.12 Document management case study: Intralinks VIA
  22. 2.13 Lawyering in a digital age: Reflections on starting up a virtual law clinic
  23. 2.14 Data security
  24. 2.15 Provision of immigration advice and services by university law clinics
  25. 2.16 Provision of debt advice by university law clinics
  26. 2.17 Legal professional privilege
  27. 2.18 Regulation of barristers and university law clinics
  28. 3. Assessment in clinics: Principles, practice and progress
  29. 4. Research on clinical legal education
  30. 5. Precedent documents and resources
  31. 5.1 Contracts and handbooks
  32. 5.1.2 External supervisor handbook: Option
  33. 5.1.3 External supervisor handbook: Option
  34. 5.1.4 Student agreement: Option
  35. 5.1.5 Student agreement: Option
  36. 5.1.6 Student agreement: Option
  37. 5.1.7 Client information agreement
  38. 5.1.8 Third party confidentiality agreement
  39. 5.2 Policies and procedures
  40. 5.2.1 Data protection and records retention policy
  41. 5.2.2 Privacy notice: Option
  42. 5.2.3 Privacy notice: Option
  43. 5.2.4 Client identification policy
  44. 5.2.5 Complaints procedure
  45. 5.2.6 Social media policy
  46. 5.2.7 Student disciplinary code
  47. 5.3 Checklists and practice documents
  48. 5.3.1 Appointment confirmation letter
  49. 5.3.2 Client appointment confirmation
  50. 5.3.3 Attendance form: Interview
  51. 5.3.4 Interview aide memoire
  52. 5.3.5 Client equality and diversity monitoring form: Option 1
  53. 5.3.6 Client equality and diversity monitoring form: Option 2
  54. 5.3.7 Case close-down checklist
  55. 5.4 Learning and teaching
  56. 5.4.1 Model module outline for a law clinic including assessment: Option
  57. 5.4.2 Model module outline for a law clinic including assessment: Option
  58. 5.4.3 Model module outline for a law clinic including assessment: Option
  59. 5.4.4 Model module outline for a Streetlaw module including assessment: Option
  60. 5.4.5 Learning diary
  61. 5.4.6 Student evaluation form: Option
  62. 5.4.7 Student evaluation form: Option
  63. 5.5 Other useful resources
  64. 5.5.1 Client feedback questionnaire: Option
  65. 5.5.2 Client feedback questionnaire: Option
  66. 5.5.3 Law School Clinic Advisory Board: Terms of reference
  67. 5.5.4 Legal Advice Centre: Annual report
  68. 5.5.5 Data audit
  69. 5.5.6 Digital and IT resource list
  70. 6. Glossary of clinical legal education networks 447
  71. 7. Postscript: ‘Things I wish I’d known before I started doing clinical legal education’
  72. 7.1 Professor John Fitzpatrick
  73. 7.2 Dr Richard Grimes
  74. 7.3 Dr Jane Krishnadas
  75. 7.4 Professor Donald Nicolson
  76. 7.5 Professor Julie Price
  77. Index

Part 3
Assessment in clinics: Principles, practice and progress

Part 3
Assessment in clinics: Principles, practice and progress

Richard Grimes and Beverley Rizzotto

Introduction

Academics and students spend considerable amounts of their time undergoing various forms of assessment, be it revising for exams, marking scripts or otherwise producing evidence of performance and achievement.

Not all clinics are, of course, assessed.1 This section, however, is aimed at those lecturers and institutions developing clinics that form part of the curriculum, or those delivering extra-curricular clinics, who want to maximise opportunities for learning and reflection even where participation in a clinic is not assessed summatively.

Given that assessment is time-consuming, resource-intensive and brings with it significant pressure – even stress – we need to be sure, as participants in the process, that what we are doing has an identifiable rationale and value.

Before looking at the detail of assessment in a clinical legal education (CLE) context, let us raise some fundamental questions:

• What do we mean by assessment?

• Why do we assess in any given circumstance?

• When is it appropriate to assess?

• In what form might and should assessment take place?

We will deal with each question in turn and then look at assessment in the clinic.

Finally, it is worth raising the point that those assessing students’ performances in clinical work are sometimes subject to the criticism that the nature of assessment in this context is overly subjective and possibly inconsistent. Working in close proximity with clinic students does mean that the ‘teacher’, first, has a high degree of interaction with the student and, second, this regular contact understandably rubs off on the student. It raises the question of whose work is being assessed – the student’s or the teacher’s?

Also, if students are working in teams or ‘firms’ within the clinic, they may be putting in – to a greater or lesser extent – collective effort. How is this taken into account in assessment?2

We acknowledge that these issues are problematic, but would ask:

• Why shouldn’t a student benefit from the advice and guidance of their teacher?

• Isn’t all assessment ultimately a subjective exercise, mitigated only by a careful and robust adherence to assessment regimes and a system of moderation and independent3 review?

Ensuring that learning outcomes are specific and measurable and that assessment tools accurately capture the extent to which students achieve these outcomes should minimise any difficulties presented by experiential learning.

Meaning of assessment

In common parlance, assessment can be taken to mean a process of evaluation.4 In universities and colleges this is normally considered to be focused on the measurement of student performance. We suggest – particularly in the context of assessment for learning (see below) – that assessment can be much more: incorporating reflection on the form and content of the curriculum, on teacher/learner perceptions of the educational process and on the efficacy of the module or programme, particularly if it involves a live client dimension.

We wish to add one other consideration encapsulated by the saying ‘less can be more’. The temptation in many law schools is to teach as much as possible in what is often an overcrowded and content-driven curriculum. This stress on volume and substance is normally reflected in a raft of learning outcomes (LOs), each of which, if their inclusion is to be justified, needs to be capable of measurement in terms of the extent of student attainment.5

Careful consideration needs to be given to what the LOs cover, whether this is essential in the context of the wider curriculum and whether the assessment of each LO is, in fact, measurable, specifically related to the individual LO concerned, achievable given the context and timeframe and appropriate to the level of study.6

Further, assessment is often categorised as either summative or formative.7 The former is concerned with a definitive statement of performance at a defined moment (such as an examination result), whereas the latter is intended to support the learning process, often being accompanied by detailed feedback.

While we recognise that the distinction between these types of assessment is clear and widely practised, there is no reason why assessment should not contain both summative and formative elements, and indeed every reason why this might be highly appropriate.8 Unless the form of assessment is designed to endorse competence (the legal equivalent of the driving test) we suggest that every effort should be made to have a formative element in all assessment – whether or not the student is being summatively assessed. In other words, students’ work should, ideally, be linked to feedback to enable the students to reflect on the quality of their efforts and how to improve in future.

The starting point in all assessment has to be with the LOs. What are the outcomes that the module/programme is designed to achieve? These may focus on doctrinal principles, on legal and related skills and/or on professional responsibility and wider ethical considerations.

The LOs may be a combination of knowledge, skills and values depending on desired outcomes. For example, on an undergraduate law degree the emphasis, understandably, could be focused on developing doctrinal understanding. On a vocational course undertaken by aspirant legal professionals, the focus may be more on skills and professional practice considerations. Some argue (and we agree) that ethical issues are pervasive to all legal study and a clinical (or other) module and degree programme must take this into account in setting relevant outcomes.9

Why we assess

There may be a wide range of externally imposed reasons for assessment, many of which appear – in practice at least – to relate to defined outputs, such as university performance league tables, inter-student rankings and assessing competency for the purpose of a professional qualification.

We suggest that for educators and students the first initial objective of assessment should be to support the educational process or, in other words, focusing on assessment for learning.10 Second, albeit importantly and unavoidably, assessment can, and often must, serve additional purposes including the supply of evidence for a variety of purposes and, of course, the marking and grading of students.

When to assess

If assessment is to aid learning then the timing of assessment is highly significant. A very common assessment regime – whether it is of student performance or academic input – is for the evaluation to take place at the end of the period of relevant delivery. This might typically be at the end of the term or semester following the student’s completion of the module in question.

We suggest that while it may be logical and/or required by university regulations to place the entire assessment requirement at the end of the relevant period of study, to do so misses learning and revision opportunities. Thought should therefore be given to introducing formative (and possibly summative) assessment components at different points of a programme so that you can monitor the progress of the student and the teacher/module/programme and build on the lessons learnt. The term continuous assessment is sometimes used to describe this process, although we prefer the expression regular and frequent assessment, as this is more reflective of how assessment is actually delivered, especially in a clinical setting.

A model for evaluation that has staggered assessment – a mix of the formative and summative (together and, if necessary, separately) – offered at different points during a course of study has the potential to capture the maximum learning opportunity and the means by which to assess relative progress, be that the progress of the students or the teaching/module/programme.

Outlines for clinical modules are contained in Part 5.4 of this Handbook as examples of how such an assessment regime might be designed and delivered. The models suggested are simply examples of what can be done and are intended to be neither prescriptive nor exhaustive.

What and how to assess: the options for the clinic

What to assess clearly depends on the overall programme aims and learning outcomes for specific modules. Part 5.4 of this Handbook contains sample outline clinical modules. You will need to think about the assessment criteria (applicable regardless of the mode of assessment) and the grade descriptors against which the relative quality of student work can be judged. These descriptors may, of course, be institutionally set and required.

Given all of the above considerations, what are the assessment options for a clinical module?

In principle any of the following assessment methods could be used to assess either student and/or teacher/module/programme performance. The relative advantages and challenges associated with each are noted briefly. The options are not mutually exclusive and a combination of two or more may be appropriate depending on LOs set and university/regulator practices and requirements.

Let us first consider the assessment of a student’s academic performance.

Written examination

This may be seen or unseen, open or closed (in terms of materials allowed to be used in the exam), take home or conducted in a university setting and under exam conditions. For assessing clinical work the written examination is perhaps the least appropriate, although it can be used to assess LOs, as for any other module. The reason for this is that the interactive and reflective nature of clinical legal education and the stress on assessment for learning opens up the opportunity for models of assessment that build on knowledge and understanding.

Written examinations also traditionally take place at the end of a module which, referring to the point made above, deprives the student of the chance to test their learning part-way through a module, and of the space to further develop their studies through involvement in the latter stages of the module.

That said, the attraction of the written examination is that it can be set, sat and marked in a relatively short-time and fits in with much of the rest of the taught programme. It is, of course, possible to introduce a formative element to examinations through the use of practice or mock exams, and the use of previous exam papers for the students to practise and receive feedback on.

Multiple choice tests (MCTs)

This means of assessment appears to be increasingly favoured at the vocational stage of legal education.11 MCTs have a ‘right’ or preferred option in terms of answer. While this may be objectionable to many (on the basis that there is often no ‘correct’ answer), MCTs can be a very effective and resource-economical tool for assessment, particularly when delivered and assessed by computer and online. In the context of clinic they could, for example, be used as part of an induction programme to check whether a student understands fundamental principles, such as client confidentiality or conflicts of interest.

At York Law School a package was developed to assess such issues and used as a pass/fail hurdle – the student can only progress to real client work once they have shown a basic level of competence through selecting the ‘right’ answer. Under the York model a student can access the test online and take it as many times as they need to in order to pass.

There are, of course, limitations to this approach, but at the very least it requires the student to think about the outcome even if by a process of elimination.

Coursework

Again this traditional form of assessment used on many modules can be easily adapted for use in the clinic. This may consist of essay or problem questions in which the students address issues relevant to their clinical experience. Similar advantages and disadvantages to those identified above can be cited here.

As seen in the model module outlines contained in Part 5.4, selective use of essays – particularly when timed part-way through a module – can be a very effective assessment technique. Coursework can also be used to assess students who must carry out independent research and study as part of their degree. The clinic may be the setting for such research – for example, a detailed analysis of the law on disability discrimination and related practical considerations, arising out of a client case. Assessment of this activity may take the form of a research report or essay.12

Presentations

As part of their clinical work, law clinic students are frequently called upon to present their findings and other aspects of their work. This may be done on an individual and/or group basis. Group presentations are, however, problematic in terms of assessment as the question arises of whose work is being assessed, although there are ways of addressing this.13

The presentations may be oral or written, may be submissions or reports, may make use of visual aids (such as posters) and/or may be e-technology based, including web postings, blogs and email exchanges. Of course, for anything that involves case-sensitive information students must ensure that client confidentiality is respected.

A presentation can also be used as part of students’ clinical work to show the extent of their reflection on their engagement in their clinical work, and to consider the skills they have utilised during their time in the university or externally-based clinic.

Portfolios

One of the most common forms of assessment in clinics is the portfolio.14 This device can be simply a collation of student work – for example, a collection of documents drafted by the student, materials they used (especially in their research) and entries maintained in a diary or journal detailing what they did in the clinic.

A portfolio may take on a strongly reflective nature where the student not only outlines what they did, but also goes on to set out how and why matters worked out as they did – and, importantly, what could have happened and what they learnt from that experience, particularly what they might do (or not do) next time in terms of improving performance. A portfolio might also include identifying shortcomings in the law and practice and how this might be reformed.

Assessment here might not only be based on the portfolio submission itself but can include the student’s response to tutor feedback on the original submission.15

Viva

Using an oral defence of a student’s understanding is not common in law school assessment, at least not at undergraduate or Master’s level. We suggest that this is an oversight and that the viva voce can be a very powerful and valuable assessment method. Personal experience suggests that individual viva s are somewhat intimidating for individual students and are time-consuming to conduct.16 Rather a group viva, where all of those students who have worked on specific cases sit down with the examiners to discuss their experiences. Clearly the LOs need to be linked to this assessment technique and you need to think about the problems noted above and with assessing individual and group work.

One solution is to assess individual performances but to allow the ‘conversation’ to take place in a group setting. There are several advantages to this, including the logistical and the psychological, which are discussed in detail elsewhere.17 The viva can avoid some of the problems caused by the tendency of students to ‘retro-fit’ evidence to suit assessment demands (deliberate or otherwise). It also allows the examiner to probe beyond initial responses and to more fully test the extent of a student’s understanding. Such a face-to-face encounter also largely avoids the potential for plagiarism.

Casework performance

For obvious reasons, one tempting medium for assessment is the work that the student has actually done in the clinic, for example as evidenced by what appears on the client file. We suggest that while this may be an appropriate format for assessment it runs the danger of making the quality of what the student does in terms of casework the focus for assessment. Given that what is done for the client must be professionally competent, is it fair to set such a standard for the student who is not yet legally qualified? Of course, the clinic supervisor must ensure that the client receives a professionally acceptable service but it is not the student’s responsibility to ensure this.

Unless the learning outcome requires demonstration of professional competence (for example, as a pre-qualification bar examination might) a student, for example at the undergraduate level, should be assessed on criteria other than competence. Again, depending on specific LOs, on an LLB degree credit should perhaps be given to students who reflect insightfully on their shortcomings rather than those who can draft a letter as a practising lawyer might. Being able to recognise what is good (or not so good) is an indicator of the depth of a student’s understanding.

Finally, under this topic, assessing actual casework is also problematic in terms of equivalence. Assessing one student’s casework will almost certainly and necessarily be different from another’s, given the different case facts and legal issues involved in each case.18

Dissertation/thesis

In principle, there is nothing to prevent (and many reasons in favour) assessment of student performance by way of a substantial written submission on a topic relevant to the student’s clinical studies.

For example, a student may have worked on a family law case where a client was seeking financial payments for herself and her children from her ex-husband. After advising the client the student may choose to consider the law on child support more generally and whether this is an area of law ripe for reform. The study may also involve a comparative element looking at the legal provision elsewhere, for example in the EU. If one of the LOs addresses law reform then submission of a dissertation on this subject may be an entirely valid form of assessment. Something less than a formal dissertation – a long essay or report – may also serve a similar purpose.

We can now consider assessment outside of the context of student performance.

Feedback from students

For several reasons the clinic lends itself to self-scrutiny. The interactive nature of clinical study and the reflective process it encourages and requires offers a unique opportunity for all participants – teachers, students and clients – to assess the value of what has taken place.

Post-case questionnaires for all concerned are a valuable way of eliciting such responses. Students might also be given clinic-experience opportunities to consider their perceptions and expectations, and use this to aid reflection during and after the clinical experience. An annual report can capture some of this feedback. Part 5.5.4 contains an example of such a report from the University of Wolverhampton. Experience suggests that all concerned – staff, students and clients – greatly appreciate the work and value of the clinic.

Sample clinic experience questionnaires for students and clients are also contained in Parts 5.4 and 5.5.

External feedback

Similarly, feedback can (and in the case of external examiners in UK universities, must) be sought from external sources. In particular, we recommend an advisory board. While having recommendatory powers only, such a body can be an invaluable source of help, especially if all relevant stakeholders are represented – including the local practising profession, the bar association/law society, not for profit/NGO groups providing legal services, academic staff, students and relevant community groups. Not only will the board assist in the process of evaluation and the shaping of future strategy, but the involvement of the board will add credibility to and ensure a degree of ownership of the clinic in the wider world.

A forum for discussion

As we know well in the academic world, peer review is a frequently used and often highly valuable means of sharing views and enhancing the ultimate quality of what is produced, be that clinical practice or scholarship based on experiential learning.

To this end, clinic networks such as CLEO19 in the UK and CLEA20 in the USA are an invaluable and easily accessible conduit for the exchange of views and ideas, and for the sharing of proven good or ‘best’ practice. On the international stage, GAJE functions as a highly effective clinical network.21

Quality assurance is also promoted and to an extent guaranteed by the publication of articles and other manuscripts on clinical legal education. The International Journal of Clinical Legal Education (IJCLE) is one such outlet and has, in recent times, become a productive outlet for empirical and related research. The largely US-focused and based Clinical Law Review is also a very helpful source of relevant publications.

All of these fora promote and support critical evaluation and, in consequence, assessment of what is happening on the ground. More detail on CLE networks can be found in Part 6 of this Handbook.

As a final comment under this section it should be noted that most, if not all, of the above methods of assessment can be delivered and assessed through electronic means, and increasingly both submission and assessment (and also feedback) can be carried out (and given) in e-format.

Feedback

So far we have only mentioned feedback in passing. Reams have been published on the role and relevance of feedback, particularly since the rise and now prominence of the concept of student-centred learning.22 In a clinical context many clinical faculty pride themselves on the quality of the teacher/student dynamic, particularly in terms of learning.

A law clinic readily lends itself to such a positive relationship. The extent of supervision required, the small group learning context and the high levels of motivation resulting from live client clinical work all contribute to a fertile feedback environment. In other modules feedback will principally be in the form of notations on student submissions (essays, in particular, and – but normally to a much lesser extent – examinations) but it is increasingly being given in e-format. The big difference in the clinic is that there is ample opportunity for feedback to be given to students, promptly and in detail, as client care (let alone good education) demands it.

Students can be given verbal and/or written feedback, to meet institutional requirements and address individuals’ learning styles, both formally and informally. For example:

• a feedback pro forma developed as part of the client file that a student is assigned to

• an email that is then placed on the client file, serving also as an accurate and true record of feedback given as part of the supervisory process.

In using informal discussions as a means of providing feedback, students are encouraged to conduct self and peer assessment of their performances in the clinic – what they think they did well, what did not go to plan and what they need to improve on for the next time. This is best done relatively soon after the event, so the experience is fresh in the students’ minds and they can then take the feedback away, reflect on it and develop their knowledge, skills and values for the next occasion.

Also, with much clinical work being conducted in small groups (hereafter referred to as student law firms (SLFs)), peer feedback is also an important component of the process, with feedback becoming an exchange of ideas and concepts, and one piece of feedback leading from and building on what has gone before. Regular meetings between student and case supervisor are also needed to ensure that the necessary discussions take place on case progress and – where issues of common interest arise that affect everyone in the clinic, for example professional practice issues around client confidentiality or conflicts of interest – there need to be meetings of the whole clinic cohort. The regular SLF meetings are sometimes referred to as ‘rounds’, drawing on the medical analogy of the consultant and medical students touring wards to discuss symptoms, diagnoses and treatment.23

Even though it may not be formally recognised as such, all of this is feedback in the sense that it informs student understanding. Requiring students to assume responsibility for their cases (albeit closely supervised) encourages them to assume a degree of ownership and promotes learning, with teacher and learner interacting in a shared experiential context.

Finally, each institution is likely to have its own assessment/feedback rules and policies. This regime must be satisfied over and above the feedback described here, although the teacher should have ample evidence on which to base assessment and feedback given the interaction involved in clinic work.

Some have suggested, understandably, that the close relationship between clinical supervisor and student can skew the ‘objectivity’ of the assessment process. We put the word in inverted commas as it presupposes that any assessment process is objective. We suggest that this is not the case, unless perhaps the assessment format is a multiple choice question and there is a pre-defined ‘right’ answer with the marker sticking rigidly to matching the relevant answers. Anything short of this is relatively subjective. The safeguards against unacceptable degrees of bias include having clear and transparent LOs and assessment criteria, assessing in teams of two or more and having an internal and external moderation system.

Not all CLE activities are, however, formally assessed. Many clinics operate on an extra-curricular basis, as an opportunity for students to gain much-needed legal work experience and an exposure to the significance of pro bono services. In such circumstances, of course, there may be formative assessment – for example, detailed feedback on student performance – and the clinical work may have an impact on other aspects of the programme.

Practical legal skills are currently assessed as part of formal training on the way to becoming a barrister or solicitor. ‘Practising’ legal skills in a clinic should provide students with invaluable experience, the opportunity to engage in activities in a legal setting – such as conducting interviews or drafting correspondence – outside the confines of formal assessment, and to receive feedback on a one-to-one basis (perhaps in much more detail than they would do as part of their course). In turn, this should equip them better for a summative examination elsewhere on the degree or vocational course.

Such opportunities will also allow the student to conduct self-assessment and increase the prospect of reflection prior to any formal examination, not to mention enhancing their CVs and giving them ammunition for use in future job interviews.

Future implications: informing the curriculum and the academic/professional regulators

The legal education world is in a state of flux. Some would say this is long overdue. The work and final report of the Legal Education Training Review (LETR)24 set a new agenda, clearly highlighting diversity and access, ethics and professional responsibility and the value of work-based learning.

While many have criticised the report25 its implications have been profound, with both the Solicitors Regulation Authority (SRA) and the Bar Standards Board – that respectively govern the admission of solicitors and barristers in England and Wales – actively involved at the time of writing in major reviews of the future of legal education and training. In the case of the SRA, the advent of the Solicitors Qualifying Examination (SQE) will remove the requirement for candidates to have completed a Qualifying Law Degree in order to be admitted as a solicitor.26

Strictly speaking, the remit of the LETR did not include undergraduate programmes but the quasi-regulatory body for the so-called academic stage of legal (and other) education – the Quality Assurance Agency (QAA) – has reached similar conclusions on the expected standards, achievements, qualities and aptitudes expected of a law graduate.27

It seems that in a post-LETR world there will be less specific regulation on law schools in terms of both content and form of the curriculum, but that what the universities and colleges produce will need to be outcome focused, although those outcomes will not be necessarily measured until after graduation, at least in terms of routes to qualification as a legal professional.

One thing is certain: that, depending on LOs set and assessment regimes designed, clinics stand to tick many of the relevant boxes – be that an ability to apply doctrine and theory to practice, an understanding of the nature and extent of skills required to be a practising (and competent) lawyer and the meaning of being a legal professional and the role of a lawyer in modern society. As explored in Part 2, time spent in clinic may also form work experience that can be counted towards professional qualification.28

Thinking of assessment in its widest sense the clinic may also be highly relevant (and its impact measurable) in terms of contribution to legal service provision, recruitment and retention of students and the degree to which clinical experience enhances a student’s employability. Having robust and demonstrable assessment may serve many purposes, from convincing an employer of a student’s worth (having passed, for example, a programme in clinic that requires an understanding of professional practice rules), as well as leading to a credible and accepted form of qualification.

Assessment, therefore, is not to be regarded as an obstacle to be overcome or a commitment to be resourced: if taken in a constructive spirit, it is an important part of the legal education process.

______________

1 See Part 1 for a discussion on the many different models of clinic.

2 For more on the existing literature exploring these issues, see Tribe Mkwebu’s literature review in Part 4.

3 Often through external examining.

4 For a discussion on what assessment means in the light of traditional and more contemporary approaches to the subject, see P Griffin (ed), Assessment for teaching (Cambridge University Press, 2014).

5 Linking outcomes and assessment modes and enhanced learning is sometimes referred to as ‘constructive alignment’: see J Biggs, ‘Enhancing teaching through constructive alignment’ (1996) Higher Education 32(3), 347.

6 One device for ensuring that outcomes are effectively linked to assessment is in the use of the management tool often referred to by the acronym SMART: specific, measurable, achievable, realistic and timely. See G Doran, ‘There’s a SMART way to write management’s goals and objectives’ (1981) Management Review 70(11), 35.

7 For a discussion of both summative and formative assessment and the relationship between them, see K Sambell, L McDowell and C Montgomery, Assessment for learning in higher education (Routledge, 2013), and in particular pp. 32–48.

8 For a further discussion of this, see R Grimes and J Gibbons, ‘Assessing experiential learning – us, them and the others’ (2016 ) International Journal of Clinical Legal Education 23(1), 107.

9 See in particular D Nicolson, Teaching and learning legal ethics: What, how and why?, in R Grimes, Rethinking legal education under the civil and common law: A roadmap for constructive change (Routledge, 2017).

10 For a practical but thorough guide to this, see P Black, C Harrison, C Lee, B Marshall and D Wiliam, Assessment for learning: Putting it into practice (Open University Press, 2004).

11 For example, MCTs are currently on the Bar Professional Training Course and are likely to be part of the assessment regime for aspiring solicitors; see the proposals for the Solicitors Qualifying Examination, available at <https://www.sra.org.uk/home/hot-topics/Solicitors-Qualifying-Examination.page> accessed 23 August 2019.

12 This can extend to other forms of experiential and ‘hands-on’ learning, such as mooting and role play.

13 For an interesting discussion of the challenges and benefits of assessing group work, see D Williams, J Beard and J Rymer, ‘Team projects: Achieving their full potential’ (1991) Journal of Marketing Education 13, 45.

14 Using a medical setting, as is so often the case in academic writing on experiential learning, a very helpful discussion on the learning (and assessment) potential of portfolios can be found in M Jasper, ‘The portfolio workbook as a strategy for student-centred nursing’ (1995) Nursing Education Today 15, 446.

15 As practised, for example, in the law clinic at Strathclyde University.

16 The use of the viva in clinical assessment is discussed in R Grimes and J Gibbons (note 8).

17 Ibid.

18 For more on equivalence see Part 2.8: Quality assurance: Higher education and clinical legal education.

19 CLEO is the Clinical Legal Education Organisation (see the Glossary at Part 6).

20 The umbrella group for clinicians in the USA (which heavily influenced the formation of CLEO) is the Clinical Legal Education Association.

21 The Global Alliance for Justice Education (see the Glossary at Part 6).

22 For an overview of this concept and a discussion of assessment in this context, see D Brandes and P Ginnis, A guide to student-centred learning (Simon and Schuster Education, 1986).

23 The use of ‘rounds’ is well discussed in S Bryant and E Milstein, ‘Rounds: a “signature pedagogy” for clinical legal education’ (2007) 14 Clinical Law Review 195.

24 Setting standards: The future of legal services education and training regulation in England and Wales, (LETR, 2013) <https://www.letr.org.uk/wp-content/uploads/LETR-Report.pdf> accessed 14 July 2017.

25 For example, see A Sanders, ‘Poor thinking, Poor outcome? The future of the law degree after the Legal and Education Training Review and the case for socio-legalism’ in H Sommerland, S Harris-Short, S Vaughan and R Young (eds), The futures of legal education and the legal profession (Hart Publishing, 2015).

26 <https://www.sra.org.uk/home/hot-topics/Solicitors-Qualifying-Examination> accessed 23 August 2018.

27 QAA, Quality Assurance Agency for Higher Education in the form of the new Benchmark Standard for Law (2015) <https://www.qaa.ac.uk/docs/qaa/subject-benchmark-statements/subject-benchmark-statement-law.pdf?sfvrsn=b939c881_16> accessed 14 July 2017.

28 See Part 2.9: Clinical legal education as solicitor qualifying work experience.

Annotate

Next Chapter
4. Research on clinical legal education
PreviousNext
© Contributors, 2020
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org