Here is the text of my talk from Legal Cheek conference today…
I’ve got ten minutes and five points, so I am going to be quick and not very nuanced.
Point 1. A lesson from history
I did the Law Society Finals. We spent a year learning very dry bits and pieces of law and process; as much as we could manage before we were tested to destruction.
And we learnt to pass. Although interestingly plenty did not pass. It was disturbingly random in who actually passed and failed among my friends. And I can point to some very successful lawyers who did not pass the LSF first time.
The course was laughably dull. Utterly uninspiring. Without a single moment of professional growth. If cynicism was the key ingredient of practice in law, it got us ready.
But it was a uniform test. And providers pass rates were published. Interestingly the College of Law had a somewhat poorer pass rate, but was the destination of choice for many students because (they thought) law firms preferred it.
That’s my short take on point 2, which is…
Why league tables won’t work
Any league table will be measuring both the students (who come with differing levels of ability, not evenly distributed amongst providers), courses, crammers, and work experience. Is a league table measuring the performance of the kinds of students university A gets? Or the kind of education University B gives? Or the impact of SQE provider X? Or Crammer Y. Or the work experience that they have had in Firms A, B, C or D.
My educated guess is that it is practice, practice, practice that will make the most difference. He who pays will win.
Point 3 SQE focuses on the wrong problem (consistency) and looks like it will duck the real problem (day 1 competence)
The SRA case is that concerningly inconsistent assessment on the LPC and LLB delivers ridiculously consistent performance on entry (close to 100% completion of training contracts). The very visible inconsistency early in the process worries them more than the absurd consistency at the end.
I don’t want to trivialise the SRA’s concerns about consistency, but I worry about competence more than I worry about LLB scores. Almost every study that has looked seriously at lawyer competence finds significant levels of incompetence problems. These studies span all kinds of work: high street to Magic Circle.
What lies behind that problem? Is it supervision and training in law firms or assessment in law schools? A genuine, if slightly leading, question. If we make the reasonable assumption that work based learning is a, if not the or the only, significant problem, the SRA proposals are weak. Work-based learning is regulated less – there is more flexibility and no attempt to raise the standards or horizons of training contract. The SRA have absolutely no idea how uniform outcomes from the training contract are. This is the problem they should be concerned with. Assuring Day One qualification is the goal but they are coy about testing on qualification.
It may be though that SQE2 can ameliorate this problem. If SQE is sat at the end of a period of QWE, or really does need to be sat at the end of the period, as the SRA hopes, then it may test something important. Yet there is quite a lot of demand for SQE2 to be passed pre-QWE for the convenience of firms and not for training and development reasons. This shows you what’s important; work planning is more than competence and training (and suggests I am right to think that the problem is work-based learning not college based marking).
Point 4 is that there are some positives both to the form of assessment and the skills elements in the SQE.
Law schools will take MCTs more seriously as a, but not the, form of assessment. And that is a good thing insofar as law schools will benefit from a proper debate on assessment which looks at MCTs and other methods.
But the emerging curriculum on knowledge provides a depressingly deconstructed view of the lawyer. It portrays and assesses lawyers as mechanical, not creative or critical thinkers, and does not lay the ground for a deep understanding of law or problem-solving in law.
The assessment requirements will be worked on but right now they seem rather incoherent and atomistic. They read like bits of the LPC and bits of the degree squashed together by committee.
And in truth the assessment is a massive gamble. The work on setting the right level and testing consistent pass-fail levels is important but it is not what is most important.
The SRA can get the test reasonably consistent on a pass-fail basis; but not at an any more gradated level. And they do not know whether the assessment will really deliver on what is important. Are the right knowledge and skills, let’s forget about attitudes, assessed well?
I wonder too at how this approach will impact on students as life-long learners. That is what we should want from professionals. Curiosity, self-improvement, ultimately – later on- vision and leadership – are the values we should be inspiring through courses and assessments. I see no incentive on SQE oriented providers to take that seriously coming from the SRA. Perhaps the market for SQE courses will drive it. We will see.
Similarly, the Alpha Plus report points out that the SQE will not test higher level professional skills. The ability to initiate and carry out projects; or to generate new insights or propose new hypotheses Is not assessed. Maybe lawyers don’t need to do this. May be routine mechanical type thinking on problems, is all that is needed.
I am reasonably confident that more purposeful, project type working, better critical thinking skills, and more creativity, are really important to legal services providers in all markets. The SQE walks us away from that not towards it.
And more generally, law schools shaping their curricula towards SQE1 or 2 are squeezing out not injecting more creativity, depth or insight in law degrees. And of course electives are getting squeezed out at the equivalent of LPC, now to be Masters stage, if there is to be one.
On the plus side though, I think SQE facing law schools are taking clinical education more seriously – and this is one area where I think the SQE will drive some improvement. Law schools will move away from clinics as interesting experience for student CVs towards clinics being stronger engines of education, as well as trying to claim QWE from it. Turning on the clinical education engine is a demanding job but a worthwhile one.
Let me end with a couple of suggestions:
- The SRA need to get ready, and fund, cohort studies of students passing through the system post 20 whenever. They need to look seriously at the impact on career trajectories, diversity, and the relationships between different pathways and career success.
- They want a cooperative and evidence-based approach to evolving the assessment. Well good. This is a subject that deserves a respectful and more honest debate than it has engendered hitherto. There is fault on all sides for the quality of that debate. The academy is sometimes intransigent and imperious, the SRA has sometimes behaved very cynically. But it is up to the SRA to set the tone.
- If I were them I’d start by signalling phased introduction of the assessments. Testing them properly. And starting with the bit that has the most potential to do some good: SQE2. Only if SQE2 really needs good quality work-based education to get people to standard; and if it really tests the skills of students to be practice-ready on qualification will SQE have done anything of value. Do that first. Show us it works. Then you have a case. And you will have made worthwhile progress. And incidentally, they will be in a better position to judge who important inconsistency in LPC marking really is. The SRA should chase the real prize of competence on admission first; and not concern themselves with consistency pre-QWE.