Our articled clerks (graduates) participate in an intense programme aimed at building their legal research skills in the first three months of joining us. They attend a one hour workshop every two weeks on legal research method (that is, how to find judicial consideration, NOT how to use LexisNexis). Our passionate and creative training librarians try to make these workshops as engaging as possible, with games, competitions, video and lots of hands on participation. The feedback is always great!
But although the graduates may enjoy coming to these sessions, I still had a million questions racing through my head:
How do we know their behaviour changes in the long run?
Have they actually learnt anything?
Are they better researchers at the end of the programme compared to the beginning?
How do we prove to the decision makers of the firm that our programme made a difference? What if it didn’t? What if…?
So many questions!! But these questions were so vital (plus I got sick of asking them) we knew we just needed to start somewhere. So, it might not be perfect, but here is our attempt to demonstrate the value of our programme:
1. We need an assessment!
Not only do we need the graduates to prove what they’ve learnt via an assessment, we also need consequences if they don’t complete it. So, we tied a research skills assessment to their first rotation performance appraisal. You know, the one their supervising partner reads and completes. There is an incentive for a fledgling lawyer!
The toughest thing about introducing an assessment was our own reticence though. We worried about how people (graduates, partners, the HR department) would react. All that worry for nothing! The graduates realised the assessment itself isn’t scary (they need to create a research strategy or plan for how they would research the answer to a question set by me – total time spent: about 20 minutes); People Development were wonderful and supportive and the partners thought that any initiative to improve the quality of research work had to be a good thing.
It has been in place for three years now. What seemed like a big deal and a massive culture change, was actually a pain-free process! Oh, and the results? Spectacular. I am pretty optimistic about the bright and creative graduates who join us here at Ashurst Australia; but the quality of their work left me amazed.
2. Measuring incremental improvement on specific research methods.
We know graduates have the ability to research on some level at least. We don’t really need to measure their ability to find a case or some legislation. We need to measure the nuance in their legal research method. I guess I was trying to work out a method to measure their growing sophistication or refinement in their approach to legal research.
To do this, we had a short quiz to be completed at the beginning AND the end of each workshop. These quizzes were mapped to a set of legal research skills standards we had created and of course were developed through the workshop. The graduates filled out the identical quiz before and after each session – obviously, the “after” result was 100%! This helped to re-enforce to them and to me that their research skills had been refined.
3. What is your biggest research challenge?
At the start of our programme, the graduates are asked to think about what their biggest challenge is when it comes to legal research. I get them all to write it down on big pieces of paper. You can probably guess the types of comments:
Where do I start?
I dread the walk of shame, when I have to go into the partner’s office empty handed!
How do I know if what I’ve found is up to date?
Why do I always seem to get either 10000 results or 0? I want 20!
At each workshop, these challenges are pinned around the room. During discussion of the research topics the trainers relate the topic back to some of these challenges. Our final session is a game show. It looks back on the standards and the suggestions for improving the quality of legal research. How do they feel about their biggest research challenge now? It may not be solved, but has it eased?
If we have given these graduates strategies to deal with the aspect of legal research they find the most daunting we have succeeded! If we have proof in the form of an assessment that their research ability meets set standards we have succeeded! If we can see evidence of a refinement in their ability to research, adding a layer of sophistication to their approach, then we have succeeded.
What have you done to measure the success of your research skills programmes?
How do you seek feedback on improvement?
How do you publicise the work that you do on refining graduate skills in your workplace?