Today's guest post comes via the ACLEA "In the Loop" Newsletter, from H. Lalla Shishkevish* (D.C. Bar) and Susan Tomita* (ALI-ABA). For those who don't know, ACLEA, the Association for Continuing Legal Education, is a key resource for CLE professionals - administrators, trainers, managers, educators, publishers, programmers, and meeting professionals. The membership organization is devoted to providing educational opportunities and professional interaction for its members who hail from the US, Canada, and many international locales.
ACLEA "Customers" of MCLE Regulators Respond to Survey
In the nearly 35 years since Minnesota established the first mandatory CLE (MCLE) requirements, lawyers have come to expect CLE providers to provide high-quality courses with turnkey CLE accreditation service. To make this happen, CLE providers spend an ever-increasing amount of time on the administrative details of the accreditation process – calculating CLE hours to the nearest hundredth (or tenth, quarter, or half) of a credit, electronically monitoring attendance anywhere on the planet and at any time, and much more.
The relationship between CLE providers and the CLE regulators is ideally one of cooperation, with both groups striving to ensure that attorneys have the best options and tools possible to maintain their competence to practice law and improve their legal skills and knowledge. But how close to this ideal are we? And where are the problem areas, if any? Anecdotal evidence provides no clear answer – but indicates that there might be areas that could use improvement.
ACLEA Members as “Customers”
ACLEA’s MCLE Committee decided to take the first step toward measuring the administrative component of MCLE regulation by developing a survey to ask ACLEA members about their experience with the MCLE accreditation process and their interactions with MCLE regulators.This past June, ACLEA
conducted its first-ever survey of members as “customers” of mandatory CLE regulatory agencies. Electronic questionnaires were emailed to all ACLEA members.
A summary of the collected data will be available on the MCLE Committee webpage . Ratings and rankings of regulatory agencies on specific aspects of their service are included, plus members’ suggestions and comments. The 75 completed questionnaires included a non-U.S. member version completed by nine individuals. A summary was given to the ACLEA Executive Committee and shared with CLEReg leadership and as background information to the CLE Summit planners.
The MCLE Committee plans to conduct a similar Consumer Reports-style survey on a regular basis and revise the questions for broader applicability. Because of assumptions underlying some of the questions, a number of members, including non-U.S. members, were unable to complete parts of the survey. The survey will be revisited after the CLE Summit, where the issue of the future of MCLE will be discussed and informed, in many ways, by the data that we have collected.
Thanks go to Alan Treleaven, Dick Lee, Jan Majewski, and Gina Roers for their work on the survey; to MCLE Committee members for their input; and to ACLEA Executive Director Donna Passons and ACLEA staff for creating the electronic survey instrument and emailing it to all ACLEA members.
The diversity of CLE providers who participated in the survey is evident from answers to the first question (on the number of annual programs/events sponsored) and to its follow-up. According to the results, about one-third sponsored 45 or fewer programs, one-third sponsored 50 to 100, and one-third
sponsored 101 to 3,000.
Sixty-five percent (65%) of respondents primarily submit their programs for accreditation to only one jurisdiction, with New York receiving the largest number (six) of these single jurisdiction submissions, followed by California and North Carolina (four each). Of the 35% minority that are multiple jurisdiction
submitters, about one-third apply to fewer than five jurisdictions, one-third apply to six to 25, and one-third apply to more than 25.
For overall satisfaction with service, the 37 evaluated accrediting agencies received, on average, a solid “good” evaluation on a five-point scale that ranged from “excellent” to “poor” (4.02; for non-U.S. jurisdictions, 4.27). Ten agencies were rated “excellent” (Alabama by four respondents; and one
respondent each for Colorado, Delaware, Mississippi, Missouri, Montana, Rhode Island, West Virginia, British Columbia (in Canada), and Queensland (in Australia)). Two agencies tied for second place (4.5 for Kentucky by two respondents and for Pennsylvania by 10 respondents). Only one jurisdiction received
a “poor” evaluation in this category (Northern Mariana Islands by one respondent).
Accrediting agencies will be heartened to know that their highest average ratings were for professionalism, courtesy, and service-orientation of communications (U.S., 4.22; non-U.S., 4.64) and for accuracy of their output (U.S., 4.21; non-U.S., 4.43). The areas in which they received the lowest average ratings – in the “fair” range – were for the website containing answers for most issues (U.S., 3.52) and timely handling of appeals (non-U.S., 3.00).
Beyond the Numbers
In promising anonymity, the survey solicited candid comments and suggestions. Following is a summary of the results.
Uniformity, consistency, and standardization: The largest number (25) of comments concerned the differing rules for accreditation, monitoring, and reporting among the various types of CLE, which differences contribute to inconsistent interpretations, confusion, and inefficiencies adversely affecting
agencies, providers, and lawyers. Providers and lawyers seeking credit approval in multiple jurisdictions are especially challenged to stay on top of the rules.
Communication: An almost equal number (23) of concerns related to communication issues – from receiving timely notice of approvals and rule changes to agencies’ ability to answer questions and resolve problems quickly. Commentators also noted the shortcomings of website postings to notify sponsors of rule changes and provide user-friendly information, such as program groupings by sponsor.
Simplification, efficiency, and speed: The need to simplify and speed up the accreditation and reporting process was the theme of many more comments (22), particularly through efficient use of email and online solutions. Too much time and paper are needlessly wasted.
Fees and payment: The high cost of fees – for applications, reporting, and late filings –was noted by a handful of commentators, with a suggestion that use of credit cards or debit accounts would streamline the payment process. These expenses can mean fewer accredited CLE options for lawyers.
Modern learning methods and rules: Rules reflecting 21st-century technology and learning preferences were the focus of another handful of commentators, one of whom gave credit limits on distance education as an example of a nonprogressive regulatory stance.
CLE quality: Only two comments related the quality of CLE with the quality of its regulation. They noted that impending compliance deadlines and loose accreditation standards can contribute to compromises in quality.
Kudos to regulators: Respondents singled out 10 accrediting agencies for praise, noting, for example, their resourceful staff and efficient software for online reporting. But one commentator objected to those who step out of their regulatory role to intervene as a competing provider in the CLE market.
*H. Lalla Shishkevish is the Director of Continuing Legal Education at the D.C. Bar. She has held this position since 1996. She is the immediate past president of ACLEA.
*Susan Tomita directs special projects and AV at ALI-ABA Continuing Professional Education in Philadelphia. A program producer and member of the Pennsylvania bar for nearly 30 years, she was instrumental in establishing CLE via satellite TV for the American Law Network.