Services
People
News and Events
Other
Blogs

What are the Risks Associated with AI in Legal Finance

View profile for Elaine Pasini MCIM
  • Posted
  • Author

Updated 6 February 2024

Artificial intelligence (AI) is literally everywhere in the news and conversations. There is so much noise surrounding this topic and no more so than in law firms, from tech to marketing to support to transparency to legislation and of course...regulators.

Whether it's understanding what the different definitions of AI are when it comes to developing technologies, central monitoring of its use, the future proof of central functions in the law firm and its AI systems whilst looking after the safety of client accounts.

How does AI impact legal finance teams

Will UK law firm legal cashiers be impacted by AI?

The role of legal cashiers could definitely be impacted by artificial intelligence (AI) in several ways, but of course each business practice will have its own principles and policy paper.

The ILFM believe that these five factors will make a difference in the legal finance and accounts departments across the board:

  1. Automation of routine tasks. The use of AI is heavily in existence within our roles and firms already, and it will be used even more to automate repetitive tasks that legal cashiers and legal finance teams often handle, such as data entry and basic accounting. If AI is integrated carefully and respectfully, it could increase efficiency and accuracy, plus free up these busy and important roles to focus on more complex tasks.
  2. Improved decision making. Because AI has the ability to analyse large volumes of data to provide insights that aid decision making it will remove the human error element and mean accurate forecasting and budgeting.
  3. Risk management. PII and regulatory compliance thoughts always loom over law firm owners and practice managers, and especially COFAs, so knowing that AI can help identify patterns and anomalies that can indicate fraud and other financial risks can only be a good thing for public trust and reputation. Innovation and investment in this technology could definitely enhance the role of legal cashiers when it comes to managing risk.
  4. Job transformation. All our roles in the legal profession have evolved in some way or another and continue to do so with AI so change should never be feared. Whilst AI can automate certain tasks, it can also create a need for new skills, so it's worth law firms investing in continued training and development of its legal finance teams.
  5. Job reduction. There's no getting away from the fact that AI could possibly lead to a reduction in jobs for human legal cashiers and wider law firm teams, including certain lawyer roles. There is a 2021 report from the Law Society that gives more details.

We know AI can bring significant changes to our roles; it's a tool that augments human capabilities rather than replacing them so we should have interest and embrace change in this regard. If AI makes our lives easier so that we can focus on strategic decision-making, better evaluation and consider our approach to AI within its regulatory framework then it can only be a good thing.

Gov and AI Reg Whitepaper Update 6 February 2024

The UK is on course for more agile AI regulation, backing regulators with the skills and tools they need to address the risks and opportunities of AI, as part of the government’s response to the AI Regulation White Paper consultation today (6 February).

Please read the government's latest response as at 6 February 2024 HERE.

What are the AI systems? Here are some examples of AI tools that legal cashiers might use:

Artificial Intelligence (AI) systems are computer systems that are capable of performing complex tasks (before tech and artificial learning only a human could do). Examples are reasoning, making decisions, and problem solving. AI systems are designed to think or act more like us (humans) by taking in information from a variety of factors, then deciding its response based on what it has learned.

Don't worry too much about the different terminologies within the AI conversation. It's more important to understand which tools can open up new possibilities to increase technology adoption in your firm. This is where Generative AI tends to come into play.

Chatbots, where advanced versions use AI, can provide more dynamic and contextually relevant interactions with its searcher.

Large Language Models (LLMs) work with language and are built on machine learning to understand how characters, words, and sentences function together.

Here's a list of AI technologies that Law Firm Finance Departments might find useful*:

  • OpenAI's ChatGPT and DALL-E. These are "Generative AI” (GenAI) tools that can produce text, audio, visuals and code.
  • Google's BARD. This is also a GenAI too and can be used for a variety of tasks.
  • Anthropic's Claude. This is great for generating new outputs based on large quantities of existing or synthetic input data.
  • GitHub Copilot. This AI tool assists with code generation, which might be useful for legal cashiers working with software and databases.
  • ClickUp AI. This AI system can generate various document types (great for legal project managers) to speed up workflows.

*The ILFM has no affiliation with any of the above tools, this blog is for information purposes only.

What are the risks associated with using AI in Law Firm Finance and Accounting?

From the research we have done at the ILFM, the risks associated with using AI in accounting and finance would be the following:

  • Data Quality: because AI relies on large volumes of data for the issues related to data quality could lead to inaccurate predictions and therefore decisions.
  • Bias: AI systems can inadvertently learn, and replicate biases present in the training data, leading to unfair or discriminatory outcomes. The UK government published an innovation challenge launched to tackle bias in AI systems. The Press Release is HERE if interested (was released in October 2023).
  • Lack of Transparency: AI algorithms, especially those based on deep learning, are often referred to as “black boxes” because their decision-making processes are not easily understandable by humans. From a pro innovation approach for your law firm, the ILFM would recommend asking your tech developers/suppliers about the mechanics behind their tools. Ask for explicit details about how their systems make decisions and process data. You may need to acquire detailed technical documentation and assurances of compliance with relevant regulations. It's all about due diligence and backing yourselves up.
  • Privacy: AI tools may require access to sensitive and confidential financial information, such as bank account numbers, social security numbers, and tax information. Can you imagine if this data and information fell into wrong hands? Law firms, especially, legal cashiers and compliance officers have a huge responsibility to protect client data and money, so understanding how AI can prevent identity theft and fraud must be taken seriously. More policies!
  • Regulatory Compliance: The use of AI in finance and accounting must comply with relevant laws and regulations. Non-compliance can result in legal penalties and reputational damage. We have listed current regulations below.
  • Ethics: Ethical considerations are paramount when using AI, especially when it comes to respecting user privacy and ensuring fairness. Ethics tends to come under the same umbrella as privacy and bias.
  • Expertise: Implementing and managing AI systems requires personnel with relevant expertise. If you are being told to manage the systems and train your colleagues, make sure you have training and continued development in this regard.
  • Cybersecurity: As AI systems become more prevalent, they also become a target for cyberattacks. 
  • Data Security: AI can capture sensitive data that could well fall under various privacy laws across multiple borders, which then have to be carefully managed by the provider, the AI company. Do make sure you and your colleagues know exactly the right questions to ask of the provider and

The latest guidance the Bar Council's 'Considerations when using ChatGPT and generative artificial intelligence software based on large language models' warns against taking the production of generative AI systems at face value.

It stresses that such systems do not analyse the content of data but rather act as 'a very sophisticated version of the sort of predictive text systems that people are familiar with from email and chat apps on smart phones, in which the algorithm predicts what the next word is likely to be'.

The guide goes on to identify three key risks with the technology:

  • Anthropomorphism: systems are designed and marketed to give the impression that the user is interacting with a human, when this is not the case.
  • Hallucinations: outputs which may sound plausible but are either factually incorrect or unrelated to the given context.
  • Information disorder: the ability for ChatGPT to generate misinformation is a serious issue of which we need be aware of.

All the above risks highlight the importance of careful implementation and ongoing management of AI systems in the accounting and finance sectors.

AI Regulation - UK Government

When it comes to AI governance, it's important to note that in England and Wales, there is currently no AI-specific legislation. That said, UK law firms and businesses must take into account and consideration various existing legal obligations when development and using AI.

The government’s recent white paper confirms that there is no plan to give responsibility for AI governance to a new single regulator but that existing sector-specific regulators will be supported and empowered to produce and implement context-specific approaches that suit the way AI is used in their sector.

The framework outlines five principles to guide and inform regulators to keep in mind when it comes to innovation and safety:

  • Safety, security and robustness.
  • Appropriate transparency and explainability.
  • Fairness.
  • Accountability and governance.
  • Contestability and redress.

The Legal Services Board (LSB) responded to the white paper and if you would like to read their reply, you can do so HERE.

Here's the ILFM's quick rundown of regulations to be aware of:

  • Data Protection Act 2018;
  • General Data Protection Regulation (GDPR) - keep on top of the Information Commissioner's Office updates; and
  • Financial Conduct Authority (FCA) Guidelines.

There are also some publications that are must reads, such as:

Summary

Clients must continue to be protected, and support teams in law firms should be involved with conversations surrounding guidance, firm consultation processes and tech supplier proposals, as well as the lawyers and because as an industry the legal sector has to be seen as a trusted market authority with its staff, its suppliers and its bottom line looked after.

Don't let the buzz words and trends overwhelm you. Expect change, be ready to have conversations with your partners, stay in communication with the banking sector, develop a deeper interest in AI and the positives it brings, however ensure you stay compliant and be aware of risks.

Engage with suppliers of tech and AI, keep an eye out for ILFM forums and webinars too, as we will be welcoming more experts in this field.

 

Comments