Chatbots are now a common way we interact digitally.
We may visit a website and ask a question where we receive an automated answer from a chatbot, often suggesting a page we may want to visit, or in some cases even initiating a simple transaction or process. We may experience something similar at work, for example where we want to find out a question about HR policies, for example, and the chatbot provides us with an automated answer.
Although chatbots have been around for a long time, they started to become more commonly deployed from the mid-2010s onwards, usually with the mixed results we would expect from an emerging technology. Over that time, chatbots have progressively improved and subsequently there has been more consumer and employee acceptance of them. Now the introduction of generative AI has accelerated the use of chatbots and dramatically increased their potential.
The emergence of chatbots in our lives as consumers and in the workplace means there are some considerations for in-house legal teams in terms of:
- Mitigating for risk
- Using them to support the work of the team.
What is a chatbot?
A chatbot is a software application that a user – let’s say a website visitor or an employee – can interact with using natural language (a “conversational user interface”), to help them find items, discover information, complete simple transactions or solve issues.
Typically, a user accesses the chatbot through a web browser where it is embedded into a website page or intranet, or through a standard application that includes a messaging / chat facility such as Microsoft Teams, Facebook messenger, WhatsApp or even via SMS. Chatbots are also sometimes embedded within software products and applications to help administrators and new users ask questions too.
On a consumer website, chatbots are typically used to:
- Answer basic questions such as when a store opens.
- Find common pages and documents which may resolve basic questions.
- Initiate common requests such as signing up to a mailing list.
- Gather information about an issue before “handing off” a user to live chat support facilitated by a real person.
Within the enterprise, chatbots are generally used to:
- Perform simple searches for contact information, documents and “how to” information.
- Show how to carry out processes in a simple, step-by-step way.
- Perform self-service tasks such as booking annual leave or a meeting room.
- Retrieve simple information pertinent to an individual, such as how much annual leave they have left.
The use of generative AI potentially ramps up what a chatbot can do, for example, with the ability to summarise content and give full answers to questions, rather than just providing links to documents and pages.
What is the value proposition for chatbots?
The value for deploying chatbots for organisations is that they:
- support automation at scale
- require less resources for customer support, driving towards a self-service approach
- funnel people away from making contact via telephone
- capture more data in different systems to support automation and reduce the need for more data entry
- drive efficiencies for customers too who can resolve issues quicker
- provide insights and analytics to refine answers and responses
- can improve over time through machine learning
- are usually a good option for gathering customer feedback
- provide an effective common interface to which a team can continue to add new capabilities.
And for customers, chatbots can:
- help them to self-serve to find the information they need
- allow them to achieve an end goal faster
- provide a natural language method of interaction which some people prefer
- provide an interface that Is well-suited to interacting on a mobile device and can also sometimes support voice recognition.
However, of course, many customers will still prefer not to use a chatbot, and more complex requests or needs are often not met by chatbots. This can be frustrating, especially when it is hard to find a telephone number or a means of contacting a real person.
Mitigating for chatbot-associated risks
Of course, there are some risks associated with using chatbots that provide automated responses to customers or employees. Generally, chatbot solutions both standalone or embedded within products and platforms have matured and have helped to reduce some common issues. However, in-house legal teams have a role to play in assessing and helping to mitigate for risks.
Erroneous answers and the associated liability
Chatbot responses generally are controlled and drafted to reduce the risk of giving erroneous answers, but there have been cases such as when Air Canada was found liable for providing the wrong information to a passenger via a chatbot. In-house teams can play a role here to ensure answers fed into the chatbot are reviewed as well as any statements and terms around the use of the chatbot are also appropriate. Contracts with any software supplier can also be reviewed.
Chatbot scope
Generally, the risks around chatbots can be partly mitigated by the scope of what a chatbot covers and also its capabilities. For example, you might not want it to cover sensitive or contentious topics where there is room for the misinterpretation of an answer. You may also want to limit capabilities – for example the ability to summarise key policies to employees may also provide the potential for an erroneous answer, or to bypass or simplify important details. Again, in-house teams can play a role in raising concerns about the scope of chatbots, or reviewing chatbot projects, which can often be decentralised across different teams and functions and may not always consider the risks involved.
Data privacy and security
Chatbots, particularly being used for transactions, can involve entering personally identifiable data for customers or for employees. Many of the standard risks around data privacy and security will apply relating to access to personal data from customer agents, data residency, robust security and so on.
Future generative AI evolution
With the advent of generative AI, potentially a chatbot facility will be integrated into every software product and digital interface. It’s possible that the term “chatbot” will also become redundant as it will be the prime way we interact with any software. As generative AI continues to evolve in extremely rapid ways, in-house teams have an important role in establishing the guardrails to mitigate for a range of risks.
Leveraging the potential of chatbots for in-house legal teams
The use of chatbots by in-house legal teams is part of a wider consideration on the use of AI to automate their processes and drive efficiency. Essentially a chatbot has the potential to save time for very busy and resource-challenged teams by providing an interface for employees to:
- Help users locate the right in-house specialist, particularly where there are large global in-house teams, or where the bot could include other risk and compliance-related roles.
- Navigate users to legal resources, knowledge and content, with the ability to answer the most common questions, or point them to an expert if their query or subject is not covered by existing content.
- Act as a front end and ask simple questions to gather data, which could then be used for either automated document or contract creation.
- Act as a place to ask questions about a particular area where compliance is necessary, for example on cyber-crime, allowing in-house teams to help reduce risks and drive user education.
Of course, there are a lot of caveats involved here, and we take a closer look at how to use chatbots for legal uses in our article on “lawbots” and “legal chatbots”.
Conclusion
When we first wrote about chatbots in 2018 we portrayed it as an emerging tech area where in-house legal teams need to “watch this space.” Since then, the maturity of chatbots and related solutions, acceptance from consumers and employees, more collective experience of successfully deploying chatbots, and now the emergence of generative AI, means chatbots are here to stay and are going to evolve. They are a channel which in-house legal teams need to be aware of and ensure that the right approaches are being taken to minimise risks.