AI-Minute Taking Tensions: Navigating Risks And Best Practices

TL
Torys LLP

Contributor

Torys LLP is a respected international business law firm with a reputation for quality, innovation and teamwork. Our experience, our collaborative practice style, and the insight and imagination we bring to our work have made us our clients' choice for their largest and most complex transactions as well as for general matters in which strategic advice is key.
A number of artificial intelligence (AI)-powered transcription services have emerged as assistive tools to streamline the process of minute-taking during discussions, meetings and presentations.
Canada Technology
To print this article, all you need is to be registered or login on Mondaq.com.

A number of artificial intelligence (AI)-powered transcription services have emerged as assistive tools to streamline the process of minute-taking during discussions, meetings and presentations. Unlike traditional transcription and recording tools, AI-notetakers leverage machine learning for natural language processing to automate and enhance the minute-taking process: they can generate summaries of key discussion points, track action items, identify different speakers and format meeting notes in accordance with company precedents. While AI-transcription services can help improve organizational productivity and promote efficiency in meetings and information sharing, the sophisticated nature of these tools also presents novel business and legal risks.

Core issues: the risks

Litigation risk

Traditional meeting minutes do not characteristically capture precise dialogue attributed to each person at a meeting. Indeed, the level of detail in management and board meeting minutes is a specialty area requiring deep training and knowledge of organizational needs and industry practices. The creation of robust and detailed—and unvetted and unfiltered—meeting minutes generated by AI-notetakers may lead to an increase in the availability of information that may be discoverable by third parties in litigation. Similarly, various versions, and the conflicts or discrepancies contained within them, may also be discoverable and inadvertently expose a company to greater litigation risk.

Privileged communications

The use of AI-notetakers raises concerns surrounding privilege and legal advice that is intended to be privileged and/or delivered in camera. If the AI-notetaker is not limited to certain sessions or cannot be disabled for certain portions of a meeting, privileged information may become compromised. Further, where settings permit meeting minutes or summaries to be distributed automatically to all attendees indiscriminately, legal advice intended to be privileged or delivered in camera could be circulated without appropriate vetting, notification or consent.

Proprietary and confidential information

AI-notetakers have access to proprietary and confidential information disclosed in meetings that may be stored on vendor systems. Proprietary and confidential information stored on vendor or company systems could be compromised by a data breach or cyber-attack, exposing the company to reputational harm and additional regulatory and litigation risk. Additional concerns arise where certain vendors could extract company data to train AI models and embed company data within the model's algorithm.

Chill on debate and oversight

At both the management and board level, the free exchange of ideas and debate are central to effective decision-making and oversight. While there is a risk that employees and directors will be more reluctant to engage in healthy debate if any form of recording is used, this risk may be increased where AI-notetakers are preparing the minutes, with less human review and discretion over the final content.

If AI-notetaker settings permit meeting minutes or summaries to be distributed automatically to attendees, legal advice intended to be privileged could be circulated without appropriate vetting, notification or consent.

Similar to the litigation risk described above, corporate decisions and the individuals participating in them may also be subjected to unwarranted scrutiny if the organization has records of the details of disagreements and debates leading up to a final decision, in addition to more summary minutes that reflect the ultimate path chosen.

Bias and impact on culture

AI-notetakers can sometimes exhibit bias by giving deference to senior people at a meeting. As an example, the software may summarize an item by giving more weight to something said by a director as opposed to an equally important perspective from an analyst. This is sometimes carried out in subtle ways, where the AI-notetaker alters the language in a summary document to reflect the authority of the senior person while overlooking valuable insights provided by a more junior individual. Consequently, these tools may inadvertently augment existing power dynamics within organizations, reinforcing hierarchies to the detriment of culture and collaboration.

Best practices: mitigating the risks

Companies should mitigate the risks associated with the use of AI-notetaking tools by taking preventative steps to ensure the responsible use of this generative technology.

  • Engage AI-transcription vendors whose software allows for the company to control the use of the software, by disabling and enabling certain functionalities to meet the company's policies and standards. For example, if meetings often entail discussions of confidential, proprietary or privileged information or in camera discussions, employees should be able to limit the use of the software to certain portions of the meeting and control the subsequent distribution of automated summaries and/or minutes.
  • Verify that AI-transcription vendors comply with data protection laws and industry standards to safeguard confidential information from unauthorized or inadvertent access or disclosure. Certain AI models are considered more secure and limit unnecessary exposure and risk of misuse of proprietary and confidential information. For instance, walled-garden AI models preserve data in a closed environment, with limited access to and interactions with external data sources, mitigating the risks associated with unauthorized data exposure.
  • Understand how an AI-transcription vendor handles company information in terms of what data is being stored, where data is being stored and the duration of data retention periods to ensure alignment with internal policies. To mitigate risks associated with company information being held in vendor environments that may not meet the company's technical and security standards, companies should ensure that most information, especially any information considered sensitive, is stored on company systems. Data retention procedures should be updated to ensure there is an employee responsible for vetting AI-generated meeting summaries and the destruction schedule for raw materials.
  • Review internal and external privacy notices and electronic monitoring policies to ensure that all meeting participants understand how their contributions may be recorded and summarized by AI-minute taking tools, and how they can raise concerns about the use of these technologies for particular discussions. Train employees to be able to answer common questions on how participant information will be collected, used and stored in these scenarios.
  • Make a conscious effort to address the inherent bias in AI-notetakers by testing if the software is impartial and inclusive, valuing contributions based on merit rather than status. Though AI-notetakers are designed to streamline and enhance efficiency and productivity, ensure there is a "human-in-the-loop" to discern how much weight should be given to what each person says at a meeting.

The adoption and use of AI-notetaking tools present an opportunity for companies to streamline record-keeping processes. However, these tools also present novel risks that companies must be attuned to when engaging AI-transcription services. If choosing to adopt AI-notetaking tools, companies should take preventative steps to ensure that data is secure and the use of the tools aligns with internal data-related policies and workplace culture, as well as external regulatory requirements.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More