Artificial Intelligence

Overview

Take the AI knowledge test!

Lavery Legal Lab on Artificial Intelligence (L3IA)

 

We anticipate that within a few years, all companies, businesses and organizations, in every sector and industry, will use some form of artificial intelligence in their day-to-day operations to improve productivity or efficiency, ensure better quality control, conquer new markets and customers, implement new marketing strategies, as well as improve processes, automation and marketing or the profitability of operations.

For this reason, Lavery created the Lavery Legal Lab on Artificial Intelligence (L3IA) to analyze and monitor recent and anticipated developments in artificial intelligence from a legal perspective. Our Lab is interested in all projects pertaining to artificial intelligence (AI) and their legal peculiarities, particularly the various branches and applications of artificial intelligence which will rapidly appear in companies and industries.

"As soon as a company knows what it wants, tools exist, it must make the best use of them, and our Lab is there to advise it in this regard. "

 

The development of artificial intelligence, through a broad spectrum of branches and applications, will also have an impact on many legal sectors and practices, from intellectual property to protection of personal information, including corporate and business integrity and all fields of business law.

Discover our lexicon which demystifies the most commonly used terms in artificial intelligence:

Lexicon on Artificial IntelligenceClick here to learn more



Click on the images and links to view some of our articles and interviews on artificial intelligence:

Lavery a créé son propre «ChatGPT» juridique

Hyper trucage sur le web : les dangers de l'Intelligence artificielle

  1. Application for an interim injunction: Manufactured urgency is not a 9-1-1 emergency

    On March 3, 2025, Superior Court Justice Nancy Bonsaint dismissed an application for an interim interlocutory injunction that would allow Les Entreprises de la Batterie inc. to use a property it did not own for major construction work on its building. The judgment serves as a reminder that a party cannot manufacture a sense of urgency and then use that to support its application for an interim injunction. Summary of facts The Plaintiff, Les Entreprises de la Batterie inc., owns a building that has been under construction since March 2021, in order to convert it into a hotel that will serve as an extension to the hotel the Plaintiff currently operates.1 The Defendant owns a hotel and a piece of property adjacent to the building under construction. The property is used as a parking lot for his hotel guests.2 Construction work on the Plaintiff’s building was initially carried out in two separate phases, from March to November 20213 and from August 23, 2022, to July 2024.4 During those phases, the Parties reached various agreements whereby the Plaintiff could use one (1) of the Defendant’s parking spaces, in exchange for compensation.5 On February 14, 2025, the Plaintiff informed the Defendant that it planned to begin a new phase of construction (Phase 3) on February 28, 2025.6 The Plaintiff also informed the Defendant that, as part of the new phase of construction, the Plaintiff would need to use half of the Defendant’s parking lot, that is, six (6) parking spaces, and that the entrance to the parking lot would have to be relocated for more than two (2) years.7 Additionally, the Plaintiff pointed out that it would need access to the Defendant’s entire parking lot for a few days in the spring of 2025.8 The Plaintiff alleged that construction work on its building had to begin urgently on February 28, 2025.9 The Defendant objected to having to tolerate such a major disruption for an additional two (2) years, given that he had endured the inconveniences caused by the Plaintiff’s construction work for over four (4) years now, without being offered any form of compensation that would be considered fair or reasonable in the circumstances. On February 27, 2025, the Plaintiff brought anoriginating application before Justice Bonsaint, seeking orders for an interim interlocutory injunction, an interlocutory injunction and a permanent injunction, as well as for a declaration of abuse of process and damages, which was amended on February 28, 2025.10 At the interim interlocutory injunction stage, the Plaintiff asked the Court to issue a temporary order granting the Plaintiff access to the Defendant’s six (6) parking spaces so it could continue setting up its construction site.11 The Plaintiff also sought reimbursement of the professional fees incurred in applying for the injunction. The Plaintiff alleged that the hotel expansion was [TRANSLATION] “a large-scale project with costs in the tens of millions of dollars”.12 The Plaintiff further alleged that [TRANSLATION] “there is an urgent need for the construction work required to repurpose the building and turn it into a hotel to continue, without being interrupted by the Defendant’s actions”.13 The Plaintiff argued that halting construction work on its building would result in delays, significantly disrupting the timeline of the project, which was planned over the next two (2) years. Furthermore, it would lead to substantial additional costs associated with the various extras charged by the contractors it had hired to carry out the conversion and construction work.14 Needless to say, the Defendant opposed the application for an interim interlocutory injunction, arguing in particularthat the facts alleged by the Plaintiff failed to meet the urgency test.15 Those are the facts that Justice Bonsaint took into account when rendering her decision. The criteria for granting interim interlocutory injunctions In her judgment, Justice Bonsaint reviewed the legal principles governing interim interlocutory injunction applications. We will do the same below. The criteria for granting an interim interlocutory injunction are as follows: Urgency Serious issue to be tried or strong prima facie case Serious or irreparable harm Balance of convenience16 It is a discretionary and exceptional remedy that should only be granted sparingly and under strict conditions.17 The urgency criterion Urgency is [TRANSLATION] “of paramount importance”18 in determining whether an interim interlocutory injunction should be granted. If the urgency test is not met, the application simply cannot be allowed.19 Courts often describe the level of urgency required as being akin to [TRANSLATION] “a 9-1-1 emergency”.20 Interim interlocutory injunctions should only be granted in cases of [TRANSLATION] “extreme urgency”.21 For a court to find that the urgency test is met, the urgency must not result from a delay in bringing legal action. It must be [TRANSLATION] “immediate and apparent”—not the product of the plaintiff’s own lack of diligence.22 In other words, [TRANSLATION] “the alleged urgency must be real—not manufactured by the person asserting it”.23 Upon reviewing the case, Justice Bonsaint noted that the Defendant had been made aware only on January 31, 2025, that the Plaintiff would need access to his property for construction work.24 Prior to January 2025, the Plaintiff had not informed the Defendant of its true intentions regarding the work.25 It was not until February 14, 2025, that the Plaintiff officially informed the Defendant of the nature of the access required for the third phase of the project, namely, the use of at least half of the Defendant’s property from February 28, 2025, to March 31, 2027.26 Further to the Defendant’s contestation, Justice Bonsaint noted that the Plaintiff had known for several months that the third phase of the work would begin in early 2025.27 She found that the Plaintiff [TRANSLATION] “had not treated the issue of accessing the parking lot as one requiring urgent resolution”.28 The Plaintiff tried to justify its failure to be proactive, arguing that it had been unable to inform the Defendant of its space requirements before 2025 because the project timeline was still unknown at the time.29 However, Justice Bonsaint found that such explanations simply did not excuse the Plaintiff’s delay in filing its application for an interim interlocutory injunction against the Defendant.30 On the contrary, the supporting documents that the Plaintiff had submitted with its letter dated February 14, 2025, such as a plan of the Defendant’s parking lot and the preliminary project timeline, included references to “2024”.31 Given the above, Justice Bonsaint could only conclude that the Plaintiff had known for several months that construction work on its building was scheduled to begin in 2025.32 On that point, Justice Bonsaint was clear: [TRANSLATION] “The Court understands that preliminary construction timelines may be subject to change, but there is nothing to suggest that construction needed to begin ‘urgently’ on February 28, 2025. . . . the Plaintiff should have taken action as early as January 2025”.33 The Plaintiff had been aware of the access issues involving the Defendant’s property since the fall of 2024—and certainly since January 2025.34 Those issues should have prompted discussions between the Parties’ lawyers well before February 2025, and no later than January 2025.35 Discussions or attempts to settle the matter The Plaintiff also argued that, at the interim interlocutory injunction stage, discussions or attempts to settle the matter could have a bearing in determining whether the urgency requirement was met.36 Justice Bonsaint rejected that argument, given that no real negotiations had taken place, other than failed calls in November and December 2024, and again in January 2025, and that the Plaintiff had been aware of the access issues involving the Defendant’s property since the fall of 2024—and certainly since January 2025. Consequently, Justice Bonsaint dismissed the application for an interim interlocutory injunction, seeing as the Plaintiff had asked the Court to find that such an order, which would grant the Plaintiff access to half of the Defendant’s parking lot for two (2) years, needed to be issued urgently, even though the Plaintiff itself had not considered the need to access the parking lot as being an urgent matter to be resolved before the third phase of construction began.37 Key takeaways The urgency criterion is of paramount importance in determining whether an interim interlocutory injunction should be granted. That requirement must be met for the Court to allow such an application. In assessing the facts and allegations related to an application for an interim interlocutory injunction, the Court must ensure that the urgency is real—akin to a 9-1-1 situation—and not manufactured by the party seeking the relief. A delay attributable to the plaintiff cannot serve as a basis for granting an interim interlocutory injunction against the defendant. Half-hearted attempts at settlement discussions or negotiations do not excuse the delay between a party becoming aware of the facts warranting an interim interlocutory injunction and the filing of the application. Diligence is therefore essential in managing and mounting such cases, making it more likely that an interim interlocutory injunction will be granted. Entreprises de la Batterie inc. c. Biron, 2025 QCCS 608, paras. 1 and 10 (hereinafter the “Judgment”). Judgment, para. 4. Judgment, para. 10. Judgment, paras. 16 to 19. Judgment, paras. 10 to 18. Judgment, para. 27. Judgment, paras. 3 and 27. Judgment, para. 3. Judgment, para. 2. Judgment, para. 6. Judgment, para. 7. Judgment, para. 46. Judgment, para. 47. Judgment, para. 48. Judgment, para. 8. Judgment, paras. 35 and 37 to 39. Judgment, para. 36. Judgment, para. 41. Id. Judgment, paras. 41 and 43. Judgment, para. 42. Judgment, para. 42. Judgment, para. 40. Judgment, paras. 61 and 62. Judgment, para. 62. Judgment, paras. 64 and 65. Judgment, para. 68. Id. Judgment, para. 74. Judgment, para. 75. Judgment, paras. 76 and 77. Judgment, para. 82. Judgment, para. 82. Judgment, para. 84. Judgment, para. 85. Judgment, para. 83. Judgment, para. 90.

    Read more
  2. Businesses: Four tips to avoid dependency or vulnerability in your use of AI

    While the world is focused on how the tariff war is affecting various products, it may be overlooking the risks the war is posing to information technology. Yet, many businesses rely on artificial intelligence to provide their services, and many of these technologies are powered by large language models, such as the widely-used ChatGPT. It is relevant to ask whether businesses should rely on purely US-based technology service providers. There is talk of using Chinese alternatives, such as DeepSeek, but their use raises questions about data security and the associated control over information. Back in 2023, Professor Teresa Scassa wrote that, when it comes to artificial intelligence, sovereignty can take on many forms, such as state sovereignty, community sovereignty over data and individual sovereignty.1 Others have even suggested that AI will force the recalibration of international interests.2 In our current context, how can businesses protect themselves from the volatility caused by the actions of foreign governments? We believe that it’s precisely by exercising a certain degree of sovereignty over their own affairs that businesses can guard against such volatility. A few tips: Understand Intellectual property issues: Large language models underlying the majority of artificial intelligence technologies are sometimes offered under open-source licenses, but certain technologies are distributed under restrictive commercial licenses. It is important to understand the limits imposed by the licenses under which these technologies are offered. Some language model owners reserve the right to alter or restrict the technology’s functionality without notice. Conversely, permissive open-source licenses allow a language model to be used without time restrictions. From a strategic standpoint, businesses should keep intellectual property rights over their data compilations that can be integrated into artificial intelligence solutions. Consider other options: Whenever technology is used to process personal information, a privacy impact assessment is required by law before such technology is acquired, developed or redesigned.[3] Even if a privacy impact assessment is not legally required, it is prudent to assess the risks associated with technological choices. If you are dealing with a technology that your service provider integrates, check whether there are alternatives. Would you be able to quickly migrate to one of these if you faced issues? If you are dealing with custom solution, check whether it is limited to a single large language model. Adopt a modular approach: When a business chooses an external service provider to provide a large language model, it is often because the provider offers a solution that is integrated to other applications that the business already uses, or because it provides an application programming interface developed specifically for the business. In making such a choice, you should determine whether the service provider can replace the language model or application if problems were to arise. If the technology in question is a fully integrated solution from a service provider, find out whether the provider offers sufficient guarantees that it could replace a language model if it were no longer available. If it is a custom solution, find out whether the service provider can, right from the design stage, provide for the possibility of replacing one language model with another. Make a proportionate choice: Not all applications require the most powerful language models. If your technological objective is middle-of-the-road, you can consider more possibilities, including solutions hosted on local servers that use open-source language models. As a bonus, if you choose a language model proportionate to your needs, you are helping to reduce the environmental footprint of these technologies in terms of energy consumption.  These tips each require different steps to be put into practice. Remember to take legal considerations, in addition to technological constraints, into account. Licenses, intellectual property, privacy impact assessments and limited liability clauses imposed by certain service providers are all aspects that need to be considered before making any changes. This isn’t just about being prudent—it’s about taking advantage of the opportunity our businesses have to show they are technologically innovative and exercise greater control over their futures. Scassa, T. 2023. “Sovereignty and the governance of artificial intelligence.” 71 UCLA L. Rev. Disc. 214. Xu, W., Wang, S., & Zuo, X. 2025. “Whose victory? A perspective on shifts in US-China cross-border data flow rules in the AI era.” The Pacific Review, 1–27. See in particular the Act respecting the protection of personal information in the private sector, CQLR c. P-39.1, s. 3.3.

    Read more
  3. Can artificial intelligence be designated as an inventor in a patent application?

    Artificial intelligence (“AI”) is becoming increasingly sophisticated, and the fact that this human invention can now generate its own inventions opens the door to new ways of conceptualizing the notion of “inventor” in patent law. In a recent ruling, the Supreme Court of the United Kingdom (“UK Supreme Court”) however found that an artificial intelligence system cannot be the author of an invention within the meaning of the applicable regulations under which patents are granted. This position is consistent with that of several courts around the world that have already ruled on the issue. But what of Canada, where the courts have yet to address the matter? In this bulletin, we will take a look at the decisions handed down by the UK Supreme Court and its counterparts in other countries before considering Canada’s position on the issue. In Thaler (Appellant) v Comptroller-General of Patents, Designs and Trade Mark,1 the UK Supreme Court ruled that “an inventor must be a person”. Summary of the decision In 2018, Dr. Stephen Thaler filed patent applications for two inventions described as having been generated by an autonomous AI system. The machine in question, DABUS, was therefore designated as the inventor in the applications. Dr. Thaler claimed that, as the owner of DABUS, he was entitled to file patent applications for inventions generated by his machine. That being so, he alleged that he was not required to name a natural person as the inventor. Both the High Court of Justice and the Court of Appeal dismissed Dr. Thaler’s appeal from the decision of the Intellectual Property Office of the United Kingdom not to proceed with the patent applications, in particular because the designated inventor was not valid under the Patents Act 1977. The UK Supreme Court, the country’s final court of appeal, also dismissed Dr. Thaler’s appeal. In a unanimous decision, it concluded that the law is clear in that “an inventor within the meaning of the 1977 Act must be a natural person, and DABUS is not a person at all, let alone a natural person: it is a machine”.2 Although there was no doubt that DABUS had created the inventions in question, that did not mean that the courts could extend the notion of inventor, as defined by law, to include machines. An ongoing trend The UK Supreme Court is not the first to reject Dr. Thaler’s arguments. The United States,3 the European Union4 and Australia5 have adopted similar positions, concluding that only a natural person can qualify as an inventor within the meaning of the legislation applicable in their respective jurisdictions. The UK ruling is part of the Artificial Inventor Project’s cross-border attempt to ensure that the DABUS machine—and AI in general—is recognized as a generative tool capable of generating patent rights for the benefit of AI system owners. To date, only South Africa has issued a patent to Dr. Thaler, naming DABUS as the inventor.6 This country is the exception that proves the rule. It should however be noted that the Companies and Intellectual Property Commission of South Africa does not review applications on their merits. As such, no reason was given for considering AI as the inventor. More recently, in February of this year, the United States Patent and Trademark Office issued a guidance on AI-assisted inventions. The guidance confirms the judicial position and states in particular that “a natural person must have significantly contributed to each claim in a patent application or patent”.7 What about Canada? In 2020, Dr. Thaler also filed a Canadian patent application for inventions generated by DABUS.8 The Canadian Intellectual Property Office (“CIPO”) issued a notice of non-compliance in 2021, establishing its initial position as follows: Because for this application the inventor is a machine and it does not appear possible for a machine to have rights under Canadian law or to transfer those rights to a human, it does not appear this application is compliant with the Patent Act and Rules.9 However, CIPO specified that it was open to receiving the applicant’s arguments on the issue, as follows: Responsive to the compliance notice, the applicant may attempt to comply by submitting a statement on behalf of the Artificial Intelligence (AI) machine and identify, in said statement, himself as the legal representative of the machine.10 To date, CIPO has issued no notice of abandonment and the application remains active. Its status in Canada is therefore unclear. It will be interesting to see whether Dr. Thaler will try to sway the Canadian courts to rule in his favour after many failed attempts in other jurisdictions around the world, and most recently in the UK Supreme Court. At first glance, the Patent Act11 (the “Act”) does not prevent an AI system from being recognized as the inventor of a patentable invention. In fact, the term “inventor” is not defined in the Act. Furthermore, nowhere is it stated that an applicant must be a “person,” nor is there any indication to that effect in the provisions governing the granting of patents. The Patent Rules12 offer no clarification in that regard either. The requirement implied by the clear use of the term “person” in the wording of the relevant sections of the law is important: It was a key consideration that the UK Supreme Court analyzed in Thaler. Case law on the subject is still ambiguous. According to the Supreme Court of Canada, given that the inventor is the person who took part in conceiving the invention, the question to ask is “[W]ho is responsible for the inventive concept?”13 That said, however, we note that the conclusion reached was that a legal person—as opposed to a natural person—cannot be considered an inventor.14 The fact is that the Canadian courts have never had to rule on the specific issue of recognizing AI as an inventor, and until such time as the courts render a decision or the government takes a stance on the matter, the issue will remain unresolved. Conclusion Given that Canadian law is not clear on whether AI can be recognized as an inventor, now would be a good time for Canadian authorities to clarify the issue. As the UK Supreme Court has suggested, the place of AI in patent law is a current societal issue, one that the legislator will ultimately have to settle.15 As such, it is only a matter of time before the Act is amended or CIPO issues a directive. Moreover, in addition to having to decide whether AI legally qualifies as an inventor, Canadian authorities will have to determine whether a person can be granted rights to an invention that was actually created by AI. The question as to whether an AI system owner can hold a patent on an invention generated by their machine was raised in Thaler. Once again, unlike the UK’s patent act,16 our Patent Act does not close the door to such a possibility. Canadian legislation contains no comprehensive list of the categories of persons to whom a patent may be granted, for instance. If we were to rewrite the laws governing intellectual property, given that the main purpose such laws is to encourage innovation and creativity, perhaps a better approach would be to allow AI system owners to hold patent rights rather than recognizing the AI as an inventor. Patent rights are granted on the basis of an implicit understanding: A high level of protection is provided in exchange for sufficient disclosure to enable a person skilled in the art to reproduce an invention. This ensures that society benefits from such inventions and that inventors are rewarded. Needless to say, arguing that machines need such an incentive is difficult. Designating AI as an inventor and granting it rights in that respect is therefore at odds with the very purpose of patent protection. That said, an AI system owner who has invested time and energy in designing their system could be justified in claiming such protection for the inventions that it generates. In such a case and given the current state of the law, the legislator would likely have to intervene. Would this proposed change spur innovation in the field of generative AI? We are collectively investing a huge amount of “human” resources in developing increasingly powerful AI systems. Will there come a time when we can no longer consider that human resources were involved in making AI-generated technologies? Should it come to that, giving preference to AI system owners could become counterproductive. In any event, for the time being, a sensible approach would be to emphasize the role that humans play in AI-assisted inventions, making persons the inventors rather than AI. As concerns inventions conceived entirely by an AI system, trade secret protection may be a more suitable solution. The professionals on our intellectual property team are at your disposal to assist you with patent registration and provide you with a clearer understanding of the issues involved. [2023] UKSC 49 [Thaler]. Ibid., para. 56. See the decision of the United States Court of Appeals for the Federal Circuit in Thaler v Vidal, 43 F. 4th 1207 (2022), application for appeal to the Supreme Court of the United States dismissed. See the decision of the Boards of Appeal of the European Patent Office in J 0008/20 (Designation of inventor/DABUS) (2021), request to refer questions to the Enlarged Board of Appeal denied. See the decision of the Full Court of the Federal Court of Australia in Commissioner of Patents v Thaler, [2022] FCAFC 62, application for special leave to appeal to the High Court of Australia denied. ZA 2021/03242. Federal Register: Inventorship Guidance for AI-Assisted Inventions. CA 3137161. Notice from CIPO dated February 11, 2022, in Canadian patent application 3137161. Ibid. R.S.C., 1985, c. P-4. SOR/2019-251. Apotex Inc.v. Wellcome Foundation Ltd., 2002 SCC 77 at paras. 96–97. Sarnoff Corp. v. Canada (Attorney General), 2008 FC 712, para. 9. Thaler, paras. 48–49, 79. Ibid., para. 79.

    Read more
  4. The forgotten aspects of AI: reflections on the laws governing information technology

    While lawmakers in Canada1 and elsewhere2 are endeavouring to regulate the development and use of technologies based on artificial intelligence (AI), it is important to bear in mind that these technologies are also classified within the broader family of information technology (IT). In 2001, Quebec adopted a legal framework aimed at regulating IT. All too often forgotten, this legislation applies directly to the use of certain AI-based technologies. The very broad notion of “technology-based documents” The technology-based documents referred to in this legislation include any type of information that is “delimited, structured and intelligible”.3 The Act lists a few examples of technology-based documents contemplated by applicable laws, including online forms, reports, photos and diagrams—even electrocardiograms! It is therefore understandable that this notion easily applies to user interface forms used on various technological platforms.4 Moreover, technology-based documents are not limited to personal information. They may also pertain to company or organization-related information stored on technological platforms. For instance, Quebec’s Superior Court recently cited the Act in recognizing the probative value of medical imaging practice guidelines and technical standards accessible on a website.5 A less recent decision also recognized that the contents of electronic agendas were admissible as evidence.6 Due to their bulky algorithms, various AI technologies are available as software as a service (SaaS) or as platform as a service (PaaS). In most cases, the information entered by user companies is transmitted on supplier-controlled servers, where it is processed by AI algorithms. This is often the case for advanced client relationship management (CRM) systems and electronic file analysis. It is also the case for a whole host of applications involving voice recognition, document translation and decision-making assistance for users’ employees. In the context of AI, technology-based documents in all likelihood encompass all documents that are transmitted, hosted and processed on remote servers. Reciprocal obligations The Act sets out specific obligations when information is placed in the custody of service providers, in particular IT platform providers. Section 26 of the Act reads as follows: 26. Anyone who places a technology-based document in the custody of a service provider is required to inform the service provider beforehand as to the privacy protection required by the document according to the confidentiality of the information it contains, and as to the persons who are authorized to access the document. During the period the document is in the custody of the service provider, the service provider is required to see to it that the agreed technological means are in place to ensure its security and maintain its integrity and, if applicable, protect its confidentiality and prevent accessing by unauthorized persons. Similarly, the service provider must ensure compliance with any other obligation provided for by law as regards the retention of the document. (Our emphasis) This section of the Act, therefore, requires the company wishing to use a technological platform and the supplier of the platform to enter into a dialogue. On the one hand, the company using the technological platform must inform the supplier of the required privacy protection for the information stored on the platform. On the other hand, the supplier is required to put in place “technological means” with a view to ensuring security, integrity and confidentiality, in line with the required privacy protection requested by the user. The Act does not specify what technological means must be put in place. However, they must be reasonable, in line with the sensitivity of the technology-based documents involved, as seen from the perspective of someone with expertise in the field. Would a supplier offering a technological platform with outmoded modules or known security flaws be in compliance with its obligations under the Act? This question must be addressed by considering the information transmitted by the user of the platform concerning the required privacy protection for technology-based documents. The supplier, however, must not conceal the security risks of its IT platform from the user since this would violate the parties’ disclosure and good faith requirements. Are any individuals involved? These obligations must also be viewed in light of Quebec’s Charter of Human Rights and Freedoms, which also applies to private companies. Companies that process information on behalf of third parties must do so in accordance with the principles set out in the Charter whenever individuals are involved. For example, if a CRM platform supplier offers features that can be used to classify clients or to help companies respond to requests, the information processing must be free from bias based on race, colour, sex, gender identity or expression, pregnancy, sexual orientation, civil status, age except as provided by law, religion, political convictions, language, ethnic or national origin, social condition, a handicap or the use of any means to palliate a handicap.7 Under no circumstances should an AI algorithm suggest that a merchant should not enter into a contract with any individual on any such discriminatory basis.8 In addition, anyone who gathers personal information by technological means making it possible to profile certain individuals must notify them beforehand.9 To recap, although the emerging world of AI is a far cry from the Wild West decried by some observers, AI must be used in accordance with existing legal frameworks. No doubt additional laws specifically pertaining to AI will be enacted in the future. If you have any questions on how these laws apply to your AI systems, please feel free to contact our professionals. Bill C-27, Digital Charter Implementation Act, 2022. In particular, the U.S. Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, October 30, 2023. Act to establish a legal framework for information technology, CQLR c C-1.1, sec. 3. Ibid, sec. 71. Tessier v. Charland, 2023 QCCS 3355. Lefebvre Frères ltée v. Giraldeau, 2009 QCCS 404. Charter of Human Rights and Freedoms, sec. 10. Ibid, sec. 12. Act respecting the protection of personal information in the private sector, CQLR c P-39.1, sec. 8.1.

    Read more