Innovators have been developing artificial intelligence (AI), IoT, cybersecurity, and related technologies for some time, yet the legal world has had limited engagement with the issues likely to arise from these technologies—except for those in the area of privacy. And this poses real challenges. For example, it won’t do us a great deal of good to produce an AI prediction device if the result is a successful class-action lawsuit that bankrupts the companies that made it, sold it, and used it. Beginning in 2017, William & Mary’s Center for Legal and Court Technology (CLCT) began research into the legal aspects of these developments. Cisco supports CLCT with research funding.

The nation’s oldest law school tackles the world’s newest technologies
CLCT is a joint initiative of William & Mary Law School, the nation’s oldest law school (W&M had the nation’s first law professor), and the National Center for State Courts. W&M is world famous for its work in court and courtroom technology and has long been interested in the interrelationship between technology and law. Cisco’s grant support enabled CLCT to expand into the AI, IoT, and related technologies space.

CLCT’s first two findings were fundamental and important. First, CLCT confirmed that by and large legal professionals, including judges and lawyers, were, with notable exceptions, unfamiliar with AI and related technologies. The classic approach of most legal professionals is to wait until a case presents itself and then pursue self-education, often under great time pressure.

Second, CLCT concluded that when, and if, they dealt with these issues, legal academics, judges, and lawyers tended to consider each technology in a vacuum. This can easily produce erroneous results.  Consider AI, for example. A “perfect” AI algorithm is vulnerable to the data used to train it and to the data that it uses. If the data used by the algorithm comes from an IoT device with an uncontrolled Internet connection, it’s vulnerable to the false and error-prone data found throughout the Internet. In the event of a lawsuit, who would be liable: designer, data provider, trainer, user, or some other party? Equally important, could we even determine the cause of a bad result, if it were arguably based on erroneous data that cannot be replicated or adequately explained?

Students pose the questions
With Cisco’s support, CLCT is augmenting its academic research work by conducting conferences, presentations, and international law student paper contests in which students discuss legal issues related to these interconnected technologies.

Last years’ contest was open to students in the EU, Canada, and the United States.  The first prize went to Jordan Cohen, now at Emory Law, for his paper “Lights, Camera, AI: Artificial Intelligence and Copyright Ownership in the Entertainment Industry of Tomorrow.” The second prize was awarded to “Perfect Enforcement & Filtering Technology,” written Brian Mund, who graduated from Yale Law School in May 2018. The third prize went to “AI-‘Agents’: to be or not to be in legal ‘domain’?” co-authored by Federica Casano and Francesco Cavinato, both from Alma Mater Studiorum, Università di Bologna. You can read the papers at the CLCT website.

This year’s competition and international conference will be held at William & Mary on October 26 and 27, and is open to law students worldwide. About the students engaged in the competition, Iria Giuffrida, associate director for Research and CLCT’s European law expert, has noted:

These events give us the privileged opportunity to establish a dialogue with the new generation of lawyers, whose careers will be defined by many of these emerging legal issues. They also permit us to collaborate with academics across the globe.

Preparing the next generation for the intersection of law and technology
From CLCT’s perspective—and mine—since technology is changing so quickly, cross-discipline education is one of the only ways to prepare both the current and next generation of lawyers and judges to address these issues. Accordingly, CLCT created “AI & More: Legal Issues Likely to Arise from AI & Related Emerging Technologies,” an innovative, unique course at W&M Law School in which CLCT’s principals—professors knowledgeable in American, Canadian, and European Union law—team-taught an unprecedentedly large seminar focused on the legal issues growing out of AI technologies. The course was so successful that CLCT will create an online version next year. CLCT is also producing additional courses, including an innovative data course to be taught by Michelle Dennedy, Cisco’s vice president for Privacy.

As a technologist, I am used to hard, pragmatic questions: How secure is this? How easily hacked could it be? What will it cost to counter a vulnerability? It’s been an interesting experience to explore the legal vulnerabilities—and opportunities.

Find out more about Cisco’s longstanding commitment to education.



Chris Shenefiel

Security Research Principal Engineer

Advanced Security Research Group