Research Hub > CoSN 2024: Why Schools Must Vet Generative AI
Article
3 min

CoSN 2024: Why Schools Must Vet Generative AI

Without compliance on artificial intelligence, schools risk losing community trust and more.

K–12 school leaders who may be eyeing generative artificial intelligence to use with students might want to hit the brakes, as they could be setting up their schools for trouble, advised Victoria Thompson, an education strategist at CDW. Thompson encouraged educators who joined her at the AI Playground at the CoSN conference in Miami to establish an AI policy before diving in headfirst.

The conference included several sessions this year about integrating AI into K–12. However, Thompson stressed the importance of vetting AI tools before adopting them. “People are jumping into AI without considering that there are serious implications,” she said. “Educators have to make sure that AI tools are being used safely and effectively.” 

Without careful review of AI tools, Thompson explained, schools could face civil lawsuits, monetary damages, remediation measures, loss of community trust and negative media coverage. 

Educators Can Avoid Noncompliance by Learning Terms of Service

Educators are mandated to comply with an array of federal and local laws to protect students while they are online. Requiring students to download and use tools that include generative AI could jeopardize the school’s compliance with not only its own policies on AI but also government regulations.

“There are risks involved with noncompliance,” Thompson said. Every AI tool has terms of service, she added, and schools need to start reading them to maintain compliance with the terms of service for both the tools and the school district.

For example, most social media tools have a minimum age for children who use the tools; the same is true for AI. However, if educators are unaware of the terms of service, they could be unknowingly noncompliant.

“There is a difference between convenience and compliance,” Thompson said. “Compliance is what we strive for. It might take longer, but that’s okay. When we are compliant, we know we are getting it right the first time.”

She added that schools should not only consider new generative AI tools that have become popular in the past year but also to take a look at their existing tech stack, which may already have generative AI embedded. She pointed out that generative AI is a lot more prevalent than many people think and shared one example that is often overlooked: text to speech.

What School Leaders Should Ask Ed Tech Companies That use Generative AI

It’s crucial for schools to be transparent about their use of AI, Thompson said. “For educators, our work is relationship-based and if we lose that trust with the community, it takes so long to rebuild,” she explained. “When we are not transparent, people get away with not being compliant, and that’s how mistakes happen.”

When it comes to AI tools, Thompson noted that schools should prioritize the following: consistent and responsible instructional practices and data practices; data management and transparency; the tool’s effectiveness in how it’s used by the school; and educator and student safety.

As part of the vetting process, Thompson said school IT and technology integration teams should ask ed tech companies that use generative AI for metrics to back up claims about how they ensure student data privacy and measure the tool’s impact on learning. She also said schools should ask about companies’ generative AI capabilities and limitations, and how they mitigate bias and ensure ethical data use. Information on human oversight and quality control, and accessibility and inclusive design are also important.

Taashi Rowe

Managing Editor, EdTech
Taashi Rowe is the Managing Editor for EdTech magazine.