From 81e76502daad4943c18eab4d5ff25966ea546e83 Mon Sep 17 00:00:00 2001
From: William Warriner <6930772+wwarriner@users.noreply.github.com>
Date: Fri, 16 May 2025 11:18:30 -0500
Subject: [PATCH] update AI section for clarity

---
 syllabus.html | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/syllabus.html b/syllabus.html
index cefe66c..c656529 100644
--- a/syllabus.html
+++ b/syllabus.html
@@ -199,13 +199,13 @@
 <p>&nbsp;</p>
 
 <h3 id="resources">Usage of Artificial Intelligence (AI):</h3>
-<p>We encourage you to responsibly explore usage of AI. As always, beware of AI nonsense or "hallucinations". As a budding technologist, we want you to be aware that "hallucintaion" is a marketing term. We prefer to call it what is: nonsense. Nevertheless, the term of art is "hallucination" so, to avoid confusion, we will continue using this term.</p>
+<p>We encourage you to responsibly explore usage of AI. As always, beware of AI nonsense or "hallucinations". As a budding technologist, we want you to be aware that "hallucintaion" is a marketing term. We prefer to call it what is: nonsense. Nevertheless, the term of art is "hallucination" so, to avoid confusion, we will continue using that term.</p>
 <p>
   You should be aware that hallucinations are a
   <a href="https://arxiv.org/abs/2409.05746">mathematical certainty</a>
-  and undetectable without independent verification. In other words, you need to already understand the AI's output well enough to be able to spot hallucinations, or have an expert check the output. Otherwise, you cannot hope to spot hallucinations reliably. And, now that you are aware of these facts, it will be
+  and undetectable without independent verification. In other words, you need to already understand the AI's output (and its relationship to reality) well enough to be able to spot hallucinations. Alternatively, you can have an expert check the output.Otherwise, you cannot hope to spot hallucinations reliably. And, now that you are aware of these facts, you should be aware it would be
   <a href="https://en.wikipedia.org/wiki/Intellectual_honesty">intellectually dishonest</a>
-  to pretend AI output is valid or truthful when you haven't verified.
+  to pretend AI output is valid or truthful when you haven't verified. **Always verify AI output!**
 </p>
 <p>
   You may use AI to assist with the development of any demonstration assignments, subject to the
-- 
GitLab