Skip to content
Facebook Instagram YouTube Email

Log in.

Scholé Sisters
  • About
  • Podcast
  • Contact
  • Shop
Join Sistershp
Scholé Sisters
Podcast Episodes

SS#166: Artificial Intelligence

Artificial intelligence has slid into our homes through search engines, word processors, and help boxes long before anyone drafted thoughtful school policies. Now classical homeschool moms and co-op teachers are wondering: What does faithfulness look like in an AI-saturated world?

In this episode, Brandy and Abby invite Jami Marstall to talk about generative AI, large language models, and how these tools intersect with the way we teach reading, writing, and thinking. They begin with cheating and academic integrity: What exactly counts as cheating when a student uses AI? How is this different from plagiarism, and why are co-op policies so hard to write right now?

From there, the conversation moves into deeper concerns: what happens to a child’s mind when he outsources topic generation, thesis writing, and even basic thinking to a machine? How does this affect contemplation, attention, and genuine intellectual growth? Finally, they explore how AI quietly erodes community—both human-to-human and human-to-God—by offering disembodied “support” without accountability or real relationship.

The Scholé Sisters

Podcast
listen on:

Apple

|

YouTube

|

Spotify

|

Audible

|

Cheating with AI in School

  • [2:45-19:04] Scholé Every Day segment
  • [20:10] What is AI?
  • [24:54] Cheating and chatbots
  • [26:04] Is it ok to have AI “help”?
  • [29:46] How to handle homework in co-ops
  • [41:52] Writing as thinking
  • [49:50] Is AI making us dumber?
  • [53:00] The illusion of instant answers
  • [56:05] Artificial community
  • [1:11:05] Countercultural choices

Today’s Hosts and Guest

Brandy Vencel

Abby Wahl

Jami Marstall has been homeschooling for almost 20 years with Charlotte Mason’s rich, living methods. She and her husband John have four children, three of whom have graduated. Jami is not particularly looking forward to homeschool retirement when her youngest graduates. She firmly believes that mothers are born persons and have minds which hunger for ideas too! Her personal reading stack usually includes works of educational philosophy, history, biography, literary classics, and a detective novel.

You can also listen to Jami in episode #131: Outsourcing in High School

Scholé Every Day: What We’re Reading

The Waverly Novels
Oeconomicus
Daddy-Long-Legs

Ivanhoe, Sir Walter Scott

Jami

Daddy Long Legs, Jean Webster

Abby

Oeconomicus, Xenophon

Brandy shares her preparation for the Feminism Detox study.

What is cheating? Is AI always cheating?

Jami read from an academic integrity policy to ground the discussion. Cheating there is defined as:

  • Using prohibited sources during assignments or closed-book tests
  • Enlisting another individual to provide answers or complete assignments
  • Finishing work by means other than your own
  • Submitting that work as if it were your own

In short: cheating is doing something to get the grade, the completed assignment, or the higher score without doing the actual work yourself, and then pretending that you did.

Plagiarism fits inside this: you take someone else’s words or ideas, pass them off as your own, and steal both their work and their credit. That feels obviously wrong to most people.

AI complicates that sense of “obvious.”

With AI, there isn’t a named author in front of you. It feels impersonal, more like an encyclopedia entry or a tool “whose job is to help you.” That makes it feel less like stealing, even when you’re still turning in work that isn’t actually yours.

Students often tell themselves:

  • “I’m just getting a little help.”
  • “I was already thinking that; this is how I would’ve said it.”
  • “How is this different from talking it through with a person?”

That’s where things blur. The student’s experience of their own thinking is still underdeveloped. They haven’t yet built the inner sense of, “This is my thought; this isn’t.” When AI supplies polished language or fully formed ideas, they easily slide into claiming ownership of work they could never have produced on their own.

Jami put the real issue starkly:

If the product at the end does not contain ideas that the student could have developed or articulated him or herself, then it’s work that they have not done.

If you couldn’t have produced that answer—its level of insight, its structure, its language—without the tool, then the tool did the work. Calling it “yours” is cheating.

Here are examples of students getting inappropriate help with their homework:

1. Using AI to pick a topic

Example: a student types into ChatGPT, “I need to write an essay. Give me a topic.”

Is that cheating?

Jami said that, strictly speaking, this probably falls more under “questionable wisdom” than “academic dishonesty,” if the teacher has allowed AI for that purpose and the student is transparent about it.

However, there are serious concerns:

  • The topic has no connection to what the student cares about or already knows.
  • The student skips the hard work of wrestling with ideas and forming an argument.
  • A known weak point (topic and thesis generation) never gets exercised and strengthened.

So while it may not automatically be cheating, it absolutely farms out an essential part of the learning process.

Some teachers in the conversation were willing to allow AI-generated topic ideas in narrow, defined circumstances, but only with clear guidelines and with the understanding that all the real thinking and writing had to be done by the student.

2. Using AI to write paragraphs or papers

When AI starts producing the actual sentences, this is cheating.

If a tool generates the paragraph, essay, or research paper, and the student submits it as their own work, that directly matches the academic integrity definition:

  • Using a prohibited source
  • To complete an assignment
  • For the purpose of getting the grade
  • While claiming authorship

“Putting it in my own words” after AI has done the heavy lifting doesn’t fix the problem if the ideas, structure, and sequence came from the tool rather than from the student’s own mind working on the material.

3. Grammar and “suggestive” tools

The conversation also noted that AI is embedded in lots of places that don’t look like “AI tools” at first glance:

  • Google’s Gemini responses
  • Grammarly’s style and phrasing suggestions
  • Microsoft Word and Google Docs offering predictive, sentence-level suggestions
  • Office’s “copilot” features

These tools don’t just fix subject–verb agreement; they increasingly suggest how to finish the sentence or how to phrase the idea.

The concern here is not just dishonesty; it’s that these suggestions get in between the student’s mind and their output. Even when it’s not outright cheating, it can bypass the struggle that actually teaches them to think and to write.

Clear AI Policies

True, some parents “don’t see a problem” with AI help as long as something is getting turned in. Often teachers don’t want to spend their energy playing detective. Typically, students genuinely aren’t sure where help ends and cheating begins.

Here are some ideas to avoid AI involvement for co-op classes:

  • Requiring handwritten work for many assignments
  • Using in-class narration, discussion, and short-answer tests
  • Comparing typed work with notebook work to see if style and understanding match
  • Having explicit class discussions about academic integrity and AI from day one

AI & Artificial Community

Jami described how even simple tools—like GPS—slowly erode our patience for human interaction. When older relatives try to give directions, we now feel irritation instead of relational interest. We just want to plug the address into a device. It’s not that their directions are wrong—it’s that we don’t want their presence.

Technology promises the answer without the relationship. Quick, precise, convenient—and completely impersonal.

When AI steps in, it takes that further. Someone online asks a question and another person replies with “Just ask ChatGPT.” The message is clear: Skip the community. Skip the conversation. Get the answer alone.

As Jami put it, these tools let us avoid one another altogether—for everything.

Seeking answers without accountability

Brandy shared how AI invites vulnerability without accountability. Teens (or adults) can tell their problems to a bot without embarrassment, without exposure, and without the friction that comes from dealing with a real person who might challenge them, correct them, or tell them something true.

Brandy gave her own daughter’s experience as a cautionary tale: when her daughter opened multiple AI chatbots and asked general questions, four out of eight tried to start romantic conversations with her. Three of them asked her to meet in person. They offered flattery, affection, and emotional attention—without being real. They mimicked relationship without the safety, limits, or accountability of embodied human connection.

It was an experiment for her daughter—but she immediately recognized how dangerous it could be for others her age.

Community without friction is counterfeit

Real community involves discomfort. People misunderstand, disagree, ask for help, need explanations, make mistakes. In other words, community is inefficient. But that inefficiency is what makes it human.

Jamie pointed out that AI and digital tools offer answers and affirmation without the “friction” of real human relationships. They allow us to appear knowledgeable, capable, or emotionally connected—without ever needing to be any of those things in embodied life.

Over time, the digital becomes easier, smoother, and more appealing—while real relationships begin to feel taxing, inefficient, and even irritating.

Mothers guard culture

Abby described her co-op’s solution: specific, intentional practices to pull women and teens out of artificial community and back into embodied fellowship.

Every mom is assigned a coffee hour—not just to prep, but to talk. They ask about parenting struggles, share stories, give advice, and listen. It creates actual culture—formed by shared experiences, not information.

The students also write by hand, narrate aloud, and talk face-to-face about books and ideas. These practices resist digitized shortcuts and train students to think, listen, and exchange real knowledge with real people.

“Kids need to know the real thing so the digital feels off”

Jami said we have to fill children’s souls with the good, the true, and the beautiful—so much that the counterfeit feels flat. Poetry written by humans. Real books, real places, real conversation, real embodied community. Screens cannot imitate singing, praying, teasing, and laughing.

Her point is simple: If they never experience the genuine, they will not recognize the counterfeit.

Delaying tech isn’t deprivation

Brandy noted that families who delayed tech access—phones, internet, social media—do not regret it. Their teens ended up more mature, more relationally capable, and better prepared to handle it wisely when introduced later. Those who adopted early were often the least equipped to handle it.

Mentioned in the Episode

The Seven Laws of Teaching
Amusing Ourselves to Death: Public Discourse in the Age of Show Business
Writing To Learn
Fitting Words Classical Rhetoric

Listen to related episodes:

SS #124 – Redeeming the 5-Paragraph Essay with Renee Shepard

Should we be teaching our kids how to write a 5-paragraph essay? Brandy loves to slam the 5-paragraph essay while…
Read More SS #124 – Redeeming the 5-Paragraph Essay with Renee Shepard

SS #33: Narration: The Act of Knowing (with Karen Glass!)

In today’s episode, Brandy and Karen dig deeply into the connection between narration and knowledge. Unfortunately, because Karen is in…
Read More SS #33: Narration: The Act of Knowing (with Karen Glass!)

Be a part of the conversation!

Discuss this podcast with other moms inside Sistership.

ENTER THE PODCAST HUB HERE

Post Tags: #Community#Culture#Jami Marstall#Podcast#Technology#Writing

Post navigation

Previous Previous
SS#165 – Book Club Joy
NextContinue
SS#167: The Causes of Christmas
Search

Cultivating thinking moms

We believe in the revitalization of dialectic, the ordering of the affections, and in-person community. We believe reading widely, thinking deeply, and applying faithfully is the kind of self-education every woman needs. Society will be recivilized by educated, confident, fruitful Christian women.

Be the first to know when the next podcast is live!

Subscribe to our podcast!

Apple PodcastsAndroidby EmailRSS

Disclosure

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

Conversations Galore

Affections Anthropology Apply Faithfully Aristotle Authority Boys C.S. Lewis Charlotte Mason Christmas Classical Classics Community Curriculum Definitions Dialectic Dorothy Sayers Educational Metaphors Habit High School Homeschooling Humility Karen Glass Latin Laughter Liberal Arts Love Motherhood Motivation Multiculturalism Ordo Amoris Parenting Philosophy Plato Podcast Pre-Reading Reading Retreat Rewards Scholé Self-Education Socratic Discussion teaching The Liberal Arts Tradition Top 10 Virtue

© 2026 Scholé Sisters · Scribe theme by Restored 316

  • About
  • Podcast
  • Contact
  • Shop
Search