In a recent column for Christianity Today, Yi-Li Lin argued for a significant increase in usage of AI-related tools in church work. I’m sympathetic, but he goes too far. The ways he does so are revealing to the challenges every profession is facing, or will face, with this technology.
As proponent of AI and creator of what became the first publicly available AI Bible study tool, I was excited when Lin’s headline came to my attention. We need positive coverage within the ministry world of this new technology.
Lin’s argument is simple and immediately familiar to anyone in pastoral ministry. A relative unfamiliar with the interworkings of a church organization once quipped to me about pastors only working “one day a week.” The opposite is normally true; we tend to struggle not to work long hours seven days a week.
Besides preaching, which Lin notes, takes a significant chunk of the week to prepare for if done right, pastors are on call for anyone in the church going through crisis, often do much of the organizational leadership and have to be the point person on everything from managing computers to counseling, to charitable aid, to marketing.
The combination can be overwhelming. Pastors, in an experience shared with small business owners, find themselves doing all sorts of things that might be specialized to different individuals in larger organizations. The allure (and value) of AI is to find the things that don’t need a human touch and replace them with automation. I’m sympathetic with Lin, as a fellow pastor, and to anyone like-minded who feels stretched to wear a multitude of hats.
But, it’s all about what we choose to pass off to AI and how we do so.
If AI could look at my preaching schedule, event plans and even the articles I write and handle posting social media promotional information for them, that would be a net win. No one benefits from me spending my time tweaking information to fit the idiosyncrasies of Facebook, Twitter/X, Mastodon and, sometimes, Instagram. Nor is there benefit when, say, an article here on OFB takes longer to be linked to on socials because I don’t have time to do that dance.
If Lin were proposing replacing those kinds of tasks, I would eagerly agree. Even if AI did that, and I’m sure it will eventually, the content being linked to would still be “handcrafted.” Targeted AI to replace supplemental tasks makes a ton of sense.
Consider the example of a photographer over the last few decades. Purists sometimes turn their noses up at autofocus and other automation on cameras. Getting aperture and focus right by hand is an art that takes rigorous practice to do well. Anyone claiming to be a photographer should know how to do it. Many shots will never achieve their full height without the individual photographic artist’s eye knowing how things should look and what the genuine limits of his or her equipment are.
However, in high-speed photography, for example, insisting on full manual controls is insisting on an inferior output of photos. A skilled photographer, or a skilled person of any profession, knows when and where to use automation because it is not an easy way out, but an additional tool wielded to amplify one’s abilities.
Unfortunately, Lin appears to barrel toward the reasons why purists in photography hate automatic controls.
Like the art of genuinely composing pictures, as opposed to simply snapping them, the process of writing a sermon is far more arduous than those who’ve never done it might imagine. “Homiletics,” as it is known, parallels other “professional” tasks that people are currently nervously eyeing AI over. Preaching is something that takes years of study and practice, along with ample time to craft any individual sermon.
ChatGPT can scarily replicate a great deal of that, much as modern smartphones can do most of the “photographic work” for the user. But, it cannot truly replicate it.
When I craft a sermon, it comes through prayer, study and plenty of background immersed in the challenges and struggles of my congregation I’m preparing it for. Lin acknowledges these things, yet leans far further into AI than I comfortably can join him.
For example, he talks about having AI take over the effort of creating a first-person narrative about being in some story of the Bible. Placing oneself into the mindset of someone living in the moment of the Biblical text can be a helpful preparation for a sermon. Why bother with such an exercise? To accomplish something every thoughtful preacher wants to do: to think through what was going on in the minds of the person or people in a Biblical story, aiming to see the common human experience we share with them and can apply today.
Working through what would it be like to live in that setting, to have that particular problem or fear, to experience that miracle — those are not questions that can be outsourced to an AI. The key I said above was ”to think through”. Giving AI the job of thinking means I am no longer wrestling and the wrestling itself is where God will often reveal the heart of the passage.
ChatGPT, or even a far more advanced AI of the future, should not replace the hard work of “earning” an understanding of the Scriptures.
Yes, I’m impressed when AI can be fed a reference to an obscure Bible passage and “preach” — decently, even. I’d never preach its output, though. Lin reflects on AI sermon output, “If necessary, users can expand from the outline option to a full sermon that is at least moderately accurate and free of errors. Obviously, however, a 100 percent AI-generated sermon would miss the context of the speaker and the congregation.”
This is akin to getting the answers from the student next to you in the math exam. If you can’t do the formula, what happens if the square being measured on your exam is different than on his? Or, what if the kid next to you used the wrong formula?
There are plenty of times I’ve heard someone approach a part of the Bible in a thoughtful way, but become convicted by further study that it wasn’t accurate. I use numerous commentaries from trustworthy scholars, examination of the original languages and those years of background study to discern the difference between “thoughtful” and “accurate.”
Each week’s work studying and preparing helps me be better equipped to make those distinctions, not just that week, but in the weeks beyond. If the pastor starts taking AI outlines and adding a layer of polish to them, what happens when the AI uses that wrong “formula”?
Pastors leaning heavily on AI may confuse “thoughtful” for “accurate.” Or, realize something is amiss but not know what to do next.
While I use my iPhone a lot for photography, because it now exceeds my DSLR’s capabilities to a surprising degree, the years I spent behind manual camera controls are not wasted. I have an app (Halide) that gives manual controls I regularly return to when, say, autofocus can’t know I want to grab onto that blade of grass.
Anyone who has taught in any setting, whether the church, academia, corporate training or elsewhere knows that when there are others involved, they’ll bring questions and situations we were not thinking of. Blades of grass, so to speak, that are hard to focus on using “autofocus.”
After weeks or months of staring at a passage before preaching it, by God’s grace, I can take bits I didn’t even think worth mentioning and use them to help someone with his or her unique questions, needs, fears and the like. Ditto when I taught in the college classroom. Had my preparation been merely what the AI spit out as a finished product, any needs or concern outliers would have to be punted to, “Let me ask the AI about that.”
In your own line of work, you almost certainly know that preparation to teach someone else is one of the best ways to increase one’s own knowledge in the area. Mechanics who teach apprentices how to fix cars on the job will become better mechanics, pastors immersed in God’s Word to help others understand it will grow in their own understanding of it in a way that cannot be short-circuited.
And set all that aside. What of the experience of praying over one’s studies? Unless we assume God is absent in those moments, replacing the arduous hike from a Scripture passage to finished sermon with a few mouse clicks and polishing the result seems… troubling.
Lin raises an objection at this juncture. Some pastors turn to prewritten sermon outlines or even wholesale prewritten sermons and have for decades. (Not in the plagiarized sense, though that happens, but in cases where people intentionally write and provide the materials.) This, however, should do little to commend AI as the sermon author.
This is ill-advised for all the reasons I listed above. At least it is akin to copying (with permission) from the teacher’s notes while taking an exam, though.
Copying accurate notes will not yield a great understanding of the subject being studied. Copying AI’s notes is a gamble that very well could be more like copying a mediocre fellow student’s notes, clapping one’s hands in satisfaction and considering that “having done the work.”
Over and over in Lin’s examples, the dangers in over-dependence on AI arise. Lin suggests having the AI generate discussion questions, but how can one know the validity of those questions or how to answer diverging interaction with them if AI also synthesized together the sermon outline itself?
I sound like an opponent of AI, but I’m not. What can this technology do in a field like preaching? For one, it can be a fantastic sounding board. I’ve come to love asking AI to critique my handiwork after I have finished it. Lin makes reference to that use and here, I largely agree with him.
(Even here, though, the tools almost always make mistakes. It is a fantastic reminder: these tools are always beholden to their inevitably flawed sources. They can be used, but we have to bring our preparations and critical thinking to them.)
Likewise, AI can be used as a complementary resource to human-sourced materials. For example, in Bible study, my own Bible Bot AI tool can be fed a passage and offer commentary specific to the verse or verses I want it to focus on. This is great as a check on my own work. Frequently, it says something interesting, to which I apply the same standard I do to Wikipedia: go find sources I am more confident in to back up that interesting find.
Think of a doctor who, having done an examination, punches in her patient’s information to an AI for a “second opinion” that combs seemingly unrelated bits of information to catch an otherwise imperceptible health problem. The potential is vast, but it needs to build on, not replace, the initial examination.
AI is even better for ancillary tasks, as I noted earlier. Consider: many of the things I do in ministry and beyond, like these articles, need visual illustrations. Using AI to generate an image or video so that I can focus on what I’m actually called (and claiming) to do — care for people, write sermons, write articles and so on — is a great help.
A nice picture at the top of this article can help people find it and decide to read it, but it isn’t the thing itself: these words are. Even then, I will spend a long time (sometimes even hours) crafting the right prompts before hand editing the results, frequently merging several different AI outputs together, to create the visual look appropriate to the content I’ve labored over. Because I’ve done the hard work on the material this output illustrates, I know how to “fact check” it, so to speak.
(Thus why illustration credits here will often be “Timothy R. Butler” plus one or more generative AI image tools. They provided parts, but they almost never provide the end result.)
At the beginning, I used the word “amplified” for what automation does advisedly. Future wise use of AI depends on us using it to magnify hard work, not as a shortcut around it.
If we use it as an easy out, it will amplify our laziness to create mediocrity or even malpractice.