The W.I.S.E Framework: A Theological Ethic for the Evaluation and Use of Artificial Intelligence Tools in Christian Education

 Generative AI as a Defining Moment in Human History

Certain events are so consequential that they permanently and indelibly alter the trajectory of human existence. They literally divide time into a before and after, not merely chronologically but also culturally, intellectually, and theologically. The life, death, and resurrection of Jesus Christ stand as the most decisive of these moments, shaping not only Christian faith but also the very structure of Western life. Other events have similarly punctuated history in lasting ways, including the Protestant Reformation, the Enlightenment, and the World Wars of the Twentieth Century, each reshaping societal values and ethical reflection.[1] Technological change has also frequently served as a catalyst for shifts in the human record, yet rarely at the pace witnessed in our present moment. Whereas the printing press, industrialization, and the personal computer reshaped societies over decades or even centuries, the advent of artificial intelligence has achieved cultural saturation in only a matter of months.[2]

November 30, 2022, stands as the defining moment in this unfolding landscape. On that date, OpenAI publicly released ChatGPT, making advanced Generative Artificial Intelligence widely accessible for the first time.[3] Within months, millions of users were adopting AI tools for a wide variety of content generation. Unlike earlier technologies, Generative AI does not merely deliver content but also produces it in a way that closely resembles human capabilities.[4] This has significantly disrupted long-standing assumptions about authorship and originality. Generative AI is fast blurring the distinction between human cognition and machine output, forcing educators to reassess how meaning, creativity, and authority are defined within academic contexts.[5] When Large Language Models can easily complete most educational tasks and generate plausible arguments and explanations, academics are forced to reconsider what it means to know, to understand, to reason, and to demonstrate learning.[6]

For Christian education in particular, the challenge is not simply pedagogical or technological but is also inherently theological. The increasingly wide availability of artificial intelligence tools raises concerns not only about academic integrity, but also about human identity and how we relate to God. Christian education’s future relevance hinges on how it will navigate the next phase of AI, and how it creates strategic plans not merely for widespread adoption, but also, how to do so in a manner that is responsibly and biblically appropriate. This requires more than adopting new forms of technology or the creation of new institutional policies. What is needed is a coherent theological ethic capable of guiding educators and students alike through the evaluation and use of AI tools. Such a framework must integrate faith and learning as it simultaneously orients educational practices towards God and spiritual formation.

 The Seismic Impact of Artificial Intelligence on Higher Education

Higher education has historically served two interrelated functions: educating students and certifying that education has occurred.[7] Academic institutions are entrusted with publicly validating to stakeholders that learning itself has taken place. Written examinations, essays, and research projects have long functioned as the primary mechanisms through which institutions have demonstrated student learning to society.[8] However, the widespread availability of Generative AI tools in the last three years has significantly destabilized this model. Artificial Intelligence is now easily capable of producing work that not only closely resembles human output but is also virtually indistinguishable from authentic student submissions.[9]

This reality undermines the assumption that traditional assessments still reliably certify understanding or mastery.[10] The issue is not simply academic dishonesty but the erosion of assessment validity. In today’s AI-saturated context, the integrity of the academic institution now lies in the hands of the students and the extent to which they want to learn. Even in fully proctored or supervised environments, Generative AI raises legitimate questions about whether existing assessment formats alone accurately measure comprehension, synthesis, and judgment. Artificial intelligence disrupts the long-standing assumption that traditional forms of assessment certify learning and exposes the extent to which many current educational practices exist due to historical tradition, rather than pedagogical appropriateness.[11]

Student adoption of AI tools has also been accelerating at an unprecedented pace. Recent surveys indicate that a majority of university students now use Generative AI for academic purposes, and not just for editing and summarizing, but also for actual content creation.[12] Adult learners, in particular, report adoption rates exceeding those of any other demographic group.[13] These tools are not simply on the periphery of education, but are quickly becoming embedded in the daily routines of students across all disciplines. Institutional responses to this rapid adoption have varied widely. Some have attempted to prohibit AI use entirely, framing Generative AI tools as threats to academic integrity. Others have adopted permissive policies that offer little guidance beyond broad encouragement. Both approaches have proven inadequate. Prohibition-based strategies are difficult to enforce and often drive AI usage underground, while permissive approaches risk normalizing uncritical and irresponsible dependence.[14] The danger is that neither approach accurately deals with the problem.

The central challenge posed by artificial intelligence is not whether students will use these tools, but whether educational institutions will adapt and encourage responsible learning in a context where the use of such tools has become commonplace.[15] Artificial intelligence is not a future challenge for higher education. It is a present reality that requires careful pedagogical and ethical consideration. The Christian academy is not simply standing at the intersection of AI and the educational process. It is being actively thrust into its rapid integration. The question is not should educational institutions adopt AI tools, because that decision has already been forced upon them. The question is how should educators, especially Christian educators, respond to the seismic changes taking place all around them in the wake of the emerging AI era. The solution lies in training students how to responsibly evaluate and use Generative AI tools.

Christian educational institutions must not only find an ethical solution to AI, but also one that clearly aligns with its distinctive mission. The Christian academy exists not merely to transmit information or certify competence, but to form students spiritually and theologically. Christian education is not simply the end. It is a means to the end, which lies in the furtherance of the mission of Jesus Christ (Matt 28:18-20, ESV). This holistic view of education has historically distinguished Christian colleges and universities from purely technical training environments.[16] The temptation for institutions, including Christian ones, is to respond to AI primarily through policy and technological adaptation. While such measures may address surface-level concerns, they do little to engage the deeper formative and ethical questions at stake. At this juncture, Christian education faces a critical choice. Reactionary resistance risks cultural irrelevance, while uncritical adoption risks theological erosion. What is required is a solution grounded in theological clarity and biblical wisdom.[17]

 Digital Skills, AI Fluency, and the Limits of a Skills-Only Response

As Generative AI tools become embedded across nearly every sector of society, the demand for digital skills and AI fluency has accelerated rapidly. Employers now increasingly expect graduates to enter the workforce not merely familiar with AI tools, but capable of integrating them productively into daily workflows.[18] The marketplace is clear: artificial intelligence is no longer seen as an addition to existing work processes. Instead, it is rapidly becoming a foundational infrastructure, reshaping how work is organized, evaluated, and performed.[19] It is not just a tool to use at work. It is reshaping how work itself is being completed.[20] Higher education has responded to these developments by expanding training initiatives, revising curricula, and investing in professional development focused on AI literacy.[21] Christian institutions, however, must also prepare their students to navigate AI as a matter of biblical stewardship. To graduate students who lack basic digital fluency would be to leave them unprepared for the realities of contemporary professional life.

Yet while digital skills and AI fluency are necessary, they are not sufficient. A skills-only response risks reducing education to technical competence detached from moral, spiritual, and relational formation. Studies examining cognitive engagement when students rely heavily on Generative AI suggest that excessive outsourcing of thinking can weaken neural pathways associated with reasoning and synthesis.[22] Educating students how to use the tools is therefore becoming increasingly important. This is especially the case for Christian education.

The Christian intellectual tradition has consistently affirmed that education concerns the formation of wisdom rather than the accumulation of information. Scripture distinguishes knowledge from wisdom, grounding true understanding in reverence for God and moral discernment (Prov. 1:7; Jam 3:13–17).[23] Without theological guidance, AI fluency risks becoming formative in unintended ways, shaping habits without appropriate ethical reflection.[24] When AI systems generate answers instantly, students may come to value speed over reflection and plausibility over truth.[25] Christian education must therefore move beyond teaching AI skills toward cultivating responsible theological discernment and how to evaluate the tools before they are even used. This discernment requires more than isolated conversations about academic integrity. It demands an ethical framework capable of integrating technological competence with theological truth. Christian educators must be equipped not only to teach students how to use AI tools, but also to help them ask deeper questions about why, when, and to what ends such tools should be employed. These questions set the stage for the need for a coherent theological framework.

Why Christian Education Requires a Theological Framework for Artificial Intelligence

Artificial intelligence is frequently treated as a technical or administrative challenge in education. Policies addressing academic integrity, data privacy, and acceptable use dominate institutional conversations. While such concerns are legitimate, they are insufficient. Technologies embody assumptions about what it means to be human, what counts as knowledge, and how authority is exercised.[26] Generative AI intensifies these questions because of the ease with which it simulates both intellectual and emotional responses.[27] When machines can generate content, articulately summarize arguments, and mimic reasoning, long-standing distinctions between human cognition and machine output become blurred.

These developments raise fundamentally theological questions. What distinguishes human intelligence from that which is artificial? What constitutes authorship, creativity, and responsibility in an age of generative systems? How should Christians understand human vocation when machines increasingly perform tasks once associated with professional identity? These questions cannot be answered through policy alone. They require theological reflection grounded in Scripture and the Christian intellectual tradition. Christian education cannot afford to be theologically silent as to how it integrates Generative AI tools. Adoption without reflection tacitly affirms the values embedded within technology. Rejection without discernment abdicates responsibility for cultural engagement. Scripture presents God as Creator and humanity as stewards, entrusted with the care and cultivation of creation (Gen 1:26–28). Educators, as stewards of intellectual and moral formation, communicate theological commitments through curricular design, assessment practices, and institutional policies.[28]

A theological framework provides coherence amid complexity. It enables educators to move beyond reactive responses toward principled engagement. Rather than asking only whether artificial intelligence is permissible or efficient, a theological framework asks whether its use is faithful and aligned with God’s purposes. Such a framework does not prescribe uniform practices but offers guiding commitments that can be contextualized across disciplines and institutional settings. The W.I.S.E. framework set out below is advanced as a theological ethic for Christian education. Rooted in Scripture and informed by theological reflection on technology, W.I.S.E. articulates four interrelated components that together orient faithful engagement with artificial intelligence. Each element addresses a distinct dimension of Christian formation while remaining integrally connected to the others.

The W.I.S.E. Framework: A Theological Ethic for Christian Education

Wisdom occupies a central place in the biblical understanding of knowledge and learning. True wisdom is grounded in reverence for God and expressed through righteous living (Prov 1:7). While Generative AI tools excel at processing information, they cannot generate wisdom. The challenge for Christian education is therefore not whether AI can enhance efficiency, but whether its use contributes to spiritually faithful formation. The W.I.S.E. framework articulates four commitments that together guide Christian educators and students alike in navigating the opportunities and challenges of artificial intelligence today. These are not sequential steps but interrelated components that help inform decision-making across policy, pedagogy, assessment, and spiritual formation.

WORSHIP: Orienting Artificial Intelligence Toward the Glory of God

Christian theology begins with worship. Jesus summarized the law as love for God with heart, soul, and mind (Matt 22:37). Scripture rejects any division between the sacred and secular, portraying work itself as service rendered to God (Gen 2:15; Col 3:17). Artificial intelligence, as a tool employed within human work, therefore falls within the scope of worship. A theological response to AI cannot be purely technological. It must also be doxological. The primary danger posed by advanced technologies is idolatry. Idolatry emerges when created things are treated as ultimate sources of meaning or authority (Ex 20:3–5). In educational contexts, this can occur when efficiency, productivity, or AI generated output replaces theological discernment. Christian academic institutions must guard against the danger of users surrendering human agency in exchange for convenience.

A worship-centered ethic encourages gratitude for technological creativity and the possibilities it opens. Human creativity and innovation are gifts exercised as part of humankind’s dominion over the earth. When artificial intelligence is used as a tool on behalf of God as Creator, it becomes one more way in which Christians love God with their minds and offer their work back to Him as an offering of praise. The question is not whether AI may be used, but whether its use is ordered toward God’s glory rather than human autonomy.

A worship-oriented approach therefore helps users learn that Generative AI tools always remain subordinate to God’s purposes and human stewardship. AI tools may assist human work, but they must not define truth, value, or identity. For Christian educators, this commitment requires intentionality and transparency. The goal is not maximal use of AI, but faithful use. When learning is framed as an act of devotion rather than mere productivity, technology is placed in its proper theological context. The appropriate theological response is to view every digital act, every prompt, query, or generated image, as an offering to God (1 Cor 10:31). Every interaction with AI must be done to His glory, not our own. A worship-centered orientation toward artificial intelligence also reshapes how Christian educators understand formation over time. Worship is not a single act but a habit-forming posture that directs love and attention. Technologies, including AI systems, inevitably train users in particular habits of mind and patterns of desire. Christian education must therefore ask not only what AI enables students to produce, but who it trains them to become.

Practically, this means Christian educators should not only teach their students how to use Generative AI tools but ought also to design learning experiences that make space for theological reflection as to how the use of the tools enable (or hinder) the user’s ability to worship God. Artificial intelligence may assist learning, but it must never crowd out practices that foster dependence on God, humility, and responsibility for one’s own thinking. When AI use is situated within a broader understanding of worship, it becomes a servant of formation rather than a substitute for it. When rightly ordered, artificial intelligence does not diminish worship but expands the spaces in which faithful work can be offered to God. The first of four key questions for reflection, is therefore: how are we worshipping God as we use Artificial Intelligence today?

IMAGE: Stewarding Artificial Intelligence as Bearers of the Imago Dei

Scripture affirms that humans are uniquely created in the image of God (Gen 1:26–27). Humankind, created in the image of God, is entrusted to represent and steward all of creation on God’s behalf (Gen 1:28-29). We are the pinnacle of creation and have been granted dominion over everything else that exists. This extends to the digital realm. Artificial intelligence, regardless of sophistication, does not reflect the Imago Dei in and of itself. It possesses no moral agency, spiritual capacity, or relational accountability. The creation narrative clarifies humanity’s role as stewards rather than creator (Gen 2:15). God alone creates ex nihilo. Humans are entrusted with cultivating and governing creation on God’s behalf, exercising authority as accountable stewards rather than autonomous masters.[29] Artificial intelligence is therefore a derivative of creation, emerging from human ingenuity exercised within God’s created order.

As advanced as they are, Generative AI tools are still a creation. Contrary to popular phraseology, they are not co-creators or co-stewards. Theological stewardship involves interacting with all forms of technology in ways that uphold truth and human dignity, avoiding both exploitation and passivity. Consequently, to steward artificial intelligence faithfully is to evaluate and use it responsibly in an ethically and theologically consistent manner. Digital stewardship therefore demands using AI tools with integrity and transparency. To steward AI faithfully is to use artificial intelligence for Him and through Him, not apart from Him.

Precisely because human beings bear the image of God, they are uniquely qualified to engage with and steward artificial intelligence. Reflecting on the Imago Dei alongside AI ought not therefore restrict its use, rather, it affirms human dominion over it. In this sense, the use of Generative AI tools is not antithetical to image-bearing, but a way in which image-bearing is exercised responsibly and theologically today. An imago Dei-centered ethic insists that stewardship responsibility ought not to be shifted to LLMs and chatbots but instead maintains ethical responsibility with the user. Christian education must affirm human accountability, particularly in an era when automation can obscure lines of responsibility.

An imago Dei–centered framework also clarifies why artificial intelligence poses unique challenges for Christian education beyond those associated with earlier technologies. Unlike calculators or learning management systems, Generative AI produces output that closely resembles human reasoning and expression. This resemblance can obscure the distinction between humans and technology, particularly for those who have grown up in the digital age. Christian education must therefore engage in explicit theological instruction that reinforces what it means to be human in contrast to what machines can simulate.

Scripture consistently portrays humans as moral agents called to exercise judgment, and to give account for their actions (Deut 30:19; Mic 6:8; Rom 14:12). When educators or students rely uncritically on AI systems to generate content, the risk is not merely to academic integrity. The deeper danger is the erosion of spiritual habits and moral agency. A theological ethic on the other hand, insists that education preserves spaces where users must take responsibility for their work. This has direct implications for how Christian educators think about learning outcomes. Assessments and learning activities should require students to articulate reasoning, defend judgments, and understand what it means to reflect God in the AI era. Christian education plays a vital role in nurturing students who not only understand Generative AI technologies, but also those who recognize their ongoing accountability to God for the responsible stewardship of them in the AI era. Refusing to engage with artificial intelligence is not a neutral act; it is a decision to abandon stewardship over a powerful tool that will shape human lives regardless. The second question for theological reflection consequently becomes: how are we reflecting the Imago Dei as we integrate AI tools in Christian education today?

SENT: Mission, Cultural Engagement, and Artificial Intelligence

Christian education has always played a vital role in the mission of the Church (Matt 28:18–20). Jesus’ command to make disciples of all nations includes the teaching and spiritual formation that takes place in the classroom, just as much as that which occurs every Sunday in the pew. The mission of the church was not limited to a particular historical or technological context, nor did it contain an implicit exclusion of emerging tools. Throughout Christian history, faithfulness has required engagement with changing cultural conditions rather than withdrawal from them.[30] Artificial intelligence now represents one of the most influential cultural forces shaping communication, work, and meaning. Therefore, to be sent in the present age necessarily involves engagement with an AI-saturated world (Acts 17:22-28, 1 Cor 9:19–23).

Historically, the Church has often approached new technologies with hesitation or resistance.[31] At times this caution reflected legitimate moral concern, yet it has also frequently resulted in missed opportunities for witness and leadership. Scholars of religion and culture have observed that when Christian communities disengage from emerging technologies, secular frameworks are left to define their meaning and ethical boundaries.[32] Artificial intelligence raises profound questions about the source of intelligence, what it means to be human, as well as agency, and value. If Christian institutions decline to participate in these conversations, those questions will be answered by secular ethical frameworks, and the responsible use of Generative AI will default to popularist ideology rather than biblical theology.[33]

A missional posture toward artificial intelligence therefore requires more than reactive participation. It demands Christian educators and their institutions actively lead the conversation. Mission today involves proclaiming truth not only through spoken or written word, but also through the ways technologies are evaluated and employed (1 Chron 12:32). When Christian educators engage AI thoughtfully and transparently, they offer a counter-narrative that integrates wisdom, responsibility, and worship rather than efficiency alone.

Artificial intelligence also presents unprecedented opportunities for cultural engagement in today’s marketplace. Students trained within the Christian academy are entering workplaces where digital skills and AI fluency are required rather than optional. To prepare them for faithful witness, Christian education must help students understand how to responsibly and theologically engage colleagues, clients, and communities within the AI era. Learning to use AI competently and ethically must become an urgent priority, enabling Christians to participate meaningfully in conversations where questions of purpose, trust, and responsibility naturally arise.

To be sent in the age of artificial intelligence is thus to bring the light of Christ into digital and technological frontiers. This does not require uncritical adoption of every new tool, nor does it justify technological enthusiasm detached from theological discernment. Instead, it reflects a commitment to faithful presence. Christian education can model what it looks like to inhabit technological spaces with humility, wisdom, and hope. By leading conversations about responsible AI usage, its impact on relationships, and its implications for human dignity, Christian institutions can engage the broader culture in ways that invite dialogue rather than defensiveness.

Ultimately, mission-oriented engagement with artificial intelligence guards against both withdrawal and capitulation. The Church’s task is not to retreat from the digital age, nor to embrace technological progress uncritically, but to demonstrate that wisdom and worship belong together. When Christian educators approach artificial intelligence as an arena for discipleship rather than a threat to be avoided, they affirm that Christ’s lordship extends over every domain of human creativity and innovation. Such an approach prepares students not only to navigate an AI-shaped world, but to bear faithful witness within it. Faithful participation in the mission of God within an AI-saturated culture requires intentional formation that equips students with both theological discernment and practical competence in the use of emerging technologies. The third key question of the W.I.S.E framework therefore becomes: How can we further the mission of Jesus as we use artificial intelligence tools?

Equipped: AI Fluency, Digital Skills, and Faithful Vocation

Christian education has long understood learning as preparation for faithful vocation rather than just for employability. Scripture affirms that believers are God’s workmanship, created in Christ Jesus for good works prepared in advance (Eph 2:10). These good works are not isolated from historical and cultural conditions but are carried out within particular social, economic, and technological contexts. In the present moment, those contexts are increasingly shaped by Generative AI tools. To be faithful in one’s calling today therefore requires intentional engagement with the very tools that are changing the way we work and live.

Equipping students for faithful vocation in the AI era cannot be reduced to simple course redesign or isolated curricular additions. Artificial intelligence is not merely a tool that is added into the educational process. It is reshaping every element of it and changing the way students learn and generate knowledge.[34] Employers increasingly expect graduates to work using AI systems as collaborative partners, integrating Generative AI tools into daily workflows rather than treating them as external aids. At the same time, marketplace studies reveal growing concern that higher education is not keeping pace with the rate of technological change, particularly in developing applied AI competencies and digital fluency.[35]

Student behavior further underscores the urgency of this challenge. Surveys indicate that a substantial majority of higher education students now use Generative AI tools regularly for academic work.[36] Yet this widespread adoption is accompanied by significant gaps in understanding how AI systems function, how outputs should be evaluated, and how such tools should be used responsibly. Global studies on trust in artificial intelligence suggest that while use is accelerating, confidence in ethical governance and human oversight remains fragile.[37] This disconnect highlights the insufficiency of prohibition-based approaches and reinforces the need for intentional formation rather than reactive enforcement.

From a theological perspective, equipping students with AI fluency is best understood as an expression of stewardship rather than a technological accommodation. Artificial intelligence, as a powerful and pervasive tool embedded within contemporary life, falls squarely within this domain of responsibility. Excellence in technological competence, rightly ordered, can therefore be understood as a form of faithful service rather than a distraction from spiritual formation (Prov 22:9, Eccl 9:10).

At the same time, Christian education must resist framing AI fluency as a purely technical skill set. Equipping students requires integrating digital literacy with theological discernment. This reflects the Christian conviction that wisdom involves the right ordering of knowledge toward faithful action rather than efficiency alone. Institutionally, this calls for embedding digital fluency and AI competencies across disciplines rather than confining them to specialized programs. Studies of higher education suggest that students increasingly question the relevance of degrees that fail to demonstrate clear alignment with emerging professional realities.[38] Christian institutions, therefore, face both a pedagogical and missional imperative to articulate what they offer that artificial intelligence cannot: formation in judgment, character, relational competence, and ethical responsibility.

Finally, equipping students for faithful vocation in the AI era requires institutional leadership willing to invest in long-term innovation. This includes providing equitable access to AI tools, regularly updating curricula as technologies evolve, and supporting faculty development that integrates digital fluency with theological reflection. A key question facing Christian education is not whether artificial intelligence will shape learning, but whether institutions will equip students to engage these tools wisely, transparently, and theologically. To equip the next generation is to integrate discipleship with digital literacy, preparing graduates who can engage artificial intelligence as stewards rather than subjects, in service of God’s purposes. The final question in the W.I.S.E framework therefore is: how are we equipping our students with appropriate AI skills and discernment to further their calling?

Conclusion: From Disruption to Discernment

The advent of Generative AI tools leaves in its wake foundational questions the academy must answer about learning outcomes, assessments, and critical thinking. However, for the Christian institution, it also raises an immensely important question about worship, the imago Dei, our mission and how we are being prepared, and preparing our students, for the AI era. The challenge is not whether AI will reshape education, but whether Christian educators will engage these changes faithfully. The W.I.S.E. framework offers a theological ethic for navigating this moment. By orienting AI use toward worship, affirming the image of God, embracing mission, and equipping students for faithful vocation, Christian education can respond to artificial intelligence with wisdom rather than fear or efficiency alone. This framework does not resolve every practical question, nor does it prescribe uniform policies. Instead, it provides a theological structure capable of guiding ongoing reflection and conversation about AI.

The emergence of Generative AI tools has not created the issues raised in this paper, but it has intensified them. Questions of authorship, agency, wisdom, and responsibility have always belonged to the task of education. What artificial intelligence has done is to expose how fragile many of our inherited assumptions about learning and assessment have become. For Christian education, this should not be seen as a threat. It is an invitation and opportunity to recover deeper theological clarity about the purpose of education itself. The temptation in moments of rapid technological change is to respond either with fear or with uncritical enthusiasm. Both responses abdicate the formative responsibility entrusted to Christian educators. Fear-driven resistance risks isolating Christian education from the cultural contexts students will inhabit, while enthusiasm untethered from theology risks conforming education to technological logics rather than Christian wisdom. The task before Christian educators is neither rejection nor capitulation, but discernment grounded in faith.

 

Notes:

[1] See for example, Jacques Ellul, The Technological Society, trans. John Wilkinson (New York: Vintage Books, 1964).

[2] Project on Workforce Team, “The Generative AI Adoption Tracker,” Project on Workforce, Harvard Kennedy School (November 13, 2025), available online: https://www.genaiadoptiontracker.com.

[3] Ethan Mollick, Co-Intelligence: Living and Working with AI (New York: Portfolio, 2024).

[4] Kate Crawford, Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (New Haven, CT: Yale University Press, 2022).

[5] See for example: Steven Watson, Erik Brezovec, and Jonathan Romic, “The Role of Generative AI in Academic and Scientific Authorship: An Autopoietic Perspective,” AI & Society 40 (2025): 3225–39, available online: https://doi.org/10.1007/s00146-024-02174-w; Hassan Bukhari and [Author], “Authorship and Ownership Issues Raised by AI-Generated Works,” Law 14.4 (2025): 57–72, available online: https://doi.org/10.3390/laws14040057

[6] Historical comparisons of technology adoption indicate that generative artificial intelligence is being integrated into everyday use far more rapidly than previous transformative technologies such as the personal computer and the internet, and empirical analyses of innovation output reveal a magnitude of research acceleration in AI not seen in earlier technological epochs, supporting the claim that the speed of technological advancement in the present AI era is without equal in the historical record. See Alexander Bick, Adam Blandin, and David Deming, “The Rapid Adoption of Generative AI,” On the Economy, Federal Reserve Bank of St. Louis (September 23, 2024), available online: https://www.stlouisfed.org/on-the-economy/2024/sep/rapid-adoption-generative-ai.

[7] Pat Hutchings, Jillian Kinzie, and George D. Kuh, Using Evidence of Student Learning to Improve Higher Education (San Francisco: Jossey-Bass, 2015), 146-82.

[8] Derek Bok, Higher Education in America (Princeton, NJ: Princeton University Press, 2013), 28-90.

[9] Steffen Herbold et al., “AI, Write an Essay for Me: A Large-Scale Comparison of Human-Written versus ChatGPT-Generated Essays,” arXiv (April 24, 2023), available online: https://arxiv.org/abs/2304.14276

[10] Peter M. Newton and Kevin Essex, “How Common Is Cheating in Online Exams?” Journal of Academic Ethics 22.3 (2024): 323-43, available online: DOI:10.1007/s10805-023-09485-5.

[11] Lee S. Shulman, “From Minsk to Pinsk: Why a Scholarship of Teaching and Learning?” Change 34.3 (2002): 36–44; Galina Ilieva et al., “A Framework for Generative AI-Driven Assessment in Higher Education,” Information 16.6 (2025): 472, available online: https://doi.org/10.3390/info16060472; Jian Zhao, Elaine Chapman, and Peyman G. P. Sabet, “Generative AI and Educational Assessments: A Systematic Review,” ERP Journal 51 (2024): 124–55; See also “Rethinking Assessment in Higher Education in the Age of Generative AI,” in Emerging Technologies and Pedagogies in the Curriculum (Springer, 2025).

[12] Menlo Ventures, Consumer AI Adoption Report (Menlo Park, CA: Menlo Ventures, 2025), available online: https://www.globenewswire.com/news-release/2025/06/26/3105905/0/en/Menlo-Ventures-2025-Consumer-AI-Report-12B-Market-1-8-Billion-Global-Users-Signal-AI-s-Mainstream-Moment.html.

[13] Stanford Institute for Human-Centered Artificial Intelligence, What Workers Really Want from Artificial Intelligence (Stanford, CA: Stanford University, 2025), available online: https://hai.stanford.edu/news/what-workers-really-want-from-artificial-intelligence.

[14] Jordan Adair, “The Elephant in the Room: 4 Actionable Tips to Manage Student Cheating,” eCampus News (February 12, 2025), available online: https://www.ecampusnews.com/teaching-learning/2025/02/12/actionable-tips-to-manage-student-cheating.

[15] Richard Adams, “UK Universities Warned to ‘Stress-Test’ Assessments as 92% of Students Use AI,” The Guardian (February 25, 2025), available online: https://www.theguardian.com/education/2025/feb/26/uk-universities-warned-to-stress-test-assessments-as-92-of-students-use-ai?utm_source=chatgpt.com

[16] Arthur F. Holmes, The Idea of a Christian College, rev. ed. (Grand Rapids, MI: Eerdmans, 1987), esp. 6–14, 43–56. Holmes articulates a formative and teleological vision of Christian higher education, arguing that education is a means rather than an end, oriented toward the formation of the whole person through the integration of faith and learning in service of Christ’s mission.

[17] John Dyer, From the Garden to the City: The Redeeming and Corrupting Power of Technology (Grand Rapids, MI: Kregel Publications, 2011), esp. 23–52, 101–28. Although over a decade prior to the public availability of Generative-AI, John Dyer wrote about the importance of a theological approach to technology. Dyer offers a theological framework for evaluating technology grounded in the biblical narrative of creation, fall, redemption, and restoration. He argues that technology is a good and God-given expression of human creativity rooted in the imago Dei, yet also subject to distortion through sin. Rejecting both technological determinism and technophobia, Dyer emphasizes that technologies are never neutral, as they embed values that shape human practices and desires, and therefore require intentional, redemptive use oriented toward spiritual flourishing rather than uncritical acceptance or rejection.

[18] Microsoft Corporation, Work Trend Index 2025 (Redmond, WA: Microsoft, 2025), available online: https://www.microsoft.com/en-us/worklab/work-trend-index.

[19] How Higher Education Can Realize the Potential of Generative AI.” UniNewsletter (September 22, 2025), available online: https://uninewsletter.com/blogs/how-higher-education-can-realize-the-potential-of-generative-ai/en.

[20] See for example Hannah Mayer et al., “Superagency in the Workplace: Empowering People to Unlock AI’s Full Potential at Work,” McKinsey & Company (January 28, 2025), available online: https://www.mckinsey.com/capabilities/tech-and-ai/our-insights/superagency-in-the-workplace-empowering-people-to-unlock-ais-full-potential-at-work?utm_source=chatgpt.com; The State of AI: Global Survey 2025, McKinsey & Company (November 5, 2025), available online: https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-state-of-ai

[21] EDUCAUSE, 2024 Horizon Report: Teaching and Learning Edition (Louisville, CO: EDUCAUSE, 2024), available online: https://library.educause.edu/resources/2024/5/2024-educause-horizon-report-teaching-and-learning-edition?utm_source=chatgpt.com.

 [22] Nataliya Kosmyna, Pattie Maes, and Robert Langer, “Your Brain on ChatGPT: Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Tasks,” arXiv preprint arXiv:2506.08872 (2025), available online: https://arxiv.org/abs/2506.08872.

[23] All Scripture quotations are taken from the English Standard Version (Wheaton, IL: Crossway, 2001/2016).

[24] Neil Selwyn, Should Robots Replace Teachers? Artificial Intelligence and the Future of Education (Cambridge: Polity Press, 2019), 63-104. Selwyn argues that educational uses of artificial intelligence are inherently formative, shaping habits of dependence, authority, and authorship, and warns that without sustained ethical and pedagogical reflection, AI adoption risks reinforcing problematic assumptions about learning and human judgment.

[25] There is emerging scholarly and empirical evidence that generative AI use can shape how students approach knowledge, critical thinking, responsibility, and creativity, particularly when it is used in ways that encourage quick answers rather than reflective engagement. These studies help validate your claim that AI-generated instant answers may inadvertently reinforce valuing speed over reflection and plausibility over truth. A systematic literature review in Educational Sciences examined multiple studies on AI tools (including generative models like ChatGPT) and found impacts on critical thinking and independent judgment among higher education students. While some uses of AI can enhance access to information, the research highlights concerns that ease of access may come at the cost of students engaging thoughtfully with content, potentially diminishing their ability to evaluate information independently. See Rahyuni Melisa et al., “Critical thinking in the age of AI: A systematic review of AI’s effects on higher education students’ reasoning and independent judgment,” Educational Sciences (2025), available online: https://eric.ed.gov/?id=EJ1459623.

[26] This is not a debate that has arisen only during the last three years. There have been ongoing concerns in the three decades leading up to publicly available Generative AI tools. See for example, Albert Borgmann, Technology and the Character of Contemporary Life: A Philosophical Inquiry (Chicago: University of Chicago Press, 1984).

[27] M. Jones and M. Schonewille, “Synthetic Relationships and Artificial Intimacy: An Ethical Framework for Evaluating the Impact of Generative-AI on Community,” Journal of Ethics in Entrepreneurship and Technology (October 14, 2025), available online: https://doi.org/10.1108/JEET-06-2025-0040.

[28] David S. Dockery and Christopher W. Morgan, eds., Christian Higher Education: Faith, Teaching, and Learning in the Evangelical Tradition (Wheaton, IL: Crossway, 2018); Nathan F. Alleman, Perry L. Glanzer, and David S. Guthrie, “The Integration of Christian Theological Traditions into the Classroom: A Survey of CCCU Faculty,” Christian Scholar’s Review 45.2 (2016): 103-24.

[29] Christopher J. H. Wright, Old Testament Ethics for the People of God (Downers Grove, IL: InterVarsity Press, 2004).

[30] Lesslie Newbigin argues that the gospel must be proclaimed as public truth within the cultural, intellectual, and institutional structures that shape modern society, rather than being confined to the private sphere. His emphasis on faithful, critical participation in dominant cultural systems provides a constructive framework for Christian engagement with emerging technologies, including artificial intelligence. See Lesslie Newbigin, The Gospel in a Pluralist Society (Grand Rapids, MI: Eerdmans, 1989).

[31] During the early stages of the late twentieth century technological revolution, Ellul argued that technological systems frequently advance faster than moral and theological reflection, producing delayed or defensive responses from Christian institutions. See Ellul, The Technological Society.

[32] See for example James Davison Hunter, To Change the World: The Irony, Tragedy, and Possibility of Christianity in the Late Modern World (Oxford: Oxford University Press, 2010).

[33] For example, in a recent edition of eCampus news, Laura Ascione, the Editorial Director at eSchool Media outlined what she termed as the 5 essential dimensions of AI literacy today, which included understanding AI, critical thinking and judgement, ethical and responsible use, human centricity, and domain expertise. However, as expected, there was a theological void. This will increasingly be the case unless Christian academics insert themselves into the conversation. See Laura Ascione, “5 Essential Dimensions of AI Literacy,” eCampus News (December 12, 2025), available online: https://www.ecampusnews.com/ai-in-education/2025/12/12/5-essential-dimensions-of-ai-literacy.

[34] Microsoft Corporation, Work Trend Index 2025.

[35] Deloitte Insights, How Higher Education Can Realize the Potential of Generative AI (2023).

 [36] “GenAI in Higher Education: Positive Sentiment Builds with Rapid Transformation,” Cengage Group Perspectives (2025), available online: https://www.cengagegroup.com/news/perspectives/2025/genai-in-higher-education---positive-sentiment-builds-with-rapid-transformation.

[37] KPMG International and the University of Melbourne, Trust, Attitudes and Use of Artificial Intelligence: A Global Study 2025 (2025), available online: https://kpmg.com/au/en/insights/artificial-intelligence-ai/trust-in-ai-global-insights-2025.html.

[38] “Digital Education Council global AI student survey 2024,” Digital Education Council (August 2, 2024),  available online: https://www.digitaleducationcouncil.com/post/digital-education-council-global-ai-student-survey-2024.


Martin R. Jones

Associate Dean, College of Business and Economics

Professor of AI, Law & Ethics

Anderson University

Martin R. Jones