Generative AI in College Writing: Critical Literacies, Student Perspectives, and Strategies for Authorial Balance

Authors

  • Heidi McKee Author

Keywords:

Artificial Intelligence, Writing, Technological Literacies

Abstract

Drawing from interviews with undergraduates, this article offers a critical analysis of the problems with generative artificial intelligence (GenAI) and student-centered recommendations for approaching college writing and GenAI. Data presented include examples of how students work with GenAI, showing critical engagement with writing processes, the challenges created by non-AI use policies, and considerations of overreliance, examining how students negotiate writing with GenAI. Recommendations include the benefits of an invitational, trust-centered pedagogy, the importance of fostering critical literacies about GenAI, the need to articulate what overreliance looks like for specific assignments, and strategies for helping students find authorial balance.

Downloads

Download data is not yet available.

References

Alvero, A. J., Lee, J., Regla‑Vargas, A., Kizilcec, R. F., Joachims, T., & Antonio, A.L. (2024). Large language models, social demography, and hegemony: Comparing authorship in human and synthetic text. Journal of Big Data, 11(1), 138. https://doi.org/10.1186/s40537-024-00986-7

Anderson, B.R., Shah, J.H., & Kreminski, M. (2024). Homogenization effects of large language models on human creative ideation. In Proceedings of the 16th Creativity & Cognition Conference (C&C ’24) (pp. 413–425). Association for Computing Machinery. https://doi.org/10.1145/3635636.3656204

Banks, A. J. (2005). Race, rhetoric, and technology: Searching for higher ground. Routledge.

Deidentified Source, 2024.

Bender, E.M., Gebru, T., McMillan‑Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21) (pp. 610–623). Association for Computing Machinery. https://doi.org/10.1145/3442188.3445922

Bennett, J. (2010). Vibrant matter: A political ecology of things. Duke University Press.

Bergmann, D., & Stryker, C. (2024). What is artificial general intelligence (AGI)? (IBM Think). IBM. https://www.ibm.com/think/topics/artificial-general-intelligence/

Bondari, N. (2025). AI, copyright, and the law: The ongoing battle over intellectual property rights. Intellectual Property & Technology Law Society, University of Southern California. https://sites.usc.edu/iptls/2025/02/04/ai-copyright-and-the-law-the-ongoing-battle-over-intellectual-property-rights/

Burns, H.L. (1979). Stimulating rhetorical invention in English composition through computer-assisted instruction (Doctoral dissertation). University of Texas at Austin. available through the Internet Archive https://archive.org/details/DTIC_ADA106372

Feenberg, A. (1991). Critical theory of technology. Oxford University Press.

Fernandes, M., & McIntyre, M. (2025a). Giving voice to generative AI refusal in rhetoric and writing studies. Kairos: A Journal of Rhetoric, Technology, and Pedagogy, 29(2). https://kairos.technorhetoric.net/29.2/disputatio/fernandes-mcintyre/

Fernandes, M., & McIntyre, M. (2025b). Drafting defensively, documenting authorship: An analysis of Draftback and Grammarly Authorship. Computers and Composition, 76.

Graham, S.S. (2023). Post-process but not post-writing: Large language models and a future for composition pedagogy. Composition Studies, 51(1). https://compstudiesjournal.com/wp-content/uploads/2023/06/graham.pdf

Hale, R. (2025). She lost her scholarship over an AI allegation — and it impacted her mental health. USA Today. https://www.usatoday.com/story/life/health-wellness/2025/01/22/college-students-ai-allegations-mental-health/77723194007/

Hawisher, G.E., & Selfe, C.L. (1991). The rhetoric of technology and the electronic writing class. College Composition and Communication, 42(1), 55–65.

Hofmann, V., Kalluri, P.R., Jurafsky, D., & King, S. (2024). AI generates covertly racist decisions about people based on their dialect. Nature, 633, 147–154. https://doi.org/10.1038/s41586-024-07856-5

International Energy Agency. (2025). AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works. IEA. https://www.iea.org/news/ai-is-set-to-drive-surging-electricity-demand-from-data-centres-while-offering-the-potential-to-transform-how-the-energy-sector-works

Johnson, G.P. (2023). Don’t act like you forgot: Approaching another literacy “crisis” by (re)considering what we know about teaching writing with and through technologies. Composition Studies, 51(1). https://compstudiesjournal.com/wp-content/uploads/2023/06/johnson.pdf

Jiang, J. Vetter, M. A., & Lucia, B. (2024). From hype to practice: Reinterpreting the writing process through technical writing students’ engagement with ChatGPT. Technical Communication Quarterly https://doi.org/10.1080/10572252.2024.2445302

Kolko, B. E., Nakamura, L., & Rodman, G. B. (Eds.). (2000). Race in Cyberspace. Routledge.

Kosmyna, N., Hauptmann, E., Yuan, Y.T., Situ, J., Liao, X., Beresnitzky, A.V., Braunstein, I., & Maes, P. (2025). Your brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant for essay writing task. arXiv. https://doi.org/10.48550/arXiv.2506.08872

Latour, B. (2005). Reassembling the social: An introduction to actor‑network theory. Oxford University Press.

Lee, H., Sarkar, A., Tankelevitch, L., Drosos, I., Rintel, S., Banks, R., & Wilson, N. (2025). The impact of generative AI on critical thinking: Self‑reported reductions in cognitive effort and confidence—effects from a survey of knowledge workers. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI ’25) (Article 1121, pp. 1–22). Association for Computing Machinery. https://doi.org/10.1145/3706598.3713778

McKee, H. (2002). “YOUR VIEWS SHOWED TRUE IGNORANCE!!!”: (Mis)Communication in an online interracial discussion forum. Computers and Composition, 19(4), 411–434.

McKee, H.A., & Porter, J.E. (2020). Ethics for AI writing: The importance of rhetorical context. In AIES '20: Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society (pp. 110–116). https://dl.acm.org/doi/10.1145/3375627.3375811

Miller, K. (2024). Privacy in an AI era: How do we protect our personal information? Stanford Institute for Human-Centered Artificial Intelligence. Retrieved from https://hai.stanford.edu/news/privacy-ai-era-how-do-we-protect-our-personal-information/

MLA‑CCCC Joint Task Force on Writing and AI. (2024). Building a culture for generative AI literacy in college language, literature, and writing (Working Paper No. 3). https://hcommons.org/app/uploads/sites/1003160/2024/11/MLA-CCCC-Joint-Task-Force-WP-3-Building-Culture-for-Gen-AI-Literacy.pdf

Owusu‑Ansah, A.L. (2023). Defining moments, definitive programs, and the continued erasure of missing people. Composition Studies, 51(1), 143–148.

Sano‑Franchini, J., McIntyre, M., & Fernandes, M. (2024). Refusing GenAI in writing studies: A Quickstart Guide. https://refusinggenai.wordpress.com/

Selber, S. A. (2004). Multiliteracies for a digital age. Southern Illinois University Press.

Selfe, C. L., & Selfe, R. J. (1994). The politics of the interface: Power and its exercise in electronic contact zones. College Composition and Communication, 45(4), 480–504.

Sperber, L., MacArthur, M., Minnillo, S., Stillman, N., & Whithaus, C. (2025). Peer and AI Review + Reflection (PAIRR): A human‑centered approach to formative assessment. Computers and Composition, 76, 102921. https://doi.org/10.1016/j.compcom.2025.102921

Turness, D. (2025). How distortion is affecting AI assistants. BBC Media Centre. https://www.bbc.co.uk/mediacentre/2025/articles/how-distortion-is-affecting-ai-assistants/

Wang, C. (2024). Exploring students’ generative AI-assisted writing processes: Perceptions and experiences from native and nonnative English speakers. Technology, Knowledge and Learning, 29(2), 345–367. https://doi.org/10.1007/s10758-024-09744-3

Wilson, C. (2025). AI hallucinations are getting worse – and they're here to stay. New Scientist. https://www.newscientist.com/article/2479545-ai-hallucinations-are-getting-worse-and-theyre-here-to-stay/

Xu, Z., Jain, S., & Kankanhalli, M. (2025, February 13). Hallucination is inevitable: An innate limitation of large language models. arXiv. https://doi.org/10.48550/arXiv.2401.11817

Yan, D. (2023). Impact of ChatGPT on learners in a L2 writing practicum: An exploratory investigation. Education and Information Technologies. https://doi.org/10.1007/s10639-023-11742-4

Zewe, A. (2025, January 17). Explained: Generative AI’s environmental impact. MIT News. https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117

Published

2026-04-06

Data Availability Statement

The interview transcripts are not available.

How to Cite

Generative AI in College Writing: Critical Literacies, Student Perspectives, and Strategies for Authorial Balance. (2026). Journal on Excellence in College Teaching, 37(Special Issue). https://celt.miamioh.edu/index.php/JECT/article/view/1330