Artificial Intelligence (AI) has become a common tool in education, particularly in software engineering, where it can assist with coding, debugging, design, and learning new concepts. In ICS 314, I primarily used ChatGPT to improve my workflow and deepen my understanding of course material. This essay reflects on how I applied AI in different parts of the course, how it influenced my learning, and how I see its future role in software engineering education.
Below are my reflections for each course element, including example prompts and evaluations of AI’s usefulness.
Experience WODs (e.g., E44: Next.js Hello World)
Prompt:
"In the terminal output after running npx create-next-app, it says '129 packages are looking for funding — run npm fund for details.' What does 'funding' mean in this context?"
Reflection: AI explained that this message indicates some installed packages are maintained by open-source projects that accept financial support. While unrelated to the WOD task itself, it improved my understanding of the open-source ecosystem and reminded me that every framework and library we use has maintainers working behind the scenes — often for free — to keep the tools we rely on functioning.
In-class Practice WODs
Prompt:
"Give me a beginner-friendly explanation of how Next.js works, including its folder structure, routing system, and how it differs from a regular React app."
Reflection: AI filled in conceptual gaps by explaining the /app directory, routing changes, and features like server-side rendering and static site generation. This additional context helped me understand why certain steps were taken in the WOD instructions, which made following them feel less like guesswork and more like applying a consistent system.
In-class WODs
Prompt:
"How do I improve the following blocks of code? Provide explanations for why."
Reflection: After timed WODs, I used AI to review my solutions, identifying inefficient patterns or syntax mistakes I had overlooked under time pressure. This turned each WOD into a mini post-mortem, where I could analyze my mistakes and learn how to avoid them in future high-pressure settings.
Essays
Prompt:
"How do I structure this essay in .md?"
Reflection: Early on, I relied on AI to learn markdown formatting for essays, from basic headers and bullet points to proper code block formatting. Over time, I became comfortable enough to use markdown without assistance, but AI helped accelerate that early learning curve.
Final Project
Prompt:
"Provide me a workflow for git push, making sure to avoid messing up the main build."
Reflection: AI helped me understand safe Git workflows, including branching strategies, pull requests, and merge conflict prevention. This guidance reduced the risk of overwriting teammates’ work and gave me more confidence in collaborating through GitHub.
Learning a Concept / Tutorial
Prompt:
"In Next.js, what does the 'pages' folder do and how does file-based routing work?"
Reflection: AI clarified that each file in the pages folder becomes a route, making it easier to connect folder structure with application URLs. This explanation helped me visualize the relationship between code organization and user navigation, making tutorials more intuitive.
Answering a Question in Class or on Discord
Reflection: I never used AI for this purpose because most questions I asked were grading-related and answered quickly by the instructor or peers. However, knowing AI was available for deeper technical clarifications gave me a safety net if I ever needed to explore a topic further before asking.
Asking or Answering a Smart Question
Reflection: I didn’t use AI here, as I was comfortable forming these questions without help. Still, I recognize that AI could be useful for refining a question to make it clearer and more specific if I ever needed assistance in framing a complex query.
Coding Example
Prompt:
"Show me an example of a simple Next.js component that displays a list of items from an array, and explain the code."
Reflection: AI provided a clean example that demonstrated array mapping and JSX rendering in Next.js. Having this concrete example not only saved time but also served as a reusable reference for similar features in other projects.
Explaining Code
Prompt:
"Explain what this Next.js page component does and how it works: <paste code snippet>."
Reflection: AI broke down imports, function definitions, return statements, and Next.js conventions, helping me understand both syntax and intent. This meant I wasn’t just copying code — I understood how to adapt it for new situations.
Writing Code
Prompt:
"Write a Next.js page that fetches JSON data from an API endpoint and displays it in a table."
Reflection: AI gave me a working scaffold with getStaticProps, which I modified to match assignment requirements. This helped me avoid “blank page syndrome” and focus on refining functionality instead of starting from scratch.
Documenting Code
Reflection: I didn’t use AI for documentation, as I had developed strong habits in previous ICS courses. However, AI could have been helpful for generating consistent JSDoc comments or converting technical explanations into more beginner-friendly language.
Quality Assurance
Prompt:
"For the following block of code, I received these errors, what went wrong?"
Reflection: AI helped pinpoint the causes of errors, particularly with database-related issues I hadn’t encountered before. It often suggested multiple possible fixes, which encouraged me to experiment and learn through comparison.
Other Uses
Reflection: Beyond structured coursework, I mostly used AI to clarify the purpose and function of software tools. It was like having a glossary and tutor rolled into one — ready whenever I hit an unfamiliar term or concept.
AI was especially helpful for steep learning curves, such as understanding Next.js or interpreting unfamiliar terminal output. Its explanations were often more approachable than official documentation, helping me grasp concepts faster. However, AI sometimes produced outdated or incorrect information, reminding me to treat it as a starting point and verify results through testing or trusted sources.
Outside ICS 314, I used AI for a personal project involving computer vision in Python to analyze bouldering videos. It helped me learn libraries like OpenCV and MediaPipe, troubleshoot frame extraction and object tracking, and explore alternative solutions. While not always perfect, AI accelerated my experimentation process.
A major challenge was over-trusting AI answers, especially with rapidly evolving frameworks like Next.js. Incorrect suggestions sometimes led to wasted debugging time. Still, AI offers significant potential as an on-demand tutor, providing quick explanations and examples while encouraging critical thinking.
AI offered instant guidance and examples, reducing downtime when I was stuck. Traditional methods like documentation and trial-and-error often led to deeper retention. The best approach combined both: using AI for quick direction, then reinforcing knowledge through hands-on practice.
AI could be integrated into ICS 314 as a supplementary resource, possibly through guided prompts tailored to course content. This would help students get targeted help while avoiding over-reliance. Encouraging students to restate AI answers in their own words could maintain critical thinking.
Using AI in ICS 314 proved valuable for tackling steep learning curves and quickly clarifying concepts. However, it also reinforced the need for verification and critical thinking. In the future, AI should be positioned as a research and brainstorming tool rather than a complete solution, ensuring students develop both efficiency and understanding.