Tag Archives: learning and failure

Is Artificial Intelligence Reducing Our Intelligence?

Joe Truss shared a great article with me, ‘The hidden cost of letting AI make your life easier‘, by Shai Tubali on Big Think. Towards the end of the article, Shai shares this:

“[Sven Nyholm’s] deeper worry is not that AI will outperform humans, but that it will appear to do so, especially to non-expert eyes. “Current forms of AI threaten meaningful activities,” he argues, “because they look far more intelligent than they are.” This appearance invites trust. People begin to treat AI as an oracle, mistaking an impressive engineering achievement for understanding. As misplaced confidence grows, judgment weakens. Skills develop less fully. Capacities are handed over too easily, and with them, forms of meaning that depend on effort.

Nyholm links this directly to the value of processes, including confusion, detours, and lingering with complexity. He punctures the idea that everything should be fast and efficient. Speed may feel pleasant, he concedes, yet it undermines patient thinking and reconsideration. He points to an Anthropic advertisement promising a paper completed in a single day: brainstorming in the morning, drafting by noon, polishing by afternoon. What disappears in this vision is the slow work of searching, getting lost, following the wrong thread, and returning with insight. “Many ideas,” Nyholm says, “come from looking for one thing and finding something else instead.” When AI delivers tidy, unified answers, it spares us that work. In doing so, it risks weakening our capacity to break complex problems into parts, examine assumptions, and think things through with precision.”

AI reduces the productive effort and struggle that makes both learning and understanding stick. Accessing information is profoundly different than understanding information, and directs the learner towards an answer instead of a learning process. This article reinforced some ideas I’ve already shared.

In ‘Keeping the friction‘ I said, “Decreasing the challenge doesn’t foster meaningful learning. Reducing the required effort doesn’t make the learning more memorable. Encouraging deeper thinking is the goal, not doing the thinking for you.”

And in ‘What’s the real AI risk in education?‘ I said, “Real learning has a charge to it, it needs to come with some challenge, and hardship. If the learning experience is too easy, it won’t be remembered. If there isn’t enough challenge, if the answers are provided rather than constructed, the learning will soon be forgotten. Remove being stuck, struggling, and failure, and you’ve removed the greatest part of a learning experience.”

I see this in my own learning. There are times I sit and read a full article, like the one shared above, but there are other times that I don’t bother and just throw a long article into an LLM and ask for a bulleted summary of the key ideas. However, I remember articles I read far better than articles where I only read the AI summaries.

How deep would my learning and understanding be if I only went as far as to read AI summaries? How much will my confidence and my belief in understanding grow, without the depth of knowledge to support my confidence and understanding? Would I be creating a kind of false fluency in topics where I lack true depth of understanding?

The convenience of using AI might not just be changing how we learn, it might be changing what we believe learning is… Perceiving learning as having access to information rather than having a deep understanding of a topic that needed to wrestle with to be truly understood. In this way, the convenience of using AI to think for us might just be reducing our intelligence.

What’s the real AI risk in education?

I read a great article on LinkedIn by Ken Shelton. He looked at two articles:

“On one side:
AI as productivity infrastructure.
On the other:
AI as compliance enforcement.

But in both cases, the conversation centers on efficiency and policing, not on whether learning itself has been redesigned for an AI-rich world. Using historical context, one could reasonably make similar arguments around the implementation of technology as well. If students are learning to “sound human” to avoid detection…If institutions are investing in increasingly sophisticated surveillance tools…If teachers are primarily using AI to move faster within the same structures…Then we have to ask, as I have shared in previous posts:

Are we adapting learning?
Or are we simply optimizing and defending legacy systems?”

I found his article more interesting than the two he shared. I especially loved his final paragraph:

“The risk isn’t just that AI is moving too fast. The risk is that our response remains reactive, oscillating between efficiency and enforcement, without addressing purpose, power, and pedagogy. Therefore, the real inflection point isn’t technological, it’s analytical and philosophical.”

My thoughts: In education, go ahead and use AI to make teaching and lessons better, use it to help students learn, and also help them understand how to use AI to enrich their learning. But don’t use it to make learning easier. Real learning has a charge to it, it needs to come with some challenge, and hardship. If the learning experience is too easy, it won’t be remembered. If there isn’t enough challenge, if the answers are provided rather than constructed, the learning will soon be forgotten. Remove being stuck, struggling, and failure, and you’ve removed the greatest part of a learning experience.

So educators need to do two things: First, they need to use AI to make what they are doing even better. And secondly, they need to shift the learning experience to one where they no longer need to worry about policing and compliance. For example: The work isn’t finished with an essay, but with students defending their points in the essay against other students with slightly different points or different perspectives. The students who wrote the essay with AI and didn’t fully comprehend the topic can’t argue their perspective as well as the ones who were willing to do the work… and if they did use AI and can then argue the points better than their peers, that only proves that they understand how to use AI as a learning tool and not a tool to do the work for them. Because the real risk of AI in education is that the AI is doing the work, the struggle, and the learning for the student.

The problem we face is how learning can be circumvented by AI. And so the challenge for educators is to make it more challenging to use AI inappropriately, and to use AI to aid in making learning experiences more challenging. This is not an easy task, but it’s one we need to figure out and do well if we want our students to be learners who will have significance in a world where AI is all around us.

_________
Update: Just found this LinkedIn post by William (Bill) Ferriter, and it has two awesome images to fit with the above.

Update 2: I forgot about this post: Thinking Requires Effort

Reducing busywork, and maximizing the problem-solving time, in a community of learners who find benefit from working together, is what schools should be in service of.

Thinking Requires Effort

I recently read a great article by Alec Couros, The Radical Act of Thinking. In it he said, “The challenge isn’t finding the tool anymore. The challenge is avoiding it. We’ve reached the point where AI is the path of least resistance for almost every task.

And then he concluded with this:

To succeed, we need to fundamentally reframe “effort.” We have to stop viewing the struggle of thinking as an inefficiency to be solved, and start protecting it as the very thing that helps us grow.

Here are a few ways that I see teachers doing this at Inquiry Hub:

  1. Community video or podcast challenges. Part of the challenge might include creating the video in a specific genre, or a meta part of the presentation where students explicitly describe what they have learned.
  2. Personalized inquiry projects. This is offered through a course designed around the process of learning, not content. So it doesn’t matter if a student is learning to code, designing a website, publishing a book, learning a specific skill in art, composing a song, starting a business, or even learning to crochet… the inquiry is designed around students learning skills they want to learn.
  3. Solving problems in class. I’ve questioned the value of homework for over 15 years now. Watching our senior math teacher teach Math & Physics, I see him focusing on the why of questions. I see his students working in pairs and groups to solve problems together on white boards. I see students actively struggling and learning in class, where they have access to support, and the focus is on the struggle and understanding the problem.

Something else that we do is to be careful not to add things to students loads unnecessarily. I can’t tell you the countless times I hear well-intentioned educators say, “You know what would be a good project for your students to do?” Followed by a legitimately good idea. But we are not an alternate school, we are a regular school with an alternative approach. Our students still need to fulfill the entire regular curriculum on top of the inquiries they do for credit. As good as other ideas may be, they become make-work activities that not all students are interested in, and this just invites students to use AI or to feel like the work is just busywork.

Will Richardson asks, “Every time you’re about to implement a new program or pedagogy or technology or initiative or building project or anything else, ask and answer this simple question: “In service of what?”

When we add anything to our schedule, it’s to serve one of two purposes:

1. Integrate curriculum or make the curriculum more engaging. Our students go on to universities, colleges, and technical institutes, and they need the required courses to get there and do well. But the required curriculum doesn’t need to be taught in a linear, boring fashion. When a project is added in class, the intent is to meaningfully cover more curriculum in less time.

2. We add things in service of students. A recent example: For the last 10 years our PAC has fundraised to provide students with FoodSafe every 2nd year. So all our students learn life skills around preparing and serving food. This year our PAC is also providing our seniors with first aid training. The plan is that they will alternate years between FoodSafe and first aid so that every student who goes through Inquiry Hub will have these life skills when they leave the school. Carving out 8 hours of training time over 2 days involves our senior teachers reworking their schedule… in the service of giving our students a life skill.

I won’t pretend that everything we do is AI proof, and that there aren’t lessons and activities where students could avoid thinking using a tool that does the work for them. I also won’t pretend that every assignment and project is ‘in service’ of authentic learning for students. But I will say that we’ve worked hard to make the learning meaningful for students. We provide them with opportunities to work in our community towards common goals, and we provide them with opportunities to pursue projects meaningful to them, focusing on the process of learning… on the struggle, with a perspective that failure and struggle are a path to real learning, not a barrier.

I’ve said before,

We talk a lot about ‘learning through failure’ in education, but we don’t really mean failure. Because when a student takes lessons from something not working, then it’s a learning opportunity and not actually a failure.”

This fits with what Alec said above,

To succeed, we need to fundamentally reframe “effort.” We have to stop viewing the struggle of thinking as an inefficiency to be solved, and start protecting it as the very thing that helps us grow.

The secret sauce is in providing the space and time for students to struggle out in the open, facing challenges or learning life skills that they will use. However, you don’t create these opportunities by continually adding things to a student’s plate. Adding more to their plates only invites them to find tools to do the work for them.

Thinking requires effort, and providing students with opportunities to demonstrate that effort in meaningful ways is, in my mind, the project of schools. Reducing busywork, and maximizing the problem-solving time, in a community of learners who find benefit from working together, is what schools should be in service of.

Content Free Learning (in a world of AI)

Yesterday, when I took a look at how it’s easier to make school work Google proof than it is to make school work AI proof, I said:

How do we bolster creativity and productivity with AND without the use of Artificial Intelligence?

This got me thinking about using AI effectively, and that led me to thinking about ‘content free’ learning. Before I go further, I’d like to define that term. By ‘content free’ I do NOT mean that there is no content. Rather, what I mean is learning regardless of content. That is to say, it doesn’t matter if it’s Math, English, Social Studies, Science, or any other subject, the learning is the same (or at least similar). So keeping with the Artificial Intelligence theme, here are some questions we can ask to promote creativity and productivity in any AI infused classroom or lesson:

“What questions should we ask ourselves before we ask AI?”
“What’s a better question to ask the AI?”
“How would you improve on this response?”
“What would your prompt be to create an image for this story?”
“How could we get to a more desired response faster?”
“What biases do you notice?”
“Who is our audience, and how do we let the AI know this?”
“How do we make these results more engaging for the audience?”
“If you had to argue against this AI, what are 3 points you or your partner would start with?”

In a Math class, solving a word problem, you could ask AI, “What are the ‘knowns and unknowns’ in the question?”

In a Social Studies class, looking at a historical event, you could ask AI, “What else was happening in the world during this event?” Or you could have it create narratives from different perspectives, before having a debate from the different perspectives.

In each of these cases, there can be discussion about the AI responses which are what students are developing and thinking about… and learning about. The subject matter can be vastly different but the students are asked to think metacognitively about the questions and tasks you give AI, or to do the same with the results an AI produces.

A great example of this is the Foundations of Inquiry courses we offer at Inquiry Hub. Student do projects on any topics of interest, and they are assessed on their learning regardless of the content.  See the chart of Curricular Competencies and Content in the course description. As described in the Goals and Rationale:

At its heart inquiry is a process of metacognition. The purpose of this course is to bring this metacognition to the forefront AS the learning and have students demonstrate their ability to identify the various forms of inquiry – across domains and disciplines and the stages of inquiry as they move through them, experience failure and stuckness at each level. Foundations of Inquiry 10 recognizes that competence in an area of study requires factual knowledge organized around conceptual frameworks to facilitate knowledge retrieval and application. Classroom activities are designed to develop understanding through in-depth study both within and outside the required curriculum.

This delves into the idea of learning and failure, which I’ve spoke a lot about before.In each of the examples above, we are asking students challenging questions. We are asking them to critically think about what we are asking AI; to think about how we can improve on AI responses; or, to think about how to use AI responses as a launching point to new questions and directions. The use of AI isn’t to ‘get to’ the answer but rather to get to a challenging place to stump students and force them to think critically about the questions and responses they get from AI.

And sometimes the activity will be too easy, other times too hard, but even those become learning opportunities… content free learning opportunities.

Every learner is a hero

Yesterday Inquiry Hub teacher John Sarte and I did a webinar with Will Richardson. In it, we shared the ‘Every Learner is a Hero’ whiteboard model we’ve been developing. I realize that this was drawn over 2 years ago and yet this was the first time we shared it in full.

There is a lot here, and John and I will share more, but here are some ideas on this whiteboard that I’ve already shared:

  1. The metaphor of Teacher as Compass – Think also of ‘Teacher as Guide’.
  2. The relationship between Learning and Failure , which involves re-examining the term failure.

It’s about more than just Transforming our Classrooms

It’s about creating a place for students to Dream, Create, and Learn… where student voice isn’t just about students presenting, but also about them helping to develop and create the learning spaces and experiences they want.

Students should be the heroes of their own learning journeys… after all it’s their learning that really matters.

 

Don’t know what you don’t know

I broke my bow a couple weeks ago, and bought a used but better bow than the one I had. This new-to-me bow was used by a top Canadian archer at the World Championships a few years ago… it’s a better bow than I’ll ever need. But I’m having such a hard time with it.

To be clear, it’s not the bow, it’s me. I’m a go cart driver trying to drive a Ferrari. With my old bow, I could tell when I was shooting well and when I struggled. With this bow, I’m shooting and it feels good, but with inconsistent results. Good shots and bad shots feel the same. Worse shots feel like the bow has a mind of its own, torquing in my hand after my shot, the string hitting my arm. I never had this issue with my old bow.

Here’s the challenge, I don’t know what I’m doing wrong? I don’t know what I don’t know. I’ve made all kinds of adjustments and still get inconsistent results. Yesterday while practicing, my buddy who is helping coach me heard me complain (again) that the bow feels too narrow on my hand, and I don’t know why.

He said, ‘Well, you can keep complaining and do the same thing, and get the same results. Or you can stop and try to fix it.’ And he sent me to get cardboard and tape, and try to make the grip wider to see if that helped. Covid makes these conversations a bit tougher, because he’s making suggestions from a distance, where we would normally be shoulder-to-shoulder working this out. So, he shot a couple more rounds while I hacked away at cardboard and wrapped my handle in tape.

It seemed to work, a lot of inconsistencies went away. I started shooting better, and the string stopped hitting my arm after my shot. I came home and wrapped a new handle with better material than cardboard, surrounded by some tennis racket over grip.

I’ll give this a try for a while. It might help considerably. It might be one of many adjustments I make. It might be something that promotes bad habits and I might need to undo it and start all over again. I need to remember that I’ve only had this bow for two weeks, and I’m still a rookie on a huge learning curve. Right now I’m in an experimental phase and need to shoot my next 1,000 arrows before I can consider my feedback valid enough to ‘know’ more. It’s hard to fix things when you don’t know what you don’t know…