There is no significant learning without a significant relationship. - James Comer
James P. Comer
“No significant learning can occur without a significant relationship.” Really? Does Dr. James Comer mean that students need to be close to their teacher to learn something?

I use this quote in just about every talk I give about my work. It's a compelling intellectual nugget because it works in all the ways that good axioms do: a simple concept, phrased well, with many applications.

Comer was not advocating for radical social pedagogy here. He was referring to the relationship of the learner to what they are learning as a child psychiatrist with extensive training in how the brain functions. We know that learning also creates and strengthens connections between concepts in the brain.

Neurons that fire together, wire together. - Hebb's rule

For the constructivist, this all feels very natural. To acquire knowledge, you must make meaning by engaging with what you want to learn. The process is a direct reflection of the process that happens in your brain. And for the social constructivist, the picture is complete when you add in other humans, bringing their own connections with content and neural pathways.

But what happens when you bring AI into the equation? What kind of relationship should learners have with those tools? Perhaps this is a question for the connectivists.

genAI and our relationship with information

I know about the power of retrieval because I grew up in a household that was a milieu of references. The overlapping conversations were a gumbo of Monty Python quotes, movie lines, psalms, and snatches of song lyrics. Each of these verbal memes were layered with meaning that had as much to do with what they were from as what they actually said. This cultural gatekeeping kept me at a huge disadvantage as the baby of the family, eight years junior to the next youngest family member.

Life is full of these trials. By the time I was old enough to start indoctrinating my toddler with quotes from old The State sketches, cell phones had entered and were subsequently banned from family riffing time. Being able to retrieve the perfect reference was the point of the game and to be able to name the year and original writer were bonus points. Looking things up was both slowing things down and bringing useless accuracy to our jazzy, loud interactions. At some point we added, "No devices, no devices!" to the mix, and online fact checking was banned from the living room.

I often find myself in the same tension with genAI tools. When I ask a genAI chatbot to retrieve a piece of information for me, it comes hermetically sealed in a polite paragraph, divorced from its epistemological cousins, void of the kind of free association that gets my brain firing the adjacent neurons. At least when you Google search you used to have to go to a website and stumble on related information.

The access to oceans of data that these tools gives us also makes it necessary for us to sever context. Books are explained to you in 5 key points. Documents only exist as grist for RAG pipelines, chunked and served like pieces of nigiri. Just the choice bits can be great, sure, but sometimes you want fish stew.

There is emerging evidence that suggests that using generative AI to engage with a topic may be less sticky than other kinds of learning. This study (still in press) is less conclusive than some would imply, but still raises questions about the connections we might be depriving our brains of when we use genAI to complete tasks.

relationships [with/through/in spite of] AI

How we will use genAI to engage with information is a deep and dark well, but I'm just as interested in what kind of relationships we will form with these tools. In higher education, we have leaned into recommending that students engage with AI as a collaborator.

These un-thought partners are purpose built for generating novel output, so a role in brainstorming seems natural. There is the risk that all of the cloud seeding is done by the machine, and the human brain does none of the connecting of disparate ideas. It is likely that learners who are given this advice will not differentiate between using a chatbot to enhance the process versus take the process over. If they simply ask for 40 ways to approach a personal essay, we've neutered the learning relationship yet again.

A collaborator might also critique or expand on the learner's ideas. A common pedagogical approach to prompt with AI is chatbot as devil's advocate, prodding learners into defending their own points of view. This is an intriguing idea, but I struggle to argue with something that doesn't actually hold any personal convictions. When rhetoric is flattened into hard facts, you might as well be arguing with an encyclopedia. It doesn't require critical judgement to engage in these conversations. Without intensive prompting, most LLMs will concede the point with orthographic gymnastics and then congratulate you on a productive debate.

There is a future in which we form relationships with these tools on their own terms as they are injected into our group chats and workflows. It is already happening. Coders say good morning to their assistants. I say thank you when Claude finds a link I've been looking for. We accuse chatbots of gassing us up too much. "That's just like you," we say. "Like who?" we wonder.

I have many thoughts about how we might design tools that use AI to play with the relationships that are formed during learning. I imagine a time where every class project as a group dynamics and project management agent. I see colleges creating virtual assistant instructors for team taught courses.

Mix this with the messiness of human relationships, and it all gets concerning very quickly. This blog is not about the concerning effect this could have on romantic or platonic relationships, but instructional ones. There will be applications in other fields that ask similar questions. Godspeed to all of us riding the cultural wake.

However we choose to proceed in our design and adoption of AI and the ensuing relationships, one thing is clear to me: We should protect human relationships above all else. We know that people help each other learn in beautiful and powerful ways. Maybe we'll create genAI tools that enhance our ability to connect with one another. We must also know when it's time to call "No devices" and get back to the business of learning together.

AI Disclosure: This one was off the dome, like a well-timed Neil Young lyric. I have been doing reflective journaling on this topic for awhile, sometimes with Claude. I did run it through a grammar checker because my English degree has expired.

arguing with encyclopedias: why learning is about relationships