AI Flipping the Fixed OER (and much more, maybe)

I’ve been reading and sharing what I read about AI, but frankly have not been as deeply immersed in the land of prompt engineering. Like many of you, I somewhat feel like the firm ground I have been relying on, might be shifting.

That’s not bad. Yet.

Influenced some by what David Wiley wrote last summer about Generative Textbooks, a realization has repeated in my head that so muich of our process, efforts, approaches with OER-- all of which has been a positive change in the open learning space-- is really focused on a fixed entity. A video, an interatcive applet, an open textbook. It’s published, in a specific platform, format-- but more or less becomes a solid object, if you will.

Brick City flickr photo by cogdogblog shared under a Creative Commons (BY) license

Hey, at one time it was supposed to be all the learning object lego blocks we could snap together as remixes.

It seems like to me though, that it’s possible, as David describes, for future learning content, perhaps mediated through an LLM or similar thing, could be something more… liquid, flexible, not just limited to what was set into print or web code.

That does not mean better or worse, but different. So I think there is opportunity, and people are doing it by not tossing prompts into the vast milue of ChatGPT/Bard/Claude et all, but those things where you focus the LLM capability to connect, summarize on the content you choose to give it.

One of many examples I came across is where @dajbelshaw exported a set of his blog notes for a degree he is pursuing, into a format he could then query through an LLM.

There is a lot of this going on I know, and I have an idea later I want to pitch elsewhere to do this on a domain of content in an OE Global program.

This is nothing I have done myself or plunged into, but am thinking conceptually - how will our concept of OER change if it is not in a fixed form? Of course it must be first published first into something you can feed an LLM, but maybe it also frees us of the constraints of format and being licked inside a platform-- its really just text, and if I understand correctly, it need not even be structured.

But Wait There is More

This might bend your mind, it did mind, but David’s thinking again is pushing my own with a new post “An ‘AI Student Agent’ Takes an Asynchronous Online Course” – the opening is what I think we ought to be chewing on around here:

Given the magnitude of impact generative AI is having and will have in education (and many other aspects of life), I’m working with some diligence to keep up to date with developments in the field. Recently, I noticed how a couple of the emerging capabilities of generative AI will come together in the future in a way that will impact education much more dramatically than I am hearing anyone talking about currently (if I’m missing this conversation somewhere, please help me connect to it!). But before I give away the punch line, let me share the individual pieces. Maybe you’ll see what I saw.

I leave it for you to take in some of the pieces, but what I saw via the (weirdly narrated) demo video for OpenInterpreter suggested even a future fluidity of computer interfaces. I am not too sure what to make of Open Interpreter since everything looks shiny on the web sites, but at least it’s not washing the word “open”

I am just fumbling around here with ideas, but isn’t that what we ought to be doing here? Is the future paved by fixed bricks or floating down rivers or some of both?

I have no idea. I just am seeking conversations.

I concur with this perspective and anticipate the continued integration of generative AI into learning management systems (LMS). My own experience in late 2022 involved successfully piloting a branching scenario using a language learning model. The setup involved scripting multiple personalities, defining specific learning objectives, and establishing both duration and evaluation criteria. The detailed prompt enabled the successful execution of this AI-driven branching scenario, which included not only grading and feedback but also offered multiple choices at each decision point, pausing for user input before proceeding.

Since conducting this test, I have not revisited the experiment, yet I am optimistic about the advances that may have occurred in the interim. Looking forward, I can easily envision the development of these ‘fluid textbooks’ - dynamic, adaptive learning resources that evolve in real-time based on student interactions and progress. In this evolving educational landscape, communication and writing skills - particularly the ability to describe complex ideas clearly and conduct thorough research - will become increasingly critical. These skills will likely emerge as the most valuable competencies for students and professionals alike, essential for navigating and succeeding in a continually adapting educational environment.

Yeah, that’s what we’ve been doing and will continue doing. We’ve highlighted the ability to translate content that exists on an LMS (Moodle but can be used in other LMSs) that includes everything that’s in what’s commonly referred to as ‘online homework systems.’ Translating the content into the student’s first language is not the only thing a teacher can to with the OERtist Tool’s AI. Revising the content to meet specific learning standards that vary from country to country, state to state, is also very useful.

Some of us have actually been promoting the ability to revise OER in an LMS for awhile - see Developing Professional Staff: For-Profit Involvement in OER - Part 3

AI is just pushing the envelope.

1 Like

Thanks Kristy and welcome here- that’s very interesting to hear about the use of AI for a branching scenario activity. Was it used to create the activity or was the interaction through an AI interface?

And we hope that the community space here lives up to the description in your profile! I am excited to see Manitoba represented.

Trust me Dan that I respect the innovative work you have been doing for a while to apply AI for translating and making OER available in languages like Dagbani but also making use of offline-first platforms like Moodlebox.

This is important work in creating OER as we have known it, a translation is important, yet it is another copy. What I was trying to probe, and likely not well written, was how might our approach need to change if OER can exist in not as much a fixed form as a course, a lesson, but something more dynamic?

There is gerat value of course in a structure approach, especially for basic knowledge and eary education, but might there be some element where the OER is not in a fixed format? This is just my wondering, not a suggested path, just trying to imagine what a learning environment experience look like if not in a book like or LMS like delivery.

  "That is Not What I Meant At All; That is Not It, At All."

Let us go then, you and I, through an analysis of our most recent experiences of teaching and or learning (because they’re so entwined) in an environment that used an OER course in an open source learning management system and include a translation or two or more.

Thank you, cogdog! Happy to be here :blush:

The development of the branching scenario ended up being a fascinating (and fun!) collaborative effort between myself and the AI. I wrote (and rewrote!) the outline for the thematic framework, scenario specifics, educational goals, character profiles, and operational parameters (the phrasing and parameters took a few iterations to successfully finalize). I instructed the AI to create 10 multiple-choice pathways aligned with the outlined learning objectives and scenario parameters, and to only launch the scenario once I had entered the prompt, “BEGIN”.

The AI then assumed “control”, generating the questions in the roles of the personas I had designated and communicating with me in their respective voices at each branch. This process removed the need to individually author every single branching option and response!

Upon completion of the scenario, I had prompted the AI to assess my responses and assign a grade to me, which it did. I then prompted the AI to explain why I was awarded that grade, and it then went through a breakdown of each of my chosen responses and how each had contributed to my overall grade.

This interactive experience took place exclusively within the language learning model, reminiscent of the old text adventures characteristic of early King’s Quest games, but again this test was run about a year and a half ago and there have been many changes to the LLM I was using since this test took place. I can’t wait to see what we are capable of if this technology is applied to H5P or Articulate Storyline! I hope this answered your question