Efforts to Regulate A.I.

I have not posted here in a while because of conferences, in-service, travel, and other local elearning events – it was a whirl-wind summer of AI in Education, a conference at University of Central Florida. We then presented with our new found expertise at a nursing deans conference, and are working on AI initiatives on campus including a summer AI Faculty Institute. I have been attempting to take a measured approach, looking at the ethics, accessibility, data privacy, climate impacts, racism, etc. But that does not seem to be a popular take on all of this. I am interested in really examining it in order to push for a truly open and transparent version of these kinds of tools that are geared towards education rather than profit.

There has been some movement in regulation on A.I. in the form of one step forward and then two steps backwards. UNESCO has had some interesting discussions, meetings, and publications, but I am waiting for something with more teeth. It is important though to get the principles we are talking about down.  It is ridiculous to think that we can’t agree on this because we may not be in agreement with all of the principles.

I am impressed with how fast Joe Biden got something together. The teeth in this one are pointing in the right direction: make the ethical principles be required for federal funding. Hit them in the pocket book. It will take state governments years to get a statement down that will mean anything. This started last October with the Blueprint for an AI Bill of Rights (the rights are for us, not the AI).

The sad thing is that we have safeguards in place for student data (FERPA) and for accessibility, yet the onus for protecting students is on the schools and faculty – not the ed tech corporations. Again and again, ed tech companies build and sell products that end up selling student information to data brokers and/or not being accessible. Pretty much the same scenario is playing out here. Rishi Sunak hosted the international AI summit and his big message was that it needs to be regulated but not at the expense of “innovation” (innovation = corporate profits).

Posted in AI | Tagged , , | Leave a comment

Open Pedagogy and Teaching with Zines

UK and US zines via Wikipedia.

I have recently, just for artistic inspirational reasons, went to San Francisco for the Labor Day weekend. My wife and I used to live there in the early 90s when we first graduated from college and got married. She worked her first “real” job at Pacific Gas and Transmission, and I was a substitute teacher, writer, and publisher of the humble zine “Notes from the Underworld.” It was a poetry zine but inspired by Dada, Surrealism, and the Fluxus movement.

Anyway, we go back now and again to hit some museums and galleries, see some friends, hang out in North Beach, and eat some really great food. This time, for no reason really, we decided to go to the SF Zine Fest. It was really an amazing experience. I expected it to be a handful of tables thinking that since the 90s, the space that self-published DIY expression was replaced by the internet and social media. I was so wrong: I was expecting maybe 50 vendors but there were at least 200 there. I hear that the LA and Chicago zine fests also have that many people. I love zines because of the artistic freedom they represent, the subversion of the publishing model, and just the raw human experience that goes into many zines.

At the Zine Fest, I ran into a couple of tables that represented work of students from a high school and a k-12. I thought that was fantastic. Art teachers working with English teachers to create joint projects. One of the things that I am concerned about as an instructional designer is student ownership of the learning. This fits right into our work on open pedagogy where students are not only demonstrating that they have met the outcomes of an assignment, but they are also learning to own the media with which they are creating. This spirit should be in our approach to the internet as well. There is so much to unpack here: it is a totally different relationship to learning, information, and media.

At the Olympia Zine Fest here in Washington State.

We grow up in this country as passive consumers of media and that informs our education. The corporate textbook often defines the course. Assignments that involve creating, like blogging, making videos or podcasts, posters, zines, etc. put the creation and the engagement with media back in the hands of the students. When the students are creating, they are asked to interrogate their own choices about what images they use, what text they use, how their message is presented, what they leave in, what they leave out, whose voices are centered in the narrative – these questions become skills. They will also start to critically question media that they didn’t create. It is a part of media literacy and the process of critical thinking.

The readings I have listed below are interesting to me because of the wide ranging examples and uses of zines: from students assignments to a teacher, Kate Ozment, turning her syllabus into a zine. This kind of work will be going into my instruction design work with open pedagogy and into my own teaching when I return to the classroom (English and ABE).

If there are some other readings that you think belong here or that myself and others would benefit from, drop me a message or leave a comment below. If you are already using zines in your classroom, I would love to gather more examples.

Readings, References, and Explorations

Bakaitus, Elvis. (2019) Zines as Open Pedagogy. Open Pedagogy Notebook.

Brown, A. , et al. (2021) Zines as Reflective Evaluation Within Interdisciplinary Learning ProgrammesFrontiers in Education, v.6.

Fields, Erin,  Alisauskas, A., and Taylor, J. (2020) Participatory Publishing: Zines as Open Pedagogy. UBC Library & Archives. University of British Columbia. Presentation.

Lonsdale, Chelsea (2015) “Engaging the “Othered”: Using Zines to Support Student Identities.Language Arts Journal of Michigan: Vol. 30: Iss. 2, Article 4.

Orozoco, Cynthia Mari. (n.d.) Informed Open Pedagogy and Information Literacy Instruction in Student-Authored Open Projects. Open Pedagogy Approaches.

Ozment, Kate (2020) “Making the Syllabus Zine.” Sammelband. Women in Book History

Rallin, Aneil, and Ian Barnard. (2008) “The Politics of Persuasion versus the Construction of Alternative Communities: Zines in the Writing Classroom.Reflections 7.3: 46-57.

Sahagian, Jacqui (2022) “Zine-making as Critical DH Pedagogy.” Scholar’s Lab. University of Virginia Library.

Scheper, J. (2023). Zine Pedagogies: Students as Critical Makers. Radical Teacher, 125.

Smith, Christopher. (2021) Zines: An Intro to Multidisciplinary Writing. Georgia Southern University. Book Chapter.

Stahura, Dawn. (2023) Teaching with Zines. SSU Library. Salem State University.

Teaching with Zines (n.d.) Barnard College.

 

Posted in education | Tagged , , , , , | Leave a comment

AI and the End of Education

robot teacherBy the click-baity title “end of education,” I don’t mean the inevitable intellectual dystopian smoking hellscape of education that is to come, but the purpose of education: to what end are we doing what we do and by what means do we wish to do it. What is the purpose of education? Even here in a technical college, our goal shouldn’t be to just get someone through a test, but to teach them how to learn. We are not just teaching someone how to repair a car, but how to analyze problems. The uncritical adoption of the latest technology (or idea, pedagogy, philosophy, etc.) does not do that.

“He [Plato] defined education as we would: as training of personality to absorb the greatest possible scope and intensity of meaning and value from experience.” – Kenneth Rexroth

I think that the question we need to ask is why are we in education in the first place? Why are we teaching? If it is purely transactional – that you are here to provide student X with a certification for a job. Then, yes, by all means, lets get rid of teachers and replace them with AI or maybe a “smart” vending machine. But if it is more than that, then we should be taking the time to look closely at the tools we choose to use.

The idea that AI is going to make it easier to write a paper is like saying that a Xerox machine makes it easier to draw. The purpose of drawing is not just to easily reproduce what we see, but to use our minds and body, our whole person to bear on ideas in a tangible way and connecting to the world around us. A camera can reproduce things more accurately, but even modern architectural drawings try to communicate the spirit of a place or the concept behind the building.

Learning how to think, how to solve problems creatively does not need to be off-loaded on to technology as something difficult. Thinking and creating is difficult – its called “cognitive dissonance” and the resolution of that dissonance is where learning takes place. This is not just the acquisition of information or the production of a paper: the knowing happens in the work.

To bowdlerize McLuhan, the tools we choose to shape the world also shape us, therefore it is critical that we understand what tool we are actually choosing. Proponents of tools like ChatGPT are already projecting human thought processes onto a Large Language Model that does not actually think, create, or decide. Educators need to account for the ethical costs of the tools we choose to use. There are problems with the lack of transparency with companies like OpenAI where we have no way to verify the data that the company uses to “train” its ChatGPT because that information is proprietary. But I have touched on these issues elsewhere.

“The function of education, therefore, is to teach one to think intensively and to think critically. But education which stops with efficiency may prove the greatest menace to society. The most dangerous criminal may be the man gifted with reason, but with no morals.” – Martin Luther King, Jr.

I don’t object to a single thing about AI itself – it is just another tool and tools have their uses.  It is being used to make huge improvements in medical treatments, assist during natural disasters and help us respond to climate change, just to name some innovations. What is objectionable is that corporations like OpenAI are treating education as just another marketplace and students as just another commodity. They also promote the view that education and assessment is just the production of just another product. And institutions are going along with it all because they think they will miss out on “the latest thing.” If we don’t do this, then someone else will. Alternately, I am hoping that we can create some spaces for critical thinking and analysis of AI and make decisions about how AI might fit into education based on something besides marketing.

Posted in AI | Tagged , , , | Leave a comment

Clover Park Technical College is excited to host our second annual OER Faculty Institute!

The logo of Clover Park Technical CollegeDates: Friday, August 11 and Saturday, August 12 
Time: 9 am to 3 pm 
Location: Zoom

There will be keynote presentations from internationally recognized OER professionals as well as sessions on finding OER, accessibility in OER, open licenses, open pedagogy as well as a panel on OER and librarians.  

Although some sessions are prof/tech focused, we opened the institute to all community and technical colleges to encourage a wide participation and collaboration  on OER initiatives. 

There is no cost to attend, and it is held virtually via Zoom. We welcome your participation and look forward to seeing you at the institute! 

Please fill out the event registration form for the Zoom link, a follow-up email with links to the recordings and slides, and a copy of the program. 

Please forward widely and broadly to your colleagues and your communities in our Washington State CTC system. For more information, contact Geoff Cain at geoffrey.cain@cptc.edu

Posted in education | Leave a comment

Brief Notes on Issues Around ChatGPT

Here are some quick notes around issues around ChatGPT:

  1. Lack of corporate transparency from the mis-named Open AI (Silicon Republic)
  2. Ethical issues around labor practices with Open AI (Time)
  3. Difficulties with attribution (Duke University)
  4. Unresolved copyright issues (Bloomberg Law)
  5. Spreading misinformation (NY Times)
  6. Biased and discriminatory responses (UNESCO)
  7. Privacy and ethical concerns (Wired)
  8. Produces poor writing that can teach bad habits (The Atlantic)
  9. A turn towards learning as product instead of a process (requiring thinking and creating)
  10. The carbon footprint is horrendous (MIT Review)
  11. We are not looking at open source alternatives (Tech Talks)
  12. There is no vetting beyond buying into the hype cycle
Posted in AI | Tagged , | 2 Comments

OER Faculty Institute Extends CFP

The OER Faculty Institute CFP extended to June 16th: Our Clover Park Technical College Mascot , Simon

The Clover Park Technical College’s Teaching and Learning Center is excited to invite proposals for the 2023 OER Faculty Institute to be held on August 11th and 12th completely online. Our emphasis is professional and technical education, but faculty, librarians, students, instructional designers, administrators, and other OER advocates from all disciplines are encouraged to submit a proposal. Last year’s presentations saw representation from a wide variety of institutions and disciplines.

This no-cost, virtual faculty institute features keynote presentations from internationally recognized professional/technical colleges as well as breakout sessions on finding OER, accessibility in OER, open licenses, as well as panels on prof/tech OER opportunities and challenges, and a panel on OER and librarians. All sessions are held virtually via Zoom and will be recorded.

Submit your proposal by NEW DATE: June 16th!

If you missed last year’s OER Faculty Institute, we have the program with links to the presentations online and, if you are Clover Park faculty, these presentations are also in a Canvas course where you can get PDU credit for viewing them.

Posted in OER | Tagged , , , | Leave a comment

Using Single Point Rubrics

I had a great conversation with an instructor today about rubrics. I got an email from the instructor asking about how to deal with a student’s repeated failure to meet the requirements of a project and if a rubric could progressively down-grade the student’s points in response. We are a Canvas college so fortunately the rubrics in our LMS will not take negative points. Fortunately, this instructor had NOT yet created a rubric for the class, so we got to have a conversation about Single Point Rubrics.

In my own teaching, I used to use a generic, traditional rubric but I would have the students review and rewrite the rubric based on their goals for the course. I would then have the students use the rubric to review one another’s work and then they would complete the rubric and turn it in with their projects. The Single Point Rubric makes this process a lot easier.

A Single Point Rubric relies on your discussion of the paper with the student. In three columns, a teacher can express what works, what the goals of the assignment are, and what the student needs to do to grow. Assessment needs to be collaborative, and Single Point Rubrics are a way for students to take ownership of their learning. It removes vague or subjective concepts from rubrics (e.g. “compelling argument”) and helps the students understand where they are as writers.

I sent him some interesting links to more on the single point rubric:

I think that the single point rubric is a way to make assessment more transparent and inclusive. If you have used them, I would appreciate your take in the comments below or join me for a discussion in Mastodon: @geoffcain@mastodon.social

Posted in assessment | Tagged , , , | Leave a comment

The Environmental Impact of AI

Here is something from the MIT Reivew that I don’t see talked about much:

“In a new paper, researchers at the University of Massachusetts, Amherst, performed a life cycle assessment for training several common large AI models. They found that the process can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself).”

We can add this to the list of ignored down-sides to AI :-).

Posted in AI | Tagged , , | Leave a comment

Frances Bell: Guerilla Edtech

This is a must read from Frances Bell:

“Rather than taking a dystopian or utopian position, a ’rewilding’ is a more productive frame to address technology’s place within education (ibid). This is akin to other guerrilla projects such as gardening, wherein gardeners reintroduce nature to abandoned or neglected land as an act of protest or taking action (‘Guerrilla gardening’, 2023). Healing damage through unsolicited action. Exploring how convivial technology and the various ways ‘open’ can ameliorate the increasing negative impact on our environment is just one way in which we can stimulate conversations and discussions (Selwyn, 2023).”

 

 

Posted in education | Leave a comment

Look Before We Leap: Evaluating Ed Tech

I saw the best minds of my generation destroyed by madness, hysterical from FOMO,
Dragging themselves through the education conference at dawn looking for the latest app…
Apologies to Allen Ginsberg.
Photo by Walter Lehman.

The willy-nilly embracing of AI in education and ChatGPT in particular has got me thinking again about the importance of rubrics for the evaluation of education technology. I think we are at a place where all of the efforts that have gone into the vetting of education technology are more important than ever. I am not a Luddite or a technophobe. I have been involved in education technology for more decades than I will admit. I am not “wringing my hands” as many ed tech folks are dismissively claiming that educators are doing about AI. But this is not my first rodeo. There are scores of tools, apps, programs, etc. that have come and gone over the years. A lot of this reminds me of the Web 2.0 flurry around 2008-10 when faculty and IDs accepted every tool as the latest thing despite the lack of accessibility and the coming spectre of “sustainability” – that is when we all figure out that corporations do not do anything for free. Many of the same people who were excoriating companies for flogging non-accessible software and apps ten years ago seem to very excited about AI. Pages have been written on issues around ed tech and the corporate take-over of education, yet somehow all of that gets forgotten. I think it is important to think about why and what we should be doing instead.

So What?

There are a number of issues around using AI in education. I am not concerned about the plagiarism issue because that is an easily addressed instructional design issue that I have addressed elsewhere in this blog even before anyone was concerned about AI. There are other issues that I think are even more troubling.

DEI
One of the ironies of all the hubbub around ChatGPT is that this is happening during a much needed examination of how we approach diversity, equity and inclusion in education. There has been a lot of serious steps taken and just as an example I would point to the numerous rubrics that are available for analyzing a syllabus or a course for DEI issues. And yet it all stops when it comes to the latest thing in education. Large Language Model AI uses huge amounts of data scraped from the internet developed by a society that is endemically racist. I think this represents a huge step backwards.

Data Security
As reported in Cyber Security Hub, “Italy has made the decision to temporarily ban ChatGPT within the country due to concerns that it violates the General Data Protection Regulation (GDPR). GDPR is a law concerning data and data privacy which imposes security and privacy obligations on those operating within the European Union (EU) and the European Economic Area (EEA).” The Italian Data Protection Agency is concerned about all of the personal data that was used to “train” ChatGPT. It is unclear how OpenAI uses the personal data that ChatGPT was trained on and what it does with the information it gleans from its use. By entering your email, phone number, and information from the browser used, combined with the information it can glean from the kinds of questions that are being entered, provides ChatGPT with a wide-array of actionable personal information that can be monetized.

Ethics
There are so many ethical issues here just starting with how ChatGPT, for instance, was trained using exploitative labor practices. But there are also unresolved copyright issues. It is all well and good that the courts seem to be saying that text, images, and audio created by AI are not copyrightable, but ethically, AI is not creating anything – it is exploiting the creative work of others. Eventually, this will have to be addressed. If I Xerox a poem or artwork, and say it is mine, I am wrong. It is a derivative work based on someone else’s creation. Once the courts understand how LLM AI works and what is really going on, we may have a different reading but I am not holding my breath.

So Why Are We Doing This?

The real question for me is why are ed tech and instructional design folks diving into this headlong? Many are not just testing and evaluating, they are actively promoting it as a tool to do their job and encouraging instructors in the “proper use” of a tool when we know there are serious problems with it. I would also add that very few people really understand ChatGPT well enough to make that recommendation. I hear instructional designers use all kinds of verbs around ChatGPT that are misleading and dangerous (it “thinks,” “creates”).

Sloth?
Two Toed Sloth from Costa Rica via Wikipedia. Maybe there is an allure of an “easy” way to do instructional design or to write a paper. Thinking and processing information is hard work – but I think it is work worth doing. If the cost of saving some time and effort is the reinforcement of an inherently racist program (or at least substantially biased) that was built using exploitative labor practices then maybe it is just not worth it. I am not excited about thinking being off-loaded to an algorithm, doing the work should be part of the fun. The travel industry brags that they have been using canned ad copy for years. When was the last time you were excited by travel ad copy?

FOMO?
Fomo is the “Fear of Missing Out” – we had better do this before someone else does. No instructional designer or “futurist” wants to be left behind in the next ed tech revolution. Except it is not an ed tech revolution, it is a corporate raking in of user information and cash because none of the corporations want to miss out either.

Cutting Edge?
Max Headroom image.This is somewhat related but the idea here is that as an instructional designer unless you are using the latest version of WidgetWare3000 then you are somehow less than. There has to be room for analyzing what we are doing and why we are doing it. I think that being on the cutting edge should include a careful look at what we are recommending to faculty and why. I am not saying that there is NO role for AI in education. I am asking why there is no oversight on what these corporations are doing in education.

Toothlessness?
Toothless abominable snowman from Rudolf the Rednosed ReindeerAnd let’s face it: there is no real oversight. None of the recommendations and few of the education technology purchasing rubrics used by IT depts have any real consequences for any one using whatever software a faculty or staff member decides to use. Is this really an academic freedom issue? One would hope it was part of a budget process but then any vetting only goes towards things that are purchased with that particular budget or purchased in a particular way. No one in education is willing to take responsibility until someone sues. We have made a lot of great in-roads into accessibility but we still have a long way to go. ChatGPT has the same problem with data security.

Greed?
Ed tech companies that engage in open-washing and routine copyright violations are thrilled with AI as courts determine that images and text generated by AI prompts are not subject to copyright even though the text generated may have been derived from materials under copyright. So legally, the open-washers are in the clear legally even though ethically they are, as usual, neck deep in the steaming, ethical miasma of for-profit at any cost.

What Is To Be Done?

I am not asking anyone to abolish AI or anything like that. I am only asking that we follow up on efforts meant to vet education technology before we recommend its use in education. That’s it. We have VPATs for education technology, but those deal with accessibility: we need to include data security among other things as well.

Ed Tech Evaluation Rubrics: An Annotated Reading List

Ansley, Lauren & Watson, Gavan. (2018). A Rubric for Evaluating E-Learning Tools in Higher Education. Educause Review. An examination of the Rubric for eLearning Tool Evaluation. This rubric includes criteria for privacy, data protection, and rights. It also considers cost under the “Accessibility” criteria.

Digital Tool Evaluation Rubric. (n.d.). Mississippi Department of Education. This rubric is for k-12 instructors and it is surprisingly comprehensive. It includes data security and “support for all learners.” I like the simplicity, but for our purposes, I think it needs more links out to explanations and research.

Sinkerson, Caroline. et al. (2014). RAIT: A Balanced Approach to Evaluating Educational Technologies. Educause Review. I like the approach in general, the method used for evaluation. This method does not look at student privacy, accessibility, and DEI issues, but focuses solely on the instructor’s relationship to the technology.

TrustEd Apps™ Data Privacy Rubric (2018). 1EdTech. IMS Global. This is a narrow rubric meant to evaluate how an app handles user privacy. This is particularly important considering that ed tech companies may either directly sell their user data (dis-aggregated or no) or sell their data to companies who in turn sell it to data brokers. Why isn’t this in other ed tech evaluation rubrics?

Lee, C-Y. & Cherner, T. S. (2015). A comprehensive evaluation rubric for assessing instructional apps. Journal of Information Technology Education: Research, 14, 21-53. Significantly, this method includes “cultural sensitivity” as a criteria. See “Appendix A” of this article for a copy of their rubric. 

Coming Soon…

ELC Equity and Tech Rubric (Draft). (2023).  E-Learning Council. Washington State Board for Community and Technical Colleges. This is a draft that the DEI workgroup from the ELC is working on. It covers a lot of the issues that are being discussed around AI in education. I think this is going to be a real game-changer in how we look at ed tech in WA state. The link will be live when they have finished. Watch this space…  

If you have other resources that you think might address these issues, feel free to put a link in the comments below.

Posted in AI | Tagged , , , | Leave a comment