Using Single Point Rubrics

I had a great conversation with an instructor today about rubrics. I got an email from the instructor asking about how to deal with a student’s repeated failure to meet the requirements of a project and if a rubric could progressively down-grade the student’s points in response. We are a Canvas college so fortunately the rubrics in our LMS will not take negative points. Fortunately, this instructor had NOT yet created a rubric for the class, so we got to have a conversation about Single Point Rubrics.

In my own teaching, I used to use a generic, traditional rubric but I would have the students review and rewrite the rubric based on their goals for the course. I would then have the students use the rubric to review one another’s work and then they would complete the rubric and turn it in with their projects. The Single Point Rubric makes this process a lot easier.

A Single Point Rubric relies on your discussion of the paper with the student. In three columns, a teacher can express what works, what the goals of the assignment are, and what the student needs to do to grow. Assessment needs to be collaborative, and Single Point Rubrics are a way for students to take ownership of their learning. It removes vague or subjective concepts from rubrics (e.g. “compelling argument”) and helps the students understand where they are as writers.

I sent him some interesting links to more on the single point rubric:

I think that the single point rubric is a way to make assessment more transparent and inclusive. If you have used them, I would appreciate your take in the comments below or join me for a discussion in Mastodon: @geoffcain

Posted in assessment | Tagged , , , | Leave a comment

The Environmental Impact of AI

Here is something from the MIT Reivew that I don’t see talked about much:

“In a new paper, researchers at the University of Massachusetts, Amherst, performed a life cycle assessment for training several common large AI models. They found that the process can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself).”

We can add this to the list of ignored down-sides to AI :-).

Posted in AI | Tagged , , | Leave a comment

Frances Bell: Guerilla Edtech

This is a must read from Frances Bell:

“Rather than taking a dystopian or utopian position, a ’rewilding’ is a more productive frame to address technology’s place within education (ibid). This is akin to other guerrilla projects such as gardening, wherein gardeners reintroduce nature to abandoned or neglected land as an act of protest or taking action (‘Guerrilla gardening’, 2023). Healing damage through unsolicited action. Exploring how convivial technology and the various ways ‘open’ can ameliorate the increasing negative impact on our environment is just one way in which we can stimulate conversations and discussions (Selwyn, 2023).”

 

 

Posted in education | Leave a comment

Look Before We Leap: Evaluating Ed Tech

I saw the best minds of my generation destroyed by madness, hysterical from FOMO,
Dragging themselves through the education conference at dawn looking for the latest app…
Apologies to Allen Ginsberg.
Photo by Walter Lehman.

The willy-nilly embracing of AI in education and ChatGPT in particular has got me thinking again about the importance of rubrics for the evaluation of education technology. I think we are at a place where all of the efforts that have gone into the vetting of education technology are more important than ever. I am not a Luddite or a technophobe. I have been involved in education technology for more decades than I will admit. I am not “wringing my hands” as many ed tech folks are dismissively claiming that educators are doing about AI. But this is not my first rodeo. There are scores of tools, apps, programs, etc. that have come and gone over the years. A lot of this reminds me of the Web 2.0 flurry around 2008-10 when faculty and IDs accepted every tool as the latest thing despite the lack of accessibility and the coming spectre of “sustainability” – that is when we all figure out that corporations do not do anything for free. Many of the same people who were excoriating companies for flogging non-accessible software and apps ten years ago seem to very excited about AI. Pages have been written on issues around ed tech and the corporate take-over of education, yet somehow all of that gets forgotten. I think it is important to think about why and what we should be doing instead.

So What?

There are a number of issues around using AI in education. I am not concerned about the plagiarism issue because that is an easily addressed instructional design issue that I have addressed elsewhere in this blog even before anyone was concerned about AI. There are other issues that I think are even more troubling.

DEI
One of the ironies of all the hubbub around ChatGPT is that this is happening during a much needed examination of how we approach diversity, equity and inclusion in education. There has been a lot of serious steps taken and just as an example I would point to the numerous rubrics that are available for analyzing a syllabus or a course for DEI issues. And yet it all stops when it comes to the latest thing in education. Large Language Model AI uses huge amounts of data scraped from the internet developed by a society that is endemically racist. I think this represents a huge step backwards.

Data Security
As reported in Cyber Security Hub, “Italy has made the decision to temporarily ban ChatGPT within the country due to concerns that it violates the General Data Protection Regulation (GDPR). GDPR is a law concerning data and data privacy which imposes security and privacy obligations on those operating within the European Union (EU) and the European Economic Area (EEA).” The Italian Data Protection Agency is concerned about all of the personal data that was used to “train” ChatGPT. It is unclear how OpenAI uses the personal data that ChatGPT was trained on and what it does with the information it gleans from its use. By entering your email, phone number, and information from the browser used, combined with the information it can glean from the kinds of questions that are being entered, provides ChatGPT with a wide-array of actionable personal information that can be monetized.

Ethics
There are so many ethical issues here just starting with how ChatGPT, for instance, was trained using exploitative labor practices. But there are also unresolved copyright issues. It is all well and good that the courts seem to be saying that text, images, and audio created by AI are not copyrightable, but ethically, AI is not creating anything – it is exploiting the creative work of others. Eventually, this will have to be addressed. If I Xerox a poem or artwork, and say it is mine, I am wrong. It is a derivative work based on someone else’s creation. Once the courts understand how LLM AI works and what is really going on, we may have a different reading but I am not holding my breath.

So Why Are We Doing This?

The real question for me is why are ed tech and instructional design folks diving into this headlong? Many are not just testing and evaluating, they are actively promoting it as a tool to do their job and encouraging instructors in the “proper use” of a tool when we know there are serious problems with it. I would also add that very few people really understand ChatGPT well enough to make that recommendation. I hear instructional designers use all kinds of verbs around ChatGPT that are misleading and dangerous (it “thinks,” “creates”).

Sloth?
Two Toed Sloth from Costa Rica via Wikipedia. Maybe there is an allure of an “easy” way to do instructional design or to write a paper. Thinking and processing information is hard work – but I think it is work worth doing. If the cost of saving some time and effort is the reinforcement of an inherently racist program (or at least substantially biased) that was built using exploitative labor practices then maybe it is just not worth it. I am not excited about thinking being off-loaded to an algorithm, doing the work should be part of the fun. The travel industry brags that they have been using canned ad copy for years. When was the last time you were excited by travel ad copy?

FOMO?
Fomo is the “Fear of Missing Out” – we had better do this before someone else does. No instructional designer or “futurist” wants to be left behind in the next ed tech revolution. Except it is not an ed tech revolution, it is a corporate raking in of user information and cash because none of the corporations want to miss out either.

Cutting Edge?
Max Headroom image.This is somewhat related but the idea here is that as an instructional designer unless you are using the latest version of WidgetWare3000 then you are somehow less than. There has to be room for analyzing what we are doing and why we are doing it. I think that being on the cutting edge should include a careful look at what we are recommending to faculty and why. I am not saying that there is NO role for AI in education. I am asking why there is no oversight on what these corporations are doing in education.

Toothlessness?
Toothless abominable snowman from Rudolf the Rednosed ReindeerAnd let’s face it: there is no real oversight. None of the recommendations and few of the education technology purchasing rubrics used by IT depts have any real consequences for any one using whatever software a faculty or staff member decides to use. Is this really an academic freedom issue? One would hope it was part of a budget process but then any vetting only goes towards things that are purchased with that particular budget or purchased in a particular way. No one in education is willing to take responsibility until someone sues. We have made a lot of great in-roads into accessibility but we still have a long way to go. ChatGPT has the same problem with data security.

Greed?
Ed tech companies that engage in open-washing and routine copyright violations are thrilled with AI as courts determine that images and text generated by AI prompts are not subject to copyright even though the text generated may have been derived from materials under copyright. So legally, the open-washers are in the clear legally even though ethically they are, as usual, neck deep in the steaming, ethical miasma of for-profit at any cost.

What Is To Be Done?

I am not asking anyone to abolish AI or anything like that. I am only asking that we follow up on efforts meant to vet education technology before we recommend its use in education. That’s it. We have VPATs for education technology, but those deal with accessibility: we need to include data security among other things as well.

Ed Tech Evaluation Rubrics: An Annotated Reading List

Ansley, Lauren & Watson, Gavan. (2018). A Rubric for Evaluating E-Learning Tools in Higher Education. Educause Review. An examination of the Rubric for eLearning Tool Evaluation. This rubric includes criteria for privacy, data protection, and rights. It also considers cost under the “Accessibility” criteria.

Digital Tool Evaluation Rubric. (n.d.). Mississippi Department of Education. This rubric is for k-12 instructors and it is surprisingly comprehensive. It includes data security and “support for all learners.” I like the simplicity, but for our purposes, I think it needs more links out to explanations and research.

Sinkerson, Caroline. et al. (2014). RAIT: A Balanced Approach to Evaluating Educational Technologies. Educause Review. I like the approach in general, the method used for evaluation. This method does not look at student privacy, accessibility, and DEI issues, but focuses solely on the instructor’s relationship to the technology.

TrustEd Apps™ Data Privacy Rubric (2018). 1EdTech. IMS Global. This is a narrow rubric meant to evaluate how an app handles user privacy. This is particularly important considering that ed tech companies may either directly sell their user data (dis-aggregated or no) or sell their data to companies who in turn sell it to data brokers. Why isn’t this in other ed tech evaluation rubrics?

Lee, C-Y. & Cherner, T. S. (2015). A comprehensive evaluation rubric for assessing instructional apps. Journal of Information Technology Education: Research, 14, 21-53. Significantly, this method includes “cultural sensitivity” as a criteria. See “Appendix A” of this article for a copy of their rubric. 

Coming Soon…

ELC Equity and Tech Rubric (Draft). (2023).  E-Learning Council. Washington State Board for Community and Technical Colleges. This is a draft that the DEI workgroup from the ELC is working on. It covers a lot of the issues that are being discussed around AI in education. I think this is going to be a real game-changer in how we look at ed tech in WA state. The link will be live when they have finished. Watch this space…  

If you have other resources that you think might address these issues, feel free to put a link in the comments below.

Posted in AI | Tagged , , , | Leave a comment

OER Faculty Institute: Call for Proposals 

The Clover Park Technical College’s Teaching and Learning Center is excited to invite proposals for the 2023 OER Faculty Institute to be held on August 11th and 12th completely online. Our emphasis is professional and technical education, but faculty, librarians, students, instructional designers, administrators, and other OER advocates from all disciplines are encouraged to submit a proposal. Last year’s presentations saw representation from a wide variety of institutions and disciplines.

This no-cost, virtual faculty institute features keynote presentations from internationally recognized professional/technical colleges as well as breakout sessions on finding OER, accessibility in OER, open licenses, as well as panels on prof/tech OER opportunities and challenges, and a panel on OER and librarians. All sessions are held virtually via Zoom and will be recorded.

Submit your proposal by June 5th.

If you missed last year’s OER Faculty Institute, we have the program with links to the presentations online and, if you are Clover Park faculty, these presentations are also in a Canvas course where you can get PDU credit for viewing them.

  ###

Posted in education | Leave a comment

Promoting Authentic Learning in the Age of AI

Have brain have machine.

From https://freesvg.org/artificial-intelligence

As an instructional designer and former English teacher, I have mixed feelings about using A.I. to produce school work. If there is a learning outcome that is legitimately met by a particular tool (including AI), then fine – let’s include that. Apart from that, the actual definition of plagiarism is the practice of taking someone else’s work or ideas and passing them off as one’s own, which is what things like ChatGPT actually do. But this has been a common problem throughout the history of education. How do we mitigate this issues? Here are some suggestions:

  1. Teach learning as a process. For instance, require students to turn in the various stages of their paper – outline, concept maps, discovery draft, bibliography, etc. then the assessment becomes a record of the students’ learning process rather than the production of a paper.
  2. Dedicate a class session to academic integrity. This lets the students know that you value academic honesty. Make sure they get a copy of the school’s academic honesty policy. Some even go so far as having the students sign a statement that says that they read and understand the policy. Often if students know what the expectations are, they will meet or exceed them.
  3. Ask deeper questions. If the answers the students are giving you are coming from Wikipedia, AI, Course Hero or other cheating sites, then you are asking the wrong questions.Take a look at Bloom’s Taxonomy and think about what kind of work you are asking your students to do. Use the Constructivist approach and ask students to apply new information to their past experiences and prior knowledge. Students learn best when they are able to apply what they are learning to past experience.
  4. Scaffold your assignments so the solutions to the next problem are based on what they learned solving previous problems.
  5. Know your students. Get to know your students by asking them to keep a journal. Or begin the day with a short in-class writing assignment. If you know how they express themselves and at what level their writing is at, any changes would be a red flag for issues in the students’ work.
  6. Let your students connect. Have the students work together on their revisions or projects. Have them discuss their research in groups. This encourages students to bring their research to their audience. Also, this allows students to see how other students tackle assignments and research problems. Students will turn to tools like AI when they feel they do not have the support necessary to do their work. As an instructor, you have the opportunity to create community in your classroom to provide that support.
  7. Project-based learning asks students to actively engage in “real-world and personally meaningful projects.” Often, students turn to short-cuts in learning because they do not feel ownership of their learning. Create a space in the curriculum for students to creatively appy what is important to them.
  8. Make it relevant. Don’t just ask students to take a position on a topic and argue for it. Require that they use a few current news sources in their paper. I love teachers who complain about reading the same death penalty papers year after year. Change the assignment! Ask them to bring in a topic from the headlines. Have them find out what journalists or writers are covering their topic.
  9. Model academic integrity. Some of the same teachers who get worked up about plagiarism will sometimes be the same teachers who do not cite sources in their own class materials. Show students how it is done and why early and often. Start with that favorite quote or picture you put in your syllabus.
  10. Explore Open Pedagogy and student-driven curricula. Open pedagogy is the deepest form of student-driven curricula. According to Mavs Open Press, open pedagogy is a form of experiential learning in which students demonstrate understanding through the act of creating content. Students can demonstrate their understanding of a topic by creating assignments. Who knows? Maybe the decisions your students make will include AI, but create a space in your class to talk about the implications and issues around AI in education.

The most important thing is that we do not let the tools define education; define education and leverage the appropriate tools to support that definition. We should not let AI become the next learning management system (the tools define the teaching). If you have other ideas for promoting academic integrity in the Age of AI, I would love to hear from you in the comments!

Posted in AI, education | Tagged , , , , | Leave a comment

Mastodon and a Sense of Community

Detail from Mastodon by Heinrich Harder.

This is just a quick update on my experiences moving from Twitter to Mastodon. I am very happy with my move out of the Twitter dumpster fire. At first, it was a real shock pulling myself out of the dozen or so communities in Twitter. I was not too happy in Twitter though, too much corporate open-washing, hate speech, “experts” with something to sell, etc., and I found that the “communities” were so big that the commercial interests wound up having the biggest voice. I was finding it harder to “hear” the voices I came there to hear, and it was too unwieldy to really engage in any meaningful way. At best, it would connect me to spaces where one could interact with others. All the reasons I was staying sounded lame and a little Stockholm-Syndrome-ish. The one thing I am finding in Mastodon that I was not finding in Twitter is intention. The folks I am connecting with in Mastodon really want community and a more depth to their engagement. I would trade a dozen of those encounters for a thousand forgotten Twitter followers any day. Also, the Mastodon folks seem to be reviving the blogosphere. It feels like we are taking the conversation around education back!

Posted in community | Tagged , , , | Leave a comment

Visual Learning and the Graphic Syllabus in Five Questions

Robert Fudd's vision of knowledge. I am working with a few faculty on campus who are looking to adopt a graphic syllabus. This brief posting is basically a recording of that conversation for online reference. Those who have read this blog for a while know that I have done a lot of work over the years in visual learning and concept maps, and I think that a graphic syllabus is a great way to start the semester. When I hear faculty say “my syllabus is my contract with the students,” I find that the syllabus often looks like a contract. The students immediately tense up and flip through them looking for the button to press to agree to the terms. At the very least, I ask faculty to at least put their picture on the syllabus, maybe for the more adventurous I will ask them to use the school colors and the mascot: this at least connects the syllabus to the college.

Why Can’t We Have a Little Color?

That was the best question I got from an instructor who is concerned about what message the learning materials give new students. How much more welcoming can we make the syllabus? We are doing a lot with the syllabus at Clover Park Technical College. We are exploring was to change how we think of the syllabus and the language we use in it to address diversity, equity and inclusion issues. This, of course, is long overdue. The culture that created the syllabus and the students who read them (when they do) has long since changed. We are also attending to things like accessibility and student support. The syllabus is an opportunity to connect students to everything they need to be successful in a class and in college. So the interest in a graphic syllabus is in alignment with needed changes that we have been wanting to make anyway.

What Story Does Your Syllabus Tell?

Syllabus will be remembered if it is telling a story or is invested in a metaphor (a “treasure map” for instance). We use visual metaphors as mnemonic devices.  All of the reasons that we use concept maps also work for why we should be using a graphic syllabus – it helps students order and remember the information.

According to the University of Texas at Austin’s CTL, the graphic syllabus can help you rethink the story your syllabus communicates to your students. In addition to focusing on the big picture, this is your chance to make the text you do use even more meaningful.

  • Created effectively, a graphic syllabus tells a story not only about your course, but about you, your enthusiasm for the course, and your expectations for the students.
  • Use language that conveys a sense of support for students’ well-being, includinginformation on relevant support resources.
  • Think about the tone of your syllabus, the rationale you provide for assignments and policies, and how you encourage enthusiasm for the material

Is It Accessible?

Attending to accessibility is not too difficult here. Often, a graphic syllabus is based on a previous, traditional syllabus. All we need to do is to make sure that the text version is accessible (uses styles like headings, tables use headers, etc.).  Somewhere in the top of the syllabus should be a link to the pure text version of the syllabus.

What Tools Should We Use?

One could use any number of tools. I would use the tools that you are most comfortable with including:

  • Google (Docs, Slides, etc.)
  • MS Word
  • PowerPoint – this is a favorite of some faculty because they know how it works and you can save your work as a pdf.
  • Piktochart
  • Canva
  • And many others.

Can You Send Me Some Examples?

Renaissance memorization technique represented as a table of information.

  • Cambrian College put up some templates that we are interested in because a lot of our faculty are comfortable (or at least more comfortable) using PowerPoint than Adobe products or learning a new platform.
  • Chem 100 syllabus reproduced at Memorial University. I like this because a timeline is an element in the graphic.
  • The syllabus as a concept map is highlighted here at the University of Rhode Island. This is how I commonly represented my courses graphically.
  • Here is the infographic syllabus of an infographic course!
  • This example discusses how to use the online tool Canva to build out a syllabus.

If you have favorite examples of graphic syllabi, post them in the comments below or feel free to drop me a note in Mastodon.

Posted in visuallearning | Tagged , , , , , | Leave a comment

Future Post: Dismantling A.I.

The HAL 9000 from  2001: a Space Odyssey (Wikipedia)

This is a “guest post” by ChatGPT. I asked “it” to write an essay on how to dismantle the strangle-hold AI has on society just in case it is needed for the future. I was relieved that I do not have to go back in time and kill Sarah Conner. I was disappointed that it seems to think that A.I. is integral and inevitable. I am giving it a “C-” for not fully addressing the topic. It will be the last word on this blog on AI for a while unless some earth-shattering advance happens. We have solved these kinds of problems before. Until then, lets bookmark some of these ideas:

Artificial intelligence (AI) has become an integral part of our society, with many industries and organizations relying on it to automate processes and make decisions. However, as AI becomes more prevalent, it is also becoming clear that it can be harmful if not properly regulated. To dismantle the stranglehold that AI has on our society, several steps can be taken.

Firstly, it is important to ensure that the development and use of AI is transparent and accountable. This means that organizations should be required to disclose the algorithms and data sets used in their AI systems and be held responsible for any negative impacts caused by their use. This would allow for greater oversight of AI systems and ensure that they are being used ethically and responsibly.

Secondly, it is essential to ensure that AI is inclusive and equitable. This means that AI systems should be developed and deployed with the needs of marginalized and underrepresented communities in mind. It is also important to ensure that AI is not perpetuating or exacerbating existing biases and inequalities. To achieve this, there needs to be more diverse representation in the development of AI, and also a continuous monitoring and evaluation of AI systems to ensure they are not biased.

Thirdly, it is crucial to invest in research and development of explainable AI. Explainable AI is a form of AI that can be understood and interpreted by humans. As AI becomes more advanced, it is increasingly difficult for humans to understand the reasoning behind its decisions. This can lead to a lack of trust in AI systems, which is harmful to society. By investing in explainable AI, we can ensure that AI systems are transparent and can be held accountable for their decisions.

Finally, it is important to establish a regulatory framework for AI. A regulatory framework would establish guidelines and standards for the development and use of AI, ensuring that it is used ethically and responsibly. This could include regulations for data collection and sharing, as well as guidelines for the design and use of AI systems.

In conclusion, to dismantle the stranglehold that AI has on our society, it is essential to ensure that AI is transparent and accountable, inclusive and equitable, and explainable. Investment in research and development of explainable AI and the establishment of a regulatory framework for AI would be necessary to achieve this. By taking these steps, we can ensure that AI is used in a way that benefits society and does not harm it.

Posted in AI | Tagged , , | Leave a comment

Hypothes.is: Annotation Starter Assignments

I was in a great workshop today from the Hypothesis Customer Success Team on Starter Assignments to get students used to annotating. Clover Park Technical College is a partner campus so we have access to workshops, a campus customer support contact, and other resources. I have been to some of these workshops before and they have always been concise, interesting, and very useful. The presentation is rich with assignment instructions and links to other resources. The workshop went through Hypothesis basics, using starter assignments, and then how to implement this in Canvas (80% of the participants were Canvas users).

I have used social annotation in some of my professional development courses with faculty and elsewhere, but I want to shape them to let faculty know how to get their own students engaged in their readings. Here are some of the things I learned in the workshop that I would implement in my prof dev courses:

  • Start by annotating the syllabus. Gives students a low-stakes experience using the annotation tools as well as getting them to think about the syllabus in general.
  • Faculty can pre-annotate articles in advance and then include questions about the reading that they can reply to annotations. This models annotation.
  • Open the readings for student questions, resources, and connections. Encourage student co-creation of knowledge. Have them link what they are learning to their workplace, home, other classes, or any previous learning.
  • Have students annotate study guides by adding links to other media, provide clarifications, ask questions. Students can add videos or images to annotations making readings a multimodal experience.

If you have starter assignments that you use to get your students started with social annotation, feel free to comment below or annotate this blog posting!

Posted in annotation | Tagged , , , , | Leave a comment