Art lesson resource: AI for Workstream

Screenshot 2024-04-14 at 19.16.48

We hear a lot about Artificial Intelligence at the moment, mainly worrying and negative stories. There are many positives and I have recently been involved with an evaluation project that I can outline here. It was based on using OpenAI’s Chat-GPT4 (the paid for version) to enhance workflow; assist workstream and make teaching more productive. The evaluation was on behalf of ImpactEd and they will be publishing a wide-ranging evaluation later this year that I can hopefully link this post to in future.

The Project

You may have seen some of the scare stories in the media, such as this recent CNN report Teachers are using AI to grade essays here from 6th April or from Education Week:  Teachers Told Us They’ve Used AI in the Classroom here from 5th January. In The Guardian, there is the conflicting It’s an educational revolution (here on 17th April 2019) to AI to spell the end of the traditional school classroom (here on 7th July 2023). I have even seen an article that I can no longer find saying something like “Teachers evaluating use of AI!” Ha, but of course we are. The tools are available and should be explored to see any potential benefits but also to understand how they are already being used by students. ImpactEd, in partnership with OpenAI, organised an evaluation of the latest version of Chat-GPT4 and how this might assist teachers in becoming more productive with their time. This is based on many surveys that suggest the majority of teachers are undertaking 60 hour weeks, burdensome administrative tasks and planning that go beyond expectations from school leaders and our Department for Education. Recent union action has been based on pay that reflects hours spent trying to keep on top. Initially, teachers from STEM and MFL backgrounds were invited to apply to take part, I petitioned due to my previous tech experience as a cross-curricular participant. I believe around 16 teachers nationally were part of the 8 week project. This involved exploratory tasks, group collaborative discussions and round-ups of success and issues arising. Finally, a summative survey evaluating use and potential.

Inspiration

You may be wondering what Chat-GPT4 can do and how it may differ from the free version. The free version of ChatGPT cannot access the internet. Chat-GPT paid -for, however, can do so and thus can provide more accurate answers. OpenAI integrated lots of features into the paid-for version, such as lots of generative pre-trained transformers that integrate at greater speed. One feature is faster Internet search into GPT-4, which means that the AI chatbot can provide up-to-date information on current events. Another integrates Dall-E for image processing. There are hundreds; some based on data, others produce programming, and there’s even a few tattoo generators! You can upload a scheme of work and ask the AI to generate a multi choice end of unit quiz. There are UX designers, writing coaches, travel guides and even assistants for medical diagnosis. You can upload files; PDF, Word documents, PowerPoints and diagrams and ask the GPT for a single paragraph summary or even to transform the text from one format into another.

Method

I used a variety of GPTs as part of my evaluation; for instance, I uploaded the PowerPoint slides for a PSHCE lesson I was teaching on Vietnamisation to Year 10. I asked the AI to make an ideal answer to the written task that would include all the salient points, have correct answers and use the linguistic style of a 15 year old UK student. I reframed the request to make the summary 300 words so that I could share a model answer to check students’ understanding.

Screenshot 2024-04-14 at 19.07.24

I used image generators to create responses to themes alongside students; they provided the key words and I inputted them. I did this as a control for what the AI would use as starting points and intend to include Chat-GPT4 as a source credited in the students’ work. I used Dall-E and a few different Tattoo generators for this – with student direction.

Screenshot 2024-04-14 at 19.09.16

I used a Canva GPT to create graphics for posters and displays, then tweaked the outcomes in Photoshop to personalise them.

Screenshot 2024-04-14 at 19.10.12

I uploaded the GCSE specification (as a PDF) and asked for assessment summaries based on specific art projects and for them to be graded to 3, 4, 5 and 6 to create a writing framework for student assessment.

Feedback

As you can imagine, this technology hasn’t been around long enough to be perfect and by its implementation we can tell what works well and how to improve it. The first version of my model answer was produced in a highly academic style and in addition to seeking answers in the slides, further clarification came off the Internet! When I asked to rephrase in the style of a 15 year old, I got some cliched Californian “like uh, gee” type answers. The third version was pretty perfect and convincing for my students.

Image generation has some faults too; glasses and anatomy often become floating shapes or disjointed (at times too many fingers). The style can be tweaked into something photorealistic or even painterly; my preference is for a woodcut style.

Posters and colour schemes come out well; I really like a retro 60s futuristic outcome.

The writing frameworks are excellent but incredibly vague, due in whole to the fact that they are summaries of sections of assessment text which is in itself subjective. I then personalised with a few specific sentences of my own for each student.

In terms of improving a workstream of tasks to get things done more productively, I found that creating model answers saved a lot of time in preparation for PSHCE lessons that are never really the highest priority in my planning (probably as I enjoy prepping art lessons more).

For image generators, the quality was way higher than most of my students could produce but gave them a visual reference that could then be used to explore ideas themselves. Although this didn’t correspond with time constraints, it was a short-cut of sorts to aid them in creating something a little more original. I do wonder that if similar word prompts are used multitude times, will the AI create very similar images and thereby be a lot less original? As a tool to assist unimaginative students it was useful, albeit limited to very same kinds of glossy reproductions. With the right tweaks and then further embellishment in image-editing software, I think image GPTs will soon become a staple within the software itself. Photoshop’s AI generator added this year can do some amazing things!

Finally, using AI as part of assessment feedback was incredibly productive. Asking for alternate rephrasing of grade descriptors created a varied and individualised feedback framework that only needed a few sentences referring to students advice for improvement. I never used a copy/paste with this anyway and by handwriting it I rephrased and personalised in my head. I managed to do two classes of feedback in just a few hours. A bit like the automated prompts when texting, you can’t always use the suggestions as is and they need the human modes of speech. I’m sure this aspect will improve in future versions.