So much fire and hands, weird hands.
AI-yi-yi
Re: AI-yi-yi
- Dr. Medulla
- Atheistic Epileptic
- Posts: 116000
- Joined: 15 Jun 2008, 2:00pm
- Location: Straight Banana, Idaho
Re: AI-yi-yi
Where it's at after a year of the AI beast on campus: https://archive.ph/8iDNJ
"I never doubted myself for a minute for I knew that my monkey-strong bowels were girded with strength, like the loins of a dragon ribboned with fat and the opulence of buffalo dung." - Richard Nixon, Checkers Speech, abandoned early draft
Re: AI-yi-yi
Hello,Dr. Medulla wrote: ↑17 May 2023, 8:05amWhere it's at after a year of the AI beast on campus: https://archive.ph/8iDNJ
I think the point about teachers having to adapt is most relevant for my situation. Most of the lab reports involve specifics to the subjects and results, so I'm not worried as much there. Writing prompts in my other classes have been and will continue to be very specific. All other writing involves article critiques from specific articles. I think I'm pretty sae from ChatGPT. I can see students using Wordtune rather than ChatGPT - the real danger there is becoming totally dependent on using it to polish everything.
Adaption will be crucial. Provide unique writing prompts, ask specific questions, etc.
- Dr. Medulla
- Atheistic Epileptic
- Posts: 116000
- Joined: 15 Jun 2008, 2:00pm
- Location: Straight Banana, Idaho
Re: AI-yi-yi
Since most of my writing assignments are book reviews—term papers, I’ve come to believe, aren’t an especially useful pedagogical exercise—and take great pains in explaining how a book review should be written, right down to a format that I expect them to follow until they become skilled enough to go jazz, AI isn’t a huge threat … yet. But if I do have a class where a research paper makes sense, I might build into it all an interview component. Which, really, isn’t a bad thing either, to promote more human connection between teacher, student, and work. As terrible as AI is overall, it’s a good thing if we need to be more conscious about our methods and the rationale. “But that’s the way I’ve always done it” isn’t a good argument.gkbill wrote: ↑17 May 2023, 10:43amHello,Dr. Medulla wrote: ↑17 May 2023, 8:05amWhere it's at after a year of the AI beast on campus: https://archive.ph/8iDNJ
I think the point about teachers having to adapt is most relevant for my situation. Most of the lab reports involve specifics to the subjects and results, so I'm not worried as much there. Writing prompts in my other classes have been and will continue to be very specific. All other writing involves article critiques from specific articles. I think I'm pretty sae from ChatGPT. I can see students using Wordtune rather than ChatGPT - the real danger there is becoming totally dependent on using it to polish everything.
Adaption will be crucial. Provide unique writing prompts, ask specific questions, etc.
"I never doubted myself for a minute for I knew that my monkey-strong bowels were girded with strength, like the loins of a dragon ribboned with fat and the opulence of buffalo dung." - Richard Nixon, Checkers Speech, abandoned early draft
Re: AI-yi-yi
Hello,Dr. Medulla wrote: ↑17 May 2023, 10:57amSince most of my writing assignments are book reviews—term papers, I’ve come to believe, aren’t an especially useful pedagogical exercise—and take great pains in explaining how a book review should be written, right down to a format that I expect them to follow until they become skilled enough to go jazz, AI isn’t a huge threat … yet. But if I do have a class where a research paper makes sense, I might build into it all an interview component. Which, really, isn’t a bad thing either, to promote more human connection between teacher, student, and work. As terrible as AI is overall, it’s a good thing if we need to be more conscious about our methods and the rationale. “But that’s the way I’ve always done it” isn’t a good argument.gkbill wrote: ↑17 May 2023, 10:43amHello,Dr. Medulla wrote: ↑17 May 2023, 8:05amWhere it's at after a year of the AI beast on campus: https://archive.ph/8iDNJ
I think the point about teachers having to adapt is most relevant for my situation. Most of the lab reports involve specifics to the subjects and results, so I'm not worried as much there. Writing prompts in my other classes have been and will continue to be very specific. All other writing involves article critiques from specific articles. I think I'm pretty sae from ChatGPT. I can see students using Wordtune rather than ChatGPT - the real danger there is becoming totally dependent on using it to polish everything.
Adaption will be crucial. Provide unique writing prompts, ask specific questions, etc.
If you assign a research paper, you could require students include a few (5?) direct text citations. I don't think ChatGPT will do that - although I'm not sure. Oral exams would be the best but that would be too time-intensive. I assign an essay question in my Psychology of Sport and Exercise undergrad class, I give them the question in the class prior to the exam and tell them they can write up an answer, but they can bring no notes to class - they need "walking around knowledge"; study their answer sufficiently to reproduce it verbatim. If they learn the material from my class notes or AI, so be it - they've learned the material. It's more about application anyway.
- Dr. Medulla
- Atheistic Epileptic
- Posts: 116000
- Joined: 15 Jun 2008, 2:00pm
- Location: Straight Banana, Idaho
Re: AI-yi-yi
With any history term paper, there's an expectation of some primary research, which AI, I suspect, isn't up to. That helps a great deal.gkbill wrote: ↑17 May 2023, 11:47amHello,Dr. Medulla wrote: ↑17 May 2023, 10:57amSince most of my writing assignments are book reviews—term papers, I’ve come to believe, aren’t an especially useful pedagogical exercise—and take great pains in explaining how a book review should be written, right down to a format that I expect them to follow until they become skilled enough to go jazz, AI isn’t a huge threat … yet. But if I do have a class where a research paper makes sense, I might build into it all an interview component. Which, really, isn’t a bad thing either, to promote more human connection between teacher, student, and work. As terrible as AI is overall, it’s a good thing if we need to be more conscious about our methods and the rationale. “But that’s the way I’ve always done it” isn’t a good argument.gkbill wrote: ↑17 May 2023, 10:43amHello,Dr. Medulla wrote: ↑17 May 2023, 8:05amWhere it's at after a year of the AI beast on campus: https://archive.ph/8iDNJ
I think the point about teachers having to adapt is most relevant for my situation. Most of the lab reports involve specifics to the subjects and results, so I'm not worried as much there. Writing prompts in my other classes have been and will continue to be very specific. All other writing involves article critiques from specific articles. I think I'm pretty sae from ChatGPT. I can see students using Wordtune rather than ChatGPT - the real danger there is becoming totally dependent on using it to polish everything.
Adaption will be crucial. Provide unique writing prompts, ask specific questions, etc.
If you assign a research paper, you could require students include a few (5?) direct text citations. I don't think ChatGPT will do that - although I'm not sure. Oral exams would be the best but that would be too time-intensive. I assign an essay question in my Psychology of Sport and Exercise undergrad class, I give them the question in the class prior to the exam and tell them they can write up an answer, but they can bring no notes to class - they need "walking around knowledge"; study their answer sufficiently to reproduce it verbatim. If they learn the material from my class notes or AI, so be it - they've learned the material. It's more about application anyway.
And you're right about learning stuff one way or another. Even with textbook lecturers, where you can get by not going to class if you follow along in the text, does generate some degree of education, tho it raises the question what the instructor is even doing there then.
The biggest thing I hate about all this, and I'm sure I've mentioned it before, is that it pushes me into a default position of "students are going to cheat and I have to compensate for that." Yes, some students have always cheated, but they were a distinct minority. My fear is that we come to expect a too-many-to-ignore number will cheat. And I hate sacrificing that trust that I go into with any class.
"I never doubted myself for a minute for I knew that my monkey-strong bowels were girded with strength, like the loins of a dragon ribboned with fat and the opulence of buffalo dung." - Richard Nixon, Checkers Speech, abandoned early draft
Re: AI-yi-yi
Hello,Dr. Medulla wrote: ↑17 May 2023, 12:05pmWith any history term paper, there's an expectation of some primary research, which AI, I suspect, isn't up to. That helps a great deal.gkbill wrote: ↑17 May 2023, 11:47amHello,Dr. Medulla wrote: ↑17 May 2023, 10:57amSince most of my writing assignments are book reviews—term papers, I’ve come to believe, aren’t an especially useful pedagogical exercise—and take great pains in explaining how a book review should be written, right down to a format that I expect them to follow until they become skilled enough to go jazz, AI isn’t a huge threat … yet. But if I do have a class where a research paper makes sense, I might build into it all an interview component. Which, really, isn’t a bad thing either, to promote more human connection between teacher, student, and work. As terrible as AI is overall, it’s a good thing if we need to be more conscious about our methods and the rationale. “But that’s the way I’ve always done it” isn’t a good argument.gkbill wrote: ↑17 May 2023, 10:43amHello,Dr. Medulla wrote: ↑17 May 2023, 8:05amWhere it's at after a year of the AI beast on campus: https://archive.ph/8iDNJ
I think the point about teachers having to adapt is most relevant for my situation. Most of the lab reports involve specifics to the subjects and results, so I'm not worried as much there. Writing prompts in my other classes have been and will continue to be very specific. All other writing involves article critiques from specific articles. I think I'm pretty sae from ChatGPT. I can see students using Wordtune rather than ChatGPT - the real danger there is becoming totally dependent on using it to polish everything.
Adaption will be crucial. Provide unique writing prompts, ask specific questions, etc.
If you assign a research paper, you could require students include a few (5?) direct text citations. I don't think ChatGPT will do that - although I'm not sure. Oral exams would be the best but that would be too time-intensive. I assign an essay question in my Psychology of Sport and Exercise undergrad class, I give them the question in the class prior to the exam and tell them they can write up an answer, but they can bring no notes to class - they need "walking around knowledge"; study their answer sufficiently to reproduce it verbatim. If they learn the material from my class notes or AI, so be it - they've learned the material. It's more about application anyway.
And you're right about learning stuff one way or another. Even with textbook lecturers, where you can get by not going to class if you follow along in the text, does generate some degree of education, tho it raises the question what the instructor is even doing there then.
The biggest thing I hate about all this, and I'm sure I've mentioned it before, is that it pushes me into a default position of "students are going to cheat and I have to compensate for that." Yes, some students have always cheated, but they were a distinct minority. My fear is that we come to expect a too-many-to-ignore number will cheat. And I hate sacrificing that trust that I go into with any class.
I have my exams on Canvas. About 80% of the exam is short answer, fill-in-the-blank, and multiple choice. A colleague is very concerned about cheating on Canvas exams and expresses this to his classes. If you look at het grades of my classes, the Canvas scores on the sort answer, etc. are very average - very similar to pre-Canvas exams. I'm amazed at this. I couldn't prove kids are cheating - there's no statistical evidence. I'd bet there are one or two who have peeked at something during an exam but by-and-large it seems most kids don't cheat. My colleague does have problems and I sometimes wonder if it's because he brings up cheating - almost like "I'm expecting it, so don't let me down."
- Dr. Medulla
- Atheistic Epileptic
- Posts: 116000
- Joined: 15 Jun 2008, 2:00pm
- Location: Straight Banana, Idaho
Re: AI-yi-yi
That's an interesting issue chiefly because a colleague who is nearing retirement is constantly grumbling about students and the quality of their work, etc etc. And, sure, there are always human slugs and sloths, but I'm always impressed by insights and efforts of so many. I sometime get a kind of retro-envy when I see someone figure stuff out at 21 where I wouldn't have, or they write so beautifully, so much more coherently than I did. Doubtless my colleague and I are seeing what we want to see—if I'm a bit of a grumbler by nature, when it comes to teaching I'm stupidly optimistic and cheerful—but how much does our approach in the classroom and feedback and all that feed into the results of the work? Not that I'm suggesting I'm somehow willing people to be better, but I would like to think my dorky enthusiasm does make people want to try a bit harder.
"I never doubted myself for a minute for I knew that my monkey-strong bowels were girded with strength, like the loins of a dragon ribboned with fat and the opulence of buffalo dung." - Richard Nixon, Checkers Speech, abandoned early draft
- Dr. Medulla
- Atheistic Epileptic
- Posts: 116000
- Joined: 15 Jun 2008, 2:00pm
- Location: Straight Banana, Idaho
Re: AI-yi-yi
So, The Big Bang Theory was written by robots all along?
https://www.theguardian.com/tv-and-radi ... o-write-tv
https://www.theguardian.com/tv-and-radi ... o-write-tv
"I never doubted myself for a minute for I knew that my monkey-strong bowels were girded with strength, like the loins of a dragon ribboned with fat and the opulence of buffalo dung." - Richard Nixon, Checkers Speech, abandoned early draft
-
coffeepotman
- Graffiti Bandit Pioneer
- Posts: 1488
- Joined: 23 Jun 2008, 1:51pm
Re: AI-yi-yi
OK, this is just WRONG
- Dr. Medulla
- Atheistic Epileptic
- Posts: 116000
- Joined: 15 Jun 2008, 2:00pm
- Location: Straight Banana, Idaho
Re: AI-yi-yi
*shudder* What we really need is AI giving us Paul doing every Clash song!
"I never doubted myself for a minute for I knew that my monkey-strong bowels were girded with strength, like the loins of a dragon ribboned with fat and the opulence of buffalo dung." - Richard Nixon, Checkers Speech, abandoned early draft
Re: AI-yi-yi
Listening to it now, just doesn't sound right to my ears. This AI stuff is pretty scary.
Just came across this one as well
God, what a mess, on the ladder of success
Where you take one step and miss the whole first rung
Where you take one step and miss the whole first rung
Re: AI-yi-yi
Fuck this shit.
"Suck our Earth dick, Martians!" —Doc
Re: AI-yi-yi
*screams into a pillow*
-
Silent Majority
- Singer-Songwriter Nancy
- Posts: 18702
- Joined: 10 Nov 2008, 8:28pm
- Location: South Londoner in the Midlands.
Re: AI-yi-yi
Fucking hell, it hurts when it's close to home. I can't take part in a culture that thinks this is clever or cute.