A “More Jobs Are Being Performed By Robots and We Don’t Even Realize It” Question of the Day
How would you feel if you learned your teacher was actually an artificial intelligence?

Today in “jobs now performed by robots:” The Wall Street Journal reports that the Georgia Institute of Technology successfully convinced 300 online students that an artificial intelligence was, in fact, a human teaching assistant.
Imagine Discovering That Your Teaching Assistant Really Is a Robot
Since January, “Jill,” as she was known to the artificial-intelligence class, had been helping graduate students design programs that allow computers to solve certain problems, like choosing an image to complete a logical sequence.
“She was the person — well, the teaching assistant — who would remind us of due dates and post questions in the middle of the week to spark conversations,” said student Jennifer Gavin.
Ms. Watson — so named because she’s powered by International Business Machines Corp.’s Watson analytics system — wrote things like “Yep!” and “we’d love to,” speaking on behalf of her fellow TAs, in the online forum where students discussed coursework and submitted projects.
“It seemed very much like a normal conversation with a human being,” Ms. Gavin said.
Shreyas Vidyarthi, another student, ascribed human attributes to the TA — imagining her as a friendly Caucasian 20-something on her way to a Ph.D.
There’s so much I want to write about this. The part where they chose “Yep!” with the exclamation point instead of “Yes,” “Yes!” “Yup,” “yesssssss” or the simple “y” says so much about how online written language communicates both hierarchy and humanness, as well as dozens of other tiny social and demographic cues—and thanks to social media, we have a huge body of casual online written language to study, so we know exactly who, demographically, chooses “Yes” vs. “Yep!” and can program our intelligences accordingly.
But this is The Billfold, so I need to focus on the money aspect: this artificial intelligence can do a lot of the tasks that a human teacher can do, including prompting class discussion. Although the WSJ notes that “Jill” worked alongside human TAs who could help pick up her slack, this does put us one step closer to the “all jobs are held by robots now” world, or at least one step closer to the “graduate students’ stipends get cut because robots can handle basic TA duties” world.
It also starts a really interesting question about human motivation, and jumps us back to that Question of the Day I asked a few weeks ago, about robots and manners.
A Robot Etiquette Question of the Day
The Georgia Institute of Technology students didn’t know that they were chatting with an AI. If they did, would their behavior change?
So much of what I do, for example, is motivated by the fact that there are humans—real people—reading and responding. (There are also robots, occasionally. I am aware of that.) When I was in school, I’d do the best work I could because I both wanted to please a human and was often inspired by that human.
As I’ve written before, I do not care about pleasing robots. My life has enough emotional labor in it already. I don’t even say Siri’s name anymore, once I figured out you didn’t have to.
When a human teacher sends over a discussion prompt, I’m motivated in part by the fact that the human teacher is observing my responses. I can’t see myself feeling the same way if I knew the prompt came from an AI. There’s a certain motivation that comes from participating in a learning community with other students, but even that is often led by a person—and I can easily imagine an entire group of students having the same unmotivated response to a robot-led discussion. Instead of knowing you can get something more out of the collective response if you work together, you know, instinctively, that there is a hard limit on what the AI can give back—so why not just give the minimum effort required to get the robot to say “Yep! Good job!”
So, our questions of the day:
- Can you really tell when you’re chatting with an artificial intelligence?
- If you can, does it change your behavior?
- What’s your gut response to Jill, the AI teaching assistant? Is it closer to “wow, they created an AI that both taught and fooled 300 students” or “oh no, the robots really will take all our jobs?”
Support The Billfold
The Billfold continues to exist thanks to support from our readers. Help us continue to do our work by making a monthly pledge on Patreon or a one-time-only contribution through PayPal.
Comments