Today I was watching another Dall-E 2 video on YouTube. I usually leave these videos pretty impressed, but this time something stuck. If you are not into spending 20 minutes watching that, the summary is pretty simple: the team have all found Dall-E 2 to be pretty cool and wanted to see how the human counterpart (aka their illustrator, Tim), wound fair against it.
Right in the middle of it, I started noticing my reaction to this video significantly diverging from what was its purpose. Sure, it does sound like an entertaining, funny challenge, but for us that aim at building a career on content creation, it is easy to find it dreadful.
If you pay enough attention during the video, you will notice that Tim looks pretty distressed during this challenge. It definitely doesn’t seem like his job was on the line during this experiment and his coworkers seem to be nice enough folks to have probably reassured him during this episode. But putting myself in his position, I don’t think any words of reassurance would actually make any difference if I couldn’t defeat this ungodly AI.
And I don’t think it’s difficult to see why. Selling our craft to a company is something on which a bunch of our identity relies. The value we offer grants us a stable income and a secure position on the hierarchy of the company (and society). But this value is constantly reevaluated when scenario changes. And having an AI able to generate similar quality illustration in a few seconds (compared to hours of a human illustrator) is a huge elephant in the room, no matter what your colleagues say.
Even if in the end the AI had lost the challenge, Tim’s work would still never be the same. His job and the value he offers will now be evaluated in relation to what Dall-E 2 (or any future AI) is capable of producing. That devaluates his skills and, most importantly, removes his leverage. Leverage is, in my opinion, vital for workers to maintain a healthy relationship with their job; without it, we end up quickly accepting worse conditions and less respect. Or, alternatively, we just give up trying to monetize that skill and accept it as a hobby.
And I think that’s what made Tim nervous during this challenge. He may not have thought in these terms, but this whole situation is a threat to him and he correctly anticipated that. Dall-E 2 may not be incredibly better, but it doesn’t have to. Tim’s relative value have decayed with just one Open AI’s paper, and permanently.
I started stating that content creators should be worried, but I don’t think the usual sense of the word is really capturing the group of people I am referring to. When I say content creators, I don’t mean only illustrators, designers, painters, etc. I actually mean everyone of us whose vocation involves creating instead of consuming.
You can see how someone only interested in watching Netflix all day has nothing to worry about. Actually, if the pace of AI progress is maintained, they should probably expect a very large gain on the available catalogue. Now if you are a software developer, for example, you are by definition creating code that corrects problems that are not trivially solved. And that's the process we are seeing being disrupted in front of us, in front of our coworkers and, tragically, in front of our bosses.
I am seeing myself entering on these very navigated waters of “automation vs. unemployment” debate. Which is unfortunate, because a lot was already said about it and I am not sure I have anything to contribute. What I will say though is that thinking this type of technology will only make our work better and never obsolete is self-deception in its pure form. There is no obvious way to make an AI-generated image “better”. Therefore, what will result is not increased quality output, but increased productivity.
Take agriculture, for example. The quality of its output can only be improved by so much. Sure, some people value organic alternatives, but most of us are satisfied with the undifferentiated, low quality product and don’t want to pay a premium for that. Therefore, technology generates pressure over production, not quality. And that, combined with a limited amount of demand for calories, is why employment on the agriculture sector declined so much.
And I see the same happening with illustrations, or code. See, I am not saying that an AGI will generate everything and humans won’t be part of the process. What I am saying is that much fewer humans will need to be involved in the enterprise of content creation. AI have reached a level that can make cognitively-heavy tasks look like commodities. And having done so, these are now susceptible to the same forces as agriculture was in the last century.
We are destined to be consumers.
AI "shadow employees" seem pretty inevitable, right?
Pretty soon, it seems extremely likely that we'll have the tools to set up an AI to watch and listen to you while you work, and then support you by performing automated tasks (e.g., queries/analysis/research/etc. based on the emails you're getting, your to-do list, your calendar, etc.) To start they would just be supervised "suggestion bots" without authority to send emails or write files unless approved, but they'll still be able to automate a ton of work, and improve the output of their human counterpart.
On one hand, this will be AWESOME! If I could hand over half my workload to an AI, that would be amazing, especially if it handles the annoying stuff that disrupts your day like people sending you emails looking for routine information.
As long as you are in personal control of the AI, it sounds great, right? But what if your employer forces you to be constantly recorded by an AI that is learning your job? Obviously there's a privacy concern, as well as the fear of losing your job as soon as the AI approaches being as good at the job as you.
But, there could also be benefits to having a company with super-productive AIs - you would have access to so much more analytical power than if you rely on humans. And you know Hank in IT who always takes 5 days to respond to an email? Now his shadow AI can respond immediately and give you the info you need.
So personally, if I were in control of it I would love to have an AI automating my job. Even if my employer controlled it, I think I would still be OK with it, just because I feel secure in my ability to contribute, and if my job were able to be fully supplanted, that seems like we would already be living in a post-scarcity utopia.
Even if programs like Dalle-2 generate variable quality output, that could turn into a scenario where the audience becomes the co-creator. It would be easy to create feedback mechanisms from audience impressions back to the creative engine, so there wouldn't need to be a single "creative human" in the loop making decisions. This feedback loop could generate frighteningly compelling content that appeals to the lowest common denominator of human engagement (not that different to the dynamic created social media feedback mechanisms).