As someone who collects NFT art, I now see a lot of AI-generated images. Usually I can tell when an image was generated by a computer, but sometimes it's hard to tell and I'm sure eventually I won't be able to tell. But if I'm being honest, today I find that I have a bias toward art that was created without any AI prompts. Maybe that changes in the future, or maybe it doesn't.
Either way, the marginal cost of producing new content, such as images and videos, has now gone down to zero as a result of AI tools. (Here are some of my crappy creations.) That means that, if you aren't already, you're soon going to be faced with a deluge of things created in this way. This will almost certainly become the dominant form of content that we consume.
I don't think that we need to be scared by this future, but I do agree with Ben Thompson and others that it's going to make authenticity and human-content more valuable. In other words, we're probably going to need to know what is digitally scarce and what is just another thing generated by AI. Thankfully we have a suitable technology for this: it's called a blockchain.
Okay, so I haven't tried it yet. But Apple Vision looks pretty awesome and the people who have tried it seem to be very impressed by it. The best article that I have read, so far, is this one here by Ben Thompson (of Stratechery). He gets into some of the tech details and explains why Apple is probably the only company in the world that could have created a device like this.
For those of you who are interested, Apple Vision is still technically a VR device, even though it is being marketed as an augmented reality (AR) device that allows you to stay engaged with the world around you. This last part is true, but it is all done digitally through 12 cameras that capture the world around you and then display it back to you.
So experientially, yes, it is an AR device; however, the tech behind it is actually just exceptional VR.
But this is not the point of today's post. The point I would like to make is one that Ben raises at the end of his article. After praising Apple Vision's achievements, he goes on to argue that the arc of technology is one that is leading toward "ever more personal experiences." In other words, it is increasingly about individual, rather than group, use cases.
And this is one of the first things that I thought of when I watched the Vision Pro keynote. "Wow, this looks like a really cool way to watch and experience a movie. But how do I do that with my partner? I guess we both now need Vision Pros. And what about families with a bunch of kids? That is a lot of Vision Pros."
But maybe this doesn't matter. Ben's point is that it's probably not an accident that this technology arc is happening at the same time as a larger societal shift away from family formation and toward more feelings of loneliness. Indeed, the number of single-person households has been steadily increasing in the US since the 1960s. The current figure sits at more than 1 in 4 households.
So there is an obviously dystopian narrative that we could all tell ourselves here. It is one where everyone works from home, plugs into virtual workplaces, and then flips over to other, more exciting, virtual worlds when it's time to unwind from the stresses of the former. And if you think about it, this isn't that much of a stretch compared to what many of us do today.
Whatever the case, in my mind, none of this is any reason to become bearish on cities. Humans will still be humans. And none of this tech is going to replace the feeling of enjoying a perfect pesto gnocchi in an impossibly narrow laneway in Milan, or drinking a caipirinha on the beaches of Rio de Janeiro while being surrounded by shockingly beautiful people.
Or at least let's hope so.
AI is going to be very disruptive, right? At this point, I think it is pretty clear to most that the answer is yes, almost regardless of what industry you're in. But is it going to be really disruptive? Like disruptive in the Clayton Christensen sense of the word. (Christensen is known for coining the term "disruptive innovation", which he contrasted against "sustaining innovation.")
This is a good question, and I like how Ben Thompson thought about it in his newsletter this morning:
I tend to believe that disruptive innovations are actually quite rare, but when they come, they are basically impossible for the incumbent company to respond to: their business models, shareholders, and most important customers make it impossible for management to respond. If that is true, though, then an incumbent responding is in fact evidence that an innovation is actually not disruptive, but sustaining.
The point he is making is that given that the big tech companies (and of course everyone else) are all now responding to AI by incorporating it into their businesses, it, by definition, must not be a disruptive innovation. It's a sustaining one. This doesn't mean that AI won't have significant impacts on our economy; it just means that maybe it won't put a company like Alphabet out of business.
I thought this was an interesting way of looking at things because it is a reminder that "disruptive innovations" often start out at the bottom of the market. They start in a way that can feel innocuous to incumbents; that is, until they move upmarket. But this is not at all how AI feels. As soon as you play around with ChatGPT you immediately think to yourself, "holy shit, this thing can do my job."
That is obviously something very meaningful. But is it going to shake up the big tech world order? I don't know, if you follow Christensen's definition, crypto sounds like the more disruptive innovation.