I like Sam's philosophy on this and I generally agree with him. However, I do not like how all the wealthy AI people are hand-waving the massive labor market shift in the coming years.
> As one example, we expect that this technology can cause a significant change in labor markets (good and bad) in the coming years, but most jobs will change more slowly than most people think, and I have no fear that we’ll run out of things to do (even if they don’t look like “real jobs” to us today). People have an innate desire to create and to be useful to each other, and AI will allow us to amplify our own abilities like never before. As a society, we will be back in an expanding world, and we can again focus on playing positive-sum games.
It's very easy as an extremely rich person to just say, "don't worry, in the end it'll be better for all of us." Maybe that's true on a societal scale, but these are people's entire worlds being destroyed.
Imagine you went to college for a medical specialty for 8-10 years, you come out as an expert, and 2 years later that entire field is handled by AI and salaries start to tank. Imagine you have been a graphic designer for 20 years supporting your 3 children and bam a diffusion model can do your job for a fraction of the cost. Imagine you've been a stenographer working in courtrooms to support your ill parents and suddenly ASR can do your job better than you can. This is just simple stuff we can connect the dots on now. There will be orders of magnitude more shifts that we can't even imagine right now.
To someone like Sam, everything will be fine. He can handle the massive societal shift because he has options. Every a moderately wealthy person will be OK.
But the entire middle class is going to start really freaking the fuck out soon as more and more jobs disappear. You're already seeing anti-AI sentiment all over the web. Even in expert circles, you can see skepticism. People saying things like, "how do I opt out of Apple Intelligence?" People don't WANT more grammar correction or AI emojis in their lives, they just want to survive and thrive and own a house.
How are we going to handle this? Sam's words of "if we could fast-forward a hundred years from today, the prosperity all around us would feel just as unimaginable" doesn't mean shit to a family of 4 who went through layoffs in the year 2025 because AI took their job while Microsoft's stock grows 50%.
For this reason I read Andrew Yang’s “The War on Normal People”. Besides UBI and “social credits”, I don’t see him offer that many other solutions to this problem. UBI also still needs to be proven as far as I’m aware.
When o1 was released, I ran an internal eval and saw it plainly outperforming our highly educated colleagues. I had goosebumps, and haven’t been able to sleep well for days. This will dramatically impact society in 2-5 years.
Do you know of any relevant material related to this?
o1 was what got you stirred up? It honestly feels like an incremental change at best to me. I had similar feelings about gpt-3.5, but since then my fears have normalized into a sort of dull, typical (for me) cynicism (so no sleepless nights).
Welcome to the anxiety party, it sucks in here. As someone who's been working on AI theory full time for ~1 year, I desperately wish we could go back to the days of my faraway youth (5 years ago) before intuition was cracked on accident by spellcheck algorithms. I agree with him that it holds the key to massive prosperity, but selfishly, it's gonna upend my life and the lives of everyone I love. Already has for me, as I grapple with how to (ethically) pay rent while spending all day lighting the Warning Beacons of Gondor...
The only real answer, IMHO, is to vote for political systems that put control of society (and AI) in the hands of the public. Call it socialism, call it Georgism, call it anarcho-free-market-space-communism, call it whatever you want; there's no way that "a tiny number of people have immense inherited power" (capitalism) and "people fundamentally understand themselves as members of a tribe put in opposition to all other tribes by default" (nationalism) mesh well with an intelligence explosion.
Here's to hoping the haters are right, and we all turn out to be wrong! I'll be thrilled if Sam Altman is just a rich company leader in 10 years, and intuitive algorithms are still confined to direct usage (chatbots).
I wish I could upvote this more than once. I feel like every conversation about AI changing society comes from rich founders telling Joe banker to "Just start a company. AI will make it easy."
The reality is, this transition is going to be painful for the average person.
We're going to need to link the two. Those wealthy enough to not care, we're going to have to organize and make them care. Ideally we can find a way to do it nonviolently.
> As one example, we expect that this technology can cause a significant change in labor markets (good and bad) in the coming years, but most jobs will change more slowly than most people think, and I have no fear that we’ll run out of things to do (even if they don’t look like “real jobs” to us today). People have an innate desire to create and to be useful to each other, and AI will allow us to amplify our own abilities like never before. As a society, we will be back in an expanding world, and we can again focus on playing positive-sum games.
It's very easy as an extremely rich person to just say, "don't worry, in the end it'll be better for all of us." Maybe that's true on a societal scale, but these are people's entire worlds being destroyed.
Imagine you went to college for a medical specialty for 8-10 years, you come out as an expert, and 2 years later that entire field is handled by AI and salaries start to tank. Imagine you have been a graphic designer for 20 years supporting your 3 children and bam a diffusion model can do your job for a fraction of the cost. Imagine you've been a stenographer working in courtrooms to support your ill parents and suddenly ASR can do your job better than you can. This is just simple stuff we can connect the dots on now. There will be orders of magnitude more shifts that we can't even imagine right now.
To someone like Sam, everything will be fine. He can handle the massive societal shift because he has options. Every a moderately wealthy person will be OK.
But the entire middle class is going to start really freaking the fuck out soon as more and more jobs disappear. You're already seeing anti-AI sentiment all over the web. Even in expert circles, you can see skepticism. People saying things like, "how do I opt out of Apple Intelligence?" People don't WANT more grammar correction or AI emojis in their lives, they just want to survive and thrive and own a house.
How are we going to handle this? Sam's words of "if we could fast-forward a hundred years from today, the prosperity all around us would feel just as unimaginable" doesn't mean shit to a family of 4 who went through layoffs in the year 2025 because AI took their job while Microsoft's stock grows 50%.