That's great to hear, thank you for sharing! Read your piece and I definitely resonate with parts of it -- I think AI, ultimately, will free us to be able to live a more authentic life. BUT - I think there needs to be a lot of thought, a lot of policy and a lot of support to help people through this transition, because there will be a massive transition and change ahead.
As humans, we hate change, which is where a lot of the anxiety comes from.
I don’t believe humans hate change - it’s our very nature. It’s just that most humans don’t live human, but live as an extension of the system they are part of.
What hates change are the communities and social groups we are part of. We fear calling it out cause we are more in love with being part of a fake group than facing the unknown.
„Shame is the pain of death of the community within us and prevents our premature independence.“ - Eugen R.-Huessy
nice artwork and appreciate the food for thought this post offered!
I find it interesting that writing helps calm your A(I) anxiety. Since starting the AI safety course, I have tried writing, but I have been unable to follow through on any of my recent ideas due to how negative they sound. When I write negative valence content, it seems to add fuel to the fire of my AI dread, so I have been taking a break from writing until the course is over.
To calm my anxiety, I have been exploring science fiction that touches on AI to see the potential good outcomes that are possible for us, if we are able safely develop these AI advancements to come. For example, Dune does not have any AI! and Star Trek has some very interesting philosophical discussion round human-AI relationships. Playing/watching soccer and playing with my cat have been some great ways to reduce "future shock" for me as well.
Thanks so much for sharing your thoughts Dan. I agree, it's hard to write about the concepts we're learning about in the course, because so much of it is hypothetical right now. The hypothetical matters, but just relaying the things that could happen without really deeply explaining the logic behind the concept I do think adds unnecessary stress to readers.
I've found by talking to my readers that so many people want to ignore AI; they're scared of it and they don't necessarily want to open an email that's talking about these really difficult concepts. I think our challenge as writers is how to raise awareness about the issues in a way that doesn't create aversion, but rather persuasion that people do have agency over the AI future.
Interesting take. I just wrote my first piece about AI, with a pretty different take on AI - I don’t have AI fear at all.
I believe that we don’t really fear AI we fear what AI is telling us to embrace - the highest version of ourselves.
Would be happy to hear your thoughts on it.
https://open.substack.com/pub/placzebo/p/stop-hitting-snooze-on-the-ai-larm?r=14inwj&utm_medium=ios
That's great to hear, thank you for sharing! Read your piece and I definitely resonate with parts of it -- I think AI, ultimately, will free us to be able to live a more authentic life. BUT - I think there needs to be a lot of thought, a lot of policy and a lot of support to help people through this transition, because there will be a massive transition and change ahead.
As humans, we hate change, which is where a lot of the anxiety comes from.
I don’t believe humans hate change - it’s our very nature. It’s just that most humans don’t live human, but live as an extension of the system they are part of.
What hates change are the communities and social groups we are part of. We fear calling it out cause we are more in love with being part of a fake group than facing the unknown.
„Shame is the pain of death of the community within us and prevents our premature independence.“ - Eugen R.-Huessy
nice artwork and appreciate the food for thought this post offered!
I find it interesting that writing helps calm your A(I) anxiety. Since starting the AI safety course, I have tried writing, but I have been unable to follow through on any of my recent ideas due to how negative they sound. When I write negative valence content, it seems to add fuel to the fire of my AI dread, so I have been taking a break from writing until the course is over.
To calm my anxiety, I have been exploring science fiction that touches on AI to see the potential good outcomes that are possible for us, if we are able safely develop these AI advancements to come. For example, Dune does not have any AI! and Star Trek has some very interesting philosophical discussion round human-AI relationships. Playing/watching soccer and playing with my cat have been some great ways to reduce "future shock" for me as well.
Thanks so much for sharing your thoughts Dan. I agree, it's hard to write about the concepts we're learning about in the course, because so much of it is hypothetical right now. The hypothetical matters, but just relaying the things that could happen without really deeply explaining the logic behind the concept I do think adds unnecessary stress to readers.
I've found by talking to my readers that so many people want to ignore AI; they're scared of it and they don't necessarily want to open an email that's talking about these really difficult concepts. I think our challenge as writers is how to raise awareness about the issues in a way that doesn't create aversion, but rather persuasion that people do have agency over the AI future.