In a world dominated by technological advancements such as AI, which have the potential to limit human connections, it is imperative to prioritize and actively cultivate authentic human connections.
Today, we are thrilled to welcome back Cal Fussman, an American journalist, host of the Big Questions Podcast, and author. In this episode, Cal takes us through the profound implications of Artificial intelligence (AI) on human connection.
Discover what bit rot is and learn valuable tips on how to protect your photos in the digital age. Moreover, we take a closer look at how AI is revolutionizing the field of photography and videography.
Get ready to embark on a journey with Cal as he shares how he built deep connections during his travels around the world. Uncover strategies for overcoming the obstacles that hinder our ability to connect with others and gain valuable insights into why focusing on building authentic connections is important.
As the conversation unfolds, he reveals why young people are finding it difficult to embrace vulnerability today and shares his perspective on AI. Additionally, he highlights the limitations of ChatGPT, asserting that while AI can attempt to copy humans, it cannot replace us.
During this episode, you will learn about:
[01:32] What is “bit rot”, and how can you protect your photos?
[02:15] How AI is impacting photography and videography
It’s crucial to prioritize strengthening authentic connections, particularly now that many individuals are working remotely after the pandemic. [07:29]
Nowadays, young people are finding it increasingly difficult to be vulnerable. In the past, personal matters often remained private; however, with the widespread use of smartphones today, one’s vulnerability can quickly become publicized worldwide. [23:10]
In sales, making connections brings you one step closer to the sale. [38:27]
ChatGPT’s knowledge is limited to the past; it cannot generate new ideas or move forward beyond the information it has been given. [47:12]
AI can try to copy humans, but it cannot be us. [01:04:01]