• Spotting and Debunking Tech Propaganda

  • 2024/09/17
  • 再生時間: 15 分
  • ポッドキャスト

Spotting and Debunking Tech Propaganda

  • サマリー

  • Tech Propaganda: An Introduction to Critical Thinking in TechnologyEpisode Notes1. FOMO (Fear of Missing Out)Definition: Rushing to adopt new technologies without clear benefitsExamples:Implementing GenAI without clear ROI just because competitors are doing itSkill development driven by fear of obsolescenceVCs worried about missing the next big thing2. Naive UtopianismDefinition: Assuming all technology is inherently goodExamples:Believing more smartphone scrolling is always betterExpecting social media to lead to world peacePromoting UBI or crypto as universal solutionsAssuming AI can completely replace teachers3. Disruption and Technological SolutionismDefinition: Ignoring negative consequences of tech solutionsKey point: Tendency to overlook negative externalities4. "Selling Two Day Old Fish"Definition: Resisting improvements to maintain profitable but outdated products/servicesExamples:Exaggerating job market demand for outdated skillsAppealing to authority (big tech companies)Dismissing newer technologies as unnecessary or overly complexClaiming established technologies aren't actually old/slow5. Superficial MediaDefinition: Promoting shallow or misleading information about technologyExamples:Media monetizing via supplementsConspiracy theory forumsInexperienced podcast hosts discussing complex topicsMaking sensational predictions about future tech with little evidenceOversimplifying complex topics6. Push to DisruptDefinition: Overconfidence in technology's ability to solve complex problemsExamples:"Figure out the business model later" mentalityPushing products to market prematurelyIgnoring negative externalitiesDismissing critics as "not understanding the vision"7. BillionairismDefinition: Excessive admiration of tech billionaires and their perceived expertiseExamples:Equating extreme wealth with universal expertiseIdolizing tech billionaires as infallible visionariesRomanticizing the "Harvard/Stanford dropout genius" narrativeIgnoring the role of luck vs. skillOveremphasizing individual genius over team efforts8. Irrational ExceptionalismDefinition: Unrealistic beliefs about a startup's chances of successExamples:"We're different from other startups that fail""Weekends are a social construct"Obsession with "changing the world"Rationalizing present hardships for imagined future gainsDismissing industry-wide failure ratesGlorifying extreme effort and sacrifice9. Double DownDefinition: Making increasingly grand claims to distract from unfulfilled promisesExamples:Promising self-driving cars "next year", then pivoting to Mars travelDeflecting from current AI model flaws with promises of future sentience10. Trojan SourceDefinition: Open source projects that later switch to commercial licensingExamples:"Rug pull" strategy in open sourceUsing community labor before pivoting to commercial model11. "Generous Pour" Ethical FramingDefinition: Highlighting easy ethical actions while ignoring larger issuesExamples:Claiming unbiased AI training sets while hiding addictive designEmphasizing harm reduction in AI outputs while ignoring IP theft12. Business Model Circular LogicDefinition: Exploiting legal grey areas and claiming they're essential to the business modelExamples:Justifying use of pirated data for AI trainingCreating unfair competition by ignoring regulations (e.g., taxi services, hotels) 🎓📚 Unlock the power of AI with two Master's degrees worth of courses on edX, covering everything from ☁️ Cloud Computing to 🦀 Rust to 🤖 LLMs and 🎨 Generative AI! 🚀👉 Join the Pragmatic AI Labs Community now:🔥 edX 🔥💬 Discord Community 💬🌟 Coursera 🌟🌟 Future Learn 🌟🌟 Linkedin Learning 🌟🎉 Start your AI journey today and take your skills to the next level! 🎉
    続きを読む 一部表示
activate_samplebutton_t1

あらすじ・解説

Tech Propaganda: An Introduction to Critical Thinking in TechnologyEpisode Notes1. FOMO (Fear of Missing Out)Definition: Rushing to adopt new technologies without clear benefitsExamples:Implementing GenAI without clear ROI just because competitors are doing itSkill development driven by fear of obsolescenceVCs worried about missing the next big thing2. Naive UtopianismDefinition: Assuming all technology is inherently goodExamples:Believing more smartphone scrolling is always betterExpecting social media to lead to world peacePromoting UBI or crypto as universal solutionsAssuming AI can completely replace teachers3. Disruption and Technological SolutionismDefinition: Ignoring negative consequences of tech solutionsKey point: Tendency to overlook negative externalities4. "Selling Two Day Old Fish"Definition: Resisting improvements to maintain profitable but outdated products/servicesExamples:Exaggerating job market demand for outdated skillsAppealing to authority (big tech companies)Dismissing newer technologies as unnecessary or overly complexClaiming established technologies aren't actually old/slow5. Superficial MediaDefinition: Promoting shallow or misleading information about technologyExamples:Media monetizing via supplementsConspiracy theory forumsInexperienced podcast hosts discussing complex topicsMaking sensational predictions about future tech with little evidenceOversimplifying complex topics6. Push to DisruptDefinition: Overconfidence in technology's ability to solve complex problemsExamples:"Figure out the business model later" mentalityPushing products to market prematurelyIgnoring negative externalitiesDismissing critics as "not understanding the vision"7. BillionairismDefinition: Excessive admiration of tech billionaires and their perceived expertiseExamples:Equating extreme wealth with universal expertiseIdolizing tech billionaires as infallible visionariesRomanticizing the "Harvard/Stanford dropout genius" narrativeIgnoring the role of luck vs. skillOveremphasizing individual genius over team efforts8. Irrational ExceptionalismDefinition: Unrealistic beliefs about a startup's chances of successExamples:"We're different from other startups that fail""Weekends are a social construct"Obsession with "changing the world"Rationalizing present hardships for imagined future gainsDismissing industry-wide failure ratesGlorifying extreme effort and sacrifice9. Double DownDefinition: Making increasingly grand claims to distract from unfulfilled promisesExamples:Promising self-driving cars "next year", then pivoting to Mars travelDeflecting from current AI model flaws with promises of future sentience10. Trojan SourceDefinition: Open source projects that later switch to commercial licensingExamples:"Rug pull" strategy in open sourceUsing community labor before pivoting to commercial model11. "Generous Pour" Ethical FramingDefinition: Highlighting easy ethical actions while ignoring larger issuesExamples:Claiming unbiased AI training sets while hiding addictive designEmphasizing harm reduction in AI outputs while ignoring IP theft12. Business Model Circular LogicDefinition: Exploiting legal grey areas and claiming they're essential to the business modelExamples:Justifying use of pirated data for AI trainingCreating unfair competition by ignoring regulations (e.g., taxi services, hotels) 🎓📚 Unlock the power of AI with two Master's degrees worth of courses on edX, covering everything from ☁️ Cloud Computing to 🦀 Rust to 🤖 LLMs and 🎨 Generative AI! 🚀👉 Join the Pragmatic AI Labs Community now:🔥 edX 🔥💬 Discord Community 💬🌟 Coursera 🌟🌟 Future Learn 🌟🌟 Linkedin Learning 🌟🎉 Start your AI journey today and take your skills to the next level! 🎉

Spotting and Debunking Tech Propagandaに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。