Artwork

Kandungan disediakan oleh Natasha Bajema - Fiction Author. Semua kandungan podcast termasuk episod, grafik dan perihalan podcast dimuat naik dan disediakan terus oleh Natasha Bajema - Fiction Author atau rakan kongsi platform podcast mereka. Jika anda percaya seseorang menggunakan karya berhak cipta anda tanpa kebenaran anda, anda boleh mengikuti proses yang digariskan di sini https://ms.player.fm/legal.
Player FM - Aplikasi Podcast
Pergi ke luar talian dengan aplikasi Player FM !

My Hero (Ch. 42) – Bionic Bug Podcast Episode 042

29:08
 
Kongsi
 

Manage episode 225959241 series 2178955
Kandungan disediakan oleh Natasha Bajema - Fiction Author. Semua kandungan podcast termasuk episod, grafik dan perihalan podcast dimuat naik dan disediakan terus oleh Natasha Bajema - Fiction Author atau rakan kongsi platform podcast mereka. Jika anda percaya seseorang menggunakan karya berhak cipta anda tanpa kebenaran anda, anda boleh mengikuti proses yang digariskan di sini https://ms.player.fm/legal.
Hey everyone, welcome back to Bionic Bug podcast! You’re listening to episode 42. This is your host Natasha Bajema, fiction author, futurist, and national security expert. I’m recording this episode on January 26, 2019. I’m sad to say that this is my final episode of the Bionic Bug podcast. This is somewhat bittersweet because I’ve grown fond of sharing my thoughts with you. If you’ve been listening from the beginning, thank you so much for joining me on this journey. If you want to keep listening to me, I’ll be launching a new podcast called the Authors of Mass Destruction Podcast. I’ll talk tech and weapons of mass destruction, but will take a slightly different approach. I’m planning to focus on helping authors write great stories about national security issues while getting the technical details right. Tune in for interviews with leading experts on weapons of mass destruction and emerging technologies, author interviews, technical modules, and reviews of what TV shows and movies get right and wrong. The podcast will help authors who write about mass destruction develop impactful ideas for their page-turning plots and provide tips for conducting research. Let’s talk tech one more time. I have two headlines for this week: The first is “Beware the Jabberwocky: The AI Monsters Are Coming,” published on www.natashabajema.comon January 22. I wrote this essay as part of A Strategic Multilayer Assessment (SMA) Periodic Publication entitled AI, China, Russia, and the Global Order: Technological, Political, Global, and Creative Perspectives edited by Nicholas D. Wright and Mariah C. Yager. I’ll link to the full paper in the show notes. Science fiction plays an important role in shaping our understanding of the implications of science and technology and helping us to cope with things to come. I describe three AI monsters depicted in science fiction films as one day disrupting the global order and potentially destroying humanity: the automation monster, the supermachine monster, and the data monster. Fears about the implications of the automatic and supermachine monsters distract us from the scariest of them all. Below the surface of our daily lives, the data monster is stealthily assaulting our sense of truth, our right to privacy, and our freedoms. My second headline is “AI can be sexist and racist — it’s time to make it fair,” published in on Nature.com on July 18, 2018. If you listen to this podcast, you’re aware of the growing influence of machine learning algorithms in our lives. One of the more troubling issues about the excitement around the power of algorithms for helping society, is the lack of attention to data. Machine learning algorithms rely upon huge datasets to train them on the relationships between data. But what if the data is biased? Humans are biased, therefore the data we generate is biased. If data scientists do not take special care to ensure the data does not under or over represent certain groups, things go wrong with the algorithms they develop. Data is not the only place where bias can occur; algorithms are created by humans. As such, they can inject bias into them as well. This is one of the most important issues of our time, and it’s not well understood or even regulated by policymakers. Okay, let’s turn to the final chapter of Bionic Bug. Last week, we left Lara in a bit of a sticky situation, with a deadly syringe pressed to her neck. Let’s find out what happens next. The views expressed in this blog are those of the author and do not reflect the official policy or position of the National Defense University, the Department of Defense or the U.S. Government.
  continue reading

44 episod

Artwork
iconKongsi
 
Manage episode 225959241 series 2178955
Kandungan disediakan oleh Natasha Bajema - Fiction Author. Semua kandungan podcast termasuk episod, grafik dan perihalan podcast dimuat naik dan disediakan terus oleh Natasha Bajema - Fiction Author atau rakan kongsi platform podcast mereka. Jika anda percaya seseorang menggunakan karya berhak cipta anda tanpa kebenaran anda, anda boleh mengikuti proses yang digariskan di sini https://ms.player.fm/legal.
Hey everyone, welcome back to Bionic Bug podcast! You’re listening to episode 42. This is your host Natasha Bajema, fiction author, futurist, and national security expert. I’m recording this episode on January 26, 2019. I’m sad to say that this is my final episode of the Bionic Bug podcast. This is somewhat bittersweet because I’ve grown fond of sharing my thoughts with you. If you’ve been listening from the beginning, thank you so much for joining me on this journey. If you want to keep listening to me, I’ll be launching a new podcast called the Authors of Mass Destruction Podcast. I’ll talk tech and weapons of mass destruction, but will take a slightly different approach. I’m planning to focus on helping authors write great stories about national security issues while getting the technical details right. Tune in for interviews with leading experts on weapons of mass destruction and emerging technologies, author interviews, technical modules, and reviews of what TV shows and movies get right and wrong. The podcast will help authors who write about mass destruction develop impactful ideas for their page-turning plots and provide tips for conducting research. Let’s talk tech one more time. I have two headlines for this week: The first is “Beware the Jabberwocky: The AI Monsters Are Coming,” published on www.natashabajema.comon January 22. I wrote this essay as part of A Strategic Multilayer Assessment (SMA) Periodic Publication entitled AI, China, Russia, and the Global Order: Technological, Political, Global, and Creative Perspectives edited by Nicholas D. Wright and Mariah C. Yager. I’ll link to the full paper in the show notes. Science fiction plays an important role in shaping our understanding of the implications of science and technology and helping us to cope with things to come. I describe three AI monsters depicted in science fiction films as one day disrupting the global order and potentially destroying humanity: the automation monster, the supermachine monster, and the data monster. Fears about the implications of the automatic and supermachine monsters distract us from the scariest of them all. Below the surface of our daily lives, the data monster is stealthily assaulting our sense of truth, our right to privacy, and our freedoms. My second headline is “AI can be sexist and racist — it’s time to make it fair,” published in on Nature.com on July 18, 2018. If you listen to this podcast, you’re aware of the growing influence of machine learning algorithms in our lives. One of the more troubling issues about the excitement around the power of algorithms for helping society, is the lack of attention to data. Machine learning algorithms rely upon huge datasets to train them on the relationships between data. But what if the data is biased? Humans are biased, therefore the data we generate is biased. If data scientists do not take special care to ensure the data does not under or over represent certain groups, things go wrong with the algorithms they develop. Data is not the only place where bias can occur; algorithms are created by humans. As such, they can inject bias into them as well. This is one of the most important issues of our time, and it’s not well understood or even regulated by policymakers. Okay, let’s turn to the final chapter of Bionic Bug. Last week, we left Lara in a bit of a sticky situation, with a deadly syringe pressed to her neck. Let’s find out what happens next. The views expressed in this blog are those of the author and do not reflect the official policy or position of the National Defense University, the Department of Defense or the U.S. Government.
  continue reading

44 episod

Semua episod

×
 
Loading …

Selamat datang ke Player FM

Player FM mengimbas laman-laman web bagi podcast berkualiti tinggi untuk anda nikmati sekarang. Ia merupakan aplikasi podcast terbaik dan berfungsi untuk Android, iPhone, dan web. Daftar untuk melaraskan langganan merentasi peranti.

 

Panduan Rujukan Pantas

Podcast Teratas