Social media censorship
Description
This thesis explores the role of social media companies in filtering online users' content, especially in the case of health-related misinformation outbreaks. Specifically, the thesis discusses the history of censorship policies, the spread of health-related misinformation, the impacts of misinformation on individuals and companies, the most effective means to correct misinformation, and how the spread and correction of misinformation changed during the 2020 to 2021 COVID-19 pandemic. The business case-structured thesis includes three sections: a case study, an instructor's manual, and a literature review. The case study explores Facebook's history with filtering information on its online platform, its reactions to misinformation outbreaks, and the company's response to misinformation during the COVID-19 pandemic. The literature review examines scholarly articles about health-related misinformation, compares and contrasts scholars' viewpoints about the topic, and concludes with suggested recommendations specific to the COVID-19 pandemic. The instructor's manual offers a teaching plan for the case and provides specific learning objectives and discussion questions for instructors to use in various management, marketing, and finance courses. Leading scholars agree that authorities can correct misinformation on social media, and proactive filtering measures can reduce misinformation outbreaks. However, the scholars offer differing options for correcting misinformation on social media and they vary regarding the degree of responsibility they believe social media platforms have to address and correct misinformation.