How to Protect Your Organisation from Plausible Rubbish and Misinformation

How to Protect Your Organisation from Plausible Rubbish and Misinformation

Protection from misinformation before its too late

In the era of social media, ChatGPT, AI and mass digital communication, fake news, misinformation and disinformation can spread like wildfire. Due to the myriad sources and entrance points of information into an organisation, they tend to be vulnerable to falling prey to plausible rubbish. Misinformation and disinformation, whether deliberate or not, can cause significant harm to an organisation’s operations, market share, reputation and decision-making processes.

In this article I provide leaders with concise, research-based guidance on how to effectively protect your organisation from the dangers of misinformation.

 

The pervasiveness of misinformation and plausible rubbish

Misinformation (false information spread without malicious intent) and disinformation (information spread with the express intention of causing some form of malicious outcome) is a growing and serious issue for many organisations and businesses, both large and small.

A 2018 study by the Massachusetts Institute of Technology (MIT) published in the journal, Science, showed that false news stories are 70% more likely to be retweeted on Twitter than true ones[i]. The study followed 126,000 rumours circulating on Twitter and found firstly that false news was significantly more likely to be shared than true news.

The study also found that the top 1% of fake news posts being shared online, in what is known as ‘false news cascades’ spread to between 1,000 and 100,000 people on average!  This demonstrates how plausible rubbish can infiltrate communication channels, influence opinion and become ‘knowledge’ at significant speed and scale.

 

Misinformation, disinformation and plausible rubbish

The distinction between Misinformation, disinformation and plausible rubbish is:

Misinformation refers to incorrect or misleading information shared, regardless of intent. This is usually because of an error or misunderstanding, not a deliberate intention to mislead. For example, an individual might unknowingly share an untrue or inaccurate fact on social media, believing it to be true. This incorrect information, now spread to a broader audience, is misinformation.

Disinformation is false information that is deliberately created and shared intending to cause harm or misleading others. Disinformation is manipulation and is often used for propaganda, to deceive, or to obscure the truth. This includes spreading false rumors to smear a business competitor or maliciously circulating false information to for political or control issues. The key difference from misinformation lies in the intent to deceive.

Plausible rubbish refers to information or assertions that, while sounding credible or believable upon first hearing because of its logical or coherent presentation, are in fact false, misleading, or without a solid foundation or fact. Plausible rubbish spreads as misinformation but can also result from disinformation. Plausible rubbish often enters an organisation and take hold as ‘common fact’ though poor training programmes, a lack of critical thinking and skill with research and evidence-based practice or other systems. Plausible rubbish successfully deceives or mislead because it is cloaked in a veneer of plausibility, seeming reasonable or probable to an unsuspecting and untrained audience even when it is without reliable evidence or research.

 

Dangers of misinformation and disinformation

There are many problems posed by misinformation, disinformation and plausible rubbish entering an organisation’s knowledge base including:

  1. Eroding trust. When an organisation’s systems, knowledge management processes and common organisational knowledge become infiltrated by false information, it can seriously undermine the trust and confidence of stakeholders, such as customers, employees and investors, as well as employees[ii].
  2. Reducing decision-making effectiveness. Decision-makers relying on inaccurate information can often lead to strategic and operational errors that can jeopardise an organisation’s operations and outcomes[iii].
  3. Increasing legal and regulatory risk. Misinformation can severely expose an organisation to the risk of legal and regulatory peril. A number of cases of misinformation entering organisational systems and being accepted as ‘knowledge’ has resulted in penalties for non-compliance with data protection and other laws, for example[iv].
  4. Increased disruption to supply chains. Misinformation and disinformation have been found to be at the heart of several supply chain disruptions and even collapses[v].

 

Evidence-based strategies to protect your organisation

There is a range of strategies you can use to protect your organisation from fake news, misinformation, disinformation and plausible rubbish:

  1. Develop clear policies. Having a good policy clearly stating the organisation’s stance on misinformation and disinformation, its seriousness and what procedures should be taken to verify information is important for every organisation. Additionally, there should be a process in place to help identify and rectify inaccuracies, as well identify sources of misinformation and sources of trusted information. This policy should be regularly reviewed and updated[vi].
  2. Train staff in critical thinking and information literacy. People are at the heart of any organisation’s efforts to counter false information. Employees who can think critically and are trained to use evidence-based practices (see – https://oxford-review.com/evidence-based-practice-essential-guide/) are significantly less likely to fall foul of false information. Empowering employees to recognise and counter misinformation by providing training on critical thinking, digital literacy and evidence-based practice is one of the most potent ways to deal with the problem of false information impacting the organisation. This goes hand in hand with the previous point about policy. Policy without critical thinking and evidence-based practice is unlikely to succeed[vii].
  3. Monitor your organisation’s online presence and communication channels. It is important regularly to monitor your organisation’s online presence, including social media channels and news sources, to identify and quickly respond to instances of misinformation starting[viii].
  4. Collaborate with credible sources: Build relationships with trusted organisations and sources, such as universities and The Oxford Review, to ensure that information entering your organisation is both valid and reliable[ix]. Book a call with us to talk about how we can help protect your organisation from misinformation and disinformation.
  5. Encourage open communication across the organisation. Foster an organisational culture that values open communication and the sharing of accurate information. This helps employees feel comfortable reporting instances of misinformation without fear of reprisal. This includes creating the expectation that people will clearly communicate the sources and methods used to obtain and verify information[x].
  6. Implement robust fact-checking protocols. Establish internal fact-checking procedures for all outgoing communications, such as press releases, marketing materials and social media posts, to ensure the consistent dissemination of accurate information[xi].
  7. Develop an incident response plan. Formulate a detailed plan outlining the steps your organisation will take in the event that misinformation is detected or has already spread. This plan should include guidelines for swiftly and effectively responding to minimise the impact of false information and showing stakeholders and employees that your organisation is being proactive and takes the veracity of the information and knowledge with which it operates very seriously[xii].
  8. Evaluate and leverage technology. Utilise high quality technological tools and platforms, such as artificial intelligence and machine learning algorithms, to detect and prevent the spread of misinformation within your organisation’s digital channels[xiii].
  9. Learn from incidents and create an organisational learning culture / learning orientation. Following any incident involving misinformation, conduct a thorough review to identify the root causes, evaluate the effectiveness of your organisation’s response and make necessary improvements to policies, procedures and systems[xiv].
  10. Use reputable sources and research-backed information. Give your staff access to credible sources, such as the Oxford Review, for evidence-based information and insights. By relying on reputable sources, your organisation can not only significantly reduce the risk of adopting and disseminating misinformation, but it also shows staff how to become more evidence based critical thinkers. Research briefings often challenge common thinking and knowledge and increase cognitive flexibility and adaptability. Sources such as the Oxford Review often conduct extensive research, follow rigorous methodologies and present findings in a transparent manner, ensuring that your organisation’s decision-making processes are based on solid evidence[xv]. Book a call with us to talk about how we can help protect your organisation from plausible rubbish, misinformation and disinformation.
  11. Assess the evidence-base of external consultants, trainers and organisations. Whenever engaging with external consultants, trainers or organisations, it is important to conduct due diligence to check their credibility and adherence to evidence-based practices. As well as checking their references and qualifications, it is wise to check the quality of the information they provide and ask about their sources. Out of date or just plain wrong information entering your organisation through these routes can have significantly negative consequences[xvi]. (Moleman et al 2022: Briner & Rousseau, 2011: Broome 2022:).

 

Conclusion

As leaders, it is essential to take a proactive approach in protecting your organisation from the dangers of misinformation. By implementing the strategies outlined above, you can significantly reduce the risks associated with plausible rubbish and strengthen the integrity of your organisation’s decision-making processes.

 

References

[i] Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. science, 359(6380), 1146-1151.

[ii] Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of applied research in memory and cognition, 6(4), 353-369.

[iii] Pennycook, G., & Rand, D. G. (2019). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences, 116(7), 2521-2526.

[iv] Alemanno, A. (2018). How to counter fake news? A taxonomy of anti-fake news approaches. _European journal of risk regulation_, _9_(1), 1-5.

Kropf, B., Wood, M., & Parsons, K. (2023). Message matters: Correcting organisational fake news. Computers in Human Behavior, 144, 107732.

[v] Petratos, P. N., & Faccia, A. (2023). Fake news, misinformation, disinformation and supply chain risks and disruptions: risk management and resilience using blockchain. Annals of Operations Research, 1-28.

[vi] Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of applied research in memory and cognition, 6(4), 353-369.

[vii] Davies, H. T., Nutley, S. M., & Smith, P. C. (Eds.). (2000). What works? Evidence-based policy and practice in public services. Policy Press.

Davies, H., Nutley, S., & Smith, P. (2000). Introducing evidence-based policy and practice in public services. In What works? (pp. 1-12). Policy Press.

Pennycook, G., & Rand, D. G. (2019). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences, 116(7), 2521-2526

[viii] Vicario, M. D., Quattrociocchi, W., Scala, A., & Zollo, F. (2019). Polarization and fake news: Early warning of potential misinformation targets. _ACM Transactions on the Web (TWEB)_, _13_(2), 1-22

[ix] Day, C. (2019). The Future of Misinformation. Comput. Sci. Eng., 21(1), 108

[x] Pennycook, G., & Rand, D. G. (2019). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences, 116(7), 2521-2526

Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policymaking (Vol. 27, pp. 1-107). Strasbourg: Council of Europe.

[xi] Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policymaking (Vol. 27, pp. 1-107). Strasbourg: Council of Europe.

[xii] Nielsen, R., Fletcher, R., Newman, N., Brennen, J., & Howard, P. (2020). Navigating the ‘infodemic’: How people in six countries access and rate news and information about coronavirus. Reuters Institute for the Study of Journalism

[xiii] Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., & Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094-1096.

Zhou, X., & Zafarani, R. (2020). A survey of fake news: Fundamental theories, detection methods, and opportunities. ACM Computing Surveys (CSUR), 53(5), 1-40.

[xiv] Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of applied research in memory and cognition, 6(4), 353-369.

[xv] Millar, R & Hames, V. (2003) Improving the research-practice interface: The impact of research-informed teaching materials on science teachers’ practices. Paper presented at the Annual Conference of the National Association for Research in Science Teaching (NARST), Philadelphia, 23-26 March 2003. University of York repository.

Pino, P., Fallahi, N., Buccheri, D., Alberti, E., Salini, S., Bettuzzi, A., & Luddeni, G. (2021). Contrasting information disorder by leveraging people’s biases and pains: innovating in the post-truth era. CERN IdeaSquare Journal of Experimental Innovation, 5(2), 18-27.

Anderson, J., Bebbington, S., Brown, A., Engitu, S., Fortier, A., Gassie, L., … & Wood, E. (2022). Guidelines for Parliamentary Libraries.

Arora, A., & Gambardella, A. (1992). New trends in technological change: the use of general and abstract Knowledge in Industrial Research. Rivista Internazionale di Scienze Sociali, 259-277.

[xvi] Briner, R. B., & Rousseau, D. M. (2011). Evidence-based I–O psychology: Not there yet. Industrial and Organizational Psychology, 4(1), 3-22.

Broome, A. (2022). Gaming country rankings: Consultancies as knowledge brokers for global benchmarks. Public Administration, 100(3), 554-570.

Moleman, M., Jerak‐Zuiderent, S., van de Bovenkamp, H., Bal, R., & Zuiderent‐Jerak, T. (2022). Evidence‐basing for quality improvement; bringing clinical practice guidelines closer to their promise of improving care practices. Journal of Evaluation in Clinical Practice, 28(6), 1003-1026.

Be impressively well informed

Get the very latest research intelligence briefings, video research briefings, infographics and more sent direct to you as they are published

Be the most impressively well-informed and up-to-date person around...

Powered by ConvertKit
Like what you see? Help us spread the word

David Wilkinson

David Wilkinson is the Editor-in-Chief of the Oxford Review. He is also acknowledged to be one of the world's leading experts in dealing with ambiguity and uncertainty and developing emotional resilience. David teaches and conducts research at a number of universities including the University of Oxford, Medical Sciences Division, Cardiff University, Oxford Brookes University School of Business and many more. He has worked with many organisations as a consultant and executive coach including Schroders, where he coaches and runs their leadership and management programmes, Royal Mail, Aimia, Hyundai, The RAF, The Pentagon, the governments of the UK, US, Saudi, Oman and the Yemen for example. In 2010 he developed the world's first and only model and programme for developing emotional resilience across entire populations and organisations which has since become known as the Fear to Flow model which is the subject of his next book. In 2012 he drove a 1973 VW across six countries in Southern Africa whilst collecting money for charity and conducting on the ground charity work including developing emotional literature in children and orphans in Africa and a number of other activities. He is the author of The Ambiguity Advanatage: What great leaders are great at, published by Palgrave Macmillian. See more: About: About David Wikipedia: David's Wikipedia Page

  • Ginkgo Retail. says:

    Ginkgo Retail presents a pioneering domain that redefines the paradigm of Ecommerce Order Management System. This platform stands as a testament to innovation, providing an intuitive and comprehensive solution for businesses navigating the complexities of order processing. With a seamless interface and advanced features, Ginkgo Retail empowers enterprises to efficiently orchestrate, track, and optimize their entire order ecosystem. Its cutting-edge technology ensures precision and adaptability, setting a new standard in ecommerce order management systems. Ginkgo Retail embodies reliability and efficiency, offering businesses a competitive edge in the dynamic landscape of digital commerce.

  • >