Tools That Fight Disinformation Online
Adblock Plus is a browser extension and app that blocks advertisements and websites through the use of filtering lists. The tool automatically whitelists ads that meet the "acceptable ads" standards — so ads can avoid being blocked on this platform by adhering to those standards. Users can opt out of even these ads by adjusting the default settings. The tool, while initially an adblocker, has increasingly seen itself as a way for users to protect themselves not only from harmful ads, but also from other harmful sites, including those that spread disinformation.
This tool intended to build user understanding of the techniques involved in the dissemination of disinformation. This game exposes players to fake news tactics used against them by putting them in the position of a news baron for fake news. Players win by publishing headlines that attract the most followers.
Bot Sentinel is a free platform developed to detect and track trollbots and untrustworthy Twitter accounts. Bot Sentinel uses machine learning and artificial intelligence to study Twitter accounts, to classify them as trustworthy or untrustworthy, and to identify bots. It then stores those accounts in a database to track each account daily. Developers use the data they collect to explore the effect of bots and their propaganda on discourse and to explore ways to counter the spread of bots and the information they disseminate. Classifying untrustworthy accounts is a manual process. They review hundreds of tweets and retweets during the review process. If an account has a large number of followers and a high percentage of misleading and/or factually incorrect tweets, they might classify that account as untrustworthy. This bot-tracking platform is not affiliated with RAND; it is owned and operated by Bot Sentinel Inc, which you can read more about here: https://botsentinel.com/info/about
Botometer is a web-based program that uses machine learning to classify Twitter accounts as bot or human by looking at features of a profile including friends, social network structure, temporal activity, language and sentiment. Botometer outputs an overall bot score (0-5) along with several other scores that provides a measure of the likelihood that the account is a bot.
BotSlayer is a browser extension that helps track and detect potential manipulation of information spreading on Twitter. BotSlayer uses a detection algorithm to identify hashtags, links, accounts, and media that are being amplified in a coordinated fashion by likely bots. Users can view/explore tweets and accounts associated with such amplification on Twitter or search for related content.
CaptainFact is a web-based collection of tools designed for collaborative verification of internet content. It includes a browser extension that provides a video overlay to internet videos with sources and contextual information, as well as icons showing the credibility based on user votes. It also has a "debate platform" that allows for discussions of specific points. While currently focused on video, they are developing a tool to provide a similar overlay to articles.
The Certified Content Coalition is an initiative to encourage standards among online media publishers and certify publishers who meet such standards. Publishers who are certified will receive and display a digital certificate.
ClaimBuster is a web-based automated, live fact-checking tool developed by University of Texas at Arlington. The tool relies on natural language processing and supervised learning (based on a human-coded dataset) to identify factual and false information. There is also an app available for Slack.
Climate Feedback is a web-based content annotation tool that allows scientists to annotate articles to provide additional context and draw attention to inaccuracies. The process results in a credibility score.
CrashCourse videos are a YouTube channel-based series of educational videos. A set of six videos focus on media literacy topics. It has specific courses, including one on media literacy.
CrossCheck is a collaborative initiative through FirstDraft News that is focused on verification. It began prior to the French election, with newsrooms and tech companies working together to report on and identify misleading news leading up to the election.
This tool allows for the crowdsourced verification of claims submitted by university students. Students submit claims and then work along with other students at participating organizations to collect relevant information, viewpoints, and evidence about that claim. All information about a claim is stored and all submitted claims and questions are crowdsourced.
DIRT is a blockchain verification tool that allows communities to moderate data, such that anyone is able to add data to the platform and any user can then challenge that data. Users earn tokens by identifying and correcting errors. Thus, there is an economic incentive for data to always be improving and for inaccurate data to be removed from the platform.
The Global Disinformation Index is a web-based tool that rates news outlets based on the "probability of disinformation on a specific media outlet." This rating system will cover all types of media, and will be a real time score.
The Duke Videofactchecking Tool is a browser extension that will provide live factchecking of information on television. It was assessed by the developers using user-based experiments. Users generally liked having the pop-ups generated by this tool, although they differed on whether they preferred getting a "rating" or just the raw factual information.
Emergent.Info is a web-based tool that tracks, verifies, or debunks rumors and conspiracies online. Rumors are suggested by individuals on the site, and then staff review and determine whether the claim is verified or false.
This tool shows users the advertisements on their Facebook feeds and guesses which ones are political. It also shows users political advertisements aimed at other users. All political ads that are collected are put into a database that is publicly available.
Factcheck.org is a website affiliated with the Annenburg Public Policy Center of the University of Pennsylvania that performs fact-checking. The purpose of the website is to "monitor the factual accuracy" that is present in American politics. It focuses on statements made by U.S. politicians. The group has a number of journalists who research material.
This tool intended to build user skills in identifying false information in a gameified format. Using a Tinder-like format, players swipe left or right depending on if they think the news presented is real or fake. Users can get hints by looking at the source of the article. Players earn points and can progress through several levels.
This tool intended to build user skills in identifying false information in a gameified format. Using a retro, 1980's arcade game setup, players see headlines and have to guess if they are fake or real by swiping to the left or right in a Tinder-like fashion. High scores get recorded to a global leaderboard.
FakerFact is an artificial intelligence tool that assesses the purpose and characteristics of information. Ratings include agenda-driven, journalism, wiki, opinion, satire. The tool does not rate an article as true or false, but rather provides an assessment of its purpose and its objectivity.
Fakey is a web-based interactive educational tool designed to improve media literacy. It presents news stories that incorporate characteristics of clickbait, fake news, conspiracy theories, etc. Users are then asked to choose to share, hide, or fact-check that information. The goal is to provide users with experience identifying true v. false information.
First Draft's web-based "verification curriculum" is designed to teach users (there are versions for both journalists and for the general public) how to verify the accuracy and credibility of different types of media.
Forensically is a web-based collection of tools that can be used for "digital image forensics." Some functionalities include magnifying functions, clone detection, error level analysis, noise analysis, level sweep, and many more.
The Get-Metadata Viewer is a web-based tool that provides users with metadata about photos, videos, and texts, including the location, time, date it was modified, format, file size, etc.
Hamilton 2.0 is a web-based dashboard that provides real-time information on Russian propaganda and disinformation online. It does so by tracking hundreds of Russian-linked Twitter accounts that are related to influencing information in the United States and Europe. The tool provides analysis of the narratives and topics promoted by the Russian government and state-backed media on Twitter, YouTube, broadcast television, and state-sponsored news websites.
Hoaxy is a web-based tool that visualizes the spread of articles online. Hoaxy searches for claims and fact-checking going back to 2016. It tracks the sharing of links to stories from low-credibility sources and independent fact-checking organizations. It also calculates a bot score, which is a measure of the likely level of automation. Overall, the IU Observatory on Social Media is interested in studying and better understanding how information is shared online. They are also interested in studying how social media affects public discourse.
The Iffy Quotient is a web-based tool that uses NewsWhip to query Facebook and Twitter and identify URLs that are known to be biased or to be frequent reporters of false information. The tool then calculates the percentage of URLs on each site that are "iffy," or known for reporting false or misleading information.
This tool is an archive of publicly available and attributed data from known online information operations from public and non-deleted tweets on Twitter and Reddit attributed to Russian and Iranian actors. The archive currently consists of over 10 million messages from Russian and Iranian state-sponsored influence operations on Twitter and Reddit, and will be updated on an ongoing basis.
The International Fact Checking Network is an initiative created in order to promote fact-checking in journalism. Part of this includes the creation of the IFCN Code of Principles, which may help establish standards for fact-checking methods. The IFCN also organizes fellowships, trainings, and conferences.
The Journalism Trust Initiative promotes trust in journalism through the development of standards. It will do so through its Workshop Agreement of the European Centre of Standardization and the participation of media outlets, press councils, and other stakeholders.
KnowNews is a browser extension developed through the Media Monitoring Africa Initiative that classifies news sites based on their credibility. Sites are rated as credible, dodgy, or not rated.
Lead Stories is a web-based fact-checking platform that identifies false or misleading stories, rumors, and conspiracies by using its Trendolizer technology to identify trending content that is then fact-checked by their team of journalists. This fact-checking website is not affiliated with RAND; it is owned and operated by Lead Stories LLC, which you can read more about here: https://leadstories.com/about.html
IREX's Learn to Discern online games were developed to teach media and information literacy skills. The program includes techniques for identifying disinformation.
Making Sense of the News is a six-week online course in journalism and news literacy offered by the City University of New York. It includes teaching from professors at The University of Hong Kong and Stony Brook University and covers topics including how to identify bias, how to evaluate sources, and how to create and share information.
Media Lit: Overcoming Information Overload is an online media literacy course hosted by Arizona State University that educates users on the skills needed to navigate the information ecosystem, to engage as viewers and creators of information, and to analyze information, distinguishing between true and false information.
MediaBugs is a service for reporting and correcting specific errors and problems in media coverage. Once a problem is reported, the tool administrators try to engage the journalist or organization and work towards correction. The tool provides a neutral, civil, moderated discussion space, which can serve forum for debate and discussion.
BitPress's Misinformation Detector is a web-based "decentralized trust protocol" blockchain tool that is designed to track the credibility of news in a transparent manner. The tool measures trust by analyzing the content and what it is linked to, establishing a network of how the content is spread across media organizations. All media sources are given trust rankings, and thus the association between content and specific sources can affect the trust rankings of other sources. The tool includes fact-checking services, using a combination of human fact-checker and blockchain technology. Journalists and publishers can sign up to be partners.
Stonybrook's News Literacy Course Pack is a media literacy curriculum that intends to provide students with the skills needed to be engaged and aware information consumers in the 21st century. It covers topics such as accessing and evaluating sources, analyzing information, bias, and the economics of the media industry. It is open to the public (for auditing), but full course credit requires enrollment.
NewsCheck is a web-based platform that performs credibility scoring using a combination of machine technology (blockchain) and humans to fight fake news. They have a system for managing and delivering content that is deemed trustworthy to consumers. Part of the NewsCheck Platform is the NewsCheck Truck Index, a set of journalistic standards, that is then used to transparently verify news information. This tool also identifies bias. NewsCheck also offers a university pilot that allows journalism and data science students to use the platform for free as experiential learning.
OpenSources is a web-based database of information sources that have been analyzed in terms of their reputations for producing credible news. The database classifies websites as: Fake News, Satire, Extreme Bias, Conspiracy Theory, Rumor Mill, State News, Junk Science, Hate News, Clickbait, Proceed with Caution, Political, and Credible.
Our.news is a website, browser extension, and app that provides fact-checking through crowdsourcing. Users can rate news content or add sources. Users can rate content based on "spin," "trust," "accuracy," and "relevance." Ratings are also weighted based on credibility. Additionally, bias-detection algorithms are used to weight user ratings. Users can also view fact-check information about articles on the site,. The fact-check information includes information on bias, sources, and where relevant links to Politifact and Snopes.
Politifact is a website that fact-checks "newsworthy and significant" statements and rates these statements as "True," "Mostly True," "Half True," "Mostly False," "False," and "Pants on Fire." The process involves reviewing other fact-checking sources, Google searches, online database seraches, expert consultation, and other literature reviews.
This tool intended to build user skills in identifying false information in a gameified format. Using a Tinder-like format, players swipe left or right to guess if the quote, author, and context (taken from the fact-checking site Politifact) are truthful or not. Scores are recorded annonymously and compared with others' high scores.
Polygraph.info is a web-based fact-checking platform that relies on journalists to research government statements and reports, along with statements by high-profile individuals. It classifies the reports based on their measure of credibility and provides additional context.
Project Look Sharp is an educational program that promotes media and digital literacy through the development of curriculum materials and professional development workshops.
Public Editor is a crowd-sourced credibility rating system that outputs scores based on the evaluations of a network of "citizen scientists" who annotate articles with comments regarding mistakes, biases, or other relevant information.
The Civic Online Reasoning Program is an educational initiative through the Stanford History Education Group that produces exercises and assessments to instruct students on how to judge the credibility of online content. It also develops rubrics that can be used to assess student performance.
Rbutr is a collaborative online platform that is developing a database of webpages, each of which is a rebuttal of another webpage, with the goal of combating disinformation and reducing fake news. It does not distinguish between true or false, it just identifies rebuttals to provide users with a more diverse set of perspectives.
The Media Verification Assistant is a web-based image verification tool. It includes "image tampering detection," "metadata analysis," "GPS geolocation," "EXIF Thumbnail extraction," and a reverse image search.
ClaimReview is a source label attached to website content that provides search engines with some information about the content of the website. This can include a "Fact Check" tag for information that is fact-checked by verified sources and organizations with the aim of providing more accurate information to users. This content will then appear with a Fact Check tag in search engines.
"Share the Facts" is a web-based widget and Slack application that provides readers with a summary of a fact-checked article, highlighting the claim that is checked and the conclusion. The widget has a general structure but can be customized to identify the fact-checking organization. By using the widget, fact-checks can be highlighted in Google search results and shared through platforms like Google and Twitter.
Snopes.com is a website that conducts extensive fact-checking research on popular topics, often chosen based on reader interest. Snopes uses a number of icons to classify content: True, Mostly True, Mixture, Mostly False, False, Unproven, Outdated, Miscaptioned, Correct Attribution, Misattributed, Scan, and Legend.
The Factual is a mobile app and browser extension that scores news content based on "the extent and quality of its sources," "the expertise of the journalist," "the opinionated nature of the language used," and "the historical reputation of the sight." It ranks the content on a 0-100 scale to measure the quality of the content. It is mainly automated. The tool rates news quality based on diversity of sources, author expertise, language used, etc. It also identifies higher quality sources and sources from the other side of the political spectrum.
The Trust Project is a collaborative initiative of several news companies that is developing standards to increase transparency in journalism. The project developed the following "Trust Indicators" as a start: Best Practices, Author/ Reporter Expertise, Type of Work, Citations and References, Methods, Locally Sourced, Diverse Voices, Actionable Feedback. News Organizations will display the indicators on their own articles.
Trive is a web-based platform and browser extension that provides fact-checking through crowdsourcing using blockchain indexing. It uses a Nash Equilibrium incentive structure for fact-checking. There are several roles that go into the process: the Curator (bids for a story), the Research (picks up claims, researches claims), the Verifier (reviews research claims), the Witness (reviews anonymous claims), and the Consumer who uses the Trive plugin. Incentives for "truth telling" are built into the system, through the use of tokens, which accrue to the researcher for verified stories, the verifier for identifying false information, and to the witness.
Trust and Verification is an online open enrollment course that teaches how to build trust as a journalist or content creator in an age of misinformation. It focuses on how to build trust, how the media landscape changes, and how to verify trustworthiness online.
Trusted Times is a browser extension that classifies fake news and unreliable of content. It uses machine learning to provide additional information about articles, including bias and details about who is covered positively and negatively. It can classify articles as fake, unreliable, verified, or mainstream media; provide summaries of verified content; idenfity important topics; identify potential bias; identify historical bias of reporters; and identify historical bias of a specific website. It uses icons to present the reliability of websites.
Truth Goggles 2.0 is a web-based tool aimed at presenting factual content to partisan audiences by asking users to read claims, then engage with PolitiFact analysis of those claims, then read the claims again. It is an initiative through the Duke Reporter's Lab. While this new version of the tool is still In development (since 2017), the original tool was an automated software that flagged suspicious claims in articles.
The TV News Archive is an initiative developing an archive of digital media, ranging from web pages, books and texts, audio recordings, videos, images, and software programs. One of its projects, the "Political TV Ad Archive" is an archive of 2016 political ads combined with fact-checking from a variety of sources (e.g., Politifact, Factcheck.com).
Twitter Trails is a web-based tool that uses an algorithm to analyze the spread of a story and how users react to the story. The tool measures the spread of the story and the skepticism expressed by users.
Check is an online set of journalism tools that were developed to assist journalists in verifying information. Journalists are able to upload stories and events, and then Check employees work to verify (or authenticate) that information. The authenticated information can serve as a resource to journalists seeking out accurate information. The tool also includes reverse image search resources.
InVid is a plug-in toolkit designed to assist fact-checking through video verification. The tool provides users with contextual information on videos, reverse image searching, video metadata, video copyright information, along with other features to assist in verifying content.
Mozilla's Web Literacy is a set of games and tools created to promote media literacy by teaching users how to write, read, and participate online in a responsible way. There are tools for learning and a community for interacting on the Mozilla platform.
Website Whitelist is a browser extension that allows users to identify sites to be whitelisted, and prevents any request to sites not included on that list. The extension also blocks external tracking and advertising websites.
This tool allows users to create an anonymous profile, then collect information about the political and other ads that they see, along with information about why they were targeted with those ads. The tool can provide users with statistics on who/what has been targeting them and uses this information to build a database of political advertising and targeting.
The YouTube Data Viewer is a web-based video verification tool offered through The Citizen Evidence Lab, created by Amnesty International. Users input a YouTube URL, and the tool outputs information about the video that is helpful in verifying a video. This includes upload time and thumbnails that can be used for reverse image searching.