In the hopes of tackling fake news on its platform, Facebook has launched a new fact-checking service.
Facebook has joined forces with Full Fact, a fact-checking charity, who will review any images, videos or stories flagged by users.
Each review will focus on misinformation perceived to be the most damaging.
This includes fake medical information, misleading stories around terror attacks and hoaxes around elections.
The new service comes amidst criticism from politicians towards Facebook about misinformation surrounding worldwide elections.
The Brexit referendum and 2017 general election were both found to have been tarnished by so-called fake news, while online mistruths around the NHS and immigration have been blamed for stoking division in nations around the world.
Social media companies have faced the threat of regulation if they fail to act on false information on their platforms, and Facebook has been called to answer questions from lawmakers in numerous countries on the subject.
In a highly publicized evidence session before the US Congress in April, founder Mark Zuckerberg addressed the company's failings on false information and the data scandal involving Cambridge Analytica.
However, he failed to appear when called to the UK Parliament's inquiry into fake news, prompting MPs to leave an empty chair for him during a session with vice-president Richard Allan in November.
Under the new measures, Facebook users will be able to report posts they fear may be inaccurate for Full Fact to review, while other suspicious posts will be identified by Facebook technology.
Posts will then be labelled as true, not true or a mixture when users share them.
If a piece of content is proven to be false, it will appear lower in Facebook's News Feed but will not be deleted.
Claire Wardle, executive director of First Draft, which worked with Full Fact on the 2017 general election, said the biggest problem is that Facebook holds all the information about the project, making it almost impossible for independent auditors to see whether it is working.
"Facebook has this global database of online misinformation and that is something that should be available to researchers and the public," said Dr Wardle.
"The first concern is to protect free speech and people's ability to say what they want," said Will Moy, director of Full Fact, adding that the main problem on social media is often that "it is harder and harder to know what to trust".
Rather than the "nuanced political fact-checking" on topics such as Brexit and immigration often found on Full Fact's website, Mr Moy predicted misinformation around health will be one of the biggest issues his team will be tackling.
Facebook first launched its fact-checking initiative in December 2016, after concerns were raised about hoaxes and propaganda spread around the election of Donald Trump.
The social network now works with fact-checkers in more than 20 countries to review content on its platform but studies disagree as to whether their efforts have been effective.
Full Fact will publish all its fact-checks on its website, Mr Moy said, as well as quarterly reports reviewing the relationship with Facebook.
Sarah Brown, training and news literacy manager, EMEA at Facebook, said in a statement: "People don't want to see false news on Facebook, and nor do we.
"We're delighted to be working with an organization as reputable and respected as Full Fact to tackle this issue.
"By combining technology with the expertise of our fact checking partners, we're working continuously to reduce the spread of misinformation on our platform."