The tech giant is currently operating without a third-party fact-checking service in the Netherlands over the continued dispute of allowing political ads to contain misinformation. However, Facebook insists it’s not an “appropriate role for us to referee political debates and prevent a politician’s speech from reaching its audience and being subject to public debate and scrutiny.” For the past year, NU.nl has been Facebook’s only third-party service after Leiden University pulled out of it partnership with the social network over similar issues, as first noted by The Verge.  With just over a year left until the 2020 US presidential elections, social media giants have been preparing by updating their misinformation policies and political advertising regulations. Last month, Facebook released a bunch of new tools to better “protect the democratic process,” but it has repeatedly been under fire for allowing politicians to openly lie in ads on the platform. As outlined in NU.nl’s post, it has been in conflict with Facebook since May after the publisher ruled that an ad published by the Dutch politician, Esther de Lange, was “unsubstantiated” after she claimed that 10 percent of Romanian farmland was owned by non-European people. Soon after, Facebook intervened claiming that politicians were not to be fact checked on the platform ultimately making NU.nl feel “uncomfortable” in the partnership.  “We value the work that NU.nl has done and regret to see them go, but respect their decision as an independent business,” a Facebook spokesperson told The Verge. “We have strong relationships with 55 fact-checking partners around the world who fact-check content in 45 languages, and we plan to continue expanding the program in Europe and hopefully in the Netherlands.” As Facebook continues to struggle to justify its stance on political ads — something its own employees has taken issue with — other tech giants including Google, Twitter, and Snapchat have fine tuned their approach to handling political ads on their platforms.  Last month, Twitter’s CEO, Jack Dorsey, tweeted that the platform will ban all political ads. Last week, Google finally updated its policy limiting advertisers from targeting ads based on voters’ political leanings or public voter records. In that same week, Evan Spiegel, Snap’s CEO, said his company will fact-check and review all ads on the platform.  With election campaigns being directly affected by technological advances, social platforms like Facebook must take responsibility for preventing the spread of misinformation and prevent their platforms from being weaponized to manipulate mass-public opinion.