Self-regulating, technocratic leviathan, Meta has announced the rollout of a massive “fact-checking” campaign to coincide with the pending federal election in Australia.
Although an election date is yet to be formally set, the parent company of Facebook and Instagram revealed they’ve long been preparing for the showdown with “misinformation” miscreants.
In a recent statement posted on Facebook Australia, Josh Machin, Facebook’s head of policy in Australia, asserted:
“Meta has been preparing for this year’s Australian election for a long time and will be using a comprehensive strategy to combat misinformation, election interference, and other forms of abuse on our platforms.”
Meta’s crackdown campaign expands on their already gung-ho, third-party “fact-checking.”
The expansion pack adds RMIT University initiative, RMIT FactLab – which works alongside RMIT ABC (the Australian Broadcasting Corporation) – to “fact-checking” services already provided to Meta by Agence France Presse and Australian Associated Press.
The “fact-checking” partnership also involves non-profit, media training organisation, First Draft, whose raison d’être is:
“To protect communities from harmful misinformation, and empower society with the knowledge, understanding, and tools needed to outsmart false and misleading information.”
Facebook Australia’s Josh Machin wrote, the expansion will also offer ‘one-off grants to all [affiliated] fact-checkers [in order] to increase their capacity in the lead up to the [Australian] election.’
The well-coordinated “fact-checking” blitz will see “fact-checkers” work alongside Artificial Intelligence, and a First Draft and Meta “Don’t be a Misinfluencer”, media campaign to be rolled out on Facebook and Instagram.
The campaign’s alleged aim is to ‘help creators and influencers to share tips for spotting false news.’
For their part, RMIT FactLab teams will monitor:
“News media as well as social media platforms to identify verifiable statements that relate to topics that matter to people and form part of the national debate.”
Despite their affiliation with the leftist dominated Australian Broadcasting Corporation, FactLab claims to be independent and neutral:
“We do not seek to influence voters or push for a particular outcome. We do not speculate…We simply follow the facts no matter where they lead…checking information against publicly available data and consulting experts.”
Notably, FactLab attest that they do not fact-check “opinions, speculation about future events, news reports or statements from journalists – noting, ‘that’s the job of the ABC’s Mediawatch program.’”
Josh Machin’s Facebook Australia statement claimed to have spent $7 billion investing in ‘safety and security,’ employing 40,000 people around the world to ‘reduce the likelihood of election interference, misinformation and online harms.’
He explained that the expansion of Meta’s “fact-checking” apparatus is to:
“Identify and take action against threats to the election; protect candidates and leaders running in the Australian election; encourage transparency, and allow Australians to control their experience.”
Machin’s statement included comments from FactLab director, Russell Skelton, stating, “We see this as a really important public service.”
While Meta’s intention to safeguard the Australian election makes surface level common sense, given the proven cyber-warfare capabilities (and past actions of the Communist Chinese Party), Meta’s self-regulation and ideological bent raise questions about its own problem with managing information.
Who polices the Big Tech thought police?
Other questions arise from Meta’s relationship with organisations like First Draft, who appears to exist as a mind hive for journalists to be schooled in Leftist newspeak, and groupthink – where claims to truth are made, but adherents cannot define the term woman, and consider abortion to be “healthcare”.
The reality check for Meta’s “fact-checking”, is that Silicon Valley’s Globalist Technocracy is not capable of being “a source for truth” without both internal and external, checks and balances.
No human organisation is free of corruption. This includes Meta.
A truism echoed by Facebook whistle-blower Frances Haugen’s infamous testimony, ‘warning of election risks’, accusing executives at Facebook of manipulating algorithms, putting “astronomical profits” before safety.
Another example is Meta’s removal of United Australia Party leader Crag Kelly from its platforms.
Kelly was perma-banned for alleged breaches of community standards regarding medically backed arguments about potential COVID-19 treatments.
For all of Josh Machin’s rhetoric about Meta protecting Australian politicians and candidates, Meta still has what amounts to a gag order on a sitting member of the Australian parliament. Thus, inadvertently giving the member’s political opponents a clear advantage.
These examples, and others, necessitate the need for solid answers to questions like:
Is Meta capable of objectivity?
Does Meta have the moxie to police itself against its own lies, and election interference?
In other words, is Meta ready to police its own proven double standards, and leftist bias?
Until Meta’s elite can these questions and halt their own flirtations with Cancel Culture, they cannot resolve legitimate concerns about their possible end-game intent.