Apple and Google are trying to get more U.S. states to adopt their phone-based approach for tracing and curbing the spread of the coronavirus by building more of the necessary technology directly into phone software as doctors sound the alarm about Covid-19 misinformation running rampant on social media.
That could make it much easier for people to get it on their phone even if their local public health agency hasn’t built its own compatible app.
The tech giants on Tuesday launched the second phase of their “exposure notification” system, designed to automatically alert people if they might have been exposed to the coronavirus.
Until now, only a handful of U.S. states have built pandemic apps using the tech companies’ framework, which has seen somewhat wider adoption in Europe and other parts of the world.
States must choose whether they want to enable the Apple-Google system. If they do, iPhone users in those states will automatically be able to opt into the system without having to download an app. They’ll be prompted with a notification asking if they consent to running the system on their phones.
For people with Android phones, Google will automatically generate an Android app for public health agencies that phone users can then download.
The companies said they expect Maryland, Nevada, Virginia and Washington, D.C., to be the first in the U.S. to launch the new version of their tool. Virginia says nearly half a million residents have downloaded its app since the state in August became the first to launch a customized pandemic app using the Google-Apple framework.
But state officials have said their app doesn’t work as well outside Virginia, although they expect a group of coordinating public health agencies to get a national server up and running before long so other states can join in.
The technology relies on Bluetooth wireless signals to determine whether an individual has spent time near anyone else who has tested positive for the virus. Both people in this scenario must be using the Google-Apple app. Instead of geographic location, the app relies on proximity. The companies say the app won’t reveal personal information either to them or their public health agency.
Individuals who receive such proximity alerts will typically be offered testing and health advice to prevent potential future spread.
Social media warning
Meanwhile, on the social media front, a new report notes the spread of misinformation peaked at an estimated 460 million views on Facebook in April 2020, right as the pandemic escalated around the world. The report about Facebook comes from Avaaz, a non-profit civil society group.
Doctors Seema Yasmin and Craig Spencer wrote about the report in an opinion piece on The New York Times, where they discussed conversations with colleagues and patients about coronavirus myths they had read on social media.
“Everyone, at some level is susceptible to this health misinformation and disinformation because there is so much nonsense that’s circling out there, the people who are pushing it are really preying on the fact that we are all really vulnerable right now, we’re scared, we’re anxious, we’re overwhelmed with information, we don’t know what to believe, Yasmin told CNN’s Don Lemon Monday. “The President is saying one thing, the scientists are saying another.”
The report, published on August 19, says bottom line, Facebook is failing to keep people safe and informed.
In a statement to CNN, a Facebook spokesperson said while the company shares Avaaz’s goal to stop the spread of misinformation, “their findings don’t reflect the steps we’ve taken to keep it from spreading on our services.”
“Thanks to our global network of fact-checkers, from April to June, we applied warning labels to 98 million pieces of COVID-19 misinformation and removed 7 million pieces of content that could lead to imminent harm,” the statement said. “We’ve directed over 2 billion people to resources from health authorities and when someone tries to share a link about COVID-19, we show them a pop-up to connect them with credible health information.”
‘Viral misinformation is spreading faster than the virus itself’
“It’s not just the patients and it’s not just people on the left or people on the right, it’s all people,” Spencer told Lemon.
“We see the consequences in the clinic and the emergency room,” Spencer and Yasmin wrote in the NYT. “Patients question our evidence-based medical guidance, refuse safe treatments and vaccines, and cite Facebook posts as ‘proof’ that Covid-19 is not real.”
Virus rumors, stigma and conspiracy theories have been circulating in 25 different languages across at least 87 countries — and this spread of misinformation has led to deaths and injuries, according to a study published in the American Journal of Tropical Medicine and Hygiene in early August.
The researchers found 2,311 reports related to possible Covid-19 misinformation and of those reports, 89% were classified as rumors; 7.8% were conspiracy theories; and 3.5% were stigma.
The study included some examples: “Poultry eggs are contaminated with coronavirus” and “Drinking bleach may kill the virus” were rumors; “Every disease has ever came from China” was stigma; and “It’s a bio-weapon funded by the Bill & Melinda Gates foundation to further vaccine sales” was a conspiracy theory.
“This undermining of public health and science since the beginning of this outbreak and this pandemic in the US has resulted in people just don’t know where to go,” Spender said. “The fact that we don’t have the CDC out in front of the public every single day telling us the updates, giving us some type of information that’s credible, people have to find it on Facebook or have to find it on the President’s Twitter feed.”
Content from the top 10 websites spreading health misinformation had almost four times as many estimated views on Facebook as content from the top 10 leading health institutions, like the World Health Organisation (WHO) and the Centers for Disease Control and Prevention (CDC), according to Avaaz’s report.
Spencer said the absence of readily available information from the CDC and The Coronavirus Task Force has led people to places like Facebook where “viral misinformation is spreading faster than the virus itself.”
And all the spread of misinformation is only making doctors’ jobs harder.
“While we try, each day, to counter these dangerous falsehoods that circulate among our patients and our peers, our ability to counsel and provide care is diminished by a social network that bolsters distrust in science and medicine,” Spencer and Yasmin wrote in the NYT. “Facebook is making it harder for us to do our jobs.”