startups

From suicide prevention to genetic testing, there’s a widening disconnect between Silicon Valley health-tech and outside experts who see red flags – Business Insider


When the Harvard psychiatrist and tech consultant John Torous learned that Facebook monitors its users’ posts for warning signs that they might be at risk of suicide, he was shocked.

Having grown accustomed to working with tech giants like Microsoft on scientific research, he wondered why he’d never heard about Facebook’s program. He was even more surprised to find out that as part of its efforts, Facebook was sending emergency responders to people’s homes.

Facebook’s monitoring tool has been running since 2017 and was involved in sending emergency responders to people more than 3,500 times as of last fall, the company said. But the reason Torous hadn’t heard of it is because the company hasn’t shared information about the tool with researchers such as him, or with the broader medical and scientific community.

Without that information, Torous said, big questions about Facebook’s suicide-monitoring tool are impossible to answer. Torous is worried the tool might home in on the wrong users, discourage frank discussions about mental health on the platform, or escalate, or even create, a mental-health crisis where there wasn’t one. In sum, Torous said Facebook’s use of the tool could be harming more people than it’s helping.

“We as the public are partaking in this grand experiment, but we don’t know if it’s useful or not,” Torous told Business Insider.

Facebook says the tool isn’t a health product or research initiative but more akin to calling for help if you see someone in trouble in a public space.

It is the latest example of a trend in Silicon Valley, where the barriers that separate tech from healthcare are crumbling. A growing array of products and services — think Apple Watch, Amazon’s Alexa, and even the latest meditation app — straddle the gap between health innovation and tech disruption. Clinicians see red flags. Tech leaders see revolution.

Never miss out on healthcare news. Subscribe to Dispensed, our weekly newsletter on pharma, biotech, and healthcare.

Silicon Valley’s critics like to point to Theranos as a dramatic example of what can go wrong as a result of the breakdown. Bolstered by big investors, the secretive blood-testing startup reached a valuation of $9 billion, despite publishing little research showing its tech worked. When it was gradually revealed that the advanced technology required for its devices did not exist, the company and its founder, Elizabeth Holmes, toppled.

Clinicians and researchers interviewed for this article said that tech products and services can often be overhyped or even harmful, even if they don’t reach Theranos-level deception. They said the health claims that companies make frequently run ahead of the evidence — a problem when people’s health is on the line.

“There’s almost this implicit assumption that they play by a different set of rules,” Torous said.

Take Juul, which is now the top seller of e-cigarettes in the US. When the San Francisco company launched its high-nicotine vaping devices, it styled them as compatible with a healthy lifestyle. When Juul was then tied to a teen-vaping epidemic, experts called attention to Juul’s lack of published health research and its youthful launch campaign. Had Juul been required to rigorously study its e-cigarettes before flooding the market with them, the company might have avoided putting youth at risk, experts say.

Founded in 2006, 23andMe is one of the oldest Silicon Valley healthcare startups. The company has long portrayed its genetic tests as helping people take better control of their health by providing a snapshot of their risk of diseases like late-onset Alzheimer’s and cancer. In 2013, however, regulators forced the company to stop selling the tests on the grounds that they hadn’t proven their results to be accurate with published research.

Now, with limited regulatory sign-off and dozens of published studies, 23andMe is selling its health tests once again. Yet clinicians still call the reports subpar. These experts say the reports can mislead. They point out that although regulators have approved of them as medical tools, the bar for that threshold was significantly lowered recently. 23andMe, on the other hand, says its reports are empowering and touts regulators’ blessing.

‘The walls are breaking down fast’

Some experts find 23andMe’s health reports concerning.
Hollis Johnson/Business Insider

In the view of Laura Hercher, the director of research in human genetics at Sarah Lawrence College, tech companies and clinicians approach health problems from fundamentally different perspectives. Where tech tends to prioritize disruption and convenience, healthcare puts an emphasis on safety.

But the invisible barriers that once separated tech from health are deteriorating, she said. In the meantime, patients and consumers may suffer the consequences, other experts say.

“The walls are breaking down fast,” Hercher told Business Insider. “There’s going to be a lot to figure out as we go along.”

At Facebook, a health problem came to the company. Staff had known there was an issue since 2009, when a cluster of suicides occurred at two high schools near the company’s headquarters in Palo Alto. Then, things became personal. After the company rolled out a livestreaming tool called “Facebook Live” several people used it to livestream their suicides. First it was a 14-year-old girl and then a 33-year-old man, both in the US. Later, in the fall, a young man in Turkey broadcast himself taking his own life.

Facebook tasked its safety-and-security team with doing something about it.

The team spoke with experts at several suicide-prevention nonprofits, including Daniel Reidenberg, the founder of Save.org. Reidenberg told Business Insider that he helped Facebook create a solution by sharing his experiences, bringing in people who’d struggled personally with suicide, and having them share what helped them.

The result was Facebook’s suicide-monitoring algorithm, or, as the company calls it, its suicide-prevention algorithm. Using pattern-recognition technology, the tool identifies posts and live streams that appear to express intents of suicide. It scans the text in a post, along with the comments on it, such as “Are you OK?”. When a post is ranked as potentially suicidal, it is sent first to a content moderator and then to a trained staff member tasked with notifying emergency responders.

Clinicians and companies disagree on the definition of health research

Antigone Davis, Facebook’s global head of safety, told Business Insider that she likens the tool to crisis response and does not consider it health research. She said Facebook doesn’t store data on individuals related to what the algorithm detects about their suicide risk.

“The AI is working on the content, not on the individual,” Davis said.

It is unclear how well the tool works. Because of privacy issues, emergency responders can’t tell Facebook what happened at the scene of a potential suicide, Davis said. In other words, emergency responders can’t tell Facebook if they reached the scene too late to stop a death, showed up to the wrong place, or arrived only to learn there was no real problem.

Torous, a psychiatrist who’s familiar with the thorny issues in predicting suicide, is skeptical. He points to a review of 17 studies in which researchers analyzed 64 different suicide-prediction models and concluded that the models had almost no ability to successfully predict a suicide attempt.

“It’s one thing for an academic or a company to say this will or won’t work. But you’re not seeing any on-the-ground peer-reviewed evidence,” Torous said. “It’s concerning. It kind of has that Theranos feel.”

Reidenberg told Business Insider that he believes Facebook is doing good work in suicide, but because its efforts are in uncharted waters, he thinks everyday issues will arise with the tool. He disagrees with Torous’ view that the efforts are health research. “There isn’t any company that’s more forward-thinking in this area,” Reidenberg said.

Something that’s easier or prettier may not be good enough in healthcare

When Pax Labs launched the Juul e-cigarette in 2015, the company had published no health research on the device.
Melia Robinson/Business Insider

Juul has long presented itself as a health-tech company and is eager to show that its devices can improve the health of adult smokers. When it launched its e-cigarettes in 2015 with a party in New York City, Juul’s then-parent company, the tech startup Pax Labs, called the Juul “smoking evolved.”

Before the launch party, though — and for several years afterwards — neither Pax nor Juul published any real health research. Then, reports of a vaping epidemic among teens began to surface.

Meanwhile, clinicians and academics looked at Juul’s devices and saw a big problem: They had a handful of qualities that made them uniquely appealing to young people.

Even compared with other e-cigarettes, Juul devices contain very high levels of addictive nicotine, which may help adult smokers but which also appear to interfere with learning and memory in the still-developing teen brain, according to Suchitra Krishnan-Sarin, a professor of psychiatry at Yale’s center for nicotine and tobacco research. Juuls are also easier to hide and to use discretely, another quality that could be helpful for adults but especially harmful for teens, Krishan-Sarin said.

Other experts point to Juul’s 2015 ads— which depicted young models on flashy backgrounds — and Juul’s sweet flavors, such as crème brûlée and cool cucumber. They say both appealed uniquely to youth. Had the startup studied its devices before selling them, those problems may have been foreseeable, they say.

“The problem is Juul products just came onto the market without any regulation and without any controlled studies,” Krishnan-Sarin told Business Insider.

In a statement emailed to Business Insider, a Juul spokesperson said the company “exists to help adult smokers switch from combustible cigarettes, which remain the leading cause of preventable death around the world,” and added that Juul is now publishing research. The representative also said the company is committed to preventing youth access to its products and supports raising the national tobacco and vapor purchasing age to 21.

“We invite those who criticize us for launching in 2015 to talk to former smokers about the impact switching to Juul has had on their lives,” the spokesperson said.

Who are 23andMe’s genetic tests for?

Then there’s 23andMe, which rolled out the health and disease component of its genetic tests in 2013, before publishing research that showed the tests to be accurate, according to regulators at the US Food and Drug Administration. Today, the agency has approved of 23andMe’s products as medical tools, thanks in part to a less-stringent process introduced last year. In addition, the company has now published dozens of basic research papers. But clinicians say those things don’t mean the tests are safe.

Jeffrey Pollard, 23andMe’s director of medical affairs, told Business Insider that its tests are designed for healthy people who want to learn more about their genes and are not intended to meet the level of care designed for the clinic. He said the company company is clear in how it communicates that to customers. Pollard also said 23andMe regularly engages with experts outside the company to ensure their products are up-to-date.

“I think it’s obvious that genetics and the tests we provide are quite impactful, and in that way, we deserve to be paid attention to and embraced,” he said.

But to Hercher, 23andMe’s reports are concerning for several reasons. One is that the tests don’t include counseling, a service that she and other experts see as critical to ensuring that people understand their results and their real risk of disease. Another is that they are not comprehensive because they only look at a select few genes involved in one’s risk of disease.

“Producing something that kind of works and is faster, cheaper, or easier isn’t always an adequate answer if it turns out to put some people at risk,” she said.

‘Move fast and break things’ vs. ‘First, do no harm’

Facebook CEO Mark Zuckerberg answered questions about privacy at a Senate committee last year.
Chip Somodevilla/Getty Images

A decade ago, Facebook cofounder and CEO Mark Zuckerberg told Business Insider founder Henry Blodget that his prime directive to his team was to “move fast and break things.”

“Unless you are breaking stuff,” he said, “you are not moving fast enough.”

It has become the unofficial motto of Silicon Valley.

But experts including Torous say that mantra is at odds with medicine’s Hippocratic oath, in which doctors swear to “first, do no harm.”

In the cases of suicide prevention, genetic testing, and e-cigarettes, lives may hang in the balance. In the tech universe, much of the motivation for a new technology is wrapped up in its potential to disrupt existing markets. But in healthcare, clinicians have to think about what could happen to someone after they use the tool they are given.

Healthcare is an industry that requires particular caution because patients are often in a vulnerable position. They might be sick or facing an elevated risk of disease or death. The chance of causing harm is high.

Hercher and Torous said that academics and clinicians play by rules different from Silicon Valley’s.

“It’s not that we’re dealing with different fact sets — we have different obligations,” Hercher said. “We live in different universes.”

Academics are worried about vulnerable populations

Hollis Johnson/Business Insider

Torous is worried that Facebook’s suicide-monitoring tool doesn’t work very well, especially based on what he’s seen published about other similar algorithms. He’s also concerned that it could cause problems by either identifying the wrong people, which would add stress to an already strained healthcare system and waste money, or by discouraging Facebook users from speaking frankly about their mental state with their peers.

“We know Facebook built it and they’re using it, but we don’t really know if it’s accurate, if it’s flagging the right or wrong people, or if it’s flagging things too early or too late,” Torous said.

Krishnan-Sarin and University of Southern California preventive-medicine professor Jessica Barrington-Trimis are concerned that even if Juul helps adult smokes, the products could hurt thousands of young people who wouldn’t have otherwise smoked by making them more likely to pick up a cigarette.

“We want smokers to quit. If you can provide them with a cleaner form of nicotine, that’s great. But many kids say they go through a whole pod in 24 hours. That’s very concerning. Nicotine is a neurotoxin to the adolescent brain,” Krishnan-Sarin said.

In a similar vein, Hercher and Ross are worried that people at a high risk of disease who take a 23andMe test could be harmed. Both of them said the tests are set up in a way that customers could believe that they’ve been screened for a serious disease such as cancer, for example, when in reality, they have not.

One case of this occurred in 2010 when an oncologist named Pamela Munster took a 23andMe breast cancer test and was relieved to learn she was negative, as the New York Times reported.

Two years later, Munster learned she had breast cancer. A more thorough clinical test revealed that indeed, she had a genetic mutation that had raised her risk of the disease. It was a mutation that 23andMe didn’t test for.

“It’s like you bring your car in for service and they say, ‘OK, we checked your rear right brake and it’s working,'” Hercher said. “If you think you’ve just had your car serviced, you’ve not been well informed.”

A developing playbook for health-tech startups: Publish more research

To John Ioannidis, an early Theranos skeptic and a professor of medicine at Stanford University, the time is ripe for another Theranos-like debacle in health tech. In January, he and a team of researchers published a study that suggests that of all the well-funded health-tech startups out there, very few are publishing scientific literature.

The answer to avoiding that problem is simple, he and his coauthors suggest: The startups need to start publishing results. “Startups are key purveyors of innovation: holding them to a minimum standard of evaluation is essential,” they wrote.

Peer-review research involves subjecting your work to a group of outside experts in the same field. Whether it’s a biotech company claiming its new therapy can cure cancer or a tech company that is trying to prevent suicide, those assertions can and should be measured and quantified, Ioannidis and his coauthors say.

Juul appears to have heeded Ioannidis’ call.

Starting this year, the company began to publish health research and told Business Insider last month that it was beefing up its research efforts with a team focused on doing more of that kind of work. Business Insider also exclusively reported that Juul is also exploring a digital-health offering that could complement its devices with an app or other smartphone-based tool designed to help smokers quit.

In addition, Juul has made over its image and done away with ads that outside experts said appealed to teens. Advertisements that featured young models on bright backgrounds have been swapped for images of adults with pops of gray in their hair. A neon online color scheme has shifted to muted hues of navy and gray. Flavors such as cool cucumber and crème brûlée have been shortened to cucumber and crème.

Hollis Johnson/Business Insider

“We are committed to continuing to research the potential public health impact of our products and have over 100,000 participants enrolled in behavioral studies across the world,” the Juul spokesperson said.

Prioritize safety

In addition to publishing research, experts say startups need to place a clearer emphasis on safety and transparency. To do that, companies like 23andMe, Juul, and Facebook all need to think more about how their tools might impact potentially vulnerable people.

“There have to be quality controls, there has to be truthfulness,” Hercher said. “Is simply having good information enough? There has to be a line somewhere.”

Facebook maintains that its work in suicide does not fall under the domain of health, but instead is restricted to emergency response. It has not published any data on the algorithm.

As far as the world of genetic-testing is concerned, experts say it’s still something of a Wild West for customers, but a handful of companies is currently trying to address that. They hope to combine the convenience and simplicity of 23andMe with the thorough nature of a clinical experience.

Invitae, one startup, recently announced plans to roll out a test that, like 23andMe’s, could be ordered by consumers, only it would incorporate genetic counseling and require a doctor’s sign-off. Last month, another startup called Helix launched a comparable test that includes optional genetic counseling. And Color Genomics has long had a test that works similarly and links customers to genetic counselors over the phone.

“There’s some great science out there,” Ross said. “I want to see more of it.”



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.