Health insurers vacuuming up details about you will affect your rates

To an outsider, the fancy booths at last month’s health insurance industry gathering in San Diego aren’t very compelling. A handful of companies pitching “lifestyle” data in addition to also salespeople touting jargony phrases like “social determinants of health.”

yet dig deeper in addition to also the implications of what they’re selling might give many patients pause: A future in which everything you do — the things you buy, the food you eat, the time you spend watching TV — may help determine how much you pay for health insurance.

With little public scrutiny, the health insurance industry has joined forces with data brokers to vacuum up personal details about hundreds of millions of Americans, including, odds are, many readers of This particular story. The companies are tracking your race, education level, TV habits, marital status, net worth. They’re collecting what you post on social media, whether you’re behind on your bills, what you order online. Then they feed This particular information into complicated computer algorithms of which spit out predictions about how much your health care could cost them.

More coming from ProPublica:

Opioid makers, blamed for overdose epidemic, cut back on marketing payments to doctors

With drug reps kept at bay, doctors prescribe more judiciously

Why your health insurer doesn’t care about your big bills

Are you a woman who recently changed your name? You could be newly married in addition to also have a pricey pregnancy pending. Or maybe you’re stressed in addition to also anxious coming from a recent divorce. of which, too, the computer types predict, may run up your medical bills.

Are you a woman who’s purchased plus-size clothing? You’re considered at risk of depression. Mental health care can be expensive.

Low-income in addition to also a minority? of which means, the data brokers say, you are more likely to live in a dilapidated in addition to also dangerous neighborhood, increasing your health risks.

“We sit on oceans of data,” said Eric McCulley, director of strategic solutions for LexisNexis Risk Solutions, during a conversation at the data firm’s booth. in addition to also he isn’t apologetic about using the item. “The fact will be, our data will be inside public domain,” he said. “We didn’t put the item out there.”

Insurers contend they use the information to spot health issues in their clients — in addition to also flag them so they get services they need. in addition to also companies like LexisNexis say the data shouldn’t be used to set prices. yet as a research scientist coming from one company told me: “I can’t say the item hasn’t happened.”

At a time when every week brings a brand new privacy scandal in addition to also worries abound about the misuse of personal information, patient advocates in addition to also privacy scholars say the insurance industry’s data gathering runs counter to its touted, in addition to also federally required, allegiance to patients’ medical privacy. The Health Insurance Portability in addition to also Accountability Act, or HIPAA, only protects medical information.

“We have a health privacy machine of which’s in crisis,” said Frank Pasquale, a professor at the University of Maryland Carey School of Law who specializes in issues related to machine learning in addition to also algorithms. “We have a law of which only covers one source of health information. They are rapidly developing another source.”

Patient advocates warn of which using unverified, error-prone “lifestyle” data to make medical assumptions could lead insurers to improperly cost plans — for instance raising rates based on false information — or discriminate against anyone tagged as high cost. in addition to also, they say, the use of the data raises thorny questions of which should be debated publicly, such as: Should a person’s rates be raised because algorithms say they are more likely to run up medical bills? Such questions might be moot in Europe, where a strict law took effect in May of which bans trading in personal data.

This particular year, ProPublica in addition to also NPR are investigating the various tactics the health insurance industry uses to maximize its profits. Understanding these strategies will be important because patients — through taxes, cash payments in addition to also insurance premiums — are the ones funding the entire health care system. Yet the industry’s bewildering web of strategies in addition to also inside deals often have little to do with patients’ needs. As the series’ first story showed, contrary to common belief, lower bills aren’t health insurers’ top priority.

Inside the San Diego Convention Center last month, there were few qualms about the way insurance companies were mining Americans’ lives for information — or what they planned to do with the data.

The sprawling convention center was a balmy draw for one of America’s Health Insurance Plans’ marquee gatherings. Insurance executives in addition to also managers wandered through the exhibit hall, sampling chocolate-covered strawberries, champagne in addition to also additional delectables designed to encourage deal-producing.

Up front, the prime real estate belonged to the big guns in health data: The booths of Optum, IBM Watson Health in addition to also LexisNexis stretched toward the ceiling, with flat screen monitors in addition to also some comfy seating. (NPR collaborates with IBM Watson Health on national polls about consumer health topics.)

To understand the scope of what they were offering, consider Optum. The company, owned by the massive UnitedHealth Group, has collected the medical diagnoses, tests, prescriptions, costs in addition to also socioeconomic data of 150 million Americans going back to 1993, according to its marketing materials. (UnitedHealth Group provides financial support to NPR.) The company says the item uses the information to link patients’ medical outcomes in addition to also costs to details like their level of education, net worth, family structure in addition to also race. An Optum spokesman said the socioeconomic data will be de-identified in addition to also will be not used for pricing health plans.

Optum’s marketing materials also boast of which the item right now has access to even more. In 2016, the company filed a patent application to gather what people share on platforms like Facebook in addition to also Twitter, in addition to also link This particular material to the person’s clinical in addition to also payment information. A company spokesman said in an email of which the patent application never went anywhere. yet the company’s current marketing materials say the item combines claims in addition to also clinical information with social media interactions.

I had a lot of questions about This particular in addition to also first reached out to Optum in May, yet the company didn’t connect me with any of its experts as promised. At the conference, Optum salespeople said they weren’t allowed to talk to me about how the company uses This particular information.

the item isn’t hard to understand the appeal of all This particular data to insurers. Merging information coming from data brokers with people’s clinical in addition to also payment records will be a no-brainer if you overlook potential patient concerns. Electronic medical records right now make the item easy for insurers to analyze massive amounts of information in addition to also combine the item with the personal details scooped up by data brokers.

the item also makes sense given the shifts in how providers are getting paid. Doctors in addition to also hospitals have typically been paid based on the quantity of care they provide. yet the industry will be moving toward paying them in lump sums for caring for a patient, or for an event, like a knee surgery. In those cases, the medical providers can profit more when patients stay healthy. More money at stake means more interest inside social factors of which might affect a patient’s health.

Some insurance companies are already using socioeconomic data to help patients get appropriate care, such as programs to help patients with chronic diseases stay healthy. Studies show social in addition to also economic aspects of people’s lives play an important role in their health. Knowing these personal details can help them identify those who may need help paying for medication or help getting to the doctor.

yet patient advocates are skeptical health insurers have altruistic designs on people’s personal information.

The industry includes a history of boosting profits by signing up healthy people in addition to also finding ways to avoid sick people — called “cherry-picking” in addition to also “lemon-dropping,” experts say. Among the classic examples: A company was accused of putting its enrollment office on the third floor of a building without an elevator, so only healthy patients could make the trek to sign up. Another tried to appeal to spry seniors by holding square dances.

The Affordable Care Act prohibits insurers coming from denying people coverage based on pre-existing health conditions or charging sick people more for individual or little group plans. yet experts said patients’ personal information could still be used for marketing, in addition to also to assess risks in addition to also determine the prices of certain plans. in addition to also the Trump administration will be promoting short-term health plans, which do allow insurers to deny coverage to sick patients.

Robert Greenwald, faculty director of Harvard Law School’s Center for Health Law in addition to also Policy Innovation, said insurance companies still cherry-pick, yet right now they’re subtler. The center analyzes health insurance plans to see if they discriminate. He said insurers will do things like failing to include enough information about which drugs a plan covers — which pushes sick people who need specific medications elsewhere. Or they may change the things a plan covers, or how much a patient has to pay for a type of care, after a patient has enrolled. Or, Greenwald added, they might exclude or limit certain types of providers coming from their networks — like those who have skill caring for patients with HIV or hepatitis C.

If there were concerns of which personal data might be used to cherry-pick or lemon-drop, they weren’t raised at the conference.

At the IBM Watson Health booth, Kevin Ruane, a senior consulting scientist, told me of which the company surveys 80,000 Americans a year to assess lifestyle, attitudes in addition to also behaviors of which could relate to health care. Participants are asked whether they trust their doctor, have financial problems, go online, or own a Fitbit in addition to also similar questions. The responses of hundreds of adjacent households are analyzed together to identify social in addition to also economic factors for an area.

Ruane said he has used IBM Watson Health’s socioeconomic analysis to help insurance companies assess a potential market. The ACA increased the value of such assessments, experts say, because companies often don’t know the medical history of people seeking coverage. A region with too many sick people, or with patients who don’t take care of themselves, might not be worth the risk.

Ruane acknowledged of which the information his company gathers may not be accurate for every person. “We talk to our clients in addition to also tell them to be careful about This particular,” he said. “Use the item as a data insight. yet the item’s not necessarily a fact.”

In a separate conversation, a salesman coming from a different company joked about the potential for error. “God forbid you live on the wrong street these days,” he said. “You’re going to get lumped in that has a lot of bad things.”

The LexisNexis booth was emblazoned with the slogan “Data. Insight. Action.” The company said the item uses 442 non-medical personal attributes to predict a person’s medical costs. Its cache includes more than 78 billion records coming from more than 10,000 public in addition to also proprietary sources, including people’s cellphone numbers, criminal records, bankruptcies, property records, neighborhood safety in addition to also more. The information will be used to predict patients’ health risks in addition to also costs in eight areas, including how often they are likely to visit emergency rooms, their total cost, their pharmacy costs, their motivation to stay healthy in addition to also their stress levels.

People who downsize their homes tend to have higher health care costs, the company says. As do those whose parents didn’t finish high school. Patients who own more valuable homes are less likely to land back inside hospital within 30 days of their discharge. The company says the item has validated its scores against insurance claims in addition to also clinical data. yet the item won’t share its methods in addition to also hasn’t published the work in peer-reviewed journals.

McCulley, LexisNexis’ director of strategic solutions, said predictions made by the algorithms about patients are based on the combination of the personal attributes. He gave a hypothetical example: A high school dropout who had a recent income loss in addition to also doesn’t have a relative nearby might have higher than expected health costs.

yet couldn’t of which same type of person be healthy? I asked.

“Sure,” McCulley said, with no apparent dismay at the possibility of which the predictions could be wrong.

McCulley in addition to also others at LexisNexis insist the scores are only used to help patients get the care they need in addition to also not to determine how much someone might pay for their health insurance. The company cited three different federal laws of which restricted them in addition to also their clients coming from using the scores in of which way. yet privacy experts said none of the laws cited by the company bar the practice. The company backed off the assertions when I pointed of which the laws did not seem to apply.

LexisNexis officials also said the company’s contracts expressly prohibit using the analysis to help cost insurance plans. They might not provide a contract. yet I knew of which in at least one instance a company was already testing whether the scores could be used as a pricing tool.

Before the conference, I’d seen a press Discharge announcing of which the largest health actuarial firm inside entire world, Milliman, was right now using the LexisNexis scores. I tracked down Marcos Dachary, who works in business development for Milliman. Actuaries calculate health care risks in addition to also help set the cost of premiums for insurers. I asked Dachary if Milliman was using the LexisNexis scores to cost health plans in addition to also he said: “There could be an opportunity.”

The scores could allow an insurance company to assess the risks posed by individual patients in addition to also make adjustments to protect themselves coming from losses, he said. For example, he said, the company could raise premiums, or revise contracts with providers.

the item’s too early to tell whether the LexisNexis scores will actually be useful for pricing, he said. yet he was excited about the possibilities. “One thing about social determinants data — the item piques your mind,” he said.

Dachary acknowledged the scores could also be used to discriminate. Others, he said, have raised of which concern. As much as there could be positive potential, he said, “there could also be negative potential.”

the item’s of which negative potential of which still bothers data analyst Erin Kaufman, who left the health insurance industry in January. The 35-year-old coming from Atlanta had earned her doctorate in public health because she wanted to help people, yet one day at Aetna, her boss told her to work that has a brand new data set.

To her surprise, the company had obtained personal information coming from a data broker on millions of Americans. The data contained each person’s habits in addition to also hobbies, like whether they owned a gun, in addition to also if so, what type, she said. the item included whether they had magazine subscriptions, liked to ride bikes or run marathons. the item had hundreds of personal details about each person.

The Aetna data team merged the data with the information the item had on patients the item insured. The goal was to see how people’s personal interests in addition to also hobbies might relate to their health care costs. yet Kaufman said the item felt wrong: The information about the people who knitted or crocheted made her think of her grandmother. in addition to also the details about individuals who liked camping made her think of herself. What business did the insurance company have looking at This particular information? “the item was a dataset of which genuinely dug into our clients’ lives,” she said. “No one gave anyone permission to do This particular.”

In a statement, Aetna said the item uses consumer marketing information to supplement its claims in addition to also clinical information. The combined data helps predict the risk of repeat emergency room visits or hospital admissions. The information will be used to reach out to members in addition to also help them in addition to also plays no role in pricing plans or underwriting, the statement said.

Kaufman said she had concerns about the accuracy of drawing inferences about an individual’s health coming from an analysis of a group of people with similar traits. Health scores generated coming from arrest records, home ownership in addition to also similar material may be wrong, she said.

Pam Dixon, executive director of the entire world Privacy Forum, a nonprofit of which advocates for privacy inside digital age, shares Kaufman’s concerns. She points to a study by the analytics company SAS, which worked in 2012 with an unnamed major health insurance company to predict a person’s health care costs using 1,500 data elements, including the investments in addition to also types of cars people owned.

The SAS study said higher health care costs could be predicted by looking at things like ethnicity, watching TV in addition to also mail order purchases.

“I find of which enormously offensive as a list,” Dixon said. “This particular will be not health data. This particular will be inferred data.”

Data scientist Cathy O’Neil said drawing conclusions about health risks on such data could lead to a bias against some poor people. the item might be easy to infer they are prone to costly illnesses based on their backgrounds in addition to also living conditions, said O’Neil, author of the book “Weapons of Math Destruction,” which looked at how algorithms can increase inequality. of which could lead to poor people being charged more, producing the item harder for them to get the care they need, she said. Employers, she said, could even decide not to hire people with data points of which could indicate high medical costs inside future.

O’Neil said the companies should also measure how the scores might discriminate against the poor, sick or minorities.

American policymakers could do more to protect people’s information, experts said. inside United States, companies can harvest personal data unless a specific law bans the item, although California just passed legislation of which could create restrictions, said William McGeveran, a professor at the University of Minnesota Law School. Europe, in contrast, passed a strict law called the General Data Protection Regulation, which went into effect in May.

“In Europe, data protection will be a constitutional right,” McGeveran said.

Pasquale, the University of Maryland law professor, said health scores should be treated like credit scores. Federal law gives people the right to know their credit scores in addition to also how they’re calculated. If people are going to be rated by whether they listen to sad songs on Spotify or look up information about AIDS online, they should know, Pasquale said. “The risk of improper use will be extremely high. in addition to also data scores are not properly vetted in addition to also validated in addition to also available for scrutiny.”

As I reported This particular story I wondered how the data vendors might be using my personal information to score my potential health costs. So, I filled out a request on the LexisNexis website for the company to send me some of the personal information the item has on me. A week later a somewhat creepy, 182-page walk down memory lane arrived inside mail. Federal law only requires the company to provide a subset of the information the item collected about me. to ensure’s all I got.

LexisNexis had captured details about my life going back 25 years, many of which I’d forgotten. the item had my phone numbers going back decades in addition to also my home addresses going back to my childhood in Golden, Colorado. Each location had a field to show whether the address was “high risk.” Mine were all blank. The company also collects records of any liens in addition to also criminal activity, which, thankfully, I didn’t have.

My report was boring, which isn’t a surprise. I’ve lived a middle-class life in addition to also grown up in not bad neighborhoods. yet the item made me wonder: What if I had lived in “high risk” neighborhoods? Could of which ever be used by insurers to jack up my rates — or to avoid me altogether?

I wanted to see more. If LexisNexis had health risk scores on me, I wanted to see how they were calculated in addition to also, more importantly, whether they were accurate. yet the company told me of which if the item had calculated my scores the item might have done so on behalf of their client, my insurance company. So, I couldn’t have them.

Leave a Reply

Your email address will not be published. Required fields are marked *


nineteen − seventeen =