Anne Weiler and Mike Van Snellenberg This Week in Health IT
August 8, 2019

 – Episode #

Guest Information

Share this clip:

Share on linkedin
Share on twitter
Share on facebook
Share on email

Co-founders Anne Weiler and Mike Van Snellenberg of Wellpepper talk talk Machine Learning, Data Protection, Data Bias, Big Tech & Privacy.

Anne Weiler & Mike Van Snellenberg on Machine Assisted Healthcare

Want to tune in on your favorite listening platform? Don't forget to subscribe!

Thank You to Our Show Sponsors

Related Content

Amplify great thinking to propel healthcare forward and raise up the next generation of health leaders.

© Copyright 2021 Health Lyrics All rights reserved

Bill Russell:                   00:10                Welcome to this week in health it influence where we discuss the influence of technology on health with people who are making it happen. My name is Bill Russell. Recovering healthcare CIO and creator of this week in health it a set of podcasts and videos dedicated to developing the next generation of health it leaders. This podcast is brought to you by health lyrics. Professional athletes hire coaches for every aspect of their life to ensure top performance healthcare technology is much more important. Yet, too many leaders choose to go it alone. Put me in your corner. Wow, that sounds so. I’ll fix that later. Visit Health lyrics.com to schedule your free consultation. If you want to support the fastest-growing podcasts and health it space, here are five ways you can do it. Share it with a peer, share it on social media, follow our social accounts, linkedin, Twitter, and youtube.

Bill Russell:                   00:53                Send me feedback, love your feedback. It’s very helpful and subscribe to our newsletter. And then one last thing before we get to our guests. We have two new services I want to make you aware of. One is this week health insights. We made the, the commitment about 18 months ago to invest in the careers of people in health it. And a lot of health leaders have stepped up and been a part of that and that we’ve packaged that up package that wisdom up so you can receive it in your inbox every Tuesday and Thursday. I take a short snippet from these interviews and answer the question of, so what, try to make it pragmatic, something you can apply to your career today. You can visit this week, health.com/insights. And then the last thing before I get to our phenomenal guests for today this week health staff meeting is a result of a conversation I had with a couple of CIO’s and that launches in two weeks.

Bill Russell:                   01:46                And it’s designed to expose your team to new thinking and get the conversation going. Essentially what that’s going to be. Again, as a short snippet from industry leader with two discussion questions, we’re going to send out one of those per week and you can share it with your staff have the conversation and get it. Get your staff meeting kicked off on the right foot. You can preregister for that this week. Health.Com/Staff meeting with no spaces. Okay enough. Today I’m joined by two guests, one returning and one new guest to introduce to the community. Anne Weiler is the CEO of Wellpepper and a returning guest who brought along the CTO and cofounder. Well, pepper, Mark Van Snellenberg good morning. Both of you. Welcome to the show. And did I get your name right?

Anne Weiler:                02:34                Yeah. You got my name right, but there’s no er on Van Snellenberg.

Bill Russell:                   02:46                I’m going to have to get the names right before I have a show with you guys. Well pepper is still the name of the show, correct? Of course. I messed that up to a, this is what happens when wear a tie. I get all flustered. It just didn’t happen. Wow. and this is the first of the shows that I’m doing with two guests and a, as you guys know, this is, you know, sort of a balancing act, not only because Anne is actually standing on a server to do the show. So the height disparity isn’t too bad, but it’s a balancing act to, you know, to get both of your perspectives on a lot of different topics. So I’m looking forward to this. The topics we discussed, ahead of time, we’re going to talk about machine learning, data protection, data bias, big tech privacy, a bunch of other stuff. And we’re going to do all that. Try to do all that in a half hour or you got you guys ready for this

Anne Weiler:                03:39                Are ready. Yeah. And with the slant patient generated data cause that’s what we’re all about here,

Bill Russell:                   03:45                Patient. Absolutely. That’d be great. So, Anne let’s start with a quick update on Wellpepper. What if anything has happened or changed since the last time we spoke, which was at Himss?

Anne Weiler:                03:56                Ah, well I got over the flu. You remember I had the flu. But yeah, lots of stuff. We did a announced a partnership with eVideon to do inpatient systems and that sends the results of customer demand. What we’re seeing from customers is wanting to have a few integrated systems that really cross the whole patient experience and you’re seeing that in health systems, hiring chief digital officers and really thinking about this end to end. We love the folks that we’ve been on. They’ve taken a very patient, patient centric approach, strong product architecture and it was just a great fit to announce that partnership. Smiling face. We got the results. Two of our studies have been published in peer reviewed journals. So we have study with Boston University people with Parkinson’s disease and the study with Harvard for seniors who are at risk of adverse events.

Anne Weiler:                04:51                And so both of those have been published. Some of the great highlights I can share from those a, the Harvard study people had improved mobility by working with a remote care plan and a remote and traction and Checkin. So they had improved mobility over the course of the year. But more importantly, they had reduced ed visits. So true cost savings that you can prove from these interventions. And then our, our friends at Bu, we, we deployed and launched a new study with them. They’ve got a five year NIH grant for a broader Parkinson study that includes some behavioral health components. So not just a medical and physical clinical intervention, but the behavioral health side of it as well. I’m so very excited about those studies that are very important to us to, to prove, you know, independently proves that what we’re doing works and empowering patients to follow their care plans. And then for our customers, we’ve deployed a new scenarios and orthopedics, shoulder and ACL surgery in neurology, new headache care plans and we just deployed Alzheimer’s intervention. So again, proving that patients can and will self manage if you give them the right tools, but also our platform approach, which was basically that a patient engagement experience needs to support all patients whatever age they are and all conditions.

Bill Russell:                   06:17                Wow. so a ton has happened. Can you provide me those links? I’ll, I’ll include them with with the podcast and the video cause that’s that’s really important research. That’s essentially saying, Hey, we have digital interventions that are producing results. That’s essentially what those studies are gonna are showing us.

Anne Weiler:                06:37                And you know, we’ve taken the theory that patients can have self supplemented if you give them the right tools, the bu study patients who came into it with a lower patient activation scores. So that’s basically, they feel less confident about their ability to make follow their care plan. They saw the greatest gains. So again, you know, that goes back to the, sometimes when you talk to clinicians, they say, it doesn’t matter. When I tell the patients they won’t do it. That’s not true. It’s about how you tell them how you support them and how you basically, you know, provide them that link back to the care team if they need additional.

Bill Russell:                   07:15                Fantastic. Well, Mike, we’re going to introduce you to the to the community here. You wrote a really cool article on self-driving healthcare and the, the concept was around machine learning and data and, and, and essentially how to help health care have its self-driving moment. Give us a little background on it or premise of the article before we jump in.

Mike Van Snelle:           07:39                Thanks. Going happy to be here. Long time listener. First time caller, I guess. So yeah, I mean the premise of the article is like we’re, we’re super excited about the opportunity for machine learning and healthcare. And one of the things that we discovered when we started looking at, you know, what the possibilities were, is that a lot of the data that you have in health care is really still siloed in the EMR. And so you have a lot of billing data and diagnostic data. And that kind of data is good for some things, for machine learning, but really doesn’t enable healthcare to have a really a robust self-driving moment in the same way that machine learning is helping to propel the, the self driving cars and autonomous car vehicle industry. So if you think back 10 years ago in, in in self driving cars, there really was no such thing as a concept of a self driving car.

Mike Van Snelle:           08:30                And that was really, you know, these large government studies that were, you know, millions of dollars just trying to get a car to drive short distances. And really deep the presence of data was what catalyzed that movement in self driving cars. Where you could get a lot of data about the environment and about mapping. And the, the problem that we have in healthcare is we just don’t have that kind of data about patients when they’re out on the street in a way. We have a lot of data about what happens inside the four walls of the hospital but not a lot about what happens to patients when they’re not in the hospital. So you can think about like the difficulty of once a, you all you had was shop meeting this records of her car and fluid level measurements and you tried to train a self driving car off of that data. You really can’t do it. It needs a lot more of a robust data set. And so that’s really what we’ve been focused on is helping to capture that data set of how do patients interact with the health system when they’re you between visits and using that data set to in the future be able to help augment. And eventually maybe it started automating care provisions.

Bill Russell:                   09:38                So, so let’s stay on cars for a second just to get the context right. So machine learning is a subset of AI, but machine learning is the act of training the machine by example. So it it, it learns by example. So I guess from your article about 15 years ago, they had a, you know, government maybe sponsored studies that, you know, you had universities out there with teams and they were trying to get their cars to drive 10 miles, 50 miles and those kinds of things. And then the big breakthrough was Stanford and they essentially taught their car to learn by example. And there’s a little id idea how that works. And then we’ll, we’re going to come back to, to healthcare because that really brings up some of the biggest challenges of training machines with healthcare data.

Mike Van Snelle:           10:27                Yeah. yeah. So the, the way that a self driving cars of all really was, as you say, the, the big breakthrough was using example driven data to train a machine. And that is a combination of sensor data and visual data captured from vehicles. And not training by algorithm. There are many companies that have tried or many I guess participants in the research that tried that approach where you try to have a smart person sit in a room and say, well, if you see these conditions and do this and instead of doing that, they said, well, let’s just like a whole bunch of data will label which things are the good things to do with the outcomes that we want. And then we’ll let the machine learn about how to, you know, how to produce those outcomes. And so it’s a great technique when you have kind of very large input spaces of variables and a set of things that you want to drive towards from up from that, which is actually a lot like healthcare.

Mike Van Snelle:           11:21                You have a whole bunch of data about people, whether it’s their genomic information, their diagnostic information, their diet and exercise, they activities of daily living all of that data. If you were able to quantify it as a fairly large data set. And then so it’d be really hard for a person to decide, well, you know, cure, I’m going to weight this factor so much and w this factor so much. But machines are really good at that. If you said, you know, here’s a million people and we can observe from all of this data that we collect about these people, you know, where are we getting good outcomes and poor outcomes and which things might lead to diseases or not. Then we just start to learn from that data and build predictive models for, for people’s health.

Bill Russell:                   12:03                Is it quantity of data or is it quality of data?

Bill Russell:                   12:06                I know the answer would be both, but which is more important?

Mike Van Snelle:           12:10                Both. I mean certainly,umachine learning works best with large quantities of data. It’s a notoriously data hungry for, to train most kinds of models. But in here when you’re paired up with that, you need to start with data that is of high quality. Uone of the problems we have with healthcare data is a lot of the data we get from a, from EMR is, is really,been tweaked towards billing purposes. So for example, we see a predominance of,uprocedure codes that are skewed towards higher billing rates. Ueven though that might not be a fair representation of the observed diagnostics and [inaudible] procedures that are actually needed for patients. So there are some kind of data cleanliness issues,uthat you need to overcome with when working with some of the data sets that we have in healthcare today.

Anne Weiler:                12:58                So if you look at what we’re doing, we’ve got a contained experience. So we’ve got patients who are, we know what disease or condition they have based on the care plan that they’ve been assigned. And we’ve got the things that they’ve been assigned to do, so we know what they’re supposed to be doing and we know how they’re doing against that care plan. So with a smaller data set, we can start to get insights from this patient generated data, which is very different than saying, you know, there’s one approach that says, get as much data as you can and try and find some sort of insight in it, which is great if you’re, you know, a large academic researcher and you can keep doing regressions on that to start to find trends. We’re starting, you can start from a hypothesis. So for example, we’ve got somebody on a, a total joint replacement care plan after surgery.

Anne Weiler:                13:52                We can say, all right, we’re going to look for factors that might predict readmissions. Well, we’ve, you know, we’ve already got a smaller surface area. We don’t need quite as much data to be able to find those things that might indicate an adverse of that. And so we’ve actually trained a machine learn message classifier on patient generated messages for patients who are part of that sort of post ambulatory surgery care plan and been able to say, hey, this, this type of message seems, seems like it might be a problem. And so you can do things if you know that, you know, if the good day to end, you know, if the data’s good and then you know what the data is worth, you can start to find these insights a lot faster. And I think that’s, you know, some of the challenges today of machine learning are people are just trying to take these massive data sets from the EMR that weren’t necessarily developed for those insights and trying to find those insights.

Anne Weiler:                14:46                Now, you know, that’s the, that’s the job of like Google’s doing that Amazon is doing that. Microsoft’s doing that they have enough compute power. They have enough people and enough money quite honestly to do that. I think where we’re, we’re looking at it and where health systems should be looking at it, it’s a little more contained. And I think you see that with health systems. They’re looking at things like, how do we fix sepsis? So you know what’s, you know, rather than what’s all the data, what’s the insight going back to the way we’re looking at it, but just on this care plan, whether you know what’s going to indicate someone’s about to not be a tier one, what’s going to indicate that they might have an ed visit? All of those things.

Bill Russell:                   15:22                Yeah, but we, so here’s the distinction. I want you guys to sort of draw like we’ve been doing this for years, but we do it. People, people sit down and look at the data and say, oh look, this correlates with this, this correlates with this and those kinds of things. How does a machine sort of step in the middle there and how do you focus it in on it? Just a few variables to say, you know, so that it’s actually picking those things out and alerting you, not the Wellpepper teams sitting in a room gone, hey, I’m seeing this trend.

Anne Weiler:                15:52                It does start with the Wellpepper team sitting in a room, right? Someone, a human needs to label the data to begin with, which is kind of the secret of machine learning and AI. And you know, when everyone was very upset about the fact that there were people listening to utterances on the home devices, people are reading, people previously are reading, you know, your, what you’ve typed in to actually apply insight. Yeah. Your search queries for example. So it does start with people and then it goes into the black box that I’ll have turn over to Mike.

Mike Van Snelle:           16:29                Yeah, I mean the, the data labeling can be a fairly intensive process and it’s hard in healthcare to, because the like the industry approach to data labeling and a lot of cases is not very well, it’s not very privacy savvy. Today we’ve seen some press coverage of some of the issues with both Amazon and Google about, you know, contractors having access to that data. But that is kind of the industry approach today is you get kind of a, a group called group sourcing of stuff, people to go and label data. But then it becomes input to a machine learning training algorithm. In healthcare, we have to be careful at them about that because, you know, for example, the message class card that we trained, we looked at the data and said, well, there’s a possibility that there’s Phi and then we use messages. And so we really can’t do that.

Bill Russell:                   17:18                We have to do all the data labeling ourselves. And so there’s a like a sensitivity there with healthcare data, but how you perform data with labeling. But then after that it’s kind of the magic of machine learning. And there have been articles about, you know, w why is machine learning so unreasonably effective in doing what it does in image recognition, diagnostics, machine self driving cars. And the, I mean, the big answer is we don’t really fully understand why it’s as effective as it is. We know that it is, one of the tradeoffs is we don’t always know why it’s making the decisions that it’s making. So there’s was a loss of some visibility into the prediction mechanism. When do you train a machine learning classifier as an area of active research to figure it out? How do we build models that are not just accurate but also predictable. Like we can explain why they predicted what they predicted,

Anne Weiler:                18:11                Which is also challenging in healthcare because you know, you get it, there’s a lot of inconsistency today and you know, the first of all, people don’t want something prescribed to their patient unless they understand why it’s been prescribed. And you know, sometimes we’ll see you talk to three surgeons and they get three different opinions. I mean there’s a whole idea of getting a second opinion. So having that kind of closed box, we don’t know where this came from is challenging, which is why again, you know, it’s gotta be this kind of baby steps that the steps have to be, all right, let’s see what things we can identify and if we can identify them and let’s use that to scale the clinicians. So present to the clinician, Hey, we’ve identified this patient might be a greater risk of readmission and that maybe suggest to them some actions that they can take. Or maybe just say, we’ve identified this and it just is enough of a flag for some, a person to reach out and see what’s going on. So, you know, it absolutely cannot be without people and healthcare.

Bill Russell:                   19:13                Yeah, absolutely be the conversations we’ve had around AI previously, just the physicians who we’ve had talking about it. Have said, you know, I, I just think about it as,uyou know, just another opinion, right? So it sort of comes through and says, Hey, look at this. And I say, all right, well that’s the computer’s opinion based on this data. Here’s my opinion and I’m going to get another console just to weigh those things up. You know, the other thing that was fascinating to me was,uwe, we did one of those projects,uyou know, our team all had, you know, their own personal time to do projects and, and to the players went out and just got, you know, tensorflow that they just brought all this stuff in and then they fed that, the images through and they, the results of the images through and they were starting to predict, I mean these are just two people sort of hacking in the corner of an office who have access to, to obviously healthcare data.

Bill Russell:                   20:07                But they were going through it and they were approaching, you know, 80, 85, 88, 90% accuracy on imagery reads after like a month of work, which I mean, sorta sorta talks to your, you know, it is effective or we’re not, we’re not entirely sure what’s going on behind there, but it’s pretty, it’s pretty interesting that it’s that effective. But let’s talk about bias for a second. So how does bias present itself? I assume it present itself through the data and then what, what are we doing about bias? How do we, how do we prevent bias?

Anne Weiler:                20:46                Ah, I mean, Mike pointed this out, like the, the EMR data is biased because it’s biased towards billing for the most part. And you know, unfortunately a lot of the data is, it’s not clean. You know, if there’s, you’ve seen the, the burdens of documentation stuff gets copied and paste and stuff. It’s like, you know, we’re going to, we know that this, you know, this billing code applies and the patient is sort of has this, but like the number of times, I’m sure you’ve had this as a patient where you know, someone is reading something back to you and saying, is this correct? And you’re like, no, you know, don’t, one that always gets me this. When they read the medication list, I’m like, are you still taking them up to so well, of course not. You know, that’s a seven day course. So, you know, the data is missing a lot of information.

Anne Weiler:                21:36                It’s missing what happened really with the patient. It’s missing what’s happening with the patient and their daily life, which is those, the activities of daily living are the things that really affect your health. It may also be missing the opinions of the full care team. So you know who hit, who has actually done the documentation in the EMR. And you know, I was thinking back, you know, we don’t know how it doesn’t, but it does. Think about like when you, you see doctors, you know, especially new residents are told if you see, if, you know, if a nurse thinks something is wrong, believe that nurse. So you know, is that, you know, that hunch that a nurse had that a patient is about to code or something like that. Is that in there too? So our, what is the, when we’re looking at, you know, we’ll, we’ll look at patient when we’re looking at patient data.

Anne Weiler:                22:25                Is it the full spectrum of what’s happening happening with that patient? And then also, you know, are we exacerbating human bias? You know, I think back to in one of the, you know, starting this, this company was a result of an experience that my mom had. She actually faced a lot of bias before she got diagnosed. And there was bias I suspect about her age and possibly, you know, you just finished reading a book that I blogged about, about data bias, that there’s a lot of data missing on women because studies were not done on them because of, you know, some, some famous studies where there were problems, the thalidomide example. But then because of that, there’s a whole lot of data missing. And so, you know, a heart attack presents differently in a woman than a man, but we don’t actually have the data to show, you know, like as a woman, I’m reading this thinking, how will I know if I’m having a heart attack?

Anne Weiler:                23:20                And it seems like we won’t. So I think that’s the thing we’ve got to recognize before we go too far down this path is do we have all the data and how do we get more data? And that’s what you were quite passionate about. The fact that the patient generated day that the experiences of, you know, why didn’t you take your medication? Was it a side effect? Was it, you know, you couldn’t afford it. All of those, those additional things around it, you know, cause you could come up and say well our data shows that patients on this condition just don’t take their medication. Well that’s only part of information. Why didn’t they take, is it something to do with the side effect? Is that something to do with the, the condition that may have. So I think that I’m worried that we got a little too far ahead of ourselves.

Anne Weiler:                24:06                Like we need to stop and say, okay, from feeding this data into the machine, what bias might that data have to begin with? Because it’s missing something and think about all the questions that you can ask and how can you get that additional data. We think that you can get a lot of data from patients. I suspect, you know, cause that’s, cause that’s where we’re focused. We do get feedback from clinicians as well. What action did they take when a patient presented a certain alert, but really thinking about like all of those things that are happening and how do you get, everybody want to say everyone’s viewpoint, but you know, there’s a lot of stuff going on.

Bill Russell:                   24:40                So a picture of someone’s health is greater than their EHR data. We know that a whole person profiles necessary. You’re, you have a bias towards collecting the data directly from the patient. Go figure. That sort of makes sense. But we’re, where are we going? Are there other sources of data that we can sort of tap into? I mean, and even are there some limitations in the, in the patient generated data as well as, I guess those are two questions, limitations and then are there, are there industry initiatives that are starting to aggregate information outside the EHR itself that are valuable that we can tap into?

Mike Van Snelle:           25:23                Yeah, I mean there’s definitely limitations in the data that you collect from patients. And, and some of those were kind of obvious things like, well, will the patient actually respond to all these surveys and questionnaires that we’re giving to them? And so there you have some percentage that that do and don’t respond. And there are you know, there’s some percentage of the day that you get from patients that you know, may not be totally clean. The, the data that we have says that patients, when you ask patients kind of continuously and ongoing about, you know, to report their conditions and things that are happening, they’re more likely to be truthful. If they’re reporting it in a minute, then it’s, they are sitting in your waiting room trying to fill out a retrospective form of what systems that I have. They tend to misremember things and that straight out.

Mike Van Snelle:           26:08                So we believe in kind of trying to capture things as realtime as possible. It helps keep that data as plain as possible. And like there are a ton of data, new data sources that I think are, are starting to feed into healthcare. There’s a lot of interests and data coming from sensors, whether that’s activity data or a weight data or blood glucose or here’s a ton of information that we can capture about about patients. I don’t think we know yet what that data means. That’s kind of the classic problem today. When you reports of patients that bring in their fitbit print out to their doctor and well what does this mean? And the second supply, I have no idea, there’s no notes on against that. So there’s a missing step there. I’m taking that kind of sensor data and merging it with all the other things we know about the patients draw meaningful clinical insights.

Mike Van Snelle:           26:56                And then at the other big pieces on the genomic side, you know there’s a lot of companies like the project baseline initiative from barely that are starting to study that like let’s look at a whole bunch of people. Let’s take their genetic information as well as all of you know, their kind of daily livings and measurements and start to understand like how to, how to people kind of people’s health progress over time. I think that’s a super meaningful and useful data set that will take decades to collect. But the way you practice medicine in 20, 30 years will be different than the way we do today. Because of that.

Anne Weiler:                27:29                I think we’re also seeing that the government making data sets available as well to compare it to. I think definitely I don’t see the two things coming together. Again, this is like, I think we’re in the stage where there’s a lot of data and people aren’t taking this data and putting it with this data to see what happens. It’s like, you know, there’s one group who is looking at claims data, there’s another group is looking at patient generated data and it is hard to bring those things together. And frequently like you know, when we’re working with a health system we’re always looking at hey, how do we prove for you the outcomes that we’re getting? But also, you know, the outcomes can include things like cost. Well frequently, you know, the surgeons that we’re working with to improve the outcomes are not the ones who have the cost data.

Anne Weiler:                28:15                I think that’s the opportunity is trying to figure out how do we get, how do we match these data sets? How do we get them together? And then know there’s also like a little, I’m gonna say chicken and egg thing, but you, you know, you saw with propeller health they were able to protect air quality based on inhaler usage. Should we be flipping that around and saying, okay, you know, the air quality is going to be bad or is bad. Like let’s you know that that was actually something that I’m a physician. In Kaiser said, send to me because they do take a population health management. This was in Seattle where we had last summer, some pretty terrible air quality from forest fires and he said, how can I need to go tell, excuse me, all my patients with asthma, what they should be doing?

Anne Weiler:                29:01                And he had no way of finding that. So he had the data, environmental data every day we saw him, the air quality rating and then he had no way of actually matching that with his patient data to say all of you people your the things I want you to proactively do because this is gonna Affect your health. So I think we’re at the point of knowing that there are these large data sets and knowing that there are, there’s valuable data in individual data sets, putting them together. That’s, yeah, that’s going to be the, the thing that really breaks through on on health,

Bill Russell:                   29:36                Wow. So still the best way for us to communicate that air quality through a sign on the road, a radio broadcast in the morning and TV news. Is that what I hear you saying?

Anne Weiler:                29:48                Well, I was always asking my phone as that was actually, that was my, you know, every day I’d wake up and be like, let’s the air quality Siri doing a really good job on that and would pull up like the, tell me the number and then fill up the whole chart of what it was. But yeah, I mean, but then you want to, I don’t want to, I want Siri, but somebody then to say, if I’m at risk, I mean we were all at risk for bad, but like, you know, hey, here’s what you should do, right. Proactively use your inhaler, stand side. You know, people were that the news was saying this is, this is how bad it is. And you’d see people jogging if they didn’t know, you know, and that goes back into the Kaiser doctor was like, I want to go tell people they should not be doing these things.

Bill Russell:                   30:30                So let’s, let’s talk I, I used the term silicon valley here and I probably shouldn’t, since you guys are in Seattle and some of the players are in Seattle, so let’s just call it, you know, the tech industry, their use of data. So last week, Microsoft, Amazon, IBM, Oracle, I don’t remember who else they all sign their pledge to share healthcare data. And I talked about it on Tuesdays show. What does, what does this really mean? I mean, what do you, what do you think it means to the industry?

Mike Van Snelle:           31:04                Yeah, I mean I think there’s an increasing movement towards openness in patient data. Certainly you’ve seen from some of the regulations coming out of CMS towards giving patients access to their own data I think is a huge step in the right direction for too long. We’ve had patient data that’s locked in silent inside of the healthcare systems. In terms of big tech using that data, I think there’s a, a lot of good that’s come out of that. I think there’s a a need for the right privacy controls to be in place and kind of be acknowledgement from those all systems that are from those big tech players. Health data is different animal than a lot of other data that they deal with. And it’s really easy to inadvertently be thi even data sets that you believe. Maybe de-identified. Like there are certainly cases, people are easily re identifying data. So I think with the right focus on privacy, I think there’s a, it’s a promising trends a lot more good than bad, some of it. But I think there needs to be careful several rules put in place by those health systems or big tech companies to kind of sell out police I guess.

Bill Russell:                   32:17                Right? So let me, let me drive this conversation. So if you get a certain part of the industry, those tech are those executives in a room, they’ll say, look I’m all for giving the patients their data, but you know, they’re going to be, they’re going to be used and abused. People are going to get that data, they’re going to give it away there. You know, we need to find ways to protect that data. And essentially what they’re saying is they can’t share the data with me cause they think that I’m not sophisticated enough to know how to protect my own data. I mean, what mechanisms are we going to put in place? And, and you have it on your phone and Mike, you have it on your phone and your parents have it on their phones and, and now all of a sudden they have this great data set that you guys know how to use. And people who understand big data and machine learning and AI can start to create some really interesting things around. But what they’re afraid of is not well pepper and the good actors, they’re worried about the nefarious actors who are already calling them every five minutes to try to get their social security number now getting access to their entire medical record.

Anne Weiler:                33:27                Yeah, I mean that’s, I don’t think that’s an easy problem to solve and it’s not related to health care, like the number of people who have their email address with their date of birth in it. You know, like the number of people who have their date of birth published on Facebook. Like there’s some, it does seem like perhaps data privacy needs to be taught somewhere. You know, I feel like, you know, I’m, I’m going to say particularly paranoid, but I remember, I guess back in the 90s being concerned about who had my data and then Mike and I worked at a company that was acquired by Microsoft and I was like, okay, well I can trust Microsoft now because I have to, because not only do they have my date of birth, I know where my social security number, they have my retirement plan, so, and my healthcare.

Anne Weiler:                34:17                So, you know, like pretty much so everything about me. But before that point, you know, I was pretty concerned about to receive my information. I think that, you know, on the one hand it’s my right to next, I want to disclose anything to you about my health. It’s my right to do that. You know, I want to put my extra, I did put my x-rays on the Internet when I broke my finger because I was, you know, I was like to blog about healthcare experiences. But at the same time, I think most people are not understanding about privacy in general. But on the other hand, and I’ll have to do the talking myself in and out of the things, but you know, on the other hand, no one’s who’s protecting our data. We look at some of the most recent breaches. Things like Equifax. I mean that, that our whole job is protecting people’s data. So, you know, is it about educating people or is it about changing the systems? I Dunno,

Mike Van Snelle:           35:11                I think at the end of the day it’s like you have to think about whose data isn’t if I’m a patient, is my data or is it somebody else’s? And certainly when we in our contract aren’t standard language, the patient, they have license to use it for your care, but it’s your data and if you take the approach that patient’s data belongs to the patient. That I think helps guide a lot of policy decisions on kind of how and when you relate in smart ways. But I think there’s a fundamental kind of ownership question there.

Bill Russell:                   35:41                So if that’s that, that’s interesting though. Cause Mike, here’s that. Here’s the counter argument because too many of these discussions, not that I agree with the counter argument, but it is a counter argument, which is if I’m sitting here taking notes about you and Ann and how well you’re doing on the podcast. Just because I’m writing about you, is that your data?

Anne Weiler:                36:03                Not necessarily, but what we have is patient generated data. It’s me saying I took my medication, I’m feeling a little dizzy. I did those, you know what it patient generated data. I’m, I’m choosing to tell you this is not your interpretation. I agree with that. Yeah. When you’re, when it becomes your interpretation of that as a doctor, yeah, that’s another, but if the doctor is doing something to you that seems to be your data, someone has done something to you like I dunno. And that’s a tricky one. And you look at like GDPR, the right to be forgotten. Oh yeah. Medical records if I want to be forgotten. But then at the same time medical records actually need to be Kept so that we know what happened. So I, it feels like we’re as same with machine learning. Like in the data privacy. It’s almost like sometimes the regulations and the technology are and on totally different timelines.

Bill Russell:                   37:07                Yeah, I, yeah, I’ve, I’ve said this on a couple of the news podcasts I do on Tuesday that yeah, I’d like to press a button and have the, you know, essentially you download my health data to me and then I’d like for it to go out of your EHR and my case on that. I just don’t think I, I, this is going to get me in trouble, but I do enough work within health care that I don’t trust healthcare to protect my data. There’s just too many things changing. There’s too many mergers and acquisitions and every time you do a merger and acquisition, you connect to another health system that has a security profile and, and you just increased your attack factors cause the attack vectors primarily are the people you employ. And so when your health system goes from 20,000 employees to 40, you now have 40,000 attack factors to get into that health system. It’s, I just don’t think anyone’s gonna say, Oh gosh, I got to target Bill Russell and get his health data. I this, I can be worth it to them. They want to get 20,000 records. They don’t want to get one.

Anne Weiler:                38:08                Yeah. So you made the case, you own it. You give people rights to it.

Bill Russell:                   38:12                So what are you going to do with it? So we so seem a Vermont is successful and secretaries are successful and and, and actually even Joe Biden, Joe Biden actually was making this case around cancer research yet that data out there. Okay. So that bi-partisan, they’re successful and say, all right, here’s all the data. What, what are we going to do with it? What are some of the best or best thinking around this is what we can start doing for patients. If we can start aggregating all this data.

Anne Weiler:                38:41                I think, oh, aggregating. Sorry, I thought you were saying,

Bill Russell:                   38:44                I don’t know if I gave it all to the guy had it and gave it to you. What are you going to do for me?

Anne Weiler:                38:50                I’m going to do for you.

Mike Van Snelle:           38:53                I think it’s a big question, right? I think that’s if you take the approach of big data sets can lead to big improvements and we don’t know when all those areas are yet that someone’s going to discover, you know, links between health data and zip code data and air quality. And like I think it is on researchers to be able to make good use of that data. And I don’t know if we know all the places, we certainly see places where today machine learning is making early inroads on like helping you improve diagnostics, things like that. I wouldn’t try to, I’m not enough for the future is to try and predict where we’ll be in five years. Use your access to healthcare data.

Anne Weiler:                39:37                Yeah.

Bill Russell:                   39:37                Let me rephrase the question cause this is, I really want to tap into your guys’ expertise. You’ve started a company, it’s successful, you’re growing it.

Anne Weiler:                39:45                So you’re just, you know, hi

Bill Russell:                   39:49                This little thing was studies from Boston University and Harvard and partnering with Mayo. I get this little thing you guys were doing as a little side project. But with that being said, it has been successful. So it becomes what, what, or if you guys didn’t have this going on and you were just getting into healthcare and all of a sudden secretaries are, they’re all successful and all this data, what areas would you like to see a startup start to tackle or go after that you think, man, that could really have a significant impact in the community where I live

Anne Weiler:                40:23                I would focus on seniors because of demographics and because of, you know, we just don’t have enough clinicians to care for everyone. So I would start to look at where can we really help people? What are the common things that people struggle with as they age, where maybe they don’t need to be struggling if we knew certain things ahead of time, you know, everybody’s concerned about Alzheimer’s and dementia. What can you do to help people be more independent because everybody wants to be, and then also what can you do? So it connects them to their caregivers or that’s, you know, their children are professional caregivers. And what I see right now in that space as a, you know, remote monitoring is a lot of like devices there. I don’t know that the devices, you know, that’s, that’s not an app. There’s, there’s gotta be a lot more about what’s going on. And how do you help someone be proactive? That’s where I would focus. Yeah. Mike, I made focus differently.

Mike Van Snelle:           41:28                I mean I think there’s a lot of kind of clinical and diagnostic things that are super important and kind of very domain specific. I think when you, one of the big challenges we have in healthcare is we just have an unsustainably expensive model here. And so I think there’s a lot around costs that we can do to understand and one of the patterns of how patients progress for diseases in which things actually do help top costs. Is Preventative treatment effective at helping bearing the cost curve or not? And I think that needs to shape policy about how we pay for things in health care and help kind of continue to move away from incident based payment and move towards kind of more value based payment. And having a complete understanding of like what does value mean? If you understand how that patient progresses over a long course of time, not just waste a few billing codes, but you know, are they able to be successful and getting back to work after a you know, health disruptions, that sort of thing.

Anne Weiler:                42:26                Which I think as well maybe if we can shorten the research, you know, like we’ve got the clinical trials and studies which you have to do. And then we’ve got the insights from machine learning. How do we bring those two things together to shorten the cycle? Cause right now it’s someone publishes something, it’s 17 years before it gets into clinical practice. So like we know things today that would improve everybody’s life, but they’re not necessarily in clinical practice. So how do you, how do you marry those two? Because the big data is also telling your that it would be maybe super interesting to, to match those analysis to studies that are, have already been done to further prove the study. Yeah.

Bill Russell:                   43:04                Fantastic. You guys, you guys are awesome. I love having you guys on the show and having you both on it.

Anne Weiler:                43:11                That’s what I want to talk. That’s why I wanted to bring Mike this time. I told him. That would be fun.

Bill Russell:                   43:17                Yeah. And anything you want to leave our listeners with before I close off here, close out.

Anne Weiler:                43:24                Don’t underestimate patients. That’s gonna say the same thing.

Bill Russell:                   43:29                That’s, that’s why you guys are together and that’s why you’re doing, doing the work that you’re doing. How can they follow you guys that just the wellpepper.com

Anne Weiler:                43:37                Ah, @ one pepper on sweater. It’s probably the best place to get all our, our news. And then on my website we’ve got our blog where we blog about all kinds of things so you know, things that we’re doing, but general industry thoughts always related to the patient experience though.

Bill Russell:                   43:53                Fantastic. Well thanks again for coming on. Please come back every Friday for more great interviews with influencers. And don’t forget every Tuesday we take a look at the news, which is impacting health. It, the show is a production of this week inHealth it for more great content. You can check out our website at this week, and I actually, I’m going to stop that now. Thisweekhealth.com I keep saying this, we can help it, but I changed the URL about two months ago, so there’s this week health.com and you can go to the youtube channel this week called.com click on the video lake and you can see any one of our 800 plus videos that are out there. Thanks for listening. That’s all for now

Play Video