Willpwr Confidentiality
What is especially innovative about our Willpwr apps is that they learn about you as you use them and they tailor their advice to fit your specific circumstances. Now to do that, we need to keep track of when and where a user presses an urge button or an “oops” button, what the intensity of the urge was, what the strength of willpower was estimated to be at the time, and many other pieces of information that the user identifies as buttons and sliders are pressed. That information helps us offer just the right advice and eventually coach the user about precisely what changes need to be made in his/her lifestyle to beat a habit. However, that information can be pretty private stuff. We decided from the outset that our apps absolutely had to protect privacy of the individual, so knowing that the information would not want to be shared, we did the following:
1) We kept all information on the user’s phone, not in the cloud, not related to us in any way.
2) We made it possible for the user to erase all information if s/he decided to do so. You simply throw the app away, and that dumps all the data the app has collected. It doesn’t exist anywhere else except on your phone, and when you throw it away, it’s gone, gone, gone. You haven’t lost the app; you can always re-install the app at a later time, but the data you collected earlier is gone. You’ll start fresh with an empty database.
3) We decided not to make the apps work as social media. You cannot locate or link up with any other users. Now this was a tough decision because a lot of research shows the importance of social support for recovery. But our app is not a complete treatment. It is an assistant. We provided information for finding bona fide groups such as Overeaters Anonymous or Sex Addicts Anonymous, but we kept the app private.
4) While we aren’t advertising this as a form of therapy,but instead as an assistant, we are nonetheless following HIPAA guidelines, being as careful as technology allows to protect confidentiality.
5) The privacy decision also meant that we had to sacrifice analyzing the data collected by our users so that we could improve our products. We’ll handle that issue by asking for volunteers some time in the future. For now, we’ll rely on only using published scientific findings to guide the advice we give, and not trying to have our app substitute for professional help. Well, there is one exception to this statement. We’ve found interviewing recovering users extremely helpful, so there are times when we are including advice from others who are beating their problems. We’ll clearly label those sections when we use them, but in our opinion, the app wouldn’t be complete without that kind of valuable first-hand experience.
We even hired a talented medical ethicist, Emmi Bane, to help us figure out how to thread both this confidentiality needle and the “assist but not treat” issue that we face. Our app is intended as self-help, not as professional treatment. We think it fills a void, however, and I think you can see that we are being as careful as possible with privacy issues.