Tuesday, November 6, 2018
Years ago, mothers used to place their hands on their children's foreheads to determine if they had a fever. Thermometers now can provide more precise measurements and, thus, more appropriate health care. Like the thermometer, can we use social media to do the same for mental illness? But what do we risk by opening our social channels to algorithmic observance? Dr. Munmun De Choudhury has spent years investigating what our social media can say about our mental health.
Transcript:
Ayanna Howard: The social media boom over the past decade has come so fast and furious that it is difficult for users and researchers to quantify its costs and benefits. The challenge is how do you present to society the new tools that could be used? Years ago, mothers used to place their hands on their infants' foreheads to determine if their child had some form of fever as an indication of maybe they were sick or ill. Thermometers, which we use now, provided more precise measurements, and thus more appropriate health care. Like what thermometers did for physical health, we can use social media to do the same for mental illness.
What do we risk by opening our social channels to algorithmic observance? How do we balance those risks with the potential benefits to patient care? In our episode today, we'll discuss those topics with School of Interactive Computing Assistant Professor Munmun De Choudhury, who has spent the last several years investigating what our social media posts say about our mental health.
(Instrumental)
I'm School of Interactive Computing Chair Ayanna Howard, and this is the Interaction Hour.
(Instrumental stops)
Munmun received her Ph.D. in 2011 from Arizona State University. Since then, she has spent time at Rutgers, at Harvard as a faculty associate, and as a postdoctoral researcher in the NeXus group at Microsoft Research. At Georgia Tech, she leads the Social Dynamics and Wellbeing Lab. Thanks for joining us, Munmun.
Munmun De Choudhury: My pleasure, Ayanna.
Ayanna: So, there's a lot of discussion lately on social media about whether it has a net positive or negative impact on society, on adolescents and the like. But your research takes a different approach. You've kind of identified it as a tool. So, let's talk a little bit about this. How did you arrive at that possibility?
Munmun: A lot of people ask me that question, and it was fairly accidental, I would say. So, several years ago, I was a new postdoctoral researcher at Microsoft Research, and I was looking for new problems to work on because I was a little bit tired from the directions I had looked until that point and I wanted something really fresh and really new, at least for myself. What I knew was that I was always intrigued by questions that are at the intersection of computer science and people. I just didn't quite know what that question would be that would give me a fresh perspective. So, I started to look up the literature and looking back upon the skills that I had developed until that point. I talked with my mentors and collaborators at MSR, and one question that became very obvious that hasn't quite been looked until that point was how to take all these wealth of data and information that people leave behind on social media platforms like Twitter and Facebook, and how do we take that to make something, to say something meaningful about those people themselves? But not at the aggregate scale of populations, but for the very individual that is leaving behind those digital traces.
This led us to a problem where we wanted to look at shifts in people's behaviors around major life events. How we can characterize and measure those traits in a better manner. So we started to look at childbirth, because it turns out that it's a life event that a lot of people talk about on these platforms, which allows us to observe these shifts in a very rich fashion. We started to look at how moms, new moms announce the birth of their child on Twitter and on Facebook, and we collected a ton of data, public data, about these new moms -- a long period preceding those announcements and a long period succeeding those announcements.
We started to measure those shifts, and initially we found a lot of observations that we had expected to see. So, obviously any major life events is succeeded by a lot of changes. We did see that in this case, as well. However, we did see some changes that we had not expected. After several iterations of going over those analyses, discussions with social workers, other experts who are social scientists, we began to hypothesize that some of those changes could be related to postpartum depression.
Ayanna: Okay, so, when you say changes -- so, you're not saying I was posting about cats and now I'm not posting about cats. That's not really what you're talking about. So, what do you mean when you say changes?
Munmun: These changes can be topical. But oftentimes the kinds of changes that we found to be the most salient and meaningful were usage of certain types of words or certain types of linguistic styles that we don't consciously decide. For instance, take the example of non-content words. So these would be adverbs, adjectives, verbs, pronouns and so forth. Words that are that kind of the gluing elements of language, and we don't quite pay conscious attention to how we are using them.
Ayanna: So, these are action words like, "I'm running."
Munmun: (Affirmation) Function words, also.
Ayanna: Okay.
Munmun: We found that changes in the use of those words the most characteristic of these shifts.
Ayanna: This is interesting. So, by mining people's patterns -- language, how they post things -- you can actually see a change in their mental wellbeing.
Munmun: Yes.
Ayanna: Very interesting. So, what did you conclude in this one study that kind of pushed you into this area?
Munmun: So, we found that these digital traces that people leave behind on these platforms can act as a window to understanding people's cognitions, to understanding their psychological states, which in turn are really valuable information that are often used to identify risks of mental health challenges.
Ayanna: So, how would you -- there are some days I'm, you know, not quite happy, maybe. But it's only temporary. So, how would you see a difference between something that's temporary -- I'm sad today versus depression?
Munmun: Right. That's a really good question. And the way we did that was by looking at really longer-term data that would allow us to identify whether those shifts are temporary or permanent. Fortunately, a lot of people are using these platforms for many years now. Many of us have had Facebook since 2006. This allows to see all these ebbs and flows and all these other changes that may have been short-term, may have been long-term that we have all gone through.
Ayanna: So, what kind of mental health diagnosis -- I won't say "diagnosis." What kind of areas do you think you can identify with this besides depression?
Munmun: Yeah, so we have so far looked at a variety of different illnesses, ranging from anxiety, bipolar disorder, post-traumatic stress, other forms of clinical stress, eating disorder has been one of our core focuses in the last few years, and in the last couple years we have also been looking a lot in the early cycles of schizophrenia. What is interesting is that across these different conditions, a lot of our observations hold.
Ayanna: Ok, so, we have this social media, we're talking about mental health issues, and so there's a big elephant in the room. So, let's just think about this. I was recently reading an article, a Wired article, I think it was, where they were talking about using machine learning algorithms at a school district in Michigan that would monitor public posts on social media. And what they were looking for was suggestions of conflict or violence. So, I can think -- ok, maybe that's good. But, I mean, really, is this something that we should be looking at on a larger scale?
Munmun: Yeah, that's a really good and very complex questions. Like any gamechangers that we see in the research world, it's a double-edged sword. There are the positives and there are the negatives. Sometimes science and technology has to push the boundaries of what can be done and answer the questions of how can we do it in a responsible and ethical way that honors the values of people are affected by it. So, it sounds like these kinds of surveillances can have a lot of positive impact because it would provide us a mechanism to reach people who might be at risk, who might be vulnerable so that they're connected with help and advice at an earlier point in time, which is not normally possible currently. But at the same time, what we need to pay attention to is who is monitoring these posts, how they are monitoring it and how that information is being used. As you can imagine, a lot of this information can be used for good, but also, with bad actors involved -- actors that may not be paying careful attention, they can be a basis of discrimination, as well. So, I think we need to answer all those questions, as well.
Ayanna: So, there's some social issues then, obviously.
Munmun: Absolutely.
Ayanna: So, let's think about this. Imagine everything's working and I can get a ping or maybe my mom can get a ping that says, "Oh, she's looking kind of unhappy over a long period of time." So, some people might actually think that -- do we really want to do this aspect of diagnosing? I mean, won't this replace, say, the care of clinicians?
Munmun: I do not think so. I think we need clinicians more than ever given the stress that exists. The need for clinicians, the need for mental health resources throughout the world. What I believe is that these kinds of techniques could actually empower clinicians, empower even the individuals themselves and their friends and family who care about their wellbeing. I believe that these types of algorithms or these types of date provide new insights that currently are not accessible to some of these stakeholders like clinicians.
Take the typical setting, a therapeutic setting, which courtesy of Sigmund Freud we have that setup today. There is a patient who comes, there is a face-to-face conversation that happens between the person and the therapist or clinician. A lot of the diagnosis is based on those face-to-face conversations. What is missing, what the clinician misses is what happens to that person outside of that room? What we are seeing is that with these algorithms and with these data, it can power the clinician to have access to all those cues that are currently inaccessible. In a way, it's another new source of information that can be helpful to the clinicians and help them do their jobs better.
Ayanna: Ok, so it's a good tool for the clinician. So, what can you do for the person? There's a lot of people, for example, that use WebMD. It's like, so WebMD says I'm not sick, so I'm not going to go to a doctor, and it's really because there might be a fear of going to the doctor in the first place. So, how do you address some of that? So, I have this tool and I can use it and maybe it pings me and tells me something. So, then I say, "Oh, my posts tell me this, and maybe there's a solution, so I don't need to go see a clinician." What would you tell those individuals to look for, to be careful about.
Munmun: Right, so I think there is an element of knowledge awareness and transparency or a good understanding of how these algorithms work. The important piece of information here is that they are not diagnostic tools. The potential that we see, at least in the near future, is to make these individuals be better aware of their psychology, of their moods, which is often a challenge. And to use the outcomes of these algorithms as an early warning or as a way to be more proactive about their own wellbeing. Instead of saying that I do not need to see a doctor, this would be a mechanism through which they can say, "Maybe I should see a doctor that I was otherwise not considering."
Ayanna: So, it changes the conversation in a positive.
Munmun: Yeah.
Ayanna: Well, this is great. We appreciate this conversation. Munmun, thank you for joining us. Mental health, it is a concern. The conversation about mental health and addressing the needs of society are very important. I thank you for joining in on this conversation, showing that there's some hope with respect to the computing that's out there in order to really attack this problem that we have in our society.
Munmun, she and a host of other faculty and students from the School of Interactive Computing will be attending later this month, November 3-7 a conference called the ACM Conference on Computer-Supported Cooperative Work and Social Computing.
(Instrumental)
If you have a topic that you would like to hear discussed on the Interaction Hour, we want to hear from you. Reach out to us on Twitter or Facebook @ICatGT or go online to ic.gatech.edu for information on how to submit your questions.
(End)