Audio

Shir Meir Lador | Using Data Science to Keep Financial Data Secure


Save to Listen Later

Podcast: Women in Data Science
Episode: Shir Meir Lador | Using Data Science to Keep Financial Data Secure
Pub date: 2019-08-15

In addition to her job at Intuit, Lador is a WiDS ambassador in Israel, has her own podcast about data science, and is a co-founder of PyData Tel Aviv meetups.

Lador’s team at Intuit focuses on machine learning in security and fraud applications to protect customers’ sensitive financial data from fraudsters and hackers. She and her team use anomaly detection and semi-supervised methods to secure Intuit products and data. “In general, putting AI into products is not an easy task.” But she thinks we need to put a lot of effort into securing our data especially with recent data leaks from Equifax and Facebook. “I think the world is going into that direction with the GDPR and other initiatives. AI has a lot of potential of helping in that domain,” she explained during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast.

Israel has a lot of expertise in the security domain because many young people study security and encryption during Israel’s mandatory military service. She had the option to do this during her service, but since she already knew she would pursue a career in this area, instead she chose to become a pilot instructor in the flight simulator. “It was a very unique experience that I would probably never get to do.”

When Lador was starting her career in data science, she did not know many people in the field. She decided to start a PyData branch in Israel because she wanted to build a professional data science community. “My main motivation was that I wanted to learn and that I wanted to have friends and people to consult with and learn from. And now I have so many data scientist friends because of all this work and it’s great. I love it.”

She noticed when organizing PyData events that it was much easier to get male speakers. When she would ask a talented female scientist to talk about her work, she would say: “No, I’m not an expert… I’m not ready. I need to learn more… I was like, no, you’re enough years in the field. Everyone can learn something from you.”

Being a WiDS ambassador was like an extension of her PyData work. “I get to decide what’s in the conference and bring the best talks there.” Her experience organizing the PyData meetups helped her know how to create a valuable conference. She sees WiDS as a great opportunity to encourage more women to speak by giving them a platform, but also by bringing all the people together. “Seeing all those women on stage. This gives great inspiration to speak at other events, not just in WiDS. I think this is just an amazing initiative.”

RELATED LINKS
Connect with Shir Meir Lador on Twitter (@shirmeir86) and LinkedIn
Listen to Shir’s podcast Unsupervised
Learn about PyData TelAviv Meetup
Read more about Intuit
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website

The podcast and artwork embedded on this page are from Professor Margot Gerritsen, which is the property of its owner and not affiliated with or endorsed by Listen Notes, Inc.

Powered by: ListenNotes
Audio

Marzyeh Ghassemi | Applying Machine Learning to Understand and Improve Health


Save to Listen Later

Podcast: Women in Data Science
Episode: Marzyeh Ghassemi | Applying Machine Learning to Understand and Improve Health
Episode pub date: 2019-08-28

Ghassemi explains how she is tackling two issues: eradicating bias in healthcare data and models, and understanding what it means to be healthy across different populations during her conversation with Women in Data Science Co-Director Karen Matthys on the Women in Data Science podcast.

She says that there are built-in biases in data, access to care, treatments, and outcomes. If we train models on data that is biased, it will operationalize those biases. Her goal is to recognize and eliminate those biases in the data and the models. For example, research shows that end-of-life care for minorities is significantly more aggressive. “This mistrust between patient and provider, which we can capture and model algorithmically, is predictive of who gets this aggressive end-of-life care.”

Ghassemi is also interested in the fundamental question of what it means to be healthy, and whether that rule generalizes. It requires a different mode for data collection and analysis. She explains that the typical process is that data is generated when you go to the doctor because you are sick. However, what matters more than your infrequent doctor check-in is how you’re experiencing things day to day, the self-report. She sees a huge opportunity in combining doctor visit data, self-reported data and data from wearable devices that’s passively collected from people that consent to their behavioral data being used. We can use all of those different kinds of data modalities to understand what it means to be healthy for all kinds of people.

She also offers valuable insights from her career in data science as a woman, a minority and a mother. She is a visible minority because she chooses to wear a headscarf. “I became comfortable very early on with defending choices that I had made about my life. And that for me really was instrumental in the academic process. Because what is academia if not constant rejection?”

Ghassemi made the decision to become a mother while pursuing her PhD. “As a society we should recognize that having kids is not a career hit.” She felt she was able to have kids and be successful as a graduate student because there was a community around her that was supportive and recognized that having children would enrich her life and experience. She credits having a supportive mentor as being instrumental in making it all work, saying, “You have to choose the race that you can be successful at.”

She wants young women entering the field to know there is no one defined path. She says don’t worry about checking boxes. Choose things that you are very passionate about. Find a mentor who’s willing to invest in you, and the path you want to take. Surround yourself with good people. It’s not the project that makes you successful; it’s the people. If you can’t trust the people around you, and learn how to work together, you are going to fail. Having the right mentors and having the right people around you should always be your guiding star.

RELATED LINKS
Connect with Marzyeh Ghassemi on Twitter (@MarzyehGhassemi) and LinkedIn
Find out more about Marzyeh on her personal website
Read more about the University of Toronto Faculty of Medicine and Vector Institute
Interview with Marzyeh: Artificial Intelligence Could Improve Health Care for All — Unless it Doesn’t
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website

The podcast and artwork embedded on this page are from Professor Margot Gerritsen, which is the property of its owner and not affiliated with or endorsed by Listen Notes, Inc.

Powered by: ListenNotes
Audio

Jennifer Chayes | Eliminating Bias

Podcast: Women in Data Science
Episode: Jennifer Chayes | Eliminating Bias
Episode pub date: 2018-10-19

Attaining tenured status at a major university is often the culmination of an academic’s career; giving it up is unthinkable for most. But after 10 years at UCLA, Jennifer Chayes was offered a job at Microsoft. The offer, she says,“scared me to death,” but she took the job and is now managing director for Microsoft Research in New England, New York and Montreal.

“There are brass rings that come along,and they always come along at the most inopportune times,and they look really scary, but I believe that we should grab them when they come along,” Chayes says during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast. Chayes is a big advocate of eliminating biases in search algorithms and believes that data scientists have “the opportunity to build algorithms with fairness, accountability, transparency and ethics, or FATE.” FATE, a group that formed at one of Chayes’ labs, works to address inequity in the field.

In one particular instance, the group discovered that certain searches yielded certain results. Searches looking for computer programmers, for example, typically returned results for people with male names. The change Chayes’ team implemented in the search algorithm removed that built-in bias. Removing bias from hiring is not only fair, it results in better outcomes, she says. “I think that you’re more likely to ask the right questions if you have been on the wrong side of outcomes. So you’re much more likely to see a lack of fairness or bias as a problem before it happens.” Chayes believes that the fieldof data science is changing and that the increase in underrepresented voices will be critical to the future of the field moving forward.

The podcast and artwork embedded on this page are from Professor Margot Gerritsen, which is the property of its owner and not affiliated with or endorsed by Listen Notes, Inc.