Ms. Thee is a Top 50 Global Thought Leader for AI, Privacy, and Safety with demonstrated experience in delivering revenue and solving complex business technology, governance, privacy and risk challenges at scale.
Ms. Thee is a consultant to some of the world's most innovative healthcare, and global technology companies including Microsoft and UCSF’s Center for Digital Healthcare Innovation to accelerate FDA approval for AI use in clinical settings. She is the CEO and Co-Founder of Minor Guard, an Artificial Intelligence software company focused on making children safer online and in real life. She is a keynote speaker including her TEDx talk "Bringing Light To Dark Places Online: Disrupting Human Trafficking Using AI." She hosts the Navigating Forward podcast. She has been named to the 2021 Top Health and Safety, Privacy, and AI Thought Leaders and Influencers and Women in Business you should follow by Thinkers 360. She was recently named to the 2022 “Top 100 Brilliant Women in AI Ethics” global list.
Human Trafficking Donation
Medical AI Startup
More From Melissa and Pursuing Uncomfortable:
fiLLLed Life Newsletter
Leave a review
Pursuing Uncomfortable Book
🎶 Podcast Intro: Welcome to the pursuing uncomfortable podcast, where we give you the encouragement you need to lean into the uncomfortable stuff life puts in front of you, so you can love your life. If you are ready to overcome all the yuck that keeps you up at night, you're in the right place. I am your host, Melissa Ebken let's get going. 🎶
🎶 Episode Intro: Friend you may not know Lisa Thea by name, but she is an important figure in our future. She's a consultant to some of the world's most innovative healthcare and global technology companies. She is a pioneer ethicist for artificial intelligence, she's worked with keeping children safe online with stopping human trafficking, and with creating data sets to help us better diagnose and treat disease. I cannot wait for you to hear what she has to say today. Welcome my friend Lisa. 🎶
Melissa Ebken 0:01
Friend, I am so glad you are catching this podcast today because Lisa is here. And she is amazing. In fact, I don't want to waste any more time. I want to jump right in. Lisa, welcome to the podcast today.
Lisa Thee 0:16
Thank you so much, Melissa, I'm thrilled to be with you.
Melissa Ebken 0:18
Well, I am really thrilled, I've really been looking forward to this conversation together. Lisa, you are the CEO and co-founder of Minor Guard. And that's an artificial intelligence software company. Artificial Intelligence is coming at us in all directions. And it's exciting and terrifying. And all of that at the same time. What particular nuance of artificial intelligence do you champion? And do you hope to bring about to mass?
Lisa Thee 0:53
Yeah, so I think I think, you know, when I think of artificial intelligence to me, you know, having an engineering background. It's really just a tool. It's not a solution. So I think that there's a lot of news cases people have seen in science fiction movies, for example, or in news clippings about it being misapplied and causing harm, and that's definitely possible. But I am really focused on what can be done to solve big hairy problems at scale that just we haven't been able to touch before. And the area that I'm particularly passionate about is improving outcomes for marginalized women and children globally. And that's where having a tool like AI that can look at data in huge amounts and make better predictions of how do we solve some of the challenges of things like human trafficking, child sexual abuse material, livestream terror events, or even you know, getting diagnose diagnostics and health care for chronic conditions can really be uplifted by applying this new capability to a space where maybe science hasn't spent enough time doing the research, because the data can tell us what is and isn't working for people at a more personalized level. And we can adjust our course of action from there from there.
Melissa Ebken 2:10
This is fascinating. When we talk about changing the world, you are talking about huge changes to huge problems in the world, being able to do that with humongous data sets if I'm hearing you correctly. But interestingly you, yeah, absolutely. And you also said to make it more personal. And that just seems like two opposite things to me huge magnet magnitudes of data sets, but also to bring about a personal effect or result. That's mind blowing.
Lisa Thee 2:45
That's a promise of digital transformation. And we seem to have a lot already in our daily lives and recommendation engines for marketing, or for product selection. The advertising industry has really taken advantage of this. They can collect enough information to understand what people like us are interested in and make more probable recommendations for things that things might benefit our lives and create a more frictionless customer experience. And the beauty of that is it's all happening kind of quietly under the scenes. And so it's not something you have to learn a lot to be able to take advantage of. And what I'm really excited about going into 2023 is that that kind of technology is starting to proliferate into additional industries that tend to be a little bit slower in adoption, things like health care, things like public safety, things like you know, government, places where they move a little bit slower because the risks of problems are higher. But once they can integrate this capability, I really have a lot of hope for seeing improvement and the moments that matter around climate change, around civil rights, around equity and dignity in our world, we're going to have a lot more wisdom gleaned out of the oceans of information that are created every day. It's been estimated that 90% of the data on the internet today has been created in the last two years. And 80% of that data is unstructured data that needs some kind of interpretation. As a result of that humans can't possibly ingest all of that to make better decisions as subject matter experts. And so as we start to look at some process, automate robotic process automation and looking at optical character recognition, some of those capabilities that are open source tools and artificial intelligence today, nobody has to create them from scratch. They already exist, you can apply them, we can start to really get much more personalized treatments for healthcare. That will work better for you based on your genome your genome, based on your environment, based on people that live like you do, versus just a one size fits all solution. Same thing with things like crimes, money laundering, human trafficking, things like that. Follow the money, it's always been the wisdom. And when you have all of this information being generated, and we can start to actually glean the right insights out of it, I'm really excited about the ability to make sure that people that are manipulating the systems to victimize people, they're leaving a trail, we just can't find it yet yet. And I think that's where AI is gonna be really promising.
Melissa Ebken 5:40
The image or analogy that came to my mind when you're describing this is when I start a really good novel, a novel with some substance and complexity, because at first it jumps around to different completely unrelated people, contexts situations, and it takes about 100 pages, to get a sense that, okay, these are going to intertwine, these are going to be related. And all of these diverse factors are going to come together and produce an action, a context, some kind of situation. A coming together of convergent data sets, I guess, is one way that will cause something particular to happen. And that's what it sounds like the potential will be with AI that I picture a bunch of detectives in a squad room with a bunch of coffee, and they're stressed out trying to find which data is relevant and which isn't and what to follow and what to ignore. Is this what we are getting to that we can streamline, streamline and prioritize the information.
Lisa Thee 6:48
Yeah, let me give you an example. I think there's a lot of fear that AI is going to replace everyone's jobs, and we're all going to be unemployed. And I'm not as concerned about that future as maybe some others because what I think it is going to require some upskilling. But in the example of that detective, this is based on some real work we've done with nonprofits, the tech industry and law enforcement around recovering child known child victims of sex trafficking. So the challenge that was brought to us back in 2017, was that facial recognition technology has mostly been trained on adult white male faces. Because that's the labeled data that's publicly available. Obviously, we protect children's data for reasons. But that did not generalize very well, when they tried to apply those models to the problem of identifying known missing children. The missing children posters that the National Center for Missing and Exploited Children produce, tend to have grainy, low quality images of the children. Whereas the Escort ads online tend to be well lit with a lot of makeup and a lot of done up-ness to them, because they're advertising. And so they weren't getting a result that was a close enough match recommendation based on the existing models to make it usable or actionable. But when we looked at it, what we decided was we never intended to take the human out of the loop in the process. Detective will always be there. Sure. How can we make the detectives job easier? And the way that we were able to look at accomplishing that was by organizing the photos in the order of which one was most likely the same person. So it might not get the first photo right? there almost every time of the top line of photos are the first five, it was in there somewhere. So envision a day that you go from scanning 10s of 1000s of ads, looking for a specific person, to a day when they're organized right next to each other in a tool. And you just have to go yep, number three, that's the person. And they can set up the meet, they can recover the child and they can work on reintegration into our society for that for that victim versus doom scrolling, just hoping to stumble upon where this child is.
Melissa Ebken 9:12
Sure. And whatever it takes.
Lisa Thee 9:15
And an example of that the first month that went into production, they recovered 130 children.
Melissa Ebken 9:19
Wow! Whatever it takes to keep children safe. I mean, I say that with a grain of salt, but seriously, whatever reasonably or even somewhat unreasonable tactics to keep our children safe. Let's do it. And also these crimes tend to affect disproportionately people that are minorities. How can this AI help to bring those numbers back to some kind of leverage? How can we leverage that to make that loophole disappear or to close?
Lisa Thee 9:56
Well, extending that news case, we know that trafficking victims tend to be diverse teenage girls, which didn't perform well on those models that facial recognition had 70% accuracy, and it just had so many mistakes, it wasn't useful. When we were able to shift to that more modern approach, nearest neighbor searches, we were able to get it to 99% accuracy. And that's how you give detectives good enough tools to go and recover those children. And so I think it's a matter of recognizing that every model is going to have to be specified for a business case, whether that be law enforcement use case, whether that be a banking use case, whether that be a healthcare use case. And once you know what you're solving for, or you can optimize a solution to work well for certain demographics based on the data that you need into edge. So that's why I'm really, really passionate about encouraging more women to go into data careers and become data literate, because this is one of the emerging skill sets are going to be in the workforce. So things like product development, and a lot of the tools that we interact with every day as consumers, and I want to see a lot more women in the boardrooms, helping make the decisions when they're being made so that they serve a broader population of people than just the typical demographic of tech workers in Silicon Valley.
Melissa Ebken 11:33
Amen to that. Amen to that. Now, we've talked about how it can help solve that particular horrific crime. Can it also help keep children safe when they're online? That's something that terrifies me.
Lisa Thee 11:51
That's what we set out to with our. Yeah, that's what we set out to do with our Minor Guard solution was to identify that a device was being used by a child or registered to a child, that an image that was being taken on it was explicit in nature. And that's a felony to create, and block that image from ever being saved on the device. Today, that technology can be applied by using a family iOS account, and iPhone 12, or later. You can set that up in parental controls. It used to take 130 choices to block your child from doing something like that in settings today, to single one. So if you go to my website at lisathee.com/tedtalk, we show you how to set up that capability in there, and also more about our journey in terms of developing that technology to disrupt human trafficking.
Melissa Ebken 12:46
And that link will be in the show notes. Friends, pause the podcast, go to the show notes, click on that link, and enact these settings so that we can keep our kids safe online. But then don't take it for granted that you've done the parenting.
Lisa Thee 13:01
If the parent would help police, not this content in the first place.
Melissa Ebken 13:08
Yeah, and not just do the setting and be done. But keep monitoring what our kids are doing. And keep asking them questions and making sure that they're doing what they say they're they're doing and what they're supposed to be doing.
Lisa Thee 13:24
Absolutely. On my page at the bottom, there's a couple products out there that parents can use to help with that AI monitoring. They're the things that I subscribe to being an insider and knowing what works and what doesn't work. And I just decided to put my name on that and my code on that to give you a discount because I want to see more parents engaged. You really are the gatekeepers for the digital lives of your children. And I wanted to make it as easy as possible to find something that works. So I'm a big fan of gab phones for first phones, that's what my elementary school aged children used because it doesn't have the overhead of the internet on them and social media. When they age up and I have to unlock the world of the Instas and the TikTok's and all of that. I plan to use Bark Technologies, which is where I partnered once I exited from Minors Guard on the Android side of the market. There are many solutions out there that work. Those are the two that I choose to partner with because I think the technology is best and the user experience is best but my encouragement to you is do something instead of nothing. Whatever you do is going to be beneficial. And usually when people or people are grooming children for bad choices, it's not one moment that matters. It's staying engaged that matters. Parents are really the safe harbor for their children when somebody violates their boundaries online. And it's really important to be an ally to them when that happens and not freak out and take their phones away because then they'll be secretive.
Melissa Ebken 14:57
That's a tough, tough situation. And thank you for those, those tools that we can use. And again, all of those links will be in the show notes. Make sure you click on those links and enact these safety measures. So can we jump over to health care briefly? Sure, sure. You've mentioned that these large data sets can help us on an individual level in health care. How does that all come together for us?
Lisa Thee 15:31
Yeah, the earliest experimentations that I saw in that were during my days as an AI solution owner at Intel Corporation. There was some really interesting work happening in 2015-16, around a collaborative cancer cloud, where research institutions were sharing data in a federated way to be able to train better, predictive models for care. And what that allowed them to do was to ask the question of the information without seeing personally identifiable information. So let me give you an example because I think it makes it a little easier to, to grock. So an example of that would be, I'm a woman of a certain age with this type of geno genomes, you know, and this particular cancer diagnosis, and I want to know whether to take to drug A, or drug B, to have better outcomes for getting my cancer into remission. Well, in order for recommendation engines to be approved by the FDA, they have to have a certain amount of data diversity, to guarantee that they will perform as well on somebody that's middle age versus young, different ethnic groups, different gender groups. So it's a really, really hard with an a HIPAA controlled environment like that US healthcare system to have access to enough information to prove that, you've done your research, and it will work better based on somebody's genome. So it was a collaboration that happened between the technology companies and some of the major cancer institutions like MIT, Harvard, Oregon State University. And it allowed them to ask us question of the data without having to see my name, my patient medical record number, all of those things. It can say, how many women have you seen that are that are within this age range, that have this type of genome and have this diagnosis that have benefited from this treatment? And it can return they can query all those places without commingling all the information and come back with an answer. It's a seven. But if it only went to one research hospital, maybe it could only get two you can't get, you can't move forward in medical research. So I'm really excited about federated learning and the use of confidential computing, to accelerate a lot of rare diseases that maybe don't have enough information at one hospital to be solved. And so we partnered with a really promising startup called BeeKeeper AI that has a solution offering in Microsoft Azure to help accelerate FDA model approval for people that are developing healthcare technology to help in this space. And so I encourage anybody who is doing startups are in pharmaceuticals or in medical equipment, to explore their capabilities, that's BeeKeeper AI. And you can check out their offering in the Microsoft Azure store, we helped build that solution for them.
Melissa Ebken 18:35
Excellent. And we'll have a link for that as well for people to check out. Is there this is this technology present also in the diagnostic side of healthcare or mainly in the treatment side?
Lisa Thee 18:49
We're early days right now. What the technology does is it really acts as more of an escrow account for data. So it was helping people to find the right data to train their models so that they can cut down the amount of cost and the amount of time needed to get FDA approval for the models. We estimate, it takes about a 50% reduction. What those models are focused on doing has a wide range of applications.
Melissa Ebken 19:14
Great. So you are an ethicist in the artificial intelligence world. What are a couple of the big questions that you grapple with?
Lisa Thee 19:28
What is acceptable cost of doing business? Nothing is risk free. How do we balance the risks between privacy and safety? How do we define privacy? Is privacy really, in the examples of crimes that I focused on, what's more important? The privacy of the criminal who doesn't want people knowing what they're doing? Or the privacy of the child who's having the worst day of their life? Watched by other adults for entertainment 1000s of times a day. I think we have some really big challenges in terms of bringing more experts into the technology space and not making engineers the ethicist to help us navigate these waters. And I think we need to bring a lot more technologists in government to be helping to set more appropriate regulation. We didn't have car seats in cars for quite a long time, because we didn't believe that we there was a big enough risk that required them. But eventually the data showed that there were more acci more and more accidents happening and saying seatbelts mattered. So we all have to wear them today, even if we are good drivers, right? I look forward to the day that the internet has some guardrails for us all.
Melissa Ebken 20:48
And yet it when you say it sounds such an obvious answer of have, obviously I want to protect the kids. But also in my day to day life. I don't want to give out my genome for anyone to use, because yes, the AI models can use them to track criminals, but also criminals can use that. Or other people can use that that don't have ethics departments. So it's not always so cut and dry of the privacy issues and the safety issues. There's such a murkiness in there.
Lisa Thee 21:27
It's always going to be two sides of the same coin, and no trade off doesn't have an impact on the other. And so that's why I think it's gonna be really important that we, we get more technologists in the regulation space and more regulation people in the technology space to come up with the right balances, because that threat landscape will resolve over time. And we need to evolve along with it.
Melissa Ebken 21:52
And life is a continuing exercise and asking the big questions, and sitting with the difficult questions. And we make real progress when we do that, when we don't shy away from it, when we do sit with what's uncomfortable and and dissect it and bisect it, then look at it from all angles.
Lisa Thee 22:16
I've been doing that for quite a few years now. It never gets easier, but it gets more tenable. And if we want to have better outcomes for everyone and democratize dignity, we're gonna have to sit down and face some of the hard questions.
Melissa Ebken 22:31
Lisa, I have 100 questions going through my mind that I would love to ask you every single one of them. Maybe I'll have you back again someday and ask you a few of those. But I do know that you are working on a cause that is very close to your heart right now. Related to keeping children safe, and in an environment where they know they are valued. Can you tell us a little bit about your project? And how we can help?
Lisa Thee 22:58
Sure. Well, first, I'd actually like to answer your first question. If you're interested in learning more about my work, and how you can apply mission to your work. Obviously, Melissa, as a pastor, you have that intrinsically, but maybe some of your listeners are looking to infuse more mission into their work lives. I'm releasing a book in 2023 called Let's Go, Women's Guide From Burnout to More Sustainable Life, where I share a lot more about how I redefined my career to focus on the moments that matter, and how I've been able to adjust over time with the challenges that we've faced with this pandemic, to maintain impact, while while I have developed some permanent disabilities. And I want to give a guide for other people that are maybe looking towards the future and wanting something a little bit different than what they've had historically. So if you go to www.lisathee.com/go, you can get a free preview of the first two chapters of that book and it answers a lot more of your questions about some of the work that I've mentioned today in a lot more detail as well as guides on how to find what your life's work can be that will give you mission. Secondarily, the organization that I wanted to mention today is one I've been a board member of for about six years. And it's called Three Strands Global Foundation. And it is an organization that's focused on prevention and reintegration services for victims of human trafficking. And I'm really passionate about their mission because they are training children about healthy boundaries and school programs in eight states now we're continuing to expand just to develop programs in Canada and even into Africa. We also have reintegration services for victims of human trafficking that are looking to be part of our society but have probably missed school, have probably have some PTSD have some stuff national needs and so we provide services to get them gainful employment in our local communities. In my local community in Sacramento alone, we place about 200 people a year. And I'm really proud of the general generation opportunities that that can provide for not only the victims of trafficking, but their children and breaking cycles. And so if you're interested in supporting that organization, I think the holidays are a really nice time to think about where you can get back to your communities. And I know that Three Strands Global Foundation will make good use of every dollar that's donated, because I see I see the balance. And I have a lot of confidence, knowing that teaching teachers about what to look for in classrooms, connecting communities with resources and being on the frontlines to how to make sure kids get the help that they need. When somebody is trying to take advantage of them is the way that we're going to end this terrible crime.
Melissa Ebken 25:57
Friends, human sex trafficking seems like a huge issue that we are powerless to do anything about from our place in this world. But this is an action you can take that will make a real difference in this effort. So make sure you click on those links, get informed and make a donation. Be an active participant in healing and preventing these horrific situations. Lisa, I want to thank you for sharing all of this today on the podcast.
Lisa Thee 26:30
Thank you so much Melissa. It's been a wonderful opportunity.
🎶 Episode Outro: Thank you so much for tuning into today's episode. If this encouraged you, please consider subscribing to our show and leaving a rating and review so we can encourage even more people just like yourself. We drop a new episode every Wednesday so I hope you continue to drop in and be encouraged to lean into and overcome all the uncomfortable stuff life brings your way. 🎶