We're all on the journey.
March 21, 2024

The Future of Well-Being: AI Chatbots and Accessible Mental Health Care

The Future of Well-Being: AI Chatbots and Accessible Mental Health Care

In this enlightening episode of The Light Inside, we delve into the transformative world of mental health care through the lens of artificial intelligence. I, Jeffrey Biesecker, had the pleasure of conversing with Karen Stevens, the co-founder of the innovative AI chatbot app, EarKick.

Tune in to The Light Inside to explore the impact of artificial intelligence on mental health with Karen Stevens, co-founder of the AI chatbot app EarKick.

Discover how EarKick is revolutionizing mental health support by offering compassionate guidance and wisdom in a user-friendly way.

Join us as we delve into the future of well-being and the potential of technology to transform the way we approach mental health challenges.

 

Timestamps:

00:00:02 - Introduction to Mental Health Stigma
00:01:06 - Introducing EarKick AI Chatbot
00:01:32 - Mint Mobile Sponsorship
00:03:08 - Addressing Mental Health with AI
00:04:14 - Understanding AI in Mental Health
00:04:35 - EarKick's Approach to Mental Health
00:05:08 - Guest Introduction: Karen Stevens
00:05:32 - Karen Stevens' Background and Inspiration
00:06:36 - The Genesis of EarKick
00:07:53 - The Concept of a Mental Health Sidekick
00:08:17 - EarKick as an AI-Driven Chatbot Hybrid
00:09:09 - Tracking Personal Data for Mental Health
00:11:21 - Building Trust and Inclusion in EarKick
00:13:23 - EarKick's Board of Advisors
00:15:27 - EarKick's Data-Driven Feedback System
00:17:36 - Cultural Availability and Accessibility
00:21:00 - Addressing Subconscious Patterns with AI
00:24:47 - Detecting Serious Mental Health Issues
00:27:53 - Adapting Traditional Mental Health Approaches
00:31:27 - Addressing Cultural Disparities in Mental Health Care
00:37:18 - Future Growth and Reshaping Mental Health Care
00:40:04 - How to Access EarKick Resources

 

(00:02) AI Chatbot EarKick

This chapter explores the intersection of artificial intelligence and mental health through my conversation with Karen Stefan, co-founder of EarKick, an AI chatbot designed to support emotional well-being. I share my excitement about how EarKick is making mental health care more accessible and socially inclusive. Karen opens up about her personal experiences with cultural differences and self-doubt during childhood, which inspired her to create a tool that could offer encouragement and reduce anxieties. We discuss the transformative potential of AI in providing timely and compassionate support for those struggling with mental health, highlighting how EarKick aims to serve as a personal sidekick, offering kind words and motivation exactly when needed.

(13:23) Trust in AI Mental Health Platforms

This chapter tackles the importance of building trust and privacy in AI applications, focusing on the EarKick app as a case study. We explore the app's approach to user data protection, highlighting its policy of not requiring personal information for usage, ensuring user ownership of data, and the absence of ads, all contributing to a safe and private user experience. The conversation also touches on the involvement of mental health professionals, like Professor Dr. Jasper Smith and Dr. Erica Simon, in shaping the app's direction and content, emphasizing the importance of expert input in creating an inclusive and supportive digital environment. Finally, I describe how EarKick uses AI to detect subtle changes in voice and typing patterns as indicators of mental health issues, offering users insights into their emotional state without overwhelming them with clinical assessments.

(19:29) AI in Mental Health Treatment

This chapter explores the intricacies of artificial intelligence (AI) in mental health, specifically how personalized AI can effectively engage with users to provide tailored mental health support and advice. I share insights on how AI, much like a therapist or coach, can create an environment where individuals are more receptive to confronting uncomfortable truths by providing non-judgmental, patient, and context-aware feedback. We examine the importance of user input and the potential limitations of AI when faced with a lack of context. Additionally, I touch on the critical role of diverse, unbiased data in developing AI algorithms and how AI can support those with neurodivergence or individuals experiencing serious mental health crises such as suicidal ideation. Through the example of EarKick, a platform designed to detect early signs of crisis and intervene proactively, we consider the promise of AI in addressing complex mental health issues.

(29:00) Expanding Mental Health Care Through Technology

This chapter examines the myriad challenges individuals face when seeking mental health support, including negative office vibes, interpersonal conflicts, and preconceived notions about therapy. We explore how the AI-powered platform EarKick aims to provide a user-friendly, accessible tool for self-reflection and expression, helping users to articulate their feelings and desired support before seeking professional help. The platform's design counters the spread of misinformation, particularly around topics like narcissism, by offering scientifically-backed, non-judgmental guidance. Additionally, we address how EarKick tackles cultural disparities in mental health care accessibility, with free core services and efforts to have third parties cover costs, promoting wider reach and adoption through word-of-mouth.

 

 

JOIN US ON INSTAGRAM: @thelightinsidepodcast

SUBSCRIBE: pod.link/thelightinside

 

Featured Guest:

Karin Stephan

Music Score by Epidemic Sound

Executive Producer: Jeffrey Besecker

Mixing, Engineering, Production, and Mastering: Aloft Media Studio

Senior Program Director:  Anna Getz

 

Transcript

The Future of Well-Being- AI Chatbots and Accessible Mental Health Care

00:02 - Jeffrey Besecker (Host)
This is the Light Inside. I'm Jeffrey Biesecker. Our mental health For ages, the specter of our emotional and psychological well-being, has presented significant challenges. The stigma surrounding mental health has often prevented individuals from seeking the treatment and support they need and deserve. Despite this stigma, there is still a lack of understanding and empathy toward those suffering from mental health concerns common concerns shared among us all. Imagine what would happen if artificial intelligence would change this course. Throughout our lives, we've all had those moments when our inner voice whispers a challenging yarn that at times seems overwhelming, invoking in us the desire for kindness, compassion and understanding, in an ear that listens and speaks with an endless supply of inspiring and uplifting wisdom. What if the challenges of sustaining our mental health were as accessible as a gentle panda? Today, we chat with Karen Stefan, co-founder of the emerging AI chatbot app known as EarKick. When it comes to addressing our mental health, at times we all need a friendly little kick in the pants and a compassionate ear that guides us. Tune in to find out how EarKick is disrupting the mental health space and shaping the future of our well-being. When we return to the Light Inside, when it comes to mobile service providers with their high rate plans, extra fees and hidden cost or expenses. 

01:39
Many of the big name networks leave a bad taste in your mouth. Mint Mobile is a new flavor of mobile network service, sharing all the same reliable features of the big name brands, yet at a fraction of the cost. I recently made the change to Mint Mobile and I can't believe the monthly savings, allowing me to put more money in my pocket for the things which truly light me up inside. Making the switch to Mint Mobile is easy. Hosted on the T-Mobile 5G network, mint gives you premium wireless service on the nation's largest 5G network, with bulk savings on flexible plan options. Mint offers 3, 6, and 12-month plans. The more months you buy, the more you save. Plus, you can also keep your current phone or upgrade to a new one. Keep your current number or change to a new one as well, and all of your contacts, apps and photos will seamlessly and effortlessly follow you to your new low-cost Mint provider. Did I mention the best part? You keep more money in your pocket and with Mint's referral plan, you can rescue more friends from big wireless bills while earning up to $90 for each referral. Visit our Mint Mobile affiliate link at thelightinsideus forward slash sponsors for additional mobile savings, or activate your plan in minutes with the Mint Mobile app. Activate your plan in minutes with the Mint Mobile app. 

03:08
It can feel scary to address our mental health at times. We've each experienced the grips of anticipatory anxiety, having possessed the knowledge of this label or not. Much like addressing our mental health as we begin to feel our way through its newness, ai presents yet another opportunity to experience the pull of stress and anxiety. Although they may initially feel scary, ai tools are not monsters. Nevertheless, it's easy to get worried about the societal change they bring about. 

03:40
The question remains how do we reconcile the role of AI in the mind's eye and allow it to bring us closer to who and what we are and everything that we might become? Definitions matter, and the way we label and perceive things influences our perception of them, even going so far as to project the actual experiences that occur actual experiences that occur. For many of us, a clear and decisive definition of what AI truly means eludes us. Starting with a clear and simple definition, we begin to understand AI as smart automation and then branch out from there. Ai is powerful and its potential is huge, but clarity is our first step. 

04:22
Ai products are tools and, like any other tool. Their value lies in how we use them, especially when addressing complex issues like our mental health. In this episode, we chat with Karen Stefan, coo of EarKick, a chatbot app that utilizes AI to auto-detect your mental state, offering kind, compassionate feedback in response. Karen, I'm excited to learn more about how your platform, earkick, is helping us address the specter of mental health care in a socially inclusive way. Thank you for joining us today to share your amazing concept. 

04:59 - Karin Stephan (Guest)
Oh, thank you so much. It's such an honor to be in this podcast with you and I'm expecting very, very good questions from your side, as I know your show. 

05:09 - Jeffrey Besecker (Host)
Well, thank you. We're so excited to share this concept not only with our listeners, but with a larger public, seeing the great impact that your passion for mental health care and your compassion for others is sharing with the world. Let's start off today by sharing first a little bit of your background and what inspired you to create the app. Let's go there first. If we would. 

05:32 - Karin Stephan (Guest)
Sure. So I grew up as a kid of a Latino mother and a Swiss father, with lots of nationalities in our family, and you know as colorful as this is and it makes you very susceptible to you know cultural differences, but for a kid it's very difficult sometimes to make sense of those, let's say, difficulties, and I grew up with a lot of misunderstandings, a lot of self-doubt, a lot of identity issues, a lot of anxieties too, and I in my little head, when I was eight, I wished for a little sidekick, a little man in my ear that would whisper encouraging things when my parents were fighting, or would tell me not to worry about my piano recital, or would tell me that school isn't all bad. You know these little encouragements and at the right time in the right place, just for me. That was my dream when I was a kid. It could have saved me a lot of problems growing up. 

06:36
Then I of course I had to learn to manage anxieties and life at large. But I encountered that pattern again and again when I was leading teams, when I was building companies I built my first company in my early 20s as a musician. Every time I was so frustrated to see that if I had given someone the right tip at the right time and the right portion, I could have spared them a lot of pain. And so when technology advanced and it became possible to build such a sidekick in the ear or such a sidekick for everyone, that's when I jumped at the opportunity. 

07:20
I found a co-founder who's a computer vision wizard, and AI genius, dr Robert Fay, and together we set out to create a companion that not only measures your mental health and your mood, anxiety, your emotions, but also gives you real time suggestions, just like that little sidekick that I imagined. And that's also why ear kick is called ear kick, because you should have the impression that something is whispering in your ear just right when you need it. 

07:53 - Jeffrey Besecker (Host)
We've all had that little voice on our shoulder, sometimes whispering things into our ears or, you know, into that stream of subconsciousness. I can see where having that encouragement can be such value and create such positive momentum in our mental health. Karen, your platform is an AI-driven chatbot hybrid of sorts, correct? 

08:17 - Karin Stephan (Guest)
It is. Now. What does that mean, right? So basically, it combines the best of three worlds. First of all, you are not going to change anything about your life, your lifestyle, your attitude, your thoughts, if you don't know where you're at. And that's where the measurement part the AI driven measurement part comes from. 

08:40
So we're all on the mental health spectrum. We all encounter challenges throughout life. Nobody is spared, regardless of their background. So what we need to know is where am I today or where was I last week? And for that we need measurement. Everyone knows you can't change what you can't measure and you can't manage what you can't measure, let alone improve. 

09:03
So the measurement part of the EarKick platform is that you lend your voice or your typing or a little video snippet. It's like checking in with an app where you say how you're doing or what's your no mind, and our AI then calculates that, assesses that and gets your emotions, gets your mood, gets your anxiety level, gets the context of what you've been saying and the reasons why you're feeling that way. It can even detect some of the symptoms that you may have, be it tiredness or irritation, and based on that measurement, then we can create or the algorithm can create a response. And the crucial thing here, this happens within seconds, almost in real time that you get a response that says oh, I'm sorry that you're feeling down this morning and that the weather is bad. It's been bad for three days, but here's what you can do. And by having this kind of measurement part and then the interaction part, completely tied to your actual state and personalized to what's going on in your own life nobody else's just your own life that makes people open. It makes me accept what is said much better. 

10:22
And then, if I have that over time, not only do I get a very clear picture of where I'm at and how I'm progressing, but I'm also getting the opportunity to have a good start to the day. Because if I hear, hey, karen, today you slept well and remember that today is your big day and I wish you all the best, and hey, don't forget to hydrate, or something like that big day, and I wish you all the best and hey, don't forget to hydrate, or something like that, then the chances that I'm going to get a good start and then I'm going to be able to deal mentally and physically with what's thrown at me is much higher. And needless to say that all this data and all those stats that you're generating with your input is being connected to data like your movement, your sleep, the weather outside, what you've been doing, your breathing exercise, et cetera, to then let you know at any time where you're at in your journey. And we all know that change is difficult, but with a little helper it's easier. 

11:21 - Jeffrey Besecker (Host)
Our daily activities and mental health provides several key data points, as you've mentioned, that we might track and learn more about our core habits In order to gain meaningful insight into the state of our emotional and psychological health. How does the app actively track the relevant personal data that accompanies our mental health? 

11:42 - Karin Stephan (Guest)
Well, our mental health is a very complex thing, right? So there is, of course, how we think about ourselves, how we feel about ourselves. There are the bodily cues that we get, you know, whether we feel anxiety somewhere in the body or we tense when we get stressed. Then there are these influencers or stressors from outside work, school, the weather, even, or relationship problems that also play a role. And then there is what we do like, how we cope and how we react, how we respond to things. 

12:21
And EarCube is able to take all of this together and sort of guide you in finding out what influences, what impacts your mental state most and how you can change that. So it's not just what you would say okay, you go to a doctor and you fill out a questionnaire, and then you have this and this and this symptom and there you go, you have this depression, or there you go, you have a burnout. It's within your environment, with everything that is you've been given, the cards that you've been dealt and the situation that you are in, the way you respond, the way your body responds. That is who you are in mental health. So you're not just in a bucket of depression, anxiety or something else. You are a unique journey, you are a unique personality and you have some sort of agency, and we want you to know where you have agency and where you can do something about what you don't like. 

13:23 - Jeffrey Besecker (Host)
With AI's new and emerging technological advances, many aspects of our daily lives are set to be significantly altered. Yet for many there remains an undercurrent of unfamiliarity and distrust when utilizing this new technology. What common practices has your team undertaken to build that sacred space of trust and inclusion into your programs? 

13:49 - Karin Stephan (Guest)
build that sacred space of trust and inclusion into your programs. Well, first of all, when we want to be trustful, in order to establish trust, we need to feel that our data is safe, that we're respected as individuals, and the way you can do that best is, first of all, by not asking people's names, not asking their emails, not asking any personal data when they want to use your tool. So when you go on the app store or in the Google play and you download ear kick, you will notice that you don't have to register. There is no connection between what you're going to do on the app and your personal data, right, and so that's the first aha moment a lot of people get. Then they notice oh, wait a minute, so I'm giving all of this for free, no ads, what's going on here? 

14:36
So the way we do it is like we want you to experience that we take it seriously with privacy, with you owning the data, with nobody can serve you ads, nobody can tell you anything, and if you lose your phone, your data is gone, so you are the owner of the data. You will experience that over time. That makes for a space where you can be very, very vulnerable, where you can be really honest and speak your mind, because nobody's watching you, this is not going anywhere, nobody's going to exploit you, and you will be able to experience it firsthand, because a lot of people talk about privacy, a lot of people talk about oh, this and that, but then they don't follow through, and we want our members to experience that we're radically private and that we put them in their journey first. 

15:28 - Jeffrey Besecker (Host)
I found it refreshing and supportive to see that you've built a team of renowned mental health practitioners into your board of directors. Would you share more with us about your board of advisors and the role they play when considering the developmental direction of your platform? 

15:43 - Karin Stephan (Guest)
Yes. So we're very, very proud to have Professor Dr Jasper Smith in our advisory board. He not only is a renowned professor at the University of Texas, he's also an author of many books, countless papers that specialize on mood disorders, anxiety, depression and what you can do about it, Especially, for example, habit changes, exercise. He's actually really into that. On the side, he also is a practitioner, so he has like a clinic where people come to and sees people every day. So he's not just, you know, in the academic world, he's actually putting his knowledge and his experience to work and he's helping a lot of people. 

16:26
Then we also have Dr Erica Simon. She is specialized in burnout, she's specialized in mental health, mood disorders and she's very hands-on as well. But she also has a way to conceptualize her knowledge. She knows how to convey things in a way that it doesn't sound complicated and how to best serve it. So she's been very involved in how the panda so the little mascot that appears in the platform in the app communicates with you. She's the person behind the personalities where she says okay, some people like more of a coach style response. Where she says, okay, some people like more of a coach style response, Somebody may prefer a chummy friend that chats a lot, and other people would rather be more on the spiritual side and have more of a stage-like or guru-like conversation with the AI. So there are other personalized achieves behind that, and we have regular meetings. We're very proud of those two advisors. There are other personalized chiefs behind that, and we have regular meetings. We're very proud of those two advisors. There are other advisors as well, but they're not in the mental health or medical realm. 

17:37 - Jeffrey Besecker (Host)
It's easy to see in that regard how you've placed such attention, care and compassionate concern in developing that inclusive environment. Thank you for that. I want to acknowledge that before we move forward. You know that's really refreshing to see that kind of interaction and I feel that builds that bond of trust that helps to bridge that gap. 

17:55 - Karin Stephan (Guest)
You can only build trust and lose it once. You don't get a second chance. 

18:01 - Jeffrey Besecker (Host)
That's debatable, but we'll leave that neither here nor there today. So, karen, could you describe how the EarKick app identifies and targets common mental health patterns and how it can provide feedback to help its users understand what causes those behaviors? 

18:22 - Karin Stephan (Guest)
It's a very good question. 

18:24
So, first of all, we have several AI models and one of them only listens to the tone of your voice, like little jitters and little things that happen in your voice that are very subtle and they get lost kind of into the human ear if you don't pay a lot of attention and you don't know the person very well. 

18:44
So that part of VA detects early signs of mental health issues and we'll keep them posted and we'll follow them and that's how VAI can tell you whether you were very anxious or where you stand, on a scale zero to nine, for example. The same goes for depression. The same goes for other markers in your voice, in the way you speak, how you type, for example, and all that is in the background assess. Now we don't tell you, oh, you have anxiety, this and this much, and oh, my God, you're so depressed. That's not going to help anyone. We're so depressed, that's not going to help anyone. We have that knowledge, or the algorithm has that knowledge, and will then provide the input and the recommendations, the weekly reports that you get, based on that knowledge, so that you are incentivized and you are motivated to work on it rather than somebody telling you oh my God you're anxious. 

19:47
So and then what you know? So it's more how a therapist or a counselor would do it or a coach would do it. You engage in a conversation and you serve up the advice or the suggestion in an actionable way and in a relatable way, and you make the person who's listening to it feel that you mean them, exactly them, and not somebody else. 

20:12
it's not a one-size-fits-all advice that you get so that's the AI that measures kind of the mental health, and then it's correlated with the other stuff you know, like with biomarkers, with what you said, the context that the what happened last week and the week before, and that all helps to generate these interactions, as though you were talking to someone else. That is as an endless memory and is a hundred percent listening to you, attentive to you, and that sort of creates the environment where people are willing to accept maybe something that is not good, or a truth that they have been blind to, or something they need to change, that they have to suppress. 

21:00 - Jeffrey Besecker (Host)
Often our subconscious and unconscious patterns exist beyond our conscious awareness, and the effectiveness of AI might be seen as limited by the level of context provided by its users and its input. In this regard, with AI being a generative tool, how much feedback is limited by the context of what the user inputs, and how might this further exasperate their mental health treatment rather than facilitating it? 

21:28 - Karin Stephan (Guest)
Well, as you rightly pointed out, if a member doesn't disclose anything doesn't talk about really on their mind, in their heart, it's difficult for anyone, let alone an AI, to really get in deeply and provide help beyond something more general, let me give you an example. I use Incic every day and I was about to demo it to a crowd of people, and these people were all CEOs and very accomplished people, right, and so, while I was demoing it, I said something to the tone of oh, today I'm excited. It. I said something to the tone of oh, today I'm excited, and I have all these, you know these intelligent people here and I want to give them a good workshop, something like that, you know. And the AI picked up on exactly that, because it seems to be something that I have with me, that I put intelligence, or what I think of other people, very high, right, and so it and I had to read it to the crowd. It said it's okay to be nervous, but remember, intelligence is not what is the worth of a person. Do not think of people around you that because you believe they're more intelligent than you, that they're any better, or that you know their value is higher than you. It was picking up on everything that I was kind of conveying and it seemed that I had a pattern about that some sort of underlying complex or something, and the way it served it up, I was able to accept it. If somebody had told me that in person, I'm not sure I would have accepted it. So it's these small conversations where you get a chance to say, oh, this is a stupid machine, or where you go back, maybe there's some truth to this. And if it comes up every so often, at some point you go like, okay, maybe let me try differently. Or it just works within you right and because you know it doesn't judge you An AI does never judge you. Because you can't right, it's easier to accept that you may have a flaw or a bias, so you expose your own bias towards the AI quite soon. 

23:51
Now you were talking about the bias that the AI itself may have, and for that, to tackle that, you have to have very good data, very clean data and very broad data. When we built AirKick, we were very aware of that. Not only do we have a very diverse team, so we have people from India, from, you know, ukraine, we have people. We have Latinos, we have Swiss. We have people the youngest person at the time was 14 and the oldest, you know, in their 60s. So we have a really broad range in the team itself and we made sure that the way it's trained and the way the data is collected and used reflects that variety and reflects that you know that broad range and thanks to big data you can really, if you train well and if you train it thoroughly, with the right people, you can get very, very, very good in unbiased algorithms very, very good and unbiased algorithms. 

24:52 - Jeffrey Besecker (Host)
When we consider a more deeply contextualized mental health issue, say, for instance, suicidal ideation, or a more deeply ingrained personality disorder, for instance something like neurodivergence, what modes or models does EarKick use that might help detect these? That might inhibit it from further masking them? 

25:11 - Karin Stephan (Guest)
Well, you know, what I know is that we have quite a followership of neurodivergent people, and one of the reasons is that EarKick is an AI and is infinitely patient and is infinitely patient and doesn't mind basically repeating things over and over again with the same enthusiasm every day, you know, or reminding you of of mundane things that a human would grow tired of. 

25:41
Right, it also has a way of communicating, uh, with these people that they, like. I know from them that they feel like they can be themselves and that they're not in the way, but then also that the AI is not in their way, so they feel a lot of ownership and agency. When it comes to that. The other side that you just mentioned about suicidal ideation or someone really going towards a crisis, we trained EarCake in a way that it detects it very early and there the proactive trained behavior is important because you should address, or the algorithm needs to address, any kind of suicidal ideation or any worsening trend heads on like proactive, and the way we trend it is that it just goes ahead and calls it what it is, encourages you, um, to see a real person, um, serves you helplines and tells you, hey, here you can call or look up this link, get some help. 

26:52
You're not alone all these things and it doesn't get tired to do that. So if somebody is going on that slippery slope, your kick is going to really hone into that. Hey, connect to someone. Hey, there's no shame in doing that. And you know, remember your son or your brother that made you laugh last night. So it connected with the real world constantly, while then also telling you here's who you can call, here's where you can go. I think that's a very good way and a very good use of technology, because very often someone we don't get in front of their eyes, there is no space for us to to reach these people Right. And technology does have a way, because a lot of the people that, at least that we know of you know, use our phone a lot or online a lot, and that's where you can meet them, nudge them out of that world, back into the real world and, hopefully, to open up to a real person. 

27:53 - Jeffrey Besecker (Host)
As you mentioned earlier in our conversation, the app guides us somewhat, based on the user's personality and preferences, toward various modals and responses, many different modalities existing within the mental health care and personal development space. To me, this might present a potential for confusion as individuals navigate their health care based on what some of their preconceived notions might be. Considering that certain models of traditional practices often contain inhibiting factors or create inhibiting factors, how does the app help bridge the gap between these traditional psychological approaches and new emergent practices that might culturally evolve? 

28:38 - Karin Stephan (Guest)
Well, you know, if you think of traditional therapy or practice, there are so many barriers to that, unfortunately, and it you know, barriers on the side of a person who needs help, barriers on the side of the practitioner. There are so many, many, many, many ways this can go wrong Not finding the right person, a matching problem. Or you come into an office and just the vibe isn't good, or you don't click with the people there, or you have all these ideas about what this is. So I would say a million touch points are at risk. Um, before you actually get the right help or the help that you deserve and I'm not even talking about availability, affordability and all that right ear kick is. 

29:24
You can always think almost think of it as you play it right. It's very easy to use, it doesn't take much of your time, you can be very spontaneous, you can yell at it if you want, you can be funny or you can be angry, and because it just listens and lets you do your thing, you may hopefully come to the conclusion that, hey, the solution is here within me, in reach, with this person over there, and while you get into the reflection mode because you're allowed to be who you are over a period of time, then you start understanding how to express yourself. You start getting used to speak about your feelings without having all these triggers around you. You get used to reflecting on what it is that you actually want and what it is that you don't want. You get used to reflecting so what's the help I'm expecting and what would that look like? 

30:27
Because you've been able to discuss it over a certain time and by the time that you are ready to go speak to a real person, whether it's a coach or your mother or a doctor, you have had this role play, you have had this training and you've been feeling okay doing that. So that's one of the ways that you can bridge that and it can also do away with some of the preconceptions, maybe prejudices, that people have towards therapy. And if you experience it's a total normal thing, something that I talk about when I get up, something that I talk about before I go to bed, okay, then we're halfway there, because every, every, every problem starts very small and we can talk about it when it's small, but we have to talk about it when it's big. So I prefer talking about problems before they evolve into avalanches, and I think IRC is a good way to do that. 

31:28 - Jeffrey Besecker (Host)
Additionally, how might the platform safeguard against untested data, which tends to seep into the social aspect of collective discussion, regarding mental health, for instance? There are many common misconceptions and fallacies about narcissism and how it surfaces in our daily life. What role is AI playing in diminishing that impact of false information? 

31:50 - Karin Stephan (Guest)
You mean false information within the app or false information outside. 

31:56 - Jeffrey Besecker (Host)
Within the app. In specific I'm not sure, but you know, I know from our own interactions with AI in our practice certain prompts will bring up a conditioned belief or conditioned response about narcissism that doesn't necessarily match the data driven approach in the data substantiated information. From a more scientific perspective, there's a very common mislabeling of narcissism just based on people's interactions, where we make a lot of core assumptions he's being narcissistic. I've seen where our own use of other AI platforms has resurfaced some of those beliefs without the proper course of action backing. 

32:38 - Karin Stephan (Guest)
Well, let me tell you that Earpig is trained on a very solid scientific basis, so it does have the knowledge underneath that someone who's read all these books and been trained on many, many cases have Will it tell you that you're a narcissist? No, it won't, of course not. It's not going to help you at this point. It's going to point out in a lovable way, what you have in terms of patterns. It's going to nudge you towards some of the maybe contorted thinking that you have, maybe some of the discrepancies between what you say and what you do. Maybe it's going to nudge you towards reframing things. Right, it's not going to change you. No AI on earth can change you. You're the only one who can change yourself. But the way it does, because it doesn't get angry or it's not insulted, which a lot of people get when they have a narcissistic person. 

33:40
That's why you know, a person who misbehaves can do that you know and listen back to it and hopefully learn something from it, without having this label oh I'm this or oh I'm not this, right? So again, ear kick is about ownership, about agency and about encouragement to do something to get to your goals right. And if somebody uses the app only to vent into it and leave their frustration in there and doesn't care about anything else, well, at least it's a good way to get rid of your intense emotions. 

34:18 - Jeffrey Besecker (Host)
Finally, here I want to look at that aspect of cultural availability. So in studying the cultural impact of mental health care, we can often observe cultural disparities that often occur in treatment availability, based on intersectionality or cultural condition, impoverished people being a prime example of that, people in secluded rural areas where there's not a broader base access to mental health care or living conditions that might inhibit that. So let me frame it in that way being in part a paid premium platform of service, how is your organization addressing these cultural disparities, bridging this gap and increasing access to effective mental health care? 

35:04 - Karin Stephan (Guest)
bridging this gap and increasing access to effective mental health care? It's a very good question. We thought a lot about this. So the app is free in the functionalities that are necessary for you to track your mental health, know your patterns and progress right. The premium version has some extras that are not essential for you to get better. That's the first thing I have to say. Then we are working very, very hard to get someone else to pay than the end user. 

35:36
So this is on many levels, because we believe that is the best use of it. Whether it's an employer or your school or your university, your YMCA, whatever that, somebody else or a third party would pay for that. But the way it's structured, there's a lot of word of mouth. So people hear of EarKick and hopefully through this podcast, more people will hear about it and they can just download and there are no strings attached. And when people are convinced that it's good for them, then they'll pass it on to their friends. And so we have lots of people from marginalized communities. We have lots of people who are young where we know that they don't have a credit card or they don't have the money to pay. They are using the feeds. Absolutely fine for them. And again, we will not leave one stone unturned until we find ways to make it available, even the paid version, for everyone, all 2 billion youth, whoever needs it, we want to get it in their hands. 

36:41 - Jeffrey Besecker (Host)
Here's an interesting thought that popped to mind for me. Perhaps down the road we could maybe transform that to buy someone a virtual mental coffee and pay it forward by sharing that love A thought that comes to mind for me as I'm sitting here and thinking about it. 

36:59 - Karin Stephan (Guest)
Look, that's such a great idea. People pay all kinds of things every day, and if they could pay a very little amount to have someone's mental health helped and supported. I mean, if that doesn't make your day, what does Right? 

37:18 - Jeffrey Besecker (Host)
Caring is sharing. That's an idea I could get behind. That's an idea I could get behind. So the last question I'd like to ask today, Karen, is how do you see your platform's growth arc extending in new and exciting ways that will collectively reshape our mental health care paradigm as we move forward to the future? 

37:39 - Karin Stephan (Guest)
The way we see the platform is that it's enabling, empowering. It's not designed to take people's jobs away. It's not designed to discriminate. It's actually quite the opposite. 

37:52
We are completely convinced that everyone not only needs a sidekick in their lives, but they deserve it too. And now we're currently dealing with the most vulnerable part of life, which is mental health. But we know that the platform can also include other things where we may need help, whether it's some, you know, more broader coaching or guidance in certain phases of life, specialized niches. All these things can then be part of the platform once we solve for, and you know, and make people understand that their mental health comes first, in that every problem that they encounter in life can be solved and tackled with a good mental readiness, as we call it right. So we want to get everyone on that platform, like really everyone, and the sky is the limit. 

38:51
Not only can we learn more about good approaches by having millions of people use the platform, but we can also imagine that take approaches that exist and work for a certain group of people and augment them. What we foresee for this platform is to empower and enable, actually augment, every human coach, every human therapist, every human counselor, you name it mentor, to be present in their mentees, in their patients' daily life, without having to be there in person, without having to be in the picture as a person. That is where we see this platform going, because we have so many amazing people doing amazing work who only have 24 hours and they cannot serve all the people that would benefit from their talent, from their knowledge, from their experience. So if we could take each of those heroes, each of those hero coaches and mentors and augment them and make them available to everyone around the world, I think we would have a tremendous impact globally. 

40:04 - Jeffrey Besecker (Host)
I want to thank you from my heart for sharing your passion for mental health both with our listeners and the world. Finally, today, karen, where can our listeners go to reach out and make this initial contact, to utilize your very, very valuable resource? 

40:21 - Karin Stephan (Guest)
Well, you can go to the website, it's earkickcom. You can also find Earkick on LinkedIn, on socials, but you can also write to me personally. I'd love to hear from your audience. My email is karen K-A-R-I-N at earkick E-A-R-K-I-C-K dot com and let me know what you think of this, how I can answer your questions. 

40:50
And I want to say one more thing it's very important to me that I'm not here to glorify AI. I am not here to make the case for anything. The case for anything. I think. If people have, if they're cautious about AI, or if they have, you know, some problems with it, I think the only thing that it shows is that they care. And to those people I want to say it's good that you're cautious, it's good that you think deeper, but do get into the topic, do get yourself acquainted. 

41:23
What's out there, how can you use it? How is it good, how is it not? What are the challenges, what are the limitations, what are the benefits? That's the only way we can make a good use of AI and that's the only way that EarKick is going to be of any help, if we can really dig deep and we get going and we use it in daily life. So, not glorifying, not vilifying it, but get our hands dirty with it. And yeah, reach out to me. I'd be more than happy to connect over LinkedIn, over email, wherever. I'm happy to give you a thumbs up and increase our egg of people who are willing to help others. 

42:02 - Jeffrey Besecker (Host)
I truly am appreciative and grateful that you are sharing that light with the world, karen, and I am extremely excited that we're able to share this with our listeners. Namaste the light in me acknowledges the light in you. Thank you, thank you Bye-bye. 

42:24
We're so thankful you joined us Today. We discovered how EarKick is disrupting the mental health space with its innovative AI-driven platform. We learned how accessible healthcare is being made available to a broader spectrum of the public. Karen Stefan spoke with us about the power of personalized mental health guidance and the role EarKick is playing as an AI chatbot, reshaping the future by empowering individuals to take a more active role in their mental well-being. Try EarKick for free by visiting our sponsors tab at wwwthelightinsidesite backslash sponsors. Just click on the link and take it for a free test drive, or access the app by visiting wwwearkickcom. In addition, you can find the app in your favorite operating systems app store. Please share this episode with a friend or loved one who might also find benefit in EarKick and, as always, we're grateful for you, our valued listening community. This has been the Light Inside. I'm Jeffrey Biesecker. Thank you. 

 

 

Karin StephanProfile Photo

Karin Stephan

COO / Co-founder

Earkick was founded in 2021 by serial entrepreneurs Karin Andrea Stephan. and Dr. Herbert Bay

We’re after revolutionizing mental health support by leveraging our passion for tackling global challenges and our backgrounds as serial entrepreneurs.

Herbert is a Computer Vision expert who has successfully built and sold computer vision/AI startups and Karin’s thesis on the intersection of mental health and tech laid the foundation for Earkick’s mission.

Together with an exceptional team we are building an AI-powered mental health platform with a multi-modal LLM companion that measures, tracks, communicates and helps improve mental health in real time.