Main

For a Life-Saving Nonprofit, AI is a Game-Changer | SXSW 2021

For many organizations, the use of AI automates back-office tasks or scaling a hiring process. For nonprofits like The Trevor Project, implementing AI means greater impact in saving young LGBTQ lives.Trevor seeks to optimize its services by using AI as an enhancement to the human touch. In partnership with Google.org, The Trevor Project explores how new AI models can support its mission by pairing highly trained humans who care with world-class technology.This session will address innovative and strategic use-cases for technology that expand the reach of Trevor's crisis services – in particular, new people-centric AI systems to assess risk and to scale crisis counselor training to save more lives. About SXSW: SXSW dedicates itself to helping creative people achieve their goals. Founded in 1987 in Austin, Texas, SXSW is best known for its conference and festivals that celebrate the convergence of the interactive, film, and music industries. An essential destination for global professionals, this year’s online event features sessions, showcases, screenings, exhibitions, professional development and a variety of networking opportunities. For more information, please visit sxsw.com. Subscribe: http://www.youtube.com/user/sxsw?sub_... Connect with SXSW: Website: https://www.sxsw.com Facebook: https://www.facebook.com/SXSWFestival/ Twitter: https://twitter.com/sxsw Instagram: https://www.instagram.com/sxsw/ YouTube: https://www.youtube.com/sxsw

SXSW

2 years ago

Hi Self-Bi. Thanks so much for joining us virtually today. I'm really excited to be here with you all. For a quick introduction, my name is Jen Carter. I use she/her pronouns, and I am the Global Head of Technology and Volunteering at Google.org. I'll share more about what that means in a minute. But first I am thrilled to be here today with Lena Ballantine, the incredible COO of The Trevor Project, an org that is near and dear to my heart, and that I've had the pleasure of working closely with
over the past couple of years. So I'll hand it over to Lena to introduce herself, and share more about Trevor's lifesaving work. Awesome. Thank you so much, Jen, I really appreciate it. And hello everybody from Austin, Texas. I actually currently reside here, and super excited, and the wish we could all be together in person, but we'll make the best out of the 30-40 minutes we have with you sharing more about the amazing work we did with Trevor and the Google team over the last year, year and a
half. But before I dive into and share about the amazing outcomes, I would love to share a little bit more about me and what got me here today. I am Lena Ballantine, use she/her pronouns. I'm the Chief Operating Officer at The Trevor Project. In my role, I oversee finance, legal operations and technology. And in my technology function, I had the pleasure of working with Jen and the Google team over the course of the last 18 months. And what got me to Trevor? I was really excited. For the majorit
y of my career, I actually worked in corporate functions and spent most of the time in the closet and really was looking for an opportunity to share the experience that I had in a meaningful, impactful way. And also grew up really countryside Northern Germany. So I knew really firsthand what it means to grow up disconnected from community, not really feeling comfortable coming out to your community, because there's not anybody who feels like you, or looks like you or feels like a trustworthy per
son to share with. And so when I was looking for a new opportunity to bring all my talents and passions to the table, I came across The Trevor Project initially wanting to volunteer. And at the same time, The Trevor Project actually posted a role for the Chief Operating Officer function. And at that point I told my wife that I've found my new future job. And she said, "Do you actually know anybody "at The Trevor Project yet?" And I said, "No, not really, "but I'm determined to make it happen." A
nd to also share a little bit more with you all why I was so passionate about the role at The Trevor Project and the general about The Trevor Project. I'm gonna share more about The Trevor Project with you. And then I'll hand it over again to Jen to share more about our beautiful collaboration between Trevor and Google. So The Trevor Project is actually the worlds largest suicide prevention and crisis intervention organization for LGBTQ young people. Our mission is to end suicide among gay, lesb
ian, bisexual, transgender, queer, and questioning young people. And how do we do that? We have five major programmatic areas. The one we lead with and engage with youth in crisis is Crisis Services through phone, text, and chat. We're there for you, it's 24/7 if you need us. We have an amazing peer support platform called TrevorSpace. You can imagine like a Facebook, safe Facebook environment for LGBTQ young people between 13 and 24. We have an amazing research department which helps us to real
ly make sure that we have data to prove that the work we're doing is actually meaningful and successful, and also dive into more community questions that we otherwise wouldn't have data to answer the question that we asked for. We have an education and public awareness area that helps us to go to corporations, schools, to do trainings and in general, raise awareness for issues of LGBTQ young people. And last but not least our advocacy department, that really is there to help work with policies a
nd regulations to prevent any kind of discrimination and protect LGBTQ youth. So a whole lot of amazing work we do. And the reason why we exist? I let the facts on this slide speak mostly for themselves, but we have a lot of evidence and we've done a lot of research that there are a lot of extra challenges for mental health of LGBTQ young people. And The Trevor Project is really there to support young people in crisis and to provide a safe environment where they can feel seen, supported, and kno
w that they have the same right to be loved than anybody else. And why that is important? Because one, one accepting adult for an LGBTQ young person can really make all the difference, and we work there to be there, and be that person for LGBTQ young people. So all that being said, really excited to be here today. And I'm gonna hand it back over to Jen to share more about Google.org and the Google AI Impact Challenge. Thanks, Lena. Some really sobering stats, but I think also just a great remind
er of the extraordinary impact of Trevor's work. So for some quick background on Google.org, and actually how we got connected to Trevor. So Google.org is Google's philanthropy. We're giving the best of Google to innovative nonprofits and civic entities who are tackling the world's toughest challenges. And we do that in a few ways. First by providing our grant dollars. So 1% of Google's profits go to support nonprofits around the world. We also do that by providing our products and technology. S
o for example, installing wifi immediately, during, or following a crisis. And then finally we do that by providing what we think is our most valuable resource, which is our people, who volunteer their time and expertise with incredible organizations on the ground like The Trevor Project. But you know, just as an example of what that package of funding, and technology, and people actually looks like in practice and really how we first started working together with Trevor, we were hearing from a
lot of nonprofits that they wanted to use AI in their work, but they didn't necessarily know how to go about it. And so in late 2018, we launched the Google AI Impact Challenge, which was an open call to organizations around the world to submit their ideas for how they could use AI to help address societal challenges. Incredibly we received over 2,600 applications, 2,602 to be exact, from 119 countries across six continents. So it really just confirmed for us that the interest and the need was t
here. But then we faced the daunting task of narrowing those down to just 20 winners or less than 1% of total ideas, which we did with the help of countless experts both inside and outside the company. So at a high level, we wanted to empower nonprofits to apply AI in their work. So the winners received not only the funding, but also the chance to take part in an accelerator program. They received Cloud credits, coaching from Google's AI experts, and finally the opportunity to participate in a G
oogle.org Fellowship, which enables teams of Googlers, typically software engineers, product managers, UX researchers, designers, and more to work full-time with them to help accelerate their impact as they build out a new product or technology. And so The Trevor Project was one of those nonprofits that was interested in using technology and AI to help them achieve their mission. So of course they applied, and ultimately were selected as one of the 20 winners. So that's sort of just the brief hi
story from Google.org side. But Lena, I'm curious what that process looked like for you all in terms of Trevor's evolution with technology and what led you to apply. Yeah, absolutely. And thank you so much for sharing the history around how Google got to and creating and implementing something as meaningful as the Google AI Impact challenge, because it really set up and created a completely different trajectory for where we are as Trevor now. And I'm happy to do a bit of a time warp and go back
to where we started. So four years ago, when The Trevor Project hired Amit Paley as CEO, he knew that the way that we were operating as an organization was just not able to serve all the young people who need us. I mentioned we have our own research department, we did a research study not too long ago, that really also helped us understand the magnitude of the challenge that, you know, we were trying to solve for. And so right now there are 1.8 million LGBTQ young people in the US alone that ser
iously consider suicide each year. And right now, I can tell you that with all the great work that we're already doing, we're reaching about 200,000 of them. So to think about how we can grow and scale, we knew that we would need technology enablement and technology support to not, you know, subsidize any of the really face-to-face work that our counselors are doing, but really to create a platform that gives them the space and the tools that they need to do that work thoughtfully, and with the
right amount of time spent. And so when actually John Callery joined our VP of Technology, he always tells a story that his first laptop still had tape on it, and it was kind of taped together, because as a nonprofit, we would put all our dollars towards the programming side, which obviously makes a lot of sense. But actually the part that really helps you to grow these programs are technology, we had to make a switch. And so thinking about the services, and texts, and chat in particular that we
were doing, we knew that that's the channel that young people wanna reach out to us. And we wanted to make sure that we serve youth in the right order and make sure that we present services to the most highest risk youth first. And we knew we had a lot of data over the 20 plus years that we're serving LGBTQ young people but we didn't really know how to put that data in the right order and make sense of that. And that's how we got to reach out to you and really start our first endeavor together
to really understand what is the data that we have. We had a clinical understanding of it, but not necessarily an AI machine learning background. And so we brought a basket of data to you and started our journey. Yeah, definitely. A few of the things that you just mentioned. I just wanna highlight them, because they really resonated with us when we were looking at the project proposals. Again, if I take myself back to a couple of years ago now, but I think the first is impact. You know, there's
no question there, you shared the scope of the problem, but also the really ambitious goal that you all have to reach all of those youth by the end of 2023, which would have an incredible impact but also requires a significant increase in scale that really necessitates the use of technology. So that was definitely something that resonated with us. The second thing you mentioned was really the expertise. I mean, now I've had the chance and the privilege to work directly with you and the rest of t
he team at Trevor. And I could go on and on about how amazing everyone is. Please do (laughing) And I definitely will. But one of the things that really stood out, even just from reading your application before we met, was the multidisciplinary nature of the work and the team. So of course building out the tech side in more recent years, but really more importantly, having always used several fields of study in the approach to best serve youth from folks with PhDs in Clinical Psychology, to medi
cal doctors with expertise in youth psychiatry and really all of the other disciplines required to understand and maintain that laser focus on the needs of LGBTQ+ youth. So again, I definitely can say a ton about it, but just an incredible team at Trevor that I think was exceptionally well positioned to do this work. Yeah. And then maybe the final thing that I'll mention is feasibility. So certainly one thing we were looking at was is it even possible to use AI to solve this problem? And that's
certainly an important question, but maybe more interesting is the flip side of that question. Is it necessary to use AI? You know, AI is not always the right answer. Often some relatively straightforward juristics can actually do the trick. And we saw a lot of applications of projects that probably could have benefited significantly from just a basic data analysis and basic data analytics that would maybe get them 75% of the way there with a fraction of the effort. And so, frankly, those weren'
t necessarily a great fit for AI. But one of the things that really appealed to us about the project that you all were proposing is that you had actually tried a number of different approaches that were not successful and thus determined that AI truly was necessary to help. Yeah. And we agreed. So, yeah, can you share maybe a little bit more about some of those other approaches? Yeah, I'll definitely share some of the things we tried and then I think it would be really helpful to share like a qu
ick, you know, model version of what we actually came out to be and creating together. So I think you mentioned that a lot. You know, we obviously weren't the experts and didn't really know if AI machine learning was the answer, but we had a lot of information, in particular on the text and chat engagement that we have. When young people in crisis reach out to us, we have a very specific risk assessment model that helps us to categorize, depending on the information that the young people are sha
ring with us. If they're at an imminent risk, high risk, down to a low risk case. And we've had those information over the years in a very anonymous version. So we didn't have any personal data stored with that, but we didn't have enough information to say if certain words are used, you know, what that correlates with from a risk perspective. And so again, we tried a more traditional path to really see if we can pre-predict some of those risks scenarios based on the data that we had in the past.
And I think what was really amazing was to see that by even asking young people now, earlier in the process, even before they talk to a person, "What's going on?" And understanding over the course of the project, and also looking historically at all the data that we had, how answering that single question with the correlation to ultimately the risk assessment, helped us now to really make sure we serve the young people that are at highest risk first. And that was the beautiful collaboration bet
ween, you mentioned that, the clinical expertise and the data that we had, with the understanding that your team brought to the table from an AI and engineering perspective. And what that looks like? I'm happy to share. And then I would love, you know, while I pull that up, maybe you can share a little bit more about what that looked like on the Google team side to bring that to life. Yeah, definitely. I'm so glad you mentioned it, because Trevor just had this incredible trove of data about how
you've had responded to the prompt, "What's going on?" but you also had not just their answer to that question, but also their suicide risk level, as determined by a crisis counselor who was using the clinical assessment model. And so not only did you have all of this data, but it was actually labeled, which was even better. It's fantastic for AI. Yeah. But so the main goal of the project was then to use natural language processing to determine suicide risk level, based solely on the youth respo
nse to that question about, "What's going on?" And so you can see some simple answers on the screen and then by doing that, we could ensure that youth at highest risk get connected to a counselor as quickly as possible. So yeah, it was, I think the existence of that data was just really key and critical and Trevor had been collecting it over the years and it was so incredibly useful. Yeah, yeah. And then the other thing that I wanted to highlight was one of the things that you all had actually m
entioned to us was that you saw different communities had different identities and different ways of using language. And there's no shortage of documentation that shows how existing biases can creep into AI. And so that was a really, another important aspect of this project was we worked really closely with with the Trevor team to learn from past mistakes, and ensure that we were not perpetuating those biases as a result of any potential differences in language and vernacular when we implemented
the AI model. And so together, we identified a number of different groups across race and ethnicity, and gender identity, and sexual orientation, and the intersections of those, and then measured the performance of our model to ensure that it was treating people fairly across groups. And I know that Trevor is continuing to track that, has mechanisms in place to ensure that no anomalies arise there. So we were just very grateful to have Trevor as a thought partner on that. I really, really appre
ciate that. I think especially the thoughtfulness around the community and serving community equally, regardless of even like the frequency that they reach out to us, because obviously there's a lot of learning that comes with the bigger dataset. Making sure that we try to work with the data sets that we have to create the best outcomes, for any young person in crisis reaching out to us being treated according to their needs and their current state. Not at all based on like how many young people
with similar demographics have reached out in the past, and how well do we think that model is actually working. And something that we've been particularly emphasizing since we finished this project together, was making sure that we maintain that insight and that knowledge. And that's something that I think for us as an organization, having come into AI machine learning a little bit more on the tail end, was actually really helpful because we already had so much information that we could help a
nd have those really critical conversations together to make sure that we're doing the best in our ability to serve the young people as well as a person would have done, when we come to prioritize this stack. 'Cause at the end of the day, there is always going be a person that the young person that is in crisis or is going to reach out to Trevor. What we're just trying to do is really use AI and machine learning to get the order right, to make sure we really serve the most at need at first. So b
eautiful, beautiful experience. And I have to say, when I think back even when I started a year and a half ago, the technology team was half the size of really what it is right now. You really brought also a lot of train the trainer thoughtfulness into the work that we did together. We now have hired our first UX person to really help with they user experience. We expanded and built out our own internal machine learning team to really also maintain all the great work that we've done together, bu
t also build upon and elaborate on all the great insights that we're able to drive home. Yeah, definitely. I think we've talked a lot about I think that teamwork, that was really an incredible part of this collaboration. Both orgs brought extensive expertise to the table, but in terms of team effectiveness, we've found that what really matters is less about who's on the team, and more about sort of how the team works together. Or in other words, we say, you know, it's not just about hiring the b
est people, it's about creating the strongest team. But I know it can be challenging. You know, we had a pretty large cohort of Fellows coming in to join your org. And as you mentioned, your team was much smaller at that time. So what did that look like for you as these two teams were coming together? Yeah, I feel slightly nostalgic because it was back in 2019, when actually bringing 12 people into the Trevor physical office space that we still had back in pre-COVID times was quite a challenge.
And in my role, as the COO, I said in the beginning, I oversee operations as well, which includes our office spaces. So it was actually a challenge to think about where are we gonna put 12 amazing Google Fellows in a way that allows for collaboration, communication, and have the creative juices flowing. So we swapped around some meeting rooms, and made sure we had like an open co-working space, and allow for that. I miss those days. (laughs) Yeah, me too. I mean, I was a huge extrovert. So for m
e that that still hits a sore spot, but it was really, really amazing. And I have to say back in the days, I was still slightly concerned, I came from a more, you know I said that earlier, traditional background where, some of the consulting engagements or collaborations that I've worked and been part of, weren't always as harmonious, and as I think fruitful, and open, and in conversation and collaboration, as the work was that we were doing together. So bringing in 12 people from an outside par
tner to almost double the size of the team that we had, which comes with the cultural implications, the different leadership styles, communication styles, definitely was one of the first challenges that John and I, co-leading the tech team were facing, and we're trying to check in on. But it was really, really something that I think energized the team on both sides. When I say that, I mean that in all the best ways, because the Google team had such deep appreciation for the mission and the work
that we do at Trevor. Also for the clinical experience and expertise that we had, and also the tech team and the work that they've done today to really get us to the point where we knew we wanted this collaboration and working with AI, because we knew we would be able to drive really meaningful outcomes. You know, on our end really, being so excited about all the experience and expertise that your teams were bringing to the table, having done that work already, and helping us really see behind w
hat we thought was possible. And bringing on an entire organization that really historically wasn't necessarily so much tech forward to really see the possibilities, and really be creative in what that can look like, and how we can reach those 1.8 million young people that need us in a way that really enabled us to do the high level of quality and care that we provide right now, in a way that just feels less manual and feels less hands-on at times, to also really free up that mind and capacity f
or people on our teams. It was really, really fun, and I really miss those days. Yeah, definitely miss those as well. And I will say since you mentioned sort of the excitement on both sides. I think we had more applications for The Trevor Project Fellowships then really any other project. (laughs) We snuck a couple, they loved working for us though. Exactly, Google is really doing, it's been such an incredible collaboration. I know how much I, and really the entire team has enjoyed working with
you all. Yeah. But the other things I wanted to come back to that you mentioned earlier was that you all have now hired a UX-er and some other new resources. And I'm really glad you mentioned that because that's sort of a critical piece of the program for us. You know, a lot of nonprofits have experienced with technologists coming in and developing maybe even a good solution to a problem, but then sort of leaving without a plan in place for who's gonna maintain that work going forward. And in th
at all too common scenario, I think off-boarding is sort of considered as an afterthought, just as the pro-bono engagement is maybe nearing its end. But we've really found that for a project to be successful, the plan for off-boarding should be in place before a project has even been selected. And then it should be kept in mind at every stage of the process after that. And so we bake it into our selection criteria, that we have to establish and advance this clear path to sustainability. And so o
ften that means nonprofits are using our grant dollars to fund new roles. So for example how- We did too. (laughs) Exactly, so hiring your first machine learning engineer or something like that. Yeah. But sometimes it just means training folks. So maybe you don't have a UX resource, although again, I am thrilled that Trevor now does. But we can still share best practices to help the product owner or even the engineers ramp up in that area, to ensure that they're thinking about user centered desi
gn, even if there isn't a full-time UX-er on staff. And so that's really one goal of the program is certainly to help get new projects off the ground, but we know we're eventually gonna roll off the project. And so another focus is really on that capacity building piece. We often say we build with not for organizations, as it truly is a collaboration from day one. Speaking of rolling off, though, we didn't really let you go that easy. (laughs) That is true. And I think part of that was unfortuna
tely related to just the world changing and COVID really bringing different kinds of challenges, not only to individuals leaving office spaces behind and feeling a little upset about that, but also understanding, and at that point I believe we were also so far into the first Google Fellowship, about to wrap it up, and you and the team really were deeply invested in the work that we were doing. And also really understanding what the access to resources and not having access to resources for our y
oung people can mean. So somebody who in the past went to a community center, or had a group, or a theater group at school that was really helpful for them to feel seen and supported. Not having access to that, we had an inkling and we've seen it over time that that really increased the need for our services. We had times that the call volumes were double the size of pre-COVID. So something that in the course of, you know, slightly moving office space remote, thinking about the impact on what th
at has for our LGBTQ young people, was something that we started having conversations about. At the same time with you already had future thoughts on what other things Google and Trevor can do together. And I'm curious to hear more about what that thought process was like on your end. And then we can segue a little bit more about part two and what the next project was that we were working on together. Yeah, definitely. I remember it well. Our first cohort of Trevor Fellows was wrapping up and th
is was in March of 2020. And I think the Trevor team was in a really great position to carry the work forward. But then COVID happened. And soon after Trevor published a white paper, that I think did a really great job at capturing some of the reasons that LGBTQ+ youth might be disproportionately impacted. But one of them that really stuck out to me, essentially what you just shared was that, research suggests that among queer youth, one third experience parental acceptance, which is great, but
an additional one third experience parental rejection, while the final one-third don't disclose their identity until they're adults. And so the majority of youth may be spending their days confined to spaces that are unsupportive of their sexual orientation or gender identity. And so we felt the moment really called for us to double down on our commitment, as you were seeing two X the volumes, and help accelerate that next phase of Trevor's plan to support more youth in crisis. And so we re-inve
sted with another grant, and a second cohort of Fellows. And as you mentioned, we were sort of already having conversations about what was gonna come next. The initial risk assessment that we talked about was really critical in that, if you're able to serve a fixed number of youth, you want those at highest risk to get connected to a counselor first. But the ultimate goal as Lena mentioned, is to serve all 1.8 million youth in crisis. And so the real question was, how do we scale even further? A
nd after a lot of conversations, it turns out that one of the big bottlenecks was training new crisis counselors. So it was incredibly time intensive for Trevor's staff because there were a lot of sort of group training and virtual training aspects, but there was also a component of it that I think, they were spending something like 14 hours one-on-one with individual, each new individual volunteer, in order to facilitate role-plays, which were a critical part of the training process, and still
are, but essentially where the Trevor staff member would play the role of a youth in crisis, so that volunteers could then practice the skills they had learned during the rest of the training. And so the idea that we discussed with you all was to instead use natural language processing to simulate the youth side of that interaction. And that was really the focus of the second fellowship. I don't know if you can pull up the simulator demo. But yeah, I think the really nice thing is the volunteers
can still get direct feedback from Trevor staff about how they did, but now they have the opportunity to go through this simulated experience really as often as they want, which gives them more confidence when they're ready to interact with youth. And it can also provide them with a wider range of experiences, as the simulator can emulate youth language to make that feel more genuine, and really just to prepare them for the emotional experience of helping a young person in crisis. And so I thin
k it's just a really great example of tech not just maintaining, but potentially even improving the quality of training in a couple of different ways. But I'm obviously biased, so, you know- (laughs) We don't, we don't want bias done. We had that conversation. (laughs) But I can actually also share firsthand, and thank you so much for really also sharing that journey and sharing your experience of how we got to extending our partnership and also really being impacted by the white paper that we w
rote, and really share also how much of a challenge it has been over the course of 2020 and beyond, to be an LGBTQ young person right now, potentially stuck at home in a space that doesn't feel as safe. And I actually just went through counselor training. So this screen that you see here is something that I feel very familiar with. And I actually spoke to Riley, which is the first role play that we converted into a crisis context simulator type of format. So I would be in this case the counselor
, and the youth responding would be Riley, in form of a bot, so to speak. And I have to say, after having done this initial role-play, and then I had a few others following up on that that we're engaging with a training coordinator and a training counselor, really felt so organic and so natural, but I also know how much work it has been for the team to put that together. And I really remember also how much more at ease I was in the beginning, that my first role play was actually in this safe spa
ce, so to speak. Because we've had so much training to even get to the point of having this first role-play. And that felt really, really amazing to have this in a space that I thought I can try. And I'm really engaging with Riley in a very natural kind of form. But knowing that I can learn from this experience without having exposed to a person, even if it's a Trevor staff, I still felt a little shy about it. So it was really really amazing to have that experience and have our work, and even ou
r training flows to be supported by that. You said that over the course of the training, there's 40 hours and more that we take for training, and over the past, I think we were able to graduate about 30 counselors per quarter. Right now we're actually able to graduate about 200 per month. That's incredible. And that's not just that one initiative we've had, you know, work that we've done on asynchronous training, and reworked a couple of other pieces. But when you look at the entire chain, to re
ally serve 1.8 million young people we really are dependent on the volunteers, and the more thoughtfully, while at the same time, also the most efficiently we can get them through training and feel like they feel seen and supported over that course of the journey, but we can replace some of the training modules with artificial intelligence and having it feel natural, has actually been really freeing up a lot of time that we can now spend working on other areas that still need as much love and su
pport as Riley and the crisis context simulator did. So thank you so much for really all the work that you did on that, because that's been huge. And as I said, having just gone through training, I know firsthand how much of a joy that was to really also go through that and see that in action. Yeah, definitely. I think maybe you touched on it a little bit, but what's next for Trevor? Oh man, I think there's so much. I just really finished a day of meetings with beautiful visions on what we are d
oing in 2021 and beyond. I think a lot of the work that we do, especially on the technology side, will continue to look into areas among the training journey, how can we further enhance the experience and continue to replace additional role-plays, look into other areas within the training journey, onboarding volunteer journey, and ongoing support and care, that we can utilize some of the knowledge that you brought on and, you know, brought and shared with Trevor. And then in addition, we're look
ing beyond the United States, the 1.8 million youth is just singularly in the US alone. We're trying to see what ways are appropriate for us to consider expanding either language or international. So that's a big undertaking that we're super excited about. And then continuing to hire a couple of Google Fellows. (laughing) But definitely continuing to build out and maintaining the expertise that you've brought to the table. So the tech team is growing, training team is growing, the volunteer team
is growing. So we're really ambitious and we'll maintain that appetite, and definitely are excited to see what this year is gonna bring. What's next for Google.org? Yeah, it's incredible. Thanks for sharing I think so much important work to scale Trevor's impact even further. And I'm definitely sure we'll continue to work closely together. Yeah, I'll keep calling. (laughing) Perfect. But yeah, in terms of what's next for us, I think one of our initial goals with the AI Impact Challenge was to h
elp not just individual organizations like Trevor to scale their impact, but really more broadly to identify these proof points within a given sector that can then be used to help similar organizations. So the technology that goes into the conversation simulator it's pretty advanced, and definitely it could be of use to other hotlines that are maybe operating in different countries or serving different populations. So we're definitely thinking about how we can work together to share some of the
tech learnings from this engagement with other nonprofits in the space. But then relatedly, we were also thinking about how can we share some of the more operational learnings from this and really all of our previous fellowships in terms of how technologists and nonprofits can partner to deliver this lasting impact. You know, we wanna encourage other companies to create formal opportunities, like the Google Fellowship that enable their employees to volunteer their tech skills with nonprofits and
civic entities. So I think, and you know, I'm sure you'd agree, but there's a ton of potential in these partnerships between nonprofits like Trevor that have this incredibly deep sector expertise, and companies like Google who have the ability to help shape maybe the technical direction of a project. So yeah, in terms of what's next, we really hope to share best practices that we've learned over the years, both on the tech and operational side. Yeah, I'll say it's really been an amazing partner
ship and an absolute privilege personally, to work with you and the rest of the team on this truly lifesaving work. So we're incredibly grateful to The Trevor Project for this collaboration. Thank you, thank you, Jen. I mean, I can really only echo all that excitement and a mutual appreciation. I just also really wanna share with the audience that, that has been such a transformative partnership for us as Trevor. And over the course of the journey we've really felt as equal partners in this enga
gement. And you've really genuinely shared and very openly shared resources, knowledge, insights, that have completely transformed how we work and operate. And have enabled us to really be bold, and be ambitious, and think outside the box, even more so than we've done prior. And I can only highly encourage people who are listening, who are thinking about partnerships and collaborations, reaching out, to really try to push that thought further, because it's been really something that we didn't kn
ow was possible, and we still benefit from it. I mean, we're having this conversation together. You know, we'll continue to collaborate and all the potential that this outcome has for people in crisis period, is something that I think doesn't have enough value. You can't put a price tag on that. And I think that's been just absolutely fabulous to have had that partnership. And definitely looking forward to the third Google Fellowship. So now I just have to come up with another idea of what that
would be, I got plenty. But I really wanna share the gratitude that we have for the collaboration and all the great work that we've done together. Well, we feel absolutely the same. We have learned so much from Trevor throughout the course of this collaboration, and it's really been such a privilege. So thank you so much. Awesome, and also thank you everybody who was listening in today and sharing thoughts. And we're here, definitely check out, we continue to grow at Trevor, and always happy to
answer questions and share insights. Thanks everyone for joining us. Thank you, bye.

Comments