Girl Decoded: Humanizing Technology before it Dehumanizes Us View it on the Alumni YouTube Channel View it on Rev.com with the transcription playback

Betsy Ludwig: Welcome to our last MILEs Masterclass of the spring 2020 season. This is our grand finale and I'm Betsy Ludwig, executive director of Women's Entrepreneurship at Northeastern University. And before I introduce our very special guests, I'd like to take a moment to thank everyone who's made this MILEs season a success. For those who don't know, MILEs is an acronym for Masterclasses in Innovation, Leadership, and Entrepreneurship. And when we conceived these masterclass series at the end of March, the world was filled with unprecedented uncertainty, disease, economic meltdown, and no one knew exactly what to do or how to react.

Betsy Ludwig: But in the chaos, our team saw some certainty and we saw the certainty of our community, and we immediately understood that we needed to support female founders now more than ever. We saw grit, we saw resilience and a new term I just learned, which is antifragility, the ability to come out of a crisis stronger than you were before it. We knew we had an amazing group of women doing fabulous things in their fields in and around Northeastern, and quickly realized we could leverage the huge Northeastern platform to give these women a voice to inspire the community and further their business goals.

Betsy Ludwig: Whatever the scenario, we know that no one's life was the same as it had been beforehand, and that the commonality of that experience has bonded us together. Over the last two months, we posted 10 masterclasses featuring over 20 amazing women in 2020. We covered topics ranging from global FinTech to opportunities in healthcare, to self compassion, to design thinking, how to build a side hustle, how to demystify your financing, your startup, the importance of building a community as a cornerstone to entrepreneurial resilience, and now today, humanizing technology.

Betsy Ludwig: I just want to thank everyone for your participation, either as a member of the audience, a speaker, a coordinator, or just a cheerleader. It takes a village and we really have one here. As always, this format is very informal. We encourage active participation from the audience with the chat. And please remember to mute your microphones. As an added bonus today, Rana is giving away some copies of her book to active chat participants. We really encourage you to ask your questions.

Betsy Ludwig: Without further ado, I'd like to introduce our speakers today. We have two amazing powerhouse women in the computer science fields. I have to admit I'm completely out of my depth on this topic. We're joined by Rana el Kaliouby, who's a PhD CEO and co-founder of Affectiva, and author of Girl Decoded. She's an academic turned entrepreneur, and she will share her take on being a founder in a white and male dominated field in her quest to humanize technology.

Betsy Ludwig: I want to just take a minute to read a review that's in her book. Here it is, almost read it. And it says, "Written with kindness, vulnerability and grace, Girl Decoded reveals the tour de force, that is Rana el Kaliouby, her must read memoir spurs technologists to follow their conscious and emboldens women all over the globe to fight for their dreams." That's by Dr. Kate Darling and I thought that was just really a really nice review.

Betsy Ludwig: She's joined by Northeastern's amazing Dean of the Khoury College of Computer Sciences, Carla Brodley. Prior to joining Northeastern, Dean Brodley was a professor at and , and her interdisciplinary machine learning research has led to advances not only in computer and information science, but in areas, including remote sensing, neuroscience, digital libraries, astrophysics, content-based image retrieval of medical images, computational biology, chemistry, evidence based medicine and predictive medicine.

Betsy Ludwig: Please join me today in welcoming these awesome speakers for a nice chat. And I turned it over to you, Carla.

Carla Brodley: Thank you for that. I like to say maybe I have the attention span of a net, which is why I've needed to collaborate with so many different fields, but actually I believe that new machine learning algorithms are invented through the necessity of trying to make things work in a new domain. I have so many questions to ask you, but let's start with just tell us about your company and what it does, and who's using the software, and why you started it. It's really three questions.

Rana el Kaliouby: First, I want to just start by thanking Betsy and you, Carla for doing this. I'm very excited. I've gotten to know Northeastern in a number of ways. First, through collaborating with Professor Matthew Goodwin. We go all the way back to when I was at MIT Media Lab, and he got me and I'll talk about some of the work we did in autism. But also, we have a number of co-ops that have interned with us over the last few years and they ended up... I think we have an amazing track record of hiring these people, including a few women too in computer science. I'm just very grateful for this collaboration and this partnership between my company and Northeastern.

Rana el Kaliouby: Yeah, so I'm in the field of humanizing technology, and basically at a high level, technology has a lot of cognitive intelligence, a lot of IQ, but very little emotional intelligence. When we think of AI, for example, artificial intelligence, it's all about automation and efficiency, and productivity, but nobody's really thinking about the human centric elements. Can technology help us be more connected, help us be more empathetic? And what would that world look like?

Rana el Kaliouby: My background is I'm a computer scientist. I studied computer science as an undergraduate at the American University in Cairo. And from then on, did my PhD at Cambridge University, focused on building and using computer vision and machine learning to build emotionally intelligent technology. And then I had the opportunity to join Professor Rosalind Picard's Lab at MIT Media Lab to apply this emotion recognition technology first to autism. And that's how I collaborated with Matthew Goodwin. And then later, that provided the impetus for starting Affectiva so that we can apply this technology to many industries around the world, including automotive, mental health and other areas.

Rana el Kaliouby: It's been quite the journey of taking a core technology that I feel very passionate about and then bringing it to the world through Affectiva.

Carla Brodley: How do companies use the products? How do they, if they buy your software, can you give a couple of different use cases?

Rana el Kaliouby: Yeah, one of the very first use cases was around quantifying how people respond to content online. We're all consuming all sorts of content and we are now able to quantify through people's facial expressions, moment by moment, what their experience is like. And that product is in 90 countries around the world and it's used by the Fortune 500 companies. More recently, we've been really focused on the automotive industry where our technology's being used to identify distracted drivers and drowsy drivers, and what's happening inside the car. A lot of computer vision and a lot of deep learning, and machine learning.

Rana el Kaliouby: And then another area that I'm super passionate about, which I think overlaps with your interests, Carla, as well, is this idea of a sensing platform to advance mental health, are our quantification, objective longitudinal quantification of mental health. We do a lot of work or we partner with companies that are focused on autism and we've done a little bit of work around depression and Parkinson's.

Carla Brodley: Oh, I have a question, which I don't know if that's possible. I've been thinking about your software, and it's so interesting and so cool. And we've all moved to an online environment. I'm wondering if there, like let's say you're teaching and you have a whole bunch of little boxes of people. Would there be a way to use your software to tell who's confused and who's bored, and who's multitasking, so you could call on the student who's multitasking or if somebody is confused, you could say, "So and so, do you have a question?" It would just help you as a person, and particularly if you're a faculty member who's not so good at reading facial expressions.

Rana el Kaliouby: I think there's a huge use case for this technology now that we're all migrated into this virtual universe. Right? Because as you said, when you're presenting... I've been doing a lot of these book talks, because my book tour had to pivot to a virtual book tour. I give a lot of these talks and often it looks like there's a hundred plus people. I can't see you all, which I ordinarily would have if you and I, Carla, were doing this live together physically, we would riff off of the audience's energy, but we can't do that right now.

Rana el Kaliouby: And so, we have no idea if our audience is engaged, are they bored to death? Are they rolling their eyes? Are they totally confused? Right? And I feel like it's broken that iterative feedback loop that is so necessary when you're engaging an audience, whether you're an instructor or a presenter. I think there's a lot of opportunity. And what's cool about it is you don't actually have to turn your camera on. You just need to give the algorithm access to the technology and it can just anonymize your emotional engagement and make that available to a teacher. I think [crosstalk 00:09:38]

Carla Brodley: It might be a little disconcerting when it says 55% of your class wishes they were anywhere but here, which is obviously not, I'm sure happening here. Moving to a slightly different direction, I think this is really exciting, moving to a slightly different direction. I think about you and your life of doing a startup. I know you're a mom. And how on earth did you find the time and motivation to write your book?

Rana el Kaliouby: The real answer is I don't really... I feel like you make time for the things you're really passionate about. Yeah, so I obviously am the CEO of Affectiva. I'm a single mom. I have two kids. My daughter is 17 and my son is 11, and they're very engaged in... I often talk about Affectiva as my third kid, so it's very much integrated in our family, my work is. And I just really felt that this was a unique opportunity to tell my story and use it as a platform to inspire others.

Carla Brodley: What parts of your story would you like to share, since you can't share all of it? Can you talk a little bit about what's so... I know that you have a very unusual pathway into tech, certainly, and they've done many amazing things. Can you talk a little bit about how you got into tech? What is it about your story that you wanted to share with us?

Rana el Kaliouby: I would say the number one thing that I want to share through this book is I have had a lot of inner doubt throughout my journey, and this voice in my, I call it the Debbie Downer voice in my head that basically said, kept saying over and over again, "You can't do this. You won't be able to raise money. You can't start a company. You're a woman, you can't do this, you can't do that." The voice is still there, but I learned how to negotiate with it and say, "Okay, I hear you." Right.

Rana el Kaliouby: I think one story in particular, I grew up in the middle East, so a lot of cultural and societal norms are deeply ingrained in my brain around what a woman, for example, can and can't do. And then of course, I'm in the tech community, which is very, very male dominated. And so, as a result, I'm very passionate about diversity. And I want to talk to you about that, because I know that's something you cared deeply about as well.

Rana el Kaliouby: But when we started Affectiva, Ross Picard, who's my co-founder and mentor, and role model, we both, when we were raising our initial round of funding, we agreed that we would hire a business executive to run the company. And I was the chief technology officer. A few years in, he moved on and the question became, "Okay, who's going to be the next CEO?" And a couple of our board members said, "Well, Rana should be, it's her company, it's her technology." Rana el Kaliouby: And I was just too scared. I was like, "I've never been a CEO before. I don't know if I can do this." And I didn't raise my hand. At the same time, our head of sales at the time basically said, "Yeah, sure, I'll do it." And he took the job. And so, he became the CEO of the company.

Carla Brodley: I did not realize.

Rana el Kaliouby: Yeah, for a couple of years. And then in 2015, it was right after I gave my TED talk and I went back to Boston, and with a lot of nudging and help from mentors around me, I basically went to him and I said, "I actually want to be the CEO of this company." And I Googled like what are the roles and responsibilities of a CEO? And I found that I was already doing them. We had raised $50 million for the company. I was at the forefront of all of that. I led the majority of the team, so I was doing the job. And it took a lot of courage to get outside of my comfort zone and convince myself first.

Rana el Kaliouby: Once I convinced myself, it was so easy to convince the board and the rest of the team that I can do this. This story was four years ago. I've been CEO for the last four years. And yeah, my advice to other females on my team, but just not just females, right? Don't let this voice in your head be your biggest obstacle. Right? Don't be your own biggest obstacle.

Carla Brodley: I think that's just a wonderful example of that you need to overcome those voices. I'll just share a quick personal anecdote, which is when I took the job as Dean, I had the same feeling as a parent, which is sometimes I'd look at the kids and think when's the real parent going to show up. And I remember when I took my job as Dean, at the end of the year, I was thanking all of my associate deans. And I shared with them that I had the same feeling sometimes it's like, "When is the real Dean going to show up? Because I'm just acting like the Dean."

Carla Brodley: And then after a while, you grow into your role and you can't even imagine why those voices were talking to you. And as new challenges present them, you're just excited about them rather than scared. And so, I would say that I think it's lovely to share that when you've gotten to a place where you're in a position of leadership, so that the young people on the Zoom meeting today can realize that that voice is inside of all of us, whether we're male or female telling us we're not sure that we're good enough for a particular role. And that if it isn't, you're not excited enough about it.

Rana el Kaliouby: Yeah. That's interesting. I've never thought of it that way. I mean, you are also a minority as a Dean or even in the senior leadership of a computer science department. How have you navigated that? And I guess what does that mean in terms of your and the college's commitment to diversity and inclusion?

Carla Brodley: I'll say that when I started out as an undergraduate in computer science, the statistics were really different. 40% of my classes was female when I was an undergraduate. I went to McGill University and it was a joint program with math, and a lot of the math students were women, and they decided to do this new thing called computer science. And then I got to graduate school, and of course, the demographics changed remarkably.

Carla Brodley: And then when I went to become faculty at Purdue, I was the third woman to be hired out of 70 faculty in ECE, the 12th in all of engineering ever to have been hired in engineering out of 220, and the very first person to have children doing during the tenure process. There was definitely, I remember thinking very strategically, first of all, about what I would say yes to, because I was asked to be on every committee. And then I invented a statement which has served me well when I would be in large committee meetings, and I would be both an assistant professor and a woman.

Carla Brodley: And I learned a phrase which I share with graduate students, which is... and this works for men too. This isn't a gender specific thing. It usually works for people who are not in a position of power somehow in a group. Thank you Joe for re-expressing my ideas such that everyone could appreciate it.

Rana el Kaliouby: Love it.

Carla Brodley: Yeah, and it has humor and it deals with the situation head on of which has happened to you, and you don't sound complainy or whiny. But I would say that the main motivation for my becoming a Dean, and I shared this yesterday on a all-college forum around thinking about where had four wonderful students talk about their experiences in Khoury, being students of color. And it was eye-opening. But I talked a little bit in that panel about why I became a Dean.

Carla Brodley: I had been working on efforts to increase the diversity in tech as a volunteer for decades. And when I was offered the Dean opportunity at Northeastern, I thought to myself, "This is an opportunity to have a national platform with which to debug ideas and then bring them to the rest of the country." And that is what I've been focusing on since I became Dean of our college, and the tagline that goes with our university is Computer Science For Everyone. And that means everyone should feel welcome in computer science and computer science affects every other discipline that there is. And so, a lot of our programs are designed with both of those in mind.

Carla Brodley: What I get to ask if we get to take turns, because I have so many things I want to ask you and I heard my own stuff so many times that if I could look at, I'm... if you could look at my face, I'm not bored talking about my own programs and I'm so much more interested in you. Tell me a little bit about how you have approached diversity in your company. Obviously, you don't have the problem that you can't read facial expressions of people of diverse backgrounds, because I'm pretty sure that you addressed that specifically, but how else do you address these issues and advocate? Rana el Kaliouby: Yeah, I like to say that for us, diversity is not just the right thing to do, it's a business imperative. Because for me ensuring that our algorithms aren't biased, our algorithms work. I mean, we're deployed in 19 countries around the world, so we cannot afford for our technology to be trained on a homogeneous set of older white guys and then you run it on somebody that looks like me, and it just falls flat on its face. It's not going to work. We have to make sure that our data is diverse and we have to make sure that our team is diverse, because we designed for what we know and we all have our own blind spots. The more different voices you have around the table, the better the end solution is going to be.

Rana el Kaliouby: For us, diversity is obviously gender diversity, ethnic diversity, but even diversity of age. We have a really robust summer internship program and our commitment to bringing co-ops. Jordan, I can see Jordan is on our team and I can see him really acknowledging that. It's really important to bring these younger voices to the table because they obviously grew up with a very different experience of technology and it's going to affect them the most, everything we're doing today. And also, diversity of perspectives too. Like you don't have to be a machine learning scientist to contribute and be a productive team member. In fact, we welcome different voices.

Rana el Kaliouby: That's very important to us too. And yeah. And then, like you, because I'm very passionate about this, I fork out some of my time to volunteer. I'm on the All Raise Steering Committee for Boston. It's a nonprofit organization committed to supporting female founders, but also female investors. And we've been very active. And I feel very fortunate to be part of that organization as well.

Carla Brodley: That's really interesting. And I think, if anything, the age diversity is something that's very lacking in tech, and that you often don't have older people. The young people are easy, but like when I walk to Google, I feel positively decrepit. There's all these 23-year-olds walking very importantly with their laptops across the Google campuses.

Rana el Kaliouby: Yeah. I think the diversity is really critical, including age diversity and this inclusiveness too. Right? It's not enough to just be diverse. It's even more important to make sure that your diverse team feels that they have an equal voice. Right? And we've been talking a lot about that internally at the company too, how do we make sure we're an inclusive team and that every voice is welcomed.

Rana el Kaliouby: I mean, I was curious, I mean, you have a program, the way I internalize it, it's like democratizing access to computer science. You have a program where you don't have to be a computer scientist to enroll in this program. Can you say a little bit more about that?

Carla Brodley: Sure. We started a master's in computer science for people who did not study computer science, and Betsy, we are going to have room for you if you're ever interested. I know you will demystify how hard tech is if you come into the program. And it's basically two semesters of very accelerated undergraduate material. Although you're not with the undergraduates, because no 29-year-old wants to sit in a room with 18-year-olds. And the average age in the program is actually late 20s.

Carla Brodley: You do two semesters of basically about two thirds of an undergraduate degree in two semesters. That is pretty hard. And then you join our direct entry master's students to finish up the three semesters of classes that you need to take to complete the masters program. Often, you go out on one or two work experiences, hopefully at your company, and the people who enroll, about one third come from STEM, one third come from business, and one third come from the liberal arts, which is really exciting.

Carla Brodley: And so, we are focused both on diversity of demographics, but also diversity of thought. You get people who maybe studied English or journalism, or sports medicine, and then they come and they get a tech degree, and it's called align for two reasons. The first is you didn't make a mistake in what you studied as an undergraduate. Anything you studied is fine, let's align that to tech. And second, it's about aligning the tech industry to population norms. Our enrollment goals are to mirror the population statistics of who graduates from university in this country. 50% women, about 85% domestic students, and of the domestic students, 25% underrepresented minorities, which are all minorities except for Asian- Americans.

Carla Brodley: And we're pretty close to those statistics. And we are really excited about the program. We want to make a master's in computer science like an MBA, something you can do after any undergraduate degree. And why did we create this program? Partially because I was an English major when I started Miguel University and then discovered computer science. And I thought about what would my life have been like if I didn't discover computer science? I would have been so sad because I loved this field so much.

Carla Brodley: But most importantly, and I apologize to anyone on this call who is 17, but we can't leave it in the hands of 17-year-olds who goes into tech. Because 17-year-olds, like myself, might choose English as a rebellion because her parents suggested that she do math. And the rebellion was more important than acknowledging that maybe they were right, given that I ended up with a math computer science degree, which I said, "You were right," to my wonderful parents.

Rana el Kaliouby: I think it's funny, because I have a 17-year-old daughter and I don't know if it's rebellion or what, but she refuses to think of herself as a STEM human being at all. She's very interested in international relations, and political science, which is great. But I think there's room for this interdisciplinary state of the world where you combine different disciplines. I mean, I tell a lot of young people who of come our way, there's a need for combining humanities and computer science, because we need to think about it AI ethics. Right? How is NorthEastern approaching that?

Carla Brodley: Over half of the majors in Khoury College of Computer Sciences, I'm worried about 2,000, maybe actually 2,300, I think with the newest class coming in, over half of them do what's called a combined major, which is not a double major. It's where they take about two thirds of classes in English, two thirds classes in computer science. And then there's some programs that put them, some classes to put them together in the digital humanities. And so, we have 36. And so, we've combined with the obvious ones like biology and physics, and design, and the non-obvious like philosophy in journalism, and criminology with our cybersecurity program.

Carla Brodley: And I will say that the number of women is higher in the combined degrees. 36% of the combined degrees are women, versus 23% of just doing one of the computer science degrees. And furthermore, your point about ethics is so important. We are infusing ethics curriculum into six of our undergraduate classes rather than having it be a standalone, because it really needs to be integrated into the material. That's really important.

Carla Brodley: We are also creating a new course with social sciences and humanities that will fulfill a general ed requirement, which we call any path, which looks at what is going on in terms of tech and diversity. There'll be things on bias and machine learning, things about the culture of tech right now, all of the things surrounding that that we're really excited about. And that was actually a suggestion from a student in my inbox in the last two weeks. And it was brilliant. We got right on it and we're working on it. Ideas can come from anywhere. I don't have to have them all. I just need to be able to recognize them.

Carla Brodley: Okay, I'm going to turn the tables and ask you another question. And then Betsy, I don't know when you want us to open up for questions. I know that there's quite a few. I would like to ask you one more question, which is, tell us a vignette from your book that really exemplifies why you wanted to write the book.

Rana el Kaliouby: Ooh, that is a hard question. Okay. The one story that comes to mind is when we were raising money for the company in the early days. This is 2009, Ross and I had just decided to spin Affectiva out of MIT. We had this trip lined up to do a Sand Hill Road Show. Sand Hill Road, for those who don't know, is a road in the Bay Area, and it's basically just filled with venture capitalists after venture capitalists.

Carla Brodley: That's where the money is.

Rana el Kaliouby: It's where the money is, exactly. And we had lined up like maybe 20 plus meetings with these VCs. And it took a while to line them up. Right? It was a big deal. Now, at the time, my son was about seven months old and I don't have any family in this country, so he had to join along. And I had lined up a babysitter for him so that I could drop him off in the morning, go do my pitches, and then pick him up. And that one day, she called and she said, "I'm not feeling well, can't take your son." I was like, "What?" Quite a decision, right?

Rana el Kaliouby: I could have canceled the meetings, but these were really important, and so I decided to bring him along. I put him in his car seat, and here we are, Ross and I would walk into the VC's office, and I just would plop him... There's usually a really kind looking person sitting at the front desk and I'd say, "Here you go, take care of Adam. I'll just go pitch and come back." And so, I did that, and to me, this kind of perseverance and grit, and just finding a pass through, like dealing with obstacles as they come your way and just finding a way through these obstacles, really in a way summarizes my life and why I wrote the book.

Rana el Kaliouby: There are so many examples of these obstacles in all of our lives, and we have a decision to either just turn back, give up, turn back, or power through, and I just powered through.

Carla Brodley: I love that story. While, Betsy, you're looking for questions, I'll just share that I don't have nearly as good a story, but I did have my 10-year-old on a snow day, come to my machine learning class. And I told him that when I did this, I wanted him to raise his hand and ask the following question. His question was why is the conjugant gradient an exponential? And it was just hilarious, because the entire class was like, "Wait, Professor Brodley's son understands what an exponential distribution and a conjugate gradient?" And then all of a sudden, they got it and they all laughed. But so you can have more fun with your kids coming to work than you can imagine.

Rana el Kaliouby: That's awesome. Is he a machine learning scientists?

Carla Brodley: He is actually in at Northeastern doing a double major in theater and computer science.

Rana el Kaliouby: Cool.

Carla Brodley: He'd be a great person for you to hire. All right. Gets emotion and he can do computer science. Betsy, do you want to start giving questions from anyone from the chat?

Betsy Ludwig: Yes, thank you so much. God, I love those stories and I can't even spell what you just said your son said. We have amazing questions in the chat and they cover everything from the importance of EQ to privacy, ethics, diversity, data bias, how the COVID crisis factors in. And so, there are a couple of people I actually am just going to call on to ask their question. And I think Bethany Edmonds who might be joining us from the West coast. Bethany, are you there and do you want to ask your question?

Bethany Edmonds: Sure. My question is, have you encountered difficulty explaining the value of EQ to software developers? And if so, how have you approached it? Because I teach computer science and I find that sometimes not everybody sees the value of it. And how have you, especially you're pitching it and VCs, how do they get it?

Rana el Kaliouby: Yeah. The first, I've been doing this for 20 years and first in academia and now in business, when we were pitching the company, we avoided using the word emotions altogether. We called it the E word and we danced around it. This is why the company is called Affectiva because affect is a synonym of emotions. In our first decks, we talked about sentiment and arousal, and valence, never the word emotion, but I think the world has changed. And especially now with everybody going through this global pandemic, I feel like there's a re-discovery of the importance of empathy and empathetic leadership, and empathetic approach to design and human centric.

Rana el Kaliouby: People are craving a human connection and they understand that our emotions drive a lot of our decision making, and our memory, and our health, and wellbeing. I think the world has changed since we were initially pitching. Yeah. But people of all sorts see the value of EQ.

Bethany Edmonds: Thank you.

Rana el Kaliouby: Thank you from you.

Betsy Ludwig: And then we have a few questions that are around the use of the facial recognition software. And I'm trying to digest a few questions here without a lot of understanding of AI. But talk about the facial recognition and some of the ethics, some of the privacy concerns and then diversity issues of reading facial expressions of people with different backgrounds, races, ethnicities, and how you might be dealing with that from a technical standpoint.

Rana el Kaliouby: Yeah. We at Affectiva, we like to talk about the ethical development and the ethical deployment of AI. From an ethical development perspective, we're committed to ensuring that our algorithms are not biased against any particular sub-population. You may have seen in the news, especially recently, that a lot of the tech giants have made a decision not to apply facial recognition, in specific, for the government or in policing.

Rana el Kaliouby: And one reason is because a lot of these technologies are biased against people of color and especially women of color, because they're not trained with a very diverse data set. That's a big issue in the industry. At Affectiva, we have veered away from any industry or use case where people did not give explicit consent. We have turned away millions of dollars of business in the surveillance space and then this security space. Because I think we just acknowledged that this data is very personal and it could be abused if it's put in the wrong hands, and we don't want to go there. Betsy Ludwig: And there's some questions about bias in the recording of this data. And can you talk about that and does that dovetail into this issue?

Rana el Kaliouby: It does. When I was a PhD student, for example, over 20 years ago, when you approach the machine learning problem, you just talked about a data set in its entirety, and you reported on accuracy, three scores at the dataset level. That's unfortunately, it means that if it's biased towards a particular subset of the data, you won't see that in this aggregate accuracy score. Now the industry's moving towards an approach where you're very thoughtful about your training data sets and sampling examples that cover age, gender, ethnicity, people with glasses, people wearing the hijab, right? The broader the data set, the better.

Rana el Kaliouby: But also, when we look at the validation or the test set, we look at accuracy scores within each category. We look at a breakdown of how our classifiers do based on gender, how they do based on different ethnic groups. And I won't pretend that it's a done deal. This is very state of the art. The industry's moving very fast and Affectiva is at the forefront of advocating for these best practices. And we have a lot of work to do, but we're committed to it. And we prioritize it as a team, right? We are putting some mind share to ensuring that we look at these diversity breakdowns for our accuracy. But the whole industry needs to move in that direction.

Betsy Ludwig: Right. And just my own personal question. You mentioned that you're doing a lot in the automotive sector, which is on your website. What other applications do you see for your software? What are the next verticals you're going to go into?

Rana el Kaliouby: I'll talk about one that I'm very passionate about. Again, Matthew, who's on the line will recognize this. The very first application of this technology that we explored was an autism, where we were able to prototype a Google Glass like device that could give real time feedback to individuals on the autism spectrum to improve their nonverbal skills. Since then, we have partnered with a company called Brain Power, and they use Google Glass in our technology, and they're now in the midst of clinical trials.

Rana el Kaliouby: But to broaden it, the application of emotion AI in mental health is really potentially transformative. When you walk into a doctor's office today, the doctor doesn't ask you like, "Okay, Betsy, what's your temperature, or tell me your blood pressure." They just measure it. There are objective measures for these important health signals, but in emotions or mental health, the gold standard is still a survey. "Like on a scale from 1 to 10, how depressed are you or how suicidal are you?"

Rana el Kaliouby: That's just like we should be able to bring data to this problem and quantify this longitudinally and remotely from people's homes. I think that's an area that's really ripe for a lot of innovation. Yeah. And it could be really transformative. Betsy Ludwig: And that feeds into another question, I've seen it on the chat from Jessica Sue. I don't know if she's here and she wants to ask her a question about the new roles that Affectiva might play with COVID. Jessica, are you there?

Jessica Sue: Hi. Yes. Thank you for this opportunity. I'm an incoming freshman at Northeastern. And my question is will Affectiva's technology take on new roles in recovering from COVID-19, whether it comes in the form of helping people with trauma or anxiety, or addressing some of the concerns with the general sentiment of growing up social distancing, and usually, people see that technology is bad for social skills of the new generation. How does EQ come into context in COVID-19?

Rana el Kaliouby: Yeah, that's an excellent question, Jessica. Congratulations too, oops. I'll start with the first part of your question. I think there's a lot of opportunity, especially now that we've all been catapulted into this universe where a lot is happening virtually, including telemedicine. Again, I think there's a lot of opportunity to incorporate emotion recognition technology and telehealth to quantify anxiety, stress through just a camera sensor and remote physiological sensing. We can actually measure your heart rate and your breathing rate just using a webcam.

Rana el Kaliouby: And so, imagine if every time you're on your phone or on your device, with your permission, we have access to that data, and that data becomes available to your physician even outside of the small amount of time that you're connected to your doctor. I think there's a lot of use cases there, but even beyond that, again, we're all connecting virtually and I'm so grateful to technology for allowing us to do that, like yay Zoom, right?

Rana el Kaliouby: But it's really crude. I feel like this is VO.1 of this kind of technology, and we're going to see a lot of innovation that sits on top that provides analytics and real time data around emotion engagement, and student engagement, and we'll allow personalization of these experiences and ultimately, create more of a sense of a shared experience. I think that's what we're all craving. I think we're going to see a lot of innovation. Maybe some of it will come from people like you, Jessica, because we need all creative minds around the table.

Carla Brodley: I just want to interject with one question I saw on the chat, Betsy, which is, are you still hiring co-ops for starting in July? I think that there's some interest in whether or not you have any more open co-op positions. Our next session, as you know, starts July 1st.

Rana el Kaliouby: We do not, however, we're just about to announce our summer training program and it's a five-week program that ends with a Make-a-Thon, where you will get assigned to some of the project ideas, span, tele-health, and automotive, and other use cases like virtual live streaming and video conferencing. If you're interested, what is the best email to... yeah. Email [email protected] and I'll make sure that I point you to the right application.

Gabi Zijderveld: Rana, it's Gabi. Sorry to interject. I just posted our flyer about the program in the chat. [crosstalk 00:42:55] it's sitting there, and we'd love folks to consider it and apply if they're interested.

Rana el Kaliouby: And actually, we have a call for trainees, but we also have a call for mentors. If you're a computer science undergrad or graduate student, who's interested in mentoring some of the younger trainees, please reach out as well.

Betsy Ludwig: Excellent. I love this. Great. Now, we have lots of questions from Brianne McDonough, and I don't know if you're there and you want to ask some of your questions about diversity of thought.

Brianne McDonough: Yeah, I can. I'll ask that one. You talked a little bit about how this technology can help people on the autism spectrum. Can you also share how it accounts for this difference in people that might be, if I'm logging in and the technology is being applied to me, and how does it account for people that have limited expression, or for example, somebody with a stroke that may only be able to use half of their expressive muscles in their face. Can you just talk a little bit about that?

Rana el Kaliouby: This is an excellent, and this is why, again, this is an example of how diversity of thought really is important. We do not have enough examples in our dataset, Brianne, around autistic individuals or people who have strokes. We just don't. Interestingly, one of the applications of our technology is in the medical space. This doctor, Joe Dusseldorf, he's a facial surgeon and he helps patients who have facial palsy to quantify their rehab, right? Are they regaining their smiles and their facial expressions?

Rana el Kaliouby: And so ironically, we're applying the technology with autism and with facial palsy, and other types of strokes, but we do not have enough examples in our data sets. That's a blind spot. Yeah. Thank you for highlighting that. I will take that back to the team. A while back, we had a similar question around transgender individuals because we don't have gender fluidity in our data set. It's not accounted for right now, and that's probably an area for improvement. Yeah.

Betsy Ludwig: Excellent. I think I don't know if Rick Fisher is there to ask his question. Rick, are you there? Robots.

Rick Fisher: I'm here.

Betsy Ludwig: Oh, good. Rick Fisher: My question, there are some, oh about eight inch robots that come out of Japan that have been working with elderly folks doing exercises, and they've been very relatively successful, but they don't have AI. Can you talk about giving them VI, sorry.

Rana el Kaliouby: Yeah. Yeah. There is a lot of opportunity in the social robotics space. We are the emotion brains in a couple of social robots that are out there. One is a robot called Mabu. It's an MIT spinout and based in the Bay Area, and they send these robots with terminally ill patients. And the robot is supposed to be a companion, right? It ensures that the patient is adhering to medication. If the patient shows signs of depression, then it will flag that to an actual nurse or doctor. I believe there's a lot of opportunity for social robotics and it will never take... People often at this point say, "Oh, but it's going to take the place of a human." It's not, because humans aren't available to provide this type of care, in the first place. Right? Rick?

Rick Fisher: And in many cases, they don't care.

Rana el Kaliouby: Yeah. I mean, loneliness is one of the top... especially, I don't know if it's probably there's a global loneliness pandemic, but especially in Asia, often the number one ask from these types of technologies is because people are lonely and they want to be able to confide and confined in these agents. Whether they're embodied agents like a robot or even like a chat bot. I mean, I don't know how many people here are familiar with Microsoft's agent chat bot called Shawee. It's extremely popular. And the types of conversations people have with this chatbot. I mean, they confide relationship statuses with it, job concerns. I find it really fascinating.

Rana el Kaliouby: And I'll just reference one more piece of work that I also find really interesting that was done at USC, where they had PTSD patients come in and some got to see a human doctor and some got to see this emotionally intelligent avatar. And they found that the patients were more forthcoming with the avatar than they were with the human doctor, because they perceived the avatar to be less judgmental. And I just find that it says more about humans, honestly, than it does about the technology. Right?

Rick Fisher: Well, no, it says a lot about both, and that the robots and the technology have some advantages, that they don't have to be judgemental. One of the chat comments was what if I don't want my professor to know that I don't like what he's saying.

Betsy Ludwig: I saw that, that's a good comment.

Rana el Kaliouby: And I think you should that use that, that I feel like people should absolutely own and have control over this data. If you don't want to consent, then I think you should have the power to do that. We feel strongly about that.

Betsy Ludwig: I just want to move to another question in the chat. Jaana, are you there to ask your question about supporting women from the LBGTQI community?

Jaana Dominique Tabalo: Yup. I'm here. Hi, my name is Jaana. I'm a rising third year CS and design major here at Khoury. And my question is how are Affectiva and Khoury College taking visible steps to support women, BiPAP and members of the LGBTQIA plus community in the tech industry? And how will you ensure that these individuals feel empowered as computer scientists?

Rana el Kaliouby: Carla, do you want to take that first?

Carla Brodley: I think we're not doing enough. And certainly, probably not enough explicitly. And I would welcome, you said Jaana, right?

Jaana Dominique Tabalo: Yes, Jaana.

Carla Brodley: Jaana, I would welcome the opportunity to talk with you about what more we can do. I know that our Dean of students, Ben Hescott, is passionate about these issues as well. Just because he's a wonderful person and also because he is a gay computer scientist. And has been for his life, and so has faced, I'm sure, interesting types of bias, but I don't have a good answer for you. The mission of Computer Science Is For Everyone really means that.

Carla Brodley: And certainly, one thing we've always done, is that if students in some way have a negative co-op experience, that we try to work with the company to make sure that no student will have that negative experience going forward, whether it's because of any kind of demographic or choice in life reason, or just because they gave you a really boring opportunity, and so, it didn't feel like an opportunity. And so, we definitely work with our companies to make sure that each of our students is having a fruitful experience and don't go posting with companies that are not. But please, reach out and let's have a talk and see if there's more that we could be doing.

Rana el Kaliouby: Yeah, and I would like to mirror that too on the Affectiva side and just in the business community. As I said, I work with All Raise, and I'm on their Steering Committee, and we're really committed to doing what we can to improve diversity in tech, especially. I mean, we're focused on the Boston community, but All Raise is a national organization, so we'd love your input. I will also share a more Affectiva-specific story. A few years ago, we had somebody on our team, Forest, who decided to... came out as a transgender individual. He became she, right? And I'm just very proud of how our team and our company responded to that.

Rana el Kaliouby: We're a very international team. We have a big... I mean, I'm originally from Cairo, Egypt, and we have a big Egyptian team where it's not at all part of the norm to talk openly about these things. And so, it sparked lot of dialogue. But also, we all came to it with a lot of empathy and a lot of openness. And I feel that Forest felt that she was really supported. We organized the coming out party for her and her parents were there. Sometimes it just begins with supporting one person where you can.

Carla Brodley: [crosstalk 00:53:17] I would note that, Ben, our Dean tells me that we have the highest number of trans students. I don't know what the right way to put those two words together is, in the university. I would hope that everybody feels welcome and supported, and if they don't, then there's work to be done.

Betsy Ludwig: Well, thank you guys for that. I'm sorry that the fire alarm in my house went off, my kids burned grilled cheese downstairs.

Rana el Kaliouby: Uh-oh.

Betsy Ludwig: That's her class. Well, there's a fascinating discussion. I did lose sight of the chat while I was trying to turn the fire alarm off. I really apologize for that. It's one for the books. Carla and Rana, do you have any last words? We're at the top of the hour and just a few minutes left, but do you have any final comments? Rana, especially, I would love to hear if you have any words of wisdom to aspiring female entrepreneurs on their quests thinking about different career paths. If you have anything to say, that'd be great.

Rana el Kaliouby: I think it all starts with passion, find something that you're really, really passionate about and you want to change in this world, because that will ensure... It's going to be a tough journey, no matter what. But if it's something you really, really care about, you'll have the grit and perseverance to power through, and then support yourself with mentors and champions. I didn't do that initially. I mean, I've had a lot of mentors and people who have supported me throughout my career, but I didn't prioritize being part of a network and a community. I just worked hard.

Rana el Kaliouby: I worked a lot. My solution to everything was to work harder and harder, and I didn't really recognize the power of having a network. I would encourage young people to invest in that. It really does help. [crosstalk 00:55:16]

Betsy Ludwig: Excellent.

Carla Brodley: I'll just close this out by saying, Rana, wow, are you inspiring? And I just wish you the best on what is just such an incredible journey, and I'm certainly looking forward to when you and I can grab a coffee in person. And it's just such a pleasure to meet you and to hear everything that's going on. And of course, I'm very much delighted I'm getting my own signed copy of your book. I'm a fan of you and your book, and your company, and we're delighted to send you as many students as you can possibly absorb.

Rana el Kaliouby: Please do. And I found that this book is leading to a lot of... it's a conversation opener. I look forward to collaborating and partnering more. It's the beginning, so thank you for having me.

Betsy Ludwig: Well, thank you for being here. And actually, I have my copy here, as I pointed out at the beginning, and I'm actually thinking that we should host a book club with some students so we can all read it and then maybe have you back to talk a little bit about some of the topics in the book, and that would get more people interested in that. And maybe we can even teach it in some of our classes. But, Rana, you're such an inspiration. Carla, you're such an inspiration. I mean, you're all amazing. You do amazing things. And I think the world is changing so rapidly.

Betsy Ludwig: It's good to know that you guys are at the forefront of thinking about all of these big, global complex problems that we need to solve. Makes me feel a little bit better. Thank you so much, again, for taking the time today to talk to us and share your wisdom. And and we hope this is the beginning of a longer relationship.

Rana el Kaliouby: Absolutely. Thank you.

Betsy Ludwig: Thank you everyone. Thank you for joining us.