Main

Interior sensing solutions for vehicles | From KNOW-HOW to WOW Podcast

You’re driving in your car, you notice a new restaurant as you drive past and you wonder: What was that? A car of the not-so-distant future may be able to answer that question. Because it knows what a driver (or passenger) is looking at - be it a landmark on the side of the road, the dashboard, or the road ahead. In the first place, monitoring the driver's gaze and head position enables important safety features, says Bosch product manager Tyler Warga. Geoff and Shuko learn how cameras and radar sensors can pick up signals that people are subconsciously sending. This can help prevent drunk driving, reduce distractions, and save children's lives. Future autonomous cars could even benefit from a sense of the driver's emotional state: they could pick up on Duchenne smiles. As University of Missouri professor Kennon Sheldon explains, this type of smile signals honest amusement. So be amused and wowed when you join us for a sensor-packed ride!

Bosch Global

2 days ago

[FX] Nice car, Shuko! - Thank you. I just wish it wasn't virtual. - Popping that illusion button straight away. So, our sound designer created that. Good job, thank you! [FX] - Oh, get out of the way! Ah do you think I just done a little bit better now? - Very believable. - Hey, what was that? - What do you mean? - The restaurant we just passed.  Isn't that a new restaurant? - Oh, that. No idea. - Well, my car definitely knows. Alexis, what was that restaurant? - You just drove past a new restau
rant called Melena's. It's German cuisine. Do you want me to book a table? - Absolutely. 7 p.m. For two, Geoff? - I'm in! [Music] From Know-How to Wow - the Bosch Global Podcast This option to get information about what you see at the side of the road, that we just simulated - that's something that cars will soon come with. But what's really fascinating is that comfort functions like this originate in safety features. So, let's start at the beginning. My name is Geoff,  welcome to our episode a
bout sensors and cameras monitoring the interior of vehicles. - And I'm Shuko. So yeah, it all starts with technology that we've had in our cars for many years. Technology that watches me as a driver and makes sure I'm fit to drive. - Because let's be honest: we're not always the best at judging our own capabilities. You might think that you can drive five hours without taking a break when in fact, you're already showing signs of drowsiness. I'm no exception and neither is our guest today, B
osch product manager Tyler Warga. - I typically drive across the country multiple times - and country being the United States - per year. And so I've driven to Utah and between Utah and Michigan five times in the last 12 months. So it's been quite a bit. - So, Geof,f I need a little bit of your help because I'm not that familiar with US geography, so let me map that. We're going from Utah to Michigan... so let's  say something like Salt Lake City to Detroit? - That'll work! - Ok, okay so we're t
alking about - and I'm sorry, I have to give this in kilometers - but we're talking about around 2,670 km. I don't think I've ever driven this far before. - For our american colleagues: that is 2,280 miles. Just so we're on the same page here. And yes, it does take a few days, that's for sure. And it's not always the best scenery, on top of that. - When you look at a typical driver, they know best and it's including myself, and people will say like, I'm not distracted, I'm not drowsy. - Or so
they think. - And so it can be helpful, for your own safety and the safety of other road users, that in fact you are drowsy and you should take a break. [Music starts] - Recently I got a vehicle that has an indirect driver drowsiness and distraction system. And it kind of gives these different reminders that give some sort of indication and rather or not, you know, I continue driving or not, it still provides an alert that then makes you more mindful. And that's what's really important for i
t, is that from my personal experience, having some sort of  visual audio alert that makes me more mindful will almost reposition me to think differently or to just be mindful, oh, maybe my driving performance isn't the best, or maybe my driving performance could be improved. And then it has nudged me to maybe take a quick gas station stop and get something to eat or drink or something like that. And then continue on. [Music stops] - So, can you explain how a car can actually detect how fit I
am as a driver? - Tyler mentioned that. That it's an indirect system. And maybe another indirect sound - I'm not sure if it's getting picked up in the microphone - my dog is upset  about some some test sirens right now. - He's test sirening himself! - He's sirening himself, exactly. So anyway, it is an indirect system. And that usually works using a steering angle sensor. So the system monitors your steering and from  that it tries to derive if you might be tired. - So, let me get this straig
ht: If I'm over-steering and then correcting myself, that could be a sign. - Or on the other hand, if you're not steering enough. When coupled with the lane-keeping assistant it can also take into account how often you're leaving  your lane without intending to do so. - So then a little coffee mug might light up in the dashboard, telling me that it's time for a break. - Well, these are indirect systems, so it's a little less direct  than that. They don't measure directly how tired you are. Th
ey're trying to derive how tired you are from the way you're steering. - The steering angle sensor is an indirect sensor. And so with that can come false positives. And so this indirect sensing can be enhanced by a direct sensor. - If you could measure drowsiness more directly, that would make the measurement more reliable. - How do you measure drowsiness? I mean, in essence the car is not going to measure my brainwaves, right? - Actually, no need for that. Because it's actually possible to
determine how tired someone is just by looking at your face. - What do you mean, my face now?  - No, Shuko. no offense! - If you must know I slept like a baby, so you must have misread something. I don't have any signs of sleeping... - No, it's not just you. I meant the 'royal you'. Everyone's face sends drowsiness signals. And they can be captured simply by a camera. - What the camera can do is actually directly measure the driver's, facial features to really understand if they are drowsy or n
ot. - Does Tyler mean by facial features something like yawning? - Yes, yawning is certainly one way of determining that someone is tired. In engineer-speak this means that the algorithm detects mouth opening for an extended amount of time. - Okay so in my case for example I'm obviously not tired, I just opened my mouth for an extended amount of time. [Music starts] - And maybe you did. But the camera  is also looking at your eyes. - It's trying to understand the state of the eyelid. So if you
consider like resting eyes when someone's tired, they often will squint their eyes for an extended period of time or blink very often. And so we're trying to figure out how, for example, fast, the eyes are opening and closing or  if they're closed for an extended period of time. - Which actually makes sense. This is indeed more direct than just measuring somebody's steering. - Especially when you combine the signals! - We can actually understand if someone has their mouth open for a bit of tim
e, and then correlate that with the eyelid closure to improve the signal to then determine, yes this person is drowsy. - And perhaps correlate it with the steering angle as well to increase the level of certainty even more. - And you said this technology is already available in cars today?  - That's correct. - Could you maybe tell me where I would find that camera? - I'll let Tyler answer that. - That's actually a really good question  because in general, some people want to find the camera and
some people don't want to  find the camera, right? So typically a lot of cameras are located at the steering column or  the A pillar, for example. We also have some cameras positioned more in the middle of the  vehicle, so like on the display or the mirror. - I think it's an important point. Do we want to  see the camera all the time? Probably not. But should it be completely hidden, almost recording  us, like in secret? Also probably not. - Or maybe it should be as on a smartphone. You  know t
here's a camera there. You know where it sits but you don't think about it every time you look at your phone, because it's kind of camouflaged in that black area. - Yeah, I don't really feel comfortable about a camera watching me as a driver, I have to admit. But I'm probably not the only one feeling this way, right? - Probably not. But for sure no one is ever going to see those images. This system is there just to remind you to be a responsible driver. And there's actually a push globally to in
stall systems like these and make them even more capable to increase road safety. - Specifically in Europe, we do see regulations and also consumer testing requirements that actually are promoting the installation or requiring the installation. - Here in the US as well, there are efforts to make such systems mandatory. And regulators as well as the engineers at Bosch are thinking about more ways to leverage these cameras. Because, once they are in the car, you can do much more with them th
an just detecting drowsiness. - So I guess my face is giving away more information about my state than I maybe want it to. - Once again, your eyes specifically. I don't think the police do this in Germany, where you live. But here in the States, when they stop you, they might actually ask you to exit the car. And then they stand in front of you and they're moving a finger or a pen back and forth in front of your eyes. - They have the finger going back and forth and they ask you to follow th
eir finger. What an officer is subjectively measuring is the smooth pursuit of that individual's eyes. - But in this case we're actually trying to figure out if you're sober or not, right? - Precisely. They are looking for saccades. - So this time you're are going to be able to do the research for me. But I have actually never heard this word in English. What is a saccade? - I'm so glad you asked because I had to look it up myself. A saccade is when your pupils almost jump from one position
to the next. It's a rapid movement from right to left as an example. - So I could probably compare it to somebody reading. - Or just looking around a room. The thing is, when you're intoxicated, that is if you have  alcohol in your system, your eyes have trouble following a movement smoothly and instead  they are jumping from one point to the next. - Meaning,: more saccades. - Yes and it's important to note you don't have control over this. - To be honest I think about it even in a sober state
I don't tend to think about how much I move my eyes. It kind of just happens naturally. - And that's exactly why it's useful in detection. And it's another signal that a camera can pick up. - Believe it or not, the camera can actuallytrack that as well too. And then from there we determine the velocity, we determine the count, also where the pupils fixated on, and then we correlate that with the behavior. - So all that is to say, future cars might be able to help reduce drunk driving. - An
d I think that's amazing! So this is again a topic of invented for life? Probably as well because there are so many terrible crashes that I think could be avoided. And I mean alcohol is only one drug that could impair a driver. - That's right. Tyler what about other drugs? [Music starts] - We actually conducted a collaborative study focusing on alcohol detection. Reason being is because there's been more so of a regulatory push in this direction. Also, consumer testing push. And so this is the
direction we're going. But the potential to track other drugs is possible. So we are evaluating that as well. But it is believed that, you know, there are different involuntary behaviors associated with other drugs that also can be tracked, or in  the case, maybe sudden sickness events. You know, in a lot of cases where someone may have a  sudden sickness event where they would pass out, there are different behaviors that may be detected  prior to that, or in the case that it is actually happen
ing at least understand that and then  provide the signal if the vehicle's capable to execute a minimal risk maneuver, for example. [Music stops] - Which means, the car could pull over to the side and stop. - It sounds like there could be  a lot more coming, though. A lot of functions that make roads safer, just by having an eye on drivers and making sure they're doing okay. - Isn't it fascinating that there is so much that you can read from someone's face? - Very fascinating! And there are al
so things you as a human might be able to read from someone's face. And you're not even aware of it. Like, you meet a new person and you instantly have a gut feeling about them. - Ah, something like, I don't trust this guy  or: I want to be friends with this person. - So, I did read up a little bit on this. And do you think you can distinguish a fake smile from a real smile? - I certainly would like to think so. But I have a feeling you're about to tell me that I can't. - Ah, let yourself be
surprised. So, I found this psychology Professor, Kennon Sheldon, who published actually about this. And he has a nice portrait picture on his University of Missouri staff profile page. And on it, you know he looks very friendly to me but I couldn't really tell is it a fake smile or a real one? - Ah, yeah. I'm pretty good at Duchenne smiles actually, so it probably was a real Duchenne smile. - A Duchenne smile?  - Geoff's getting a little bit nervous. So many French words. - This is testing
my skills! - This was named after Guillaume Duchenne, a French physician in the 19th century. [Music starts]  - A Duchenne smile is a smile that involves the entire face, especially the muscles around the eyes. And so they lift the eyes and the cheek and so it's not just the mouth moving in a smile, it's the whole face smiling. And, perceivers can tell that you're really feeling that positive emotion when you express a Duchenne smile. [Music stops] - So, basically it's really smiling with yo
ur eyes and when someone does that, it's probably real. - It's not something that you can really do, on purpose. You have to actually feel it, and then it's just, a kind of symptom of that feeling. - Yeah, except he said that he is good at it and he did it on purpose for his profile picture. - True and you'd also expect the actors can fake it because otherwise you wouldn't really believe their emotions in films. [Music starts] - It probably just has to do with being able to bring to mind some
thing or some aspect of the situation that's really amusing to you. So you have to make yourself feel it at that moment. And so that's how you can fake it, is to get a real feeling that's not really about what the perceiver might think it's about. [Music stops] - See Geoff, you've got tips for the next photos. But Keenan says that Duchenne smiles are universal. So all humans can produce them and read them, independent of what culture we grew up in. Because they're older than the culture. - S
o this is a really important idea in evolutionary biology that, you might evolve the ability to signal in an honest way that can't be faked, that you have certain attractive strengths as an animal. - So strengths that go beyond muscles and being able to dominate or, I don't know, build a nest. - I think we're talking more of social strength in this context. - Nest building is a social strength! - We think that the Duchenne smile evolved in part to give us a way to signal to each other that,
hey, I'm here. You know, things are good. I'm genuine, this is authentic.  I'm not trying to take advantage of you. - Okay, so dear listeners, I can see Shuko right now. Shuko, would you please smile at me... That's the fakest thing I've ever seen! - No it's not. No, I don't agree. I'm always happy when I see you Geoff! - All right let's try again. Try and  think of something genuinely amusing. - So, maybe like me doing jumping jacks. And if you don't know what I'm talking about, go to our pre
vious episode, listeners. Okay, how's this? - Oh, that's wonderful. Such a difference. Thank you very much! And honestly of  course you know that I love your smile. - Thank you! [Music starts] - If you think of somebody you know that's constantly having belly laughs, they just are really good humor, good spirits, you can see it in their face. And so we said that frequent Duchenne smiles are honest signals of what we call chronic positive affect or positive emotion. And those are the people th
at you really kind of wanna associate with. You wanna marry them, you want them on your team, you want them as friends because they're gonna make you feel good. And it's not fake.  They're feeling it and they help you feel it. [Music stops] - They help you feel it, absolutely! I think I need a couple more Duchenne smilers in my life. - Yeah me too I think I could use a little bit more Duchenne smilers in my life. But anyway, returning back to technology. I wanted to know from Kennon, if thos
e signals couldn't also be picked up by cameras and image recognition systems. It would help them read our emotions, right? - I don't see any reason why they couldn't, and I  would be surprised if it wasn't already happening. - Well, it just might be happening in a Bosch research lab. - If you're in a higher level vehicle or leveraging a higher level of automation, let's say in this case a SAE level four,- - Dear listeners, for those of you who may not know SAE refers to the Society of Automo
tive Engineers. - Maybe emotions become more important to then anticipate what the driver would like or maybe anticipate what the driver is going to do next. - So he's saying automated cars should adjust to the emotional state of the driver - or the person in the driver's seat, because the car is doing the driving here? - Yeah, pretty much. Being able to see how people are feeling could make the experience of a self-driving car much more pleasant. - One of those examples are, let's say there'
s  an SAE level four system and it's behaving, a little bit aggressive, you know, more than  the person will want. And they have the little, you know, I can't do it 'cause we're on a podcast,  but the frightened look in your face, if you will, where, you know, the eyes widen up a little  bit and then the mouth goes, you know, closes, but kind of purchase together. we can track that  and we can indicate maybe that the virtual driving system needs to be toned down, if you will,  you know, and mayb
e drive less aggressive. - A car reading and understanding my emotions.  That really does sound like the future. - Doesn't it, though? - Right. Yeah, it's - I would say - definitely futuristic, but also definitely realistic if you consider what we can already do today and how we can, again, enhance the overall  driving experience for the driver and occupants. - Let's talk more about that. Because so far,  we've talked about a camera monitoring the driver. But actually, what Tyler and his cowork
ers  make, consists of more components than that. - Right, we promised this monitoring technology  that's rooted in enhancing safety, can also be used to increase comfort. So, let's get to it! - You know, maybe let's get back into our virtual car [FX] and see what Tyler's system can do for us. Imagine sometime in the not so distant future, you and I go for some after work drinks with some  colleagues. [FX] We have a nice chat with our colleagues. The evening gets longer and longer. We're in a g
ood mood. Everyone drank well. But not the two of us. We're still sober. - And I would say: “Sure, we can take my car [FX] and I'm gonna give you a ride home.” - Sounds good. [FX] But then, we get into the car, you on the driver side, me on the passenger side, and it turns out your beer was in fact not alcohol-free. [FX] - This is how it could sound if your car tells you: You shouldn’t drive. But it depends on the manufacturer how the warning appears. Anyway: That’s what Tyler’s technology
reveals. - Then right away, the camera's able to detect, you know, within a certain duration that the driver maybe had one too many drinks and is impaired. - So the car would basically tell me: No Shuko, you know what? You absolutely should not drive. [FX] - Fortunately, I didn't actually feel like drinking. So we switch seats [FX] and now Iget to drive your fancy car. - Lucky you! Because normally it wouldn't be that easy. So, with that said, Geoff, get us home safely. [FX] - From there, a
s you could imagine, maybe someone  who's had a little too much to drink is maybe a little chattier than it'd normally be. And so from there. The passenger's talking back and forth to the driver. And so the driver continues to look back at the passenger to have you know, a normal conversation as you would in the vehicle. - But, at one point, I look at you for too long, not having my eyes on the road for an extended amount of time. - And then the first  distraction warning is issued. [FX] - Tha
t means, the camera system  can also determine when you're not looking. - Yes, but this time not just by looking at your eyes or your pupils, but also at your whole head. - The reason we do the head pose and the pupil direction is if you consider, it's actually considered owl and lizard. So in an owl, eyes are fixated, it rotates its head back and forth, whereas a lizard often rotates its eyes and has its head fixated. So if you consider someone driving and then maybe texting on a phone, their
head position is actually facing forward, but their eyes are looking downwards. And so we can track both of those to again, enhance the signal. And that generally would indicate if someone's distracted or able to perform the driving task. - Owl (makes sound) - and lizard (makes sound). I am I'm not so sure I like the lizard bit. So you pretend to be an owl by having your head facing forward, but then you lizard-like direct  your eyes towards your phone. That's a no-no. - Well, during our car
ride, what could happen is, my phone rings. [FX] So now I'm driving your car and my phone isn't connected to your system, so I might want to do what you should not do, I'm just gonna go ahead ans answer the phone. [FX] Hello? - Now, we're gonna shift from a driver  monitoring camera to a occupant monitoring camera. - That's a second camera, watching not only the driver's face but everyone's position inside the car. - And so when you consider like a phone call.  Typically when someone takes a p
hone call, they lift the phone, they put it by their ear.  Right there, automatically we're detecting the bend of the arm, but also we can see the object  in the hand. And so those two things can help us determine the behavior in this case of phone call,  to understand again that the driver's distracted, but also more importantly that the driver can't  grab the wheel with their right hand. [FX] - That's important because you've been using my car's highway assist function. And with that, you're a
ctually allowed to take your hands off the wheel, but you must be able to grab it with both hands at any time, in case of emergency. - My bad. I pass the phone to you. I'll be a responsible driver, focusing on the driving task. You, on the other hand, seem to be very comfortable in the passenger seat. - Yes, I have my feet on the dashboard board and I'm probably being the DJ in the car and just chilling. - It's your dashboard. - If someone's sitting in the seat and has like their legs up on
the dashboard, for example, or maybe is in a unfavorable position, we can also provide that data or information to the passive safety system to then enhance overall safety, and maybe not deploying that airbag facing forward, or even providing alerts to take your feet off the dashboard. [FX] - If something were to happen, that airbag would hurt you much more in that position than it would protect you. So the system can automatically switch it off.  That said, to make those kinds of decisions, i
t's important for the car to know where  passengers are in the first place. I might place a bag on the seat and that should have different consequences from a person sitting there. And that's why in addition to the driver monitoring  camera and the occupant monitoring camera, the system can also include a radar sensor. - It's a 60 GHz radar that's positioned in the interior of the vehicle and it behaves a lot like an exterior radar, but rather than detecting vehicles, it's detecting people. [FX
] - Radar, that's unexpected. Why is a radar better at detecting people than a camera? - Because of occlusions. Cameras can't see behind things, like seats for example. But a radar signal can go straight through the seat, so it can even detect someone in the third row of a van. - Sounds almost like X-ray, but without dangerous radiation. But, what does a radar actually see? It just measures distances, right? - It does. But imagine radar waves bouncing  off of your chest, the distance will sli
ghtly vary. Because you're breathing. Your chest is moving up and down, in and out, and radar actually can measure that. - And thereby determine that I'm a living human being. - And that you're an adult. That is to say, it can differentiate between adults and children. - When people exit the vehicle, one unfortunate situation that happens in all regions all over the world is they will leave their child behind, say they forget the child in different situations.  And so since the radar is capa
ble of detecting the overall breathing rate of adults, we can also detect the breathing rate of a child, which is typically a lot higher than an adult, you know, 25 or up. So if the radar distinguishes  or understands that there's still, you know, sign of life and in the case it's a child, we can also issue alerts to help prevent children from getting left in a vehicle. - Oh wow! - Yeah! That's is really important technology. I actually looked it up. Here in the US alone over the past 25 years,
 more than 950 children have died from heatstroke, after they were left behind in a car. - That's shocking.  - It is shocking! So, child-left-behind functionalities such as this are on their way to becoming mandatory in new cars. - That's really good to hear! Okay, shall we drive this episode home? - I think that's a great idea. For the final stretch, let's get back into our virtual car [FX] and explore one more feature that ties all of these technologies together. - And I'll be good, I'll tak
e my feet off of your virtual dashboard this time. - I still think it's your virtual dashboard but I appreciate it. And we’re going back to that restaurant that we drove past at the very top of the episode. Let’s see what was going on there. So we're driving through the city, familiar streets, familiar buildings, but then.... - Hey, what's that? Never seen this restaurant before! - Hey Alexis, what's that restaurant? - You just drove past a new restaurant called Melena’s. It's German cuisine.
Do you want me to book a table? - That's a pretty smart Voice Assistant. You didn't give it any specifics of what you're inquiring about. - All the information the camera and the radar and the other systems in the car collect are merged to figure out what exactly it is that we're looking at. And then provide information about it. - We do have the eye gaze tracking GPS position and heading, and then this is all processed via processing manager alongside our speech recognition engine. But then
on top  of that, what we're doing is, leveraging cloud processing related to the application alongside  interest intent, and then also map data and POI. - POI as in Point of Interest. - Right. That data is coming from an external source. So, everything we’ve heard so far was relying on Bosch technology, our hardware, our software: the cameras, the radar and the  steering angle sensor, the compute platform, and of course the algorithms. Now, with this  feature we’re making use of connectivity an
d bringing in Bosch's partner Amazon. - AWS in this case is providing in the background information on the map and the point of interest  to help give back to the driver occupants. - That's cool. Just from looking at me, the car  knows what I want. And I think this is double cool because, let's be honest: nobody gets excited for  safety features. And yes, it makes sense that a car can detect an intoxicated or distracted or tired driver. But do I care if my car has that feature? - You personally
? Probably not. - If the same system also increases convenience, that gets me excited.  I want it for the convenience, and I get safety as a bonus - even though it was  developed to enhance safety in the first place. - I think the safety features are cool also.  Because I believe it: This system actually does no better I might think that I'm not too tired  to keep going [FX] but when I get that drowsiness alert I better take a break and grab that coffee. [FX] There it is! [Music starts] Fortuna
tely just as we're pulling into the driveway. Thanks for coming on this ride with me, Shuko. - It's me that needs to thank you because you've been driving. But you can definitely give me back the car keys now. - Yes ma'am. - Merci. - Listeners, don't forget to give us a like if you learned something. - And until next time, à la prochaine. From Know-How to Wow - the Bosch Global Podcast [Music stops] - Dear listeners, I'm Geoff's voice avatar and  I'll be hosting the next deep dive episode. Tyl
er will be our guest again. He will reveal more  features about the interior camera system. And he will talk about implementation challenges: Like sunglasses, which could obstruct a driver's eyes.

Comments