Self-Driving Vehicles
This doesn't address my points about the fact that the car's LIDAR should have spotted this woman even if she was on the median in the bushes, which goes to your point. Between LIDAR and cameras the hardware/software should have figured out that this is a jaywalker intent on being in the middle of the road. There will be more information emerging because the two main agencies and the companies are looking into it.
This doesn't address my points about the fact that the car's LIDAR should have spotted this woman even if she was on the median in the bushes, which goes to your point. Between LIDAR and cameras the hardware/software should have figured out that this is a jaywalker intent on being in the middle of the road. There will be more information emerging because the two main agencies and the companies are looking into it.
My bet is the other way. The car's LIDAR and radar should have spotted her at roughly 300 ft away if she placed herself in the median in the bushes and resolved that it was a pedestrian, especially with a bicycle which is metal. Once she started moving, the algorythim should have told the car's brain that there was an object in motion moving into the car's path.
The car is moving roughly 55 ft/sec and that means that it should have braked without human intervention a minimum of 120 ft away from her but even much farther away than that. As others say, the point is moot because radar/lidar doesn't care if its dark outside or the noonday sun.
Your point is that a human driver would have outperformed the onboard systems because that human would have better vision and detection capabilities. Humans are just as fallible because your average driver would not necessarily have reconciled what was going on in front of them any better than the Uber software/hardware.
My bet is the other way. The car's LIDAR and radar should have spotted her at roughly 300 ft away if she placed herself in the median in the bushes and resolved that it was a pedestrian, especially with a bicycle which is metal. Once she started moving, the algorythim should have told the car's brain that there was an object in motion moving into the car's path.
The car is moving roughly 55 ft/sec and that means that it should have braked without human intervention a minimum of 120 ft away from her but even much farther away than that. As others say, the point is moot because radar/lidar doesn't care if its dark outside or the noonday sun.
Your point is that a human driver would have outperformed the onboard systems because that human would have better vision and detection capabilities. Humans are just as fallible because your average driver would not necessarily have reconciled what was going on in front of them any better than the Uber software/hardware.
I'm not saying that a human car always outperform onboard systems, but in this particular scenario there was plenty of time for an attentive human drive to react. Maybe onboard systems could outperform the human driver if they were working properly, but in this particular case they did absolutely nothing and encouraged the driver not to pay attention to the road.
Celebrating Lexus & Toyota from Around the Globe
I'm not saying that a human car always outperform onboard systems, but in this particular scenario there was plenty of time for an attentive human drive to react. Maybe onboard systems could outperform the human driver if they were working properly, but in this particular case they did absolutely nothing and encouraged the driver not to pay attention to the road.
https://www.axios.com/uber-self-driv...85e12a145.html
It is clearly an uber software issue and safety driver was simply not attentive at the time.
This is where sensors should have detected the person on the street.
Uber is going to lose a lot of money over this.
Most safety systems out there today, with way lesser sensors would have attempted to brake at some point...
Obviously car should have stopped and it did not. It is a very clear, wide street.
There are many tests on every new vehicle with auto brake system done by IIHS, EuroNCAP and JNCAP, and at these speeds, system in your midrange vehicle would autobrake to full stop and not hit a woman.
In fact, those industry tests are done at much harder situations for systems because pedestrian is partially hidden from view by another vehicle thats stationary, and then they "jump" from that side into the street, to test how well the system works.
This on the other hand, was very textbook situation, there should have been no problems here by a system in Camry, let alone system with much better lidar. She is even crossing next to the street lights, I would guess that it is due to the poor dashboard cam that it looks so dark, as we can see that crash occurs just next to the streetlights.
In any case, system did not detect a problem at any point, it continued going 38 mph even after running over the person. It is clearly a system failure.
And I can understand human test driving not being attentive since they just sit around doing nothing or looking at their computer for info which is very distracting...
If we lived in serious world, there would be rules about autonomous testing... things like distractions, hands on wheel, how long they can drive before they have to rest, passing some kind of attention tests, etc.
But we live in silly world where lawmakers were trying to get a jump start on other states and hence we have AZ where there are no rules at all when it comes to autonomous testing. Now it looks very silly, doesnt it? But it is also very likely that if state attempted to add more rules, that media would criticize it for stopping the technology.
Most manufacturers dont test in California anymore since it requires them to report of failures.
And I can understand human test driving not being attentive since they just sit around doing nothing or looking at their computer for info which is very distracting...
If we lived in serious world, there would be rules about autonomous testing... things like distractions, hands on wheel, how long they can drive before they have to rest, passing some kind of attention tests, etc.
But we live in silly world where lawmakers were trying to get a jump start on other states and hence we have AZ where there are no rules at all when it comes to autonomous testing. Now it looks very silly, doesnt it? But it is also very likely that if state attempted to add more rules, that media would criticize it for stopping the technology.
Most manufacturers dont test in California anymore since it requires them to report of failures.










