What's the difference? Apart from the legal requirement to have a diver ready to take over in test vehicles (which necessarily makes it Level 2), the fundamental difference is that you'd have to show a lot more than one demo to establish that you've achieved Level 4. Level 4s are supposed to be able to operate without human intervention at all within prescribed domains (e.g. downtown cities). That doesn't mean operate one trip or one day or one month without a disengagement -- that's still Level 2.
I'm super impressed by the demo but Cruise will have to show more data to back up a Level 4 claim.
when we see the streets of San Fransisco conquered, then we know that self-driving is ready to come of age.
Seriously? And if you conquer the street of Naples what will come of age?If you've ever been to those countries, the first thing you notice as a westerner is that the streets are chaos, traffic rules are barely obeyed, and aggressiveness is required to get anywhere.
Meanwhile, Beijing and Dehli have extreme pollution problems partially caused by cars.
If any place was in need of point to point autonomous ride sharing, it's Asia. The reduction in cars would reduce congestion from the outset. But in the future, protocol guided right of way negotiation could reduce congestion even further.
Ideally, the AI should be able to figure out how the dynamics of such a situation work and not create a jam or something.
I still haven't seen a driverless car handle a single-lane road (e.g. with parked cars on the side). Something like this:
https://www.google.co.uk/maps/@51.4606316,-2.5562767,3a,75y,...
A car forcing a merge right in front of it, an armored car taking a lane and a half where it had to wait to go around (there were numerous instances of similar lane blockages that it navigated around), a pedestrian jay walking in front of it, a bus picking up passengers, a bike cutting in front of it, and car pulling out in front of it in heavy traffic.
I had no idea they could handle that many scenarios already.
Notes:
* There are frequent steering twitches to the left. This may be associated with passing parked cars. There are similar twitches to the right when in the left lane of a one-way street.
* Crosswalk behavior when turning needs some work. The vehicle enters the intersection, then stops in the intersection before the crosswalk with people in it. This is a hard problem, because the system needs to recognize people waiting to cross but not yet in the roadway. When the light turns green, both the pedestrians going straight and the turning vehicle can enter the intersection, the pedestrians having right of way. The pedestrians now block the vehicle, and the vehicle blocks the bike lane.
* Left turns into multi-lane streets are too wide and into the wrong lane.
* On two occasions, the vehicle is stuck behind a doubly-parked vehicle engaged in loading. The options are to wait or to cross a double yellow line. There's a delay of several seconds, then forward movement. Suspect manual intervention.
Entering the bike lane appears to be legally-required [1] behavior for a driver turning right across a bike lane in California. Entering the intersection, however, i'm not sure about.
Illustrated here: https://www.sfbike.org/news/bike-lanes-and-right-turns/
[1] https://leginfo.legislature.ca.gov/faces/codes_displaySectio...
Near my home there are several intersections with bike lanes, and I estimate that perhaps one driver in 10 enters the bike lane to make the right turn as they are required to.
In fact, I sometimes get surprised or even dirty looks from drivers making illegal right turns when I do move into the bike lane to make the turn. I guess they think I'm trying to get ahead of them and cut them off, but I'm not, I'm simply following the law and improving bicycle safety.
Worth mentioning, 30 seconds would feel like an eternity inside a car.
Google had another crash last month.[1] As usual, it was probably the other driver's fault. Other driver apparently botched a left turn in a two-lane left turn intersection at Rengsdorf and El Camino. Google drives in traffic on the SF peninsula every day, and we know exactly how many times they've crashed. It's reassuring seeing all those miles with only the occasional fender-bumper. It's not like Tesla slamming into something on a freeway at full speed. Three times.
Still waiting for the CA DMV to post the 2016 autonomous vehicle disconnect reports. Those cover December through November and are due Jan 1st, so DMV should have them up by now.
[1] https://www.dmv.ca.gov/portal/wcm/connect/3d358211-3f0c-430e...
Would it, when the car is driving for you and you're free to putter about with a phone or laptop or book?
I always figured that this would be one of the greatest benefits of autonomous vehicles - perhaps the car isn't as aggressive as the human it's driving it would be, or as quick to get "unstuck", but the human won't care too much because they're too busy on their phone. In most cases, what's a few extra minutes when you're no longer actually driving the car?
I would love to see how the self driving contraptions handle this city at rush hour.
That was nice and clean city driving in the video clips but nothing that distinguishes it from "Level 3" (human intervention may be required within ~15 seconds or so) or even "Level 2" (human intervention may be required within seconds, current state of the art).
The benefit of this is you notice where levels "poke through," e.g. Level 4 may work on a sunny day, but a small change in rain or road conditions could downgrade the system to Level 1. As time goes forward, the inner levels can be expected to radiate out.
EDIT: Never mind. The point of "Level 4" is it is competent in all reasonable operating domains.
This isn't the way it works. If a car says it can do "Level 4 on a sunny day" it means when you sit in the car and engage the autopilot (and it says it's safe to engage), then no human intervention will not be required during the course of the trip. If conditions change, the car will be parked without human assistance and wait for help (what happens next is outside of the scope). You could be sitting in the back seat, or your kids could be in the car alone on their way to school.
"Level 4 on a sunny day on some pre-certified highways" is ok. "Level 4 in San Francisco traffic" is also ok, and much harder. "Level 4 unless it starts raining and we'll deteriorate to Level 3" is Level 3, not 4.
This is the definition of autonomy levels from SAE, and they're pretty strictly defined.
Level 1: Driver must be ready to take control at any time. Automated system may include features such as Adaptive Cruise Control (ACC), Parking Assistance with automated steering, and Lane Keeping Assistance (LKA) Type II in any combination.
Level 2: The driver is obliged to detect objects and events and respond if the automated system fails to respond properly. The automated system executes accelerating, braking, and steering. The automated system can deactivate immediately upon takeover by the driver.
Level 3: Within known, limited environments (such as freeways), the driver can safely turn their attention away from driving tasks, but must still be prepared to take control when needed.
Level 4: The automated system can control the vehicle in all but a few environments such as severe weather. The driver must enable the automated system only when it is safe to do so. When enabled, driver attention is not required.
Level 5: Other than setting the destination and starting the system, no human intervention is required. The automatic system can drive to any location where it is legal to drive and make its own decision.
From the Society of Automotive Engineers
I'm also curious how it would react to going up Mason and California, where there's a traffic light at the top of a steep hill. Last time I had to physically pull myself up via the steering wheel to see anything, and as a seasoned driver I was a bit worried.
Add to this complexity the weather conditions. Suppose the sun is shining straight at you and you need to squint and shade your eyes just to make out what the light is -- this happens to me frequently -- can the camera see the traffic light and distinguish its color clearly under such conditions?
What about when it's raining, misting or drizzling, snowing heavily, etc. and the traffic lights are these fragmented outlines that you, the human, can heuristically distinguish but a machine might not?
One last thought: suppose it's right turn on red and first car in line is a self-driving vehicle. Can it really look left and safely determine there's enough time to beat the cross traffic? If it's highly conservative and just waits until green, there could be ten irate motorists behind it and guaranteed to honk and curse.
It's exciting technology but there are some very difficult problems to solve. I worry that if these machines can't demonstrate 110% of a human's ability to drive, they simply won't be implemented in many places except some very well defined rigid routes that are free of problematical challenges and variations.
Well, how do you distinguish / identify a traffic light?
For me it's a combination of knowing the area, knowing what a traffic light looks like, and observing the behaviour of traffic around (mostly in front) of me (if the intersection isn't visible).
For an autonomous vehicle, they'd use the same methods plus they'd have the non-trivial added benefit of colleague robots feeding them updated information.
> Suppose the sun is shining straight at you and you need to squint ...
I'm sure the human eye + sunglasses + visor comes a poor second compared to CCD + mechanical + IR + (etc) can do.
> What about when it's raining, misting or drizzling, snowing heavily, etc. and the traffic lights are these fragmented outlines that you, the human, can heuristically distinguish but a machine might not?
Interesting use of the word heuristic there. If you can determine the heuristics you are using, then an autonomous vehicle can use the same.
> One last thought: suppose it's right turn on red and first car in line is a self-driving vehicle. Can it really look left and safely determine there's enough time to beat the cross traffic? If it's highly conservative and just waits until green, there could be ten irate motorists behind it and guaranteed to honk and curse.
This, and variations, frequently come up in discussions on AV. There's an implicit expectation that no one building these things has considered this problem (this is clearly false). There's two explicit expectations that once a tipping point of AV's are out there, a) there'll be a regulatory push to massively accelerate the adoption close to 100%, and b) inter-vehicle communication means this problem won't arise. In the short term these problems may occur, but to address your question, yes, I'd suggest a computer would be better able to predict if there's enough time to safely turn than most humans.
> I worry that if these machines can't demonstrate 110% of a human's ability ...
Which human would you pick?
Anyway, I wouldn't worry - none of these problems are intractable.
red, yellow, green, flashing red, flashing yellow,
no-turn-left red arrow, no-turn-right red arrow
You forgot flashing green, which does actually exist, at least in MA. You treat it like a green light, but it could go yellow and then red if a pedestrian presses the walk button.I couldn't find any real information on this though. I think I picked it up from Reddit comments. Some info does suggest it's legally complicated: http://www.claimsjournal.com/news/national/2014/03/06/244965...
It's not an impossible problem to solve, but I have not heard of any efforts to create a car-to-car or car-to-road communications protocol and infrastructure that would be cross-manufacturer and/or internationally approved.
In my opinion, before such a mechanism exists, autonomous cars will only work as long as the majority of cars are driven by humans. I will change my opinion when I see a demonstration of a city (or city-like test site) traffic where all or the majority of cars are autonomous.
Nor do they have any presence I'm aware of on Github. This is in contrast to BMW, for example, who have made a number of contributions: https://github.com/bmwcarit
Anyway, just curious to what extent Cruise used (or still uses) ROS and open source software in their stack.
You might not be able to turn an aircraft carrier on a dime but when you do you've got an aircraft carrier.
GM (and the other automotive manufacturers for that matter) decided they wanted in on self driving and electric vehicles, had a few meetings, wrote a few checks and a few years later look at the result. Are Google and Tesla going to reply with similar videos?
I'm not sure what this means.
(Southern Michigan gets quite a lot of snow. Prevailing winds are westerly so lots of lake effect. Grand Rapids gets almost 2x the snow that Detroit gets. Cleveland also gets more snow than Detroit, lake effect from Erie)
Every time ... cars pass it, and dive into its lane (a typical reaction to slow moving vehicles).
Still, nice accomplishment.
Thought: Once we have 99% self driving cars it will be quite easy to convert a portion of roads to pedestrian only at times when traffic is light: bollards go up, lighting changes, cars informed to reroute.
And so you end up with posts like this trying to analyse a video frame by frame to assess the reality of the technology, and yet everyone including the author tries to guess where's the catch ( is the green light really trustworthy ? Why is the video accelerated ? Etc..).
I'm reasonably certain we can stop right there. It's a heavily regulated field and these things can't be sold until they've been rigorously tested. I know SV isn't known for its coziness with regulatory agencies, but in the field of safety, they'll have to play ball or take on all the liability the lawsuits will throw at them.