abc
Well-known member
Why haven't anyone come up with a self-driving snowmobile yet?
Backyard cat skiing anyone?
Backyard cat skiing anyone?
Welcome to AlpineZone, the largest online community of skiers and snowboarders in the Northeast!
You may have to REGISTER before you can post. Registering is FREE, gets rid of the majority of advertisements, and lets you participate in giveaways and other AlpineZone events!
How long is a long way away? Years? Decades? How many years? How many decades?I think we’re along ways away from driverless vehicles that can make a trip to a remote location in variable weather conditions.
I think we’re along ways away from driverless vehicles that can make a trip to a remote location in variable weather conditions. I had a new car up in vt equipped with lane departure correction. Great feature on the interstate. On local road, with snow on the sides, it kept stearing out of the clear roadway in the middle and into snow covered roadway on the side.
It is never going to happen without completely rebuilding every road on the way there from the gravel base on up. You think that is ever going to happen? Me neither.
Driverless cars use cameras to read the lines on the road. In anything other than perfect conditions, they don't roll. Without embedding cable in the roadway itself to guide these things you will never see driverless cars in areas that receive any adverse weather conditions.
I agree fully self driving cars are still science fiction, but they can use a lot more than just cameras to see where the road is. Most of them have extremely detailed maps of the roads, so if they know their exact position, then they can know where the road is without even seeing it. Who knows what else they can come up with, like snow penetrating radar or something. So technically self driving cars are feasible, it's just going to take a while to make them safe and reliable.
The biggest barrier now is that the current generation of artificial intelligence isn't good at generalizing. It's very good at learning specific things, but if it encounters a variation on what it's learned, then it gets confused. So if you teach it what a pedestrian looks like and what a rabbit looks like, and then a person in a rabbit suit is walking across the street, the car might decide it's just a rabbit. The AI just lacks common sense and depth of experience to say, "hmmm, that's too big to be a rabbit". The last two fatalities from self driving cars happened when the AI thought a big white truck blocking the road was just the sky, and a woman crossing the street with a bicycle was just a bicycle.
So its ok for the car to hit a bicycle as long as nobody is riding it?
Buses are cruising up to ski resorts in large numbers!It also sucks that your car Volvo hides the on off button in the settings menus. I can turn that off as soon as I see something wrong on the road.
I think we will begin to see some highway capacity improvement pretty quickly from these new features. Adaptive cruise control helps reduce rubbernecking and increase threw-put.
I think for ski areas busses are more realistic, except for a few ski areas near rail lines. Rail lines are very expensive to build so they don’t make sense if usage is lower. I think ski areas should work harder to push busses as a way to access the resort. Most don’t do anything of the sort.
So its ok for the car to hit a bicycle as long as nobody is riding it?
Depends on what kind of bike it is.
I was kind of glossing over the details on that accident. Apparently they had tuned the software to try to reduce false positives, so that the car wouldn't slam on the brakes over phantom objects that weren't really a threat. The car detected that something was there, but it wasn't sure what it was, and decided to ignore it. I suspect if the woman had been crossing the street without the bicycle, then it would have stopped for her. If the bicycle had been parked in the middle of the road by itself, then the car would have driven around it. It would have also gone around her if she'd been riding the bike. But a woman pushing a bike across the street in the dark wasn't recognizable to the car since it couldn't generalize the different objects that it knew to understand this novel situation it wasn't trained on.
I'm not aware of the part of reducing false positive.Depends on what kind of bike it is.
I was kind of glossing over the details on that accident. Apparently they had tuned the software to try to reduce false positives, so that the car wouldn't slam on the brakes over phantom objects that weren't really a threat. The car detected that something was there, but it wasn't sure what it was, and decided to ignore it. I suspect if the woman had been crossing the street without the bicycle, then it would have stopped for her. If the bicycle had been parked in the middle of the road by itself, then the car would have driven around it. It would have also gone around her if she'd been riding the bike. But a woman pushing a bike across the street in the dark wasn't recognizable to the car since it couldn't generalize the different objects that it knew to understand this novel situation it wasn't trained on.
I'm not aware of the part of reducing false positive.
But if that's true, that's truly disturbing. They're testing their software using general public, real people as guinea pigs!
They don't allow that in medical research. But I guess that's acceptable for computer software?
This is the only post sofar that doesn't sound like pie in the sky fantasy.
And the ppl who are trying to use current "self driving" features on snowy roads... You folks scare me.
I don't mean to make light about any accidents or injuries caused by driverless technology. But it does interest me that these accidents become front page news. The standard by which driverless technology should be judged is that of human driving. So far, every statistic that I've seen has suggested that while we still have a long way to go, driverless technology tests already outperform humans. In general, humans are bad drivers.
I drive all the time (don't have a choice) and I freely admit that I'm an average driver (skillwise). People tend to want to judge driveless technology against the standard of perfect. And we all know that human drivers aren't perfect. My mother-in-law is a sweet, kind, caring person. And if you asked me, she's such a bad driver, she should have her license revoked. Consistently drives 10+ mph below the speed limit. Comes to a stop at most intersections regardless of whether there is a traffic light or stop sign. Bottom line, she's dangerous on the road. Give me today's (not even future versions) driverless technology over her any day of the week.
This attitude is what terrifies me.
So driverless cars are preferable to cautious drivers who stay below the speed limit and follow all traffic directions? WOW!
+1 Add to that drivers with the right-of-way who simply decide to stop to let opposing traffic make left turns, for no logical reason.Stopping at intersections with no stop sign or traffic light where you have the right of way is not a "cautious" driver. That's outright dangerous.
Stopping at intersections with no stop sign or traffic light where you have the right of way is not a "cautious" driver following traffic directions. That's outright dangerous.
Edit: Don't get me wrong, I also don't think it is acceptable either for a driverless car to have anything less than a near perfect standard of safety and accuracy before they are allowed. Being "as good as humans" is not good enough.