Japanese automakers Nissan and Honda say they plan to share components for electric vehicles like batteries and jointly research software for autonomous driving.
I guess I grew up with a lot of automated things, but I feel like driving is something that people should be able to do for safety reasons.
Have you not met “people”? People drive like absolute idiots. I had to emergency brake just yesterday because the woman driving the car to my right was drifting into my lane at highway speeds while she negotiated a large 24oz can of some beverage in her left driving hand while simultaneously holding a cigarette. Her right hand and both eyes were busy doing something on her cell phone.
Yeah, recently I’ve started paying attention to what other drivers are doing while stopped at intersections. The amount of people using their cell phones behind the wheel astonishes me.
Yeah people suck at driving, but they’ll just get worse while the automation is being ironed out and eventually they won’t know how to drive at all. What about an emergency situation that the automated cars and/or roads aren’t programmed for?
It’s almost like forcing a majority of the population to drive a 2 ton vehicle at over 60mph was a bad idea that has no real fix. Maybe we should make some changes so cars aren’t the default mode of transportation anymore?
I’m a driving instructor and I believe automated roads would be a godsend. I’d actually be happy the day I don’t have to do my job, not because I hate it - I actually love my job - but because automated roads would be so much safer overall.
I guess that’s fair. Machines would be much less accident prone than humans, but you can’t automate everything on the road (e.g. people, bikes, non-automated vehicles). People are going to have to be able to know how to get out of situations manually. What about emergencies where you have to do something that the automated roads aren’t programmed for?
I don’t know about your city, but I trust technology a lot more than the average driver. At least technology can detect a red light vs a green light. I nearly got hit by a ford mega truck in broad daylight who thought the small, green bicycle symbol was his indicator to ignore his massive red “no left turn” indicator across a protected bike lane. :P
I agree. Less margin for error, but leaves people who depend on automation vulnerable. I just imagine lots of growing pains before we get to ideal state.
I don’t know about your city, but I trust technology a lot more than the average driver.
I don’t. Technology can be subject to glitches, bugs, hacking, deciding to plow right through pedestrians (hello Tesla!), etc.
While the case can be made that human drivers are worse at reaction time and paying attention, at least a “dumb” car can’t be hacked, won’t be driven off the road due to a bug, won’t try to knock people over itself without stopping, etc.
A human, when they catch these things happening, can correct them (even if it is caused by them). But if a computer develops a fatal fault like that, or is hijacked, it cannot.
EDIT: It seems like this community is full of AI techbro yes-men. Any criticism or critical analysis of their ideas seems to be met with downvotes, but I’ve yet to get a reply justifying how what I said is wrong.
Yes, you probably are. Please don’t forget that the current available technology constantly improves, and that we actually don’t see any good examples of self - driving cars that much - the most prominent displays are from Tesla, and they arguably build the worst cars we’ve seen since Ford came up with the assembly line.
The technology used in autonomous vehicles, e. g. sensors, have been used in safety critical application for decades in other contexts, and a machine is capable of completely different reaction times. Also, if autonomous vehicles cooperate in traffic sticking to their programmed behavior, observing traffic rules etc., you will get less reckless driving, with traffic flow becoming more deterministic. These benefits will particularly increase once self-driving cars don’t have to share the road with human drivers.
I would always trust a well-engineered, self-driving car more than one driven by a human.
Disclaimer: I used to work on these things in a research lab. Also, we’re not quite there yet, so please have a little patience.
What about things on the road that are not automated? There will be situations where a machine’s ethics might override a human driver’s ethics. It would be good for us to be able to override the system and know how to safely drive in emergencies.
It’s not about everything being automated. We also have to differentiate between early incarnations of autonomous vehicles and the desired, final state.
A manual override will of course be part of early models for the foreseeable future, but the overall goal is for the machine to make better decisions than a human could.
I don’t have any quarrel with life or death decisions being made by a machine if they have been made according to the morals and ethics of the people who designed the machine, and with the objective data that was available to the system at the time, which is often better than what would be available to a human in the same situation.
It’s the interpretation of said data that is still not fully there yet, and we humans will have to come to terms with the fact that a machine might indeed kill another human being in a situation where acting any different might cause more harm.
I don’t subscribe to the notion that a machine’s decision is always worth less than the decision of another entity participating in the same situation, just because it so happens that the second entity happens to be a human being.
Not having control of a vehicle in a life or death situation is terrifying to me. I probably trust my driving more than most, and trust my decisions over those decided by a corporation beholden to rich investors.
I’m worried about the growing pains before we get to the ideal state, and that would have to be full autonomy of everything on the road so nothing that enters the space can collide with another, or if they do, it’s not dangerous.
But then guess what? People will be able to pay for the fast lane. Or a faster rate of speed. You make a whole economy out of trying to get to work, trying to go to a wedding, trying to go anywhere. I don’t trust it, but I get it.
Not having control of a vehicle in a life or death situation is terrifying to me.
Are you terrified of riding on trains or flying in planes? You don’t have control of the vehicle in those cases either and those are both considered far FAR safer than you driving a car.
Not having control of the plane in a life or death situation is terrifying. Train not so much.
Driving a car can’t be compared to those two. So much more traffic and people/objects in the way. Only comparable to something like a bus in a designated bus lane.
I think you’re falling into a bit of a trap here: perfect is the enemy of good. Not everything has to be automated, instead of growing pains, there can also be gains.
Remember, we are currently aiming to get these vehicles on the road, alongside regular drivers. They use sensors and computer vision to read street signs, detect people etc., all with the reaction speed of a machine. What if the in-between product is simply a better driver with faster reaction times? That is the current goal, really - no one wants to automate everything, simply because that wouldn’t be feasible anytime soon.
Yes, again, we’re not there yet and these things are far from perfect. But let’s first just aim to get them good enough, and then maybe just a little better than your average driver.
As for the your proposed business model: we have capable drivers now, why do these business models don’t exist right now? Why is there no fast lane that allows me pay to get to my destination faster? What would the technology of driverless cars introduce that would enable these regulations?
You’ve misunderstood me and we’re getting off topic. The main point is that “good” is where the eggs are cracked to get to a “great” omelette.
We have toll roads today. You pay for faster travel. Automation of vehicles introduces much easier access into controlling you vehicle and a lot more variables able to be controlled.
Am I an old fart for being concerned about future generations growing up with self driving cars?
I guess I grew up with a lot of automated things, but I feel like driving is something that people should be able to do for safety reasons.
Have you not met “people”? People drive like absolute idiots. I had to emergency brake just yesterday because the woman driving the car to my right was drifting into my lane at highway speeds while she negotiated a large 24oz can of some beverage in her left driving hand while simultaneously holding a cigarette. Her right hand and both eyes were busy doing something on her cell phone.
There is very little safe about people driving.
Yeah, recently I’ve started paying attention to what other drivers are doing while stopped at intersections. The amount of people using their cell phones behind the wheel astonishes me.
Yeah people suck at driving, but they’ll just get worse while the automation is being ironed out and eventually they won’t know how to drive at all. What about an emergency situation that the automated cars and/or roads aren’t programmed for?
It’s almost like forcing a majority of the population to drive a 2 ton vehicle at over 60mph was a bad idea that has no real fix. Maybe we should make some changes so cars aren’t the default mode of transportation anymore?
I’m a driving instructor and I believe automated roads would be a godsend. I’d actually be happy the day I don’t have to do my job, not because I hate it - I actually love my job - but because automated roads would be so much safer overall.
I guess that’s fair. Machines would be much less accident prone than humans, but you can’t automate everything on the road (e.g. people, bikes, non-automated vehicles). People are going to have to be able to know how to get out of situations manually. What about emergencies where you have to do something that the automated roads aren’t programmed for?
My dad said the same thing about GPS Vs the ability to map read
Nearly drove us off the road a couple of times while trying to read a map lol
Haha funny comparison. I love GPS, but when it doesn’t work it’s pretty low stakes. The mechanism in which it works is completely different too.
I don’t know about your city, but I trust technology a lot more than the average driver. At least technology can detect a red light vs a green light. I nearly got hit by a ford mega truck in broad daylight who thought the small, green bicycle symbol was his indicator to ignore his massive red “no left turn” indicator across a protected bike lane. :P
I agree. Less margin for error, but leaves people who depend on automation vulnerable. I just imagine lots of growing pains before we get to ideal state.
I don’t. Technology can be subject to glitches, bugs, hacking, deciding to plow right through pedestrians (hello Tesla!), etc.
While the case can be made that human drivers are worse at reaction time and paying attention, at least a “dumb” car can’t be hacked, won’t be driven off the road due to a bug, won’t try to knock people over itself without stopping, etc.
A human, when they catch these things happening, can correct them (even if it is caused by them). But if a computer develops a fatal fault like that, or is hijacked, it cannot.
EDIT: It seems like this community is full of AI techbro yes-men. Any criticism or critical analysis of their ideas seems to be met with downvotes, but I’ve yet to get a reply justifying how what I said is wrong.
Plenty of dumb cars get recalls all the time for shitty parts or design. Remember that Prius with the brakes that would just decide to stop working?
Self-driving cars are no less prone to mechanical failures.
Yeah, but you said that already
No, I was talking about software issues.
And if you know that both non-self-driving cars and self-driving cars are both equally prone to mechanical issues, why bring it up as a counterpoint?
It wasn’t a counterpoint you silly goose, I was agreeing with you
Yes, you probably are. Please don’t forget that the current available technology constantly improves, and that we actually don’t see any good examples of self - driving cars that much - the most prominent displays are from Tesla, and they arguably build the worst cars we’ve seen since Ford came up with the assembly line.
The technology used in autonomous vehicles, e. g. sensors, have been used in safety critical application for decades in other contexts, and a machine is capable of completely different reaction times. Also, if autonomous vehicles cooperate in traffic sticking to their programmed behavior, observing traffic rules etc., you will get less reckless driving, with traffic flow becoming more deterministic. These benefits will particularly increase once self-driving cars don’t have to share the road with human drivers.
I would always trust a well-engineered, self-driving car more than one driven by a human.
Disclaimer: I used to work on these things in a research lab. Also, we’re not quite there yet, so please have a little patience.
What about things on the road that are not automated? There will be situations where a machine’s ethics might override a human driver’s ethics. It would be good for us to be able to override the system and know how to safely drive in emergencies.
It’s not about everything being automated. We also have to differentiate between early incarnations of autonomous vehicles and the desired, final state.
A manual override will of course be part of early models for the foreseeable future, but the overall goal is for the machine to make better decisions than a human could.
I don’t have any quarrel with life or death decisions being made by a machine if they have been made according to the morals and ethics of the people who designed the machine, and with the objective data that was available to the system at the time, which is often better than what would be available to a human in the same situation.
It’s the interpretation of said data that is still not fully there yet, and we humans will have to come to terms with the fact that a machine might indeed kill another human being in a situation where acting any different might cause more harm.
I don’t subscribe to the notion that a machine’s decision is always worth less than the decision of another entity participating in the same situation, just because it so happens that the second entity happens to be a human being.
Not having control of a vehicle in a life or death situation is terrifying to me. I probably trust my driving more than most, and trust my decisions over those decided by a corporation beholden to rich investors.
I’m worried about the growing pains before we get to the ideal state, and that would have to be full autonomy of everything on the road so nothing that enters the space can collide with another, or if they do, it’s not dangerous.
But then guess what? People will be able to pay for the fast lane. Or a faster rate of speed. You make a whole economy out of trying to get to work, trying to go to a wedding, trying to go anywhere. I don’t trust it, but I get it.
Are you terrified of riding on trains or flying in planes? You don’t have control of the vehicle in those cases either and those are both considered far FAR safer than you driving a car.
Not having control of the plane in a life or death situation is terrifying. Train not so much.
Driving a car can’t be compared to those two. So much more traffic and people/objects in the way. Only comparable to something like a bus in a designated bus lane.
I think you’re falling into a bit of a trap here: perfect is the enemy of good. Not everything has to be automated, instead of growing pains, there can also be gains.
Remember, we are currently aiming to get these vehicles on the road, alongside regular drivers. They use sensors and computer vision to read street signs, detect people etc., all with the reaction speed of a machine. What if the in-between product is simply a better driver with faster reaction times? That is the current goal, really - no one wants to automate everything, simply because that wouldn’t be feasible anytime soon.
Yes, again, we’re not there yet and these things are far from perfect. But let’s first just aim to get them good enough, and then maybe just a little better than your average driver.
As for the your proposed business model: we have capable drivers now, why do these business models don’t exist right now? Why is there no fast lane that allows me pay to get to my destination faster? What would the technology of driverless cars introduce that would enable these regulations?
You’ve misunderstood me and we’re getting off topic. The main point is that “good” is where the eggs are cracked to get to a “great” omelette.
We have toll roads today. You pay for faster travel. Automation of vehicles introduces much easier access into controlling you vehicle and a lot more variables able to be controlled.