Forum Discussion
- rlw999Explorer
Terryallan wrote:
rlw999 wrote:
Terryallan wrote:
What happens when the road isn't where it is supposed to be? Like in a construction zone where the road was moved over a few feet. I know. The car runs into the guard rail. Seen it happen.
Automated vehicles don't blindly follow a pre-mapped route down to the foot, they have cameras, radar and lidar to help them see.
Might want to tell that to Telsa. I saw what happens when in a construction zone where the barrier is moved over 2 feet. Telsa upside down.
Tesla's "autopilot", despite what their marketing says, is not a self-driving system. It's a driver assistance feature, but the driver still needs to pay attention to the road.
Tesla also saved money by not using LIDAR to detect obstacles, while most of the truck based systems do use it (mostly because a $5000 LIDAR sensor is a lot more affordable on a $200K truck than a $50K car, especially when it's replacing a $75K/year driver) - TerryallanExplorer II
rlw999 wrote:
Terryallan wrote:
What happens when the road isn't where it is supposed to be? Like in a construction zone where the road was moved over a few feet. I know. The car runs into the guard rail. Seen it happen.
Automated vehicles don't blindly follow a pre-mapped route down to the foot, they have cameras, radar and lidar to help them see.
Might want to tell that to Telsa. I saw what happens when in a construction zone where the barrier is moved over 2 feet. Telsa upside down. Terryallan wrote:
What happens when the road isn't where it is supposed to be? Like in a construction zone where the road was moved over a few feet. I know. The car runs into the guard rail. Seen it happen.
The autodrive systems currently being developed like GM supercruise and Tesla autopilot FSD use AI to make some decisions. We have FSD on our Tesla. Not the limited test beta that works in cities (there are less than a 1000 test cars with this test software in it presently) but the standard FSD available on any Tesla as an option. If it is on FSD (full self drive) on a road or highway it watches the lines and makes decisions accordingly. If the lines disappear it try's to do the same thing using the edges of the road. At some point if it can't do that it notifies the driver that he or she must take over immediately. But these are level 2 systems meaning a driver is present and must be able to take over at any time. The Tesla system requires the driver to have a hand on the wheel and actually checks every minute or so.
But for the most part it works pretty good and interventions are seldom. It navigates makes the turns for all the on ramps and off ramps if you have a destination programmed in, makes all the necessary lane changes, stops at all the stop signs and stop lights etc. I find it a little paranoids and slow around crosswalks and bicycles but maybe that is a good thing.
To give you an idea on its AI capabilities, the RV park where we store our class A decided to add some home brew stop signs last year. Basically a stop on a pole on a tire rim. I tend to drive the car with TAC on all the time so just in case I miss a stop sign or stop light the car (who my wife named JARVIS) will see it and stop. Anyway, not expecting a non compliant stop sign on a stick on a curb I would have driven right through this one. But JARVIS saw it and came to a gentle stop. We were yakking and trying to figure out why the car stopped and then saw the sign. Well, at some point later the park took them out because technically they weren't really compliant and people complained. Anyway, for at least the next half dozen times we noticed that JARVIS still slowed down to almost a stop looking for the sign that he knew was there before. He finally adapted and now he doesn't stop anymore.
Here is a picture of that sign.
And this is the display the driver sees if there is a stop sign.
We don't use the summon feature much as it is a little slow, but the times we have used it it works okay. Frankly I can walk faster to the car than summoning the car from the door at Walmart, but the couple times we have used it for kicks it has done okay, backing out of its parking spot, driving up the correct side of the parking lot lane, stopping for pedestrians etc etc. Like I say, we don't use it much. People get freaked out when they see there is no one in the car when its driving around the parking lot.
We do use the autopark in the garage feature when its tight in there because of projects etc. Pull up to the garage door. garage door opens, select autopark, get out of the car, as soon as the car door closes car goes in gear and drives into the garage. It stops and then closes the garage door, I walk in the main house door and don't worry about being able to only open my car door half way in the garage. Easy peasy.
Some manufacturers like Tesla are going real minimalist with their interiours getting people use to not having buttons or knobs to play with. Most everything can be done by voice. I can literally say to the car, "my a$$ is cold" and it turns on my seat heater. I think my wifes car has like two buttons, and both are on the steering wheel. Even the brake pedal rarely gets used as it is pretty much all one pedal driving.
Meet Jarvis. :)
Anyway, I'm sure there will be lots of AI developments in the next few years.- rlw999Explorer
pnichols wrote:
Well ... there is one big and very important difference between a human truck driver and a technology truck driver: A human truck driver has an ultimately selfish reason to not want to get into, or cause, an accident involving what he's riding in. It's called self preservation.
If that were the only motivation that humans have, that might be comforting. But that same driver is also under pressure to get the job done, even if he's tired or feeling ill, maybe he just drank half a bottle of cough medicine to help him finish the run. And when he's approaching his mandatory break, he may feel compelled to drive a little faster so he can get to the depot while he's still legally allowed to drive... or maybe he just wants to get home to his family faster. Driving 10% over the speed limit gets him home an hour earlier on a 10 hour drive.
He's also prone to distraction, perhaps paying more attention to his MP3 player than to the road, or getting into an argument with his wife on the phone. Or maybe he's one of those drivers that thinks he can watch a movie while driving.
And even humans are subject to malfunction, drivers have had accidents during medical incidents. It's much more expensive to have a redundant driver than to build redundancy into control systems.
A self-driving car doesn't need to have a sense of self-preservation, it just needs to know how to stay on the road and avoid obstacles. - Quite the contrary. The truck and software people know it is end of days if the truck does something that seems preventable. These are rational professional people deciding what to program as a group.
On the contrary I have seen many trucks driving needlessly aggressive.
Much lower cost for the automated truck to just pull over in bad weather and wait. Every winter we see groups of big rigs in a big pile up. Why don't they just pull over and wait in the cab for two days? Risks are taken either way. - Grit_dogNavigator
mkirsch wrote:
I place this on my "Things to Worry About and Lose Sleep Over" list just below Murder Hornets.
Best post of the month!!! - pnicholsExplorer II
Hammerboy wrote:
pnichols wrote:
Scarry is right!!
Imagine how full of integrated circuits, complex mechanical components, and communications equipment (for Internet and/or satellite connectivity) ... those trucks will be full of. All of tlhat can, and will, fail here and there over time.
I hope that transportation regulations require those trucks to be clearly marked - including distinctive night lighting - so that the rest of us can stay well away from them on the highways.
P.S. Maybe I spent too many years working in the integrated circuits industry and too many hours watching those cable reality shows about big rig accident disasters in Alaska - most which have nothing to do with human error - but can be blamed on 80,000 lbs. of freight inter-acting with the laws of physics.
Someday in the near future we will think it's scary when a human is behind the wheel. "What if the driver falls asleep?" "What if the driver is not paying attention or has a medical problem"
Dan
Well ... there is one big and very important difference between a human truck driver and a technology truck driver: A human truck driver has an ultimately selfish reason to not want to get into, or cause, an accident involving what he's riding in. It's called self preservation.
I don't think a bundle of technology is self-aware enough to be concerned with preserving itself. Software, integrated circuits, engine control modules, accelerometers, braking systems, tire pressure monitoring systems, steering mechanisms, pattern recognition image capture systems, satellite/Internet antennas, etc., etc. ... could care less about what happens to themselves resulting from "not paying attention" or "errors in judgement" that they might make. - JIMNLINExplorer III
rlw999 wrote:
JIMNLIN wrote:
I doubt the warehouse/customer is going to spend millions of bucks setting up a system to handle drivers free trucks.
They will when they own the warehouses and the trucks -- like Amazon, Walmart, etc. And when other warehouses have to pay a premium to send a human operated truck to Amazon's warehouses since their drivers get queued up behind a driverlesss truck that doesn't mind waiting for 10 hours to unload, they'll start to make accommodations for the driverless trucks. Which could be as simple as a parking area -- driverless trucks come in an park where it waits for a human driver to take over.
Probably for those mega warehouses .... but not for a small customers with 2-3 docks or no docks at all.
I could see a mega outfit like Walmart/Amazon/etc using driver less trucks running between their large regional warehouses.
I don't see truckers loosing their jobs any time soon. - mkirschNomad III place this on my "Things to Worry About and Lose Sleep Over" list just below Murder Hornets.
- wapiticountryExplorer
valhalla360 wrote:
When the truck driver makes a mistake it is an accident and liability insurance takes over. But when the computer makes an actual, provable, decision to run over a 65 year old instead of a 10 year old because the programming says the life of a young child is more valuable than that of a senior citizen is that still an accident? I don't know, but I do know it will kind of suck to be at the bottom of the decision tree.wapiticountry wrote:
One of the biggest hurdles to automated driving are ethical decision trees. What will the programming be when the vehicle is faced with multiple bad outcomes? Will it be programmed to protect the vehicle at all costs, keeping it out of the ditch and running into a child instead? Will it be programmed to go the other way and veer into the ditch and miss what it thought was a child, but was actually a bag of trash in the middle of the highway possibly careening out of control and crashing into homes, businesses etc? There is a lot of information available on this problem and it is scary to realize there may actually be computer code that makes the decision to deliberately kill you in an effort to protect someone else.
And what happens when a truck driver makes the same mistakes?
Unless Asimov's 3 laws blows up and the trucks decide we are better off locked in our houses, it's not really a big worry.
There is much more to automated vehicles than just letting them loose. The programming will require making determinations that will ultimately result in death and injuries and those determinations will be etched onto those circuit boards for the world to see. I am not sure that auto makers or individual programmers should be allowed to make the ultimate decision on who should live and who should die when an automated vehicle is faced with a no win scenario.
About Travel Trailer Group
44,029 PostsLatest Activity: Jan 28, 2025