Cars Autonomy?


23/April/2018


Cars Autonomy vs, (almost), perfectly educated and experienced, alive human, drivers?

Or, even better, vs 


The Syncro Pilots? 
(Scroll down to the #5 please, "The Syncro Driver's profile")



Are you crazy?



By  on April 22, 2018
(article excerpt)

Thanks to the incredibly lax and voluntary guidelines outlined by the National Highway Traffic Safety Administration, automakers have had free reign to develop and test autonomous technology as they see fit. Meanwhile, the majority of states have seemed eager to welcome companies to their neck of the woods with a minimum of hassle. But things are beginning to change after a handful of high-profile accidents are forcing public officials to question whether the current approach to self-driving cars is the correct one.

The House of Representatives has already passed the SELF DRIVE Act. But it’s bipartisan companion piece, the AV START Act, has been hung up in the Senate for months now. The intent of the legislation is to remove potential barriers for autonomous development and fast track the implementation of self-driving technology. But a handful of legislators and consumer advocacy groups have claimed AV START doesn’t place a strong enough emphasis on safety and cyber security. Interesting, considering SELF DRIVE appeared to be less hard on manufacturers and passed with overwhelming support.

Of course, it also passed before the one-two punch of vehicular fatalities in California and Arizona from earlier this year. Now some policymakers are admitting they probably don’t understand the technology as they should and are becoming dubious that automakers can deliver on the multitude of promises being made. But the fact remains that some manner of legal framework needs to be established for autonomous vehicles, because it’s currently a bit of a confused free-for-all. 

“It was not an issue that I knew a whole a lot about, and I was just bombarded by all sides,” Indiana Republican Senator Michael Crider, who oversaw the state’s attempt to introduce new autonomous regulation, told Automotive News. “I’m sick of the whole topic.”

That’s an issue with a lot of legislators. They aren’t technical experts and are primarily being educated by the very groups that are advancing this technology and likely have a strong bias to have things their way. Meanwhile, the automotive and tech industries don’t want the government to pass laws that effectively neuter developments they’ve poured billions of dollars into already.

“There’s not a lot of trust [among the companies],” explained Crider. “They all have spent a lot of money to develop their technology”

However, some public servants are feeling taken advantage of. Indiana Republican State Representative Ed Soliday, who authored the state’s failed self-driving car bill, said groups like the Self-Driving Coalition for Safer Streets and Alliance of Automobile Manufacturers were basically trying to put one over on government officials.

“They basically treated us like we were stupid,” Soliday said. “It’s a very frustrating experience. They need to change their attitude.”

With opposition to AV START growing, at least in its present form, automotive groups may be forced to make concessions soon. Will Wallace, Consumers Union’s senior policy analyst on self-driving, said manufacturers need to be more open about the level of uncertainty surrounding autonomous development. The group recently took Democratic Senator Richard Blumenthal (Connecticut) for a joyride in a Cadillac CT6 with its Super Cruise and Tesla Model 3 with Autopilot to illustrate how present-day semi-autonomous driving systems function on public roads.

Blumenthal remarked the experience was “frightening.” He also issued a statement in response to the fatal incident involving an Uber vehicle and a pedestrian in Tempe, Arizona, in March. “This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads,” Blumenthal said. “Congress must take concrete steps to strengthen the AV START Act with the kind of safeguards that will prevent future fatalities. In our haste to enable innovation, we cannot forget basic safety.”

In the same month, a group of 27 individuals representing bicycle safety, pedestrians, environmentalists, law enforcement, and the disabled community submitted a letter to Senators Mitch McConnell and Chuck Schumer — urging them to reconsider the deregulation on autonomous vehicle development that the AV START Act would ultimately authorize. Meanwhile, other consumer advocacy groups are pushing for assurances that driver data won’t be misused as vehicles become more connected.

“Do your homework,” said Soliday. “Everybody’s beginning to understand there’s a lot of hyperbole in the vision casting for autonomous vehicles.”


[Image: Ford Motor Company]
The Truth about cars."



Please watch:




(Yes, there is an excuse, always).

Yes, the artificial intelligence is for the Humanity.

But the use of the individuals, or of the societies, as the victims of an early adoption, is not .

The Syncro Heresy


15/May/2018

Is there any case for a Syncro pilot, (or just a Syncro driver), to crash under these circumstances?





23/May/2018


The autonomation of a, potential, killer.

By  on May 23, 2018
Mobileye, the Israeli company that supplies camera-based driver assist technology to a host of automakers, just received a black eye.

While one of its Ford Fusion Hybrid testbeds cruised through the streets of Jerusalem to show off its autonomous driving abilities, the sedan, equipped with 12 cameras (three of them forward facing), four advanced EyeQ4 chips, and a television audience, drove merrily through a red light.

It’s technically not the car’s fault, Mobileye said. It’s the TV crew’s.

According to The Detroit News, the car was outfitted with news cameras from Israel’s Channel 10 before setting out for its journey. A safety driver sat behind the wheel. Despite an otherwise uneventful trip, the media event saw the Fusion sail through a red light without slowing or stopping. (See the video here.)

Technological hiccups, as we’ve seen, are par for the course in this wild and wooly era of early autonomous vehicle testing. 

So what caused the test car, which eschews radar and lidar for a suite of cameras and Intel-developed chips, to ignore what it saw? Electromagnetic interference from the TV crew’s onboard cameras (from the camera’s wireless transmitters, specifically) interfered with the signal from the traffic light transponder, Mobileye CEO Amnon Shashua claims. While the cameras saw the red light, it didn’t jibe with the transponder signals received by the car. Thus, the car ignored the camera.

“It was a very unique situation,” he said in an interview. “We’d never anticipated something like this.”
Shashua claims his company has since fixed the problem, but without saying exactly how. He did add that Mobileye would better shield the car’s computers from electromagnetic interference in the future. The company, acquired by Intel for a ridiculous sum last year, unveiled its Level 4 autonomous vehicle in January.

Just last week, Mobileye secured a contract with an unnamed European manufacturer for its Level 3 “semi-autonomous” driving technology. According to Reuters, the company’s tech will appear in 8 million of that automaker’s vehicles over the life of the contract. Mobileye already has partnerships with General Motors, Fiat Chrysler, and many other automakers.

As for the Fusions and their Level 4 technology, Mobileye says those systems won’t be commercially available until 2021. In the past, Shashua has said that full autonomy requires a combination of all types of imaging systems: cameras, radar, and lidar.
(article excerpt)



Uber (alles?):

"Uber’s lidar supplier blames the company’s software — a claim backed by an anonymous source who says the Volvo did see the victim, but chose to do nothing."

By  on May 8, 2018
Volvo Cars and Uber join forces to develop autonomous driving cars
The fatal collision between an autonomous Volvo XC90 operated by Uber Technologies and 49-year-old Elaine Herzberg in March could have been prevented, had the vehicle’s software not dismissed what its sensors saw.

That’s what two sources briefed on the issue told The Information, but Uber isn’t divulging what led to the Tempe, Arizona collision. What it will admit to, however, is the hiring of a former National Transportation Safety Board chair to examine the safety of its self-driving vehicle program.

Uber suspended testing following the March crash. In the aftermath, a video showing the seconds leading up to the impact revealed a vehicle that didn’t react to the victim crossing its path on a darkened road. The Volvo’s headlights pick up Herzberg an instant before the collision, but it’s the forward-facing radar and 360-degree rooftop LIDAR unit that should have identified the woman as an object to be avoided.

Lidar supplier Velodyne denied any flaw in its product following the fatal crash.
Blame the software, the sources claim. Uber reportedly tuned its software to aggressively ignore what the sensors determined to be “false positives,” thus ensuring a smooth ride with fewer unnecessary course corrections or brake applications. An errant plastic shopping bag counts as a false positive. As far as the car’s software was concerned, Herzberg was just such an object, the sources stated.

Uber declined to comment on these claims. On Monday, the company announced the hiring of a safety expert to oversee its autonomous vehicle program.

“We have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture,” Uber said. “Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”
(article excerpt)


28/May/2018

Autopiloted TESLA self-crashed in Greece, (report from Facebook):


"... guys this is it. The road trip is over. I'm sorry.
Vehicle was engaged on Autopilot at 120 km/h. Car suddenly veered right without warning and crashed into the centre median (edit: divider at the exit fork). Both wheels (edit: one wheel) completely shattered, my door wouldn't even open correctly. I'm unharmed.
Update 00:29 - policeman responding on scene is a fan of the Model 3, go figure. I'm just stating the facts in my post. My insurance is third-party only, which means I will receive no compensation for this collision. I am now calling a tow truck driver who will tow the car to Thessaloniki as it is not drivable. 
I will make further plans, most likely to repatriate the car back to San Francisco from there. I'm trying to stay positive guys, I'm so lucky to have had this opportunity to represent the EV community and movement, and so unlucky to have had this happen to me after driving without an issue through 25 countries.

Update 02:47 - 

thank you very much to the local police for their help. The vehicle has been towed to a secure yard. Like I said in the previous update, it will be towed to Thessaloniki in the morning. 

Further arrangements about the car will be made there. The police also drove me to a local hotel (where I am now). 

My left hand was grasping the bottom of the wheel immediately before the collision. The vehicle jerked to the right (seemingly in an attempt to take the exit) with great strength and did so very suddenly, which was the immediate cause of the collision of the front left of the car with the divider. 

I have prepared a draft statement regarding the incident and will release it in the morning after I have had a chance to sleep and calm down. 

I am again disgusted by the various negative comments and personal attacks that have surfaced on the various Tesla Facebook groups, including suggestions that I am bringing the car back to the US to commit insurance fraud by pretending the collision happened there. 

Other comments suggest that this is not a big loss for me as I'm rich and that I've made enough money on this trip to pay for the entire car. 

This trip has been about seeing the world, meeting people along the way, and giving them the opportunity to see and experience this beautiful car. I understand the critics saying that at times that's not what this trip has appeared to be, but at the end of the day the trip must be taken and judged as a whole - a trip that I continue to stand by and am proud of."






30/May/2018

The Frantic Autopilots!
By  on May 30, 2018
Image: Laguna Beach PD, via Twitter
This won’t help our Pravda rating.

Police in Laguna Beach, California told the “media” that the driver of a Tesla Model S that collided with a parked Ford Police Interceptor Utility on Tuesday was operating in Autopilot mode. At least, that’s the driver’s claim.

Images released by Laguna Beach PD reveal a somewhat glancing rear impact, as it seems the police cruiser only slightly intruded into the driving lane. The cruiser, then unoccupied, was totalled, while the insurance company — if past collisions are any indicator — will probably declare the Tesla a write-off.
Right now, there’s no confirmation that autosteer and traffic-aware cruise control was enabled on the Tesla.

If this turns out to be a false claim, you’ll see an update here. If it isn’t, well, you’ll still read about it here. As it stands, we don’t know any details of what occurred inside the vehicle leading up to the collision, how fast it was travelling, or whether the driver received any visual or audible warnings. Tesla hasn’t released a data-filled blog post (as it sometimes does when Autopilot pops up negatively in the news).

Sgt Jim Cota, Laguna Beach PD’s public information officer, claims the Tesla driver received minor lacerations from his glasses in the collision.
Image: Laguna Beach PD
When contacted by The Guardian, a Tesla spokeswoman repeated the automaker’s safety instructions for the proper use of Autopilot features. “When using autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times,” she said. “Tesla has always been clear that autopilot doesn’t make the car impervious to all accidents, and before a driver can use autopilot, they must accept a dialogue box which states that ‘autopilot is designed for use on highways that have a center divider and clear lane markings’.”

Despite boastful pronouncements about Autopilot’s abilities in its infancy, Tesla maintains that drivers should only enable the features on proper, divided roadways. Drivers must also remain alert and keep their hands on the wheel. 

This crash brings to mind a recent rear-end collision in Utah that also involved a stopped municipal vehicle. In that incident, a Model S maintained a steady cruise speed as it approached a red light, colliding with a fire truck at 60 mph. Investigations are underway to pinpoint why the car’s forward-facing cameras and radar failed to detect the obstacle and warn the driver.
(article excerpt)
The Truth About Cars



The Syncro Heresy


08/March/2019

Our future dreams/nightmares?