Yup, the Falcon 9, Falcon Heavy, the self landing boosters, the Model S, Model X, Paypal, Boring Company, all cons. The man has no credibility whatsoever. Run along now and get back to your Uber shift you dolt.
Speaking of Uber, isnât it a con as well?
Weâre learning a lot in this thread.
Any attempt at new technology is a scam.
Any girl who dates a rich guy is a whore.
Any guy trying to innovate is a charlatan.
And any minor thing that happens at a company is worthy of national news and simultaneously causes bro1999 to cream himself
I hear the menâs room at the Tesla plant is out of toilet paper.
Better drop your Model 3 reservations and short the stock.
literally every tesla crash is going to be investigated now it seems.
Thatâs exactly how I feel about autopilot
only can read the first paragraph unless i pay. pfft.
Even if you donât own a Tesla, you are, or might soon become, part of the companyâs massive experiment in automotive safety.
There are already more than 200,000 Teslas on the road, and all of them built after early 2015 are capable of Autopilot, that is, semiautonomous driving. This makes drivers, and anyone encountering these cars on the road, guinea pigs who are helping to train the artificial intelligence Tesla ultimately hopes to use for a fully autonomous driving system.
During this experiment, at least two people have died in driverâs seats of Teslas that crashed while Autopilot was engaged, but Chief Executive Elon Musk argues the system continues to improve and, overall, Teslas are safer than they would be without the technology.
Alphabet Inc.âs [US:GOOGL] Waymo and Uber Technologies Inc., among others, are also road testing on public streets. Theyâre experimenting at much smaller scales, though an Uber autonomous vehicle struck and killed a pedestrian in March. Subsequently, Uber suspended its self-driving-car program. CEO Dara Khosrowshahi has said it will resume within a few months.
These experiments are based on a number of assumptions about the abilities of AI, and the compatibility of humans and partially autonomous driving systems. If automobile companies are wrong about any of themâand thereâs reason to believe they areâweâll almost certainly see more self-driving car accidents, as semiautonomous technology becomes commonplace.
That isnât to say we shouldnât be on this path. Every year, some 40,000 people die in the U.S. in traffic-related accidentsâa situation made worse by distracted driving.
We have established methods for responsibly rolling out life-saving new technologies beforeâthink of clinical trials for new drugs, or seatbelts and airbagsâand we can do it again. But it might mean pumping the brakes on the rollout of self-driving cars.Teslaâs dangerous game
When engaged, Autopilot keeps the car within in its lane, can automatically change lanes, and maintains a safe distance from cars ahead and behind. When it senses a dangerous situation it alerts the driver, whether or not Autopilot is engaged. Sometimes, however, itâs up to the driver to realize the Autopilot system isnât doing what it should.
Tesla says that its cars with autonomous driving technology are 3.7 times safer than the average American vehicle. Itâs true that Teslas are among the safest cars on the road, but it isnât clear how much of this safety is due to the driving habits of its enthusiast owners (for now, those who can afford Teslas) or other factors, such as build quality or the carsâ crash avoidance technology, rather than Autopilot.
In the wake of a fatal 2016 crash, which happened when Autopilot was engaged, Tesla cited a report by the National Highway Traffic Safety Administration as evidence that Autopilot mode makes Teslas 40% safer. NHTSA recently clarified the report was based on Teslaâs own unaudited data, and NHTSA didnât take into account whether Autopilot was engaged. Complicating things further, Tesla rolled out an auto-braking safety featureâwhich almost certainly reduced crashesâshortly before it launched Autopilot.
There isnât enough data to independently verify that self-driving vehicles cause fewer accidents than human-driven ones. A Rand Corp. study concluded that traffic fatalities already occur at such relatively low ratesâon the order of 1 per 100 million miles traveledâthat determining whether self-driving cars are safer than humans could take decades.
What we do have is evidenceâacknowledged in Teslaâs own user manualsâthat Teslaâs semiautonomous driving system is easily fooled by bright sunlight, faded lane markings, seams in the road, etc. Researchers continue to document other ways to trick these systems, as well.
Tesla emphasizes its system is driver-assist technology, not full autonomy, and blamed the driver in the most recent crash that occurred when the system was engaged. Yet Tesla drivers and news reports suggest that in some cases, the only thing keeping drivers from getting into Autopilot-related accidents is their own reflexes.
The company promised a cross-country drive accomplished entirely by its self-driving tech sometime in 2017 but decided the system wasnât yet ready.AIâs limitations
None of this surprises experts who understand the AI at the heart of autonomous driving systems. Deep learningâthe âintelligentâ component of these systemsâis âbrittle, opaque and shallow,â says Gary Marcus, a professor of psychology and neural science at New York University and the former head of Uberâs AI lab.
AI is brittle because it canât carry over insights from one context to another, opaque because humans canât evaluate its neuron-like tangle of connections, and shallow because itâs easy to fool. You canât just throw more deep learning at a problem and expect it to be as good as a human, says Dr. Marcus.
Decades of research on autopilot systemsâwhether in airplanes or automobilesâhave shown that the most dangerous kind is that which requires the driver to take action when it fails. Less sophisticated semiautonomous driving systems, like adaptive cruise control and enhanced warnings, have been shown to increase safety. Full automation, where ultimately thereâs no steering wheel or gas pedal, has only begun to be road tested.
Alphabetâs Waymo decided it was too dangerous to let drivers take control when needed, and skipped right to a fully self-driving ride-share service, Waymo CEO John Krafcik has said. According to the company, and many who research self-driving technology, a system that never asks a driver to take over is safer than making potentially tricky machine-human handoffs.
Tesla promised to release safety data on its self-driving tech regularly starting next quarter. It isnât clear what kind of data it will release, but experts say public sharing of data, from all makers of autonomous vehicles, is the only way to ensure proper evaluation of the safety of these new technologies. Given that we already evaluate the safety of every other part of a motor vehicle in this way, it just makes sense.
Why do you keep making these overblown claims?
Gasoline-fueled vehicles are much more dangerous when the fuel system is compromised.
You could well have less than a minute to exit the vehicle if a (now even-higher-pressure-with-direct-injection) fuel line under the hood leaks and sprays gasoline on a hot engine/exhaust.
Contrast the above with when road debris punctured a Tesla battery - the system warned the battery pack was compromised and it finally started to burn well after the driver had pulled over and exited the vehicle.
I donât think the site owners would like paywalled content being posted here. It could easily become a problem for them.
Well, they will let me know if thatâs the problem.
Another Tesla executive bails the sinking mothership.
I actually laughed out loud at the âgoing to these degenerate celebrities partiesâ
I want to go to those.
Especially if they are degenerate.
I donât know what qualifies a party as degenerate but i want to find out.
Apparently, the NTSB, NHTSA and the IIHS see things differently, as there are more and more âbombsâ driving on the road every day.
He may talk out of his ass alot to build hype, and never meets a deadline he sets, but Iâd hardly call him a con artist. The 200k Teslas on the road today, along with various other endeavors would also make this a silly argument.
another day, another guinea pigâŚerâŚcrashâŚhasnât been confirmed if autopilot was in use or not.
The NTSB is gonna need to do some hiring.
Who needs more sensors for safety?! Not Elon. $$$ >>>> safety!
misleading people to obtain a goal is the definition of conning.
Heâs not misleadingâŚhe misses targets, yes, but heâs producing.
As an engineer, this NEVER happens ANYWHERE with ANYTHING. Execs and people counting money pushing engineering to reduce the costs or eliminate parts? Wow, never ever happens.