
? 喬納森·查林杰稱,他的Cybertruck汽車在全自動駕駛模式下,徑直撞上了路燈。埃隆·馬斯克希望這項技術(shù)在今年6月份無人駕駛出租車服務(wù)啟動時能夠準備就緒。
一輛特斯拉(Tesla)Cybertruck側(cè)翻在路邊,右前輪懸空緊貼在燈柱上,車身嚴重變形,場面令人觸目驚心。車主喬納森·查林杰于周日在社交平臺發(fā)布了事故照片,稱特斯拉全自動駕駛(FSD)軟件在他未注意時突然失控,導致車輛撞向路燈。
盡管查林杰本人毫發(fā)無傷,但他警告稱其他人可能不會如此幸運。他呼吁:“請轉(zhuǎn)發(fā)我的遭遇,幫助更多人避免重蹈覆轍或遭遇更可怕的厄運?!?/p>
該帖子迅速獲得200萬次瀏覽,并引發(fā)關(guān)于“FSD是否足夠成熟,能否真正實現(xiàn)無人駕駛”的激烈爭論。
此時距離特斯拉首席執(zhí)行官馬斯克推出無人駕駛出租車服務(wù)的關(guān)鍵節(jié)點僅剩不到五個月時間,而這一業(yè)務(wù)被視為支撐特斯拉1.1萬億美元市值的核心支柱。
據(jù)查林杰描述,事故發(fā)生時,車輛未能及時駛離即將終止的車道——盡管相鄰車道暢通無阻,系統(tǒng)既未嘗試減速,也沒有轉(zhuǎn)向,直到撞上障礙物。
谷歌(Google)地圖及街景畫面顯示,事故路段的車道布局與查林杰帖子中的照片完全吻合。里諾市警察局向《財富》雜志證實,2月6日確有一名姓查林杰的司機發(fā)生了車禍。但在完整報告提交前,警方暫不披露進一步細節(jié)。
查林杰在推文中@了馬斯克、特斯拉AI總監(jiān)阿肖克·埃勒斯瓦米、整個AI團隊及Cybertruck車輛總工程師韋斯·莫里爾。
特斯拉長期通過FSD系統(tǒng)收集數(shù)據(jù),用于訓練其自動駕駛系統(tǒng)。過去,對于不實車禍指控,特斯拉會立即予以否認。
截至發(fā)稿時,特斯拉尚未回應(yīng)《財富》雜志的置評請求?!敦敻弧冯s志向查林杰發(fā)送了置評請求,但未收到回復。
“這是我的嚴重失誤。不要重蹈覆徹”
特斯拉直到去年9月才向Cybertruck開放FSD系統(tǒng),此時距該車型上市已過去整整10個月。
與特斯拉轎車相比,這款皮卡車身尺寸更大、底盤更高,且配備四輪轉(zhuǎn)向系統(tǒng),整體工程設(shè)計更為復雜。
一位以公正和權(quán)威性著稱的特斯拉 FSD 測試員證實,查林杰對車禍的描述具有可信度。
查林杰在推文中@的資深測試員查克·庫克回應(yīng)稱:“你描述的場景非常普遍。車輛的路線規(guī)劃程序或提前并道決策,往往會讓車輛陷入困境,最終無路可走。你絕不是個例?!?/p>
查林杰迅速承認自己的疏忽,并接受因未按特斯拉要求對 FSD 進行監(jiān)督而需承擔的最終責任。
他警告稱:“這顯然是我的嚴重失誤,不要重蹈覆轍。請保持警惕,事故隨時可能發(fā)生?!彼瑫r請求將行車記錄儀視頻提交給特斯拉AI團隊分析。
面對“蓄意借6月無人駕駛出租車商業(yè)落地前夕、馬斯克與特斯拉備受關(guān)注之機進行炒作”的指控,查林杰堅決否認。
他表示:“我只想盡可能將數(shù)據(jù)交給特斯拉。我已經(jīng)嘗試了所有能想到的辦法與他們?nèi)〉寐?lián)系?!?/p>
對事故發(fā)生時間的質(zhì)疑
早在上月初,查林杰就曾在另一條未被廣泛關(guān)注的推文中承認遭遇嚴重事故。
當被問及在正面碰撞時,Cybertruck的能量吸收表現(xiàn)時,他于1月初寫道:“作為事故親歷者,我可以確認,這款車在撞擊時的潰縮吸能表現(xiàn)正常?!?/p>
查林杰透露,他已多次嘗試向特斯拉提交行車記錄儀中的事故視頻。
他特別指出該事故發(fā)生時使用的是FSD v13.2.4版本。值得注意的是,該軟件版本在他發(fā)布較早推文約一周后才開始向全體FSD用戶推送。
馬斯克的過往表現(xiàn)引發(fā)外界對 FSD 技術(shù)的質(zhì)疑
距離6月份推出無人駕駛出租車只有幾個月時間,但特斯拉首席執(zhí)行官馬斯克尚未公布任何可供獨立驗證的數(shù)據(jù),以證明他所說的FSD已準備好應(yīng)用在無人監(jiān)督的全自動駕駛出租車上。
相比之下,Waymo等競爭對手會向州監(jiān)管機構(gòu)報告人工接管事件。但特斯拉多年來卻一直利用法律漏洞,拒絕這一透明度要求。
馬斯克還多次發(fā)表與事實不符的言論。
特斯拉的AI總監(jiān)埃勒斯瓦米在法庭作證時稱,馬斯克曾指示他修改一條營銷視頻,以誤導消費者對特斯拉FSD能力的認知。
最近,馬斯克承認,搭載舊版 AI3 推理計算機的特斯拉車型實際上無法實現(xiàn)他此前宣稱的“所有 2016 年后生產(chǎn)的車輛均具備自動駕駛能力”。
他計劃為已購買 FSD 的客戶提供最新一代硬件升級服務(wù),但具體如何實施以及所需成本仍不明確。(財富中文網(wǎng))
譯者:劉進龍
審校:汪皓
? 喬納森·查林杰稱,他的Cybertruck汽車在全自動駕駛模式下,徑直撞上了路燈。埃隆·馬斯克希望這項技術(shù)在今年6月份無人駕駛出租車服務(wù)啟動時能夠準備就緒。
一輛特斯拉(Tesla)Cybertruck側(cè)翻在路邊,右前輪懸空緊貼在燈柱上,車身嚴重變形,場面令人觸目驚心。車主喬納森·查林杰于周日在社交平臺發(fā)布了事故照片,稱特斯拉全自動駕駛(FSD)軟件在他未注意時突然失控,導致車輛撞向路燈。
盡管查林杰本人毫發(fā)無傷,但他警告稱其他人可能不會如此幸運。他呼吁:“請轉(zhuǎn)發(fā)我的遭遇,幫助更多人避免重蹈覆轍或遭遇更可怕的厄運?!?/p>
該帖子迅速獲得200萬次瀏覽,并引發(fā)關(guān)于“FSD是否足夠成熟,能否真正實現(xiàn)無人駕駛”的激烈爭論。
此時距離特斯拉首席執(zhí)行官馬斯克推出無人駕駛出租車服務(wù)的關(guān)鍵節(jié)點僅剩不到五個月時間,而這一業(yè)務(wù)被視為支撐特斯拉1.1萬億美元市值的核心支柱。
據(jù)查林杰描述,事故發(fā)生時,車輛未能及時駛離即將終止的車道——盡管相鄰車道暢通無阻,系統(tǒng)既未嘗試減速,也沒有轉(zhuǎn)向,直到撞上障礙物。
谷歌(Google)地圖及街景畫面顯示,事故路段的車道布局與查林杰帖子中的照片完全吻合。里諾市警察局向《財富》雜志證實,2月6日確有一名姓查林杰的司機發(fā)生了車禍。但在完整報告提交前,警方暫不披露進一步細節(jié)。
查林杰在推文中@了馬斯克、特斯拉AI總監(jiān)阿肖克·埃勒斯瓦米、整個AI團隊及Cybertruck車輛總工程師韋斯·莫里爾。
特斯拉長期通過FSD系統(tǒng)收集數(shù)據(jù),用于訓練其自動駕駛系統(tǒng)。過去,對于不實車禍指控,特斯拉會立即予以否認。
截至發(fā)稿時,特斯拉尚未回應(yīng)《財富》雜志的置評請求。《財富》雜志向查林杰發(fā)送了置評請求,但未收到回復。
“這是我的嚴重失誤。不要重蹈覆徹”
特斯拉直到去年9月才向Cybertruck開放FSD系統(tǒng),此時距該車型上市已過去整整10個月。
與特斯拉轎車相比,這款皮卡車身尺寸更大、底盤更高,且配備四輪轉(zhuǎn)向系統(tǒng),整體工程設(shè)計更為復雜。
一位以公正和權(quán)威性著稱的特斯拉 FSD 測試員證實,查林杰對車禍的描述具有可信度。
查林杰在推文中@的資深測試員查克·庫克回應(yīng)稱:“你描述的場景非常普遍。車輛的路線規(guī)劃程序或提前并道決策,往往會讓車輛陷入困境,最終無路可走。你絕不是個例?!?/p>
查林杰迅速承認自己的疏忽,并接受因未按特斯拉要求對 FSD 進行監(jiān)督而需承擔的最終責任。
他警告稱:“這顯然是我的嚴重失誤,不要重蹈覆轍。請保持警惕,事故隨時可能發(fā)生?!彼瑫r請求將行車記錄儀視頻提交給特斯拉AI團隊分析。
面對“蓄意借6月無人駕駛出租車商業(yè)落地前夕、馬斯克與特斯拉備受關(guān)注之機進行炒作”的指控,查林杰堅決否認。
他表示:“我只想盡可能將數(shù)據(jù)交給特斯拉。我已經(jīng)嘗試了所有能想到的辦法與他們?nèi)〉寐?lián)系?!?/p>
對事故發(fā)生時間的質(zhì)疑
早在上月初,查林杰就曾在另一條未被廣泛關(guān)注的推文中承認遭遇嚴重事故。
當被問及在正面碰撞時,Cybertruck的能量吸收表現(xiàn)時,他于1月初寫道:“作為事故親歷者,我可以確認,這款車在撞擊時的潰縮吸能表現(xiàn)正常?!?/p>
查林杰透露,他已多次嘗試向特斯拉提交行車記錄儀中的事故視頻。
他特別指出該事故發(fā)生時使用的是FSD v13.2.4版本。值得注意的是,該軟件版本在他發(fā)布較早推文約一周后才開始向全體FSD用戶推送。
馬斯克的過往表現(xiàn)引發(fā)外界對 FSD 技術(shù)的質(zhì)疑
距離6月份推出無人駕駛出租車只有幾個月時間,但特斯拉首席執(zhí)行官馬斯克尚未公布任何可供獨立驗證的數(shù)據(jù),以證明他所說的FSD已準備好應(yīng)用在無人監(jiān)督的全自動駕駛出租車上。
相比之下,Waymo等競爭對手會向州監(jiān)管機構(gòu)報告人工接管事件。但特斯拉多年來卻一直利用法律漏洞,拒絕這一透明度要求。
馬斯克還多次發(fā)表與事實不符的言論。
特斯拉的AI總監(jiān)埃勒斯瓦米在法庭作證時稱,馬斯克曾指示他修改一條營銷視頻,以誤導消費者對特斯拉FSD能力的認知。
最近,馬斯克承認,搭載舊版 AI3 推理計算機的特斯拉車型實際上無法實現(xiàn)他此前宣稱的“所有 2016 年后生產(chǎn)的車輛均具備自動駕駛能力”。
他計劃為已購買 FSD 的客戶提供最新一代硬件升級服務(wù),但具體如何實施以及所需成本仍不明確。(財富中文網(wǎng))
譯者:劉進龍
審校:汪皓
? Jonathan Challinger claims his Cybertruck drove headlong into a streetlight while in Full Self-Driving mode. Elon Musk wants the technology to be ready for a June robotaxi launch.
Wrapped around a pole with its right wheel dangling, the image of a wrecked Tesla Cybertruck lying motionless on the side of the road is shocking. Driver Jonathan Challinger posted the undated picture on Sunday. He claims Tesla’s automated Full Self-Driving (FSD) software caused his vehicle to crash into a light post while he wasn’t looking.
While Challinger escaped without harm, he warned others might not be so lucky. “Spread my message and help save others from the same fate or far worse,” he wrote.
The post received 2 million views, sparking fierce debate as to whether FSD is good enough to be used without humans behind the wheel.
It comes less than five months before Tesla CEO Elon Musk’s crucial launch of an autonomous driving robotaxi service, which is a core pillar supporting Tesla’s more than $1.1 trillion market cap.
According to Challinger’s account, the car failed to depart a lane that was ending, despite no vehicles that might have impeded him merging into another, and making no attempt to slow down or turn until it was too late.
Google Maps and Street View imagery show that the road layout matches the photo in Challinger’s post. An official for the Reno Police Department confirmed to Fortune that there was a crash involving a driver named Challinger on Feb. 6, but declined to give further details pending a full report being filed.
Challinger tagged Musk, AI director Ashok Elluswamy, Tesla’s entire AI team, and Cybertruck lead engineer Wes Morrill in the tweet.
The carmaker constantly collects data from FSD for training. In the past it has immediately denied crash accounts when they have not been true.
At the time of publication, Tesla had not responded to Fortune’s request for comment. Fortune also contacted Challinger for comment but did not receive a reply.
‘Big fail on my part. Don’t make the same mistake I did’
Tesla only rolled out FSD to the Cybertruck in September, a full 10 months after the vehicle launched.
The pickup has larger dimensions, a higher stance on the road, and more complex engineering—it uses all four wheels to steer—than a Tesla saloon.
One of the best-known and most impartial Tesla FSD testers attested to the plausibility of Challinger’s account of the crash.
“The situation you describe is very common, where the planner or decision-making to get in the appropriate lane early enough often gets the vehicle in a bind, where it runs out of options,” replied Chuck Cook, who was tagged in the post by Challinger. “You are NOT the only one.”
Challinger was quick to admit negligence and accept ultimate responsibility for failing to supervise the system, as Tesla requires of all its owners who use FSD.
“Big fail on my part, obviously. Don’t make the same mistake I did. Pay attention. It can happen,” he warned, requesting a means to deliver dashcam footage to Tesla’s AI team for analysis.
Challinger, moreover, dismissed accusations that he was acting in bad faith by trying to capitalize on the scrutiny surrounding Musk, Tesla, and FSD ahead of the commercial launch in June.
“I just want to get the data to Tesla if I can. I tried everything I could think of to get in touch with them,” he said.
Doubt over date of reported crash
Challinger had previously acknowledged early last month—in a separate post he didn’t flag widely—that he had been involved in a serious accident.
Responding to a question about the Cybertruck’s structural ability to absorb energy in a frontal collision, he wrote in early January: “Having crashed mine, can confirm that it crumples just fine.”
He said he had repeatedly tried to get the dashcam footage to Tesla.
Challinger specified that the crash occurred while using FSD v13.2.4, a software version that had only been widely rolled out to all FSD users roughly a week after his earlier post.
Musk’s track record raises doubts about the technology
Just months ahead of a planned June launch, CEO Musk has yet to publish any independently verifiable data to back up his claim that FSD is ready to be used in an unsupervised, fully autonomous robotaxi.
By comparison, other rivals like Waymo report their disengagements to state regulators. Tesla, however, has used a legal loophole to avoid this transparency for years.
Musk has also repeatedly misstated facts.
Tesla’s AI director, Elluswamy, testified in court that Musk ordered him to doctor a marketing video to mislead consumers about Tesla’s FSD capabilities.
More recently, Musk admitted Teslas running on older AI3 inference computers have, in fact, failed to live up to his claim that all cars built after 2016 are capable of autonomous driving.
He plans to replace that hardware with the newest generation in those vehicles where customers purchased FSD. How exactly that can be done and at what cost is unclear.