成人小说亚洲一区二区三区,亚洲国产精品一区二区三区,国产精品成人精品久久久,久久综合一区二区三区,精品无码av一区二区,国产一级a毛一级a看免费视频,欧洲uv免费在线区一二区,亚洲国产欧美中日韩成人综合视频,国产熟女一区二区三区五月婷小说,亚洲一区波多野结衣在线

首頁 500強 活動 榜單 商業(yè) 科技 領(lǐng)導(dǎo)力 專題 品牌中心
雜志訂閱

人工智能將產(chǎn)生海量碳排放,魚和熊掌如何兼得?

Jeremy Kahn
2021-04-23

算法在訓(xùn)練過程中能夠產(chǎn)生多少碳足跡,在很大程度取決于算法的具體設(shè)計、用于訓(xùn)練的計算機硬件類型,以及訓(xùn)練發(fā)生地的發(fā)電類型的影響。

文本設(shè)置
小號
默認
大號
Plus(0條)

現(xiàn)在的人工智能算法已經(jīng)可以完成一些非常復(fù)雜的任務(wù)了,比如撰寫非常自然的文章,或者根據(jù)文字描述生成圖片等等。不過訓(xùn)練這些程序需要大量的計算力,而計算力又跟電力成正比,這就讓很多人開始擔(dān)心起超大型人工智能系統(tǒng)的碳足跡問題,生怕它在環(huán)境上不具有可持續(xù)性。

那么,這些先進的人工智能系統(tǒng)的碳足跡究竟是多少?加州大學(xué)伯克利分校(University of California at Berkeley)和谷歌公司(Google)的科學(xué)家最近給出了迄今為止最準(zhǔn)確的估計——畢竟當(dāng)前市面上很多最流行的大型人工智能系統(tǒng)本身就是由谷歌開發(fā)的。

比如研究顯示,舊金山的人工智能公司OpenAI創(chuàng)建的一款功能強大的語言模型GPT-3,它在軟件的訓(xùn)練過程中產(chǎn)生了552噸的二氧化碳排放,相當(dāng)于120輛乘用車一年的排放量。谷歌的先進聊天機器人Meena也排放了96噸的二氧化碳,相當(dāng)于17個美國家庭一年的耗電量。

雖然這些數(shù)字大得嚇人,但它比研究人員此前的估算還是小了一些,畢竟研究人員始終沒有渠道獲得蘋果(Apple)和谷歌的內(nèi)部數(shù)據(jù)。這篇研究文章在4月21日發(fā)表于不用經(jīng)過同行審議的論文庫arxiv.org里。該論文同時也表明,人工智能對氣候造成的影響是能夠緩解的。

研究人員發(fā)現(xiàn),這些算法在訓(xùn)練過程中可以產(chǎn)生多少碳足跡,在很大程度取決于算法的具體設(shè)計、用于訓(xùn)練的計算機硬件類型,以及訓(xùn)練發(fā)生地的發(fā)電類型的影響。

谷歌的科學(xué)家們發(fā)現(xiàn),只要改變以上這三個因素,就能夠有效地降低這些超大型人工智能系統(tǒng)的碳排放,最多甚至有可能將排放量降低上千倍。例如一個用來訓(xùn)練人工智能程序的數(shù)據(jù)中心,如果它從煤炭聚集型的印度,搬到了可再生能源十分豐富的芬蘭,它的碳排放就可以降低10倍到100倍。

該論文的第一作者、谷歌的科學(xué)家大衛(wèi)·帕特森告訴《財富》雜志:“這就好比一個老笑話:‘搞房地產(chǎn)最重要的是什么?地段,地段,還是地段?!囟尾灰粯?,就會產(chǎn)生很大的區(qū)別?!?/p>

帕特森也是加州大學(xué)伯克利分校的名譽教授。他表示,這對人工智能行業(yè)來說最終是一個利好消息,因為大多數(shù)的人工智能算法是都是“在云端上”被訓(xùn)練的。訓(xùn)練的過程發(fā)生在數(shù)據(jù)中心里,而這個數(shù)據(jù)中心可能離該算法的創(chuàng)建者還有好幾千公里遠?!霸谠朴嬎阒校乩砦恢檬亲钊菀赘淖兊氖虑??!彼f。

不過,如果在對人工智能系統(tǒng)進行培訓(xùn)時,都要顧及系統(tǒng)的環(huán)保性的話,這最終還是會利好那些大型云服務(wù)提供商。因為微軟(Microsoft)、谷歌、IBM和亞馬遜(Amazon)等公司已經(jīng)在各地設(shè)立了幾十個數(shù)據(jù)中心,有些還設(shè)置在平均溫度較低的地區(qū),這樣就節(jié)省了服務(wù)器機架的冷卻成本,和使用清潔能源的成本。

用于語言處理的超大型人工智能系統(tǒng)究竟會給環(huán)境造成多大影響?在谷歌內(nèi)部,也有一群人工智能倫理專家對此提出了質(zhì)疑。也正因為如此,谷歌的人工智能倫理學(xué)家蒂姆尼特·格布魯才會從谷歌離職,之后,另一名人工智能倫理學(xué)家瑪格麗特·米切爾也被炒了魷魚,此二人都是谷歌人工智能倫理研究團隊的負責(zé)人。

這篇關(guān)于減少人工智能碳足跡的論文共有9名作者,其中之一是谷歌研發(fā)部門的高級副總裁杰夫·迪恩。他也因為涉嫌參與逼走格布魯而遭到不少詬病。迪恩之所以批判格布魯,其中的一條“罪狀”,就是格布魯之前的一篇論文,沒有談到如何減輕大型語言模型的負面?zhèn)惱碛绊憽?/p>

除了搬到擁有清潔電網(wǎng)的地方,另一種提高人工智能程序能耗效率的方法,就是采用專門為神經(jīng)網(wǎng)絡(luò)設(shè)計的芯片。這種芯片可以一定程度地模擬人腦,很適合近幾年人工智能的最新發(fā)展。目前,市面上的大多數(shù)人工智能程序所使用的芯片,都是原本用于電腦游戲的圖形芯片。不過現(xiàn)在,諸如谷歌、微軟和亞馬遜等大公司都在他們的數(shù)據(jù)中心里安裝了專門用于神經(jīng)網(wǎng)絡(luò)的計算機芯片。

研究發(fā)現(xiàn),告別圖形處理芯片,擁抱神經(jīng)網(wǎng)絡(luò)專用芯片,能夠使訓(xùn)練超大型人工智能算法的用電量減少5倍。另外,如果你使用的是最新一代的人工智能芯片,它又可以比最老一代的芯片節(jié)能整整一半。

如果重新設(shè)計神經(jīng)網(wǎng)絡(luò)算法,使其成為計算機科學(xué)家所說的“稀疏算法”,那么這些算法還將變得更節(jié)能,甚至能夠節(jié)省10倍的能耗。在這種架構(gòu)下,網(wǎng)絡(luò)里的大多數(shù)人工智能神經(jīng)元只需要與相對較少的其他神經(jīng)元相連,因此只需要較少的相關(guān)神經(jīng)元,就可以計算人工智能算法在訓(xùn)練中遇到的每個新案例的數(shù)據(jù)權(quán)重。

參與這項研究的另一位谷歌研究員莫德·特希爾表示,她希望這篇論文有助于整個行業(yè)認識到人工智能算法的標(biāo)準(zhǔn)能耗,以及人工智能算法的碳足跡。

不過她也強調(diào),這并不是一件容易的事情。要想精確計算人工智能算法的碳足跡,我們不光要知道某地的電網(wǎng)在多大程度上屬于綠色電網(wǎng),還要知道在算法的訓(xùn)練過程中,使用了多少比例的可再生能源和化石能源。雖然一些大型云服務(wù)商已經(jīng)開始向客戶提供詳細的二氧化碳排放信息了,但用戶真正要想獲得這些數(shù)據(jù),還是很困難的。(財富中文網(wǎng))

譯者:樸成奎

現(xiàn)在的人工智能算法已經(jīng)可以完成一些非常復(fù)雜的任務(wù)了,比如撰寫非常自然的文章,或者根據(jù)文字描述生成圖片等等。不過訓(xùn)練這些程序需要大量的計算力,而計算力又跟電力成正比,這就讓很多人開始擔(dān)心起超大型人工智能系統(tǒng)的碳足跡問題,生怕它在環(huán)境上不具有可持續(xù)性。

那么,這些先進的人工智能系統(tǒng)的碳足跡究竟是多少?加州大學(xué)伯克利分校(University of California at Berkeley)和谷歌公司(Google)的科學(xué)家最近給出了迄今為止最準(zhǔn)確的估計——畢竟當(dāng)前市面上很多最流行的大型人工智能系統(tǒng)本身就是由谷歌開發(fā)的。

比如研究顯示,舊金山的人工智能公司OpenAI創(chuàng)建的一款功能強大的語言模型GPT-3,它在軟件的訓(xùn)練過程中產(chǎn)生了552噸的二氧化碳排放,相當(dāng)于120輛乘用車一年的排放量。谷歌的先進聊天機器人Meena也排放了96噸的二氧化碳,相當(dāng)于17個美國家庭一年的耗電量。

雖然這些數(shù)字大得嚇人,但它比研究人員此前的估算還是小了一些,畢竟研究人員始終沒有渠道獲得蘋果(Apple)和谷歌的內(nèi)部數(shù)據(jù)。這篇研究文章在4月21日發(fā)表于不用經(jīng)過同行審議的論文庫arxiv.org里。該論文同時也表明,人工智能對氣候造成的影響是能夠緩解的。

研究人員發(fā)現(xiàn),這些算法在訓(xùn)練過程中可以產(chǎn)生多少碳足跡,在很大程度取決于算法的具體設(shè)計、用于訓(xùn)練的計算機硬件類型,以及訓(xùn)練發(fā)生地的發(fā)電類型的影響。

谷歌的科學(xué)家們發(fā)現(xiàn),只要改變以上這三個因素,就能夠有效地降低這些超大型人工智能系統(tǒng)的碳排放,最多甚至有可能將排放量降低上千倍。例如一個用來訓(xùn)練人工智能程序的數(shù)據(jù)中心,如果它從煤炭聚集型的印度,搬到了可再生能源十分豐富的芬蘭,它的碳排放就可以降低10倍到100倍。

該論文的第一作者、谷歌的科學(xué)家大衛(wèi)·帕特森告訴《財富》雜志:“這就好比一個老笑話:‘搞房地產(chǎn)最重要的是什么?地段,地段,還是地段?!囟尾灰粯?,就會產(chǎn)生很大的區(qū)別?!?/p>

帕特森也是加州大學(xué)伯克利分校的名譽教授。他表示,這對人工智能行業(yè)來說最終是一個利好消息,因為大多數(shù)的人工智能算法是都是“在云端上”被訓(xùn)練的。訓(xùn)練的過程發(fā)生在數(shù)據(jù)中心里,而這個數(shù)據(jù)中心可能離該算法的創(chuàng)建者還有好幾千公里遠。“在云計算中,地理位置是最容易改變的事情。”他說。

不過,如果在對人工智能系統(tǒng)進行培訓(xùn)時,都要顧及系統(tǒng)的環(huán)保性的話,這最終還是會利好那些大型云服務(wù)提供商。因為微軟(Microsoft)、谷歌、IBM和亞馬遜(Amazon)等公司已經(jīng)在各地設(shè)立了幾十個數(shù)據(jù)中心,有些還設(shè)置在平均溫度較低的地區(qū),這樣就節(jié)省了服務(wù)器機架的冷卻成本,和使用清潔能源的成本。

用于語言處理的超大型人工智能系統(tǒng)究竟會給環(huán)境造成多大影響?在谷歌內(nèi)部,也有一群人工智能倫理專家對此提出了質(zhì)疑。也正因為如此,谷歌的人工智能倫理學(xué)家蒂姆尼特·格布魯才會從谷歌離職,之后,另一名人工智能倫理學(xué)家瑪格麗特·米切爾也被炒了魷魚,此二人都是谷歌人工智能倫理研究團隊的負責(zé)人。

這篇關(guān)于減少人工智能碳足跡的論文共有9名作者,其中之一是谷歌研發(fā)部門的高級副總裁杰夫·迪恩。他也因為涉嫌參與逼走格布魯而遭到不少詬病。迪恩之所以批判格布魯,其中的一條“罪狀”,就是格布魯之前的一篇論文,沒有談到如何減輕大型語言模型的負面?zhèn)惱碛绊憽?/p>

除了搬到擁有清潔電網(wǎng)的地方,另一種提高人工智能程序能耗效率的方法,就是采用專門為神經(jīng)網(wǎng)絡(luò)設(shè)計的芯片。這種芯片可以一定程度地模擬人腦,很適合近幾年人工智能的最新發(fā)展。目前,市面上的大多數(shù)人工智能程序所使用的芯片,都是原本用于電腦游戲的圖形芯片。不過現(xiàn)在,諸如谷歌、微軟和亞馬遜等大公司都在他們的數(shù)據(jù)中心里安裝了專門用于神經(jīng)網(wǎng)絡(luò)的計算機芯片。

研究發(fā)現(xiàn),告別圖形處理芯片,擁抱神經(jīng)網(wǎng)絡(luò)專用芯片,能夠使訓(xùn)練超大型人工智能算法的用電量減少5倍。另外,如果你使用的是最新一代的人工智能芯片,它又可以比最老一代的芯片節(jié)能整整一半。

如果重新設(shè)計神經(jīng)網(wǎng)絡(luò)算法,使其成為計算機科學(xué)家所說的“稀疏算法”,那么這些算法還將變得更節(jié)能,甚至能夠節(jié)省10倍的能耗。在這種架構(gòu)下,網(wǎng)絡(luò)里的大多數(shù)人工智能神經(jīng)元只需要與相對較少的其他神經(jīng)元相連,因此只需要較少的相關(guān)神經(jīng)元,就可以計算人工智能算法在訓(xùn)練中遇到的每個新案例的數(shù)據(jù)權(quán)重。

參與這項研究的另一位谷歌研究員莫德·特希爾表示,她希望這篇論文有助于整個行業(yè)認識到人工智能算法的標(biāo)準(zhǔn)能耗,以及人工智能算法的碳足跡。

不過她也強調(diào),這并不是一件容易的事情。要想精確計算人工智能算法的碳足跡,我們不光要知道某地的電網(wǎng)在多大程度上屬于綠色電網(wǎng),還要知道在算法的訓(xùn)練過程中,使用了多少比例的可再生能源和化石能源。雖然一些大型云服務(wù)商已經(jīng)開始向客戶提供詳細的二氧化碳排放信息了,但用戶真正要想獲得這些數(shù)據(jù),還是很困難的。(財富中文網(wǎng))

譯者:樸成奎

Artificial intelligence algorithms that power some of the most cutting-edge applications in technology, such as writing coherent passages of text or generating images from descriptions, can require vast amounts of computing power to train. And that in turn requires large amounts of electricity, leading many to worry about the carbon footprint of these increasingly popular ultra-large A.I. systems make them environmentally unsustainable.

New research from scientists at the University of California at Berkeley and Google, which deploys many of these large A.I. systems, provides the most accurate estimates to date for the carbon footprint of some of these state-of-the-art systems.

For instance, GPT-3, a powerful language model created by the San Francisco-based A.I. company OpenAI, produced the equivalent of 552 metric tons of carbon dioxide during its training, according to the study. That’s the same amount that would be produced by driving 120 passenger cars for a year. Google’s advanced chatbot Meena consumed 96 metric tons of carbon dioxide equivalent, or about the same as powering more than 17 homes for a year.

While those figures are frightening large, they are smaller than some previous estimates from researchers who did not have access to the same detailed information from inside Google and OpenAI. The research paper, which was posted to the non-peer reviewed research repository arxiv.org on April 21, also shows that the climate impact of A.I. can be mitigated.

The researchers conclude that the carbon footprint of training these algorithms varies tremendously depending on the design of the algorithm, the type of computer hardware used to train it, and the nature of electricity generation where that training takes place.

Altering all three of these factors can reduce the carbon footprint of training one of these very large A.I. algorithms by a factor of up to 1,000, the Google scientists found. Simply changing the datacenter used to train the algorithm from a place where power generation is coal intensive, like India, to one where the electrical grid runs on renewable power, such as Finland, can reduce it by a factor of between 10 and 100 times, the study concluded.

“It’s like that old joke about the three most important things in real estate: location, location, location,” David Patterson, the Google scientist who is lead author on the new paper, told Fortune. “Location made such a big difference.”

Patterson, who is also an emeritus professor at U.C. Berkeley, says that’s ultimately good news because most A.I. algorithms are trained “in the cloud,” with the actual processing taking place in data centers that can be hundreds or even thousands of miles away from where the person creating the system is sitting. “In cloud computing, location is the easiest thing to change,” he says.

But if environmental sustainability becomes a major consideration in training A.I. systems it is also to further cement the market position of the largest cloud service providers. That’s because companies such as Microsoft, Google, IBM and Amazon Web Services have dozens of data centers in many different places, including those in areas with colder average temperatures, reducing the cost of cooling all those server racks, and greener energy.

The environmental impact of ultra-large A.I. systems designed for processing language was one of the criticisms of such algorithms raised by a group of A.I. ethics specialists inside Google that played a role in the events that precipitated the ouster of Timnit Gebru and the subsequent firing of Margaret Mitchell, the two co-heads of the A.I. ethics research team.

Jeff Dean, a senior executive vice president at Google who heads the company’s research division and has been faulted by Gebru and her supporters for his role in forcing her out, is one of the nine authors credited on the new research paper on reducing the carbon footprint of these A.I. systems. One of his alleged criticisms of Gebru’s earlier paper is that it did not discuss ways to mitigate the negative ethical impacts of large language models.

Besides shifting to a location with a greener electricity grid, another way to improve the energy consumption of these models is to use computer chips that are specifically-designed for neural networks, a kind of machine learning software loosely modeled on the human brain that is responsible for most recent advances in A.I. Today, the majority of A.I. workloads are trained on computer chips that were originally designed for rendering the graphics in video games. But increasingly new kinds of computer chips designed just for neural networks are being installed in datacenters run by large cloud-computing companies such as Google, Microsoft, and Amazon Web Services.

Changing from graphics processing chips to these new neural network-specific chips can cut the energy needed to train an ultra-large algorithm by a factor of five, and it can be cut in half again by shifting from the earliest generation of these new A.I. chips to the latest versions of them, the researchers found.

An even bigger savings—a factor of 10—can be found by redesigning the neural network algorithms themselves so that they are what computer scientists call “sparse.” That means that most of the artificial neurons in the network connect to relatively few other neurons, and therefore need a smaller number of these neurons to update how they weight data for each new example the algorithm encounters during training.

Maud Texier, another Google researcher who worked on the study, says she hopes the paper helps drive the entire industry towards standardized benchmarks for measuring the energy consumption and carbon footprint of A.I. algorithms.

But she emphasized that this is not easy. To get an accurate estimate for the carbon footprint, it is important to know not just how green the electric grid in any particular location is in general, but exactly what the mix of renewable energy and fossil fuel-based electricity was during the specific hours when the A.I. algorithm was being trained. Obtaining this information from cloud service providers has been difficult, she says, although the large cloud service companies are starting to provide more detailed information on carbon dioxide emissions to customers.

財富中文網(wǎng)所刊載內(nèi)容之知識產(chǎn)權(quán)為財富媒體知識產(chǎn)權(quán)有限公司及/或相關(guān)權(quán)利人專屬所有或持有。未經(jīng)許可,禁止進行轉(zhuǎn)載、摘編、復(fù)制及建立鏡像等任何使用。
0條Plus
精彩評論
評論

撰寫或查看更多評論

請打開財富Plus APP

前往打開
熱讀文章
国产精品1000夫妇激情啪发布亚洲乱码| 久久久久久国产精品免费免费男同| 国产无码高清视频不卡| 日韩一区三区视频| 青草青草久热国产精品| 亚洲欧美色国产中文字幕在线| 亚洲欧美在线大香蕉| 亚洲精品熟女国产久久国产| 中字幕一区二区三区乱码| 精品一区二区三区无码免费视频| 女人被狂躁C到高潮视频| 91久久精品日日躁夜夜躁欧美| 99精品国产在热久久无毒不卡| 久久综合久久久久88| 国产A∨精品一区二区三区不卡| 无码人妻精品一区二区蜜桃| 久久国产精品波多野结衣AV| 黄色网页在线播放| (愛妃精選)午夜福利理论片高清在线观看| 五月天中文字幕MV在线| 国产在线精品一区亚洲毛片免费一级| 日韩欧美狼一区二区三区免费观看| 国产女人喷潮视频在线观看| 精品国产综合区久久久久久| 久久av免费天堂小草播放| 亚洲无码视频一区二区三区| 无码午夜人妻一区二区三区不卡视频| 国产免费av片在线观看与下载| 精品免费国偷自产| 亚洲中文字幕久久精品无码喷水| 国产精品日本一区二区在线播放| 国产女人高潮好舒服在线观看| 国产成人精品一区二区三区免费看| 亚洲中文久久久久久精品国产| 狠狠人妻久久久久久综合| 国产精品后入内射日本在线观看| 国产成人综合久久久久久| 国产免费AV片无码永久免费| 人人妻人人澡人人爽曰本| 大香蕉操逼色网视频| 大香蕉操逼色网视频|