當(dāng)前位置:希尼爾首頁 > 雙語新聞 >
大數(shù)據(jù)時(shí)代的隱私 (中英雙語)
大數(shù)據(jù)時(shí)代的隱私(中英雙語)
青島希尼爾翻譯公司(www.uxiaohua.cn)整理發(fā)布2016-01-12
希尼爾翻譯公司(www.uxiaohua.cn)2016年1月12日了解到:Imagine being talked about
behind your back. Now picture that conversation taking place covertly in
your own sitting room, with you unable to hear it.
想象一下有人在你背后談?wù)撃恪,F(xiàn)在設(shè)想一下,這樣的談話就悄悄發(fā)生在你家客廳里,而你卻無法聽到。
That is the modus operandi of SilverPush, an Indian start-up that
embeds inaudible sounds in television advertisements. As the advert
plays, a high-frequency signal is emitted that can be picked up by a
mobile or other device installed with an app containing SilverPush
software. This “pairing” — currently targeted at Indian consumers — also
identifies users’ other nearby devices and allows the company to monitor
what they do across those. All without consumers hearing a thing.
這就是印度創(chuàng)業(yè)企業(yè)SilverPush的做法,該公司在電視廣告里嵌入聽不到的聲音。廣告播放時(shí),會(huì)發(fā)出一種高頻信號(hào),安裝有內(nèi)置SilverPush軟件的應(yīng)用的手機(jī)或其他設(shè)備可接收到這種信號(hào)。這種“配對(duì)”——目前是針對(duì)印度消費(fèi)者的——也會(huì)識(shí)別出用戶附近的其他設(shè)備,讓該公司得以監(jiān)控他們?cè)谶@些設(shè)備上做些什么。這一切都在消費(fèi)者無知無覺的情況下發(fā)生。
This “cross-device tracking technology”, being explored by other
companies including Adobe, is an emblem of a new era with which all of
us — governments, companies, charities and consumers — will have to
contend.
這種“跨設(shè)備跟蹤技術(shù)”——包括Adobe在內(nèi)的其他公司也在探索此技術(shù)——標(biāo)志著一個(gè)新時(shí)代的來臨。這個(gè)新時(shí)代是所有人——政府、公司、慈善機(jī)構(gòu)和消費(fèi)者——將不得不應(yīng)對(duì)的。
Last month, the Royal Statistical Society hosted a conference at
Windsor castle to ponder the challenges of Big Data — an overused,
underexplained term for both the flood of information churned out by our
devices and the potential for this flood to be organised into revelatory
and predictive rivers of knowledge.
不久前,英國皇家統(tǒng)計(jì)學(xué)會(huì)(Royal Statistical
Society)在溫莎(Windsor)城堡召開了一次大會(huì),思考大數(shù)據(jù)帶來的挑戰(zhàn)。大數(shù)據(jù)是一個(gè)被濫用、內(nèi)涵解釋欠清楚的術(shù)語,既指我們的設(shè)備產(chǎn)生的海量信息流,也指把這些信息整理為分門別類的一股股具有揭示性和預(yù)見性的信息流的潛力。
The setting was apt: the ethics and governance surrounding the
growing use of data are a right royal mess. Public discussion about how
these vast quantities of information should be collected, stored,
cross-referenced and exploited is urgently needed. There is excitement
about how it might revolutionise healthcare — during outbreaks of
disease, for example, search data can be mined for the greater good.
Today, however, public engagement largely amounts to public outcry when
things go wrong.
這次大會(huì)召開得正是時(shí)候:圍繞日益增加的數(shù)據(jù)使用的倫理和治理可謂一團(tuán)糟。目前迫切需要就這些海量數(shù)據(jù)應(yīng)當(dāng)如何收集、存儲(chǔ)、相互參照和利用展開公眾討論。有人對(duì)大數(shù)據(jù)可能催生醫(yī)療革命感到興奮:比如說,在疾病爆發(fā)時(shí),可以為了更高的利益挖掘搜索數(shù)據(jù)。然而,如今,當(dāng)出現(xiàn)糟糕情況時(shí),公眾討論很大程度上變成公眾的強(qiáng)烈聲討。
The extent to which tech shapes our lives — the average British
adult spends more than 20 hours a week online, according to a report by
UK media regulator Ofcom — means our behaviour, habits, desires and
aspirations can be revealed by our swipes and keystrokes.
英國媒體監(jiān)管機(jī)構(gòu)英國通信辦公室(Ofcom)的一份報(bào)告顯示,英國成年人平均每周在線時(shí)間超過20小時(shí)??萍紝?duì)我們生活的巨大影響,意味著我們的行為、習(xí)慣、欲望和抱負(fù)都可以通過觸摸屏和鍵盤操作顯露出來。
This has made analysis of online be梔愀瘀椀漀甀爀 a new Klondike.
Personal data are like gold dust, and we surrender them every time we
casually click “OK” to a website’s terms and conditions.
這使得對(duì)在線行為的分析成為一座新的金礦。個(gè)人數(shù)據(jù)就像金砂,每次我們隨意對(duì)一家網(wǎng)站的條款與條件點(diǎn)擊“確定”時(shí),就把我們的個(gè)人數(shù)據(jù)交了出去。
And here is our first problem: most of us click unthinkingly (it
is usually impenetrable legalese, anyhow). It is thus questionable
whether we have given informed consent to all the ways in which our
personal data are subsequently used. To demonstrate this, a security
company set up a public WiFi spot in the City of London and inserted a
“Herod clause” committing users to hand over their firstborn for
eternity. Within a short period of time, several people unwittingly
bartered away their offspring in return for a free connection.
這是我們面臨的第一個(gè)問題:我們中大多數(shù)人都是不假思索地點(diǎn)擊的(不過,條款與條件通常是難懂的法律措辭)。那么,我們對(duì)自己的個(gè)人數(shù)據(jù)隨后被使用的各種情形是否行使了知情同意權(quán),就成了疑問。為了證明這一點(diǎn),一家安全公司在倫敦金融城(City
of London)設(shè)立了一個(gè)公共WiFi熱點(diǎn),并嵌入一個(gè)“希律條款”(Herod
Clause),要求用戶承諾永遠(yuǎn)放棄他們的第一個(gè)孩子。在很短時(shí)間內(nèi),就有不少人為了免費(fèi)上會(huì)兒網(wǎng)稀里糊涂地放棄了自己的孩子。
Legal challenges aside, there is rarely independent scrutiny of
what is a fair and reasonable relationship between an online company and
its consumers. Facebookfell foul of this when it manipulated the news
feeds of nearly 700,000 users for a psychology experiment. Users claimed
they had been duped by the study, which found that those exposed to
fewer positive news stories were more likely to write negative posts.
The company retorted that consent had already been given. Approval last
week of EU data protection rules permitting hefty fines for privacy
breaches may prevent a repetition; consent will no longer be the elastic
commodity it was.
除了法律挑戰(zhàn),關(guān)于網(wǎng)絡(luò)公司及其消費(fèi)者之間公平與恰當(dāng)?shù)年P(guān)系應(yīng)該是怎樣的,我們也很少進(jìn)行過獨(dú)立的審視。Facebook在這一點(diǎn)上便曾引起眾怒,因?yàn)樗鼮榱俗鲆粋€(gè)心理實(shí)驗(yàn),對(duì)近70萬用戶的動(dòng)態(tài)消息動(dòng)了手腳。用戶們聲稱,他們被那項(xiàng)研究給耍了,研究結(jié)果顯示,那些接收到更少積極消息的人更可能寫出消極的內(nèi)容。Facebook反駁稱,他們已獲得了用戶的同意。不久前,歐盟通過了數(shù)據(jù)保護(hù)規(guī)則,新規(guī)允許對(duì)侵犯隱私的行為處以高額罰款,這或許能阻止類似情況再次發(fā)生;用戶不再像以往那樣無論代價(jià)如何都只能被動(dòng)同意了。
A second challenge arises from the so-called internet of things,
when devices bypass humans and talk directly to one another. So my
depleted smart fridge could automatically email the supermarket
requesting replenishment. But it could also mean my gossiping gadgets
become a network of electronic spies that can paint a richly detailed
picture of my prandial and other proclivities, raising privacy concerns.
Indeed, at a robotics conference last month, technologists identified
the ability of robots to collect data, especially in private homes, as
the single biggest ethical issue in that field.
第二個(gè)挑戰(zhàn)源自各種設(shè)備繞過人類、直接彼此對(duì)話的所謂物聯(lián)網(wǎng)。所以,我的智能冰箱在儲(chǔ)存消耗光了的時(shí)候可以自動(dòng)給超市發(fā)電郵,要求補(bǔ)貨。但這也可以意味著,我的那些“八卦”的設(shè)備構(gòu)成了一張電子間諜網(wǎng),它可以繪制出一幅有關(guān)我的飲食與其他癖性的極其詳盡的圖畫,令人擔(dān)心隱私暴露。實(shí)際上,在不久前的一個(gè)機(jī)器人學(xué)大會(huì)上,技術(shù)專家們把機(jī)器人收集數(shù)據(jù)(尤其是在私人住所里)的能力認(rèn)定為大數(shù)據(jù)領(lǐng)域最大的單個(gè)倫理問題。
Alongside the new EU rules on data protection, we need something
softer: a body of experts and laypeople that can bring knowledge, wisdom
and judgment to this fast-moving field. There is already a Council for
Big Data, Ethics and Society in the US, comprising lawyers, philosophers
and anthropologists.
除了歐盟新的數(shù)據(jù)保護(hù)規(guī)則外,我們也需要更軟性的方式:一個(gè)由專家和非專業(yè)人員構(gòu)成的機(jī)構(gòu),為這一快速發(fā)展的領(lǐng)域帶來知識(shí)、智慧和判斷力。眼下,美國已有了一個(gè)由律師、哲學(xué)家和人類學(xué)家組成的大數(shù)據(jù)、倫理與社會(huì)委員會(huì)(Council
for Big Data, Ethics and Society)。
Europe should follow this example — because, as a stream of
anecdotes at the Windsor conference revealed, companies and academics
ap瀀攀愀爀 to be navigating this new data-rich world without a moral
compass. In 2012 a Russian company created Girls Around Me, an app that
pooled publicly available information to show the real-time locations
and pictures of nearby women, without their consent; the app, a
stalker’s dream, was withdrawn. High-tech rubbish bins in London’s
Square Mile, which captured information from smartphones to track
unwitting owners’ movements in order to target them with advertising,
were ditched on grounds of creepiness.
歐洲應(yīng)當(dāng)仿效美國的做法,因?yàn)檎鐪厣髸?huì)上的一連串趣聞所顯示的那樣,公司和學(xué)術(shù)界人士在這個(gè)數(shù)據(jù)豐富的新世界航行時(shí),似乎沒有帶上倫理指南針。2012年,一家俄羅斯公司推出了一款名為“Girls
Around
Me”的應(yīng)用(App),可以匯集公開可見的信息,在不經(jīng)使用者附近女性同意的情況下顯示她們的實(shí)時(shí)位置和照片。這款跟蹤騷擾者夢(mèng)寐以求的應(yīng)用被撤下了?!捌椒接⒗铩保⊿quare
Mile,即倫敦金融城,因面積正好1平方英里得名——譯者注)的高科技電子垃圾箱捕捉來自智能手機(jī)的信息,以跟蹤不知情的機(jī)主的行蹤,從而針對(duì)他們發(fā)布廣告,這些垃圾桶因令人毛骨悚然而被取締。
Meanwhile, a scientist has created software that combs Twitter
connections to infer a tweeter’s ethnicity and even religion, raising
the question of whether public posts can legitimately be used to deduce
private information. Do we, as one lawyer suggested,need laws against
misuse of our online personae?
同時(shí),一名科學(xué)家做了一款軟件,能夠通過徹底搜查推特(Twitter)人脈圖,推斷一名推特用戶的種族、甚至宗教,這引發(fā)了使用公開發(fā)言推斷私人信息是否合法的疑問。我們是否如一名律師所認(rèn)為的那樣,需要出臺(tái)防止個(gè)人在線角色被濫用的法律?
We have wearable devices that, like Santa, see you when you are
sleeping and know when you’re awake. It is possible that a company will
find a way of deducing — through sentiment analysis of social media
postings, visits to charity websites, checks on your bank balance and
fitness tracking — if you’ve been bad or good.
我們有了可穿戴設(shè)備,這些設(shè)備像圣誕老人一樣,在你睡著時(shí)注視著你,也知道你何時(shí)是醒著的。一家公司有可能找到推斷你近來生活是否積極向上的辦法——通過分析社交媒體發(fā)言表現(xiàn)出的情緒、訪問慈善網(wǎng)站以及核查你的銀行存款余額和健康追蹤。
This goes to show: just because big data makes it technically
possible to do something, does not mean we should.
這證明:并非僅僅因?yàn)榇髷?shù)據(jù)使某事在技術(shù)上具備可行性,就意味著我們應(yīng)該那么做。
新聞部分來源于網(wǎng)絡(luò),,版權(quán)歸作者或者來源機(jī)構(gòu)所有,如果涉及任何版權(quán)方面的問題,請(qǐng)通知我們及時(shí)刪除。