Why does everyone want a search engine company to correct what's wrong with humanity?
Social problems exist. Sexism, social stigmatization, crime and other problems have always plagued humanity.Increasingly, however, organizations are calling on Google to solve or help solve these social problems, usually at the expense of the quality of the company's own products.
Here are a few examples.
Sexism
Carnegie Mellon University and the International Computer Science Institute created software called AdFisher, which analyzes Google ad targeting, including job postings. They found that male users were shown ads for high-paying executive jobs more often than women were.
The decisions about which ads to show to which users are based on two sets of data. One is the user information that Google collects. The other is the advertiser's data, which is a combination of data that the company harvested and hand-entered information about demographics.
Google's job is to faithfully take the user signals and the advertiser signals and enable communication between the two parties.
It's possible that sexism at Google determined or contributed to a sexist outcome for the targeting of the executive job postings. To the extent that that is true, Google must correct the problem as soon as possible.
However, it's more likely that Google's algorithm is faithfully conveying the sexism that in fact exists in society.
If that's true, what should Google do? Should it continue to offer "neutral" algorithms that deliver whatever biases and prejudices are contained in the inputs? Or should it counter society's biases and program corrective social engineering into its algorithms?
In other words, should Google take the inputs that skew toward sexism and deliberately weight them in the other direction to produce outputs that create the illusion that sexism doesn't exist?
Social
stigmatization
The Internet has radically increased the scope and consequences of social stigma, or at least people believe it has (reading Nathaniel Hawthorne's The Scarlet Letter might disabuse you of that notion).
In any
event, Europe recently enacted a set of rules under the concept of "theright to be forgotten." These rules
require search engines, such as Google Search, to offer the European public a
process for requesting the removal of specific search results that appear after
specific search terms are entered. Specifically, stigmatizing information
(that's not also in the public interest) must not appear when a search is
conducted for the name of a person who has requested the removal.
For
example, let's say a European man is photographed by the local newspaper while
playing an Australian didgeridoo in the park. He's young, bearded and has long
hair. Years later, he sells the instrument, shaves his beard, cuts his hair and
becomes a respected financial adviser. But when prospective clients search
Google for his name, they find the old newspaper photo. The man can request
that Google remove the link to the newspaper when people search his name, and
Google is required by law to comply. (If that sounds far-fetched, note that I'm
describing an actual case under the right-to-be-forgotten law.)
Google
has received hundreds of thousands of requests for right-to-be-forgotten
removals, and it has granted nearly half of them.
The
Russian parliament has approved an even stronger right-to-be-forgotten measure
which, if signed by President Vladimir Putin, will be become law next year.
And
last week a group called Consumer Watchdog asked the Federal Trade Commission
to enact right-to-be-forgotten rules in the U.S.
A
search engine exists to faithfully index and enable the discovery of legal
content posted online. But right-to-be-forgotten rules have the effect of
selectively rendering search engines less accurate.
Internet
content can stigmatize people, and that's a problem. The right-to-be-forgotten
solution requires Google to fix that problem by sabotaging its search engine
and offering an inferior product.
Crime
Criminals can use various methods to communicate with one another so police can't listen. For example, they can whisper in each other's ears, talk in code or use burner phones. And, of course, they can use end-to-end encryption for texting or emails.
While
bad guys could and do use encrypted communication to jeopardize national
security and to commit crimes, encryption is still very useful as a tool to protect national security and to prevent crime. Without encryption, it's much
easier for hackers, including state-sponsored perpetrators of industrial
espionage, to steal military, business and other secrets and use the
information against U.S. organizations. Encryption also helps prevent a long
list of crimes, including identity theft, extortion, blackmail and other kinds
of fraud.
But the
FBI and the U.S. Department of Justice recently announced that Google (among
other companies) is a major threat to national security and
law enforcementbecause it provides end-to-end encryption.
The
solution, according to law enforcement organizations, is for Google and other
companies to create back doors that would enable authorities to eavesdrop on
encrypted communication.
The
trouble is that any back door created for law enforcement could also be used by
terrorists, criminals and hostile foreign governments for hacking, spying and
stealing. Security experts know this; the FBI and the DOJ apparently don't.
Yes,
there is crime. And the solution proposed by federal law enforcement
authorities is for Google to fight crime by making its products unsecure. That
solution prevents Google from meeting the public's demand for secure
communication.
An unrelated
situation involves Google's social mapping app, Waze. The free app is like
Google Maps, except that it enables people to communicate with each other in
specific ways. Drivers can use Waze to alert one another about things like road
hazards and traffic jams. They can also use it to identify the location of
police cars.
Police
organizations recently called on Google to disable the ability for Waze to note
the location of police cruisers, saying it enables the stalking of police
officers. They want Google to prevent people from communicating with each other
about police cars they see while driving around (something the police and the
government themselves cannot prevent because of the First Amendment to the U.S.
Constitution).
In yet
another case, Google is being called upon to address the problem of youth
violence.Google's
personalized advertising system on search and YouTube grabs "signals"
to decide which ads to display. These signals come from both the content
displayed and also the search history and other personalized information of the
user.
Because
rap music often talks about guns and violence, ads for gun-related companies
and organizations are more likely to be displayed in the context of rap content
and videos.The
founder of an organization called the Hip-Hop Chess Federation demanded that
Google intervene in its own ad-serving algorithm to remove ads for the U.S.
Concealed Carry Association (USCCA) and other gun-related companies on
rap-related content.
Pedestrian safety
In New York City last year, 20 people were killed by vehicles that were making legal left turns, according to this WNYC article published two months ago.
Because
of the article, the New York City Council this month wrote a letter to Google
asking the company to make two changes to Google Maps.
One of
the council's requests is for a "stay on truck routes" feature for
truck drivers. The other is for functionality that would give Google Maps users
the ability to request turn-by-turn directions with fewer left turns.
Car
accidents happen. It's a real problem, and it's tragic for victims and their
families.Google
is already doing as much as any organization to address the problem by
inventing a self-driving car that's way safer than human-driven cars.
But the
New York City Council now wants Google to change its Maps app in order to solve
society's car accident problem.
Google's mission
No comments:
Post a Comment