[ICTs-and-Society] Blogpost about Google’s “New“ Terms of Use and Privacy Policy: Old Exploitation and User Commodification in a New Ideological Skin

James Losey losey at newamerica.net
Wed Feb 29 20:01:54 PST 2012


Hi Christian,

Thank you for sending this blog post. I think providing Google's new
policies within the context of EU regulation is particularly helpful. In a
globally networked world, the multi-jurisdiction that online services face
is both a challenge for companies and an opportunity to push for more user
control over the online public sphere.

However, I would like to push back against a couple notions in your piece.
First, much like television or radio has been supported by advertising, so
too are many online services. The question we are grappling with is not
whether or not the service is supported by advertising but what are
reasonable limits on the use of personally identifiable information for
advertising. Secondly, opt-out is technically an option, but from a
behavioral economics standpoint has much higher costs than most users will
chose. The question is really about what types of controls should be
available for users.

With respect to Google's recent change, you are absolutely correct in
noting the "large-scale economic surveillance" of users, after all, Google
is essentially an advertising company and earns
97%<http://www.wired.com/epicenter/2011/07/google-revenue-sources/>of
revenue from advertising, but it is worth defining that transaction.
First, Google profits from serving adds to users and naturally would profit
more from offering more effective advertising. However, Google is also
potentially provides a better product to consumers by tailoring services.
For example, if I am able to use Google search to more quickly access the
information I want then I am more likely to use Google as my primary search
engine. Another example would be location based data from a mobile device
allowing location based services, such as locating me on a map, or to use
Google's example, telling me that I am 15 minutes away from a meeting that
starts in 15 minutes.

In this two-sided market we have two clear values of user data. First, user
data can lead to tailored serves. Secondly, user data can also provide
better tailored advertisements. Both support the service, as one can lead
to more user value while the other provides more value per user to Google.
Essentially, user data is the currency that supports the transaction to
otherwise "free" services online.

Now lets look at Google's new policy. They are simplifying 60 services into
a single privacy policy while at the same time noting that they will be
sharing data between various Google services. This is Google's eclosion -
their transition from a variety of different services that share a log-in,
yet have at times have different data collection, into a single integrated
service. Most notably, this will mean the sharing of data between Google's
web history and YouTube history which is significant because Google is the
largest search engine and the largest video hosting website in the world.
>From my conversations with Google staff, I have confirmed that this step
will not include combining DoubleClick data. (In fact, this would be in
violation of a Federal Trade Commission order here in the U.S.)

>From a business standpoint, this transition makes sense. Two of Google's
major competitors are striving for integrated experiences. Apple offers a
vertically integrated experience on iOS devices and is pushing the same App
store restrictions onto OSX computers through Gatekeeperon the next version
of their operating system. Facebook is pushing to create the next
generation of the online portal - the very same that was rejected in the
90s - by offering applications and media within Facebook. However, by
integrating a "like" button on a large number of websites, Facebook has
also made a major play to collect user data on user web browsing.

Google has responded to this competitive threat through their "+1 "button
(which offers the same web browsing tracking as the "like" button) as well
as the intent to transition into a company that offers an integrated
experience. The commercial pressures of the online marketplace are to
maximize, as you say, "economic surveillance" and Google's new policy is a
clear intent to become underlying platform through which users interact
with other online content.

As I argue<http://www.slate.com/articles/technology/future_tense/2012/02/google_privacy_policy_the_missing_opt_out_isn_t_the_only_problem_.html>with
my colleague Thomas Gideon, the problem here is not whether or not
Google offers an "opt-out." After all, saying users must accept new
policies or choose another service is a bully's pulpit. Even though Google
offers the ability to download all your data and leave (although this is a
commendable step considering the other players in the space) users have
considerable sunk costs in users log-ins for YouTube, email, and other
Google services. Additionally, once you log into a single Google service
you are automatically logged into other services - it is entirely
unreasonable to expect that someone will be chatting or emailing in one
window and log-out in order to view a video. What Google is doing is
"forcibly bridging services without the choice of a partial opt-out is an
attempt by Google to leverage user dependency on some services to increase
the usage of others—most notably Google+." In other words, depending on a
users use of Gmail or YouTube to track *all *web browsing history.  Because
in analog people interact with different spaces in different ways, what
Google should offer is the ability for a user to control their online
identity - ie, the profile created by Google - in different services. User
may decide that they would like to keep separate profiles for web history
vs. YouTube, or they may not. However, the current state of Google privacy
controls is painfully services specific while Google is pushing a policy
for an integrated service. I would argue that tools should offer nuanced
control of user identity within an integrated space.

Moving forward, I think its worth accepting the fact the online model is
predicated on the exchange of user data for services. But there are some
components worth exploring.

One component is to question data collection. This includes regulatory
approaches to limiting the type of data that can be collected by
intermediaries, as well as the length of time this data can be stored for.
Data has a value for users as well as services, after all location data can
aid mobile phone calls while travelling in a car while search data can
allow online services to be tailored to users interests. However, these
data silos can create concerns over the ability to forget past history or
the ability for law enforcement to easily access detailed history without
adequate due process or other limits.

A second component is to explore the transparency of the exchange. To what
extent are consumer aware of the transaction, and the extent that data is
collected. From a regulatory standpoint, some approaches are to clarify
this exchange without providing meaningful opportunities for consumers.
This Joy of Tech
cartoon<http://www.joyoftech.com/joyoftech/joyarchives/1653.html>
lucidly
illustrates one approach to consumer "protections" that obviously falls
short. Transparency is obviously necessary but not sufficient.

A third component is the level of controls. In the United States, both
through initiatives of the Digital Advertising Alliance (DAA) and the W3C
standards setting body, there are discussions of a "Do Not Track"
header. Theoretically, the header would allow a user to send a clear signal
that they would not like to be tracked. While there are technical methods
for circumventing this signal it does provide an opportunity for regulatory
enforcement. Unfortunately, the DAA process which recently got White House
support would only block advertsing tracking *not *web browsing which I
think is disingenuous. Rather, what I would like to see is actually an
implementation of Google Circles for online web tracking. Google Circles is
an interesting approach for social networks -- a clear way to define
different categories of social interaction and what types of data they will
have access to. I think it would valuable for users to have similar
controls for their interactions with different websites or online spaces.

Finally, a forth component is exploring consumer harms to data collection.
In your email you linked to a paper where you detail the political economy
of Google. However, I think it would be worth breaking down where Google's
revenue comes from. Based on this
breakdown<http://www.wired.com/epicenter/2011/07/google-revenue-sources/>,
two-thirds of Google's revenue comes from four sources: Insurance, loans,
mortgage, and attorneys. At least in the United States, these are some
industries that have factions actively take advantage of different
communities. For example, the current recession is largely caused by abuses
in the mortgage industry. It is worth exploring if traditionally
marginalized populations are likewise marginalized through the
personalization of advertisements online. Superficial research
suggests<http://www.wired.com/epicenter/2011/07/google-revenue-sources/>that
it is indeed the case that racial minorities in the United States are
targeted by different advertisements in the United States but this is
clearly an area that requires more research.

I look forward to exploring these issues further,

James

On Wed, Feb 29, 2012 at 6:27 PM, Christian Fuchs <christian.fuchs at uti.at>wrote:

> http://fuchs.uti.at/789/
>
> Google’s “New“ Terms of Use and Privacy Policy: Old Exploitation and
> Commodification in a New Ideological Skin
>
> On March 1st, 2012, Google changed its terms of use and privacy policy.
> What has changed? Has something changed?
>
> Google’s general terms of services that were valid from April 16, 2007,
> until the end of February 2012, applied to all of its services. It thereby
> enabled the economic surveillance of a diverse multitude of user data that
> was collected from various services and user activities for the purpose of
> targeted advertising: “Some of the Services are supported by advertising
> revenue and may display advertisements and promotions. These advertisements
> may be targeted to the content of information stored on the Services,
> queries made through the Services or other information”.
>
> Google specified in its old privacy policy (valid from October 20, 2011,
> until the end of February 2012) that the company “may collect the following
> types of information”: personal registration information, cookies that
> store “user preferences”, log information (requests, interactions with a
> service, IP address, browser type, browser language, date and time of
> requests, cookies that uniquely identify a user), user communications,
> location data, unique application number. Google said that it was using
> Cookies for “improving search results and ad selection”, which is only a
> euphemism for saying that Google sells user data for advertising purposes.
> “Google also uses cookies in its advertising services to help advertisers
> and publishers serve and manage ads across the web and on Google services”.
> To “serve and manage ads” means to exploit user data for economic purposes.
> The Google ad preferences manager displays the user interests and
> preferences that are collected by the use of cookies and used for targeted
> advertising.
>
> Google’s old privacy policy specified that “Google uses the DoubleClick
> advertising cookie on AdSense partner sites and certain Google services to
> help advertisers and publishers serve and manage ads across the web”.
> Google used DoubleClick, a commercial advertising server owned by Google
> since 2007 that collects and networks data about usage behaviour on various
> websites, sells this data, and helps providing targeted advertising – for
> networking the data it holds about its users with data about these users’
> browsing and usage behaviour on other web platforms. There was only an
> opt-out option from this form of networked economic surveillance. Google’s
> privacy policy provided a link to this option. Opt-out options are always
> rather unlikely to be used because in many cases they are hidden inside of
> long privacy and usage terms and are therefore only really accessible to
> knowledgeable users. Many Internet corporations avoid opt-in advertising
> solutions because such mechanisms can drastically reduce the potential
> number of users participating in advertising. That Google helped
> advertisers to “serve and manage ads across the web” means that it used the
> DoubleClick server for collecting user behaviour data from all over the WWW
> and using this data for targeted advertising. Google’s exploitation of
> users is not only limited to its own sites, its surveillance process is
> networked, spreads and tries to reach all over the WWW.
>
> The analysis shows that Google makes use of privacy policies and terms of
> service that enable the large-scale economic surveillance of users for the
> purpose of capital accumulation. Advertising clients of Google that use
> Google AdWords are able to target ads for example by country, exact
> location of users and distance from a certain location, language users
> speak, the type of device used: (desktop/laptop computer, mobile device
> (specifiable)), the mobile phone operator used (specifiable), gender, or
> age group.
>
> On January 25, 2012, the EU released a proposal for a General Data
> Protection Regulation that defines a right of individuals not to be subject
> to profiling, which is understood as  “automated processing intended to
> evaluate certain personal aspects relating to this natural person or to
> analyse or predict in particular the natural person’s performance at work,
> economic situation, location, health, personal preferences, reliability or
> behaviour“ (article 20, 1). Targeted advertising is such a form of
> profiling. According to (the planned) article 20, 2 (c), profiling is
> allowed if the data subject consents according to the conditions of article
> 7, which says that if the consent is given as part of a written declaration
> (as e.g. a web site’s terms of use or privacy policy), the “consent must be
> presented distinguishable in its appearance from this other matter“
> (article 7, 2). The regulation furthermore proposes a right of citizens to
> be forgotten (article 17), which also includes that third parties should be
> informed and asked to erase the same data (article 17, 2), the right to
> data portability (article 18), which e.g. means that all personal data must
> be exportable from Facebook to other social networking sites. A further
> suggested regulation is that by default only the minimum of data that is
> necessary for obtaining the purpose of processing is collected and stored
> (article 23). Fines of up to 1 000 000 Euros and 2% of the annual worldwide
> turnover of a company are implemented (article 79). The EU regulation to a
> certain extent limits targeted advertising by the right to be forgotten and
> the special form in which consensus must be given, it does however not make
> targeted advertising a pure opt-in option, which were a more efficient way
> for protecting consumers’ and users’ privacy.
>
> As a result of the announcement of the EU Data Protection Regulation,
> Google over night announced the change and unification of all its privacy
> policies and the change of its terms of use. In the new terms of use, the
> use of targeted advertising is no longer defined in the terms of use, but
> the privacy policy: “We use the information we collect from all of our
> services to provide, maintain, protect and improve them, to develop new
> ones, and to protect Google and our users. We also use this information to
> offer you tailored content – like giving you more relevant search results
> and ads”. Although Google presents its new policies as major privacy
> enhancement (“a simpler, more intuitive Google experience. […]  we’re
> consolidating more than 60 into our main Privacy Policy. Regulators
> globally have been calling for shorter, simpler privacy policies – and
> having one policy covering many different products is now fairly standard
> across the web” (http://googleblog.blogspot.**com/2012/01/updating-our-**
> privacy-policies-and-terms.**html<http://googleblog.blogspot.com/2012/01/updating-our-privacy-policies-and-terms.html>
> ).
>
> The core of the regulations – the automatic use of targeted advertising –
> has not changed. The European Union does not require Google to base
> targeted ads on opt-in. Google offers two opt-out options for targeted ads:
> one can opt-out from the basing of targeted ads on a) search keywords and
> b) visited websites that have Google ads (Ads Preferences Manager,
> https://www.google.com/**settings/ads/preferences/<https://www.google.com/settings/ads/preferences/>
> ).
>
> In the new privacy policy, “user communications” are no longer mentioned
> separately as collected user information. But rather content is defined as
> part of log information: “Log information. When you use our services or
> view content provided by Google, we may automatically collect and store
> certain information in server logs. This may include: details of how you
> used our service, such as your search queries”.  Search keywords can be
> interpreted as the content of a Google search. The formulation that log
> information is how one uses a service is vague. It can be interpreted to
> also include all type of Google content, such as the text of a gMail
> message or a Google+ posting.
>
> In the new privacy policy, Google says: “We may combine personal
> information from one service with information, including personal
> information, from other Google services – for example to make it easier to
> share things with people you know. We will not combine DoubleClick cookie
> information with personally identifiable information unless we have your
> opt-in consent”. This change is significant and reflects the circumstance
> of the EU data protection regulation’s third-party regulation in the right
> to be forgotten (article 17, 2). The question if DoubleClick is used for
> Google’s targeted ads more or less is based on the question how extensively
> and aggressively Google tries to make users to opt-in to DoubleClick. The
> effect is that Google will no longer be able to automatically use general
> Internet user data collected by DoubleClick. However, the unification of
> the privacy policies and the provision that information from all Google
> services and all Google ads on external sites can be combined allows Google
> to base targeted advertising on user profiles that contain a broad range of
> user data. The sources of user surveillance are now mainly Google services.
> As Google spreads its ad service all over the web, this surveillance is
> still networked and spread out. Google tries to compensate the limited use
> of DoubleClick data for targeted advertising with an integration of the
> data that it collects itself.
>
> Concerning the use of sensitive data, both the old and the new privacy
> policy specify: “We require opt-in consent for the sharing of any sensitive
> personal information”.  In addition, the new policy says: “When showing you
> tailored ads, we will not associate a cookie or anonymous identifier with
> sensitive categories, such as those based on race, religion, sexual
> orientation or health”. Targeted ads use data from all Google services,
> including content data”.
>
> The proposed EU Data Protection Regulation says that the processing of
> sensitive data (race, ethnicity, political opinions, religion, beliefs,
> trade-union membership, genetic data, health data, sex life, criminal
> convictions or related security measures) is forbidden, except if the data
> subject consents (article 9). Google continues to use content data (such as
> search queries) for targeting advertising that is based on algorithms that
> make an automatic classification of interests. By collecting a large number
> of search keywords by one individual, the likelihood that he or she can be
> personally identified increases. Search keywords are furthermore linked to
> IP addresses that make the computers of users identifiable. Algorithms can
> never perfectly analyze the semantics of data. Therefore use of sensitive
> data for targeted advertising cannot be avoided as long as search queries
> and other content are automatically analyzed. Google’s provision that it
> does not use sensitive data for targeted ads stands in contradiction with
> the fact that it says it uses “details of how you used our service, such as
> your search queries”.
>
> The overall changes introduced by Google’s new privacy policies and terms
> of use are modest, the fundamentals remains unchanged: Google uses targeted
> advertising as a default. DoubleClick is now less likely to be used for
> targeted advertising. Google has unified its privacy policies. Whereas
> Google presents this move as providing more transparency (“We believe this
> new, simpler policy will make it easier for people to understand our
> privacy practices as well as enable Google to improve the services we
> offer”, http://googleblog.blogspot.**com/2012/01/updating-our-**
> privacy-policies-and-terms.**html<http://googleblog.blogspot.com/2012/01/updating-our-privacy-policies-and-terms.html>),
> it also enables Google to base its targeted ads on a wide range of user
> data that stem from across all its services.
>
> Google claims that it does not use sensitive data for targeted ads, which
> is contradicted by the definition of content data as log data that can be
> used for targeted ads. Google’s old privacy terms (version from October 20,
> 2011) had 10 917 characters, which is an increase of 30%. The main privacy
> terms have thereby grown in complexity, although the number of privacy
> policies that apply to Google services was reduced from more than 70 to one.
>
> Google present its updated terms of use and privacy policies as new,
> although no fundamental improvements of user privacy protection can be
> found. The “change” is an ideological marketing strategy aimed at
> maintaining the stability of the exploitation of the labour of users that
> generates value and generates Google’s profits that in 2011 amounted to
> $8.5 billion (http://www.forbes.com/**global2000/#p_1_s_arank_**
> ComputerServices_All_All<http://www.forbes.com/global2000/#p_1_s_arank_ComputerServices_All_All>).
> Google continues to automatically collect, analyse and commodify a
> multitude of user data that is generated by searches and the use of Google
> services. The Marxist communication scholar Dallas Smythe wrote in 1981:
> “For the great majority of the population […] 24 hours a day is work time.
> […] [Audiences] work to market […] things to themselves”. For the great
> majority of Internet users, most of Internet use is (value-generating)
> labour time. Internet users work on Google and other corporate platforms to
> market things to themselves and are transformed into an Internet commodity
> that is sold to targeted advertising clients in order to accumulate capital
> in the amount of billions of Euros.
>
> In a response letter to the EU Article 29 Data Protection Working Party
> (concerning Google’s updated policies and terms; see
> http://www.edri.org/book/**export/html/1225<http://www.edri.org/book/export/html/1225>),
> Google’s Global Privacy Counsel Peter Fleischer writes that “we are not
> selling our users’ data”. One wonders where Google’s $US 8.5 billion
> profits come from, except from the commodification of the data results of
> users’ activities?
>
> The EU Article 29 Data Protection Working Party asked the French National
> Commission for Computing and Civil Liberties (CNIL) to analyse Google’s new
> policies. In a letter to Google, CNIL shows deep concern and said that “our
> preliminary analysis shows that Google’s new policy does not meet the
> requirements of the European Directive on Data Protection […] Moreover,
> rather than promoting transparency, the terms of the new policy and the
> fact that Google claims publicly that it will combine data across services
> raises fears about Google’s actual practices. Our preliminary investigation
> shows that it is extremely difficult to know exactly which data is combined
> between which services for which purposes, even for trained privacy
> professionals. In addition, Google is using cookies (among other tools) for
> these combinations and in this regard, it is not clear how Google aims to
> comply with the principle of consent laid down in Article 5(3) of the
> revised ePrivacy Directive, when applicable. The CNIL and the EU data
> protection authorities are deeply concerned about the combination of
> personal data across services: they have strong doubts about the lawfulness
> and fairness of such processing, and about its compliance with European
> Data Protection legislation”. Big Brother Watch reports that only 12% of
> the Google users have read the new policy and that 65% are not aware that
> the changes have now come into effect. The initiative says: “Google is
> putting advertiser’s interests before user privacy and should not be
> rushing ahead before the public understand what the changes will mean”.
>
> According to the proposed new EU Data Protection Regulation (
> http://ec.europa.eu/justice/**newsroom/data-protection/news/**
> 120125_en.htm<http://ec.europa.eu/justice/newsroom/data-protection/news/120125_en.htm>),
> Google’s exploitation of users is perfectly legal. That it is legal does
> however not mean that we cannot consider Google commodification as a
> violation of user/consumer/Internet workers’ privacy, but rather that the
> EU’s suggested legal provisions do not provide enough protection for users.
> The only way forward is to legally require all Internet companies (and
> companies in general) to necessarily make targeted advertising an opt-in
> option by law, which would give users and consumers more control.
> Implementing such a provision requires not only courage, it also requires
> not to be afraid of organised business interests. It is however the only
> way for putting privacy interests first. Today, profit stands over privacy
> protection and therefore over people. Google is one of the best examples
> for this circumstance. Google’s “new” privacy policy is not new at all and
> should consequently best be renamed to “privacy violation policy” or “user
> exploitation policy”.
>
> Related publication:
> Fuchs, Christian. 2011. A contribution to the critique of the political
> economy of Google. Fast Capitalism 8 (1). http://www.uta.edu/huma/agger/**
> fastcapitalism/8_1/fuchs8_1.**html<http://www.uta.edu/huma/agger/fastcapitalism/8_1/fuchs8_1.html>
>
> ______________________________**_________________
> Discussion mailing list
> Discussion at lists.icts-and-**society.net<Discussion at lists.icts-and-society.net>
> http://lists.icts-and-society.**net/listinfo.cgi/discussion-**
> icts-and-society.net<http://lists.icts-and-society.net/listinfo.cgi/discussion-icts-and-society.net>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.icts-and-society.net/pipermail/discussion-icts-and-society.net/attachments/20120229/783cbcec/attachment-0002.htm>


More information about the Discussion mailing list