Since my keynote on neo-informationalism in regards to the Google-China saga, I started thinking that one of the blind-spots of living in a neo-informationalist world is to see "free-information" as a binary - either information is open or its not, either you make your identity known or not (update - I develop the idea of neo-informationalism in my piece on Haystack censorship tech). This totally builds upon danah boyd ‘s thinking about privacy as binary - either we have it or we don’t. I’ll go back to danah’s work later.
So how is this blind spot built into our social media technologies and how do people make sense of this? (Eszter Hargittai and danah boyd’s recent research on facebook is a great example of how users are managing privacy settings.) I’m wondering how does that change the ways that they are used in places with different conceptions of privacy and information? How do people make decisions to share information with social technology applications? How can we understand privacy as a cultural practice? I’ve been thinking a lot about these questions as it relates to privacy, trust, and relationships as I prepare for my fieldwork in China.
In a country that is just beginning to create a rule of law based on individual rights and justice, the importance of maintaining anonymity in many contexts is critical because it means that one can put their idea(s) out there without the fear of personal retribution. So one of the most important priorities for online users in China is the ability to be anonymous.
A western approach of complete information openness wouldn’t work in China because the anonymous user has an important role in maintaining information openness in a Chinese context. Countless online and offline stories in China have succeeded because of the mass participation of millions of anonymous users in leaving comments, making posts, and participating in online discussions.* Privacy is critical for these individuals because it allows me them to have a voice—a voice they wouldn’t be able to have if they made their identity open. We have to recalibrate our expectations for places with different social-political contexts of information and privacy. I’m afraid that Western companies don’t have a nuanced understanding of the cultural intricacies surrounding privacy in China (and as many scholars have pointed out in the West also).
How can companies design technologies with the understanding that anonymity is a right, not a privilege? Or even more relevant is to ask, how do companies design the right to privacy/publicness into our technologies?
Google Buzz, a product recently launched by Google in the US ran into a lot of problems because Google misunderstood the importance of privacy for users and how users defined privacy. In her recent talk, danah boyd argued that Google understood privacy as a binary, private vs public, and failed to see privacy as a spectrum. After Danah’s talk, the Buzz team admitted that they had screwed up. So even Google had to learn that privacy isn’t always evil.
I think one of the interesting things to come out of this lesson that Google quickly learned from is that open-access to information cannot always be the default. This default works for some of their products because these services (such as search) tend to work best in an open-access free-information environment. Both searchers and search providers benefit from information non-scarcity. (There are unintended consequences to searching, but I’ll leave that alone for now.)
But social applications that serve to mediate personal ties do not operate in an open-access environment. No matter how much we design “openness” into our social technologies, social technologies operate under conditions of information scarcity because social ties are scarce. We value our ties because we have a limited of ties whether it is our 2 best friends from childhood or 60,893 Twitter followers or 300 facebook friends. Social ties - they take time to create and nuture, they can be fragile, unpredictable, meaningful and/or sensitive, and they are limited.
GUANXI and SOCIAL CONNECTIONS - To really understand anonymity, we have to explore the meaning of guanxi in China. Guanxi is the Chinese equivalent to social connections. Just like one’s social connections in the US, a Chinese person’s guanxi consists of people they know on a personal, familial, or professional basis. Guanxi also means that social connections require a level of mutual obligation.
A lot of scholars and journalists have framed guanxi as a unique Chinese social phenomenon but I argue that they overemphasize practices of mutual obligation.
I just don’t buy the argument that Chinese people value their social network that much more than other people. This argument implies that others, such as Americans, care less about their social connections or place less value on social obligations than Chinese people. That’s simply not true. Look at our obsession with managing our social networks. If anything, Americans want to believe that success is purely based on the individual. But any sociologist can tell you that income, social networks, race, education, parent’s education and all that stuff that helps you meet other people does matter. A lot. And they also matter in China, but in different ways.
WHY CHINESE PEOPLE MIGHT HAVE DIFFERENT IDEAS ABOUT PRIVACY - So why might Chinese people have a different cultural orientation towards social connections? I need to explore this further, but my initial hypothesis is that Chinese ideas about privacy are connected to the recent historical period of repression, a different cultural historical experience, and different orientations towards social visibility.
1.) Chinese history is still rife with fresh memories of people who suffered by making their social connections explicit. This is still true in mixed-market Communist China; however it may change as the people will not be penalized for their social connections and as there is more temporal distance from the traumatizing events of the past. Social amnesia can present an opportunity for new practices to be born.
2.) Making social connections explicit can be seen as a form of bragging, which in general is not seen as a favorable trait in China. There is a cultural expectation that the more people you know, the more careful you are to not flaunt these social connections.
3.) People are much more judicious about making their social connections explicit. People don’t always invite someone else to be their contact on some social media site because they sometimes aren’t sure that the other person wants to be their contact or wants for their connection to be made explicit. They fear that the other person will feel obligated to become their social contact and from then on, the actual real-life social connection could be ruined due to this awkward dance in social media connections. In my research, adults and youth both expressed a lot of doubt, fear, and confusion about making someone a “contact.” Many of them preferred to just keep chatting with their private list of contacts over QQ because it was easier and more comfortable to manage their social connections privately than to engage in a platform that made their networks more visible to other people.
PRIVACY AS CULTURAL - I find it more useful to think of privacy as a cultural practice than as an act of rational choice between private vs. public. As I state earlier, danahboyd insightfully makes the point that privacy is not a binary - it’s not just on or off - it’s a spectrum of contexts that are lot more complex than our online architectures are designed for right now. Following danah’s point, I am going to start thinking of privacy as a cultural practice. ‘Privacy as Cultural’ means that we have to start asking what are the multiple histories and narratives attached to various notions of privacy in any one place/region. There are multiple notions of privacy at any one time competing, conforming, complementing, and cohering. Framing privacy as a cultural act means that we can observe it and describe it. Privacy is a process, it’s negotiated, and it’s constantly in flux.
HOW TO UNDERSTAND CULTURAL ASPECTS of PRIVACY - Making the case that privacy is cultural all of sudden sounds kinda touchy feely. It can be difficult to get a handle on culture and it can be even more obscure to think about how companies could become more attuned to the nuances of privacy.
GUANXI, PRIVACY, and TECHNOLOGY - What technology companies designing for the Chinese market need to grasp is that cultural orientations towards privacy — especially around guanxi — matter. They matter because if the technologies that are designed for social networking in the US are simply re-launched in China, they will fail. They will fail because Chinese people do not share the same cultural orientation towards anonymity, privacy, and user preferences in online or offline social networks as Americans. Guanxi is something that one holds near and dear to them, so close that they don’t want to reveal it. Let me play with this analogy - Social connections in China are like underwear, whereas social connections in America are like a jacket. The difference is that Chinese people want to keep their social connections out of the public eye, while American people want to display their social connections. The difference here is that Americans and Chinese have different cultural orientations towards transparency, privacy, and anonymity.** In real life, social connections can defined on more implicit or explicit terms, depending on how social connections are made known in the specific context.
For example, we can learn so much from Chinese people who have tried to replicate successful American social networks and failed at it. One example is Linkedin. Linkedin is a US online social networking site where users list all the jobs they have ever had and all the people they know or have worked with in the form of “connections.” Around 2004-05, Lin Feng 林枫 copied Linkedin for the Chinese market. It was a total failure. Why? Because Chinese people didn’t want to show off their underwear. Chinese copy-cat of Linked in failed back then because Chinese people didn’t want to make their social connections explicit.
Take the Chinese equivalent to Facebook on Kaixin. If you talk to most people who use it, they will tell you that they use it to connect to friends. But, if you actually observe what they are doing, you will see that they use it to look for music. Yes, music. It’s kind of like myspace stripped of social connections. Underlying this supposed social media network that seems to be a copycat of myspace and of facebook is an extensive music exchange network. That’s definitely different from how we use social media here in the US. The music industry has instilled enough fear and guanxi throughout American-based social media companies to ensure that music sharing does not become an easily sharable commodity.
The story of the Linkedin copy-cat and Kaixin show how cultural orientations towards privacy and social connections matter in how a technology is used. What companies and scholars have to understand is that:
1.) it’s not that social connections matters more to Chinese people and less to American people, it’s that they matter in different ways that we might not notice at first glance2.) technologies are NOT neutral 3.) “free-information” narratives must be contextualized - free to what ends? what are the socio-political contexts for free? What do people expect of “openness”?4.) social media apps are not universal in the ways they are used
SO WHAT’S NEXT? Understanding privacy as culture is an important lesson for tech companies that are increasingly focusing their design energy in the software business. Even companies, like Nokia, that were once hardware based companies, have to re-define material practices as linked to cultural understandings around social media applications. (I’ll write another post on Nokia)
Well there is so much more to understand and explain that I hope to contribute more to this dialogue. I would love to see more research that makes clear how the values of guanxi in China differ from the values of connections in the US and how this difference can be turned into an awareness that is designed into technologies for the Chinese market. So one of the questions that I will be answering in my fieldwork is how can services/apps be designed for communities with alternative orientations towards transparency.
So I’ve decided to dedicate a portion of my fieldwork in China to understanding the cultural aspects of privacy. I thought one way to really to get at local notions of privacy is to spend time with local venture capitalists and entrepreneurs of failed or ongoing Web 2.0 technologies.
Research on failure offers many cultural insights for understanding how innovation takes places and how values are mis-read or mis-build into technologies. I am really excited to spend some time in Beijing and Shanghai with people who have created all these failed twitter-lilke copycats that the government has shut down. There’s more to do the story thaat Chinese Web 2.0 land is a just a pure copy of US web 2.0 apps. A recent techcrunch article portrayed Westerners rushing into China and licking their wounds over US introduced technologies that have failed in China. The article doesn’t mention all the exciting experimentation happening on the ground with Chinese VCs and entrepreneurs. For example, Farmville is actually a game invented in China.
The majority of my fieldwork will still involve making sense of how new users, the rural to urban migrants in Wuhan, and interact with these new online technologies. I’m going to be moving to Wuhan, China and making frequent visits to Beijing and China for 1 year for ethnographic research starting March 2011. If you’re in China and am interested in these topics, let’s talk! Or if you are or know of any Chinese entrepreneurs or venture capitalists of the internets, I would love to chat with you!
(thanks Chun Xia for inspiring me to follow up on Chinese entrepreneurs!)
*Check out Min Jiang’s articles on online public deliberation in China. Her research suggests that the current limitations of speech online should also be examined alongside reforms being made on the ground in local citizen participation.
Jiang, Min. 2009. “Exploring Online Structures on Chinese Government Portals: Citizen Political Participation and Government Legitimation.”Social Science Computer Review 27:174-195. Jiang, Min. 2010. “Running Head: Authoritarian Deliberation.”
**I realize that I’m generalizing here and that there are millions of Americans who don’t want to be online and have their social connections even documented, and that they are millions of Chinese people who would love to make all their connections public. But I do believe that social media technologies are designed for the greatest number of users and there is no doubt that facebook, twitter, myspace, linkedin, and other online apps wouldn’t be as successful in the US were it not for a larger social proclivity among users to make their social connections explicit.