Removing Trust

by Thomas Vander Wal in , , , , , , , , ,


About two years ago I made a conscious effort not to use the term “trust” and encouraged those I was engaging for work and social interactions not to use the term. The problem is not the concept of trust, but the use of the term trust, or more accurately the overuse of the term trust. Trust gets used quite often as it is a word that has high value in our society. There are roughly seven definitions or contextual uses of the term trust, which is problematic when trying to design, develop, or evaluate ways forward from understandings gaps and potential problems.

Initially, I started a deep dive into reading everything I could on trust to get a better grasp of the term and underlying foundations. I thought this may provide better understanding and bring me back to using the term and with more clarity of understanding. While, this helped increase my understanding of the use of trust as a term it also confirmed the broad fuzzy use of the term, even within attempts to clarify it.

Why the Use of the Term Trust is Problematic

When I was working with people to help improve their social software deployments or use of social sites, as well as engagements in B2B and B2C arena the term trust was used a lot. I would ask people to define “trust” as they were using it, and they would describe what they meant by trust, but with in a sentence or two they had moved onto a different contextual definition. Sometimes I would point this out and ask them to redefine what they meant, pointing out the shift in usage. When I asked one group I was talking with to use other words as proxy for the term trust things started moving forward with much more clarity and understanding. Also gone were the disagreements (often heated) between people whose disagreement was based on different use of the term.

Once I started regularly asking people to not use trust, but proxies for the term I started keeping rough track of the other words and concepts that were underlying trust. The rough list includes: Respected, comfort, dependable, valued, honest, reliable, treasured, loved, believable, consistent, etc. Many found the terms they used to replace trust were more on target for what they actually meant than when using the word trust. There are some sets terms that nicely overlap (dependable, reliable, consistent and valued, treasured), but one term that came up a lot and generated a lot of agreement in group discussions is comfort.

Social Comfort Emerges

Within a few months of stopping use of the term trust, comfort was the one concept that was often used that seamed to be a good descriptor for social software environments. It was a social comfort with three underlying elements that helped clarify things. Social comfort for interacting in social software environments was required for: 1) People; 2) Tools; and 3) Content (subject matter). I will explain these briefly, but really need to come back to each one in more depth in later posts.

(A presentation to eXention last year turned what was publicly one slide on the subject into a full 60 minute plus presentation.)

Social Comfort with People

Social comfort with people is one essential for people interacting with others. Some of the key questions people bring up with regard to social comfort with people are: Knowing who someone is, how they will interact with you, what they will do with information shared, reliability of information shared, are they safe, can I have reasonable interaction with them, and why would I interact with this person. One of the biggest issues is, “Who is this person and why would I connect or interact with them?” But, most social software tools, particularly for internal organization use provide that contextual information or depth needed to answer that question in their profiles (even in the organizations where most people have relatively “complete” profiles, the information in the profiles is rarely information that helps answer the “Who is this person and why should I listen or interact with them?” question.

Social Comfort with Tools

Social comfort with tools is often hindered by not only ease of use, but ease of understanding what social features and functionalities do, as well as with whom this information is shared. There is an incredible amount of ambiguity in the contextual meaning (direct or conveyed) of many interface elements (ratings, stars, flags, etc.) fall deeply into this area. This leads to the social reticence of a click, where people do not star, flag, rate, or annotate as the meanings of these actions are not clear in meaning (to the system or to other people) as well as who sees these actions and what the actions mean to them. Nearly every organization has a handful if not many examples of misunderstanding of these interactions in actual use. The problems are often compounded as sub-groups in organizations often establish their own contextual understandings of these elements for their use, but that may have the opposite meaning elsewhere (a star may mean items a person is storing to come back to later in one group and another it means a person likes the item starred and can be construed as a light approval). Even services where this is well defined and conveyed in the interface this conflict in understandings occurs. (This is not to ward people off use, but the to understand lack of consistency of understanding that occurs, although the 5 star (or other variations) are really universally problematic and needs a long explanation as to why.)

Social Comfort with Content

Social comfort with content or subject matter can hold people back from using social software. People may have constructive input, but their lack of their own perceived expertise may be (and often is) what inhibits them from sharing that information. The means for gathering this constructive feedback is needed along with the ability for others to ask questions and interact, which usually rules out anonymous contributions (additionally anonymous contributions rarely help mitigate this problem as that doesn’t really provide comfort, as well inside most organizations it is quite easy to resolve who is behind any anonymous contribution, so it is false anonymity). People often have contributions they believe are helpful, but may not be fully fleshed out, or are need to have the information vetted for internal political reasons or put in context (terminology and constructs that are most easily understood and usable) through vetting of others (whom there is social comfort with).

Improving Outcomes with Focal Shift

One of the outcomes of this shift from the term trust to others, including social comfort is areas that need to be addressed are more easily seen, discussed, considered, and potential solutions identified. The end results are often improved adoption through improved community management, improved interfaces and interactions in the services, better tools through iteration, and improved adoption.