The transparency cure modern tech so desperately needs

a man holding a shield with a checkmark

On 14 November the mighty tech giant Google agreed to pay $372 million in a lawsuit brought by a coalition of state prosecutors.

The legal action, submitted by 40 state attorney generals, alleged that Google knowingly misled users about location tracking on their Google accounts. According to the plaintiffs, thousands of Google customers were fooled by ambiguous language and other techniques into believing location tracking was turned off, when in fact it was running in the background of their devices.

The massive monetary settlement is, reportedly, the biggest data privacy settlement in US history.

Google of course tried to play down the issue, stating via a spokesman that the issues identified by prosecutors were “outdated product policies that [were] changed years ago.”

But the problem goes way beyond the issue of whether or not Google knows where you spent Thanksgiving this year. 

This case is just the latest issue underscoring the very real issue of transparency in the world of information technology.

The extent of the problem

There’s an old saying in the TV industry, a saying recently popularized by the hit documentary The Social Dilemma: “If you’re not buying the product, you are the product.”

The recent court case involving Google is an excellent reminder of this fact. 

In addition to the massive fine they agreed to pay, the settlement included another important clause Google must now abide by: The company will be forced to make serious changes to its location tracking disclosures starting in 2023–making it clearer and easier to identify when the tracking function is enabled. 

This is, at the end of the day, the heart of the issue for which Google was sued in the first place. It’s not that they tracked people’s whereabouts. It’s that they didn’t tell users they were doing it. 

Google and every other major platform in media and IT make their money off of data–data on you and me. Business models like Google’s rely on information collection in order to sell highly efficient targeted ads. Thus, there is a massive incentive to collect as much of that data as possible and not tell users it’s being collected, lest they stop giving it. 

To some this up: Transparency is contrary to the business interests of our biggest tech and media models.

From concealment to manipulation

Unfortunately, the transparency problem does not end with unwanted tracking or data disclosure.

From what I’ve seen after decades of being in corporate tech, I can tell you that the biggest problems go much further, and are only starting to come into public awareness.

Companies that deploy tech are not just failing to disclose important information to users. They are deploying technology that is outright manipulating.

Luckily, in the past couple of years, there’s been a lot more awareness on this issue, especially when it comes to online social platforms. The powerful influence these tools have on their users to the point of bringing about clinical addiction, is now well known.   

But I do think this very precarious situation and the steps necessary to solve it, are still not widely understood.

The popular conception of the so-called “Big Tech” problem, namely that tech companies are designing their products with the explicit intention of manipulating their customers. 

To be fair, there’s quite a bit of evidence to support this view. 

For over twenty years, tech executives have been going on record expressing regret for helping produce manipulative platforms. One of the big milestone revelations took place in a 2017 speech at Stanford by Facebook’s former vice president for user growth Chamath Palihapitiya. During the address, Palihapitiya told the audience he feels “tremendous guilt” for creating “tools that are ripping apart the social fabric of how society works.” 

The knee-jerk response to this has been advocacy for more laws controlling what these technological tools are and aren’t allowed to do. Indeed, the motivation to reign in on tech has been one of the central drivers for the sweeping data regulation we’ve all witnessed in the recent period, from GDPR, to California’s CCPA, to the more recent additions to New York’s DFS cybersecurity rules.    

Now, while it is true that big-tech will often create algorithms with manipulation in mind, and yes, regulation does have a role in curbing that, I think there’s something a bit off about the picture.

Well meaning and dangerous

The truth is, most tech creators do not go about their work with evil intentions in mind.

Quite the contrary. 

In my experience, most developers actually want to create something that will have a positive impact on people’s lives.

The problem–at least most of the time–doesn’t come from some wicked plan on the part of creators. It comes from unintended consequences.

Essentially all the major tech platforms that have become so ubiquitous in our modern world, today run on smart algorithms, programs that use artificial intelligence to learn about their environment and alter their own behavior. These programs are created in order to optimize products, enhance user experience, and improve the overall service being provided.

What we are discovering more and more, however, is that by instructing an AI engine to achieve a certain goal, we are unleashing a range of consequences we could not have anticipated. AI algorithms designed to increase productivity for example have been shown to produce deteriorating working conditions for employees as the expectations of the AI do not comport with how actual humans operate. Similarly, AI models used to determine the best candidates for certain job roles have produced huge biases in hiring practices within the companies that use them. 

Similar, and in my opinion much worse, effects have taken shape when AI is used in personalized user platforms. What the creator intended was to give the user a more attractive product. But what it ends up doing is using manipulation to get the user hooked. This manipulation can take many forms, including personalized addictive strategies for consumption of digital goods, or taking advantage of the emotionally vulnerable state of individuals to  promote products and services. This manipulation can become powerful enough to inflict immense psychological harm on users, especially young people

The important point to highlight here is that this was not an outcome that anyone planned. No nefarious cabal of tech creators sat around one day thinking up ways to emotionally torture millions of teenagers. 

It was the effect of instructing a powerful technology to achieve certain outcomes–outcomes you could argue desirable in and of themselves–but without knowing the consequences it would produce along the way.

The transparency cure

I’m a firm believer in the power of creativity to take on our most pressing challenges.

It was on the basis of this belief that I co-founded my company and it’s what today allows me to support my clients and help make their enterprises thrive.  

I don’t want to see a world where we are afraid of innovation and technological advance. And while regulations and rules have their place, they cannot be the end-all solution. Too many laws governing tech can impede growth and reinforce an attitude of skepticism toward technology. 

What I believe is the cure to our data technology conundrum can be summed up in one word: transparency.

By bringing back the value of transparency into tech creation, we can revive a sense of trust between consumers and creators. We all understand companies deploy platforms so they can profit off of their use. But disclosing how those platforms are used (and how they are often using us) is essential if we are going to continue on this amazing growth trajectory we’re currently riding.

Some doubt the viability of integrating transparency into the datasphere since the cost and resources of restructuring our current way of doing things is too high, and no one will want to undertake the challenge.  

This is why we need to start deploying solutions that can help us carve a path forward.

In my experience, when people learn of the damaging effects lack of transparency causes, they become genuinely concerned. When they realize how transparency and trust can add to their value as an enterprise, they become excited. The minute they learn about the smart platforms that can make transparency a reality for them, they feel empowered. 

Spreading this empowerment is Vendict’s mission.

Heading 1

Heading 2

Heading 3

Heading 4

Heading 5
Heading 6

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.

Block quote

Ordered list

  1. Item 1
  2. Item 2
  3. Item 3

Unordered list

  • Item A
  • Item B
  • Item C

Text link

Bold text

Emphasis

Superscript

Subscript

Share & Subscribe

Ready to Get Your Time Back?

Give us only 20 minutes and we will show you how to get 20 hours back.

Book a Demo
We use cookies and similar technologies that access and store information from your browser and device to enhance your experience, analyze site usage and performance, provide social media features, personalize content and ads. View our Privacy Policy for more information.