The Case for Optimism in Employee Listening
by Greg Moran
On Sunday, November 12, LinkedIn included in its Daily Rundown an article from The Guardian about employee listening technology. They then asked for feedback under the hashtag #bigbrotheremployer (nothing leading in that hashtag!).
It's been an interesting few days watching people weigh in on the topic, including Future of Work author Jacob Morgan.
Full disclosure: I'm the COO of Aware and I am quoted in The Guardian's article. I thought that it might be helpful to provide a perspective from the point of view of someone who has spent years contemplating the virtues, vices and slippery slopes of the employee listening landscape.
Context is King
Let me start by letting everyone know (sardonic spoiler alert!) that this horse has been out of the barn for a very, very long time, but it's a very serious topic that we MUST continue to discuss. For various reasons, at least since the industrial revolution (and long before), employers have monitored employees.We've always hated it… or at least mildly resented not being 'trusted' by our employer.
What we don't always do as good a job of is connecting the dots between employee listening when we are freaked out by it and employee listening when it has protected us or someone we care about, or saved the company from a damaging incident.
So now we have better technology available and we have an opportunity to take this whole question of know what is going on inside a large enterprise to the next level.
Let's all agree that it's time to move beyond checking for bad words and assuming bad intent — we need to understand context and patterns not just facts. Before we dive into where we should head with this tech, let's talk about why.
An Inconvenient Truth
It is an inconvenient truth that not everyone inside an enterprise is trustable, despite all efforts to hire trustworthy employees. Just ask the engineer at Uber who was propositioned by her boss on the company collaboration network.
Statistically, the proof is clear. Harvard Business Review published that there were 80 million insider attacks in 2014. IBM further elucidated this reality by finding in 2016 that 75% of these attacks were malicious.
It is an inconvenient truth that an employee who revealed his/her sexual identity really did fear reprisal and really did use private messaging to express concern to a friend (this is a case we shared in the article and it is a real case). Since August of this year Uber, Twitter, Google and Oracle are all facing lawsuits relating to harassment or bias.
It is an inconvenient truth that some corporate programs produce un-intended consequences that would be better understood sooner rather than later (just ask the 5,300 people fired at Wells Fargo because they were defrauding customers… for a long time).
It is an inconvenient truth that people act one way in formal meetings and another way on their company's social network. It is further a reality that many companies operate in a highly regulated world and must positively prove that they are operating in compliance with those regulations, even in cases where there was no intent to be non-compliant.
Similarly, many employees do things with efficiency in mind that massively increase the risk surface area of a company to bad actors.
In the same way that the vast majority of us appreciate the value of community policing and cameras to keep us safe, the vast majority of us accept that some level of visibility is required in the workplace to ensure that we are safe from the various risks that we've mentioned.
If a young female engineer is propositioned on the company network (which was installed to spur collaboration and innovation) by her boss, how likely is she to feel inclined to participate in that network — which she knows for sure is being used to harass?
Risks are a Reality
The reality is that companies must provide a safe, compliant and secure environment for employees, customers and other stakeholders or risk paying dire consequences.
Travis Kalanick is no longer running the company he founded and it now has 5 FBI investigations active. Google has a divided workforce and a damaged reputation that will take a long time to repair.
The large preponderance of companies and managers use technology monitoring to give their employees confidence that they work for a company that cares about protecting them and managing risk in a responsible way.
Curbing Misuse Through Thoughtful Design
Yes, the technology can also be used as a weapon and manipulated, but a well designed monitoring program will also catch that and highlight it for others to see.
We emerged from Plato's cave a long time ago and we live in a complex world. Rather than assume that all technology that could be used unthoughtfully will be used unthoughtfully, let's embrace that it can be used thoughtfully in a way that makes us better as a species and then thoughtfully design the controls we need to manage the risks.
We see this happening now in the EU with GDPR (General Data Protection Regulation), and surely much of the world will follow their lead. We are headed to a place that, with machine learning, we'll have monitoring capability that can anticipate, protect, suggest and create insight for both individuals and companies that will help them compete and win with high engagement.
That's our mission at Aware: Based in optimism and aspiration.