How Trust & Safety Processes Can Be Improved With GigCX Delivery

Everest Group published a new research report titled ‘The Wisdom of The Crowd – The Gig Delivery Model in Trust & Safety’ This report explores the opportunity for gig delivery – such as GigCX – to drastically improve how Trust and Safety (T&S) services are delivered.

T&S is focused on content moderation. Any service that allows user-generated content needs to ensure that safety rules are followed to protect the entire community. Social networks are the most obvious example – we can upload anything to services such as YouTube and Facebook. Still, the services don’t publish all those uploads because some may offend other users.

However, T&S is moving far beyond social networks. Many companies now encourage customers to post pictures, videos, and reviews of their products. Once the end customer can publish their content online, a content moderation filter is required to keep everyone safe.

Everest identified several advantages of using the GigCX model:

  • Improved profitability: Around 70% of companies that engaged a gig workforce found that their costs were reduced by 30-40% – thanks to lower employee costs and a reduction in real estate.
  • Access to local talent: when moving into new locations, it is easy to find local people to work on customer support in the local language.
  • Dynamic Scalability: the ability to quickly scale resources up and down for seasonal or short-term peaks in business activity.
  • Focus: Full-time employees can focus on high-value tasks: the transactional tasks can be performed by the gig workforce, allowing the core team to focus on more strategic services.
  • Diverse and flexible workforce: working hours and working days are all flexible. You indicate when you need resources and the gig workers will respond.

However, there is a kicker. Finding people who want to work full-time on content moderation is difficult. There is a good reason why. An Artificial Intelligence (AI) system will usually ensure that the most offensive or violent posts are never published, but when the AI system is unsure, it will require a human moderator. This means that the employees in these jobs are subjected to an endless stream of offensive and upsetting images.

This can be difficult to manage inside a traditional Monday-to-Friday work environment. Companies are offering on-site psychologists and allowing employees to take breaks whenever they are overwhelmed by the content. Still, it is an environment that requires the employees to step away and distance themselves whenever they feel it is too much.

This is why Everest believes that a gig model is far more effective for content moderation. By building a bench of talent experienced in content moderation processes, you can ensure that they only work when they feel ready to work.

Nobody is forcing them to do more hours than they want to. Content moderation shifts can be short – allowing plenty of time to feel distanced from the content again.

The content that requires moderation isn’t pleasant, but this is an essential process if users can upload their content to sites. AI can capture the most offensive content, but there is always uncertainty that requires human intervention to protect the community.

Allowing humans to step in and out of the process whenever they feel comfortable is a perfect delivery model for T&S. The Everest report makes this point, and I entirely agree. Sometimes it is better to focus on a task for short periods and to be flexible around working hours.

Interested in learning more? Click here to read about the additional benefits of GigCX.

Terry Rybolt, CRO at LiveXchange, wrote this article. If you are interested in connecting with him and learning more about GigCX click here.