Microsoft CEO Satya Nadella leaves the Elysee Palace after a meeting with the French President Emmanuel Macron in Paris on May 23, 2018.
Aurelien Morissard | IP3 | Getty Images
If Microsoft were to complete an acquisition of TikTok, it would gain a company with much potential for advertising revenue growth.
But with such a purchase, Microsoft would also take on an entirely new slate of problems.
Microsoft announced on Aug. 2 that it was in talks to purchase TikTok’s business in the U.S., Australia and New Zealand, with a deadline to complete the deal by Sept. 15. The company is currently owned by Chinese tech company ByteDance, and has become a target of the Trump Administration and other governments over privacy and security concerns. Trump also signed an executive order last week that would ban U.S. companies from doing business with TikTok, but it’s unclear how that order could affect a potential acquisition by Microsoft.
In the U.S., TikTok has grown to more than 100 million monthly users, many of whom are teens and young adults. Those users tune into TikTok to see full-screen videos uploaded to the app by others. These videos often feature lip syncing over songs, flashy video editing and eye-catching, augmented-reality visual effects.
To say that TikTok represents a business that is radically different than the enterprise software that Microsoft specializes in would be an understatement.
For Microsoft, TikTok could become an advertising revenue powerhouse, but this potential is not without its own risk. Like other social apps, TikTok is a target for all kinds of problematic content that must be dealt with. This includes basic problems such as spam and scams, but more complicated content could also become headaches for Microsoft.
This could include content such as misinformation, hoaxes, conspiracy theories, violence, prejudice and pornography, said Yuval Ben-Itzhak, CEO of Socialbakers, a social media marketing company.
“Microsoft will need to deal with all of that and will be blamed and criticized when they fail to do so,” Ben-Itzhak said.
Microsoft declined to comment, and TikTok did not respond to a request for comment on this story.
These challenges can be overcome, but they require large investments of capital and technical prowess, two things Microsoft is capable of providing. And already, Microsoft has some experience when it comes to moderating online communities.
In 2016, Microsoft purchased LinkedIn for $26.2 billion, and although the career and professional-centric service does not have the degree of content issues its peers deal with, it is still a social network. Microsoft has also run Xbox Live, the online gaming service, since its launch in 2002. Online gaming and social media are different beasts, but they do share similarities.
“Combating misinformation will need to be a mission critical priority. Microsoft will be new to this as it doesn’t have experience managing a high profile social network at this scale,” said Daniel Elman, an analyst at Nucleus Research. “That said, if any company can acquire or quickly develop the requisite skills and capabilities, it is Microsoft.”
But these are no small challenges, and these types of problems have become major issues for TikTok’s rivals.
Facebook, for example, was accused of not doing enough to circumvent fake news and Russian misinformation ahead of the 2016 U.S. election, and four years later, the company still comes consistently under criticism for whether it is doing enough to prevent that type of content from its services. In July, hundreds of advertisers boycotted Facebook over its failure to content the spread of hate speech and misinformation.
Twitter, meanwhile, began to lose key users, like comedian Leslie Jones, after the company let harassment run rampant on its social network. The company has spent the past couple of years building features to reduce the amount of hateful content users have to deal with in their mentions.
These types of issues have already flared up on TikTok. Far-right activists, white nationalists and neo-Nazis have previously been reported on the app, according to Motherboard and the Huffington Post, which found some users who had already been banned by Facebook and Twitter.
TikTok’s potential content problems, however, may be more similar to those of Google-owned YouTube. The two services depend on user-generated videos for content, and they both rely heavily on algorithms that learn a user’s behavior to determine what kind of content to suggest next.
“The issue with algorithm based content feeds is it generally degrades to the most salacious content that shows the highest engagement,” said Mike Jones, managing partner of Los Angeles venture capital firm Science. “There is no doubt that as creators further understand how to drive additional views and attention on the site through algorithm manipulation, the content will increase in its salaciousness and will be a consistent battle that any owner will have to deal with.”
Another similarity with YouTube is the amount of content available on TikTok that is focused on minors. Although TikTok does not allow users younger than 13 to post on the app, many of its users are between the ages of 13 and 18, and their content can be easily viewed by others.
For YouTube, the challenge of hosting content involving minors became a major issue in February 2019 when Wired discovered a network of pedophiles who were using the video service’s recommendation features to find videos of minors exposed or in their underwear.
With the number of young users on TikTok, it’s not hard to imagine that Microsoft could wind up with a problem similar to Google’s.
YouTube has also become a cesspool for conspiracy theories, such as the idea that the earth is flat. That too could become a problem on TikTok, and already, there is evidence of this. The conspiracy theory that Wayfair uses its furniture for child trafficking gained a particular amount of momentum on TikTok this year.
To handle these problems, Microsoft would have to invest an immense amount of time and money on content moderation.
For Facebook, this problem has been handled through a two-pronged strategy. The company continually invests in artificial intelligence technology that is capable of detecting bad content — such as pornography, content that contains violence or hate speech — and removing it from their services before it is ever viewed by other users.
For more complicated content, Facebook also relies on thousands of human moderators. These moderators often work for Facebook through third-party vendors as contractors, and they are tasked with going through thousands of pieces of content per day in strenuous working conditions at risk of developing PTSD. These working conditions have come under criticism on numerous occasions, creating public-relations headaches for Facebook.
If Microsoft acquired TikTok, it too would likely have to build up similar AI technology and build out a network of human moderators all the while avoiding negative headlines for poor working conditions.
TikTok offers Microsoft in an immense amount of potential in the digital marketing sector, but along with all that upside will come numerous new challenges and responsibility that the company will have to take on.