Once you lose credibility, you lose trust, and trust is your most valuable asset.
Apple has taken a lot of criticism over its decision to incorporate the detection of child sexual abuse material (CSAM) for users who upload their photos to iCloud Photo Library. Much of that criticism, I wrote over the weekend, is a self-inflicted wound due to Apple’s poor rollout of the information surrounding precisely what it’s doing.
There are some that call the move a backdoor to your phone’s encryption, and that Apple is violating user privacy. I was critical of the move not because anyone thinks eliminating CSAM isn’t a noble cause, but because of the precedent it sets for a company that has made “what happens on your iPhone stays on your iPhone” a core value.
Among the loudest voices in the firestorm is Will Cathcart, the CEO of WhatsApp. Cathcart posted to Twitter, calling this change “a wrong approach and a setback for people’s privacy all over the world.”
I mean, there’s a lot to unpack in Cathcart’s Twitter thread, and it deserves a look. Cathcart is a very smart guy, and he’s in charge of the world’s largest messaging platform. His view would seem to carry a lot of weight if it wasn’t such a blatant misrepresentation of what Apple is actually doing. We’ll get to all of that in a minute.
First, however, I can’t help wondering whether Cathcart knows he works for Facebook.
I mean, he’s worked there for a while. He reports to Chris Cox, the head of product, who reports to CEO Mark Zuckerberg. Previously Cathcart led “product development for News Feed and Facebook’s introduction of advertising into News Feed and on mobile,” according to the company’s website, before later overseeing the entire Facebook app.
I think it’s fair to say he’s relatively familiar with Facebook’s business model. You know, the one where the company scoops up personal information and monetizes it through what it…