(Bloomberg View) — If you're on Facebook, then at some point over the past two weeks you got a little note from the company saying, "It's been a great year! Thanks for being a part of it." And then you saw your "Year in Review," a collection of things that the company's algorithms decided were your most important moments. The items were likely chosen because they showed up a lot in your stream or they garnered a lot of comments or they were shared. And that single feature, as many users pointed out, led to "inadvertent algorithmic cruelty." Facebook pushed in our faces photos of loved ones who died and other heartwrenching moments from 2014.

Anyone inclined to use the service as more than a carefully curated, fake, happy-all-the time repository of our most emotionally manicured selves were put on notice. You can express genuinely sad things on Facebook, and they'll be handled with all the sensitivity that only a bot can provide. It's been a great year (when your family was torn apart)! Thanks for being a part of it.

Anyone who is human and who interacts with other humans could have seen this coming a mile away. But maybe no one at Facebook HQ has bad years? Or maybe this year was just particularly good? The stock did, after all, gain 47 percent. (The company of course apologized after the fact.)

Recommended For You

This upsetting moment should give us pause, not just because it was idiotic and we continue to give our data to this company that continually does idiotic things. But because so much of our lives and our security is in the hands of the algorithms that, like Facebook, try to draw connections between the trails of data we leave online. Advertisers are trying to sell us products based on the profiles they can build with our online lives. Insurance companies are trying to find ways to alter their business models with the information they glean from our posts. The NSA has collected vast amounts of data on citizens who have been accused of no wrongdoing in an effort to map out where on the spectrum of criminality we fall.

Formulas and humans try to infer meaning in those connections. Facebook's failure is insensitive, easy to ridicule and easy to forget. But that doesn't mean that the data we leave about and the inferred connections that people make don't have real consequences when they're wrong.

NOT FOR REPRINT

© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.