News of a Facebook social experiment examining the effect positive or negative posts have on users has prompted justifiable anger. What it shouldn't provoke is surprise. After all, online companies are conducting similar "A/B tests" all the time in order to maximize usage and profits, writes David Weinberger at CNN. For instance, Amazon might try putting a banner on one side of the site for some users and the other side for others, with the goal of determining which side generates more clicks. Such testing is done on everything from "font sizes to colors to the depth of the drop shadows."
The trouble with Facebook's newly-revealed experiment is that it played with our emotions. But that's nothing new: Our Facebook feeds are always based on algorithms, and they're "based on what works for Facebook, Inc., and only secondarily based on what works for us as individuals and as a society," Weinberger writes. Facebook's goal is "happy customers," but that just means people who visit and click a lot. So what should be done about it? Perhaps filters could be more user-controlled or designed with "socially desirable aims." But we can't expect such changes when we're giving social data to "commercial entities that have as their primary interest not the health of our society and culture, but their bottom line." Click for his full piece.