This is my dumping ground for quotes and other stuff relating to the wonderful world of digital & communications.
the fetishization of “big data,” which is touted as a way to understand and control society without reference to the history (or patterns of thought) that gave rise to the data analyzed. A finance firm may say, for example, “we charge 15% interest to someone who had a past default, just because past patterns of data show that such people often default again,” in a process agnostic as to whether a defiant refusal to repay, or a family medical emergency, caused the prior default. The police may say, “we’re intensively policing this neighborhood because it had 10% more crime in the past.”
But what if defaults resulted from excessive interest rates in the past, caused by discriminatory lending practices? And what if the above-normal crime in the neighborhood simply reflected past patterns of intense policing that reflected racism? What if each decision makes future defaults, or excess crime rates, more likely? Then the “science of society” promised by big data morphs into a subjugation of certain parts of society. The algorithms behind such judgments become less “objective arbiters” of opportunity and punishment, than ways of laundering subjective, biased decisions into ostensibly objective, fair scores. Those affected lose a chance at individualized treatment and understanding, as technical systems treat people as a mere collection of data points.
In short, if we forget the human origins and purpose of algorithmic judgments, we lose our chance to develop a humane (and intelligible) society
We are more susceptible than we may think to the “dictatorship of data”—that is, to letting the data govern us in ways that may do as much harm as good. The threat is that we will let ourselves be mindlessly bound by the output of our analyses even when we have reasonable grounds for suspecting that something is amiss. Education seems on the skids? Push standardized tests to measure performance and penalize teachers or schools. Want to prevent terrorism? Create layers of watch lists and no-fly lists in order to police the skies. Want to lose weight? Buy an app to count every calorie but eschew actual exercise
A new policy by CVS Pharmacy requires every one of its nearly 200,000 employees who use its health plan to submit their weight, body fat, glucose levels and other vitals or pay a monthly fine.
Employees who agree to this testing will see no change in their health insurance rates, but those who refuse will have to pay an extra $50 per month — or $600 per year — for the company’s health insurance program
a program called GLEAM (Global Epidemic and Mobility Model) that divides the world into hundreds of thousands of squares. It models travel patterns between these squares (busy roads, flight paths and so on) using equations based on data as various as international air links and school holidays. The result is impressive. In 2009, for example, there was an outbreak of a strain of influenza called H1N1. GLEAM mimicked what actually happened with great fidelity. In most countries it calculated to within a week when the number of new infections peaked. In no case was the calculation out by more than a fortnight.
As we acquire more data, we have the ability to find many, many more statistically significant correlations. Most of these correlations are spurious and deceive us when we’re trying to understand a situation. Falsity grows exponentially the more data we collect. The haystack gets bigger, but the needle we are looking for is still buried deep
Society might be well served if the model makers pondered the ethical dimensions of their work as well as studying the math, according to Rachel Schutt, a senior statistician at Google Research.
“Models do not just predict, but they can make things happen,” says Ms. Schutt, who taught a data science course this year at Columbia. “That’s not discussed generally in our field.”
Models can create what data scientists call a behavioral loop. A person feeds in data, which is collected by an algorithm that then presents the user with choices, thus steering behavior.
the role of the campaign pros in Washington who make decisions on hunches and experience is rapidly dwindling, being replaced by the work of quants and computer coders who can crack massive data sets for insight.
Online, the get-out-the-vote effort continued with a first-ever attempt at using Facebook on a mass scale to replicate the door-knocking efforts of field organizers. In the final weeks of the campaign, people who had downloaded an app were sent messages with pictures of their friends in swing states. They were told to click a button to automatically urge those targeted voters to take certain actions, such as registering to vote, voting early or getting to the polls. The campaign found that roughly 1 in 5 people contacted by a Facebook pal acted on the request, in large part because the message came from someone they knew.
BUSINESSES avidly mine data to improve their efficiency. Non-profit groups have plenty of information, too. But they can rarely afford to hire number-crunchers. Now a bunch of philanthropic geeks at DataKind, a New York-based charity, are helping other do-gooders work more productively and quantify their achievements for donors, who like to see that their money is well spent. A typical DataKind two-day “hackathon” last month in London attracted 50 people who worked in three teams. One pored over the records of Place2Be, which offers counselling to troubled schoolchildren. Crunching the data showed that boys tend to respond better than girls, though girls who lived with only their fathers showed the biggest improvements of all. The charity did not know that.
“In this world of huge and big data, you won’t be able to program machines for everything they should know,” said Ms. Rometty. “These machines will have to learn what is right, what is wrong, what is a pattern.” It is the third wave of computing, she said. At first, computers could count. Today, they are programmed to follow “if this, then that.” Next they will need to discover and learn on their own, she said, not just as a search engine, but proactively.
Using data amassed from loyalty card usage, Kroger is now offering personalized discounts in-store to give shoppers money off their favorite brands in real time. The store has begun to offer the service to its Kroger Rewards scheme customers, providing a price for certain products depending on each member’s individual shopping history… There is no need to print off and scan and the coupons in-store as the prices are automatically added to each users’ Rewards card and applied to the final bill at checkout.