Categories
Pedagogy

Are these algorithms real?

I’m teaching about algorithms in my Rhetoric, Media, and Civic Life this week, and thought it might be useful to share with students the scope and influence of algorithms–but I wanted to game-ify it a bit. So I asked the Facebook hive mind to suggest examples of “fake but plausibly true’ and “true but sound fake” algorithms that I could share amongst some more run of the mill examples. I used Poll Everywhere for text voting–it worked marvelously. Here’s the script; I’ve deleted a few interpretive commentary notes, and anything in quotes is snagged and lightly edited from the provided link.

+ means it is true, – means it is false.

Let’s play “Is this algorithm real?” I’m going to read you a description of an algorithm in action—you will text true or false, we’ll take a poll and then I’ll reveal whether or not it is real and talk about the broader implications.

+A fast food company—who wants to remain anonymous—is able to analyze the length of the drive through line and adjust it’s digital menu accordingly. When the line is long, the menu shows meals that can be made quickly; when the line is short, the menu features meals that take more time to prepare.

+As we all know, Netflix makes recommendations of shows you might like. But are they using customers’ viewing habits to figure out what new shows they ought to make? Yes. Drawing on complex analysis of big data, Netflix figured out that it’s customers really liked 1) political thrillers, 2) movies with Kevin Spacey, and 3) shows and movies produced by David Fincher. And that is how they ended up making “House of Cards.”

-Helix is an app in development that some software analysts are calling “Tinder, for genetics.” You supply a swab of skin cells from your cheek and send it in to the Helix lab for DNA analysis. Helix then uses it’s proprietary algorithm to identify optimal genetic matches with other Helix users–swipe right when you see a good genetic match at the bar and start having those super-babies!

+”Amazon sold a “KEEP CALM AND RAPE A LOT” t-shirt. How did such a thing come to pass? This is a weird outcome of an automated algorithm that just tries random variations on “KEEP CALM AND,” offering them for sale in Amazon’s third-party marketplace and printing them on demand if any of them manage to find a buyer. The algorithm that creates these t-shirts is not complex or powerful. This is how it works. 1) Start a sentence with the words KEEP CALM AND. 2) Pick a word from this long list of verbs. Any word will do. Don’t worry, I’m sure they’re all fine. 3) Finish the sentence with one of the following: OFF, THEM, IF, THEM or US. 4) Lay these words out in the classic Keep Calm style. 5) Create a mockup jpeg of a t-shirt. 6) Submit the design to Amazon using our boilerplate t-shirt description. 7) Go back to 1 and start again. It costs nothing to create the design, nothing to submit it to Amazon and nothing for Amazon to host the product. If no-one buys it then the total cost of the experiment is effectively zero. But if the algorithm stumbles upon something special, something that is both unique and funny and actually sells, then everyone makes money.”

-What’s the worst part of going to the library? So many books! How do you choose? The University of Michigan libraries have developed a new algorithm that addresses this problem. After you borrow 100 books, the library develops a profile of your interests. Then, much like Amazon, it begins to recommend books that it thinks you might be interested in—and even notifies you when a new book related to your research comes in to the library. The libraries are now working on a “predictive algorithm” that will project, based on your current research, what your next research area might be. “This software is going to push University of Michigan researchers to a new level,” head of libraries Susan B. Wells claimed. (Pro tip: adding a quotation into the example increases veracity).

-XBox has an algorithm that tracks the speed you tap particular buttons such as right trigger, or B or A. Over time, this reveals your game preferences, as you tap different buttons at different rates if you are playing a first person shooter, strategy, or sports game. The online Xbox store then sorts the games in the order it thinks you would be most likely to buy them based on this data.

-In select metropolitan areas, Subway has just rolled out “The Subway TrueSandwichMatch” program. Here’s how it works: scan your grocery store rewards card at any participating Subway, and a computer algorithm will design for you a personalized sandwich you’ll absolutely love, based on your weekly food purchases! If you have a track record of purchasing ham, muenster cheese, and yellow mustard—no need to say anything to the sandwich artist, for TrueSandwichMatch will develop a sandwich with those flavor profiles.

+”When Brooklyn-based computer programmer Jacky Alciné looked over a set of images that he had uploaded to Google Photos on Sunday, he found that the service had attempted to classify them according to their contents. Google offers this capability as a selling point of its service, boasting that it lets you “search by what you remember about a photo, no description needed.” In Alciné’s case, many of those labels were basically accurate: A photograph of an airplane wing had been filed under “Airplanes,” one of two tall buildings under “Skyscrapers,” and so on. Then there was a picture of Alciné and a friend. They’re both black. And Google had labeled the photo “Gorillas.” On investigation, Alciné found that many more photographs of the pair—and nothing else—had been placed under this literally dehumanizing rubric. Google immediately apologized and addressed the issue.”

+”The Chinese government has announced a new universal reputation score, tied to every person in the country’s nation ID number and based on such factors as political compliance, hobbies, shopping, and whether you play videogames. The program will be administered by Alibaba (China’s answer to Amazon) and Tencent (the country’s huge, government-compliant social network). Your score will be generated not only by your activities, but by the activities of the friends in your social graph — the people you identify as friends on social media. Your score will be decreased for doing things like mentioning Tienanmen Square or speculating on official corruption, or for participating in activities that the state wishes to “nudge” you away from, like playing video-games. All scores are public to everyone, and high-scoring individuals will get privileges denied to their less fortunate peers, such as permits to visit (or live) in Singapore.”

-How does Dunkin Donuts, Krispy Kreme, Tim Hortons know how many donuts to make? These donuts are made at huge plants and shipped to stores daily over long distances in a frozen state. So if they get it wrong it can cost millions. How do they know? There’s an algorithm in the register that matches type of drink to donut purchase. If a store indicates an upswing of black coffee sales, more glazed donuts are sent there. Likewise, if there is an increase in lots of cream and sugar, more chocolate frosted doughnuts are shipped. The algorithm is good at accommodating quick shifts in beverage preference, but even better long term, knowing that the time of year also influences coffee sales of certain types, and thus allows these corporations to prepare the shipments accordingly.

+”A handful of tech startups are using social data to determine the risk of lending to people who have a difficult time accessing credit. Traditional lenders rely heavily on credit scores like FICO, which look at payments history. They typically steer clear of the millions of people who don’t have credit scores. But some financial lending companies have found that social connections can be a good indicator of a person’s creditworthiness. One such company, Lenddo, determines if you’re friends on Facebook with someone who was late paying back a loan to Lenddo. If so, that’s bad news for you. It’s even worse news if the delinquent friend is someone you frequently interact with. ‘It turns out humans are really good at knowing who is trustworthy and reliable in their community,’ said Jeff Stewart, a co-founder and CEO of Lenddo. ‘What’s new is that we’re now able to measure through massive computing power.'”

+”You can buy Peter Lawrence’s The making of a fly—a classic book in developmental biology—for $2 million, from two different retailers. The book normally retails for $35. This happened because Amazon retailers are increasingly using algorithmic pricing, with a number of companies offering pricing algorithms/services to retailers. Both profnath and bordeebook, the two retailers were clearly using automatic pricing – employing algorithms that didn’t have a built-in sanity check on the prices they produced. But the two retailers were clearly employing different strategies. On the day we discovered the million dollar prices, the copy offered by bordeebook was1.270589 times the price of the copy offered by profnath. And now the bordeebook copy was 1.270589 times profnath again. So clearly at least one of the sellers was setting their price algorithmically in response to changes in the other’s price. I continued to watch carefully and the full pattern emerged. Once a day profnath set their price to be 0.9983 times bordeebook’s price. The prices would remain close for several hours, until bordeebook “noticed” profnath’s change and elevated their price to 1.270589 times profnath’s higher price.”

+”A team of researchers from Carnegie Mellon University claim Google displays far fewer ads for high-paying executive jobs… if you’re a woman. Datta and his colleagues built a tool, called Ad Fisher, that tracks how user behavior on Google influences the personalized Google ads that each user sees. The Ad Fisher team found that when Google presumed users to be male job seekers, they were much more likely to be shown ads for high-paying executive jobs. Google showed the ads 1,852 times to the male group — but just 318 times to the female group.”

+”Spoofing is a disruptive algorithmic trading entity employed by traders to outpace other market participants and to manipulate commodity markets. Spoofers feign interest in trading futures, stocks and other products in financial markets creating an illusion of exchange pessimism in the futures market when many offers are being cancelled or withdrawn, or false optimism or demand when many offers are being placed in bad faith. Spoofers bid or offer with intent to cancel before the orders are filled. The flurry of activity around the buy or sell orders is intended to attract other high-frequency traders (HFT) to induce a particular market reaction such as manipulating the market price of a security. Spoofing is credited with causing the May 6, 2010, Flash Crash also known as The Crash of 2:45, the 2010 Flash Crash or simply the Flash Crash, a United States trillion-dollar stock market crash, which started at 2:32 p.m. EDT and lasted for approximately 36 minutes. Stock indexes, such as the S&P 500, Dow Jones Industrial Average and Nasdaq Composite, collapsed and rebounded very rapidly. The Dow Jones Industrial Average had its biggest intraday point drop (from the opening) up to that point, plunging 998.5 points (about 9%), most within minutes.”

+”The National Security Agency’s SKYNET programme engages in mass surveillance of Pakistan’s mobile phone network, and then uses a machine learning algorithm on the cellular network metadata of 55 million people to try and rate each person’s likelihood of being a terrorist. Based on the classification date of “20070108” on one of the SKYNET slide decks leaked by Edward Snowden (which themselves appear to date from 2011 and 2012), the machine learning program may have been in development as early as 2007. In the years that have followed, thousands of innocent people in Pakistan may have been mislabelled as terrorists by that “scientifically unsound” algorithm, possibly resulting in their untimely demise.”

css.php