The Engineer's Notebook is a shared blog for entries that don't fit into a specific CR4 blog. Topics may range from grammar to physics and could be research or or an individual's thoughts - like you'd jot down in a well-used notebook.
I took a management course in grad school in 2009, and just about the only thing I remember about it was learning about the Hawthorne effect. This was research done at a Chicago factory from 1924-32, in which workers supposedly showed an uptick in productivity while participating in a study examining the effects of lighting on their work. A later study of the results claimed that the lighting had zero effect, and the workers were more productive because they simply liked the idea that researchers were taking an interest in their work. Some later critics found this anecdotal BS, but it was interesting nonetheless.
Tijmen Schep, a Dutch technology critic and privacy designer, recently identified a related effect in the digital realm. Schep calls the phenomenon “social cooling” (a riff on global warming) and describes it as the pattern of altered behavior exhibited when online users know they’re constantly being tracked by Big Data algorithms. And even though algorithms are mathematical, they were built by a human who likely showed some bias in their programming, so no algorithm is totally fair.
It’s safe to assume that nothing’s private online. Any user who clicks a link or searches on Google has an aggregate “digital reputation” that then leads to profiling. For example, third-party companies privy to a user’s data can accurately determine a user’s addictability, personality traits, economic stability, religion and reading habits simply by combing their data. Schep says that even though a company may claim they don’t sell your data, the patterns and labels they derive from it are legally theirs and they can sell them at will.
Schep worries that a society built entirely on digital reputation could have ill effects. Users may feel pressured to conform, avoid healthy risks to maintain high ratings, and eventually fall into rigid social labels and structures. Mistakes and imperfections are meant to be forgotten, but online records persist indefinitely and may affect a user over the course of their lifetime. Critics are seeing the increasing importance of online reputation as a sort of credit score tailored to social interaction. China is already exploring a variant of a wide-reaching social credit score.
There are countless other examples of the possibility of social engineering through data. In November of last year, Data Alliance, the Big Data arm of advertising company WPP, signed an agreement with music service Spotify to acquire the listening preferences, moods and behaviors of 100 million of Spotify’s users in 60 countries. While the press release for this deal claims that this data will be used to connect with users with targeted ads, one could see this data leveraged by insurance companies to determine a user’s risk of depression and other mood disorders.
Schep identifies a similar effect he calls mathwashing, which more or less defines the bias behind “unbiased” algorithms. (He borrows that term from former Kickstarter data scientist Fred Benenson.) Schep calls for algorithms to be treated ethically as a law rather than a tool, as the design of algorithms is becoming more and more crucial to fair living, and algorithms are increasingly implemented in the justice system through predictive policing and other strategies.
We don’t yet live in a completely quantified society, but it’s easy to see how targeted advertising and Big Data are both nudging it in that direction. I’ll consider that next time I think about clicking on a questionable link.
When we install a new app, it often asks our permission to access other pieces of our personal information in order for it to function. For example, a messaging app will want to see your contacts. What we may not realize is that once the app has permission to collect that information, it can share your data with anyone.
If a GPS app sends your location to a server to help find you directions from your current coordinates, it can send that location elsewhere too.
I’ve often heard it said that Google knows so much of our personal information—our workplaces, our contact info, and sometimes even our credit card numbers—that it could take over the world. Or at least blackmail many of us.
The research conducted by IMDEA Networks Institute suggests that Google might not be the only one who views all of this information as power (or money), since information collected by individual apps are often grouped by unique identifiers pulled from our phones, compiling a mini database of our activity.
IMDEA is setting out to make each of us more aware with their own app, the Lumen Privacy Monitor (pictured left and below), which analyzes the traffic of each of the other apps on your phone. It tells you when Google Maps sends to Google servers, but it also tells you when another app sends information to somewhere completely unexpected.
Of course, Lumen also asks to share some of this data with the researchers at IMDEA, promising that the data shared won’t include any of the personal information being accessed by your other apps, just that the app shared something private and to where.
I wasn’t able to download and test Lumen on my phone, but the idea sounds pretty neat to me. At least then I would know who knows all about my life. I’m not alone in thinking that either. Since October 2015, more than 1,600 people have used Lumen and through them, the researchers have had access to 5,000 apps.
Researchers revealed that:
· 598 internet sites (think: Facebook, Google, Yahoo, and big name internet service providers like Verizon Wireless) were tracking for advertising purposes
· More than 70% of the apps studied shared information with at least one tracker
· Fifteen percent of apps shared information with five or more trackers
· One in four trackers pulled a “unique device identifier” they used to link your information to you
Researchers also noted that information crossed country borders, into locations with lax privacy laws, and that information included unique identifiers being pulled from apps for children. The IMDEA reminds readers that this is particularly problematic because these identifiers can sometimes be linked to a specific physical location. Not only would a third-party know your son likes the latest Pixar movie, they could know where he was.
While Lumen and its researchers provide this information, they say that “it’s hard to know what users might do about this.” Without abandoning the apps mining the information, it’s hard to opt out of using them.
So, delete some of your less useful apps—or don’t—but at least now we can all know where our information is going.
You know that box of VHS tapes in your basement? The one you haven’t looked through in years? Well, you may want to dig it out because those memories may fade forever.
VHS tapes are slowly becoming unwatchable, as they have a 20-30 year life span. A group of archivists is racing against time to digitize the tapes. Researchers are calling this the “magnetic media crisis.”
Tapes are made through sounds and images being magnetized onto strips of tape, and using the same principle as when you rub a piece of metal with a magnet, it retains that magnetism. But when you take the magnet away, the piece of metal slowly loses its magnetism — and in the same way, the tape slowly loses its magnetic properties.
"Once that magnetic field that's been imprinted into that tape has kind of faded too much, you won't be able to recover it back off the tape after a long period of time," Howard Lukk, director of standards at the Society of Motion Picture and Television Engineers, told NPR.
Most VHS tapes were recorded in the 1980s and '90s, when video cameras first became widely available to Americans. So, even the best quality and properly stored tapes will eventually be unwatchable.
Groups like XFR Collective are working to make these memories permanent. They hope to make the service more cost-effective, as some companies charge a large amount of money. The service is offered by many shops, some libraries, and online companies, but the cost deters many.
But it’s a tricky thing to master, and many people don’t know the switch is a matter of time – not just an update to outdated technology.
At XFR Collective, the staff is made up of volunteers, and the process is very time consuming. They often have to spend hours watching the entire tape from start to finish, and that doesn’t even account for troubleshooting like dropped frames and tracking issues.
The group works collaboratively to digitize tapes in order of importance. They see anything from wedding videos to public access TV archives that aren’t available in any other format. They put the transferred footage onto an online archive.
The volunteers are tasked with an important job. If they don’t convert this footage, it could be gone forever, thus losing bits and pieces of our history.
But what about your personal collection of home movies? Many drugstores will transfer them to DVD or Blu-Ray, and lots of websites will make DVDs for you as well. But time’s ticking!
Several years after I moved from Western New York, Yahoo built several unique data centers north of Buffalo. Local press hyped the fact that Yahoo modeled the data centers after chicken coops and used this unique shape to guide cool air into the buildings. Apparently the company saved money and energy by replacing chillers and air conditioners with cool wind, which is abundant between lakes Erie and Ontario.
For several years, Dutch startup Nerdalize has been working on doing the exact opposite: using excess heat from data servers to heat residences and other buildings. The company first produced the eRadiator (shown here), a data server that produced about half the heat of a conventional radiator. They installed eRadiators in five homes but they were found to be slow and unable to heat even a single room.
Nerdalize’s latest project looks to install servers to heat water supplies in Dutch homes. The company is planning on adding installations to 42 Dutch homes beginning this August, but over 3500 people have signed up expressing interest in their own installation. A Nerdalize business description calls the arrangement a “win-win-win:” companies can save 30-50% on their data services, customers will have free heat after the cost of installation, and each heater will supposedly prevent 3 metric tons of carbon dioxide emissions per year.
While Nerdalize is one of the first companies attempting to commercialize server heat, they’re not the first to conceptualize the idea. In a 2011 paper, Microsoft proposed installing servers in homes and businesses for use as “data furnaces.” This idea proposed adding a metal server cabinet to a home’s existing ductwork and was more or less identical to the route pursued by Nerdalize.
While server heaters / data furnaces are a cool idea, there are still a host of questions and unexplored issues. Seeing as the average Joe’s home is far less secure than a conventional data center, how will companies feel leasing out their data services? How will a Nerdalize customer feel about having to schedule techs to fix broken servers?
All told, using cold wind to cool servers and using server heat to warm houses are both pretty cool ideas. But time will tell if the latter is scalable or practical.
Are Alexa, Siri, and Roomba making up the members of your ‘Sex-and-the City”-like posse? Are you confiding in or laughing with them? Or are you treating them like pets?
Researchers from the University of Michigan interested in the increased rates of feelings of loneliness and isolation recently being reported, conducted a series of experiments to determine if interactive products are capable of filling a social void for lonely people.
In a series of experiments, the results of which are published in the Journal of Consumer Research, Carolyn Yoon, professor of marketing, found that more and more people are relying on these devices for social interaction, in some cases treating the devices like pets (Roomba) by naming them or referring to them as “he” or “she.”
In one of the experiments, participants were asked how many Facebook friends they had. When a participant reported the number to another person, it was exaggerated (a tell-tale sign of loneliness). However, the participant was more likely to confide the actual number of Facebook friends he or she had to a device.
Although researchers found that people were relying on the devices for social interaction, they also report that once the participant was reminded that the product was just a device, the effect vanished.
Concerned that people will not pursue relationships outside of the ones they have created with these products as long as they continue to become more realistic, researchers are urging product designers to consider the harmful consequences of creating devices that mimic humans.
On the flip side, however, researchers do see the benefit of humanized devices in terms of future healthcare related scenarios. For example, using the device as a nurse, home health aide, or health monitor.
So while your imagined “Sex-and-the-City” brunch with Siri, Roomba, and Alexa isn’t likely for now, it may soon be a reality as these devices become more and more realistic.
Do you talk to your devices? Do you treat them like a friend or a pet?