Reality Mining: Techno-utopian Fantasy or Totalitarian Nightmare?

At the MIT Technology Review, Nicholas Carr reviews the work of Alex Pentland, director of MIT’s Human Dynamics Laboratory.  Pentland's work apparently aims to help create a fully programmed social order that would be difficult to distinguish from your worst nightmare of a totalitarian surveillance society.  Excerpt:
Pentland describes a series of experiments that he and his associates have been conducting in the private sector. They go into a business and give each employee an electronic ID card, called a “sociometric badge,” that hangs from the neck and communicates with the badges worn by colleagues. Incorporating microphones, location sensors, and accelerometers, the badges monitor where people go and whom they talk with, taking note of their tone of voice and even their body language. The devices are able to measure not only the chains of communication and influence within an organization but also “personal energy levels” and traits such as “extraversion and empathy.” In one such study of a bank’s call center, the researchers discovered that productivity could be increased simply by tweaking the coffee-break schedule.

Pentland dubs this data-processing technique “reality mining,” and he suggests that similar kinds of information can be collected on a much broader scale by smartphones outfitted with specialized sensors and apps. Fed into statistical modeling programs, the data could reveal “how things such as ideas, decisions, mood, or the seasonal flu are spread in the community.” . . .
What really excites Pentland is the prospect of using digital media and related tools to change people’s behavior, to motivate groups and individuals to act in more productive and responsible ways. If people react predictably to social influences, then governments and businesses can use computers to develop and deliver carefully tailored incentives, such as messages of praise or small cash payments, to “tune” the flows of influence in a group and thereby modify the habits of its members. Beyond improving the efficiency of transit and health-care systems, Pentland suggests, group-based incentive programs can make communities more harmonious and creative.
Call me cynical, but it seems just plain deluded to assume, as Pentland apparently does, that such technology, if adopted widely by governments and businesses, would ultimately fulfill these techno-utopian fantasies, and not result in a dystopian nightmare of surveillance and control.

Google Expands Glass "Explorer" Program

From Yahoo:
Google will make a limited supply of its controversial Internet-linked Glass eyewear available for purchase in the United States beginning — and ending — on April 15.
Anyone in the United States with $1,500 to spend on Glass will be able to join the ranks of “Explorers” who have gotten to test out the devices prior to them hitting the market, the California-based Internet titan said Thursday in a post at Google+ social network . . . 
On April 15, starting at 9 a.m. Eastern, Google will commence what it billed as the biggest expansion of the Explorer program to date by letting anyone in the U.S. buy the eyewear online here, noting that there would be a limited number of units available.
But Glass users, who have not so affectionately become known as "Glassholes", have also become targets of ire and even violence.  But not to fear, Google has a solution . . . contact lens cameras. From CNET:
Google has a patent pending for a contact lens with a micro camera and sensors embedded on the surface controlled by blinking, which would enable you to take hands-free pictures and could help the blind navigate the everyday obstacles of the world.

FBI to Expand Facial Recognition Photo Database

From the EFF:
New documents released by the FBI show that the Bureau is well on its way toward its goal of a fully operational face recognition database by this summer.
EFF received these records in response to our Freedom of Information Act lawsuit for information on Next Generation Identification (NGI)—the FBI’s massive biometric database that may hold records on as much as one third of the U.S. population. The facial recognition component of this database poses real threats to privacy for all Americans.

NGI builds on the FBI’s legacy fingerprint database—which already contains well over 100 million individual records—and has been designed to include multiple forms of biometric data, including palm prints and iris scans in addition to fingerprints and face recognition data. NGI combines all these forms of data in each individual’s file, linking them to personal and biographic data like name, home address, ID number, immigration status, age, race, etc. This immense database is shared with other federal agencies and with the approximately 18,000 tribal, state and local law enforcement agencies across the United States.

The records we received show that the face recognition component of NGI may include as many as 52 million face images by 2015. By 2012, NGI already contained 13.6 million images representing between 7 and 8 million individuals, and by the middle of 2013, the size of the database increased to 16 million images. The new records reveal that the database will be capable of processing 55,000 direct photo enrollments daily and of conducting tens of thousands of searches every day . . .
In order to avoid the prying eyes of government pervs, you might want to consider a new look, such as those being developed by Adam Harvey, who is seeking to "create a growing catalog of designs and techniques that can be employed as camouflage against face detection."

Quantum Leap in Quantum Computing?

From Popular Mechanics:
It's a machine that could calculate solutions to problems so impossibly time-consuming that even the most powerful supercomputers could never handle them. And it would do so in an instant. This is the quantum computer, made possible by the bizarre nature of quantum mechanics. And though the idea is still in its infancy, it's no fantasy.

Two research teams, at Harvard University and the Max Planck Institute of Quantum Optics in Germany, have just announced that they have independently forged the building blocks for tomorrow's quantum computers. As they published today in the journal Nature (1, 2), the scientists discovered a way to hook up atoms and particles of light to create a new type of switch and logic-gate‚ quantum versions of the connecting structures that link bits of data in modern computers. 
Amusingly, the author of the article has apparently fielded some questions in the reddit post on the story and provided a rather tight summary:
New computing equipment allows info to be put into a fucking weird quantum state, which can do crazy shit. Like super fast computing. We've made similar things before, but can't build big computers with them. With this we think we can.
See also, this handy infographic:


Anti-Surveillance: Police Dismantle Monitoring Gear on their Own Vehicles

It is difficult to find government officials who are against mass surveillance of the American people and the public worldwide. However, these professional hypocrites are much more sensitive when they themselves are on the other side of the camera.  Police are all for mass surveillance, until you turn it on them. From Ars Technica:

The Los Angeles Police Commission is investigating how half of the recording antennas in the Southeast Division went missing, seemingly as a way to evade new self-monitoring procedures that the Los Angeles Police Department imposed last year.

According to the Los Angeles Times, an LAPD investigation determined that around half of the 80 patrol cars in one South LA division were missing antennas as of last summer, and an additional 10 antennas were unaccounted for. Citing a police source, the newspaper said that removing the antennas can reduce the range of the voice transmitters by as much as a third of the normal operating distance.
The Police Commission, an independent body that oversees LAPD policy, was only notified of the situation two months ago.

Inflight Wifi Provider Goes Above and Beyond to Compromise Passenger Info Security

From Wired:
Gogo, the inflight Wi-Fi provider, is used by millions of airline passengers each year to stay connected while flying the friendly skies. But if you think the long arm of government surveillance doesn’t have a vertical reach, think again.
Gogo and others that provide Wi-Fi aboard aircraft must follow the same wiretap provisions that require telecoms and terrestrial ISPs to assist U.S. law enforcement and the NSA in tracking users when so ordered. But they may be doing more than the law requires.
According to a letter Gogo submitted to the Federal Communications Commission, the company voluntarily exceeded the requirements of the Communications Assistance for Law Enforcement Act, or CALEA, by adding capabilities to its service at the request of law enforcement.  The revelation alarms civil liberties groups, which say companies should not be cutting deals with the government that may enhance the ability to monitor or track users.
“CALEA itself is a massive infringement on user’s rights,” says Peter Eckersley of the Electronic Frontier Foundation. “Having ISP’s [now] that say that CALEA isn’t enough, we’re going to be even more intrusive in what we collect on people is, honestly, scandalous.”

Heartbleed: Critical OpenSSL Bug Exposes Secure Traffic

From Ars Technica:
Lest readers think "catastrophic" is too exaggerated a description for the critical defect affecting an estimated two-thirds of the Internet's Web servers, consider this: at the moment this article was being prepared, the so-called Heartbleed bug was exposing end-user passwords, the contents of confidential e-mails, and other sensitive data belonging to Yahoo Mail and almost certainly countless other services.
The two-year-old bug is the result of a mundane coding error in OpenSSL, the world's most popular code library for implementing HTTPS encryption in websites, e-mail servers, and applications. The result of a missing bounds check in the source code, Heartbleed allows attackers to recover large chunks of private computer memory that handle OpenSSL processes. The leak is the digital equivalent of a grab bag that hackers can blindly reach into over and over simply by sending a series of commands to vulnerable servers. The returned contents could include something as banal as a time stamp, or it could return far more valuable assets such as authentication credentials or even the private key at the heart of a website's entire cryptographic certificate.