Showing posts with label data privacy. Show all posts
Showing posts with label data privacy. Show all posts

Blowing Up The Burners: New Bill To Make Anonymous Cell Phones Illegal?

Chances are, as a modern human being, you own at least one cell phone that you guard with a vigilance that many bodyguards would envy.  It's as important as your wallet or keys, and maybe more so due to its irreplaceability.  But what about the cell phones that are used specifically for their disposable nature?  Should they be illegal just because sometimes you have business to handle that you don't want interacting with the rest of your real life?

Burner phones:  will they be burned at the legal stake for their perceived sins?
(Image courtesy survivethewild.net.)

The Witness Of Fitness: Health Apps Want Your Hot Body (Of Data)

Do you have a smartphone, smartwatch, or otherwise e-enabled device that you use to keep track of your health and fitness habits?  That's good, in the sense that you care enough about yourself to hopefully not totally devolve into a donut-demolishing dumpster.  It's bad, however, that all of your fitness data might not just be kept solely between you and your get-buff gadgetry.

"Sweet, I just beat my best 5K time!
But what's with all these ads to join the army?"
(Image courtesy lifefitness.com.)

X Marks The Spotted: Windows 10 Is Watching You

As citizens of the cyber-community, we've unfortunately become conditioned to seeing ads that are eerily targeted to things we say, emails that appear from long-forgotten websites, and other evidence of deep data gathering made manifest for use of moneymaking.  Now, with the launch of Windows 10 becoming a necessity for some users, Microsoft seems to have pulled out even more stops to speed up their spying...

Seriously, what ISN'T spying on us these days?
(Image courtesy hackread.com.)

Federal Appeals Court Rules NSA Wiretapping Illegal; NSA Turns Up The Volume, Puts Hands Over Ears, Says "La La La"

Of course, all privacy-prone American citizens have known this for some time:  the NSA's phone-call compendium is unnecessary, unaffiliated with capturing ANY terrorists EVER, and is overall downright creepy.  Thankfully, today, a federal appeals court ruled it illegal.

They listen to everything, but this is the only thing they need to hear.
(Image courtesy alan.com.)

Answers To All Of Your Searches: Now, You Can Download Your Entire Google History

As humans, it is an ineffable part of our nature to search.  Searching for meaning, searching for love, searching for a place to call our own...we do love a good quest - sometimes, even if we don't really know what we're searching for.  That fascination has only escalated with the advent of search engines, which require only mere, vague, and often strange commands to set us out onto our next journey.  And now, all that you've searched for can be found again...

Search from your perch...
(Image courtesy forum-politique.com.)

Bugs, Drugs, and Thugs: DEA Phone Tap Protocol From 1992 Onward Paved Way For Current NSA Programs

Many compelling arguments have been offered as evidence to stop the NSA and other agencies from spying on American (and others') phone calls.  Constitutional rights infringement, invasion of privacy, and simply wasting time and manpower are all noteworthy points that the programs should be stopped.  However, nothing justifies the removal of this century's scaled-up security state better than history itself:  powers-that-be have been monitoring calls for decades, and it didn't stop terrorists one bit...

It didn't really stop drug dealers that much, either.
(Image courtesy anyclip.com.)

Positively Fourth Amendment: The Department Of Justice Wants In On Your Info, Anywhere


Thanks to a new initiative to amend the Constitutionally-sound rules regarding search and seizure, the United States Department of Justice seems to want to practice anything but.  Currently the D.O.J. is seeking  the authority to hack computers anywhere in the world...

This isn't cool and never will be, regardless of what the Department of "Justice" thinks.
(Image courtesy watchdog.org.)

The Big Daddy Of Big Data: U.S. Appoints First-Ever "Chief Data Scientist"


Due to the vast influx of intelligence from many forms of modern media, treasuring our data technology is now a job that requires a major position in the United States government.  Meet America's chief cyber crusader, D.J. Patil...

The Safest Secrets In The World: Swiss Systems Allow For Super-Secure Data Storage

As privacy concerns escalate in our ever-observed lives, steps are now being taken to ensure that precious data can be held as securely as gold or other valuables.  Switzerland, a nation known for its strict privacy in the banking business, is at the forefront of this mission.

According to phys.org, Switzerland has some 61 data-banking centers that deal in information storage.  During the last five years, over a billion dollars have been invested by folks looking to keep their most important information safe from anyone else.

Even their pocketknife USB has a fingerprint scanner and major encryption technology.
No, seriously.
(Image courtesy gadling.com.)

The investments in data storage are surging despite Switzerland's ever-eroding laws concerning banking privacy. Due to the formerly overabundant nature of banking privacy in the nation, it was known as a haven for shady dealings to be neatly numbered and accounted for, without oversight from pesky things like the law. Although that's now changing, the element of the pervasive privacy is now being well applied to data security.

Franz Grueter, the managing director of the data storage firm Green.ch, explained, "Clients need confidence, discretion, reliability and stability. These have been the country's hallmarks forever." He also noted that, "Data storage is the new Eldorado for Switzerland. It's a real boom." (Green.ch has posted 30% annual growth since its inception in 1995.)

Though Switzerland is Europe's fifth-largest data hub, it wants to be known as the nation that takes data security the most seriously. In Switzerland, personal data is legally classified as a "precious good" that requires a judge-issued order before it can be observed by any outsiders. Thus, digital assets, in the form of proprietary secrets, intellectual property, invention schematics, sensitive plans, or other critical data can be safely stashed with the Swiss.

Even email services established in Switzerland are more secure.
(Image courtesy totaldigitalsecurity.com.)

One such information cache, known as Deltalis, is situated in an underground Cold War-era bunker that's protected by biometric scanners, armed guards, and four-ton steel doors that were built to thwart a nuclear attack. Its exact location is not publicly known, and critical IT developments will be handled only by those who act in strict accordance with Swiss law. As far as privacy goes in the modern world, this is as safe as safe can be.

With leaks everywhere from government to Hollywood to personal cell phones occurring, it's good to know that somewhere, secrecy is being taken seriously. One big leak, from renowned whistleblower Edward Snowden, hinted that international spies had their eye on cracking into the Swiss system. They'll have to be the best in the world to make the attempt, though...digitally, physically, and legally, the Swiss have more layers of data protection in place than useful tools on one of their pocketknives.

Your weirdest nudies are safe here.
(Image courtesy photoromanzoitaliano.com.)

E-Emotional Rescue: Computer Programs That Deal In Your Feelings

Experts say that your computer is a better judge of your personality than even your closest family and friends.  It knows your preferences, correspondents, written words, tastes in imagery, secrets kept and deleted, and more.  But what happens in the possibly-near future when machines begin using all of this information to actually UNDERSTAND you?

When it comes to emotional intelligence and your computer, what constitutes too much information?
(Image courtesy singularityhub.com.)

According to the New Yorker, this may be happening more quickly than we expect.  Computers can already attempt to determine moods from vocal pitch and intensity, while simultaneously analyzing any attendant videos for evidence of micro-expressions or gestures that could reveal even more about an interaction.  Even the placement of words in a sentence can be taken to imply other things, indicating how angry, passionate, or spectacularly talented certain authors are.  Now, computers can not only be aware of these elements, but use them to temper their own responses or advice.

Rana el Kaliouby, an Egyptian scientist who runs the Boston-based company Affectiva, is on the forefront of this mecha-emotional leap.  Affectiva's most prominent software, Affdex, is trained to recognize four major emotions:  happy, confused, surprised, and disgusted.  Breaking down the user's face-image into deformable and non-deformable points, the software analyzes how far certain parts of one's face will move (such as a smile or frown raising or lowering the corners of the mouth) in relation to other set points on the face (such as the tip of the nose.)  Things like skin texture (where wrinkles appear, or not) also factor in.  These metrics are analyzed into computing what you feel.

Based off the research of 1960s scientist Paul Ekman, the idea behind this technology stems from a simple, universal concept:  all humans, regardless of race, gender, age or language barriers, have at least six specific facial expressions that register particular emotions.  Ekman broke these expressions down into their constituent movements and wrote a 500-page epic called FACS (Facial Action Coding System) on the subject.  The work has been considered the preeminent treatise on this topic for decades now.

Other companies are on the e-emotional bandwagon too, with names like Emotient, Realeyes, and Sension.  Companies who rely on videoconferencing could now have a useful extra line on what their clients and associates are thinking.  Emotions, which have been found to be closely neurologically related to decision-making and common sense, now can be deduced from faces and choices with a degree of accuracy that seems like mind-reading.

We're less unique than anyone thinks.
(Image courtesy thewaylifeis.com.)

While useful (and now predominantly operational) in business, Kaliouby also spent time researching if this specific recognizance could act as an "emotional hearing aid" for those with autism.  The National Science Foundation offered Kaliouby and her mentor nearly a million dollars to develop this idea.  This proved successful, but the idea was almost immediately extrapolated by businesses from Pepsi to Toyota in the interest of learning more about their consumers' preferences.  These requests overwhelmed the scientists, leading to the creation of Affectiva.  The company, which claims to have refused requests to use the software for espionage (corporate and personal), wanted to generate revenue from investors to augment their autism-relating research.

Thus Affdex began testing users' response to advertisements, giving the promotions industry a leg up on what consumers would be feeling when exposed to their sales pleas.  More than two million videos from eighty countries lent the program an unprecedented amount of information, all adding up to more accuracy in prediction from the program.  Affectiva now deals in these negotiations and improvements full-time.  In coming years, with more "smart" devices and internet-enabled items out there for our interaction, emotional electronics could use their ever-increasing knowledge to hopefully make our lives better.

These programs have our attention, which is a valuable resource.  Now, can that be used to hold our interest, connect us more completely, and/or improve our circumstances (even just by knowing we need the room temperature raised a little?)  Or will it simply serve as another metric to keep tabs on a passive populace?  Will we have the right to know when and where we are being emotionally analyzed, and will we be able to thwart such advances if desired?  Kaliouby maintains that there must be an overall altruistic tilt to the usage of the program, explaining to various advertisers that, “In our space, you could very easily be perceived as Big Brother, as opposed to the gatekeeper of your own emotional data—and it is two very different positions. If we are not careful, we can very easily end up on the Big Brother side.”

Whether we'll end up selling our attention to gain happiness points to sell for more happiness remains uncertain.  But the fact remains that the market for your emotions is vast and lucrative.  Companies will want to know you're happy if it makes them feel they're doing something right.  Other more insidious organizations may be tickled to learn that you're feeling deeply unsettled and on edge (right where some of them want you.)  Will the future be made of humans wearing constant poker faces, lest we be called out by computers?  Will there be surcharges for extra super-sized doses of happiness from certain places or products?  Or should we maybe turn the lens in on ourselves, and understand the nature of our own feelings, before we release them into the wild to be tagged and tracked...or hunted?

And remember, all of this information is taken from imagery alone.  We're not even really "plugged in" yet...
(Image courtesy rdn-consulting.com)






The Question Of Invisibility: Google's Yearly Content Removal Request Report

The thing about dealing with information supergiants is that they not only have power over who sees your secrets and how, but they'll also discuss them when comprehensively covering what dirtier deeds than your own were begged to be scrubbed from the internet.

Such is the nature of Google's semiannual transparency report, another installation of which was released today.  For the first time, this report included some 30 examples of material that had been expressly asked (mostly by government operatives) to vanish from the common knowledge, as though what was reported on was really actually bad enough to transcend the public's millisecond-length attention span.

There's a lot of skullduggery out there...removing the evidence is kind of a big job.
(Image courtesy betanews.com.)

According to newsweek.com, some of it really WAS bad enough that it could ruin lives simply by remaining in the public eye.  Prison inmate abuse, serious sexual accusations, and purported "defamation" of numerous police officers were all included in the materials requested for removal.  None of these requests were granted.

By Google's anaylsis, nearly 8,000 requests were made for information removal during 2013.  These covered the e-extraordinary rendition of some 14,367 pieces of information.  Requests were up 60% from 2012 to have one's secrets permanently kept that way (at least from the eyes of the internet.)  The full transparency report including cases and actions taken is available for analysis.

The specific offending material varied, with governments making 1,066 requests for content be removed from blogs, 841 requests for removal from Google searches, and 765 requests to never again grace the screens of YouTube. These requests comprised the time period between July and December 2013 alone.

Unsurprisingly, people said and did a bunch of dumb stuff caught on Twitter too.
(Image courtesy transparency.twitter.com.)


The most cited reason for the prospective purge was explained as "defamation" (36%), with nudity/obscenity (16%) and security (11%) also making excuses. Google was quick to admit that their report is not a full account of possible online censorship, but is a good metric in that "it does provide a lens on the things that governments and courts ask us to remove, underscoring the importance of transparency around the processes governing such requests.”

You can run, and you can rant, and sling all the press and televised mess you want, but you can't hide from the internet. Little Brother has just as many cameras and ears as Big Brother. The "embarrassing" (and maybe appropriately defamatory) results are more than elements of evidence: they are mirrors to our very society. A stark and honest appraisal of that image requires the full picture of our actions, no matter how ugly.

So that's how that works!  Thanks, transparency!
(Image courtesy watchdog.org.)

We See What You Did There: Edward Snowden Given Human Rights Award By Sweden

While the United States remains steadfast in putting Edward Snowden in the "whistleblower spy" archive of history, other nations consider his efforts a laudable fight against the subtle tyranny of the surveillance state.  This week in Sweden, Snowden was awarded the Right Livelihood award, a humanitarian recognition of his work to free Americans (and others) from the zoo of Big Brother's surveillance amusement.

And we, in good conscience, shouldn't let them.
(Image courtesy garymvasey.files.com.)

According to the Guardian UK, Snowden was not physically able to attend the ceremony, as he considered it a threat to his safety (he is wanted on charges under the Espionage Act in the United States, whose notorious record of "renditions" would have rightly worried Snowden.)  However, he spoke with the committee via teleconference from Moscow, where he is currently living in exile.  In a show of solidarity for Mr. Snowden's deplorably alienated circumstances, none of his family members would accept the award in his absence, noting only that someday Snowden himself should be able to do so.

Informed and angry.  He's not wrong.
(Image courtesy reddit.com.)

The award jury noted that Snowden was being commended “for his courage and skill in revealing the unprecedented extent of state surveillance violating basic democratic processes and constitutional rights."

No one cares you have nothing to hide.  Something can be used against you.
(Image courtesy car-memes.com.)

President Barack Obama, who did not comment on Snowden's award, had previously campaigned with a strong intent to protect American whistleblowers.


They spelled Obama's name wrong, but everything else about this is sadly correct.
(Image courtesy csnbbs.com.)



Grounding Big Brother: Amnesty International Releases Anti-Government-Spyware Detection Software

Are you a closet revolutionary who is constantly aware of the deterioration of society and informs themself on ways it can be fixed?  Are you a casual bystander who once googled a song by a band that prided themselves on questioning authority?  Are you just paranoid as hell that the Man is out to get you?  Now, you can stop governmental cyber-peeping for sure, thanks to new technology released by Amnesty International.

As reported by the BBC, it is no secret that governments use "sophisticated spying tools that could grab images from webcams or listen via microphones to monitor people." Amnesty International knows how wrong that is, and has released the Detekt software to combat Big Brother's unsavory advances. Detekt scans your computer for government-grade spyware that might be missed (or intentionally looked over) by other more mainstream virus or malware detectors.

They're not this overt, but they are this unpleasant.
(Image courtesy wpremedy.com.)

Created through a collaboration between Amnesty International, the Electronic Frontier Foundation, Privacy International and Digitale Gesellschaft, the free software is designed to operate on Windows (the platform which most spied-on people are apparently using.) Its availability should be helpful in putting a damper on the $5 billion international government spyware market.

That's your tax money, getting spent to indiscriminately spy.  Kill the idea that this could ever be acceptable.
(Image courtesy betanews.com)

"People think the uses of spyware by governments are isolated cases. They are not," said Claudio Guarnieri, the German creator of Detekt. "Their discovery is isolated...Spyware is becoming the final solution for surveillance operations to overcome encryption.

"The real problem is nobody really asked the public whether that's acceptable and some countries are legitimizing their use without considering the consequences and inherent issues."

One of those inherent issues being that average civilians shouldn't be covertly spied on by their government.  Better fire up the Detekt, we probably just got put on a list.

There is nothing noble about blindly swinging a cyber bat at peoples' computers, hoping a pinata of prosecutable info will explode.  Even if it did, that candy is probably supposed to be helping the people.
(Image courtesy thehackernews.com.)



Pew Report: 90% Of Americans Feel They've "Lost Control" Over Data Privacy

It's no secret that most civilian information in the United States is not secret.  But just how bad has the encroachment on our privacy gotten?  In a new Pew Research Center report, it seems that the cognitive dissonance of the American Dream is frustrating, but still not something people feel ready to fix...even though it is more critical now than ever to stop the erosion from the invasion.

It's not just a feeling.  It's their first move.
(Image courtesy mb.com.ph.)

The Washington Post reports that a recent study indicated Americans were very aware of the "privacy dystopia" they were living in, with 61% stating that they "would like to do more" to protect their online information.  Over 90% were aware that they had "lost control" over how private organizations were able to obtain and utilize their personal information.

Unfortunately, 55% were admittedly willing to trade personal information for free services online, which doesn't seem to be in line with most peoples' stated desires for privacy (yes, it DOES require sacrifice of some things, unfortunately, but perhaps someday with effort, that could be changed.  Cognizance of this is the first step to correcting it.)

Whatever this is, it isn't worth your security.
(Image courtesy news.softpedia.com.)

Other data from the report included some interesting findings:

-60% reported that revealing data to companies over the internet did not significantly improve their online experience

-88% did not trust advertisers the majority of the time

-82% did not trust the government all or most of the time

-Only 24% felt they could be easily anonymous online

-Perhaps most importantly, over 60% disagreed or strongly disagreed with the statement "it is a good thing for society if people believe that someone is keeping an eye on the things that they do online."

Cell phones, land lines, and social media site security were also assessed, but the overall results were clear:  the snooping needs to stop.  And until we cease squandering our own operational security or surrendering our data for the benefit of fleeting internet fun, this is going to be difficult to change.  It is no longer enough to disagree with privacy-violating practices - consumers and citizens must make the powers that be stop shamelessly snooping and selling our security.  Big Brother has become a bully, and it's time to fight back.

More technological shutters must be closed to block a variety of prying eyes.
(Image courtesy nypost.com.)

This (Text) Message Will Self-Destruct (Your Hard Drive's Data)

Security of your digital information now means security of the majority of your life.  Though it's possible to use a service that will release your valuable documents in the event of your untimely demise, what if you just need a complete destruction of data?

Look no further than the Autothysis128t.  According to ign.com, the 128-gigabyte hard drive is specially created for cautious compilers' control.  It's encrypted with a password, but for an unbeatable extra level of security, an onboard cell radio is standing by to eradicate your info with a mere text message.  

The Autothysis128t can be programmed to automatically go scorched-earth on your saved files in several ways, such as if it's unplugged from your computer, or if too many passwords are attempted to crack it.  But the real killswitch is the text-based execution order (textecution?) that you personally choose and fire off should the situation require it.

Thus, in event of theft or loss, the device will murder your data beyond all known recovery techniques as soon as you hit "send."  For $1600, it's an expensive security measure, but can you put a price on perfect privacy?

For all of your most sensitive materials.




Shrug Off "Atlas", Facebook's New Ad-Stalking Network

You are a target.  Your likes, dislikes, and desires, as manifested via the internet, make you prime material for directed advertising, and social media giant Facebook knows it.  That's why they're stepping in to make their ads follow you around the internet, like a lost dog of consumerism, or perhaps an over-egregious door-to-door salesman inside your screen.

According to mashable.com, the targeted ads will start following you immediately.  You selections and mentions on facebook help them to direct material that they think you will be prone to clicking on, and thus your creepily-pertinent ad distractions will appear if you visit other facebook-affiliated sites (such as Amazon or various news outlets.)

Facebook's new ad network, Atlas, is responsible for this collection and dispersal.  A former Microsoft company which Facebook purchased for $100 million last year, Atlas tracks your verbiage and serves up what it feels is appropriate topical consumer choices.  Atlas CEO Erik Johnson stated this is superior to the logging of your info by your computer's "cookies", stating in a blog post that, "Cookies don’t work on mobile, are becoming less accurate in demographic targeting and can’t easily or accurately measure the customer purchase funnel across browsers and devices or into the offline world."

Now that they've stepped up their game, so can you.  Services like Adblock, Ghostery, NoScript and Disconnect.me can help to combat the ever-encroaching e-eyeballs and protect your privacy.  So if you've ever had the sneaking suspicion that your paper trail needs to be burned, now you know how to fire it up.

You don't want to be on the shoulders of the Atlas that hefts the world wide web.

Dead Drop: Darknet Service Will Be Your Whistleblower If You Mysteriously Disappear

It's hard out there for a whistleblower.  With Bradley "Chelsea" Manning in extreme custody, Edward Snowden hiding out in Russia, and numerous other knowledge-droppers dead under sketchy circumstances, one would be deterred before breathing a word of any new top-secret info - no matter how damning.  However, if you do happen to have your hands on some hot intel, and fear for your safety because of it, a new service will release your documents if you end up disappearing or dead.

The service, called Dead Man Zero, is accessible only through the deep web.  According to vice.com, it costs around $120 (paid in bitcoin.)  One uploads their files to a secure cloud, then the site requires password updates (set at a variable time preference by the user), which if not established will trigger a release of the documents to the user's desired outlets (lawyers, journalists, etc.)

“So what if something happens to you?” Dead Man Zero's site ponders. "Especially if you're trying to do something good like blow the whistle on something evil or wrong in society or government. There should be consequences if you are hurt, jailed, or even killed for trying to render a genuine and risky service to our free society...Now you have some protection. If 'something happens' to you, then your disclosures can be made public regardless.”

It adds, "If events overtake you, you can still overtake your adversaries."

Of course, for anyone paranoid enough to use this service, a secondary dose of worry ensues.  Is the cloud secure enough?  Will the site sustain long enough to make certain my documents really do survive me?  Will they follow through with their promise despite what the intel may contain?  Yes, it is a gamble.  But so is possessing information worthy of this kind of necessity.  For true protection of what is too dangerous for public knowledge, it's either a service like this, or a buried chest full of documents and some keys distributed to your close associates...which do you feel is truly the safest?

You could always test their security by uploading a treasure map to the cloud and laying booby traps for anyone who comes after it.  Just an option.

Hacked Printer Shows How Lax Security Could "Doom" Your Company

We live in a world of instant gratification and hyper-connectivity. Unfortunately, the connections that bring us easy and immediately pleasant results can turn on us just as quickly as they work for us. Nowhere is this more true than in the field of technology. This was recently illustrated when a Canon office printer, connected to an outside computer server, was hacked to play "Doom."

According to pcgamesn.com, the security flaw was intentionally manifested to prove that the overly-accessible printer proved a threat to office data security. The Canon Pixma printers have a web-accessible interface that required no authentication, enabling Context Information Security analyst Michael Jordon to sneak into the system and run a copy of "Doom" on the Pixma's LED screen. This was a playful but serious reminder than any party with unpleasant intent could create firmware to monitor or manipulate the printer's output, which could be instrumental in corporate espionage or sabotage.

As Jordon explained to The Guardian, “If you can run Doom on a printer, you can do a lot more nasty things...In a corporate environment, it would be a good place to be. Who suspects printers?”

Canon has assured its users that an update, requiring a username and password for the Pixma interface, will solve any rogue infiltration programs in all models that had previously been at risk to be compromised. Who says video games never teach you anything?

There are even worse things than these guys waiting to grab your office intel.  (Image courtesy cdn.bloodydisgusting.com.)

Hashing: How and Why to Check a File's Hash Value

Consider the following situation. You have been working for days on a PowerPoint presentation for work or school, and have been keeping the file on a shared computer, a network drive or even a personal flash drive. You put the final touches on your presentation the night before it’s due, save the file and get ready for a good night's sleep. The next day, you confidently begin your presentation. But imagine your surprise when you and your audience see the following image on your third slide:


You’ve been pranked. If you're lucky, everyone got a good laugh out of it. If not, there may be more serious consequences, depending on the situation. This sort of everyday  scenario raises an obvious question. Short of opening the file and manually perusing each slide in the presentation, how could you be sure that it had not been modified by any of the pranksters you may share your computer or network with? More seriously, how can we verify the integrity of a file that may or may not have been modified by a malicious individual seeking to infect out computer or network with a dangerous piece of malware?

In this article, we’ll consider these questions and discuss the pros and cons of one simple means by which we can verify a file’s integrity to ensure that it has not been tampered with, namely, by verifying its hash value. We’ll conclude with a quick tutorial on how to verify a file’s hash value on Mac, Linux and Windows systems, and provide some links to a few lectures on cryptographic hash functions culled from the series of courses listed in our collection of free online computer science courses. Our primary sources along the way will be Everyday Cryptography by Keith M. Martin, and Applied Cryptography by Bruce Schneier.

Malware comes in many different guises. As the Electronic Frontier Foundation writes in their Surveillance Self-Defense Project, malware is frequently spread by "trick[ing] the computer user into running a software program that does something the user wouldn't have wanted." Let's say you decide to download a file from a website you know and trust, and from which you have safely downloaded files in the past. How do you know, for example, that the file you have downloaded onto your computer is in fact the one intended by the trusted website? How do you know it was not altered in transit? How do you know it was not swapped for another file by a malicious attacker? And how can you determine this without running the file first? 

One simple way to verify a file's integrity is by confirming its hash value. In Everyday Cryptography, Martin writes: “Hash functions can be used to provide checks against accidental changes to data and, in certain cases, deliberate manipulation of data . . . As such they are sometimes referred to as modification detection codes or manipulation detection codes” (emphasis in original, Martin, p. 188). In our opening example, a suitable hash function would have allowed you to detect that your presentation had been modified in some way without ever opening it.

So, what is a hash function? The primary practical property of a hash function is that it compresses arbitrarily long inputs into a fixed length output (Martin, p. 189, Schneier, section 2.4). Furthermore, slight differences in the input data result in large differences in the output data. “A single bit change in the pre-image [i.e. the file you’re hashing] changes, on the average, half of the bits in the hash value,” (Schneier, section 2.4). Two of the most commonly used cryptographic hash functions are known as MD5 and SHA1. Schnier quotes NIST’s description of the SHA hash function as found in the Federal Register:
The SHA is called secure because it is designed to be computationally infeasible to recover a message corresponding to a given message digest, or to find two different messages which produce the same message digest. Any change to a message in transit will, with a very high probability, result in a different message digest. (Schneier, section 18.7.)
Here’s a simple example. I have created a plain text file named hello.txt on my Desktop. The file contains a single line that reads: “Hello there.” Applying the well-known sha1 hash function to the file produces the following hash value:
4177876fcf6806ef65c4c1a1abf464087bfbf337.

If I edit the file and remove the period from the end of the line so that it reads “Hello there”, the hash function now returns an entirely different value: 33ab5639bfd8e7b95eb1d8d0b87781d4ffea4d5d.

If I then return the file to its original state by adding the period back in to the end of the sentence, the hash value of the newly edited file will be the same as the original hash. And we would have seen much the same result (though it would have taken a good bit longer to compute!) if my original file had been a copy of the complete works of Shakespeare from which I then removed a period.  

Let’s consider a more practical example. The Electronic Frontier Foundation provides a number of recommendations on how to reduce your risk of malware infection in its Surveillance Self-Defense Project. At the top of their list, we read: “Currently, running a minority operating system [their examples are Linux and  MacOS -ed.] significantly diminishes the risk of infection because fewer malware applications have been targeted at these platforms. (The overwhelming majority of existing malware targets only a single particular operating system.)” This is more security through obscurity than anything else, but it’s still fun to try out new things, so after a bit of reading you decide to download a copy of the latest version of Ubuntu from an online repository.

How can you check to make sure that the file you’ve downloaded is the official one intended by Ubuntu’s developers and has not been manipulated or corrupted in transit? One way is to confirm that the file’s hash value is equivalent to the one provided by the developers. So you go to the page that lists the download’s hash value and make a note of it. Next, you run the hash function on the file you downloaded. If the resulting value is equivalent to the expected one, you have successfully verified the file’s hash.

However, it is critical to note here that verifying a file’s hash value by itself can only establish a relatively weak form of data integrity, in comparison with more robust mechanisms such as digital signature schemes which can provide a stronger form of integrity verification and even authentication. (Martin, pp. 186-189.) This is because a hash value such as we are discussing here cannot tell us anything about the origin of a digital file. For example, assume that unbeknownst to you, the site you’ve downloaded your file from has itself been compromised, and the attacker has: 1) replaced the download file with a piece of malware, and 2) also replaced the corresponding hash value that you use to check the file’s integrity with the hash value of the malware.

If you then verify the hash value of your downloaded file, you have done nothing more than verify the integrity of the malware! And you’re none the wiser because the site itself was compromised! At the same time, however, if you found out through another source that the site and file were compromised, you could then identify the malicious file and distinguish it from the legitimate source file. In a digital signature scheme, as mentioned above, the developer could digitally sign the legitimate hash value with a trusted key. In this way, the question of trust is then displaced to the question of signature authentication.

A second concern regarding this method of determining data integrity is the security of the hash functions themselves. There are known practical and theoretical vulnerabilities in two hash functions that are among the most common in use for these exact purposes on the web today: MD5 and SHA1. A discussion of these vulnerabilities is beyond the scope of the present article, but more information can be easily found online.

Still, as Bruce Schnier states, “we cannot use [one-way hash functions] to determine with certainty that the two strings are equal, but we can use them to get a reasonable assurance of accuracy.” (Schneier, section 2.4). In other words, hash functions can help us establish a basic level of data integrity. In our opening example, simply making a note of the hash and then checking it the next day would have sufficed to establish that the file had been tampered with. But, of course, if the file had been secured or encrypted to begin with, it never would have even been an issue in the first place.

Finally, how does one actually compute the hash value of a file? It is actually rather simple, but the specifics depend on your choice of operating system. MacOS and Linux systems come bundled with basic functionality to check any file’s hash value, while Microsoft Windows systems require you to download a piece of software to accomplish the task. Two of the most common functions used to verify file hashes are known as MD5 and SHA1. We’ll consider each in turn.

MacOS
1) Open up a command line Terminal.
2) Type “openssl md5 </path/to/file>” into the terminal and press enter.
2A) As an alternative to #2, you can also type “openssl md5 ” into the terminal, then drag and drop the target file into the Terminal window, and press enter.
3) The terminal will then return the MD5 hash value of the given file.

To compute the hash value of the file using a different hash function, type the name of that function into the terminal command in place of “md5”. For example, to compute the sha1 hash of a file, you would type: “openssl sha1 ” followed by the file path. To see a list of all the message digest commands available on your machine, type “openssl —help” into the command line terminal.

Linux (Debian-based)

1) Open up a command line Terminal.
2) Type: “md5sum </path/to/file>”. Then press enter.
3) The terminal will return the MD5 hash value of the given file.

To compute the hash value of the file using a different hash function, type the appropriate command into the terminal in front of the path to the target file. For example, “sha1sum </path/to/file>” will compute the file’s sha1 hash value. To see what other hash functions are available on your system, type “man dgst” into the terminal. 

Windows
Windows systems apparently do not come bundled with a built-in utility to check hash values. However, there are a number of different pieces of software you can download to accomplish the task. Microsoft Support lists the File Checksum Integrity Verifier, but warns that this is not supported by Microsoft and is only of use on Windows 2000, Windows XP and Windows Server 2003. This discussion at superuser provides a number of different extant options.

Video Lectures on Hash Functions
As always, comments, questions, suggestions and angry tirades are welcome below.