Was SHA-256 cracked? Don’t buy into the retraction!

SHA-256 is a one way hashing algorithm. Cracking it would have tectonic implications for consumers, business and all aspects of government including the military.
 
It’s not the purpose of this post to explain encryption, AES or SHA-256, but here is a brief description of SHA-256. Normally, I place reference links in-line or at the end of a post. But let’s get this out of the way up front:
 
One day after Treadwell Stanton DuPont claimed that a secret project cracked SHA-256 more than one year ago, they back-tracked. Rescinding the original claim, they announced that an equipment flaw caused them to incorrectly conclude that they had algorithmically cracked SHA-256.
 
“All sectors can still sleep quietly tonight,” said CEO Mike Wallace. “Preliminary results in this cryptanalytic research led us to believe we were successful, but this flaw finally proved otherwise.”
 
Yeah, sure! Why not sell me that bridge in Brooklyn while you backtrack.

The new claim makes no sense at all—a retraction of an earlier claim about a discovery by a crack team of research scientists (pun intended). The clues offered in the original claim, which was issued just one day earlier, cast suspicion on the retraction. Something fishy is going on here. Who pressured DuPont into making the retraction—and for what purpose? Something smells rotten in Denmark!
Let’s deconstruct this mess by reviewing the basic facts:

  • A wall street, financial services firm proudly announces the solution to a de facto contest in math and logic
  • They succeeded in this achievement a year ago, but kept it secret until this week
  • One day later (with no challenge by outsiders),* they announce a flaw in the year-old solution

Waitacottenpickensec, Mr. DuPont!! The flaw (an ‘equipment issue’) was discovered a year after this equipment was configured and used—but only one day after you finally decide to disclose the discovery? Poppycock!

I am not given to conspiracy theories (a faked moon landing, suppressing perpetual motion technology, autism & vaccinations, etc)—But I recognize government pressure when I see it! Someone with guns and persuasion convinced DuPont to rescind the claim and point to a silly experimental error.

Consider the fallout, if SHA-256 were to suddenly lose public confidence…

  • A broken SHA-256 would wreak havoc on an entrenched market. SHA-256 is a foundational element in the encryption used by consumers & business
  • But for government, disclosing a crack to a ubiquitous standard that they previously discovered (or designed) would destroy a covert surveillance mechanism—because the market would move quickly to replace the compromised methodology.
I understand why DuPont would boast of an impressive technical feat. Cracking AES, SSL or SHA-256 has become an international contest with bragging rights. But, I cannot imagine a reason to wait one year before disclosing the achievement. This, alone, does not create a conundrum. Perhaps DuPont was truly concerned that it would undermine trust in everyday communications, financial transactions and identity/access verification…
 
But retracting the claim immediately after disclosing it makes no sense at all. There is only one rational explanation. The original claim undermines the interests of some entity that has the power or influence to demand a retraction. It’s difficult to look at this any other way.
 
What about the everyday business of TS DuPont?
 
If the purpose of the original announcement was to generate press for DuPont’s financial services, then they have succeeded. An old axiom says that any press is good press. In this case, I don’t think so! Despite the potential for increased name recognition (Who knew that any DuPont was into brokerage & financial services?) I am not likely to think positively of TS DuPont for my investment needs.
 

* The cryptographic community could not challenge DuPont’s original claim, because it was not accompanied by any explanation of tools, experimental technique or mathematical methodology. Recognizing that SHA-256 is baked into the global infrastructure: banking, commerce and communications, their opaque announcement was designed to protect the economy. Thank you, Mr. DuPont, for being so noble! 

Is San Bernardino iPhone fully Encrypted?

Here is a question that keeps me up at night…

Is the San Bernardino iPhone just locked or is it properly encrypted?

Isn’t full encryption beyond the reach of forensic investigators? So we come to the real question: If critical data on the San Bernardino iPhone is properly encrypted, and if the Islamic terrorist who shot innocent Americans used a good password, then what is it that the FBI thinks that Apple can do to help crack this phone? Doesn’t good encryption thwart forensic analysis, even by the FBI and the maker of the phone?

iphone-01In the case of Syed Rizwan Farook’s iPhone, the FBI doesn’t know if the shooter used a long and sufficiently unobvious password. They plan to try a rapid-fire dictionary attack and other predictive algorithms to deduce the password. But the content of the iPhone is protected by a closely coupled hardware feature that will disable the phone and even erase memory, if it detects multiple attempts with the wrong password. The FBI wants Apple to help them defeat this hardware sentry, so that they can launch a brute force hack-trying thousands of passwords each second. Without Apple’s help, the crack detection hardware could automatically erase incriminating evidence, leaving investigators in the dark.

Mitch Vogel is an Apple expert. As both a former police officer and one who has worked with Apple he succinctly explains the current standoff between FBI investigators and Apple.


The iPhone that the FBI has is locked with a passcode and encrypted. If it was just locked with a passcode, like most iPhones, then something like the 4ukey iPhone Unlocker could be used to bypass and remove the passcode and gain entry into the phone. Download 4ukey iPhone Unlocker for Windows here, if you need these services. However, the iPhone in question is encrypted and this makes things somewhat more complicated. It can only be decrypted with the unique code. Not even Apple has that code or can decrypt it. Unlike what you see in the movies, it’s not possible for a really skilled hacker to say “It’s impossible”” and then break through it with enough motivation. Encryption really is that secure and it’s really impossible to break without the passcode.

What the FBI wants to do is brute force the passcode by trying every possible combination until they guess the right one. However, to prevent malicious people from using this exact technique, there is a security feature that erases the iPhone after 10 attempts or locks it for incrementally increasing time periods with each attempt. There is no way for the FBI (or Apple) to know if the feature that erases the iPhone after 10 tries is enabled or not, so they don’t even want to try and risk it.

oceans_of_data-sSo the FBI wants Apple to remove that restriction. That is reasonable. They should, if it is possible to do so without undue burden. The FBI should hand over the iPhone to Apple and Apple should help them to crack it.

However, this isn’t what the court order is asking Apple to do. The FBI wants Apple to create software that disables this security feature on any iPhone and give it to them. Even if it’s possible for this software to exist, it’s not right for the FBI to have it in their possession. They should have to file a court order every single time they use it. The FBI is definitely using this situation as an opportunity to create a precedent and give it carte blanche to get into any iPhone without due process.

So the answer to your question is that yes it is that secure and yes, it’s a ploy by the FBI. Whether it’s actually possible for Apple to help or not is one question and whether they should is another. Either way, the FBI should not have that software.

Ex-NSA Boss says FBI is Wrong on Encryption

What happens if the National Park Service fences off scenic lookout points at the Grand Canyon’s south rim near the head of the Bright Angel trail? Would it prevent the occasional suicide jumper? Not a chance. (The National Park Service tried this in the mid 1980s). People will either gore themselves on fences and posts or they will end their lives on the road in a high speed automobile, putting others at risk. Either way, tourists will be stuck with looking at the North Rim and the Colorado River through prison bars.

Let’s move from analogy to reality. What happens if you jam cell phone signals on tunnels and bridges. Will it stop a terrorist from remotely detonating a bomb? No. But it will certainly thwart efforts to get rescue and pursuit underway. And what about personal encryption?…

Gadgets and apps are finally building encryption into their wares by default, although it is always safer to use a VPN too, such as those designed by 25pc, to give you extra peace of mind. These are highly beneficial for individuals who want to protect their data, but does a locked-down iPhone or the technology that businesses use to secure trade secrets and plan strategy among colleagues enable criminals? Not even close. But if the FBI criminalizes encryption, they cripple the entire American economy. After all, the Genie is already out of the lamp.

Bear with me for just one more analogy (I’m still reaching for the right one): Criminalizing kitchen knives will make cooking impossible and the criminals will still have knives.

A Wild Duck has not previously linked to a media article. I am proud of our all-original content and clear statement of opinions. But in this case, I could not have said it better myself. (Actually, I have said it this all along: End-to-end encryption is a good thing for government, businesses and individuals alike. It is communications and storage empowerment.)

With this article, you will see that the former NSA director gets it. The current FBI director hasn’t a clue. Ah, well…That’s OK. Some concepts are subtle. For some politicians, an understanding of the practical, personal and sociological implications requires decades of exposure and post-facto reflection.

Memo to FBI director, Jim Comey: Get your head out of the sand and surround yourself with advisers who can explain cause and effect.


, Jan 13, 2016)encryption

Encryption protects everyone’s communications, including terrorists. The FBI director wants to undermine that. The ex-NSA director says that’s a terrible idea.

The FBI director wants the keys to your private conversations on your smartphone to keep terrorists from plotting secret attacks.

But on Tuesday, the former head of the U.S. National Security Agency…

Read the full article at CNN Money
http://money.cnn.com/2016/01/13/technology/nsa-michael-hayden-encryption/

Big biz & Uncle Sam like Tor, sort of…

Oceans of Data
Try to visualize all the data about you that is recorded, stored or transmitted each day in one form or another. Consider every possible source, both public and private. What if it could all be put together, correlated with data about every other person on earth and sifted by detectives whose only task is to look for subtle patterns of behavior?

Let’s start with phone calls: In addition to the number dialed, the phone company knows your location, the caller of ID of incoming calls, and even has access to the actual conversation. (Believe it or not, your government is listening). Check the phone bill of both parties and we can figure out how often you call each other. If we then learn everything we can about the people that you talk to, we can probably learn a thing or two about you. And speaking about location, did you know that both iPhones and Android phones log your precise location every few seconds and then transmit your location history to Apple or Google several times each hour? An even more ominous program discovered this week is embedded in Android phones. It sends every keystroke to your carrier even if you opt out.

What about your health records, magazine subscriptions, tax filings, legal disputes, mortgage records, banking transactions including charge card purchases? Now add your internet use – not just the sites at which you are registered, but every site you have ever visited. As that does happen, you have been recorded in some way on every website you’ve ever visited, many different agencies and companies collect different internet use statistics for public viewing, such as the stats you’re able to research using this source here as just one example. Suppose we add videos from convenience stores, traffic enforcement cameras and every ATM that you pass. Don’t forget the snapshot at the toll booth. They have one camera pointed at your face and another at the license plate. Of course, there is also a log entry from the toll payment device on your windshield and the key chain FOB that you use when you buy gas.

What about the relationships that are revealed by your old high school yearbook, old newspaper articles or that 4th grade poetry contest your daughter was in. There was a handout that night and so it counts as information related to you. How about that camera in the elevator at work? Suppose that it could recognize your face immediately and match it up with your fingerprints from your last international flight and your phone calls, web visits, hotel reservations and TV viewing habits.

Whew! That’s a lot of information to recognize or sift through in any meaningful way. But for a moment, ask yourself “What If”… What if all that data from every transaction record, GPS device, tax return and historical log could all be accurately attributed, correlated, matched and analyzed. What could be accomplished with all of this? Who wants it and for what purpose? Would their goals align with yours?

Person of Interest
In the CBS Television series, Person of Interest, a government computer looks for clues to the next terrorist event by monitoring virtually everyone and everything. The project doesn’t require its creators to build a new surveillance network. Massive amounts of data are already floating around us every day.

Of course, the data is fragmented. It was gathered for different reasons – mostly for private commerce (banking, medicine, safety). Few people consider it to impact privacy or personal freedoms, because we assume that It is too disparate and unwieldy for analysis by any single entity. Yet, in Person of Interest, the computer taps into all of these sources and mines the data for suspicious patterns.

As patterns emerge from all of this data, the computer finds converging threads based on individual behavior. Taken alone, the data points are meaningless – someone in Oregon signs for a package; someone using a different name in Rhode Island makes a plane reservation; someone in Pakistan fitting both descriptions checks into a motel and visits a convicted arms smuggler. The mobile phone carried by the last person accepts a phone call at a number previously used by one of the other individuals. Normally, no one could have ever fit these pieces together.

Eventually, the computer begins to identify suspicious activity. Depending on the programming and based on past findings, it even predicts events. But wait! Many of the patterns it finds are unrelated to terrorism. It finds clues to likely mob hits, crimes of passion, kidnapping, guns at school, and regional crime. The results are irrelevant to the machine’s purpose and in this fictional drama, the government decide that analysis would constitute illegal domestic spying. So they order the programmer to purge “irrelevant data” by adding a software routine to periodically delete extraneous results.

Of course, if the “personal” results were deleted, we wouldn’t have a new and exciting television series (my personal favorite). So, the middle-age geek who gave life to the analytics, recasts himself as a vigilante. He teams up with a former special ops agent (in the mold of Harrison Ford) and together, they follow data-mined leads in hope of saving innocent individuals.

In the US, our government has such a program. In fact, there are many Total Information Awareness projects. Unlike the Hollywood version, there was never any intent to purge personal information. In fact, it’s collection and analysis is the whole point. Another difference with the television series is that our government is not satisfied to mine public data or even legally obtained data. Instead, The federal government adds new primary data mechanisms every month and builds enormous enterprises to spy on individuals. This results in voluminous information daily, all of it available for future data mining without anyone’s knowledge or consent.

Of course, information and videos of individuals are routinely recorded wherever we go. But typically, we assume that this information is not centrally gathered, compared or analyzed. Most people assume that they are “off the radar” if they are not being actively tracked as part of an investigation. But with data mining techniques, no one is really off the radar. Machines make decisions about patterns that should be flagged and escalated for additional scrutiny.

Mixmaster: An Innocent Tool or Antiforensics?
In the 1990’s, despite a background in cryptography and computer science, I wasn’t aware of these programs. In the fields of political science and sociology, I was a ninnyhammer. It is either coincidence or perhaps prescience that I proposed and then participated in a project called a Mixmaster more than a decade ago…

The idea was simple: As you surf the web or send mail, your digital footprints are randomized so that an interloper or investigator could not piece together the participants in an internet exchange, nor determine the habits of an individual user. Well, they’re not really random, but the IP address reported to the email service or web page you visit is substituted by one associated with another participant in the project. That’s because each data leaving your PC is relayed through internet services associated with the others. We added a few simple facets to further obscure tracks:

  • Recognizing that a rogue participant might keep a log on the individuals who hand off data through his own relay (or may be compelled to do so in the future), our code automatically increased the number of ‘hops’ in relationship to the number of available peers. Anonymity was enhanced, because an unfriendly investigator attempting to trace the source of a web visit or email would need cooperation from a larger pool of participants.
  • Data between participants ware encrypted and randomized in length and even timing, to thwart possible forensic analysis.
  • A backward channel was added, but with very tight rules on expiration and purging. This allowed packet acknowledgement, web site navigation, and even two-way dialogue while still preserving anonymity.

Privacy & Politics
For most of us involved in the project, we had no endgame or political agenda. We simply recognized that it is occasionally comforting to send email, browse the web or post to a public forum without leaving a traceable return address. To those who claimed that our work might aid money launderers, terrorists or child molesters, we explained that identification and authentication should be under control of parties involved in a conversation. The internet is a new communications medium. But it was not designed to undermine the privacy of every conversation for the purpose of facilitating future forensic investigation. Investigators – if their purpose is supported by judicial oversight –have many old school methods and tools to aid their detective work. The growth of a new communication medium must not become a key to suppression or compromised privacy.

Vacuum-cleaner surveillance

Anonymous, but authenticated
There is a big difference, between identification and authentication. In a democracy, citizens are authenticated at the polls. But they enter a private booth to cast their vote and they turn in a ballot without a signature. They are identified (or even better, authenticated without identification) for the purpose of verifying eligibility. But their identity is not carried over to their voting decision. The real business is effectively anonymous.

This isn’t to say that all authorized entry systems should allow anonymous access. Of course not! Access entry systems typically might asks “Who are you?” (your User ID) and then ask for proof (typically a password). Your identity is not always required, but proof of authorized access can come in 3 forms. Very secure systems (such as banks) require at least 2 of these before allowing access:

  • something you know: A password or challenge
  • something you have: Evidence that you have a token or card
  • something you are: A fingerprint, recognizable face, or voice match

In each case, it is the person behind the door that needs your identity or authorization and not your government.

Anonymity and encryption go hand in hand. Both technologies are used to ensure that internet communication is private and does not become the affair of your friends, employer, former spouse, or government overseers. So where, exactly, does your government stand on the use of internet encryption or anonymity? In most of the world, the answer is clear. Governments stand for propaganda and crowd control. They are against any technology that enhances privacy. But this is not a universal axiom: In Germany, they stand on the side of citizens. Your data and your identity belong to you. Very little of your affairs are open to the government. But in the United States, the answer is very murky…

The NSA conducts vacuum-cleaner surveillance of all data crossing the Internet–email, web surfing… everything! –Mark Klein

Under George W. Bush, every bit of information was Uncle Sam’s business. With oversight by Dick Cheney (and hidden from legislative or judicial oversights), the executive branch concocted mechanisms of blatant domestic spying. Of course, the ringleaders realized that each mechanism violated the US constitution protection from unreasonable search, and so it was ordered and implemented covertly until a technician working for AT&T blew the whistle. Suddenly stories were surfacing that Uncle Sam was implementing a Reagan era project that had been shelved during the Clinton era. This launched a scramble to win public support for The Patriot Act, an absurd euphemism which attempts to whitewash illegal snooping as the patriotic duty of each citizen (talk about ‘deceptive’! Our leaders must think that we are sheep. Not just your garden variety grass-eating sheep, but really, really dumb sheep that feed on bull chips!).

-=-=-=-=-=-=-=- (writing in progress)

… until and (including preemptive data mining with programs like Dick Cheney’s “total information awareness”), back doors built into encryption chips, “deep packet” data sniffing installed at major switching center, satellite interception of phone calls, and national security letters (a euphemism for warrantless snooping).

Before the Obama administration, the answer was clear. These technologies are barely tolerated for banking, medicine and commerce. But they are to be weakened, subterfuged or thwarted when used by private citizens. In each case, the government sought to block the technology or insert a back door into the programming code (and into actual data centers) for use during any future investigation. Of course, in a bold era of predictive behavior modeling, authorized investigations often gives way to fishing expeditions for the sole purpose of information gathering.

But something has changed in the past 2 years. As news spread about Internet censorship in China, the Arab spring, and covert schools for girls in Taliban controlled regions of Afghanistan, the US government began to recognize that uncensored and even untraceable Internet use sometimes coincided with foreign policy objectives. Imagine the conundrum this revelation must have generated within the state department! On the one hand, the Patriot Act sanctions blatant acts of domestic spying (including preemptive data mining with programs like Dick Cheney’s “total information awareness”), back doors built into encryption chips, “deep packet” data sniffing installed at major switching center, satellite interception of phone calls, and national security letters (a euphemism for warrantless snooping). Yet, they also support freedom of speech and privacy for anything that supports US policy amongst our friends.

-=-=-=-=-=-=-

Today, this model has been widely adopted and greatly enhanced by an open source project called Tor. In this blog, I won’t try to justify the need for robust anonymous relays. Better writers and social philosophers than me have explained why free and anonymous communications channels are central to a free and democratic society. Better writers than me have chronicled the abuse of the Patriot Act, Echelon, TIA and numerous other abuses of government forms of overreach. Better writers than me have explained how open and free communication leads to increased safety even if it sometimes facilitates communications among terrorists, digital pirates or pornographers.

-=-=-=-=-=-=-

Turn of Events: Government as Advocate

  • Obama lends support to Tor
  • Tor to users: Use Amazon Cloud as bridge to anonymity (this section under development)

Additional Reading

  • Carrier IQ (CIQ): A secret routine is embedded in Android phones sends every user keystroke to the network carrier, even when you opt out of every single connectivity feature. It cannot be uninstalled and cannot be uninstalled nor even shut down!
  • Surrounded by Surveillance: Is Everything Spying On You?

    Pigeons aren’t the only ones listening. The light pole itself broadcasts conversations.

    Even municipal light posts send conversations to government agencies, supposedly to aid first responders in an emergency. But wait! The manufacturer “proudly contacted DARPA” to suggest a more sinister use for the data collected from hidden microphones?

  • Wikipedia entry: Information Awareness Office (introduction & overview)
  • Official DARPA site: Information Awareness Office
  • The Smoking Gun: Discovery of Massive “Vacuum Sweep” Domestic Spying
    Leads to Patriot Act (euphemism for act of Profound Anti-Americanism)

Beijing to Impose Encryption Disclosure Rules

I originally wrote this piece in April 2010 as feedback to this article in The Wall Street Journal.
__________________________________________________________

Another reader, Felix Wyss, correctly points out that this WSJ article is unclear on the information demanded by Chinese authorities. Every open or public key encryption standard is based on a disclosed algorithm. That’s the whole point. It is the complexity of reversing that algorithm that makes the encryption secure…

If the Chinese government wants more information about the algorithm used to encode data for transmission or archival – For example, to ensure that it is secure – then I say, “Absolutely! Get all the information you like”. But if they want a key escrow or back door, they are barking up a dead tree stump. Been there. Done that. Our own government tried this. In the words of Dana Carvy: “Not Gonna Happen.

Even under communist rule, that wouldn’t be encryption at all. And with the growing clout of economic success, Chinese companies won’t stand for it. The Party would do better by demanding that routers or firewalls force users to create two keys: One for the end user and one for the network admin. That would promote good business practice. Of course, turning over the 2nd key to anyone outside of the immediate work group  or family would require active user consent and compliance.

Hey China! Let’s face it: The days of suppressing free speech or forcing products to snitch on their users is coming to an ignoble end. The secret police of Romania, East Germany, the Soviet Union and Iraq have all disbanded – or at least redirected to classic gum shoe detective work and intelligence gathering. Terrorism is the new enemy and not the private business and political activities of your own citizens. You have shown a remarkable ability to emerge as an economic superpower, making things that people like, exporting quality products, all while raising the standard of living for your population. Ultimately, this helps all countries. But now you have some very ugly skeletons to sweep out of your closet. Modern 1st world countries cannot forever suppress political and religious freedom.

Grow up. We really do want you to enter the community of nations some day.

– Ellery Davies
Ellery Davies clarifies law and public policy. Feedback is always welcome.