Can Bitcoin transactions be made private?

The blockchain is public, yet a Bitcoin wallet can be created anonymously. So are Bitcoin transactions anonymous? Not at all…

Each transaction into and out of a wallet is a bread crumb. Following the trail is trivial. Every day, an army of armchair sleuths help the FBI. That’s how Silk Road was brought down.

The problem is that some of that money eventually interacts with the real world (a dentist is paid, a package shipped or a candy is purchased at a gas station). Even if the real-world transaction is 4 hops before or after hitting the “anonymous” wallet, it creates a forensic focal point. Next comes a tax man, an ex-spouse or a goon.

The first article linked below addresses the state of tumblers (aka “mixers”). They anonymize an open network by obfuscating the trail of bread crumbs.

Mixers/tumblers aren’t the only way to add a layer of privacy to Bitcoin transactions. The Lightning Network spec includes an optional 17-hop onion routing (just like TOR’s 4 step onion routing). I have not yet seen the feature expressed in wallets or services, but if implemented, it will be even more private and trustworthy than a mixer, because there is no middle party to trust (by you) or squeeze (by investigators). It has the potential to makes any crypto Bitcoin even more anonymous than cash.

Certain cryptocurrencies (not Bitcoin) have anonymity baked in by design. Monero, ZCash and Dash are privacy tokens that use very different approaches to eliminate the bread crumbs. Monero appears to have one distinct advantage: Like the TOR network, it is trustless. But there are benefits to each approach.

Can I Check Web Sites Visited by my Kids/Staff?

Early this morning, I was asked this question at Quora. It’s a pretty basic request of network administrators, including parents, schools and anyone who administers a public, sensitive or legally exposed WiFi hot spot.

Is there a quick and easy way to view, log, or otherwise monitor the web sites visited by people on your home or office network?

Yes. It’s free and and it is pretty easy to do.

It gets a bit trickier if the individual on your network is using a VPN service like virtual shield vpn that they have configured on their device. A VPN allows you to create a secure connection to another network over the Internet, which allows you to access region-restricted websites and shield your browsing history from public WiFi. It’s no wonder that so many people try and find the best VPN services from websites like Indexsy for privacy reasons, but this does make your analysis of the individual trickier. [1] A VPN does not stop you from logging their browsing, but all of their activity will point to the VPN address instead of the site that they are actually visiting. In that case, there is another way to monitor their activity. However, having a VPN can put a protective barrier to your location. It’s wise to get a secure VPN. You can review these best vpn reviews for more information and for the best VPNs. See note #1, below.

Before getting into this, I should mention that I believe that using covert methods to monitor a family member’s online activity is a terrible method of parenting. In my opinion, there are better ways to deal with the issue-parenting techniques that don’t undermine trust as they deal with safety.

I can think of at least three methods for logging the websites that people on your network visit. In the explanation below, we will focus on #2. For more information, dig into the notes at the bottom of this answer.

You can either…

  1. Configure your router to store logs of visited IP addresses [2]
  2. Set your router to use the DNS server at opendns.com, instead of the default server offered by your internet service provider. This involves a simple setting available in all routers. (Replace default DNS server addresses with 208.67.222.222 and 208.67.220.220)
  3. You can set up a proxy which redirects web traffic to one of the computers in your house or a third-party service. This is how the monitoring software for parents and custodial services monitor or block web traffic.

In the remainder of this quick tutorial, we focus on method #2..

Once you configure your router to use the two DNS servers at OpenDNS.com, create a free account on their web site. Then, enable the logging feature. It not only shows you visited domains, it maps them into actual domain names and subdomains-making it easy to search, sort or analyze traffic.

You can download a spreadsheets and sort by number of visits or by the domains visited. Logs are maintained for only two weeks. So, if you wish to maintain a history, you will need to visit OpenDNS and download them regularly. (Check their user forum. Someone has created a safe, single-line DOS command that downloads these activity logs to your PC).


[1] VPN, Onion Routing and Encryption

If an individual in your home or office is using a Virtual Private Network [VPN], they are effectively covering their tracks with method #3, above. You can see their connection to the VPN service, but that service is either trusted to destroy logs of visited web sites, or anonymize traffic, by routing it through a chain of users that have no way to back-trace and identify the requester’s address.

Since their traffic originates on your network, there are other things you can do to monitor their activities. For example, if they are not using end-to-end encryption, you can use method #3 yourself, to route data in and out through your own PC or service.

[2] Logging the IP address or domain of visited web sites is not a feature of all routers. I have three recent model routers – and only one of them has a feature to log traffic in and out of the network.

[3] OpenDNS cannot discriminate the individual device in your home or office that has accessed websites that it logs. The logs include the traffic for all HTTP access that originates through your internet service subscription.

But some remarkable feature of OpenDNS (other than it being completely free):

a) It speeds up your overall internet experience noticeably! I thought the internet available in my area was poor one time but I did this and it sped up my connection rather quickly. Like Google’s free DNS service, it is more robust and more redundant than the default DNS settings recommended by your internet service provider.

b) It maps every IP address into a domain name. So when you log in to check your logs and statistics, you don’t need to figure what the numbers mean. You view a list that makes sense. You can even search for certain words or web sites.

c) It permits you to block websites based on a very rich set of 100 criteria, including violence, adult content, hate speech, etc.

d) It offers graphs of your network access including overall volume. An example is shown here:

Online Privacy: Learn Tor, VPN, VeraCrypt, LasPass

I have a special request. Actually, this is a personal plea to my readers…

Next month, I host two evening privacy workshops near Boston. I could use a teaching assistant to run around and help newbies install software as I present to the class. But what I really need help with—is getting the word out. Please help…

This time, it’s not about Bitcoin or the blockchain. It’s about taking control of your online identity and browsing activities. It’s about privacy and anonymity. It’s about your communications, your personal data and your disk or cloud storage.

All that data belongs to you and not to your ISP, employer, a hacker, the government, or marketers. And it is surprisingly easy to cover your tracks. In fact, with the proper tools, taking control of your identity and privacy is safe, simple and transparent.

In just 3 hours, attendees will learn install and use TOR, VPN, VeraCrypt and LastPass. They will also get an excellent feel for the function and benefits of a virtual machine.

Anyone attending can choose either Aug 8 (Marlboro) or Aug 22 (Natick). Renting a presentation room in the Natick Library is expensive.

Please help me promote an effective and exciting evening of learning. Get the word out. Check out these announcements: [Sign-up page]   [Meetup page]

Bonus Points: Do you recognize the photo on the left? Be the first to leave a comment with the name of the plastic privacy bubble and the 1960s TV series that featured it. The winner gets two free passes to our privacy workshop that can be transferred to anyone.

Diminishing Bitcoin Mining Rewards

By now, most Bitcoin and Blockchain enthusiasts are aware of four looming issues that threaten the conversion of Bitcoin from an instrument of academics, criminal activity, and closed circle communities into a broader instrument that is fungible, private, stable, ubiquitous and recognized as a currency—and not just an investment unit or a transaction instrument.

These are the elephants in the room:

  • Unleashing high-volume and speedy transactions
  • Governance and the concentration of mining influence among pools, geography or special interests
  • Privacy & Anonymity
  • Dwindling mining incentives (and the eventual end of mining). Bitcoin’s design eventually drops financial incentives for transaction validation. What then?

As an Op-Ed pundit, I value original content. But the article, below, on Bitcoin fungibility, and this one on the post-incentive era, are a well-deserved nod to inspired thinking by other writers on issues that loom over the cryptocurrency community.

This article at Coinidol comes from an unlikely source: Jacob Okonya is a graduate student in Uganda. He is highly articulate, has a  keen sense of market economics and the evolution of technology adoption. He is also a quick study and a budding columnist.

What Happens When Bitcoin Mining Rewards Diminish To Zero?

Jacob addresses this last issue with clarity and focus. I urge Wild Ducks to read it. My response, below touches on both issues 3 and 4 in the impromptu list, above.


Sunset mining incentives—and also the absence of supporting fully anonymous transactions—are two serious deficiencies in Bitcoin today.
I am confident that both shortcomings will be successfully addressed and resolved.

Thoughts about Issues #3 and #4: [Disclosure] I sit on the board at CRYPSA and draft whitepapers and position statements.*

Blockchain Building: Dwindling Incentives

mining-incentive-02Financial incentives for miners can be replaced by non-financial awards, such as recognition, governance, gaming, stakeholder lotteries, and exchange reputation points. I am barely scratching the surface. Others will come up with more creative ideas.

Last year, at the 2015 MIT Bitcoin Expo, Keynote speaker Andreas Antonopoulos expressed confidence that Bitcoin will survive the sunset of miner incentives. He proposed some novel methods of ongoing validation incentives—most notably, a game theory replacement. Of course, another possibility is the use of very small transaction fees to continue financial incentives.

Personally, I doubt that direct financial incentives—in the form of microcash payments— will be needed. Ultimately, I envision an ecosystem in which everyone who uses Bitcoin to buy, sell, gift, trade, or invest will avoid fees while creating fluidity—by sharing the CPU burden. All users will validate at least one Blockchain transaction for every 5 transactions of their own.

Today, that burden is complex by design, because it reflects increasing competition to find a diminishing cache of unmined coins. But without that competition, the CPU overhead will be trivial. In fact, it seems likely that a validation mechanism could be built into every personal wallet and every mobile device app. The potential for massive crowd-sourced scrutiny has the added benefit of making the blockchain more robust: Trusted, speedy, and resistant to attack.

Transaction Privacy & Anonymity

Bitcoin’s lack of rock-solid, forensic-thwarting anonymity is a weak point that must ultimately be addressed. It’s not about helping criminals, it’s about liberty and freedoms. Detectives & forensic labs have classic methods of pursuing criminals. It is not our job to offer interlopers an identity, serial number and traceable event for every transaction.

Anonymity can come in one of three ways. Method #3 is least desirable:

  1. Add complex, multi-stage, multi-party mixing to every transaction—including random time delays, and parsing out fragments for real purchases and payments. To be successful, mixing must be ubiquitous. That is, it must be active with every wallet and every transaction by default. Ideally, it should even be applied to idle funds. This thwarts both forensic analysis mining-incentive-03and earnest but misguided attempts to create a registry of ‘tainted’ coins.
  2. Fork by consensus: Add anonymizing technology by copying a vetted, open source alt-coin
  3. Migrate to a new coin with robust, anonymizing tech at its core. To be effective, it must respect all BTC stakeholders with no other ownership, pre-mined or withheld distribution. Of course, it must be open, transparent and permissionless—with an opportunity and incentive for all users to be miners, or more specifically, to be bookkeepers.

That’s my opinion on the sunset of mining incentives and on transaction anonymity.
—What’s yours?


* Ellery Davies is co-chair of the Cryptocurrency Standards Asso-
  ciation. He was host and MC for the Bitcoin Event in New York.

Bitcoin Fungibility: A Benefit of privacy & anonymity

I was pointed to this article by Jon Matonis, Founding Director, Bitcoin Foundation. I was sufficiently moved to highlight it here at AWildDuck.

On Fungibility, Bitcoin, Monero and ZCash … [backup]

This is among the best general introductions I have come across on traceability and the false illusion of privacy. The explanation of coin mixing provides and coin_mixing-03excellent, quick & brief overview.

Regarding transaction privacy, a few alt-coins provide enhanced immunity or deniability from forensic analysis. But if your bet is on Bitcoin (as it must be), the future is headed toward super-mixing and wallet trading by desgin and by default. Just as the big email providers haved added secure transit,
Bitcoin will eventually be fully randomized and anonymized per trade and even when assets are idle. It’s not about criminals; it’s about protecting business, government and individuals. It’s about liberty and our freedoms. [Continue below image]

coin_mixing-04

How to thwart forensic investigation: Fogify explains an advanced mixing process

The next section of the article explains the danger of losing fungibility due to transaction tracing and blacklisting. I can see only ONE case for this, and it requires a consensus and a hard fork (preferably a consensus of ALL stakeholders and not just miners). For example, when a great number of Etherium was stolen during the DAO meltdown.

My partner, Manny Perez, and I take opposing views of blacklisting coins based on their ‘tainted’ history (according to “The Man”, of course!). I believe that blacklists must ultimately be rendered moot by ubiquitous mixing, random transaction-circuit delays, dilbert-060219and multiple-transaction ‘washing’ (intentionally invoking a term that legislators and forensic investigators hate)—Manny feels that there should be a “Law and Order” list of tainted coins. Last year, our Pro-&-Con views were published side-by-side in this whitepaper.

Finally, for Dogbert’s take on fungible, click here. I bought the domain fungible.net many years ago, and I still haven’t figured out what to do with it. Hence this Dilbert cartoon. 🙂
____________
The author is co-chair of The Cryptocurrency Standards Association.
He also presents on privacy, anonymity, blind signaling & antiforensics.

Blind Signaling and Response presentation posted

Online services mine personal user data to monetize processes. That’s the business model of “free” services. Even if mining is consensual, policies and promises cannot guaranty privacy. It succumbs to error, administrative malfeasance, hackers, malware and overreaching governments. Is there a technical solution? One that supports monetized data mining and manipulation, but only under predefined conditions, rather than by policies and promises?

Philip Raymond has spent the past 4 years searching for a privacy Holy Grail: The ability to facilitate data mining and manipulation in a way that protects user identity, restricts use to predefined purposes, and insulates results from other uses, even within the service that gathers and manipulates the data.

Prior to this week, there was scant public material on the Blind Signaling mechanism. A PowerPoint overview was accessible only by students at a few universities and the French mathematician who is donating resources to the project.

This week, Université de Montréal posted a live video presentation that steps through the BSR PowerPoint slides. It was filmed at a computer privacy workshop hosted by the university math and encryption departments. Master of Ceremonies, Gilles Brassard, is recognized as an inventor of quantum cryptography, along with his colleague, Charles Bennett. [Brief History of QC]

Blind Signaling and Response  by Philip Raymond…

I am often asked about the algorithm or technical trick that enables data to be decrypted or manipulated—only if the user intent is pure. That’s the whole point here, isn’t it! We claim that a system can be devised that restricts interpretation and use of personal data (and even identities of individual users who generate data), based on the intended use.

The cover pulls back near the end of the video. Unfortunately, I was rushed through key PowerPoint slides, because of poor timing, audience questions, and a lack of discipline. But, I will present my theories directly to your screen, if you are involved in custodial privacy of user data for any online service (Google, Yahoo, Bing, etc) or ISP, or upstream provider, or an Internet “fabric” service (for example, Akamai).

How it Works

The magic draws upon (and forms an offshoot of) Trusted Execution Technology [TXT], a means of attestation and authentication, closely related to security devices called Trusted Platform Modules. In this case, it is the purpose of execution that must be authenticated before data can be interpreted, correlated with users or manipulated.

Blind Signaling and Response is a combination of TXT with a multisig voting trust. If engineers implement a change to the processes through which data is manipulated (for example, within an ad-matching algorithm of Google Ad-Words), input data decryption keys will no longer work. When a programming change occurs, the process decryption keys must be regenerated by the voting trust, which is a panel of experts in different countries. They can be the same engineers who work on the project, and of course they work within an employer NDA. But, they have a contractual and ethical imperative to the users. (In fact, they are elected by users). Additionally, their vote is—collectively—beyond the reach of any government. This results in some very interesting dynamics…

  1. The TXT framework gives a Voting Trust the power to block process alteration. The trust can authenticate a rotating decryptoin key when changes to an underlying process are submitted for final approval. But, if a proscribed fraction of members believes that user data is at risk of disclosure or manipulation in conflict with the EULA, the privacy statement (and with the expectations of all users), they can withhold the keys needed for in-process decryption. Because proposed changes may contain features and code that are proprietary to the custodian, members of the voting trust are bound by non-disclosure—but their vote and their ethical imperative is to the end user.
    .
  2. Blind Signaling and Response does not interfere with the massively successful Google business model. It continues to rake in revenue for serving up relevant screen real-estate to users, and whatever else Google does to match users with markets. Yet, BSR yields two important benefits:
  • a) It thwarts hackers, internal spies, carelessness, and completely undermines the process of government subpoenas, court orders and National Security Letters. After all, the data is meaningless even to in-house engineers. It is meaningful only when it is being used in the way the end users were promised.
    .
  • b) Such a baked-in process methodology can be demonstrably proved. Doing so can dramatically improve user perception and trust in an online service, especially a large collection of “free” services that amasses personal data on interests, behavior and personal activities. When user trust is strengthened, users are not only more likely to use the services, they are less likely to thwart free services via VPN, mixers or other anonymizers.

Incidentally, the idea to merge a TXT mechanism with a human factor (a geographically distributed voting trust accountable to end users) was first suggested by Steven Sprague (just hours before my presentation in the above video…I had been working on a very different method to achieve blind signalling). In addition to being insightful and lightning quick to absorb, process and advise, Steven is a Trusted Platform expert, director of Wave Systems and CEO of  Rivetz. Steven and I were classmates at Cornell University, but we had never met nor heard of each other until our recent affiliation as advisers to The Cryptocurrency Standards Association.

To learn more about Blind Signaling and Response—or to help with the project—use the contact link at the top of this page. Let me know if you watched the Montreal video.

Disclosure: The inventor/presenter publishes this Wild Duck blog under the pen name, “Ellery”.

Ex-NSA Boss says FBI is Wrong on Encryption

What happens if the National Park Service fences off scenic lookout points at the Grand Canyon’s south rim near the head of the Bright Angel trail? Would it prevent the occasional suicide jumper? Not a chance. (The National Park Service tried this in the mid 1980s). People will either gore themselves on fences and posts or they will end their lives on the road in a high speed automobile, putting others at risk. Either way, tourists will be stuck with looking at the North Rim and the Colorado River through prison bars.

Let’s move from analogy to reality. What happens if you jam cell phone signals on tunnels and bridges. Will it stop a terrorist from remotely detonating a bomb? No. But it will certainly thwart efforts to get rescue and pursuit underway. And what about personal encryption?…

Gadgets and apps are finally building encryption into their wares by default, although it is always safer to use a VPN too, such as those designed by 25pc, to give you extra peace of mind. These are highly beneficial for individuals who want to protect their data, but does a locked-down iPhone or the technology that businesses use to secure trade secrets and plan strategy among colleagues enable criminals? Not even close. But if the FBI criminalizes encryption, they cripple the entire American economy. After all, the Genie is already out of the lamp.

Bear with me for just one more analogy (I’m still reaching for the right one): Criminalizing kitchen knives will make cooking impossible and the criminals will still have knives.

A Wild Duck has not previously linked to a media article. I am proud of our all-original content and clear statement of opinions. But in this case, I could not have said it better myself. (Actually, I have said it this all along: End-to-end encryption is a good thing for government, businesses and individuals alike. It is communications and storage empowerment.)

With this article, you will see that the former NSA director gets it. The current FBI director hasn’t a clue. Ah, well…That’s OK. Some concepts are subtle. For some politicians, an understanding of the practical, personal and sociological implications requires decades of exposure and post-facto reflection.

Memo to FBI director, Jim Comey: Get your head out of the sand and surround yourself with advisers who can explain cause and effect.


, Jan 13, 2016)encryption

Encryption protects everyone’s communications, including terrorists. The FBI director wants to undermine that. The ex-NSA director says that’s a terrible idea.

The FBI director wants the keys to your private conversations on your smartphone to keep terrorists from plotting secret attacks.

But on Tuesday, the former head of the U.S. National Security Agency…

Read the full article at CNN Money
http://money.cnn.com/2016/01/13/technology/nsa-michael-hayden-encryption/

Privacy –vs– Anonymity

My friend and business partner, Manny Perez holds elective office. As New York State politicians go, he is an all around decent guy! The first things colleagues and constituents notice about him is that he is ethical, principled, has a backbone, and is compassionate for the causes he believes in.

Manny wears other hats. In one role, he guides an ocean freighter as  founder and co-director of CRYPSA, the Cryptocurrency Standards Association. Manny-guitar-sWith the possible exceptions of Satoshi Nakamoto and Andreas Antonopoulos, Manny knows more about Bitcoin than anyone.

But Manny and I differ on the role of privacy and anonymity in financial dealings. While he is a privacy advocate, Manny sees anonymity —and especially civilian tools of anonymity—as a separate and potentially illegal concept. He is uneasy about even discussing the use of intentionally architected anonymity in any financial or communications network. He fears that our phone conversation may be parsed (I agree) and trigger a human review (I agree) and that it could be construed as evidence of promoting illegal technology. This is where we differ… I agree, but I don’t care how anyone who is not party to a private conversation construes it! Yet, I see anonymity as either synonymous with privacy or at least a constituent component. You can’t have one without the other.

Manny was raised in Venezuela, where he was schooled and held is first jobs. He was involved in the energy industry. He acknowledges that experience with a repressive and graft-prone government, lead to a belief in a more open approach: free markets coupled with a democratic government.

Perhaps this is a key source of our different viewpoints. Manny comes from a repressive land and has come to respect the rules-based structure within his comfort zones of banking, energy and government. He is a certified AML expert (anti-money laundering) and believes strongly in other financial oversight rules, like KYC (Know Your Customer) and RICO (Racketeer Influenced and Corrupt Organizations Act).

Because Manny is appreciative of the opportunity and benefits conveyed by his adoptive country, he may overlook a fact that whispers in the minds of other privacy advocates: That is, we may one day need protection from our own government. After all, who but a conspiracy nut or white supremacist could imagine the US government suppressing its populace. Sure, they engage in a little domestic spying—but if you have nothing to hide, why hide at all?!

This week, Manny posted an open letter to the cryptocurrency community. His organization, CRYPSA is at the intersection of that community with law, technology and politics. His letter addresses privacy, anonymity and transparency, but the title is “How can you report a stolen bitcoin?” For me, the issue is a non-sequitur. You needn’t, you shouldn’t, the reporting superstructure shouldn’t exist, and in a well designed system, you can’t.* More to the point, the imposition of any centralized reporting or recording structure would violate the principles of a decentralized, p2p currency.

To be fair, Manny is not a sheep, blindly falling into line. He is shrewd, independent and very bright. But in response to my exaggerated and one-dimensional Manny, I have assembled some thoughts…

1. Privacy, Anonymity and Crime

Bitcoin pile-sThe debate about Bitcoin serving as a laundering mechanism for cyber-criminals is a red herring. Bitcoin does not significantly advance the art of obfuscation or anonymity. There have long been digital E-golds and stored value debit cards that offer immunity from tracking. They are just as easy to use over the Internet.

Moreover, it’s common for crime or vice to drive the early adoption of new technology, especially technology that ushers in a paradigm shift. The problem with linking Bitcoin to crime is that it drives a related debate on transparency, forensics and government oversight. This is a bad association. Transparency should be exclusively elective, being triggered only after a transaction—if and when one party seeks to prove that a payment was made or has a need to discuss a contractual term.

On the other hand, a good mechanism should render forensic analysis a futile effort if attempted by a 3rd party without consent of the parties to a transaction. We should always resist the temptation to build a “snitch” into our own tools. Such designs ultimately defeat their own purpose. They do not help to control crime—Rather, they encourage an invasive government with its fingers in too many people’s private affairs.

CRYPSA is building tools that allow Bitcoin users to ensure that both parties can uncover a transaction completely, but only a party to the transaction wishes to do so!. For example, a parent making a tuition payment to a college can prove the date, amount and courses associated with that payment; a trucker or salesman with a daily expense account can demonstrate to his employer that a purchase was associated with food and lodging and not with souvenirs. And, of course, a taxpayer under audit can demonstrate whatever he wishes about each receipt or payment.

But in every case, the transaction is opaque (and if properly secured, it is completely anonymous) until the sender or recipient chooses to expose details to scrutiny. I will never accept that anonymity is evil nor evidence of illicit intent. Privacy is a basic tenet of a democracy and a government responsible to its citizens. CRYPSA develops tools of transparency, because commerce, businesses and consumers often need to invoke transparency—and not because any entity demands it of them.

We are not required to place our telephone conversations on a public server for future analysis (even if our government saves the metadata or the complete conversation to its clandestine servers). Likewise, we should not expose our transactions to interlopers, no matter their interest or authority. The data should be private until the data generator decides to make it public.

2. Reporting a Transaction (Why not catalog tainted coins?)

Manny also wants to aid in the serialization and cataloging of tainted funds, much like governments do with mass movement of cash into and out of the banking network. This stems from an earnest desire is to help citizens, and not to spy. For example, it seems reasonable that a mechanism to report the theft of currency should be embedded into Bitcoin technology. Perhaps the stolen funds can be more easily identified if digital coins themselves (or their transaction descendants) are fingered as rogue.

The desire to imbue government with the ability to trace the movement of wealth or corporate assets is a natural one. It is an outgrowth of outdated monetary controls and our comfort with centralized trust-endowed. In fact, it is not even a necessary requirement in levying or enforcing taxes.

Look at it this way…

  1. Bitcoin transactions are irreversible without the identification and cooperation of the original payee (the one who received funds). Of course, identification is not a requisite for making a transaction, any more than identification is required for a cash purchase at a restaurant or a newsstand.
  2. There are all sorts of benefits of both anonymous transactions and secure, irrevocable transactions—or least those that cannot be reversed without the consent of the payee. This is one of the key reasons that Bitcoin is taking off despite the start-up fluctuations in exchange rate.
  3. Regarding the concern that senders occasionally wish to reverse a transaction (it was mistaken, unauthorized, or buyer’s remorse), the effort to report, reverse or rescind a transaction is very definitely barking up the wrong tree!

The solution to improper transactions is actually quite simple.

a) Unauthorized Transactions

Harden the system and educate users. Unauthorized transactions can be prevented BEFORE they happen. Even in the worst case, your money will be safer than paper bills in your back pocket, or even than an account balance at your local bank.

b) Buyer’s Remorse and Mistaken transactions

Buyer beware. Think before you reach for your wallet! Think about what you are buying, from whom, and how you came to know them. And here is something else to think about (issues that are being addressed by CRYPSA)…

i.   Do you trust that the product will be shipped?
ii.  Did you bind your purchase to verifiable terms or conditions?
iii. Is a third party guarantor involved (like Amazon or eBay)?

All of these things are available to Bitcoin buyers, if they only educate themselves. In conclusion, “reporting” transactions that you wish to rescind is a red herring. It goes against a key tenant of cryptocurrency. It is certainly possible that a distributed reverse revocation mechanism can be created and implemented. But if this happens, users will migrate to another platform (call it Bitcoin 2.0).

You cannot dictate oversight, rescission or rules to that which has come about from organic tenacity. Instead, we should focus on implementing tools that help buyers and businesses identify sellers who agree to these extensions up front. This, again, is what CRYPSA is doing. It is championing tools that link a transaction to business standards and to user selective transparency. That is, a transaction is transparent if—and only if— the parties to a transaction agree to play by these rules, and if one of them decides to trigger the transparency. For all other p2p transactions, there is no plan to tame the Wild West. It is what it is.

* When I say that you should not report a stolen coin, I really mean that you should not run to the authorities, because there is nothing that they can do. But this is not completely accurate.

20130529_102314a1. There are mechanisms that can announce your theft back into a web of trust. Such a mechanism is at the heart of the certificate revocation method used by the encryption tool, PGP (Pretty Good Privacy). CRYPSA plans to design a similar user-reporting mechanism to make the cryptocurrency community safer.

2. Authorities should still be alerted to theft or misuse of assets. They can still investigate a crime scene, and follow a money trail in the same way that they do with cash transactions, embezzlement or property theft. They can search for motive and opportunity. They have tools and resources and they are professionals at recovering assets.


 

Disclosure: Just like Manny, I am also a CRYPSA director and acting Co-Chairman. (Cryptocurrency Standards Association). This post reflects my personal opinion on the issue of “reporting” unintended, unauthorized or remorseful transactions. I do not speak for other officers or members.

Canary Watch deduces federal gag orders

The US government and its courts routinely demand personal user information from email services, banks, phone companies and other online, telecommunications or financial services. These demands compel services to disclose details about email, phone calls and financial transactions. They also gain access to hordes of so called “metadata”, which can be just as personal as a user’s phone calls. It includes information about user relationships, locations, browser configuration and even search history. Many of these demands, such as the infamous National Security Letter stipulate that the service may not divulge that they were asked for the information in the first place. In fact, they can’t say anything about a de facto investigation!…

My friend, Michael, occasionally points out that skirting the law with Wink-Wink-Nod-Nod is still likely breaking the law. His point, of course, is that law is often based on intent. So with this in mind, what do you think about this clever reporting service?…

Canary WatchA service called, Canary Watch, lets online services like Verizon or Google send a continuous stream of data that repeatedly states “We are not currently compelled to turn over any data on our users”. Naturally, if the service suddenly refrains from sending the statement, a reasonable person can infer that the government is demanding personal information with the usual GAG order attached.

If you extrapolate this technique, a service like Verizon could continuously broadcast a massive list of usernames (or shorter hash codes representing individual users). These are the users who are not currently being investigated. Any change to the data stream would allow a 3rd party to infer and alert users who are the subject of an investigation.

With the launch of this service, Canary Watch wins the 2015 Wild Duck Privacy Award. This is the type of cleverness that we like! Why? Because it enhances transparency and helps everyone to control their own privacy, if they choose to do so.

Wild Duck Privacy Award

Further reading: Activists create website to track & reveal NSA, FBI info requests

Erase Online Infamy: Lies, slander, binging, sexting

I wrote this article under contract to the leading European security magazine and Blog, at which I typically write under my real name. (Ellery is a pen name).

Upon submission, my editor haggled over the usual issues of regional euphemisms (not allowed), eclectic metaphors (encouraged) and brevity (my submissions tend exceed the word limit by 3 or 4x). But this time, she also asked me to cut a section that I feel is critical to the overall thrust. That section is titled “Create Digital Chaff”

I am neither a stakeholder nor an editor at that magazine. Their editors have every right to set the terms and tone of anything they publish. But sensing my qualms over personal ethics and reputation, my editor released the article from contract and suggested that I publish in a venue with an edgy approach to privacy. I considered farming it out to Wired, CNet or PC Magazine, but it was written at a level and style intended for a very different audience. And so, it appears here, in my own Blog. The controversial section is intact and in red, below. Of course, Wild Ducks will see nothing controversial in a perfectly logical recommendation.

——————————-

The web is filled with tutorials on how to block tracking, hide purchases and credit history-even how to shift your identity deep under­cover.

But what about search results linked to your name, organization or your past. What can be done?

Legal remedies are rarely effective. The Internet is untamed. Data spreads like wildfire and it’s difficult to separ­ate opinion from slander. But, you can counter undesir­able content.

Catalogue the Irritants

Start by listing and prioritizing your pain. Search your name in all the ways you are known. Try several search engines, not just Google. Check image results and Usenet (news groups).

Record disparaging search results in 7 columns:

  • Search terms that yield the offending page
  • URL address of unflattering material
  • Severity damage to your reputation
  • Owner or Author contact info of traceable party
  • Which role? author, site admin, hosting service?
  • Inbound links search on “Links:{Page_URL}”
  • Disposition left msg, success, failure, etc

Sort pages in descending order of severity. As you resolve issues, reduce severity to zero, but save entries for follow up. With just a few offensive pages, it may take a few hours to perform the tasks described.

The Process

Most posts boil down to an individual author rather than a faceless organization. You can appeal, bargain, redirect, bury or dis­credit a source, sometimes employing several strategies. With reputation is at stake, all is fair.

Removing or Correcting Content

First, determine if you are more likely to influence the web developer, site host, or author who cre­ated unflat­tering mater­ial (start with him, if pos­sible).

Ascertain if the author is likely to influence readers that matter to you. After all, without creed, rants cannot Infamy Callout 1ado much damage.

If the source is credible, appeal directly. In some cases, re­marks about you may be immaterial to his point. If it is impos­sible to find the source or if there is no meeting of minds, con­tact the site owner or hosting service-daunting, but not im­possible. GoDaddy, the largest host­ing site[1], often takes down entire sites in response to complaints.

Try pleading, bargaining or swapping favors. (But never pay! Extortion is best handled by courts). Negotiate these actions:

  • Change names, keywords and metatags. Avoid taking down the page for 2 weeks.
  • Point domain or URL to a different site
  • Post an apology or correction at searchable pages that contain offending material. (Avoid describing the slander. Use different phrases).
  • Add chaff (below). It reduces discovery.

Takedown the Search Cache

Check the page cache of each search (click the arrow to the right of Google results). File takedown requests, especially if material is obscene or you can argue it is untruthful or slander.

Check referring sites. They may excerpt or echo defamation. In the UK, freedom of expression is becoming a gray area. Nevertheless, in practice, the Internet gives everyone a soap box. So our next technique employs misdirection and ‘noise’ rather than confrontation.

Create Digital Chaff

To protect from missiles, some aircraft eject ‘chaff’. The burning strips of foil lure guided munitions by presenting a target that is more attractive than the plane. Likewise, you can de­ploy digital “chaff” by planting information Infamy Callout 2that overwhelms search results, leading viewers away from de­famatory links via redirection or con­fusion. Your goal: change search result ranking.

Chaff-s

US Air Force jet ejects burning chaff

Consider photos or events that you find untruthful or embar­rassing. Ask friends with popular web pages to add content about the photo associating ficti­tious names (not yours). Con­versely, get them to post your name in articles that dis­tance you from the activity in ways that seemingly make it impossible for you to fit the offensive descriptions.

Use your imagination, but don’t make up lies. Eventually, they catch up with you. Instead, fill the web with news, trivia, reviews, and all things positive. Create content that pulls the focus of searches that previously led to pages you wish to suppress.

Finally, avoid SEO tricks to lure search engines,[2] such as cross-links, robot-generated hits or invisible text and meta­data not directly related to con­tent. Infamy Callout 3Search engines de­tect rigging. Manip­ula­tion or deceit will de­mote your page rank, negating hours of hard work. If you would like to focus on your rankings, you can use tools like Google My Business (GMB) to do so. All of your SEO tactics will then be above board so your site will not be penalized for any reason. There are a lot of tools available when it comes to GMB so there are a lot of elements of your site that you can work on. If you’re not sure how to manage your GMB account, there are other tools like local viking which are also available to help. You may be wondering how does local viking work and it works by helping you manage your GMB account, boosting the visibility of your GMB, attracting more potential customers and making more profit. Using tools like these is much more beneficial than using robot-generated hits or invisible text.

Looking forward, consider ways to avoid letting your own data lead to future embarrassment…

Social Media

Facebook isn’t the only social media site. They include any site with ‘walls’, feeds or link sharing. Likewise, Blogs and Blog comments create a threat beacon.

Social media can ruin reputations as surely as it shares photos and connects friends. Searchable from within and outside, they spread online activities like vines. Your wall and timeline will easily outlive you!

Learn privacy controls. Start by enabling all restrictions and then isolate friends and colleagues into circles. This partitions friends and colleagues into venues asso­ciated with your various hats. Of course, friends can be in several of your circles, but it gives you the ability to restrict your wall or photos to individuals likely to appreci­ate the context and less likely to amplify your accidents and oversights.

Faced with privacy concerns, Facebook recently added granular controls and Google unified privacy policies across services. Most sites offer ways to enable and disable features or even temporarily suspend tracking. If yours doesn’t, find a more reputable service.

Scrutinize, photo tagging options. Facebook users can block tagging of photos and remove their name on photos that others post. (Don’t take too much comfort-They also use facial rec­ognition to encourage other users to ID you. In the universe of photos that include you, only a fraction was posted by you.)

Clean up Cloud Services

Do you use iCloud, Google Drive, or Skydrive? Infamy Callout 4How about Dropbox, SugarSync or backup services Carbonite or Mozy?

Cloud services are great collaboration tools. They backup folders and stream personal media. Like social networks, they present another leaky conduit for your private life.

Check that sync folders are encrypted or unlikely to convey personal or unflattering material. Review shared links: Never grant access to everyone or let anyone access everything. Administer your friends, family and colleagues on a need-to-know basis. Your data needs a defense perimeter.

Create an Alter Ego

Ellery isn’t my real name. It is the alias with which I publish AWildDuck. But the fact that I acknowledge that I have another identity and occasional references to my career, geographic location and age, demonstrates that I am either very foolish or not making a serious effort to prevent discovery.

Archival Sites

Unfortunate news for anyone trying to erase history: The Wayback Machine at www.archive.org takes snapshots of the entire Internet every day. Visi­tors click a calendar to travel back in time and view a web or page as it once appeared.

Al­though content does not appear in search results, the com­ments you posted about the boss’ daughter are viewable to any visitor-no login required! Advice concerning archive sites: “Get past it!” They are not likely a threat, but they remind us that in the Internet Age, history cannot be erased.

A Dose of Prevention

Putting anything online-even email-lets a cat out of a bag. You can corral it and hope that it doesn’t eat the neigh­bor’s daisies, but you cannot get it back into the bag. That’s why we focus on disguise, chaff and misdirection. If the neighbors are looking at your shiny car and the cat looks more like a dog belonging to someone else, it is less likely to draw attention.

As you hunt authors of unflattering detritus and imple­ment containment, make a resolution to watch what you say online. Online content is never private. Cameras and recorders are everywhere and they aren’t operated by friends. Your trail will easily outlive you.

_____________

[1] In April 2005, Go Daddy (aka Wild West Domains) surpassed Network Solutions as the largest ICANN-accredited registrar on the Internet [domain names registered].
Source: web-hosting-top.com. Stats of 4/27/2005, and up to the date of this posting.

[2] SEO = Search Engine Optimization

Slippery Slope: Japan seeks to ban Tor

The Electronic Freedom Foundation (EFF) often finds itself on the opposite side of legislation that is initiated or supported by media rights owners. In fact, the Recording Industry Association of America (RIAA) and its Hollywood counterpart, the Motion Picture Association of America (MPAA) have thwarted every promising technology since the dawn of the photocopier and the 8-track tape cartridge.

We could list delayed technologies or those that were threatened with a use tax, such as the VCR, writable CDs, file sharing networks, and DVD backup software. But the funny thing about grumbling rights owners is that they are, well, right. Sort of. After all, anyone who believes that it is OK to download a movie with Bit Torrent or trade music with friends (while maintaining access in their own playlist) has a weak argument. They certainly can’t claim the moral high ground, unless they are the only person on earth that limits file copies to back ups and playlists in strict conformity to exceptions allowed under DMCA.

But this week, it isn’t the RIAA or MPAA that seeks to squash the natural evolution of the Internet. This time, it is the government of Japan. Japan?!!

Napster-ShawnFirst, some background…

In July 2001, Napster was forced to shut its servers by order of the US Ninth Circuit court. Despite legitimate uses for the service, the court agreed with a district court and the US Recording Industry Association (RIAA), that Napster encouraged and facilitated intellectual property theft—mostly music in that era.

The decision that halted Napster was directed at a specific company. Of course, it de-legitimized other file swapping services. (Who remembers Limewire, Kazaa, Bearshare or WinMX?) But, it was never intended to condemn the underlying technology. In fact, Napster was a pioneer in the emergence of ad hoc, peer-to-peer networks. It is the precursor of today’s Bit Torrent which merges distributed p2p storage with swarm technology to achieve phenomenal download speed and a robust, nearline storage medium. In fact, over the next few years, AWildDuck predicts that the big cloud services will migrate to a distributed p2p architecture.

Akamai has long used the power of distributed networks for storing data “at the fringe”, a technique that serves up web pages rapidly and reduces conserves network resources. But a similar network, grown organically and distributed among the masses strikes fear in the hearts of anyone who believes that power stems from identification and control.

In 2000 and 2001, p2p networks were perceived as a threat, because they facilitated the sharing of files that might be legally by few peers–or none at all. Today, p2p networks are fundamental to the distribution of files and updates and are at the very core of the Internet.

how_tor_works

Tor facilitates privacy. User identification is by choice.

Peer-to-peer networks are no more a tool of crime than telephones. Although both can be used for illegal purposes, no reasonable person advocates banning phones, and no one who understands the evolution and benefit of modern networks would advocate the regulation of peer-to-peer networks. (Please don’t add guns to this list. That issue has completely different considerations at play. With guns, there is a reasonable debate about widespread ownership, because few people use it as a tool for everyday activities and because safe use requires training).

But p2p networks are evolving. The robust, distributed nature is enhanced by distributing the tables that point to files. In newer models, users control the permissions as a membership criteria and not based on the individual source or content of files. For this reason, anonymity is a natural byproduct of technology refinement.

Consider the individual users of a p2p network. They are nodes in a massive and geographically distributed storage network. As both a source of data and also a repository for fragments from other data originators, they have no way to determine what is being stored on their drives or who created the data. Not knowing the packet details is a good thing for all parties—and not just to confound forensic analysis. It is a good thing every which way you evaluate a distributed network.*

The RE-Criminalization of P2P Networks

tor-crop

Japan’s National Police Agency (NPA) is much like America’s FBI. As a federal agency equipped with investigators, SWAT teams and forensic labs, their jurisdiction supersedes local authorities in criminal matters of national concern.

This weekend, the NPA became the first national agency in any country to call for a ban on the use of anonymous web surfing. They want Internet service providers to monitor and block attempts by their subscribers to use proxy servers that relay their internet traffic through remote servers, thereby anonymizing web traffic and even emboldening users to browse areas of the Internet that they might otherwise avoid.

But the Japanese NPA has a naïve and immature view of humanity. The use of proxy servers is not only fundamental to many legitimate purposes, many netizens consider web-surfing privacy to be a legitimate objective on its own merits. We could list a dozen non-controversial reasons for web surfing privacy—but if we had to do that, you probably wouldn’t be reading this page.

* The statement that anonymity and encryption is a good thing for distributed, p2p networks—not just for data thieves, but for all legal and business purposes—may not be self-evident to all readers. It will be the topic of a future discussion in this Blog.

Ellery Davies is an author, privacy consultant and cloud
storage architect. He is also editor at AWild Duck.com.

Chilling developments in domestic spying

The US government is obsessed about your phone calls, email, web surfing and a log of everywhere that you travel. The obsession has become so intense over the past few years, that they have had to recast the definition of data gathering. After all, warrantless wiretapping and domestic spying is illegal. And so once exposed, Uncle Sam now claims that massive public eavesdropping, archiving and data mining (including building cross-domain portfolios on every citizen) does not count as “spying” because a human analyst has not yet listened to a particular conversation. The way your government spins it, if they have not yet listened into private, domestic conversations, they can gather yottabytes of personal and businesses without any judicial oversight.

The increasing pace of Big Brother’s appetite for wads of personal data is–at the very least–alarming and more specifically, unlikely to result in anything more than a Police State. To learn about some of these events, check our recent articles on the topic of Uncle Sam’s proclivity for data gathering.

Whistle blower, William Binney, explains a secret NSA program to spy on U.S. citizens without warrants

I’m Not Doing Anything Illegal. Why Should I Care?

Here at AWildDuck, we frequently discuss privacy, government snooping, and projects that incorporate or draw upon warrantless interception. In just the USA, there are dozens of projects–past and present–with a specific mandate to violate the Foreign Intelligent Surveillance Act. How can the American government get away with it? In the past decade, as leaks began to surface, they tried to redefine the meaning of domestic surveillance to exclude sweeping acts of domestic surveillance. The Bush era interpretation of USSID 18 is so farcical, that it can be debunked by an elementary school pupil. As the ruse unraveled, the wholesale gathering of data on every citizen in America was ‘legitimized’ by coupling The Patriot Act with general amnesty for past acts warrantless wiretapping. Dick Cheney invoked the specter of 911 and the urgent need to protect Americans from terrorism as justification for creating a more thorough and sweeping police mechanism than any totalitarian regime in history.

The programs go by many names, each with a potential to upend a democracy: Stellar Wind, The Patriot Act, TIA, Carnivore, Echelon, Shamrock, ThinThread, Trailblazer, Turbulence, Swift, and MINARET. Other programs thwart the use of privacy tools by businesses and citizens, such as Clipper Chip, Key Escrow and the classification of any secure web browsing as a munition that must be licensed and cannot be exported. The list goes on and on…

A myriad of dangers arise when governments ‘of-the-people and by-the-people’ engage in domestic spying, even if the motive is noble. Off the bat, I can think of four:

  • Justifications are ethereal. They are based on transient goals and principles. Even if motives are common to all constituents at the time a project is rolled out, the scope of data archived or the access scenarios inevitably change as personal and administrations change.
  • Complex and costly programs are self-perpetuating by nature. After all, no one wants to waste billions of taxpayer dollars. Once deployed, a massive surveillance mechanism, it is very difficult to dismantle or thwart.
  • There is convincing research to suggest that domestic surveillance could aid terrorists, rather than protect civilians.
  • Perhaps most chilling, is the insipid and desensitizing effect of such programs. Once it becomes acceptable for a government to spy on its citizens, it is a surprisingly small step for neighbors, co-workers and your own children to become patriotic partners in surveillance and reporting. After all, if your government has the right to preemptively look for dirt on your movement, Internet surfing, phone calls, cash transactions and sexual dalliances, then your neighbor can take refuge in the positive light of assisting law enforcement as they transmit an observation about an unusual house guest or the magazines you subscribe to.

What’s New in Domestic Spying?

This is a landmark week for anyone who values privacy and who understands that domestic spying is not a necessary tool of homeland security. This week, we are learning that US surveillance of its citizens is skyrocketing and a court case is about to either validate or slap a metaphorical wrist. Either way, each event brings us ever closer to the world depicted in Person of Interest. For now, I am citing breaking news. We’ll flush out the details soon.

Article in progress. Changes coming in the next few hours.
Figures, Photos & Links will be added. Please return soon.

Articles on Privacy & Domestic Surveillance here at AWildDuck:

$1 Billion kick-starts Facial Recognition of Everyone

For access to a home or automobile, most people use a key. Access to accounts or transactions on the Internet usually requires a password. In the language of security specialists, these authentication schemes are referred to as using something that you have (a key) or something that you know (a password).

In some industries, a third method of identification is becoming more common: Using something that you are. This area of security and access is called ‘biometrics’. The word is derived from bio = body or biology and metrics = measurement.

The data center that houses computer servers for AWildDuck also houses valuable equipment and data for other organizations. When I visit to install a new router or tinker with my servers, I must first pass through a door that unlocks in the presence of my fob (a small radio-frequency ID tag on my key chain). But before I can get to the equipment cage that houses my servers, I must also identify myself by placing the palm of my hand on a scanner and speaking a code word into a microphone. I don’t know if my voice is identified as a biometric, but the use of a fob, a code word and a hand-scan demonstrates that the facility uses all three methods of identify me: Something that I have, something that I know and something that I am.

If you work with technology that is dangerous, secret, or that has investor involvement, then biometric identification or access seems reasonable. After all, something-that-you-are is harder to forge than something that you have. Because this technique is tied to part of your body, it also discourages the loaning of credentials to a spouse, friend, or blackmailer.

But up until now, biometric identification required the advance consent of the individuals identified. After all, before you can be admitted to a secure facility based on your hand print, you had to allow your hand to be scanned at some time in the past. This also suggests that you understood the legitimate goals of those needing your identification in the future.

Few Americans have been compelled to surrender their biometrics without advance consent. There are exceptions, of course. Rapists and individuals applying to live in the United States are routinely fingerprinted. Two very different demographics, and yet both are compelled to surrender a direct link to their genetic makeup. But until now, we have never seen a non-consenting and unsuspecting population subjected to wholesale cataloging of personal biometrics. Who wants all of this data? What could they do with it?

Here at AWildDuck, we have written about the dogged persistence of conservatives in the American government to seek a state of Total Information Awareness. But now, Uncle Sam is raising the stakes to a new low: The Dick Cheneys and Karl Roves aren’t satisfied with compiling and mining data from that which is online, such as phone books, Facebook data, company web sites, etc. They want access to as much personal and corporate data as they can get their hands on: Bank records, credit card receipts, tax returns, library borrowing records, personal email, entire phone conversations & fax images, and the GPS history logged by your mobile phone.

Perhaps even more creepy, is the recent authorization for the use of high altitude drones for domestic law enforcement. But wait! That development pales in comparison with a minor news bulletin today. The FBI has just funded a program of facial recognition. We’re not talking about identifying a repeat bank robber, a missing felon or an unauthorized entry across our borders. We are talking about scanning and parsing the entire population into a biometric fingerprint database. The project aims to cull and track facial images – and identify each one – from every Flickr account, every ATM machine, every 7-11…in fact, every single camera everywhere.

If you have a driver’s license, a Facebook account, or if you ever appeared in a college yearbook, it’s a certainty that you will soon surrender identifiable biometrics, just like a rapist or a registered alien. By 2014, we may arrive at 1984.

The one billion dollars set aside by the FBI for the facial recognition component of Project Über Awareness belies the truly invasive scope of body-cavity probing that the Yanks want to administer. The massively funded effort includes a data archival project buried within a Utah hill that is brain-seizing in size and scope. Forget about Tera, Peta and Exabytes. Think instead of Yotta, Zeta and Haliburtabytes.

Engadget is a popular web site that reviews and discusses high tech markets, media & gadgets. Below, they discuss the facial recognition component and its privacy implications. Just as with our past articles on this topic, Engadget begins with a still image from the ABC television series Person of Interest. The show depicts the same technology and it’s all encompassing power. Whomever controls it has the power to manipulate life. But unlike Mr. Finch, a fictional champion of stalked heroines, the Big Brother version is not compelled by a concern for individual safety and security. Instead, the US government is using the specter of terrorism and public safety to bring the entire world one giant leap closer to a police state.

Do we really want our government – any government – to know every detail about our daily lives? Does the goal of securing public safety mean that we must surrender our individual freedoms and privacy completely? Are individuals who don’t care about privacy absolutely certain that they will trust their governments for all time and under all circumstances? Do they expect that the data will never be breached or used for purposes that were not originally sanctioned or intended? Is anyone that naïve?

________________________________________________________________________

FBI rolls out $1 billion public face recognition system in 2014.
Big Brother will be on to your evildoing everywhere

Reprint: Engadget.com — By , posted Sep 9th 2012

New York & Hawaii: Frightening bedfellows lacking perspective

New York and Hawaii are bookends to 50 American states. Although separated by 8,000 km, each is rich in heritage, and with a very different political and cultural perspective. Yet, despite the distance and political differences, they are embarking on an identical and ruinous path. Bills introduced in both states suggest that legislators lack fundamental knowledge of history, democracy, economics and, especially, the nature of the Internet. More importantly, they care not a whit of personal freedoms, privacy and individual rights.

NY & HI senate: Lacking historical perspective

I should end here with my favorite tag line, “So Sayeth Ellery”, but that would deny readers chilling facts. Facts that ought to shock the senses of every New Yorker and Hawaiian, and humiliate by association. Let’s cut to the chase: Lawmakers in the Aloha state want to criminalize anonymous internet posting while senators in the Empire State plan to create a database of every web site visited by each resident. Yes! They plan to track & archive your internet surfing history. I am not making this up!

A government dB of everyone’s web surfing… Now, Isn’t that just special?!

With regrets to Dana Carvey, Isn’t that just special? After all, an individual concerned about being carded at the door is an individual with something to hide—most likely, guilty of a crime. Who else would object to registering a DNA sample before speaking on topics of the day? A law-abiding citizen doesn’t fear a government that tracks thought, medical history, private communication, bedroom fantasy, or corporate negotiation. Just what are those people afraid of?

Dear Wild Ducks: We are all those people. I am too blinded by disappointment and pity to name names or plow through the facts. (N.B. Names of the proponents are in the tags below this article). So, I offer links to well written summaries. Read along with me and weep. The US is already constructing the world’s biggest database of everything that you say, do and think. Perhaps New York and Hawaii feel left out. Or perhaps legislators in those states skipped out on high school history. More likely, they are decent individuals with good intentions, but simply poor stewards of liberty in an era of ecommerce, the Drudge Report, AWildDuck.

Does anyone not find this frightening? Forget about “confidential sources”. Want to comment on a breastfeeding blog? Sure. But first, register your fingerprints with an ISP and web host! I can think of three reasons that this won’t fly. More importantly, I am concerned that our legislators don’t see this:

Reasons to avoid suppressing a privacy technology

  • If a government bans free expression, the business of internet hosting & access simply migrates to jurisdictions that understand democracy. It’s the nature of any fungible medium.
  • Political restrictions on existing technologies or platforms create incentives for the rapid deployment of methods that circumvent or thwart the restrictions. This has the unintended effect of causing even more interference with legitimate investigations and forensic tools.
  • History demonstrates the dangers of surrendering free, anonymous speech to a government, no matter how ethical the current leaders. Governments are transient, though they try hard to be self-preserving. They do their best work when prodded by free and democratic constituents.

So sayeth Ellery.

Ellery Davies is not generally known as a liberal commentator.
But he is a political wonk, privacy advocate and editor of AWildDuck.

Enhancing Privacy: Blind Signaling and Response

A user-transparent privacy enhancement may allow online service providers like Google to provably shield personal data from prying eyes—even from themselves. Personal user data like search, email, doc and photo content, navigation and clicks will continue to support clearly defined purposes (advertising that users understand and agreed to), data will be unintelligible if inspected for any other purpose.
In effect, the purpose and processes of data access and manipulation determine whether data can be interpreted or even associated with individual users. If data is inspected for any purpose apart from the original scope, it is unintelligible, anonymous and self-expiring. It is useless for any person or process beyond that which was disclosed to users at the time of collection. It cannot even be correlated to individual users who generate the data.

Blind Signaling and Response is not yet built into internet services. But as it crosses development and test milestones, it will attract attention and community scrutiny. A presentation at University of Montreal Privacy Workshop [video] gives insight into the process. The presenter can be contacted via the contact link at the top of this Blog page.

Can Internet services like Google protect user data from all threats—even from their own staff and processes—while still supporting their business model? If such commitment to privacy could be demonstrable, it could usher in an era of public trust. I believe that a modification to the way data is collected, stored and processed may prevent a breach or any disclosure of personal user information, even if compelled by a court order.

The goal of Blind Signaling and Response is define a method of collecting and storing data that prevents anyone but the intended process from making sense of it. But this pet theory has quite a road ahead…

Before we can understand Blind Signaling and Response, it helps to understand classic signaling. When someone has a need, he can search for a solution.

When an individual is aware of their needs and problems, that’s typically the first step in marrying a problem to a solution. But in a marketing model, a solution (sometimes, one that a user might not even realize he would desire) reaches out to individuals.

Of course the problem with unsolicited marketing is that the solution being hawked may be directed at recipients who have no matching needs. Good marketing is a result of careful targeting. The message is sent or advertised only to a perfect audience, filled with Individuals who are glad that the marketer found them. Poor marketing blasts messages at inappropriate lists or posts advertisements in the wrong venue. For the marketer (or Spam email sender), it is a waste of resources and sometimes a crime. For the recipient of untargeted ads and emails, it is a source of irritation and an involuntary waste of resources, especially of the recipient’s attention.

Consider a hypothetical example of a signal and its response:

Pixar animators consume enormous computing resources creating each minute of animation. Pixar founder, John Lasseter, has many CGI tools at his disposal, most of them designed at Pixar. As John plans a budget for Pixar’s next big film, suppose that he learns of a radical new animation theory called Liquid Flow-Motion. It streamlines the most complex and costly processes. His team has yet to build or find a practical application that benefits animators, but John is determined to search everywhere.

Method #1: A consumer in need searches & signals

Despite a lack of public news on the nascent technique, John is convinced that there must be some workable code in a private lab, a university, or even at a competitor. And so, he creates a web page and uses SEO techniques to attract attention.

The web page is a signal. It broadcasts to the world (and hopefully to relevant parties) that Pixar is receptive to contact from anyone engaged in Liquid Flow-Motion research. With Google’s phenomenal search engine and the internet’s reach, this method of signaling may work, but a successful match involves a bit of luck. Individuals engaged in the new art may not be searching for outsiders. In fact, they may not be aware that their early stage of development would be useful to anyone.

Method #2: Google helps marketers target relevant consumers

Let’s discuss how Google facilitates market-driven signaling and a relevant marketing response today and let us also determine the best avenue for improvement…

At various times in the past few weeks, John had Googled the phrase “Liquid Flow-Motion” and some of the antecedents that the technology builds upon. John also signed up for a conference in which there was a lecture unit on the topic (the lecture was not too useful. It was given by his own employee and covered familiar ground). He also mentioned the technology in a few emails.

Google’s profile for John made connections between his browser, his email and his searches. It may even have factored in location data from John’s Android phone. In Czechoslovakia, a grad student studying Flow-Motion has created the first useful tool. Although he doesn’t know anything about Google Ad Words, the university owns 75% of the rights to his research. They incorporate key words from research projects and buy up the Google Ad Words “Liquid Flow-Motion”.

Almost immediately, John Lasseter notices very relevant advertising on the web pages that he visits. During his next visit to eBay, he notices a home page photo of a product that embodies the technique. The product was created in Israel for a very different application. Yet it is very relevant to Pixar’s next film. John reaches out to both companies–or more precisely, they reached out in response to his signal, without even knowing to whom they were replying.

Neat, eh? What is wrong with this model?

For many users, the gradual revelation that an abundance of very personal or sensitive data is being amassed by Google and the fact that it is being marketed to unknown parties is troubling. Part of the problem is perception. In the case described above and most other cases in which the Google is arbiter, the result is almost always to the user’s advantage. But this fact, alone, doesn’t change the perception.

But consider Google’s process from input to output: the collection of user data from a vast array of free user services and the resulting routing of ads from marketing partners. What if data collection, storage and manipulation could be tweaked so that all personal data–including the participation of any user–were completely anonymized? Sounds crazy, right? If the data is anonymized, it’s not useful.

Wrong.

Method #3: Incorporate Blind Signaling & Response into AdWords
— and across the board

A signaling and response system can be constructed on blind credentials. The science is an offshoot of public key cryptography and is the basis of digital cash (at least, the anonymous form). It enables a buyer to satisfy a standard of evidence (the value of their digital cash) and also demonstrate that a fee has been paid, all without identifying the buyer or even the bank that guarantees cash value. The science of blind credentials is the brainchild of David Chaum, cryptographer and founder of DigiCash, a Dutch venture that made it possible to guaranty financial transactions without any party (including the bank) knowing any of the other parties.

The takeaway from DigiCash and the pioneering work of David Chaum is that information can be precisely targeted–even with a back channel–without storing or transmitting data that aids in identifying a source or target. (Disclosure: I am developing a specification for the back channel mechanism. This critical component is not in the DigiCash implementation). Even more interesting is that the information that facilitates replying to a signal can be structured in a way that is useless to both outsiders and even to the database owner (in this case, Google).

The benefits aren’t restricted to Internet search providers. Choose the boogeyman: The government, your employer, someone taking a survey, your grandmother. In each case, the interloper can (if they wish) provably demonstrate that meaningful use of individually identifiable data is, by design, restricted to a stated purpose or algorithm. No other person or process can find meaning in the data—not even to whom it belongs.

The magic draws upon and forms an offshoot of Trusted Execution Technology, a means of attestation and authentication. In this case, it is the purpose of execution that must be authenticated before data can be interpreted, correlated with users or manipulated. This presentation at a University of Montreal privacy workshop pulls back the covers by describing a combination of TXT with a voting trust, (the presenter rushes through key slides at the end of the video).

It’s reasonable to assume that privacy doesn’t exist in the Internet age. After all, unlike a meeting at your dining table, the path from whisper to ear passes through a public network. Although encryption and IP re-routing ensure privacy for P2P conversations, it seems implausible to maintain privacy in everyday searches, navigation, and online email services, especially when services are provided at no cost to the user. Individuals voluntarily disgorge personal information in exchange for services, especially, if the goal is to keep the service provider incented to offer the service. For this reason, winning converts to Blind Signaling and Response requires a thoughtful presentation.

Suppose that you travel to another country and walk into a bar. You are not a criminal, nor a particularly famous or newsworthy person. You ask another patron if he knows where to find a good Cuban cigar. When you return to your country, your interest in cigars will probably remain private and so will the fact that you met with this particular individual or even walked into that bar.

Gradually, the internet is facilitating at a distance the privileges and empowerment that we take for granted in a personal meeting. With end-to-end encryption, it has already become possible to conduct a private conversation at a distance. With a TOR proxy and swarm routing, it is also possible to keep the identities of the parties private. But today, Google holds an incredible corpus of data that reveals much of what you buy, think, and fantasize about. To many, it seems that this is part of the Faustian bargain:

  • If you want the benefits of Google services, you must surrender personal data
  • Even if you don’t want to be the target of marketing,* it’s the price that you pay for using the Google service (Search, Gmail, Drive, Navigate, Translate, Picasa, etc).

Of course, Google stores and act on the data that it gathers from your web habits. But both statements above are false!

a)  When Google incorporates Blind Signaling into its services, you will get all the benefits of Google services without anyone ever discovering personal information. Yet, Google will still benefit from your use of their services and have even more incentive to continue offering you valuable, personalized services, just as they do now.

b)  Surrendering personal data in a way that does not anonymize particulates is not “the price that you pay for Google services”. Google is paid by marketers and not end users. More importantly, marketers can still get relevant, targeted messages to the pages you visit, while Google protects privacy en toto! Google can make your personal data useless to any other party and for any other purpose. Google and their marketing partners will continue to benefit exactly as they do now.

Article in process…

* This is also a matter of perception. You really do want targeted messaging. Even if you hate spam and, like me, prefer to search for a solution instead of have marketers push a solution to you. In a future article, I will demonstrate that every individual is pleased by relevant messaging, even if it is unsolicited, commercial or sent in bulk.

Will Google “Do No Evil”?

Google captures and keeps a vast amount of personal information about its users. What do they do with all that data? Despite some very persistent misconceptions, the answer is “Nothing bad”. But they could do a much better job ensuring that no one can ever do anything bad with that data—ever. Here is a rather simple but accurate description of what they do with what is gleaned from searches, email, browsing, documents, travel, photos, and more than 3 dozen other ways that they learn about you:

  • Increase the personal relevance of advertising as you surf the web
  • Earn advertising dollars–not because they sell information about you–but
    because they use that data to match and direct relevant traffic toward you

These aren’t bad things, even to a privacy zealot. With or without Google, we all see advertising wherever we surf. Google is the reason that so many of the ads appeal to our individual interests.

But what about all that personal data? Is it safe on Google’s servers? Can they be trusted? More importantly, can it someday be misused in ways that even Google had not intended?

I value privacy above everything else. And I have always detested marketing, especially the unsolicited variety. I don’t need unsolicited ‘solutions’ knocking on my door or popping up in web surfing. When I have needs, I will research my own solutions—thank you very much.

It took me years to come to terms with this apparent oxymoron, but the personalization brought about by information exchange bargains are actually a very good bargain for all parties concerned, and if handled properly, it needn’t risk privacy at all! In fact, the things that Google does with our personal history and predilections really benefits us, but…

This is a pro-Google posting. Well, it’s ‘pro-Google’ if they “do no evil” (Yes—it’s the Google mantra!). First the good news: Google can thwart evil by adding a fortress of privacy around the vast corpus of personal data that they collect and process without weakening user services or the value exchange with their marketing partners. The not-so-good news is that I have urged Google to do this for over two years and so far, they have failed to act. What they need is a little urging from users and marketing partners. Doing no evil benefits everyone and sets an industry precedent that will permeate online businesses everywhere.

The CBS prime time television series, Person of Interest, pairs a freelance ‘James Bond’ with a computer geek. The geek, Mr. Finch, is the ultimate privacy hack. He correlates all manner of disparate data in seconds, including parking lot cameras, government records, high school yearbook photos and even the Facebook pages of third parties.

Mr. Finch & Eric Schmidt: Separated at birth?

It’s an eerie coincidence that Google Chairman, Eric Schmidt, looks like Mr. Finch. After all, they both have the same job! They find a gold mine of actionable data in the personal dealings of everyone.

Viewers accept the TV character. After all, Finch is fictional, he is one of the good guys, and his snooping ability (especially the piecing together of far-flung data) is probably an exaggeration of reality. Right?!

Of course, Eric Schmidt & Google CEO Larry Page are not fictional. They run the largest data gathering engine on earth. I may be in the minority. I believe that Google is “one of the good guys”. But let’s first explore the last assumption about Mr. Finch: Can any organization correlate and “mine” meaningful data from a wholesale sweep of a massive eavesdropping machine and somehow piece together a reasonable profile of your interests, behavior, purchasing history and proclivities? Not only are there organizations that do this today, but many of them act with our explicit consent and with a disclosed value exchange for all that personal data.

Data gathering organizations fall into three categories, which I classify based on the exchange of value with web surfers and, more importantly, whether the user is even aware of their role in collecting data. In this classification, Google has moved from the 2nd category to the first, and this is a good thing:

  1. Organizations that you are aware of–at least peripherally–and for which there is a value exchange (preferably, one that is disclosed). Google comes to mind, of course. Another organization with informed access to your online behavior is your internet service provider. If they wanted to compile a dossier of your interests, market your web surfing history to others, or comply with 3rd party demands to review your activities, it would be trivial to do so.
  2. Organizations with massive access to personal and individualized data, but manage to “fly beneath the Radar”. Example: Akamai Technologies operates a global network of servers that accelerate the web by caching pages close to users and optimizing the route of page requests. They are contracted by almost any company with a significant online presence. It’s safe to say that their servers and routers are inserted into almost every click of your keyboard and massively distributed throughout the world. Although Akamai’s customer relationship is not with end users, they provide an indirect service by speeding up the web experience. But because Internet users are not actively engaged with them (and are typically unaware of their role in caching data across the Internet), there are few checks and on what they do with the click history of users, with whom they share data, and if–or how–individualized is data is retained, anonymized or marketed.
  3. National governments. There is almost never disclosure or a personal value exchange. Most often, the activity involves compulsory assistance from organizations that are forbidden from disclosing the privacy breach or their own role in acts of domestic spying.
The NSA is preparing to massively vacuum data from everyone, everywhere, at all times

The US is preparing to spy on everyone, everywhere, at all times. The massive & intrusive project stuns scientists involved.

I have written about domestic spying before. In the US, It has become alarmingly broad, arbitrary and covert. The über secretive NSA is now building the world’s biggest data gathering site. It will gulp down everything about everyone. The misguided justification of their minions is alternatively “anti-terrorism” or an even more evasive “911”.

Regarding, category #2, I have never had reason to suspect Akamai or Verizon of unfair or unscrupulous data mining. (As with Google, these companies could gain a serious ethical and market advantage by taking heed of today’s column.) But today, we focus on data gathering organizations in category #1—the ones with which we have a relationship and with whom we voluntarily share personal data.

Google is at the heart of most internet searches and they are partnered with practically every major organization on earth. Forty eight free services contain code that many malware labs consider to be a stealth payload. These doohickeys give Google access to a mountain of data regarding clicks, searches, visitors, purchases, and just about anything else that makes a user tick.

It’s not just searching the web that phones home. Think of Google’s 48 services as a marketer’s bonanza. Browser plug-ins phone home with every click and build a profile of user behavior, location and idiosyncrasies. Google Analytics, a web traffic reporting tool used by a great many web sites, reveals a mountain of data about both the web site and every single visitor. (Analytics is market-speak for assigning identity or demographics to web visits). Don’t forget Gmail, Navigate, Picassa, Drive, Google Docs, Google+, Translate, and 3 dozen other projects that collect, compare and analyze user data. And what about Google’s project to scan everything that has ever been written? Do you suppose that Google knows who views these documents, and can correlate it with an astounding number of additional facts? You can bet Grandma Estelle’s cherry pie that they do!

How many of us ever wonder why all of these services are free to internet users everywhere? That’s an awful lot of free service! One might think that the company is very generous, very foolish, or very unprofitable. One would be wrong on all counts!

Google has mastered the art of marketing your interests, income stats, lifestyle, habits, and even your idiosyncrasies. Hell, they wrote the book on it!

But with great access to personal intelligence comes great responsibility. Does Google go the extra mile to protect user data from off-label use? Do they really care? Is it even reasonable to expect privacy when the bargain calls for data sharing with market interests?

At the end of 2009, Google Chairman, Eric Schmidt made a major gaffe in a televised interview on CNBC. In fact, I was so convinced that his statement was toxic, that I predicted a grave and swift consumer backlash. Referring to the Billions of individuals using Google search engine, investigative anchor, Maria Bartiromo, asked Schmidt why it is that users enter their most private thoughts and fantasies. She wondered if they are aware of Google’s role in correlating, storing & sharing data—and in the implicit role of identifying users and correlating their identities with their interests.

Schmidt seemed to share Bartiromo’s surprise. He suggested that internet users were naive to trust Google, because their business model is not driven by privacy and because they are subject to oversight by the Patriot Act. He said:

If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place. If you really need that kind of privacy, the reality is that search engines — including Google — do retain this information for some time and it’s important, for example, that we are all subject in the United States to the Patriot Act and it is possible that all that information could be made available to the authorities.

At the time, I criticized the statements as naive, but I have since become more sanguine. Mr. Schmidt is smarter than me. I recognize that he was caught off guard. But clearly, his response had the potential to damage Google’s reputation. Several Google partners jumped ship and realigned with Bing, Microsoft’s newer search engine. Schmidt’s response became a lightning rod–albeit brief–for both the EFF (Electronic Freedom Foundation) and the CDT (Center for Democracy & Technology). The CDT announced a front-page campaign, Take Back Your Privacy.

But wait…It needn’t be a train wreck! Properly designed, Google can ensure individual privacy, while still meeting the needs of their marketing partners – and having nothing of interest for government snoops, even with a proper subpoena.

I agree with the EFF that they undermine Google’s mission. Despite his high position, Schmidt may not fully recognize to that Google’s marketing objectives can coexist with an ironclad guarantee of personal privacy – even in the face of the Patriot Act.

Schmidt could have had salvaged the gaffe quickly. I urged him to quickly demonstrate that he understands and defends user privacy. But I overestimated consumer awareness and expectations for reasonable privacy. Moreover, consumers may feel that the benefits of Google’s various services inherently trade privacy for productivity (email, taste in restaurants, individualized marketing, etc).

Regarding a damning consumer backlash for whitewashing personal privacy with their public, I was off by a few years, but in the end, my warnings will be vindicated. Public awareness of privacy and especially of internet data sharing and data mining has increased. Some are wondering if the bargain is worthwhile, while others are learning that data can be anonymized and used in ways that still facilitate user benefits and even the vendor’s marketing needs.

With massive access to public data and the mechanisms to gather it (often without the knowledge and consent of users), comes massive responsibility. (His interview contradicts that message). Google must rapidly demonstrate a policy of “default protection and a very high bar for sharing data. In fact, Google can achieve all its goals while fully protecting individual privacy.

Google’s data gathering and archiving mechanism needs a redesign (it’s not so big a task as it seems): Sharing data and cross-pollination should be virtually impossible – beyond a specified exchange between users and intended marketers. Even this exchange must be internally anonymous, useful only in aggregate, and self expiring – without recourse for revival. Most importantly, it must be impossible for anyone – even a Google staffer – to make a personal connection between individual identities and search terms, Gmail users, ad clickers, voice searchers or navigating drivers! For a while now, voice search has been thought of as a huge potential advancement in the Google data gathering system, although many would argue that it has not yet had its desired impact.

I modestly suggest that Google create a board position, and give it authority with a visible and high-profile individual. (Disclosure, I have made a “ballsy” bid to fill such a position. There are plenty of higher profile individuals that I could recommend).

Schmidt’s statements have echoed for more than 2 years now. Have they faded at all? If so, it is because Google’s services are certainly useful and because the public has become somewhat inured to the creeping loss of privacy. But wouldn’t it be marvelous if Google seized the moment and reversed that trend. Wouldn’t it be awesome if someone at Google discovered that protecting privacy needn’t cripple the value of information that they gather. Google’s market activity is not at odds with protecting their user’s personal data from abuse. What’s more, the solution does not involve legislation or even public trust. There is a better model!

They are difficult to contain or spin. As Asa Dotzler at FireFox wrote in his blog, the Google CEO simply doesn’t understand privacy. Here in USA, Schmidt’s statements have become a lightning rod for both the EFF and CDT (Center for Democracy & Technology). The CDT has even launched a front page campaign to “Take Back Your Privacy”.

Google’s not the only one situated at a data Nexus. Other organizations fly below the radar, either because few understand their tools or because of Government involvement. For example, Akamai probably has more access to web traffic data than Google. The US government has even more access because of an intricate web of programs that often force communications companies to plant data sniffing tools at the junction points of massive international data conduits. We’ve discussed this in other articles, and I certainly don’t advocate that Wild Ducks be privacy zealots and conspiracy alarmists. But the truth is, the zealots have a leg to stand on and the alarmists are very sane.

U.S. Police Snoop Email, IMs, Phone Records

I originally wrote this in April 2011 as feedback to this article in PC World.
Even this reprint appeared before Edward Snowden broke similar news.
__________________________________________________________

The article linked above begins with these words:

Law enforcement organizations are making tens of thousands of requests for private electronic information from companies such as Sprint, Facebook and AOL.

Police and other agencies have “enthusiastically embraced” asking for e-mail, instant messages and mobile-phone location data.

PC Wiretapping with a court order is one thing. But this amounts to preemptive forensics. It reeks of unreasonable search…

Intercepting and reading private communica- tions has no ethical leg to stand on, especially when initiated by a police force. It suggests that personal email (or data written to a disk) should have less protection than private thought. Personal communications must be rendered off limits to interlopers. I say “rendered” rather than legislated, because technology exits to foil overzealous acts of law enforcement. In security consulting, I rarely help courts to glean information that the author believed to be private. Applying forensic skills in this way puts blood on the hands of good technicians. (Quite literally, it had better involve a murder or bomb threat). Instead, I am more likely to help individuals and organizations confound any attempt to reconstruct, trace or decode information, including content, history, ownership, origin, transfer (including asset transfer) or digital fingerprints.

I call this practice “Antiforensics”. More like-minded privacy advocates are heading in this direction. In almost every case that forensics is employed without consent of the creator or archivist (i.e. the person being investigated), the practice is unethical. I would never claim that the field lacks all legitimate purpose, but it is too often used by courts concerned with porn, drugs, your marriage, disputes between corporations, or the money in your mattress. At the drop of a hat, a forensic specialist will roll over and sing like a jay bird for any court in the land. Must we sell out? Where does basic privacy fit into the picture?

Cryptography and stenography not only belong in the hands of every human (Thank you, Philip Zimmerman), they should be inherent in every email, fax and phone conversation. They should be part of private communication and every save-to-disk. If “The Man” has a compelling reason to catch you with your pants down, he should have both a court order and a good gumshoe. One who resorts to conventional means at either end of the communication, rather than mining for data at a nexus in New Jersey (AT&T) or Virginia (NSA).

As a security specialist for almost 30 years, I have seen “forensics” destroy families, lives and laudable civil movements. The art of a 3rd party using forensics for the improvement of society is far less prevalent than forensic activities that interfere with personal or political freedoms.

The spirit of prophylactic and preemptive antiforensics is embodied at Fungible.net, a data recovery lab in New England.* Mouse over the red words “Forensics” and “Security”. The lab uses the most sophisticated forensic tools, but they won’t sell out to a court unless someone has targeted the president.

Am I in the minority, practicing “anti-forensics” with zeal and passion? My concern for privacy (before and during an investigation) exceeds my allegiance to political jurisdiction.

How about you? Give us your opinion about antiforensics.   —Ellery Davies

Ellery Davies clarifies law and public policy. He is a privacy champion, antiforensics expert, columnist to tech publications and inventor or Blind Signaling and Response. Here at A Wild Duck, Ellery dabbles in economics and law.

* Fungible.net is a data recovery service. But they also host Ellery’s Wild Duck blog.

Obama authorizes unique internet ID for all Americans

I originally wrote this piece in January 2011 as feedback to this article at Engadget.
__________________________________________________________

This is certainly the most frightening story of the past decade. It is more threatening to personal liberties (and even to national security) than any  terrorist organization.

There will probably be a very short time between implementation of a national cyber ID infrastructure and implementation of an effective means to mix/proxy or otherwise obfuscate the digital trails of a casual web surfer.

It’s likely that an eventual law would criminalize anyone who covers their tracks. But even if the use of an accredited-ID channel is optional, your liberty is threatened by the very existance of a national identity program. Think of Germany rounding up the Jews. Is the metaphor too extreme? Think of a prosecutor questioning a witness: “Mr. Smith, I see that you did not allow your identity to be tracked when you posted this message. What are you trying to hide? Only criminals hide their Internet ID!”

It is critical to a democracy and to our personal freedoms that we are not identified each time that we speak. Not even as a default option that we can easily override. When you walk into a bar, you may be “carded” so that you can purchase alcohol (just as you must present a credit card for an on-line purchase). But your ID is not recorded and then correlated with your statements to to other patrons or to the proprietor (in case you missed the metaphor, the proprietor is a government).

– Ellery Davies