Can I Check Web Sites Visited by my Kids/Staff?

Early this morning, I was asked this question at Quora. It’s a pretty basic request of network administrators, including parents, schools and anyone who administers a public, sensitive or legally exposed WiFi hot spot.

Is there a quick and easy way to view, log, or otherwise monitor the web sites visited by people on your home or office network?

Yes. It’s free and and it is pretty easy to do.

It gets a bit trickier if the individual on your network is using a VPN service like virtual shield vpn that they have configured on their device. A VPN allows you to create a secure connection to another network over the Internet, which allows you to access region-restricted websites and shield your browsing history from public WiFi. It’s no wonder that so many people try and find the best VPN services from websites like Indexsy for privacy reasons, but this does make your analysis of the individual trickier. [1] A VPN does not stop you from logging their browsing, but all of their activity will point to the VPN address instead of the site that they are actually visiting. In that case, there is another way to monitor their activity. However, having a VPN can put a protective barrier to your location. It’s wise to get a secure VPN. You can review these best vpn reviews for more information and for the best VPNs. See note #1, below.

Before getting into this, I should mention that I believe that using covert methods to monitor a family member’s online activity is a terrible method of parenting. In my opinion, there are better ways to deal with the issue-parenting techniques that don’t undermine trust as they deal with safety.

I can think of at least three methods for logging the websites that people on your network visit. In the explanation below, we will focus on #2. For more information, dig into the notes at the bottom of this answer.

You can either…

  1. Configure your router to store logs of visited IP addresses [2]
  2. Set your router to use the DNS server at opendns.com, instead of the default server offered by your internet service provider. This involves a simple setting available in all routers. (Replace default DNS server addresses with 208.67.222.222 and 208.67.220.220)
  3. You can set up a proxy which redirects web traffic to one of the computers in your house or a third-party service. This is how the monitoring software for parents and custodial services monitor or block web traffic.

In the remainder of this quick tutorial, we focus on method #2..

Once you configure your router to use the two DNS servers at OpenDNS.com, create a free account on their web site. Then, enable the logging feature. It not only shows you visited domains, it maps them into actual domain names and subdomains-making it easy to search, sort or analyze traffic.

You can download a spreadsheets and sort by number of visits or by the domains visited. Logs are maintained for only two weeks. So, if you wish to maintain a history, you will need to visit OpenDNS and download them regularly. (Check their user forum. Someone has created a safe, single-line DOS command that downloads these activity logs to your PC).


[1] VPN, Onion Routing and Encryption

If an individual in your home or office is using a Virtual Private Network [VPN], they are effectively covering their tracks with method #3, above. You can see their connection to the VPN service, but that service is either trusted to destroy logs of visited web sites, or anonymize traffic, by routing it through a chain of users that have no way to back-trace and identify the requester’s address.

Since their traffic originates on your network, there are other things you can do to monitor their activities. For example, if they are not using end-to-end encryption, you can use method #3 yourself, to route data in and out through your own PC or service.

[2] Logging the IP address or domain of visited web sites is not a feature of all routers. I have three recent model routers – and only one of them has a feature to log traffic in and out of the network.

[3] OpenDNS cannot discriminate the individual device in your home or office that has accessed websites that it logs. The logs include the traffic for all HTTP access that originates through your internet service subscription.

But some remarkable feature of OpenDNS (other than it being completely free):

a) It speeds up your overall internet experience noticeably! I thought the internet available in my area was poor one time but I did this and it sped up my connection rather quickly. Like Google’s free DNS service, it is more robust and more redundant than the default DNS settings recommended by your internet service provider.

b) It maps every IP address into a domain name. So when you log in to check your logs and statistics, you don’t need to figure what the numbers mean. You view a list that makes sense. You can even search for certain words or web sites.

c) It permits you to block websites based on a very rich set of 100 criteria, including violence, adult content, hate speech, etc.

d) It offers graphs of your network access including overall volume. An example is shown here:

Online Privacy: Learn Tor, VPN, VeraCrypt, LasPass

I have a special request. Actually, this is a personal plea to my readers…

Next month, I host two evening privacy workshops near Boston. I could use a teaching assistant to run around and help newbies install software as I present to the class. But what I really need help with—is getting the word out. Please help…

This time, it’s not about Bitcoin or the blockchain. It’s about taking control of your online identity and browsing activities. It’s about privacy and anonymity. It’s about your communications, your personal data and your disk or cloud storage.

All that data belongs to you and not to your ISP, employer, a hacker, the government, or marketers. And it is surprisingly easy to cover your tracks. In fact, with the proper tools, taking control of your identity and privacy is safe, simple and transparent.

In just 3 hours, attendees will learn install and use TOR, VPN, VeraCrypt and LastPass. They will also get an excellent feel for the function and benefits of a virtual machine.

Anyone attending can choose either Aug 8 (Marlboro) or Aug 22 (Natick). Renting a presentation room in the Natick Library is expensive.

Please help me promote an effective and exciting evening of learning. Get the word out. Check out these announcements: [Sign-up page]   [Meetup page]

Bonus Points: Do you recognize the photo on the left? Be the first to leave a comment with the name of the plastic privacy bubble and the 1960s TV series that featured it. The winner gets two free passes to our privacy workshop that can be transferred to anyone.

Blind Signaling and Response presentation posted

Online services mine personal user data to monetize processes. That’s the business model of “free” services. Even if mining is consensual, policies and promises cannot guaranty privacy. It succumbs to error, administrative malfeasance, hackers, malware and overreaching governments. Is there a technical solution? One that supports monetized data mining and manipulation, but only under predefined conditions, rather than by policies and promises?

Philip Raymond has spent the past 4 years searching for a privacy Holy Grail: The ability to facilitate data mining and manipulation in a way that protects user identity, restricts use to predefined purposes, and insulates results from other uses, even within the service that gathers and manipulates the data.

Prior to this week, there was scant public material on the Blind Signaling mechanism. A PowerPoint overview was accessible only by students at a few universities and the French mathematician who is donating resources to the project.

This week, Université de Montréal posted a live video presentation that steps through the BSR PowerPoint slides. It was filmed at a computer privacy workshop hosted by the university math and encryption departments. Master of Ceremonies, Gilles Brassard, is recognized as an inventor of quantum cryptography, along with his colleague, Charles Bennett. [Brief History of QC]

Blind Signaling and Response  by Philip Raymond…

I am often asked about the algorithm or technical trick that enables data to be decrypted or manipulated—only if the user intent is pure. That’s the whole point here, isn’t it! We claim that a system can be devised that restricts interpretation and use of personal data (and even identities of individual users who generate data), based on the intended use.

The cover pulls back near the end of the video. Unfortunately, I was rushed through key PowerPoint slides, because of poor timing, audience questions, and a lack of discipline. But, I will present my theories directly to your screen, if you are involved in custodial privacy of user data for any online service (Google, Yahoo, Bing, etc) or ISP, or upstream provider, or an Internet “fabric” service (for example, Akamai).

How it Works

The magic draws upon (and forms an offshoot of) Trusted Execution Technology [TXT], a means of attestation and authentication, closely related to security devices called Trusted Platform Modules. In this case, it is the purpose of execution that must be authenticated before data can be interpreted, correlated with users or manipulated.

Blind Signaling and Response is a combination of TXT with a multisig voting trust. If engineers implement a change to the processes through which data is manipulated (for example, within an ad-matching algorithm of Google Ad-Words), input data decryption keys will no longer work. When a programming change occurs, the process decryption keys must be regenerated by the voting trust, which is a panel of experts in different countries. They can be the same engineers who work on the project, and of course they work within an employer NDA. But, they have a contractual and ethical imperative to the users. (In fact, they are elected by users). Additionally, their vote is—collectively—beyond the reach of any government. This results in some very interesting dynamics…

  1. The TXT framework gives a Voting Trust the power to block process alteration. The trust can authenticate a rotating decryptoin key when changes to an underlying process are submitted for final approval. But, if a proscribed fraction of members believes that user data is at risk of disclosure or manipulation in conflict with the EULA, the privacy statement (and with the expectations of all users), they can withhold the keys needed for in-process decryption. Because proposed changes may contain features and code that are proprietary to the custodian, members of the voting trust are bound by non-disclosure—but their vote and their ethical imperative is to the end user.
    .
  2. Blind Signaling and Response does not interfere with the massively successful Google business model. It continues to rake in revenue for serving up relevant screen real-estate to users, and whatever else Google does to match users with markets. Yet, BSR yields two important benefits:
  • a) It thwarts hackers, internal spies, carelessness, and completely undermines the process of government subpoenas, court orders and National Security Letters. After all, the data is meaningless even to in-house engineers. It is meaningful only when it is being used in the way the end users were promised.
    .
  • b) Such a baked-in process methodology can be demonstrably proved. Doing so can dramatically improve user perception and trust in an online service, especially a large collection of “free” services that amasses personal data on interests, behavior and personal activities. When user trust is strengthened, users are not only more likely to use the services, they are less likely to thwart free services via VPN, mixers or other anonymizers.

Incidentally, the idea to merge a TXT mechanism with a human factor (a geographically distributed voting trust accountable to end users) was first suggested by Steven Sprague (just hours before my presentation in the above video…I had been working on a very different method to achieve blind signalling). In addition to being insightful and lightning quick to absorb, process and advise, Steven is a Trusted Platform expert, director of Wave Systems and CEO of  Rivetz. Steven and I were classmates at Cornell University, but we had never met nor heard of each other until our recent affiliation as advisers to The Cryptocurrency Standards Association.

To learn more about Blind Signaling and Response—or to help with the project—use the contact link at the top of this page. Let me know if you watched the Montreal video.

Disclosure: The inventor/presenter publishes this Wild Duck blog under the pen name, “Ellery”.

Is San Bernardino iPhone fully Encrypted?

Here is a question that keeps me up at night…

Is the San Bernardino iPhone just locked or is it properly encrypted?

Isn’t full encryption beyond the reach of forensic investigators? So we come to the real question: If critical data on the San Bernardino iPhone is properly encrypted, and if the Islamic terrorist who shot innocent Americans used a good password, then what is it that the FBI thinks that Apple can do to help crack this phone? Doesn’t good encryption thwart forensic analysis, even by the FBI and the maker of the phone?

iphone-01In the case of Syed Rizwan Farook’s iPhone, the FBI doesn’t know if the shooter used a long and sufficiently unobvious password. They plan to try a rapid-fire dictionary attack and other predictive algorithms to deduce the password. But the content of the iPhone is protected by a closely coupled hardware feature that will disable the phone and even erase memory, if it detects multiple attempts with the wrong password. The FBI wants Apple to help them defeat this hardware sentry, so that they can launch a brute force hack-trying thousands of passwords each second. Without Apple’s help, the crack detection hardware could automatically erase incriminating evidence, leaving investigators in the dark.

Mitch Vogel is an Apple expert. As both a former police officer and one who has worked with Apple he succinctly explains the current standoff between FBI investigators and Apple.


The iPhone that the FBI has is locked with a passcode and encrypted. If it was just locked with a passcode, like most iPhones, then something like the 4ukey iPhone Unlocker could be used to bypass and remove the passcode and gain entry into the phone. Download 4ukey iPhone Unlocker for Windows here, if you need these services. However, the iPhone in question is encrypted and this makes things somewhat more complicated. It can only be decrypted with the unique code. Not even Apple has that code or can decrypt it. Unlike what you see in the movies, it’s not possible for a really skilled hacker to say “It’s impossible”” and then break through it with enough motivation. Encryption really is that secure and it’s really impossible to break without the passcode.

What the FBI wants to do is brute force the passcode by trying every possible combination until they guess the right one. However, to prevent malicious people from using this exact technique, there is a security feature that erases the iPhone after 10 attempts or locks it for incrementally increasing time periods with each attempt. There is no way for the FBI (or Apple) to know if the feature that erases the iPhone after 10 tries is enabled or not, so they don’t even want to try and risk it.

oceans_of_data-sSo the FBI wants Apple to remove that restriction. That is reasonable. They should, if it is possible to do so without undue burden. The FBI should hand over the iPhone to Apple and Apple should help them to crack it.

However, this isn’t what the court order is asking Apple to do. The FBI wants Apple to create software that disables this security feature on any iPhone and give it to them. Even if it’s possible for this software to exist, it’s not right for the FBI to have it in their possession. They should have to file a court order every single time they use it. The FBI is definitely using this situation as an opportunity to create a precedent and give it carte blanche to get into any iPhone without due process.

So the answer to your question is that yes it is that secure and yes, it’s a ploy by the FBI. Whether it’s actually possible for Apple to help or not is one question and whether they should is another. Either way, the FBI should not have that software.

Ex-NSA Boss says FBI is Wrong on Encryption

What happens if the National Park Service fences off scenic lookout points at the Grand Canyon’s south rim near the head of the Bright Angel trail? Would it prevent the occasional suicide jumper? Not a chance. (The National Park Service tried this in the mid 1980s). People will either gore themselves on fences and posts or they will end their lives on the road in a high speed automobile, putting others at risk. Either way, tourists will be stuck with looking at the North Rim and the Colorado River through prison bars.

Let’s move from analogy to reality. What happens if you jam cell phone signals on tunnels and bridges. Will it stop a terrorist from remotely detonating a bomb? No. But it will certainly thwart efforts to get rescue and pursuit underway. And what about personal encryption?…

Gadgets and apps are finally building encryption into their wares by default, although it is always safer to use a VPN too, such as those designed by 25pc, to give you extra peace of mind. These are highly beneficial for individuals who want to protect their data, but does a locked-down iPhone or the technology that businesses use to secure trade secrets and plan strategy among colleagues enable criminals? Not even close. But if the FBI criminalizes encryption, they cripple the entire American economy. After all, the Genie is already out of the lamp.

Bear with me for just one more analogy (I’m still reaching for the right one): Criminalizing kitchen knives will make cooking impossible and the criminals will still have knives.

A Wild Duck has not previously linked to a media article. I am proud of our all-original content and clear statement of opinions. But in this case, I could not have said it better myself. (Actually, I have said it this all along: End-to-end encryption is a good thing for government, businesses and individuals alike. It is communications and storage empowerment.)

With this article, you will see that the former NSA director gets it. The current FBI director hasn’t a clue. Ah, well…That’s OK. Some concepts are subtle. For some politicians, an understanding of the practical, personal and sociological implications requires decades of exposure and post-facto reflection.

Memo to FBI director, Jim Comey: Get your head out of the sand and surround yourself with advisers who can explain cause and effect.


, Jan 13, 2016)encryption

Encryption protects everyone’s communications, including terrorists. The FBI director wants to undermine that. The ex-NSA director says that’s a terrible idea.

The FBI director wants the keys to your private conversations on your smartphone to keep terrorists from plotting secret attacks.

But on Tuesday, the former head of the U.S. National Security Agency…

Read the full article at CNN Money
http://money.cnn.com/2016/01/13/technology/nsa-michael-hayden-encryption/

Wild Duck Adds Quora Answers

I have posted as AWildDuck for 4½ years. I have also written for Lifeboat, Yahoo, Engadget & Sophos NakedSecurity. But in just the past 3 months, Quora.com has published more of my answers than all other venues combined. More than half of Quora posts were solicited by editors or members.

This month, I became Most Viewed Writer for 7 Quora topics, and among the top 50 writers in many others.                   [Continue below]…Quora_Most_Viewed_splashWriting as “Ellery Davies”, I am a top contributor for Bitcoin, Virtual Currency, Cryptocurrencies, Routers, Local Area Networks, Gravity and Digital Currency. Since these are topics discussed here, my Quora posts are now linked at top right   »
—with the latest post on top.

FOR or AGAINST: Registry of Tainted Bitcoins

The founders and co-chairmen of the Cryptocurrency Standards Association have a friendly—but passionate—disagreement on the need for a public registry that lists tainted bitcoins. Arguments For and Against a registry are presented below. To view the arguments side-by-side, click the image below (same content):

Tainted Bitcoin Registry

Continue reading

Privacy –vs– Anonymity

My friend and business partner, Manny Perez holds elective office. As New York State politicians go, he is an all around decent guy! The first things colleagues and constituents notice about him is that he is ethical, principled, has a backbone, and is compassionate for the causes he believes in.

Manny wears other hats. In one role, he guides an ocean freighter as  founder and co-director of CRYPSA, the Cryptocurrency Standards Association. Manny-guitar-sWith the possible exceptions of Satoshi Nakamoto and Andreas Antonopoulos, Manny knows more about Bitcoin than anyone.

But Manny and I differ on the role of privacy and anonymity in financial dealings. While he is a privacy advocate, Manny sees anonymity —and especially civilian tools of anonymity—as a separate and potentially illegal concept. He is uneasy about even discussing the use of intentionally architected anonymity in any financial or communications network. He fears that our phone conversation may be parsed (I agree) and trigger a human review (I agree) and that it could be construed as evidence of promoting illegal technology. This is where we differ… I agree, but I don’t care how anyone who is not party to a private conversation construes it! Yet, I see anonymity as either synonymous with privacy or at least a constituent component. You can’t have one without the other.

Manny was raised in Venezuela, where he was schooled and held is first jobs. He was involved in the energy industry. He acknowledges that experience with a repressive and graft-prone government, lead to a belief in a more open approach: free markets coupled with a democratic government.

Perhaps this is a key source of our different viewpoints. Manny comes from a repressive land and has come to respect the rules-based structure within his comfort zones of banking, energy and government. He is a certified AML expert (anti-money laundering) and believes strongly in other financial oversight rules, like KYC (Know Your Customer) and RICO (Racketeer Influenced and Corrupt Organizations Act).

Because Manny is appreciative of the opportunity and benefits conveyed by his adoptive country, he may overlook a fact that whispers in the minds of other privacy advocates: That is, we may one day need protection from our own government. After all, who but a conspiracy nut or white supremacist could imagine the US government suppressing its populace. Sure, they engage in a little domestic spying—but if you have nothing to hide, why hide at all?!

This week, Manny posted an open letter to the cryptocurrency community. His organization, CRYPSA is at the intersection of that community with law, technology and politics. His letter addresses privacy, anonymity and transparency, but the title is “How can you report a stolen bitcoin?” For me, the issue is a non-sequitur. You needn’t, you shouldn’t, the reporting superstructure shouldn’t exist, and in a well designed system, you can’t.* More to the point, the imposition of any centralized reporting or recording structure would violate the principles of a decentralized, p2p currency.

To be fair, Manny is not a sheep, blindly falling into line. He is shrewd, independent and very bright. But in response to my exaggerated and one-dimensional Manny, I have assembled some thoughts…

1. Privacy, Anonymity and Crime

Bitcoin pile-sThe debate about Bitcoin serving as a laundering mechanism for cyber-criminals is a red herring. Bitcoin does not significantly advance the art of obfuscation or anonymity. There have long been digital E-golds and stored value debit cards that offer immunity from tracking. They are just as easy to use over the Internet.

Moreover, it’s common for crime or vice to drive the early adoption of new technology, especially technology that ushers in a paradigm shift. The problem with linking Bitcoin to crime is that it drives a related debate on transparency, forensics and government oversight. This is a bad association. Transparency should be exclusively elective, being triggered only after a transaction—if and when one party seeks to prove that a payment was made or has a need to discuss a contractual term.

On the other hand, a good mechanism should render forensic analysis a futile effort if attempted by a 3rd party without consent of the parties to a transaction. We should always resist the temptation to build a “snitch” into our own tools. Such designs ultimately defeat their own purpose. They do not help to control crime—Rather, they encourage an invasive government with its fingers in too many people’s private affairs.

CRYPSA is building tools that allow Bitcoin users to ensure that both parties can uncover a transaction completely, but only a party to the transaction wishes to do so!. For example, a parent making a tuition payment to a college can prove the date, amount and courses associated with that payment; a trucker or salesman with a daily expense account can demonstrate to his employer that a purchase was associated with food and lodging and not with souvenirs. And, of course, a taxpayer under audit can demonstrate whatever he wishes about each receipt or payment.

But in every case, the transaction is opaque (and if properly secured, it is completely anonymous) until the sender or recipient chooses to expose details to scrutiny. I will never accept that anonymity is evil nor evidence of illicit intent. Privacy is a basic tenet of a democracy and a government responsible to its citizens. CRYPSA develops tools of transparency, because commerce, businesses and consumers often need to invoke transparency—and not because any entity demands it of them.

We are not required to place our telephone conversations on a public server for future analysis (even if our government saves the metadata or the complete conversation to its clandestine servers). Likewise, we should not expose our transactions to interlopers, no matter their interest or authority. The data should be private until the data generator decides to make it public.

2. Reporting a Transaction (Why not catalog tainted coins?)

Manny also wants to aid in the serialization and cataloging of tainted funds, much like governments do with mass movement of cash into and out of the banking network. This stems from an earnest desire is to help citizens, and not to spy. For example, it seems reasonable that a mechanism to report the theft of currency should be embedded into Bitcoin technology. Perhaps the stolen funds can be more easily identified if digital coins themselves (or their transaction descendants) are fingered as rogue.

The desire to imbue government with the ability to trace the movement of wealth or corporate assets is a natural one. It is an outgrowth of outdated monetary controls and our comfort with centralized trust-endowed. In fact, it is not even a necessary requirement in levying or enforcing taxes.

Look at it this way…

  1. Bitcoin transactions are irreversible without the identification and cooperation of the original payee (the one who received funds). Of course, identification is not a requisite for making a transaction, any more than identification is required for a cash purchase at a restaurant or a newsstand.
  2. There are all sorts of benefits of both anonymous transactions and secure, irrevocable transactions—or least those that cannot be reversed without the consent of the payee. This is one of the key reasons that Bitcoin is taking off despite the start-up fluctuations in exchange rate.
  3. Regarding the concern that senders occasionally wish to reverse a transaction (it was mistaken, unauthorized, or buyer’s remorse), the effort to report, reverse or rescind a transaction is very definitely barking up the wrong tree!

The solution to improper transactions is actually quite simple.

a) Unauthorized Transactions

Harden the system and educate users. Unauthorized transactions can be prevented BEFORE they happen. Even in the worst case, your money will be safer than paper bills in your back pocket, or even than an account balance at your local bank.

b) Buyer’s Remorse and Mistaken transactions

Buyer beware. Think before you reach for your wallet! Think about what you are buying, from whom, and how you came to know them. And here is something else to think about (issues that are being addressed by CRYPSA)…

i.   Do you trust that the product will be shipped?
ii.  Did you bind your purchase to verifiable terms or conditions?
iii. Is a third party guarantor involved (like Amazon or eBay)?

All of these things are available to Bitcoin buyers, if they only educate themselves. In conclusion, “reporting” transactions that you wish to rescind is a red herring. It goes against a key tenant of cryptocurrency. It is certainly possible that a distributed reverse revocation mechanism can be created and implemented. But if this happens, users will migrate to another platform (call it Bitcoin 2.0).

You cannot dictate oversight, rescission or rules to that which has come about from organic tenacity. Instead, we should focus on implementing tools that help buyers and businesses identify sellers who agree to these extensions up front. This, again, is what CRYPSA is doing. It is championing tools that link a transaction to business standards and to user selective transparency. That is, a transaction is transparent if—and only if— the parties to a transaction agree to play by these rules, and if one of them decides to trigger the transparency. For all other p2p transactions, there is no plan to tame the Wild West. It is what it is.

* When I say that you should not report a stolen coin, I really mean that you should not run to the authorities, because there is nothing that they can do. But this is not completely accurate.

20130529_102314a1. There are mechanisms that can announce your theft back into a web of trust. Such a mechanism is at the heart of the certificate revocation method used by the encryption tool, PGP (Pretty Good Privacy). CRYPSA plans to design a similar user-reporting mechanism to make the cryptocurrency community safer.

2. Authorities should still be alerted to theft or misuse of assets. They can still investigate a crime scene, and follow a money trail in the same way that they do with cash transactions, embezzlement or property theft. They can search for motive and opportunity. They have tools and resources and they are professionals at recovering assets.


 

Disclosure: Just like Manny, I am also a CRYPSA director and acting Co-Chairman. (Cryptocurrency Standards Association). This post reflects my personal opinion on the issue of “reporting” unintended, unauthorized or remorseful transactions. I do not speak for other officers or members.

Canary Watch deduces federal gag orders

The US government and its courts routinely demand personal user information from email services, banks, phone companies and other online, telecommunications or financial services. These demands compel services to disclose details about email, phone calls and financial transactions. They also gain access to hordes of so called “metadata”, which can be just as personal as a user’s phone calls. It includes information about user relationships, locations, browser configuration and even search history. Many of these demands, such as the infamous National Security Letter stipulate that the service may not divulge that they were asked for the information in the first place. In fact, they can’t say anything about a de facto investigation!…

My friend, Michael, occasionally points out that skirting the law with Wink-Wink-Nod-Nod is still likely breaking the law. His point, of course, is that law is often based on intent. So with this in mind, what do you think about this clever reporting service?…

Canary WatchA service called, Canary Watch, lets online services like Verizon or Google send a continuous stream of data that repeatedly states “We are not currently compelled to turn over any data on our users”. Naturally, if the service suddenly refrains from sending the statement, a reasonable person can infer that the government is demanding personal information with the usual GAG order attached.

If you extrapolate this technique, a service like Verizon could continuously broadcast a massive list of usernames (or shorter hash codes representing individual users). These are the users who are not currently being investigated. Any change to the data stream would allow a 3rd party to infer and alert users who are the subject of an investigation.

With the launch of this service, Canary Watch wins the 2015 Wild Duck Privacy Award. This is the type of cleverness that we like! Why? Because it enhances transparency and helps everyone to control their own privacy, if they choose to do so.

Wild Duck Privacy Award

Further reading: Activists create website to track & reveal NSA, FBI info requests

Erase Online Infamy: Lies, slander, binging, sexting

I wrote this article under contract to the leading European security magazine and Blog, at which I typically write under my real name. (Ellery is a pen name).

Upon submission, my editor haggled over the usual issues of regional euphemisms (not allowed), eclectic metaphors (encouraged) and brevity (my submissions tend exceed the word limit by 3 or 4x). But this time, she also asked me to cut a section that I feel is critical to the overall thrust. That section is titled “Create Digital Chaff”

I am neither a stakeholder nor an editor at that magazine. Their editors have every right to set the terms and tone of anything they publish. But sensing my qualms over personal ethics and reputation, my editor released the article from contract and suggested that I publish in a venue with an edgy approach to privacy. I considered farming it out to Wired, CNet or PC Magazine, but it was written at a level and style intended for a very different audience. And so, it appears here, in my own Blog. The controversial section is intact and in red, below. Of course, Wild Ducks will see nothing controversial in a perfectly logical recommendation.

——————————-

The web is filled with tutorials on how to block tracking, hide purchases and credit history-even how to shift your identity deep under­cover.

But what about search results linked to your name, organization or your past. What can be done?

Legal remedies are rarely effective. The Internet is untamed. Data spreads like wildfire and it’s difficult to separ­ate opinion from slander. But, you can counter undesir­able content.

Catalogue the Irritants

Start by listing and prioritizing your pain. Search your name in all the ways you are known. Try several search engines, not just Google. Check image results and Usenet (news groups).

Record disparaging search results in 7 columns:

  • Search terms that yield the offending page
  • URL address of unflattering material
  • Severity damage to your reputation
  • Owner or Author contact info of traceable party
  • Which role? author, site admin, hosting service?
  • Inbound links search on “Links:{Page_URL}”
  • Disposition left msg, success, failure, etc

Sort pages in descending order of severity. As you resolve issues, reduce severity to zero, but save entries for follow up. With just a few offensive pages, it may take a few hours to perform the tasks described.

The Process

Most posts boil down to an individual author rather than a faceless organization. You can appeal, bargain, redirect, bury or dis­credit a source, sometimes employing several strategies. With reputation is at stake, all is fair.

Removing or Correcting Content

First, determine if you are more likely to influence the web developer, site host, or author who cre­ated unflat­tering mater­ial (start with him, if pos­sible).

Ascertain if the author is likely to influence readers that matter to you. After all, without creed, rants cannot Infamy Callout 1ado much damage.

If the source is credible, appeal directly. In some cases, re­marks about you may be immaterial to his point. If it is impos­sible to find the source or if there is no meeting of minds, con­tact the site owner or hosting service-daunting, but not im­possible. GoDaddy, the largest host­ing site[1], often takes down entire sites in response to complaints.

Try pleading, bargaining or swapping favors. (But never pay! Extortion is best handled by courts). Negotiate these actions:

  • Change names, keywords and metatags. Avoid taking down the page for 2 weeks.
  • Point domain or URL to a different site
  • Post an apology or correction at searchable pages that contain offending material. (Avoid describing the slander. Use different phrases).
  • Add chaff (below). It reduces discovery.

Takedown the Search Cache

Check the page cache of each search (click the arrow to the right of Google results). File takedown requests, especially if material is obscene or you can argue it is untruthful or slander.

Check referring sites. They may excerpt or echo defamation. In the UK, freedom of expression is becoming a gray area. Nevertheless, in practice, the Internet gives everyone a soap box. So our next technique employs misdirection and ‘noise’ rather than confrontation.

Create Digital Chaff

To protect from missiles, some aircraft eject ‘chaff’. The burning strips of foil lure guided munitions by presenting a target that is more attractive than the plane. Likewise, you can de­ploy digital “chaff” by planting information Infamy Callout 2that overwhelms search results, leading viewers away from de­famatory links via redirection or con­fusion. Your goal: change search result ranking.

Chaff-s

US Air Force jet ejects burning chaff

Consider photos or events that you find untruthful or embar­rassing. Ask friends with popular web pages to add content about the photo associating ficti­tious names (not yours). Con­versely, get them to post your name in articles that dis­tance you from the activity in ways that seemingly make it impossible for you to fit the offensive descriptions.

Use your imagination, but don’t make up lies. Eventually, they catch up with you. Instead, fill the web with news, trivia, reviews, and all things positive. Create content that pulls the focus of searches that previously led to pages you wish to suppress.

Finally, avoid SEO tricks to lure search engines,[2] such as cross-links, robot-generated hits or invisible text and meta­data not directly related to con­tent. Infamy Callout 3Search engines de­tect rigging. Manip­ula­tion or deceit will de­mote your page rank, negating hours of hard work. If you would like to focus on your rankings, you can use tools like Google My Business (GMB) to do so. All of your SEO tactics will then be above board so your site will not be penalized for any reason. There are a lot of tools available when it comes to GMB so there are a lot of elements of your site that you can work on. If you’re not sure how to manage your GMB account, there are other tools like local viking which are also available to help. You may be wondering how does local viking work and it works by helping you manage your GMB account, boosting the visibility of your GMB, attracting more potential customers and making more profit. Using tools like these is much more beneficial than using robot-generated hits or invisible text.

Looking forward, consider ways to avoid letting your own data lead to future embarrassment…

Social Media

Facebook isn’t the only social media site. They include any site with ‘walls’, feeds or link sharing. Likewise, Blogs and Blog comments create a threat beacon.

Social media can ruin reputations as surely as it shares photos and connects friends. Searchable from within and outside, they spread online activities like vines. Your wall and timeline will easily outlive you!

Learn privacy controls. Start by enabling all restrictions and then isolate friends and colleagues into circles. This partitions friends and colleagues into venues asso­ciated with your various hats. Of course, friends can be in several of your circles, but it gives you the ability to restrict your wall or photos to individuals likely to appreci­ate the context and less likely to amplify your accidents and oversights.

Faced with privacy concerns, Facebook recently added granular controls and Google unified privacy policies across services. Most sites offer ways to enable and disable features or even temporarily suspend tracking. If yours doesn’t, find a more reputable service.

Scrutinize, photo tagging options. Facebook users can block tagging of photos and remove their name on photos that others post. (Don’t take too much comfort-They also use facial rec­ognition to encourage other users to ID you. In the universe of photos that include you, only a fraction was posted by you.)

Clean up Cloud Services

Do you use iCloud, Google Drive, or Skydrive? Infamy Callout 4How about Dropbox, SugarSync or backup services Carbonite or Mozy?

Cloud services are great collaboration tools. They backup folders and stream personal media. Like social networks, they present another leaky conduit for your private life.

Check that sync folders are encrypted or unlikely to convey personal or unflattering material. Review shared links: Never grant access to everyone or let anyone access everything. Administer your friends, family and colleagues on a need-to-know basis. Your data needs a defense perimeter.

Create an Alter Ego

Ellery isn’t my real name. It is the alias with which I publish AWildDuck. But the fact that I acknowledge that I have another identity and occasional references to my career, geographic location and age, demonstrates that I am either very foolish or not making a serious effort to prevent discovery.

Archival Sites

Unfortunate news for anyone trying to erase history: The Wayback Machine at www.archive.org takes snapshots of the entire Internet every day. Visi­tors click a calendar to travel back in time and view a web or page as it once appeared.

Al­though content does not appear in search results, the com­ments you posted about the boss’ daughter are viewable to any visitor-no login required! Advice concerning archive sites: “Get past it!” They are not likely a threat, but they remind us that in the Internet Age, history cannot be erased.

A Dose of Prevention

Putting anything online-even email-lets a cat out of a bag. You can corral it and hope that it doesn’t eat the neigh­bor’s daisies, but you cannot get it back into the bag. That’s why we focus on disguise, chaff and misdirection. If the neighbors are looking at your shiny car and the cat looks more like a dog belonging to someone else, it is less likely to draw attention.

As you hunt authors of unflattering detritus and imple­ment containment, make a resolution to watch what you say online. Online content is never private. Cameras and recorders are everywhere and they aren’t operated by friends. Your trail will easily outlive you.

_____________

[1] In April 2005, Go Daddy (aka Wild West Domains) surpassed Network Solutions as the largest ICANN-accredited registrar on the Internet [domain names registered].
Source: web-hosting-top.com. Stats of 4/27/2005, and up to the date of this posting.

[2] SEO = Search Engine Optimization

Update: NSA surveillance, Bitcoin, cloud storage

Just last month, Edward Snowden was honored with our first annual Wild Duck Privacy Award (we hope that he considers it an honor). The vigorous debate ignited by his revelations extend to the US Congress, which just voted on a defense spending bill Edward Snowdento  defund a massive NSA domestic spying program at the center of the controversy.

Although the bill was narrowly defeated, it is clear that Snowden has played a critical role in deliberative policy legislation at the highest level of a representative government. Even if this is the only fact in his defense, why then – we wonder, is Snowden a fugitive who must fear for his life and his freedom?

Snowden saw an injustice and acted to right a wrong. His error was to rely solely on his own judgment and take matters into his own hands, without deliberative process or oversight. But since it is the lack of these very same protective mechanisms for which he engaged in conscientious objection, the ethical dilemma presented a Catch 22.

—————————————————————————————

Stacks of BitcoinRegular readers know that we love Bitcoin. We covered the stateless currency in 2011 and 2013. Just as the internet decentralizes publishing and influence peddling, some day soon, Bitcoin will decentralize world monetary systems by obliterating the role of govern-ments and banks in the control of money flow and savings. Why? Because math is more trustworthy than financial institutions and geopolitics. You needn’t be an anarchist to appreciate the benefits of a currency that is immune from political influence, inflation, and the potential for manipulation.

Now, comes word of a Texas man charged with running a $60 million Bitcoin Ponzi scheme. The story is notable simply because it is the first skullduggery aimed at the virtual currency — other than internet hacking or other attacks on the still fragile infrastructure. Should we worry. Absolutely not. This story has little to do with Bitcoin and falls squarely under the category of Caveat Emptor. Widows and orphans beware!

—————————————————————————————

bitcasa-sIn February, we wrote about Bitcasa, the upstart cloud storage service with an edge over diver-sified competitors and other entrenched players: Dropbox, Google Drive, Microsoft SkyDrive, SugarSync, Apple iCloud, etc. WildDucks learned how to get truly unlimited cloud storage for just $49. Now they are launching unlimited cloud storage in Europe starting at €60 per year.

Bitcasa still captures our attention and sets our pulse racing. While we are disappointed that it lacks the RDDC architecture that will eventually rule the roost, their Infinite Drive technology is a barn burner. More than ever, it is clear that Bitcasa is likely to displace or be acquired by their better known brethren.

—————————————————————————————

Drew Houston-01sWe also wrote about Dropbox, but that posting wasn’t really a review. It was our plea to CEO, Drew Houston (shown at left), to adopt a fully distributed and reverse cloud architecture. That effort failed, but it is still our favorite of the entrenched players. More suited to pin stripe corporate adoption, but in our opinion, not quite a Bitcasa.

In a previous article, we introduced lesser known cloud startups with clever and unique architect-ture that yield subtle benefits: SpaceMonkey, Symform and Digital Lifeboat. That last one was in need of a life preserver. It flopped. But the IP that they created in the area of distributed p2p storage management will live on. We will all benefit.

—————————————————————————————

Stream Music Flowchart-s2Finally, in May we ran down the benefits of cloud music players and their likely future of streaming your own personal library of movies. Now, Jeff Somogyi at Dealnews has created a nifty flowchart to help you decide among many vendors in a crowded market.

Of course, a discussion of Bitcasa, Dropbox, SpaceMonkey and RDDC wasn’t our first discussion of cloud storage. Shortly after AWildDuck launched back in 2011, we applauded PogoPlug and their ilk (Tonidoplug, Dreamplug, Shiva, and other genres consumer grade network attached storage with internet access. They let you create personal cloud services and even stream media from a drive or RAID storage device attached to your home router.

 

Got a spare million? Groundbreaking antispam patent at auction

Patent-A-01sThe Sender Bond patent that made waves just a few years ago is being auctioned at Ebay. To one degree or another, it’s been baked into various products. Ironport Bonded Sender Program, Goodmail, Cruelmail and Microsoft (Who remembers the Penny Black project?). A pure implementation was rolled out by Vanquish Labs. They had a good 10 year run, 4 products and even won PC Magazine Editors’ Choice (the publication that really counts).

Now Vanquish users have been transferred to Google services and the clever IP that started it all is up for auction; starting bid is $160,000. In the right hands, it’s worth north of 5 million. And that, dear reader, leaves plenty of room for the buyer to profit.

Bill Gates-21At the Word Economic Conference in Davos Switzerland, Bill Gates predicted that clever economics would ultimately defeat Spam.

Economics is the holy grail, he promised. It will deter unwanted telemarketing calls, junk faxes, and of course, email offers for enhanced male anatomy, Nigerian dictators, hot teen coeds, and Rolex watches. (Perhaps if you buy the teen coeds, the anatomical enhancement is unnecessary).

Bill Gates was right — but a bit early. Google was in it’s infancy and Mark Zuckerberg was still in high school.The tiny Massachusetts start-up was ill equipped to boil the ocean. The world wasn’t ready for something that was difficult to explain, even if it was eminently transparent, practical and effective.

penalty-sealThe “Sender Bond mechanism” is simple, natural and empowering. It won’t block unsolicited contact from the strangers that you want to greet: A new customer, a long lost classmate, that refund you thought was lost forever. In fact, for thousands of satisfied Vanquish Labs users, this patent and other email security technology rolled into their IP divestiture works like a champ.

But Vanquish users were serviced by a data center architecture designed to showcase technology, not to service a substantial number clients and billions of messages. As the beta test ended, early adopters were added to the showcase server. Grappling with an impressive string of awards, the company should have virtualized and followed the SAAS industry into the cloud. But lacking resources, Vanquish attempted to scale their demo architecture. It worked for awhile, but ultimately, that strategy delayed their entry into an era of cloud architecture. Because they could not scale, they failed to exploit and market clever antispam technology. In the hands of an established cloud security provider or a technology licensing trust, this bird has wings. Their loss will be an opportunity for someone else.

I’m not going to pontificate on this one. In fact this is an unofficial article. I will be taking it down next month. As someone deeply involved with the technology, this one hits too close to home. It’s like seeing your son leave for war or your daughter get married and flying the coop.

Feedback on this posting is closed. Feel free to comment on orher posts or use the contact form to contact the author.

Slippery Slope: Japan seeks to ban Tor

The Electronic Freedom Foundation (EFF) often finds itself on the opposite side of legislation that is initiated or supported by media rights owners. In fact, the Recording Industry Association of America (RIAA) and its Hollywood counterpart, the Motion Picture Association of America (MPAA) have thwarted every promising technology since the dawn of the photocopier and the 8-track tape cartridge.

We could list delayed technologies or those that were threatened with a use tax, such as the VCR, writable CDs, file sharing networks, and DVD backup software. But the funny thing about grumbling rights owners is that they are, well, right. Sort of. After all, anyone who believes that it is OK to download a movie with Bit Torrent or trade music with friends (while maintaining access in their own playlist) has a weak argument. They certainly can’t claim the moral high ground, unless they are the only person on earth that limits file copies to back ups and playlists in strict conformity to exceptions allowed under DMCA.

But this week, it isn’t the RIAA or MPAA that seeks to squash the natural evolution of the Internet. This time, it is the government of Japan. Japan?!!

Napster-ShawnFirst, some background…

In July 2001, Napster was forced to shut its servers by order of the US Ninth Circuit court. Despite legitimate uses for the service, the court agreed with a district court and the US Recording Industry Association (RIAA), that Napster encouraged and facilitated intellectual property theft—mostly music in that era.

The decision that halted Napster was directed at a specific company. Of course, it de-legitimized other file swapping services. (Who remembers Limewire, Kazaa, Bearshare or WinMX?) But, it was never intended to condemn the underlying technology. In fact, Napster was a pioneer in the emergence of ad hoc, peer-to-peer networks. It is the precursor of today’s Bit Torrent which merges distributed p2p storage with swarm technology to achieve phenomenal download speed and a robust, nearline storage medium. In fact, over the next few years, AWildDuck predicts that the big cloud services will migrate to a distributed p2p architecture.

Akamai has long used the power of distributed networks for storing data “at the fringe”, a technique that serves up web pages rapidly and reduces conserves network resources. But a similar network, grown organically and distributed among the masses strikes fear in the hearts of anyone who believes that power stems from identification and control.

In 2000 and 2001, p2p networks were perceived as a threat, because they facilitated the sharing of files that might be legally by few peers–or none at all. Today, p2p networks are fundamental to the distribution of files and updates and are at the very core of the Internet.

how_tor_works

Tor facilitates privacy. User identification is by choice.

Peer-to-peer networks are no more a tool of crime than telephones. Although both can be used for illegal purposes, no reasonable person advocates banning phones, and no one who understands the evolution and benefit of modern networks would advocate the regulation of peer-to-peer networks. (Please don’t add guns to this list. That issue has completely different considerations at play. With guns, there is a reasonable debate about widespread ownership, because few people use it as a tool for everyday activities and because safe use requires training).

But p2p networks are evolving. The robust, distributed nature is enhanced by distributing the tables that point to files. In newer models, users control the permissions as a membership criteria and not based on the individual source or content of files. For this reason, anonymity is a natural byproduct of technology refinement.

Consider the individual users of a p2p network. They are nodes in a massive and geographically distributed storage network. As both a source of data and also a repository for fragments from other data originators, they have no way to determine what is being stored on their drives or who created the data. Not knowing the packet details is a good thing for all parties—and not just to confound forensic analysis. It is a good thing every which way you evaluate a distributed network.*

The RE-Criminalization of P2P Networks

tor-crop

Japan’s National Police Agency (NPA) is much like America’s FBI. As a federal agency equipped with investigators, SWAT teams and forensic labs, their jurisdiction supersedes local authorities in criminal matters of national concern.

This weekend, the NPA became the first national agency in any country to call for a ban on the use of anonymous web surfing. They want Internet service providers to monitor and block attempts by their subscribers to use proxy servers that relay their internet traffic through remote servers, thereby anonymizing web traffic and even emboldening users to browse areas of the Internet that they might otherwise avoid.

But the Japanese NPA has a naïve and immature view of humanity. The use of proxy servers is not only fundamental to many legitimate purposes, many netizens consider web-surfing privacy to be a legitimate objective on its own merits. We could list a dozen non-controversial reasons for web surfing privacy—but if we had to do that, you probably wouldn’t be reading this page.

* The statement that anonymity and encryption is a good thing for distributed, p2p networks—not just for data thieves, but for all legal and business purposes—may not be self-evident to all readers. It will be the topic of a future discussion in this Blog.

Ellery Davies is an author, privacy consultant and cloud
storage architect. He is also editor at AWild Duck.com.

Google switches Privacy honchos (Opportunity knocks)

After three years on the job, Google’s first ever Director of Privacy is stepping down. Alma Whitten rode out a tumultuous period which saw several high profile privacy snafus, not least of which has become known as the WiFi drive-by data grab.

Changing-of-the-guard in the office of chief privacy honcho presents a rare opportunity for Google. One wonders if Lawrence You will seize the moment…

Google-greyGoogle has a privacy option that could propel them onto the moral high ground. A nascent science offers a way for Mr. You to radically demonstrate indisputable proof of respect for users. Unlike other potential announcements, policies or technologies, this one protects user privacy completely—while continuing to profitably direct data from marketing partners. In fact, it won’t interfere with revenues across all services, including Search, Docs, and all aspects of Android and Chrome.

Lawrence You steps in as Privacy Director

Lawrence You: Reason to keep smiling.

What could possibly anonymize individual user data while preserving individual benefits? I refer to Blind Signaling and Response. It is new enough that no major services incorporate the technique. That’s not surprising, because the math is still being worked out with help from a few universities. But with the resources and clout of the Internet juggernaut, Google needn’t wait until upstarts incorporate provable privacy and respect into every packet of data that flies across the ether.

What is Blind Signaling and Response? You’re Google! Google it and go to the source. You’ve once brought the inventor to Mountain View. My 2¢: Get the project in house and grow it like a weed. When PMs & directors, like Brad Bender, Anne Toth, Stephan Somogyi and Andrew Swerdlow get religion, the tailwind will grease a path toward roll out—and well deserved bragging rights.

A bit of Irony: Venture Beat says that Whitten is leaving the “hardest job in the world” and that Lawrence You will lose his smile as he takes the reins. Nonsense! With a technical solution to privacy, the world’s hardest job will transform into one of education and taking the credit. Ultimately, it will be the prestige job that commands respect.

Perhaps just as important, Blind Signaling and Response will gut the Bing Scroogled campaign, like a stake through the heart. With Google pioneering user respect, the Scroogled campaign will turn from clever FUD into a colossal waste of cash.

Disclosure:  Ellery Davies is very keen on the potential for BSR and want’s very much to pull his favorite service provider along for the ride.

Chilling developments in domestic spying

The US government is obsessed about your phone calls, email, web surfing and a log of everywhere that you travel. The obsession has become so intense over the past few years, that they have had to recast the definition of data gathering. After all, warrantless wiretapping and domestic spying is illegal. And so once exposed, Uncle Sam now claims that massive public eavesdropping, archiving and data mining (including building cross-domain portfolios on every citizen) does not count as “spying” because a human analyst has not yet listened to a particular conversation. The way your government spins it, if they have not yet listened into private, domestic conversations, they can gather yottabytes of personal and businesses without any judicial oversight.

The increasing pace of Big Brother’s appetite for wads of personal data is–at the very least–alarming and more specifically, unlikely to result in anything more than a Police State. To learn about some of these events, check our recent articles on the topic of Uncle Sam’s proclivity for data gathering.

Whistle blower, William Binney, explains a secret NSA program to spy on U.S. citizens without warrants

I’m Not Doing Anything Illegal. Why Should I Care?

Here at AWildDuck, we frequently discuss privacy, government snooping, and projects that incorporate or draw upon warrantless interception. In just the USA, there are dozens of projects–past and present–with a specific mandate to violate the Foreign Intelligent Surveillance Act. How can the American government get away with it? In the past decade, as leaks began to surface, they tried to redefine the meaning of domestic surveillance to exclude sweeping acts of domestic surveillance. The Bush era interpretation of USSID 18 is so farcical, that it can be debunked by an elementary school pupil. As the ruse unraveled, the wholesale gathering of data on every citizen in America was ‘legitimized’ by coupling The Patriot Act with general amnesty for past acts warrantless wiretapping. Dick Cheney invoked the specter of 911 and the urgent need to protect Americans from terrorism as justification for creating a more thorough and sweeping police mechanism than any totalitarian regime in history.

The programs go by many names, each with a potential to upend a democracy: Stellar Wind, The Patriot Act, TIA, Carnivore, Echelon, Shamrock, ThinThread, Trailblazer, Turbulence, Swift, and MINARET. Other programs thwart the use of privacy tools by businesses and citizens, such as Clipper Chip, Key Escrow and the classification of any secure web browsing as a munition that must be licensed and cannot be exported. The list goes on and on…

A myriad of dangers arise when governments ‘of-the-people and by-the-people’ engage in domestic spying, even if the motive is noble. Off the bat, I can think of four:

  • Justifications are ethereal. They are based on transient goals and principles. Even if motives are common to all constituents at the time a project is rolled out, the scope of data archived or the access scenarios inevitably change as personal and administrations change.
  • Complex and costly programs are self-perpetuating by nature. After all, no one wants to waste billions of taxpayer dollars. Once deployed, a massive surveillance mechanism, it is very difficult to dismantle or thwart.
  • There is convincing research to suggest that domestic surveillance could aid terrorists, rather than protect civilians.
  • Perhaps most chilling, is the insipid and desensitizing effect of such programs. Once it becomes acceptable for a government to spy on its citizens, it is a surprisingly small step for neighbors, co-workers and your own children to become patriotic partners in surveillance and reporting. After all, if your government has the right to preemptively look for dirt on your movement, Internet surfing, phone calls, cash transactions and sexual dalliances, then your neighbor can take refuge in the positive light of assisting law enforcement as they transmit an observation about an unusual house guest or the magazines you subscribe to.

What’s New in Domestic Spying?

This is a landmark week for anyone who values privacy and who understands that domestic spying is not a necessary tool of homeland security. This week, we are learning that US surveillance of its citizens is skyrocketing and a court case is about to either validate or slap a metaphorical wrist. Either way, each event brings us ever closer to the world depicted in Person of Interest. For now, I am citing breaking news. We’ll flush out the details soon.

Article in progress. Changes coming in the next few hours.
Figures, Photos & Links will be added. Please return soon.

Articles on Privacy & Domestic Surveillance here at AWildDuck:

$1 Billion kick-starts Facial Recognition of Everyone

For access to a home or automobile, most people use a key. Access to accounts or transactions on the Internet usually requires a password. In the language of security specialists, these authentication schemes are referred to as using something that you have (a key) or something that you know (a password).

In some industries, a third method of identification is becoming more common: Using something that you are. This area of security and access is called ‘biometrics’. The word is derived from bio = body or biology and metrics = measurement.

The data center that houses computer servers for AWildDuck also houses valuable equipment and data for other organizations. When I visit to install a new router or tinker with my servers, I must first pass through a door that unlocks in the presence of my fob (a small radio-frequency ID tag on my key chain). But before I can get to the equipment cage that houses my servers, I must also identify myself by placing the palm of my hand on a scanner and speaking a code word into a microphone. I don’t know if my voice is identified as a biometric, but the use of a fob, a code word and a hand-scan demonstrates that the facility uses all three methods of identify me: Something that I have, something that I know and something that I am.

If you work with technology that is dangerous, secret, or that has investor involvement, then biometric identification or access seems reasonable. After all, something-that-you-are is harder to forge than something that you have. Because this technique is tied to part of your body, it also discourages the loaning of credentials to a spouse, friend, or blackmailer.

But up until now, biometric identification required the advance consent of the individuals identified. After all, before you can be admitted to a secure facility based on your hand print, you had to allow your hand to be scanned at some time in the past. This also suggests that you understood the legitimate goals of those needing your identification in the future.

Few Americans have been compelled to surrender their biometrics without advance consent. There are exceptions, of course. Rapists and individuals applying to live in the United States are routinely fingerprinted. Two very different demographics, and yet both are compelled to surrender a direct link to their genetic makeup. But until now, we have never seen a non-consenting and unsuspecting population subjected to wholesale cataloging of personal biometrics. Who wants all of this data? What could they do with it?

Here at AWildDuck, we have written about the dogged persistence of conservatives in the American government to seek a state of Total Information Awareness. But now, Uncle Sam is raising the stakes to a new low: The Dick Cheneys and Karl Roves aren’t satisfied with compiling and mining data from that which is online, such as phone books, Facebook data, company web sites, etc. They want access to as much personal and corporate data as they can get their hands on: Bank records, credit card receipts, tax returns, library borrowing records, personal email, entire phone conversations & fax images, and the GPS history logged by your mobile phone.

Perhaps even more creepy, is the recent authorization for the use of high altitude drones for domestic law enforcement. But wait! That development pales in comparison with a minor news bulletin today. The FBI has just funded a program of facial recognition. We’re not talking about identifying a repeat bank robber, a missing felon or an unauthorized entry across our borders. We are talking about scanning and parsing the entire population into a biometric fingerprint database. The project aims to cull and track facial images – and identify each one – from every Flickr account, every ATM machine, every 7-11…in fact, every single camera everywhere.

If you have a driver’s license, a Facebook account, or if you ever appeared in a college yearbook, it’s a certainty that you will soon surrender identifiable biometrics, just like a rapist or a registered alien. By 2014, we may arrive at 1984.

The one billion dollars set aside by the FBI for the facial recognition component of Project Über Awareness belies the truly invasive scope of body-cavity probing that the Yanks want to administer. The massively funded effort includes a data archival project buried within a Utah hill that is brain-seizing in size and scope. Forget about Tera, Peta and Exabytes. Think instead of Yotta, Zeta and Haliburtabytes.

Engadget is a popular web site that reviews and discusses high tech markets, media & gadgets. Below, they discuss the facial recognition component and its privacy implications. Just as with our past articles on this topic, Engadget begins with a still image from the ABC television series Person of Interest. The show depicts the same technology and it’s all encompassing power. Whomever controls it has the power to manipulate life. But unlike Mr. Finch, a fictional champion of stalked heroines, the Big Brother version is not compelled by a concern for individual safety and security. Instead, the US government is using the specter of terrorism and public safety to bring the entire world one giant leap closer to a police state.

Do we really want our government – any government – to know every detail about our daily lives? Does the goal of securing public safety mean that we must surrender our individual freedoms and privacy completely? Are individuals who don’t care about privacy absolutely certain that they will trust their governments for all time and under all circumstances? Do they expect that the data will never be breached or used for purposes that were not originally sanctioned or intended? Is anyone that naïve?

________________________________________________________________________

FBI rolls out $1 billion public face recognition system in 2014.
Big Brother will be on to your evildoing everywhere

Reprint: Engadget.com — By , posted Sep 9th 2012

Can USA Assert Jurisdiction Over Assange?

Most Wild Ducks are aware that WikiLeaks is a rogue distributor of classified and secret documents from anonymous news sources, news leaks and whistle blowers. At the helm is the very charming self-promoter, Julian Assange. This man attracts controversy like honey attracts flies. Dozens of governments, banks and NGOs would gladly substitute honey with “horse manure” in that simile.

In the past 2 years, WikiLeaks has threatened—and then followed through—on the release of information troves containing copious numbers of memos, orders, private communications, and tactical analyses by governments, banks, charities, NGOs, and what-have-you. To generate buzz and prevent sabotage while they vet and compile controversial disclosures, WikiLeaks occasionally pre-releases an encrypted stash of secret documents that they call an “insurance file” or, more accurately, an information bomb. Once out there, it can never be defused—The contents can be remotely detonated by anyone with an encryption key. (This can be a short phrase that is easy to remember).

During the past 2 years, WikiLeaks has been doing exactly what it has threatened (or promised, depending upon your perspective). They have disseminated enormous troves of sensitive and sometimes embarrassing documents, phone calls, faxes, emails, and other private communications without permission from those who were party to the data. Among the infringed parties (think of this as the data ‘owner’ or originator) are the US Government, Bank of America and just about anyone else that claims domain over sensitive material. WikiLeaks justifies its acts as a 21st century watchdog agency with a calling higher than any government. Their PR spin conveys an ethical rudder that pushes for transparency in all affairs. The United States points out that outted documents sometimes reveal names of spies, endangering agents and their families. Other documents reveal the number and location of weapon systems, or data about the capacity and range of weapon systems. But that’s not all…

For WikiLeaks, it doesn’t matter that a telephone transcript reveals personal information unrelated to the government or business affairs targeted for disclosure. For example, parties arranging a phone call reveal that a premier is delayed because he is with a young mistress or a Deputy of State can’t take a call, because she is in the midst of a fierce hangover. In effect, WikiLeaks says “Hey! These are public officials supported by their subjects or constituents. Transparency is always better than secrecy, no matter what’s in the pudding. Just throw it all out there and let the chips fall as they may.”

Of course, the US Government, it’s allies, and many public and private organizations don’t see it that way! Just because a disgruntled employee or consultant has access to sensitive documents shouldn’t mean that a 3rd party organization can air it all on the bathroom wall. And so, Julian Assange is a wanted man.

For the past two months, Assange has been holed up at the Ecuadorian embassy in Great Britain. I mention “Great Britain” as a geographic footnote and not to imply ownership or jurisdiction. An embassy is sovereign territory no matter whose land surrounds it. Right?

…Well, not according to the British.

Today, Ecuador’s foreign minister announced that the country is granting asylum to Assange. But now, the British government threatens to storm the Ecuadorian Embassy and forcibly extradite Assange to Sweden, where he is wanted for questioning in cases of alleged rape and sexual molestation.

The US government seeks Julian Assange for trial in a US court on charges related to his role in the massive WikiLeaks disclosure of confidential documents and communications. Of course, the US considers these documents to be sensitive and they are each labeled at various levels of “Secret”. The US has laws that govern access, copying and disclosure. It’s safe to assume that the charge would be treason, conspiracy, theft, aiding the enemy, or something related to willful interference with process.

US Jurisdiction: How Can it be Asserted?

I understand all of that. But I have never seen an explanation as to how the US could assert jurisdiction or request extradition. Assange is a foreigner and his acts related to WikiLeaks took place in foreign countries. Does the US assert that anything labeled as “secret” by its military is automatically secret everywhere on Earth? That would be a tough argument, because it would require a bilateral reciprocation agreement. Assange has lived in Nairobi since 2007. Does the US protect documents and extradite individuals over everything that the Nairobi government considers to be a secret?

Of course, the United States is pursuing enablers within or serving in uniform, but Assange is not among them. His actions may have harmed US interests (this is certainly debatable)—but how can the US claim that it has domain over the legality of his acts or his capture and punishment? Having an extradition agreement doesn’t mean that you can demand any individual that you seek. There has got to be a reasonable basis for the extradition. Doesn’t a bench warrant need a viable basis in law?

Swedes:
We Just Want to Try Him for Rape

The Swedes interest in Assange is ostensibly to charge him with a sex crime. That certainly sounds like a legitimate interest that is unrelated to the beef with Uncle Sam. But the Swedish government refuses to guaranty safe passage to a region that is not party to a US extradition treaty. They claim that they are bound by law to turn Assange over to the US. The solution to this quagmire is not simple, but it is achievable. Assange claims that he is willing to face that charge. Why not try him in Ecuador (or the country that becomes his safe harbor from American extradition). If he refuses, he could be tried in abstention by a Swedish court and the court sentence could be negotiated with authorities in the safe harbor country.

WikiLeaks: Is the Wholesale Release of
                   Secret Communiques Ethical?

What about the 900 pound elephant in the room? Can WikiLeaks claim that its mission is moral or ethical (carried out in the current fashion) morality of what Assange has done vis-à-vis WikiLeaks. My own readers at awildduck.com have pressed for an editorial opinion on the whole affair. Has Assange harmed US interests? Does it matter outside of the US? Did he break an “international” law? Should he be held accountable?  Should he be turned over to American authorities to stand trial?

I won’t weigh in on these issues here. The purpose of this posting is to question US jurisdiction and earnestly seek information & opinions on the basis for extradition. If you have knowledge of the law, the basis or the justification for that request, I invite your analysis and comment.

Could the Brits Really “Storm an Embassy”?

I certainly can’t imagine that the Brits would “storm the Ecuadorian embassy”. Good God, man! Regardless of treaties and acts, it is a sovereign country. In fact, I would think that the Ecuadorian could, at their discretion, grant Assange citizenship and then confer diplomatic status. This would compel a host country to guaranty safe passage to the Airport. Isn’t that the whole idea of ambassadors and the exchange of territory? Storming an embassy would place the UK in the unenviable and undistinguished company of Egypt (2011) and Iran (1979~1981). Who can forget the hostage taking? That event spawned a nightly TV show in the US and the career of Ted Koppel.

         Ellery Davies clarifies the intersection of Technology, Law and Public
         Policy. He is a contributor to Yahoo, CNet, ABC News, PCWorld and
         The Wall Street Journal. He is also Chief Editor of A Wild Duck.

Photo Mural—Sam Spratt, Gizmodo

Enhancing Privacy: Blind Signaling and Response

A user-transparent privacy enhancement may allow online service providers like Google to provably shield personal data from prying eyes—even from themselves. Personal user data like search, email, doc and photo content, navigation and clicks will continue to support clearly defined purposes (advertising that users understand and agreed to), data will be unintelligible if inspected for any other purpose.
In effect, the purpose and processes of data access and manipulation determine whether data can be interpreted or even associated with individual users. If data is inspected for any purpose apart from the original scope, it is unintelligible, anonymous and self-expiring. It is useless for any person or process beyond that which was disclosed to users at the time of collection. It cannot even be correlated to individual users who generate the data.

Blind Signaling and Response is not yet built into internet services. But as it crosses development and test milestones, it will attract attention and community scrutiny. A presentation at University of Montreal Privacy Workshop [video] gives insight into the process. The presenter can be contacted via the contact link at the top of this Blog page.

Can Internet services like Google protect user data from all threats—even from their own staff and processes—while still supporting their business model? If such commitment to privacy could be demonstrable, it could usher in an era of public trust. I believe that a modification to the way data is collected, stored and processed may prevent a breach or any disclosure of personal user information, even if compelled by a court order.

The goal of Blind Signaling and Response is define a method of collecting and storing data that prevents anyone but the intended process from making sense of it. But this pet theory has quite a road ahead…

Before we can understand Blind Signaling and Response, it helps to understand classic signaling. When someone has a need, he can search for a solution.

When an individual is aware of their needs and problems, that’s typically the first step in marrying a problem to a solution. But in a marketing model, a solution (sometimes, one that a user might not even realize he would desire) reaches out to individuals.

Of course the problem with unsolicited marketing is that the solution being hawked may be directed at recipients who have no matching needs. Good marketing is a result of careful targeting. The message is sent or advertised only to a perfect audience, filled with Individuals who are glad that the marketer found them. Poor marketing blasts messages at inappropriate lists or posts advertisements in the wrong venue. For the marketer (or Spam email sender), it is a waste of resources and sometimes a crime. For the recipient of untargeted ads and emails, it is a source of irritation and an involuntary waste of resources, especially of the recipient’s attention.

Consider a hypothetical example of a signal and its response:

Pixar animators consume enormous computing resources creating each minute of animation. Pixar founder, John Lasseter, has many CGI tools at his disposal, most of them designed at Pixar. As John plans a budget for Pixar’s next big film, suppose that he learns of a radical new animation theory called Liquid Flow-Motion. It streamlines the most complex and costly processes. His team has yet to build or find a practical application that benefits animators, but John is determined to search everywhere.

Method #1: A consumer in need searches & signals

Despite a lack of public news on the nascent technique, John is convinced that there must be some workable code in a private lab, a university, or even at a competitor. And so, he creates a web page and uses SEO techniques to attract attention.

The web page is a signal. It broadcasts to the world (and hopefully to relevant parties) that Pixar is receptive to contact from anyone engaged in Liquid Flow-Motion research. With Google’s phenomenal search engine and the internet’s reach, this method of signaling may work, but a successful match involves a bit of luck. Individuals engaged in the new art may not be searching for outsiders. In fact, they may not be aware that their early stage of development would be useful to anyone.

Method #2: Google helps marketers target relevant consumers

Let’s discuss how Google facilitates market-driven signaling and a relevant marketing response today and let us also determine the best avenue for improvement…

At various times in the past few weeks, John had Googled the phrase “Liquid Flow-Motion” and some of the antecedents that the technology builds upon. John also signed up for a conference in which there was a lecture unit on the topic (the lecture was not too useful. It was given by his own employee and covered familiar ground). He also mentioned the technology in a few emails.

Google’s profile for John made connections between his browser, his email and his searches. It may even have factored in location data from John’s Android phone. In Czechoslovakia, a grad student studying Flow-Motion has created the first useful tool. Although he doesn’t know anything about Google Ad Words, the university owns 75% of the rights to his research. They incorporate key words from research projects and buy up the Google Ad Words “Liquid Flow-Motion”.

Almost immediately, John Lasseter notices very relevant advertising on the web pages that he visits. During his next visit to eBay, he notices a home page photo of a product that embodies the technique. The product was created in Israel for a very different application. Yet it is very relevant to Pixar’s next film. John reaches out to both companies–or more precisely, they reached out in response to his signal, without even knowing to whom they were replying.

Neat, eh? What is wrong with this model?

For many users, the gradual revelation that an abundance of very personal or sensitive data is being amassed by Google and the fact that it is being marketed to unknown parties is troubling. Part of the problem is perception. In the case described above and most other cases in which the Google is arbiter, the result is almost always to the user’s advantage. But this fact, alone, doesn’t change the perception.

But consider Google’s process from input to output: the collection of user data from a vast array of free user services and the resulting routing of ads from marketing partners. What if data collection, storage and manipulation could be tweaked so that all personal data–including the participation of any user–were completely anonymized? Sounds crazy, right? If the data is anonymized, it’s not useful.

Wrong.

Method #3: Incorporate Blind Signaling & Response into AdWords
— and across the board

A signaling and response system can be constructed on blind credentials. The science is an offshoot of public key cryptography and is the basis of digital cash (at least, the anonymous form). It enables a buyer to satisfy a standard of evidence (the value of their digital cash) and also demonstrate that a fee has been paid, all without identifying the buyer or even the bank that guarantees cash value. The science of blind credentials is the brainchild of David Chaum, cryptographer and founder of DigiCash, a Dutch venture that made it possible to guaranty financial transactions without any party (including the bank) knowing any of the other parties.

The takeaway from DigiCash and the pioneering work of David Chaum is that information can be precisely targeted–even with a back channel–without storing or transmitting data that aids in identifying a source or target. (Disclosure: I am developing a specification for the back channel mechanism. This critical component is not in the DigiCash implementation). Even more interesting is that the information that facilitates replying to a signal can be structured in a way that is useless to both outsiders and even to the database owner (in this case, Google).

The benefits aren’t restricted to Internet search providers. Choose the boogeyman: The government, your employer, someone taking a survey, your grandmother. In each case, the interloper can (if they wish) provably demonstrate that meaningful use of individually identifiable data is, by design, restricted to a stated purpose or algorithm. No other person or process can find meaning in the data—not even to whom it belongs.

The magic draws upon and forms an offshoot of Trusted Execution Technology, a means of attestation and authentication. In this case, it is the purpose of execution that must be authenticated before data can be interpreted, correlated with users or manipulated. This presentation at a University of Montreal privacy workshop pulls back the covers by describing a combination of TXT with a voting trust, (the presenter rushes through key slides at the end of the video).

It’s reasonable to assume that privacy doesn’t exist in the Internet age. After all, unlike a meeting at your dining table, the path from whisper to ear passes through a public network. Although encryption and IP re-routing ensure privacy for P2P conversations, it seems implausible to maintain privacy in everyday searches, navigation, and online email services, especially when services are provided at no cost to the user. Individuals voluntarily disgorge personal information in exchange for services, especially, if the goal is to keep the service provider incented to offer the service. For this reason, winning converts to Blind Signaling and Response requires a thoughtful presentation.

Suppose that you travel to another country and walk into a bar. You are not a criminal, nor a particularly famous or newsworthy person. You ask another patron if he knows where to find a good Cuban cigar. When you return to your country, your interest in cigars will probably remain private and so will the fact that you met with this particular individual or even walked into that bar.

Gradually, the internet is facilitating at a distance the privileges and empowerment that we take for granted in a personal meeting. With end-to-end encryption, it has already become possible to conduct a private conversation at a distance. With a TOR proxy and swarm routing, it is also possible to keep the identities of the parties private. But today, Google holds an incredible corpus of data that reveals much of what you buy, think, and fantasize about. To many, it seems that this is part of the Faustian bargain:

  • If you want the benefits of Google services, you must surrender personal data
  • Even if you don’t want to be the target of marketing,* it’s the price that you pay for using the Google service (Search, Gmail, Drive, Navigate, Translate, Picasa, etc).

Of course, Google stores and act on the data that it gathers from your web habits. But both statements above are false!

a)  When Google incorporates Blind Signaling into its services, you will get all the benefits of Google services without anyone ever discovering personal information. Yet, Google will still benefit from your use of their services and have even more incentive to continue offering you valuable, personalized services, just as they do now.

b)  Surrendering personal data in a way that does not anonymize particulates is not “the price that you pay for Google services”. Google is paid by marketers and not end users. More importantly, marketers can still get relevant, targeted messages to the pages you visit, while Google protects privacy en toto! Google can make your personal data useless to any other party and for any other purpose. Google and their marketing partners will continue to benefit exactly as they do now.

Article in process…

* This is also a matter of perception. You really do want targeted messaging. Even if you hate spam and, like me, prefer to search for a solution instead of have marketers push a solution to you. In a future article, I will demonstrate that every individual is pleased by relevant messaging, even if it is unsolicited, commercial or sent in bulk.

Will Google “Do No Evil”?

Google captures and keeps a vast amount of personal information about its users. What do they do with all that data? Despite some very persistent misconceptions, the answer is “Nothing bad”. But they could do a much better job ensuring that no one can ever do anything bad with that data—ever. Here is a rather simple but accurate description of what they do with what is gleaned from searches, email, browsing, documents, travel, photos, and more than 3 dozen other ways that they learn about you:

  • Increase the personal relevance of advertising as you surf the web
  • Earn advertising dollars–not because they sell information about you–but
    because they use that data to match and direct relevant traffic toward you

These aren’t bad things, even to a privacy zealot. With or without Google, we all see advertising wherever we surf. Google is the reason that so many of the ads appeal to our individual interests.

But what about all that personal data? Is it safe on Google’s servers? Can they be trusted? More importantly, can it someday be misused in ways that even Google had not intended?

I value privacy above everything else. And I have always detested marketing, especially the unsolicited variety. I don’t need unsolicited ‘solutions’ knocking on my door or popping up in web surfing. When I have needs, I will research my own solutions—thank you very much.

It took me years to come to terms with this apparent oxymoron, but the personalization brought about by information exchange bargains are actually a very good bargain for all parties concerned, and if handled properly, it needn’t risk privacy at all! In fact, the things that Google does with our personal history and predilections really benefits us, but…

This is a pro-Google posting. Well, it’s ‘pro-Google’ if they “do no evil” (Yes—it’s the Google mantra!). First the good news: Google can thwart evil by adding a fortress of privacy around the vast corpus of personal data that they collect and process without weakening user services or the value exchange with their marketing partners. The not-so-good news is that I have urged Google to do this for over two years and so far, they have failed to act. What they need is a little urging from users and marketing partners. Doing no evil benefits everyone and sets an industry precedent that will permeate online businesses everywhere.

The CBS prime time television series, Person of Interest, pairs a freelance ‘James Bond’ with a computer geek. The geek, Mr. Finch, is the ultimate privacy hack. He correlates all manner of disparate data in seconds, including parking lot cameras, government records, high school yearbook photos and even the Facebook pages of third parties.

Mr. Finch & Eric Schmidt: Separated at birth?

It’s an eerie coincidence that Google Chairman, Eric Schmidt, looks like Mr. Finch. After all, they both have the same job! They find a gold mine of actionable data in the personal dealings of everyone.

Viewers accept the TV character. After all, Finch is fictional, he is one of the good guys, and his snooping ability (especially the piecing together of far-flung data) is probably an exaggeration of reality. Right?!

Of course, Eric Schmidt & Google CEO Larry Page are not fictional. They run the largest data gathering engine on earth. I may be in the minority. I believe that Google is “one of the good guys”. But let’s first explore the last assumption about Mr. Finch: Can any organization correlate and “mine” meaningful data from a wholesale sweep of a massive eavesdropping machine and somehow piece together a reasonable profile of your interests, behavior, purchasing history and proclivities? Not only are there organizations that do this today, but many of them act with our explicit consent and with a disclosed value exchange for all that personal data.

Data gathering organizations fall into three categories, which I classify based on the exchange of value with web surfers and, more importantly, whether the user is even aware of their role in collecting data. In this classification, Google has moved from the 2nd category to the first, and this is a good thing:

  1. Organizations that you are aware of–at least peripherally–and for which there is a value exchange (preferably, one that is disclosed). Google comes to mind, of course. Another organization with informed access to your online behavior is your internet service provider. If they wanted to compile a dossier of your interests, market your web surfing history to others, or comply with 3rd party demands to review your activities, it would be trivial to do so.
  2. Organizations with massive access to personal and individualized data, but manage to “fly beneath the Radar”. Example: Akamai Technologies operates a global network of servers that accelerate the web by caching pages close to users and optimizing the route of page requests. They are contracted by almost any company with a significant online presence. It’s safe to say that their servers and routers are inserted into almost every click of your keyboard and massively distributed throughout the world. Although Akamai’s customer relationship is not with end users, they provide an indirect service by speeding up the web experience. But because Internet users are not actively engaged with them (and are typically unaware of their role in caching data across the Internet), there are few checks and on what they do with the click history of users, with whom they share data, and if–or how–individualized is data is retained, anonymized or marketed.
  3. National governments. There is almost never disclosure or a personal value exchange. Most often, the activity involves compulsory assistance from organizations that are forbidden from disclosing the privacy breach or their own role in acts of domestic spying.
The NSA is preparing to massively vacuum data from everyone, everywhere, at all times

The US is preparing to spy on everyone, everywhere, at all times. The massive & intrusive project stuns scientists involved.

I have written about domestic spying before. In the US, It has become alarmingly broad, arbitrary and covert. The über secretive NSA is now building the world’s biggest data gathering site. It will gulp down everything about everyone. The misguided justification of their minions is alternatively “anti-terrorism” or an even more evasive “911”.

Regarding, category #2, I have never had reason to suspect Akamai or Verizon of unfair or unscrupulous data mining. (As with Google, these companies could gain a serious ethical and market advantage by taking heed of today’s column.) But today, we focus on data gathering organizations in category #1—the ones with which we have a relationship and with whom we voluntarily share personal data.

Google is at the heart of most internet searches and they are partnered with practically every major organization on earth. Forty eight free services contain code that many malware labs consider to be a stealth payload. These doohickeys give Google access to a mountain of data regarding clicks, searches, visitors, purchases, and just about anything else that makes a user tick.

It’s not just searching the web that phones home. Think of Google’s 48 services as a marketer’s bonanza. Browser plug-ins phone home with every click and build a profile of user behavior, location and idiosyncrasies. Google Analytics, a web traffic reporting tool used by a great many web sites, reveals a mountain of data about both the web site and every single visitor. (Analytics is market-speak for assigning identity or demographics to web visits). Don’t forget Gmail, Navigate, Picassa, Drive, Google Docs, Google+, Translate, and 3 dozen other projects that collect, compare and analyze user data. And what about Google’s project to scan everything that has ever been written? Do you suppose that Google knows who views these documents, and can correlate it with an astounding number of additional facts? You can bet Grandma Estelle’s cherry pie that they do!

How many of us ever wonder why all of these services are free to internet users everywhere? That’s an awful lot of free service! One might think that the company is very generous, very foolish, or very unprofitable. One would be wrong on all counts!

Google has mastered the art of marketing your interests, income stats, lifestyle, habits, and even your idiosyncrasies. Hell, they wrote the book on it!

But with great access to personal intelligence comes great responsibility. Does Google go the extra mile to protect user data from off-label use? Do they really care? Is it even reasonable to expect privacy when the bargain calls for data sharing with market interests?

At the end of 2009, Google Chairman, Eric Schmidt made a major gaffe in a televised interview on CNBC. In fact, I was so convinced that his statement was toxic, that I predicted a grave and swift consumer backlash. Referring to the Billions of individuals using Google search engine, investigative anchor, Maria Bartiromo, asked Schmidt why it is that users enter their most private thoughts and fantasies. She wondered if they are aware of Google’s role in correlating, storing & sharing data—and in the implicit role of identifying users and correlating their identities with their interests.

Schmidt seemed to share Bartiromo’s surprise. He suggested that internet users were naive to trust Google, because their business model is not driven by privacy and because they are subject to oversight by the Patriot Act. He said:

If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place. If you really need that kind of privacy, the reality is that search engines — including Google — do retain this information for some time and it’s important, for example, that we are all subject in the United States to the Patriot Act and it is possible that all that information could be made available to the authorities.

At the time, I criticized the statements as naive, but I have since become more sanguine. Mr. Schmidt is smarter than me. I recognize that he was caught off guard. But clearly, his response had the potential to damage Google’s reputation. Several Google partners jumped ship and realigned with Bing, Microsoft’s newer search engine. Schmidt’s response became a lightning rod–albeit brief–for both the EFF (Electronic Freedom Foundation) and the CDT (Center for Democracy & Technology). The CDT announced a front-page campaign, Take Back Your Privacy.

But wait…It needn’t be a train wreck! Properly designed, Google can ensure individual privacy, while still meeting the needs of their marketing partners – and having nothing of interest for government snoops, even with a proper subpoena.

I agree with the EFF that they undermine Google’s mission. Despite his high position, Schmidt may not fully recognize to that Google’s marketing objectives can coexist with an ironclad guarantee of personal privacy – even in the face of the Patriot Act.

Schmidt could have had salvaged the gaffe quickly. I urged him to quickly demonstrate that he understands and defends user privacy. But I overestimated consumer awareness and expectations for reasonable privacy. Moreover, consumers may feel that the benefits of Google’s various services inherently trade privacy for productivity (email, taste in restaurants, individualized marketing, etc).

Regarding a damning consumer backlash for whitewashing personal privacy with their public, I was off by a few years, but in the end, my warnings will be vindicated. Public awareness of privacy and especially of internet data sharing and data mining has increased. Some are wondering if the bargain is worthwhile, while others are learning that data can be anonymized and used in ways that still facilitate user benefits and even the vendor’s marketing needs.

With massive access to public data and the mechanisms to gather it (often without the knowledge and consent of users), comes massive responsibility. (His interview contradicts that message). Google must rapidly demonstrate a policy of “default protection and a very high bar for sharing data. In fact, Google can achieve all its goals while fully protecting individual privacy.

Google’s data gathering and archiving mechanism needs a redesign (it’s not so big a task as it seems): Sharing data and cross-pollination should be virtually impossible – beyond a specified exchange between users and intended marketers. Even this exchange must be internally anonymous, useful only in aggregate, and self expiring – without recourse for revival. Most importantly, it must be impossible for anyone – even a Google staffer – to make a personal connection between individual identities and search terms, Gmail users, ad clickers, voice searchers or navigating drivers! For a while now, voice search has been thought of as a huge potential advancement in the Google data gathering system, although many would argue that it has not yet had its desired impact.

I modestly suggest that Google create a board position, and give it authority with a visible and high-profile individual. (Disclosure, I have made a “ballsy” bid to fill such a position. There are plenty of higher profile individuals that I could recommend).

Schmidt’s statements have echoed for more than 2 years now. Have they faded at all? If so, it is because Google’s services are certainly useful and because the public has become somewhat inured to the creeping loss of privacy. But wouldn’t it be marvelous if Google seized the moment and reversed that trend. Wouldn’t it be awesome if someone at Google discovered that protecting privacy needn’t cripple the value of information that they gather. Google’s market activity is not at odds with protecting their user’s personal data from abuse. What’s more, the solution does not involve legislation or even public trust. There is a better model!

They are difficult to contain or spin. As Asa Dotzler at FireFox wrote in his blog, the Google CEO simply doesn’t understand privacy. Here in USA, Schmidt’s statements have become a lightning rod for both the EFF and CDT (Center for Democracy & Technology). The CDT has even launched a front page campaign to “Take Back Your Privacy”.

Google’s not the only one situated at a data Nexus. Other organizations fly below the radar, either because few understand their tools or because of Government involvement. For example, Akamai probably has more access to web traffic data than Google. The US government has even more access because of an intricate web of programs that often force communications companies to plant data sniffing tools at the junction points of massive international data conduits. We’ve discussed this in other articles, and I certainly don’t advocate that Wild Ducks be privacy zealots and conspiracy alarmists. But the truth is, the zealots have a leg to stand on and the alarmists are very sane.