Surveillance State: Police CANNOT Track Your Cell Phone Without A Warrant Now

SPY

In a first, a Manhattan federal judge presiding over a narcotics case has decided that drug evidence obtained through cell phone surveillance technology called “Stingray” won’t be admissible in court.

StingRay (also known as “Hailstorm” or “TriggerFish”) is an “IMSI catcher basically acts like a cell phone tower, and sends out signals which force cell phones to ping them back with information showing their owner’s location and other identifying information. If an agent is tracking a suspect, the pings kind of work like the game “hot or cold.” The closer you get to the phone, the stronger (or hotter) the pings will become.

Judge William Pauley on Tuesday ruled that defendant Raymond Lambis’ rights were violated when DEA agents used a Stingray without a warrant to locate and search his Washington Heights apartment in Manhattan during a drug-trafficking investigation.

According to court documents, the DEA sent a technician with the StingRay to the area where Lambis lived. The technician walked around, sending out signals, until the strength of responding pings led him to Lambis’ apartment building. The technician entered the building and then walked up and down the hallways until he found the specific apartment where the pings were the strongest.

Later that evening, DEA agents knocked on Lambis’ door and asked his father for permission to search the apartment. In Lambis’ bedroom, agents recovered “narcotics, three digital scales, empty zip lock bags, and other drug paraphernalia.”

Privacy advocates say the use of the Stingray technology without a warrant encroaches on or even violates people’s constitutional rights. But despite concerns, the devices have become an increasingly common and popular item in law enforcement’s arsenal.

Research by the American Civil Liberties Union found that at least 13 federal agencies use StingRay technology, including the NSA, Homeland Security, the FBI and the army. In New York (and in many other states), both state and local police are equipped with Stingrays. An investigation by USA Today found that the technology was used even for routine crimes, like petty theft.

The ACLU found that the NYPD had used Stingrays more than 1,000 times between 2008 and May 2015 without any written policy on obtaining a warrant.

“If carrying a cell phone means being exposed to military grade surveillance equipment, then the privacy of nearly all New Yorkers is at risk,” Donna Lieberman, executive director of the ACLU’s New York branch said earlier this year.

Pauley’s ruling on Tuesday follows what was celebrated as a landmark decision by privacy advocates in April, when Maryland’s second highest court ruled that police need a probable cause warrant to track cell phones using StingRays.

After the decision, the Baltimore office of the public defender began reviewing hundreds of cases which hinged on StingRay technology, all of which could potentially be challenged as a result of the Maryland court ruling.

Read the Original Article at Vice News

Why You Should Side With Apple and Not the FBI in the San Bernardino I-Phone Case

I have the utmost respect for Bruce. The man knows his stuff and is the final word in topics of this sort. -SF

iphone

By Bruce Schneier

Earlier this week, a federal magistrate ordered Apple to assist the FBI in hacking into the iPhone used by one of the San Bernardino shooters. Apple will fight this order in court.

The policy implications are complicated. The FBI wants to set a precedent that tech companies will assist law enforcement in breaking their users’ security, and the technology community is afraid that the precedent will limit what sorts of security features it can offer customers. The FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.

The technology considerations are more straightforward, and shine a light on the policy questions.

The iPhone 5c in question is encrypted. This means that someone without the key cannot get at the data. This is a good security feature. Your phone is a very intimate device. It is likely that you use it for private text conversations, and that it’s connected to your bank accounts. Location data reveals where you’ve been, and correlating multiple phones reveal who you associate with. Encryption protects your phone if it’s stolen by criminals. Encryption protects the phones of dissidents around the world if they’re taken by local police.  It protects all the data on your phone, and the apps that increasingly control the world around you.

This encryption depends on the user choosing a secure password, of course. If you had an older iPhone, you probably just used the default four-digit password. That’s only 10,000 possible passwords, making it pretty easy to guess. If the user enabled the more-secure alphanumeric password, that means a harder-to-guess password.

Apple added two more security features on the iPhone. First, a phone could be configured to erase the data after too many incorrect password guesses. And it enforced a delay between password guesses. This delay isn’t really noticeable by the user if you type the wrong password and then have to retype the correct password, but it’s a large barrier for anyone trying to guess password after password in a brute-force attempt to break into the phone.

But that iPhone has a security flaw. While the data is encrypted, the software controlling the phone is not. This means that someone can create a hacked version of the software and install it on the phone without the consent of the phone’s owner and without knowing the encryption key. This is what the FBI — and now the court — is demanding Apple do: It wants Apple to rewrite the phone’s software to make it possible to guess possible passwords quickly and automatically.

The FBI’s demands are specific to one phone, which might make its request seem reasonable if you don’t consider the technological implications: Authorities have the phone in their lawful possession, and they only need help seeing what’s on it in case it can tell them something about how the San Bernardino shooters operated. But the hacked software the court and the FBI wants Apple to provide would be general. It would work on any phone of the same model. It has to.

Make no mistake; this is what a backdoor looks like. This is an existing vulnerability in iPhone security that could be exploited by anyone.

There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues. There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world. Have the Chinese, for instance, written a hacked Apple operating system that records conversations and automatically forwards them to police? They would need to have stolen Apple’s code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies. We simply have no idea who already has this capability.

And while this sort of attack might be limited to state actors today, remember that attacks always get easier. Technology broadly spreads capabilities, and what was hard yesterday becomes easy tomorrow. Today’s top-secret NSA programs become tomorrow’s PhD theses and the next day’s hacker tools. Soon this flaw will be exploitable by cybercriminals to steal your financial data. Everyone with an iPhone is at risk, regardless of what the FBI demands Apple do.

What the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm. Powerful governments, democratic and totalitarian alike, want access to user data for both law enforcement and social control. We cannot build a backdoor that only works for a particular type of government, or only in the presence of a particular court order.

Either everyone gets security or no one does. Either everyone gets access or no one does. The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized. The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.

CORRECTION: An earlier version of this post incorrectly stated that the vulnerability the FBI wants Apple to exploit has been fixed in later models of the iPhone. In fact, according to Apple, that is not the case: There are some differences in the details of the attack, but all of its phones would be vulnerable to having their software updated in this manner.

Bruce Schneier is a security technologist and CTO of Resilient Systems, Inc. His latest book is Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World.

Read the Original Article at Washington Post