Clown World Bulletin: The Agenda to Normalize Crazy Continues

Major Airline Allows Male Staff to Wear Skirts

 

In a bid to become “the most inclusive airline in the skies,” British Virgin Atlantic scrapped rules on Wednesday requiring its staff to wear gender-specific uniforms. This means male personnel may now wear red skirt suits to work.

The company, which is owned by billionaire Richard Branson, announced that it would update its gender identity policies to “champion individuality,” enabling its employees to wear clothing that “expresses how they identify.”

In addition, Virgin Atlantic has updated its “trans inclusion policies,” entitling members of this community to time off for medical treatments related to gender transition, and allowing them to choose changing and shower facilities.

 

Have you noticed every time one of these Woke Corporations make a big announcement like this without fail it always ends with an overt Violation of Other people’s rights in the “fine print?”

I mean allowing a mentally unstable person to choose what changing (bathroom) and shower facilities they use openly Violates my Right to Privacy, but that’s OK because the rights of these people are now more important than the rest of us normal people, right?

Wrong.

This is Cultural Marxism in it’s most complete, under-handed form and it needs to be stamped out.

 

 

The Surveillance State: Drones and The End of Society

Drones3

The human race is on the brink of momentous and dire change. It is a change that potentially smashes our institutions and warps our society beyond recognition. It is also a change to which almost no one is paying attention. I’m talking about the coming obsolescence of the gun-wielding human infantryman as a weapon of war. Or to put it another way: the end of the Age of the Gun.

 You may not even realize you have been, indeed, living in the Age of the Gun because it’s been centuries since that age began. But imagine yourself back in 1400. In that century (and the 10 centuries before it), the battlefield was ruled not by the infantryman, but by the horse archer—a warrior-nobleman who had spent his whole life training in the ways of war. Imagine that guy’s surprise when he was shot off his horse by a poor no-count farmer armed with a long metal tube and just two weeks’ worth of training. Just a regular guy with a gun.
That day was the end of the Middle Ages and the beginning of modernity. For centuries after that fateful day, gun-toting infantry ruled the battlefield. Military success depended more and more on being able to motivate large groups of (gun-wielding) humans, instead of on winning the loyalty of the highly trained warrior-noblemen. But sometime in the near future, the autonomous, weaponized drone may replace the human infantryman as the dominant battlefield technology. And as always, that shift in military technology will cause huge social upheaval.

The advantage of people with guns is that they are cheap and easy to train. In the modern day, it’s true that bombers, tanks, and artillery can lay waste to infantry—but those industrial tools of warfare are just so expensive that swarms of infantry can still deter industrialized nations from fighting protracted conflicts. Look at how much it cost the United States to fight the wars in Afghanistan and Iraq, versus how much it cost our opponents. The hand-held firearm reached its apotheosis with the cheap, rugged, easy-to-use AK-47; with this ubiquitous weapon, guerrilla armies can still defy the mightiest nations on Earth.

Read the Remainder at Quartz

“Predictive Policing”: The Cyber Version of “Stop and Frisk”

china

Thanks America! How China’s Newest Software Could Track, Predict, and Crush Dissent

Armed with data from spying on its citizens, Beijing could turn ‘predictive policing’ into an AI tool of repression.

What if the Communist Party could have predicted Tiananmen Square? The Chinese government is deploying a new tool to keep the population from uprising. Beijing is building software to predict instability before it arises, based on volumes of data mined from Chinese citizens about their jobs, pastimes, and habits. It’s the latest advancement of what goes by the name “predictive policing,” where data is used to deploy law enforcement or even military units to places where crime (or, say, an anti-government political protest) is likely to occur. Don’t cringe: Predictive policing was born in the United States. But China is poised to emerge as a leader in the field.

Here’s what that means.

First, some background. What is predictive policing? Back in 1994, New York City Police Commissioner William Bratton led a pioneering and deeply controversial effort to pre-deploy police units to places where crime was expected to occur on the basis of crime statistics.

Bratton, working with Jack Maple, deputy police commissioner, showed that the so-called CompStat decreased crime by 37 percent in just three years. But it also fueled an unconstitutional practice called “stop-and-frisk,” wherein minority youth in the wrong place at the wrong time were frequently targeted and harassed by the police. Lesson: you can deploy police to hotspots before crime occurs but you can cause more problems than you solve.

That was in New York.

Wu Manqing, a representative from China Electronics Technology, the company that the Chinese government hired to design the predictive policing software, described the newest version as “a unified information environment,” Bloomberg reported last week. Its applications go well beyond simply sending police to a specific corner. Because Chinese authorities face far fewer privacy limits on the sorts of information that they can gather on citizens, they can target police forces much more precisely. They might be able to target an individual who suddenly received and deposited a large payment to their bank account, or who reads pro-democracy news sites, or who is displaying a change in buying habits — purchasing more expensive luxury items, for instance. The Chinese government’s control over the Internet in that country puts it in a unique position to extend the reach of surveillance and data collection into the lives of citizens. Chinese authorities plan to deploy the system in places where the relations between ethnic minorities and Chinese party are particularly strained, according to Bloomberg.

 For all the talk in Washington casting China as a rising regional military threat, the country began spending more on domestic security and stability, sometimes called wei-wen, than on building up its military in 2011. More recent numbers are harder to come by, but many China watchers believe the trend has continued.

After the Arab Spring in 2011, Chinese leaders increased internal security spending by 13 percent to 624 billion yuan, outpacing spending on the military, which was 601 billion yuan. That year, the Chinese government compelled 650 cities to improve their ability to monitor public spaces via surveillance cameras and other technologies. “Hundreds of Chinese cities are rushing to construct their safe city platforms by fusing Internet, video surveillance cameras, cell phones, GPS location data and biometric technologies into central ICT meta-systems,” reads the introduction to a 2013 report on Chinese spending on homeland security technologies from the Homeland Security Research Council, a market research firm in Washington.

China soon emerged as the world’s largest market for surveillance equipment. Western companies including Bain Capital, the equity firm founded by former GOP presidential candidate Mitt Romney, all wanted a piece of a pie worth a potential $132 billion (in 2022.)

But collecting massive amounts of data leads inevitably to the question of how to analyze it at scale. China is fast becoming a world leader in the use of machine learning and artificial intelligence for national security. Chinese scientists recently unveiled two papers at the Association for the Advancement of Artificial Intelligence and each points to the future of Chinese research into predictive policing.

One explains how to more easily recognize faces by compressing a Deep Neural Network, or DNN, down to a smaller size. “The expensive computation of DNNs make their deployment difficult on mobile and embedded devices,” it says. Read that to mean: here’s a mathematical formula for getting embedded cameras to recognize faces without calling up a distant database.

The second paper proposes software to predict the likelihood of a “public security event” in different Chinese provinces within the next month. Defense One was able to obtain a short demonstration of the system. Some of the “events” include the legitimately terrifying “campus attack” or “bus explosion” to the more mundane sounding, “strike event” or “gather event,” (the researchers say this was the “gather” incident in question.) all on a scale of severity from 1 to 5. To build it, the researchers relied on a dataset of more than 12,324 disruptive occurrences that took place across different provinces going back to 1998.

The research by itself is not alarming. What government doesn’t have an interest in stopping shootings or even predicting demonstrations?

It’s the Chinese government’s definition of “terrorism” that many in the West find troubling, since the government has used the phantom of public unrest to justify the arrests of peaceful dissidents, such as women’s rights worker Rebiya Kadeer.

Those fears increased after the Chinese government passed new anti-terror legislation in December that expanded government surveillance powers and that compels foreign technology companies to assist Chinese authorities in data collection efforts against Chinese citizens. Specifically, the law says that telecommunication and technology companies “shall provide technical interfaces, decryption and other technical support and assistance to public security and state security agencies when they are following the law to avert and investigate terrorist activities.”

The U.S. objects, and State Department spokesman Mark Toner said the law “could lead to greater restrictions on the exercise of freedoms of expressions, association, and peaceful assembly.” The FBI’s push to compel Apple to provide a different technical interface into Syed Farook’s iPhone is one reason leaders in China are watching the FBI versus Apple debate so closely (and the epitome of irony).

“Essentially, this law could give the authorities even more tools in censoring unwelcome information and crafting their own narrative in how the ‘war on terror’ is being waged,” human rights worker William Nee told the New York Times.

It could also compel foreign technology companies to assist the Chinese government in the acquisition of more data to train predictive policing software efforts. That’s where China’s predictive policing powers enter the picture.

Predictive policing efforts are rising around the United States with programs in Memphis, Tennessee, Chicago, Illinois, Santa Cruz and Los Angeles, California, and elsewhere. Police departments implement them in a variety of ways, many not particularly controversial. Beijing has the resources, will, and the data and inclination to turn predictive policing into something incredibly powerful, and, possibly, quite dreadful.

Read the Original Article at Defense One

Speak Into the Microphone: Devices on Public Buses in Maryland Listening in on Public Conversations

Buses

The Maryland Senate on Tuesday delayed action on a bill that would clamp down on when public buses and trains can record the private conversations of their passengers.

Sen. Robert A. Zirkin (D-Baltimore County), chair of the Senate Judicial Proceedings, which unanimously voted for the measure to move to the Senate floor, said he wanted the committee to address an amendment offered by some of those who are concerned about costs associated with the bill.

The bill is likely to be considered by the Senate on Wednesday, he said.

“What [the Maryland Transit Administration] is doing is a mass surveillance,” Zirkin said.

“I find it outrageous,” he said. “I don’t want to overstate it, but this is the issue of our generation. As technology advances, it becomes easier and easier to encroach on people’s civil liberties.”

 While Zirkin and other proponents argue that the technology, which has been in use since 2012, is an infringement on civil liberties, the bill’s opponents say the recordings are a necessary tool for homeland security.

The bill, which would affect MTA buses in the Baltimore area, Ride On buses in Montgomery County and TheBus in Prince George’s County, creates guidelines for audio recordings and places limits on when they can be made.

MTA began using recording devices inside some of its buses in 2012, without seeking legislative approval. Nearly 500 of its fleet of 750 buses now have audio recording capabilities. Officials say the devices can capture important information in cases of driver error or an attack or altercation on a bus.

Under the bill, recording devices would have to be installed near a bus or train operators’ seat. The devices would be controlled by the driver and could be activated only in the event of a public-safety incident.

The legislation to limit the recordings came to the Senate floor last week, but a vote was delayed until Tuesday after several lawmakers raised questions about how much it would cost to retrofit or replace existing recording equipment to meet the bill’s requirements.

Some lawmakers raised the issue of security. Several asked for the delay to allow time to draft amendments.

 “I can make an argument to tape everybody, everywhere, everywhere they walk, everywhere they talk, and you can make the excuse for homeland security,” Zirkin said. “But that is not a valid reason to encroach this fundamentally on people’s privacy rights.”

This is the fourth time in four years that the bill to limit the recordings has been introduced. Previous pieces of legislation have never made it out of committee, but Zirkin’s committee unanimously approved it this year.

Senate President Thomas V. Mike Miller Jr. (D-Calvert) indicated last week that he doesn’t like the bill and would probably vote against it because he feels the limitations could compromise security, and he does not want to incur the cost of replacing existing equipment.

The Judicial Proceedings Committee will hear testimony Tuesday afternoon on a bill that would change the way police officers in Maryland are trained and the process they go through when they are accused of misconduct.

The legislation, which was heard in the House last week, was created after the spring’s riots in Baltimore and repeated calls from criminal justice advocates for police reform.

Also on Tuesday, the House Appropriations Committee is scheduled to hold a hearing on a proposal to ban firearms at public colleges and universities in the state, including community colleges. Under existing law, schools can set their own gun policies, as long as they comply with Maryland statutes. Some schools prohibit firearms outright, while others allow them with permission from campus police.

The gun legislation, sponsored by Sen. Richard S. Madaleno Jr. (D-Montgomery) and Del. Benjamin S. Barnes (D-Prince George’s), is partly a response to a wave of mass shootings across the nation in recent years. Schools that include Virginia Tech and Oregon’s Umpqua Community College have experienced such deadly shootings, and Washington College on the Eastern Shore was shut down for a week in the fall while authorities tried to track down a student who had allegedly displayed a gun on campus.

House Speaker Michael E. Busch (D-Anne Arundel) and Senate President Miller joined other Democratic lawmakers in announcing support for the gun ban last month.

Read the Original Article at Washington Post

Why You Should Side With Apple and Not the FBI in the San Bernardino I-Phone Case

I have the utmost respect for Bruce. The man knows his stuff and is the final word in topics of this sort. -SF

iphone

By Bruce Schneier

Earlier this week, a federal magistrate ordered Apple to assist the FBI in hacking into the iPhone used by one of the San Bernardino shooters. Apple will fight this order in court.

The policy implications are complicated. The FBI wants to set a precedent that tech companies will assist law enforcement in breaking their users’ security, and the technology community is afraid that the precedent will limit what sorts of security features it can offer customers. The FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.

The technology considerations are more straightforward, and shine a light on the policy questions.

The iPhone 5c in question is encrypted. This means that someone without the key cannot get at the data. This is a good security feature. Your phone is a very intimate device. It is likely that you use it for private text conversations, and that it’s connected to your bank accounts. Location data reveals where you’ve been, and correlating multiple phones reveal who you associate with. Encryption protects your phone if it’s stolen by criminals. Encryption protects the phones of dissidents around the world if they’re taken by local police.  It protects all the data on your phone, and the apps that increasingly control the world around you.

This encryption depends on the user choosing a secure password, of course. If you had an older iPhone, you probably just used the default four-digit password. That’s only 10,000 possible passwords, making it pretty easy to guess. If the user enabled the more-secure alphanumeric password, that means a harder-to-guess password.

Apple added two more security features on the iPhone. First, a phone could be configured to erase the data after too many incorrect password guesses. And it enforced a delay between password guesses. This delay isn’t really noticeable by the user if you type the wrong password and then have to retype the correct password, but it’s a large barrier for anyone trying to guess password after password in a brute-force attempt to break into the phone.

But that iPhone has a security flaw. While the data is encrypted, the software controlling the phone is not. This means that someone can create a hacked version of the software and install it on the phone without the consent of the phone’s owner and without knowing the encryption key. This is what the FBI — and now the court — is demanding Apple do: It wants Apple to rewrite the phone’s software to make it possible to guess possible passwords quickly and automatically.

The FBI’s demands are specific to one phone, which might make its request seem reasonable if you don’t consider the technological implications: Authorities have the phone in their lawful possession, and they only need help seeing what’s on it in case it can tell them something about how the San Bernardino shooters operated. But the hacked software the court and the FBI wants Apple to provide would be general. It would work on any phone of the same model. It has to.

Make no mistake; this is what a backdoor looks like. This is an existing vulnerability in iPhone security that could be exploited by anyone.

There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues. There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world. Have the Chinese, for instance, written a hacked Apple operating system that records conversations and automatically forwards them to police? They would need to have stolen Apple’s code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies. We simply have no idea who already has this capability.

And while this sort of attack might be limited to state actors today, remember that attacks always get easier. Technology broadly spreads capabilities, and what was hard yesterday becomes easy tomorrow. Today’s top-secret NSA programs become tomorrow’s PhD theses and the next day’s hacker tools. Soon this flaw will be exploitable by cybercriminals to steal your financial data. Everyone with an iPhone is at risk, regardless of what the FBI demands Apple do.

What the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm. Powerful governments, democratic and totalitarian alike, want access to user data for both law enforcement and social control. We cannot build a backdoor that only works for a particular type of government, or only in the presence of a particular court order.

Either everyone gets security or no one does. Either everyone gets access or no one does. The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized. The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.

CORRECTION: An earlier version of this post incorrectly stated that the vulnerability the FBI wants Apple to exploit has been fixed in later models of the iPhone. In fact, according to Apple, that is not the case: There are some differences in the details of the attack, but all of its phones would be vulnerable to having their software updated in this manner.

Bruce Schneier is a security technologist and CTO of Resilient Systems, Inc. His latest book is Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World.

Read the Original Article at Washington Post