Friday, August 19, 2016

OpenStack 08/20/2016 (a.m.)

  • Tags: surveillance state, NSA, ShadowBrokers, hack, seconddate

    • On Monday, a hacking group calling itself the “ShadowBrokers” announced an auction for what it claimed were “cyber weapons” made by the NSA. Based on never-before-published documents provided by the whistleblower Edward Snowden, The Intercept can confirm that the arsenal contains authentic NSA software, part of a powerful constellation of tools used to covertly infect computers worldwide.

      The provenance of the code has been a matter of heated debate this week among cybersecurity experts, and while it remains unclear how the software leaked, one thing is now beyond speculation: The malware is covered with the NSA’s virtual fingerprints and clearly originates from the agency.

      The evidence that ties the ShadowBrokers dump to the NSA comes in an agency manual for implanting malware, classified top secret, provided by Snowden, and not previously available to the public. The draft manual instructs NSA operators to track their use of one malware program using a specific 16-character string, “ace02468bdf13579.” That exact same string appears throughout the ShadowBrokers leak in code associated with the same program, SECONDDATE.


Posted from Diigo. The rest of Open Web group favorite links are here.

OpenStack 08/19/2016 (p.m.)

  • Tags: surveillance state, NSA, cybersecurity, exploits, vulnerabilities, firewalls

    • To penetrate the computers of foreign targets, the National Security Agency relies on software flaws that have gone undetected in the pipes of the Internet. For years, security experts have pressed the agency to disclose these bugs so they can be fixed, but the agency hackers have often been reluctant.

      Now with the mysterious release of a cache of NSA hacking tools over the weekend, the agency has lost an offensive advantage, experts say, and potentially placed at risk the security of countless large companies and government agencies worldwide.

      Several of the tools exploited flaws in commercial firewalls that remain unpatched, and they are out on the Internet for all to see. Anyone from a basement hacker to a sophisticated foreign spy agency has access to them now, and until the flaws are fixed, many computer systems may be in jeopardy.

      The revelation of the NSA cache, which dates to 2013 and has not been confirmed by the agency, also highlights the administration’s little-known process for figuring out which software errors to disclose and which to keep secret.


Posted from Diigo. The rest of Open Web group favorite links are here.

Monday, August 15, 2016

OpenStack 08/16/2016 (a.m.)

  • Bruce Scneier on the insecurity of the Internet of Things, and possible consequences.

    Tags: computer-security, Internet-of-things, Scneier

    • Disaster stories involving the Internet of Things are all the rage. They feature cars (both driven and driverless), the power grid, dams, and tunnel ventilation systems. A particularly vivid and realistic one, near-future fiction published last month in New York Magazine, described a cyberattack on New York that involved hacking of cars, the water system, hospitals, elevators, and the power grid. In these stories, thousands of people die. Chaos ensues. While some of these scenarios overhype the mass destruction, the individual risks are all real. And traditional computer and network security isn’t prepared to deal with them.

      Classic information security is a triad: confidentiality, integrity, and availability. You’ll see it called “CIA,” which admittedly is confusing in the context of national security. But basically, the three things I can do with your data are steal it (confidentiality), modify it (integrity), or prevent you from getting it (availability).

    • So far, internet threats have largely been about confidentiality. These can be expensive; one survey estimated that data breaches cost an average of $3.8 million each. They can be embarrassing, as in the theft of celebrity photos from Apple’s iCloud in 2014 or the Ashley Madison breach in 2015. They can be damaging, as when the government of North Korea stole tens of thousands of internal documents from Sony or when hackers stole data about 83 million customer accounts from JPMorgan Chase, both in 2014. They can even affect national security, as in the case of the Office of Personnel Management data breach by—presumptively—China in 2015.

      On the Internet of Things, integrity and availability threats are much worse than confidentiality threats. It’s one thing if your smart door lock can be eavesdropped upon to know who is home. It’s another thing entirely if it can be hacked to allow a burglar to open the door—or prevent you from opening your door. A hacker who can deny you control of your car, or take over control, is much more dangerous than one who can eavesdrop on your conversations or track your car’s location.

      With the advent of the Internet of Things and cyber-physical systems in general, we've given the internet hands and feet: the ability to directly affect the physical world. What used to be attacks against data and information have become attacks against flesh, steel, and concrete.

      Today’s threats include hackers crashing airplanes by hacking into computer networks, and remotely disabling cars, either when they’re turned off and parked or while they’re speeding down the highway. We’re worried about manipulated counts from electronic voting machines, frozen water pipes through hacked thermostats, and remote murder through hacked medical devices. The possibilities are pretty literally endless. The Internet of Things will allow for attacks we can’t even imagine.


Posted from Diigo. The rest of Open Web group favorite links are here.

Saturday, August 13, 2016

OpenStack 08/13/2016 (p.m.)

  • It's not often that I disagree with EFF's positions, but on this one I do. The government should be prohibited from exploiting computer vulnerabilities and should be required to immediately report all vulnerabilities discovered to the relevant developers of hardware or software. It's been one long slippery slope since the Supreme Court first approved wiretapping in Olmstead v. United States, 277 US 438 (1928), https://goo.gl/NJevsr (.) Left undecided to this day is whether we have a right to whisper privately, a right that is undeniable. All communications intercept cases since Olmstead fly directly in the face of that right.

    Tags: surveillance state, cybersecurity, exploits, government, malware

    • In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. To give a simple example, even when chasing a fleeing murder suspect, the police have a duty not to endanger bystanders. The government should pay the same care to our safety in pursuing threats online, but right now we don’t have clear, enforceable rules for government activities like hacking and "digital sabotage." And this is no abstract question—these actions increasingly endanger everyone’s security
    • The problem became especially clear this year during the San Bernardino case, involving the FBI’s demand that Apple rewrite its iOS operating system to defeat security features on a locked iPhone. Ultimately the FBI exploited an existing vulnerability in iOS and accessed the contents of the phone with the help of an "outside party." Then, with no public process or discussion of the tradeoffs involved, the government refused to tell Apple about the flaw. Despite the obvious fact that the security of the computers and networks we all use is both collective and interwoven—other iPhones used by millions of innocent people presumably have the same vulnerability—the government chose to withhold information Apple could have used to improve the security of its phones.

      Other examples include intelligence activities like Stuxnet and Bullrun, and law enforcement investigations like the FBI’s mass use of malware against Tor users engaged in criminal behavior. These activities are often disproportionate to stopping legitimate threats, resulting in unpatched software for millions of innocent users, overbroad surveillance, and other collateral effects. 

      That’s why we’re working on a positive agenda to confront governmental threats to digital security. Put more directly, we’re calling on lawyers, advocates, technologists, and the public to demand a public discussion of whether, when, and how governments can be empowered to break into our computers, phones, and other devices; sabotage and subvert basic security protocols; and stockpile and exploit software flaws and vulnerabilities.  

    • Smart people in academia and elsewhere have been thinking and writing about these issues for years. But it’s time to take the next step and make clear, public rules that carry the force of law to ensure that the government weighs the tradeoffs and reaches the right decisions.

      This long post outlines some of the things that can be done. It frames the issue, then describes some of the key areas where EFF is already pursuing this agenda—in particular formalizing the rules for disclosing vulnerabilities and setting out narrow limits for the use of government malware. Finally it lays out where we think the debate should go from here.   


Posted from Diigo. The rest of Open Web group favorite links are here.

Friday, August 12, 2016

OpenStack 08/12/2016 (p.m.)

  • Includes question about cyber-security and privacy.

    Tags: Auction 2016, 20-quetions, science, internet

    • A coalition of US groups representing more than 10 million scientists and engineers published 20 questions on Wednesday they want every US presidential candidate to answer ahead of November’s vote.

      The questions range from how to support vaccine science, to defining the scope of America’s goals in space, to the candidates’ views on climate change and what would they would do about it.

      Stances on nuclear power, protecting the world’s oceans, reducing the human and economic costs of mental illness, and the controversy over visa programs that allow highly skilled immigrants into the United States also feature in the list, made public by the American Association for the Advancement of Science (AAAS).

    • The full list is available at ScienceDebate.org/20qs.

      The 56 groups that helped create the list by crowd sourcing the questions has asked for the candidates to answer the questions by September 6.

      All are described by AAAS as non-partisan groups, including the National Academy of Sciences, the American Geophysical Union, the American Chemical Society and the Union of Concerned Scientists.


Posted from Diigo. The rest of Open Web group favorite links are here.

Sunday, August 07, 2016

OpenStack 08/07/2016 (p.m.)

  • But nobody is saying that the output of this technology can't be combined with the output of facial recognition technology to let them monitor you individually AND track your emotions. Fortunately, others are fighting back with knowledge and tech to block facial recognition. http://goo.gl/JMQM2W

    Tags: facial-recognition, emotion-recognition, cloud computing, Microsoft

    • On the 21st floor of a high-rise hotel in Cleveland, in a room full of political operatives, Microsoft’s Research Division was advertising a technology that could read each facial expression in a massive crowd, analyze the emotions, and report back in real time. “You could use this at a Trump rally,” a sales representative told me.

      At both the Republican and Democratic conventions, Microsoft sponsored event spaces for the news outlet Politico. Politico, in turn, hosted a series of Microsoft-sponsored discussions about the use of data technology in political campaigns. And throughout Politico’s spaces in both Philadelphia and Cleveland, Microsoft advertised an array of products from “Microsoft Cognitive Services,” its artificial intelligence and cloud computing division.

      At one exhibit, titled “Realtime Crowd Insights,” a small camera scanned the room, while a monitor displayed the captured image. Every five seconds, a new image would appear with data annotated for each face — an assigned serial number, gender, estimated age, and any emotions detected in the facial expression. When I approached, the machine labeled me “b2ff” and correctly identified me as a 23-year-old male.

    • “Realtime Crowd Insights” is an Application Programming Interface (API), or a software tool that connects web applications to Microsoft’s cloud computing services. Through Microsoft’s emotional analysis API — a component of Realtime Crowd Insights — applications send an image to Microsoft’s servers. Microsoft’s servers then analyze the faces and return emotional profiles for each one.

      In a November blog post, Microsoft said that the emotional analysis could detect “anger, contempt, fear, disgust, happiness, neutral, sadness or surprise.”

      Microsoft’s sales representatives told me that political campaigns could use the technology to measure the emotional impact of different talking points — and political scientists could use it to study crowd response at rallies.

    • Facial recognition technology — the identification of faces by name — is already widely used in secret by law enforcement, sports stadiums, retail stores, and even churches, despite being of questionable legality. As early as 2002, facial recognition technology was used at the Super Bowl to cross-reference the 100,000 attendees to a database of the faces of known criminals. The technology is controversial enough that in 2013, Google tried to ban the use of facial recognition apps in its Google glass system.

      But “Realtime Crowd Insights” is not true facial recognition — it could not identify me by name, only as “b2ff.” It did, however, store enough data on each face that it could continuously identify it with the same serial number, even hours later. The display demonstrated that capability by distinguishing between the number of total faces it had seen, and the number of unique serial numbers.

      UniqueFaces-e1470257178350-3

      Photo: Alex Emmons

    • Instead, “Realtime Crowd Insights” is an example of facial characterization technology — where computers analyze faces without necessarily identifying them. Facial characterization has many positive applications — it has been tested in the classroom, as a tool for spotting struggling students, and Microsoft has boasted that the tool will even help blind people read the faces around them.

      But facial characterization can also be used to assemble and store large profiles of information on individuals, even anonymously.

    • Alvaro Bedoya, a professor at Georgetown Law School and expert on privacy and facial recognition, has hailed that code of conduct as evidence that Microsoft is trying to do the right thing. But he pointed out that it leaves a number of questions unanswered — as illustrated in Cleveland and Philadelphia.

      “It’s interesting that the app being shown at the convention ‘remembered’ the faces of the people who walked by. That would seem to suggest that their faces were being stored and processed without the consent that Microsoft’s policy requires,” Bedoya said. “You have to wonder: What happened to the face templates of the people who walked by that booth? Were they deleted? Or are they still in the system?”

      Microsoft officials declined to comment on exactly what information is collected on each face and what data is retained or stored, instead referring me to their privacy policy, which does not address the question.

      Bedoya also pointed out that Microsoft’s marketing did not seem to match the consent policy. “It’s difficult to envision how companies will obtain consent from people in large crowds or rallies.”


Posted from Diigo. The rest of Open Web group favorite links are here.

Tuesday, July 12, 2016

OpenStack 07/13/2016 (a.m.)

  • Off we go for another trip to the European Court of Justice.

    Tags: surveillance state, EU, safe-harbor, privacy-shield, litigation

    • Privacy Shield is the proposed new deal between the EU and the US that is supposed to safeguard all personal data on EU citizens held on computer systems in the US from being subject to mass surveillance by the US National Security Agency. The data can refer to any transaction — web purchases, cars or clothing — involving an EU citizen whose data is held on US servers.

      Privacy groups say Privacy Shield — which replaces the Safe Harbor agreement ruled unlawful in October 2015 — does not meet strict EU standard on the use of personal data. Monique Goyens, Director General of the European Consumer Organization (BEUC) told Sputnik:

      “We consider that the shield is cracked beyond repair and is unlikely to stand scrutiny by the European Court of Justice. A fundamental problem remains that the US side of the shield is made of clay, not iron.”

    • The agreement has been under negotiation for months ever since the because the European Court of Justice ruled in October 2015 that the previous EU-US data agreement — Safe Harbor — was invalid. The issue arises from the strict EU laws — enshrined in the Charter of Fundamental Rights of the European Union — to the privacy of their personal data.
    • The Safe Harbor agreement was a quasi-judicial understanding that the US undertook to agree that it would ensure that EU citizens’ data on US servers would be held and protected under the same restrictions as it would be under EU law and directives. The data covers a huge array of information — from Internet and communications usage, to sales transactions, import and exports.
    • The case arose when Maximillian Schrems, a Facebook user, lodged a complaint with the Irish Data Protection Commissioner, arguing that — in the light of the revelations by ex-CIA contractor Edward Snowden of mass surveillance by the US National Security Agency (NSA) — the transfer of data from Facebook’s Irish subsidiary onto the company’s servers in the US does not provide sufficient protection of his personal data.

      The court ruled that: “the Safe Harbor Decision denies the national supervisory authorities their powers where a person calls into question whether the decision is compatible with the protection of the privacy and of the fundamental rights and freedoms of individuals.”


Posted from Diigo. The rest of Open Web group favorite links are here.

OpenStack 07/12/2016 (p.m.)

  • Tags: surveillance state, EU, safe-harbor, privacy-shield, NSA, State-Dept.

    • The EU has accepted a new version of the so-called Private Shield law that would allow US companies to transfer Europeans’ private data to servers across the ocean. The EU struck down the previously-reached agreement over US surveillance concerns.

    • The majority of EU members voted in support of the Privacy Shield pact with the US that had been designed to replace its predecessor, the Safe Harbor system, which the highest EU court ruled “invalid” in October 2015 following Edward Snowden’s revelations about mass US surveillance.
    • The newly-adopted agreement will come into force starting Tuesday.

      The deal, which is said to be aimed at protecting European citizens’ private data, defines the rules of how the sharing of information should be handled. It gives legal ground for tech companies such as Google, Facebook and MasterCard to move Europeans’ personal data to US servers bypassing an EU ban on moving personal information out from the 28-nation bloc. The agreement covers everything from private data about employees to detailed records of what people do online.

      “For the first time, the US has given the EU written assurance that the access of public authorities for law enforcement and national security will be subject to clear limitations, safeguards and oversight mechanisms and has ruled out indiscriminate mass surveillance of European citizens' data,” the statement said.

    • The new deal now grants greater guarantees to European customers and provides “accessible and affordable redress mechanisms” in case any disputes concerning US spying arise. An ombudsman will also be created within the US State Department to review complaints filed by EU citizens.
    • Privacy Shield, however, has also faced sharp criticism. Concerns about extensive US spying activity were raised in Europe after whistleblower Edward Snowden released a trove of controversial material on Washington’s surveillance practices.

      Digital rights group Privacy International (PI) said the newly-adopted pact had been drawn up on a "flawed premise" and “remains full of holes and hence offers limited protection to personal data”. 


Posted from Diigo. The rest of Open Web group favorite links are here.

Sunday, July 03, 2016

OpenStack 07/03/2016 (p.m.)

  • Tags: surveillance state, NSA-hacker-interview

    • he message arrived at night and consisted of three words: “Good evening sir!”

      The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept.

      There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine.

      Good evening sir!

    • The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept.

      There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine.

    • I got lucky with the hacker, because he recently left the agency for the cybersecurity industry; it would be his choice to talk, not the NSA’s. Fortunately, speaking out is his second nature.
    • The Lamb’s memos on cool ways to hunt sysadmins triggered a strong reaction when I wrote about them in 2014 with my colleague Ryan Gallagher. The memos explained how the NSA tracks down the email and Facebook accounts of systems administrators who oversee computer networks. After plundering their accounts, the NSA can impersonate the admins to get into their computer networks and pilfer the data flowing through them. As the Lamb wrote, “sys admins generally are not my end target. My end target is the extremist/terrorist or government official that happens to be using the network … who better to target than the person that already has the ‘keys to the kingdom’?”

      Another of his NSA memos, “Network Shaping 101,” used Yemen as a theoretical case study for secretly redirecting the entirety of a country’s internet traffic to NSA servers.

    • In recent years, two developments have helped make hacking for the government a lot more attractive than hacking for yourself. First, the Department of Justice has cracked down on freelance hacking, whether it be altruistic or malignant. If the DOJ doesn’t like the way you hack, you are going to jail. Meanwhile, hackers have been warmly invited to deploy their transgressive impulses in service to the homeland, because the NSA and other federal agencies have turned themselves into licensed hives of breaking into other people’s computers. For many, it’s a techno sandbox of irresistible delights, according to Gabriella Coleman, a professor at McGill University who studies hackers. “The NSA is a very exciting place for hackers because you have unlimited resources, you have some of the best talent in the world, whether it’s cryptographers or mathematicians or hackers,” she said. “It is just too intellectually exciting not to go there.”
    • He agreed to a video chat that turned into a three-hour discussion sprawling from the ethics of surveillance to the downsides of home improvements and the difficulty of securing your laptop.
    • “If I turn the tables on you,” I asked the Lamb, “and say, OK, you’re a target for all kinds of people for all kinds of reasons. How do you feel about being a target and that kind of justification being used to justify getting all of your credentials and the keys to your kingdom?”

      The Lamb smiled. “There is no real safe, sacred ground on the internet,” he replied. “Whatever you do on the internet is an attack surface of some sort and is just something that you live with. Any time that I do something on the internet, yeah, that is on the back of my mind. Anyone from a script kiddie to some random hacker to some other foreign intelligence service, each with their different capabilities — what could they be doing to me?”

    • “You know, the situation is what it is,” he said. “There are protocols that were designed years ago before anybody had any care about security, because when they were developed, nobody was foreseeing that they would be taken advantage of. … A lot of people on the internet seem to approach the problem [with the attitude of] ‘I’m just going to walk naked outside of my house and hope that nobody looks at me.’ From a security perspective, is that a good way to go about thinking? No, horrible … There are good ways to be more secure on the internet. But do most people use Tor? No. Do most people use Signal? No. Do most people use insecure things that most people can hack? Yes. Is that a bash against the intelligence community that people use stuff that’s easily exploitable? That’s a hard argument for me to make.”
    • I mentioned that lots of people, including Snowden, are now working on the problem of how to make the internet more secure, yet he seemed to do the opposite at the NSA by trying to find ways to track and identify people who use Tor and other anonymizers. Would he consider working on the other side of things? He wouldn’t rule it out, he said, but dismally suggested the game was over as far as having a liberating and safe internet, because our laptops and smartphones will betray us no matter what we do with them.

      “There’s the old adage that the only secure computer is one that is turned off, buried in a box ten feet underground, and never turned on,” he said. “From a user perspective, someone trying to find holes by day and then just live on the internet by night, there’s the expectation [that] if somebody wants to have access to your computer bad enough, they’re going to get it. Whether that’s an intelligence agency or a cybercrimes syndicate, whoever that is, it’s probably going to happen.”

      • There are precautions one can take, and I did that with the Lamb. When we had our video chat, I used a computer that had been wiped clean of everything except its operating system and essential applications. Afterward, it was wiped clean again. My concern was that the Lamb might use the session to obtain data from or about the computer I was using; there are a lot of things he might have tried, if he was in a scheming mood. At the end of our three hours together, I mentioned to him that I had taken these precautions—and he approved.

        “That’s fair,” he said. “I’m glad you have that appreciation. … From a perspective of a journalist who has access to classified information, it would be remiss to think you’re not a target of foreign intelligence services.”

        He was telling me the U.S. government should be the least of my worries. He was trying to help me.

        Documents published with this article:


Posted from Diigo. The rest of Open Web group favorite links are here.

Friday, July 01, 2016

Friday, June 24, 2016

OpenStack 06/24/2016 (p.m.)


Posted from Diigo. The rest of Open Web group favorite links are here.

Sunday, May 29, 2016

OpenStack 05/29/2016 (p.m.)

  • Tags: W3C, web-standards, CDM, encrypted-media-extensions EME DRM

    • The World Wide Web Consortium (W3C), once the force for open standards that kept browsers from locking publishers to their proprietary capabilities, has changed its mission. Since 2013, the organization has provided a forum where today's dominant browser companies and the dominant entertainment companies can collaborate on a system to let our browsers control our behavior, rather than the other way.

      This system, "Encrypted Media Extensions" (EME) uses standards-defined code to funnel video into a proprietary container called a "Content Decryption Module." For a new browser to support this new video streaming standard -- which major studios and cable operators are pushing for -- it would have to convince those entertainment companies or one of their partners to let them have a CDM, or this part of the "open" Web would not display in their new browser.

      This is the opposite of every W3C standard to date: once, all you needed to do to render content sent by a server was follow the standard, not get permission. If browsers had needed permission to render a page at the launch of Mozilla, the publishers would have frozen out this new, pop-up-blocking upstart. Kiss Firefox goodbye, in other words.

    • The W3C didn't have to do this. No copyright law says that making a video gives you the right to tell people who legally watch it how they must configure their equipment. But because of the design of EME, copyright holders will be able to use the law to shut down any new browser that tries to render the video without their permission.

      That's because EME is designed to trigger liability under section 1201 of the Digital Millennium Copyright Act (DMCA), which says that removing a digital lock that controls access to a copyrighted work without permission is an offense, even if the person removing the lock has the right to the content it restricts. In other words, once a video is sent with EME, a new company that unlocks it for its users can be sued, even if the users do nothing illegal with that video.

      We proposed that the W3C could protect new browsers by making their members promise not to use the DMCA to attack new entrants in the market, an idea supported by a diverse group of W3C members, but the W3C executive overruled us saying the work would go forward with no safeguards for future competition.

      It's even worse than at first glance. The DMCA isn't limited to the USA: the US Trade Representative has spread DMCA-like rules to virtually every country that does business with America. Worse still: the DMCA is also routinely used by companies to threaten and silence security researchers who reveal embarrassing defects in their products. The W3C also declined to require its members to protect security researchers who discover flaws in EME, leaving every Web user vulnerable to vulnerabilities whose disclosure can only safely take place if the affected company decides to permit it.

    • The W3C needs credibility with people who care about the open Web and innovation in order to be viable. They are sensitive to this kind of criticism. We empathize. There are lots of good people working there, people who genuinely, passionately want the Web to stay open to everyone, and to be safe for its users. But the organization made a terrible decision when it opted to provide a home for EME, and an even worse one when it overruled its own members and declined protection for security research and new competitors.

      It needs to hear from you now. Please share this post, and spread the word. Help the W3C be the organization it is meant to be.

  • Tags: surveillance state, NSA, 702, FISA, legislation, backdoor-serches, stats

    • Last month, at the request of the Department of Justice, the Courts approved changes to the obscure Rule 41 of the Federal Rules of Criminal Procedure, which governs search and seizure. By the nature of this obscure bureaucratic process, these rules become law unless Congress rejects the changes before December 1, 2016.

      Today I, along with my colleagues Senators Paul from Kentucky, Baldwin from Wisconsin, and Daines and Tester from Montana, am introducing the Stopping Mass Hacking (SMH) Act (bill, summary), a bill to protect millions of law-abiding Americans from a massive expansion of government hacking and surveillance. Join the conversation with #SMHact.

    • For law enforcement to conduct a remote electronic search, they generally need to plant malware in — i.e. hack — a device. These rule changes will allow the government to search millions of computers with the warrant of a single judge. To me, that’s clearly a policy change that’s outside the scope of an “administrative change,” and it is something that Congress should consider. An agency with the record of the Justice Department shouldn’t be able to wave its arms and grant itself entirely new powers.
    • These changes say that if law enforcement doesn’t know where an electronic device is located, a magistrate judge will now have the the authority to issue a warrant to remotely search the device, anywhere in the world. While it may be appropriate to address the issue of allowing a remote electronic search for a device at an unknown location, Congress needs to consider what protections must be in place to protect Americans’ digital security and privacy. This is a new and uncertain area of law, so there needs to be full and careful debate. The ACLU has a thorough discussion of the Fourth Amendment ramifications and the technological questions at issue with these kinds of searches.

      The second part of the change to Rule 41 would give a magistrate judge the authority to issue a single warrant that would authorize the search of an unlimited number — potentially thousands or millions — of devices, located anywhere in the world. These changes would dramatically expand the government’s hacking and surveillance authority. The American public should understand that these changes won’t just affect criminals: computer security experts and civil liberties advocates say the amendments would also dramatically expand the government’s ability to hack the electronic devices of law-abiding Americans if their devices were affected by a computer attack. Devices will be subject to search if their owners were victims of a botnet attack — so the government will be treating victims of hacking the same way they treat the perpetrators.

    • As the Center on Democracy and Technology has noted, there are approximately 500 million computers that fall under this rule. The public doesn’t know nearly enough about how law enforcement executes these hacks, and what risks these types of searches will pose. By compromising the computer’s system, the search might leave it open to other attackers or damage the computer they are searching.

      Don’t take it from me that this will impact your security, read more from security researchers Steven Bellovin, Matt Blaze and Susan Landau.

      Finally, these changes to Rule 41 would also give some types of electronic searches different, weaker notification requirements than physical searches. Under this new Rule, they are only required to make “reasonable efforts” to notify people that their computers were searched. This raises the possibility of the FBI hacking into a cyber attack victim’s computer and not telling them about it until afterward, if at all.

  • The 4th Amendment: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and *particularly describing the place to be searched, and the* persons or *things to be seized."* So much for the particularized description of the place to be searched and the thngs to be seized.  Fah! Who needs a Constitution, anyway .... 

    Tags: surveillance state, NSA, 702, FISA, legislation, backdoor-serches, stats

    • The Senate Judiciary Committee held an open hearing today on the FISA Amendments Act, the law that ostensibly authorizes the digital surveillance of hundreds of millions of people both in the United States and around the world. Section 702 of the law, scheduled to expire next year, is designed to allow U.S. intelligence services to collect signals intelligence on foreign targets related to our national security interests. However—thanks to the leaks of many whistleblowers including Edward Snowden, the work of investigative journalists, and statements by public officials—we now know that the FISA Amendments Act has been used to sweep up data on hundreds of millions of people who have no connection to a terrorist investigation, including countless Americans.

      What do we mean by “countless”? As became increasingly clear in the hearing today, the exact number of Americans impacted by this surveillance is unknown. Senator Franken asked the panel of witnesses, “Is it possible for the government to provide an exact count of how many United States persons have been swept up in Section 702 surveillance? And if not the exact count, then what about an estimate?”

    • Elizabeth Goitein, the Brennan Center director whose articulate and thought-provoking testimony was the highlight of the hearing, noted that at this time an exact number would be difficult to provide. However, she asserted that an estimate should be possible for most if not all of the government’s surveillance programs.

      None of the other panel participants—which included David Medine and Rachel Brand of the Privacy and Civil Liberties Oversight Board as well as Matthew Olsen of IronNet Cybersecurity and attorney Kenneth Wainstein—offered an estimate.

      Today’s hearing reaffirmed that it is not only the American people who are left in the dark about how many people or accounts are impacted by the NSA’s dragnet surveillance of the Internet. Even vital oversight committees in Congress like the Senate Judiciary Committee are left to speculate about just how far-reaching this surveillance is. It's part of the reason why we urged the House Judiciary Committee to demand that the Intelligence Community provide the public with a number. 

    • The lack of information makes rigorous oversight of the programs all but impossible. As Senator Franken put it in the hearing today, “When the public lacks even a rough sense of the scope of the government’s surveillance program, they have no way of knowing if the government is striking the right balance, whether we are safeguarding our national security without trampling on our citizens’ fundamental privacy rights. But the public can’t know if we succeed in striking that balance if they don’t even have the most basic information about our major surveillance programs." 

      Senator Patrick Leahy also questioned the panel about the “minimization procedures” associated with this type of surveillance, the privacy safeguard that is intended to ensure that irrelevant data and data on American citizens is swiftly deleted.

      Senator Leahy asked the panel: “Do you believe the current minimization procedures ensure that data about innocent Americans is deleted? Is that enough?” 

      David Medine, who recently announced his pending retirement from the Privacy and Civil Liberties Oversight Board, answered unequivocally:

    • Senator Leahy, they don’t. The minimization procedures call for the deletion of innocent Americans’ information upon discovery to determine whether it has any foreign intelligence value. But what the board’s report found is that in fact information is never deleted. It sits in the databases for 5 years, or sometimes longer. And so the minimization doesn’t really address the privacy concerns of incidentally collected communications—again, where there’s been no warrant at all in the process… In the United States, we simply can’t read people’s emails and listen to their phone calls without court approval, and the same should be true when the government shifts its attention to Americans under this program.

      One of the most startling exchanges from the hearing today came toward the end of the session, when Senator Dianne Feinstein—who also sits on the Intelligence Committee—seemed taken aback by Ms. Goitein’s mention of “backdoor searches.” 

    • Feinstein: Wow, wow. What do you call it? What’s a backdoor search?

      Goitein: Backdoor search is when the FBI or any other agency targets a U.S. person for a search of data that was collected under Section 702, which is supposed to be targeted against foreigners overseas.

      Feinstein: Regardless of the minimization that was properly carried out.

      Goitein: Well the data is searched in its unminimized form. So the FBI gets raw data, the NSA, the CIA get raw data. And they search that raw data using U.S. person identifiers. That’s what I’m referring to as backdoor searches.

      It’s deeply concerning that any member of Congress, much less a member of the Senate Judiciary Committee and the Senate Intelligence Committee, might not be aware of the problem surrounding backdoor searches. In April 2014, the Director of National Intelligence acknowledged the searches of this data, which Senators Ron Wyden and Mark Udall termed “the ‘back-door search’ loophole in section 702.” The public was so incensed that the House of Representatives passed an amendment to that year's defense appropriations bill effectively banning the warrantless backdoor searches. Nonetheless, in the hearing today it seemed like Senator Feinstein might not recognize or appreciate the serious implications of allowing U.S. law enforcement agencies to query the raw data collected through these Internet surveillance programs. Hopefully today’s testimony helped convince the Senator that there is more to this topic than what she’s hearing in jargon-filled classified security briefings.


Posted from Diigo. The rest of Open Web group favorite links are here.

Wednesday, April 20, 2016

OpenStack 04/20/2016 (p.m.)

  • Tags: driverless-cars, Beverly-Hills

    • An uncontested vote by the Beverly Hills City Council could guarantee a chauffeur for all residents in the near future. However, instead of a driver, the newly adopted program foresees municipally-owned driverless cars ready to order via a smartphone app.

      Also known as autonomous vehicles, or AV, driverless cars would appear to be the next big thing not only for people, but local governments as well – if the Beverly Hills City Council can get its AV development program past a few more hurdles, that is. The technology itself has some challenges ahead as well.

    • In the meantime, the conceptual shuttle service, which was unanimously approved at an April 5 city council meeting, is being celebrated.
    • Naming Google and Tesla in its press release, Beverly Hills must first develop a partnership with a manufacturer that can build it a fleet of unmanned cars. There will also be a need to bring in policy experts. All of these outside parties will have a chance to explore the program’s potential together at an upcoming community event.

      The Wallis Annenberg Center for the Performing Arts will host a summit this fall that will include expert lectures, discussions, and test drives. Er, test rides.

      Already in the works for Beverly Hills is a fiber optics cable network that will, in addition to providing high-speed internet access to all residents and businesses, one day be an integral part of a public transit system that runs on its users’ spontaneous desires.

      Obviously, Beverly Hills has some money on hand for the project, and it is also an ideal testing space as the city takes up an area of less than six square miles. Another positive factor is the quality of the city’s roads, which exceeds that of most in the greater Los Angeles area, not to mention California and the whole United States.

      “It can’t find the lane markings!” Volvo’s North American CEO, Lex Kerssemakers, complained to Los Angeles Mayor Eric Garcetti last month, according to Reuters. “You need to paint the bloody roads here!”

      Whether lanes are marked or signs are clear has made a big difference in how successfully the new technology works.

      Unfortunately, the US Department of Transportation considers 65 percent of US roads to be in poor condition, so AV cars may not be in the works for many Americans living outside of Beverly Hills quite as soon.


Posted from Diigo. The rest of Open Web group favorite links are here.

Saturday, April 16, 2016

OpenStack 04/16/2016 (p.m.)

  • The Fourth Amendment argument that people have a right to know when their property has been searched or seized is particularly interesting to me. If adopted by the Courts, that could spell the end of surveillance gag orders. 

    Tags: surveillance state, Microsoft, DoJ, litigation, 4th-Amendment, 1st Amendment, gag-orders

    • “WE APPRECIATE THAT there are times when secrecy around a government warrant is needed,” Microsoft President Brad Smith wrote in a blog post on Thursday. “But based on the many secrecy orders we have received, we question whether these orders are grounded in specific facts that truly demand secrecy. To the contrary, it appears that the issuance of secrecy orders has become too routine.”

      With those words, Smith announced that Microsoft was suing the Department of Justice for the right to inform its customers when the government is reading their emails.

      The last big fight between the Justice Department and Silicon Valley was started by law enforcement, when the FBI demanded that Apple unlock a phone used by San Bernardino killer Syed Rizwan Farook.

      This time, Microsoft is going on the offensive. The move is welcomed by privacy activists as a step forward for transparency — though it’s also for business reasons.

    • Secret government searches are eroding people’s trust in the cloud, Smith wrote — including large and small businesses now keeping massive amounts of records online. “The transition to the cloud does not alter people’s expectations of privacy and should not alter the fundamental constitutional requirement that the government must — with few exceptions — give notice when it searches and seizes private information or communications,” he wrote.

      According to the complaint, Microsoft received 5,624 federal demands for customer information or data in the past 18 months. Almost half — 2,576 — came with gag orders, and almost half of those — 1,752 — had “no fixed end date” by which Microsoft would no longer be sworn to secrecy.

      These requests, though signed off on by a judge, qualify as unconstitutional searches, the attorneys argue. It “violates both the Fourth Amendment, which affords people and businesses the right to know if the government searches or seizes their property, and the First Amendment, which enshrines Microsoft’s rights to talk to its customers and to discuss how the government conducts its investigations — subject only to restraints narrowly tailored to serve compelling government interests,” they wrote.


Posted from Diigo. The rest of Open Web group favorite links are here.

Tuesday, April 12, 2016

OpenStack 04/13/2016 (a.m.)

  • Tags: Panama-Papers, technology, open-source

    • Then we put the data up, but the problem with Solr was it didn’t have a user interface, so we used Project Blacklight, which is open source software normally used by librarians. We used it for the journalists. It’s simple because it allows you to do faceted search—so, for example, you can facet by the folder structure of the leak, by years, by type of file. There were more complex things—it supports queries in regular expressions, so the more advanced users were able to search for documents with a certain pattern of numbers that, for example, passports use. You could also preview and download the documents. ICIJ open-sourced the code of our document processing chain, created by our web developer Matthew Caruana Galizia.

      We also developed a batch-searching feature. So say you were looking for politicians in your country—you just run it through the system, and you upload your list to Blacklight and you would get a CSV back saying yes, there are matches for these names—not only exact matches, but also matches based on proximity. So you would say “I want Mar Cabra proximity 2” and that would give you “Mar Cabra,” “Mar whatever Cabra,” “Cabra, Mar,”—so that was good, because very quickly journalists were able to see… I have this list of politicians and they are in the data!

    • Last Sunday, April 3, the first stories emerging from the leaked dataset known as the Panama Papers were published by a global partnership of news organizations working in coordination with the International Consortium of Investigative Journalists, or ICIJ. As we begin the second week of reporting on the leak, Iceland’s Prime Minister has been forced to resign, Germany has announced plans to end anonymous corporate ownership, governments around the world launched investigations into wealthy citizens’ participation in tax havens, the Russian government announced that the investigation was an anti-Putin propaganda operation, and the Chinese government banned mentions of the leak in Chinese media.

      As the ICIJ-led consortium prepares for its second major wave of reporting on the Panama Papers, we spoke with Mar Cabra, editor of ICIJ’s Data & Research unit and lead coordinator of the data analysis and infrastructure work behind the leak. In our conversation, Cabra reveals ICIJ’s years-long effort to build a series of secure communication and analysis platforms in support of genuinely global investigative reporting collaborations.

    • For communication, we have the Global I-Hub, which is a platform based on open source software called Oxwall. Oxwall is a social network, like Facebook, which has a wall when you log in with the latest in your network—it has forum topics, links, you can share files, and you can chat with people in real time.
    • We had the data in a relational database format in SQL, and thanks to ETL (Extract, Transform, and Load) software Talend, we were able to easily transform the data from SQL to Neo4j (the graph-database format we used). Once the data was transformed, it was just a matter of plugging it into Linkurious, and in a couple of minutes, you have it visualized—in a networked way, so anyone can log in from anywhere in the world. That was another reason we really liked Linkurious and Neo4j—they’re very quick when representing graph data, and the visualizations were easy to understand for everybody. The not-very-tech-savvy reporter could expand the docs like magic, and more technically expert reporters and programmers could use the Neo4j query language, Cypher, to do more complex queries, like show me everybody within two degrees of separation of this person, or show me all the connected dots…
    • We believe in open source technology and try to use it as much as possible. We used Apache Solr for the indexing and Apache Tika for document processing, and it’s great because it processes dozens of different formats and it’s very powerful. Tika interacts with Tesseract, so we did the OCRing on Tesseract.

      To OCR the images, we created an army of 30–40 temporary servers in Amazon that allowed us to process the documents in parallel and do parallel OCR-ing. If it was very slow, we’d increase the number of servers—if it was going fine, we would decrease because of course those servers have a cost.

    • For the visualization of the Mossack Fonseca internal database, we worked with another tool called Linkurious. It’s not open source, it’s licensed software, but we have an agreement with them, and they allowed us to work with it. It allows you to represent data in graphs. We had a version of Linkurious on our servers, so no one else had the data. It was pretty intuitive—journalists had to click on dots that expanded, basically, and could search the names.

Posted from Diigo. The rest of Open Web group favorite links are here.

Thursday, April 07, 2016

OpenStack 04/07/2016 (p.m.)

  • Tags: surveillance state, encryption, WhatsApp

    • For most of the past six weeks, the biggest story out of Silicon Valley was Apple’s battle with the FBI over a federal order to unlock the iPhone of a mass shooter. The company’s refusal touched off a searing debate over privacy and security in the digital age. But this morning, at a small office in Mountain View, California, three guys made the scope of that enormous debate look kinda small.

      Mountain View is home to WhatsApp, an online messaging service now owned by tech giant Facebook, that has grown into one of the world’s most important applications. More than a billion people trade messages, make phone calls, send photos, and swap videos using the service. This means that only Facebook itself runs a larger self-contained communications network. And today, the enigmatic founders of WhatsApp, Brian Acton and Jan Koum, together with a high-minded coder and cryptographer who goes by the pseudonym Moxie Marlinspike, revealed that the company has added end-to-end encryption to every form of communication on its service.

    • This means that if any group of people uses the latest version of WhatsApp—whether that group spans two people or ten—the service will encrypt all messages, phone calls, photos, and videos moving among them. And that’s true on any phone that runs the app, from iPhones to Android phones to Windows phones to old school Nokia flip phones. With end-to-end encryption in place, not even WhatsApp’s employees can read the data that’s sent across its network. In other words, WhatsApp has no way of complying with a court order demanding access to the content of any message, phone call, photo, or video traveling through its service. Like Apple, WhatsApp is, in practice, stonewalling the federal government, but it’s doing so on a larger front—one that spans roughly a billion devices.
    • The FBI and the Justice Department declined to comment for this story. But many inside the government and out are sure to take issue with the company’s move. In late 2014, WhatsApp encrypted a portion of its network. In the months since, its service has apparently been used to facilitate criminal acts, including the terrorist attacks on Paris last year. According to The New York Times, as recently as this month, the Justice Department was considering a court case against the company after a wiretap order (still under seal) ran into WhatsApp’s end-to-end encryption.

      “The government doesn’t want to stop encryption,” says Joseph DeMarco, a former federal prosecutor who specializes in cybercrime and has represented various law enforcement agencies backing the Justice Department and the FBI in their battle with Apple. “But the question is: what do you do when a company creates an encryption system that makes it impossible for court-authorized search warrants to be executed? What is the reasonable level of assistance you should ask from that company?”


Posted from Diigo. The rest of Open Web group favorite links are here.

Saturday, April 02, 2016

OpenStack 04/02/2016 (p.m.)

  • The article is wide of the mark, based on analysis of Executive Branch policy rather than the governing law such as the Freedom of Information Act. And I still find it somewhat ludicrous that a third party with knowledge of the defect could succeed in convincing a court that knowledge of a defect in a company's product is trade-secret proprietary information. "Your honor, my client has discovered a way to break into Mr. Tim Cook's house without a key to his house. That is a valuable trade secret that this Court must keep Mr. Cook from learning." Pow! The Computer Fraud and Abuse Act makes it a crime to access a computer that can connect to the Internet by exploiting a software bug. 

    Tags: surveillance state, Apple, FBI, iPhone, encryption, FOIA, litigation

    • The FBI may be allowed to withhold information about how it broke into an iPhone belonging to a gunman in the December San Bernardino shootings, despite a U.S. government policy of disclosing technology security flaws discovered by federal agencies.

      Under the U.S. vulnerabilities equities process, the government is supposed to err in favor of disclosing security issues so companies can devise fixes to protect data. The policy has exceptions for law enforcement, and there are no hard rules about when and how it must be applied.

      Apple Inc has said it would like the government to share how it cracked the iPhone security protections. But the Federal Bureau of Investigation, which has been frustrated by its inability to access data on encrypted phones belonging to criminal suspects, might prefer to keep secret the technique it used to gain access to gunman Syed Farook's phone.

      The referee is likely to be a White House group formed during the Obama administration to review computer security flaws discovered by federal agencies and decide whether they should be disclosed.

    • Stewart Baker, former general counsel of the NSA and now a lawyer with Steptoe & Johnson, said the review process could be complicated if the cracking method is considered proprietary by the third party that assisted the FBI.

      Several security researchers have pointed to the Israel-based mobile forensics firm Cellebrite as the likely third party that helped the FBI. That company has repeatedly declined comment.


Posted from Diigo. The rest of Open Web group favorite links are here.