Friday, June 24, 2016

OpenStack 06/24/2016 (p.m.)


Posted from Diigo. The rest of Open Web group favorite links are here.

Sunday, May 29, 2016

OpenStack 05/29/2016 (p.m.)

  • Tags: W3C, web-standards, CDM, encrypted-media-extensions EME DRM

    • The World Wide Web Consortium (W3C), once the force for open standards that kept browsers from locking publishers to their proprietary capabilities, has changed its mission. Since 2013, the organization has provided a forum where today's dominant browser companies and the dominant entertainment companies can collaborate on a system to let our browsers control our behavior, rather than the other way.

      This system, "Encrypted Media Extensions" (EME) uses standards-defined code to funnel video into a proprietary container called a "Content Decryption Module." For a new browser to support this new video streaming standard -- which major studios and cable operators are pushing for -- it would have to convince those entertainment companies or one of their partners to let them have a CDM, or this part of the "open" Web would not display in their new browser.

      This is the opposite of every W3C standard to date: once, all you needed to do to render content sent by a server was follow the standard, not get permission. If browsers had needed permission to render a page at the launch of Mozilla, the publishers would have frozen out this new, pop-up-blocking upstart. Kiss Firefox goodbye, in other words.

    • The W3C didn't have to do this. No copyright law says that making a video gives you the right to tell people who legally watch it how they must configure their equipment. But because of the design of EME, copyright holders will be able to use the law to shut down any new browser that tries to render the video without their permission.

      That's because EME is designed to trigger liability under section 1201 of the Digital Millennium Copyright Act (DMCA), which says that removing a digital lock that controls access to a copyrighted work without permission is an offense, even if the person removing the lock has the right to the content it restricts. In other words, once a video is sent with EME, a new company that unlocks it for its users can be sued, even if the users do nothing illegal with that video.

      We proposed that the W3C could protect new browsers by making their members promise not to use the DMCA to attack new entrants in the market, an idea supported by a diverse group of W3C members, but the W3C executive overruled us saying the work would go forward with no safeguards for future competition.

      It's even worse than at first glance. The DMCA isn't limited to the USA: the US Trade Representative has spread DMCA-like rules to virtually every country that does business with America. Worse still: the DMCA is also routinely used by companies to threaten and silence security researchers who reveal embarrassing defects in their products. The W3C also declined to require its members to protect security researchers who discover flaws in EME, leaving every Web user vulnerable to vulnerabilities whose disclosure can only safely take place if the affected company decides to permit it.

    • The W3C needs credibility with people who care about the open Web and innovation in order to be viable. They are sensitive to this kind of criticism. We empathize. There are lots of good people working there, people who genuinely, passionately want the Web to stay open to everyone, and to be safe for its users. But the organization made a terrible decision when it opted to provide a home for EME, and an even worse one when it overruled its own members and declined protection for security research and new competitors.

      It needs to hear from you now. Please share this post, and spread the word. Help the W3C be the organization it is meant to be.

  • Tags: surveillance state, NSA, 702, FISA, legislation, backdoor-serches, stats

    • Last month, at the request of the Department of Justice, the Courts approved changes to the obscure Rule 41 of the Federal Rules of Criminal Procedure, which governs search and seizure. By the nature of this obscure bureaucratic process, these rules become law unless Congress rejects the changes before December 1, 2016.

      Today I, along with my colleagues Senators Paul from Kentucky, Baldwin from Wisconsin, and Daines and Tester from Montana, am introducing the Stopping Mass Hacking (SMH) Act (bill, summary), a bill to protect millions of law-abiding Americans from a massive expansion of government hacking and surveillance. Join the conversation with #SMHact.

    • For law enforcement to conduct a remote electronic search, they generally need to plant malware in — i.e. hack — a device. These rule changes will allow the government to search millions of computers with the warrant of a single judge. To me, that’s clearly a policy change that’s outside the scope of an “administrative change,” and it is something that Congress should consider. An agency with the record of the Justice Department shouldn’t be able to wave its arms and grant itself entirely new powers.
    • These changes say that if law enforcement doesn’t know where an electronic device is located, a magistrate judge will now have the the authority to issue a warrant to remotely search the device, anywhere in the world. While it may be appropriate to address the issue of allowing a remote electronic search for a device at an unknown location, Congress needs to consider what protections must be in place to protect Americans’ digital security and privacy. This is a new and uncertain area of law, so there needs to be full and careful debate. The ACLU has a thorough discussion of the Fourth Amendment ramifications and the technological questions at issue with these kinds of searches.

      The second part of the change to Rule 41 would give a magistrate judge the authority to issue a single warrant that would authorize the search of an unlimited number — potentially thousands or millions — of devices, located anywhere in the world. These changes would dramatically expand the government’s hacking and surveillance authority. The American public should understand that these changes won’t just affect criminals: computer security experts and civil liberties advocates say the amendments would also dramatically expand the government’s ability to hack the electronic devices of law-abiding Americans if their devices were affected by a computer attack. Devices will be subject to search if their owners were victims of a botnet attack — so the government will be treating victims of hacking the same way they treat the perpetrators.

    • As the Center on Democracy and Technology has noted, there are approximately 500 million computers that fall under this rule. The public doesn’t know nearly enough about how law enforcement executes these hacks, and what risks these types of searches will pose. By compromising the computer’s system, the search might leave it open to other attackers or damage the computer they are searching.

      Don’t take it from me that this will impact your security, read more from security researchers Steven Bellovin, Matt Blaze and Susan Landau.

      Finally, these changes to Rule 41 would also give some types of electronic searches different, weaker notification requirements than physical searches. Under this new Rule, they are only required to make “reasonable efforts” to notify people that their computers were searched. This raises the possibility of the FBI hacking into a cyber attack victim’s computer and not telling them about it until afterward, if at all.

  • The 4th Amendment: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and *particularly describing the place to be searched, and the* persons or *things to be seized."* So much for the particularized description of the place to be searched and the thngs to be seized.  Fah! Who needs a Constitution, anyway .... 

    Tags: surveillance state, NSA, 702, FISA, legislation, backdoor-serches, stats

    • The Senate Judiciary Committee held an open hearing today on the FISA Amendments Act, the law that ostensibly authorizes the digital surveillance of hundreds of millions of people both in the United States and around the world. Section 702 of the law, scheduled to expire next year, is designed to allow U.S. intelligence services to collect signals intelligence on foreign targets related to our national security interests. However—thanks to the leaks of many whistleblowers including Edward Snowden, the work of investigative journalists, and statements by public officials—we now know that the FISA Amendments Act has been used to sweep up data on hundreds of millions of people who have no connection to a terrorist investigation, including countless Americans.

      What do we mean by “countless”? As became increasingly clear in the hearing today, the exact number of Americans impacted by this surveillance is unknown. Senator Franken asked the panel of witnesses, “Is it possible for the government to provide an exact count of how many United States persons have been swept up in Section 702 surveillance? And if not the exact count, then what about an estimate?”

    • Elizabeth Goitein, the Brennan Center director whose articulate and thought-provoking testimony was the highlight of the hearing, noted that at this time an exact number would be difficult to provide. However, she asserted that an estimate should be possible for most if not all of the government’s surveillance programs.

      None of the other panel participants—which included David Medine and Rachel Brand of the Privacy and Civil Liberties Oversight Board as well as Matthew Olsen of IronNet Cybersecurity and attorney Kenneth Wainstein—offered an estimate.

      Today’s hearing reaffirmed that it is not only the American people who are left in the dark about how many people or accounts are impacted by the NSA’s dragnet surveillance of the Internet. Even vital oversight committees in Congress like the Senate Judiciary Committee are left to speculate about just how far-reaching this surveillance is. It's part of the reason why we urged the House Judiciary Committee to demand that the Intelligence Community provide the public with a number. 

    • The lack of information makes rigorous oversight of the programs all but impossible. As Senator Franken put it in the hearing today, “When the public lacks even a rough sense of the scope of the government’s surveillance program, they have no way of knowing if the government is striking the right balance, whether we are safeguarding our national security without trampling on our citizens’ fundamental privacy rights. But the public can’t know if we succeed in striking that balance if they don’t even have the most basic information about our major surveillance programs." 

      Senator Patrick Leahy also questioned the panel about the “minimization procedures” associated with this type of surveillance, the privacy safeguard that is intended to ensure that irrelevant data and data on American citizens is swiftly deleted.

      Senator Leahy asked the panel: “Do you believe the current minimization procedures ensure that data about innocent Americans is deleted? Is that enough?” 

      David Medine, who recently announced his pending retirement from the Privacy and Civil Liberties Oversight Board, answered unequivocally:

    • Senator Leahy, they don’t. The minimization procedures call for the deletion of innocent Americans’ information upon discovery to determine whether it has any foreign intelligence value. But what the board’s report found is that in fact information is never deleted. It sits in the databases for 5 years, or sometimes longer. And so the minimization doesn’t really address the privacy concerns of incidentally collected communications—again, where there’s been no warrant at all in the process… In the United States, we simply can’t read people’s emails and listen to their phone calls without court approval, and the same should be true when the government shifts its attention to Americans under this program.

      One of the most startling exchanges from the hearing today came toward the end of the session, when Senator Dianne Feinstein—who also sits on the Intelligence Committee—seemed taken aback by Ms. Goitein’s mention of “backdoor searches.” 

    • Feinstein: Wow, wow. What do you call it? What’s a backdoor search?

      Goitein: Backdoor search is when the FBI or any other agency targets a U.S. person for a search of data that was collected under Section 702, which is supposed to be targeted against foreigners overseas.

      Feinstein: Regardless of the minimization that was properly carried out.

      Goitein: Well the data is searched in its unminimized form. So the FBI gets raw data, the NSA, the CIA get raw data. And they search that raw data using U.S. person identifiers. That’s what I’m referring to as backdoor searches.

      It’s deeply concerning that any member of Congress, much less a member of the Senate Judiciary Committee and the Senate Intelligence Committee, might not be aware of the problem surrounding backdoor searches. In April 2014, the Director of National Intelligence acknowledged the searches of this data, which Senators Ron Wyden and Mark Udall termed “the ‘back-door search’ loophole in section 702.” The public was so incensed that the House of Representatives passed an amendment to that year's defense appropriations bill effectively banning the warrantless backdoor searches. Nonetheless, in the hearing today it seemed like Senator Feinstein might not recognize or appreciate the serious implications of allowing U.S. law enforcement agencies to query the raw data collected through these Internet surveillance programs. Hopefully today’s testimony helped convince the Senator that there is more to this topic than what she’s hearing in jargon-filled classified security briefings.


Posted from Diigo. The rest of Open Web group favorite links are here.

Wednesday, April 20, 2016

OpenStack 04/20/2016 (p.m.)

  • Tags: driverless-cars, Beverly-Hills

    • An uncontested vote by the Beverly Hills City Council could guarantee a chauffeur for all residents in the near future. However, instead of a driver, the newly adopted program foresees municipally-owned driverless cars ready to order via a smartphone app.

      Also known as autonomous vehicles, or AV, driverless cars would appear to be the next big thing not only for people, but local governments as well – if the Beverly Hills City Council can get its AV development program past a few more hurdles, that is. The technology itself has some challenges ahead as well.

    • In the meantime, the conceptual shuttle service, which was unanimously approved at an April 5 city council meeting, is being celebrated.
    • Naming Google and Tesla in its press release, Beverly Hills must first develop a partnership with a manufacturer that can build it a fleet of unmanned cars. There will also be a need to bring in policy experts. All of these outside parties will have a chance to explore the program’s potential together at an upcoming community event.

      The Wallis Annenberg Center for the Performing Arts will host a summit this fall that will include expert lectures, discussions, and test drives. Er, test rides.

      Already in the works for Beverly Hills is a fiber optics cable network that will, in addition to providing high-speed internet access to all residents and businesses, one day be an integral part of a public transit system that runs on its users’ spontaneous desires.

      Obviously, Beverly Hills has some money on hand for the project, and it is also an ideal testing space as the city takes up an area of less than six square miles. Another positive factor is the quality of the city’s roads, which exceeds that of most in the greater Los Angeles area, not to mention California and the whole United States.

      “It can’t find the lane markings!” Volvo’s North American CEO, Lex Kerssemakers, complained to Los Angeles Mayor Eric Garcetti last month, according to Reuters. “You need to paint the bloody roads here!”

      Whether lanes are marked or signs are clear has made a big difference in how successfully the new technology works.

      Unfortunately, the US Department of Transportation considers 65 percent of US roads to be in poor condition, so AV cars may not be in the works for many Americans living outside of Beverly Hills quite as soon.


Posted from Diigo. The rest of Open Web group favorite links are here.

Saturday, April 16, 2016

OpenStack 04/16/2016 (p.m.)

  • The Fourth Amendment argument that people have a right to know when their property has been searched or seized is particularly interesting to me. If adopted by the Courts, that could spell the end of surveillance gag orders. 

    Tags: surveillance state, Microsoft, DoJ, litigation, 4th-Amendment, 1st Amendment, gag-orders

    • “WE APPRECIATE THAT there are times when secrecy around a government warrant is needed,” Microsoft President Brad Smith wrote in a blog post on Thursday. “But based on the many secrecy orders we have received, we question whether these orders are grounded in specific facts that truly demand secrecy. To the contrary, it appears that the issuance of secrecy orders has become too routine.”

      With those words, Smith announced that Microsoft was suing the Department of Justice for the right to inform its customers when the government is reading their emails.

      The last big fight between the Justice Department and Silicon Valley was started by law enforcement, when the FBI demanded that Apple unlock a phone used by San Bernardino killer Syed Rizwan Farook.

      This time, Microsoft is going on the offensive. The move is welcomed by privacy activists as a step forward for transparency — though it’s also for business reasons.

    • Secret government searches are eroding people’s trust in the cloud, Smith wrote — including large and small businesses now keeping massive amounts of records online. “The transition to the cloud does not alter people’s expectations of privacy and should not alter the fundamental constitutional requirement that the government must — with few exceptions — give notice when it searches and seizes private information or communications,” he wrote.

      According to the complaint, Microsoft received 5,624 federal demands for customer information or data in the past 18 months. Almost half — 2,576 — came with gag orders, and almost half of those — 1,752 — had “no fixed end date” by which Microsoft would no longer be sworn to secrecy.

      These requests, though signed off on by a judge, qualify as unconstitutional searches, the attorneys argue. It “violates both the Fourth Amendment, which affords people and businesses the right to know if the government searches or seizes their property, and the First Amendment, which enshrines Microsoft’s rights to talk to its customers and to discuss how the government conducts its investigations — subject only to restraints narrowly tailored to serve compelling government interests,” they wrote.


Posted from Diigo. The rest of Open Web group favorite links are here.

Tuesday, April 12, 2016

OpenStack 04/13/2016 (a.m.)

  • Tags: Panama-Papers, technology, open-source

    • Then we put the data up, but the problem with Solr was it didn’t have a user interface, so we used Project Blacklight, which is open source software normally used by librarians. We used it for the journalists. It’s simple because it allows you to do faceted search—so, for example, you can facet by the folder structure of the leak, by years, by type of file. There were more complex things—it supports queries in regular expressions, so the more advanced users were able to search for documents with a certain pattern of numbers that, for example, passports use. You could also preview and download the documents. ICIJ open-sourced the code of our document processing chain, created by our web developer Matthew Caruana Galizia.

      We also developed a batch-searching feature. So say you were looking for politicians in your country—you just run it through the system, and you upload your list to Blacklight and you would get a CSV back saying yes, there are matches for these names—not only exact matches, but also matches based on proximity. So you would say “I want Mar Cabra proximity 2” and that would give you “Mar Cabra,” “Mar whatever Cabra,” “Cabra, Mar,”—so that was good, because very quickly journalists were able to see… I have this list of politicians and they are in the data!

    • Last Sunday, April 3, the first stories emerging from the leaked dataset known as the Panama Papers were published by a global partnership of news organizations working in coordination with the International Consortium of Investigative Journalists, or ICIJ. As we begin the second week of reporting on the leak, Iceland’s Prime Minister has been forced to resign, Germany has announced plans to end anonymous corporate ownership, governments around the world launched investigations into wealthy citizens’ participation in tax havens, the Russian government announced that the investigation was an anti-Putin propaganda operation, and the Chinese government banned mentions of the leak in Chinese media.

      As the ICIJ-led consortium prepares for its second major wave of reporting on the Panama Papers, we spoke with Mar Cabra, editor of ICIJ’s Data & Research unit and lead coordinator of the data analysis and infrastructure work behind the leak. In our conversation, Cabra reveals ICIJ’s years-long effort to build a series of secure communication and analysis platforms in support of genuinely global investigative reporting collaborations.

    • For communication, we have the Global I-Hub, which is a platform based on open source software called Oxwall. Oxwall is a social network, like Facebook, which has a wall when you log in with the latest in your network—it has forum topics, links, you can share files, and you can chat with people in real time.
    • We had the data in a relational database format in SQL, and thanks to ETL (Extract, Transform, and Load) software Talend, we were able to easily transform the data from SQL to Neo4j (the graph-database format we used). Once the data was transformed, it was just a matter of plugging it into Linkurious, and in a couple of minutes, you have it visualized—in a networked way, so anyone can log in from anywhere in the world. That was another reason we really liked Linkurious and Neo4j—they’re very quick when representing graph data, and the visualizations were easy to understand for everybody. The not-very-tech-savvy reporter could expand the docs like magic, and more technically expert reporters and programmers could use the Neo4j query language, Cypher, to do more complex queries, like show me everybody within two degrees of separation of this person, or show me all the connected dots…
    • We believe in open source technology and try to use it as much as possible. We used Apache Solr for the indexing and Apache Tika for document processing, and it’s great because it processes dozens of different formats and it’s very powerful. Tika interacts with Tesseract, so we did the OCRing on Tesseract.

      To OCR the images, we created an army of 30–40 temporary servers in Amazon that allowed us to process the documents in parallel and do parallel OCR-ing. If it was very slow, we’d increase the number of servers—if it was going fine, we would decrease because of course those servers have a cost.

    • For the visualization of the Mossack Fonseca internal database, we worked with another tool called Linkurious. It’s not open source, it’s licensed software, but we have an agreement with them, and they allowed us to work with it. It allows you to represent data in graphs. We had a version of Linkurious on our servers, so no one else had the data. It was pretty intuitive—journalists had to click on dots that expanded, basically, and could search the names.

Posted from Diigo. The rest of Open Web group favorite links are here.

Thursday, April 07, 2016

OpenStack 04/07/2016 (p.m.)

  • Tags: surveillance state, encryption, WhatsApp

    • For most of the past six weeks, the biggest story out of Silicon Valley was Apple’s battle with the FBI over a federal order to unlock the iPhone of a mass shooter. The company’s refusal touched off a searing debate over privacy and security in the digital age. But this morning, at a small office in Mountain View, California, three guys made the scope of that enormous debate look kinda small.

      Mountain View is home to WhatsApp, an online messaging service now owned by tech giant Facebook, that has grown into one of the world’s most important applications. More than a billion people trade messages, make phone calls, send photos, and swap videos using the service. This means that only Facebook itself runs a larger self-contained communications network. And today, the enigmatic founders of WhatsApp, Brian Acton and Jan Koum, together with a high-minded coder and cryptographer who goes by the pseudonym Moxie Marlinspike, revealed that the company has added end-to-end encryption to every form of communication on its service.

    • This means that if any group of people uses the latest version of WhatsApp—whether that group spans two people or ten—the service will encrypt all messages, phone calls, photos, and videos moving among them. And that’s true on any phone that runs the app, from iPhones to Android phones to Windows phones to old school Nokia flip phones. With end-to-end encryption in place, not even WhatsApp’s employees can read the data that’s sent across its network. In other words, WhatsApp has no way of complying with a court order demanding access to the content of any message, phone call, photo, or video traveling through its service. Like Apple, WhatsApp is, in practice, stonewalling the federal government, but it’s doing so on a larger front—one that spans roughly a billion devices.
    • The FBI and the Justice Department declined to comment for this story. But many inside the government and out are sure to take issue with the company’s move. In late 2014, WhatsApp encrypted a portion of its network. In the months since, its service has apparently been used to facilitate criminal acts, including the terrorist attacks on Paris last year. According to The New York Times, as recently as this month, the Justice Department was considering a court case against the company after a wiretap order (still under seal) ran into WhatsApp’s end-to-end encryption.

      “The government doesn’t want to stop encryption,” says Joseph DeMarco, a former federal prosecutor who specializes in cybercrime and has represented various law enforcement agencies backing the Justice Department and the FBI in their battle with Apple. “But the question is: what do you do when a company creates an encryption system that makes it impossible for court-authorized search warrants to be executed? What is the reasonable level of assistance you should ask from that company?”


Posted from Diigo. The rest of Open Web group favorite links are here.

Saturday, April 02, 2016

OpenStack 04/02/2016 (p.m.)

  • The article is wide of the mark, based on analysis of Executive Branch policy rather than the governing law such as the Freedom of Information Act. And I still find it somewhat ludicrous that a third party with knowledge of the defect could succeed in convincing a court that knowledge of a defect in a company's product is trade-secret proprietary information. "Your honor, my client has discovered a way to break into Mr. Tim Cook's house without a key to his house. That is a valuable trade secret that this Court must keep Mr. Cook from learning." Pow! The Computer Fraud and Abuse Act makes it a crime to access a computer that can connect to the Internet by exploiting a software bug. 

    Tags: surveillance state, Apple, FBI, iPhone, encryption, FOIA, litigation

    • The FBI may be allowed to withhold information about how it broke into an iPhone belonging to a gunman in the December San Bernardino shootings, despite a U.S. government policy of disclosing technology security flaws discovered by federal agencies.

      Under the U.S. vulnerabilities equities process, the government is supposed to err in favor of disclosing security issues so companies can devise fixes to protect data. The policy has exceptions for law enforcement, and there are no hard rules about when and how it must be applied.

      Apple Inc has said it would like the government to share how it cracked the iPhone security protections. But the Federal Bureau of Investigation, which has been frustrated by its inability to access data on encrypted phones belonging to criminal suspects, might prefer to keep secret the technique it used to gain access to gunman Syed Farook's phone.

      The referee is likely to be a White House group formed during the Obama administration to review computer security flaws discovered by federal agencies and decide whether they should be disclosed.

    • Stewart Baker, former general counsel of the NSA and now a lawyer with Steptoe & Johnson, said the review process could be complicated if the cracking method is considered proprietary by the third party that assisted the FBI.

      Several security researchers have pointed to the Israel-based mobile forensics firm Cellebrite as the likely third party that helped the FBI. That company has repeatedly declined comment.


Posted from Diigo. The rest of Open Web group favorite links are here.