Wednesday, September 10. 2014Project «SurPRISE – Surveillance, Privacy and Security: A large scale participatory assessment of criteria and factors determining acceptability and acceptance of security technologies in Europe»About the ProjectSurPRISE re-examines the relationship between security and privacy, which is commonly positioned as a ‘trade-off’. Where security measures and technologies involve the collection of information about citizens, questions arise as to whether and to what extent their privacy has been infringed. This infringement of individual privacy is sometimes seen as an acceptable cost of enhanced security. Similarly, it is assumed that citizens are willing to trade off their privacy for enhanced personal security in different settings. This common understanding of the security-privacy relationship, both at state and citizen level, has informed policymakers, legislative developments and best practice guidelines concerning security developments across the EU. However, an emergent body of work questions the validity of the security-privacy trade-off. This work suggests that it has over-simplified how the impact of security measures on citizens is considered in current security policies and practices. Thus, the more complex issues underlying privacy concerns and public skepticism towards surveillance-oriented security technologies may not be apparent to legal and technological experts. In response to these developments, this project will consult with citizens from several EU member and associated states on the question of the security-privacy trade-off as they evaluate different security technologies and measures. Next-gen HTTP Web acceleration protocol nears completionVia PCWorld -----
When it comes to speeding up Web traffic over the Internet, sometimes too much of a good thing may not be such a good thing at all. The Internet Engineering Task Force is putting the final touches on HTTP/2, the second version of the Hypertext Transport Protocol (HTTP). The working group has issued a last call draft, urging interested parties to voice concerns before it becomes a full Internet specification. Not everyone is completely satisfied with the protocol however. “There is a lot of good in this proposed standard, but I have some deep reservations about some bad and ugly aspects of the protocol,” wrote Greg Wilkins, lead developer of the open source Jetty server software, noting his concerns in a blog item posted Monday. Others, however, praise HTTP/2 and say it is long overdue. “A lot of our users are experimenting with the protocol,” said Owen Garrett, head of products for server software provider NGINX. “The feedback is that generally, they have seen big performance benefits.” First created by Web originator Tim Berners-Lee and associates, HTTP quite literally powers today’s Web, providing the language for a browser to request a Web page from a server. SPDY is speediestVersion 2.0 of HTTP, based largely on the SPDY protocol developed by Google, promises to be a better fit for how people use the Web. “The challenge with HTTP is that it is a fairly simple protocol, and it can be quite laborious to download all the resources required to render a Web page. SPDY addresses this issue,” Garrett said. While the first generation of Web sites were largely simple and relatively small, static documents, the Web today is used as a platform for delivering applications and bandwidth intensive real-time multimedia content. HTTP/2 speeds basic HTTP in a number of ways. HTTP/2 allows servers to send all the different elements of a requested Web page at once, eliminating the serial sets of messages that have to be sent back and forth under plain HTTP. HTTP/2 also allows the server and the browser to compress HTTP, which cuts the amount of data that needs to be communicated between the two. As a result, HTTP/2 “is really useful for organization with sophisticated Web sites, particularly when its users are distributed globally or using slower networks—mobile users for instance,” Garrett said. While enthusiastic about the protocol, Wilkins did have several areas of concern. For instance, HTTP/2 could make it more difficult to incorporate new Web protocols, most notably the communications protocol WebSocket, Wilkins asserted. Wilkins noted that HTTP/2 blurs what were previously two distinct layers of HTTP—the semantic layer, which describes functionality, and the framework layer, which is the structure of the message. The idea is that it is simpler to write protocols for a specification with discrete layers. The protocol also makes it possible to hide content, including malicious content, within the headers, bypassing the notice of today’s firewalls, Wilkins said. Servers need more powerHTTP/2 could also put a lot more strain on existing servers, Wilkins noted, given that they will now be fielding many more requests at once. HTTP/2 “clients will send requests much more quickly, and it is quite likely you will see spikier traffic as a result,” Garrett agreed. As a result, a Web application, if it doesn’t already rely on caching or load balancing, may have to do so with HTTP/2, Garrett said. The SPDY protocol is already used by almost 1 percent of all the websites, according to an estimate of the W3techs survey company. NGINX has been a big supporter of SPDY and HTTP/2, not surprising given that the company’s namesake server software was designed for high-traffic websites. Approximately 88 percent of sites that offer SPDY do so with NGINX, according to W3techs. Yet NGINX has characterized SPDY to its users as “experimental,” Garrett said, largely because the technology is still evolving and hasn’t been nailed down yet by the formal specification. “We’re really forward to when the protocol is rubber-stamped,” Garrett said. Once HTTP/2 is approved, “We can recommend it to our customers with confidence,” Garrett said.
Tuesday, September 09. 2014Mimosa Networks Launches Its First Gigabit Wireless ProductsVia Tech Crunch ----- Mimosa Networks is finally ready to help make gigabit wireless technology a reality. The company, which recently came out of stealth, is launching a series of products that it hopes to sell to a new generation of wireless ISPs. Its wireless Internet products include its new B5 Backhaul radio hardware and its Mimosa Cloud Services planning and analytics offering. By using the two in combination, new ISPs can build high-capacity wireless networks at a fraction of the cost it would take to lay a fiber network. The B5 backhaul radio is a piece of hardware that uses multiple-input and multiple-output (MIMO) technology to provide up to 16 streams and 4 Gbps of output when multiple radios are using the same channel. With a single B5 radio, customers can provide a gigabit of throughput for up to eight or nine miles, according to co-founder and chief product officer Jaime Fink. The longer the distance, the less bandwidth is available, of course. But Fink said the company is running one link of about 60 miles that still gets a several hundred megabits of throughput along the California coast. Not only does the product offer high data speeds on 5 GHz wireless spectrum, but it also makes that spectrum more efficient. It uses spectrum analysis and load balancing to optimize bandwidth, frequency, and power use based on historical and real-time data to adapt to wireless interference and other issues. In addition to the hardware, Mimosa’s cloud services will help customers plan and deploy networks with analytics tools to determine how powerful and efficiently their existing equipment is running. That will enable new ISPs to more effectively determine where to place new hardware to link up with other base stations. The product also is designed to support networks as they grow, and it makes sure that ISPs can spot problems as they happen. The Cloud Services product is available now, but the backhaul radio will be available for about $900 later this fall. Mimosa is launching these first products after raising $38 million in funding from New Enterprise Associates and Oak Investment Partners. That includes a $20 million Series C round led by return investor NEA that was recently closed. The company was founded by Brian Hinman, who had previously co-founded PictureTel, Polycom, and 2Wire, along with Fink, who previously served as CTO of 2Wire and SVP of technology for Pace after it acquired the home networking equipment company. Now they’re hoping to make wireless gigabit speeds available for new ISPs.
Monday, September 08. 2014Touch+ transforms any surface into a multitouch gesture controller for your PC or MacVia PCWorld -----
After Leap Motion's somewhat disappointing debut, you'd be forgiving for wanting to wave off the idea of third-party gesture control peripherals. But wait! Unlike Leap, Reactiv isn't trying to revolutionize human-computer interactions with its Touch+ controller—there's no wizard-like finger waggling or Minority Report-style hand waving here. Instead, the Touch+'s dual cameras turn any surface into multi-touch input device.
Touch+ was born out of Haptix, a Kickstarter project that raised more than $180,000 from backers. Over the past year, Reactiv refined the Haptix vision to eventually become Touch+. While Touch+ certainly won't be for everyone, Reactiv is positioning the multitouch PC controller as more than a mere tool for games and art projects. The video above shows the device being used in an office meeting, acting as a cursor control for a businessman's laptop before being repositioned on the fly to point at a projected display, instantly allowing the man to reach up with his hands to circle objects on the image. What's more, PCWorld sister site CITEWorld managed to snag a live demo with Touch+, and the founders focused on the potential productivity uses of the device: Enabling mouse-free control of Excel and PowerPoint, naturally manipulating pictures in PhotoShop, creating designs in CAD, the aforementioned presentation capabilities, and so forth. ![]() The Touch+ motion controller connected to the top of a notebook. The Touch+ works with Windows PCs or Macs, connecting via USB 2.0 or 3.0. If you choose to point it at your keyboard, the device will temporarily suspend its multitouch capabilities while you type, then resume when your fingers stop bobbing up and down. Sound interesting? An alpha version of Touch+ is available now on the Reactiv website for $75. Until we get our own hands on the device, however, we won't know for sure how the device stacks up to competitors like the Leap Motion.
Friday, September 05. 2014The Internet of Things is here and there -- but not everywhere yetVia PCWorld -----
The Internet of Things is still too hard. Even some of its biggest backers say so. For all the long-term optimism at the M2M Evolution conference this week in Las Vegas, many vendors and analysts are starkly realistic about how far the vaunted set of technologies for connected objects still has to go. IoT is already saving money for some enterprises and boosting revenue for others, but it hasn’t hit the mainstream yet. That’s partly because it’s too complicated to deploy, some say. For now, implementations, market growth and standards are mostly concentrated in specific sectors, according to several participants at the conference who would love to see IoT span the world. Cisco Systems has estimated IoT will generate $14.4 trillion in economic value between last year and 2022. But Kevin Shatzkamer, a distinguished systems architect at Cisco, called IoT a misnomer, for now. “I think we’re pretty far from envisioning this as an Internet,” Shatzkamer said. “Today, what we have is lots of sets of intranets.” Within enterprises, it’s mostly individual business units deploying IoT, in a pattern that echoes the adoption of cloud computing, he said.
In the past, most of the networked machines in factories, energy grids and other settings have been linked using custom-built, often local networks based on proprietary technologies. IoT links those connected machines to the Internet and lets organizations combine those data streams with others. It’s also expected to foster an industry that’s more like the Internet, with horizontal layers of technology and multivendor ecosystems of products. What’s holding back the Internet of ThingsThe good news is that cities, utilities, and companies are getting more familiar with IoT and looking to use it. The less good news is that they’re talking about limited IoT rollouts for specific purposes. “You can’t sell a platform, because a platform doesn’t solve a problem. A vertical solution solves a problem,” Shatzkamer said. “We’re stuck at this impasse of working toward the horizontal while building the vertical.” “We’re no longer able to just go in and sort of bluff our way through a technology discussion of what’s possible,” said Rick Lisa, Intel’s group sales director for Global M2M. “They want to know what you can do for me today that solves a problem.” One of the most cited examples of IoT’s potential is the so-called connected city, where myriad sensors and cameras will track the movement of people and resources and generate data to make everything run more efficiently and openly. But now, the key is to get one municipal project up and running to prove it can be done, Lisa said. ![]() ThroughTek, based in China, used a connected fan to demonstrate an Internet of Things device management system at the M2M Evolution conference in Las Vegas this week. The conference drew stories of many successful projects: A system for tracking construction gear has caught numerous workers on camera walking off with equipment and led to prosecutions. Sensors in taxis detect unsafe driving maneuvers and alert the driver with a tone and a seat vibration, then report it to the taxi company. Major League Baseball is collecting gigabytes of data about every moment in a game, providing more information for fans and teams. But for the mass market of small and medium-size enterprises that don’t have the resources to do a lot of custom development, even targeted IoT rollouts are too daunting, said analyst James Brehm, founder of James Brehm & Associates. There are software platforms that pave over some of the complexity of making various devices and applications talk to each other, such as the Omega DevCloud, which RacoWireless introduced on Tuesday. The DevCloud lets developers write applications in the language they know and make those apps work on almost any type of device in the field, RacoWireless said. Thingworx, Xively and Gemalto also offer software platforms that do some of the work for users. But the various platforms on offer from IoT specialist companies are still too fragmented for most customers, Brehm said. There are too many types of platforms—for device activation, device management, application development, and more. “The solutions are too complex.” He thinks that’s holding back the industry’s growth. Though the past few years have seen rapid adoption in certain industries in certain countries, sometimes promoted by governments—energy in the U.K., transportation in Brazil, security cameras in China—the IoT industry as a whole is only growing by about 35 percent per year, Brehm estimates. That’s a healthy pace, but not the steep “hockey stick” growth that has made other Internet-driven technologies ubiquitous, he said. What lies aheadBrehm thinks IoT is in a period where customers are waiting for more complete toolkits to implement it—essentially off-the-shelf products—and the industry hasn’t consolidated enough to deliver them. More companies have to merge, and it’s not clear when that will happen, he said. “I thought we’d be out of it by now,” Brehm said. What’s hard about consolidation is partly what’s hard about adoption, in that IoT is a complex set of technologies, he said. And don’t count on industry standards to simplify everything. IoT’s scope is so broad that there’s no way one standard could define any part of it, analysts said. The industry is evolving too quickly for traditional standards processes, which are often mired in industry politics, to keep up, according to Andy Castonguay, an analyst at IoT research firm Machina.
Instead, individual industries will set their own standards while software platforms such as Omega DevCloud help to solve the broader fragmentation, Castonguay believes. Even the Industrial Internet Consortium, formed earlier this year to bring some coherence to IoT for conservative industries such as energy and aviation, plans to work with existing standards from specific industries rather than write its own. Ryan Martin, an analyst at 451 Research, compared IoT standards to human languages. “I’d be hard pressed to say we are going to have one universal language that everyone in the world can speak,” and even if there were one, most people would also speak a more local language, Martin said.
Posted by Christian Babski
in Hardware, Innovation&Society, Network, Programming
at
12:14
Defined tags for this entry: hardware, innovation&society, network, programming, sensors, smartobject
Google flushes out users of old browsers by serving up CLUNKY, AGED version of searchVia The Register -----
Google is attempting to shunt users away from old browsers by intentionally serving up a stale version of the ad giant's search homepage to those holdouts. The tactic appears to be falling in line with Mountain View's policy on its other Google properties, such as Gmail, which the company declines to fully support on aged browsers. However, it was claimed on Friday in a Google discussion thread that the multinational had unceremoniously dumped a past its sell-by-date version of the Larry Page-run firm's search homepage on those users who have declined to upgrade their Opera and Safari browsers. A user with the moniker DJSigma wrote on the forum:
In a later post, DJSigma added that there seemed to be a glitch on Google search.
The Opera user then said that the problem appeared to be "intermittent". Others flagged up similar issues on the Google forum and said they hoped it was just a bug. While someone going by the name MadFranko008 added:
Some Safari 5.1.x and Opera 12.x netizens were able to fudge the system by customising their browser's user agent. But others continued to complain about Google's "clunky", old search homepage. A Google employee, meanwhile, said that the tactic was deliberate in a move to flush out stick-in-the-mud types who insisted on using older versions of browsers. "Thanks for the reports. I want to assure you this isn't a bug, it's working as intended," said a Google worker going by the name nealem. She added:
In a separate thread, as spotted by a Reg reader who brought this sorry affair to our attention, user MadFranko008 was able to show that even modern browsers - including the current version of Chrome - were apparently spitting out glitches on Apple Mac computers. Google then appeared to have resolved the search "bug" spotted in Chrome.
(Page 1 of 1, totaling 6 entries)
|
QuicksearchPopular Entries
CategoriesShow tagged entriesSyndicate This BlogCalendar
Blog Administration |