Many of you probably think that the National Security Agency (NSA) and open-source software get along like a house on fire. That's to say, flaming destruction. You would be wrong.
In partnership with the Apache Software Foundation, the NSA announced on Tuesday that it is releasing the source code for Niagarafiles (Nifi). The spy agency said that Nifi "automates data flows among multiple computer networks, even when data formats and protocols differ".
Details on how Nifi does this are scant at this point, while the ASF continues to set up the site where Nifi's code will reside.
In a statement, Nifi's lead developer Joseph L Witt said the software "provides a way to prioritize data flows more effectively and get rid of artificial delays in identifying and transmitting critical information".
The NSA is making this move because, according to the director of the NSA's Technology Transfer Program (TPP) Linda L Burger, the agency's research projects "often have broad, commercial applications".
"We use open-source releases to move technology from the lab to the marketplace, making state-of-the-art technology more widely available and aiming to accelerate U.S. economic growth," she added.
The NSA has long worked hand-in-glove with open-source projects. Indeed, Security-Enhanced Linux (SELinux), which is used for top-level security in all enterprise Linux distributions — Red Hat Enterprise Linux, SUSE Linux Enterprise Server, and Debian Linux included — began as an NSA project.
More recently, the NSA created Accumulo, a NoSQL database store that's now supervised by the ASF.
More NSA technologies are expected to be open sourced soon. After all, as the NSA pointed out: "Global reviews and critiques that stem from open-source releases can broaden a technology's applications for the US private sector and for the good of the nation at large."
But the TechAeris report suggests copresence will do more than what
GCM does today. Based on a source that scanned the latest Google Play
Services build, numerous references to copresence were found along with
images alluding to peer-to-peer file transfers, similar to how Apple’s
AirDrop works between iOS and OS X. One of the images in the software
appears to be song sharing between an iPhone and an Android device.
Copresence could even be location-based, where the appropriate app
could tell you when other friends are nearby and in range for transfers
or direct communications. Obviously, without an official Google
announcement we don’t know what this project is about. All of the
evidence so far suggests, however, that a new cross-platform
communication method is in the works.
But the TechAeris report suggests copresence will do
more than what GCM does today. Based on a source that scanned the latest
Google Play Services build, numerous references to copresence were
found along with images alluding to peer-to-peer file transfers, similar
to how Apple’s
AirDrop works between iOS and OS X. One of the images in the software
appears to be song sharing between an iPhone and an Android device.
Copresence could even be location-based, where the appropriate app
could tell you when other friends are nearby and in range for transfers
or direct communications. Obviously, without an official Google
announcement we don’t know what this project is about. All of the
evidence so far suggests, however, that a new cross-platform
communication method is in the works.
A two-stage attack could allow spies to sneak secrets out of the most
sensitive buildings, even when the targeted computer system is not
connected to any network, researchers from Ben-Gurion University of the
Negev in Israel stated in an academic paper describing the refinement of
an existing attack.
The technique,
called AirHopper, assumes that an attacker has already compromised the
targeted system and desires to occasionally sneak out sensitive or
classified data. Known as exfiltration, such occasional communication is
difficult to maintain, because government technologists frequently
separate the most sensitive systems from the public Internet for
security. Known as an air gap, such a defensive measure makes it much
more difficult for attackers to compromise systems or communicate with
infected systems.
Yet, by using a program to create a radio signal using a computer’s
video card—a technique known for more than a decade—and a smartphone
capable of receiving FM signals, an attacker could collect data from
air-gapped devices, a group of four researchers wrote in a paper
presented last week at the IEEE 9th International Conference on Malicious and Unwanted Software (MALCON).
“Such technique can be used potentially by people and organizations
with malicious intentions and we want to start a discussion on how to
mitigate this newly presented risk,” Dudu Mimran, chief technology
officer for the cyber security labs at Ben-Gurion University, said in a
statement.
For the most part, the attack is a refinement of existing techniques. Intelligence agencies have long known—since at least 1985—that
electromagnetic signals could be intercepted from computer monitors to
reconstitute the information being displayed. Open-source projects have turned monitors into radio-frequency transmitters.
And, from the information leaked by former contractor Edward Snowden,
the National Security Agency appears to use radio-frequency devices implanted in various computer-system components to transmit information and exfiltrate data.
AirHopper uses off-the-shelf components, however, to achieve the same
result. By using a smartphone with an FM receiver, the exfiltration
technique can grab data from nearby systems and send it to a waiting
attacker once the smartphone is again connected to a public network.
“This is the first time that a mobile phone is considered in an
attack model as the intended receiver of maliciously crafted radio
signals emitted from the screen of the isolated computer,” the group
said in its statement on the research.
The technique works at a distance of 1 to 7 meters, but can only send
data at very slow rates—less than 60 bytes per second, according to the
researchers.
Engineers at Stanford University have developed a tiny radio that’s
about as big as an ant and that’s cheap and small enough that it could
help realize the “Internet of things”—the world of everyday objects that
send and receive data via the Internet.
The radio is built on a piece of silicon that measures just a few
millimeters on each side. Several tens of them can fit on the top of a
U.S. penny and the radio itself is expected to cost only a few pennies
to manufacture in mass quantities.
Part of the secret to the radio’s size is its lack of a battery. Its
power requirements are sufficiently frugal that it can harvest the
energy it needs from nearby radio fields, such as those from a reader
device when it’s brought nearby.
RFID tags and contactless smartcards can get their power the same way,
drawing energy from a radio source, but Stanford’s radio has more
processing power than those simpler devices, a university representative
said. That means it could query a sensor for its data, for instance,
and transmit it when required.
The device operates in the 24GHz and 60GHz bands, suitable for communications over a few tens of centimeters.
Engineers envisage a day when trillions of objects are connected via
tiny radios to the Internet. Data from the devices is expected to help
realize smarter and more energy-efficient homes, although quite how it
will all work is yet to be figured out. Radios like the one from
Stanford should help greatly expand the number of devices that can
collect and share data.
The radio was demonstrated by Amin Arbabian, an assistant professor of
electrical engineering at Stanford and one of the developers of the
device, at the recent VLSI Technology and Circuits Symposium in Hawaii.
Mimosa Networks is
finally ready to help make gigabit wireless technology a reality. The
company, which recently came out of stealth, is launching a series of
products that it hopes to sell to a new generation of wireless ISPs.
Its wireless Internet products include its new B5 Backhaul radio
hardware and its Mimosa Cloud Services planning and analytics offering.
By using the two in combination, new ISPs can build high-capacity
wireless networks at a fraction of the cost it would take to lay a fiber
network.
The B5 backhaul radio is a piece of hardware that uses multiple-input and multiple-output (MIMO) technology to provide up to 16 streams and 4 Gbps of output when multiple radios are using the same channel.
With a single B5 radio, customers can provide a gigabit of throughput
for up to eight or nine miles, according to co-founder and chief
product officer Jaime Fink. The longer the distance, the less bandwidth
is available, of course. But Fink said the company is running one link
of about 60 miles that still gets a several hundred megabits of
throughput along the California coast.
Not only does the product offer high data speeds on 5 GHz wireless
spectrum, but it also makes that spectrum more efficient. It uses
spectrum analysis and load balancing to optimize bandwidth, frequency,
and power use based on historical and real-time data to adapt to
wireless interference and other issues.
In addition to the hardware, Mimosa’s cloud services will help
customers plan and deploy networks with analytics tools to determine how
powerful and efficiently their existing equipment is running. That will
enable new ISPs to more effectively determine where to place new
hardware to link up with other base stations.
The product also is designed to support networks as they grow, and it
makes sure that ISPs can spot problems as they happen. The Cloud
Services product is available now, but the backhaul radio will be
available for about $900 later this fall.
Mimosa
is launching these first products after raising $38 million in funding
from New Enterprise Associates and Oak Investment Partners. That
includes a $20 million Series C round led by return investor NEA that
was recently closed.
The company was founded by Brian Hinman,
who had previously co-founded PictureTel, Polycom, and 2Wire, along
with Fink, who previously served as CTO of 2Wire and SVP of technology
for Pace after it acquired the home networking equipment company. Now
they’re hoping to make wireless gigabit speeds available for new ISPs.
The Internet of Things is still too hard. Even some of its biggest backers say so.
For all the long-term optimism at the M2M Evolution conference this week
in Las Vegas, many vendors and analysts are starkly realistic about how
far the vaunted set of technologies for connected objects still has to
go. IoT is already saving money for some enterprises and boosting
revenue for others, but it hasn’t hit the mainstream yet. That’s partly
because it’s too complicated to deploy, some say.
For now, implementations, market growth and standards are mostly
concentrated in specific sectors, according to several participants at
the conference who would love to see IoT span the world.
Cisco Systems has estimated
IoT will generate $14.4 trillion in economic value between last year
and 2022. But Kevin Shatzkamer, a distinguished systems architect at
Cisco, called IoT a misnomer, for now.
“I think we’re pretty far from envisioning this as an Internet,”
Shatzkamer said. “Today, what we have is lots of sets of intranets.”
Within enterprises, it’s mostly individual business units deploying IoT,
in a pattern that echoes the adoption of cloud computing, he said.
In the past, most of the networked machines in factories, energy grids
and other settings have been linked using custom-built, often local
networks based on proprietary technologies. IoT links those connected
machines to the Internet and lets organizations combine those data
streams with others. It’s also expected to foster an industry that’s
more like the Internet, with horizontal layers of technology and
multivendor ecosystems of products.
What’s holding back the Internet of Things
The good news is that cities, utilities, and companies are getting more
familiar with IoT and looking to use it. The less good news is that
they’re talking about limited IoT rollouts for specific purposes.
“You can’t sell a platform, because a platform doesn’t solve a problem. A
vertical solution solves a problem,” Shatzkamer said. “We’re stuck at
this impasse of working toward the horizontal while building the
vertical.”
“We’re no longer able to just go in and sort of bluff our way through a
technology discussion of what’s possible,” said Rick Lisa, Intel’s group
sales director for Global M2M. “They want to know what you can do for
me today that solves a problem.”
One of the most cited examples of IoT’s potential is the so-called
connected city, where myriad sensors and cameras will track the movement
of people and resources and generate data to make everything run more
efficiently and openly. But now, the key is to get one municipal project
up and running to prove it can be done, Lisa said.
The conference drew stories of many successful projects: A system for
tracking construction gear has caught numerous workers on camera walking
off with equipment and led to prosecutions. Sensors in taxis detect
unsafe driving maneuvers and alert the driver with a tone and a seat
vibration, then report it to the taxi company. Major League Baseball is
collecting gigabytes of data about every moment in a game, providing
more information for fans and teams.
But for the mass market of small and medium-size enterprises that don’t
have the resources to do a lot of custom development, even targeted IoT
rollouts are too daunting, said analyst James Brehm, founder of James
Brehm & Associates.
There are software platforms that pave over some of the complexity of
making various devices and applications talk to each other, such as the Omega DevCloud,
which RacoWireless introduced on Tuesday. The DevCloud lets developers
write applications in the language they know and make those apps work on
almost any type of device in the field, RacoWireless said. Thingworx,
Xively and Gemalto also offer software platforms that do some of the
work for users. But the various platforms on offer from IoT specialist
companies are still too fragmented for most customers, Brehm said. There
are too many types of platforms—for device activation, device
management, application development, and more. “The solutions are too
complex.”
He thinks that’s holding back the industry’s growth. Though the past few
years have seen rapid adoption in certain industries in certain
countries, sometimes promoted by governments—energy in the U.K.,
transportation in Brazil, security cameras in China—the IoT industry as a
whole is only growing by about 35 percent per year, Brehm estimates.
That’s a healthy pace, but not the steep “hockey stick” growth that has
made other Internet-driven technologies ubiquitous, he said.
What lies ahead
Brehm thinks IoT is in a period where customers are waiting for more
complete toolkits to implement it—essentially off-the-shelf products—and
the industry hasn’t consolidated enough to deliver them. More companies
have to merge, and it’s not clear when that will happen, he said.
“I thought we’d be out of it by now,” Brehm said. What’s hard about
consolidation is partly what’s hard about adoption, in that IoT is a
complex set of technologies, he said.
And don’t count on industry standards to simplify everything. IoT’s
scope is so broad that there’s no way one standard could define any part
of it, analysts said. The industry is evolving too quickly for
traditional standards processes, which are often mired in industry
politics, to keep up, according to Andy Castonguay, an analyst at IoT
research firm Machina.
Instead, individual industries will set their own standards while
software platforms such as Omega DevCloud help to solve the broader
fragmentation, Castonguay believes. Even the Industrial Internet
Consortium, formed earlier this year
to bring some coherence to IoT for conservative industries such as
energy and aviation, plans to work with existing standards from specific
industries rather than write its own.
Ryan Martin, an analyst at 451 Research, compared IoT standards to human languages.
“I’d be hard pressed to say we are going to have one universal language
that everyone in the world can speak,” and even if there were one, most
people would also speak a more local language, Martin said.
Google is attempting to shunt users away from old browsers by
intentionally serving up a stale version of the ad giant's search
homepage to those holdouts.
The tactic appears to be falling in line with Mountain View's policy on its other Google properties, such as Gmail, which the company declines to fully support on aged browsers.
However, it was claimed on Friday in a Google discussion thread
that the multinational had unceremoniously dumped a past its
sell-by-date version of the Larry Page-run firm's search homepage on
those users who have declined to upgrade their Opera and Safari
browsers.
A user with the moniker DJSigma wrote on the forum:
A few minutes ago, Google's homepage reverted to the old version for
me. I'm using Opera 12.17. If I search for something, the results are
shown with the current Google look, but the homepage itself is the old
look with the black bar across the top. It seems to affect only the
Google homepage and image search. If I click on "News", for instance,
it's fine.
I've tried clearing cookies and deleting the browser cache/persistent
storage. I've tried disabling all extensions. I've tried masking the
browser as IE and Firefox. It doesn't matter whether I'm signed in or
signed out. Nothing works. Please fix this!
In a later post, DJSigma added that there seemed to be a glitch on Google search.
If I go to the Google homepage, I get served the old version of the
site. If I search for something, the results show up in the current
look. However, if I then try and type something into the search box
again, Google search doesn't work at all. I have to go back to the
homepage every time to do a new search.
The Opera user then said that the problem appeared to be
"intermittent". Others flagged up similar issues on the Google forum
and said they hoped it was just a bug.
While someone going by the name MadFranko008 added:
Phew ... thought it was just me and I'd been hacked or something ...
Same problem here everything "Google" has reverted back to the old style
of several years ago.
Tested on 3 different computers and different OS's & browsers ...
All have the same result, everything google has gone back to the old
style of several years ago and there's no way to change it ... Even the
copyright has reverted back to 2013!!!
Some Safari 5.1.x and Opera 12.x netizens were able to
fudge the system by customising their browser's user agent. But others
continued to complain about Google's "clunky", old search homepage.
A
Google employee, meanwhile, said that the tactic was deliberate in a
move to flush out stick-in-the-mud types who insisted on using older
versions of browsers.
"Thanks for the reports. I want to assure
you this isn't a bug, it's working as intended," said a Google worker
going by the name nealem. She added:
We’re continually making improvements to Search, so we can only
provide limited support for some outdated browsers. We encourage
everyone to make the free upgrade to modern browsers - they’re more
secure and provide a better web experience overall.
In a separate thread, as spotted by a Reg
reader who brought this sorry affair to our attention, user
MadFranko008 was able to show that even modern browsers - including the
current version of Chrome - were apparently spitting out glitches on
Apple Mac computers.
Google then appeared to have resolved the search "bug" spotted in Chrome.
When Google chief financial officer Patrick Pichette said the tech giant might bring 10 gigabits per second internet connections
to American homes, it seemed like science fiction. That’s about 1,000
times faster than today’s home connections. But for NASA, it’s downright
slow.
While the rest of us send data across the public internet, the space agency uses a shadow network called ESnet, short for Energy Science Network, a set of private pipes that has demonstrated cross-country data transfers of 91 gigabits per second–the fastest of its type ever reported.
NASA isn’t going bring these speeds to homes, but it is using this
super-fast networking technology to explore the next wave of computing
applications. ESnet, which is run by the U.S. Department of Energy, is
an important tool for researchers who deal in massive amounts of data
generated by projects such as the Large Hadron Collider and the Human
Genome Project. Rather sending hard disks back and forth through the
mail, they can trade data via the ultra-fast network. “Our vision for
the world is that scientific discovery shouldn’t be constrained by
geography,” says ESnet director Gregory Bell.
In making its network as fast as it can possibly be, ESnet and
researchers are organizations like NASA are field testing networking
technologies that may eventually find their way into the commercial
internet. In short, ESnet a window into what our computing world will
eventually look like.
The Other Net
The first nationwide computer research network was the Defense
Department’s ARPAnet, which evolved into the modern internet. But it
wasn’t the last network of its kind. In 1976, the Department of Energy
sponsored the creation of the Magnetic Fusion Energy Network to connect
what is today the National Energy Research Scientific Computing Center
with other research laboratories. Then the agency created a second
network in 1980 called the High Energy Physics Network to connect
particle physics researchers at national labs. As networking became more
important, agency chiefs realized it didn’t make sense to maintain
multiple networks and merged the two into one: ESnet.
The nature of the network changes with the times. In the early days
it ran on land lines and satellite links. Today it is uses fiber optic
lines, spanning the DOE’s 17 national laboratories and many other sites,
such as university research labs. Since 2010, ESnet and Internet2—a
non-profit international network built in 1995 for researchers after the
internet was commercialized—have been leasing “dark fiber,” the excess
network capacity built-up by commercial internet providers during the
late 1990s internet bubble.
An Internet Fast Lane
In November, using this network, NASA’s High End Computer Networking
team achieved its 91 gigabit transfer between Denver and NASA Goddard
Space Flight Center in Greenbelt, Maryland. It was the fastest
end-to-end data transfer ever conducted under “real world” conditions.
ESnet has long been capable of 100 gigabit transfers, at least in
theory. Network equipment companies have been offering 100 gigabit
switches since 2010. But in practice, long-distance transfers were much
slower. That’s because data doesn’t travel through the internet in a
straight line. It’s less like a super highway and more like an
interstate highway system. If you wanted to drive from San Francisco to
New York, you’d pass through multiple cities along the way as you
transferred between different stretches of highway. Likewise, to send a
file from San Francisco to New York on the internet—or over ESnet—the
data will flow through hardware housed in cities across the country.
A map of ESnet’s connected sites. Image: Courtesy of ESnet
NASA did a 98 gigabit transfer between Goddard and the University of Utah over ESnet in 2012. And Alcatel-Lucent and BT obliterated that record
earlier this year with a 1.4 terabit connection between London and
Ipswich. But in both cases, the two locations had a direct connection,
something you rarely see in real world connections.
On the internet and ESnet, every stop along the way creates the
potential for a bottleneck, and every piece of gear must be ready to
handle full 100 gigabit speeds. In November, the team finally made it
work. “This demonstration was about using commercial, off-the-shelf
technology and being able to sustain the transfer of a large data
network,” says Tony Celeste, a sales director at Brocade, the company
that manufactured the equipment used in the record-breaking test.
Experiments for the Future
Meanwhile, the network is advancing the state of the art in other
ways. Researchers have used it to explore virtual network circuits
called “OSCARS,”
which can be used to create complex networks without complex hardware
changes. And they’re working on what are known as network “DMZs,” which can achieve unusually fast speeds by handling security without traditional network firewalls.
These solutions are designed specifically for networks in which a
small number of very large transfers take place–as opposed to the
commercial internet where lots of small transfers take place. But
there’s still plenty for commercial internet companies to learn from
ESnet. Telecommunications company XO Communications already has a 100 gigabit backbone, and we can expect more companies to follow suit.
Although we won’t see 10-gigabit connections—let alone 100 gigabit
connections—at home any time soon, higher capacity internet backbones
will mean less congestion as more and more people stream high-definition
video and download ever-larger files. And ESnet isn’t stopping there.
Bell says the organization is already working on a 400 gigabit network,
and the long-term goal is a terabyte per second network, which about
100,000 times faster than today’s home connections. Now that sounds like
science fiction.
Update 13:40 EST 06/17/14: This story has been updated to make it clear that ESnet is run by the Department of Energy.
Update 4:40 PM EST 06/17/14: This story has been updated to avoid
confusion between ESnet’s production network and its more experimental
test bed network.
Ever since covering Fliike,
a beautifully-designed physical ‘Like’ counter for local businesses,
I’ve been thinking about how the idea could be extended, with a
fully-programmable, but simple, ticker-style Internet-connected display.
A few products along those lines do already exist, but I’ve yet to
find anything that quite matches what I had in mind. That is, until
recently, when I was introduced to LaMetric, a smart ticker being
developed by UK/Ukraine Internet of Things (IoT) startup Smart Atoms.
Launching
its Kickstarter crowdfunding campaign today, the LaMetric is aimed at
both consumers and businesses. The idea is you may want to display
alerts, notifications and other information from your online “life” via
an elegant desktop or wall-mountable and glance-able display. Likewise,
businesses that want an Internet-connected ticker, displaying various
business information, either publicly for customers or in an office, are
also a target market.
The
device itself has a retro, 8-bit style desktop clock feel to it, thanks
to its ‘blocky’ LED light powered display, which is part of its charm.
The display can output one icon and seven numbers, and is scrollable.
But, best of all, the LaMetric is fully programmable via the
accompanying app (or “hackable”) and comes with a bunch of off-the-shelf
widgets, along with support for RSS and services like IFTTT, Smart
Things, Wig Wag, Ninja Blocks, so you can get it talking to other smart
devices or web services. Seriously, this thing goes way beyond what I
had in mind — try the simulator for yourself — and, for an IoT junkie like me, is just damn cool.
Examples of the kind of things you can track with the device include
time, weather, subject and time left till your next meeting, number of
new emails and their subject lines, CrossFit timings and fitness goals,
number of to-dos for today, stock quotes, and social network
notifications.
Or for businesses, this might include Facebook Likes, website
visitors, conversions and other metrics, app store rankings, downloads,
and revenue.
In addition to the display, the device has back and forward buttons
so you can rotate widgets (though these can be set to automatically
rotate), as well as an enter key for programmed responses, such as
accepting a calendar invitation.
There’s also a loudspeaker for audio alerts. The LaMetric is powered
by micro-USB and also comes as an optional and more expensive
battery-powered version.
Early-bird backers on Kickstarter can pick up the LaMetric for as
little as $89 (plus shipping) for the battery-less version, with
countless other options and perks, increasing in price.