Researchers have succeeded in combining the power of quantum
computing with the security of quantum cryptography and have shown that
perfectly secure cloud computing can be achieved using the principles of
quantum mechanics. They have performed an experimental demonstration of
quantum computation in which the input, the data processing, and the
output remain unknown to the quantum computer. The international team of
scientists will publish the results of the experiment, carried out at
the Vienna Center for Quantum Science and Technology (VCQ) at the
University of Vienna and the Institute for Quantum Optics and Quantum
Information (IQOQI), in the forthcoming issue of Science.
Quantum computers are expected to play an important role in future
information processing since they can outperform classical computers at
many tasks. Considering the challenges inherent in building quantum
devices, it is conceivable that future quantum computing capabilities
will exist only in a few specialized facilities around the world – much
like today's supercomputers. Users would then interact with those
specialized facilities in order to outsource their quantum computations.
The scenario follows the current trend of cloud computing: central
remote servers are used to store and process data – everything is done
in the "cloud." The obvious challenge is to make globalized computing
safe and ensure that users' data stays private.
The latest research, to appear in Science, reveals that
quantum computers can provide an answer to that challenge. "Quantum
physics solves one of the key challenges in distributed computing. It
can preserve data privacy when users interact with remote computing
centers," says Stefanie Barz, lead author of the study. This newly
established fundamental advantage of quantum computers enables the
delegation of a quantum computation from a user who does not hold any
quantum computational power to a quantum server, while guaranteeing that
the user's data remain perfectly private. The quantum server performs
calculations, but has no means to find out what it is doing – a
functionality not known to be achievable in the classical world.
The scientists in the Vienna research group have demonstrated the
concept of "blind quantum computing" in an experiment: they performed
the first known quantum computation during which the user's data stayed
perfectly encrypted. The experimental demonstration uses photons, or
"light particles" to encode the data. Photonic systems are well-suited
to the task because quantum computation operations can be performed on
them, and they can be transmitted over long distances.
The process works in the following manner. The user prepares qubits –
the fundamental units of quantum computers – in a state known only to
himself and sends these qubits to the quantum computer. The quantum
computer entangles the qubits according to a standard scheme. The actual
computation is measurement-based: the processing of quantum information
is implemented by simple measurements on qubits. The user tailors
measurement instructions to the particular state of each qubit and sends
them to the quantum server. Finally, the results of the computation are
sent back to the user who can interpret and utilize the results of the
computation. Even if the quantum computer or an eavesdropper tries to
read the qubits, they gain no useful information, without knowing the
initial state; they are "blind."
###
The research at the Vienna Center for
Quantum Science and Technology (VCQ) at the University of Vienna and at
the Institute for Quantum Optics and Quantum Information (IQOQI) of the
Austrian Academy of Sciences was undertaken in collaboration with the
scientists who originally invented the protocol, based at the University
of Edinburgh, the Institute for Quantum Computing (University of
Waterloo), the Centre for Quantum Technologies (National University of
Singapore), and University College Dublin.
Publication: "Demonstration of Blind Quantum Computing"
Stefanie Barz, Elham Kashefi, Anne Broadbent, Joseph Fitzsimons, Anton Zeilinger, Philip Walther.
DOI: 10.1126/science.1214707
Another day, another set of Android fragmentation stories. And while
there’s no doubt that there is wide fragmentation within the platform,
and there’s not real solution in sight, I’m starting to wonder if Google
ever had a plan to prevent the platform for becoming a fragmented mess.
OS fragmentation, though, is an utter disaster. Ice Cream
Sandwich is by all accounts very nice; but what good does that do app
developers, when according to Google’s own stats, 30% of all Android
devices are still running an OS that is 20 months old?
…
More than two-thirds of iOS users had upgraded to iOS 5 a mere three
months after its release. Anyone out there think that Ice Cream Sandwich
will crack the 20% mark on Google’s platform pie chart by March?
He then goes on to deliver the killer blow:
OS fragmentation is the single greatest problem Android
faces, and it’s only going to get worse. Android’s massive success over
the last year mean that there are now tens if not hundreds of millions
of users whose handset manufacturers and carriers may or may not allow
them to upgrade their OS someday; and the larger that number grows, the
more loath app developers will become to turn their back on them. That
unwillingness to use new features means Android apps will fall further
and further behind their iOS equivalents, unless Google manages – via
carrot, stick, or both – to coerce Android carriers and manufacturers to
prioritize OS upgrades.
OK, so Android is fragmented, and it’s a problem that Google doesn’t
seem willing to tackle. But the more I look at the Android platform and
the associated ecosystem, it makes me wonder if Google ever had any plan
(or for that matter intention) to control platform fragmentation.
I disagree with Kindel that that there’s nothing that Google can do
to at least try to discourage fragmentation. I believe that
one of Google’s strongest cards are Android users themselves. Look at
how enthusiastic iPhone and iPad owners are about iOS updates. They’re
enthusiastic because Apple tells them why they should be enthusiastic
about new updates. Compare this to Google’s approach to Android
customers. Google (or anyone else in the chain for that matter) doesn’t
seem to be doing much to get people fired up and enthusiastic about
Android. In fact, it seems to me the only message being given to Android
customers is ‘buy another Android handset.’
I understand that Google isn’t Apple and can’t seem to sway the
crowds in the same way, but it might start to help if the search giant
seemed to care about the OS. The absence of enthusiasm make the seem
Sphinx-like and uncaring. Why should anyone care about new Android
updates when Google itself doesn’t really seem all that excited? If
Google created a real demand for Android updates from the end users,
this would put put pressure on the handset makers and the carriers to
get updates in a timely fashion to users.
Make the users care about updates, and the people standing in the way of those updates will sit up and pay attention to things.
Personal comment:
Google with Android OS is now in a similar place than Microsoft with Windows, and blaming Google to have this disparity of OS versions would be the same than blaming Microsoft on the fact that Windows XP, Windows Vista and Windows 7 are still co-existing nowadays. One reason Android got that 'fragged' is that it has to face a rapid evolution of hardware and new kind of devices in a very short time, somehow having a kind of Frankenstein-like experience with its Android creature. Many distinct hardware manufacturers adopt Android, develop their own GUI layer on top of it, making Google having a direct control on the spread of new Android version quite impossible... as each manufacturer may need to perform their own code update prior to propose a new version of Android on their own devices.
The direct comparison with iOS is a kind of unfair as Apple do have a rapid update cycle by controlling every single workings of the overall mechanism: SDK regular updates push developers to adopt new features and forget about old iOS versions and new iDevice's Apps request the end-user to upgrade their iOS version to the last one in order to be able to install new Apps. Meanwhile, Apple is having control on hardware design, production and evolution too, making the propagation of new iOS versions much easier and much faster than it is for Google with Android.
Then, mobile devices (smartphones or tablets) do have a short life timeline and this was already true prior Google and Apple starts acting in this market. So whatever your name is Google or Apple, considering not proposing the very last version of your OS on so-called 'old' or obsolete hardware is a kind of an obvious choice to do. This is not even a 'choice' but more a direct consequence of how fast technology is evolving nowadays.
Now, smartphones and tablets hardware capabilities will reach a 'standard' level to become 'mature' products (all smartphones/tablets do have cameras, video capabilities, editing capabilities etc...) which may make easier for Android to spread over on all devices in a similar version while hardware evolution observes a pause. Already Apple's last innovations are more linked to software than real hardware (r)evolution, so Android may take benefit of this in order to reduce the gap.
Microsoft announced that it will be launching silent updates for IE9 in January.
Despite
the controversy of user control, Microsoft especially has a reason to
make this move to react to browser "update fatigue" that has resulted in
virtually "stale" IE users who won't upgrade their browsers unless they
upgrade their operating system as well.
The most recent upgrade of Google's Chrome browser
shows just how well the silent update feature works. Within five days
of introduction, Chrome 15 market share fell from 24.06 percent to just
6.38 percent, while the share of Chrome 16 climbed from 0.35 percent to
19.81 percent, according to StatCounter.
Within five days, Google moved about 75 percent of its user base - more
than 150 million users - from one browser to another. Within three
days, Chrome 16 market share surpassed the market share of IE9
(currently at about 10.52 percent for this month), in four days it
surpassed Firefox 8 (currently at about 15.60 percent) and will be
passing IE8 today, StatCounter data indicates.
What makes this data so important is the fact that Google is
dominating HTML5 capability across all operating system platforms and
not just Windows 7, where IE9 has a slight advantage, according to
Microsoft (StatCounter does not break out data for browser share on
individual operating systems). IE9 was introduced on March 14 of 2011,
has captured only 10.52 percent market share and has followed a similar
slow upgrade pattern as its predecessors. For example, IE, which was
introduced in March 2009, reached its market share peak in the month IE9
was introduced - at 30.24 percent. Since then, the browser has declined
to only 22.17 percent and 57.52 percent of the IE user base still uses
IE8 today.
With the silent updates becoming available for IE8 and IE9,
Microsoft is likely to avoid another IE6 disaster with IE8. Even more
important for Microsoft is that those users who update to IE9 may be
less likely to switch to Chrome.
Everyone likes personal cloud services, like Apple’s iCloud, Google Music, and Dropbox.
But, many of aren’t crazy about the fact that our files, music, and
whatever are sitting on someone else’s servers without our control.
That’s where ownCloud comes in.
OwnCloud is an open-source cloud program. You use it to set up your
own cloud server for file-sharing, music-streaming, and calendar,
contact, and bookmark sharing project. As a server program it’s not that
easy to set up. OpenSUSE, with its Mirall installation program and desktop client makes it easier to set up your own personal ownCloud, but it’s still not a simple operation. That’s going to change.
According to ownCloud’s business crew,
“OwnCloud offers the ease-of-use and cost effectiveness of Dropbox and
box.net with a more secure, better managed offering that, because it’s
open source, offers greater flexibility and no vendor lock in. This
makes it perfect for business use. OwnCloud users can run file sync and
share services on their own hardware and storage or use popular public
hosting and storage offerings.” I’ve tried it myself and while setting
it up is still mildly painful, once up ownCloud works well.
OwnCloud enables universal access to files through a Web browser or WebDAV.
It also provides a platform to easily view and sync contacts, calendars
and bookmarks across all devices and enables basic editing right on the
Web. Programmers will be able to add features to it via its open
application programming interface (API).
OwnCloud is going to become an easy to run and use personal, private
cloud thanks to a new commercial company that’s going to take ownCloud
from interesting open-source project to end-user friendly program. This
new company will be headed by former SUSE/Novell executive Markus Rex.
Rex, who I’ve known for years and is both a business and technology
wizard, will serve as both CEO and CTO. Frank Karlitschek, founder of
the ownCloud project, will be staying.
To make this happen, this popular–350,000 users-program’s commercial
side is being funded by Boston-based General Catalyst, a high-tech.
venture capital firm. In the past, General Catalyst has helped fund such
companies as online travel company Kayak and online video platform leader Brightcove.
General Catalyst came on board, said John Simon, Managing Director at
General Catalyst in a statement, because, “With the explosion of
unstructured data in the enterprise and increasingly mobile (and
insecure) ways to access it, many companies have been forced to lock
down their data–sometimes forcing employees to find less than secure
means of access, or, if security is too restrictive, risk having all
that unavailable When we saw the ease-of-use, security and flexibility
of ownCloud, we were sold.”
“In a cloud-oriented world, ownCloud is the only tool based on a
ubiquitous open-source platform,” said Rex, in a statement. “This
differentiator enables businesses complete, transparent, compliant
control over their data and data storage costs, while also allowing
employees simple and easy data access from anywhere.”
As a Linux geek, I already liked ownCloud. At the company releases
mass-market ownCloud products and service in 2012, I think many of you
are going to like it as well. I’m really looking forward to seeing where
this program goes from here.
How a little-known 1971 machine launched an industry.
Forty years ago, Nutting Associates released the world’s first
mass-produced and commercially sold video game, Computer Space. It was
the brainchild of Nolan Bushnell, a charismatic engineer with a creative
vision matched only by his skill at self-promotion. With the help of
his business partner Ted Dabney and the staff of Nutting Associates,
Bushnell pushed the game from nothing into reality only two short years
after conceiving the idea.
Computer Space pitted a player-controlled rocket ship against two
machine-controlled flying saucers in a space simulation set before a
two-dimensional star field. The player controlled the rocket with four
buttons: one for fire, which shoots a missile from the front of the
rocket ship; two directional rotation buttons (to rotate the ship
orientation clockwise or counterclockwise); and one for thrust, which
propelled the ship in whichever direction it happened to be pointing.
Think of Asteroids without the asteroids, and you should get the picture.
During play, two saucers would appear on the screen and shoot at the player while flying in a zig-zag formation. The player’s goal was to dodge the saucer fire and shoot the saucers.
Considering a game of this complexity playing out on a TV set, you
might think that it was created as a sophisticated piece of software
running on a computer. You’d think it, but you’d be wrong–and Bushnell
wouldn’t blame you for the mistake. How he and Dabney managed to pull it
off is a story of audacity, tenacity, and sheer force-of-will worthy of
tech legend. This is how it happened.
“Apps that are released under an Open Source
Initiative-recognised open source licence can, at least in the
pre-release version of the Windows Store, be distributed according to
terms that contradict Microsoft’s Standard Application License Terms if
this is required by the open source licence. Among other things, the
Standard Application License Terms prohibit the sharing of
applications.”
Microsoft officials shared more details about the coming Windows Store
earlier this week. Metro-style applications will be licensable,
marketable and downloadable from the Windows 8 Store. Non-Metro-style
Desktop Apps will only be marketable from inside the store, with links
provided to developers’ sites for sales/downloads.
I’ve had a few developers ask me whether Microsoft will allow the use
of open-source languages/development environments — like PHP, Ruby,
Python, Eclipse, etc. — to create Windows 8 apps. The Windows 8
architectural diagrams (from Microsoft and others) make me believe the
answer is no, even though HTML5/JavaScript/CSS are all supported (and
treated as better than first-class citizens in Windows 8)….Anyone know
otherwise?
Recent Google engineering intern Andrew Munn has launched into a detailed explanation on Google+
as to why many Android devices are significantly more sluggish and less
responsive in terms of user interface and experience than comparable
iOS and Windows Phone 7 devices. The root of the problem? Inoptimal
priority queuing on Android OS. On one side, iOS has graphics rendering
queued as a real-time priority, thereby letting users self-manage which
priorities are to be rendered in the background. On the flip side,
Android views graphics rendering as a normal priority. As a result,
Android devices tend to become more sluggish when they’re trying to
perform other tasks simultaneously.
The gist of the problem boiled down by Munn:
It’s not GC pauses. It’s not because Android runs
bytecode and iOS runs native code. It’s because on iOS all UI rendering
occurs in a dedicated UI thread with real-time priority. On the other
hand, Android follows the traditional PC model of rendering occurring on
the main thread with normal priority.
Munn also broke it down in real world terms by providing the example
that if you put your finger on the screen of an iPhone or iPad and move
it around when it’s halfway through loading a complex web page like
Facebook, all rendering stops instantaneously. The website will
literally never load until your finger is removed, and this all boils
down to the fact that the “UI thread is intercepting all events and
rendering the UI at real-time priority.”
There are also some other reasons, like inoptimal hardware. The
NVIDIA Tegra 2 CPUs ubiquitous to many Android 3.0 tablets and some
phones suffered from low memory bandwidth and lacked NEON media
instructions, both of which ultimately presented a bottleneck to the
Android user interface and experience. However, Android 4.0 remedies
this by having graphics hardware acceleration, although as long as
graphics aren’t given top priority (a la real-time), platforms like iOS
or Windows Phone 7 are always going to be more fluid.
The Kilobots are an inexpensive system for testing synchronized and collaborative behavior in a very large swarm of robots. Photo courtesy of Michael Rubenstein
The Kilobots are coming. Computer scientists and engineers at Harvard University have developed and licensed technology that will make it easy to test collective algorithms on hundreds, or even thousands, of tiny robots.
Called Kilobots, the quarter-sized bug-like devices scuttle around on three toothpick-like legs, interacting and coordinating their own behavior as a team. AJune 2011 Harvard Technical Reportdemonstrated a collective of 25 machines implementing swarming behaviors such as foraging, formation control, and synchronization.
Once up and running, the machines are fully autonomous, meaning there is no need for a human to control their actions.
The communicative critters were created by members of the Self-Organizing Systems Research Group led by Radhika Nagpal, the Thomas D. Cabot Associate Professor of Computer Science at the Harvard School of Engineering and Applied Sciences (SEAS) and a Core Faculty Member at the Wyss Institute for Biologically Inspired Engineering at Harvard. Her team also includes Michael Rubenstein, a postdoctoral fellow at SEAS; and Christian Ahler, a fellow of SEAS and the Wyss Institute.
Thanks to a technology licensing deal with the K-Team Corporation, a Swiss manufacturer of high-quality mobile robots, researchers and robotics enthusiasts alike can now take command of their own swarm.
One key to achieving high-value applications for multi-robot systems in the future is the development of sophisticated algorithms that can coordinate the actions of tens to thousands of robots.
"The Kilobot will provide researchers with an important new tool for understanding how to design and build large, distributed, functional systems," says Michael Mitzenmacher, Area Dean for Computer Science at SEAS.
The name "Kilobot" does not refer to anything nefarious; rather, it describes the researchers' goal of quickly and inexpensively creating a collective of a thousand bots.
Inspired by nature, such swarms resemble social insects, such as ants and bees, that can efficiently search for and find food sources in large, complex environments, collectively transport large objects, and coordinate the building of nests and other structures.
Due to reasons of time, cost, and simplicity, the algorithms being developed today in research labs are only validated in computer simulation or using a few dozen robots at most.
In contrast, the design by Nagpal's team allows a single user to easily oversee the operation of a large Kilobot collective, including programming, powering on, and charging all robots, all of which would be difficult (if not impossible) using existing robotic systems.
So, what can you do with a thousand tiny little bots?
Robot swarms might one day tunnel through rubble to find survivors, monitor the environment and remove contaminants, and self-assemble to form support structures in collapsed buildings.
They could also be deployed to autonomously perform construction in dangerous environments, to assist with pollination of crops, or to conduct search and rescue operations.
For now, the Kilobots are designed to provide scientists with a physical testbed for advancing the understanding of collective behavior and realizing its potential to deliver solutions for a wide range of challenges.
-----
Personal comment:
This remembers me one project I have worked on, back in 2007, called "Variable Environment", which was involving swarm based robots called "e-puck" developed at EPFL. E-pucks were reacting in an autonomous manner to human activity around them.
The rush to make computers smaller and smaller has been going on for some time now, but we may have a winner–at least for now–in terms of the small computer race. It’s called the Cotton Candy from FXI Tech, and though it just looks like yourstandard USBthumb drive, it turns out it’s packing an entire very small computer in its tiny packaging.
The specs, admittedly, aren’t anything truly spectacular, offering up a dual-core ARM Cortex A9 on the processor end, backed up by an ARM Mali-400MP GPU, wi-fi and Bluetooth connectivity, a USB plug and a microSD card slot as well as its own Android operating system. But when you consider that it’s all encasedin a devicethat’s the size of a basic key chain, well, suddenly the whole picture looks a lot more interesting.
What this is designed to do is hook into much larger displays, thanks to that HDMI plug, and allow you to perform many of your basic computer functions. You’ve got Bluetooth for the peripherals, microSD for the storage, cloud access from the Android app…it’s a very simple, very basic, but extremely portable setup. And, you can even hook it into another computer with the USB plug included, which in turn will let you borrow the peripherals hooked into that computer (great if you needed to print something, I’d say) to do the various jobs you want done.
And if you want an ultra-small computer to take with you most anywhere you go, Cotton Candy should be on hand in time for Christmas 2012, and the pricing is expected to land at the $200 mark, which isn’t half bad. Though it does make me wonder why most wouldn’t just buy a full on laptop for not too much more, especially if they buy used.
Still though, an ultra-small PC for an ultra-small price tag is in the offing, so what do you guys think? Will the Cotton Candy catch on? Or will we be seeing these go for half that or less just to clear them out? No matter what you think, we love hearing from you, so head on down to the comments section and tell us what you think!