Choosing sides: Google’s new augmented-reality game,
Ingress, makes users pick a faction—Enlightened or Resistance—and run
around town attacking virtual portals in hopes of attaining world
domination
I’m not usually very political, but I recently joined the Resistance,
fighting to protect the world against the encroachment of a strange,
newly discovered form of energy. Just this week, in fact, I spent hours
protecting Resistance territory and attacking the enemy.
Don’t worry, this is just the gloomy sci-fi world depicted in a new smartphone game called Ingress
created by Google. Ingress is far from your normal gaming app,
though—it takes place, to some degree, in the real world; aspects of the
game are revealed only as you reach different real-world locations.
Ingress’s world is one in which the discovery of so-called
“exotic matter” has split the population into two groups: the
Enlightened, who want to learn how to harness the power of this energy,
and the Resistance, who, well, resist this change. Players pick a side,
and then walk around their city, collecting exotic matter to keep
scanners charged and taking control of exotic-matter-exuding portals in
order to capture more land for their team.
I found the game, which
is currently available only to Android smartphone users who have
received an invitation to play, surprisingly addictive—especially
considering my usual apathy for gaming.
What’s most interesting
about Ingress, though, is what it suggests about Google’s future plans,
which seem to revolve around finding new ways to extend its reach from
the browser on your laptop to the devices you carry with you at all
times. The goal makes plenty of sense when you consider that traditional
online advertising—Google’s bread and butter—could eventually be
eclipsed by mobile, location-based advertising.
Ingress was
created by a group within Google called Niantic Labs—the same team
behind another location-based app released recently (see “Should You Go on Google’s Field Trip?”).
Google
is surely gathering a treasure trove of information about where we’re
going and what we’re doing while we play Ingress. It must also see the
game as a way to explore possible applications for Project Glass, the
augmented-reality glasses-based computer that the company will start
sending out to developers next year. Ingress doesn’t require a
head-mounted display; it uses your smartphone’s display to show a map
view rather than a realistic view of your surroundings. Still, it is
addictive, and is likely to get many more folks interested in
location-based augmented reality, or at least in augmented-reality
games.
Despite its futuristic focus, Ingress sports a sort of
pseudo-retro look, with a darkly hued map that dominates the screen and a
simple pulsing blue triangle that indicates your position. I could only
see several blocks in any direction, which meant I had to walk around
and explore in order to advance in the game.
For a while, I didn’t
know what I was doing, and it didn’t help that Ingress doesn’t include
any street names. New users complete a series of training exercises,
learning the basics of the game, which include capturing a portal,
hacking a portal to snag items like resonators (which control said
portals), creating links of exotic matter between portals to build a
triangular control field that enhances the safety of team members in the
area, and firing an XMP (a “non-polarized energy field weapon,”
according to the glossary) at an enemy-controlled portal.
Confused much? I sure was.
But
I forged ahead, though, hoping that if I kept playing it would make
more sense. I started wandering around looking for portals. Portals are
found in public places—in San Francisco, where I was playing, this
includes city landmarks such as museums, statues, and murals. Resistance
portals are blue, Enlightened ones are green, and there are also some
gray ones out there that remain unclaimed.
I found a link to a larger map
of the Ingress world that I could access through my smartphone browser
and made a list of the best-looking nearby targets. Perhaps this much
planning goes against the exploratory spirit of the game, but it made
Ingress a lot less confusing for me (there’s also a website that doles out clues about the game and its mythology).
Once
I had a plan, I set out toward the portals on my list, all of which
were in the Soma and Downtown neighborhoods of San Francisco. I managed
to capture two new portals at Yerba Buena Gardens—one at a statue of
Martin Luther King, Jr. and another at the top of a waterfall—and link
them together.
Across the street, in front of the Contemporary
Jewish Museum, I hacked an Enlightened portal and fired an XMP at it,
weakening its resonators. I was then promptly attacked. I fled, figuring
I wouldn’t be able to take down the portal by myself.
A few hours
later, much of my progress was undone by a member of Enlightened
(Ingress helpfully sends e-mail notifications about such things). I was
surprised by how much this pissed me off—I wanted to get those portals
back for the Resistance, but pouring rain and the late hour stopped me.
Playing
Ingress was a lot more fun than I expected, and from the excited
chatter in the game’s built-in chat room, it was clear I wasn’t the only
one getting into it.
On my way back from a meeting, I couldn’t
help but keep an eye out for portals, ducking into an alley to attack
one near my office. Later, I found myself poring over the larger map on
my office computer, looking at the spread of portals and control fields
around the Bay Area.
As it turns out, my parents live in an area
dominated by the Enlightened. So I guess I’ll be busy attacking enemy
portals in my hometown this weekend.
The future of augmented-reality technology is here - as long as you're a rabbit. Bioengineers have placed the first contact lenses containing electronic displays into the eyes of rabbits as a first step on the way to proving they are safe for humans. The bunnies suffered no ill effects, the researchers say.
The first version may only have one pixel, but higher resolution lens displays - like those seen inTerminator- could one day be used as satnav enhancers showing you directional arrows for example, or flash up texts and emails - perhaps even video. In the shorter term, the breakthrough also means people suffering from conditions like diabetes and glaucoma may find they have a novel way to monitor their conditions.
In February,New Scientistrevealedthe litany of research projects underway in the field of contact lens enhancement. While one companyhas fielded a contact lens technologyusing a surface-mounted strain gauge to assess glaucoma risk, none have built in a display, or the lenses needed for focused projection onto the retina - and then tested it in vivo. They have now.
"We have demonstrated the operation of a contact lens display powered by a remote radiofrequency transmitter in free space and on a live rabbit," says a US and Finnish team led by Babak Praviz of the University of Washington in Seattle.
"This verifies that antennas, radio chips, control circuitry, and micrometre-scale light sources can be integrated into a contact lens and operated on live eyes."
The test lens was powered remotely using a 5-millimetre-long antenna printed on the lens to receive gigahertz-range radio-frequency energy from a transmitter placed ten centimetres from the rabbit's eye. To focus the light on the rabbit's retina, the contact lens itself was fabricated as a Fresnel lens - in which a series of concentric annular sections is used to generate the ultrashort focal length needed.
They found their lens LED glowed brightly up to a metre away from the radio source in free space, but needed to be 2 centimetres away when the lens was placed in a rabbit's eye and the wireless reception was affected by body fluids. All the 40-minute-long tests on live rabbits were performed under general anaesthetic and showed that the display worked well - and fluroescence tests showed no damage or abrasions to the rabbit's eyes after the lenses were removed.
While making a higher resolution display is next on their agenda, there are uses for this small one, say the researchers: "A display with a single controllable pixel could be used in gaming, training, or giving warnings to the hearing impaired."
"This is clearly way off in the future. But we're aware of the research that is ongoing in this field and we're watching the technology's potential for biosensing and drug delivery applications in particular," says a spokesperson for the British Contact Lens Association in London.
Back aroundJuly of last year, Qualcomm launched a software development kit for building Augmented Reality apps on Android. The idea was to allow Android developers to build all sorts of crazy AR stuff (like games and apps that render things in live 3D on top of a view pulled in through your device’s camera) without having to reinvent the wheel by coding up their own visual-recognition system. It is, for lack of a better word, awesome.
And now it’s available for iOS.
For those unfamiliar with Augmented Reality — or for those who just want to see something cool — check out this demo video I shot a year or so back:
Sometime in the past few hours, Qualcomm quietly rolled a beta release of the iOS-compatible SDK into theirdeveloper center. This came as abit of a shock; Qualcomm had previously expressed that, while an iOS port would come sooner or later, their main focus was building this platform for devices running their Snapdragon chips (read: not Apple devices).
And yet, here we are. This first release of the SDK supports the iPhone 4, iPad 2, and fourth generation iPod Touch — none of which have Snapdragon CPUs in them. Furthermore, this release supports Unity (a WYSIWYG-style rapid game development tool) right off the bat, whereas the Android release didn’t get Unity support until a few months. Developers can also work in straight in Xcode if they so choose.
This platform lowers the “You must be this crazy of a developer to ride this ride” bar considerably, so expect an onslaught of Augmented Reality apps in the App Store before too long.