Thursday, April 30. 2015
Via Information week
Technology giant IBM announced two major breakthroughs towards the building of a practical quantum computer, the next evolution in computing that will be required as Moore’s Law runs out of steam.
Described in the April 29 issue of the journal Nature Communications, the breakthroughs include the ability to detect and measure both kinds of quantum errors simultaneously and a new kind of circuit design, which the company claims is "the only physical architecture that could successfully scale to larger dimensions."
The two innovations are interrelated: The quantum bit circuit, based on a square lattice of four superconducting "qubits" -- short for quantum bits -- on a chip roughly one-quarter-inch square, enables both types of quantum errors to be detected at the same time.
The IBM project, which was funded in part by the Intelligence Advanced Research Projects Activity (IARPA) Multi-Qubit Coherent Operations program, opts for a square-shaped design as opposed to a linear array, which IBM said prevents the detection of both kinds of quantum errors simultaneously.
Jerry M. Chow, manager of the Experimental Quantum Computing group at IBM’s T.J. Watson Research Center, and the primary investigator on the IARPA-sponsored Multi-Qubit Coherent Operations project, told InformationWeek that one area they are excited about is the potential for quantum computers to simulate systems in nature.
"In physics and chemistry, quantum computing will allow us to design new materials and drug compounds without the expensive trial-and-error experiments in the lab, dramatically speeding up the rate and pace of innovation," Chow said. "For instance, the effectiveness of drugs is governed by the precise nature of the chemical bonds in the molecules constituting the drug."
He noted the computational chemistry required for many of these problems is out of the reach of classical computers, and this is one example of where quantum computers may be capable of solving such problems leading to better drug design.
The qubits, IBM said, could be designed and manufactured using standard silicon fabrication techniques, once a handful of superconducting qubits can be manufactured quickly and reliably, and boast low error-rates.
"Quantum information is very fragile, requiring the quantum elements to be cooled to near absolute zero temperature and shielded from its environment to minimize errors," Chow explained. "A quantum bit, the component that carries information in a quantum system, can be susceptible to two types of errors -- bit-flip and phase-flip. It either error occurs, the information is destroyed and it cannot carry out the operation."
He said it is important to detect and measure both types of errors in order to know what errors are present and how to address them, noting no one has been able to do this before in a scalable architecture.
"We are at the stage of figuring out the building blocks of quantum computers -- a new paradigm of computing completely different than how computers are built today," Wong said. "In the arc of quantum computing progress, we are at the moment of time similar to when scientists were building the first transistor. If built, quantum computers have the potential to unlock new applications for scientific discovery and data analysis and will be more powerful than any supercomputer today."
Friday, March 20. 2015
Via PC World
Remember when the Internet was just that thing you accessed on your computer?
Today, connectivity is popping up in some surprising places: kitchen appliances, bathroom scales, door locks, clothes, water bottles… even toothbrushes.
That’s right, toothbrushes. The Oral-B SmartSeries is billed as the world’s first “interactive electric toothbrush” with built-in Bluetooth. Whatever your feelings on this level of connectivity, it’s undeniable that it’s a new frontier for data.
And let’s face it, we’re figuring it out as we go. Consequently, it’s a good idea to keep your devices secure - and that means leveraging a product like Norton Security, which protects mobile devices and can help keep intruders out of your home network. Because, let’s face it, the last thing you want is a toothbrush that turns on you.
Welcome to the age of the Internet of Things (IoT for short), the idea that everyday objects - and everyday users - can benefit from integrated network connectivity, whether it’s a washing machine that notifies you when it’s done or a collar-powered tracker that helps you locate your runaway pet.
Some of these innovations are downright brilliant. Others veer into impractical or even unbelievable. And some can present risks that we’ve never had to consider before.
Consider the smart lock. A variety of companies offer deadbolt-style door locks you can control from your smartphone. One of them, the August Smart Lock, will automatically sense when you approach your door and unlock it for you, even going so far as to lock it again once you’ve passed through. And the August app not only logs who has entered and exited, but also lets you provide a temporary virtual key to friends, family members, a maid service, and the like.
That’s pretty cool, but what happens in the event of a dead battery - either in the user’s smartphone or the lock itself? If your phone gets lost or stolen, is there a risk a thief can now enter your home? Could a hacker “pick” your digital lock? Smart lock-makers promise safeguards against all these contingencies, but it begs the question whether the conveniences outweigh the risks. Do we really need the Internet in all our things?
The latest that-can’t-possibly-be-a-real-product example made its debut at this year’s Consumer Electronics Show: The Belty, an automated belt/buckle combo that automatically loosens when you sit and tightens when you stand. A smartphone app lets you control the degree of each. Yep.
Then there’s the water bottle that reminds you to drink more water. The smart exercise shirt your trainer can use to keep tabs on your activity (or lack thereof). And who can forget the HAPIfork, the “smart” utensil that aims to steer you toward healthier eating by reminding you to eat more slowly?
Stop the Internet (of Things), I want to get off.
Okay, I shouldn’t judge. And it’s not all bad. There is real value in - and good reason to be excited about - a smart basketball that helps you perfect your jump shot. Or a system of smart light bulbs designed to deter break-ins. Ultimately, the free market will decide which ones are useful and which ones are ludicrous.
The important thing to remember is that with the IoT, we’re venturing into new territory. We’re linking more devices than ever to our home networks. We’re installing phone and tablet apps that have a direct line not just to our data, but also our very domiciles.
Wednesday, March 18. 2015
A year after it revealed another attempt to muscle in on the smartphone market, Canonical’s first Ubuntu-based smartphone is due to go on sale in Europe in the “coming days”, it said today. The device will be sold for €169.90 (~$190) unlocked to any carrier network, although some regional European carriers will be offering SIM bundles at the point of purchase. The hardware is an existing mid-tier device, the Aquaris E4.5, made by Spain’s BQ — with the Ubuntu version of the device known as the ‘Aquaris E4.5 Ubuntu Edition’. So the only difference here is it will be pre-loaded with Ubuntu’s mobile software, rather than Google’s Android platform.
Canonical has been trying to get into the mobile space for a while now. Back in 2013, the open source software maker failed to crowdfund a high end converged smartphone-cum-desktop-computer, called the Ubuntu Edge — a smartphone-sized device that would been powerful enough to transform from a pocket computer into a fully fledged desktop when plugged into a keyboard and monitor, running Ubuntu’s full-fat desktop OS. Canonical had sought to raise a hefty $32 million in crowdfunds to make that project fly. Hence its more modest, mid-tier smartphone debut now.
On the hardware side, Ubuntu’s first smartphone offers pretty bog standard mid-range specs, with a 4.5 inch screen, 1GB RAM, a quad-core A7 chip running at “up to 1.3Ghz”, 8GB of on-board storage, 8MP rear camera and 5MP front-facing lens, plus a dual-SIM slot. But it’s the mobile software that’s the novelty here (demoed in action in Canonical’s walkthrough video, embedded below).
Canonical has created a gesture-based smartphone interface called Scopes, which puts the homescreen focus on on a series of themed cards that aggregate content and which the user swipes between to navigate around the functions of the phone, while app icons are tucked away to the side of the screen, or gathered together on a single Scope card. Examples include a contextual ‘Today’ card which contains info like weather and calendar, or a ‘Nearby’ card for location-specific local services, or a card for accessing ‘Music’ content on the device, or ‘News’ for accessing various articles in one place.
It’s certainly a different approach to the default grid of apps found on iOS and Android but has some overlap with other, alternative platforms such as Palm’s WebOS, or the rebooted BlackBerry OS, or Jolla’s Sailfish. The problem is, as with all such smaller OSes, it will be an uphill battle for Canonical to attract developers to build content for its platform to make it really live and breathe. (It’s got a few third parties offering content at launch — including Songkick, The Weather Channel and TimeOut.) And, crucially, a huge challenge to convince consumers to try something different which requires they learn new mobile tricks. Especially given that people can’t try before they buy — as the device will be sold online only.
Canonical said the Aquaris E4.5 Ubuntu Edition will be made available in a series of flash sales over the coming weeks, via BQ.com. Sales will be announced through Ubuntu and BQ’s social media channels — perhaps taking a leaf out of the retail strategy of Android smartphone maker Xiaomi, which uses online flash sales to hype device launches and shift inventory quickly. Building similar hype in a mature smartphone market like Europe — for mid-tier hardware — is going to a Sisyphean task. But Canonical claims to be in it for the long, uphill haul.
“We are going for the mass market,” Cristian Parrino, its VP of Mobile, told Engadget. “But that’s a gradual process and a thoughtful process. That’s something we’re going to be doing intelligently over time — but we’ll get there.”
Monday, February 02. 2015
The next generation of cloud servers might be deployed where the clouds can be made of alcohol and cosmic dust: in space. That’s what ConnectX wants to do with their new data visualization platform.
Why space? It’s not as though there isn’t room to set up servers here on Earth, what with Germans willing to give up space in their utility rooms in exchange for a bit of ambient heat and malls now leasing empty storefronts to service providers. But there are certain advantages.
The desire to install servers where there’s abundant, free cooling makes plenty of sense. Down here on Earth, that’s what’s driven companies like Facebook to set up shop in Scandinavia near the edge of the Arctic Circle. Space gets a whole lot colder than the Arctic, so from that standpoint the ConnectX plan makes plenty of sense. There’s also virtually no humidity, which can wreak havoc on computers.
They also believe that the zero-g environment would do wonders for the lifespan of the hard drives in their servers, since it could reduce the resistance they encounter while spinning. That’s the same reason Western Digital started filling hard drives with helium.
But what about data transmission? How does ConnectX plan on moving the bits back and forth between their orbital servers and networks back on the ground? Though something similar to NASA’s Lunar Laser Communication Demonstration — which beamed data to the moon 4,800 times faster than any RF system ever managed — seems like a decent option, they’re leaning on RF.
Mind you, it’s a fairly complex setup. ConnectX says they’re polishing a system that “twists” signals to reap massive transmission gains. A similar system demonstrated last year managed to push data over radio waves at a staggering 32gbps, around 30 times faster than LTE.
So ConnectX seems to have that sorted. The only real question is the cost of deployment. Can the potential reduction in long-term maintenance costs really offset the massive expense of actually getting their servers into orbit? And what about upgrading capacity? It’s certainly not going to be nearly as fast, easy, or cheap as it is to do on Earth. That’s up to ConnectX to figure out, and they seem confident that they can make it work.
Thursday, December 11. 2014
Google's moonshot group Google X is working on a pill that, when swallowed, will seek out cancer cells in your body. It'll seek out all sorts of diseases, in fact, pushing the envelope when it comes to finding and destroying diseases at their earliest stages of development. This system would face "a much higher regulatory bar than conventional diagnostic tools," so says Chad A. Mirkin, director of the International Institute for Nanotechnology at Northwestern University.
Word comes from the Wall Street Journal where they've got Andrew Conrad, head of the Life Sciences team at the Google X research lab speaking during their WSDJ Live conference. This system is called "The Nano Particle Platform," and it aims to "functionalize Nano Particles, to make them do what we want."
According to Conrad, Google X is working on two separate devices, more than likely. The first is the pill which contains their smart nanoparticles. The second is a wearable device that attracts the particles so that they might be counted.
"Our dream" said Conrad, is that "every test you ever go to the doctor for will be done through this system."
Sound like a good idea to you?
Friday, November 07. 2014
For an industry run according to logic and rationality, at least outwardly, the tech world seems to have a surprising weakness for hype and the 'next big thing'.
Perhaps that's because, unlike — say — in sales or HR, where innovation is defined by new management strategies, tech investment is very product driven. Buying a new piece of hardware or software often carries the potential for a 'disruptive' breakthrough in productivity or some other essential business metric. Tech suppliers therefore have a vested interest in promoting their products as vigorously as possible: the level of spending on marketing and customer acquisition by some fast-growing tech companies would turn many consumer brands green with envy.
As a result, CIOs are tempted by an ever-changing array of tech buzzwords (cloud, wearables and the Internet of Things [IoT] are prominent in the recent crop) through which they must sift in order to find the concepts that are a good fit for their organisations, and that match their budgets, timescales and appetite for risk. Short-term decisions are relatively straightforward, but the further you look ahead, the harder it becomes to predict the winners.
Tech innovation in a one-to-three year timeframe
Despite all the temptations, the technologies that CIOs are looking at deploying in the near future are relatively uncontroversial — pretty safe bets, in fact. According to TechRepublic's own research, top CIO investment priorities over the next three years include security, mobile, big data and cloud. Fashionable technologies like 3D printing and wearables find themselves at the bottom of the list.
A separate survey from Deloitte reported similar findings: many of the technologies that CIOs are piloting and planning to implement in the near future are ones that have been around for quite some time — business analytics, mobile apps, social media and big data tools, for example. Augmented reality and gamification were seen as low-priority technologies.
This reflects the priorities of most CIOs, who tend to focus on reliability over disruption: in TechRepublic's research, 'protecting/securing networks and data' trumps 'changing business requirements' for understandably risk-wary tech chiefs.
Another major factor here is money: few CIOs have a big budget for bets on blue-skies innovation projects, even if they wanted to. (And many no doubt remember the excesses of the dotcom years, and are keen to avoid making that mistake again.)
According to the research by Deloitte, less than 10 percent of the tech budget is ring-fenced for technology innovation (and CIOs that do spend more on innovation tend to be in smaller, less conservative, companies). There's another complication in that CIOs increasingly don't control the budget dedicated to innovation, as this is handed onto other business units (such as marketing or digital) that are considered to have a more entrepreneurial outlook.
CIOs tend to blame their boss's conservative attitude to risk as the biggest constraint in making riskier IT investments for innovation and growth. Although CIOs claim to be willing to take risks with IT investments, this attitude does not appear to match up with their current project portfolios.
Another part of the problem is that it's very hard to measure the return on some of these technologies. Managers have been used to measuring the benefits of new technologies using a standard return-on-investment measure that tracks some very obvious costs — headcount or spending on new hardware, for example. But defining the return on a social media project or an IoT trial is much more slippery.
Tech investment: A medium-term view
If CIO investment plans remain conservative and hobbled by a limited budget in the short term, you have to look a little further out to see where the next big thing in tech might come from.
One place to look is in what's probably the best-known set of predictions about the future of IT: Gartner's Hype Cycle for Emerging Technologies, which tries to assess the potential of new technologies while taking into account the expectations surrounding them.
The chart grades technologies not only by how far they are from mainstream adoption, but also on the level of hype surrounding them, and as such it demonstrates what the analysts argue is a fundamental truth: that we can't help getting excited about new technology, but that we also rapidly get turned off when we realize how hard it can be to deploy successfully. The exotically-named Peak of Inflated Expectations is commonly followed by the Trough of Disillusionment, before technologies finally make it up the Slope of Enlightenment to the Plateau of Productivity.
"It was a pattern we were seeing with pretty much all technologies — that up-and-down of expectations, disillusionment and eventual productivity," says Jackie Fenn, vice-president and Gartner fellow, who has been working on the project since the first hype cycle was published 20 years ago, which she says is an example of the human reaction to any novelty.
"It's not really about the technologies themselves, it's about how we respond to anything new. You see it with management trends, you see it with projects. I've had people tell me it applies to their personal lives — that pattern of the initial wave of enthusiasm, then the realisation that this is much harder than we thought, and then eventually coming to terms with what it takes to make something work."
According to Gartner's 2014 list, the technologies expected to reach the Plateau of Productivity, (where they become widely adopted) within the next two years include speech recognition and in-memory analytics.
Technologies that might take two to five years until mainstream adoption include 3D scanners, NFC and cloud computing. Cloud is currently entering Gartner's trough of disillusionment, where early enthusiasm is overtaken by the grim reality of making this stuff work: "there are many signs of fatigue, rampant cloudwashing and disillusionment (for example, highly visible failures)," Gartner notes.
When you look at a 5-10-year horizon, the predictions include virtual reality, cryptocurrencies and wearable user interfaces.
Working out when the technologies will make the grade, and thus how CIOs should time their investments, seems to be the biggest challenge. Several of the technologies on Gartner's first-ever hype curve back in 1995 — including speech recognition and virtual reality — are still on the 2014 hype curve without making it to primetime yet.
These sorts of user interface technologies have taken a long time to mature, says Fenn. For example, voice recognition started to appear in very structured call centre applications, while the latest incarnation is something like Siri — "but it's still not a completely mainstream interface," she says.
Nearly all technologies go through the same rollercoaster ride, because our response to new concepts remains the same, says Fenn. "It's an innate psychological reaction — we get excited when there's something new. Partly it's the wiring of our brains that attracts us — we want to keep going around the first part of the cycle where new technologies are interesting and engaging; the second half tends to be the hard work, so it's easier to get distracted."
But even if they can't escape the hype cycle, CIOs can use concepts like this to manage their own impulses: if a company's investment strategy means it's consistently adopting new technologies when they are most hyped (remember a few years back when every CEO had to blog?) then it may be time to reassess, even if the CIO peer-pressure makes it difficult.
Says Fenn: "There is that pressure, that if you're not doing it you just don't get it — and it's a very real pressure. Look at where [new technology] adds value and if it really doesn't, then sometimes it's fine to be a later adopter and let others learn the hard lessons if it's something that's really not critical to you."
The trick, she says, is not to force-fit innovation, but to continually experiment and not always expect to be right.
Looking further out, the technologies labelled 'more than 10 years' to mainstream adoption on Gartner's hype cycle are the rather sci-fi-inflected ones: holographic displays, quantum computing and human augmentation. As such, it's a surprisingly entertaining romp through the relatively near future of technology, from the rather mundane to the completely exotic. "Employers will need to weigh the value of human augmentation against the growing capabilities of robot workers, particularly as robots may involve fewer ethical and legal minefields than augmentation," notes Gartner.
Where the futurists roam
Beyond the 10-year horizon, you're very much into the realm where the tech futurists roam.
Steve Brown, a futurist at chip-maker Intel argues that three mega-trends will shape the future of computing over the next decade. "They are really simple — it's small, big and natural," he says.
'Small' is the consequence of Moore's Law, which will continue the trend towards small, low-power devices, making the rise of wearables and the IoT more likely. 'Big' refers to the ongoing growth in raw computing power, while 'natural' is the process by which everyday objects are imbued with some level of computing power.
"Computing was a destination: you had to go somewhere to compute — a room that had a giant whirring computer in it that you worshipped, and you were lucky to get in there. Then you had the era where you could carry computing with you," says Brown.
"The next era is where the computing just blends into the world around us, and once you can do that, and instrument the world, you can essentially make everything smart — you can turn anything into a computer. Once you do that, profoundly interesting things happen," argues Brown.
With this level of computing power comes a new set of problems for executives, says Brown. The challenge for CIOs and enterprise architects is that once they can make everything smart, what do they want to use it for? "In the future you have all these big philosophical questions that you have to answer before you make a deployment," he says.
Brown envisages a world of ubiquitous processing power, where robots are able to see and understand the world around them.
"Autonomous machines are going to change everything," he claims. "The challenge for enterprise is how humans will work alongside machines — whether that's a physical machine or an algorithm — and what's the best way to take a task and split it into the innately human piece and the bit that can be optimized in some way by being automated."
The pace of technological development is accelerating: where we used to have a decade to make these decisions, these things are going to hit us faster and faster, argues Brown. All of which means we need to make better decisions about how to use new technology — and will face harder questions about privacy and security.
"If we use this technology, will it make us better humans? Which means we all have to decide ahead of time what do we consider to be better humans? At the enterprise level, what do we stand for? How do we want to do business?".
Not just about the hardware and software
For many organizations there's a big stumbling block in the way of this bright future — their own staff and their ways of working. Figuring out what to invest in may be a lot easier than persuading staff, and whole organisations, to change how they operate.
"What we really need to figure out is the relationship between humans and technology, because right now humans get technology massively wrong," says Dave Coplin, chief envisioning officer for Microsoft (a firmly tongue-in-cheek job title, he assures me).
Coplin argues that most of us tend to use new technology to do things the way we've always been doing them for years, when the point of new technology is to enable us to do things fundamentally differently. The concept of productivity is a classic example: "We've got to pick apart what productivity means. Unfortunately most people think process is productivity — the better I can do the processes, the more productive I am. That leads us to focus on the wrong point, because actually productivity is about leading to better outcomes." Three-quarters of workers think a productive day in the office is clearing their inbox, he notes.
Developing a better relationship with technology is necessary because of the huge changes ahead, argues Coplin: "What happens when technology starts to disappear into the background; what happens when every surface has the capability to have contextual information displayed on it based on what's happening around it, and who is looking at it? This is the kind of world we're heading into — a world of predictive data that will throw up all sorts of ethical issues. If we don't get the humans ready for that change we'll never be able to make the most of it."
Nicola Millard, a futurologist at telecoms giant BT, echoes these ideas, arguing that CIOs have to consider not just changes to the technology ahead of them, but also changes to the workers: a longer working life requires workplace technologies that appeal to new recruits as well as staff into their 70s and older. It also means rethinking the workplace: "The open-plan office is a distraction machine," she says — but can you be innovative in a grey cubicle? Workers using tablets might prefer 'perch points' to desks, those using gesture control may need more space. Even the role of the manager itself may change — becoming less about traditional command and control, and more about being a 'party host', finding the right mix of skills to get the job done.
In the longer term, not only will the technology change profoundly, but the workers and managers themselves will also need to upgrade their thinking.
Wednesday, November 05. 2014
Via ars technica
A two-stage attack could allow spies to sneak secrets out of the most sensitive buildings, even when the targeted computer system is not connected to any network, researchers from Ben-Gurion University of the Negev in Israel stated in an academic paper describing the refinement of an existing attack.
The technique, called AirHopper, assumes that an attacker has already compromised the targeted system and desires to occasionally sneak out sensitive or classified data. Known as exfiltration, such occasional communication is difficult to maintain, because government technologists frequently separate the most sensitive systems from the public Internet for security. Known as an air gap, such a defensive measure makes it much more difficult for attackers to compromise systems or communicate with infected systems.
Yet, by using a program to create a radio signal using a computer’s video card—a technique known for more than a decade—and a smartphone capable of receiving FM signals, an attacker could collect data from air-gapped devices, a group of four researchers wrote in a paper presented last week at the IEEE 9th International Conference on Malicious and Unwanted Software (MALCON).
“Such technique can be used potentially by people and organizations with malicious intentions and we want to start a discussion on how to mitigate this newly presented risk,” Dudu Mimran, chief technology officer for the cyber security labs at Ben-Gurion University, said in a statement.
For the most part, the attack is a refinement of existing techniques. Intelligence agencies have long known—since at least 1985—that electromagnetic signals could be intercepted from computer monitors to reconstitute the information being displayed. Open-source projects have turned monitors into radio-frequency transmitters. And, from the information leaked by former contractor Edward Snowden, the National Security Agency appears to use radio-frequency devices implanted in various computer-system components to transmit information and exfiltrate data.
AirHopper uses off-the-shelf components, however, to achieve the same result. By using a smartphone with an FM receiver, the exfiltration technique can grab data from nearby systems and send it to a waiting attacker once the smartphone is again connected to a public network.
“This is the first time that a mobile phone is considered in an attack model as the intended receiver of maliciously crafted radio signals emitted from the screen of the isolated computer,” the group said in its statement on the research.
The technique works at a distance of 1 to 7 meters, but can only send data at very slow rates—less than 60 bytes per second, according to the researchers.
Wednesday, October 29. 2014
Engineers at Stanford University have developed a tiny radio that’s about as big as an ant and that’s cheap and small enough that it could help realize the “Internet of things”—the world of everyday objects that send and receive data via the Internet.
The radio is built on a piece of silicon that measures just a few millimeters on each side. Several tens of them can fit on the top of a U.S. penny and the radio itself is expected to cost only a few pennies to manufacture in mass quantities.
Part of the secret to the radio’s size is its lack of a battery. Its power requirements are sufficiently frugal that it can harvest the energy it needs from nearby radio fields, such as those from a reader device when it’s brought nearby.
RFID tags and contactless smartcards can get their power the same way, drawing energy from a radio source, but Stanford’s radio has more processing power than those simpler devices, a university representative said. That means it could query a sensor for its data, for instance, and transmit it when required.
The device operates in the 24GHz and 60GHz bands, suitable for communications over a few tens of centimeters.
Engineers envisage a day when trillions of objects are connected via tiny radios to the Internet. Data from the devices is expected to help realize smarter and more energy-efficient homes, although quite how it will all work is yet to be figured out. Radios like the one from Stanford should help greatly expand the number of devices that can collect and share data.
The radio was demonstrated by Amin Arbabian, an assistant professor of electrical engineering at Stanford and one of the developers of the device, at the recent VLSI Technology and Circuits Symposium in Hawaii.
Tuesday, September 09. 2014
Via Tech Crunch
Mimosa Networks is finally ready to help make gigabit wireless technology a reality. The company, which recently came out of stealth, is launching a series of products that it hopes to sell to a new generation of wireless ISPs.
Its wireless Internet products include its new B5 Backhaul radio hardware and its Mimosa Cloud Services planning and analytics offering. By using the two in combination, new ISPs can build high-capacity wireless networks at a fraction of the cost it would take to lay a fiber network.
The B5 backhaul radio is a piece of hardware that uses multiple-input and multiple-output (MIMO) technology to provide up to 16 streams and 4 Gbps of output when multiple radios are using the same channel.
With a single B5 radio, customers can provide a gigabit of throughput for up to eight or nine miles, according to co-founder and chief product officer Jaime Fink. The longer the distance, the less bandwidth is available, of course. But Fink said the company is running one link of about 60 miles that still gets a several hundred megabits of throughput along the California coast.
Not only does the product offer high data speeds on 5 GHz wireless spectrum, but it also makes that spectrum more efficient. It uses spectrum analysis and load balancing to optimize bandwidth, frequency, and power use based on historical and real-time data to adapt to wireless interference and other issues.
In addition to the hardware, Mimosa’s cloud services will help customers plan and deploy networks with analytics tools to determine how powerful and efficiently their existing equipment is running. That will enable new ISPs to more effectively determine where to place new hardware to link up with other base stations.
The product also is designed to support networks as they grow, and it makes sure that ISPs can spot problems as they happen. The Cloud Services product is available now, but the backhaul radio will be available for about $900 later this fall.
Mimosa is launching these first products after raising $38 million in funding from New Enterprise Associates and Oak Investment Partners. That includes a $20 million Series C round led by return investor NEA that was recently closed.
The company was founded by Brian Hinman, who had previously co-founded PictureTel, Polycom, and 2Wire, along with Fink, who previously served as CTO of 2Wire and SVP of technology for Pace after it acquired the home networking equipment company. Now they’re hoping to make wireless gigabit speeds available for new ISPs.
Monday, September 08. 2014
After Leap Motion's somewhat disappointing debut, you'd be forgiving for wanting to wave off the idea of third-party gesture control peripherals. But wait! Unlike Leap, Reactiv isn't trying to revolutionize human-computer interactions with its Touch+ controller—there's no wizard-like finger waggling or Minority Report-style hand waving here. Instead, the Touch+'s dual cameras turn any surface into multi-touch input device.
Touch+ was born out of Haptix, a Kickstarter project that raised more than $180,000 from backers. Over the past year, Reactiv refined the Haptix vision to eventually become Touch+.
While Touch+ certainly won't be for everyone, Reactiv is positioning the multitouch PC controller as more than a mere tool for games and art projects. The video above shows the device being used in an office meeting, acting as a cursor control for a businessman's laptop before being repositioned on the fly to point at a projected display, instantly allowing the man to reach up with his hands to circle objects on the image.
What's more, PCWorld sister site CITEWorld managed to snag a live demo with Touch+, and the founders focused on the potential productivity uses of the device: Enabling mouse-free control of Excel and PowerPoint, naturally manipulating pictures in PhotoShop, creating designs in CAD, the aforementioned presentation capabilities, and so forth.
The Touch+ works with Windows PCs or Macs, connecting via USB 2.0 or 3.0. If you choose to point it at your keyboard, the device will temporarily suspend its multitouch capabilities while you type, then resume when your fingers stop bobbing up and down.
Sound interesting? An alpha version of Touch+ is available now on the Reactiv website for $75. Until we get our own hands on the device, however, we won't know for sure how the device stacks up to competitors like the Leap Motion.
(Page 1 of 17, totaling 170 entries) » next page
Show tagged entries