Thursday, January 05. 2012Cognitive Computing: When Computers Become BrainsVia Forbes by Roger Kay ----- ![]() IBM researchers reverse engineered a macaque brain as a start to engineering one of their own
The gnomes at IBM’s research labs were not content to make merely a genius computer that could beat any human at the game of jeopardy. They had to go and create a new kind of machine intelligence that mimics the actual human brain. Watson, the reigning jeopardy champ, is smart, but it’s still recognizably a computer. This new stuff is something completely different. IBM is setting out to build an electronic brain from the ground up. Cognitive computing, as the new field is called, takes computing concepts to a whole new level. Earlier this week, Dharmendra Modha, who works at IBM’s Almaden Research Center, regaled a roomful of analysts with what cognitive computing can do and how IBM is going about making a machine that thinks the way we do. His own blog on the subject is here. First Modha described the challenges, which involve aspects of neuroscience, supercomputing, and nanotechnology. The human brain integrates memory and processing together, weighs less than 3 lbs, occupies about a two-liter volume, and uses less power than a light bulb. It operates as a massively parallel distributed processor. It is event driven, that is, it reacts to things in its environment, uses little power when active and even less while resting. It is a reconfigurable, fault-tolerant learning system. It is excellent at pattern recognition and teasing out relationships. A computer, on the other hand, has separate memory and processing. It does its work sequentially for the most part and is run by a clock. The clock, like a drum majorette in a military band, drives every instruction and piece of data to its next location — musical chairs with enough chairs. As clock rates increase to drive data faster, power consumption goes up dramatically, and even at rest these machines need a lot of electricity. More importantly, computers have to be programmed. They are hard wired and fault prone. They are good at executing defined algorithms and performing analytics. With $41 million in funding from the Defense Advanced Research Projects Agency (DARPA), the scientists at the Almaden lab set out to make a brain in a project called Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE). The rough analogy between a brain and a computer posits roles for cell types — neurons, axons, and synapses — that correspond to machine elements — processors, communications links, and memory. The matches are not exact, as brain cells’ functions are less distinct from each other than the computer elements. But the key is that the brain elements all reside near each other, and activity in any given complex is stimulated by activity from adjacent complexes. That is, thoughts stimulate other thoughts. Modha and his team set out to map and synthesize a wiring diagram for the brain, no trivial task, as the brain has 22 billion neurons and 220 trillion synapses. In May 2009, the team managed to simulate a system with 1 billion neurons, roughly the brain of a lower mammal. Except that it operates at one-thousandth of real time, not enough to perform what Modha called “the four Fs”: food, fight, flight, and mating. But the structure of this machine is entirely different from today’s commercial computers. The memory and processing elements are built close together. It has no clock. Operations are asynchronous and event driven; that is, they have no predetermined order or schedule. And instead of being programmed, they learn. Just like us. Part of getting the power down to brain-like levels is not storing temporary results (caching, in industry jargon). Sensing stimulates action, which is sensed and acted upon further. And so on. The team recently built a smaller hardware version of the brain simulation, one with just 256 neurons, 262,000 programmable synapses, and 65,000 learning synapses. The good news is that this machine runs at within an order of magnitude of the power that a real brain consumes. With its primitive capabilities, this brainlette is capable of spatial navigation, machine vision, pattern recognition, and associative memory and can do evidence-based hypothesis generation. It has a “mind’s eye” that can see a pattern, for example, a badly written number, and generate a good guess as to what the actual number is. Already better than our Precambrian ancestors. Modha pointed out that this type of reasoning is a lot like that of a typical right hemisphere in the brain: intuitive, parallel, synthetic. Not content with half a brain, Modha envisions adding a typical von Neumann-type computer, which acts more like a reasoning left hemisphere, to the mix, and having the two share information, just like a real brain. When this brain is ready to go to market, I’m going to send my own on holiday and let Modha’s do my thinking for me. Oh, and, by the way, in case you were wondering whether the SyNAPSE project has caused Watson to be put out to pasture, nothing could be further from the truth. Watson is alive and well and moving on to new, more practical applications. For example, since jeopardy contestants can’t “call a friend,” Watson was constrained to the data that could be loaded directly into the machine (no Internet searches), but in the latest application of Watson technology — medical diagnoses — the Internet is easily added to the corpus within the machine, allowing Watson to search a much wider range of unstructured data before rendering an answer. Watson had to hit the bell faster than the human contestants, but the doctors seeking advice on a strange set of symptoms can easily wait a half hour or longer. So, Watson can make more considered choices. Watson at work is a serious tool. All this genius is causing my brain to explode. Disclosure: Endpoint has a consulting relationship with IBM.
Monday, December 19. 2011Concept vending machine knows exactly what you wantVia Dig Info TV -----
This concept model for a next-generation vending machine, which features a see through display, is being developed by Sanden, a large manufacturer of vending machines, in conjunction with Okaya Electronics and Intel. This concept model has a vertical, 65-inch, Full HD transparent display. The products behind the display can be seen through the glass, and you can simultaneously see high definition text, pictures, and Flash animations on the display. "This vending machine uses the Intel SandyBridge Core. It features Audience Impression Metric, or AIM, and can do anonymous face recognition. So this machine can recognize whether customers are male or female, or old or young." When there aren't any customers, the machine shows a large digital clock and animations, to attract the attention of people passing by. If a customer stands in front of the machine, it estimates their attributes from anonymous video analysis, and shows advertising content to match the customers demographic. "In this demo, we're suggesting that vending machines could be used to purchase luxury items, such as cosmetics and wine. The machine also has a public safety mode in times of emergency, which shows information such as evacuation routes." "I think this machine could be used in lots of ways, depending on customers' imagination. It has a great many possibilities, so we'd like to get ideas from everyone, rather than just using it as a regular vending machine."
Tuesday, December 13. 2011Paris creates the largest Google Earth display everTuesday, December 06. 2011Senseg demonstrates new technology that creates textures on flat screensVia gizmag -----
What if you could feel what's on your television screen? Tech company Senseg is working on a way for you to someday be able to do just that, and recently demonstrated a prototype tablet that is already able to make that magic happen.
The tech is made possible using an electrostatic-field-based system that allows different parts of the screen to produce varying degrees of friction. So, while you're touching a flat screen, it feels like you're touching something textured instead. Your traditional screen is turned into what Senseg is calling a "Feel Screen," allowing you to feel textures, contours and edges of things that are displayed in front of you. Feel Screens don't rely on moving parts in the screen itself, and could be integrated into devices we use today such as smartphones, tablets, and televisions. Senseg's technology is still very much in prototype-form, but could be headed our way in the next 24 months. Thursday, November 24. 2011Kinect for Windows Coming in 2012Via Nexus ----- The Kinect for Windows team has announced that Microsoft is releasing PC-specific Kinect hardware in 2012, along with a Kinect SDK that will enable developers to build all sorts of apps for Kinect. Will the emergence of motion-based interfaces eventually overtake touchscreens in terms of intuitiveness in computing user interfaces? Microsoft earlier announced the beta release of the Kinect for Windows SDK, which enabled hackers and enthusiasts to build programs that can take advantage of the Kinect’s motion-sensing capabilities. Just recently, Kinect for Windows GM Craig Eisler announced that the company’s commercial program will launch 2012, which will involve not only software, but also new Kinect hardware specifically designed for personal computers. These will take advantage of computers’ different capabilities, as well as the drawbacks of using the Kinect sensor meant for Microsoft’s Xbox 360 console. In particular, the benefits of the upcoming hardware will include the following:
Microsoft is also launching a new Kinect Accelerator business incubator through Microsoft BizSpark, through which the company will actively provide assistance to 10 companies in using the Kinect as a development platform. Microsoft sees businesses and end-users taking advantage of Kinect in more than just gaming. This can include developing special apps for people with disabilities, special needs, or injuries, and can also open the floodgates to other gesture and motion-based UI engineering. Will Kinect be the next big thing in user experience engineering after multi-touch touchscreens and accelerometers? Wednesday, November 23. 2011Kilobots - tiny, collaborative robots - are leaving the nestVia PhysOrg ----- The Kilobots are an inexpensive system for testing synchronized and collaborative behavior in a very large swarm of robots. Photo courtesy of Michael Rubenstein The Kilobots are coming. Computer scientists and engineers at Harvard University have developed and licensed technology that will make it easy to test collective algorithms on hundreds, or even thousands, of tiny robots. Called Kilobots, the quarter-sized bug-like devices scuttle around on three toothpick-like legs, interacting and coordinating their own behavior as a team. A June 2011 Harvard Technical Report demonstrated a collective of 25 machines implementing swarming behaviors such as foraging, formation control, and synchronization. Once up and running, the machines are fully autonomous, meaning there is no need for a human to control their actions.
The communicative critters were created by members of the Self-Organizing Systems Research Group led by Radhika Nagpal, the Thomas D. Cabot Associate Professor of Computer Science at the Harvard School of Engineering and Applied Sciences (SEAS) and a Core Faculty Member at the Wyss Institute for Biologically Inspired Engineering at Harvard. Her team also includes Michael Rubenstein, a postdoctoral fellow at SEAS; and Christian Ahler, a fellow of SEAS and the Wyss Institute. Thanks to a technology licensing deal with the K-Team Corporation, a Swiss manufacturer of high-quality mobile robots, researchers and robotics enthusiasts alike can now take command of their own swarm. One key to achieving high-value applications for multi-robot systems in the future is the development of sophisticated algorithms that can coordinate the actions of tens to thousands of robots. "The Kilobot will provide researchers with an important new tool for understanding how to design and build large, distributed, functional systems," says Michael Mitzenmacher, Area Dean for Computer Science at SEAS. "Plus," he adds, "tiny robots are really cool!" The name "Kilobot" does not refer to anything nefarious; rather, it describes the researchers' goal of quickly and inexpensively creating a collective of a thousand bots.
Inspired by nature, such swarms resemble social insects, such as ants and bees, that can efficiently search for and find food sources in large, complex environments, collectively transport large objects, and coordinate the building of nests and other structures. Due to reasons of time, cost, and simplicity, the algorithms being developed today in research labs are only validated in computer simulation or using a few dozen robots at most. In contrast, the design by Nagpal's team allows a single user to easily oversee the operation of a large Kilobot collective, including programming, powering on, and charging all robots, all of which would be difficult (if not impossible) using existing robotic systems. So, what can you do with a thousand tiny little bots? Robot swarms might one day tunnel through rubble to find survivors, monitor the environment and remove contaminants, and self-assemble to form support structures in collapsed buildings. They could also be deployed to autonomously perform construction in dangerous environments, to assist with pollination of crops, or to conduct search and rescue operations. For now, the Kilobots are designed to provide scientists with a physical testbed for advancing the understanding of collective behavior and realizing its potential to deliver solutions for a wide range of challenges. ----- Personal comment: This remembers me one project I have worked on, back in 2007, called "Variable Environment", which was involving swarm based robots called "e-puck" developed at EPFL. E-pucks were reacting in an autonomous manner to human activity around them. Tuesday, November 22. 2011Electronic contact lens displays pixels on the eyes-----
(Image: Institute of Physics) Monday, November 21. 2011Cotton Candy Device Puts Small Android-Powered Computer In A Thumb DriveVia nexus404 ----- The rush to make computers smaller and smaller has been going on for some time now, but we may have a winner–at least for now–in terms of the small computer race. It’s called the Cotton Candy from FXI Tech, and though it just looks like your standard USB thumb drive, it turns out it’s packing an entire very small computer in its tiny packaging.
The specs, admittedly, aren’t anything truly spectacular, offering up a dual-core ARM Cortex A9 on the processor end, backed up by an ARM Mali-400MP GPU, wi-fi and Bluetooth connectivity, a USB plug and a microSD card slot as well as its own Android operating system. But when you consider that it’s all encased in a device that’s the size of a basic key chain, well, suddenly the whole picture looks a lot more interesting. What this is designed to do is hook into much larger displays, thanks to that HDMI plug, and allow you to perform many of your basic computer functions. You’ve got Bluetooth for the peripherals, microSD for the storage, cloud access from the Android app…it’s a very simple, very basic, but extremely portable setup. And, you can even hook it into another computer with the USB plug included, which in turn will let you borrow the peripherals hooked into that computer (great if you needed to print something, I’d say) to do the various jobs you want done. And if you want an ultra-small computer to take with you most anywhere you go, Cotton Candy should be on hand in time for Christmas 2012, and the pricing is expected to land at the $200 mark, which isn’t half bad. Though it does make me wonder why most wouldn’t just buy a full on laptop for not too much more, especially if they buy used. Still though, an ultra-small PC for an ultra-small price tag is in the offing, so what do you guys think? Will the Cotton Candy catch on? Or will we be seeing these go for half that or less just to clear them out? No matter what you think, we love hearing from you, so head on down to the comments section and tell us what you think! Wednesday, November 09. 2011NVIDIA Tegra 3 quad-core mobile processor revealed and detailedVia Slash Gear By Chris Burns ----- The next generation of mobile processors has arrived in the form of the NIVIDA Tegra 3, formerly known as Project Kal-El, a quad-core chipset with aspirations to dominate the Android landscape in 2012 as theTegra 2 dual-core processor dominated the majority of 2011. Though many of the details have already been revealed by NVIDIA before today on how Tegra 3 functions and is able to bring you the consumer more power, less battery consumption, and more effective workload distribution, this marks both the official naming of the chip as well as the official distribution of easy to process videos on how Tegra 3 will affect the average mobile device user.
NVIDIA’s Tegra 3 chipset has been gone over in full detail by your humble narrator in two posts here on SlashGear just a few weeks ago in two posts, one on how there are actually [five cores, not just four], and another all about [Variable Symmetric Multiprocessing] aka vSMP. Note that back then NVIDIA had not yet revealed that the final market name for the processor would be “Tegra 3? at the time these posts were published, instead still using the codename “Project Kal-El” to identify the chipset. The most important thing you should take away from these posts is this: your battery life will be better and the distribution of power needed by your processor cores will be handled more intelligently. NVIDIA has provided a few videos that will explain again in some rather easy to process detail what we’re dealing with here in the Tegra 3. The first of these videos shows visually what cores use which amount of power as several different tasks are performed. Watch as a high-powered game uses all four cores while browsing a webpage might only use a single core. This is the power of Variable Symmetric Multiprocessing in action. NVIDIA Tegra 3: Fifth Companion Core Next there’s a demonstration of an upcoming game that would never have been able to exist on a mobile platform if it hadn’t been for NVIDIA’s new chip architecture and the power of a quad-core chipset – along with NVIDIA’s twelve GPU cores of course. We had a look at this game back earlier this year in the first Glowball post – now we go underwater: Glowball Video 2: Tegra 3 goes underwater Finally there’s a lovely set of videos showing you exactly what it means for game developers and gamers to be working with the Tegra 3 chipset. The first video shows off how next-generation games are being made specifically for this chipset, developers working hand in hand with NVIDIA to optimize their games for the Tegra 3 so that gamers can get the most awesome experience in mobile history. Devour this, if you will: NVIDIA Tegra 3: Developers bring Next-Generation Games to Mobile You can also see several examples of the games in the video and how they’ve been improved in the Tegra 3 world. Riptide GP as well as Shadowgun have been reviewed and given hands-on videos by your humble narrator in the past – can’t wait for the enhanced visions! Next have a look at these games running side-by-side with their original versions. Make sure you’re sitting down, because you’re going to get pumped up. Side-by-side Gameplay Competition vs Tegra 3 Down to the frames per second, this new chipset will change the world you live in as far as gaming goes. Of course it doesn’t stop there, but in that gaming is one of the best ways to test a processor on this platform, one made with gaming in mind of course, you’ve got to appreciate the power. Have a peek at this tiny chart to see what we mean: Then head over to the post from ASUS on what the very first hardware running the Tegra 3 will look like. It’s the ASUS Eee Pad Transformer Prime, a 10.1-inch tablet from the makers of the original Transformer, a device made to pummel the competition and usher in a whole new age in mobile computing. We look forward to the future, NVIDIA, bring on another year of complete and total annihilation of the competition! Monday, November 07. 2011Introducing the 5-watt server that runs on cell phone chipsVia giagom ----- Can ARM wrestle its way into the server market? Calxeda and Hewlett-Packard think so. On Tuesday Calxeda launched its EnergyCore ARM server-on-a-chip (SoC), which it says consumes as little as 1.5 watts (and idles at half a watt). And HP, the world’s largest server maker, committed to building EnergyCore-based servers that will consume as little as 5 watts when running all out. Compare that to the lowest-power x86 server chips from Intel, which consume about 20 watts but deliver higher performance. Calxeda, backed in part by ARM Holdings, is banking that the success that ARM chips found in smartphones and mobile devices will carry over into data centers serving large, scale-out workloads. In that arena, it is facing off squarely against chip giant Intel and its x86-based architecture, which dominates the market for chips running in commodity servers. Said Calxeda in a statement:
EnergyCore targets web serving, big data appsThe small form factor and energy stinginess of EnergyCore, based on the ARM Cortex processor, suits an emerging and fast-growing class of web and cloud applications, but it lacks in terms of software support and currently won’t support the enterprise demand for 64-bit processors. Thus, for traditional data centers locked into the Intel x86 architecture and with lots of legacy software to run, Calxeda might be a stretch. But that might not matter. “For big cloud companies that buy gargantuan numbers of servers and for whom the power and space issues get linearly nasty as they build up the number of nodes, this is a good solution,” said analyst Roger Kay, the founder and president of Endpoint Technologies Associates. These sorts of transactions take on an almost consultative nature, where the server vendor works with the customer’s developers, he said. EnergyCore is 32-bit only, a fact that Intel will no doubt trumpet. “High-performance computing [HPC] needs 64 bits to deal with larger address space, but that doesn’t mean that 32-bit [processors] can’t address certain data center applications,” Kay said. “This new chip is designed to handle very large databases that in turn handle lots of queries from many end points. Think Google Earth where there are lots of simple queries — ‘show me the bits in the X-Y grid specified.’” HP estimates that those big scale-out web and cloud data center scenarios represent a healthy 10 to 15 percent of the data center market, Kay noted. That’s certainly worth fighting for. Intel pegs that segment at 10 percent of the overall market.
Calxeda, like SeaMicro, which makes low-power servers running Intel Atom processors, also builds on a fabric that lets all the various SoC components communicate inside the box. Skeptics point out that big data center buyers tend to be a conservative lot, not likely to gamble on a new chip architecture. “Many CIOs will go with the devil they know. They have software that runs on Intel so why move?” Kay said. But again, Calxeda and HP are seeking out the biggest of the big data center companies — those that have a lot of in-house development talent that are not as bound by legacy software concerns. What’s in EnergyCore SoC?
Calxeda and HP will start rolling out sample products later this year, with volume ramping up in 2012. Calxeda is not alone in this arena: Marvell Inc. is already in the market with its own ARM-based servers. NVIDIA is also building an ARM-based server, dubbed Denver, for HPC. While Calxeda is tiny compared to Intel, it also doesn’t manufacture its own silicon, which could end up hurting it when comparing it to Intel, one of the last silicon vendors to own its own chip manufacturing. Fichera notes that Calxeda has no control over the distribution and sales of what it designs: The server partner has all the leverage. If someone else has a better SoC next year, Calxeda (or whatever SoC provider they use) could be gone.
« previous page
(Page 15 of 18, totaling 178 entries)
» next page
|
QuicksearchPopular Entries
CategoriesShow tagged entriesSyndicate This BlogCalendar
Blog Administration |