Sensing oxygen: This implantable sensor measures the concentration of dissolved oxygen in tissue, an indicator of tumor growth.
Credit: Technical University of Munich
Tuesday, November 01. 2011Multi-Device Web Design: An EvolutionVia LUKEW By Luke Wroblewski -----
As mobile devices have continued to evolve and spread, so has the process of designing and developing Web sites and services that work across a diverse range of devices. From responsive Web design to future friendly thinking, here's how I've seen things evolve over the past year and a half. If you haven't been keeping up with all the detailed conversations about multi-device Web design, I hope this overview and set of resources can quickly bring you up to speed. I'm only covering the last 18 months because it has been a very exciting time with lots of new ideas and voices. Prior to these developments, most multi-device Web design problems were solved with device detection and many still are. But the introduction of Responsive Web Design really stirred things up. Responsive Web DesignResponsive Web Design is a combination of fluid grids and images with media queries to change layout based on the size of a device viewport. It uses feature detection (mostly on the client) to determine available screen capabilities and adapt accordingly. RWD is most useful for layout but some have extended it to interactive elements as well (although this often requires Javascript). Responsive Web Design allows you to use a single URL structure for a site, thereby removing the need for separate mobile, tablet, desktop, etc. sites. For a short overview read Ethan Marcotte's original article. For the full story read Ethan Marcotte's book. For a deeper dive into the philosophy behind RWD, read over Jeremy Keith's supporting arguments. To see a lot of responsive layout examples, browse around the mediaqueri.es site. ChallengesResponsive Web Design isn't a silver bullet for mobile Web experiences. Not only does client-side adaptation require a careful approach, but it can also be difficult to optimize source order, media, third-party widgets, URL structure, and application design within a RWD solution. Jason Grigsby has written up many of the reasons RWD doesn't instantly provide a mobile solution especially for images. I've documented (with concrete) examples why we opted for separate mobile and desktop templates in my last startup -a technique that's also employed by many Web companies like Facebook, Twitter, Google, etc. In short, separation tends to give greater ability to optimize specifically for mobile. Mobile First Responsive DesignMobile First Responsive Design takes Responsive Web Design and flips the process around to address some of the media query challenges outlined above. Instead of starting with a desktop site, you start with the mobile site and then progressively enhance to devices with larger screens. The Yiibu team was one of the first to apply this approach and wrote about how they did it. Jason Grigsby has put together an overview and analysis of where Mobile First Responsive Design is being applied. Brad Frost has a more high-level write-up of the approach. For a more in-depth technical discussion, check out the thread about mobile-first media queries on the HMTL5 boilerplate project. TechniquesMany folks are working through the challenges of designing Web sites for multiple devices. This includes detailed overviews of how to set up Mobile First Responsive Design markup, style sheet, and Javascript solutions. Ethan Marcotte has shared what it takes for teams of developers and designers to collaborate on a responsive workflow based on lessons learned on the Boston Globe redesign. Scott Jehl outlined what Javascript is doing (PDF) behind the scenes of the Globe redesign (hint: a lot!). Stephanie Rieger assembled a detailed overview (PDF) of a real-world mobile first responsive design solution for hundreds of devices. Stephan Hay put together a pragmatic overview of designing with media queries. Media adaptation remains a big challenge for cross-device design. In particular, images, videos, data tables, fonts, and many other "widgets" need special care. Jason Grigsby has written up the situation with images and compiled many approaches for making images responsive. A number of solutions have also emerged for handling things like videos and data tables. Server Side ComponentsCombining Mobile First Responsive Design with server side component (not full page) optimization is a way to extend client-side only solutions. With this technique, a single set of page templates define an entire Web site for all devices but key components within that site have device-class specific implementations that are rendered server side. Done right, this technique can deliver the best of both worlds without the challenges that can hamper each. I've put together an overview of how a Responsive Design + Server Side Components structure can work with concrete examples. Bryan Rieger has outlined an extensive set of thoughts on server-side adaption techniques and Lyza Gardner has a complete overview of how all these techniques can work together. After analyzing many client-side solutions to dynamic images, Jason Grigsby outlined why using a server-side solution is probably the most future friendly. Future ThinkingIf all the considerations above seem like a lot to take in to create a Web site, they are. We are in a period of transition and still figuring things out. So expect to be learning and iterating a lot. That's both exciting and daunting. It also prepares you for what's ahead. We've just begun to see the onset of cheap networked devices of every shape and size. The zombie apocalypse of devices is coming. And while we can't know exactly what the future will bring, we can strive to design and develop in a future-friendly way so we are better prepared for what's next. ResourcesI referenced lots of great multi-device Web design resources above. Here they are in one list. Read them in order and rock the future Web!
Tuesday, October 18. 2011Throwable Panoramic Ball Camera captures any moment in 3DJonas Pfeil, a recent graduate of the Technical University of Berlin, created the Throwable Panoramic Ball Camera as a way to instantly create panoramic images without the often tedious digital stitching process common today. Equipped with an accelerometer, when thrown the camera creates full spherical panoramas that display three-dimensional images of a moment captured in time. Although the device is not yet for sale, it will be on display at the upcoming Siggraph Asia conference. You can see the ball's technology at work in the video below.
Thursday, October 13. 2011Nettop Round-Up: Four Tiny PCs, Benchmarked And ReviewedVia Tom's Hardware ----- We're testing four nettops: Arctic Cooling’s MC001-BD, ASRock’s CoreHT 252B, Giada’s i50, and Zotac’s Zbox AD03BR-PLUS. All of these tiny, quiet systems take a very different approach to compact computing, and we fill you in on what makes them unique. Intel’s Atom CPU might have been the driving force behind the popularization of the nettop form factor, but manufacturers are squeezing more powerful hardware into these tiny machines. Sometimes, more potent graphics performance is added via a mobile chipset. Sometimes, processing m muscle is emphasized instead with a CPU designed for powerful notebook. And now, APUs belonging to AMD's Fusion initiative are an option, serving up efficient power use and higher-performance graphics on a single processor die. Truly, the nettop is no longer a stripped-down machine barely capable of Web browsing and word processing. These tiny PCs are tailor-made to excel in specific applications, which often includes use in a home theater. Click here to get the full survey Saturday, October 08. 2011Computer Virus Hits U.S. Drone Fleet
Photo courtesy of Bryan William Jones A computer virus has infected the cockpits of America’s Predator and Reaper drones, logging pilots’ every keystroke as they remotely fly missions over Afghanistan and other warzones. The virus, first detected nearly two weeks ago by the military’s Host-Based Security System, has not prevented pilots at Creech Air Force Base in Nevada from flying their missions overseas. Nor have there been any confirmed incidents of classified information being lost or sent to an outside source. But the virus has resisted multiple efforts to remove it from Creech’s computers, network security specialists say. And the infection underscores the ongoing security risks in what has become the U.S. military’s most important weapons system. “We keep wiping it off, and it keeps coming back,” says a source familiar with the network infection, one of three that told Danger Room about the virus. “We think it’s benign. But we just don’t know.” Military network security specialists aren’t sure whether the virus and its so-called “keylogger” payload were introduced intentionally or by accident; it may be a common piece of malware that just happened to make its way into these sensitive networks. The specialists don’t know exactly how far the virus has spread. But they’re sure that the infection has hit both classified and unclassified machines at Creech. That raises the possibility, at least, that secret data may have been captured by the keylogger, and then transmitted over the public internet to someone outside the military chain of command.
Drones have become America’s tool of choice in both its conventional and shadow wars, allowing U.S. forces to attack targets and spy on its foes without risking American lives. Since President Obama assumed office, a fleet of approximately 30 CIA-directed drones have hit targets in Pakistan more than 230 times; all told, these drones have killed more than 2,000 suspected militants and civilians, according to the Washington Post. More than 150 additional Predator and Reaper drones, under U.S. Air Force control, watch over the fighting in Afghanistan and Iraq. American military drones struck 92 times in Libya between mid-April and late August. And late last month, an American drone killed top terrorist Anwar al-Awlaki — part of an escalating unmanned air assault in the Horn of Africa and southern Arabian peninsula. But despite their widespread use, the drone systems are known to have security flaws. Many Reapers and Predators don’t encrypt the video they transmit to American troops on the ground. In the summer of 2009, U.S. forces discovered “days and days and hours and hours” of the drone footage on the laptops of Iraqi insurgents. A $26 piece of software allowed the militants to capture the video. The lion’s share of U.S. drone missions are flown by Air Force pilots stationed at Creech, a tiny outpost in the barren Nevada desert, 20 miles north of a state prison and adjacent to a one-story casino. In a nondescript building, down a largely unmarked hallway, is a series of rooms, each with a rack of servers and a “ground control station,” or GCS. There, a drone pilot and a sensor operator sit in their flight suits in front of a series of screens. In the pilot’s hand is the joystick, guiding the drone as it soars above Afghanistan, Iraq, or some other battlefield. Some of the GCSs are classified secret, and used for conventional warzone surveillance duty. The GCSs handling more exotic operations are top secret. None of the remote cockpits are supposed to be connected to the public internet. Which means they are supposed to be largely immune to viruses and other network security threats. But time and time again, the so-called “air gaps” between classified and public networks have been bridged, largely through the use of discs and removable drives. In late 2008, for example, the drives helped introduce the agent.btz worm to hundreds of thousands of Defense Department computers. The Pentagon is still disinfecting machines, three years later. Use of the drives is now severely restricted throughout the military. But the base at Creech was one of the exceptions, until the virus hit. Predator and Reaper crews use removable hard drives to load map updates and transport mission videos from one computer to another. The virus is believed to have spread through these removable drives. Drone units at other Air Force bases worldwide have now been ordered to stop their use. In the meantime, technicians at Creech are trying to get the virus off the GCS machines. It has not been easy. At first, they followed removal instructions posted on the website of the Kaspersky security firm. “But the virus kept coming back,” a source familiar with the infection says. Eventually, the technicians had to use a software tool called BCWipe to completely erase the GCS’ internal hard drives. “That meant rebuilding them from scratch” — a time-consuming effort. The Air Force declined to comment directly on the virus. “We generally do not discuss specific vulnerabilities, threats, or responses to our computer networks, since that helps people looking to exploit or attack our systems to refine their approach,” says Lt. Col. Tadd Sholtis, a spokesman for Air Combat Command, which oversees the drones and all other Air Force tactical aircraft. “We invest a lot in protecting and monitoring our systems to counter threats and ensure security, which includes a comprehensive response to viruses, worms, and other malware we discover.” However, insiders say that senior officers at Creech are being briefed daily on the virus. “It’s getting a lot of attention,” the source says. “But no one’s panicking. Yet.”
Friday, September 30. 2011Researchers install world’s highest webcam to provide a view of EverestVia gizmag ----- After starting out producing security cameras, German-based Mobotix is now taking video surveillance to new heights - literally. One of the company's type-M12 cameras has been situated at an altitude of 5,643-meters (18,514 ft) on the Kala Patthar mountain to stream high definition images of the summit of the nearby 8,848-meter (29,029 ft) high Mount Everest. The solar-powered webcam takes the title of the world's highest webcam from the now second highest webcam in the world located at the 4,389-meter (14,400 ft) high base camp of Mount Aconcagua in Argentina. The Kala Patthar location, which was chosen for its excellent view of the western side of Everest, including the north and southwest faces of the mountain and the West Ridge, exposes the webcam to some pretty harsh conditions with high winds and temperatures as low as -30°C (-22°F). Images captured by the webcam are transmitted wirelessly to the Ev-K2-CNR Pyramid Laboratory/Observatory, which is located at an altitude of 5,050 meters (16,568 ft). Here, the video is analyzed before being sent onto Italy for further evaluation. "We spent months developing the perfect setup for the installation and invested a lot of time testing and verifying the system. And it inspired us on to set a record: operating the highest webcam in the world," said Giampietro Kohl, leader of the Ev-K2-CNR technical committee who coordinated the webcam's installation by Italian engineers and the Nepalese Ev-K2-CNR team as part of the "Everest Share 2011" research project, The research project is taking place as part of the international "Share" climate and environmental monitoring conference, with researchers hoping the images, in conjunction with meteorological data gathered at the world's highest weather station sitting at an altitude of 8,000 meters (26,247 ft) on Mount Everest, will provide an insight into climate change. People can take in the view provided by the Mobotix webcam from the comfort of their homes by pointing their browsers here. The webcam is only active during the daylight hours of 6:00 a.m. to 6:00 p.m. Nepalese time, with images updated every five minutes to allow the researchers to track the movement of the clouds around the mountain summit. If you'd like to turn the view around and see what it looks like from the summit of Everest, there's also a nice 360-degree panorama here. Thursday, September 22. 2011Dr. Watson - Come Here - I Need YouVia big think By Dominic Basulto ----- ![]() The next time you go to the doctor, you may be dealing with a supercomputer rather than a human. Watson, the groundbreaking artificial intelligence machine from IBM that took on chess champions and Jeopardy! contestants alike, is about to get its first real-world application in the healthcare sector. In partnership with health benefits company WellPoint, Watson will soon be diagnosing medical cases – and not just the everyday cases, either. The vision is for Watson to be working hand-in-surgical-glove with oncologists to diagnose and treat cancer in patients.
While having super-knowledgeable medical experts on call is exciting, it also raises several thorny issues. At what point – if ever - would you ask for a “second opinion” on your medical condition from a human doctor? Will “Watson” ever be included in the names of physicians included in your HMO listings? And, perhaps most importantly, can supercomputers ever provide the type of bedside manner that we are accustomed to in our human doctors?
Given that the cost of healthcare is simply too high, as a society we will need to accept some compromises. Once the healthcare industry is fully digitized, supercomputers like Watson could result in a more cost-effective way to sift through the ever-growing amount of medical information and provide real-time medical analysis that could save lives. If Watson also results in a significant improvement in patient treatment as well, it’s clear that the world of medicine will never be the same again. Right now, IBM envisions Watson supplementing – not actually replacing - doctors. But the time is coming when nurses across the nation will be saying, “Watson -- Come Here –- I Need You,” instead of turning to doctors whenever they need a sophisticated medical evaluation of a patient.
Posted by Christian Babski
in Hardware, Innovation&Society, Technology
at
17:21
Defined tags for this entry: artificial intelligence, hardware, innovation&society, super computer, technology
Monday, September 19. 2011UNIVAC: the troubled life of America's first computerVia ars technica By Matthew Lasar ----- ![]() It was November 4, 1952, and Americans huddled in their living rooms to follow the results of the Presidential race between General Dwight David Eisenhower and Adlai Stevenson, Governor of Illinois. We like to think that our time is a unique moment of technological change. But the consumers observing this election represented an unprecedented generation of early adopters, who watched rather than listened to the race on the radio. By that year they had bought and installed in their homes about 21 million copies of a device called the television—about seven times the number that existed just three years earlier. On that night they witnessed the birth of an even newer technology—a machine that could predict the election's results. Sitting next to the desk of CBS Anchor Walter Cronkite was a mockup of a huge gadget called a UNIVAC (UNIVersal Automatic Computer), which Cronkite explained would augur the contest. J. Presper Eckert, the UNIVAC's inventor, stood next to the device and explained its workings. The woman who actually programmed the mainframe, Navy mathematician Grace Murray Hopper was nowhere to be seen; for days her team had input voting statistics from earlier elections, then wrote the code that would allow the calculator to extrapolate the contest based on previous races. J. Presper Eckert and CBS's Walter Cronkite pondering the UNIVAC on election night, 1952. To the disquietude of national pollsters expecting a Stevenson victory, Hopper's UNIVAC group predicted a huge landslide for Eisenhower, and with only five percent of the results. CBS executives didn't know what to make of this bold finding. "We saw [UNIVAC] as an added feature to our coverage that could be very interesting in the future," Cronkite later recalled. "But I don't think that we felt the computer would become predominant in our coverage in any way." And so CBS told its audience that UNIVAC only foresaw a close race. At the end of the evening, when it was clear that UNIVAC's actual findings were spot on, a spokesperson for the company that made the machine was allowed to disclose the truth—that the real prediction had been squelched. "The uncanny accuracy of UNIVAC's prediction during a major televised event sent shock waves across through the nation," notes historian Kurt W. Beyer, author of Grace Hopper and the Invention of the Information Age. "In the months that followed, 'UNIVAC' gradually became the generic term for a computer." That's putting it mildly. By the late 1950s the UNIVAC and its cousin, the ENIAC, had inspired a generic sobriquet for anyone with computational prowess—a "BRAINIAC." The term became so embedded in American culture that to this day your typical computer literacy quiz includes the following multiple choice poser: Which was not an early mainframe computer? But the fact that this question is even posed is testimony to the other key component of UNIVAC's history—its famous trajectory was cut short by a corporation with a much larger shadow: IBM. The turbulent life of UNIVAC offers interesting lessons for developers and entrepreneurs in our time. The machines and their teamsDuring the Second World War, two teams in the United States were deployed to improve the calculations necessary for artillery firing and strategic bombing. Hopper worked with Harvard mathematician Howard Aiken, whose Mark I computer performed computations for the Navy. John Mauchly and J. Presper Eckert's Electronic and Numerical Integrator and Computer (ENIAC) rolled out rocket firing tables for the Army. While both groups served extraordinarily during the war, their leaders could not have thought about these devices more differently. Aiken viewed them as scientific tools. Mauchly saw their potential as commercial instruments. After the conflict, Aiken obstinately lobbied against the commercialization of computing, inveighing against the "foolishness with Eckert and Mauchly," at computer conferences. Perhaps there was a need for five or six machines in the country, he told associates; no more. But Aiken's assistant Hopper was fascinated by the duo—the former a graduate student and the latter a professor of electronics. Eckert was "looking way ahead," Hopper recalled. "Even though he was a college professor he was visualizing the use of these computers in the business and industrial area." The University of Pennsylvania sided with Aiken. The college offered Eckert and Mauchly tenured positions, but only on the condition that they sign patent releases for all their work. Both inventors resigned from the campus and in the spring of 1946 formed the Electronic Control Company, which eventually became the Eckert-Mauchly Computer Corporation. Over the course of five years, the two developers rethought everything associated with computational machines. The result was a device that went way beyond the age of punch card calculators associated with IBM devices. The UNIVAC, unveiled in 1951, was the fruit of this effort. "No one who saw a UNIVAC failed to see how much it differed from existing calculators and punched card equipment," writes historian Paul E. Ceruzzi:
These characteristics would enable the UNIVAC to perform thousands more operations per second than its closest rival, the Harvard Mark II. And its adaptation of the entertainment industry's new tool—magnetic recording tape—would allow it to store vastly more data. UNIVAC was quickly picked up by the US Census Bureau in a $300,000 contract, which was followed by another deal via the National Bureau of Standards. Soon a racetrack betting odds calculator company called American Totalisator signed on, purchasing a 40 percent interest in the company. You could see and hold itBut Eckert-Mauchly could not handle this volume of work on its own. Its principals drastically underbid on key contracts. After a plane crash killed the corporation's board president, the inventors and Totalisor clashed over the viability of the project. The duo then went to IBM for backing and met with Thomas Watson Junior and Senior, but could not convince the elder executive of the UNIVAC's viability. "Having built his career on punch cards," Watson Jr. later reflected, "Dad distrusted magnetic tape instinctively. On a punch card, you had a piece of information that was permanent. You could see it and hold it in your hand.... But with magnetic tape, your data were stored invisibly on a medium that was designed to be erased and reused." So EMCC turned to its second choice—the Remington Rand office equipment company, whose founder James Rand expressed outrage when he saw a reworked IBM typewriter rather than a Remington hooked up to the UNIVAC. "Take that label off that machine!" Rand declared on his first visit to an EMCC laboratory. "I don't want it seen in here!" The tenderness over an IBM logo aside, Remington Rand brought an important innovation to the UNIVAC—television advertisements. The longer infomercials came complete with symphony orchestra introductions and historical progress timelines that began with the Egyptian Sphinx. The shorter ones extolled the role that UNIVAC was playing in weather prediction. "Today UNIVAC is saving time and increasing efficiency for science, industry, business, and government," one ad concluded. A Remington Rand UNIVAC commercial.
But while that was certainly true about what the machine did for its clients, historian Beyer notes that it didn't extend to Remington's management of EMCC. Most of the office company's top staff, like its founder, didn't understand the device, and related more to punch card machines. The man put in direct charge of EMCC, former Manhattan Project director Leslie Groves, tossed Mauchly to the sales department when he flunked a company security clearance test (apparently he had attended some Communist Party meetings in the 1930s). On top of that, new management did not sympathize with EMCC's female programmers, among them Grace Hopper, who by 1952 had written the UNIVAC's first software compiler. "There were not the same opportunities for women in larger corporations like Remington Rand," she later reflected. "They were older companies, and the jobs had been stereotyped." Then there was Groves' marketing strategy for the UNIVAC, which amounted to selling less of the devices, even as they were being hawked on TV as exemplars of technological progress. He ordered a fifty percent annual production quota drop. "With such low sales expectations, there was little incentive to educate Remington Rand's sizeable sales force about the new technology," Beyer explains. The biggest blow, however, came when IBM began to rethink its aversion to magnetic mainframe storage. Left in the dustDespite Remington/EMCC's internal chaos, interest in the UNIVAC exploded after the 1952 CBS demonstration. This created more problems. Hopper's programming staff was now besieged with attractive offers from companies using IBM gear, creating a brainiac drain within EMCC itself. "Some members of Dr. Grace Hopper's staff have already left for positions with users of IBM equipment," Mauchly noted in a memo, "and those of her staff who still remain are now expecting attractive offers from outside sources." Customer service and support became more and more of a challenge. Still, the UNIVAC was highly competitive with IBM equipment. The question was whether EMCC could beat Big Blue in government contract bidding, specifically for the Semi-Automatic Ground Environment (SAGE) defense communications network. The SAGE project amounted to an early-warning radar system designed to pick up enemy bomber activity around the nation's borders. It was the brainchild of Jay Forrester, director of MIT's Sernomechanisms Laboratory, and central to the idea was a network of digital computers to integrate the network, dubbed "Project Whirlwind." In three years Forrester's team had pioneered real-time, high-capacity access memory for the mainframes. The government now offered a contract to build 50 Whirlwind computers. IBM quickly rallied its forces for the contest. "I thought it was absolutely essential to IBM's future that we win it," Thomas Watson Jr., who had none of Senior's allergies to digital computing, later explained. "The company that built those computers was going to be way ahead of the game, because it would learn the secrets of mass production." Forrester gave the matter some thought. Remington Rand had UNIVAC. And it had the prestige of Manhattan Project Director Leslie Groves. But Remington did not have IBM's scale of operation or its production capacity. Indeed, under Groves' direction, it had scaled that capacity down. In 1953, the government offered the contract to IBM. Historian Beyer explains the consequences of this decision:
IBM quickly integrated these discoveries into its next rollout of commercial computers. The market loved them and ordered thousands. "In a little over a year we started delivering those redesigned computers," Watson Jr. later boasted. "They made the UNIVAC obsolete and we soon left Remington Rand in the dust." The UNIVAC universe of the 1950s—clockwise from top left: UNIVAC mathematician and programmer Grace Hopper; John W. Mauchly and John Presper Eckert; General Leslie Groves of Remington Rand; DC Comics' Brainiac confronting Superman in 1959; a Department of Commerce UNIVAC in the center. AftermathSensing the dust around it, in 1955 Remington merged with the Sperry Corporation and became Sperry Rand. No less than General Douglas MacArthur ran the new entity. This gave the UNIVAC a new lease on digital life, but one that operated in the shadow of the company that had once sworn that it would stick to punch tape: IBM. In the meantime, a slew of firms jumped into the high-speed computing business, among them RCA, National Cash Register, General Electric, and Honeywell. "IBM and the Seven Dwarfs," they were dubbed. UNIVAC was now a dwarf. Grace Hopper continued her work. She became an advocate of the assumption inherent in her UNIVAC compiler which she called "automatic" computing—the notion that programs should emphasize simple English words. Her compiler, later called FLOW-MATIC, understood 20 words. Her contemporaries patiently informed her that this number was enough. Hopper "was told very quickly that [she] couldn't do this because computers didn't understand English," she later noted. Happily, she did not believe this to be true, and advised a team that developed the COBOL programming language, which she championed and furthered through the 1960s and 1970s. US Navy Rear-Admiral Grace Murray Hopper died in 1992. Having fattened IBM on government grants for decades, the Department of Justice launched an antitrust suit against the corporation in 1969. This initiative was suddenly withdrawn by the Reagan administration in 1982—as the company once again jolted the industry by jumping into the PC market. As for UNIVAC, its complex birth 60 years ago remains the moment when we discovered that computers were going to be part of our lives—that they were going to become integral our work and collective imagination. It was also a moment when information systems developers and entrepreneurs learned that innovation and genius are not always a match for influence and organizational scale. "Howard Aiken was wrong," historian Paul Cerruzi wrote in 2000. "There turned out to be a market for millions of electronic digital computers by the 1990s." Their emergence awaited advances in solid state physics. Nonetheless, "the nearly ubiquitous computers of the 1990s are direct descendants of what Eckert and Mauchly hoped to commercialize in the late 1940s." Further readingMost of the material in this essay comes from Kurt W. Beyer's must-read book, Grace Hopper and the Invention of the Information Age (MIT Press). Also essential is Paul E. Ceruzzi's History of Modern Computing.
Monday, September 12. 2011Implantable sensor can monitor tumors constantly to sense growth-----
Sensing oxygen: This implantable sensor measures the concentration of dissolved oxygen in tissue, an indicator of tumor growth.
Researchers hope to combine the sensor with a device to deliver targeted chemotherapy. A team of medical engineers in Germany has developed an implant to continuously monitor tumor growth in cancer patients. The device, designed to be implanted in the patient near the tumor site, uses chip sensors to measure oxygen levels in the blood, an indicator of growth. The data is then transmitted wirelessly to an external receiver carried by the patient and transferred to his or her doctor for remote monitoring and analysis. "We developed the device to monitor and treat slow-growing tumors that are difficult to operate on, such as brain tumors and liver tumors, and for tumors in elderly patients for whom surgery might be dangerous," said Helmut Grothe, head of the Heinz-Nixdorf Institute for Medical Electronics at the Technical University of Munich. The roughly two-centimeter-long device, dubbed the IntelliTuM (Intelligent Implant for Tumor Monitoring), includes a self-calibrating sensor, data measurement and evaluation electronics, and a transmitter. All the components are contained within a biocompatible plastic housing. The device sensor detects the level of dissolved oxygen in the fluid near the tumor; a drop in that measure suggests the metabolic behavior of the tumor is changing, often in a more aggressive way. So far, researchers have tested the device in tissue grown in culture. The next step is to test it in live animals. Most monitoring of tumor growth is currently done via CT scans, MRI, and other forms of external imaging. "The advantage of an implant over external imaging is that you can monitor the tumor on the go," says Sven Becker of the Technical University of Munich. "This means patients would have to pay fewer visits to the hospital for progression and postsurgery monitoring of tumors. They also wouldn't have to swallow contrast agents." While the device is currently calibrated to monitor oxygen, its chips can also be used to monitor other signs of tumor change or growth. "Oxygen levels are one of the primary indicators of tumor growth, but we have also found a way to activate the pH sensors by recalibrating the device from outside the body," says Grothe. Tuesday, September 06. 2011Modular’ 3D Printed Shoes by Objet on Display at London’s Victoria and Albert MuseumVia object -----
Marloes ten Bhromer is a critically acclaimed Dutch designer. She produces some incredible outworldly shoe designs based on a unique combination of art and technological functionality. One of her most exciting new designs is called the 'Rapidprototypedshoe' – created on the Objet Connex multi-material 3D printer. Why did she use rapid prototyping? According to Marloes, this is because; "rapid prototyping – adding material in layers – rather than traditional shoe manufacturing methods – could help me create something entirely new within just a few hours." And why Objet? Again, in her words; "Objet Connex printers make it possible to print an entire shoe – albeit a concept shoe – including a hard heel and a flexible upper in one build, which just isn't possible with other 3D printing technologies." The Objet Connex multi-material 3D printer allows the simulatneous printing of both rigid and rubber-like material grades and shades within a single prototype, which is why it's used by many of the world's largest shoe manufacturers. And of course, because it's 3D printing and not traditional manufacturing methods, there are no expensive set-up costs and no minimum quantities to worry about! This particular shoe design is based on a modular concept – with an interchangeable heel to allow for specific customizations as well as easy repairs (see the bottom photo which shows the heel detatched).
If you can't make it right at this moment, don't worry – the shoe and the exhibit will remain there until January 2nd. The Power of Making exhibition is created in collaboration with the Crafts Council. Curator Daniel Charney's aim is to encourage visitors to consider the process of making, not just the final results. For this the 3D printing process is particularly salient. For more details on this story read the Press Release here. ----- See also the first 'printed' plane Thursday, September 01. 2011Raspberry Pi $25 Computer Running Quake IIIVia TechCrunch ----- You may recall the Raspberry Pi, a barebones PC for emerging markets that they hope to sell for $25. When we wrote it up earlier this year, there wasn’t much in the way of demonstration: a few stills of the PCB and a video with founder David Braben describing his plan for the device. But today we have a demo that both captures the geek imagination and proves the device has legs:they’ve got it running Quake III. Not that it’s some big accomplishment to run a game released in the last millennium, but it actually does pretty well. The device uses a 700MHz ARM processor and has 128MB of RAM enabled here, and lacking any on-device storage, it’s running the OS (Debian CLI) and the game off an SD card. They could hit higher framerates, but wanted to show that 1920×1080 with 4xAA was possible. Naturally you could reduce this quite a bit and max out the refresh rate on your monitor; Q3A isn’t exactly the most graphics-intensive game on the market. The game isn’t being emulated; they actually compiled the open source version for their Debian build. They plan on networking a few together and playing a deathmatch soon. Now, the point of this isn’t that now, impoverished children in Kazakhstan will be able to hone their all-important FPSing skills. It’s more of a proof of concept showing that a (fairly) modern piece of software can be adapted to the hardware they’ve put together: the Raspberry Pi really is a full-on computer. And while there are Micro ATX boards and systems out there (very useful ones in fact), they don’t come anywhere near the $25 mark. You still need an LCD, keyboard, SD card or USB drive, and so on, but the Raspberry Pi Foundation is all about lowering the entry barrier and providing everything that’s needed in a basic computer for as low a price as possible. Keep up with the project here. They’ve still got a lot of work to do before they make this a viable product, but things seem to be moving along rapidly.
« previous page
(Page 16 of 18, totaling 178 entries)
» next page
|
QuicksearchPopular Entries
CategoriesShow tagged entriesSyndicate This BlogCalendar
Blog Administration |