Sensing oxygen: This implantable sensor measures the concentration of dissolved oxygen in tissue, an indicator of tumor growth.
Credit: Technical University of Munich
Monday, September 19. 2011UNIVAC: the troubled life of America's first computerVia ars technica By Matthew Lasar ----- ![]() It was November 4, 1952, and Americans huddled in their living rooms to follow the results of the Presidential race between General Dwight David Eisenhower and Adlai Stevenson, Governor of Illinois. We like to think that our time is a unique moment of technological change. But the consumers observing this election represented an unprecedented generation of early adopters, who watched rather than listened to the race on the radio. By that year they had bought and installed in their homes about 21 million copies of a device called the television—about seven times the number that existed just three years earlier. On that night they witnessed the birth of an even newer technology—a machine that could predict the election's results. Sitting next to the desk of CBS Anchor Walter Cronkite was a mockup of a huge gadget called a UNIVAC (UNIVersal Automatic Computer), which Cronkite explained would augur the contest. J. Presper Eckert, the UNIVAC's inventor, stood next to the device and explained its workings. The woman who actually programmed the mainframe, Navy mathematician Grace Murray Hopper was nowhere to be seen; for days her team had input voting statistics from earlier elections, then wrote the code that would allow the calculator to extrapolate the contest based on previous races. J. Presper Eckert and CBS's Walter Cronkite pondering the UNIVAC on election night, 1952. To the disquietude of national pollsters expecting a Stevenson victory, Hopper's UNIVAC group predicted a huge landslide for Eisenhower, and with only five percent of the results. CBS executives didn't know what to make of this bold finding. "We saw [UNIVAC] as an added feature to our coverage that could be very interesting in the future," Cronkite later recalled. "But I don't think that we felt the computer would become predominant in our coverage in any way." And so CBS told its audience that UNIVAC only foresaw a close race. At the end of the evening, when it was clear that UNIVAC's actual findings were spot on, a spokesperson for the company that made the machine was allowed to disclose the truth—that the real prediction had been squelched. "The uncanny accuracy of UNIVAC's prediction during a major televised event sent shock waves across through the nation," notes historian Kurt W. Beyer, author of Grace Hopper and the Invention of the Information Age. "In the months that followed, 'UNIVAC' gradually became the generic term for a computer." That's putting it mildly. By the late 1950s the UNIVAC and its cousin, the ENIAC, had inspired a generic sobriquet for anyone with computational prowess—a "BRAINIAC." The term became so embedded in American culture that to this day your typical computer literacy quiz includes the following multiple choice poser: Which was not an early mainframe computer? But the fact that this question is even posed is testimony to the other key component of UNIVAC's history—its famous trajectory was cut short by a corporation with a much larger shadow: IBM. The turbulent life of UNIVAC offers interesting lessons for developers and entrepreneurs in our time. The machines and their teamsDuring the Second World War, two teams in the United States were deployed to improve the calculations necessary for artillery firing and strategic bombing. Hopper worked with Harvard mathematician Howard Aiken, whose Mark I computer performed computations for the Navy. John Mauchly and J. Presper Eckert's Electronic and Numerical Integrator and Computer (ENIAC) rolled out rocket firing tables for the Army. While both groups served extraordinarily during the war, their leaders could not have thought about these devices more differently. Aiken viewed them as scientific tools. Mauchly saw their potential as commercial instruments. After the conflict, Aiken obstinately lobbied against the commercialization of computing, inveighing against the "foolishness with Eckert and Mauchly," at computer conferences. Perhaps there was a need for five or six machines in the country, he told associates; no more. But Aiken's assistant Hopper was fascinated by the duo—the former a graduate student and the latter a professor of electronics. Eckert was "looking way ahead," Hopper recalled. "Even though he was a college professor he was visualizing the use of these computers in the business and industrial area." The University of Pennsylvania sided with Aiken. The college offered Eckert and Mauchly tenured positions, but only on the condition that they sign patent releases for all their work. Both inventors resigned from the campus and in the spring of 1946 formed the Electronic Control Company, which eventually became the Eckert-Mauchly Computer Corporation. Over the course of five years, the two developers rethought everything associated with computational machines. The result was a device that went way beyond the age of punch card calculators associated with IBM devices. The UNIVAC, unveiled in 1951, was the fruit of this effort. "No one who saw a UNIVAC failed to see how much it differed from existing calculators and punched card equipment," writes historian Paul E. Ceruzzi:
These characteristics would enable the UNIVAC to perform thousands more operations per second than its closest rival, the Harvard Mark II. And its adaptation of the entertainment industry's new tool—magnetic recording tape—would allow it to store vastly more data. UNIVAC was quickly picked up by the US Census Bureau in a $300,000 contract, which was followed by another deal via the National Bureau of Standards. Soon a racetrack betting odds calculator company called American Totalisator signed on, purchasing a 40 percent interest in the company. You could see and hold itBut Eckert-Mauchly could not handle this volume of work on its own. Its principals drastically underbid on key contracts. After a plane crash killed the corporation's board president, the inventors and Totalisor clashed over the viability of the project. The duo then went to IBM for backing and met with Thomas Watson Junior and Senior, but could not convince the elder executive of the UNIVAC's viability. "Having built his career on punch cards," Watson Jr. later reflected, "Dad distrusted magnetic tape instinctively. On a punch card, you had a piece of information that was permanent. You could see it and hold it in your hand.... But with magnetic tape, your data were stored invisibly on a medium that was designed to be erased and reused." So EMCC turned to its second choice—the Remington Rand office equipment company, whose founder James Rand expressed outrage when he saw a reworked IBM typewriter rather than a Remington hooked up to the UNIVAC. "Take that label off that machine!" Rand declared on his first visit to an EMCC laboratory. "I don't want it seen in here!" The tenderness over an IBM logo aside, Remington Rand brought an important innovation to the UNIVAC—television advertisements. The longer infomercials came complete with symphony orchestra introductions and historical progress timelines that began with the Egyptian Sphinx. The shorter ones extolled the role that UNIVAC was playing in weather prediction. "Today UNIVAC is saving time and increasing efficiency for science, industry, business, and government," one ad concluded. A Remington Rand UNIVAC commercial.
But while that was certainly true about what the machine did for its clients, historian Beyer notes that it didn't extend to Remington's management of EMCC. Most of the office company's top staff, like its founder, didn't understand the device, and related more to punch card machines. The man put in direct charge of EMCC, former Manhattan Project director Leslie Groves, tossed Mauchly to the sales department when he flunked a company security clearance test (apparently he had attended some Communist Party meetings in the 1930s). On top of that, new management did not sympathize with EMCC's female programmers, among them Grace Hopper, who by 1952 had written the UNIVAC's first software compiler. "There were not the same opportunities for women in larger corporations like Remington Rand," she later reflected. "They were older companies, and the jobs had been stereotyped." Then there was Groves' marketing strategy for the UNIVAC, which amounted to selling less of the devices, even as they were being hawked on TV as exemplars of technological progress. He ordered a fifty percent annual production quota drop. "With such low sales expectations, there was little incentive to educate Remington Rand's sizeable sales force about the new technology," Beyer explains. The biggest blow, however, came when IBM began to rethink its aversion to magnetic mainframe storage. Left in the dustDespite Remington/EMCC's internal chaos, interest in the UNIVAC exploded after the 1952 CBS demonstration. This created more problems. Hopper's programming staff was now besieged with attractive offers from companies using IBM gear, creating a brainiac drain within EMCC itself. "Some members of Dr. Grace Hopper's staff have already left for positions with users of IBM equipment," Mauchly noted in a memo, "and those of her staff who still remain are now expecting attractive offers from outside sources." Customer service and support became more and more of a challenge. Still, the UNIVAC was highly competitive with IBM equipment. The question was whether EMCC could beat Big Blue in government contract bidding, specifically for the Semi-Automatic Ground Environment (SAGE) defense communications network. The SAGE project amounted to an early-warning radar system designed to pick up enemy bomber activity around the nation's borders. It was the brainchild of Jay Forrester, director of MIT's Sernomechanisms Laboratory, and central to the idea was a network of digital computers to integrate the network, dubbed "Project Whirlwind." In three years Forrester's team had pioneered real-time, high-capacity access memory for the mainframes. The government now offered a contract to build 50 Whirlwind computers. IBM quickly rallied its forces for the contest. "I thought it was absolutely essential to IBM's future that we win it," Thomas Watson Jr., who had none of Senior's allergies to digital computing, later explained. "The company that built those computers was going to be way ahead of the game, because it would learn the secrets of mass production." Forrester gave the matter some thought. Remington Rand had UNIVAC. And it had the prestige of Manhattan Project Director Leslie Groves. But Remington did not have IBM's scale of operation or its production capacity. Indeed, under Groves' direction, it had scaled that capacity down. In 1953, the government offered the contract to IBM. Historian Beyer explains the consequences of this decision:
IBM quickly integrated these discoveries into its next rollout of commercial computers. The market loved them and ordered thousands. "In a little over a year we started delivering those redesigned computers," Watson Jr. later boasted. "They made the UNIVAC obsolete and we soon left Remington Rand in the dust." The UNIVAC universe of the 1950s—clockwise from top left: UNIVAC mathematician and programmer Grace Hopper; John W. Mauchly and John Presper Eckert; General Leslie Groves of Remington Rand; DC Comics' Brainiac confronting Superman in 1959; a Department of Commerce UNIVAC in the center. AftermathSensing the dust around it, in 1955 Remington merged with the Sperry Corporation and became Sperry Rand. No less than General Douglas MacArthur ran the new entity. This gave the UNIVAC a new lease on digital life, but one that operated in the shadow of the company that had once sworn that it would stick to punch tape: IBM. In the meantime, a slew of firms jumped into the high-speed computing business, among them RCA, National Cash Register, General Electric, and Honeywell. "IBM and the Seven Dwarfs," they were dubbed. UNIVAC was now a dwarf. Grace Hopper continued her work. She became an advocate of the assumption inherent in her UNIVAC compiler which she called "automatic" computing—the notion that programs should emphasize simple English words. Her compiler, later called FLOW-MATIC, understood 20 words. Her contemporaries patiently informed her that this number was enough. Hopper "was told very quickly that [she] couldn't do this because computers didn't understand English," she later noted. Happily, she did not believe this to be true, and advised a team that developed the COBOL programming language, which she championed and furthered through the 1960s and 1970s. US Navy Rear-Admiral Grace Murray Hopper died in 1992. Having fattened IBM on government grants for decades, the Department of Justice launched an antitrust suit against the corporation in 1969. This initiative was suddenly withdrawn by the Reagan administration in 1982—as the company once again jolted the industry by jumping into the PC market. As for UNIVAC, its complex birth 60 years ago remains the moment when we discovered that computers were going to be part of our lives—that they were going to become integral our work and collective imagination. It was also a moment when information systems developers and entrepreneurs learned that innovation and genius are not always a match for influence and organizational scale. "Howard Aiken was wrong," historian Paul Cerruzi wrote in 2000. "There turned out to be a market for millions of electronic digital computers by the 1990s." Their emergence awaited advances in solid state physics. Nonetheless, "the nearly ubiquitous computers of the 1990s are direct descendants of what Eckert and Mauchly hoped to commercialize in the late 1940s." Further readingMost of the material in this essay comes from Kurt W. Beyer's must-read book, Grace Hopper and the Invention of the Information Age (MIT Press). Also essential is Paul E. Ceruzzi's History of Modern Computing.
Friday, September 16. 2011Data Analytics: Crunching the Future
The technicians at SecureAlert’s monitoring center in Salt Lake City sit in front of computer screens filled with multicolored dots. Each dot represents someone on parole or probation wearing one of the company’s location-reporting ankle cuffs. As the people move around a city, their dots move around the map. “It looks a bit like an animated gumball machine,” says Steven Florek, SecureAlert’s vice-president of offender insights and knowledge management. As long as the gumballs don’t go where they’re not supposed to, all is well. The company works with law enforcement agencies around the U.S. to keep track of about 15,000 ex-cons, meaning it must collect and analyze billions of GPS signals transmitted by the cuffs each day. The more traditional part of the work consists of making sure that people under house arrest stay in their houses. But advances in the way information is collected and sorted mean SecureAlert isn’t just watching; the company says it can actually predict when a crime is about to go down. If that sounds like the “pre-cogs”—crime prognosticators—in the movie Minority Report, Florek thinks so, too. He calls SecureAlert’s newest capability “pre-crime” detection. Using data from the ankle cuffs and other sources, SecureAlert identifies patterns of suspicious behavior. A person convicted of domestic violence, for example, might get out of jail and set up a law-abiding routine. Quite often, though, SecureAlert’s technology sees such people backslide and start visiting the restaurants or schools or other places their victims frequent. “We know they’re looking to force an encounter,” Florek says. If the person gets too close for comfort, he says, “an alarm goes off and a flashing siren appears on the screen.” The system doesn’t go quite as far as Minority Report, where the cops break down doors and blow away the perpetrators before they perpetrate. Rather, the system can call an offender through a two-way cellphone attached to the ankle cuff to ask what the person is doing, or set off a 95-decibel shriek as a warning to others. More typically, the company will notify probation officers or police about the suspicious activity and have them investigate. Presumably with weapons holstered. “It’s like a strategy game,” Florek says. (BeforeBloomberg Businessweek went to press, Florek left the company for undisclosed reasons.) It didn’t used to be that a company the size of SecureAlert, with about $16 million in annual revenue, could engage in such a real-world chess match. For decades, only Fortune 500-scale corporations and three-letter government agencies had the money and resources to pull off this kind of data crunching. Wal-Mart Stores (WMT) is famous for using data analysis to adjust its inventory levels and prices. FedEx (FDX) earned similar respect for tweaking its delivery routes, while airlines and telecommunications companies used this technology to pinpoint and take care of their best customers. But even at the most sophisticated corporations, data analytics was often a cumbersome, ad hoc affair. Companies would pile information in “data warehouses,” and if executives had a question about some demographic trend, they had to supplicate “data priests” to tease the answers out of their costly, fragile systems. “This resulted in a situation where the analytics were always done looking in the rearview mirror,” says Paul Maritz, chief executive officer of VMware (VMW). “You were reasoning over things to find out what happened six months ago.” In the early 2000s a wave of startups made it possible to gather huge volumes of data and analyze it in record speed—à la SecureAlert. A retailer such as Macy’s (M) that once pored over last season’s sales information could shift to looking instantly at how an e-mail coupon for women’s shoes played out in different regions. “We have a banking client that used to need four days to make a decision on whether or not to trade a mortgage-backed security,” says Charles W. Berger, CEO of ParAccel, a data analytics startup founded in 2005 that powers SecureAlert’s pre-crime operation. “They do that in seven minutes now.”
Now a second wave of startups is finding ways to use cheap but powerful servers to analyze new categories of data such as blog posts, videos, photos, tweets, DNA sequences, and medical images. “The old days were about asking, ‘What is the biggest, smallest, and average?’?” says Michael Olson, CEO of startup Cloudera. “Today it’s, ‘What do you like? Who do you know?’ It’s answering these complex questions.”
The big bang in data analytics occurred in 2006 with the release of an open-source system called Hadoop. The technology was created by a software consultant named Doug Cutting, who had been examining a series of technical papers released by Google (GOOG). The papers described how the company spread tremendous amounts of information across its data centers and probed that pool of data for answers to queries. Where traditional data warehouses crammed as much information as possible on a few expensive computers, Google chopped up databases into bite-size chunks and sprinkled them among tens of thousands of cheap computers. The result was a lower-cost and higher-capacity system that lots of people can use at the same time. Google uses the technology throughout its operations. Its systems study billions of search results, match them to the first letters of a query, take a guess at what people are looking for, and display suggestions as they type. You can see the bite-size nature of the technology in action on Google Maps as tiny tiles come together to form a full map. Cutting created Hadoop to mimic Google’s technology so the rest of the world could have a way to sift through massive data sets quickly and cheaply. (Hadoop was the name of his son’s toy elephant.) The software first took off at Web companies such as Yahoo! (YHOO) and Facebook and then spread far and wide, with Walt Disney (DIS), the New York Times, Samsung, and hundreds of others starting their own projects. Cloudera, where Cutting, 48, now works, makes its own version of Hadoop and has sales partnerships withHewlett-Packard (HPQ) and Dell (DELL). Dozens of startups are trying to develop easier-to-use versions of Hadoop. For example, Datameer, in San Mateo, Calif., has built an Excel-like dashboard that allows regular business people, instead of data priests, to pose questions. “For 20 years you had limited amounts of computing and storage power and could only ask certain things,” says Datameer CEO Stefan Groschupf. “Now you just dump everything in there and ask whatever you want.” Top venture capital firms Kleiner Perkins Caufield & Byers and Redpoint Ventures have backed Datameer, while Accel Partners, Greylock Partners, and In-Q-Tel, the investment arm of the CIA, have helped finance Cloudera. Past technology worked with data that fell neatly into rows and columns—purchase dates, prices, the location of a store. Amazon.com (AMZN), for instance, would use traditional systems to track how many people bought a certain type of camera and for what price. Hadoop can handle data that don’t fit into spreadsheets. That ability, combined with Hadoop’s speedy divide-and-conquer approach to data, lets users get answers to questions they couldn’t even ask before. Retailers can dig into not just what people bought but why they bought it. Amazon can (and does) analyze its website logs to see what other items people look at before they buy that camera, how long they look at them, whether certain colors on a Web page generate more sales—and synthesize all that into real-time intelligence. Are they telling their friends about that camera? Is some new model poised to be the next big hit? “These insights don’t come super easily, but the information is there, and we do have the machine power now to process it and search for it,” says James Markarian, chief technology officer at data specialist Informatica (INFA).
Take the case of U.S. Xpress Enterprises, one of the largest private trucking companies. Through a device installed in the cabs of its 10,000-truck fleet, U.S. Xpress can track a driver’s location, how many times the driver has braked hard in the last few hours, if he sent a text message to the customer saying he would be late, and how long he rested. U.S. Xpress pays particular attention to the fuel economy of each driver, separating out the “guzzlers from the misers,” says Timothy Leonard, U.S. Xpress CTO. Truckers keep the engines running and the air conditioning on after they’ve pulled over for the night. “If you have a 10-hour break, we want your AC going for the first two hours at 70 degrees so you can go to sleep,” says Leonard. “After that, we want it back up to 78 or 79 degrees.” By adjusting the temperature, U.S. Xpress has lowered annual fuel consumption by 62 gallons per truck, which works out to a total of about $24 million per year. Less numerically, the company’s systems also analyze drivers’ tweets and blog posts. “We have a sentiment dashboard that monitors how they are feeling,” Leonard says. “If we see they hate something, we can respond with some new software or policies in a few hours.” The monitoring may come off as Big Brotherish, but U.S. Xpress sees it as key to keeping its drivers from quitting. (Driver turnover is a chronic issue in the trucking business.) How are IBM (IBM) and the other big players in the data warehousing business responding to all this? In the usual way: They’re buying startups. Last year, IBM bought Netezza for $1.7 billion. HP, EMC (EMC), and Teradata (TDC) have also acquired data analytics companies in the past 24 months.
It’s not going too far to say that data analytics has even gotten hip. The San Francisco offices of startup Splunk have all the of-the-moment accoutrements you’d find at Twitter or Zynga. The engineers work in what amounts to a giant living room with pinball machines, foosball tables, and Hello Kitty-themed cubes. Weekday parties often break out—during a recent visit, it was Mexican fiesta. Employees were wearing sombreros and fake moustaches while a dude near the tequila bar played the bongos. Splunk got its start as a type of nuts-and-bolts tool in data centers, giving administrators a way to search through data tied to the low-level operations of computers and software. The company indexes “machine events”—the second-by-second records produced by computing devices to keep track of their actions. This could include records of every time a server stores information, or it could be the length of a cell phone call and what type of handset was used. Splunk helps companies search through this morass, looking for events that caused problems or stood out as unusual. “We can see someone visit a shopping website from a certain computer, see that they got an error message while on the lady’s lingerie page, see how many times they tried to log in, where they went after, and what machine in some far-off data center caused the problem,” says Erik Swan, CTO and co-founder of Splunk. While it started as troubleshooting software for data centers, the company has morphed into an analysis tool that can be aimed at fine-tuning fraud detection systems at credit-card companies and measuring the success of online ad campaigns. A few blocks away from Splunk’s office are the more sedate headquarters of IRhythm Technologies, a medical device startup. IRhythm makes a type of oversize, plastic band-aid called the Zio Patch that helps doctors detect cardiac problems before they become fatal. Patients affix the Zio Patch to their chests for two weeks to measure their heart activity. The patients then mail the devices back to IRhythm’s offices, where a technician feeds the information into Amazon’s cloud computing service. Patients typically wear rivals’ much chunkier devices for just a couple of days and remove them when they sleep or shower—which happen to be when heart abnormalities often manifest. The upside of the waterproof Zio Patch is the length of time that people wear it—but 14 days is a whole lot of data.
IRhythm’s Hadoop system chops the 14-day periods into chunks and analyzes them with algorithms. Unusual activity gets passed along to technicians who flag worrisome patterns to doctors. For quality control of the device itself, IRhythm uses Splunk. The system monitors the strength of the Zio Patch’s recording signals, whether hot weather affects its adhesiveness to the skin, or how long a patient actually wore the device. On the Zio Patch manufacturing floor, IRhythm discovered that operations at some workstations were taking longer than expected. It used Splunk to go back to the day when the problems cropped up and discovered a computer glitch that was hanging up the operation. Mark Day, IRhythm’s vice-president of research and development, says he’s able to fine-tune his tiny startup’s operations the way a world-class manufacturer like Honda Motor (HMC) or Dell could a couple years ago. Even if he could have afforded the old-line data warehouses, they were too inflexible to provide much help. “The problem with those systems was that you don’t know ahead of time what problems you will face,” Day says. “Now, we just adapt as things come up.” At SecureAlert, Florek says that despite the much-improved tools, extracting useful meaning from data still requires effort—and in his line of work, sensitivity. If some ankle-cuff-wearing parolee wanders out-of-bounds, there’s a human in the process to make a judgment call. “We are constantly tuning our system to achieve a balance between crying wolf and catching serious situations,” he says. “Sometimes a guy just goes to a location because he got a new girlfriend.”
Posted by Christian Babski
in Innovation&Society, Software, Technology
at
18:05
Defined tags for this entry: artificial intelligence, innovation&society, privacy, software, technology
Monday, September 12. 2011Implantable sensor can monitor tumors constantly to sense growth-----
Sensing oxygen: This implantable sensor measures the concentration of dissolved oxygen in tissue, an indicator of tumor growth.
Researchers hope to combine the sensor with a device to deliver targeted chemotherapy. A team of medical engineers in Germany has developed an implant to continuously monitor tumor growth in cancer patients. The device, designed to be implanted in the patient near the tumor site, uses chip sensors to measure oxygen levels in the blood, an indicator of growth. The data is then transmitted wirelessly to an external receiver carried by the patient and transferred to his or her doctor for remote monitoring and analysis. "We developed the device to monitor and treat slow-growing tumors that are difficult to operate on, such as brain tumors and liver tumors, and for tumors in elderly patients for whom surgery might be dangerous," said Helmut Grothe, head of the Heinz-Nixdorf Institute for Medical Electronics at the Technical University of Munich. The roughly two-centimeter-long device, dubbed the IntelliTuM (Intelligent Implant for Tumor Monitoring), includes a self-calibrating sensor, data measurement and evaluation electronics, and a transmitter. All the components are contained within a biocompatible plastic housing. The device sensor detects the level of dissolved oxygen in the fluid near the tumor; a drop in that measure suggests the metabolic behavior of the tumor is changing, often in a more aggressive way. So far, researchers have tested the device in tissue grown in culture. The next step is to test it in live animals. Most monitoring of tumor growth is currently done via CT scans, MRI, and other forms of external imaging. "The advantage of an implant over external imaging is that you can monitor the tumor on the go," says Sven Becker of the Technical University of Munich. "This means patients would have to pay fewer visits to the hospital for progression and postsurgery monitoring of tumors. They also wouldn't have to swallow contrast agents." While the device is currently calibrated to monitor oxygen, its chips can also be used to monitor other signs of tumor change or growth. "Oxygen levels are one of the primary indicators of tumor growth, but we have also found a way to activate the pH sensors by recalibrating the device from outside the body," says Grothe. Tuesday, September 06. 2011It's Looking Up If You're Looking DownVia big think By Dominic Basulto ----- Far too many people are walking around with their heads immersed in their tiny mobile devices, or communicating affectionately with their tiny smart phones while out in public with perfectly acceptable human companions. The only problem, of course, is that humans are not evolutionarily equipped to act like this – and that inevitably leads to awkward scenes like people running into things on a city street or couples awkwardly texting with other people while having dinner "together." Tiny screens, while useful for monitoring the electronic minutiae of our daily lives, are not so useful for keeping our heads up and making eye contact with other humans. Fortunately, a number of tech companies are thinking of ways to make Looking Up the new Looking Down. Mobile device makers, encouraged by the rapid adoption of tablet technologies and people's embrace of post-PC screens, are busy developing new ways of interacting with these smaller screens that are not "inappropriately immersive." Finland’s mobile phone giant Nokia, bowed and bruised after failing to keep up with Apple in the development of sleek new mobile devices and other objects of consumer lust, is exploring a new strategy to take on Apple: developing cleverly-designed phones that enable you to make eye contact and become aware of the environment around you. As Nokia's head designer Marko Ahtisaari explained to the Wall Street Journal, "When you look around at a restaurant in Helsinki, you'll see couples having their heads down instead of having eye contact and being aware of the environment they're in... Designing for true mobility... is an example of what people would not explicitly ask for but love when they get it." Nokia is still being mysterious about what it has in store for future mobile users, but most likely, a "Look Up mobile device" (for lack of a better word) would be designed to combine the viewing potential of big screens with thr easy-to-operate interface of a smaller device. This is actually harder than it sounds. According to usability expert Jakob Nielsen, there are five different screen experiences – what he refers to as TV, mobile, desktop, "very small" (i.e. screens no larger than an RFID chip) and "very big" (i.e. screens as large as buildings). It's not enough, though, simply to translate a "very large" screen experience to a "very small" screen -- the usability considerations change, according to the different screen experiences. That's why it's always been so frustrating to browse the Web on a mobile phone - there are very different usability characteristics once you shrink a screen.
The transmedia experience - formerly the exclusive domain of entertainment brands and Hollywood - is starting to blend over into every aspect of our lives. Transmedia – which refers to seamless storytelling across different online and offline platforms – has been re-interpreted by mobile designers to include surfaces and screens. When done right, this cross-surface storytelling leads to entirely new types of interactions and experiences. BERG London, in collaboration with Dentsu London, for example, has been experimenting with "incidental media" that transform everyday objects into interactive surfaces. One thing is certain -- the future is sure to turn a few heads - or at least, tilt them upward for awhile. Friday, August 26. 2011StarCraft: Coming to a sports bar near you
Via The Wall Street Journal
----- SAN FRANCISCO—One Sunday afternoon last month, a hundred boisterous patrons crowded into Mad Dog in the Fog, a British sports bar here, to watch a live broadcast. Half the flat-screen TVs were tuned to a blood-filled match between two Korean competitors, "MC" and "Puma." The crowd erupted in chants of "M-C! M-C!" when the favorite started a comeback. The pub is known for showing European soccer and other sports, but Puma and MC aren't athletes. They are 20-year-old professional videogame players who were leading computerized armies of humans and aliens in a science-fiction war game called "Starcraft II" from a Los Angeles convention center. The Koreans were fighting over a tournament prize of $50,000. This summer, "Starcraft II" has become
the newest barroom spectator sport. Fans organize so-called Barcraft
events, taking over pubs and bistros from Honolulu to Florida and
switching big-screen TV sets to Internet broadcasts of professional game
matches happening often thousands of miles away. Fans of 'Starcraft II' watch a live game broadcast in Washington, D.C. As they root for their on-screen superstars, "Starcraft" enthusiasts can sow confusion among regular patrons. Longtime Mad Dog customers were taken aback by the young men fist-pumping while digital swarms of an insect-like race called "Zerg" battled the humanoid "Protoss" on the bar's TVs. "I thought I'd come here for a quiet beer after a crazy day at work," said Michael McMahan, a 59-year-old carpenter who is a 17-year veteran of the bar, over the sound of noisy fans as he sipped on a draught pint. But for sports-bar owners, "Starcraft" viewers represent a key new source of revenue from a demographic—self-described geeks—they hadn't attracted before. "It was unbelievable," said Jim Biddle, a manager of Bistro 153 in Beaverton, Ore., which hosted its first Barcraft in July. The 50 gamers in attendance "doubled what I'd normally take in on a normal Sunday night." For "Starcraft" fans, watching in bars fulfills their desire to share the love of a game that many watched at home alone before. During a Barcraft at San Francisco's Mad Dog in July, Justin Ng, a bespectacled 29-year-old software engineer, often rose to his feet during pivotal clashes of a match. "This feels like the World Cup," he said. "You experience the energy and screams of everyone around you when a player makes an amazing play." Millions of Internet users already tune in each month on their PCs to watch live "eSports" events featuring big-name stars like MC, who is Jang Min Chul in real life, or replays of recent matches. In the U.S., fervor for "Starcraft II" is spilling into public view for the first time, as many players now prefer to watch the pros. In mid-July, during the first North American Star League tournament in Los Angeles, 85,000 online viewers watched Puma defeat MC in the live championship match on Twitch.tv, said Emmett Shear, who runs the recently-launched site. The "Starcraft" franchise is more popular in Korea, where two cable TV stations, MBC Game and Ongamenet, provide dedicated coverage. The cable channels and Web networks broadcast other war games such as "Halo," "Counter-Strike," and "Call of Duty." But "Starcraft II" is often the biggest draw. The pros, mostly in their teens and 20s, get prize money and endorsements. Professional leagues in the U.S. and Korea and have sprouted since "Starcraft II" launched last year. Pro-match broadcasts often include breathless play-by-play announcers who cover each move like a wrestling match. (A typical commentary: "It's a drone genocide! Flaming drone carcasses all over the place!"). Barcraft goers credit a Seattle bar, Chao Bistro, for launching the Barcraft fad this year. Glen Bowers, a 35-year-old Chao patron and "Starcraft" fan, suggested to owner Hyung Chung that he show professional "Starcraft" matches. Seeing that customers were ignoring Mariners baseball broadcasts on the bar's TVs, Mr. Chung, a videogame fan, OK'ed the experiment. In mid-May Mr. Bowers configured Chao's five TVs to show Internet feeds and posted an online notice to "Starcraft" devotees. About 150 people showed up two days later. Since then, Mr. Bowers has organized twice-a-week viewings; attendance has averaged between 40 and 50 people, including employees of Amazon.com Inc. and Microsoft Corp., he said. The trend ended up spreading to more than a dozen Barcrafts across the country, including joints in Raleigh, N.C., and Boston. The "Starcraft II" game lends itself to sports bars because it "was built from the ground up as a spectator sport," said Bob Colayco, a publicist for the game's publisher, Activision Blizzard Inc. Websites like Twitch.tv helped "Starcraft's" spectator-sport appeal by letting players "stream" live games. Two University of Washington graduate students recently published a research paper seeking to scientifically pinpoint "Starcraft's" appeal as a spectator sport. The paper posits that "information asymmetry," in which one party has more information than the other, is the "fundamental source of entertainment."
Posted by Christian Babski
in Innovation&Society
at
11:01
Defined tags for this entry: game, innovation&society
Wednesday, August 24. 2011The First Industrial EvolutionVia Big Think by Dominic Basulto ----- If the first industrial revolution was all about mass manufacturing and machine power replacing manual labor, the First Industrial Evolution will be about the ability to evolve your personal designs online and then print them using popular 3D printing technology. Once these 3D printing technologies enter the mainstream, they could lead to a fundamental change in the way that individuals - even those without any design or engineering skills - are able to create beautiful, state-of-the-art objects on demand in their own homes. It is, quite simply, the democratization of personal manufacturing on a massive scale. At the Cornell Creative Machines Lab, it's possible to glimpse what's next for the future of personal manufacturing. Researchers led by Hod Lipson created the website Endless Forms (a clever allusion to Charles Darwin’s famous last line in The Origin of Species) to "evolve” everyday objects and then bring them to life using 3D printing technologies. Even without any technical or design expertise, it's possible to create and print forms ranging from lamps to mushrooms to butterflies. You literally "evolve" printable, 3D objects through a process that echoes the principles of evolutionary biology. In fact, to create this technology, the Cornell team studied how living items like oak trees and elephants evolve over time. 3D printing capabilities, once limited to the laboratory, are now hitting the mainstream. Consider the fact that MakerBot Industries just landed $10 million from VC investors. In the future, each of us may have a personal 3D printer in the home, ready to print out personal designs on demand.
Wait a second, what's going on here? Objects using humans to evolve themselves? 3D Printers? Someone's been drinking the Kool-Aid, right?
What if that gorgeous iPad 2 you’re holding in your hand was actually “evolved” and not “designed”? What if it is the object that controls the design, and not the designer that controls the object? Hod Lipson, an expert on self-aware robots and a pioneer of the 3D printing movement, has claimed that we are on the brink of the second industrial revolution. However, if objects really do "evolve," is it more accurate to say that we are on the brink of The First Industrial Evolution? The final frontier, of course, is not the ability of humans to print out beautifully-evolved objects on demand using 3D printers in their homes. (Although that’s quite cool). The final frontier is the ability for self-aware objects independently “evolving” humans and then printing them out as they need them. Sound far-fetched? Well, it’s now possible to print 3D human organs and 3D human skin. When machine intelligence progresses to a certain point, what’s to stop independent, self-aware machines from printing human organs? The implications – for both atheists and true believers – are perhaps too overwhelming even to consider. Tuesday, August 23. 2011Your Two ThingsVia KK ----- In ten years from now, how many gadgets will people carry? Apple would like you to carry 3 things today. The iPad, iPhone and MacBook. Once they would be happy if you carried one. What do they have in mind for the next decade? Ten? I claim that what technology wants is to specialize, so I predict that any device we have today we'll have yet more specialized devices in the future. That means there will be hundreds of new devices in the coming years. Are we going to carry them all? Will we have a daypack full of devices? Will every pocket have its own critter? I think the answer for the average person is 2. We'll carry two devices in the next decade. Over the long term, say 100 years, we may carry no devices. The two devices we'll carry (on average) will be 1) a close-to-body handheld thingie, and 2) a larger tablet thingie at arms length. The handled will be our wallet, purse, camera, phone, navigator, watch, swiss army knife combo. The tablet will be a bigger screen and multi sensor input. It may unfold, or unroll, or expand, or be just a plain plank. Different folks will have different sizes. But there are caveats. First, we'll wear a a lot of devices -- which is not the same as carrying them. We'll have devices built into belts, wristbands, necklaces, clothes, or more immediately into glasses or worn on our ears, etc. We wear a watch; we don't carry it. We wear necklaces, or fit bits, rather than carry. Main difference is being attached it is harder to lose (or lose track of), always intimate. This will be particularly true of quantified self-tracking devices. If we ask the question, ow may devices will you wear in ten years, the answer may be ten. Secondly, the two devices you carry may not always be the same devices. You may switch them out depending on the location, mode (vacation or work), task at hand. Some days you may need a bigger screen than others. More importantly, the devices may depend on your vocation. Some jobs want a small text based device (programmers), others may want a large screen (filmmakers), others a very blinding bright display (contractor), or others a flexible collapsible device (salesperson). The law of technology is that a specialized tool will always be superior to a general purpose tool. No matter how great the built in camera in your phone gets, the best single purpose camera will be better. No matter who great your navigator in your handheld combo gets, the best dedicated navigator will be a lot better. Professionals, or ardent enthusiasts will continue to use the best tools, which will mean the specialized tools. Just to be clear, the combo is a specialized tool itself, just as a swiss army knife is a specialized knife -- it specializes in the combo. It does everything okay. So another way to restate the equation: the 2 devices each person will carry are one general purpose combination device, and one specialized device (per your major interests and style). Of course, some folks will carry more than two, like the New York City police officer in the image above (taken in Times Square a few weeks ago). That may be because of their job, or vocation. But they won't carry them all the time. Even when they are "off" they will carry at least one device, and maybe two. But I predict that in the longer term we will tend to not carry any devices at all. That's because we will have so many devices around us, both handheld and built-ins, and each will be capable of recognizing us and displaying to us our own personal interface, that they in effect become ours for the duration of our use. Not too long ago no one carried their own phone. You just used the nearest phone at hand. You borrowed it and did not need to carry your own personal phone around.That would have seemed absurd in 1960. But of course not every room had a phone, not every store had one, not every street had one. So we wanted our own cell phones. But what if almost any device made could be borrowed and used as a communication device? You pick up a camera, or tablet, or remote and talk into it. Then you might not need to carry your own phone again. What if every screen could be hijacked for your immediate purposes? Why carry a screen of your own? This will not happen in 10 years. But I believe in the goodness of time the highly evolved person will not carry anything. At the same time the attraction of a totem object, or something to hold in your hands, particularly a gorgeous object, will not diminish. We may remain with one single object that we love, that does most of what we need okay, and that in some ways comes to represent us. Perhaps the highly evolved person carries one distinctive object -- which will be buried with them when they die. At the very least, I don't think we'll normally carry more than a couple of things at once, on an ordinary day. The number of devices will proliferate, but each will occupy a smaller and smaller niche. There will be a long tail distribution of devices. 50 years from now a very common ritual upon meeting of old friends will be the mutual exchange and cross examination of what lovely personal thing they have in their pocket or purse. You'll be able to tell a lot about a person by what they carry. ----- Thursday, August 11. 2011Does Facial Recognition Technology Mean the End of Privacy?Via big think by Dominic Basulto ----- At the Black Hat security conference in Las Vegas, researchers from Carnegie Mellon demonstrated how the same facial recognition technology used to tag Facebook photos could be used to identify random people on the street. This facial recognition technology, when combined with geo-location, could fundamentally change our notions of personal privacy. In Europe, facial recognition technology has already stirred up its share of controversy, with German regulators threatening to sue Facebook up to half-a-million dollars for violating European privacy rules. But it's not only Facebook - both Google (with PittPatt) and Apple (with Polar Rose) are also putting the finishing touches on new facial recognition technologies that could make it easier than ever before to connect our online and offline identities. If the eyes are the window to the soul, then your face is the window to your personal identity. And it's for that reason that privacy advocates in both Europe and the USA are up in arms about the new facial recognition technology. What seems harmless at first - the ability to identify your friends in photos - could be something much more dangerous in the hands of anyone else other than your friends for one simple reason: your face is the key to linking your online and offline identities. It's one thing for law enforcement officials to have access to this technology, but what if your neighbor suddenly has the ability to snoop on you? The researchers at Carnegie Mellon showed how a combination of simple technologies - a smart phone, a webcam and a Facebook account - were enough to identify people after only a three-second visual search. Hackers - once they can put together a face and the basics of a personal profile - like a birthday and hometown - they can start piecing together details like your Social Security Number and bank account information.
Forget being fingerprinted, it could be far worse to be Faceprinted. It's like the scene from The Terminator, where Arnold Schwarzenegger is able to identify his targets by employing a futuristic form of facial recognition technology. Well, the future is here. Imagine a complete stranger taking a photo of you and immediately connecting that photo to every element of your personal identity and using that to stalk you (or your wife or your daughter). It happened to reality TV star Adam Savage - when he uploaded a photo to his Twitter page of his SUV parked outside his home, he didn't realize that it included geo-tagging meta-data. Within hours, people knew the exact location of his home. Or, imagine walking into a store, and the sales floor staff doing a quick visual search using a smart phone camera, finding out what your likes and interests are via Facebook or Google, and then tailoring their sales pitch accordingly. It's targeted advertising, taken to the extreme.
Which raises the important question: Is Privacy a Right or a Privilege? Now that we're all celebrities in the Internet age, it doesn't take much to extrapolate that soon we'll all have the equivalent of Internet paparazzi incessantly snapping photos of us and intruding into our daily lives. Cookies, spiders, bots and spyware will seem positively Old School by then. The people with money and privilege and clout will be the people who will be able to erect barriers around their personal lives, living behind the digital equivalent of a gated community. The rest of us? We'll live our lives in public. Geeks Without Frontiers Pursues Wi-Fi for EveryoneVia OStatic ----- Recently, you may have heard about new efforts to bring online access to regions where it has been economically nonviable before. This idea is not new, of course. The One Laptop Per Child (OLPC) initiative was squarely aimed at the goal until it ran into some significant hiccups. One of the latest moves on this front comes from Geeks Without Frontiers, which has a stated goal of positively impacting one billion lives with technology over the next 10 years. The organization, sponsored by Google and The Tides Foundation, is working on low cost, open source Wi-Fi solutions for "areas where legacy broadband models are currently considered to be uneconomical." According to an announcement from Geeks Without Frontiers: "GEEKS expects that this technology, built mainly by Cozybit, managed by GEEKS and I-Net Solutions, and sponsored by Google, Global Connect, Nortel, One Laptop Per Child, and the Manna Energy Foundation, will enable the development and rollout of large-scale mesh Wi-Fi networks for atleast half of the traditional network cost. This is a major step in achieving the vision of affordable broadband for all." It's notable that One Laptop Per Child is among the sponsors of this initiative. The organization has open sourced key parts of its software platform, and could have natural synergies with a global Wi-Fi effort. “By driving down the cost of metropolitan and village scale Wi-Fi networks, millions more people will be able to reap the economic and social benefits of significantly lower cost Internet access,” said Michael Potter, one of the founders of the GEEKS initiative. The Wi-Fi technology that GEEKS is pursuing is mesh networking technology. Specifically, open80211s (o11s), which implements the AMPE (Authenticated Mesh Peering Exchange) enabling multiple authenticated nodes to encrypt traffic between themselves. Mesh networks are essentially widely distributed wireless networks based on many repeaters throught a specific location. You can read much more about the open80211s standard here. The GEEKS initiative has significant backers, and with sponsorship from OLPC, will probably benefit from good advice on the topic of bringing advanced technologies to disadvantaged regions of the world. The effort will be worth watching. Tuesday, August 09. 2011Help CERN in the hunt for the Higgs BosonVia bit-tech ----- The Citizen Cyberscience Centre based at CERN, launched a new version of LHC@home today.
« previous page
(Page 17 of 18, totaling 180 entries)
» next page
|
QuicksearchPopular Entries
CategoriesShow tagged entriesSyndicate This BlogCalendar
Blog Administration |