Image: Interfaculty Initiative in Information Studies, The University of Tokyo
In an interesting meshing of robotics and prosthetics development, Japanese researchers from Tokyo University working in conjunction with Sony Corporation, have created an external forearm device capable of causing independent finger and wrist movement. Introduced on the Rekimoto Lab website, the PossessedHand as it’s called can be strapped to the wrist like a blood pressure cuff and fine tuned to the individual wearing it. The PossessedHand sends small doses of electricity to the muscles in the forearm that control movement, and can be "taught" to send preprogrammed signals that replicate the movements of normal wrist and finger movements, such as plucking the strings of a musical instrument.
Though the signals sent are too weak to actually cause string plucking, they are apparently strong enough to cause the user to understand which finger is supposed to be moved, thus, the device might be construed to be more of a learning device than an actual guitar accessory.
Currently devices that do roughly the same thing are done with electrodes inserted into the skin, or work via gloves worn over the hand, both rather cludgy and perhaps somewhat painful. This new approach in contrast, is said to feel more like a gentle hand massage.
Image: Interfaculty Initiative in Information Studies, The University of Tokyo
Though the original purpose of the PossessedHand seems to be as an aid to help people learn to play musical instruments, something that has inspired a bit of criticism from the musical community due to the fact that nothing is actually learned when using the device; the hand basically becomes an external part of the instrument, while the brain remains passive; it seems clear the device could be used in multiple other ways. For example, it could be used by hearing people to assist in speaking with deaf sign-language users, or to help people type who have never learned how, or perhaps more importantly to help paralyzed people or those suffering from a stroke.
Interfaculty Initiative in Information Studies, The University of Tokyo
In these instances it’s not always imperative that the user actually learn anything new, just that they are able to communicate when they want to. If the programming of the device could be made to work in real time in other ways, by the user, then its value would greatly increase. For example if a person could speak out loud into a microphone and those words could then be captured and translated to sign-language and transferred directly to their fingers, deaf people would instantly be able to communicate with anyone they meet who is willing to wear the cuff.
ASUS is a well-known Taiwanese computer components manufacturer. The company stepped into the tablet industry a couple of years ago. The ASUS Eee Pad Transformer is the company's latest Honeycomb tablet with a unique form factor. This tablet comes with a keyboard (purchased separately), which transforms it into a mini laptop or netbook when docked to the tablet. The tablet hasn't seen the wider market yet, with countries like India still awaiting the launch of the tablet. Now it seems like the company is already planning a successor to the tablet powered by the NVIDIA Tegra 3 CPU, and apparently running Android Ice Cream Sandwich. The tablet could be launched in October or November this year. Previous rumours suggested that ASUS was planning to have a successor to the Eee Pad Transformer running on the Windows 8 platform, but this new rumour seems more likely. Also, the fact that this news comes from a supplier chain of ASUS in Taiwan, makes it seem much more plausible.
The Eee Pad Transformer With The Keyboard
The ASUS Eee Pad Transformer is a regular Honeycomb tablet with a 10.1" display. It stands out in the crowd of tablets because it comes with a keyboard (which can be bought separately), literally transforming it into a netbook. A similar feature is available with the Motorola XOOM, allowing it to be paired with a Bluetooth wireless keyboard. The Acer Iconia Tab is already available in India, and with the XOOM, Galaxy Tab 10.1 and Eee Pad Transformer launching soon, the tablet market in India could heat up in no time.
The Eee Pad Transformer runs on a 1 GHz dual core NVIDIA Tegra 2 CPU, which is seen in most Honeycomb tablets. The Tegra 3 processor codenamed Kal El is a quad core CPU capable of running each core at up to 1.5 GHz. The Kal El is said to be up to 5 times faster than the Tegra 2 processor, so we can only imagine the snappiness of that beast.
-----
Personal comment: This kind of "hybrid" hardware demonstrastes perfectly how "new" the so-called "concept" of tablet is! Starting from a laptop, removing the keyboard and installing a lightweight OS... in order to install back a keyboard less than two years later. Now, thanks to all hardware constructor's tablets and hybrid models, one can easily think and admit that any laptop can be "transformed" to a tablet and reverse, and, as end-user, we may now request this kind of feature for all laptops without any power of functionality drawback... making mobile computing hardware converging again to one global solution valuable for any kind of use.
If you want to feel different sensations are not just a classic mouse, which is pretty cool, his name is Amenbo Mouse, a mouse equipped with a touch of beauty to five fingers with a shape similar to a human hand. A study conducted by the Japan Twin Research & Development Co. has successfully developed a mouse is truly revolutionary.
And now, they’ve officially showcased his creations mouse with Amenbo name, a new input device is a mouse that will not waste the function of one finger but will give a touch sensor for the fifth finger. Each finger will get the mouse pad itself with sensors, which are all connected to a single base with a flexible material that allow movement. The device can detect pressure and finger movements, and is ideal for many uses.
“We put sensors on each fingertip mouse, and can enter the coordinates for each finger as well as downward pressure from each finger means actual hand movements are being fed into a PC .. For example, is allow the PC to receive data for the various needs of the toes “said the Japan Twin Research & Development Co.. One example application for this system is to manipulate the data in 3D CAD. Sometimes in order to manipulate the data in 3D CAD or rather need to use a 3D mouse and normal mouse using both hands, but in this Amenbo mouse system functions can be combined and operated with one hand, and much more usefulness depends on the user’s needs.
As we can see from the demo video below, mouse Amenbo able to perform with good movement to move the 3D object, the mouse Amenbo own work through connection to a computer USB port connection, unfortunately for more complete information including the marketing has not been disclosed.
The endoscope has been the go to tool for doctors that need to look
inside the digestive tract of a patient since the 1980?s. These things
are generally long, black cables that are forced through the mouth of
up the bum of people when the doctors need to check for things like
colon cancer or stomach issues. In 2008, we talked about a new
development in endoscope tools that placed the camera into a small pill shaped capsule that the person swallowed to get a look at their innards.
The problem with that camera capsule was that it could only follow
along and take photos inside the body using the muscle contractions of
the patient’s body. Some Japanese researchers have taken that endoscope
capsule to the next level and attached a small propulsion system to it
that allows the camera to swim through the digestive tract. The new
capsule has what appears to be a flagella type mechanism on the back
that uses a magnetic propulsion system.
The doctor is able to control the capsule using a joystick to get it
to take photos of whatever part of the digestive track is needed. The
device has been nicknamed “mermaid” and it would have to feel strange
moving around inside the body. It is certainly better than a four foot
long tube crammed down your throat.
Microsoft has released the Kinect for Windows SDK beta, as expected,
allowing PC developers to use the motion-tracking accessory. A free
100MB download, the SDK offers support for the depth sensor, color
camera and quad-microphone array, along with all the clever
skeletal-tracking systems that Xbox 360 game developers have had access
to.
There’s also integration with Windows’ speech recognition API. That
potentially means developers will be able to use the microphone array –
which can pinpoint which user is talking thanks to beam formation – to
transcribe speech to text, open and control applications, and more.
Microsoft has thrown in plenty of technical documentation, samples,
all the drivers you’ll need and support for C++, C#, or Visual
Basic. This current iteration of the SDK is only for non-commercial
purposes; Microsoft says it will be releasing a commercial version
later on. You’ll obviously need a Kinect sensor, too, which currently
costs around $140.
The iTunes store continues to grow. The data that Apple published in the last event included the following:
15 Billion iTunes song downloads
130 million book downloads
14 billion app downloads
$2.5 billion paid to developers
225 million accounts
425k apps
90k iPad apps
100k game and entertainment titles
50 million game center accounts
As this data is added to the existing data and cross-referenced additional insight into the economics of iTunes is emerging.
Since we know something about the average price of songs and apps, and we know the split between developers and Apple (and roughly between music labels and Apple) we can get a rough estimate of the amount Apple retains to run its store.
The following chart shows iTunes “content margin” by month. This margin is what Apple “keeps” after paying content owners but before paying for other costs like payment processing and delivery/fulfillment which should be accounted as variable costs. Strictly speaking this margin is not “gross margin”.
If we add the content margins from music and apps and assume the store runs at break even we can get an idea of what it costs to operate the store. The latest number is $113 million per month (from a total income of $313 million/mo.). It implies over $1.3 billion per year.
Much of that cost does go into serving the content (traffic and payment processing). Some of it goes to curation and support. But it’s very likely that there is much left over to be invested in capacity increases.
I would like to hear alternative opinions, but my guess is that much of the capex that went into the new data centers Apple built came from the iTunes operating margin.
With the GreenChip, each bulb becomes a small networked device
NXP has just announced its GreenChip, which gives every light bulb
the potential of being connected to a TCP/IP network to provide
real-time information and receive commands, wirelessly. This feels a bit
like science-fiction talk, but NXP has managed to build a chip that is
low-cost enough to be embedded into regular light bulbs (and more in the
future) with an increase of about $1 in manufacturing cost. Obviously,
$1 is not small relative to the price of a bulb but, in absolute terms,
it’s not bad at all — and the cost is bound to fall steadily, thanks to
Moore’s law.
But what can you do with wirelessly connected bulbs? For one, you can
dim, or turn them on and off using digital commands from any computer,
phone or tablet.
You can also do it remotely: those chips have the potential of making
home automation much easier and more standard than anything that came
before. Better home automation can also mean smarter (and automated)
energy -and money- savings. the bulbs are also smart enough to know how
much energy they have consumed.
Although the bulbs use internet addresses, they are not connected
directly to the web. They don’t use WIFI either, because that protocol
is too expensive and not energy-efficient for this usage. Instead, the
bulbs are linked through a 2.4-GHz IEEE 802.15.4 network and in standby
mode, the GreenChip consumes about 50mW.
The network itself is a mesh network that is connected to a “box”
that will itself be connected to your home network. Computers and mobile
devices send commands to the box, which sends them to the bulbs.
Because it is a mesh network, every bulb is considered to be a “network
extender”, so as long as there is 30 meters between two bulbs, the
network can be extended across very large surfaces. In a typical house,
that would mean no “dead spots”.
The first products will be manufactured by TCP, which manufactures
about 1M efficient light bulbs (of all sorts) per day. TCP supplies
other brands like Philips or GE. The prices of the final products have
yet to be determined, but NXP expect them to be attractive to consumers.
Of course, we need to see what the applications will look like too.
This is an interesting first step in embedding low-cost smart chips
in low-cost goods. Yet, this is a critical step in creating a smarter
local energy grid in our homes.
During today’s opening keynote for Google I/O, they touched on all the major topics that folks were anticipating including Android Ice Cream Sandwich, Google Music, and Google TV,
but they also added some unexpected developments. One of them being
their new Android Open Accessory initiative, that will allow developers
to create their own hardware accessories that can be controlled by
Android.
The Android Open Accessory system is built on open-source Arduino. It
will will allow externally accessories connect to an Android-powered
device via USB and eventually Bluetooth. This could essentially mean the
start of a whole slew of “Made for Android” devices such as docks,
speakers, or even an exercise bike. This opens up a whole world of
possibilities now for external accessories to work with Android.
Google is offering an Android Device Kit or ADK for developers. The
kit comes with a sample implementation in the form of a USB accessory
along with all the hardware design files, the code for the accessory’s
firmware, and the Android application that interacts with the accessory.