These days, there are plenty of people out there
who like to code, either for money or for fun or for both. But,
apparently, there’s also a growing interest in watching other people
code. How else to explain the new site Watch People Code, which lets you, well, watch people code, live?
The
Watch People Code site simply makes it a little easier to watch streams
from the WatchPeopleCode subreddit that are currently live or coming
up. It’s generated some chatter among developers, and the feedback has
generally been positive, with most people feeling that it’s a good way
to learn and improve your own coding. For example, here are a few
comments written by developers on discussion forums such as Hacker News.
“I like this idea. It isn't the most common thing to shadow
someone while they code, you can learn a lot from how people ‘flow’.” nirkalimi
“I am probably going to actually use this a lot. I find there's no better way to pick up new things than from pair coding.” Itamar Kestenbaum
“Watching someone code teaches many subtle techniques that
people don't even realize they do. Especially the holistic set of
techniques an individual uses and how they interact.” endergen
Live streams of people coding isn’t a totally new thing. Twitch.TV offers live streams of game development, as does Ludum Dare. Plus, you can also go directly to YouTube’s collection of live streams and find people programming.
It’s
not only coders who may enjoy watching people program. A non-programmer
friend of mine, for example, after I brought Watch People Code to her
attention, found herself oddly mesmerized by watching someone
code. “This is oddly interesting,” she said.
The next generation of cloud servers might be deployed where the
clouds can be made of alcohol and cosmic dust: in space. That’s what
ConnectX wants to do with their new data visualization platform.
Why space? It’s not as though there isn’t room to set up servers here
on Earth, what with Germans willing to give up space in their utility
rooms in exchange for a bit of ambient heat and malls now leasing empty storefronts to service providers. But there are certain advantages.
The desire to install servers where there’s abundant, free cooling
makes plenty of sense. Down here on Earth, that’s what’s driven
companies like Facebook to set up shop in Scandinavia near the edge of
the Arctic Circle. Space gets a whole lot colder than the Arctic, so from that standpoint the ConnectX plan makes plenty of sense. There’s also virtually no humidity, which can wreak havoc on computers.
They also believe that the zero-g environment would do wonders for
the lifespan of the hard drives in their servers, since it could reduce
the resistance they encounter while spinning. That’s the same reason
Western Digital started filling hard drives with helium.
But what about data transmission? How does ConnectX plan on moving
the bits back and forth between their orbital servers and networks back
on the ground? Though something similar to NASA’s Lunar Laser Communication Demonstration — which beamed data to the moon 4,800 times faster than any RF system ever managed — seems like a decent option, they’re leaning on RF.
Mind you, it’s a fairly complex setup. ConnectX says they’re
polishing a system that “twists” signals to reap massive transmission
gains. A similar system demonstrated last year managed to push data over
radio waves at a staggering 32gbps, around 30 times faster than LTE.
So ConnectX seems to have that sorted. The only real question is the
cost of deployment. Can the potential reduction in long-term maintenance
costs really offset the massive expense of actually getting their
servers into orbit? And what about upgrading capacity? It’s certainly
not going to be nearly as fast, easy, or cheap as it is to do on Earth.
That’s up to ConnectX to figure out, and they seem confident that they
can make it work.
SAN DIEGO, Jan. 5, 2015 /PRNewswire/ -- Qualcomm Incorporated (NASDAQ: QCOM) today announced that its subsidiary, Qualcomm Life, Inc.,
has been selected by Novartis, a global pharmaceutical leader, as a
global digital health collaborator for its Trials of The Future program.
Qualcomm Life's 2net™ Platform will serve as a global connectivity
platform for collecting and aggregating medical device data during
clinical trials to improve the convenience and speed of capturing study
participant data and test results to ultimately gain more trial
efficiencies and connected experiences for participants.
The
Trials of The Future program is designed to leverage health care
technology to improve the experience of clinical trial participants and
patients using Novartis products, and provide connectivity with future
products marketed by Novartis. Novartis will combine the 2net Platform,
2net Hub and 2net Mobile technologies with designated medical devices to
automate the collection of vital patient data at patient's homes during
clinical trials.
"Novartis is a pioneer in putting technology to use in advancing pharmaceutical innovation," said Rick Valencia,
senior vice president and general manager, Qualcomm Life.
"Standardizing on the tech-agnostic 2net Platform and accessing the
robust ecosystem of integrated medical devices will provide them a great
range of flexibility and scalability, ultimately accelerating their
efforts to design more efficient, cost-effective clinical trials."
Novartis
is using the 2net Platform in a recently-launched clinical study,
evaluating the use of mobile devices with chronic lung disease patients.
The study, which is observational in nature and does not involve any
Novartis pharmaceutical product, leverages 2net Mobile-enabled
smartphones and 2net Hubs to seamlessly collect and aggregate biometric
data from medical devices and transmits this data to the cloud-based
2net Platform, which securely sends the data to the study coordinator.
Qualcomm Life to Have Significant Presence at CES 2015
Qualcomm Life will showcase chronic care and transitional care management demos at booth #8525 in Central Hall, Las Vegas Convention Center. James Mault,
MD, F.A.C.S., vice president and chief medical officer, Qualcomm Life,
will participate in a fireside chat on what is new in Quantified Self
from 3:30-4:30 p.m. in the Las Vegas Convention Center Room, N621 on Monday, January 5. Rick Valencia, senior vice president and general manager, Qualcomm Life, will participate in a fireside chat with Corinne Savill, head of business development and licensing, Novartis, titled "Pharma goes Techy and Consumers Score" from 9:40-10:04 a.m. at the Digital Health Summit at the Venetian level 2 Bellini Room 2004 on Thursday, January 8.
About Qualcomm Incorporated
Qualcomm Incorporated (NASDAQ: QCOM)
is a world leader in 3G, 4G and next-generation wireless technologies.
Qualcomm Incorporated includes Qualcomm's licensing business, QTL, and
the vast majority of its patent portfolio. Qualcomm Technologies, Inc., a
wholly-owned subsidiary of Qualcomm Incorporated, operates, along with
its subsidiaries, substantially all of Qualcomm's engineering, research
and development functions, and substantially all of its products and
services businesses, including its semiconductor business, QCT. For more
than 25 years, Qualcomm ideas and inventions have driven the evolution
of digital communications, linking people everywhere more closely to
information, entertainment and each other. For more information, visit
Qualcomm's website, OnQ blog, Twitter and Facebook pages.
Qualcomm and 2net are trademarks of QUALCOMM Incorporated, registered in the United States and other countries. Other product and brand names may be trademarks or registered trademarks of their respective owners.
I know very few open-source programmers, no matter how skilled they
may with the intricacies of C++, who relish learning about the ins and
outs of open-source licenses. I can't blame them. Like it or not,
though, picking an open-source license is a necessity.
The GPLs are still the most popular licenses, but more permissive licenses are gaining fast.The Open Source Initiative
has long provided vital open-source licensing reference information,
but it still left programmers more puzzled than informed. Lately,
OSI-related sites, such as Choose a License and the open-source license FAQ on GitHub, have made it easier, but some programmers haven't been bothering with any license at all.
That's
a big mistake. If you don't have a license, you're leaving the door
open for people to fool with your code. You may think that's just
fine... just like the poor sods did who used a permissive Creative
Common license for "their" photographs that they'd stored on Yahoo's Flikr only to discover that Yahoo had started to sell prints of their photos ... and keeping all the money.
Maybe you're cool with that. I'm not.
Now, as Stephen O'Grady, co-founder of Red Monk,
a major research group, has observed, as software is shifting to being
used more in a service mode rather than deployed, you may not need to
protect your code with a more restrictive license such as the GPLv3 since "if the code is not a competitive advantage,
it is likely not worth protecting." Even in the case of code with
little intrinsic value, however, O'Grady believes that, "permissive
licenses [such as the MIT License] are a perfect alternative."
So while O'Grady using Black Duck data found that "the GPL has been the overwhelmingly most selected license," these two licenses, GPLv2 and v3 are, "no longer more popular than most of the other licenses put together."
Instead,
"The three primary permissive license choices (Apache/BSD/MIT) ...
collectively are employed by 42 percent. They represent, in fact, three
of the five most popular licenses in use today." These permissive
licenses has been gaining ground at GPL's expense. The two biggest
gainers, the Apache and MIT licenses, were up 27 percent, while the GPLv2, Linux's license, has declined by 24 percent.
This
trend towards more permissive licensing is not, however, just the
result of younger programmers switching to thinking of code as means to
an end for a cloud services such as Software-as-a-Service (SaaS).
Instead, it's been moving that way since 2004 according to a 2012 study
by Donnie Berkholz, a Red Monk analyst.
Berkholz learned, using data from Ohloh, an open-source code research project now known as Open Hub, that "Since 2010, this trend has reached a point where permissive is more likely than copyleft [GPL] for a new open-source project."
I'm
not sure that's wise, but this is 2014, not 1988. Many program's
functionality are now delivered as a service rather than from a program
residing on your computer. What I do know, though, is that if you want
some say in how your code will be used tomorrow, you still need to put
in under some kind of license.
Yes, without any license, your code defaults to falling under copyright law.
In that case, legally speaking no one can reproduce, distribute, or
create derivative works from your work. You may or may not want that. In
any case, that's only the theory. In practice you'd find defending your
rights to be difficult.
You should also keep in mind that when
you "publish" your code on GitHub, or any other "public" site, you're
giving up some of your rights. Which ones? Well, it depends on the
site's Terms of Service. On GitHub, for example, if you choose to make
your project repositories public -- which is probably the case or why
would be on GitHub in the first place? --then you've agreed to allow others to view and fork your repositories. Notice the word "fork?" Good luck defending your copyright.
Seriously,
you can either figure out what you want to do with your code before you
start exposing it to the world, or you can do it after it's become a
problem. Me? I think figuring it out first is the smart play.
Diogo Mónica once wrote a short computer script that gave him a secret weapon in the war for San Francisco dinner reservations.
This was early 2013. The script would periodically scan the popular
online reservation service, OpenTable, and drop him an email anytime
something interesting opened up—a choice Friday night spot at the House
of Prime Rib, for example. But soon, Mónica noticed that he wasn’t
getting the tables that had once been available.
By the time he’d check the reservation site, his previously open
reservation would be booked. And this was happening crazy fast. Like in a
matter of seconds. “It’s impossible for a human to do the three forms
that are required to do this in under three seconds,” he told WIRED last
year.
Mónica could draw only one conclusion: He’d been drawn into a bot war.
Everyone knows the story of how the world wide web made the internet
accessible for everyone, but a lesser known story of the internet’s
evolution is how automated code—aka bots—came to quietly take it over.
Today, bots account for 56 percent of all of website visits, says Marc
Gaffan, CEO of Incapsula, a company that sells online security services.
Incapsula recently an an analysis of 20,000 websites to get a snapshot
of part of the web, and on smaller websites, it found that bot traffic
can run as high as 80 percent.
People use scripts to buy gear on eBay and, like Mónica, to snag the
best reservations. Last month, the band, Foo Fighters sold tickets for
their upcoming tour at box offices only, an attempt to strike back against the bots used by online scalpers.
“You should expect to see it on ticket sites, travel sites, dating
sites,” Gaffan says. What’s more, a company like Google uses bots to
index the entire web, and companies such as IFTTT and Slack give us ways use the web to use bots for good, personalizing our internet and managing the daily informational deluge.
But, increasingly, a slice of these online bots are malicious—used to
knock websites offline, flood comment sections with spam, or scrape
sites and reuse their content without authorization. Gaffan says that
about 20 percent of the Web’s traffic comes from these bots. That’s up
10 percent from last year.
Often, they’re running on hacked computers. And lately they’ve become
more sophisticated. They are better at impersonating Google, or at
running in real browsers on hacked computers. And they’ve made big leaps
in breaking human-detecting captcha puzzles, Gaffan says.
“Essentially there’s been this evolution of bots, where we’ve seen it
become easier and more prevalent over the past couple of years,” says
Rami Essaid, CEO of Distil Networks, a company that sells bot-blocking
software.
But despite the rise of these bad bots, there is some good news for
the human race. The total percentage of bot-related web traffic is
actually down this year from what it was in 2013. Back then it accounted
for 60 percent of the traffic, 4 percent more than today.
BitTorrent, the peer-to-peer file sharing company, is today opening an alpha test for its
latest stab at disrupting — or at least getting people to rethink — how
users interact with each other and with content over the Internet. Project Maelstrom
is BitTorrent’s take on the web browser: doing away with centralised
servers, web content is instead shared through torrents on a distributed
network.
BitTorrent years ago first made a name for itself as a P2P network
for illicit file sharing — a service that was often used to share
premium content for free at a time when it was hard to get legal content
elsewhere. More recently, the company has been applying its knowledge
of distributed architecture to tackle other modern file-sharing
problems, producing services like Sync to share large files with others, Bundle for content makers to have a way of distributing and selling content; and the Bleep messaging service.
These have proven attractive to people for a number of reasons. For
some, it’s about more efficient services — there is an argument to be
made for P2P transfers of large files being faster and easier than those
downloaded from the cloud — once you’ve downloaded the correct local
client, that is (a hurdle in itself for some).
For others, it is about security. Your files never sit on any cloud, and instead stay locally even when they are shared.
This latter point is one that BitTorrent has been playing up a lot
lately, in light of all the revelations around the NSA and what happens
to our files when they are put onto cloud-based servers. (The long and
short of it: they’re open to hacking, and they’re open to governments
and others’ prying fingers.)
In the words of CEO Eric Klinker, Maelstrom is part of that line of
thinking that using P2P can help online content run more smoothly.
“What if more of the web worked the way BitTorrent does?” he writes
of how the company first conceived of the problem. “Project Maelstrom
begins to answer that question with our first public release of a web
browser that can power a new way for web content to be published,
accessed and consumed. Truly an Internet powered by people, one that
lowers barriers and denies gatekeepers their grip on our future.”
Easy enough to say, but also leaving the door open to a lot of questions.
For now, the picture you see above is the only one that BitTorrent
has released to give you an idea of how Maelstrom might look. Part of
the alpha involves not just getting people to sign up to use it, but
getting people signed up to conceive of pages of content to actually
use. “We are actively engaging with potential partners who would like to
build for the distributed web,” a spokesperson says.
Nor is it clear what form the project will take commercially.
Asked about advertising — one of the ways that browsers monetise
today — it is “too early to tell,” the spokesperson says. “Right now the
team is focused on building the technology. We’ll be evaluating
business models as we go, just as we did with Sync. But we treat web
pages, along with distributed web pages the same way other browsers do.
So in that sense they can contain any content they want.”
That being said, it won’t be much different from what we know today as “the web.”
“HTML on the distributed web is identical to HTML on the traditional
web. The creation of websites will be the same, we’re just provided
another means for distributing and publishing your content,” he adds.
In that sense, you could think of Maelstrom as a complement to what
we know as web browsers today. “We also see the potential that there is
an intermingling of HTTP and BitTorrent content across the web,” he
says.
It sounds fairly radical to reimagine the entire server-based
architecture of web browsing, but it comes at a time when we are seeing a
lot of bumps and growing pains for businesses over more traditional
services — beyond the reasons that consumers may have when they opt for
P2P services.
BitTorrent argues that the whole net neutrality debate — where
certain services that are data hungry like video service Netflix
threaten to be throttled because of their strain on ISP networks — is
one that could be avoided if those data-hungry services simply opted for
different ways to distribute their files. Again, this highlights the
idea of Maelstrom as a complement to what we use today.
“As a distributed web browser, Maelstrom can help relieve the burden
put on the network,” BitTorrent says. “It can also help maintain a more
neutral Internet as a gatekeeper would not be able to identify where
such traffic is originating.
False-color micrograph of Caenorhabditis elegans
(Science Photo Library/Corbis)
If the brain is a collection
of electrical signals, then, if you could catalog all those those
signals digitally, you might be able upload your brain into a computer, thus achieving digital immortality.
While the plausibility—and ethics—of this upload for humans can be debated, some people are forging ahead in the field of whole-brain emulation. There are massive efforts to map the connectome—all the connections in
the brain—and to understand how we think. Simulating brains could lead
us to better robots and artificial intelligence, but the first steps
need to be simple.
So, one group of scientists started with the roundworm Caenorhabditis elegans, a critter whose genes and simple nervous system we know intimately.
The OpenWorm project
has mapped the connections between the worm’s 302 neurons and simulated
them in software. (The project’s ultimate goal is to completely
simulate C. elegans as a virtual organism.) Recently, they put that software program in a simple Lego robot.
The worm’s body parts and neural networks now have
LegoBot equivalents: The worm’s nose neurons were replaced by a sonar
sensor on the robot. The motor neurons running down both sides of the
worm now correspond to motors on the left and right of the robot, explains Lucy Black for I Programmer. She writes:
---
It is claimed that the robot behaved in ways that are similar to observed C. elegans. Stimulation
of the nose stopped forward motion. Touching the anterior and posterior
touch sensors made the robot move forward and back accordingly.
Stimulating the food sensor made the robot move forward.
---
Timothy Busbice, a founder for the OpenWorm project, posted a video of the Lego-Worm-Bot stopping and backing:
The simulation isn’t
exact—the program has some simplifications on the thresholds needed to
trigger a "neuron" firing, for example. But the behavior is impressive
considering that no instructions were programmed into this robot. All it
has is a network of connections mimicking those in the brain of a
worm.
Of course, the goal of uploading our brains assumes that we aren’t alreadyliving in a computer simulation.
Hear out the logic: Technologically advanced civilizations will
eventually make simulations that are indistinguishable from reality. If
that can happen, odds are it has. And if it has, there are probably
billions of simulations making their own simulations. Work out that
math, and "the odds are nearly infinity to one that we are all living in
a computer simulation," writes Ed Grabianowski for io9.
Google's moonshot group Google X is working on a pill that, when
swallowed, will seek out cancer cells in your body. It'll seek out all
sorts of diseases, in fact, pushing the envelope when it comes to
finding and destroying diseases at their earliest stages of development.
This system would face "a much higher regulatory bar than conventional
diagnostic tools," so says Chad A. Mirkin, director of the International
Institute for Nanotechnology at Northwestern University.
Word comes from the Wall Street Journal
where they've got Andrew Conrad, head of the Life Sciences team at the
Google X research lab speaking during their WSDJ Live conference. This
system is called "The Nano Particle Platform," and it aims to
"functionalize Nano Particles, to make them do what we want."
According to Conrad, Google X is working on two separate devices,
more than likely. The first is the pill which contains their smart
nanoparticles. The second is a wearable device that attracts the
particles so that they might be counted.
"Our dream" said Conrad, is that "every test you ever go to the doctor for will be done through this system."
It turns out that a vital missing ingredient in the long-sought after
goal of getting machines to think like humans—artificial
intelligence—has been lots and lots of data.
Last week, at the O’Reilly Strata + Hadoop World Conference in New York, Salesforce.com’s head of artificial intelligence, Beau Cronin, asserted that AI has gotten a shot in the arm from the big data movement.
“Deep learning on its own, done in academia, doesn’t have the [same]
impact as when it is brought into Google, scaled and built into a new
product,” Cronin said.
In the week since Cronin’s talk, we saw a
whole slew of companies—startups mostly—come out of stealth mode to
offer new ways of analyzing big data, using machine learning, natural
language recognition and other AI techniques that those researchers have
been developing for decades.
One such startup, Cognitive Scale,
applies IBM Watson-like learning capabilities to draw insights from
vast amount of what it calls “dark data,” buried either in the Web—Yelp
reviews, online photos, discussion forums—or on the company network,
such as employee and payroll files, noted KM World.
Cognitive
Scale offers a set of APIs (application programming interfaces) that
businesses can use to tap into cognitive-based capabilities designed to
improve search and analysis jobs running on cloud services such as IBM’s Bluemix, detailed the Programmable Web.
Cognitive Scale was founded by Matt Sanchez, who headed up IBM’s Watson Labs,
helping bring to market some of the first e-commerce applications based
on the Jeopardy-winning Watson technology, pointed out CRN.
Sanchez,
now chief technology officer for Cognitive Scale, is not the only
Watson alumnus who has gone on to commercialize cognitive technologies.
Alert reader Gabrielle Sanchez pointed out that another Watson ex-alum, engineer Pete Bouchard, recently joined the team of another cognitive computing startup Zintera
as the chief innovation office. Sanchez, who studied cognitive
computing in college, found a demonstration of the company’s “deep
learning” cognitive computing platform to be “pretty impressive.”
AI-based deep learning with big data was certainly on the mind of senior Google executives. This week the company snapped up two Oxford University technology spin-off companies that focus on deep learning, Dark Blue Labs and Vision Factory.
The teams will work on image recognition and natural language understanding, Sharon Gaudin reported in Computerworld.
Sumo Logic
has found a way to apply machine learning to large amounts machine
data. An update to its analysis platform now allows the software to
pinpoint casual relationships within sets of data, Inside Big Data concluded.
A company could, for instance, use the Sumo Logic cloud service to analyze log data to troubleshoot a faulty application, for instance.
While companies such as Splunk have long offered search engines for machine data, Sumo Logic moves that technology a step forward, the company claimed.
“The
trouble with search is that you need to know what you are searching
for. If you don’t know everything about your data, you can’t by
definition, search for it. Machine learning became a fundamental part of
how we uncover interesting patterns and anomalies in data,” explained
Sumo Logic chief marketing officer Sanjay Sarathy, in an interview.
For
instance, the company, which processes about 5 petabytes of customer
data each day, can recognize similar queries across different users, and
suggest possible queries and dashboards that others with similar setups
have found useful.
“Crowd-sourcing intelligence around different
infrastructure items is something you can only do as a native cloud
service,” Sarathy said.
With Sumo Logic, an e-commerce company
could ensure that each transaction conducted on its site takes no longer
than three seconds to occur. If the response time is lengthier, then an
administrator can pinpoint where the holdup is occurring in the transactional flow.
One existing Sumo Logic customer, fashion retailer Tobi, plans to use the new capabilities to better understand how its customers interact with its website.
One-upping IBM on the name game is DataRPM, which crowned its own big data-crunching natural language query engine Sherlock (named after Sherlock Holmes who, after all, employed Watson to execute his menial tasks).
Sherlock
is unique in that it can automatically create models of large data
sets. Having a model of a data set can help users pull together
information more quickly, because the model describes what the data is
about, explained DataRPM CEO Sundeep Sanghavi.
DataRPM can analyze
a staggeringly wide array of structured, semi-structured and
unstructured data sources. “We’ll connect to anything and everything,”
Sanghavi said.
The service company can then look for ways that different data sets could be combined to provide more insight.
“We
believe that data warehousing is where data goes to die. Big data is
not just about size, but also about how many different sources of data
you are processing, and how fast you can process that data,” Sanghavi
said, in an interview.
For instance, Sherlock can pull together
different sources of data and respond with a visualization to a query
such as “What was our revenue for last year, based on geography?” The
system can even suggest other possible queries as well.
Sherlock
has a few advantages over Watson, Sanghavi claimed. The training period
is not as long, and the software can be run on-premise, rather than as a
cloud service from IBM, for those shops that want to keep their
computations in-house. “We’re far more affordable than Watson,” Sanghavi
said.
Initially, DataRPM is marketing to the finance, telecommunications, manufacturing, transportation and retail sectors.
One company that certainly does not think data warehousing is going to die is a recently unstealth’ed startup run by Bob Muglia, called Snowflake Computing.
Publicly
launched this week, Snowflake aims “to do for the data warehouse what
Salesforce did for CRM—transforming the product from a piece of
infrastructure that has to be maintained by IT into a service operated
entirely by the provider,” wrote Jon Gold at Network World.
Founded
in 2012, the company brought in Muglia earlier this year to run the
business. Muglia was the head of Microsoft’s server and tools division,
and later, head of the software unit at Juniper Networks.
While Snowflake could offer its software as a product, it chooses to do so as a service, noted Timothy Prickett Morgan at Enterprise Tech.
“Sometime
either this year or next year, we will see more data being created in
the cloud than in an on-premises environment,” Muglia told Morgan.
“Because the data is being created in the cloud, analysis of that data
in the cloud is very appropriate.”
More and more, governments are using powerful spying software to target human rights activists and journalists, often the forgotten victims of cyberwar. Now, these victims have a new tool to protect themselves.
Called Detekt, it scans a person's computer for traces of surveillance software, or spyware. A coalition of human rights organizations, including Amnesty International and the Electronic Frontier Foundation launched Detekt on Wednesday, with the goal of equipping activists and journalists with a free tool to discover if they've been hacked.
"Our ultimate aim is for human rights defenders, journalists and civil society groups to be able to carry out their legitimate work without fear of surveillance, harassment, intimidation, arrest or torture," Amnesty wroteThursday in a statement.
The open-source tool was developed by security researcher Claudio Guarnieri, a security researcher who has been investigating government abuse of spyware for years. He often collaborates with other researchers at University of Toronto's Citizen Lab.
During their investigations, Guarnieri and his colleagues discovered, for example, that the Bahraini government used software created by German company FinFisher to spy on human rights activists. They also found out that the Ethiopian government spied on journalists in the U.S. and Europe, using software developed by Hacking Team, another company that sells off-the-shelf surveillance tools.
Guarnieri developed Detekt from software he and the other researchers used during those investigations.
--------------------------
"I decided to release it to the public because keeping it private made no sense," he told Mashable. "It's better to give more people as possible the chance to test and identify the problem as quickly as possible, rather than keeping this knowledge private and let it rot."
Detekt only works with Windows, and it's designed to discover malware developed both by commercial firms, as well as popular spyware used by cybercriminals, such as BlackShades RAT (Remote Access Tool) and Gh0st RAT.
The tool has some limitations, though: It's only a scanner, and doesn't remove the malware infection, which is why Detekt's official site warns that if there are traces of malware on your computer, you should stop using it "immediately," and and look for help. It also might not detect newer versions of the spyware developed by FinFisher, Hacking Team and similar companies.
"If Detekt does not find anything, this unfortunately cannot be considered a clean bill of health," the software's "readme" file warns.
For some, given these limitations, Detekt won't help much.
"The tool appears to be a simple signature-based black list that does not promise it knows all the bad files, and admits that it can be fooled," John Prisco, president and CEO of security firm Triumfant, said. "Given that, it seems worthless to me, but that’s probably why it can be offered for free."
Joanna Rutkowska, a researcher who develops the security-minded operating system Qubes, said computers with traditional operating systems are inherently insecure, and that tools like Detekt can't help with that.
"Releasing yet another malware scanner does nothing to address the primary problem," she told Mashable. "Yet, it might create a false sense of security for users."
But Guarnieri disagrees, saying that Detekt is not a silver-bullet solution intended to be used in place of commercial anti-virus software or other security tools.
"Telling activists and journalists to spend 50 euros a year for some antivirus license in emergency situations isn't very helpful," he said, adding that Detekt is not "just a tool," but also an initiative to spark discussion around the government use of intrusive spyware, which is extremely unregulated.
For Mikko Hypponen, a renowned security expert and chief research officer for anti-virus vendor F-Secure, Detekt is a good project because its target audience — activists and journalists — don't often have access to expensive commercial tools.
“Since Detekt only focuses on detecting a handful of spy tools — but detecting them very well — it might actually outperform traditional antivirus products in this particular area,” he told Mashable.