Wired Gadget Labs had a neat little article, with pictures, about this fellow in England. He makes art photographs of fluorescent tubes in standing in fields, powered by the overhead power lines. You have to see it to believe it, it’s incredible, but in a “Wow! But… what are they doing to my body if they can do that?” kind of way.

[Fluorescent bulbs lit by Magnetic Fields at Wired]



Six botnets are responsible for generating 85% of all spam emails, acccording to an article at Dark Reading today. I’ve known for a while that botnets are a serious threat, including in their arsenal of naughty tricks spam, vulnerability scanning, infecting other computers, stealing information from their host computers (you don’t keep your credit card numbers in Quicken, do you?) and distributed denial of service attacks. It was a shock to find that so few botnets send all that spam that clutters up inboxes.

Researchers with Marshal’s TRACE team have identified six botnets that together are currently responsible for distributing 85 percent of all spam, Dark Reading has learned.

This article came at a an opportune time. I’ve been thinking lately that in addition to all these current problems, which involve fairly visible (and traceable) activities, botnets could be very useful in cyberwarfare. I’m thinking here of the true cyberwarfare, where a foreign power decides to take out the information infrastructure of a rival country or group.

Such a militarily-oriented botnet would not need to draw attention to itself by sending out spam messages or participating in a DDOS attack, at least not until it was directed to spring into action. It would need minimal direction, and so much like a mole in an espionage novel, such a botnet could exist for years, dormant, waiting only for the command to attack the military, government or civilian IT infrastructure of the rival country.

Alternatively, you wouldn’t even really need to attack anyone. Can you imagine the economic damage to the US if every computer infected by a large botnet (these can number in the millions of computers) was suddenly directed to reformat it’s hard drive?



I’ve spent a lot of time driving the last couple of weeks, and living in the Midwest, it’s always a case of trying to keep the windshields clean and how often do we replace the wiper blades. You can imagine my interest when I read at Sparkingtech about a new prototype technology that uses nanotechnology to keep your windshields clean instead.

Italian car designer Leonardo Fioravanti (of Pininfarina fame) has developed a prototype car with a windshield that doesn’t need wipers. It can brush away water and dirt all by itself.

The article is short on details, and I don’t really pretend to understand those they do include, but if I can get a windshield that stays clear without wipers, sign me up, even if nanotechnology is morally questionable.



Newly discovered flaws in VMWare allow malware applictions running in VMWare to escape the sandbox. These application can view and modify files in the underlying host operating systems, according to this article on Compuworld, although to be fair it’s popping up on a number of sites.

As of Sunday, there was no patch available for the flaw, which affects VMware’s Windows client virtualization programs, including Workstation, Player and ACE. The company’s virtual machine software for Windows servers and for Mac- and Linux-based hosts are not at risk.

“Why should we care?” you might ask. Over the couple of years, there’s been a movement towards virtualized infrastructure in larger IT shops, and even some smaller ones. This allows companies to provide “part” of a server to an individual user, application, or team, but host that partial server on another larger server, or even a cluster or grid computer. These virutual servers can be scaled and reconfigured more easily than a real, physical server as well. This makes more efficient use of the computing infrastructure, plus makes it very easy to set up and tear down new virtual servers. It’s a win-win situation.

The discovery of this flaw that allows applications to “see” the underlying host OS could be a significant problem for the budding trend towards virtualization if it’s not addressed soon. VMWare isn’t currently telling us when a patch will be available, but they provide a work-around in the meantime:

“On Windows hosts, if you have configured a VMware host-to-guest shared folder, it is possible for a program running in the guest to gain access to the host’s complete file system and create or modify executable files in sensitive locations,” confirmed VMware.

VMware has not posted a fix, but it instead told users to disable shared folders.

So until a fix is available, if you’re running VMWare in Windows, disable those shared folders, folks.



With the growing popularity of agile development philosophies, UML modeling has taken a back seat in a lot of development shops. I still believe that modeling has it’s place, despite the fact that this has puts me at odds with a number of practitioners I’ve run into in the last few years.

UML modeling allows me to organize and plan out my thoughts for large systems, and share them easily. I’m thinking of those systems where extensibility and maintainability are important. It creates artifacts that allow others who come after me (or, for that matter, me in 6 months when I’ve forgotten) to understand the concepts and structures in the systems I build without having to pore over the code to do so. I also find when I’m trying to understand existing code, and there exist no good documents to describe that code, creating models helps me put the whole enchilada down in front of me at once, where I can get a sense of it.

As I mentioned, a good number of practitioners I run into today think that modeling does not need to happen, ever, and this is just wrong-headed. But let’s assume for the moment that you think perhaps you want a model for one of the reasons I mention, or for one I didn’t. Which tool should you use? The answer depends on what you’re trying to accomplish. Just as you wouldn’t drive a nail with a screwdriver, you want to pick the appropriate tool for the job you’re trying to accomplish. I’m going to talk about several popular modeling tools: IBM Rational Rose, IBM Rational Software Architect, Sparx Systems Enterprise Architect, and Borland Together Control Center.

Read more



Botnets, those pernicious threats to internet life, liberty and pursuit of happiness, may have a new enemy. Researchers at the Georgia Tech are using traffic analysis of IRC and HTTP to try to identify botnets in the wild. The theory is that botnets need to communicate with their command and control infrastructure, and that they tend to look like, well botnets rather than people when they do so.

While my description above is a bit tongue-in-cheek, botnets are really a serious threat. There are estimates that they contribute up to 80% of spam email, and they’re regularly used in denial of service attacks like the one we see ongoing right now with  WordPress.com blogs. They’re particularly difficult to identify, track, and combat, and there are documented instances of hundreds of thousands and over a million botnet zombie computers under the control of a single individual.

While it’s too early to tell if this new approach will help significantly in the fight, any means we have to help combat botnets are welcome at the party.

[Georgia Tech via Ars Technica]



RIP, Backups

February 20, 2008 | 2 Comments

I recently installed a new 750 gigabyte hard drive into one of my systems. As I was realizing that I now had 3/4 of a terabyte of information on a single spindle, and chuckling with glee, I was also thinking about backups, and that they’re really no longer a fact of many people’s life at home. This is disconcerting for me, because I understand the risks involved, if that spindle goes away due to any of a number of reasons (power problems, mechanical failure, coffee in the chassis, etc) then all that data is gone.

There are two primary reasons that home users, and in fact even  a fair number of businesses, don’t back up their machines. Size of media and time involved.

Where is the typical home user going to find space to back up 750 GB? CD-ROM and even DVD devices are too small to serve as effective backup media, and too slow in addition. Backing up to other drives is really the only viable approach. Adding in an additional drive simply for backup seems excessive.
Microsoft has recently introduced their Windows Home Server, which would be another logical place for such a backup. Network attached storage devices from manufacturers like D-Link, Netgear, Western Digital and Buffalo have been around for a while, and would also be a good location. Your typical home user doesn’t realize either these tools are around, nor should they need to. Both of these solutions presume more home network infrastructure than the typical house has, and setting up a backup schedule requires users that understand they need to do it. Unfortunately, this doesn’t occur to most people until it’s too late.

The second reason is time: unless a backup is scheduled to run automatically, and at a time that the machine is both on, and unused (or lightly loaded) the backup interferes with use of the computer. This creates an incentive to disable, reschedule or cancel the backups, in turn decreasing the likelihood that they’re actually done.

Larger businesses tend to have network file servers that provide shared drives, which are backed up, and also to run  backup software on their employees computers. This however is also an imperfect solution, unless employees are unable to store files anywhere other than on the shared drives.  It also doesn’t address machines used by road warriors, or by employees working at home temporarily or telecommuting.

This is a problem without neat solutions, unfortunately. I’ve run for years without backups by using RAID to reduce my risk of device failures. This doesn’t help me with accidental deletion, though. The rapid growth of primary storage device capacity, which has not been matched by secondary storage capacity and speed have created a situation where backups are not really a fact of life for most people.



A California District Court judge has ordered WikiLeaks to be taken off line in response to a complaint from Swiss bank Julius Baer we find from CSO magazine.  This answers why WikiLeaks disappeared off the internets earlier this week.

The order in the U.S. came after a Swiss bank, Julius Baer, earlier this month filed a complaint against the site and San Mateo, California-based Dynadot, Wikileaks’ domain-name registry, for posting several hundred of the bank’s documents.

Some of those documents allegedly reveal that Julius Baer was involved in offshore money laundering and tax evasion in the Cayman Islands for customers in several countries, including the U.S.

WikiLeaks serves an interesting and controversial role, providing a way for leakers to anonymously post information that they believe should be in the public eye. It’s difficult to argue that the site couldn’t be abused by someone posting confidential company information that really has no business in the public eye. Conversely, with the rampant secrecy shown by the government, even to the extent of refusing to allow citizens to know who is informing public policy (remember the Vice President’s energy policy meetings?) tools like WikiLeaks are essential if we are not to become a police state.

It would be disappointing if the service were to go away for the long term. WikiLeaks sites in other countries are still on line as of this writing.



A new study by University of Wisconsin-Madison professor Dietram Scheufele finds that only 1/3 of Americans find nanotechnology morally acceptable, reports this article in Science Daily.

In a sample of 1,015 adult Americans, only 29.5 percent of respondents agreed that nanotechnology was morally acceptable.

What, pray tell, is moral or immoral, acceptable or unacceptable about nanotechnology? Have we become so mired in fanatical puritanism that we are no longer capable of thinking for ourselves?

The catch for Americans with strong religious convictions, Scheufele believes, is that nanotechnology, biotechnology and stem cell research are lumped together as means to enhance human qualities. In short, researchers are viewed as “playing God” when they create materials that do not occur in nature, especially where nanotechnology and biotechnology intertwine, says Scheufele.

To a certain extent, this reminds me of some of the things that happened in the Islamic world during the dark ages. While most of Western Europe was trying to remember how to read and write, the Islamic world had mathematics, algebra, medicine, and astronomy. Then something happened, some say it was a new strict interpretation of Sharia law, that caused their progress to stop, like it was frozen in amber. The West continued to learn and develop, while the Islamic world stagnated, and the majority of advancements in science no longer arose from that part of the world.

Now, we have a majority of the arguably the most technologically advanced country on the planet saying they want to teach Intelligent Design in schools, and that nanotechnology is morally indefensible. How long before we have to have new concepts approved by the church before they can be investigated, I wonder?



Robot that Replays Dreams

February 18, 2008 | 1 Comment

I don’t generally remember my dreams, and so I’m naturally curious about what my subconscious gets up to while I’m not looking, as it were. I’m not sure, however, that I’d want to have them reenacted by a robot that took my ECG, EEG and eye movements as input.

Fernando Orellana and Brendan Burns have collaborated on a new art work which investigates one of the possible human-robot relationships.

Using recorded brainwave activity and eye movements during REM (rapid eye movement) sleep to determine robot behaviors and head positions, “Sleep Waking” acts as a way to “play-back” dreams.

On the other hand, it would be really interesting to find out if watching the robot would cause me to remember what I don’t.

keep looking »


WP Themes