Bad name, but awesome job. To think that you could rig a Windows look-alike on a Commodore 64 is just amazing.
Joel Runyon runs into Russell Kirsch in a coffeehouse in Portland. Reading the follow-up post is also something I recommend.
While at my forensics course a couple of weeks ago, the other guy in my class pulled out a USB stick and showed off some tools on there. He built it using YUMI, over at Pen Drive Linux, and I’ve taken some time to do the same. The process for creating a bootable USB stick is easy, and I’ve loaded my drive with goodies, including:
- Windows 7 (using a valid copy I own; you need to provide your own ISO)
- Backtrack 5
- Ubuntu 12.04
- Knoppix 6.7.1
- Peppermint Two
- Ultimate Boot CD
- Acronis anti-malware scanner
- Kaspersky virus scanner
- Trinity Rescue Kit
- AVG anti-virus scanner
- Memtest 86+
On top of that, I have some forensics and analysis tools on the stick, in case I’m somewhere without an internet connection and need to do some repair or analysis. I like the idea of having most of my tools available on one bootable USB stick, rather than a pack of DVDs.
I really wanted to put DEFT Linux on this stick, but YUMI does not support it at this time. Instead, you can use the Universal USB Installer to create a single-app bootable stick. After running that, DEFT 7 loaded just fine. Unfortunately, its ISO is just over 2 GB, meaning you’ll need a 4 GB stick to get the full kit on there.
Incidentally, if you get a weird syslinux error in YUMI stating that An error(1) occurred while executing syslinux. Your USB drive won’t be bootable, make sure that your USB stick is formatted as FAT32. If in doubt, just let YUMI format the stick for you.
Here is an interesting post on design problems inherent in touchscreen-based technology (via Brent Ozar PLF’s weekly list). There is something to be said for tactile interfaces: typing on an old IBM keyboard provides much better feedback than trying to type on a similarly-sized touchscreen keyboard (not to mention a much smaller touchscreen keyboard).
The other problem that I see is that most of our daily experiences are three-dimensional in nature: the relative thickness of a book, as pointed out, tells you a pretty good amount on its own: it tells you how far along you are, roughly how much more there is to go, and how big the book is compared to other books. Without that third dimension, you need page numbers, or you’re lost. The lack of a notable third dimension certainly keeps devices portable and light (I’m not complaining about being able to store hundreds of books on my nook, and I can lay my nook flat and expect to be able to read from it without holding the thing open) but comes with some tradeoffs. When it comes to something like a keyboard, or some other device in which constant visual observation is a bad thing, the model falls apart, leaving us to cope with subpar design.
Installing Ubuntu Linux on my Asus laptop required a few tweaks to get things working correctly. Previously, I had used Ubuntu 9 and 10 and did not have nVidia driver support. With 11, jockey-gtk (the Additional Drivers tool) told me that I could use the nVidia drivers, so after I installed them from that tool, the product said that the drivers were installed but not activated. I tried a number of things, but was unable to get my 3D card to work; the Intel card worked fine and lsmod showed the nVidia card set up, but my Xorg.conf file change didn’t result in the nVidia card working correctly. What ended up fixing it was upgrading my BIOS firmware to the latest version. After doing that, the driver worked fine, and now I have 3D support. The battery life is pretty crappy—far worse than in Windows—but at least my video card works now. In addition, I can do Unity in 3D.
My quick thought on Unity is that I like it. I like the fact that it uses the Windows key to pop up a gnome-do style box. That’s a rather convenient feature. It also doesn’t really get in the way for me, which is vital for a menu. Fancy-looking menus which get in the way aren’t progress…
In addition, when updating from 11.04 to 11.10, my touchpad stopped working. When I ran xinput list in a console, it showed the “ETPS/2 Elantech Touchpad” as active and on id=12, but it just didn’t work. It turns out that Mark Pointing has the correct answer for me: the touchpad was renamed in 11.10, but one of the script files was not updated to match this change. Specifically, change /etc/acpi/asus-touchpad.sh from
XINPUTNUM=`xinput list | grep 'SynPS/2 Synaptics TouchPad' | sed -n -e's/.*id=\([0-9]\+\).*/\1/p'
XINPUTNUM=`xinput list | grep 'ETPS/2 Elantech Touchpad' | sed -n -e's/.*id=\([0-9]\+\).*/\1/p'`
Note that there are some apostrophes in there, but the entire line is bracketed by backticks (`, which shares its key with ~ on the US keyboard, and is above the tab key). After changing that line and restarting X, my touchpad worked just fine.
sudo apt-get install dconf-tools
Then, after that, run dconf-editor. This does require a mouse, unfortunately, so get a USB mouse to tide you over (or make the shell script change first). In the configuration editor, go to org –> gnome –> settings-daemon –> peripherals –> touchpad. In here, touchpad-enabled may be unchecked. If it is, check it. In addition, you can also change the touchpad to use two-finger scrolling (which I’ve really gotten accustomed to) and turn on tap-to-click, another thing I like in Asus touchpads. Horizontal scrolling is another setting I’ve turned on.
Over the past week, I haven’t really posted very much and kind of fell behind on everything. The reason for that is, when I got to work on Monday, my computer was dead as a doornail. Unfortunately, we don’t do backups of physical machines. Most of my important files were still fine—I save documents to a network drive and all code is regularly checked into source control—but I had to rebuild my machine from scratch. Due to this, err, opportunity, I decided to volunteer for virtualization. A good percentage of people at work are already on thin clients, so it’s not exactly treading new ground.
The developers and I had been resistant to the idea, though, due to our insatiable resource requirements. I am the worst about it: I usually keep three or four instances of Visual Studio, a couple of SQL Server Management Studio, several diagnostic tools, one or two Powershell windows, Excel, a few windows of the three major browsers, and a bit more open at a time. So I figured that if I could succeed in a virtual environment without major headaches, everybody else could adapt pretty easily.
It took me a few days to get my computer up and running—I also spent a good bit of time training a new developer who just joined—and so I didn’t really have a computer until sometime late on Thursday. This has limited my amount of time that I have spent testing, but after running roughly 75% of my normal stress load, I noticed that it wasn’t appreciably slower than before. Part of this is that I’ve moved up to a 64-bit machine with 6 GB of RAM (getting more RAM was one of the carrots I demanded in return for being a guinea pig), but it seems that virtualization has reached a point of mass acceptance in a business environment. I remember the hubbub about Network Computers back in the mid-to-late ’90s and mocked it mercilessly (and, I believe, deservedly) back then, but at this point, major bandwidth improvements have made it so that we really can do all of this work on servers, streaming across gigabit (or better) connections. Most importantly, I have a base image now, so if something happens to my installation, I’m back up and running in 15 minutes. We tested that out on Friday and it worked pretty well—recomposition took about 15 minutes and then I spent another 45 minutes or so rebuilding my profile.