A USB Pen Drive

While at my forensics course a couple of weeks ago, the other guy in my class pulled out a USB stick and showed off some tools on there.  He built it using YUMI, over at Pen Drive Linux, and I’ve taken some time to do the same.  The process for creating a bootable USB stick is easy, and I’ve loaded my drive with goodies, including:

  • Windows 7 (using a valid copy I own; you need to provide your own ISO)
  • Backtrack 5
  • Ubuntu 12.04
  • Knoppix 6.7.1
  • Peppermint Two
  • Ophcrack
  • Ultimate Boot CD
  • Acronis anti-malware scanner
  • Kaspersky virus scanner
  • Trinity Rescue Kit
  • Gparted
  • Clonezilla
  • AVG anti-virus scanner
  • Memtest 86+
  • FreeDOS
  • Tails

On top of that, I have some forensics and analysis tools on the stick, in case I’m somewhere without an internet connection and need to do some repair or analysis.  I like the idea of having most of my tools available on one bootable USB stick, rather than a pack of DVDs.

I really wanted to put DEFT Linux on this stick, but YUMI does not support it at this time.  Instead, you can use the Universal USB Installer to create a single-app bootable stick.  After running that, DEFT 7 loaded just fine.  Unfortunately, its ISO is just over 2 GB, meaning you’ll need a 4 GB stick to get the full kit on there.

Another difficulty I’ve had is with getting additional bootable ISOs up and running.  For example, I wanted to get BartPE and a Windows 7 repair disc as well, but this didn’t work quite as I’d like.

Incidentally, if you get a weird syslinux error in YUMI stating that An error(1) occurred while executing syslinux. Your USB drive won’t be bootable, make sure that your USB stick is formatted as FAT32.  If in doubt, just let YUMI format the stick for you.

Design Problems

Here is an interesting post on design problems inherent in touchscreen-based technology (via Brent Ozar PLF’s weekly list). There is something to be said for tactile interfaces:  typing on an old IBM keyboard provides much better feedback than trying to type on a similarly-sized touchscreen keyboard (not to mention a much smaller touchscreen keyboard).

The other problem that I see is that most of our daily experiences are three-dimensional in nature:  the relative thickness of a book, as pointed out, tells you a pretty good amount on its own:  it tells you how far along you are, roughly how much more there is to go, and how big the book is compared to other books.  Without that third dimension, you need page numbers, or you’re lost.  The lack of a notable third dimension certainly keeps devices portable and light (I’m not complaining about being able to store hundreds of books on my nook, and I can lay my nook flat and expect to be able to read from it without holding the thing open) but comes with some tradeoffs.  When it comes to something like a keyboard, or some other device in which constant visual observation is a bad thing, the model falls apart, leaving us to cope with subpar design.

Ubuntu 11.10 On An Asus UL30Vt

Installing Ubuntu Linux on my Asus laptop required a few tweaks to get things working correctly.  Previously, I had used Ubuntu 9 and 10 and did not have nVidia driver support.  With 11, jockey-gtk (the Additional Drivers tool) told me that I could use the nVidia drivers, so after I installed them from that tool, the product said that the drivers were installed but not activated.  I tried a number of things, but was unable to get my 3D card to work; the Intel card worked fine and lsmod showed the nVidia card set up, but my Xorg.conf file change didn’t result in the nVidia card working correctly.  What ended up fixing it was upgrading my BIOS firmware to the latest version.  After doing that, the driver worked fine, and now I have 3D support.  The battery life is pretty crappy—far worse than in Windows—but at least my video card works now.  In addition, I can do Unity in 3D.

My quick thought on Unity is that I like it.  I like the fact that it uses the Windows key to pop up a gnome-do style box.  That’s a rather convenient feature.  It also doesn’t really get in the way for me, which is vital for a menu.  Fancy-looking menus which get in the way aren’t progress…

In addition, when updating from 11.04 to 11.10, my touchpad stopped working.  When I ran xinput list in a console, it showed the “ETPS/2 Elantech Touchpad” as active and on id=12, but it just didn’t work.  It turns out that Mark Pointing has the correct answer for me:  the touchpad was renamed in 11.10, but one of the script files was not updated to match this change.  Specifically, change /etc/acpi/asus-touchpad.sh from

XINPUTNUM=`xinput list | grep 'SynPS/2 Synaptics TouchPad' | sed -n -e's/.*id=\([0-9]\+\).*/\1/p'

to

XINPUTNUM=`xinput list | grep 'ETPS/2 Elantech Touchpad' | sed -n -e's/.*id=\([0-9]\+\).*/\1/p'`

Note that there are some apostrophes in there, but the entire line is bracketed by backticks (`, which shares its key with ~ on the US keyboard, and is above the tab key).  After changing that line and restarting X, my touchpad worked just fine.

Another helpful hint from that thread comes from martincasco and Emily Strickland.  Install dconf-tools:

sudo apt-get install dconf-tools

Then, after that, run dconf-editor.  This does require a mouse, unfortunately, so get a USB mouse to tide you over (or make the shell script change first).  In the configuration editor, go to org –> gnome –> settings-daemon –> peripherals –> touchpad.  In here, touchpad-enabled may be unchecked.  If it is, check it.  In addition, you can also change the touchpad to use two-finger scrolling (which I’ve really gotten accustomed to) and turn on tap-to-click, another thing I like in Asus touchpads.  Horizontal scrolling is another setting I’ve turned on.

One Computer Down

Over the past week, I haven’t really posted very much and kind of fell behind on everything.  The reason for that is, when I got to work on Monday, my computer was dead as a doornail.  Unfortunately, we don’t do backups of physical machines.  Most of my important files were still fine—I save documents to a network drive and all code is regularly checked into source control—but I had to rebuild my machine from scratch.  Due to this, err, opportunity, I decided to volunteer for virtualization.  A good percentage of people at work are already on thin clients, so it’s not exactly treading new ground.

The developers and I had been resistant to the idea, though, due to our insatiable resource requirements.  I am the worst about it:  I usually keep three or four instances of Visual Studio, a couple of SQL Server Management Studio, several diagnostic tools, one or two Powershell windows, Excel, a few windows of the three major browsers, and a bit more open at a time.  So I figured that if I could succeed in a virtual environment without major headaches, everybody else could adapt pretty easily.

It took me a few days to get my computer up and running—I also spent a good bit of time training a new developer who just joined—and so I didn’t really have a computer until sometime late on Thursday.  This has limited my amount of time that I have spent testing, but after running roughly 75% of my normal stress load, I noticed that it wasn’t appreciably slower than before.  Part of this is that I’ve moved up to a 64-bit machine with 6 GB of RAM (getting more RAM was one of the carrots I demanded in return for being a guinea pig), but it seems that virtualization has reached a point of mass acceptance in a business environment.  I remember the hubbub about Network Computers back in the mid-to-late ’90s and mocked it mercilessly (and, I believe, deservedly) back then, but at this point, major bandwidth improvements have made it so that we really can do all of this work on servers, streaming across gigabit (or better) connections.  Most importantly, I have a base image now, so if something happens to my installation, I’m back up and running in 15 minutes.  We tested that out on Friday and it worked pretty well—recomposition took about 15 minutes and then I spent another 45 minutes or so rebuilding my profile.

The PC is dead! Long live the PC!

I finally have a functioning PC again! I thought I’d share my story with you, the reader, so that you may benefit.

This computer (the one I’m typing on now) was purchased in October of 2009, after my attempts to build my first PC went horribly, horribly wrong. I still blame the motherboard manufacturer for making me improvise with my screws. For the most part, it’s this model, but with an older monitor I’ve had for even longer.

It worked pretty well for the first year or so. Shortly after it came out, I got Starcraft II, which was something I’d waited for for over ten years.

Then I discovered Starcraft II needed an internet connection, which I lacked at the time. Grrr. In August, I got the internet turned back on, which was nice. I couldn’t wait to play Starcraft II!

Then the computer died. At this point, I was cursing and threatening to destroy everyone and everything. Like when I get up in the morning, but much worse.

After troubleshooting, I discovered a problem with my hard drive, a Seagate 750 GB model. I’d formatted it three times, but was unable to install Windows on it. I decided to buy a new hard drive, and selected the Western Digital 1 TB Caviar Black. With my new hard drive, I installed Windows XP and Starcraft II. I finished most of the first mission, and went to bed.

The computer did not turn back on the next day. Worse, it seemed this hard drive had also gone bad. I called Western Digital, they sent a new hard drive. This one worked even worse. More anger and fury ensued. I tried booting Ubuntu from a USB drive, and it worked, but I couldn’t do much. Ubuntu could see the hard drive, though, which Windows could not.

I called WD tech support (a very good group) and they had me troubleshoot everything under the sun. I even switched SATA cables. They concluded my SATA controller had failed.

Fast forward a few months later. I finally got the board I needed, installed it — and nothing happened. See, Windows 7 couldn’t see it and neither could Windows XP. The Rosewill RC-222 had no Windows 7 drivers that worked! XP wanted a floppy disk, so after consulting with Kevin, I built a slipstreamed model of Windows XP with the appropriate drivers. This got me to a Blue Screen of Death.

I was about to throw this piece of shit in the garbage, when Kevin recommended checking the error code out online. Using my laptop, I found an interesting suggestion — change the BIOS setting from AHCI to IDE. It didn’t fool Windows 7 — but it did fool Windows XP. I can’t use Windows 7 until Rosewill comes up with less crappy drivers. But I can play Starcraft II. Huzzah!!!

Wait, what’s that? There’s a new semester starting? Winter break is over? Son of a …

Nook Not Recognized

I tried plugging in my Nook to re-charge and remove some PDFs I had completed, but I kept getting a message saying that the USB device was not recognized.  I tried this from various USB ports (sometimes devices only want to use the rear ports on my computer), but that didn’t fix it.  I even checked it on my Linux PC to see if it could find the Nook; no dice.

After checking out this very informative post on Nook basics, I gave the hard reboot a try, and now it works again.  This fix wasn’t specified anywhere in the post, but there was a comment about a hard reboot (holding down the power button for 12-20 seconds) fixing software issues, and that’s apparently what this was.

Are Smartphones The Future?

Eric S. Raymond says yes.  Even though I don’t personally have a smartphone, I can see this being the case.  Yesterday, a few guys and I were having a discussion regarding phones.  For all but one of them, they said that they didn’t imagine that a smartphone would be all that important for them, but as soon as they started playing around with it, even their non-techie wives were interested.  There was only one person who really wasn’t impressed with smartphones—or at least couldn’t find a reasonable use.  I’m kind of in that camp as well, at least for now (that and I don’t want to pay the bandwidth fees…).  But I do think Raymond has a good point:  you can get a somewhat-decent (and improving) camera, GPS device, various sensors, small gaming platform, telephone, MP3 player, video player, electronic book reader, data storage device, etc. etc. in just one device.

But the big complaint may be that none of these are quite as good as the distinct devices.  The camera, for example, isn’t anywhere near a point-and-shoot, much less a DSLR.  Each of the other devices is also lacking in various ways.  So it’s nice to have a jack-of-all-trades device, but there are good use cases for the rest of the devices.  So the major question is, to what extent will the smartphone market eat the individual device markets?  Raymond seems to argue that this will be extensive, but I’m not quite as sure.  Though I should make mention that I do agree with him on the substance of his argument, and would only differ in (perceived) market size.