Once again, I find myself singing the praises of SSH. Seriously, is there much of a reason to have any other ports open anymore? The latest trick I have added to my list of things SSH can do is presenting a remote filesystem, securely.  Now, I’m sure most of us are aware that you can transfer files over SSH using a protocol called SFTP. What you may or may not be aware is that you can mount this remote filesystem locally using a nifty little tool called SSHFS. This is incredibly useful in a number of situations, allowing you to access remote files in a way that is easy for the user (as easy as local filesystems), easier to set up than solutions such as NFS, and as secure as SSH itself.

All you have to do on the remote machine you wish to access is have OpenSSH listening somewhere. For the client machine, you need to make sure you have SSHFS installed. To do this on Ubuntu, simply run:

sudo apt-get install sshfs

Now, to mount the filesystem locally, we first need to create a mount point for the filesystem:

mkdir /path/to/mountpoint
chown user /path/to/mountpoint

Where user is your username and sshfs is the location of the mountpoint. Now, to go ahead and mount the remote filesystem, simply execute this command with your own information inserted:

sshfs remote-username@address.of.server:/remote/folder/to/mount /path/to/mountpoint

Enter your password, and that’s it! Your remote filesystem should now be mounted.

Well that’s pretty cool in itself, but what if we want to go farther and have it mount at startup without any interaction from us? No problem, thanks to another cool feature of SSH called public key authentication. This feature allows us to log in to a system without providing the password of the user we are authenticating as, and instead authenticating users based on their RSA keys. If you trust me that this is secure, you can skip the next paragraph, but if you don’t, or you are curious how this works, read on.

The initial key exchange that SSH does is encrypted using an asymmetric encryption algorithm called RSA. In this key exchange, the goal is to exchange a symmetric key (AES, DES, whatever  you want) over RSA, which is unfortunately too slow to handle the large amount of data that needs to be encrypted to secure all of the SSH traffic. It is ideal, however, for assuring that a key exchange stays secure. The way it works is that each participant, both client and server, have a public key and a private key, and you give out the public key to anyone you want to be able to send you data. Once encrypted with the public key, the only way you can decrypt the data is with the private key, which only the local computer has. This technique has the useful property of providing both confidentiality and, as long as the private key is kept secret, authenticity. This means that as long as the private key is kept secret, you can authenticate to a system based solely on the public key, because no one but the authorized machine should be able to decrypt the proper symmetric key if it does not have the private key. If you would like more explanation than the incredibly brief overview I just gave, go check out the Wikipedia articles on RSA and on SSH, it should give you all the information you want.

Now that you don’t feel like you’re doing something incredibly dangerous (or maybe you still do, and you just like danger…:P ), follow these steps provided by OpenSSH on how to set up public key authentication between two hosts.  Once done, all that’s left to do is add the sshfs command that we used earlier to mount the remote filesystem to a startup script somewhere. To do this in Ubuntu/GNOME, you can simply go to System->Preferences->Startup Applications and add a new entry that uses our command from earlier as the command to be executed at login. If you are not on Ubuntu or using GNOME, you should be able to find documentation somewhere on how to make something run on startup.

That’s all there is too it, hope someone finds it useful. Just a short note, if you need to unmount the share, simply execute sudo umount /path/to/mountpoint and you’ll be fine. Enjoy!

Reblog this post [with Zemanta]
Advertisements

Hey all, long time no post yet again. Exams can do that to you…but if you are much more fortunate than I and have some time to kill, I would highly suggest taking the pre-alpha of Chromium (the open source base of Google Chrome) for a spin. For a long while now I’ve been looking for a decent browser replacement for Firefox on my netbook, which is almost unbearably slow. Thankfully for my mobile browsing experience, Chromium seems to be shaping up to be that browser. Still has tons of bugs and crashes occasionaly, but for pre-alpha it’s still really polished and *really* fast. I will be writing a more thorough review once I get the time, but until then you all can just see for yourself.

A big fat warning before we begin: EXPECT THINGS TO BREAK. This isn’t even in alpha yet, so there are no guarantees as to your experience. That said, I’ve had a pretty good experience with it so far.

All you really need to do to get Chromium installed is to add the nightly PPA repository that the developers were kind enough to set up for all of us Ubuntu users and install the chromium-browser package. To do this, simply do the following:

Open up a terminal (or use Alt+F2) and execute the following command:

sudo gedit /etc/apt/sources.list

Now go to the PPA site to get the correct lines to add into the file.  To do this, simply select your version of Ubuntu and it will tell you what lines you need. It should look something like this (the lines for Intrepid):

deb http://ppa.launchpad.net/chromium-daily/ppa/ubuntu intrepid main
deb-src http://ppa.launchpad.net/chromium-daily/ppa/ubuntu intrepid main

Add these to the end of the file, save, then exit.

Now you need to add the repository key. Simply execute this command:

sudo apt-key adv --recv-keys --keyserver keyserver.ubuntu.com fbef0d696de1c72ba5a835fe5a9bf3bb4e5e17b5

Great! The repository is now installed and verified. Now, simply update the repositories and install the package by running the following commands:

sudo apt-get update
sudo apt-get install chromium-browser

That’s it! Chromium should now be installed on your system, ready for you play around with. Enjoy.

As a follow-up to my previous post on using Handbrake to rip DVDs, I wanted to do a short write-up on how to use a program called DeVeDe to restore those MKV, AVI, and MP4 files that you ripped earlier back to a DVD that you can use on any DVD player.

Before finding DeVeDe, I had been looking for a good solution for DVD creation on Linux for awhile. However, nothing had really impressed me very much. They generally had clunky, bloated UIs and didn’t support a wide range of file formats. DeVeDe changes all that; it uses the same mencoder backend that Handbrake does, allowing it to support a wide range of files (pretty much anything mencoder supports). It also sports a very simple but powerful UI, allowing you to make pretty much any customization you want to the menu and to have very complex DVD title structures. This is while also not being overly complex for entry level users, and pretty enough that it doesn’t burn your retinas to look at it.

Sound good? Then let’s get started. First, you of course need to install it. To do this on Ubuntu (Hardy/Intrepid/Jaunty, probably others as well), simply open up a terminal and execute the following command:

sudo apt-get install devede

That’s it! Alternatively, you can install it through Synaptic by searching for devede and installing the package. But what fun is that? 😛

Now that DeVeDe is installed, let’s open it up and take a look.

Select Disc Type - DeVeDe

Select Disc Type - DeVeDe

As you can see, you’ll first be prompted for what kind of CD/DVD you want to make. For this tutorial, we will assume you’re making a normal DVD, but there are a lot of other options you can follow if you wish.

Main Screen - DeVeDe

Main Screen - DeVeDe

Now we are presented with the home screen, the place where all the magic happens. You are started out with the most simple DVD possible: a single DVD title, generically named, and a simple default menu. From here, you can do pretty much anything you want to do. In the interest of keeping this simple, we will assume that you just want to burn a backup of a single movie.  First things first: let’s name the title. To do this, simply click on Properties.

Title Properties - DeVeDe

Title Properties - DeVeDe

Here, simply enter whatever you want the title to be named, and select the action you want taken after its finished (I would suggest just going to the menu afterwards). After you’re done, click OK.

We now need to add a video file to the title. To do this, simply click the Add button under the Files box on the right.

File Properties - DeVeDe

File Properties - DeVeDe

Click the file dialog button and select your video file. I would also suggest changing the format from PAL to NTSC if you are living in the U.S., most DVD players expect NTSC content here. If you know differently for yours though, or it can handle both, then don’t worry about it. If you do need to change to NTSC and you’re adding a lot of video files, you can make this the default on the home screen. From the add file dialog screen, you can also chose what audio track you want to use (if there are multiple), and you can add your own custom subtitle files simply by clicking the add button next to the subtitle box and selecting the sub file. There are also a number of very useful advanced settings that you can mess around with if you feel so inclined  (default settings have worked for me though). Before you finish, I would advise clicking the Preview button as well. It will encode a sample of the video with your settings and play it back so that you can preview what the DVD will look like when finished, and to make sure everything is in sync (very handy feature!).  Once you are satisfied with your settings, simply click OK.

Now, you need to configure your menu. For me, I really don’t care what the menu looks like, so I just leave the default in. However, I’m sure there are many out there who don’t share my thoughts, and would like to customize away. If so, simply click the Menu Options button at the bottom of the home screen.

Menu Options - DeVeDe

Menu Options - DeVeDe

From here, you can make pretty much any change you want to. Add music, add a custom background, title the Menu, change the font, everything. I won’t go through this in depth, but you can play around with it and see what happens! You can also preview the menu from here, so you can see what it looks like as you’re making it.

You’re almost done now! The last thing you need to check is under the Advanced Options tab at the bottom. If you have a multicore CPU, I would advise selecting the Use Optimizations For Multicore CPUs option. This will greatly speed up your disc creation time. Once you’ve checked this, go ahead and click Forward.

Final Disc Structure - DeVeDe

Final Disc Structure - DeVeDe

You will now be prompted with where to save the ISO image of the DVD. An ISO image, for those who don’t know, is basically a bit for bit copy of a DVD, and we will use it to actually burn our DVD.

Save ISO - DeVeDe

Save ISO - DeVeDe

Once done, just click OK and go get a cup of coffee. It will be a little while, as DeVeDe needs to encode your video into the proper format.

After it finishes, get a DVD and insert it into your DVD burner. Open up the folder where you saved the ISO, double click the file (right click->Disk Burner on Jaunty), and click Burn. Wait for it to finish, and then you’re done! Go plug it into any DVD player, and it should work like any other disc.

Image Burning

Image Burning

And that’s it! I hope this was helpful to some of you out there wondering how to create DVDs in Ubuntu, feel free to ask if you need help or clarification.

A few days ago, AT&T announced the specifics on a trial of their new pricing program, and, in true AT&T fashion,  continued their rape the American consumer in another attempt to keep us in their profitable technological dark age. I suppose that may be a little harsh, but hey, I am not so happy right now. I’m sure you will forgive me my moment of rage. So what is their new, creative pricing plan? Make you pay more (a lot more, of course) to get the service you have today.

Apparently, AT&T has decided that they are tired of people’s access to “unlimited” broadband services (godless freeloaders), so they have decided to start running trials in which users are charged by the gigabyte for Internet access. What this essentially means is that without any discernible increase in the cost of providing their service, they have taken it upon themselves to greatly increase the cost of their service to the average consumer. To get the same unlimited access that you’re paying 20-35 dollars a month for now, you will have to pay (at least) 150 dollars to AT&T for in the future. Let me repeat that: AT&T is implementing a 500% price hike for no apparent reason. Well, other than greed, that is.

Now, to be fair, AT&T has put forth a few arguments on why this price hike is necessary. The first and foremost among these arguments is that people are actually using the bandwidth that they paid for. And they can’t have that. They first tried to bump up their profit margins again by trying to force web companies like Google and Yahoo to pay more for all of their web traffic to be prioritized , a potentially disastrous proposal for the internet as a whole, and a definitive death blow to the cause of net neutrality. Unfortunately, the do-nothing attitude of Congress rejected the net neutrality bill that would have prevented such a thing from taking place, but at least Congress restrained itself from making the telecom’s brilliant idea law. Because of this setback, the AT&T and the telecoms were forced to go back to the drawing board. Looks like they’ve decided that if they can’t take from the provider side, they’re going to take from the consumer. And they want it all.

The second most quoted reason for the price hike is much more sickening, however. AT&T has gone around proudly declaring that they need the money to keep pace with technological innovation, so as to continue to provide their customers with “superior service.” Hrm…you mean like the $200 billion dollars in taxpayer money you and your telecom buddies were given back in the 90’s to achieve the goal of 86 million U.S. homes with symmetrical 46mbps internet connections by 2006? Or was that not quite enough for you? While they’ve sat counting the money they robbed from the American people, we have quickly slid from 1st worldwide in broadband penetration to 25th. And please, spare me the “we’re too large of a country, Japan has it so easy” rhetoric. In case you didn’t realize, Japan is a country  about the size of California, and I don’t think that ANY Californians have yet to be blessed with the 100mbit/s internet connections that most Japanese citizens enjoy. Oh, and telecom companies? I’m still waiting for my $2000 refund check (or, preferably, my faster internet connection). And don’t think I’ll forget.

Their third line of reasoning is just silly. AT&T argues that because it worked in Europe (an arguable point) and on cell phones (a ridiculous point), it should now be the rule rather than the exception. I can’t help but wonder at how they decided that Europeans liked paying more for their internet connections. My guess is that their definition of “worked” is that people didn’t storm their corporate offices with pitchforks. Well, either that or the entire continent is comprised of masochists. You can decide which is the more likely scenario. But believe me, if I or any of my friends could have an unlimited 3G connection that didn’t cost an arm and a leg, we would subscribe in a heartbeat. However, providers just will not do that, regardless of the MASSIVE consumer interest in such a service. Why? They make more money per kilobyte when they charge by the kilobyte than when they give people an unlimited pass. It has nothing to do with need, or an increase in traffic, or a better way of thinking about providing internet service for consumers. It is about them padding their already enormous profits with more of your hard earned money.

I mentioned in the title of this post that this was a cautionary tale. I want to clarify what I mean by that. First, I want to caution the American people: if we continue to let the large corporations in this country dictate the progress of technological innovation for their own gain, we will fall further and further behind the rest of the world. Technology, with all its benefits, has made this country the great place that it is, and to let that slip away for the short-term profit of a wealthy few would be one of the worst decisions that we could make. The hard economic times that we are now in would devolve into something much worse without our technological upper-hand. Second, I want to caution the telecom companies, specifically AT&T: be careful on the ground on which you tread. You’ve already been lucky so far that the U.S. government has not taken action against you for your monopolistic business practices now and your blatant fraud back in the 90’s.  Price gouging your customers to the point of ridiculousness while simultaneously stealing their tax dollars is not going to win you any friends. Eventually, your misdeeds and lies will come into the public light (probably after you pushed peoples’ pocketbooks just a bit too far), and people will be calling for heads to roll. When that happens, you’re going to need all the friends you can get.

</rant>

It seems that not a week goes by any more that I don’t find some new, fun trick to do with SSH. A few weeks ago, I found one that to me has been especially useful.

I was sitting in the Tulsa International Airport, once again wishing that airports would just suck it up and provide free wireless access throughout their terminals. It’s a real pet peeve of mine, as layovers become incredibly more painful when I can’t waste away my time stumbling about the internet. I might even have to do something *shudder* productive…

Anyway, there I was, sipping some coffee and working on a project, when I noticed that there was an open wireless network available that was not one of those god forsaken Boingo hotspots. Being the curious person that I am, I decided to see if I could connect. Sure enough, it let me right on. Being the cautious person I am, I went to an HTTPS secured site to see what would happen. And sure enough, the normally valid certificate was invalid, pretty much guaranteeing someone was trying to listen in.  I was still happy though, at least I still I had internet access and could keep myself mildly entertained with that.

However, I was feeling especially curious that day, so I decided to try to tunnel my traffic over SSH to a box back in my apartment, keeping my oh-so precious personal data away from prying eyes. Besides, beats working. After a little digging through man pages, this task, to my surprise, turned out to be much simpler than I had expected. All you need is one SSH command and an SSH server that you have access to and has forwarding enabled (the default OpenSSH installation on Ubuntu does).

If you don’t have an SSH server set up and you’re using Ubuntu at home, simply execute this on your home machine:

sudo apt-get install openssh-server

This will install and start the service. Make sure that a.) your user password is of decent strength (SSH is a common target for password bruteforcing) and b.) that you have port 22 forwarded on your router if you are behind a NAT so that you can access it from outside of your local network. The SSH client should already be installed on a default Ubuntu install (you can also do this using PuTTY on windows).

Once you have these two things ready, just open up terminal on your laptop/netbook/mobile device and type the following:

ssh -Nf -D randPortNum remote-username@ssh.server.com

Replace randPortNum with a port number of your choosing (something above 1024 if you are not root, which is probable), remote-username with your username on the remote system, and ssh.server.com with the hostname or IP address of your SSH server. If you are using your home server, I’d suggest using DynDNS to get a simple domain name to access it with. If you do not feel very comfortable with the command line, or you are lazy like me (I hate having to close the window after I’m done…), you can execute this command using Alt+F2, and the SSH client will prompt you for your password.

Now let me explain what exactly this command is doing. The N and f flags both specify that the command is to be forked into the background, so that you can do whatever you want after you execute it. Close the terminal, keep using it for something else, anything you please (just not killall ssh!). The D flag is the one doing the really interesting stuff: the OpenSSH developers decided it would be cool to put SOCKS proxy functionality straight into the client, and the D flag is how you access it. Basically, you are just telling SSH to start “local dynamic application-level port forwarding” (SOCKS proxy) from the specified port on your local machine to the remote host. Now, any program on your computer that supports SOCKS proxies will be able to connect to that port on your machine and have its traffic automagically forwarded (and encrypted!) across the internet to your remote machine, where it will then go out to its destination.

To add to it, tons of programs do support SOCKS proxies, more than you might think. Firefox, Opera, Pidgin, Deluge, Transmission (Tracker only), the list goes on. On top of that, using some programs (like tsocks) you can actually use any TCP based program over it. Very cool stuff.

To go ahead and encrypt your web traffic, open up Firefox (if you need Opera instructions, they’re probably very similar).  Go to Edit->Preferences->Advanced->Network->Settings (Configure How Firefox Connects To The Internet) . Select “Manual proxy configuration”, enter “localhost” for your SOCKS host and the port number you chose earlier as your port. Either SOCKS 4 or 5 should work (I use 5). Now, it should look similar to the picture below:

An Example Configuration

An Example Configuration

Now just click OK, close out the Settings dialog, and you’re done! Go here and check it out, your IP is now the same as the remote host’s. If you’re really paranoid, you can also make Firefox tunnel your DNS queries over the proxy. This prevents the nameserver of the local network feeding you bad DNS information or keeping tabs on what you are viewing (you are still relying on the remote nameserver being trustworthy though :P) . To do this, open up a tab, enter the address “about:config”, search for “network.proxy.socks_remote_dns” and set it to true. And that’s it!

This trick can be immensely useful in many situations, from securing your traffic across untrusted local networks, to getting around packet shaping/filtering, to remaining anonymous online. I now use it all the time on my laptop, and very rarely trust the local network. A word of warning before I sign off though, I was lucky on that hotspot because the attacker was not trying to launch a MITM attack against my SSH traffic. If they had, the keys would not have matched my previous connection attempts to my SSH server, and I would have been warned in big bold letters that I was being listened in on, and the SSH client would have quit. In this situation, securing your traffic may be more difficult, but not impossible. I may post later on how one might go about this.

Anyway, hope someone else finds this as useful and interesting as I do. As always, feel free to ask if you have any questions.

UPDATE 04/15/2010: I have done a follow-up post to this article describing how you can use proxychains to allow any program that uses TCP sockets to tunnel traffic over SOCKS proxies, not just ones that have built-in proxy support. I also show how to chain multiple proxies together.

Wow, you know you haven’t posted in awhile when your intro paragraph to your next post talks about how Christmas went. In case anyone still cares now that it’s almost Easter, it went well. Very well. I still want to take this time to thank Santa for his enormous generosity this past year, as he was kind enough to get me that netbook that had been dancing around in my dreams for awhile: the Eee PC 1000.

I’ve spent the past few months playing around with my shiny new Eee PC, and I am duly impressed. Wireless N,  8GB SSD + 32GB built in flash, 7 (yes, count them, 7) hours of battery life, Bluetooth, webcam + mic, the list goes on and on. All of this technological goodness kept within a sleek, 12 inch wide frame that even Steve Jobs might not deem “junk”. Oh, and did I mention that all of this wonderful hardware has native Linux driver support? Can you say “portable hackstation”?

Yes, it was a good Christmas for this Linux user, and judging from the experience I had with the Eee PC 1000, it’s been a good year for Linux users in general. With netbooks being the fastest growing segments in the computing arena,  Linux’s superior memory and power management, combined with it’s endless configurability and ever-improving usability,  is starting to make Microsoft fear the penguin more than usual. This is not without reason: Ubuntu 8.10 has completed my netbook.

Now, before you all cry out in unison that I can get netbooks with Linux preinstalled, I know. In fact, mine came that way. However, the distribution that shipped with my Eee PC made it feel less like a computer and more like a toy, and a very useless one at that. I really hope that Asus wises up, and starts shipping something that isn’t intentionally crippled for some miguided notion of  usability. I am thoroughly convinced that an install of Ubuntu would have been easier to use for anyone than that worthless POS that came preinstalled.

However, as great of a fit that the Ubuntu/Eee PC union is, it was not without some small hurdles to first overcome. The following is a short documentation of how to take your nifty new Eee PC and install the latest release of Ubuntu, Intrepid Ibex.

As I’m sure you’ve figured out, installing from CD isn’t going to work so well without a CD drive, so we first need to find another way to get Ubuntu onto the netbook. The easiest way to do this is with a flash drive. These are many ways to get Ubuntu on a flash drive, as documented here, but I will only be covering how I did it, using the installation tool built into Ubuntu. If you don’t have a flash drive, well, buy one. Seriously, it’s like 5 bucks.

Once you’ve gotten a hold of a flash drive, make sure you’ve backed up any important files, because we’re going to wipe it and put Ubuntu onto it.  You are also going to need to get an ISO of the latest version of Ubuntu 8.10 (32 bit) from here. While that’s downloading, you might run off and get an ethernet cable if you don’t have it, you’ll need it later.

It should be mentioned at this point that there are lots of ready-made distros out there specifically for the Eee PC, including a number based off of Ubuntu. In addition, a default installation of Ubuntu does not have driver support enabled for all of the Eee PC components. However, these ready-made distributions strip out a lot of kernel features that you may need at some point, so for most users it’s a better idea to just install the standard edition and install a custom kernel. After all, it would be rather annoying if, for all the Eee PCs portable goodness, you plugged in some device that normally works under a standard Ubuntu 8.10 install only to find out that support for it has been removed. It’s better to at least have a backup of the original kernel, with all of its driver support, and then run a slimmed down version with the Eee PC drivers compiled in for day to day use. Now, I know what you’re thinking to yourself right now: “I have to replace my kernel just to get this working? What is this, Gentoo?” Do not fear, the Ubuntu community has your back, and has made this process a piece of cake.

Now that you have the ISO downloaded, we can move on to the fun part – installing it on a USB drive. If you already have Ubuntu installed on your desktop/laptop, then you’re all set to start. If not, you need to burn the ISO to a CD, and then boot into it before you can start. Once you have Ubuntu up and running, go to System -> Administration -> Create A USB Startup Disk. This will look slightly different on the Live CD, as you don’t have to select an ISO (it uses itself), but the concept is the same:

Now, simply select the ISO file that you downloaded, the USB drive that you want to install, and click “Make Startup Disk”. Go get yourself something to eat, as this can take awhile, depending on the speed of the disk.

You should now have a bootable USB drive with Ubuntu 8.10 installed, congratualtions! You’re well on our way to having it up and running. Now, go ahead plug it into your Eee PC and power it up. You may need to set the USB drive as the default boot device in the BIOS, so it’s best to check. F2 at the bootup screen does the trick. For some reason, my Eee PC reports USB drives as hard drives, so I would check to make sure that USB is first in the Hard Disk boot priority list.

Once you’ve booted up into Ubuntu using the USB drive, simply install Ubuntu as you normally would, by clicking the Install icon on the desktop and following the prompts. Make sure that your 8GB partition is the one that your root  partition is installed to, not doing so will result in slow performance and possibly data loss later on.

Restart, and you’re almost done! Hook up your Eee PC to a wired connection (your wireless most likely won’t work), and follow these instructions to install the custom Eee kernel.

That’s it! I hope you all have found this informative, and I know you will all enjoy Ubuntu on your Eee PC as much as I have.

If you want some tips on configuring your Ubuntu install to deal with the small screen, please see the Ubuntu wiki. Its tips really helped me, and I’m sure they will be of use to all of you as well.

And they said it would never pay off…:P . But in all seriousness, welcome to all who have now blindly stumbled upon my tiny blog in the middle of the vast sea of cyberspace. Because of the copious amounts of schoolwork and research I’ve had on my plate the past few months, I haven’t set nearly enough time aside to update this blog. This saddens me greatly, so I’m going to begin a renewed effort to start posting my musings concerning technology and such again, and hopefully some of you might be able to glean a few pieces of advice and wisdom out of my incessant babbling. Now for something to write about… I guess I’ll have to see what sets my fingers typing next.

Until then, peace.