Category: Ubuntu


My Ubuntu file server is almost complete. With multiple terabyte hard drives in place, software for sharing files among computers on the network, it provides all the services a modern home network requires. However, one of the things it is missing is PVR functionality, which MythTV provides.

MythTV is one of the premier software packages to come out of the open-source movement. It has been developed by hundreds of individuals who work in their free time to generate software that useful to an even greater number of people. Built for the Linux platform, it is very robust and feature filled. This power comes at a price, however, and MythTV is famous for being stubborn to install and maintain. Originally I wanted to write a blog post about how I installed a TV tuner card and conquered MythTV to create an amazing home server package, but instead I need help.

After adding a Hauppauge HVR-1600 to a PCI port in my mid-tower, I installed the drivers and firmware and set about installing MythTV. This has proven impossible because I cannot run the setup program. As shown by the image at the top, whenever I run mythtv-setup through an Xserver session on my MacBook Pro, no video is output and the interface becomes unbearably slow. I’ve consulted with many different forums and no one has been able to offer advice, so now I want to get help from the internet at large. Has anyone see this while installing MythTV and knows how to solve it?

Here’s my hardware setup to clarify things.

  • OS Ubuntu 9.04 CLI
  • TV Tuner Hauppauge HVR-1600
  • Network Gigabit
  • Remote terminal OS Mac OS X 10.6.4
  • Remote terminal hardware MacBook Pro 13″ Dec 2009 GeForce 9400m
Tagged with: , , , , ,

When running a website, it is very important to have a backup of all site files, to prepare for any event that may require reloading data (file corruption, moving hosts, etc.). After building my Ubuntu file server, I knew that I had to find a way to mirror this website so files could be recovered if necessary.

I looked at rsync and curlftpfs, but that combination was complex to set up. Soon after, I stumbled on FTPcopy and have found a good solution.

Here is a bash script to automate the process and create a daily mirror of whatever FTP server you want to back up.

  1. Download and install FTPCopy from the repositories.
    sudo apt-get install ftpcopy
  2. Change to a directory that will store the script and open a new text file.
    cd /path/to/directory
    vi ftpbackup
  3. Press i, then copy and paste the following text.
    #!/bin/bash

    USER=username
    HOST="website"
    PASS=password
    REMOTE="public_html/"
    DIR=$(echo "/path/to/backup/$HOST")
    cd $DIR

    # Issue FTPcopy command
    ftpcopy --no-delete -l 1 -u $USER -p $PASS $HOST $REMOTE .

  4. Be sure to change the values for website, storage directory, remote directory, host, username and password.
  5. Save and exit the text file by typing :wq
  6. Make the script executable by typing chmod a+x /path/to/script
  7. Add the script to the crontab so it will be executed on a regular basis. I use Webmin for this type of administration work, but it is possible to use the command line. Use this example to sort out the format. Mine runs daily at 12 AM.

To clarify, this code changes directories into the backup folder, then issues the ftpcopy command. The remote directory of public_html is common on many webservers, but be sure to confirm before running the script. The no-delete option means that files are not removed from the backup if they’ve been removed from the web server. The l option simply means provide feedback of what files are being moved — this can be viewed in your user mail.

After the time has passed for the first time, check the folder where your backups will reside to make sure they are being added as planned.

Tagged with: , , , ,

Since getting my PS3, then setting up my Ubuntu file server, I’ve really enjoyed watching movies on my TV. To get the media from computer to Playstation requires the use of software that employs the UPnP protocol, in the form of a DLNA server.

If there are too many acronyms in there, just remember the name Mediatomb. Mediatomb is an open-source, cross-platform DLNA server that streams a variety of media formats across a local network to whatever compatible device you happen to have running at the end. It can stream video, music, photos in numerous formats, and will even transcode others so that they can stream as well. All this tinkering comes at the expense of user-friendliness, though. In most cases, the regular binaries for each operating system will do most of the cool tricks I mention here. To get the most out of the system, however, requires you to compile from source.

As mentioned in my previous file server post, I’m running Ubuntu 9.04 Jaunty Jackolope on a “headless” Intel server, which I control via the terminal. Since the computer doesn’t restart, I wanted it to run as a daemon, which was where I ran into a problem. For some reason, Jaunty didn’t play nice with the standard daemon package, so I had do a little digging to find the solution.

Steps

  1. Create a temporary working directory by issuing this command
    $ mkdir temp
  2. Install the ffmpegthumbnailer libraries by installing libffmpegthumbnailer. Use the command sudo apt-get install libffmpegthumbnailer and enter your admin password when prompted.
  3. Compile ffmpeg using the tutorial at Juliensimon.blogspot.com but include the configure tag --enable-libffmpegthumbnailer. Don’t move on until the configuration confirms thumbnailer installation.
  4. Compile and install the Mediatomb binaries from source — again, I used the excellent tutorial at Juliensimon.blogspot.com
  5. Check the functionality of Mediatomb to issuing the command $ mediatomb then opening a web browser to http://ip_of_server:49152/
  6. To make the daemon work, first download the daemon package by issuing this command (one line)
    $ wget http://mirrors.kernel.org/ubuntu/pool/universe/m/mediatomb/mediatomb-daemon_0.11.0-3ubuntu2_all.deb
  7. Now extract the files in the package to the temporary directory created earlier
    $ dpkg-deb -x mediatomb-daemon_0.11.0-3ubuntu2_all.deb temp

    As you can see, the daemon package is just a collection of configuration files, so installing it properly is just a matter of copying the files back.

  8. Change to the temporary directory with the files by typing
    $ cd temp_directory_name
  9. Type these commands one line at a time to copy the files back to their rightful place. The commands with two lines should be printed as one single command.
    $ sudo cp etc/mediatomb/config.xml /etc/mediatomb/config.xml

    $ sudo cp etc/default/mediatomb /etc/default/mediatomb

    $ sudo cp etc/init.d/mediatomb /etc/init.d/mediatomb

    $ sudo cp etc/logrotate.d/mediatomb /etc/logrotate.d/mediatomb

    $ sudo cp usr/share/doc/mediatomb-daemon/README.Debian usr/share/doc/mediatomb-daemon/README.Debian

    $ sudo cp usr/share/doc/mediatomb-daemon/changelog.Debian.gz /usr/share/doc/mediatomb-daemon/changelog.Debian.gz

    $ sudo cp usr/share/doc/mediatomb-daemon/changelog.gz /usr/share/doc/mediatomb-daemon/changelog.gz

    $ sudo cp usr/share/doc/mediatomb-daemon/copyright /usr/share/doc/mediatomb-daemon/copyright

    If the copy comes back with errors about directories, you’ll likely have to use the mkdir to create the requested folders.

  10. Now the important step is setting the proper permissions of the folder /var/lib/mediatomb. Change into that directory by issuing
    $ cd /var/lib/
  11. The folder /var/lib/mediatomb should contain 3 files:
    $ ls mediatomb
    mediatomb.html
    sqlite3.db
    sqlite3.db-journal
  12. Change the ownership of the folder and its contents.
    $ chown -R mediatomb:mediatomb mediatomb
  13. Change the permissions of the HTML file.
    $ sudo chmod 666 mediatomb/mediatomb.html
  14. Change the permissions of the remaining two files:
    $ sudo chmod 644 mediatomb/sqlite3.db
    $ sudo chmod 644 mediatomb/sqlite3.db-journal
  15. Make the script run at startup.
    $ update-rc.d mediatomb defaults

To start the server, simply issue the command sudo /etc/init.d/mediatomb start. If the server doesn’t start, view the Mediatomb log file to see what else is happening.

$ vi /var/log/mediatomb.log

Exit with :q. I’ve been running this setup since I first built the server, and it has worked exceptionally well.

Tagged with: , , ,

After planning and building a Ubuntu-based fileserver, it’s time to add software to the mix, so that the computer does the work it was originally meant for. In the case with my command-line only server, the software I added was for file sharing and media serving.

The software I’m explaining here was all installed via the command line. To do any of this with the GUI of Ubuntu, simply open a terminal window.

Mediatomb

The priority of my project was that the media on the computer would be accessible on my PS3 via wireless network. To do that, I researched a number of software projects, and settled on MediaTomb because I had heard good things about it. MediaTomb has lots of great features including, but not limited to: web administration, daemon operation, transcoding and instantaneous library additions.

After finding an excellent tutorial on adding all the necessary and optional libraries to get every feature working, I had a good setup. I highly recommend reading that blog post and others on it to find out about all that MediaTomb can do. Essentially, the task involves downloading and compiling MediaTomb to use all the features. While initially the MediaTomb daemon did not run, I was able to fix the problem and got everything working. I’ll be sharing the process in an upcoming post.

Webmin

Using the command-line doesn’t really bother me, but I wanted another way to administer the server over the network and a browser. To do that, I used Webmin, which enables users to change many aspects of the system, without using the terminal.

Installing Webmin is quite easy, because it is included in the Ubuntu repositories. Simply open a terminal window and type

sudo apt-get install webmin

Enter your admin password when prompted, and Webmin will be downloaded and installed automatically. Once it is installed, open a new browser window and navigate to https://ip_of_server:10000 then enter your user credentials. Once logged in, you can do many operations without using the terminal. I have found the best benefit so far to be local disk management — after all, mess that up, and your data vanishes.

ffmpeg

Ffmpeg is a suite of libraries and applications for operating on videos and multimedia. Once installed with all the add-ons, nearly every type of video can be modified and converted. Once again, the blog of Julien Simon has an excellent tutorial about making the program work with Ubuntu. Following his instructions, you’ll have a fully operational ffmpeg installation for other applications to use.

The only thing I’ll add to his tutorial is a fix for ffmpegthumbnailer, which is necessary for MediaTomb thumbnails. Install ffmpegthumbnailer by running

sudo apt-get install libffmpegthumbnailer2

then make an addition to the ffmpeg configure code. Add the text --enable-libffmpegthumbnailer2 to the configure stage, and you should see ffmpegthumbnailer — yes when the configuration has finished.

Netatalk

Netatalk is an open-source file server that can be configured to use the Apple File Protocol. To do this, I followed these instructions, and was up and running in no time. Once installed, Netatalk is very easy to administer from the command line. I was able to create multiple shares for guest and user access and can now access my files from any Mac in the house.

To enable guest access, an additional element must be added to the /etc/netatalk/afpd.conf file. Add the following text

uams_guest.so

so that the entire line is

- -transall -uamlist uams_randnum.so,uams_dhx.so,uams_guest.so - nosavepassword -advertise_ssh

This will enable you to access shares without entering a password, provided the shares themselves are open to the use nobody.

rtorrent + screen

Torrents are a convenient way to download large files like Linux distributions, and a media server is a perfect platform for unattended downloads. rTorrent is a powerful but easy to use command line client that can be customized for any situation. On its own, rTorrent will not run as a daemon, but coupled with screen, it can run in the background.

Install rTorrent and screen by executing the following commands in the Terminal.

sudo apt-get install rtorrent
sudo apt-get install screen

Once installed, start screen and attach rTorrent to it.

screen -S torrents
rtorrent

Detach the screen session by pressing ctrl + a, d and you’ll be back to the main prompt. Now rtorrent will remain running even when you disconnect from SSH.

To rejoin the rTorrent session, you must attach to it.

screen -r torrents

avidemux

Avidemux is set of audio/video tools for the command line that can do many operations. So far I’ve used it to shift audio inside video tracks. Like most other programs, it simply requires a few arguments to do whichever job you need.

sudo apt-get install avidemux

mencoder

Slightly different than ffmpeg, mencoder is another suite of video conversion for a variety of formats. This is an excellent tool to use for converting MKV or OGM files to AVI (video does not need to be converted). Again, it requires basic command line arguments, and can be used in a screen session to work in the background.

sudo apt-get install mencoder

lm-sensors

An important part of running a headless media server is making sure the hardware is operating within its temperature restraints. A tool do monitor that is called lm-sensors and must be set up with the specific hardware in your computer. I could explain the system, but a post on the Ubuntu forums does a much better job. Once installed, simply issue the command sensors to see the temperature of various compontents, and — if your motherboard supports it — the chassis fan speed.

Of course, this only begins to scratch the surface of possible software for a Ubuntu fileserver, but I think the applications shown are important for running a system without local input. Set them up, and enjoy central storage for all the computers in your house.

Tagged with: , , , ,

With the planning of a new computer done, it’s time to begin the process of buying and assembling the parts together. The more you read about compatibility and user satisfaction, the better your final product will be. I highly suggest perusing Newegg.com (or .ca in Canada) to read about other users’ experiences with a prospective part. I used this site to read about parts, then sourced them locally to get instant gratification.

The components

While I won’t talk about physically assembling the computer (there are hundreds of articles like that across the web for that), I will explain what parts are necessary and which ones I bought.

The Choices

I purchased these parts because each met very specific needs for a fileserver. The case includes 4 internal 3.5″ drive bays, with 3 5.25″ and 2 3.5″ external bays. This leaves plenty of space for expansion. Additionally, the case includes a 120 mm chassis fan, meaning it will blow a lot of air, but will remain quiet. The cooling and noise factor is especially important for computers that will be on constantly.

The Intel E5200 was picked because the 45nm manufacturing process means it will require less power and cooling that comparable processors. The fairly high clock speed is just a bonus, but one that allows this computer to work as a video processing station. Remember, a few extra dollars spent at the outset means your system will likely satisfy your needs for much longer.

The hard drive choice is largely a matter of budget and ambitions, but I highly recommend a separate drive for the OS and main storage. Originally I didn’t really want a 500 GB boot drive, but my local store had a sale. The separate drives mean you can upgrade or even replace the operating system without touching your media files. It also means you can create filesystems like RAID or LVM without modifying your home folder.

The motherboard is perhaps the most important component in the build, and requires the most research. As previously mentioned in the planning post, the motherboard will make or break the connectivity of your machine, both to internal components and the network. The ASUS unit I chose has 4 SATA connectors and Gigabit Ethernet. It was one of the few — if not the only — motherboard I found that has both of these features and a MicroATX form factor. The gigabit connection means I can transfer data across the network at speeds of 40 MB/s!

A central storage database can make using multiple computers much simpler and convenient, and with properly chosen components, it can be built with a fairly small investment.

Disclosure
Newegg provides a small affiliate payout for items purchased through these links. I recommend Newegg because of their rapid shipping, low prices and excellent customer service.
Tagged with: , , ,