Tag: Ubuntu


I’m an inquisitive person. Maybe it’s because I’m an engineer, but I like learning about new things, whether they’re related to cars, computers or technology in general. So recently when looking through the logs of my Ubuntu server, I discovered many, many failed login attempts for users that didn’t exist. Since my system is accessible via the internet through SSH port 22, I get lots of attempts from bots, or less likely, real people.

In the past I’ve looked up the recorded IP address on the internet and found the offending location (or at least an approximation) but now, with my recent affinity for Python, I decided to automate the process and start a log. Using the free API service from IPInfoDB.com, I wrote a Python script that runs hourly and parses the log file, looks up the location online, then stores everything in a MySQL database. Then I wrote a small PHP page to output the data to a browser in the form of the pretty maps and charts you see on this page. Now I can check the site occasionally and see where the activity is.

What does it all mean?

By looking at the map at the top, you’ll notice that most of the logins are centralized in the United States and eastern Asia. I can’t really say I’m that surprised.

The sample is made up of 352 attempts since February 2nd. You’ll see in the chart that the top 5 countries are:

  • China – 126
  • United States – 77
  • Republic of Korea – 32
  • Turkey – 18
  • India – 13

It’s an interesting list, and I’m curious to see how the numbers will change over time.

Additionally, here is a chart of the top usernames that the attempts had. Not really a surprise here, either. Obviously “attackers”, for lack of a better word, want to gain full access to a machine by using the root username, and most of the other names are for services that typical servers have running (eg. Oracle, mysql, postgres, etc.). What I found most interesting are the one-off attempts: brian, richard, fluffy, stud, gnats and taylour to name a few (really!). Top 5 once again are:

  • root – 255
  • bin – 21
  • nagios – 14
  • oracle – 9
  • test – 4

So what can you do?

If you must have an open SSH connection to the internet, there are a few things you can do to protect your system. There are lots of websites that describe the steps in more details (like this one, or this one or even this one. All of those sites share a few simple tips:

  1. Disable root access
  2. Use keys to access the system instead of passwords
  3. Drop the connection after x login attempts
  4. Use a non-standard port for SSH
  5. Enforce secure passwords

I follow a few of those, mostly because my system isn’t important enough to warrant super hardened measures. Even by disabling root access, you remove 99% of the threats, and requiring a key to log in removes the remaining 1%. I”m not particularly worried about users actually accessing my computer, and by following some of the measures above, that possibility is basically elimininated.

Update — September 23, 2012
A commenter requested the source code for the above system, so I’ve attached it as a zip file. It contains the Python script and web front end. I’ve written instructions as best I can, but please remember it’s a bit of a hack and therefore I can’t provide much support. Follow the readme and it should make sense.

Tagged with: , , ,

When running a website, it is very important to have a backup of all site files, to prepare for any event that may require reloading data (file corruption, moving hosts, etc.). After building my Ubuntu file server, I knew that I had to find a way to mirror this website so files could be recovered if necessary.

I looked at rsync and curlftpfs, but that combination was complex to set up. Soon after, I stumbled on FTPcopy and have found a good solution.

Here is a bash script to automate the process and create a daily mirror of whatever FTP server you want to back up.

  1. Download and install FTPCopy from the repositories.
    sudo apt-get install ftpcopy
  2. Change to a directory that will store the script and open a new text file.
    cd /path/to/directory
    vi ftpbackup
  3. Press i, then copy and paste the following text.
    #!/bin/bash

    USER=username
    HOST="website"
    PASS=password
    REMOTE="public_html/"
    DIR=$(echo "/path/to/backup/$HOST")
    cd $DIR

    # Issue FTPcopy command
    ftpcopy --no-delete -l 1 -u $USER -p $PASS $HOST $REMOTE .

  4. Be sure to change the values for website, storage directory, remote directory, host, username and password.
  5. Save and exit the text file by typing :wq
  6. Make the script executable by typing chmod a+x /path/to/script
  7. Add the script to the crontab so it will be executed on a regular basis. I use Webmin for this type of administration work, but it is possible to use the command line. Use this example to sort out the format. Mine runs daily at 12 AM.

To clarify, this code changes directories into the backup folder, then issues the ftpcopy command. The remote directory of public_html is common on many webservers, but be sure to confirm before running the script. The no-delete option means that files are not removed from the backup if they’ve been removed from the web server. The l option simply means provide feedback of what files are being moved — this can be viewed in your user mail.

After the time has passed for the first time, check the folder where your backups will reside to make sure they are being added as planned.

Tagged with: , , , ,

Earlier I wrote an Applescript that goes online to TV.com and finds the episode titles for TV show video files. While that seemed to work properly, TV.com changed their format and my Applescript went kaput. Since I really wanted to have this process automated, I wrote a bash script to do the same thing with the command line.

The result is a Ubuntu bash script that renames all the formatted files in a folder with the actual episode titles. Right now it requires Linux because it uses wget and XMLStarlet to download the file data, but I may release an additional script that works with other systems.

The entire script is made possible by the excellent XML feed service by TVRage.com.

Download the script

Prerequisite

XMLStarlet
XMLStarlet is a small command-line utility that can process XML files and text. It is required to traverse the XML structure of the TVRage.com data. To download this utility in Ubuntu, simply use the repositories.

sudo apt-get install xmlstarlet

Usage

Change paths where appropriate.

  1. Save the script to a known folder, change into that folder, and make it executable by issuing the following command
    chmod a+x ./tvrenamer.sh
  2. Change the current directory to the folder that contains the video files.
    cd Television/Season\ 1
  3. Rename all the files in the folder to use the format SxxExx.extension
    S08E01.avi
    S08E02.avi
    S08E03.avi
    S08E04.avi
    S08E05.avi
    S08E06.avi
    S08E07.avi
    S08E08.avi
    S08E09.avi
    S08E10.avi
  4. Call the script and append the name of the show to the end of the command.
    /path/to/script/tvrenamer.sh Simpsons
  5. Watch as the shows all magically change their name.
    Downloading show data for 'Simpsons'...
    Downloading episode guide...
    Simpsons - S08E01 - Treehouse of Horror VII.avi
    Simpsons - S08E02 - You Only Move Twice.avi
    Simpsons - S08E03 - The Homer They Fall.avi
    Simpsons - S08E04 - Burns, Baby Burns.avi
    Simpsons - S08E05 - Bart After Dark.avi
    Simpsons - S08E06 - A Milhouse Divided.avi
    Simpsons - S08E07 - Lisa's Date with Density.avi
    Simpsons - S08E08 - Hurricane Neddy.avi
    Simpsons - S08E09 - El Viaje Misterioso de Nuestro Jomer (The Mysterious Voyage of Homer).avi
    Simpsons - S08E10 - The Springfield Files.avi

If you wish to access the script simply by typing the name (tvrenamer, for example) simply issue the following two commands:

cp /path/to/script/tvrenamer.sh /usr/local/bin/tvrenamer
sudo chmod a+x /usr/local/bin/tvrenamer

From this point, you simply need to use tvrenamer "TV Show".

The script reads all files in the folder, but will only rename files that are in the S**E** format. TV show titles must have escaped spaces to properly search for the show, or be surrounded in ” quotes.

Tagged with: , , , ,

Since getting my PS3, then setting up my Ubuntu file server, I’ve really enjoyed watching movies on my TV. To get the media from computer to Playstation requires the use of software that employs the UPnP protocol, in the form of a DLNA server.

If there are too many acronyms in there, just remember the name Mediatomb. Mediatomb is an open-source, cross-platform DLNA server that streams a variety of media formats across a local network to whatever compatible device you happen to have running at the end. It can stream video, music, photos in numerous formats, and will even transcode others so that they can stream as well. All this tinkering comes at the expense of user-friendliness, though. In most cases, the regular binaries for each operating system will do most of the cool tricks I mention here. To get the most out of the system, however, requires you to compile from source.

As mentioned in my previous file server post, I’m running Ubuntu 9.04 Jaunty Jackolope on a “headless” Intel server, which I control via the terminal. Since the computer doesn’t restart, I wanted it to run as a daemon, which was where I ran into a problem. For some reason, Jaunty didn’t play nice with the standard daemon package, so I had do a little digging to find the solution.

Steps

  1. Create a temporary working directory by issuing this command
    $ mkdir temp
  2. Install the ffmpegthumbnailer libraries by installing libffmpegthumbnailer. Use the command sudo apt-get install libffmpegthumbnailer and enter your admin password when prompted.
  3. Compile ffmpeg using the tutorial at Juliensimon.blogspot.com but include the configure tag --enable-libffmpegthumbnailer. Don’t move on until the configuration confirms thumbnailer installation.
  4. Compile and install the Mediatomb binaries from source — again, I used the excellent tutorial at Juliensimon.blogspot.com
  5. Check the functionality of Mediatomb to issuing the command $ mediatomb then opening a web browser to http://ip_of_server:49152/
  6. To make the daemon work, first download the daemon package by issuing this command (one line)
    $ wget http://mirrors.kernel.org/ubuntu/pool/universe/m/mediatomb/mediatomb-daemon_0.11.0-3ubuntu2_all.deb
  7. Now extract the files in the package to the temporary directory created earlier
    $ dpkg-deb -x mediatomb-daemon_0.11.0-3ubuntu2_all.deb temp

    As you can see, the daemon package is just a collection of configuration files, so installing it properly is just a matter of copying the files back.

  8. Change to the temporary directory with the files by typing
    $ cd temp_directory_name
  9. Type these commands one line at a time to copy the files back to their rightful place. The commands with two lines should be printed as one single command.
    $ sudo cp etc/mediatomb/config.xml /etc/mediatomb/config.xml

    $ sudo cp etc/default/mediatomb /etc/default/mediatomb

    $ sudo cp etc/init.d/mediatomb /etc/init.d/mediatomb

    $ sudo cp etc/logrotate.d/mediatomb /etc/logrotate.d/mediatomb

    $ sudo cp usr/share/doc/mediatomb-daemon/README.Debian usr/share/doc/mediatomb-daemon/README.Debian

    $ sudo cp usr/share/doc/mediatomb-daemon/changelog.Debian.gz /usr/share/doc/mediatomb-daemon/changelog.Debian.gz

    $ sudo cp usr/share/doc/mediatomb-daemon/changelog.gz /usr/share/doc/mediatomb-daemon/changelog.gz

    $ sudo cp usr/share/doc/mediatomb-daemon/copyright /usr/share/doc/mediatomb-daemon/copyright

    If the copy comes back with errors about directories, you’ll likely have to use the mkdir to create the requested folders.

  10. Now the important step is setting the proper permissions of the folder /var/lib/mediatomb. Change into that directory by issuing
    $ cd /var/lib/
  11. The folder /var/lib/mediatomb should contain 3 files:
    $ ls mediatomb
    mediatomb.html
    sqlite3.db
    sqlite3.db-journal
  12. Change the ownership of the folder and its contents.
    $ chown -R mediatomb:mediatomb mediatomb
  13. Change the permissions of the HTML file.
    $ sudo chmod 666 mediatomb/mediatomb.html
  14. Change the permissions of the remaining two files:
    $ sudo chmod 644 mediatomb/sqlite3.db
    $ sudo chmod 644 mediatomb/sqlite3.db-journal
  15. Make the script run at startup.
    $ update-rc.d mediatomb defaults

To start the server, simply issue the command sudo /etc/init.d/mediatomb start. If the server doesn’t start, view the Mediatomb log file to see what else is happening.

$ vi /var/log/mediatomb.log

Exit with :q. I’ve been running this setup since I first built the server, and it has worked exceptionally well.

Tagged with: , , ,

After planning and building a Ubuntu-based fileserver, it’s time to add software to the mix, so that the computer does the work it was originally meant for. In the case with my command-line only server, the software I added was for file sharing and media serving.

The software I’m explaining here was all installed via the command line. To do any of this with the GUI of Ubuntu, simply open a terminal window.

Mediatomb

The priority of my project was that the media on the computer would be accessible on my PS3 via wireless network. To do that, I researched a number of software projects, and settled on MediaTomb because I had heard good things about it. MediaTomb has lots of great features including, but not limited to: web administration, daemon operation, transcoding and instantaneous library additions.

After finding an excellent tutorial on adding all the necessary and optional libraries to get every feature working, I had a good setup. I highly recommend reading that blog post and others on it to find out about all that MediaTomb can do. Essentially, the task involves downloading and compiling MediaTomb to use all the features. While initially the MediaTomb daemon did not run, I was able to fix the problem and got everything working. I’ll be sharing the process in an upcoming post.

Webmin

Using the command-line doesn’t really bother me, but I wanted another way to administer the server over the network and a browser. To do that, I used Webmin, which enables users to change many aspects of the system, without using the terminal.

Installing Webmin is quite easy, because it is included in the Ubuntu repositories. Simply open a terminal window and type

sudo apt-get install webmin

Enter your admin password when prompted, and Webmin will be downloaded and installed automatically. Once it is installed, open a new browser window and navigate to https://ip_of_server:10000 then enter your user credentials. Once logged in, you can do many operations without using the terminal. I have found the best benefit so far to be local disk management — after all, mess that up, and your data vanishes.

ffmpeg

Ffmpeg is a suite of libraries and applications for operating on videos and multimedia. Once installed with all the add-ons, nearly every type of video can be modified and converted. Once again, the blog of Julien Simon has an excellent tutorial about making the program work with Ubuntu. Following his instructions, you’ll have a fully operational ffmpeg installation for other applications to use.

The only thing I’ll add to his tutorial is a fix for ffmpegthumbnailer, which is necessary for MediaTomb thumbnails. Install ffmpegthumbnailer by running

sudo apt-get install libffmpegthumbnailer2

then make an addition to the ffmpeg configure code. Add the text --enable-libffmpegthumbnailer2 to the configure stage, and you should see ffmpegthumbnailer — yes when the configuration has finished.

Netatalk

Netatalk is an open-source file server that can be configured to use the Apple File Protocol. To do this, I followed these instructions, and was up and running in no time. Once installed, Netatalk is very easy to administer from the command line. I was able to create multiple shares for guest and user access and can now access my files from any Mac in the house.

To enable guest access, an additional element must be added to the /etc/netatalk/afpd.conf file. Add the following text

uams_guest.so

so that the entire line is

- -transall -uamlist uams_randnum.so,uams_dhx.so,uams_guest.so - nosavepassword -advertise_ssh

This will enable you to access shares without entering a password, provided the shares themselves are open to the use nobody.

rtorrent + screen

Torrents are a convenient way to download large files like Linux distributions, and a media server is a perfect platform for unattended downloads. rTorrent is a powerful but easy to use command line client that can be customized for any situation. On its own, rTorrent will not run as a daemon, but coupled with screen, it can run in the background.

Install rTorrent and screen by executing the following commands in the Terminal.

sudo apt-get install rtorrent
sudo apt-get install screen

Once installed, start screen and attach rTorrent to it.

screen -S torrents
rtorrent

Detach the screen session by pressing ctrl + a, d and you’ll be back to the main prompt. Now rtorrent will remain running even when you disconnect from SSH.

To rejoin the rTorrent session, you must attach to it.

screen -r torrents

avidemux

Avidemux is set of audio/video tools for the command line that can do many operations. So far I’ve used it to shift audio inside video tracks. Like most other programs, it simply requires a few arguments to do whichever job you need.

sudo apt-get install avidemux

mencoder

Slightly different than ffmpeg, mencoder is another suite of video conversion for a variety of formats. This is an excellent tool to use for converting MKV or OGM files to AVI (video does not need to be converted). Again, it requires basic command line arguments, and can be used in a screen session to work in the background.

sudo apt-get install mencoder

lm-sensors

An important part of running a headless media server is making sure the hardware is operating within its temperature restraints. A tool do monitor that is called lm-sensors and must be set up with the specific hardware in your computer. I could explain the system, but a post on the Ubuntu forums does a much better job. Once installed, simply issue the command sensors to see the temperature of various compontents, and — if your motherboard supports it — the chassis fan speed.

Of course, this only begins to scratch the surface of possible software for a Ubuntu fileserver, but I think the applications shown are important for running a system without local input. Set them up, and enjoy central storage for all the computers in your house.

Tagged with: , , , ,