One of the first posts I wrote for this blog, back in 2007, was about how to write and compile C programs using Mac OS X. That post was a result of me doing C programming in my first year of university, and since then plenty of things have changed. Not only have I finished university, but the basic procedure for installing Xcode has changed as well.
I thought I’d write a new tutorial updated for the more recent versions of OS X.
- Install Xcode from the Mac App Store
- Install the extra Command Line Tools from within Xcode by navigating to Xcode > Preferences > Downloads
- Create a new Xcode project by navigation to File > New > Project
- From the template list, under OS X, select Command Line Tool and choose Next
- Fill out the required forms, and under Type choose C
- Save the project to your computer
- Open main.c and write!
It’s possible that by default the toolbar won’t be shown to click Compile + Run, so you can press Cmd+R to do that directly.
The method in the original post continue to work, as long as Xcode installed as described above. Comments on that post provide additional information, like this one from mvdhoef:
instead of resorting to the default a.out you can use gcc the way it was meant to be used!!!
gcc -o -Wall name file.c
where name is the name of the program gcc will create,
file.c is the file and perhaps extension of the code you have created.
-o is to open new/rewrite in this case name (*if name already existed, it would overwrite without a second thought)
-Wall is another option which tells gcc to show all errors it encounters during compiling. (*this is optional)
Don’t forget to slack off while your code is compiling.
When running a website, it is very important to have a backup of all site files, to prepare for any event that may require reloading data (file corruption, moving hosts, etc.). After building my Ubuntu file server, I knew that I had to find a way to mirror this website so files could be recovered if necessary.
I looked at rsync and curlftpfs, but that combination was complex to set up. Soon after, I stumbled on FTPcopy and have found a good solution.
Here is a bash script to automate the process and create a daily mirror of whatever FTP server you want to back up.
- Download and install FTPCopy from the repositories.
sudo apt-get install ftpcopy
- Change to a directory that will store the script and open a new text file.
i, then copy and paste the following text.
# Issue FTPcopy command
ftpcopy --no-delete -l 1 -u $USER -p $PASS $HOST $REMOTE .
- Be sure to change the values for website, storage directory, remote directory, host, username and password.
- Save and exit the text file by typing
- Make the script executable by typing
chmod a+x /path/to/script
- Add the script to the crontab so it will be executed on a regular basis. I use Webmin for this type of administration work, but it is possible to use the command line. Use this example to sort out the format. Mine runs daily at 12 AM.
To clarify, this code changes directories into the backup folder, then issues the ftpcopy command. The remote directory of public_html is common on many webservers, but be sure to confirm before running the script. The no-delete option means that files are not removed from the backup if they’ve been removed from the web server. The l option simply means provide feedback of what files are being moved — this can be viewed in your user mail.
After the time has passed for the first time, check the folder where your backups will reside to make sure they are being added as planned.
Earlier I wrote an Applescript that goes online to TV.com and finds the episode titles for TV show video files. While that seemed to work properly, TV.com changed their format and my Applescript went kaput. Since I really wanted to have this process automated, I wrote a bash script to do the same thing with the command line.
The result is a Ubuntu bash script that renames all the formatted files in a folder with the actual episode titles. Right now it requires Linux because it uses wget and XMLStarlet to download the file data, but I may release an additional script that works with other systems.
The entire script is made possible by the excellent XML feed service by TVRage.com.
XMLStarlet is a small command-line utility that can process XML files and text. It is required to traverse the XML structure of the TVRage.com data. To download this utility in Ubuntu, simply use the repositories.
sudo apt-get install xmlstarlet
Change paths where appropriate.
- Save the script to a known folder, change into that folder, and make it executable by issuing the following command
chmod a+x ./tvrenamer.sh
- Change the current directory to the folder that contains the video files.
cd Television/Season\ 1
- Rename all the files in the folder to use the format
- Call the script and append the name of the show to the end of the command.
- Watch as the shows all magically change their name.
Downloading show data for 'Simpsons'...
Downloading episode guide...
Simpsons - S08E01 - Treehouse of Horror VII.avi
Simpsons - S08E02 - You Only Move Twice.avi
Simpsons - S08E03 - The Homer They Fall.avi
Simpsons - S08E04 - Burns, Baby Burns.avi
Simpsons - S08E05 - Bart After Dark.avi
Simpsons - S08E06 - A Milhouse Divided.avi
Simpsons - S08E07 - Lisa's Date with Density.avi
Simpsons - S08E08 - Hurricane Neddy.avi
Simpsons - S08E09 - El Viaje Misterioso de Nuestro Jomer (The Mysterious Voyage of Homer).avi
Simpsons - S08E10 - The Springfield Files.avi
If you wish to access the script simply by typing the name (tvrenamer, for example) simply issue the following two commands:
cp /path/to/script/tvrenamer.sh /usr/local/bin/tvrenamer
sudo chmod a+x /usr/local/bin/tvrenamer
From this point, you simply need to use
tvrenamer "TV Show".
The script reads all files in the folder, but will only rename files that are in the S**E** format. TV show titles must have escaped spaces to properly search for the show, or be surrounded in ” quotes.
Building websites is a rather evolutionary thing. Code is reused and improved continuously and after a few sites that used a similar code structure, I decided to create a separate subdomain of my site to sell these scripts for $1.
The wesg Script Directory contains all the various scripts and programs I’ve created and I will add to them whenever a new one is made. Each script will have a demo for you to try before purchasing and is all done through PayPal.
Right now I only have a PHP/MySQL based events organizer, but in the coming days/weeks I’ll have a login script there too.
Need a base to start your website? Check it out!
If you’ve worked on a website, or had someone build a website for you, or even purchase webhosting, you’ve probably heard the “buzz-phrase” Search Engine Optimization. What exactly is this long-winded acronym?
SEO, as it is commonly referred to, is editing and building a website so that traffic can be increased. At its foundation, it is based on modifying and building code so that search engines such as Google, Yahoo, MSN, and Ask can better crawl your site in order to properly place it in search results. The easier the search engines can determine what content your site has, the better chance you have of having a favourable placement in keyword results.
Things you can do for SEO
- Strong titles — Make sure your site has a good title for the browser. Usually this is simply the title for your website, plus the title of the specific page. If possible, make sure the website title is somewhere in the title of every page on your site.
- Clear metadata — Metadata is the text in the code of your site that search engines display next to your search result. The most important piece of code to include in this is the description tag. Use
and put it in the html head of each page. If possible, have a different description for each page, or if that isn’t possible, just use the tag on the homepage.
- Use descriptive URLs — When designing the URL layout of your site, try to use .htaccess rewrite rules to turn ugly, number based urls into worded ones. You’ll notice on this site that each post has a url that includes the post title. That is generally favoured over the basic variable + number approach (ie. /item/watches is better than ?i=3403).
- Use a sitemap when starting — While their merit for established sites is often questioned, submitting a sitemap to Google when your site is first starting out is a great way to get all your pages indexed so people can find them. Do that by signing up for Google Webmaster Tools. A sitemap can be as simple as a single list of individual pages, or an automatically generated XML sheet. View my site’s sitemap here.
Things to watch
- Don’t pay to gain search ranking — When you sign up for webhosting, you may be intrigued by the company’s “get into every search engine” offer. Never pay for services to immediately boost your Google rank. Google makes it clear that they will never charge users to increase their search results, and no company can offer surefire ways to gain rank. Gaining search rank takes time, and can be done with no money spent at all.
Instead of paying money to gain rank, you must look at every way to gain incoming links. Respected sites that link to you make your site more respected, and will likely boost your rank for general keywords. Do this by finding website directories related to your topic, or join a related forum and put your URL as a signature for your posts. Over time, search engines will move you up, and you should see an increase in traffic coming from them.
[tags]SEO, Search Engine Optimization, hosting, website[/tags]