dannyman.toldme.com


FreeBSD, Linux, Mac OS X, Technical

De-duplicating Files with jdupes!

Link: https://dannyman.toldme.com/2023/03/23/de-duplicating-files-with-jdupes/

Part of my day job involves looking at Nagios and checking up on systems that are filling their disks. I was looking at a system with a lot of large files, which are often duplicated, and I thought this would be less of an issue with de-duplication. There are filesystems that support de-duplication, but I recalled the fdupes command, a tool that “finds duplicate files” … if it can find duplicate files, could it perhaps hard-link the duplicates? The short answer is no.

But there is a fork of fdupes called jdupes, which supports de-duplication! I had to try it out.

It turns out your average Hadoop release ships with a healthy number of duplicate files, so I use that as a test corpus.

> du -hs hadoop-3.3.4
1.4G hadoop-3.3.4
> du -s hadoop-3.3.4
1413144 hadoop-3.3.4
> find hadoop-3.3.4 -type f | wc -l
22565

22,565 files in 1.4G, okay. What does jdupes think?

> jdupes -r hadoop-3.3.4 | head
Scanning: 22561 files, 2616 items (in 1 specified)
hadoop-3.3.4/NOTICE.txt
hadoop-3.3.4/share/hadoop/yarn/webapps/ui2/WEB-INF/classes/META-INF/NOTICE.txt

hadoop-3.3.4/libexec/hdfs-config.cmd
hadoop-3.3.4/libexec/mapred-config.cmd

hadoop-3.3.4/LICENSE.txt
hadoop-3.3.4/share/hadoop/yarn/webapps/ui2/WEB-INF/classes/META-INF/LICENSE.txt

hadoop-3.3.4/share/hadoop/common/lib/commons-net-3.6.jar

There are some duplicate files. Let’s take a look.

> diff hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
> ls -l hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
-rwxr-xr-x 1 djh djh 1640 Jul 29 2022 hadoop-3.3.4/libexec/hdfs-config.cmd
-rwxr-xr-x 1 djh djh 1640 Jul 29 2022 hadoop-3.3.4/libexec/mapred-config.cmd

Look identical to me, yes.

> jdupes -r -m hadoop-3.3.4
Scanning: 22561 files, 2616 items (in 1 specified)
2859 duplicate files (in 167 sets), occupying 52 MB

Here, jdupes says it can consolidate the duplicate files and save 52 MB. That is not huge, but I am just testing.

> jdupes -r -L hadoop-3.3.4|head
Scanning: 22561 files, 2616 items (in 1 specified)
[SRC] hadoop-3.3.4/NOTICE.txt
----> hadoop-3.3.4/share/hadoop/yarn/webapps/ui2/WEB-INF/classes/META-INF/NOTICE.txt

[SRC] hadoop-3.3.4/libexec/hdfs-config.cmd
----> hadoop-3.3.4/libexec/mapred-config.cmd

[SRC] hadoop-3.3.4/LICENSE.txt
----> hadoop-3.3.4/share/hadoop/yarn/webapps/ui2/WEB-INF/classes/META-INF/LICENSE.txt

[SRC] hadoop-3.3.4/share/hadoop/common/lib/commons-net-3.6.jar

How about them duplicate files?

> diff hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
> ls -l hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
-rwxr-xr-x 2 djh djh 1640 Jul 29 2022 hadoop-3.3.4/libexec/hdfs-config.cmd
-rwxr-xr-x 2 djh djh 1640 Jul 29 2022 hadoop-3.3.4/libexec/mapred-config.cmd

In the ls output, the “2” in the second column indicates the number of hard links to a file. Before we ran jdupes, each file only linked to itself. After, these two files link to the same spot on disk.

> du -s hadoop-3.3.4
1388980 hadoop-3.3.4
> find hadoop-3.3.4 -type f | wc -l
22566

The directory uses slightly less space, but the file count is the same!

But, be careful!

If you have a filesystem that de-duplicates data, that’s great. If you change the contents of a de-duplicated file, the filesystem will store the new data for the changed file and the old data for the unchanged file. If you de-duplicate with hard links and you edit a deduplicated file, you edit all the files that link to that location on disk. For example:

> ls -l hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
-rwxr-xr-x 2 djh djh 1640 Jul 29 2022 hadoop-3.3.4/libexec/hdfs-config.cmd
-rwxr-xr-x 2 djh djh 1640 Jul 29 2022 hadoop-3.3.4/libexec/mapred-config.cmd
> echo foo >> hadoop-3.3.4/libexec/hdfs-config.cmd
> ls -l hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
-rwxr-xr-x 2 djh djh 1644 Mar 23 16:16 hadoop-3.3.4/libexec/hdfs-config.cmd
-rwxr-xr-x 2 djh djh 1644 Mar 23 16:16 hadoop-3.3.4/libexec/mapred-config.cmd

Both files are now 4 bytes longer! Maybe this is desired, but in plenty of cases, this could be a problem.

Of course, the nature of how you “edit” a file is very important. A file copy utility might replace the files, or it may re-write them in place. You need to experiment and check your documentation. Here is an experiment.

> ls -l hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
-rwxr-xr-x 2 djh djh 1640 Mar 23 16:19 hadoop-3.3.4/libexec/hdfs-config.cmd
-rwxr-xr-x 2 djh djh 1640 Mar 23 16:19 hadoop-3.3.4/libexec/mapred-config.cmd
> cp hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
cp: 'hadoop-3.3.4/libexec/hdfs-config.cmd' and 'hadoop-3.3.4/libexec/mapred-config.cmd' are the same file

The cp command is not having it. What if we replace one of the files?

> cp hadoop-3.3.4/libexec/mapred-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd.orig
> echo foo >> hadoop-3.3.4/libexec/hdfs-config.cmd
> ls -l hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
-rwxr-xr-x 2 djh djh 1644 Mar 23 16:19 hadoop-3.3.4/libexec/hdfs-config.cmd
-rwxr-xr-x 2 djh djh 1644 Mar 23 16:19 hadoop-3.3.4/libexec/mapred-config.cmd
> cp hadoop-3.3.4/libexec/mapred-config.cmd.orig hadoop-3.3.4/libexec/mapred-config.cmd
> ls -l hadoop-3.3.4/libexec/hdfs-config.cmd hadoop-3.3.4/libexec/mapred-config.cmd
-rwxr-xr-x 2 djh djh 1640 Mar 23 16:19 hadoop-3.3.4/libexec/hdfs-config.cmd
-rwxr-xr-x 2 djh djh 1640 Mar 23 16:19 hadoop-3.3.4/libexec/mapred-config.cmd

When I run the cp command to replace one file, it replaces both files.

Back at work, I found I could save a lot of disk space on the system in question with jdupes -L, but I am also wary of unintended consequences of linking files together. If we pursue this strategy in the future, it will be with considerable caution.

Feedback Welcome


FreeBSD, JIRA, Linux, Mac OS X, Technical

Strip Non-Ascii From a File

Link: https://dannyman.toldme.com/2012/05/10/mysql-not-configured-for-utf8/

I have had bad luck trying to coax this out of Google, so here’s a Perl one-liner:

perl -pi -e 's/[\x80-\xEF]//g' file.txt

Where file.txt is a file you want to clean up.

Why this comes up is because we have a web application that was set up to hit a MySQL database, which is incorrectly configured to store text as ASCII instead of UTF-8. The application assumes that all text is Unicode and that the database is correctly configured, and every week or two someone asks me why they are getting this weird gnarly error. Typically they are pasting in some weird UTF-8 whitespace character sent to us from Ukraine.

Eventually the database will be reloaded as UTF-8 and the problem will be solved. Until then, I can tell folks to use the Perl command above. It just looks for anything with the high bit set and strips it out.

Feedback Welcome


FreeBSD, Linux, Mac OS X, Technical

Avoiding Concurrent Crons: Easy File Locking!

Link: https://dannyman.toldme.com/2010/09/20/lockf-flock-cron/

Old SysAdmin tip: keep your frequent-but-long-running cron jobs from running concurrently by adding some lightweight file locking to your cron entry. For example, if you have:

* 15 * * * /usr/local/bin/db-backup.sh

On FreeBSD you could use:

* 15 * * * /usr/bin/lockf -t 0 /tmp/db-backup.lock /usr/local/bin/db-backup.sh

Or on Linux:

* 15 * * * /usr/bin/flock -w 0 /tmp/db-backup.lock /usr/local/bin/db-backup.sh

Read up on the lockf or flock man pages before you go putting this in. This can be a bit tricky because these can also be system calls. Try “man 1 lockf” or the like to nail it down to the manual for the user-executable command.

1 Comment


Featured, Mac OS X, Sundry, Technical, Technology

Week of 20 December, 2009

Link: https://dannyman.toldme.com/2009/12/27/week-of-20-december-2009/

Sunday, December 20

So, it is weird sleeping in when you expected to be on a train. New York City was a winter wonderland, very pleasant to walk around when the cars are driving slow, and the streets are filled with people shoveling snow. A Winter snow storm the weekend before Christmas hits the spot for people to rub shoulders with strangers in a friendly manner.

The snow also means no parking enforcement on Monday. It looks like we will have to move the car before Thursday, as Christmas Eve is not a parking holiday.

We went to brunch, then some light shopping, and back home for a relaxing afternoon. Mei has one last night shift this evening, and since the car is well and buried, I escorted her to the hospital on the train.

I like going out in the snow. Must be that Viking blood. On my way back I noted that in the working class neighborhood surrounding the hospital, there was less commercial activity, because there is less money to spend. Without a critical mass of people with sufficient disposable income, you don’t get the retail services opening up which help employ the working class, and that is why modern small towns tend to be somewhat dead. I started thinking about how in SimCity 4, commercial development always lagged in a new town until a certain point . . .

Later that night I looked up the new MMO city simulator, Cities XL. For $10 / 30 days I thought I would give it a try. I didn’t go to bed until 5am, though to be sure I didn’t get the game running until 3am due to download issues. The game feels pretty “beta” but from what I seen the interface is pretty slick, and the graphics are beautiful. It seems pretty close to the idea of a game I have been wanting to play for years, where you build your city on a planet with other cities, and cities have effects on each other. The first two things I have seen that have been missing from SimCity is that the very first thing you need is a road coming in from outside, and then a consideration for local natural resources, which give your new town a back story and a context, which is a more satisfying start than an abstract sandbox.

Monday, December 21

Brian: Okay, cats riding Roomba pretty much justifies Google’s purchase of YouTube.
Me: Amen! It is all about . . . the long tail!

Tuesday, December 22

Brunch with Mei. We ate at Tom’s which is this famous place that is never open. I ate there once before and enjoyed their French Toast, but this time through we found the food quality somewhat lacking.

After a relaxed day at home, it was up to Penn Station, and on to Chicago. Mei accompanied me to Penn Station to see me off, but as I was concerned with finding the Amtrak check-in kiosks and then a good place to wait for the track announcement I kept speeding off ahead of her. She wasn’t too pleased about that but was gracious enough in saying goodbye. I got a nice seat on the train and a Japanese Literature Post-grad named Steve sat next to me.

The train was running a little late, and they never did go through coach for dinner reservations, so as the train pulled out of Albany at 7:30 I walked back to the dining car, where a long line of confused and uninformed guests had gathered, knowing that they typically stop serving dinner at 8pm. I had a lamb shank, sitting across from a guy who had been in computer sales for the past half century or so. Right now he is retired but helping some guys in nano-fabrication get running in business. Cool stuff.

There was a fair amount of talk of politics. The guy was Republican who had voted for Obama, and the lady sitting next to me said her husband was a Tea Party protester. I started to laugh in sympathy then realized that hey, sometimes you have sat down to eat with Republicans. I listened as these business folks tried to make sense of the role of government in the modern world. They disdained the crazy right-wing types who oppose all government programs.

I slept better than I had the first time I rode the train in November.

Usenet’s big “problem” is that nobody ever wrote a user-friendly web interface for it. Instead, the people who really wanted to chat found it easier to hack up web forums filled with animated emoticons using PHP and MySQL, rather than figure out some bitchin’ gateway into the great gray world, ruled by curmudgeons content to seal themselves off from the hoi polloi.

Wednesday, December 23

We were repeatedly woken in the morning by loud announcements regarding the fact that breakfast could be had in the dining car. I took the L home through a landscape I most remember from high school. In the evening I showed Machinarium to the family, which everyone found to be adorable and engaging. I ended up playing the game until 3:30am.

Thursday, December 24

We headed down to Grandma’s house for Christmas Eve. There was less family around than other years but neighbors dropped by. A lighter year than usual, so we had a lot of leftovers.

Around 10pm we opened presents. I went to set up the webcam I had gotten Grandma, but when I plugged it in to her Mac nothing happened. Further investigation revealed that the UVC feature that enables webcam support was introduced in OS X 10.4 and that if you have 10.3.9 you’re just a sorry twat who can not use webcam software. Okay, so how much to upgrade? Well, the latest and greatest is only $30! That’s not so bad, let us do this! Woah there pardner, you can’t have the new Mac OS unless you have 1GB of RAM and an Intel processor. Your vintage Mac Mini just isn’t going to do! Uhhh, okay. How about 10.4? Well, Apple doesn’t publish that any more, that is a collector’s item, you see. The current market rate for a used copy of the old Mac OS on the resale market is around $150.

I guess if you keep spending money on upgrading your Mac everything will be dandy but if you’re the sort of human trash who only upgrades her computer maybe twice a decade then Fuck You, Grandma! If this were Windows or Linux someone would have figured out how to support a nice webcam. Hell, on Linux I can even use the cheaper “Windows” webcam because, unlike Mac OS, someone figured out how to get the auto-focus working . . . the fact that Microsoft can only manage to squeeze out a potentially mandatory OS upgrade once or twice a decade begins to seem more virtuous. Apple really should let you easily upgrade components of their OS without much hassle, but selling computers is how they make money.

Fuck you, Apple. Well, I’ll find her an upgrade to OS X 10.4 for non-Intel computers on CD-not-DVD and there may even be a store around that will happily get her a memory upgrade, because something tells me that even if the Apple Store has a Genius who could, by appointment only, fill out the form to mail the computer off for a memory upgrade because woah basic maintenance on a Mac Mini is effing rocket science I suspect that when they find out it is an old computer stained by a half decade of tobacco that they will just condescendingly laugh at my horribly backward Grandmother and I’d finally snap and go in there and beat the crap out of some wannabe-hipster douchebags.

Next time Grandma gets a PC.

Friday, December 25

Cleaning up Grandma’s house. Uncle John started to explore the netbook that we got him for Christmas. Janice came by, and we were all glad. John set up an old-fashioned 120mm “dual lens reflex” box camera on a tripod and some lights and took some family Christmas photos. We also looked over some rifles that had been sitting around in Grandma’s house from the previous owner, before heading back home.

Saturday, December 26

Mom treated me to brunch, and Jessica brought the posters she got me for Christmas to her shop to frame them. Then, Mom drove me down to Union Station, for my 9PM train back towards New York.

Feedback Welcome


Featured, FreeBSD, Linux, Mac OS X, Technical

HOWTO: Random Number in Shell Script

Link: https://dannyman.toldme.com/2008/07/04/shell-sh-bash-random-splay/

The other day I was working on a shell script to be run on several hundred machines at the same time. Since the script was going to download a file from a central server, and I did not want to overwhelm the central server with hundreds of simultaneous requests, I decided that I wanted to add a random wait time. But how do you conjure a random number within a specific range in a shell script?

Updated: Due to much feedback, I now know of three ways to do this . . .

1) On BSD systems, you can use jot(1):
sleep `jot -r 1 1 900`

2) If you are scripting with bash, you can use $RANDOM:
sleep `echo $RANDOM%900 | bc`

3) For portability, you can resort to my first solution:
# Sleep up to fifteen minutes
sleep `echo $$%900 | bc`

$$ is the process ID (PID), or “random seed” which on most systems is a value between 1 and 65,535. Fifteen minutes is 900 seconds. % is modulo, which is like division but it gives you the remainder. Thus, $$ % 900 will give you a result between 0 and 899. With bash, $RANDOM provides the same utility, except it is a different value whenever you reference it.

Updated yet again . . . says a friend:
nah it’s using `echo .. | bc` that bugs me, 2 fork+execs, let your shell do the math, it knows how
so $(( $$ % 900 )) should work in bsd sh

For efficiency, you could rewrite the latter two solutions:
2.1) sleep $(( $RANDOM % 900 ))
3.1) sleep $(( $$ % 900 ))

The revised solution will work in sh-derived shells: sh, bash, ksh. My original “portable” solution will also work if you’re scripting in csh or tcsh.

2 Comments


FreeBSD, Linux, Mac OS X, Technical

Mini-HOWTO: What Time is UTC?

Link: https://dannyman.toldme.com/2008/05/06/what-time-utc/

I wanted to know what time it was in UTC, but I forgot my local offset. (It changes twice a year!) I figured I could look in the date man page, but I came up with an “easier” solution. Simply fudge the time zone and then ask.

0-20:57 djh@noneedto ~$ env TZ=UTC date
Tue May  6 03:57:07 UTC 2008

The env bit is not needed in bash, but it makes tcsh happy.

Update: Mark points out an easier solution:
date -u

Knowing you can set TZ= is still useful in case you ever need to contemplate an alternate timezone.

(Thanks, Saul and Dave for improving my knowledge.)

3 Comments


Featured, Free Style, FreeBSD, Linux, Mac OS X, Sundry, Technical, Technology

Trendspotting: “The Amiga Line”

Link: https://dannyman.toldme.com/2008/01/26/deader-than-amiga/

I have been playing with Google Trends, which will be happy to generate a pretty graph of keyword frequency over time. A rough gauge to the relative popularity of various things. This evening, I was riffing off a post from the Royal Pingdom, regarding the relative popularity of Ubuntu and Vista, among other things.

I got started graphing various Linux distributions against each other, XP versus Vista, and trying to figure out the best keyword for OS X. Then, I wondered about FreeBSD. Against Ubuntu, it was a flatline. So, I asked myself: what is the threshold for a dead or dying Operating System?

Amiga vs FreeBSD:
Google Trends: Amiga versus FreeBSD

Ouch! Can we get deader?

Amiga vs FreeBSD vs BeOS:
Google Trends: Amiga versus FreeBSD versus BeOS

To be fair, the cult of Amiga is still strong . . . BeOS is well and truly dead. But how do the BSDs fare?

Amiga vs FreeBSD vs BeOS vs NetBSD vs OpenBSD:
Google Trends: *BSD versus Amiga, BeOS

NetBSD has been sleeping with the BeOS fishes for a while, and OpenBSD is on its way. And that’s a league below Amiga!

In Red Hat land, only Fedora beats “the Amiga Line”. For Unix in general, nothing stops the Ubuntu juggernaut. But there’s a long way to go to catch up with Uncle Bill.

(Yes, it is a rainy night and the girlfriend is out of town.)

Postscript: Ubuntu versus Obama

3 Comments


About Me, FreeBSD, Linux, Mac OS X, Technical

SysAdmin OpEd: Where to Keep the Crons

Link: https://dannyman.toldme.com/2008/01/11/etc-crontab-or-die/

This is just a note which I contributed to a thread on sage-members, to get something off my chest, as to where people should maintain their crontab entries. I sincerely doubt that reading what I have to say will bring you any great illumination.

I’d say, any reasonable SysAdmin should default to /etc/crontab because every other reasonable SysAdmin already knows where it is. If anything is used in addition to /etc/crontab, leave a note in /etc/crontab advising the new guy who just got paged at 3:45am where else to look for crons.

For production systems, I strongly object to the use of per-user crontabs. I’m glad to hear I’m not alone. One thing I have to do in a new environment tends to be to write a script that will sniff out all the cron entries.

And then there was the shop that used /etc/crontab, user crons, and fcron to keep crons from running over each other. This frustrated me enough that I did a poor job of explaining that job concurrency could easily be ensured by executing a command through (something like) the lockf utility, instead of adding a new layer of system complexity.

Yes, I am a cranky old SysAdmin.

2 Comments


About Me, Mac OS X, News and Reaction, Sundry, Technical, Technology

Goodbye, Bill Gates!

Link: https://dannyman.toldme.com/2008/01/08/bill-gates-last-day/

I was startled by this YouTube video, where we discover that Bill Gates can make fun of himself. Or, at least, his people can assemble a video where Bill Gates makes fun of himself. Good for Bill! I was then reassured at the consistency of the universe, when it was revealed that Bill really can’t make fun of himself without at least a dozen star cameos to reassure us that it is not so much that he is poking fun at himself, but that he is “acting”.

It is telling that Al Gore has the funniest line.

I hope Bill’s foundation does much good in the world. I almost feel sorry for Microsoft that after all the effort, Vista has proven to be a cold turkey. For what its worth, from a UI and performance perspective, I prefer Windows XP to Mac OS X. Though I’m not sure that this is praise for Microsoft as much as it is an aversion to the Smug Cult of Apple.

(Yes, I am a contrarian. People hate contrarians. Especially Mac people, who think they have the contrarian cred: the last thing a contrarian wants to encounter is a contradicting contrarian!)

Feedback Welcome


About Me, Mac OS X, Technology

Evidence I’m Not a Mac Person . . .

Link: https://dannyman.toldme.com/2007/08/03/the-genius-of-apple/

I just completed a feedback form regarding my AppleCare warranty experience. Question 12a gave me a chance to bitch. Question 12b made me smile at my ridiculous expectations:


12a Is there anything else you would like to tell Apple about your recent in-store repair experience at the Apple Retail Store? (NOTE: 2000 character limit)

Replacing the optical drive on a Mac Mini is a simple procedure that takes fifteen minutes, requiring a screwdriver and a putty knife. That I should have to drive to a God damned mall and explain to a “genius” that he doesn’t actually need my password to log in to OS X, wait for twenty minutes as the “genius” engages in manual data entry, then wait “seven to ten business days” for the part to be replaced is FUCKING SAD.

(Note: Hold down command+s during boot, run to the appropriate init level and type “passwd” to reset the password. Even someone who isn’t a “genius” can pull that off!)

12b The comment above is a

Compliment

Complaint

Suggestion

(more…)

5 Comments


Mac OS X, Technical, WordPress

Mac OS X and per-user Support for .htaccess

Link: https://dannyman.toldme.com/2007/06/21/mac-os-x-sites-htaccess-allowoverride/

Problem

I just spent a fair amount of time wrestling with Apache on my Macintosh. The problem is that it simply refused to read the .htaccess file in my user directory.

My First Approach

I took the “Unix Guy” approach and edited /etc/httpd/httpd.conf to ensure that Apache was configured to consult my user’s .htaccess file. I changed this bit:

<Directory /Users/*/Sites>
    AllowOverride FileInfo AuthConfig Limit
    Options MultiViews Indexes FollowSymLinks IncludesNoExec
    [ . . . ]

To read:

<Directory /Users/*/Sites>
    # AllowOverride FileInfo AuthConfig Limit
    AllowOverride All
    Options MultiViews Indexes FollowSymLinks IncludesNoExec
    [ . . . ]

But . . . nada. (more…)

Feedback Welcome


Mac OS X, Technical

Newbie Mac Hobo

Link: https://dannyman.toldme.com/2007/05/14/hopping-a-train/

So, I have a few gazillion things I would like to do, and some free time to play with. In terms of “professional development” I am looking for work, but more interesting to me is to have some time for education. I know an awful lot of things, especially about systems administration, and there are plenty of things that I don’t know, some things you would think I have done, but haven’t, and then there’s adding new stuff like learning Ruby on Rails, which could be danged handy if I choose to pursue contracting. To that end, I’ve had the “Agile Web Development with Rails” book collecting dust for a while . . . and a Macintosh desktop, to keep Unix confusing.

I’m into Chapter 3: Installing Rails, and the instructions for me don’t quite cut it, and a link they provide to Lucas Carlson’s blog no longer exists. But that needn’t stop us, because Google found me a very handy tutorial from Hivelogic: Building Ruby, Rails, Subversion, Mongrel, and MySQL on Mac OS X. It is from February and it does an excellent job of walking through the steps to fetch, build, and install the various pieces on a Macintosh, in /usr/local, like a real Unix system, and explaining all the important bits like fixing your $PATH and installing Xcode, and what the heck is sudo anyway. Stepping through the article is a breeze and you are left with a working Rails server, backed by MySQL, and the beginnings of a clue as to contemporary “best practices” like deployment-via-Capistrano. Huzzah!

In my case, I had to complete one other hurdle along the way. The “gem install rails” bit was erroring out for me when I followed the book and again with the Hivelogic article, with an error like: “Could not find rails (> 0) in any repository” . . . Google again found me an answer from Army of Evil Robots that basically boils down to:

gem update

Anyway, now I have a working Ruby setup on this computer, and I can hitch along to the next chapter, where I’ll learn to hop a freight.

I’m not sure what I might want to do with Rails, but “re-implement lnk.to” seems sort of obvious. If you are reading this and happen to have feelings about lnk.to, or link-shortening services, I welcome any thoughts, suggestions, wishes . . . thanks! -d

2 Comments


Site Archive