- To hell with the pig... I'm going to Switzerland.

Protecting My Bandwidth (Sunday, February 26, 2006)

So two more site updates. I recently discovered that someone was linking to one of the images on my site and embedding it in their page, rather than making a local copy of it. So I sent them a note, kindly asking them not to. They eventually did do as I asked, which was nice of them, however, I decided I should come up with a technical solution anyway. So I've created a new script for serving images which checks to make sure that there is a local referrer and serves up lots of garbage if it's not... I want people to know that I'm serious. And it won't prevent people from downloading their own copy of the files. I should put a note in my Terms page explicitly authorizing folks to copy that sort of file for that purpose.

Then, of course, there's the problem of feed readers like Bloglines serving up my content, which causes the image references to be broken. So I modified my feed again so that not only does it rewrite relative URLs, but also changes the images so that they aren't protected. I'm pretty sure Google isn't indexing my feed, so the image search should only return links to protected images. It's not a fool-proof plan, but I think it's good enough, and we'll see how it goes.

In other news, I've been having a lot of fun writing a parser for a Lisp-like language for my graduate compilers class. I wasn't content to just write a simple recursive descent parser. Noooo. That would have been too easy, and also too tedious. So I wrote a full "compiler" that reads in a BNF file with semantic routines, and generates a full recursive descent parser with error recovery. It's largely equivalent to Yacc, except for the type of parser it generates. Very fun stuff.

—Brian (2/26/2006 9:09 PM)


No comments.

(no html)

Disclaimer: Opinions on this site are those of Brian Ziman and do not necessarily
reflect the views of any other organizations or businesses mentioned.