Planet PHP Sources now on GitHub

After Lukas asked me for the current Planet sources, I realized that I didn’t really maintain the sources in the mentioned SVN repository anymore (for various reasons). So I decided to finally move them to GitHub. It makes much more sense there, since the most decent thing to do if you want to set up your own planet is to fork it and git (plus GitHub) makes that painlessly easy. Nevertheless I’m of course still interested in patches :)

Please be aware, that the code for the planets is pretty old, it uses a framework, which is declared deprecated and the code sometimes evolved a little bit too much (i.e. some refactoring wouldn’t be too bad). You also may find files in the repository, which shouldn’t belong there :) But OTOH, it’s working day in, day out since ages and does what it’s supposed to do.

The sources for Planet PHP are here:
The “fork” for Planet Switzerland is located in the blogug branch at

Tags: ,

Do not steal content from Planet PHP …

… and especially not without proper links to the original authors, no attribution and too many google ads, otherwise this happens to you:


Update: They removed the feed this morning, here’s a small screenshot:


Tags: ,

blogug in today’s “heute”

Last Friday, I gave a phone-interview to Thomas Benkö from heute about blog aggregators in general and Planet Blogug. The result can be read in today’s edition (5.6 MB PDF) or as single page from here (156 KB PDF :) )

Tags: , , , ,

Is automated headline-linking to blogs content stealing?

On Friday evening, Marcel from made a little post to check, who’s stealing his content. I actually saw that post on Planet Blogug and didn’t think much of it, but when a few hours later appeared as “content-stealer” I was quite surprised…

What have “we” done? On the so called city pages, gets the latest post from bloggers of that city from Planet Blogug and displays them with the title and a direct link back to the original post (see for example here).

I of course quickly removed from the index, made a comment on and all was fine again, also for Marcel (later Dorian made also a statement)

But the general question remains (independent of the case). Is taking the headlines of a site, putting them on your own site with a direct link back to the original post already a copyright infringement? Isn’t that just what Google, Technorati et al. also do? What would NZZ or the Tagi say, if I’d do that? (I’m btw not talking about the actual content of the post, but just the headline).

There are some legal cases on that question, the Shetland News vs. Shetland Times case and especially for Switzerland the federal court case 4C.336/2004 where it was about spidering classified pages (real estate in that case), which it said, that this is allowed. With all that in mind, I highly doubt that one would win in a court with such a case.

But the problem in the jobblog case is also, that takes the “latest posts from a city” from, where one can add his blog. And if one adds it there, it doesn’t of course imply that everyone else can use that feed (legal issues aside). To give some options, has two checkboxes: “Non-commercial sites may use this feed” and “Commercial sites may use this feed” to declare what one wants. IMHO this classification is somehow too broad. I for example don’t mind if the headlines of my blogs appear on commercial sites (as long as they directly link back to me), but if they would also take the content, I’d maybe have a problem with it. So, maybe blogug should add an option “Linking just the headline to the original post is always fine” (or similar). But then again, only a little percentage actually does claim their entry at and maintain it, so at it the end, it’s useless nevertheless (except blogug doesn’t check any boxes by default, so that it’s really clear)

The other solution for providers like would be to ask each and every blog owner, if they’re fine with being on As the whole thing is an automated process, this would imply a pretty big administrative overhead.

Anyway, and blogug are thinking about better solutions for all of this, until then, if you’re not happy with your headlines being included on, just leave us a note and we will remove your blog.

There’s also an older post to a similar topic by me (but that was about the actual content, not just the headlines)

Tags: , , , ,

Search by language on Planet Switzerland

Inspired by a post on Bertrand’s blog and especially a comment by Stephanie there, I implemented a language detection feature of blog entries on Planet Switzerland. If you’re only interested in eg. German posts, you can search for lang:de (or French or Italian or English) from now on.

I used the PEAR Text_LanguageDetect class for this feature and so far, it works pretty well, if you limit the detection to those four languages. If you take all 51 available languages into consideration, then maybe 10% gets “funny” languages assigned (like azeri, cebuano, hausa, hawaiian, tagalog, pidgin or any other european language). It’s still not perfect with limiting to those four languages, but mainly short texts are assigned wrongly and it’s way below 5%.

And because some people just like statistics, here’s the break down of how many posts are done in which language:

it: 692
fr: 4’165
en: 8’856
de: 23’411

With the many many search options available in the mean time on Planet Switzerland, it was time to collect and write them down. So here it is: Search Options for Planet Switzerland.

Tags: , ,

Planet Switzerland goes local

For all who Planet Switzerland is not local enough, we just added a new feature: Search by city or canton. Some examples

city:Sankt Gallen
canton:Sankt Gallen

and also by country: country:Switzerland

And you can see the geo information for each blog, if you click on “More Info on this post”.

Technically, we check the Blog html start-page for <meta name=”ICBM” content=”” /> or name=”geo.position” and insert that into the DB. After that, we search for the appropriate city with the help of OpenGeoDB, which should have coordinates for all Swiss (and German) cities and save that in the DB as well.

This means, that only blogs show up on “location-search”, who have those meta-tags on their homepage, even if you search for “country:Switzerland”. And all blogs located in foreign countries do not show up either, since we don’t have data for those (except Germany, but there’s no blog up north with geodata), not even for Peru :). There are currently 167 blogs with geodata and 156 of them were “located”.

Since OpenGeoDB only has one “point” per city, there are cases, where the city is wrong (eg, if you live nearer to the centre of the neighbor-town than to your hometown). If that’s the case for your blog, please report this to us and we can fix it (we just add another point to the GeoDB). As mentioned above, you can check the town we assigned to you with “More Info on this post”.

We also currently don’t parse the RSS feed for geo location data. That will come later, and then each post can be on a different place.

How to add the meta-data to your blog is written on a blog-post by Alain. And another use of those data can be seen on the SLUG – Blog Map.

And as always, if you have more ideas, what could be done with that, just tell us :)

Tags: , ,

New Planet Switzerland features

Bored with reading blog posts which noone else interests? Just want to read, what everyone else is talking about and how they do it? Then look no further than (or as Atom-Feed) :) It just shows all the recent blogposts, which link to the “Top Links last 7 days” from the right side (and the Top Linked-to post itself) and leaves out the boring rest :)

On a more serious side, I added “and” and “or” support to the search queries. For example shows all the blog posts, which link to something with or in their links. Parenthesis don’t work yet, btw…

If you do a fulltext search, you don’t have to add “or” and “and”, eg. “bitflux blogug” searches for all posts containing the word “bitflux” or “blogug”, on the other hand “+bitflux +blogug” searches for all posts, which contain both terms. “and/or” does work, too, but is much slower on the DB server side”

Hope this is useful for your ego-surfing needs, and as always, add “atom/” just before “search/” and you get the Atom feed of the search.


Tags: ,

Why including remote JavaScript is sometimes a bad idea

And I’m not talking about XSS (the danger of including JavaScripts from untrusted sources should be obvious…)

This morning, I went to BloggingTom and the first thing I was greeted with was a Cocomment “popup” (a div based one). As it looked like a standard OS X window, the reflex based action was to hit Apple+W for closing it. Naaah, that closed the whole tab, of course. Here’s how that looked like:


Next try, clicking on the actual and very small close button, another popup came up:


And then I have to click “Cancel” to agree, that it won’t be “cocommented”. Usability^3 :)

Even better on Win/IE:


What happened? The usual and (by cocomment) recommended way to make a cocomment enabled post is to click the bookmarklet they delivered, before you want to make a post on a blog. Then the above popup and the warning makes somehow sense (as I – as the commentor – clicked my bookmarklet and want to cocomment-enable it). BloggingTom now just included that script by default to avoid having to click the bookmarklet. Nice idea for all, who have a cocomment account, very bad for all others :) seems to have a nicer solution to the problem with his coComment WordPress Plugin. It doesn’t enable the script by default, but adds a “toggle” button to the “Save Comment” button, so you can manually cocomment-enable the comment, without having to use the bookmarklet (you can’t install bookmarklets in all browsers, for example not in NetNewsWire).

And why is it bad to include remote JavaScript exactly? First, as BloggingTom’s example showed, you don’t have any control over that JS. It may have worked differently, when Tom did include that the first time and then cocomment just changed the behaviour (or maybe, Tom just didn’t test it without being logged in :) ). Even if it wasn’t changed since Tom integrated it, who guarantees that cocomment doesn’t make adjustments later, which break Tom’s site again? (Doesn’t have to be with bad intentions…). I wouldn’t integrate something like that and enable it by default (with no way to turn it off for the user) on something important as the comment function of your blog :)

Furthermore, what happens if is down? We had some similar issues with the gravatar site and Servers go down, are slow in responding, have network problems, etc and when that happens, either your site goes down with it, breaks the layout or is just damn slow in responding. We solved those problems with caching the gravatars and the feeds locally, so if the side is down, it won’t really affect us, just shows old data.

PS. This is (again) not a bashing against cocomment or BloggingTom or whoever. It was just the perfect example for why you should be careful in including remote JavaScripts (or similar functionality like remote RSS-Feeds) on your site. Both parties did certainly everyting in good intentions, how it is right now on Tom’s site is just a bad combination.

PPS. I’m still not convinced by the way cocomment works (collects comments) in general and doubt, that it will work in the longer term. But I’m sure they have some nice ideas in their “stealth mode”, which should improve that. I also had some ideas (before cocomment went public, btw : ) ) of how to make following discussions in comments easier with the help of and, but those ideas have their flaws, too, would only be Switzerland centered and most importantly, I don’t find the time currently to actual implement it, so I’ll shut up for now :)

Update: BloggingTom now turned it off, until a better solution is found.

Tags: ,

classifieds via blogs

Classifieds via blogs are the latest buzz. Others wrote already about edgio et al. (namics blog, Bernhard Seefeld and Andreas Göldi (from december and in german, still interesting)), so I don’t have to repeat it here.

While those services are not yet public and don’t have to show much right now, I assume they won’t be much more than focused aggregators (with some goodies of course). Edgeio for example said, that they look for the tag “listing” and aggregate those posts. So, if this takes off and kind of becomes a standard, it would be easy to piggyback on that for others, for example Planet Switzerland (search for tag/listing). Throw in the hListing microformat, some georeferencing and maybe some more commonly used tags and done is the classifieds aggregator for blogs.

As others pointed out discussing edgio, there’s a big spam problem in this idea (for which I don’t have a good solution either) and the usual chicken and egg problem. A big part of the blogger crowd doesn’t care about tagging nor about microformats, so either you have to have a pretty smart aggregator, which recognizes blog-classifieds without those tags or you won’t get much classifieds at all. Furthermore I don’t know how many people would try to sell their stuff via blogs instead of going to ricardo or ebay with a much larger audience. On the other hand, the blog community, especially the swiss one, is something like a fenced garden, one knows each other and this makes it a lot easier and trusted to sell/buy/find something.

Maybe I’ll extend Planet Switzerland some day with some more features into that direction (hListing parser for example and some georeferencing magic). Anyone interested at all?

Tags: , , , ,

Comment feeds on

After my suggestion to Patrice that having a comment feed field on each blog on, he (as usual) promptly added it. I have some plans using those feeds on the Planet, more about that later :)

Now it’s your turn, go and add your comment feed there. And as most of the most use blogging tools in Switzerland do support comment feeds, that should get us a nice list (not sure if and the Sixapart people do support comment feeds out of the box, at least WordPress, Kaywa, Serendipity and Flux CMS do, didn’t check the others)

Tags: , ,