... in technology

Rio de Janeiro as a smart city

04 Mar 2012 | 499 words | brazil rio smart cities streetart urbanism technology

The New York Times has a longish article portraying the Operations Center of the City of Rio that has been build by IBM’s smarter cities unit.

In the article both the city of Rio de Janeiro and IBM portray the operations center as some kind of magic wand that enables the benevolent city government to steer the daily life of the city’s population using video feeds and text messages:

City employees in white jumpsuits work quietly in front of a giant wall of screens — a sort of virtual Rio, rendered in real time. Video streams in from subway stations and major intersections. A sophisticated weather program predicts rainfall across the city. A map glows with the locations of car accidents, power failures and other problems.

[…] Rio represents a grand challenge. A horizontal city sprawled between mountains and the Atlantic Ocean, it is at once a boomtown, a beach town, a paradise, an eyesore, a research center and a construction site. Oil-industry giants like Halliburton and Schlumberger have been rushing to build research centers here to help develop massive oil and gas fields off the coast.

Special police units have moved into about 20 slums, called favelas, in an effort to assert government control and combat crime. Rio is also reconstructing major arenas and building a rapid-bus system ahead of the 2014 World Cup and the 2016 Summer Olympics.

This is a city where some of the rich live in gated communities while some of the poor in the favelas pirate electricity from the grid. And where disasters, natural and otherwise, sometimes strike. Rainstorms can cause deadly landslides. Last year, a historic streetcar derailed, killing five people. Earlier this year, three buildings collapsed downtown, killing at least 17.

[…] In real flood conditions, the operations center decides when to set off the sirens. That decision is based on I.B.M.’s system, which uses computer algorithms to predict how much rain will fall in a given square kilometer — a far more precise forecast than standard weather systems provide. When the program predicts heavy rain, the center sends out text messages to different departments so they can prepare.

The article lists a number of criticism of this surveillance based approach to smart cities:

Some wonder if it is all for show, to reassure Olympic officials and foreign investors. Some worry that it will benefit well-off neighborhoods more than the favelas. Others fear that all this surveillance has the potential to curb freedoms or invade privacy. Still others view the center as a stopgap that does not address underlying infrastructure problems.

Which seems perfectly summarized by this graffiti that i came across in central Rio last january (two days after the building collapse mentioned in the NYT article above, which took place within 10 minutes walking distance from the location of the graffiti).

Smart city graffiti

Makes me wonder if the graffiti artists was referring to a general tendency or to the Operations Center of the City of Rio in particular.

Sweet memories of lego

03 Mar 2011 | 208 words | lego memories technology

In the last couple of days kevin has been posting an amazing seriesofimages of mid 70’s lego sets, which evoke lots of memories for me (and show how ridiculously non-standard lego has become of the last 3.5 decades).

Yesterday kevin has published an ode to the magic of lego and the first set his father gave him in 1975:

lego payloader

[…] Then there were others. Not too many, but others, enough to get the gag. The gag was that it wasn’t a puzzle to solve, after all. The gag was that the puzzle was more interesting unsolved, growing in complexity over time. It’s so fucking corny to write about Legos that I can barely commit to posting this. But these things are true, that I still don’t know what a Payloader does, that even blindfolded I could still construct this vehicle with these pieces, and that discovering that anything can become anything else was a lesson that would be reinterpreted six years later, when a TRS-80 arrived at public school in the Bronx. Most everything in the interim was tv, play, or schoolwork. […]

Nothing to add here. except maybe that i have fond memories of chewing on those rubber tires and still remember how they tasted…

The revolution was televised (and projected from a plastic chair)

12 Feb 2011 | 186 words | egypt photos revolution technology

As one could have expected the big picture has some impressive pictures from the events in egypt on thursday and friday. my favorite picture (love the chair) is this one showing a crowd watching Mubarak’s thursday night television address:

Also really like the mosaic tile facade of the building of state television (especially in combination with the Munthadar Al-Zahdi references) in this shot:

They also have a photo that perfectly illustrates those silly twitter/facebook revolution claims. if this picture says anything in this regard it is that the revolution was largely dependent on mobile phone chargers:

this revolution is powered by mobile phone chargers

and finally this picture clearly gives a pretty good illustration of why the army had no real choice but to side with the people (or at least not against them). which is even more true since all those people are the army’s customers.

nevertheless it should be remembered that Mubarak was an Army (or rather Air Force) man himself before he became president. In this light it might not be the best idea to lay all your eggs into the army’s basket.

On risks & rewards (related to sharing metadata)

15 Oct 2010 | 825 words | copyright europeana metadata technology work

In the light of yesterday’s rather confusing and unconstructive discussion about ‘risks and rewards’ at the Europeanaopen culture 2010‘ conference, i thought that it might be useful to clarify a number of things. If you take a step back from form your favourite grief about copyright/public funding and look at the larger picture the whole risks and rewards discussion is actually quite simple:

Say you are a cultural heritage institution that wants to make digital representations of cultural artifacts (your ‘content’) from your collection available online: First you will need to ensure that you are actually allowed to offer these digital representations online (for example because the artifacts are out of copyright (in the public domain) or because you have managed to obtain permission to do so from the copyright holders). Once you have succeeded in this you will probably make some descriptive metadata about the objects available alongside the digital representations (if you don’t it will be very hard for users to find and to make sense of these digital representations)

Now say you want (or have to1 work with Europeana, what does that mean for your content and your descriptive metadata? You work with Europeana by contributing descriptions about the digital representations that you want to make accessible via www.europeana.eu to Europeana. In order to be able to point people to your content Europeana needs to have these descriptions. In contrast to your metadata Europeana does not need or want you content2.

Now what happens if you give your descriptive metadata to Europeana? Europeana will use it in order to point its users to your content. In order to do so Europeana will transform, combine and enrich your metadata with other relevant metadata that has been contributed by other heritage institutions. In order to fully leverage the possibilities offered by the web it also needs to make your metadata available online without restrictions (this is called linked open data, and if you want to understand why this is amazing you should go read the excellent primer on the next generation European Data Model by Stefan Gradman et al).

Now that sounds scary: ‘your metadata available without restrictions’. So lets break down the risks: First of all you loose some control over your metadata since others can work with it as well. This risk squarely falls into the category of a known unknown since you wont be able to tell right now if that loss of control will have positive or negative consequences. Secondly there is a risk of loosing imaginary revenue3: other parties might somehow generate revenue based on your metadata (this is more if an unknown unknown).

But since there are risks to making available your metadata without restrictions, there are also rewards: if your metadata is not available on Europeana users cannot find your content via Europeana. Being findable via Europeana will bring more users to your content and making your metadata available to Europeana will also result in Europeana enriching your metadata (and you are then free to incorporate enriched metadata in your own system or not).

Now all you have to do yourself is to decide if the risks out-weight the rewards: If they do then you should not make metadata about your content available to Europeana and you will not have to face these risks. On the other hand if the rewards out-weight the risks then you should probably make your metadata available to Europeana (and of course you can always experiment with a small portion of your metadata first to see if your cost benefit analysis is correct, you can also exclude your most valuable metadata or provide subsets).

In the end this really comes down to this: Europeana is a search engine that will help people find your content based on the descriptions of the content that are providing to Europeana. If you don’t provide descriptions your works can’t be found via Europeana and you do not have to face any of the risks described above4.


  1. This depends a bit on how you look at this. if you have accepted funds to digitize your content under the condition that you make it available via Europeana then of course you had the choice not to take those funds. ↩︎

  2. Your content is very similar to the secret code that comes with your debit card here: your bank has no reason to ever ask you for your secret number and Europeana has no reason to ever ask you for your content. if they do something fishy is going on and you should alert the competent authorities. ↩︎

  3. Imaginary revenue is always bigger than actual revenue. Imaginary revenue is created by fantasizing about a yet undefined customer showing up and offering lots of revenue for something that so far you have failed to monetize yourself. ↩︎

  4. Depending on how Europeana will grow not making your descriptive metadata available might also carry a risk: you might become less relevant as an institution. ↩︎

Stock trades, art and algorithms

26 Sep 2010 | 686 words | algorithms art economy future modernity technology

If you ask me one of the more fascinating things going on out there right now is high-frequency trading. High-frequency trading (HFT) occurs when traders program computers to buy and sell stocks (or other financial products) in quick succession under certain, pre-defined circumstances. (a good starting point to learn more about HFT is this planet money episode or this ai500 article by Joe Flood).

Apparently High Frequency trading enables successful trading firms to skim of enormous surplus off these transactions (up to 1 million USD per day according to the planet money episode mentioned above). Not surprisingly this behavior can also act as a destabilizing factor wrecking havoc on stock markets. It has been one of the contributing factor’s to the ‘flash crash‘ which saw the Dow-Jones index plunge nearly 1,000 points in seconds on the 6th of may 2010.

If you believe wikipedia (which of course you should not) High Frequency trading is currently responsible for 70% of the equity trading volume in the US. Needless to say the practice is generating a fair share of controversy among economists.

At the core of this controversy are the merits of HFT: does is make macro-economic sense (because it ensures the liquidity of markets and limits market volatility) or is it detrimental to the economy at large (because it extracts value from markets based on no other fact than that prices tend to move)?

While this debate is going on it appears that there are even stranger things occurring in the field of high frequency trading: in August the Atlantic reported on research undertaken by a market data firm called Nanex that unearthed trading patterns that do not seem to make sense even by the high obfuscation standards of HFT. The article in the Atlantic claims that these strange patterns are the result of ‘mysterious and possibly nefarious trading algorithms’ whose ways and reasons of operation are known to no-one:

Unknown entities for unknown reasons are sending thousands of orders a second through the electronic stock exchanges with no intent to actually trade. Often, the buy or sell prices that they are offering are so far from the market price that there’s no way they’d ever be part of a trade. The bots sketch out odd patterns with their orders, leaving patterns in the data that are largely invisible to market participants.

When you visualize this you get something like this (graphs by Nanex):

According to the Atlantic it is unclear what exactly causes these patterns to emerge. The Nanex researchers have come to the conclusion that these algorithms are most likely an attempt by trading firms to introduce noise into the marketplace in order to realize a competitive advantage:

Other firms have to deal with that noise, but the originating entity can easily filter it out because they know what they did. Perhaps that gives them an advantage of some milliseconds. In the highly competitive and fast HFT world, where even one’s physical proximity to a stock exchange matters, market players could be looking for any advantage.

On the other hand there are much more poetic explanations for the emergence of these patterns, that abandon the idea that these patters serve a purpose all-together:

On the quantitative trading forum, Nuclear Phynance, the consensus on the patterns seemed to be that they simply just emerged. They were the result of “a dynamical system that can enter oscillatory/unstable modes of behaviour,” as one member put it. If so, what you see here really is just the afterscent of robot traders gliding through the green-on-black darkness of the financial system on their way from one real trade to another.

Whatever they are, these patterns are also outright beautiful. The above visualizations remind me on the work of german artist Jorinde Voigt, who’s stunning drawings (pdf) often rely on algorithms as a source:

Konstellation Algorithmus Adlerflug 100 Adler, Strom, Himmelsrichtung, Windrichtung, Windstärke -Jorinde Voigt Berlin, Oktober 2007

p.s:Sara says that these stealth trading bots remind her of the tiger in Jonathan Lethem’sChronic City instead. p.p.s: Also just finished reading ‘the Fires‘ by the aforementioned Joe Flood. brilliant book, highly reccomended.

Apparently the dutch had video telephony well before i was even born...

15 Apr 2010 | 95 words | design history modernity technology media

Last year i spend a fair amount of energy to get the open video platform openimages.eu off the ground, but so far the videos that have been uploaded there (mainly from the polygoon collection of the institute of sound and vision have utterly failed to impress me.

Now after 4 month of operation there is finally a video that i can approve of. It has the perfect combination of techno-optimism, cuttting-edge design and sideburns:

if you do not see a video here you need a better browser

‘first test with videophone’ on openimages.eu / cc-by-sa

(UK National Portrait Gallery vs. Wikimedia) vs. the Public Domain

26 Jul 2009 | 1720 words | copyright public domain technology

You might have heard that the Wikimedia Foundation and the National Portrait Gallery arehavinga bitof arow these days. At the core of the dispute lies the fact that in march an English wikipedia administrator by the name of Derrick Coetzee uploaded more than 3000 high resolution images of paintings held by the National Portrait Gallery to the Wikimedia Commons.

The images uploaded by were not simply taken from the NPG’s website and re-uploaded to the wikimedia commons, as the NPG does (and did) not provide high resolution images files on it’s website. While the NPG website only offers relatively low resolution images (see this page for a typical image provided by the NPG and this page for the high resolution version uploaded by Coetzee), Coetzee managed to use the website’s zomify feature (now disabled) in order to obtain the high resolution files and subsequently uploaded them to the Wikimedia Commons.

While the NPG does not dispute that the original paintings are in the public domain, it argues that Coetzee’s action violates a numer legal regimes that give the National Portrait Gallery the exclusive right to determine how these reproductions are distributed. According to an email send by the NPG’s law firm to Coetzee his actions constitute an Infringement of the NPG’s copyright in those images as well as an infringement of the NPG’s database right (in the database populated by these works). In addition the NPG argues that Coetzee’s actions constitute an unlawful circumvention of technical protection measures (even though zomify clearly states that zomify is not an image security system) and breach of contract. While all of these are serious allegations (the last one is a bit silly if you ask me) the current debate very much centers on the question if the National Portrait Gallery should have a copyright regarding these images. In a post on boingboing.net Cory Doctorow lays out why this is such a fundamental question:

In Britain, copyright law apparently gives a new copyright to someone who produces an image full of public domain material, effectively creating perpetual copyright for a museum that owns the original image, since they can decide who gets to copy it and then set terms on those copies that prevent them being treated as public domain.

Regardless of the fact that this is obviously problematic the general consensus seems to be that under British copyright law the NPG does indeed hold a copyright in the photographic reproductions (because the making of the reproductions of these paintings required a significant expenditure of labour) while under US law (the wikimedia foundation is based in the US) it does not.

So on one side we have the NPG claiming that it’s copyrights have been violated and that Coetzee/Wikimedia should therefore remove the high res-images from the Wikimedia Commons and on the other side we have Coetzee (backed by the Wikimedia Foundation, many wikipedians and Creative Commons) claiming that these images belong to the public domain and do not need to be removed. The wikimedia foundation’s Erik Möller has outlined this position in on the Wikimedia Foundation’s blog:

The Wikimedia Foundation has no reason to believe that the user in question has violated any applicable law, and we are exploring ways to support the user in the event that NPG follows up on its original threat. We are open to a compromise around the specific images, but our position on the legal status of these images is unlikely to change. Our position is shared by legal scholars and by many in the community of galleries, libraries, archives, and museums. In 2003, Peter Hirtle, 58th president of the Society of American Archivists, wrote:

“The conclusion we must draw is inescapable. Efforts to try to monopolize our holdings and generate revenue by exploiting our physical ownership of public domain works should not succeed. Such efforts make a mockery of the copyright balance between the interests of the copyright creator and the public.” [source]

Some in the international GLAM [pk: Galleries, Libraries, Archives and Museums] community have taken the opposite approach, and even gone so far to suggest that GLAM institutions should employ digitial watermarking and other Digital Restrictions Management (DRM) technologies to protect their alleged rights over public domain objects, and to enforce those rights aggressively.

The Wikimedia Foundation sympathizes with cultural institutions’ desire for revenue streams to help them maintain services for their audiences. And yet, if that revenue stream requires an institution to lock up and severely limit access to its educational materials, rather than allowing the materials to be freely available to everyone, that strikes us as counter to those institutions’ educational mission. It is hard to see a plausible argument that excluding public domain content from a free, non-profit encyclopedia serves any public interest whatsoever.

I completely agree with the position taken by the Wikimedia Foundation here. It should not be possible to monopolize public domain works by obtaining copyrights in simple (or even complicated) reproductions of these works. Once the copyrights in the original works have expired those who formerly held the copyright or those who happen to own the physical works should not have any exclusive right to determine what third parties can do with reproductions of these works. As far as i am concerned this is one of the fundamental principles of the public domain which cannot be pushed aside by museums in search of online business models1.

However i have the feeling that this principle is not the only thing that should be considered in the current dispute. It is likely that in this particular case the National Portrait did not knowingly publish the high resolution photos of these portraits:

Assuming a certain level of technological ignorance on behalf of the NPG it is fairly safe to assume that they thought they where only making available 500 * 400 pixel images and allowed users of the website to see 500 * 400 px sections of the paintings in high resolution. Before Coetzee proved them otherwise the NPG probably never realized that this meant that the entire high-res files needed to be on a web-server somewhere2.

Does the public domain status of the original paintings requires the NPG to make available the high-res photos? As far as i can see not. The public domain status of these paintings means that nobody has the right to control their reproduction and publication of reproductions anymore, but it does not mean that all reproductions of these pictures must be freely distributed. Just as i can take a photo of a public domain work and keep it for myself the NPG can decide to take these pictures and then keep them for whatever they please to do with them3: There simply is no right of access to public domains works or their reproductions.

If you consider this it is a little bit easier to understand the position of the NPG. They never knowingly published the high-res versions of these images and all of a sudden they appear on Wikipedia and there does not seem to be a way to control their distribution anymore. At this point it is very much a theoretical question if the NPG has a right in these images or not because the images are out on the net and there is absolutely no way for anyone to regain control over them ever again (regardless of how the legal dispute will end).

However it is important to note that before these images got out onto the net the NPG did not try to control their distribution by asserting copyright but simply by not making them available, knowing (one assumes) that once they were available their copyright claims would be without much effect no matter how much these are backed by British law.

Given all of this i do think that it will be counterproductive to use this particular case in order to defend the principle that there should be no right of exclusive control over the distribution of reproductions of public domain works (as the blog post by Erik Möller implies). Instead this dispute is really about access control to these files.

If one is really interested in working on getting as many good quality reproductions of public domain works online then it is necessary to work with cultural heritage institutions by convincing them that making available these files without restrictions is in the best of their interest (as a number of Wikipedia volunteers argue in this excellent open letter).

Working with cultural heritage institutions means that contributing to repositories of freely licensed and public domain works such as the Wikimedia Commens should always be based on conscious and voluntary decisions by those in a position to make material available4. There is a growingnumberof examples of such behavior and it is probably only a question of time (and hard work on behalf of wikipedians) before more cultural heritage institutions recognize that making available their collections rather than keeping them locked away in search of marginal income from licensing[5] is more likely to strengthen their position in the digital environment.

It might very well be contra-productive to insist that the images obtained by Coetzee are not ‘protected’ by copyright as this is likely to make cultural heritage institutions feel even more threatened by public domain advocates. Instead this energy should be focussed on convincing cultural heritage institutions that it is in their best interest to to make their collections available as open as possible.


  1. I am currently working on a Public Domain manifesto that outlines this and other principles. This partially explains why i think that this is such an important dispute. ↩︎

  2. Which once again demonstrates that DRM cannot work, that there is no real difference between downloading and streaming and that if you really want to keep something for your self the most stupid thing you can do is to store it on a device connected to web, no matter how much ‘security’ may be involved. ↩︎

  3. I do however think that the NPG has the moral (and statutory) obligation to make these images available as they are a public institution funded with public money, but that is an entirely different line of argumentation. ↩︎

  4. Again, publicly funded institutions have the moral (and should have the statutory) obligation to make public domain works available, but that is an different line of argumentation. ↩︎

The end is VERY near...

07 Jul 2009 | 36 words | books technology stupidity

This is by far the most repulsive patent application i have ever seen. Apparentlyamazon.com has filed for a patent for showing contextual ads in the margins (also called white space for a good reason!) of e-books…

print/pixel

16 May 2009 | 830 words | media publishing technology

Spend the last two evenings at the print/pixel international conference on the current shifts in print and online media production (warning: most dysfunctional conference website ever) organized by the research project Communication in a Digital Age at the PZI in Rotterdam. Most of the discussions were (not surprisingly) on business models for print publications and the role of print journalism in the current technological and economic climate. Unfortunately the whole thing has not really resulted in much new insights (& all the speakers i really wanted to hear did not show up).

At times the discussion reminded me very much of the discussions around file sharing of music from a couple of years ago. At that time the music industry was claiming that they needed to be in control of distribution (and thus eliminate file sharing) because they had a god-given task to do the (expensive) discovery of new artists and transforming some of them into stars, which could only be done as long as they were able to extract enough surplus from distributing recorded music.

Now the print people are structurally repeating the same argument: self distribution via the internet must be limited because (small) publishing houses use parts of their revenue in order to filter the quality texts from the much bigger reservoir of general text production. The argument goes that in order for this quality filter to survive, authors must continue to publish through publishers instead of relying on internet distribution.

Of course this line of reasoning is not only stupid but also incredibly arrogant. If i was a publisher i would rather invest my energies into figuring what i have to offer to authors and readers once the perfect shitstorm of cheap generic electronic paper reading devices1 and wide available of e-books on the file-sharing platforms breaks loose. judging from the discussions at print/pixel it appears that very few people are seriously preparing for this inevitable scenario.

The entire event reminded me of a passage in the first half of David Foster Wallace’s ‘Infinite Jest‘ (which i have almost finished reading by now). It describes a fictional future scenario in which the digital revolution re-structured the economics of entertainment (=news) in a completely different way from the situation as it was discussed at print/pixel2. When it was published in 1996 Infinite Jest was situated in a near future that roughly corresponds with the current present (predictably wikipedia has an entire section of the ‘infinite jest’ article devoted to the question which year in the book corresponds to which year in the Gregorian calendar).

In the scenario described in Infinite Jest, television has been replaced by the a entertainment mechanism for entertainment products (called cartridges) called InterLace. InterLace is also the name of the company that has complete control (i.e a monopoly) over the distribution of entertainment cartridges. People pay for the on demand consumption of cartridges (and those who can’t afford it re-watch old cartridges all the time). In the book the emergence of InterLace is described in a term paper written by the main character Hal Incandenza, a marihuana addicted junior tennis player (pages 410-418 in the back bay 10th aniversary paperback edition).

According to Hal’s paper the shift from Network television to cable television and then to InterLace as the dominant entertainment medium started when advertisements on network television got so repulsive that people viewers abandoned network television in favor of cable television which offered more choice and less advertisement. While the cable TV stations rested in their new found popularity, a video rental chain owner called Noreen Lace Forché bought up the production facilities of the bankrupt TV networks and used them to produce content for InterLace which offers pre-produced on-demand entertainment. From the consumer perspective the appeal of InterLace over cable television is that one has complete control over the choice of programming (i.e one can select from a huge library) without any advertisement. In the scenario described in infinite Jest this is a business proposition that people are happy to pay, which not only puts broadcast and cable television out of business but also eliminates most of the advertising industry.

Now the obvious difference with the the current reality is today there is no monopoly provider of entertainment (or news) products that can afford to set a price for access. Instead we have the open internet offering free access to almost all entertainment (or news) and one of the very few business models that offers some relief to producers is online advertisement. Makes me wonder if we would not be better of in the infinite jest scenario, paying for our entertainment and news and being spared of the advertising onslaught…


  1. In the short term this probably means jail-brokenkindle’s that display standard .pdf and .epub files. ↩︎

  2. The free availability of news in real time all over the internets undermines the very structures involved in producing these news and in the long run will affect the quality and diversity of the news available. ↩︎

Dark fibre

23 Apr 2009 | 323 words | copyright india film media photos piracy technology

In march we spend a week in bangalore with jamie and the darkfibre crew. we had flown there to take pictures of them while they were shooting for dark fibre (more pictures will become available later).

Dark fibre crew at work on the rooftop terrace of a IT office building in South Bangalore

It was fun and extremely interesting to watch the production from behind the scenes and i am really looking forward to the film (jamie has promised that there will be a trailer on the 13th of may). in the meanwhile there is an interview with jamie and his co-director peter mann on the website of the center for internet and society in bangalore:

‘Dark Fibre’ is set amongst the cablewallahs of Bangalore, and uses the device of cabling to traverse different aspects of informational life in the city. It follows the lives of real cablewallahs and examines the political status of their activities.The fictional elements arrive in the form of a young apprentice cablewallah who attempts to unite the disparate home-brew networks in the city into a grassroots, horizontal ‘people’s network’. Some support the activity and some vehemently oppose it — but what no one expects is the emergence of a seditious, unlicensed and anonymous new channel which begins to transform people’s imaginations in the city. Our young cable apprentice is tasked with tracking down the channel, as powerful political forces array themselves against it. Not only the ‘security’ of the city, but his own wellbeing depend on whether he finds it, and whether it proves possible to stop its distribution. Meanwhile, mysterious elements from outside India — possibly emissaries of a still-greater power — are appearing on the scene. This quest for the unknown channel is reminiscent of a modern-day ‘Moby Dick’, with the city of Bangalore as the high seas and our cable apprentice a reluctant Ahab. The action is a combination of verite, improvisation and scripted action.

meanwhile... is the personal weblog of Paul Keller. I am currently policy director at Open Future and President of the COMMUNIA Association for the Public Domain. This weblog is largely inactive but contains an archive of posts (mixing both work and personal) going back to 2005.

I also maintain a collection of cards from African mediums (which is the reason for the domain name), a collection of photos on flickr and a website collecting my professional writings and appearances.

Other things that i have made online: