Have a read of this article and try to put aside the fact that a picture of Auschwitz was wrongly labelled as "sport".
Actually look at what Flickr has tried to achieve. Its made a system that can look at an image and actually generalise its overall theme. For humans this is such a simple task but for analytical machines essentially what it is working from is a large range of binary trees. Image a game of 20 questions if you were only playing with a finite amount of images and the concept of colours you don't really understand.
I think its quite impressive that the majority of images have been labelled depending on your opinion kinda correctly. This article doesn't focus on that though and that is a fair enough point. The pictures should never been labelled with such tags but this system is in testing still and the way the system learns, is from its mistake. Each time the system is told that it was wrong. It learns and the next time a similar image is shown it will have a better chance of labelling it correctly.
I hope that Flickr doesn't take down the system completely but it may want to remove it from the public eye for the time being and allow some people to help maintain the system and help it learn.
Instead of shouting at a product that made some mistakes they should help to push this field forward. Image recognition research will be badly hurt if Flickr is pressured into stopping completely because of this publicity.
Flickr is facing a user revolt after a new auto-tagging system labelled images of black people with tags such as “ape” and “animal” as well as tagging pictures of concentration camps with “sport” or “jungle gym”. The system, which was introduced in early May, uses what Flickr describes as “advanced image recognition technology” to automatically categorise photos into a number of broad groups.