Revision as of 14:33, 21 June 2013 editCMBJ (talk | contribs)Autopatrolled, Extended confirmed users, IP block exemptions, Pending changes reviewers13,994 edits fixing top section← Previous edit | Revision as of 14:56, 21 June 2013 edit undoDelicious carbuncle (talk | contribs)21,054 edits →Contentious images: CMBJ, you and Wnt are the same person, right?Next edit → | ||
Line 18: | Line 18: | ||
::::That was really long and unhelpful. -'']'' <small>(])</small> 14:16, 21 June 2013 (UTC) | ::::That was really long and unhelpful. -'']'' <small>(])</small> 14:16, 21 June 2013 (UTC) | ||
:::::Sorry that it did not resonate well with you. The point was to illustrate the concept that I outlined above, which is that the practical effects of an optional filter extend beyond that of user choice. <span style="background:black;color:white"> '''''— '''''] </span> 14:27, 21 June 2013 (UTC) | :::::Sorry that it did not resonate well with you. The point was to illustrate the concept that I outlined above, which is that the practical effects of an optional filter extend beyond that of user choice. <span style="background:black;color:white"> '''''— '''''] </span> 14:27, 21 June 2013 (UTC) | ||
::::::If someone wanted to prevent others from seeing parts of Misplaced Pages, they could simply block ''all'' of WP or block ''all'' images from WP. You describe a scenario in which only specific images are blocked - which is worse? ] (]) 14:56, 21 June 2013 (UTC) | |||
::Doesn't Google and other large image hosting websites do the same? It's common sense. <b>]</b> ] 06:57, 21 June 2013 (UTC) | ::Doesn't Google and other large image hosting websites do the same? It's common sense. <b>]</b> ] 06:57, 21 June 2013 (UTC) | ||
Revision as of 14:56, 21 June 2013
General
- There's a minor typo: "free-content remit than that that of supporting". Mohamed CJ (talk) 00:34, 21 June 2013 (UTC)
- Fixed, thank you! Don't be afraid to fix obvious typos like that yourself. :-) Ed 01:07, 21 June 2013 (UTC)
- MichaelMaggs's article doesn't really tell us his/her opinion on sexual images, while Mattbuck is basically supporting the current status quo. My main interest in Commons is uploading images for use on Misplaced Pages, but I also upload images knowing that they may never be used here. Although I !voted for Muhammad images to be used in the article , I find the excessive collection of seemingly useless sexual content on the Commons as disruptive, repellent and pointy. Mohamed CJ (talk) 01:04, 21 June 2013 (UTC)
- No one is supporting the status quo. The theme of both responses is: "If you have a problem with Commons, engage with us instead of sniping from afar." Which is good advice. Powers 01:22, 21 June 2013 (UTC)
- I can see how "Come talk to us" is a definite vote for the status quo, since it seems the only well-developed policy on Commons is "Your problem isn't our problem". These responses consider my op-ed scathing for Commons, and maybe I did use harsh language in places, but ultimately, none of them address how Commons can continue to assert policy autonomy while still serving the inter-wiki media sharing function. My op-ed offered a solution that in my eyes is win-win, allowing autonomy for Commons, and removing the repercussions of Commons-local policy or lack thereof from the other projects Commons serves. Gigs (talk) 02:36, 21 June 2013 (UTC)
- "Us" indeed--Commons is marked by cliquishness and instinctive antipathy. Drmies (talk) 04:18, 21 June 2013 (UTC)
- We all have home wikis, mine is Commons. I was posting as a Commons administrator, about a Commons issue, to an audience of people who are not Commons users. What other pronoun would I use? -mattbuck (Talk) 12:16, 21 June 2013 (UTC)
- "Us" indeed--Commons is marked by cliquishness and instinctive antipathy. Drmies (talk) 04:18, 21 June 2013 (UTC)
- I can see how "Come talk to us" is a definite vote for the status quo, since it seems the only well-developed policy on Commons is "Your problem isn't our problem". These responses consider my op-ed scathing for Commons, and maybe I did use harsh language in places, but ultimately, none of them address how Commons can continue to assert policy autonomy while still serving the inter-wiki media sharing function. My op-ed offered a solution that in my eyes is win-win, allowing autonomy for Commons, and removing the repercussions of Commons-local policy or lack thereof from the other projects Commons serves. Gigs (talk) 02:36, 21 June 2013 (UTC)
- No one is supporting the status quo. The theme of both responses is: "If you have a problem with Commons, engage with us instead of sniping from afar." Which is good advice. Powers 01:22, 21 June 2013 (UTC)
- I'm pleased to see these two responses to the original op-ed. I won't comment on the sexual images debate, but otherwise the responses reflect my long held view that commons is not just a source of images for wikimedia projects, but also a useful library of free licence images, etc, that anyone can use. I have uploaded many of my own images to commons. Although I often immediately add an image I have uploaded to an appropriate wikipedia article, that is not always the case, as I also frequently upload images on the basis that someone might find them useful somewhere at some time in the future. If I have one criticism of commons, it is that many of the images in commons are not of particularly high quality. But I suspect that that problem is merely one of many reasons for us to upload better images to commons than many of the ones that are already there. Bahnfrend (talk) 02:03, 21 June 2013 (UTC)
Contentious images
I do not see any issues with contentious images be filtered by choices, as it is all personal preference. One technical way to resolved this, is to tag those photos and let user decide what photos they want to see. For example, tag the "Contentious images" in broad category ie. (Sexual explicit/Violence/Glory/Discretion's. Ordinary user have to explicitly select/tick search result to include photos that falls under that category, Otherwise only non-tag photos will be displayed. This is not censorship, because it is the user decisions to decide what they want to see, without being forced to see photos that they do not want to see in the first place. User have the right to choose. Why should a group of admin decide on behalf of the user what photos user must see/cannot see. User power. Yosri (talk) 01:26, 21 June 2013 (UTC)
- Unfortunately, this would be censorship, because it enables entities other than the user to manipulate the classification system to forcibly impose it through technical and/or punitive means. — C M B J 01:32, 21 June 2013 (UTC)
- The tagging must not be done arbitrary/single person, must follows guidelines set by committee/voting. User still can choose to see/ignore. Why should somebody force me to see things that I do not want to see. Yosri (talk) 01:42, 21 June 2013 (UTC)
- Consider the following scenario. The New Foo State Education Agency (NFSEA) boasts a promise of zero tolerance for prohibited activities and commissions a task force to implement the policy across all educational institutions in New Foo. NFSEA's task force then concludes that content-control software will be necessary to enforce a provision that forbids, among other things, accessing online pharmacies. Accordingly, the task force recommends acquisition of a compliant software suite, one of such, "Foo Filter", it notes as being a government off-the-shelf product made available through an NFSEA-approved vendor, FuTek. The NFSEA then negotiates with FuTek and procures a license to use Foo Filter over the next ten fiscal years. The NFSEA deploys Foo Filter at all educational institutions across New Foo, including institution ranging in scope from elementary schools to public research universities, then concludes that the implementation has been completed. Everything seems to be in order and life goes on as usual. Several weeks later, class is back in session at Foo University, and Joseph, a sophomore at FU, is at the computer science laboratory reading Misplaced Pages articles pertaining to an upcoming assignment he has on human rights. He is particularly moved by Abu Ghraib torture and prisoner abuse and plans to make his presentation on the subject. However, upon visiting that article, he soon realizes that the article's twelve images are all inaccessible except for one: File:Navy consolidated brig -- Mirimar CA.jpg. He raises the point with a member of the laboratory's staff and asks her why students aren't allowed to access these images. "I'm sorry," she says, "these images are restricted because Foo Filter automatically blocks all images classified as offensive in nature." Joseph replies, "but doesn't that go against the idea of free speech?" "Yes," she says, "but Foo Filter's use is mandated on all state campuses and there are stiff penalties for noncompliance." "So you're saying that I'm going to have to walk back to my dorm if I want to use one of these images in my presentation?" "No," she says, "the dormitories actually use the same network, so you won't be able to access it there, either." "Ridiculous," Joseph says. "Rules are rules," she says sighingly. — C M B J 13:37, 21 June 2013 (UTC)
- That was really long and unhelpful. -mattbuck (Talk) 14:16, 21 June 2013 (UTC)
- Sorry that it did not resonate well with you. The point was to illustrate the concept that I outlined above, which is that the practical effects of an optional filter extend beyond that of user choice. — C M B J 14:27, 21 June 2013 (UTC)
- If someone wanted to prevent others from seeing parts of Misplaced Pages, they could simply block all of WP or block all images from WP. You describe a scenario in which only specific images are blocked - which is worse? Delicious carbuncle (talk) 14:56, 21 June 2013 (UTC)
- Sorry that it did not resonate well with you. The point was to illustrate the concept that I outlined above, which is that the practical effects of an optional filter extend beyond that of user choice. — C M B J 14:27, 21 June 2013 (UTC)
- That was really long and unhelpful. -mattbuck (Talk) 14:16, 21 June 2013 (UTC)
- Consider the following scenario. The New Foo State Education Agency (NFSEA) boasts a promise of zero tolerance for prohibited activities and commissions a task force to implement the policy across all educational institutions in New Foo. NFSEA's task force then concludes that content-control software will be necessary to enforce a provision that forbids, among other things, accessing online pharmacies. Accordingly, the task force recommends acquisition of a compliant software suite, one of such, "Foo Filter", it notes as being a government off-the-shelf product made available through an NFSEA-approved vendor, FuTek. The NFSEA then negotiates with FuTek and procures a license to use Foo Filter over the next ten fiscal years. The NFSEA deploys Foo Filter at all educational institutions across New Foo, including institution ranging in scope from elementary schools to public research universities, then concludes that the implementation has been completed. Everything seems to be in order and life goes on as usual. Several weeks later, class is back in session at Foo University, and Joseph, a sophomore at FU, is at the computer science laboratory reading Misplaced Pages articles pertaining to an upcoming assignment he has on human rights. He is particularly moved by Abu Ghraib torture and prisoner abuse and plans to make his presentation on the subject. However, upon visiting that article, he soon realizes that the article's twelve images are all inaccessible except for one: File:Navy consolidated brig -- Mirimar CA.jpg. He raises the point with a member of the laboratory's staff and asks her why students aren't allowed to access these images. "I'm sorry," she says, "these images are restricted because Foo Filter automatically blocks all images classified as offensive in nature." Joseph replies, "but doesn't that go against the idea of free speech?" "Yes," she says, "but Foo Filter's use is mandated on all state campuses and there are stiff penalties for noncompliance." "So you're saying that I'm going to have to walk back to my dorm if I want to use one of these images in my presentation?" "No," she says, "the dormitories actually use the same network, so you won't be able to access it there, either." "Ridiculous," Joseph says. "Rules are rules," she says sighingly. — C M B J 13:37, 21 June 2013 (UTC)
- Doesn't Google and other large image hosting websites do the same? It's common sense. Mohamed CJ (talk) 06:57, 21 June 2013 (UTC)
- The tagging must not be done arbitrary/single person, must follows guidelines set by committee/voting. User still can choose to see/ignore. Why should somebody force me to see things that I do not want to see. Yosri (talk) 01:42, 21 June 2013 (UTC)
- What is amazing is just how little interest there is in technical mechanisms to allow users to control what they find offensive. Knowing very little about Javascript, I wrote up a tiny little script that actually hid all the images in Muhammad. This was proof-of-principle of an idea I had gone on about at considerable length in User:Wnt/Personal image blocking. We could allow people who are offended to form networks, transclude together huge lists of blacklisted images, doing so collaboratively without requiring any Official View of what is a Bad Image. It doesn't seem like that is of any interest to anyone though. Despite talk of people being offended, the cause seems to be more about trying to win power to affect what other people see. If you don't have personal choice of what to block and whose blocklists to transclude into your own - if you have a project-wide set of categories to include and exclude content - then inevitably people will disagree on those categories, and someone has to so regretfully place himself in charge of saying who is right and who has to be banned to keep him from disagreeing with the others. Wnt (talk) 07:54, 21 June 2013 (UTC)
- Decentralizing this sort of scheme is actually the best suggestion I've heard yet, although it does still worry me that if lists gain enough popularity they will be used to do harm. If China catches wind of such a scheme, for example, they could exploit our work to easily block access more legitimate content than would otherwise be feasible. There are also considerations in places like Iran where homophobic lists, for example, could theoretically contribute to persecution efforts. — C M B J 13:42, 21 June 2013 (UTC)
still historically inaccurate
One of the truly great tragedies of medieval England was not so much the tragedy of the commons in its original sense but the forcible enclosure by powerful outside interests of the historic common land that had for centuries been available as a free resource for all. - no, that's still wrong. But whatever.Volunteer Marek 03:24, 21 June 2013 (UTC)
Opinion from Dcoetzee
I came to this party a bit late so I didn't submit an op ed, but wanted to give my thoughts briefly. I'm a long-time adminstrator on both Commons and English Misplaced Pages, and I refer to both as home wikis. There is substantial overlap between us in the area of image curation and dealing with media licensing issues - I have seen a lot of great work going into Misplaced Pages:Possibly unfree files here, and Commons itself is quite reliant upon the excellent table at Misplaced Pages:Non-U.S. copyrights. I believe many of the image admins here on En would be great Commons admins, and vice versa. On the other hand, Commons' understanding of copyright law, U.S. and international, and policies surrounding it are in many ways more nuanced than En's, with extensive pages on issues like non-copyright restrictions, de minimis, and freedom of panorama, and as such it's no surprise that not everyone who excels here has the specialist understanding to administrate content on Commons.
But the main thrust of the original essay was, as these responses suggest, about scope. I want to emphasize what MichaelMaggs referred to as the "small proportion of our holdings that relate to sexual imagery and to privacy/the rights of the subject". Commons does receive a lot of low-quality penis uploads by white first-world males, for whatever reason, and we purge these without prejudice; this inspired the part of the scope policy reading: "poor or mediocre files of common and easy to capture subjects may have no realistic educational value, especially if Commons already hosts many similar or better quality examples." At the same time, Commons struggles to acquire a variety high-quality and/or distinctive media of sex, anatomy, and pornography topics, such as medical images, images of non-whites or women, documentary photographs and videos of sexual acts, portraits of porn stars, and so on. Contrary to the moral panic that frequently surrounds the presence of sexual content at Commons, we actually need a lot more of it, just the right kind.
Our policy on photographs of identifiable people addresses many of the typical cases where a person's image may be used unethically, particularly images taken in a private setting without consent. In addition to this, there is a de facto policy that persons who request deletion of an image of themselves, which is not in use or easily replaced by another image, typically have their request honored (we call this "courtesy deletion"). We also provide courtesy deletion in some cases when users make it clear that they didn't understand the meaning of the free license at the time they used it. Photos of people online can damage reputations and be very disturbing, so we take these kind of issues very seriously, and always weigh the benefit of works to the public carefully against the risk to the individual.
That said, much of this practice is encoded only as folk knowledge gained through experience, and deserves more thorough documentation as official policies and guidelines. Policy development on Commons can be a struggle, with a small number of users split among a huge number of tasks, and in many cases practice shifts before policy comes along to document it, as has happened with the more aggressive deletion of URAA-violating images, or with the 2257 tag for sexually explicit works. But when policy development founders, it is not through lack of attention so much as because new policies have to be effective at carving out a new area that is not adequately addressed by our core policies. As anyone who's frequented Misplaced Pages talk:Criteria for speedy deletion would know, rules that seem intuitive are often found to have important exceptions. As an international project, it's also important that policies on Commons are culturally-neutral and guided by the common needs of all projects.
Part of the misunderstandings between Commons and other projects arise because of poor communication: we sometimes delete files without warning users on local projects, or fully explaining to them the intricacies of the laws that require the deletion; we sometimes do not delete works that one project finds inappropriate for its local culture, but others find useful. I think an important part of our mission going forward should be to communicate our intentions and rationales to all affected parties at all times. Dcoetzee 04:41, 21 June 2013 (UTC)
Good only in speech
Despite they (Common admins) are big philosophers by speech; they never leave a chance to humiliate someone, neglecting every personality rights. JKadavoor Jee 11:46, 21 June 2013 (UTC)
- Quite indiscriminate your judgement: none of the 2 op-editors voted to keep that image. Of the admins participating in the deletion discussion, 4 voted to delete and 5 voted to keep. --Túrelio (talk) 12:20, 21 June 2013 (UTC)
- It was a contentious DR, and was decided on the grounds that the subject is a public figure and the media are not easily replaceable. I'm sure if I tried I could find many XfDs on en.wp that I disagree with, but that just shows that my own opinions are not the consensus ones. -mattbuck (Talk) 12:22, 21 June 2013 (UTC)
- I said “Common admins”; not the above admins. The DR was closed by an admin; am I right? Please appoint admins who have a common sense to read and understand what are written in our policies: "While some aspects of ethical photography and publication are controlled by law, there are moral issues too. They find a reflection in the wording of the Universal Declaration of Human Rights, Article 12: "No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation."Common decency and respect for human dignity may influence the decision whether to host an image above that required by the law. The extent to which an image might be regarded as "unfairly obtained" or to be "intrusive", for example, is a matter of degree and may depend on the nature of the shot, the location, and the notability of the subject." JKadavoor Jee 12:31, 21 June 2013 (UTC)
Communication is the key
The original op-ed and the two responses as well as many of the comments above point to the lack of communication among parties being the source of contention. Perhaps a way to forestall future disagreements is to make sure the lines of communication are always open. Even though we're all focused on the projects, we have to go the extra distance and focus a bit more on individual's perceived displeasure in order to see that sometimes illusive consensus. -- kosboot (talk) 12:51, 21 June 2013 (UTC)