Revision as of 11:10, 29 October 2009 editGavin.collins (talk | contribs)18,503 edits →Quick approvals without community consensus or discussion← Previous edit | Revision as of 11:13, 29 October 2009 edit undoTedder (talk | contribs)Edit filter managers, Administrators62,266 edits →Quick approvals without community consensus or discussion: +link to ANINext edit → | ||
Line 109: | Line 109: | ||
::Other templates have blank fields. If there is consensus by the community to have a bot fill it in, fine. But there isn't. So, rollback of an unapproved bot task is a simple and straight-forward means to take care of the issue. --] (]) 01:19, 24 October 2009 (UTC) | ::Other templates have blank fields. If there is consensus by the community to have a bot fill it in, fine. But there isn't. So, rollback of an unapproved bot task is a simple and straight-forward means to take care of the issue. --] (]) 01:19, 24 October 2009 (UTC) | ||
:::I would have to agree. There seems to be no oversight or governance policy in operation here. Bots and automated tools seem to be performing many tasks with out consideration to the wider issues involved or the consequences of bot actions. Blaming the template and washing of hands is frankly a cheap way to sidestep the issue of linkspamming. Obviously the principal of ] has not permeate down to BAG yet. In theory the bots should be accountable to the wider community. In practise, the legitimate issues brought to the attention of the bot operators are ignored. Its "one rule for you, one rule for me" as far as I can see. --] (]|] 11:03, 29 October 2009 (UTC) | :::I would have to agree. There seems to be no oversight or governance policy in operation here. Bots and automated tools seem to be performing many tasks with out consideration to the wider issues involved or the consequences of bot actions. Blaming the template and washing of hands is frankly a cheap way to sidestep the issue of linkspamming. Obviously the principal of ] has not permeate down to BAG yet. In theory the bots should be accountable to the wider community. In practise, the legitimate issues brought to the attention of the bot operators are ignored. Its "one rule for you, one rule for me" as far as I can see. --] (]|] 11:03, 29 October 2009 (UTC) | ||
::::Here's the Gavin is referring to. ] (]) 11:13, 29 October 2009 (UTC) | |||
== Another speedy approval, and speedy approvals and community input in general == | == Another speedy approval, and speedy approvals and community input in general == |
Revision as of 11:13, 29 October 2009
Shortcuts
Archives | |||||||||||||||
Index
|
|||||||||||||||
This page has archives. Sections older than 30 days may be automatically archived by Lowercase sigmabot III. |
Is approval scheme working?
It was recently agreed on Misplaced Pages:Village pump that any large-scale automated or semi-automated article creation task should require BRFA. One concern was that it would be impossible to follow up, has this been the case? Take for instance Sasata's recent large creation of fungi articles. Or Fergananim's very short articles on medieval Irish aboots, like Gillabhrenainn Ua hAnradhain. According to the new regulations these should both require approval, but I can't see that this has been done? Lampman (talk) 14:57, 28 September 2009 (UTC)
- With the exception of the content creation bot BRfA which is open atm, there haven't been many (if any) recent BRfAs for page creation. But then, this proposal hasn't been widely "advertised", and I'm willing to bet that the users mentioned above aren't even aware of it. - Kingpin (talk) 15:08, 28 September 2009 (UTC)
- No, this was an issue brought up at the discussion: that it would be just another layer of guidelines and regulations that nobody would care about or even know about. I should perhaps bring it up at the Village Pump to figure out how it can be better advertised and implemented, otherwise it's rather pointless to have such a rule at all. Lampman (talk) 15:22, 28 September 2009 (UTC)
- Since the community has decided that mass content creation must be done by an authorized bot, it can be enforced as part of the existing "no unauthorized bots" rule. Although it would probably be better to just warn the user first for the moment. Anomie⚔ 20:31, 28 September 2009 (UTC)
- No, this was an issue brought up at the discussion: that it would be just another layer of guidelines and regulations that nobody would care about or even know about. I should perhaps bring it up at the Village Pump to figure out how it can be better advertised and implemented, otherwise it's rather pointless to have such a rule at all. Lampman (talk) 15:22, 28 September 2009 (UTC)
Most likely approve
Why is it bot policy that a bot will most likely be approved after a community discussion? Isn't it that a decision to approve a trial will be made after discussion?
After a reasonable amount of time has passed for community input, an approvals group member will most likely approve a trial for your bot and move the request to this section.
What? --69.225.5.4 (talk) 18:23, 29 September 2009 (UTC)
- The statement is accurate from a historical standpoint, mostly becuase people rarely ask for approval of a controversial task. --ThaddeusB (talk) 00:15, 30 September 2009 (UTC)
- It's not a statement about the history of the policy; and the policy isn't just the type of tasks, it's the code, the feasibility, what the community desires. There are plenty of bots that are not approved, so it's inaccurate.
- To say the task "will most likely be approved" smacks of a lack of regard for community input.
- BAG members should approve trials only if there is support for the bot and the bot is in accordance with policy.
- I'm going to reword it, barring input from BAG saying that, indeed, they "most likely" approve trials for bots. --69.225.5.4 (talk) 18:09, 1 October 2009 (UTC)
- Since August 1, there have been 4 bots denied, 3 requests expired, and 6 withdrawn by the operator. A good deal of the expired/withdrawn ones were approved for trial at the time of closing. In that time, there have been 33 approved bots. So "most likely" does seem accurate. Mr.Z-man 18:22, 1 October 2009 (UTC)
- It should outline policy, not predict the results of a request. --69.225.5.4 (talk) 20:12, 1 October 2009 (UTC)
- Since August 1, there have been 4 bots denied, 3 requests expired, and 6 withdrawn by the operator. A good deal of the expired/withdrawn ones were approved for trial at the time of closing. In that time, there have been 33 approved bots. So "most likely" does seem accurate. Mr.Z-man 18:22, 1 October 2009 (UTC)
Hypotheticals.
So, suppose someone wanted to run a bot that did admin activities, but was not themselves an admin. (I can't program, so no, this isn't asking about me.) Is this allowed? If it's not been considered, should it be allowed? Also, what about someone who is an admin, runs a useful bot, and is desysoped? (Or, what in the case where a regular user who runs a bot is banned.) I'm just curious how such issues are approached, so I can better know policy. Irbisgreif (talk) 08:10, 15 October 2009 (UTC)
- It's not allowed, as it would allow someone access to the admin tools who didn't go through RFA. If an admin is desysopped, their bots would be desysopped to; if a user is banned, their bots would be banned (and deflagged) too. If a user is just blocked for a short time (e.g. a 31-hour block), I don't know whether their bots would be blocked too or if they would be left alone as long as they weren't used for block evasion. A bot's activities (or the operator's activities relating to being a bot operator, e.g. communication) can also be brought to this page for review, which could result in the bot being deapproved. Anomie⚔ 11:15, 15 October 2009 (UTC)
Quick approvals without community consensus or discussion
There was a remark that this bot was approved without community consensus. ("Bots seem to get approved based on a technical evaluation rather than on whether they conform to bot policy by only making edits that have consensus, as happened here.")
The bot was approved in two days with no input from anyone else in the RFBA process, other than bot operator and single BAG member who approved the bot for editing thousands of mainspace articles after examing some trial edits made by the bot before it was approved for trial edits, also, thereby, eliminating the opportunity for community input on the trial run.
This bot only edits pages which have the parameter blank already, but the parameter does not show up in the article if not filled in (| oclc = | dewey = congress = ), and whether it should be filled in by a bot should be discussed with the wider community to gain consensus for the bot task.
I would like to see some time pass before bots that impact article space widely are approved.
Bot policy requires that a bot "performs only tasks for which there is consensus," means that bots should not be approved for main space tasks without community input. One BAG member does not constitute community input, imo.
Link to discussion calling OCLC linkspam-controversy.
This link is not about CobraBot. I include it because a quick search shows that OCLCs are something that generates a lot of discussion on en.wiki. This discussion mentions, for instance, that consensus shows "OCLCs are considered superfluous when ISBNs are present." This discussion shows that, contrary to being approved, the CobraBot task maybe should have been denied as there might not be community consensus for the task at all. Consensus is required by bot policy. None was asked for in this approval. No time for community input was allowed before community approval. A prior bot was stopped from doing this task by the community. Maybe this bot task should not have been approved against community consensus.
--69.225.5.183 (talk) 07:33, 18 October 2009 (UTC)
- I assume that Kingpin was working under the assumption that this was an uncontroversial task, relatively easily undone that wouldn't be irritating to the community. The question on gaining community consensus for a bot task is tricky - do we need to actively seek it out for every single task? If it affects a specific set of articles, it's a good idea to go to the relevant Wikiprojects to ask for input first if it's going to cause a major change. But if we went and asked for every single little cleanup bot then we'd get no input at all - the community is strangely disinterested in bot operations and would quickly tire of our requests. So, not to dwell on the specific case as you request, the general case is that many tasks will in practice, be approved, when there is no consensus against the task rather than a positive consensus in its favour. Fritzpoll (talk) 09:27, 20 October 2009 (UTC)
- Unfortunately in this specific case the task is a controversial task. If the time on the RFBA board had been longer than 2 days, I might have been able to point this out because I've seen discussions on wikipedia about OCLC.
- So, if the task is going to impact article space, and it's going to impact a lot of articles, and it is adding something new, this was not the time, imo, to make a quick solo decision the task was not controversial.
- Just because a parameter is available in an article information box doesn't mean adding it via a bot is non-controversial. The organism article boxes have dozens of parameters that could be filled with a bot, if you started filling all of them, or even some of the hierarchies in all organisms, the bot would be blocked immediately by one of the biology editors with admin privileges.
- Yes, there is a lot of community disinterest in bots. But I don't think this was a good situation for assuming that the bot would be uncontroversial and could be quickly approved, because a search by the BAG member in non-main space would have revealed discussions about the issue, the BOT is adding information to an infobox, the BOT is working in article space, and the BOT is impacting thousands of articles.
- In addition, the bot owner was notified that his task was not controversial and should have stopped the bot at that point and revisited its operating parameters at the time, since the flag was given without any community input other than rapid approval by a single BAG member.
- I think a default position, when dealing with editing thousands of mainspace articles, that no word against something in less than two days is tacit approval is a poor operating practice in a community that works on consensus. --69.225.5.183 (talk) 16:05, 20 October 2009 (UTC)
- Well, I'm going to follow your initial suggestion that this was not specifically about CobraBot and not comment on it, since I have no insider knowledge about what Kingpin's reasoning was. As a BAG member myself, I'm also not going to revisit it and try to second-guess what I would have done. The question of community consensus is a witty one in all walks of Misplaced Pages - what constitutes a consensus, etc. That clause exists not to force us to race out and gather consensus, but to avoid approving things where there is doubt. Sometimes that goes wrong, as may have happened here: perhaps we need to be more thorough and that's certainly a comment I'll take on board. I'm looking at the dates in this particular case, however, and not seeing how we'd have found the main discussion, which occurred after approval. Will re-examine your links to check what I've probably missed. Fritzpoll (talk) 16:40, 20 October 2009 (UTC)
- I'd have to say there was little to indicate that this task would be controversial: The only discussion linked above that dates to before the BRFA was Template talk:Citation/Archive 3#Why OCLC?, which didn't actually generate a lot of discussion and was located on an unrelated template's talk page. The documentation for the template in question does mention "use OCLC when the book has no ISBN", but I missed it at first when checking just now because I paged straight to the documentation for the oclc parameter itself rather than reading through the introductory text. And Cybercobra did stop the bot once it became clear that the task was in fact controversial (although it does seem most of the controversy is due to just one very vocal editor on a crusade).
- Sure, this might have been avoided by insisting on arbitrary waiting periods before bot approval and other bureaucracy. But is more bureaucracy really either necessary or desired by the community as a whole? Anomie⚔ 17:08, 20 October 2009 (UTC)
- Exactly - a system like this will always allow some number of mistakes to occur - and all that tightening the rules too strongly will do is to make the 99.99% of never-controversial bots that BAG handle much slower to process. Fritzpoll (talk) 17:12, 20 October 2009 (UTC)
- Well, I'm going to follow your initial suggestion that this was not specifically about CobraBot and not comment on it, since I have no insider knowledge about what Kingpin's reasoning was. As a BAG member myself, I'm also not going to revisit it and try to second-guess what I would have done. The question of community consensus is a witty one in all walks of Misplaced Pages - what constitutes a consensus, etc. That clause exists not to force us to race out and gather consensus, but to avoid approving things where there is doubt. Sometimes that goes wrong, as may have happened here: perhaps we need to be more thorough and that's certainly a comment I'll take on board. I'm looking at the dates in this particular case, however, and not seeing how we'd have found the main discussion, which occurred after approval. Will re-examine your links to check what I've probably missed. Fritzpoll (talk) 16:40, 20 October 2009 (UTC)
- If anything is to come of this, I think it should be that unless the task has positive consensus, the bot operator should be willing to stop it at the first sign of opposition and take part in the discussion. Too often it seems that operators refuse to stop their bot until someone really starts screaming. Franamax (talk) 17:56, 20 October 2009 (UTC)
- Whatever lack of indicators beforehand there were about the controversial nature of this task, I think it should be considered that the task in general added a lot of edits to article space, added a visible parameter to articles, and this was done without community input. What's important to me, is that a bot not be approved so quickly when it is editing main space, when it is editing it in a way that human editors haven't (it's not changing a template that has been deprecated, for example, but adding something that meat editors didn't put in the articles).
- In this instance, and in future instances, where the bot is adding something to articles that will appear in mainspace, and adding it to a lot of articles, and, most particularly when the addition is a link to an external site, the task should not be considered non-controversial as a default value. --69.225.5.183 (talk) 02:57, 21 October 2009 (UTC)
- That (like 1RR) is probably a good rule to follow in general, although it needs to be balanced against the ignorance and pointiness factors. Anomie⚔ 18:35, 20 October 2009 (UTC)
1RR with bots is a good rule to follow with bots.
Adding thousands of links without community input is a major concern. However, in the case of mainspace edits that contribute to thousands of article additions or changes, I would like to see community input at least given a chance in the future, and anything that makes this explicit to BAG members would be a way of addressing the situation.
At this point, however, I would also like community input about rolling back the bot edits, since they were made without community input, and they link externally. This should not have been done without major community input. And, in the case of future editing mainspace with a bot adding external links, I think the default value should be to not do so if the community has not positively spoken for adding the link. --69.225.5.183 (talk) 02:57, 21 October 2009 (UTC)
- I actually meant 1RR is generally a good rule for human editors to follow. Most bots should be following something akin to 0RR, although there are a number of exceptions (e.g. bots implementing CFD decisions). As for CobraBot supposedly adding external links, you're making the same mistaken assumption User:Gavin.collins insisted on repeatedly making. Anomie⚔ 03:03, 21 October 2009 (UTC)
- Oh on the 1RR. I checked, and there was no link before the bot added the OCLC, and there is one afterwards. So, the bot added a link, it's through a template, I suppose, but it is still links. If the links aren't there, maybe you could link to a CobraBot addition that shows the before there is and after there is or before there isn't and after there isn't edit. --69.225.5.183 (talk) 03:51, 21 October 2009 (UTC)
- The bot added the oclc parameter to the template, what the template decides to do with it is beside the point. As mentioned elsewhere, if the community decides that external links to WorldCat are inappropriate the template would be changed with no alteration to what the bot did. Anomie⚔ 03:53, 21 October 2009 (UTC)
- It's fine to put functionality in a template that may or may not be used at the time the template is created. Again, this is the case with taxoboxes, for example. There are multiple parameters that are not necessarily used. It's the community that decides, or the individual editors, to use that parameter in the template. In this case, because the community was not consulted the decision to activate all these links was made without a single mention of it in the BRFA or anywhere. Neither the bot creator, nor Kingpin mention that this bot will be creating active links where none existed before by editing this parameter. ::::So, it's another point I guess, for bot instructions, when the edit that is being made will be making links, this should be explicitly stated in the BRFA. And it's another good reason for requiring proactive community support rather than just a lack of disapproval for that particular bot. --69.225.5.183 (talk) 04:20, 21 October 2009 (UTC)
- The bot added the oclc parameter to the template, what the template decides to do with it is beside the point. As mentioned elsewhere, if the community decides that external links to WorldCat are inappropriate the template would be changed with no alteration to what the bot did. Anomie⚔ 03:53, 21 October 2009 (UTC)
- Oh on the 1RR. I checked, and there was no link before the bot added the OCLC, and there is one afterwards. So, the bot added a link, it's through a template, I suppose, but it is still links. If the links aren't there, maybe you could link to a CobraBot addition that shows the before there is and after there is or before there isn't and after there isn't edit. --69.225.5.183 (talk) 03:51, 21 October 2009 (UTC)
- Nearly every bot contributes to thousands of articles. What is or is not trivial/non-controversial is always a judgment call, and therefore impossible to insure 100% accuracy. In this case there was no reason to believe filling in an existing template parameter would be controversial. (And in all reality most controversial requests are unlikely to draw objections until they go live, as 99.9% of Misplaced Pages pays zero attention to BRFAs.) --ThaddeusB (talk) 03:55, 21 October 2009 (UTC)
- Yes, we've kinda moved on to how to handle it in the future though. --69.225.5.183 (talk) 04:20, 21 October 2009 (UTC)
- Even if the application is filed and quickly approved, bots still have more oversight than most users. They have a bot owner who (almost always) takes responsibility for errors, and while many edits may be made, they can be very easily stopped with no repercussions (through blocking or shutoff buttons). More analysis on "pre-editing" in the human wiki world would lead to claims of WP:CREEP. tedder (talk) 04:17, 21 October 2009 (UTC)
I think that "say no to linkspam" says it all, no matter what the age. There was no consensus to actively link to this site, the bot move forward without gaining any community consensus, making en.wiki the "feeder site" to thousands of links to worldcat. The community should decide whether or not the infoboxes provide links to this particular website, not BAG, particularly since BAG's fallback is to generally approve a trial, then approve the bot for flagging based only on technical issues.
BAG itself seems to indicate there is no design for community input: a trial is "most likely" approved, without rgard to community input, then the bot is approved solely on technical issues. Linking thousands of wikipedia pages to worldcat required community consensus, not rapid approval. If this is done here it could be an easy avenue for vandalism. --69.226.111.130 (talk) 21:05, 23 October 2009 (UTC)
- I'm afraid I'm in the camp that says that the bot is not adding the link to WorldCat - the template is. If there's no consensus to link to WorldCat, then the template needs to be changed - if Cobra went through and did it all manually, it would still be overlinked. The bot may have highlighted a controversial template setting, but it didn't create it Fritzpoll (talk) 21:39, 23 October 2009 (UTC)
- The bot used a functionality of the template without community input. As I said above, there are dozens of fields in wikipedia taxoboxes. If you assign a bot to fill in every one of them, the bot will simply be blocked by a responsible admin. Even if you approved a bot to fill in only a specific field on every single taxobox the bot would be blocked unless it went forward with community consensus.
- The spirit of the policy for community consensus at wikipedia is that the community gets some say in the matter. Find all the technicalities, "the field exists, therefore it should be filled in," it doesn't amount to community consensus. BAG approved a bot to do a task without gaining community consensus.
- It's for the community to decide whether or not that field should be filled in on every article, not for an individual BAG member. I've already linked to a discussion showing there appears to be no consensus for that field to be filled in. Bots policy doesn't say "if there's no consensus against something a bot can do it."
- What it says is, "In order for a bot to be approved, its operator should demonstrate that it: performs only tasks for which there is consensus." So, let's run with bots policy. There is no link to community consensus for this task, because there was no attempt to establish community consensus for the task.
- Other templates have blank fields. If there is consensus by the community to have a bot fill it in, fine. But there isn't. So, rollback of an unapproved bot task is a simple and straight-forward means to take care of the issue. --69.226.111.130 (talk) 01:19, 24 October 2009 (UTC)
- I would have to agree. There seems to be no oversight or governance policy in operation here. Bots and automated tools seem to be performing many tasks with out consideration to the wider issues involved or the consequences of bot actions. Blaming the template and washing of hands is frankly a cheap way to sidestep the issue of linkspamming. Obviously the principal of duty of care has not permeate down to BAG yet. In theory the bots should be accountable to the wider community. In practise, the legitimate issues brought to the attention of the bot operators are ignored. Its "one rule for you, one rule for me" as far as I can see. --Gavin Collins (talk|contribs) 11:03, 29 October 2009 (UTC)
- Here's the Relevant ANI thread Gavin is referring to. tedder (talk) 11:13, 29 October 2009 (UTC)
- I would have to agree. There seems to be no oversight or governance policy in operation here. Bots and automated tools seem to be performing many tasks with out consideration to the wider issues involved or the consequences of bot actions. Blaming the template and washing of hands is frankly a cheap way to sidestep the issue of linkspamming. Obviously the principal of duty of care has not permeate down to BAG yet. In theory the bots should be accountable to the wider community. In practise, the legitimate issues brought to the attention of the bot operators are ignored. Its "one rule for you, one rule for me" as far as I can see. --Gavin Collins (talk|contribs) 11:03, 29 October 2009 (UTC)
- Other templates have blank fields. If there is consensus by the community to have a bot fill it in, fine. But there isn't. So, rollback of an unapproved bot task is a simple and straight-forward means to take care of the issue. --69.226.111.130 (talk) 01:19, 24 October 2009 (UTC)
Another speedy approval, and speedy approvals and community input in general
Can we give more than 3 minutes for interested users to examine trial runs? There seem to be many excuses for why community consensus is not needed, not given, no time for it. In this particular bot case, the task is straight-forward, responsible and responsive bot owner, dealing with deprecated code, etc., etc. But, sometimes I want to examine the trial runs after they have been run, but before the final approval, to see if they are problems that show up during the trial. A good reason for doing trials in the first place is to examine the results.
3 minutes is not enough time, and I don't see the urgency in approving this bot in 3 minutes. A couple of days for interested users to examine the trial run is not unreasonable, imo, no matter what the task.
One reason for instruction creep, by the way, is that editors seem to other editors to be overlooking common courtesies and common sense. I don't see why the instructions should say wait 2 days or wait more than 3 minutes, except that it is apparently not obvious that waiting more than 3 minutes gives time for community input.
There was no urgency in approving this bot, so allowing more than 3 minutes for the trial run to be examined by interested parties would have been a simple courtesy. --IP69.226.103.13 (talk) 20:58, 22 October 2009 (UTC)
- The trial is a technical check. Mr Z-Man allowed time prior to the trial for community input on the task itself. Once the trial was proven to be technically functional - which is the reason BAG members are chosen, because of technical proficiency - there was no reason for further delay. Fritzpoll (talk) 23:04, 22 October 2009 (UTC)
- I would also note that in this case, there was also a bot request open for several days. (Oddly enough, WP:BOTREQ tends to get significantly higher traffic than BRFA). Mr.Z-man 23:36, 22 October 2009 (UTC)
- Lots of people want things done for them, but few probably care about unrelated tasks that aren't theirs; there's also probably a discouragement factor for non-programmers, who might be uncertain if their non-technical input is wanted/needed/helpful/relevant (Answer: Yes, it is!). --Cybercobra (talk) 00:39, 23 October 2009 (UTC)
So, it boils down to: after community input, a BAG member "will most likely approve a trial for your bot," (without any reference to the community input), then based entirely on technical functionality, the bot will be quickly approved after the trial. The bot owner is solely responsible for the actions of the bot.
So, BAG does nothing, but allow for a community input board then fast forward bots to be flagged by bureaucrats, or whoever flags bots... Interesting.
I will then move forward with this understanding of BAG's role on en.wiki. --69.226.111.130 (talk) 20:49, 23 October 2009 (UTC)
- If you do so, you will be moving forward with an incorrect understanding. BAG checks if a task seems to have sufficient consensus for how controversial it seems to be (and yes, sometimes something turns out to be controversial that didn't seem like it would be); at times, and increasingly often lately, BAG will insist that wider community consensus must be sought before the request may proceed. BAG also considers the technical aspects of the proposed task; among other things, this includes reviewing the code (when available) for errors, making suggestions on how to do things more efficiently, and pointing out situations to watch out for. If both of those seem good, a BAG member will approve a trial. It turns out that this has historically happened more often than not, hence the wording "will most likely approve a trial for your bot" that you continue to insist on misinterpreting. If the trial is completed without issues and it seems unlikely that there will be further community reaction as a result of the trial edits, BAG approves the bot. Then a bureaucrat comes along, checks the BRFA again for consensus and such, and actually grants the bot flag. Anomie⚔ 21:05, 23 October 2009 (UTC)
- "Insist on misinterpreting" because the history isn't included in the policy? How should someone interpret "most likely" to be approved without knowing the history? It's the bots policy, not a predictor of the outcome of a request, and it should be written clearly and cleanly as a policy that invites both the casual and familiar editor of wikipedia to learn about how bots work on wikipedia. But if you're going to insist upon describing the history of occurrences here instead of giving a clean and clear statement of policy you're going to wind up with people misinterpreting what you mean.
- It also doesn't seem that BAG's ability to predict community response is particularly good, since recent incidents of community consensus were wrong, and, again, there's no point in a group wherein individuals are expected to predict the community response. The community can simply give their response. That makes it easy on everyone. BAG members aren't expected to be mind readers. When they're unwilling to do a search and find out the community consensus, they can just wait a reasonable amount of time. Again, community inclusiveness would rule decision making rather than mind-reading skills or some other BAG ability to know what the community wants without asking. There's no other place on wikipedia where editors are expected to individually gauge the community consensus without input from the community.
- Asking for sufficient time greater than 3 minutes to be able to look over a trial and input a response is not unreasonable.
- I'm not insisting on misinterpreting, I'm simply not making the effort to read the entire history of bag to learn what a single paragraph actually means, when it's meaning is not obvious. Policies in a dynamic community should be written in a way that allows someone to gather the correct policy, not the history of the policy, from reading the page.
- Why not be courteous to the spirit of the community of wikipedia as it actually is: people come and go, and the policy could be written out for interested editors who drop by and don't know and don't want to read the history. Then it's clear, also, to others how it is intended. That I'm misinterpreting is also not so obvious without reading the history, by the way. That's what your policy says. --69.226.111.130 (talk) 01:09, 24 October 2009 (UTC)
- "Insist on misinterpreting" because it has been explained to you before, but instead of doing anything to try to "fix" it you just continue to bring up the misinterpretation. Do you have a better suggested wording to describe the typical bot approvals process?
- The wording is not that tricky. How about "After a reasonable amount of time has passed for community input, a member of the Bot Approvals Group may approve a short trial during which the bot is monitored to ensure that it operates correctly" like it says on the policy page?
- Sure, the community can give their response. But besides you, no one bothers. What are we supposed to do, wait around forever because the community generally doesn't care about bots until after the fact? Also, I suspect you're mistaken in your comment that "There's no other place on wikipedia where editors are expected to individually gauge the community consensus without input from the community". WP:RFR doesn't seem to have much community input, mostly someone requests and then an admin either grants or denies. People doing WP:New pages patrol don't seem to seek much community input before slapping on a CSD tag, and in many cases it doesn't seem like the admins doing the deletion do either. And we have WP:BRD to describe a good way to go about it in relation to general edits.
- Have I anywhere asked you to "wait around forever?" Is that comment a necessary part of this discussion?
- Permissions don't generally impact 1000s of articles at a time, do they? If they do, they should not be granted without community input. I think that in this case, most of the various permissions become part of other wikipedia editors monitoring/stalking each other.
- I think the new pages patrollers are also out of control, but one battle at a time.
- WP:BRD deals with individual edits.
- Just out of curiosity, do you actually have any comments on that particular trial, or is it just the principle of the thing? Anomie⚔ 01:22, 24 October 2009 (UTC)
- I made my position on this particular trial clear in the third sentence above. --IP69.226.103.13 (talk) 04:09, 24 October 2009 (UTC)
- "Insist on misinterpreting" because it has been explained to you before, but instead of doing anything to try to "fix" it you just continue to bring up the misinterpretation. Do you have a better suggested wording to describe the typical bot approvals process?
- Why not be courteous to the spirit of the community of wikipedia as it actually is: people come and go, and the policy could be written out for interested editors who drop by and don't know and don't want to read the history. Then it's clear, also, to others how it is intended. That I'm misinterpreting is also not so obvious without reading the history, by the way. That's what your policy says. --69.226.111.130 (talk) 01:09, 24 October 2009 (UTC)
- "some other BAG ability to know what the community wants without asking" - We are asking the community, that's what the BRFA is; the community just doesn't respond. You point to a couple of requests that were processed quickly as "evidence" that we don't give the community time to respond, but I could point to plenty that are/were open for weeks with little to no input from outside the bot community. Except for major policy changes, discussions on Misplaced Pages typically don't last for more than a week. The LawBot BRFA, while it was approved only a few minutes after the trial, was open for 15 days. That's twice as long as we give for deletion and adminship discussions. Mr.Z-man 01:36, 24 October 2009 (UTC)
- Community consensus is the policy, and that requires time for community input, and that requires more than 3 minutes. There's no policy on moving forward without community consensus, so, until there is, the lack of community input doesn't matter. What matters is the failure to provide time within which the community can comment. --IP69.226.103.13 (talk) 04:09, 24 October 2009 (UTC)
- 2 weeks is more than enough time. There's no minimum requirement for what constitutes consensus. Its based on whoever cares enough to show up for the discussion. At WP:FFD for instance, files are routinely deleted with no comments from anyone other than the nominator simply because no one cares enough to comment. BRFAs should not have to take a month just so that we can prove that no one really cares. See also Misplaced Pages:Silence and consensus. If you want to try to get more community input into bot approvals, that would be great, but requiring some arbitrary waiting period after each step for a process that already takes way too long is not the solution. Mr.Z-man 04:26, 24 October 2009 (UTC)
- It's not "each step." Is it necessary to exaggerate what I'm saying to disagree with me?
- Trial runs are an important part of creating and debugging programs. I should not have to point this out, or counter that I'm not asking for a month after each step.
- Three minutes is not enough time for the community to view and comment upon a trial run. It wasn't two weeks, it was 3 minutes. Trial runs are not just one of many steps in the process of writing programs, even little scripts on wikipedia.
- This is not WP:FFD, which deals with single files, but WP:BOTS, which deals with programs that can impact hundreds and thousands of articles. WP:FFD has specific policies listed to deal with this issue, and they are not the same as WP:BOTS policy. The former has a policy, one assumes made by community consensus, like one assumes the wikipedia bots policy is made. The FFD policy says, "Files that have been listed here for more than 7 days are eligible for deletion if there is no clear consensus in favour of keeping them or no objections to deletion have been raised," whereas the bots policy says, "In order for a bot to be approved, its operator should demonstrate that it performs only tasks for which there is consensus."
- BOTS should use BOTS policy. And, meanwhile, FFD can go on using FFD policy. Both, until community consensus changes.
- Who asked you to take a month? Where? Quote me on asking you to wait a month, or forever, or two weeks after a trial run. Can you stick with the topic without exaggeration, please?
- I'm not "requiring some arbitrary waiting period after each step." So, back to the topic, please.
- BOTS policy requires community consensus. Trial runs are a major and important part of programming, not simply one of many steps in the process. The completion of a trial run is a major step in checking the viability of a bot and may reveal problems or additional requirements for the bot. This is a part of the process where it would be appropriate for the community to be allowed enough time to comment. 3 minutes is not long enough. --IP69.226.103.13 (talk) 04:53, 24 October 2009 (UTC)
- (e/c)There's really only 2 steps on a normal bot approval - before the trial and after the trial - so yes, requiring a wait for consensus before trial approval and before final approval would be a required wait before each step. Or, to put it another way, is there any step to bot approval that you don't think needs time for community input?
- So, adding a waiting period after the only other step is onerous? If there are only two steps, then waiting after each step is not that big of a deal. So, I stand corrected. You expressed add adding a waiting period after "each step," and it seemed like this was much more than just adding a waiting period after the single other step. Yes, no good argument has been offered for why there should be time for community consensus after one step and not the other, and bot policy is simple in this respect: it requires community consensus, consensus requires time for community input. If there are only two steps, consensus can be gained after each by simply allowing some time for community input. 3 minutes is not sufficient for either step.
- 2 weeks was referring to how long the community had to comment on the request between the time it opened and the time it was approved. 2.5 if you include the WP:BOTREQ thread as well.
- If it shouldn't take a month, how long should it take? 2 weeks is obviously not sufficient to get community input. Misplaced Pages:Bots/Requests for approval/SDPatrolBot 4 has been open for more than 6 weeks, its been 3 weeks since the trial ended, and quelle surprise, no community input yet. Mr.Z-man 05:06, 24 October 2009 (UTC)
- No matter how many times the community fails to comment it isn't an argument for disallowing community input, so I'm not sure why BAG members keep bringing this up in this discussion.
- Bot policy requires community input, time for it should be allowed. 3 minutes is not sufficient time. What is that time usually allowed for community input? Some fraction of that time for community input on the trial run would be reasonable. The RFBA has been up for the pre-trial time, interested parties can watch list it, they've already had some time to comment, they're aware of the trial run. --IP69.226.103.13 (talk) 05:19, 24 October 2009 (UTC)
- I'm not saying we should disallow community input, I'm saying that we shouldn't wait unreasonable amounts of time for input. Do you have any actual suggestions on how to either increase participation, or how long to wait before we can assume no one has any complaints? Mr.Z-man 06:47, 24 October 2009 (UTC)
- I don't believe I've suggested an unreasonable amount of time for waiting. In fact, I've only suggested longer than 3 minutes, or some fraction of the time waiting for the before. You don't have to assume no-one has any complaints. That's not part of policy. I asked how long you usually wait for comments in the before period, as I think less time than that is necessary. How about a week? A couple of weeks on an inactive RFBA seems like a good amount of time, more than generous for community input. There are some RFBAs that require less time, edits only non-space, other reasons. But a couple of weeks seems reasonable for an RFBA before trial, and a week seems reasonable for a post trial RFBA. --69.226.111.130 (talk) 07:17, 24 October 2009 (UTC)
- I'm not saying we should disallow community input, I'm saying that we shouldn't wait unreasonable amounts of time for input. Do you have any actual suggestions on how to either increase participation, or how long to wait before we can assume no one has any complaints? Mr.Z-man 06:47, 24 October 2009 (UTC)
- (e/c)There's really only 2 steps on a normal bot approval - before the trial and after the trial - so yes, requiring a wait for consensus before trial approval and before final approval would be a required wait before each step. Or, to put it another way, is there any step to bot approval that you don't think needs time for community input?
- 2 weeks is more than enough time. There's no minimum requirement for what constitutes consensus. Its based on whoever cares enough to show up for the discussion. At WP:FFD for instance, files are routinely deleted with no comments from anyone other than the nominator simply because no one cares enough to comment. BRFAs should not have to take a month just so that we can prove that no one really cares. See also Misplaced Pages:Silence and consensus. If you want to try to get more community input into bot approvals, that would be great, but requiring some arbitrary waiting period after each step for a process that already takes way too long is not the solution. Mr.Z-man 04:26, 24 October 2009 (UTC)
- Community consensus is the policy, and that requires time for community input, and that requires more than 3 minutes. There's no policy on moving forward without community consensus, so, until there is, the lack of community input doesn't matter. What matters is the failure to provide time within which the community can comment. --IP69.226.103.13 (talk) 04:09, 24 October 2009 (UTC)
- "some other BAG ability to know what the community wants without asking" - We are asking the community, that's what the BRFA is; the community just doesn't respond. You point to a couple of requests that were processed quickly as "evidence" that we don't give the community time to respond, but I could point to plenty that are/were open for weeks with little to no input from outside the bot community. Except for major policy changes, discussions on Misplaced Pages typically don't last for more than a week. The LawBot BRFA, while it was approved only a few minutes after the trial, was open for 15 days. That's twice as long as we give for deletion and adminship discussions. Mr.Z-man 01:36, 24 October 2009 (UTC)
We are not a bureaucracy, we can just take action without putting everything up for discussion first. If the community objects then we can act on it, otherwise we can just get the work done. If something goes wrong with a bot then that is what the revert button is for. Chillum 04:58, 24 October 2009 (UTC)
BAGBot replacement?
BAGbot seems to be mostly dead lately and ST47 doesn't seem to be around anymore, so Misplaced Pages:BAG/Status isn't being updated and users aren't being notified when {{OperatorAssistanceNeeded}} is used (it did a couple other things, but these 2 were probably the most important). I was going to make a replacement, but can't seem to find the time to finish it (i.e. de-crappify my hastily thrown together code). If someone wants to write a replacement for it, that would be much appreciated. If someone wants my code to start with (written in Python, using my framework), I don't recall if the current version actually works or not. Mr.Z-man 04:56, 27 October 2009 (UTC)
- Hmm... I feel I could probably write something to do this, and I'd be keen to give it a go. DotNetWikiBot is currently refusing to load page history, so it would have to wait until that is fixed. Also, the main problem would be that I don't currently have a computer which I'm comfortable with running 24hrs, so either I could send the exe to someone who does (it wouldn't use much bandwidth, as it's only running once every 30 or so minutes), or I could just run it in the British daytime. As you can see, there are a few complications, so if someone else wants to program it, that's fine with me. - Kingpin (talk) 09:37, 27 October 2009 (UTC)
- I'll look at having AnomieBOT do it. Anomie⚔ 11:16, 27 October 2009 (UTC)