Revision as of 00:53, 22 March 2007 editMets501 (talk | contribs)24,644 edits clarify wording← Previous edit | Latest revision as of 21:47, 26 February 2009 edit undoThehelpfulbot (talk | contribs)56,901 editsm Robot: Renaming category to Category:Misplaced Pages failed bot requests and general fixes to page; | ||
(7 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
<noinclude> | |||
⚫ | |||
] | |||
⚫ | :''The following discussion is an archived debate. <span style="color:red">'''Please do not modify it.'''</span> |
||
⚫ | <div class="boilerplate metadata" style="background-color: #dedaca; margin: 2em 0 0 0; padding: 0 10px 0 10px; border: 1px solid #AAAAAA;"> | ||
⚫ | ==]== | ||
⚫ | :''The following discussion is an archived debate. <span style="color:red">'''Please do not modify it.'''</span> Subsequent comments should be made in a new section.'' The result of the discussion was ] '''Approval withdrawn'''.<!-- from Template:Bot Top--> | ||
⚫ | == ] == | ||
{{Newbot|BetacommandBot|3}} | {{Newbot|BetacommandBot|3}} | ||
Line 18: | Line 20: | ||
'''Already has a bot flag''' ''yes''''':''' | '''Already has a bot flag''' ''yes''''':''' | ||
'''Function Details:''' Currently BCbot does linksearches and post them to subpages under <s>my account</s> ] subpages these anti-spam statistical task may slowly branch out to cover more but they will stay in <s>my userspace</s>] subpages | '''Function Details:''' Currently BCbot does linksearches and post them to subpages under <s>my account</s> ] subpages these anti-spam statistical task may slowly branch out to cover more but they will stay in <s>my userspace</s>] subpages | ||
===Discussion=== | === Discussion === | ||
Sorry, I can't really understand what you're saying. Can you clarify? —<span style="color: red;">] (])</span> 18:17, 10 March 2007 (UTC) | Sorry, I can't really understand what you're saying. Can you clarify? —<span style="color: red;">] (])</span> 18:17, 10 March 2007 (UTC) | ||
:See ]] ] ] <sup>(] • ] • ])</sup> 07:35, 12 March 2007 (UTC) | :See ]] ] ] <sup>(] • ] • ])</sup> 07:35, 12 March 2007 (UTC) | ||
::Personally, I'm as wise as before. What's it currently doing? What extension are you proposing? How's it going to do it? At first sight this looks like you're requesting approval for a spider to exhaustively search for spam links, but I'm hoping that's not the case. ] 07:34, 13 March 2007 (UTC) | ::Personally, I'm as wise as before. What's it currently doing? What extension are you proposing? How's it going to do it? At first sight this looks like you're requesting approval for a spider to exhaustively search for spam links, but I'm hoping that's not the case. ] 07:34, 13 March 2007 (UTC) | ||
:::It's not a spider users who are part of ] Identify spamed domains and BCbot tracks the spamed domains and assist in identifying common targets for a given domain and provides a linksearch history for given spammer. ] <sup>(] • ] • ])</sup> 18:00, 13 March 2007 (UTC) | :::It's not a spider users who are part of ] Identify spamed domains and BCbot tracks the spamed domains and assist in identifying common targets for a given domain and provides a linksearch history for given spammer. ] <sup>(] • ] • ])</sup> 18:00, 13 March 2007 (UTC) | ||
::::I'm still struggling to follow your explanation (if explanation that was). Looking at the history of ], I see that BCD a) BCB created the initial list, and b) periodically deletes items, and c) periodically adds new items. Are all three of these activities covered by this BRFA? If so, how are you doing each? The b) part seems clearest, as I assume it's done by periodically checking to see if each page on the list has been deleted or despammed, but if a) and c) were to be automated, I'd be guessing as to how. (Come to that, I'm rather guessing here in general, and would prefer "Function Details" that wasn't so unclear, if not open-ended, in the first place.) ] 03:19, 14 March 2007 (UTC) | ::::I'm still struggling to follow your explanation (if explanation that was). Looking at the history of ], I see that BCD a) BCB created the initial list, and b) periodically deletes items, and c) periodically adds new items. Are all three of these activities covered by this BRFA? If so, how are you doing each? The b) part seems clearest, as I assume it's done by periodically checking to see if each page on the list has been deleted or despammed, but if a) and c) were to be automated, I'd be guessing as to how. (Come to that, I'm rather guessing here in general, and would prefer "Function Details" that wasn't so unclear, if not open-ended, in the first place.) ] 03:19, 14 March 2007 (UTC) | ||
:Step one anyone in #wikipedia-spam finds a spammer they request BCbot to do a linksearch. if BCbot has a hit it A.) adds the name to ] B.) creates a subpage with the linksearch results. C.) it updates ] | :Step one anyone in #wikipedia-spam finds a spammer they request BCbot to do a linksearch. if BCbot has a hit it A.) adds the name to ] B.) creates a subpage with the linksearch results. C.) it updates ] | ||
Line 41: | Line 43: | ||
He's already got a flag; presumably you meant this? :) {{BotApproved}} --] 11:56, 21 March 2007 (UTC) | He's already got a flag; presumably you meant this? :) {{BotApproved}} --] 11:56, 21 March 2007 (UTC) | ||
:{{BotDenied}} For now, I have withdrawn the approval of this bot. |
:{{BotDenied}} For now, I have withdrawn the approval of this bot. When everything is in order again, if you still want to do a similar task please make a new bot request. —<span style="color: red;">] (])</span> 00:26, 22 March 2007 (UTC) | ||
:'''NOTE: ''' Permission for this bot was removed "Due to the huge controversy surrounding Betacommand's removal of external links". (See (at the bottom) and for the controversy itself). -- ] 13:00, 22 March 2007 (UTC) | |||
⚫ | :''The above discussion is preserved as an archive of the debate. |
||
:'''UPDATE:''' This specific request has been rejected permanently, but may be '''superseded''' by a future request of the same nature. -- ] 14:20, 28 March 2007 (UTC) | |||
⚫ | :''The above discussion is preserved as an archive of the debate. <span style="color:red">'''Please do not modify it.'''</span> Subsequent comments should be made in a new section.''<!-- from Template:Bot Bottom --></div> | ||
</noinclude> | </noinclude> |
Latest revision as of 21:47, 26 February 2009
- The following discussion is an archived debate. Please do not modify it. Subsequent comments should be made in a new section. The result of the discussion was Approval withdrawn.
BetacommandBot
Operator: User:Betacommand
Automatic or Manually Assisted: Auto
Programming Language(s): python
Function Summary: Anti-spam work
Edit period(s) (e.g. Continuous, daily, one time run):
Edit rate requested: Unknown edits as needed
Already has a bot flag yes:
Function Details: Currently BCbot does linksearches and post them to subpages under my account Misplaced Pages:WPSPAM subpages these anti-spam statistical task may slowly branch out to cover more but they will stay in my userspaceMisplaced Pages:WPSPAM subpages
Discussion
Sorry, I can't really understand what you're saying. Can you clarify? —METS501 (talk) 18:17, 10 March 2007 (UTC)
- See Misplaced Pages:WikiProject Spam/LinkSearchMisplaced Pages:WikiProject Spam/Report Misplaced Pages:WikiProject Spam/LinkSearch/List Betacommand 07:35, 12 March 2007 (UTC)
- Personally, I'm as wise as before. What's it currently doing? What extension are you proposing? How's it going to do it? At first sight this looks like you're requesting approval for a spider to exhaustively search for spam links, but I'm hoping that's not the case. Alai 07:34, 13 March 2007 (UTC)
- It's not a spider users who are part of Misplaced Pages:WPSPAM Identify spamed domains and BCbot tracks the spamed domains and assist in identifying common targets for a given domain and provides a linksearch history for given spammer. Betacommand 18:00, 13 March 2007 (UTC)
- I'm still struggling to follow your explanation (if explanation that was). Looking at the history of Misplaced Pages:WikiProject Spam/LinkSearch/zazzle.com, I see that BCD a) BCB created the initial list, and b) periodically deletes items, and c) periodically adds new items. Are all three of these activities covered by this BRFA? If so, how are you doing each? The b) part seems clearest, as I assume it's done by periodically checking to see if each page on the list has been deleted or despammed, but if a) and c) were to be automated, I'd be guessing as to how. (Come to that, I'm rather guessing here in general, and would prefer "Function Details" that wasn't so unclear, if not open-ended, in the first place.) Alai 03:19, 14 March 2007 (UTC)
- It's not a spider users who are part of Misplaced Pages:WPSPAM Identify spamed domains and BCbot tracks the spamed domains and assist in identifying common targets for a given domain and provides a linksearch history for given spammer. Betacommand 18:00, 13 March 2007 (UTC)
- Personally, I'm as wise as before. What's it currently doing? What extension are you proposing? How's it going to do it? At first sight this looks like you're requesting approval for a spider to exhaustively search for spam links, but I'm hoping that's not the case. Alai 07:34, 13 March 2007 (UTC)
- Step one anyone in #wikipedia-spam finds a spammer they request BCbot to do a linksearch. if BCbot has a hit it A.) adds the name to Misplaced Pages:WikiProject Spam/LinkSearch/List B.) creates a subpage with the linksearch results. C.) it updates Misplaced Pages:WikiProject Spam/Report
- Step 2: daily task it reads /List and updates all reports. if it finds that there are zero links it adds that to /Holding 1 it before adding to /Holding 1 BCbot does the same for /Holding 2 and moves it to old. /Old contains all websites that havent been spammed within the last 72 hours.
- BCBot is used to track spammed domains and assist in identifying them and removing their spam. Betacommand 03:29, 14 March 2007 (UTC)
- I'm awarding myself some points, since my "b)" seems to correspond fairly closely to your "Step 2". That part seems fair enough. But I'm still struggling with the "do a linksearch" bit. How's this to be generated? From the contribs of a given spammer, extracting all additions of a particular domain? Or what? Alai 03:37, 14 March 2007 (UTC)
- it uses Special:Linksearch to generate that data. as it is the least taxing method for our servers Betacommand 19:48, 14 March 2007 (UTC)
- OK, that was a key piece of data I hadn't secured on my own recognizance; thank you. Aside from the meta-issue of the description being still being less than crystal clear, and getting additional information being rather like pulling teeth, I'm not seeing any problems here. Alai 03:13, 17 March 2007 (UTC)
- I've noticed these pages already; presumably this application covers that activity which is already happening? If not, then I must have missed the point too.
- Once a URL is down to 0 external links is it necessary to still have a page for them (and thus another external link), e.g. Misplaced Pages:WikiProject_Spam/LinkSearch/panthercarclub.com?
- What's the data going to be used for and who decides what is spam? I notice that for example google is listed, and yet google actually has an interwiki link google:test. --kingboyk 19:17, 17 March 2007 (UTC)
- true google is listed that is because there was a request for a report of that data. spam is identified in several methods, Mass additions, identification of domains/sites that are identifed as possible spam that link to said page should be monitored, and other methods that humans Identify spam. the bot does not Identify spam. all the bot does is track the usage of the domain over time. In regard to deleting the subpage that issue has not arisen to need deletion, having the site in the title is not a link. Betacommand 02:07, 18 March 2007 (UTC)
All seems solid, flag away (looks for b'crat) -- Tawker 03:40, 21 March 2007 (UTC)
He's already got a flag; presumably you meant this? :) Approved. --kingboyk 11:56, 21 March 2007 (UTC)
- Denied. For now, I have withdrawn the approval of this bot. When everything is in order again, if you still want to do a similar task please make a new bot request. —METS501 (talk) 00:26, 22 March 2007 (UTC)
- NOTE: Permission for this bot was removed "Due to the huge controversy surrounding Betacommand's removal of external links". (See this page (at the bottom) and this page for the controversy itself). -- RM 13:00, 22 March 2007 (UTC)
- UPDATE: This specific request has been rejected permanently, but may be superseded by a future request of the same nature. -- RM 14:20, 28 March 2007 (UTC)
- The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made in a new section.