Revision as of 03:44, 14 December 2024 editClayoquot (talk | contribs)Event coordinators, Extended confirmed users, Pending changes reviewers24,478 edits →Draftifications by month← Previous edit | Latest revision as of 16:45, 25 December 2024 edit undoCX Zoom (talk | contribs)Edit filter helpers, Extended confirmed users, Page movers, Pending changes reviewers19,277 edits →Orphaned editnotices: ReplyTag: Reply | ||
(25 intermediate revisions by 9 users not shown) | |||
Line 25: | Line 25: | ||
] | ] | ||
== Draftifications by month == | |||
== Dusty articles within all subcategories of a parent category == | |||
Hi everyone. Cryptic kindly created which shows how many draftifications took place between 2021-07 and 2022-08. Could someone please fork it to show dates from 2016 to 2024? If it's easier, I'm fine with seeing the number of draftifications by year instead of by month. Many thanks and best wishes, ] (] <nowiki>|</nowiki> ]) 03:38, 14 December 2024 (UTC) | |||
Is this possible? I'd like to get a list like ] but for anything within any subcategory of ]. It would make quite a nice little To Do list for times I feel like doing some research and writing but don't have a particular bee in my bonnet that very minute. Thanks for any help! ] (]) 15:16, 17 November 2024 (UTC) | |||
:I've rerun the query in-place. —] 14:19, 14 December 2024 (UTC) | |||
:What is "dusty"? Neither ] nor ] say what it does. Is it a sort by timestamp of last edit?{{pb}}In direct subcategories only, include the handful of pages directly in the category, or the whole tree? If the last, to what depth? Examples: ]→]→]→] is depth 2, and ]→]→]→]→] is depth 3; neither the page itself nor the root category count. —] 16:51, 17 November 2024 (UTC) | |||
::Beautiful, thank you so much Cryptic! ] (] <nowiki>|</nowiki> ]) 16:29, 14 December 2024 (UTC) | |||
::Yes, it's a list of articles by date of most recent edit. | |||
::Hm, on the second question. Ideally I'd end up with is a list of, say, food items that hadn't been edited in ten years. Or chefs, or restaurants, or food animals or whatever. Maybe I need to choose a more specific subcategory? ] (]) 17:24, 17 November 2024 (UTC) | |||
:::Well, ok, | |||
:::*] is in ], but, strictly speaking, isn't in "any subcategory of ]". Should it be included in the list? | |||
:::*] is in ], which is a subcategory of ], so it should be. (It's also directly in ], but never mind that.) | |||
:::*] isn't in ] or any of its immediate subcategories, but it ''is'' in ], a subcategory of ]; so the article's in a sub-subcategory of ]. Include or not? | |||
:::*] isn't in Food or drink, its immediate subcategories, or (I think) any of ''their'' subcategories, but it's in ], a subcat of ], so it's at least in a sub-sub-subcategory of ]. Same question. | |||
:::The reason I need a maximum depth is because - like almost all reasonably broad categories - ] eventually includes a significant portion of ''all'' categories. Depth 10, for example, has 122639 different categories in the tree, out of 2.4 million total categories (including maintenance categories, category redirects, and so on), and you really quickly start getting unrelated pages like ] → ] → ] → ] → ] → ] → ] → ].{{pb}}Or, if you like, you can give me a list of categories to pull from. Even if it's a large list, or something like "Anything in any direct subcategory of ], ], ], ], ". —] 18:33, 17 November 2024 (UTC) | |||
::::Oh, and do you want non-mainspace pages in the list or not? What about redirects? —] 18:38, 17 November 2024 (UTC) | |||
:::::lol...clearly in over my head here. :D Thank you for your patience. | |||
:::::So, no to feed a cold, starve a fever. Yes to recipe, dulce de leche and ice milk. | |||
:::::I think maybe start with something that's likely to contain fewer extraneous things. ] in a way that will allow me to see, for instance, the articles that are in ] > ] > ] > ] that haven't been edited in the last ten years. ] (]) 18:50, 17 November 2024 (UTC) | |||
::::::Oh, no non-mainspace pages, no redirects. ] (]) 19:13, 17 November 2024 (UTC) | |||
:::::::None quite that old in either tree. ] for ] depth 3 (oldest is ], 2015-11-16 18:36:35 - see what I meant about unrelated pages?), ] for ] depth 4 (oldest is ], 2019-12-16T04:47:03). —] 19:19, 17 November 2024 (UTC) | |||
::::::::Well, thank you for your work, and sorry to waste your time! ] (]) 19:35, 17 November 2024 (UTC) | |||
:::::::::Not wasted at all. Not your fault the category system is terrible for datamining.{{pb}}There might be some value in finding the latest revision that wasn't marked minor, and maybe excluding ones made by bots too, but that's going to be harder and a lot slower. Would definitely need to cut the set of articles to look at to something on the order of a couple thousand before looking at the edits, rather than the tens of thousands in that first tree. —] 20:14, 17 November 2024 (UTC) | |||
::::::::::Thanks. And I've actually already found an article that needs attention from your 87976 query, so win! | |||
::::::::::The point for me here is looking for categories that have many articles that haven't been updated since before sourcing started modernizing. It's a bit tricky because the articles that were created first -- probably in any category -- are also likely the articles that get update often, have multiple watchers, etc. So it's possible there just aren't huge numbers of food articles that need this kind of attention. ] (]) 21:18, 17 November 2024 (UTC) | |||
== |
== List of all file redirects that are in use in mainspace == | ||
I wrote a query that lists all file redirects, at ]. Can this query be expanded to only list file redirects that are used in mainspace somewhere? –] <small>(])</small> 22:26, 19 December 2024 (UTC) | |||
There are {{NUMBEROFARTICLES}}, but AIUI that includes disambig pages, stand-alone lists, and outlines, and maybe even portals (i.e., all content namespaces, not just the mainspace) but excludes redirects. Is there a way to get a count of the number of just plain old ordinary articles, excluding the other types? (A percentage from a sample set is good enough; I'd like to be able to write a sentence like "Of the 6.9 million articles, 6.2 million are regular articles, 0.45 million are lists, and 0.2 million are disambig pages.") ] (]) 22:46, 17 November 2024 (UTC) | |||
:]. —] 22:56, 19 December 2024 (UTC) | |||
: {{re|WhatamIdoing}} according to ], there are 362,957 of those. ] ] 23:29, 17 November 2024 (UTC) | |||
::] suggests that there are about a thousand of those, which will not have a material effect on the numbers. | |||
::] says they've tagged 131K pages. There are about 123,700 pages with "List of" or "Lists of" at the start of the title. ] (]) 00:37, 18 November 2024 (UTC) | |||
:There is no clear definition of what a "regular article" is. Also many pages are not correctly marked and categorized. Don't for ] which look like ordinary articles, or might be, depending. -- ]] 00:43, 18 November 2024 (UTC) | |||
:<nowiki>{{NUMBEROFARTICLES}}</nowiki> seems to be mainspace non-redirect pages. I'd thought it used other heuristics, too; I remember needing at least one link, and less confidently requiring a period? But plainly doesn't anymore; I'm getting 6912240 for ns0 !page_is_redirect on the replicas now.{{pb}}There's only 362201 non-redirects in ] and mainspace. Most of the difference are in other namespaces, probably legitimately, though I'm surprised to see 208 in user:, 44 total in various talk namespaces, 9 mainspace redirects, and a single redirect in draftspace.{{pb}}114253 mainspace non-redirects in ], though 64 of those are in the disambig cat as well.{{pb}}Lists are less certain; there's no ]. I could try to count pages that are in any category starting with "Lists " or ending with " lists", but - not having done that before - don't have any idea how many it would miss, and how many it would miscount. Ditto with pages starting with "List of " or "Lists of " (which is easy - 120653, not counting any redirs or pages in the dabs or set index cats). —] 01:00, 18 November 2024 (UTC) | |||
::Oh, and 11193167 redirects (so 18105407 total mainspace pages), if you care. —] 01:03, 18 November 2024 (UTC) | |||
::So 6,912,230 non-redirect pages, of which 362,201 are dab pages and 120,653 are Lists (per title), and the rest (e.g., Outlines, Index pages) is immaterial. A good SIA looks like an article and an underdeveloped one looks like a dab page, which takes us back to GreenC's point about it not being a clear-cut question. | |||
::All of this suggests that if you count SIAs as 'articles', then there are 6.429 million articles (93%) and if you count them as lists/dabs, then there are 6.315 million articles (91%). | |||
::Thanks, all. ] (]) 01:15, 18 November 2024 (UTC) | |||
== |
== Update to NPP reports == | ||
Is it possible to add a link to the {{tq|#}} column at ] with an xtools redirs created link. It can target ] | |||
Hello again, generous satisfiers of curiosity: | |||
Similarly for ] targeting ] Thanks! <span style="font-family:monospace;font-weight:bold">]:<]></span> 15:49, 20 December 2024 (UTC) | |||
Today's question is how old the typical currently active ] account is. The requirements are: | |||
:] and ]. —] 18:56, 20 December 2024 (UTC) | |||
::Thanks a lot <span style="font-family:monospace;font-weight:bold">]:<]></span> 04:06, 21 December 2024 (UTC) | |||
* is currently active (perhaps made at least one edit during the last 30 days? Any plausible definition of active works for me, so pick whatever is easiest and cheapest to run) | |||
* meets EXTCON (all of which will have the EXTCON permission) | |||
== Measuring the number of source links to each domain for a given article/set of articles == | |||
I don't care whether it's date of first edit vs registration date. I also don't care whether it's all ~73K of them or if it's a sample of 5K–10K. I am looking for an answer to the nearest year ("Most active EXTCON editors started editing before 2014" or "If you see someone editing an article under EXTCON, they probably started editing more than 10 years ago"). | |||
==== Command denied==== | |||
I keep getting the error, "execute command denied to user 's52788'@'%' for routine 'enwiki_p.count'". I was using the page database, but even after I modified my query to only use the externallinks database (meaning I need to input a numerical page ID instead of using the title), I'm still getting the denial. What am I doing wrong here? Am I just not allowed to aggregate? Here's my query, simplified as much as possible and still not working: | |||
Thank you, ] (]) 17:14, 19 November 2024 (UTC) | |||
SELECT count (el_to_domain_index) | |||
:Hmm. user_touched has been scrubbed because it is private data. So I guess LEFT JOIN recentchanges to see who is active? This should only get us users who have made an edit in the last 30 days. Then run MEDIAN() on it. Let's see if ] finishes. If the count is 72,000ish, then I also need to add a WHERE to filter out the editors who aren't in recentchanges. –] <small>(])</small> 18:33, 19 November 2024 (UTC) | |||
FROM externallinks | |||
::That's going to get you not just every user with the user right - the whole point of a left join is that you get a result whether there's a row in the joined table or not - but a row in your resultset for ''every'' row they have in recentchanges. And you're leaving out admins, who have extended confirmed implicitly. Plus, even if it worked, it would be a dump of ~25k values.{{pb}}Mariadb has , but I can't get it to work on user_registration no matter how I try to preprocess it first - it gives me "Numeric datatype is required for percentile_cont function" when I call it directly on the column, which is reasonable, but always 100000000 if I cast it to any kind of numeric value, which isn't. (Anyone know what I'm doing wrong? ]. And I've never really grokked window funcs or how to get them to behave like normal, sane, grouped-by aggregate funcs anyway.) But doing it longhand works just fine. ]: 28 May 2013. —] 19:37, 19 November 2024 (UTC) | |||
WHERE el_from = 37198628 | |||
:::user_registration is ASCII-encoded binary rather than a binary number which is why you're getting nonsense when casting it and trying to do operations on it. ] (]) 21:28, 19 November 2024 (UTC) | |||
GROUP BY el_to_domain_index; | |||
::::Casting it seems to get me a numeric, and doing normal arithmetic on it (user_registration + 1, division, and so on) coerces it to a numeric; it doesn't get me nonsense until I try to pass it through MEDIAN(). And UNIX_TIMESTAMP() in particular is documented to return an unsigned int. <s>Current theory is that MEDIAN() can't deal with large numbers (]; dividing by numbers smaller than ten gets me 100 million again), which is boggling.</s> No, a cast or operation on the result of ''MEDIAN()'' is what fixes it. Still boggling. ]. Thanks for the prod. —] 21:55, 19 November 2024 (UTC) | |||
:::::Also, comparing the results reminded me that user_registration is NULL for some users who registered before mid-2005ish, which I hadn't corrected for. 2013-06-15 19:42:14, though I doubt the two weeks' inaccuracy is going to matter much to WAID. —] 22:30, 19 November 2024 (UTC) | |||
:::::TIL CAST seems to convert to the proper numeric representation if the binary string contains only numeric ASCII characters. Glad you were able to get it working though. ] (]) 22:34, 19 November 2024 (UTC) | |||
] (]) 23:14, 21 December 2024 (UTC) | |||
== List of Revision IDs by edit summaries == | |||
:Remove the space between <code>count</code> and the open paren. —] 23:21, 21 December 2024 (UTC) | |||
::(Or <code>set sql_mode = 'IGNORE_SPACE';</code> first. —] 23:24, 21 December 2024 (UTC)) | |||
::Wow. Thank you. ] (]) 23:29, 21 December 2024 (UTC) | |||
==== Lag, no results returned ==== | |||
Can someone write a SQL query that isolates all edits made with the edit summary <code><nowiki>Disambiguating links to ] (intentional link to DAB) using ].</nowiki></code> where XYZ is any combination of letters, numbers or symbols. There is a bug in the script that causes edits with this summary to target to a wrong link, see ]. Thanks! <span class="nowrap">—''']'''</span> <sup class="nowrap">(] • {]•]})</sup> 14:24, 10 December 2024 (UTC) | |||
:]. —] 17:43, 10 December 2024 (UTC) | |||
::Thank you very much. It helped me in fixing the pages that had wrong links due to the script bug. <span class="nowrap">—''']'''</span> <sup class="nowrap">(] • {]•]})</sup> 19:50, 10 December 2024 (UTC) | |||
<s>Now I'm trying to get counts for all the external links from all the pages in a category. I want to do this for each separate page, and get lists of all the actual URLs, but y'know, baby steps. I used this query: https://quarry.wmcloud.org/query/89031 | |||
== Draftifications by month == | |||
USE enwiki_p; | |||
Hi everyone. Cryptic kindly created which shows how many draftifications took place between 2021-07 and 2022-08. Could someone please fork it to show dates from 2016 to 2024? If it's easier, I'm fine with seeing the number of draftifications by year instead of by month. Many thanks and best wishes, ] (] <nowiki>|</nowiki> ]) 03:38, 14 December 2024 (UTC) | |||
SELECT el_to_domain_index, | |||
count(el_to_domain_index) | |||
FROM externallinks | |||
JOIN categorylinks ON cl_from = el_from | |||
WHERE cl_to = 11696843 | |||
GROUP BY el_to_domain_index | |||
ORDER BY count(el_to_domain_index) DESC; | |||
I'm not getting any results and it takes ages to not get them. What am I doing wrong now? Also, how do I include pages in any subcategories, or does this include them automatically? ] (]) 00:57, 22 December 2024 (UTC) | |||
</s> | |||
I figured out that I need to use page despite the slowness it'll cause, because cl_to uses a name instead of an ID. So here is my new query, now also running on simplewiki for easier testing. https://quarry.wmcloud.org/query/89032 | |||
USE simplewiki_p | |||
SELECT page_title, | |||
el_to_domain_index, | |||
count(el_to_domain_index) | |||
FROM externallinks | |||
JOIN categorylinks ON cl_from = el_from | |||
JOIN page on cl_from = page_id | |||
WHERE cl_to = Canada | |||
GROUP BY page_title, el_to_domain_index; | |||
This query though has a syntax error on line 2. | |||
I also think I might be in the wrong place to ask for step-by-step help like this. If there's a better place for me to go, I'd appreciate the direction. ] (]) 02:18, 22 December 2024 (UTC) | |||
:You don't need the USE statement on Quarry since you have to select a database there separately (since most are on different servers now); but if you keep it, you need to terminate it with a semicolon.{{pb}}Next error you'd get is that you need to quote 'Canada'. At least that one has a useful error message ("Unknown column 'Canada' in 'where clause'").{{pb}}The reason your first query took forever is because <syntaxhighlight lang='sql' inline>SELECT * FROM categorylinks WHERE cl_to = 11696843;</syntaxhighlight> does a full table scan - it tries to coerce each row's cl_to (a string value) into a number, and then does a numeric comparison. There's no correct way to use the index on cl_to since many different strings compare equal to that number, in particular ones starting with whitespace. <syntaxhighlight lang='sql' inline>SELECT * FROM categorylinks WHERE cl_to = '11696843';</syntaxhighlight> on the other hand finishes instantly with no results (since ] has no members). Categories are only loosely tied to the page at their title anyway.{{pb}}You won't get members of subcategories like that - you have to go far out of your way to do so, similar to ]. You ''would'' get the direct subcategories like ] themselves, if any happened to have any external links. Distinguish them by selecting page_namespace too, if you're not already filtering by it. —] 02:56, 22 December 2024 (UTC) | |||
::It sounds like I'm better off doing a multipart kludge- getting all the relevant page titles with Massviews or Petscan, running a single query to turn them into IDs, then using those IDs as el_froms so I only need the externallinks database. Thank you for your help! ] (]) 05:59, 22 December 2024 (UTC) | |||
== Orphaned editnotices == | |||
When a page is moved, its ] is not moved with it. There is a post-move warning for it, but users would need to move it separately. That too can only be done by template editors, page movers and admins. I believe that there are plenty of editnotices that have become orphaned from their target page. I need a query to list such pages. If there is already a regularly updated database, that will work too. Thanks! <span class="nowrap">—''']'''</span> <sup class="nowrap">(] • {]•]})</sup> 07:53, 25 December 2024 (UTC) | |||
:Here's mainspace only to get you started: ]. You or someone else can fork and improve this if you need additional namespaces. Making this a database report somewhere using {{t|Database report}} might be a good idea. Hope this helps. –] <small>(])</small> 08:42, 25 December 2024 (UTC) | |||
::I suspect it's much worse than that. It's certainly more complex.{{pb}}There's plenty of mainspace titles with colons in them, and it's conceivable that some of those have orphaned editnotices; there's really no way around parsing for the namespace name, and that's going to be ugly and complex, and I haven't tried it yet. (It being Christmas morning and all. Maybe tomorrow.) But I wouldn't estimate that to result in more than a handful of other hits.{{pb}}Much more likely is the case that CX Zoom mentions directly: a page is moved but the editnotice isn't, leaving it attached to the remnant redirect. There's going to be false positives looking for those whether we do it the "correct" way and look in the move log (since there might be an editnotice intentionally attached to a page that had another page moved from it in the past), or whether we include editnotices attached to pages that are currently redirects. The latter's easier, and especially easier to combine with the query looking for pages that don't exist; I've done it at ]. That'll also miss editnotices that unintentionally weren't moved with their page, where the resulting redirect was turned into a ''different'' page, though. —] 15:30, 25 December 2024 (UTC) | |||
:::Thank you very much, both of you... <span class="nowrap">—''']'''</span> <sup class="nowrap">(] • {]•]})</sup> 16:45, 25 December 2024 (UTC) |
Latest revision as of 16:45, 25 December 2024
Page for requesting database queries
Archives | |||||
|
|||||
This page has archives. Sections older than 14 days may be automatically archived by Lowercase sigmabot III when more than 4 sections are present. |
This is a page for requesting one-off database queries for certain criteria. Users who are interested and able to perform SQL queries on the projects can provide results from the Quarry website.
You may also be interested in the following:
- If you are interested in writing SQL queries or helping out here, visit our tips page.
- If you need to obtain a list of article titles that meet certain criteria, consider using PetScan (user manual) or the default search. Petscan can generate list of articles in subcategories, articles which transclude some template, etc.
- If you need to make changes to a number of articles based on a particular query, you can post to the bot requests page, depending on how many changes are needed.
- For long-term review and checking, database reports are available.
Quarry does not have access to page content, so queries which require checking wikitext cannot be answered with Quarry. However, someone may be able to assist by using Quarry in another way (e.g. checking the table of category links rather than the "Category:" text) or suggest an alternative tool.
Draftifications by month
Hi everyone. Cryptic kindly created this query which shows how many draftifications took place between 2021-07 and 2022-08. Could someone please fork it to show dates from 2016 to 2024? If it's easier, I'm fine with seeing the number of draftifications by year instead of by month. Many thanks and best wishes, Clayoquot (talk | contribs) 03:38, 14 December 2024 (UTC)
- I've rerun the query in-place. —Cryptic 14:19, 14 December 2024 (UTC)
- Beautiful, thank you so much Cryptic! Clayoquot (talk | contribs) 16:29, 14 December 2024 (UTC)
List of all file redirects that are in use in mainspace
I wrote a query that lists all file redirects, at quarry:query/88966. Can this query be expanded to only list file redirects that are used in mainspace somewhere? –Novem Linguae (talk) 22:26, 19 December 2024 (UTC)
Update to NPP reports
Is it possible to add a link to the #
column at Misplaced Pages:New_pages_patrol/Reports#Unreviewed_new_redirects_by_creator_(top_10) with an xtools redirs created link. It can target xtools:pages/en.wikipedia.org/USERNAME/0/onlyredirects
Similarly for Misplaced Pages:New_pages_patrol/Reports#Unreviewed_new_articles_by_creator_(top_10) targeting xtools:pages/en.wikipedia.org/USERNAME/all Thanks! ~/Bunnypranav:<ping> 15:49, 20 December 2024 (UTC)
- Done and done. —Cryptic 18:56, 20 December 2024 (UTC)
- Thanks a lot ~/Bunnypranav:<ping> 04:06, 21 December 2024 (UTC)
Measuring the number of source links to each domain for a given article/set of articles
Command denied
I keep getting the error, "execute command denied to user 's52788'@'%' for routine 'enwiki_p.count'". I was using the page database, but even after I modified my query to only use the externallinks database (meaning I need to input a numerical page ID instead of using the title), I'm still getting the denial. What am I doing wrong here? Am I just not allowed to aggregate? Here's my query, simplified as much as possible and still not working:
SELECT count (el_to_domain_index) FROM externallinks WHERE el_from = 37198628 GROUP BY el_to_domain_index;
Safrolic (talk) 23:14, 21 December 2024 (UTC)
- Remove the space between
count
and the open paren. —Cryptic 23:21, 21 December 2024 (UTC)- (Or
set sql_mode = 'IGNORE_SPACE';
first. —Cryptic 23:24, 21 December 2024 (UTC)) - Wow. Thank you. Safrolic (talk) 23:29, 21 December 2024 (UTC)
- (Or
Lag, no results returned
Now I'm trying to get counts for all the external links from all the pages in a category. I want to do this for each separate page, and get lists of all the actual URLs, but y'know, baby steps. I used this query: https://quarry.wmcloud.org/query/89031
USE enwiki_p; SELECT el_to_domain_index, count(el_to_domain_index) FROM externallinks JOIN categorylinks ON cl_from = el_from WHERE cl_to = 11696843 GROUP BY el_to_domain_index ORDER BY count(el_to_domain_index) DESC;
I'm not getting any results and it takes ages to not get them. What am I doing wrong now? Also, how do I include pages in any subcategories, or does this include them automatically? Safrolic (talk) 00:57, 22 December 2024 (UTC)
I figured out that I need to use page despite the slowness it'll cause, because cl_to uses a name instead of an ID. So here is my new query, now also running on simplewiki for easier testing. https://quarry.wmcloud.org/query/89032
USE simplewiki_p SELECT page_title, el_to_domain_index, count(el_to_domain_index) FROM externallinks JOIN categorylinks ON cl_from = el_from JOIN page on cl_from = page_id WHERE cl_to = Canada GROUP BY page_title, el_to_domain_index;
This query though has a syntax error on line 2.
I also think I might be in the wrong place to ask for step-by-step help like this. If there's a better place for me to go, I'd appreciate the direction. Safrolic (talk) 02:18, 22 December 2024 (UTC)
- You don't need the USE statement on Quarry since you have to select a database there separately (since most are on different servers now); but if you keep it, you need to terminate it with a semicolon.Next error you'd get is that you need to quote 'Canada'. At least that one has a useful error message ("Unknown column 'Canada' in 'where clause'").The reason your first query took forever is because
SELECT * FROM categorylinks WHERE cl_to = 11696843;
does a full table scan - it tries to coerce each row's cl_to (a string value) into a number, and then does a numeric comparison. There's no correct way to use the index on cl_to since many different strings compare equal to that number, in particular ones starting with whitespace.SELECT * FROM categorylinks WHERE cl_to = '11696843';
on the other hand finishes instantly with no results (since Category:11696843 has no members). Categories are only loosely tied to the page at their title anyway.You won't get members of subcategories like that - you have to go far out of your way to do so, similar to quarry:query/87975. You would get the direct subcategories like simple:Category:Canada stubs themselves, if any happened to have any external links. Distinguish them by selecting page_namespace too, if you're not already filtering by it. —Cryptic 02:56, 22 December 2024 (UTC)- It sounds like I'm better off doing a multipart kludge- getting all the relevant page titles with Massviews or Petscan, running a single query to turn them into IDs, then using those IDs as el_froms so I only need the externallinks database. Thank you for your help! Safrolic (talk) 05:59, 22 December 2024 (UTC)
Orphaned editnotices
When a page is moved, its editnotice is not moved with it. There is a post-move warning for it, but users would need to move it separately. That too can only be done by template editors, page movers and admins. I believe that there are plenty of editnotices that have become orphaned from their target page. I need a query to list such pages. If there is already a regularly updated database, that will work too. Thanks! —CX Zoom 07:53, 25 December 2024 (UTC)
- Here's mainspace only to get you started: quarry:query/89138. You or someone else can fork and improve this if you need additional namespaces. Making this a database report somewhere using {{Database report}} might be a good idea. Hope this helps. –Novem Linguae (talk) 08:42, 25 December 2024 (UTC)
- I suspect it's much worse than that. It's certainly more complex.There's plenty of mainspace titles with colons in them, and it's conceivable that some of those have orphaned editnotices; there's really no way around parsing for the namespace name, and that's going to be ugly and complex, and I haven't tried it yet. (It being Christmas morning and all. Maybe tomorrow.) But I wouldn't estimate that to result in more than a handful of other hits.Much more likely is the case that CX Zoom mentions directly: a page is moved but the editnotice isn't, leaving it attached to the remnant redirect. There's going to be false positives looking for those whether we do it the "correct" way and look in the move log (since there might be an editnotice intentionally attached to a page that had another page moved from it in the past), or whether we include editnotices attached to pages that are currently redirects. The latter's easier, and especially easier to combine with the query looking for pages that don't exist; I've done it at quarry:query/89148. That'll also miss editnotices that unintentionally weren't moved with their page, where the resulting redirect was turned into a different page, though. —Cryptic 15:30, 25 December 2024 (UTC)
- Thank you very much, both of you... —CX Zoom 16:45, 25 December 2024 (UTC)
- I suspect it's much worse than that. It's certainly more complex.There's plenty of mainspace titles with colons in them, and it's conceivable that some of those have orphaned editnotices; there's really no way around parsing for the namespace name, and that's going to be ugly and complex, and I haven't tried it yet. (It being Christmas morning and all. Maybe tomorrow.) But I wouldn't estimate that to result in more than a handful of other hits.Much more likely is the case that CX Zoom mentions directly: a page is moved but the editnotice isn't, leaving it attached to the remnant redirect. There's going to be false positives looking for those whether we do it the "correct" way and look in the move log (since there might be an editnotice intentionally attached to a page that had another page moved from it in the past), or whether we include editnotices attached to pages that are currently redirects. The latter's easier, and especially easier to combine with the query looking for pages that don't exist; I've done it at quarry:query/89148. That'll also miss editnotices that unintentionally weren't moved with their page, where the resulting redirect was turned into a different page, though. —Cryptic 15:30, 25 December 2024 (UTC)