Misplaced Pages

:Request a query: Difference between revisions - Misplaced Pages

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 21:56, 19 November 2024 editCryptic (talk | contribs)Administrators41,597 edits Median account age for EXTCON: no, that wasn't it, think I have it now, thank you← Previous edit Revision as of 22:01, 19 November 2024 edit undoCryptic (talk | contribs)Administrators41,597 edits Median account age for EXTCON: +quarry:query/88042Next edit →
Line 129: Line 129:
::That's going to get you not just every user with the user right - the whole point of a left join is that you get a result whether there's a row in the joined table or not - but a row in your resultset for ''every'' row they have in recentchanges. And you're leaving out admins, who have extended confirmed implicitly. Plus, even if it worked, it would be a dump of ~25k values.{{pb}}Mariadb has , but I can't get it to work on user_registration no matter how I try to preprocess it first - it gives me "Numeric datatype is required for percentile_cont function" when I call it directly on the column, which is reasonable, but always 100000000 if I cast it to any kind of numeric value, which isn't. (Anyone know what I'm doing wrong? ]. And I've never really grokked window funcs or how to get them to behave like normal, sane, grouped-by aggregate funcs anyway.) But doing it longhand works just fine. ]: 28 May 2013. —] 19:37, 19 November 2024 (UTC) ::That's going to get you not just every user with the user right - the whole point of a left join is that you get a result whether there's a row in the joined table or not - but a row in your resultset for ''every'' row they have in recentchanges. And you're leaving out admins, who have extended confirmed implicitly. Plus, even if it worked, it would be a dump of ~25k values.{{pb}}Mariadb has , but I can't get it to work on user_registration no matter how I try to preprocess it first - it gives me "Numeric datatype is required for percentile_cont function" when I call it directly on the column, which is reasonable, but always 100000000 if I cast it to any kind of numeric value, which isn't. (Anyone know what I'm doing wrong? ]. And I've never really grokked window funcs or how to get them to behave like normal, sane, grouped-by aggregate funcs anyway.) But doing it longhand works just fine. ]: 28 May 2013. —] 19:37, 19 November 2024 (UTC)
:::user_registration is ASCII-encoded binary rather than a binary number which is why you're getting nonsense when casting it and trying to do operations on it. ] (]) 21:28, 19 November 2024 (UTC) :::user_registration is ASCII-encoded binary rather than a binary number which is why you're getting nonsense when casting it and trying to do operations on it. ] (]) 21:28, 19 November 2024 (UTC)
::::Casting it seems to get me a numeric, and doing normal arithmetic on it (user_registration + 1, division, and so on) coerces it to a numeric; it doesn't get me nonsense until I try to pass it through MEDIAN(). And UNIX_TIMESTAMP() in particular is documented to return an unsigned int. <s>Current theory is that MEDIAN() can't deal with large numbers (]; dividing by numbers smaller than ten gets me 100 million again), which is boggling.</s> No, a cast on the result of ''MEDIAN()'' is what fixes it. Still boggling. Thanks for the prod. —] 21:55, 19 November 2024 (UTC) ::::Casting it seems to get me a numeric, and doing normal arithmetic on it (user_registration + 1, division, and so on) coerces it to a numeric; it doesn't get me nonsense until I try to pass it through MEDIAN(). And UNIX_TIMESTAMP() in particular is documented to return an unsigned int. <s>Current theory is that MEDIAN() can't deal with large numbers (]; dividing by numbers smaller than ten gets me 100 million again), which is boggling.</s> No, a cast or operation on the result of ''MEDIAN()'' is what fixes it. Still boggling. ]. Thanks for the prod. —] 21:55, 19 November 2024 (UTC)

Revision as of 22:01, 19 November 2024

Page for requesting database queries


Archives
Archive 1Archive 2Archive 3
Archive 4Archive 5


This page has archives. Sections older than 14 days may be automatically archived by Lowercase sigmabot III when more than 4 sections are present.
Shortcuts

This is a page for requesting one-off database queries for certain criteria. Users who are interested and able to perform SQL queries on the projects can provide results from the Quarry website.

You may also be interested in the following:

  • If you are interested in writing SQL queries or helping out here, visit our tips page.
  • If you need to obtain a list of article titles that meet certain criteria, consider using PetScan (user manual) or the default search. Petscan can generate list of articles in subcategories, articles which transclude some template, etc.
  • If you need to make changes to a number of articles based on a particular query, you can post to the bot requests page, depending on how many changes are needed.
  • For long-term review and checking, database reports are available.

Quarry does not have access to page content, so queries which require checking wikitext cannot be answered with Quarry. However, someone may be able to assist by using Quarry in another way (e.g. checking the table of category links rather than the "Category:" text) or suggest an alternative tool.

List of articles likely to have one or no sources

While making this edit recently, it occurred to me that we ought to have a way of at least semi-automatically identifying and tagging articles with either a single or no sources. I'd like to be able to do an AWB run of likely such articles.

Given that there are many different ways to do sources, I'd like to start with a conservative query, which lists all articles that contain none of the following strings:

  • <ref
  • http://
  • Notes
  • cite
  • Reference
  • Sources
  • Citation
  • Bibliography
  • sfn

I don't know how to construct a RegEx query with a negative (the internet seems to have some ideas, but I struggle to convert this into Misplaced Pages's flavor), so I'd appreciate some help. Could anyone help me generate this list? Cheers, Sdkb05:14, 14 November 2024 (UTC)

No access to article text. —Cryptic 06:19, 14 November 2024 (UTC)
this regex search is a start. It gives 10000 results then times out. * Pppery * it has begun... 06:24, 14 November 2024 (UTC)
You'll want to at least make that case-insensitive, anchor "ref" and maybe "cite" to word boundaries, and match "https://" too. But still, WP:Request a query isn't WP:Request a search. —Cryptic 06:34, 14 November 2024 (UTC)
...holy crap, it is. It shouldn't be. —Cryptic 06:35, 14 November 2024 (UTC)
The underlying ElasticSearch cluster has a read-only replica on Toolforge, which can be queried. So I'd say this page is the right place for such requests. – SD0001 (talk) 07:41, 14 November 2024 (UTC)
If someone comes here looking for help with Elasticsearch's middle-end, they're going to be very, very disappointed. —Cryptic 08:13, 14 November 2024 (UTC)
Thanks, @Pppery! After expanding the query to -insource:/(ef|http|otes|ite|ources|itation|ibliography|sfn|list of|lists of|link|further reading|Wiktionary redirect)/ -intitle:list -deepcategory:"Set index articles" it's starting to turn up mostly useful results. Cheers, Sdkb07:17, 14 November 2024 (UTC)
You can get more results before it times out by adding more non-regex filters. For instance, adding -hastemplate:"Module:Citation/CS1" gives 15k results instead of just 2k. – SD0001 (talk) 07:39, 14 November 2024 (UTC)
Anyway, the sort of things this page can do to answer your original question are to give you lists of pages with zero, or zero or one, external links, or that don't transclude any of a set of templates, or both; and as a bonus filter out redirects (which I'm fairly sure search does whether you like it or not), disambigs, and - to some extent - list pages. —Cryptic 07:16, 14 November 2024 (UTC)
Maybe rename this page to WP:Request a SQL query. * Pppery * it has begun... 20:13, 14 November 2024 (UTC)
Or we could ask people to read past the page title to the first two sentences. —Cryptic 02:55, 15 November 2024 (UTC)

Syntax error due to using a reserved word as a table or column name in MySQL

https://quarry.wmcloud.org/query/87911

https://stackoverflow.com/questions/23446377/syntax-error-due-to-using-a-reserved-word-as-a-table-or-column-name-in-mysql

It isn't handling the `user` table right as "user" is an SQL reserved word, I think.

The syntax highlighter was showing "user" in red, so I surrounded it with backticks `user`, then it was showing in light blue.

I think it needs to be highlighted in white to work correctly. But how? wbm1058 (talk) 18:47, 14 November 2024 (UTC)

Unrelated to the reserved word. `WHERE IS NULL(u.user_name)` should be `WHERE u.user_name IS NULL`. But see prior noise at User talk:Primefac/Archive 32#U2 deletions if you want to continue this. * Pppery * it has begun... 20:12, 14 November 2024 (UTC)
https://www.w3schools.com/sql/sql_isnull.asp indicates that my syntax should be valid. Two alternative ways to do the same thing? Regarding the "prior noise", I'm a more competent administrator who's checking page histories, and leaving redirects within user space alone. My current focus is on cross-namespace redirects from user pages of nonexistent users to outside of userspace. My recent deletion log will give you an idea; I'm trying to make a more specific query to reduce the noise level in the query results I've been working from. – wbm1058 (talk) 20:53, 14 November 2024 (UTC)
Wikimedia uses MySQL (actually MariaDB which uses MYSQL-ish syntax), not SQL server where your link says ISNULL (not IS NULL which the query uses) is valid. * Pppery * it has begun... 21:06, 14 November 2024 (UTC)
MariaDB supports ISNULL(), and it works the way Wbm1058 was trying to use it (modulo the misplaced space). SQL Server's ISNULL() is a synonym of COALESCE() instead. x IS NULL is generally safer precisely because of that incompatibility. —Cryptic 21:29, 14 November 2024 (UTC)
I tried just changing the syntax of the "IS NULL" statement as suggested. It was cooking on that for a while, and then:
"Error
This web service cannot be reached. Please contact a maintainer of this project.
Maintainers can find troubleshooting instructions from our documentation on Wikitech."
Hopefully my query didn't just crash the server. – wbm1058 (talk) 21:55, 14 November 2024 (UTC)
It just ran to completion, so simply changing the "IS NULL" statement fixed the syntax error. Now on to figure out the results, and tweak the query to do what I really want it to do. Thanks for your help. wbm1058 (talk) 22:09, 14 November 2024 (UTC)

FYI, I'm now feeling the joy. User:Wbm1058/Userpages of nonexistent users is my report of 400 pages which I think may all be safely speedy-deleted under U2: Userpage or subpage of a nonexistent user. This report was culled from a report of 1960 pages, by INTERSECT with the user table SELECT. This is indicative of the poor page-move interface design, which leads editors who think they're publishing user drafts to keep pages in userspace when they really wanted to move to mainspace, because they neglected the namespace dropdown in the move-page user interface. – wbm1058 (talk) 14:11, 15 November 2024 (UTC)

Dusty articles within all subcategories of a parent category

Is this possible? I'd like to get a list like Special:AncientPages but for anything within any subcategory of Category:Food and drink. It would make quite a nice little To Do list for times I feel like doing some research and writing but don't have a particular bee in my bonnet that very minute. Thanks for any help! Valereee (talk) 15:16, 17 November 2024 (UTC)

What is "dusty"? Neither Special:AncientPages nor Misplaced Pages talk:Special:AncientPages say what it does. Is it a sort by timestamp of last edit?In direct subcategories only, include the handful of pages directly in the category, or the whole tree? If the last, to what depth? Examples: Category:Food and drinkCategory:DairyCategory:Dairy industryMark Ezell is depth 2, and Food and drinkCategory:DairyCategory:Dairy industryCategory:Dairy farmingGoatherd is depth 3; neither the page itself nor the root category count. —Cryptic 16:51, 17 November 2024 (UTC)
Yes, it's a list of articles by date of most recent edit.
Hm, on the second question. Ideally I'd end up with is a list of, say, food items that hadn't been edited in ten years. Or chefs, or restaurants, or food animals or whatever. Maybe I need to choose a more specific subcategory? Valereee (talk) 17:24, 17 November 2024 (UTC)
Well, ok,
The reason I need a maximum depth is because - like almost all reasonably broad categories - Category:Food and drink eventually includes a significant portion of all categories. Depth 10, for example, has 122639 different categories in the tree, out of 2.4 million total categories (including maintenance categories, category redirects, and so on), and you really quickly start getting unrelated pages like Category:Food and drinkCategory:Food and drink by countryCategory:Agriculture by countryCategory:Agriculture in Europe by countryCategory:Agriculture in RomaniaCategory:Forestry in RomaniaCategory:Romanian woodworkersConstantin Brâncuși.Or, if you like, you can give me a list of categories to pull from. Even if it's a large list, or something like "Anything in any direct subcategory of Category:Food and drink, Category:Cuisine, Category:Chefs, Category:Poultry, ". —Cryptic 18:33, 17 November 2024 (UTC)
Oh, and do you want non-mainspace pages in the list or not? What about redirects? —Cryptic 18:38, 17 November 2024 (UTC)
lol...clearly in over my head here. :D Thank you for your patience.
So, no to feed a cold, starve a fever. Yes to recipe, dulce de leche and ice milk.
I think maybe start with something that's likely to contain fewer extraneous things. Category:Chefs in a way that will allow me to see, for instance, the articles that are in Category:Chefs by nationality > Category:Women chefs by nationality > Category:British women chefs > Category:Women chefs from Northern Ireland that haven't been edited in the last ten years. Valereee (talk) 18:50, 17 November 2024 (UTC)
Oh, no non-mainspace pages, no redirects. Valereee (talk) 19:13, 17 November 2024 (UTC)
None quite that old in either tree. quarry:query/87975 for Category:Food and drink depth 3 (oldest is Land reform in the Austrian Empire, 2015-11-16 18:36:35 - see what I meant about unrelated pages?), quarry:query/87976 for Category:Chefs depth 4 (oldest is Richard Ekkebus, 2019-12-16T04:47:03). —Cryptic 19:19, 17 November 2024 (UTC)
Well, thank you for your work, and sorry to waste your time! Valereee (talk) 19:35, 17 November 2024 (UTC)
Not wasted at all. Not your fault the category system is terrible for datamining.There might be some value in finding the latest revision that wasn't marked minor, and maybe excluding ones made by bots too, but that's going to be harder and a lot slower. Would definitely need to cut the set of articles to look at to something on the order of a couple thousand before looking at the edits, rather than the tens of thousands in that first tree. —Cryptic 20:14, 17 November 2024 (UTC)
Thanks. And I've actually already found an article that needs attention from your 87976 query, so win!
The point for me here is looking for categories that have many articles that haven't been updated since before sourcing started modernizing. It's a bit tricky because the articles that were created first -- probably in any category -- are also likely the articles that get update often, have multiple watchers, etc. So it's possible there just aren't huge numbers of food articles that need this kind of attention. Valereee (talk) 21:18, 17 November 2024 (UTC)

Number of articles that are actually articles

There are 6,930,490, but AIUI that includes disambig pages, stand-alone lists, and outlines, and maybe even portals (i.e., all content namespaces, not just the mainspace) but excludes redirects. Is there a way to get a count of the number of just plain old ordinary articles, excluding the other types? (A percentage from a sample set is good enough; I'd like to be able to write a sentence like "Of the 6.9 million articles, 6.2 million are regular articles, 0.45 million are lists, and 0.2 million are disambig pages.") WhatamIdoing (talk) 22:46, 17 November 2024 (UTC)

@WhatamIdoing: according to Category:All disambiguation pages, there are 362,957 of those. BD2412 T 23:29, 17 November 2024 (UTC)
Category:WikiProject Outlines articles suggests that there are about a thousand of those, which will not have a material effect on the numbers.
Misplaced Pages:WikiProject Lists/Assessment#Statistics says they've tagged 131K pages. There are about 123,700 pages with "List of" or "Lists of" at the start of the title. WhatamIdoing (talk) 00:37, 18 November 2024 (UTC)
There is no clear definition of what a "regular article" is. Also many pages are not correctly marked and categorized. Don't for WP:INDEXES which look like ordinary articles, or might be, depending. -- GreenC 00:43, 18 November 2024 (UTC)
{{NUMBEROFARTICLES}} seems to be mainspace non-redirect pages. I'd thought it used other heuristics, too; I remember needing at least one link, and less confidently requiring a period? But plainly doesn't anymore; I'm getting 6912240 for ns0 !page_is_redirect on the replicas now.There's only 362201 non-redirects in Category:All disambiguation pages and mainspace. Most of the difference are in other namespaces, probably legitimately, though I'm surprised to see 208 in user:, 44 total in various talk namespaces, 9 mainspace redirects, and a single redirect in draftspace.114253 mainspace non-redirects in Category:All set index articles, though 64 of those are in the disambig cat as well.Lists are less certain; there's no Category:All list pages. I could try to count pages that are in any category starting with "Lists " or ending with " lists", but - not having done that before - don't have any idea how many it would miss, and how many it would miscount. Ditto with pages starting with "List of " or "Lists of " (which is easy - 120653, not counting any redirs or pages in the dabs or set index cats). —Cryptic 01:00, 18 November 2024 (UTC)
Oh, and 11193167 redirects (so 18105407 total mainspace pages), if you care. —Cryptic 01:03, 18 November 2024 (UTC)
So 6,912,230 non-redirect pages, of which 362,201 are dab pages and 120,653 are Lists (per title), and the rest (e.g., Outlines, Index pages) is immaterial. A good SIA looks like an article and an underdeveloped one looks like a dab page, which takes us back to GreenC's point about it not being a clear-cut question.
All of this suggests that if you count SIAs as 'articles', then there are 6.429 million articles (93%) and if you count them as lists/dabs, then there are 6.315 million articles (91%).
Thanks, all. WhatamIdoing (talk) 01:15, 18 November 2024 (UTC)

Median account age for EXTCON

Hello again, generous satisfiers of curiosity:

Today's question is how old the typical currently active WP:EXTCONFIRMED account is. The requirements are:

  • is currently active (perhaps made at least one edit during the last 30 days? Any plausible definition of active works for me, so pick whatever is easiest and cheapest to run)
  • meets EXTCON (all of which will have the EXTCON permission)

I don't care whether it's date of first edit vs registration date. I also don't care whether it's all ~73K of them or if it's a sample of 5K–10K. I am looking for an answer to the nearest year ("Most active EXTCON editors started editing before 2014" or "If you see someone editing an article under EXTCON, they probably started editing more than 10 years ago").

Thank you, WhatamIdoing (talk) 17:14, 19 November 2024 (UTC)

Hmm. user_touched has been scrubbed because it is private data. So I guess LEFT JOIN recentchanges to see who is active? This should only get us users who have made an edit in the last 30 days. Then run MEDIAN() on it. Let's see if quarry:query/88037 finishes. If the count is 72,000ish, then I also need to add a WHERE to filter out the editors who aren't in recentchanges. –Novem Linguae (talk) 18:33, 19 November 2024 (UTC)
That's going to get you not just every user with the user right - the whole point of a left join is that you get a result whether there's a row in the joined table or not - but a row in your resultset for every row they have in recentchanges. And you're leaving out admins, who have extended confirmed implicitly. Plus, even if it worked, it would be a dump of ~25k values.Mariadb has a MEDIAN() window function, but I can't get it to work on user_registration no matter how I try to preprocess it first - it gives me "Numeric datatype is required for percentile_cont function" when I call it directly on the column, which is reasonable, but always 100000000 if I cast it to any kind of numeric value, which isn't. (Anyone know what I'm doing wrong? Sample query. And I've never really grokked window funcs or how to get them to behave like normal, sane, grouped-by aggregate funcs anyway.) But doing it longhand works just fine. quarry:query/88039: 28 May 2013. —Cryptic 19:37, 19 November 2024 (UTC)
user_registration is ASCII-encoded binary rather than a binary number which is why you're getting nonsense when casting it and trying to do operations on it. Uhai (talk) 21:28, 19 November 2024 (UTC)
Casting it seems to get me a numeric, and doing normal arithmetic on it (user_registration + 1, division, and so on) coerces it to a numeric; it doesn't get me nonsense until I try to pass it through MEDIAN(). And UNIX_TIMESTAMP() in particular is documented to return an unsigned int. Current theory is that MEDIAN() can't deal with large numbers (see resultset 4; dividing by numbers smaller than ten gets me 100 million again), which is boggling. No, a cast or operation on the result of MEDIAN() is what fixes it. Still boggling. Cleaner query. Thanks for the prod. —Cryptic 21:55, 19 November 2024 (UTC)
Category: