Module talk:Find sources
Module:Find sources is permanently protected from editing because it is a heavily used or highly visible module. Substantial changes should first be proposed and discussed here on this page. If the proposal is uncontroversial or has been discussed and is supported by consensus, editors may use {{edit template-protected}} to notify an administrator or template editor to make the requested edit.
|
To help centralize discussions and keep related topics together, all talk subpages of this page, as well as Template talk:Find sources mainspace , Template talk:Find general sources, and Template talk:Find biographical sources redirect here. |
This module was considered for deletion on 2017 May 14. The result of the discussion was "no consensus". |
This module does not require a rating on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||
|
|
|
This page has archives. Sections older than 180 days may be automatically archived by Lowercase sigmabot III when more than 4 sections are present. |
Related pages |
---|
Number of transclusions of Template:Find general sources
[edit]Template:Find general sources has "869860 transclusion(s) found" as of 06:36, 30 November 2023 (UTC), as I just checked https://templatecount.toolforge.org/index.php?lang=en&namespace=10&name=Find_general_sources#bottom .
(Appreciate it if another user can reply below and confirm this.)
This is written here because Template talk:Find general sources redirects here. RZuo (talk) 06:36, 30 November 2023 (UTC)
- Quoting from what appears right now on the template page, this is "roughly 1% of all pages" on English Wikipedia. RZuo (talk) 06:42, 30 November 2023 (UTC)
- Also, quoting from Module_talk:Find_sources/Archive_1#Wikipedia_Library_text, "this is an 800k-transclusion template" in September 2021. RZuo (talk) 06:58, 30 November 2023 (UTC)
Edit request 12 January 2024
[edit]This edit request to Module:Find sources/templates/Find general sources has been answered. Set the |answered= or |ans= parameter to no to reactivate your request. |
A discussion at Wikipedia:Village pump (proposals) resulted in consensus supporting the proposal to "remove all individual news outlets" from Module:Find sources/templates/Find general sources (closure: Special:Diff/1195065906). This means removal of the following lines:
{ code = 'new york times', display = "''NYT''", tooltip = 'The New York Times', }, { code = 'ap', display = 'AP', tooltip = 'Associated Press', },
Adumbrativus (talk) 05:26, 12 January 2024 (UTC)
- Done by Sdkb in Special:Diff/1195169672 SWinxy (talk) 22:16, 12 January 2024 (UTC)
Proposed removal on 'Find video game sources' template:
[edit]Remove both VG/RL and WPVG/Talk (both don't seem to link anywhere) on Template:Find video game sources (these are the last two links) Superb Owl (talk) 18:16, 24 June 2024 (UTC)
- Unclear what you change you are requesting. Please clarify. Mathglot (talk) 09:27, 25 June 2024 (UTC)
Trim "list of" from argument?
[edit]Can the logic here convert "List of foo" as a parameter to become a search for "foo"? e.g. "List of fictional rutabagas" becomes "fictional rutabagas" for search purposes? This would help correctly identify relevant sources in AfDs, and those who actually meant to search for "List of foo" can add the prefix back if desired. My experience is that "list of foo" consistently fails to produce any relevant sources, while "foo" will produce more sources, with some arguably relevant to the discussion. Jclemens (talk) 05:59, 20 August 2024 (UTC)
Chatbots as valid sources or identifiers of them
[edit]Awesome Aasim, Can you please elaborate on your intentions with this sandbox edit ? I believe that it would be a perversion of this module and the associated template to admit any notion of AI bots into the module configuration either as 1) a reliable source, or as 2) a good way to find reliable sources (their hallucinations are legion). Hence, I would be against porting your changes to the module or to the template without consensus achieved at an Rfc on this page advertised at the main venues where AI bots are being discussed, as well as at WP:VPR. Thanks, Mathglot (talk) 04:51, 16 October 2024 (UTC)
- I kind of agree as well. However, Google and Bing also index unreliable sources. I was initially going to put in a query "find reliable sources for $1" but then decided against it for some reason. Awesome Aasim 14:00, 16 October 2024 (UTC)
- That is, of course, true. However, I have never seen Google or Bing invent a very convincing-looking source out of whole
clothbits, creating authors who do not exist, or very complex and scholarly-looking titles that look real but do not exist (but are quite similar to pieces of other titles that do exist), and so on. If the task assigned were to add sources to an article that do not exist but would rarely get challenged, AI bots are definitely the way to go. It's quite possible vandals or lazy or clueless editors are doing this already, and it is a problem that will have to be addressed at some point. Mathglot (talk) 15:34, 16 October 2024 (UTC)- When a chatbot is powered by a search engine, it is less likely to make stuff up; but that doesn't mean it doesn't pull from unreliable sources. I have tried Copilot before (not necessarily for Wikipedia tasks, but for personal tasks like clarifying math concepts) and it has not really failed me. On the other hand, ChatGPT has occasionally made stuff up, especially when it does not query from the web. Awesome Aasim 15:45, 16 October 2024 (UTC)
- Imho, this page is not the place for Wikipedia editors to debate whether chatbots are more or less likely to hallucinate under this or that circumstance, and, pardon me, but anecdotal evidence about failure to fail in casual use by non-experts is close to worthless. Please use the AI discussion venues for that. Here we should debate whether a find sources module should use the results of AI, however triggered, and imho the answer to that is a slam-dunk 'no'. I will shut up now, and hopefully others will chime in. Mathglot (talk) 16:26, 16 October 2024 (UTC)
- Agreed entirely with Mathglot here. * Pppery * it has begun... 16:53, 16 October 2024 (UTC)
- Imho, this page is not the place for Wikipedia editors to debate whether chatbots are more or less likely to hallucinate under this or that circumstance, and, pardon me, but anecdotal evidence about failure to fail in casual use by non-experts is close to worthless. Please use the AI discussion venues for that. Here we should debate whether a find sources module should use the results of AI, however triggered, and imho the answer to that is a slam-dunk 'no'. I will shut up now, and hopefully others will chime in. Mathglot (talk) 16:26, 16 October 2024 (UTC)
- When a chatbot is powered by a search engine, it is less likely to make stuff up; but that doesn't mean it doesn't pull from unreliable sources. I have tried Copilot before (not necessarily for Wikipedia tasks, but for personal tasks like clarifying math concepts) and it has not really failed me. On the other hand, ChatGPT has occasionally made stuff up, especially when it does not query from the web. Awesome Aasim 15:45, 16 October 2024 (UTC)
- That is, of course, true. However, I have never seen Google or Bing invent a very convincing-looking source out of whole
- Google's top-level "AI overview" results are rarely totally accurate, but they do give the source links, which in aggregate ends up being significantly more reliable for us than their raw top search results, which, often being shit like quora or low-quality zines, do not provide sources at all.
- I don't know if linking to another engine prompting something like ChatGPT would get better AI-enhanced results than Google or Bing in that respect, since they are putting a good deal of effort into making it give back real online-accessible sources. That seems to align with our goal here.
- Of course the other question is whether it's more enticing for the novice editor to have a shiny link saying "ChatGPT" or whatever latest AI tool is out there, instead of just clicking "Google", even if the result is the same. Also, calling exclusively the Google AI overview results does not appear possible right now as it's still considered an experimental feature, and it doesn't appear to be available in private browsing either. But that's my thought going forward. SamuelRiv (talk) 17:52, 16 October 2024 (UTC)
Listed at: Wikipedia:Village pump (miscellaneous). Mathglot (talk) 16:35, 16 October 2024 (UTC)
- I think that even if chatbots were 100% accurate in their output, we would still want to avoid using them for use-cases like this because of how heavily these technologies rely on Wikipedia itself as an information source. We need to stay upstream of LLMs to avoid circular referencing. signed, Rosguill talk 16:47, 16 October 2024 (UTC)
- I wonder if with the right prompt engineering we can get these chatbots to actually spit out reliable sources. We can maybe base our entire prompt based on something like WP:RSPS. Although it probably would overfill the query parameter. Awesome Aasim 17:25, 16 October 2024 (UTC)
- Are you proposing to engineer a solution that operates without human intervention? Because if you are successful, you should quit your day job and launch the next AI start-up, or become CTO of one of the existing ones. Mathglot (talk) 17:30, 16 October 2024 (UTC)
- No, a person will still need to click on and then review what is provided, and provide follow up queries. The queries I pass into the ?q=... parameter may be a good start, but I don't think they are a good end. But us Wikipedians should know this; just as the first page of Google/Bing search results can at times be littered with stuff like WP:DAILYFAIL and WP:NEWSMAX, depending on the query and depending on previous searches.
- I have largely toyed with ChatGPT and found it is not always good. Web-based LLMs like Copilot and Gemini are a bit better, although I remember they had a bumpy start, sometimes pulling nonsense from places like Reddit and Facebook.
- I do not agree that we should just copy and paste the exact output of an LLM. I only think AI is good to assist humans, but practically can never replace humans. Awesome Aasim 17:38, 16 October 2024 (UTC)
- Are you proposing to engineer a solution that operates without human intervention? Because if you are successful, you should quit your day job and launch the next AI start-up, or become CTO of one of the existing ones. Mathglot (talk) 17:30, 16 October 2024 (UTC)
- I wonder if with the right prompt engineering we can get these chatbots to actually spit out reliable sources. We can maybe base our entire prompt based on something like WP:RSPS. Although it probably would overfill the query parameter. Awesome Aasim 17:25, 16 October 2024 (UTC)
- I think that including AI-oriented links is likely to cause more problems than it solves. WhatamIdoing (talk) 19:41, 17 October 2024 (UTC)