Jump to content

Talk:MDPI/Archive 5

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 3Archive 4Archive 5

Times Higher Education article

A new article dated March 15 may be a useful reference. ScienceFlyer (talk) 07:08, 17 March 2023 (UTC)

This is a very good article. It perfectly captures the issues people have with MDPI, and how its business model works. Kenji1987 (talk) 08:02, 17 March 2023 (UTC)
Great addition that should be added to bolster some citations that are possibly debateable in quality. For instance: nice that it uses the figure from the Crosetto blog, which I added previously as an expert WP:SPS. Lends that decision further credibility beyond its notoriety as a high-profile analysis of MDPI. -- Crawdaunt (talk) 10:33, 18 March 2023 (UTC)
The article is paywalled. What does it say? Banedon (talk) 10:45, 18 March 2023 (UTC)
Not sure about copy/pasting the whole article (bypassing paywall and pasting an article to Wikipedia in any form seems... sketchy). But here are snippets:
The Swiss open access publisher MDPI has rejected criticisms that its rapid expansion of journal “special issues” poses a threat to quality, despite having almost 56,000 such issues receiving submissions this year.
...
While some scholars have praised its rapid turnaround of papers (with a median submission-to-publication window of 38 days) and its transparency (22 per cent of articles use open peer review), it has faced questions over its use of special issues – for which journals invite scholars to propose and run publications on distinct themes, with submissions remaining open for months.
...
In addition to concerns about how peer review can be conducted effectively at this scale, questions have been raised about the relevance of some special issues to the journal’s main theme: one Sustainability special issue on “gender issues in sport and leisure”, which closed last year, included papers on doping in sport and how anxiety levels in female Chinese PhD students were affected by “funny running”.
...
Focusing on 98 MDPI journals with an impact factor, there were 55,985 special issues with a closing date in 2023, as of 23 February, Dr Crosetto told Times Higher Education. That compares with 39,587 open special issues identified at the end of March 2021, although only 10,504 of these eventually published anything. In 2022, 17,777 special issues published content.
[FIGURE FROM CROSETTO BLOG]
Number of MDPI open special issues per year. Source: Paolo Crosetto, scraped from MDPI website
Source: Paolo Crosetto
...
While Dr Crosetto did not begrudge MDPI’s success, its growth raised questions about both demand and quality, he said. “Sustainability sports 3,512 open special issues – this is just shy of 10 per day. If each special issue plans to host six to 10 papers, this is 60 to 100 papers per day. At some point, you are bound to touch a limit – there possibly aren’t that many papers around,” he said.
Cheers
-- Crawdaunt (talk) 11:47, 18 March 2023 (UTC)
Thanks. To be honest, my first reaction was laughter, because the description closely matches my experience with MDPI. The idea that there aren't enough papers around is also interesting, and the first time I've seen it. I would nonetheless like to see this elaborated on because it's obvious that the number of papers published every year worldwide is increasing (c.f. [1]), and furthermore some articles are written because they are invited articles and would not have existed otherwise.
However, I cannot see how we can make use of this article as a source, unless it's as a source for rapid turnaround, open peer review, and how a substantial number of special issues have not published anything. Maybe in the History section? Banedon (talk) 01:29, 20 March 2023 (UTC)
I think the biggest take-away is that MDPI's business model is primarily based on the special-issue model, which confuses a lot of people, as it makes publishing less exclusive, and more accessible (though we should not confuse them for anything else but a commercial company whose prime interest is making profit). The fact that Crosetto mentions that there aren't enough papers around clearly higlights this. Yes, there aren't enough papers within Western universities, but the rest of the world has plenty of papers to publish. Oviedo-Garcia's problematic paper on MDPI states that a characteristic of a predatory journal is: "predominance of editorial board members from developing countries", which fits into this presumption as well. But anyway to keep things short: we could add, MDPI has been criticised for its special-issues model (add that they had 54,000 special issues are so), which some scholars argue might jeopordize the quality of peer review, whereas MDPI contended that it does not. Kenji1987 (talk) 11:42, 20 March 2023 (UTC)
IMO this is just a better reference than some of the current references on the page per Wikipedia standards on referencing. e.g. one can now cite the Crosetto blogpost as the primary source, and this THE as the affirming reference that shows that blogpost is written by a subject expert. -- Crawdaunt (talk) 21:06, 20 March 2023 (UTC)
Interestingly, Jeffrey Beall wrote about MDPI special issues in 2015. ScienceFlyer (talk) 22:35, 20 March 2023 (UTC)
Aye, whole section on Beall's list within article as MDPI was a major subject surrounding that controversial list. Crawdaunt (talk) 08:05, 21 March 2023 (UTC)

Clarivate de-lists MDPI flagship journal

See the announcement post by Clarivate here: https://clarivate.com/blog/supporting-integrity-of-the-scholarly-record-our-commitment-to-curation-and-selectivity-in-the-web-of-science/ And MDPI's response here: https://www.mdpi.com/about/announcements/5536

Among the ~50 journals dropped from Web of Science is the MDPI flagship journal International Journal of Environmental Research and Public Health (IJERPH) and Journal of Risk and Financial Management (JRFM). In other words, IJERPH and JRFM can no longer have an impact factor...

Don't really know how this fits into the page yet (this is quite a big move by Web of Science/Clarivate), but thought I'd bring this info to the awareness of the page. IJERPH began in 2004 and among the largest journal of MDPI by publication output, also being 2nd only to Scientific Reports in terms of total journal article output among biomedicine-related journals (see: https://scontent-lhr8-2.xx.fbcdn.net/v/t39.30808-6/337128097_3317100025267257_2316774039826351253_n.jpg?stp=dst-jpg_p526x296&_nc_cat=1&ccb=1-7&_nc_sid=5cd70e&_nc_ohc=HmJUS2hW_4UAX9nQQka&_nc_ht=scontent-lhr8-2.xx&oh=00_AfDHGh-avvYdAWg4-tAAEEVH4eY6VV2c9ACrNPPm6MNLRg&oe=64238754).

Big news. -- Crawdaunt (talk) 15:19, 24 March 2023 (UTC) - edited

New article from Science. ScienceFlyer (talk) 21:12, 29 March 2023 (UTC)
I don't see anything in that article that is relevant? Banedon (talk) 00:28, 4 May 2023 (UTC)
  • You have to look in the history, two IPs insist on adding this issue to the article on the journal, even though the sources don't mention this journal. I've put a "see also" link to the relevant section here, but some extra eyes of people who have thought more about this issue than I myself are welcome. --Randykitty (talk) 05:56, 4 May 2023 (UTC)
Oh, you're referring to the history of the Wikipedia page. I thought it was the history of the journal, and my comment reflected there being no history section in that article. Considering the claim the source is supporting is "[Symmetry's] scientific quality is considered as poor by many institutions and mathematicians", I think Igor Pak is fine as a source, but I've not read the source, and furthermore any criticism of MDPI should clearly be in this article and not the one on Symmetry. Banedon (talk) 08:27, 4 May 2023 (UTC)

Corrections published in Oviedo-García's Research Evaluation article

Link to paper [2]. It looks like a correction has been published. The correction says "The original version of this article has been retracted and a revised version has been published as a replacement, addressing concerns about conclusions drawn in the article. The conclusions in the updated article are reached based on cited sources." Does anyone know if the version given in the first link is the corrected paper? If the first link is the original uncorrected paper, where can I find the corrected paper? If it is the corrected paper, where can I find the changes? Banedon (talk) 09:48, 11 July 2023 (UTC)

Yeah they did something weird there... Normally a doi is a permanent identifier and a correction is a separate doi. Here they've removed the original and not even left a statement of what was changed... Very odd for a journal whose purpose is "Research Evaluation". I could, in fact, find the original article (thanks RetractionWatch, below). There's a web archived version here.
RetractionWatch report on a few changes and even provide a Microsoft Word document where they did a Track Changes comparison between the original and the revised version. Note: my browser said "this cannot be downloaded securely," but... I guess I live life on the edge as I did it anyways >: ) If you want to do the same, see the "(See a comparison we created here.)" line in the RetractionWatch page above.
My quickie scan says the new version has ~1000 fewer words than the original. I see many edits toning language down, changing from "are in fact" to "could be considered" etc... or large blocks of text to tone down conclusions. Ex:
ADDED: "Further research is needed to compare the JCR-indexed MPDI journals to similar journals in their respective fields in order to understand whether the level of self-citation is significantly different for MDPI published journals. Due to the breadth of MDPI journals assessed in this paper it was not possible to conduct in-depth work to compare each journal. As such a single proxy was used in the form of the non-MDPI leading journal. However, since these journals are all the top journals by Impact Factor in their respective subject categories it is not possible to know whether the differences observed between the two groups are meaningful, a wider sample is necessary to draw definitive conclusions." <- toned down conclusion
But some changes seem to actually just buckle down and reinforce previous conclusions:
ADDED: "Regarding journal names, aforementioned Principles state they “shall be unique and not be one that is easily confused with other journal or might mislead potential authors and readers about the journal’s origin or association with other journals”. However, as previously shown, some MDPI journals’ name are very similar to other publishers journals breaching the required condition of not been easily confused with another journal." <- reinforced conclusion
Cheers,
--Crawdaunt (talk) 05:50, 12 July 2023 (UTC)
I added a section on this to the article. I didn't add Giulia Stefelleni's criticism of the revised article, but that could reasonably be added too. Banedon (talk) 01:23, 27 July 2023 (UTC)
Added the special issue thing. That's what stood out from this paper to me. 800+ special issue per year is nuts. There's no way a fulltime EiC could even keep track of the emails involved with that number of issues. Headbomb {t · c · p · b} 02:04, 27 July 2023 (UTC)
Not a fan of the section for several reasons. First, it's not the thrust of the paper (it's not given in the abstract). Second, I don't see why you can't have 800+ special issues per year. Sure, a single EiC cannot keep track of all the emails involved with that many special issues, but that's why special issues have guest editors. It does imply that the EiC is not the person setting up the SI, but that doesn't say anything about the peer review of the SI papers. More pragmatically the only cited criticism of the SIs is by Jack Grove, but the criticism doesn't make sense: there are millions of articles published every year (Elsevier alone for example publishes more papers than MDPI) so it's clearly possible to conduct peer review on this scale. Finally if the number of SIs is a concern, then the section is in the wrong place; it should be a subsection within the "Evaluation and controversies" section. If it the number of SIs is not a concern, then the section is COATRACKed. If kept, I would move it to the History section, mention that at some point MDPI adopted the SI model (which also drove growth at Hindawi & Frontiers), and leave out Jack Grove's "criticism" (if it can be called criticism, since he only raised questions). Banedon (talk) 03:56, 27 July 2023 (UTC)
There are a million of Elsevier articles accross thousands of articles, on their regular schedule. When your journal has 12 regular issues, and 3000 special issues, something is amiss. This is both something odd (it doesn't have to be the focus of the Oviedo-García, but it is something she points out as extraordinary), and something criticized. Headbomb {t · c · p · b} 04:01, 27 July 2023 (UTC)
I don't see anything amiss with that. You can't do it with a hybrid or subscription journal (your subscription prices would be thrown out of whack, plus you can't exactly print an issue with 0-1 papers for physical reasons), but there's no fundamental reason an open access journal can't do it. Oviedo-Garcia's paper doesn't focus on SIs - it doesn't for example discriminate between self-citations from SI papers and non-SI papers - so it's at best a tangential source. I remain opposed to leaving the section as it is and prefer my edit described above. Banedon (talk) 04:11, 27 July 2023 (UTC)
@Crawdaunt: wondering what you think about the above (also curious what other watchers of this article think). Banedon (talk) 06:32, 1 August 2023 (UTC)
This is no longer about the Oviedo section, but rather this new subsection called proliferation of special issues?
I think that's been a topic in news before, and Oviedo comments on it certainly. This is currently built in as the last small paragraph of the History section, citing refs [7][8], although perhaps the following might be a good reference in addition to [8] as a way to legitimize [7] as an authority analysis: https://www.science.org/content/article/fast-growing-open-access-journals-stripped-coveted-impact-factors
Actually... I now realize this section is re-citing refs [7][8] again as refs 100+.
That all said, I don't think this deserves the current heading status. I think there could be a discussion about this being its own subsection within controversies? But also it's a bit redundant with what's already in the history section. Maybe the solution there is to remove this from the History section into its own controversy subheading?
As it is, there's a weird focus on Oviedo-Garcia, who was one of many who identified this, but the in-depth analysis in ref [7] by Crosetto really started that whole conversation in Oviedo-Garcia, and gave rise to the recent articles in Times Higher Education and Sciencemag on the topic. Crawdaunt (talk) 14:47, 1 August 2023 (UTC)
There's no focus on OG, we just report what they did. If Crosetto precedes OG, then we should start with Crosetto. Headbomb {t · c · p · b} 16:43, 1 August 2023 (UTC)
Expanded with Crosetto. Headbomb {t · c · p · b} 17:26, 1 August 2023 (UTC)
Better in terms of balance, but I would say this is currently much longer than it ought to be, and again I'd merge it into the controversies section above as a subsection rather than give it its own section heading. I might also remove the last paragraph of the History section and port it to this new section. I'd actually say that's a better starting point than providing a full 4 short paragraphs on the topic...
This is a WP:DUE concern. Just because one can write further on something doesn't mean one should. The length of a section should be balanced with its overall importance to the present article. MDPI does have a high use of Special Issues, and that's a relevant controversial topic... But it's not more important than every other controversy covered, so to make it 2x as long and also under its own section header is a bit much.
Answering from phone atm so can look this over later to make edits in-line with my feedback if those aren't made already. Crawdaunt (talk) 17:54, 1 August 2023 (UTC)
I don't buy the length/due argument. It's also a rather bad fit in the 'controversies' section as it stands, given those are all about individual events, while this one is systematic. Headbomb {t · c · p · b} 18:23, 1 August 2023 (UTC)
@Banedon, thoughts? Crawdaunt (talk) 20:06, 1 August 2023 (UTC)
It's a good point that this is no longer related to the Oviedo-Garcia section, but rather about special issues; accordingly, I will start a new section on this talk page. Banedon (talk) 03:20, 2 August 2023 (UTC)

Proliferation of Special Issue section might not be appropriately placed

This is about what is currently the "Proliferation of special issues" section. I think the section is inappropriate in the current article because it deals with the special issue model for managing a journal (also called the "Guest editor model" by Scholarly Kitchen [3]. The special issue model is a model, first and foremost. It is independent of MDPI. Per Scholarly Kitchen, the same model is used by Frontiers, Hindawi, and Springer Nature.

Because the special issue model is a business model, it is not directly relevant to MDPI, and I think the section is inappropriate. It's like writing a section saying that MDPI uses the open access model - it's not wrong, it's just inappropriate. If we have the section at all, it should be in the history section and say that MDPI started adopting the model on ___ date, and that it drove significant growth at potential reputational cost. For the last statement we can cite Crosseto/Scholarly Kitchen, but not Ovideo-Garcia, since the latter did not suggest anything wrong with the special issues. (Truth be told Crosseto is not a very good source either, because he only suggests there could be something wrong; he does not provide any evidence. If there's something similar for MDPI to Retraction Watch on paper mills @ Hindawi, then we could add it.) Banedon (talk) 03:40, 2 August 2023 (UTC)

"Because the special issue model is a business model, it is not directly relevant to MDPI"
It is directly relevant to MDPI because they (ab)use this model, discussed in multiple independent reliable sources. This is not a section about special issues in general, it's a section about special issues as used by MDPI. Headbomb {t · c · p · b} 07:01, 2 August 2023 (UTC)
I don't see anything in the section right now that applies to MDPI as opposed to the special issue model as a whole. Everything there could equally apply (with different numbers, of course) to the other three publishers listed above. Can you point something out? Banedon (talk) 07:05, 2 August 2023 (UTC)
3000+ special issues per journal, vs 3-4 special issue per journal is a major difference. It's three orders of magnitude greater. Headbomb {t · c · p · b} 07:11, 2 August 2023 (UTC)
Did you read what I wrote above in the Oviedo-Garcia section? The special issue model is applicable only to OA journals. It is not possible to run 3000+ special issues in a hybrid or subscription journal. Saying "there is a difference of three orders of magnitude" is like saying "there is a difference of an infinity orders of magnitude in the fees required to publish in these journals". It doesn't seem like you understand the topic. I won't be responding further until Crawdaunt (or someone else) responds. Banedon (talk) 07:33, 2 August 2023 (UTC)
This is nowhere near unique to open access publishers. Nearly every journal publisher out there publishes special/thematic issues. Headbomb {t · c · p · b} 08:03, 2 August 2023 (UTC)
On this, I agree with @Headbomb: it's simply not true that MDPI's use of special issues is similar to other publishers. There is a clear extreme use of Special Issues by MDPI. Those data reported in Times Higher Education and Science Magazine (news dept) originally come from the Crosetto analysis. To emphasize the notion that there genuinely is controversy about the sheer magnitude of special issues, see:
"Carlos Peixeira Marques of the University of Trás-os-Montes and Alto Douro, for example, says MDPI sent him multiple invitations to serve as a guest editor, in agriculture, animal science, and engineering—but never in his field of business and tourism. “The absolutely insane number of [MDPI] special issues has made it impossible to guarantee minimum peer-review standards,” he says.
The speed with which MDPI’s special issue manuscripts are reviewed and published is also a concern, Crosetto says. In 2022, MDPI’s median time from submission to acceptance was 37 days, well below the 200 days at the PLOS family of journals, another large, open-access publisher he examined for comparison. For about one in three papers, MDPI’s turnaround was 1 month or less. Considering the time it can take to recruit reviewers and revise manuscripts, “this looks just impossible,” he says."
https://www.science.org/content/article/fast-growing-open-access-journals-stripped-coveted-impact-factors
My concern is just a WP:DUE thing. As @Banedon says, this is not 100% unique to MDPI (although as above, MDPI is a flagbearer in growth through special issues, not without criticism from significant sources). So to have its own defined section, when it's just one among a long list of controversies, is unjustified to me. I don't grant the idea that, because it's longer-running, and not a single event, it somehow deserves its own section. MDPI's high rates of self-citation highlighted in the Oviedo article are long-running, but a specific source reported them, and so made it into a public controversy. I don't see anything different here.
So IMO it shouldn't be longer than one paragraph, and it shouldn't be in its own special section.
-- Crawdaunt (talk) 08:25, 2 August 2023 (UTC)
No, the point is that you must compare special issues from OA publishers using the guest editor model to other OA publishers using the guest editor model. The sheer number of MDPI special issues might sound unique, but it is not when you compare with Hindawi/Frontiers/Springer Nature. The Scholarly Kitchen article even gives links - here are 52 pages of "collections" (aka special issues) for Scientific Reports: [4].
The sources on there being a controversy because of the number of SIs are also very weak. E.g. Crosseto's quote basically boils down to "this looks impossible", but it looks to me that it's clearly possible and the evidence is staring him in the face, he just refuses to believe it. (See Argument from incredulity.) I for example have seen a lot of complaints about how MDPI asks for reviews in 10 days and also requests revisions from authors in 5-10 days, implying these things are happening, they're just happening at a very fast pace. If Crosseto has got any actual evidence that the papers were not properly reviewed, he hasn't produced it. He could go around emailing the authors of all the papers (not just by MDPI, but also Hindawi/Frontiers/Springer Nature) asking if they actually received reviews, and to share them. He could also approach the editors listed and ask if they actually made the decision. Absent this kind of evidence, I'd actually favor not using the source at all. Banedon (talk) 08:48, 2 August 2023 (UTC)
I don't agree that they're weak. Dozens of scientists have voiced concerns over the rigour of the MDPI peer review process, cited in the Science Magazine article (beyond just Crosetto, who contributed data), and also the Times Higher Education article published around the same time.
I don't think your notion of having to request info from every publisher under the sun stands... Wikipedia is a place to report what is part of the public conversation, not to question how valid one side of that conversation is. Followers, not leaders, and all that jazz.
There is a widespread view that MDPI's peer review is too lax, and its proliferation of special issues (possible thanks to relying on variable guest editors, not consistent staff editors), is part of that. Analyses by Petrou, an ardent defender of MDPI, nevertheless find MDPI's review process is bizarrely fast, requiring him to censor MDPI from the dataset to get a meaningful interpretation of the data (ref4 of the present page, publishing fast and slow (Petrou)).
...
Interesting position we all find ourselves in though... @Headbomb is pushing for a stronger focus on criticism of the SI proliferation, I am pushing for some middle ground, @Banedon is pushing for this to be dropped entirely(?).
@Randykitty or @Karlaz1 do you want to comment? Crawdaunt (talk) 19:17, 2 August 2023 (UTC)
Dozens of scientists have voiced concerns over the rigour of the MDPI peer review process Right, but these are concerns, not evidence. An analogous coincern like "I don't see how Wikipedia can maintain editorial standards over millions of articles when their decision-making council (Arbcom) is only 10ish editors" does not show that Wikipedia has lax editorial standards. Sources like the Oviedo-Garcia article talk about the number of special issues because that data is easy to acquire, compared to data on the rigour of the peer review process, which is much harder (aside from emailing the authors/editors I don't see how one can acquire the data, unless MDPI publishes it). Absent that data I am not even sure if we should imply a link. It would surely fail WP:BLP, but we're not a BLP.
I prefer adding information about the SIs to the history section. Something to the tune of "MDPI adopted the Guest editor model in ___, and the model drove rapid growth, but there are a lot of SIs that never published anything" would be appropriate. One could add "dozens of scientists have voiced concerns over the rigour of the peer review of SIs", but that's a general critique of the model. If we include that, then we could also reasonably include "dozens of scientists have voiced concerns over the gold OA model which incentivizes publishers to accept articles regardless of their quality" into this article, but then one could go around adding this claim to all OA publishers (which is pretty much all publishers at this point, there aren't many journals that don't have an OA option), and then it becomes silly. I would favor leaving it out; if kept it shouldn't be more than a couple of lines.
Your preference of a section under controversies would be my second choice. Banedon (talk) 01:08, 3 August 2023 (UTC)
I see your point. Again I think I differ in that MDPI is clearly a special case in use of SIs, and also anomalous growth (alongside this use of SIs), and has a peer review turaround half or less than literally all other publishers. There is no objective definition of "good" peer review, only a definition of peer review. So when it becomes a public conversation about the quality of MDPI's science, its peer review, by virtue of over-reliance on the special issue model, I do see that as a controversy of MDPI, not of special issues. The controversy is over the frequency of special issues, not the fact that they are used.
-- Crawdaunt (talk) 06:47, 3 August 2023 (UTC)
I'm not sure if MDPI pioneered SIs (i.e. the Guest editor model). It's possible they did. If they did, then yeah it's clearly a special case, but only because they're the pioneer. I'm not convinced MDPI shows anomalous growth due to the model. Hindawi, Frontiers and Scientific Reports all use the model and show comparable growth. In fact, the evidence seems to suggest that the SI model is very good at driving growth, which is why all these publishers/journals rely on it. Finally, given the model involves making as many SIs as possible, if there is a controversy over the frequency of the SIs, then that's a controversy over the model and not the publisher.
(Also not convinced that there's a peer review turnaround time of half or less than all other publishers, since IEEE Access is even faster, but this does not seem relevant to the SIs.) Banedon (talk) 05:41, 4 August 2023 (UTC)
Pioneered, no of course not. SIs have been around for over a century. What they pioneered was transforming to make the majority of their publications, across all their journals, published through SIs. This is what makes MDPI unique among all publishers.
It's not true 'all these publishers rely on SIs for growth. Definitely Frontiers and Hindawi have begun using SIs (not to the same extent at MDPI). Sci Reports is nothing like MDPI re: SIs, it is the Nature dumping ground journal like PLOS ONE but Nature portfolio. That overwhelming majority of its articles come from non-invited routes.
The 'model' here is business-oriented, not peer review-oriented. The controversy is over the practice of making SIs the majority way articles are published. One ought to appreciate that making "special issues" the default route for publication is an oxymoron.
The IEEE point is fair. Although I think IEEE notes their acceptance rate is ~30% whereas MDPI journals publisher their acceptance rates, and they're more aking to 50-60% iirc. Finding the blog that analysed this now... IEEE link and MDPI reject rate chart included below.
https://ieeeaccess.ieee.org/about-ieee-access/frequently-asked-questions/#:~:text=On%20average%2C%20the%20IEEE%20Access,they%20receive%20the%20acceptance%20notification.
https://danielbrockington.files.wordpress.com/2022/11/screenshot-2022-11-10-at-22.41.30.png
In general, there are a ton of factors going on here. What is true through the murk, to me, is that there is a controversy quite specifically about MDPI's use of SIs, and that is the point we should be debating, not its merit. The tone of the section is determined by the validity of the content, but the proposal of a section itself is certainly justified in my view.
-- Crawdaunt (talk) 08:07, 4 August 2023 (UTC)
It sounds like we're not thinking of the same thing. When I think of "SI model", I am thinking of a publishing model where you try to get as many guest editors to lead as many special issues as possible. This is not doable by subscription journals. The reason is because the prices of subscription journals is intimately related to the number of issues they publish - you cannot suddenly have an extra 100 special issues in your 8 issues/year journal without throwing your subscription prices out of whack. In the same way, a subscription journal cannot have an "issue" with no papers, but OA journals have no trouble (although it's still not ideal, of course). Because only OA journals can implement the SI model, it means the SI model is recent.
If Frontiers and Hindawi have only just begun to start using SIs (I'm pretty sure Hindawi is a latecomer to the model, haven't checked if Frontiers started before MDPI though), then MDPI is the pioneer. That should be worth including in the article. Still, if MDPI is not the only publisher using the model, then I still don't see the controversy over the model as a controversy about MDPI. As for Sci Reports - what makes you say it's nothing like MDPI? It looks very similar, except the special issues are called "collections". BioMed Central are doing the same thing, as well. Rejection rate is a fair thing to mention in the article in my opinion, but that would be a separate topic to SIs, and any comparison would have to involve more publishers than just IEEE Access vs. MDPI. Banedon (talk) 09:51, 4 August 2023 (UTC)
Topical special issues existed in other journals before MDPI was founded, I think more generally focused on bringing in high-quality content than on boosting author-fee revenues. An example: [5]David Eppstein (talk) 12:18, 4 August 2023 (UTC)
I'm beginning to wonder how many times I have to repeat myself. When I think of "SI model", I am thinking of a publishing model where you try to get as many guest editors to lead as many special issues as possible. This is NOT doable by subscription journals, and NOT the same as the (few) special issues that subscription journals can do. Banedon (talk) 12:34, 4 August 2023 (UTC)
Yeah, I'm sorry to say the feeling's mutual. I'm not convinced by your arguments, you're not convinced by mine (or @Headbomb's). I really have understood your idea. Agree to disagree and just leave a vote open?
1. per Headbomb: a separate section devoted to the SI controversy.
2. per me: the SI controversy is folded in as a subsection of controversies, and reduced to similar length as other controversies.
3. per Banedon: the SI controversy is not an MDPI-specific controversy and doesn't deserve special mention on the page.
Does that seem like a fair summary?
-- Crawdaunt (talk) 16:25, 4 August 2023 (UTC)
I really don't understand this need to cull the length. It's 300 words, all backed by relevant sources. It's about 5% of the article lengthwise. That is not excessive. Headbomb {t · c · p · b} 16:44, 4 August 2023 (UTC)
Looks reasonable to me, too. XOR'easter (talk) 17:39, 4 August 2023 (UTC)
Same here. --Randykitty (talk) 08:07, 5 August 2023 (UTC)
That's 3:1:1, think we both just got overruled @Banedon.
Thanks for input @Headbomb, @XOR'easter, @Randykitty.
-- Crawdaunt (talk) 06:03, 6 August 2023 (UTC)
I have to say it's refreshing to have a civil disagreement, compared to the rest of the often inane discussions that surround MDPI. Headbomb {t · c · p · b} 07:59, 6 August 2023 (UTC)