Jump to content

Wikipedia:WikiProject Conservatism

From Wikipedia, the free encyclopedia
(Redirected from Wikipedia:Right)


    Welcome to WikiProject Conservatism! Whether you're a newcomer or regular, you'll receive encouragement and recognition for your achievements with conservatism-related articles. This project does not extol any point of view, political or otherwise, other than that of a neutral documentarian. Partly due to this, the project's scope has long become that of conservatism broadly construed, taking in a healthy periphery of (e.g., more academic) articles for contextualization.

    Major alerts

    A broad collection of discussions that could lead to significant changes of related articles

    Did you know

    Articles for deletion

    • 14 Oct 2024 – Jessica Reed Kraus (talk · edit · hist) was AfDed by Ibjaja055 (t · c); see discussion (1 participant)
    • 02 Oct 2024 – Democratic Center Party (Turkey) (talk · edit · hist) was AfDed by Chidgk1 (t · c); see discussion (4 participants; relisted)
    • 06 Oct 2024Dominic Foppoli (talk · edit · hist) AfDed by Russ Woodroofe (t · c) was closed as delete by Star Mississippi (t · c) on 13 Oct 2024; see discussion (7 participants)
    • 23 Sep 2024New Federalism (talk · edit · hist) AfDed by Choucas Bleu (t · c) was closed as keep by Asilvering (t · c) on 14 Oct 2024; see discussion (11 participants; relisted)

    Proposed deletions

    Categories for discussion

    Redirects for discussion

    (69 more...)

    Good article nominees

    Requests for comments

    Requested moves

    Articles to be merged

    Articles to be split

    Articles for creation

    Watchlists

    WatchAll (Excerpt)
    Excerpt from watchlist concerning all the articles in the project's scope
    Note that your own edits, minor edits, and bot edits are hidden in this tab

    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    12 October 2024

    11 October 2024

    10 October 2024

    9 October 2024

    For this watchlist but about 3X in length, visit: Wikipedia:WikiProject Conservatism/All recent changes
    WatchHot (Excerpt)
    A list of 10 related articles with the most (recent) edits total
    394 edits Petre Pandrea
    91 edits List of Donald Trump 2024 presidential campaign endorsements
    88 edits 2024 Conservative Party leadership election
    72 edits Endorsements in the 2024 Conservative Party leadership election
    63 edits Fianna Fáil
    57 edits Shigeru Ishiba
    52 edits Grooming gang moral panic in the United Kingdom
    46 edits Robert Zoellick
    40 edits JD Vance
    40 edits Imran Khan

    These are the articles that have been edited the most within the last seven days. Last updated 13 October 2024 by HotArticlesBot.



    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    12 October 2024

    11 October 2024

    10 October 2024

    For this watchlist but about 5X in length, visit: Wikipedia:WikiProject Conservatism/Hot articles recent changes
    WatchPop (Excerpt)
    A list of 500 related articles with the most (recent) views total

    This is a list of pages in the scope of Wikipedia:WikiProject Conservatism along with pageviews.

    To report bugs, please write on the Community tech bot talk page on Meta.

    List

    Period: 2024-09-01 to 2024-09-30

    Total views: 61,226,443

    Updated: 00:33, 6 October 2024 (UTC)

    Rank Page title Views Daily average Assessment Importance
    1 Laura Loomer 2,260,824 75,360 C Low
    2 Donald Trump 1,443,275 48,109 B High
    3 Project 2025 839,546 27,984 B Mid
    4 JD Vance 827,889 27,596 B Mid
    5 Ronald Reagan 669,141 22,304 FA Top
    6 Dick Cheney 617,188 20,572 GA Mid
    7 Vladimir Putin 482,155 16,071 B High
    8 Benjamin Netanyahu 468,220 15,607 B Mid
    9 George W. Bush 458,147 15,271 B High
    10 George H. W. Bush 357,287 11,909 B High
    11 Charlie Kirk 332,331 11,077 C Low
    12 Winston Churchill 324,408 10,813 GA Top
    13 Theodore Roosevelt 319,383 10,646 B High
    14 Richard Nixon 313,786 10,459 FA High
    15 Republican Party (United States) 311,857 10,395 B Top
    16 Family of Donald Trump 310,434 10,347 B Low
    17 Rupert Murdoch 288,851 9,628 B Low
    18 Gerald Ford 280,740 9,358 C High
    19 John McCain 280,103 9,336 FA Mid
    20 Viktor Orbán 274,764 9,158 C Mid
    21 Zionism 273,323 9,110 B Low
    22 Dwight D. Eisenhower 265,139 8,837 B High
    23 Alternative for Germany 264,961 8,832 C Low
    24 Matt Walsh (political commentator) 262,454 8,748 C Low
    25 Liz Cheney 257,864 8,595 B High
    26 Tucker Carlson 248,224 8,274 B High
    27 Candace Owens 239,496 7,983 B Low
    28 Red states and blue states 220,177 7,339 C Mid
    29 Am I Racist? 217,530 7,251 Start Mid
    30 Margaret Thatcher 213,363 7,112 GA Top
    31 Jon Voight 203,862 6,795 C Low
    32 Bharatiya Janata Party 196,262 6,542 GA Low
    33 French Revolution 192,143 6,404 B Unknown
    34 Cold War 188,977 6,299 C Top
    35 Rishi Sunak 188,705 6,290 B High
    36 List of Donald Trump 2024 presidential campaign endorsements 186,519 6,217 List Low
    37 John Wayne 186,335 6,211 B Low
    38 Woke 184,803 6,160 B Top
    39 Kelsey Grammer 183,439 6,114 B Low
    40 Bing Crosby 183,234 6,107 B Low
    41 Springfield pet-eating hoax 181,720 6,057 B Low
    42 Jordan Peterson 180,931 6,031 C Low
    43 Fyodor Dostoevsky 180,841 6,028 B Low
    44 Robert Jenrick 173,254 5,775 C Unknown
    45 Taliban 165,870 5,529 B High
    46 Chuck Norris 163,719 5,457 B Low
    47 Bangladesh Nationalist Party 163,649 5,454 C High
    48 Robert Duvall 158,528 5,284 B Low
    49 Patricia Heaton 158,164 5,272 C Low
    50 Charles de Gaulle 155,089 5,169 B Mid
    51 Shinzo Abe 151,951 5,065 B Mid
    52 William McKinley 151,944 5,064 FA Low
    53 Shirley Temple 151,874 5,062 B Low
    54 Constitution of the United States 151,123 5,037 B High
    55 Mitt Romney 149,800 4,993 FA High
    56 Stephen Baldwin 149,551 4,985 B Low
    57 QAnon 148,432 4,947 GA Mid
    58 Linda McMahon 142,443 4,748 B Low
    59 Nick Fuentes 140,965 4,698 B Low
    60 Shigeru Ishiba 140,877 4,695 Start Low
    61 Mike Pence 140,752 4,691 B Mid
    62 Dan Quayle 139,749 4,658 B Mid
    63 Nancy Reagan 139,627 4,654 B Mid
    64 James Caan 137,200 4,573 C Low
    65 James Stewart 132,034 4,401 GA Low
    66 John Kennedy (Louisiana politician) 131,232 4,374 C Low
    67 Rudy Giuliani 131,148 4,371 B Mid
    68 1964 United States presidential election 130,926 4,364 C Mid
    69 James A. Garfield 129,738 4,324 FA Low
    70 Javier Milei 129,266 4,308 B Mid
    71 Grover Cleveland 127,256 4,241 FA Mid
    72 Boris Johnson 126,727 4,224 B High
    73 Marjorie Taylor Greene 126,345 4,211 GA Low
    74 Herbert Hoover 125,050 4,168 B Mid
    75 Condoleezza Rice 124,949 4,164 B Mid
    76 Lara Trump 122,632 4,087 C Low
    77 Ted Cruz 122,063 4,068 B Mid
    78 Hillbilly Elegy 119,292 3,976 B Low
    79 Greg Gutfeld 118,849 3,961 C Low
    80 Imran Khan 115,224 3,840 B Low
    81 William Howard Taft 114,651 3,821 FA Mid
    82 Warren G. Harding 113,351 3,778 FA Low
    83 John Malkovich 112,997 3,766 C Low
    84 Ben Shapiro 112,684 3,756 C Mid
    85 Calvin Coolidge 111,143 3,704 FA High
    86 Lindsey Graham 109,085 3,636 C Low
    87 Mary Matalin 108,813 3,627 C Low
    88 Francisco Franco 108,483 3,616 C Mid
    89 Ayn Rand 107,715 3,590 GA Mid
    90 Liz Truss 107,542 3,584 FA Mid
    91 Sarah Palin 107,230 3,574 C Mid
    92 Chiang Kai-shek 106,097 3,536 C Low
    93 Mike DeWine 103,622 3,454 B Low
    94 Pat Sajak 103,369 3,445 C Low
    95 Sarah Huckabee Sanders 102,919 3,430 C Low
    96 Anthony Scaramucci 102,373 3,412 C Low
    97 Jeanine Pirro 101,750 3,391 B Low
    98 John Locke 99,660 3,322 C Top
    99 Chester A. Arthur 98,435 3,281 FA Low
    100 James Woods 96,743 3,224 Start Low
    101 Nikki Haley 95,867 3,195 B Low
    102 Otto von Bismarck 94,810 3,160 B High
    103 Muhammad Ali Jinnah 94,596 3,153 FA High
    104 Libertarianism 94,572 3,152 B High
    105 Clark Gable 93,833 3,127 B Low
    106 Agenda 47 93,751 3,125 C Top
    107 Generation 93,479 3,115 B Mid
    108 The Heritage Foundation 92,893 3,096 B High
    109 Ron DeSantis 92,309 3,076 B Mid
    110 Ben Carson 92,191 3,073 C Low
    111 Conservative Party (UK) 91,775 3,059 C High
    112 2024 Tenet Media investigation 91,762 3,058 C Low
    113 Thomas Sowell 91,526 3,050 C Mid
    114 Far-right politics 91,521 3,050 B Low
    115 Angela Merkel 91,078 3,035 GA High
    116 Mike Johnson 90,126 3,004 C Mid
    117 Fox News 89,720 2,990 C Mid
    118 Nancy Mace 89,223 2,974 B Low
    119 Iran–Contra affair 87,524 2,917 GA Low
    120 John Major 86,432 2,881 B High
    121 Anders Behring Breivik 86,403 2,880 C Low
    122 Mike Lindell 86,037 2,867 C Low
    123 Donald Trump 2024 presidential campaign 85,550 2,851 B Low
    124 2024 Liberal Democratic Party (Japan) presidential election 85,016 2,833 C Unknown
    125 Recep Tayyip Erdoğan 83,805 2,793 B High
    126 Charlton Heston 83,092 2,769 B Low
    127 Clarence Thomas 82,880 2,762 B Mid
    128 George Santos 82,779 2,759 B Low
    129 Kellyanne Conway 82,402 2,746 B Low
    130 Gadsden flag 81,567 2,718 B Low
    131 False or misleading statements by Donald Trump 81,380 2,712 B Low
    132 George Wallace 81,075 2,702 B Mid
    133 Mitch McConnell 80,414 2,680 B Mid
    134 Douglas Murray (author) 80,344 2,678 C Low
    135 Darryl Cooper 80,116 2,670 Redirect Low
    136 Adam Kinzinger 79,864 2,662 B Low
    137 Gary Sinise 79,669 2,655 C Low
    138 Spiro Agnew 79,635 2,654 FA Mid
    139 The Republicans (France) 79,004 2,633 Start Low
    140 Bo Derek 78,905 2,630 Start Low
    141 Matt Gaetz 78,385 2,612 C Low
    142 Neoliberalism 78,291 2,609 B Top
    143 Dave Mustaine 78,059 2,601 C Low
    144 Arthur Wellesley, 1st Duke of Wellington 77,714 2,590 B Low
    145 Benjamin Harrison 77,568 2,585 FA Low
    146 Deng Xiaoping 77,117 2,570 B Low
    147 Mullah Omar 76,674 2,555 B High
    148 Björn Höcke 75,852 2,528 Start Low
    149 Rich Lowry 75,000 2,500 Start Unknown
    150 Gretchen Carlson 74,727 2,490 B Low
    151 Brett Cooper (commentator) 74,016 2,467 Stub Low
    152 Charles Lindbergh 73,836 2,461 B Low
    153 Whig Party (United States) 73,816 2,460 C Low
    154 Kari Lake 73,685 2,456 C Low
    155 Paul von Hindenburg 72,965 2,432 C Mid
    156 Sean Hannity 72,511 2,417 B Mid
    157 Donald Rumsfeld 72,458 2,415 B Mid
    158 McCarthyism 72,009 2,400 C High
    159 Nigel Farage 72,006 2,400 B Mid
    160 Bill O'Reilly (political commentator) 71,959 2,398 B Mid
    161 Gary Cooper 71,627 2,387 FA Mid
    162 David Cameron 71,243 2,374 B Top
    163 Critical race theory 70,724 2,357 C Low
    164 Dmitry Medvedev 70,598 2,353 C High
    165 Herbert Kickl 69,878 2,329 Stub Mid
    166 Proud Boys 69,719 2,323 C Low
    167 Laura Ingraham 69,679 2,322 C Mid
    168 Dave Rubin 69,629 2,320 C Low
    169 Theresa May 69,521 2,317 B Mid
    170 House of Bourbon 68,862 2,295 B High
    171 David Duke 68,821 2,294 B Mid
    172 Tom Tugendhat 68,790 2,293 B Low
    173 Reform UK 68,271 2,275 C High
    174 Left–right political spectrum 68,187 2,272 C Top
    175 Lauren Boebert 67,318 2,243 B Low
    176 Liberal Democratic Party (Japan) 67,239 2,241 C High
    177 Right-wing politics 66,947 2,231 C Top
    178 2024 Solingen stabbing 66,577 2,219 C Low
    179 Falun Gong 66,186 2,206 B Mid
    180 Neoconservatism 65,929 2,197 C Top
    181 Conservative Party of Canada 65,596 2,186 B High
    182 Dan Bongino 65,491 2,183 C Mid
    183 Anna Paulina Luna 65,427 2,180 B Low
    184 Neville Chamberlain 65,355 2,178 FA Mid
    185 John Roberts 65,184 2,172 B High
    186 Malik Obama 65,132 2,171 Start Low
    187 Newt Gingrich 64,889 2,162 GA High
    188 Angie Harmon 64,873 2,162 C Low
    189 Dana Perino 64,677 2,155 C Low
    190 Chris Christie 64,661 2,155 C Low
    191 Roger Ailes 64,522 2,150 C Mid
    192 Michael Reagan 64,152 2,138 C Low
    193 Lauren Chen 63,871 2,129 Start Low
    194 Craig T. Nelson 63,826 2,127 Start Unknown
    195 Paul Ryan 63,694 2,123 C Mid
    196 Tom Cotton 63,332 2,111 C Low
    197 Scott Baio 62,925 2,097 Start Low
    198 Austrian People's Party 62,659 2,088 Start High
    199 T. S. Eliot 62,401 2,080 B Low
    200 Alice Weidel 62,106 2,070 C Low
    201 Great Replacement 62,099 2,069 C Top
    202 Presidency of Donald Trump 60,241 2,008 B Low
    203 Rush Limbaugh 60,122 2,004 B High
    204 Barry Goldwater 60,043 2,001 B High
    205 Jair Bolsonaro 59,906 1,996 B Mid
    206 Effects of pornography 59,896 1,996 C Low
    207 Pat Buchanan 59,835 1,994 B Mid
    208 Steve Bannon 59,711 1,990 B Mid
    209 The Daily Wire 59,656 1,988 C Low
    210 Capitalism 59,529 1,984 C Top
    211 Lauren Southern 59,395 1,979 Start Mid
    212 2024 United Kingdom riots 59,392 1,979 B Low
    213 Jeb Bush 59,060 1,968 B Low
    214 Truth Social 58,577 1,952 B Low
    215 Ted Nugent 58,384 1,946 C Low
    216 Conservatism 58,352 1,945 B Top
    217 Stephen Miller (political advisor) 58,211 1,940 B Low
    218 Mark Levin 57,897 1,929 Start High
    219 Likud 57,731 1,924 C Low
    220 Rutherford B. Hayes 57,604 1,920 FA Low
    221 What Is a Woman? 57,511 1,917 B Low
    222 Bob Dole 57,367 1,912 B Low
    223 Deep state in the United States 57,035 1,901 Start Low
    224 Milton Friedman 56,965 1,898 GA High
    225 Tom Clancy 56,794 1,893 C Low
    226 Tim Scott 56,716 1,890 C Low
    227 Ann Coulter 56,607 1,886 B Mid
    228 Curtis Yarvin 56,288 1,876 C High
    229 James Cleverly 56,120 1,870 C Low
    230 Anthony Eden 55,780 1,859 B Mid
    231 Barbara Stanwyck 55,617 1,853 B Low
    232 Bob Hope 55,608 1,853 B Low
    233 Manosphere 55,470 1,849 Start Low
    234 Milo Yiannopoulos 55,362 1,845 C Low
    235 Strom Thurmond 55,020 1,834 B Mid
    236 W. B. Yeats 54,087 1,802 FA Low
    237 John Layfield 53,560 1,785 B Low
    238 Kayleigh McEnany 53,399 1,779 C Low
    239 Trump derangement syndrome 53,290 1,776 C Mid
    240 Stacey Dash 52,975 1,765 C Low
    241 Liberty University 52,936 1,764 B Mid
    242 Christian Democratic Union of Germany 52,875 1,762 C High
    243 Greg Abbott 52,817 1,760 B Mid
    244 Benjamin Disraeli 52,416 1,747 FA Top
    245 Kristi Noem 52,298 1,743 B Low
    246 1924 United States presidential election 52,086 1,736 C Low
    247 Laura Bush 51,915 1,730 GA Low
    248 Amy Coney Barrett 51,695 1,723 C Low
    249 Itamar Ben-Gvir 51,592 1,719 C Mid
    250 Patriots for Europe 51,531 1,717 C Low
    251 Kelly Ayotte 51,328 1,710 C Low
    252 Rick Scott 51,296 1,709 C Low
    253 Make America Great Again 51,024 1,700 B Low
    254 Roger Stone 50,958 1,698 C Low
    255 Robert Davi 50,934 1,697 Start Low
    256 Daily Mail 50,885 1,696 B Mid
    257 The Fountainhead 50,786 1,692 FA Low
    258 Jeff Flake 50,786 1,692 C Mid
    259 Brett Kavanaugh 50,663 1,688 B High
    260 Melissa Joan Hart 50,231 1,674 B Low
    261 Shiv Sena 50,226 1,674 C Unknown
    262 Jack Kemp 50,218 1,673 GA Mid
    263 Reform Party of the United States of America 49,952 1,665 C Low
    264 Nicolas Sarkozy 49,000 1,633 B High
    265 Denis Leary 48,635 1,621 C NA
    266 Glenn Beck 48,304 1,610 B Mid
    267 Dinesh D'Souza 48,280 1,609 B Mid
    268 Blaire White 47,943 1,598 Start Low
    269 James Cagney 47,664 1,588 GA Low
    270 National Rally 47,604 1,586 GA High
    271 Larry Hogan 47,488 1,582 B Low
    272 Trumpism 47,317 1,577 B Mid
    273 Riley Gaines 47,296 1,576 B Mid
    274 Kataeb Party 47,241 1,574 B Low
    275 Don King 47,201 1,573 B Low
    276 The Wall Street Journal 47,097 1,569 B Mid
    277 Pat Boone 46,967 1,565 C Low
    278 Kevin McCarthy 46,834 1,561 C Low
    279 Stephen Harper 46,767 1,558 GA High
    280 White supremacy 46,674 1,555 B Low
    281 Benny Johnson (columnist) 46,435 1,547 Start Low
    282 The Epoch Times 46,249 1,541 B Low
    283 Dave Ramsey 46,104 1,536 C Unknown
    284 Federalist Party 45,944 1,531 C Low
    285 T. D. Jakes 45,930 1,531 C Unknown
    286 Ron Paul 45,820 1,527 C Mid
    287 Marco Rubio 45,759 1,525 B Mid
    288 Frank Luntz 45,675 1,522 B Low
    289 Michael Steele 45,598 1,519 B Low
    290 William F. Buckley Jr. 45,580 1,519 C Top
    291 Meghan McCain 45,367 1,512 C Low
    292 Jacobitism 45,263 1,508 B High
    293 Menachem Begin 45,181 1,506 B Mid
    294 Karl Malone 44,841 1,494 Start Low
    295 Rumble (company) 44,796 1,493 Start Low
    296 Lynne Cheney 44,335 1,477 C Low
    297 Antonin Scalia 43,864 1,462 FA High
    298 Joe Scarborough 43,642 1,454 B Low
    299 Mahathir Mohamad 43,421 1,447 GA High
    300 Free Democratic Party (Germany) 43,126 1,437 C Mid
    301 Groypers 42,328 1,410 B Low
    302 Tradwife 42,052 1,401 B Low
    303 Martin Heidegger 41,993 1,399 C Low
    304 Harold Macmillan 41,911 1,397 B High
    305 Redneck 41,871 1,395 C Low
    306 Trump wall 41,677 1,389 C Low
    307 Tommy Tuberville 41,564 1,385 B Low
    308 Right-wing populism 41,261 1,375 C Low
    309 Conservatism in the United States 41,220 1,374 B Top
    310 Edward Teller 40,938 1,364 FA Low
    311 Islamophobia 40,919 1,363 C Mid
    312 Last Man Standing (American TV series) 40,763 1,358 B Low
    313 Joe Kent 40,659 1,355 C Low
    314 Mike Huckabee 40,588 1,352 B Mid
    315 Trump fake electors plot 40,386 1,346 B High
    316 The Daily Telegraph 40,294 1,343 C Low
    317 Ginger Rogers 40,272 1,342 C Unknown
    318 Booker T. Washington 40,020 1,334 B Low
    319 Tea Party movement 39,854 1,328 C Mid
    320 Thomas Massie 39,823 1,327 B Low
    321 Brothers of Italy 39,745 1,324 B Mid
    322 John C. Calhoun 39,707 1,323 FA Top
    323 Fred Thompson 39,660 1,322 B Low
    324 Jane Russell 39,660 1,322 B Low
    325 Katie Britt 39,638 1,321 C Low
    326 Laissez-faire 39,623 1,320 C Top
    327 Taft–Hartley Act 39,510 1,317 B Low
    328 Oliver North 39,368 1,312 C Mid
    329 Turning Point USA 39,331 1,311 C Low
    330 New York Post 39,157 1,305 C Low
    331 Leonard Leo 38,912 1,297 C Mid
    332 United Russia 38,707 1,290 B High
    333 Jackson Hinkle 38,683 1,289 B Low
    334 The Times of India 38,397 1,279 C Mid
    335 John Birch Society 38,127 1,270 C Low
    336 Bret Stephens 38,001 1,266 C Low
    337 Rand Paul 37,949 1,264 GA Mid
    338 2008 California Proposition 8 37,709 1,256 B Mid
    339 The Gateway Pundit 37,588 1,252 C Unknown
    340 James O'Keefe 37,575 1,252 C Low
    341 Sheldon Adelson 37,554 1,251 C Low
    342 Nuclear family 37,455 1,248 Start Low
    343 Naftali Bennett 37,425 1,247 B Mid
    344 Curtis Sliwa 37,325 1,244 C Unknown
    345 William Barr 37,325 1,244 B Unknown
    346 Rachel Campos-Duffy 37,156 1,238 Start Low
    347 Dan Crenshaw 37,039 1,234 B Low
    348 Karl Rove 36,994 1,233 B Mid
    349 Friedrich Hayek 36,817 1,227 B Top
    350 Edward Heath 36,809 1,226 B High
    351 Fred MacMurray 36,699 1,223 C Low
    352 Michael Farmer, Baron Farmer 36,631 1,221 C Low
    353 Christopher Luxon 36,614 1,220 B Unknown
    354 Trey Gowdy 36,606 1,220 C Mid
    355 Terri Schiavo case 36,560 1,218 GA Low
    356 D. H. Lawrence 36,452 1,215 B Unknown
    357 Lee Hsien Loong 36,442 1,214 C Mid
    358 Park Chung Hee 36,442 1,214 C Low
    359 Lil Pump 36,346 1,211 B Low
    360 Breitbart News 36,201 1,206 C Mid
    361 Classical liberalism 35,858 1,195 B Top
    362 Jack Posobiec 35,838 1,194 C Low
    363 Alt-right 35,774 1,192 C Mid
    364 Murphy Brown 35,721 1,190 C Low
    365 Jim Jordan 35,505 1,183 B Low
    366 Mike Gabbard 35,233 1,174 Start Unknown
    367 Christian nationalism 35,083 1,169 Start High
    368 InfoWars 35,060 1,168 C Low
    369 Edmund Burke 35,042 1,168 B Top
    370 United National Party 35,022 1,167 C Low
    371 Kalergi Plan 35,018 1,167 Start Mid
    372 GypsyCrusader 34,889 1,162 C Low
    373 John Bolton 34,842 1,161 C Mid
    374 Richard B. Spencer 34,825 1,160 C Low
    375 Jacob Rees-Mogg 34,798 1,159 C Low
    376 Samuel Alito 34,749 1,158 C Mid
    377 Bible Belt 34,621 1,154 C Low
    378 Corey Lewandowski 34,617 1,153 C Low
    379 Rasmussen Reports 34,081 1,136 Start Low
    380 Johnny Ramone 33,878 1,129 C Low
    381 Enoch Powell 33,828 1,127 B High
    382 Neil Gorsuch 33,707 1,123 B Mid
    383 Drudge Report 33,700 1,123 B Mid
    384 Elise Stefanik 33,458 1,115 B Low
    385 Presidency of Ronald Reagan 33,435 1,114 C High
    386 Marine Le Pen 33,215 1,107 B Low
    387 Muhammad Zia-ul-Haq 33,092 1,103 B High
    388 Mark Rutte 33,064 1,102 C High
    389 Priti Patel 32,965 1,098 C Unknown
    390 Reince Priebus 32,825 1,094 Start Low
    391 Alessandra Mussolini 32,720 1,090 B Unknown
    392 Greg Locke 32,719 1,090 Start Low
    393 Fianna Fáil 32,703 1,090 B Low
    394 Bourbon Restoration in France 32,668 1,088 C High
    395 David Mamet 32,573 1,085 C Low
    396 Tom Wolfe 32,522 1,084 B Low
    397 Victor Davis Hanson 32,289 1,076 B Mid
    398 Flannery O'Connor 32,197 1,073 A Low
    399 António de Oliveira Salazar 32,179 1,072 B Unknown
    400 Honoré de Balzac 32,149 1,071 FA High
    401 2016 Republican Party presidential primaries 32,089 1,069 B Mid
    402 Jemima Goldsmith 32,086 1,069 C Unknown
    403 Loretta Young 32,081 1,069 C Low
    404 Marsha Blackburn 31,964 1,065 C Low
    405 Constitution Party (United States) 31,883 1,062 C Low
    406 Gavin McInnes 31,812 1,060 C Low
    407 First impeachment of Donald Trump 31,765 1,058 B High
    408 Libs of TikTok 31,606 1,053 C Low
    409 Thomas Mann 31,578 1,052 C Mid
    410 Moshe Dayan 31,509 1,050 B Mid
    411 Betsy DeVos 31,346 1,044 C Mid
    412 Newsmax 31,037 1,034 Start Low
    413 Ustaše 31,001 1,033 C High
    414 Walter Brennan 30,985 1,032 C Low
    415 Frank Bruno 30,891 1,029 Start Unknown
    416 Franz von Papen 30,856 1,028 B Low
    417 Islamism 30,725 1,024 B High
    418 Éamon de Valera 30,685 1,022 B High
    419 Political spectrum 30,659 1,021 C Top
    420 Morgan Ortagus 30,451 1,015 C Unknown
    421 Illegal immigration to the United States 30,377 1,012 B Low
    422 Nawaz Sharif 30,373 1,012 B Unknown
    423 Chuck Grassley 30,170 1,005 C Mid
    424 Steele dossier 30,055 1,001 B Low
    425 Tomi Lahren 30,008 1,000 Start Low
    426 Byron Donalds 29,841 994 C Low
    427 Fine Gael 29,721 990 B High
    428 Steven Crowder 29,519 983 C Mid
    429 Dennis Miller 29,228 974 Start Low
    430 Christopher Rufo 29,228 974 C Low
    431 Mike Lee 28,983 966 C Low
    432 Progressivism 28,964 965 C Mid
    433 Rick Perry 28,890 963 B Mid
    434 Facebook–Cambridge Analytica data scandal 28,661 955 C Unknown
    435 Samuel Taylor Coleridge 28,567 952 C Top
    436 Natural law 28,564 952 C Top
    437 Lillian Gish 28,441 948 C Low
    438 Fidesz 28,352 945 C Unknown
    439 Economic policy of the Donald Trump administration 28,255 941 Start Low
    440 Phil Robertson 28,047 934 C Low
    441 Jeff Sessions 28,016 933 Start Unknown
    442 Hillsdale College 27,974 932 C Low
    443 BC United 27,927 930 C Mid
    444 Southern strategy 27,923 930 B High
    445 Michel Houellebecq 27,847 928 C Low
    446 Koch family 27,845 928 Start High
    447 Party for Freedom 27,800 926 C Mid
    448 Winsome Sears 27,670 922 C Low
    449 Cultural Marxism conspiracy theory 27,632 921 B Low
    450 Project Veritas 27,611 920 B Low
    451 Frankfurt School 27,552 918 B Low
    452 Dark Enlightenment 27,352 911 Start Mid
    453 Rule of law 27,291 909 C Top
    454 Austrian school of economics 27,242 908 B Mid
    455 Andrea Tantaros 27,221 907 C Low
    456 Gary Johnson 27,177 905 GA High
    457 National Review 27,065 902 C High
    458 Brian Mulroney 27,008 900 B High
    459 David Koch 27,003 900 C Mid
    460 Hope Hicks 26,992 899 C Low
    461 Madison Cawthorn 26,984 899 C Low
    462 Moms for Liberty 26,921 897 B Low
    463 Social stratification 26,871 895 C High
    464 Presidency of George W. Bush 26,865 895 C High
    465 Otzma Yehudit 26,734 891 B Mid
    466 Norma McCorvey 26,651 888 C Unknown
    467 White movement 26,638 887 B Mid
    468 William Rehnquist 26,566 885 B High
    469 Profumo affair 26,318 877 FA Mid
    470 CDU/CSU 26,302 876 C Low
    471 American Independent Party 26,283 876 C Low
    472 Ward Bond 26,262 875 C Low
    473 White genocide conspiracy theory 26,073 869 B Low
    474 John Connally 25,851 861 B Mid
    475 Jerry Falwell 25,850 861 B High
    476 Broken windows theory 25,683 856 C Low
    477 Reaganomics 25,635 854 B Mid
    478 Race and crime in the United States 25,625 854 C Mid
    479 John O'Hurley 25,589 852 Start Low
    480 Bill Kristol 25,522 850 B High
    481 Alec Douglas-Home 25,473 849 FA Low
    482 Bezalel Smotrich 25,466 848 C Mid
    483 Grey Wolves (organization) 25,435 847 B Mid
    484 Penny Mordaunt 25,374 845 B Low
    485 Boogaloo movement 25,288 842 B Low
    486 Tories (British political party) 25,273 842 C High
    487 Anarcho-capitalism 25,157 838 B Low
    488 Nippon Kaigi 25,138 837 C Mid
    489 Enrique Peña Nieto 25,108 836 B Low
    490 Julius Evola 24,931 831 B Low
    491 Elaine Chao 24,921 830 B Low
    492 12 Rules for Life 24,914 830 B Mid
    493 History of the Republican Party (United States) 24,890 829 B High
    494 Meir Kahane 24,829 827 B High
    495 Doug Ford 24,799 826 C Low
    496 Irene Dunne 24,669 822 GA Low
    497 Ben Sasse 24,630 821 B Low
    498 Charles Koch 24,494 816 B Low
    499 Pan-Islamism 24,352 811 C High
    500 Orson Scott Card 24,316 810 B Low


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    12 October 2024

    11 October 2024

    For this watchlist but with about 3X in length, See: Wikipedia:WikiProject Conservatism/Recent changes
    Alternative watchlist prototypes (Excerpts)
    See also: Low-importance recent changes
    See also: Mid-importance recent changes
    See also: High-importance recent changes
    See also: Top-importance recent changes
    See also: Preconfigured recent vandalism shortlist

    Publications watchlist prototype beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    13 October 2024

    12 October 2024

    11 October 2024

    10 October 2024

    9 October 2024

    8 October 2024

    7 October 2024

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Publications recent changes

    Watchlist of journalists, bloggers, commentators etc., beneath this line:

    Discuss  · Edit

    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Journalism recent changes

    Organizations watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    12 October 2024

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Organizations recent changes

    Prototype political parties watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Political parties recent changes

    Prototype politicians watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Politicians recent changes

    Prototype MISC (drafts, templates etc.) watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    14 October 2024

    13 October 2024

    12 October 2024

    For a watchlist potentially up to30X in length, see Wikipedia:WikiProject Conservatism/MISC recent changes

    New articles

    A list of semi-related articles that were recently created

    This list was generated from these rules. Questions and feedback are always welcome! The search is being run daily with the most recent ~14 days of results. Note: Some articles may not be relevant to this project.

    Rules | Match log | Results page (for watching) | Last updated: 2024-10-13 19:59 (UTC)

    Note: The list display can now be customized by each user. See List display personalization for details.
















    In The Signpost

    One of various articles to this effect
    The Right Stuff
    July 2018
    DISCUSSION REPORT
    WikiProject Conservatism Comes Under Fire

    By Lionelt

    WikiProject Conservatism was a topic of discussion at the Administrators' Noticeboard/Incident (AN/I). Objective3000 started a thread where he expressed concern regarding the number of RFC notices posted on the Discussion page suggesting that such notices "could result in swaying consensus by selective notification." Several editors participated in the relatively abbreviated six hour discussion. The assertion that the project is a "club for conservatives" was countered by editors listing examples of users who "profess no political persuasion." It was also noted that notification of WikiProjects regarding ongoing discussions is explicitly permitted by the WP:Canvassing guideline.

    At one point the discussion segued to feedback about The Right Stuff. Member SPECIFICO wrote: "One thing I enjoy about the Conservatism Project is the handy newsletter that members receive on our talk pages." Atsme praised the newsletter as "first-class entertainment...BIGLY...first-class...nothing even comes close...it's amazing." Some good-natured sarcasm was offered with Objective3000 observing, "Well, they got the color right" and MrX's followup, "Wow. Yellow is the new red."

    Admin Oshwah closed the thread with the result "definitely not an issue for ANI" and directing editors to the project Discussion page for any further discussion. Editor's note: originally the design and color of The Right Stuff was chosen to mimic an old, paper newspaper.

    Add the Project Discussion page to your watchlist for the "latest RFCs" at WikiProject Conservatism Watch (Discuss this story)

    ARTICLES REPORT
    Margaret Thatcher Makes History Again

    By Lionelt

    Margaret Thatcher is the first article promoted at the new WikiProject Conservatism A-Class review. Congratulations to Neveselbert. A-Class is a quality rating which is ranked higher than GA (Good article) but the criteria are not as rigorous as FA (Featued article). WikiProject Conservatism is one of only two WikiProjects offering A-Class review, the other being WikiProject Military History. Nominate your article here. (Discuss this story)
    RECENT RESEARCH
    Research About AN/I

    By Lionelt

    Reprinted in part from the April 26, 2018 issue of The Signpost; written by Zarasophos

    Out of over one hundred questioned editors, only twenty-seven (27%) are happy with the way reports of conflicts between editors are handled on the Administrators' Incident Noticeboard (AN/I), according to a recent survey . The survey also found that dissatisfaction has varied reasons including "defensive cliques" and biased administrators as well as fear of a "boomerang effect" due to a lacking rule for scope on AN/I reports. The survey also included an analysis of available quantitative data about AN/I. Some notable takeaways:

    • 53% avoided making a report due to fearing it would not be handled appropriately
    • "Otherwise 'popular' users often avoid heavy sanctions for issues that would get new editors banned."
    • "Discussions need to be clerked to keep them from raising more problems than they solve."

    In the wake of Zarasophos' article editors discussed the AN/I survey at The Signpost and also at AN/I. Ironically a portion of the AN/I thread was hatted due to "off-topic sniping." To follow-up the problems identified by the research project the Wikimedia Foundation Anti-Harassment Tools team and Support and Safety team initiated a discussion. You can express your thoughts and ideas here.

    (Discuss this story)

    Delivered: ~~~~~


    File:Finally, a public-domain "Finally," template.jpg
    Goodyear
    PD
    47
    0
    442
    WikiProject Conservatism

    Is Wikipedia Politically Biased? Perhaps


    A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.


    Report by conservative think-tank presents ample quantitative evidence for "mild to moderate" "left-leaning bias" on Wikipedia

    A paper titled "Is Wikipedia Politically Biased?"[1] answers that question with a qualified yes:

    [...] this report measures the sentiment and emotion with which political terms are used in [English] Wikipedia articles, finding that Wikipedia entries are more likely to attach negative sentiment to terms associated with a right-leaning political orientation than to left-leaning terms. Moreover, terms that suggest a right-wing political stance are more frequently connected with emotions of anger and disgust than those that suggest a left-wing stance. Conversely, terms associated with left-leaning ideology are more frequently linked with the emotion of joy than are right-leaning terms.
    Our findings suggest that Wikipedia is not entirely living up to its neutral point of view policy, which aims to ensure that content is presented in an unbiased and balanced manner.

    The author (David Rozado, an associate professor at Otago Polytechnic) has published ample peer-reviewed research on related matters before, some of which was featured e.g. in The Guardian and The New York Times. In contrast, the present report is not peer-reviewed and was not posted in an academic venue, unlike most research we cover here usually. Rather, it was published (and possibly commissioned) by the Manhattan Institute, a conservative US think tank, which presumably found its results not too objectionable. (Also, some – broken – URLs in the PDF suggest that Manhattan Institute staff members were involved in the writing of the paper.) Still, the report indicates an effort to adhere to various standards of academic research publications, including some fairly detailed descriptions of the methods and data used. It is worth taking it more seriously than, for example, another recent report that alleged a different form of political bias on Wikipedia, which had likewise been commissioned by an advocacy organization and authored by an academic researcher, but was met with severe criticism by the Wikimedia Foundation (who called it out for "unsubstantiated claims of bias") and volunteer editors (see prior Signpost coverage).

    That isn't to say that there can't be some questions about the validity of Rozado's results, and in particular about how to interpret them. But let's first go through the paper's methods and data sources in more detail.

    Determining the sentiment and emotion in Wikipedia's coverage

    The report's main results regarding Wikipedia are obtained as follows:

    "We first gather a set of target terms (N=1,628) with political connotations (e.g., names of recent U.S. presidents, U.S. congressmembers, U.S. Supreme Court justices, or prime ministers of Western countries) from external sources. We then identify all mentions in English-language Wikipedia articles of those terms.

    We then extract the paragraphs in which those terms occur to provide the context in which the target terms are used and feed a random sample of those text snippets to an LLM (OpenAI’s gpt-3.5-turbo), which annotates the sentiment/emotion with which the target term is used in the snippet. To our knowledge, this is the first analysis of political bias in Wikipedia content using modern LLMs for annotation of sentiment/emotion."

    The sentiment classification rates the mention of a terms as negative, neutral or positive. (For the purpose of forming averages this is converted into a quantitative scale from -1 to +1.) See the end of this review for some concrete examples from the paper's published dataset.

    The emotion classification uses "Ekman’s six basic emotions (anger, disgust, fear, joy, sadness, and surprise) plus neutral."

    The annotation method used appears to be an effort to avoid the shortcomings of popular existing sentiment analysis techniques, which often only rate the overall emotional stance of a given text overall without determining whether it actually applies to a specific entity mentioned in it (or in some cases even fail to handle negations, e.g. by classifying "I am not happy" as a positive emotion). Rozado justifies the "decision to use automated annotation" (which presumably rendered considerable cost savings, also by resorting to OpenAI's older GPT 3.5 model rather than the more powerful but more expensive GPT-4 API released in March 2023) citing "recent evidence showing how top-of-the-rank LLMs outperform crowd workers for text-annotation tasks such as stance detection." This is indeed becoming a more widely used choice for text classification. But Rozado appears to have skipped the usual step of evaluating the accuracy of this automated method (and possibly improving the prompts it used) against a gold standard sample from (human) expert raters.

    Selecting topics to examine for bias

    As for the selection of terms whose Wikipedia coverage to annotate with this classifier, Rozado does a lot of due diligence to avoid cherry-picking: "To reduce the degrees of freedom of our analysis, we mostly use external sources of terms [including Wikipedia itself, e.g. its list of members of the 11th US Congress] to conceptualize a political category into left- and right-leaning terms, as well as to choose the set of terms to include in each category." This addresses an important source of researcher bias.

    Overall, the study arrives at 12 different groups of such terms:

    • 8 of these refer to people (e.g. US presidents, US senators, UK members of parliament, US journalists).
    • Two are about organizations (US think tanks and media organizations).
    • The other two groups contain "Terms that describe political orientation", i.e. expressions that carry a left-leaning or right-leaning meaning themselves:
      • 18 "political leanings" (where "Rightists" receives the lowest average sentiment and "Left winger" the highest), and
      • 21 "extreme political ideologies" (where "Ultraconservative" scores lowest and "radical-left" has the highest – but still slightly negative – average sentiment)

    What is "left-leaning" and "right-leaning"?

    As discussed, Rozado's methods for generating these lists of people and organizations seem reasonably transparent and objective. It gets a bit murkier when it comes to splitting them into "left-leaning" and "right-leaning", where the chosen methods remain unclear and/or questionable in some cases. Of course there is a natural choice available for US Congress members, where the confines of the US two-party system mean that the left-right spectrum can be easily mapped easily to Democrats vs. Republicans (disregarding a small number of independents or libertarians).

    In other cases, Rozado was able to use external data about political leanings, e.g. "a list of politically aligned U.S.-based journalists" from Politico. There may be questions about construct validity here (e.g. it classifies Glenn Greenwald or Andrew Sullivan as "journalists with the left"), but at least this data is transparent and determined by a source not invested in the present paper's findings.

    But for example the list of UK MPs used contains politicians from 14 different parties (plus independents). Even if one were to confine the left vs. right labels to the two largest groups in the UK House of Commons (Tories vs. Labour and Co-operative Party, which appears to have been the author's choice judging from Figure 5), the presence of a substantial number of parliamentarians from other parties to the left or right of those would make the validity of this binary score more questionable than in the US case. Rozado appears to acknowledge a related potential issue in a side remark when trying to offer an explanation for one of the paper's negative results (no bias) in this case: "The disparity of sentiment associations in Wikipedia articles between U.S. Congressmembers and U.K. MPs based on their political affiliation may be due in part to the higher level of polarization in the U.S. compared to the U.K."

    Tony Abbott.
    Most negative sentiment among Western leaders: Former Australian PM Tony Abbott
    Scott Morrison.
    Most positive sentiment among Western leaders: Former Australian PM Scott Morrison

    This kind of question become even more complicated for the "Leaders of Western Countries" list (where Tony Abbott scored the most negative average sentiment, and José Luis Rodríguez Zapatero and Scott Morrison appear to be in a tie for the most positive average sentiment). Most of these countries do not have a two-party system either. Sure, their leaders usually (like in the UK case) hail from one of the two largest parties, one of which is more to the left and the another more to the right. But it certainly seems to matter for the purpose of Rozado's research question whether that major party is more moderate (center-left or center-right, with other parties between it and the far left or far right) or more radical (i.e. extending all the way to the far-left or far-right spectrum of elected politicians).

    What's more, the analysis for this last group compares political orientations across multiple countries. Which brings us to a problem that Wikipedia's Jimmy Wales had already pointed to back in 2006 in response a conservative US blogger who had argued that there was "a liberal bias in many hot-button topic entries" on English Wikipedia:

    "The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don't, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. ... The idea that neutrality can only be achieved if we have some exact demographic matchup to [the] United States of America is preposterous."

    We already discussed this issue in our earlier reviews of a notable series of papers by Greenstein and Zhu (see e.g.: "Language analysis finds Wikipedia's political bias moving from left to right", 2012), which had relied on a US-centric method of defining left-leaning and right-leaning (namely, a corpus derived from the US Congressional Record). Those studies form a large part of what Rozado cites as "[a] substantial body of literature [that]—albeit with some exceptions—has highlighted a perceived bias in Wikipedia content in favor of left-leaning perspectives." (The cited exception is a paper[2] that had found "a small to medium size coverage bias against [members of parliament] from the center-left parties in Germany and in France", and identified patterns of "partisan contributions" as a plausible cause.)

    Similarly, 8 out of the 10 groups of people and organizations analyzed in Rozado's study are from the US (the two exceptions being the aforementioned lists of UK MPs and leaders of Western countries).

    In other words, one potential reason for the disparities found by Rozado might simply be that he is measuring an international encyclopedia with a (largely) national yardstick of fairness. This shouldn't let us dismiss his findings too easily. But it is a bit disappointing that this possibility is nowhere addressed in the paper, even though Rozado diligently discusses some other potential limitations of the results. E.g. he notes that "some research has suggested that conservatives themselves are more prone to negative emotions and more sensitive to threats than liberals", but points out that the general validity of those research results remains doubtful.

    Another limitation is that a simple binary left vs. right classification might be hiding factors that can shed further light on bias findings. Even in the US with its two-party system, political scientists and analysts have long moved to less simplistic measures of political orientations. A widely used one is the NOMINATE method which assigns members of the US Congress continuous scores based on their detailed voting record, one of which corresponds to the left-right spectrum as traditionally understood. One finding based on that measure that seems relevant in context of the present study is the (widely discussed but itself controversial) asymmetric polarization thesis, which argues that "Polarization among U.S. legislators is asymmetric, as it has primarily been driven by a substantial rightward shift among congressional Republicans since the 1970s, alongside a much smaller leftward shift among congressional Democrats" (as summarized in the linked Wikipedia article). If, for example, higher polarization was associated with negative sentiments, this could be a potential explanation for Rozado's results. Again, this has to remain speculative, but it seems another notable omission in the paper's discussion of limitations.

    What does "bias" mean here?

    A fundamental problem of this study, which, to be fair, it shares with much fairness and bias research (in particular on Wikipedia's gender gap, where many studies similarly focus on binary comparisons that are likely to successfully appeal to an intuitive sense of fairness) consists of justifying its answers to the following two basic questions:

    1. What would be a perfectly fair baseline, a result that makes us confident to call Wikipedia unbiased?
    2. If there are deviations from that baseline (often labeled disparities, gaps or biases), what are the reasons for that – can we confidently assume they were caused by Wikipedia itself (e.g. demographic imbalances in Wikipedia's editorship), or are they more plausibly attributed to external factors?

    Regarding 1 (defining a baseline of unbiasedness), Rozado simply assumes that this should imply statistically indistinguishable levels of average sentiment between left and right-leaning terms. However, as cautioned by one leading scholar on quantitative measures of bias, "the 'one true fairness definition' is a wild goose chase" – there are often multiple different definitions available that can all be justified on ethical grounds, and are often contradictory. Above, we already alluded to two potentially diverging notions of political unbiasedness for Wikipedia (using an international instead of US metric for left vs right leaning, and taking into account polarization levels for politicians).

    But yet another question, highly relevant for Wikipedians interested in addressing the potential problems reported in this paper, is how much its definition lines up with Wikipedia's own definition of neutrality. Rozado clearly thinks that it does:

    Wikipedia’s neutral point of view (NPOV) policy aims for articles in Wikipedia to be written in an impartial and unbiased tone. Our results suggest that Wikipedia’s NPOV policy is not achieving its stated goal of political-viewpoint neutrality in Wikipedia articles.

    WP:NPOV indeed calls for avoiding subjective language and expressing judgments and opinions in Wikipedia's own voice, and Rozado's findings about the presence of non-neutral sentiments and emotions in Wikipedia articles are of some concern in that regard. However, that is not the core definition of NPOV. Rather, it refers to "representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic." What if the coverage of the terms examined by Rozado (politicians, etc.) in those reliable sources, in their aggregate, were also biased in the sense of Rozado's definition? US progressives might be inclined to invoke the snarky dictum "reality has a liberal bias" by comedian Stephen Colbert. Of course, conservatives might object that Wikipedia's definition of reliable sources (having "a reputation for fact-checking and accuracy") is itself biased, or applied in a biased way by Wikipedians. For some of these conservatives (at least those that are not also conservative feminists) it may be instructive to compare examinations of Wikipedia's gender gaps, which frequently focus on specific groups of notable people like in Rozado's study. And like him, they often implicitly assume a baseline of unbiasedness that implies perfect symmetry in Wikipedia's coverage – i.e. the absence of gaps or disparities. Wikipedians often object that this is in tension with the aforementioned requirement to reflect coverage in reliable sources. For example, Wikipedia's list of Fields medalists (the "Nobel prize of Mathematics") is 97% male – not because of Wikipedia editors' biases against women, but because of a severe gender imbalance in the field of mathematics that is only changing slowly, i.e. factors outside Wikipedia's influence.

    All this brings us to question 2. above (causality). While Rozado uses carefully couched language in this regard ("suggests" etc, e.g. "These trends constitute suggestive evidence of political bias embedded in Wikipedia articles"), such qualifications are unsurprisingly absent in much of the media coverage of this study (see also this issue's In the media). For example, the conservative magazine The American Spectator titled its article about the paper "Now We've Got Proof that Wikipedia is Biased."

    Commendably, the paper is accompanied by a published dataset, consisting of the analyzed Wikipedia text snippets together with the mentioned term and the sentiment or emotion identified by the automated annotation. For illustration, below are the sentiment ratings for mentions of the Yankee Institute for Public Policy (the last term in the dataset, as a non-cherry-picked example), with the term bolded:

    Dataset excerpt: Wikipedia paragraphs with sentiment for "Yankee Institute for Public Policy"
    positive "Carol Platt Liebau is president of the Yankee Institute for Public Policy.Liebau named new president of Yankee Institute She is also an attorney, political analyst, and conservative commentator. Her book Prude: How the Sex-Obsessed Culture Damages Girls (and America, Too!) was published in 2007."
    neutral "Affiliates

    Regular members are described as ""full-service think tanks"" operating independently within their respective states.

    Alabama: Alabama Policy Institute
    Alaska: Alaska Policy Forum
    [...]
    Connecticut: Yankee Institute for Public Policy
    [...]
    Wisconsin: MacIver Institute for Public Policy, Badger Institute, Wisconsin Institute for Law and Liberty, Institute for Reforming Government
    Wyoming: Wyoming Liberty Group"
    positive "The Yankee Institute for Public Policy is a free market, limited government American think tank based in Hartford, Connecticut, that researches Connecticut public policy questions. Organized as a 501(c)(3), the group's stated mission is to ""develop and advocate for free market, limited government public policy solutions in Connecticut."" Yankee was founded in 1984 by Bernard Zimmern, a French entrepreneur who was living in Norwalk, Connecticut, and Professor Gerald Gunderson of Trinity College. The organization is a member of the State Policy Network."
    neutral "He is formerly Chairman of the Yankee Institute for Public Policy. On November 3, 2015, he was elected First Selectman in his hometown of Stonington, Connecticut, which he once represented in Congress. He defeated the incumbent, George Crouse. Simmons did not seek reelection in 2019."
    negative "In Connecticut the union is closely identified with liberal Democratic politicians such as Governor Dannel Malloy and has clashed frequently with fiscally conservative Republicans such as former Governor John G. Rowland as well as the Yankee Institute for Public Policy, a free-market think tank."
    positive "In 2021, after leaving elective office, she was named a Board Director of several organizations. One is the Center for Workforce Inclusion, a national nonprofit in Washington, DC, that works to provide meaningful employment opportunities for older individuals. Another is the William F. Buckley Program at Yale, which aims to promote intellectual diversity, expand political discourse on campus, and expose students to often-unvoiced views at Yale University. She also serves on the Board of the Helicon Foundation, which explores chamber music in its historical context by presenting and producing period performances, including an annual subscription series of four Symposiums in New York featuring both performance and discussion of chamber music. She is also a Board Director of the American Hospital of Paris Foundation, which provides funding support for the operations of the American Hospital of Paris and functions as the link between the Hospital and the United States, funding many collaborative and exchange programs with New York-Presbyterian Hospital. She is also a Fellow of the Yankee Institute for Public Policy, a research and citizen education organization that focuses on free markets and limited government, as well as issues of transparency and good governance."
    positive "He was later elected chairman of the New Hampshire Republican State Committee, a position he held from 2007 to 2008. When he was elected he was 34 years old, making him the youngest state party chairman in the history of the United States at the time. His term as chairman included the 2008 New Hampshire primary, the first primary in the 2008 United States presidential election. He later served as the executive director of the Yankee Institute for Public Policy for five years, beginning in 2009. He is the author of a book about the New Hampshire primary, entitled Granite Steps, and the founder of the immigration reform advocacy group Americans By Choice."

    Briefly


    Other recent publications

    Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.

    How English Wikipedia mediates East Asian historical disputes with Habermasian communicative rationality

    From the abstract: [3]

    "We compare the portrayals of Balhae, an ancient kingdom with contested contexts between [South Korea and China]. By comparing Chinese, Korean, and English Wikipedia entries on Balhae, we identify differences in narrative construction and framing. Employing Habermas’s typology of human action, we scrutinize related talk pages on English Wikipedia to examine the strategic actions multinational contributors employ to shape historical representation. This exploration reveals the dual role of online platforms in both amplifying and mediating historical disputes. While Wikipedia’s policies promote rational discourse, our findings indicate that contributors often vacillate between strategic and communicative actions. Nonetheless, the resulting article approximates Habermasian ideals of communicative rationality."

    From the paper:

    "The English Wikipedia presents Balhae as a multi-ethnic kingdom, refraining from emphasizing the dominance of a single tribe. In comparison to the two aforementioned excerpts [from Chinese and Korean Wikipedia], the lead section of the English Wikipedia concentrates more on factual aspects of history, thus excluding descriptions that might entail divergent interpretations. In other words, this account of Balhae has thus far proven acceptable to a majority of Wikipedians from diverse backgrounds. [...] Compared to other language versions, the English Wikipedia forthrightly acknowledges the potential disputes regarding Balhae's origin, ethnic makeup, and territorial boundaries, paving the way for an open and transparent exploration of these contested historical subjects. The separate 'Balhae controversies' entry is dedicated to unpacking the contentious issues. In essence, the English article adopts a more encyclopedic tone, aligning closely with Wikipedia's mission of providing information without imposing a certain perspective."

    (See also excerpts)

    Facebook/Meta's "No Language Left Behind" translation model used on Wikipedia

    From the abstract of this publication by a large group of researchers (most of them affiliated with Meta AI):[4]

    "Focusing on improving the translation qualities of a relatively small group of high-resource languages comes at the expense of directing research attention to low-resource languages, exacerbating digital inequities in the long run. To break this pattern, here we introduce No Language Left Behind—a single massively multilingual model that leverages transfer learning across languages. [...] Compared with the previous state-of-the-art models, our model achieves an average of 44% improvement in translation quality as measured by BLEU. By demonstrating how to scale NMT [neural machine translation] to 200 languages and making all contributions in this effort freely available for non-commercial use, our work lays important groundwork for the development of a universal translation system."

    "Four months after the launch of NLLB-200 [in 2022], Wikimedia reported that our model was the third most used machine translation engine used by Wikipedia editors (accounting for 3.8% of all published translations) (https://web.archive.org/web/20221107181300/https://nbviewer.org/github/wikimedia-research/machine-translation-service-analysis-2022/blob/main/mt_service_comparison_Sept2022_update.ipynb). Compared with other machine translation services and across all languages, articles translated with NLLB-200 has the lowest percentage of deletion (0.13%) and highest percentage of translation modification kept under 10%."

    "Which Nigerian-Pidgin does Generative AI speak?" – only the BBC's, not Wikipedia's

    From the abstract:[5]

    "Naija is the Nigerian-Pidgin spoken by approx. 120M speakers in Nigeria [...]. Although it has mainly been a spoken language until recently, there are currently two written genres (BBC and Wikipedia) in Naija. Through statistical analyses and Machine Translation experiments, we prove that these two genres do not represent each other (i.e., there are linguistic differences in word order and vocabulary) and Generative AI operates only based on Naija written in the BBC genre. In other words, Naija written in Wikipedia genre is not represented in Generative AI."

    The paper's findings are consistent with an analysis by the Wikimedia Foundation's research department that compared the number of Wikipedia articles to the number of speakers for the top 20 most-spoken languages, where Naija stood out as one of the most underrepresented.

    "[A] surprising tension between Wikipedia's principle of safeguarding against self-promotion and the scholarly norm of 'due credit'"

    From the abstract:[6]

    Although Wikipedia offers guidelines for determining when a scientist qualifies for their own article, it currently lacks guidance regarding whether a scientist should be acknowledged in articles related to the innovation processes to which they have contributed. To explore how Wikipedia addresses this issue of scientific "micro-notability", we introduce a digital method called Name Edit Analysis, enabling us to quantitatively and qualitatively trace mentions of scientists within Wikipedia's articles. We study two CRISPR-related Wikipedia articles and find dynamic negotiations of micro-notability as well as a surprising tension between Wikipedia’s principle of safeguarding against self-promotion and the scholarly norm of “due credit.” To reconcile this tension, we propose that Wikipedians and scientists collaborate to establish specific micro-notability guidelines that acknowledge scientific contributions while preventing excessive self-promotion.

    See also coverage of a different paper that likewise analyzed Wikipedia's coverage of CRISPR: "Wikipedia as a tool for contemporary history of science: A case study on CRISPR"

    "How article category in Wikipedia determines the heterogeneity of its editors"

    From the abstract:[7]

    " [...] the quality of Wikipedia articles rises with the number of editors per article as well as a greater diversity among them. Here, we address a not yet documented potential threat to those preconditions: self-selection of Wikipedia editors to articles. Specifically, we expected articles with a clear-cut link to a specific country (e.g., about its highest mountain, "national" article category) to attract a larger proportion of editors of that nationality when compared to articles without any specific link to that country (e.g., "gravity", "universal" article category), whereas articles with a link to several countries (e.g., "United Nations", "international" article category) should fall in between. Across several language versions, hundreds of different articles, and hundreds of thousands of editors, we find the expected effect [...]"

    "What do they make us see:" The "cultural bias" of GLAMs is worse on Wikidata

    From the abstract:[8]

    "Large cultural heritage datasets from museum collections tend to be biased and demonstrate omissions that result from a series of decisions at various stages of the collection construction. The purpose of this study is to apply a set of ethical criteria to compare the level of bias of six online databases produced by two major art museums, identifying the most biased and the least biased databases. [...] For most variables the online system database is more balanced and ethical than the API dataset and Wikidata item collection of the two museums."

    References

    1. ^ Rozado, David (June 2024). "Is Wikipedia Politically Biased?". Manhattan Institute. Dataset: https://doi.org/10.5281/zenodo.10775984
    2. ^ Kerkhof, Anna; Münster, Johannes (2019-10-02). "Detecting coverage bias in user-generated content". Journal of Media Economics. 32 (3–4): 99–130. doi:10.1080/08997764.2021.1903168. ISSN 0899-7764.
    3. ^ Jee, Jonghyun; Kim, Byungjun; Jun, Bong Gwan (2024). "The role of English Wikipedia in mediating East Asian historical disputes: the case of Balhae". Asian Journal of Communication: 1–20. doi:10.1080/01292986.2024.2342822. ISSN 0129-2986. Closed access icon (access for Wikipedia Library users)
    4. ^ Costa-jussà, Marta R.; Cross, James; Çelebi, Onur; Elbayad, Maha; Heafield, Kenneth; Heffernan, Kevin; Kalbassi, Elahe; Lam, Janice; Licht, Daniel; Maillard, Jean; Sun, Anna; Wang, Skyler; Wenzek, Guillaume; Youngblood, Al; Akula, Bapi; Barrault, Loic; Gonzalez, Gabriel Mejia; Hansanti, Prangthip; Hoffman, John; Jarrett, Semarley; Sadagopan, Kaushik Ram; Rowe, Dirk; Spruit, Shannon; Tran, Chau; Andrews, Pierre; Ayan, Necip Fazil; Bhosale, Shruti; Edunov, Sergey; Fan, Angela; Gao, Cynthia; Goswami, Vedanuj; Guzmán, Francisco; Koehn, Philipp; Mourachko, Alexandre; Ropers, Christophe; Saleem, Safiyyah; Schwenk, Holger; Wang, Jeff; NLLB Team (June 2024). "Scaling neural machine translation to 200 languages". Nature. 630 (8018): 841–846. Bibcode:2024Natur.630..841N. doi:10.1038/s41586-024-07335-x. ISSN 1476-4687. PMC 11208141. PMID 38839963.
    5. ^ Adelani, David Ifeoluwa; Doğruöz, A. Seza; Shode, Iyanuoluwa; Aremu, Anuoluwapo (2024-04-30). "Which Nigerian-Pidgin does Generative AI speak?: Issues about Representativeness and Bias for Multilingual and Low Resource Languages". arXiv:2404.19442 [cs.CL].
    6. ^ Simons, Arno; Kircheis, Wolfgang; Schmidt, Marion; Potthast, Martin; Stein, Benno (2024-02-28). "Who are the "Heroes of CRISPR"? Public science communication on Wikipedia and the challenge of micro-notability". Public Understanding of Science. doi:10.1177/09636625241229923. ISSN 0963-6625. PMID 38419208. blog post
    7. ^ Oeberst, Aileen; Ridderbecks, Till (2024-01-07). "How article category in Wikipedia determines the heterogeneity of its editors". Scientific Reports. 14 (1): 740. Bibcode:2024NatSR..14..740O. doi:10.1038/s41598-023-50448-y. ISSN 2045-2322. PMC 10772120. PMID 38185716.
    8. ^ Zhitomirsky-Geffet, Maayan; Kizhner, Inna; Minster, Sara (2022-01-01). "What do they make us see: a comparative study of cultural bias in online databases of two large museums". Journal of Documentation. 79 (2): 320–340. doi:10.1108/JD-02-2022-0047. ISSN 0022-0418. Closed access icon / freely accessible version


    ToDo List

    Miscellaneous tasks

    Categories to look through

    (See also this much larger list of relevant articles without a lead image)

    Translation ToDo

    A list of related articles particularly good and notable enough to be worthy of a solid translation effort

    Merging ToDo

    A list of related articles that may have resulted from a WP:POVFORK or may, at least, look like the functional equivalents of one
    Note that the exact target of a potential merge must not be provided here and that multiple options (e.g. generous use of Template:Excerpt) might accomplish the same