Project:Support desk

About this board

Welcome to the MediaWiki Support desk, where you can ask MediaWiki questions!

(Read this message in a different language)

See also

Before you post

Post a new question

  1. To help us answer your questions, please indicate which versions you are using, as found on your wiki's Special:Version page:
    • MediaWiki version
    • PHP version
    • Database type and version
  2. Please include the web address (URL) to your wiki if possible. It's often easier for us to identify the source of the problem if we can see the error directly.
  3. To start a new thread, click "Start a new topic".
Mztourist (talkcontribs)

Hi,

I know nothing about programming.

I would like a bot to transfer all of the photos in https://catalog.archives.gov/id/532388 Black and White Photographs of Marine Corps Activities in Vietnam, 1962 - 1975 to Wikimedia Commons Category:United States Marine Corps in the Vietnam War

Is this something that Python can do? If so how do I start? Or can someone create this bot for me?

Any help gratefully appreciated.

Malyacko (talkcontribs)
Mztourist (talkcontribs)

Noted, thanks

Reply to "Can someone help me?"

localization file not working

8
JJBBSS (talkcontribs)

Hey,

I was trying to write a little extension for myself to do some checks on user emails during registration. Now most of that is working now so I wanted to add localization as it is required as per the doc. But i simply can't get it to work.

I added the

"MessagesDirs": {
 	"SomeNameExtension": [
 		"i18n"
 	]
 },

to extension.json

and put two json files (de.json and en.json) in the i18n folder of the extension.

The content of both these files is

{
 	"SomeName-desc": "<some words>",
 	"SomeName-fail": "<some other words>"
 }

i tried to use the -desc with the "descriptionmsg" in extension.json and the -fail with a $value->fatal( wfMessage( 'SomeName-fail' ) );

But in both cases it will just show ⧼SomeName-desc⧽ and not the actual string that should be pulled from the json.

The general language of the wiki is german so i assume it should pull from de.json by default? I tried force regenerating the localization cache but no change.

Maybe I am just making an obvious mistake. I am pretty fresh to the whole wiki thing.

Bawolff (talkcontribs)

It might be some localization cache issue. Try removing the extension from LocalSettings.php, saving, viewing a page, and then readding and viewing a new page. Normally this isnt neccessary, but it should flush out caches.

JJBBSS (talkcontribs)

hhm, i tried that but there was no change. any other ideas?

JJBBSS (talkcontribs)

ok,

I found the solution.

I read the doc again here and noticed the sentence about message-ids being lowercase. I tried that and things magically worked.

I did a little bit of limited testing and it seems to only break when the first character is capital. Seems to work fine with all the other letters as capital.


I am not really sure if that is intended behavior but is seems kinda arbitrary to only enforce lower case in the first letter.

But if it is intended behavior it should probably me made clear in the doc. Because how it is currently written makes it seem like it is optional recommendation.


Ok, I dug into the logs a little bit and I found this log entry:

[DBQuery] LCStoreDB::get [0s] localhost: SELECT  lc_value  FROM `l10n_cache`    WHERE lc_lang = 'de' AND lc_key = 'messages:someName-desc'  LIMIT 1

so when testing with the different capital things it seems that the first letter of lc_key is always converted to lowercase but the others are not.


So messages:SomeName-desc becomes messages:someName-desc . This means the that query will not get a result because the value would be under SomeName while it searched under someName.

And when the key doesn't start capital, not conversion happens so the query finds the value just fine.

Kinda inconsistent.

Idk where in the code that query is created so I don't know why this happens.

Ciencia Al Poder (talkcontribs)

This happens because MediaWiki always ignores the case of the first character of the page title by design (there are some exceptions to this). Since those texts can be modified directly on the wiki by editing the MediaWiki:xxxx page, this limitation ensures there may not be conflicting messages with upper and lowercase versions of the first character of the message name.

JJBBSS (talkcontribs)

Alrighty,

Thanks for the clarification.


Though it might still be good to document this a little more clearly in the doc.

Bawolff (talkcontribs)

Would probably be nice if we threw an exception for keys that start with uppercase letter.

JJBBSS (talkcontribs)

so you mean something like this?

This seems to at least catch my specific problem

--- Message_original.php	2022-05-12 03:51:45.545078000 -0400
+++ Message.php	2022-05-12 04:23:22.717145290 -0400
@@ -230,6 +230,10 @@
 			$key = $key->getKey();
 		}
 
+		if ( is_string( $key ) && preg_match( '~^\p{Lu}~u', $key ) ) {
+			throw new InvalidArgumentException( '$key can not begin with capital letter' );
+		}
+
 		if ( !is_string( $key ) && !is_array( $key ) ) {
 			throw new InvalidArgumentException( '$key must be a string or an array' );
 		}
Reply to "localization file not working"

caching nonexistent pages and images

13
Nicole Sharp (talkcontribs)

I uninstalled and reinstalled MediaWiki (deleting all original files and the original MySQL database and user, then uploading a newly downloaded copy of MediaWiki and creating a new MySQL database and user) but pages that do not exist on the newly installed wiki are visible in "/w/uploads/cache/history/" and images that do not exist on the wiki are visible in "/w/uploads/thumb/".

My theory is that webcrawler bots are trying to re-crawl the pages on the wiki to update their indices for search engines, despite the fact that the pages were removed from the sitemap when the wiki was deleted. All of the cached HTML files for the nonexistent pages appear to not have any content written by me (since the original wiki was deleted). So is it possible that MediaWiki is creating a cached HTML file every time a page title is requested that does not exist on the wiki? That seems like an extremely poor use of storage and system resources rather than just returning a 404 error?

What is more confusing though is why or how thumbnails are being generated for images that do not exist on the wiki? I have InstantCommons disabled, and am instead using ForeignFileRepos so that thumbnails are stored locally without needing to hotlink to Wikimedia Commons. Could webcrawler bots also be requesting images in such a way that MediaWiki is generating local thumbnails from images on Wikimedia Commons, despite the fact that the images are not in use anywhere on the wiki? This is especially troublesome, since it could be used as an attack vector by a bad crawler bot to overload the server if requesting that MediaWiki generate many local thumbnails for images that don't exist on the wiki.

Other than disabling file caching or enabling InstantCommons, any suggestions on how to fix this, or explanations as to what might be happening with MediaWiki to generate the HTML files and image thumbnails for content that does not exist on the wiki?

Nicole Sharp (talk) 23:05, 24 April 2022 (UTC)

Bawolff (talkcontribs)

> Could webcrawler bots also be requesting images in such a way that MediaWiki is generating local thumbnails from images on Wikimedia Commons, despite the fact that the images are not in use anywhere on the wiki?

Do you have 404 thumbnailing setup (it is not the default)? If yes, then that could happen otherwise no.

Are these thumbnails of images on your wiki or just random images.

Similarly, are the contents of the html cache items corresponding to pages in your new wiki or your old wiki?


Are you sure you deleted everything when you reinstalled?

Nicole Sharp (talkcontribs)
$wgLocalFileRepo['transformVia404'] = true;
$wgGenerateThumbnailOnParse = false;

gives a MediaWiki Internal Error Fatal Exception.

The /w/ directory was deleted, as was both the MySQL database and the MySQL user, as well as the sitemap. The Cloudflare cache was also purged. This is a completely new installation, including a fresh download from MediaWiki.org, a new MySQL database and a new MySQL user. Nothing from the previous installation should exist on the new installation. The only explanation I can think of is webcrawler bots which indexed the previous wiki installation and are now looking for the missing URLs apparently for both pages and images.

As far as I can tell, all of the thumbnailed images are either from the current wiki installation or the previous wiki installation. Ditto for the cached HTML files. They do not appear to be random and are presumably from a systematic crawl based on a third-party sitemap generated from a previous crawl of the old wiki installation. It is not good for the bots either if they are not getting 404 errors for the content that was deleted, since they may think the content still exists otherwise.

Nicole Sharp (talk) 05:19, 25 April 2022 (UTC)

Nicole Sharp (talkcontribs)
$wgThumbnailScriptPath = "{$wgScriptPath}/thumb.php";
$wgGenerateThumbnailOnParse = false;

allows the wiki to continue functioning normally. I don't see anything different on the browser. I am confused by the GenerateThumbnailOnParse manual page though. Is the change in the wiki then that local thumbnails will no longer be cached? I don't want the wiki to dynamically generate new thumbnails for every pageview. The point of the local thumbnail cache is to try to reduce server load while also limiting cross-domain requests to Wikimedia Commons.

Nicole Sharp (talk) 06:10, 25 April 2022 (UTC)

Bawolff (talkcontribs)

I wasn't neccesarily suggesting enabling that setting - if that setting is enabled mediawiki will automatically create things on crawling, sort of like what you described, so i was wondering if you had it turned on. The default off is probably what you want.

Nicole Sharp (talkcontribs)

I think I see the problem. If I just go to the URL of (for example) "/wiki/file:RandomImageFromWikimediaCommons" the wiki shows the Wikimedia Commons metadata but the thumbnail for the image is stored locally, despite the fact that the random image from Wikimedia Commons is not used anywhere on the local wiki. "Copy Image Address" then gives the URL of the thumbnail as "/w/uploads/thumb/d/dd/RandomImageFromWikimediaCommons". I think the only way to solve this problem is to enable InstantCommons and use the hotlinks to the thumbnails from Wikimedia Commons instead. Then if I really want a local copy of the thumbnail, I will have to download the file from Wikimedia Commons, and manually upload it to the local wiki under a different filename. 06:21, 25 April 2022 (UTC)

Bawolff (talkcontribs)

I'm confused - is this a local image or one from wikimedia commons?

> the thumbnail for the image is stored locally, despite the fact that the random image from Wikimedia Commons is not used anywhere on the local wiki. "Copy Image Address" then gives the URL of the thumbnail as "/w/uploads/thumb/d/dd/RandomImageFromWikimediaCommons".

if this is an image from wikimedia commons (via instant commons or ForeignApiRepo) this should not happen. It is possible to set mediawiki up to do that, but its not the default.

If this is a file you uploaded locally that happens to be the same as a file from commons, mediawiki wont know the file on commons is the same, so naturally will make a local thumbnail.

note local files will still have some thumbnails even if they aren't used anywhere, as they have to be displayed on the image page, which requires thumbnails.

Nicole Sharp (talkcontribs)

What I would like to though is to configure MediaWiki so that it will only save local thumbnails when a page on the wiki using that image is saved. Any other image thumbnails would then be hotlinked to Wikimedia Commons, including for page previews while editing (before the page is saved). Is there any easy way to configure MediaWiki to do this? Nicole Sharp (talk) 06:27, 25 April 2022 (UTC)

Bawolff (talkcontribs)

mediawiki can be configured to hot link foreign images or download locally, but it can't mix and match. Can i ask why you want this?

Nicole Sharp (talkcontribs)

These are all images from Wikimedia Commons, with no locally uploaded images. I think the problem is that I want both local thumbnails and hotlinked thumbnails. I want a hotlink to the Wikimedia Commons thumbnail for a page preview while editing and then only save the local thumbnail when the page edit is saved. If that is not currently supported, then maybe a job for a future extension or version of MediaWiki.

I don't want local thumbnails for images not used on the wiki, which is when hotlinks to Wikimedia Commons are preferred. For saved pages, then I would like local thumbnails of the Wikimedia Commons images. That reduces cross-domain requests during general browsing, and allows the thumbnails from Wikimedia Commons to be displayed offline if necessary.

For now, I re-enabled InstantCommons to avoid MediaWiki creating lots of thumbnails for images that don't exist on the wiki. There is still the other issue that MediaWiki is generating cached HTML files for pages that don't exist. Any suggestions for that? That is less problematic, since the HTML files for the nonexistent pages are much smaller in file size and don't use as much server resources to generate as thumbnails. But it could still be an attack vector if a bad bot tried to request thousands of page titles that don't exist if MediaWiki has to generate a new HTML file for each new page title request. It would be better if MediaWiki returned 404 errors instead.

Nicole Sharp (talk) 16:14, 25 April 2022 (UTC)

Nicole Sharp (talkcontribs)

I think I figured out what is going on with the cached HTML files for nonexistent pages. If you go to

https://www.mediawiki.org/wiki/page_that_does_not_exist

MediaWiki does not give a 404 error for the nonexistent page but instead generates a new page with the content

"There is currently no text in this page. You can search for this page title in other pages, search the related logs, or create this page."

or on a private wiki:

"There is currently no text in this page. You can search for this page title in other pages, or search the related logs, but you do not have permission to create this page."

So I think that is what the HTML files are for. A webcrawler bot is looking for the page titles that used to exist, but no longer exist, and MediaWiki is generating blank HTML pages for each new page request (all of the HTML files in /uploads/cache/history/ appear to be approximately the same size with the same text).

I am guessing the only way to stop this is to disable file caching? If a 404 error was returned, it would have to only be for anonymous or logged-out users, since logged-in users should be able to link to nonexistent page titles in order to create new pages from redlinks.

Nicole Sharp (talk) 19:54, 25 April 2022 (UTC)

Nicole Sharp (talkcontribs)

Even after enabling InstantCommons (but leaving file caching enabled) and deleting the thumbnail directory, MediaWiki recreated the same thumbnails from Wikimedia Commons for the images not used on the wiki in the "uploads/thumb" directory. I can try to disable file caching as well since that might be part of the problem maybe. Nicole Sharp (talk) 05:03, 28 April 2022 (UTC)

Bawolff (talkcontribs)

Instant commons in default config should not create local thumbnails for images from commons. File cache should not affect this.

for file cache i would suggest using varnish instead if possible.

Reply to "caching nonexistent pages and images"

Is it considered acceptable to mirror wikipedia?

8
AlgorithmGG (talkcontribs)

Hi,

I've been playing with mediawiki for a while now and I am considering mirroring wikipedia in English is this still an option. And also what sort of issues would I run into. perhaps there is a good thread you could recommend?


Regards

Bawolff (talkcontribs)

Yes of course. Please follow the license requirements (basically make sure to maintain credit, dont mislead people into thinking your site is the real wikipedia).

Mirroring english wikipedia is high effort due to the sheer size of it.

A good starting place is meta:Data dumps. The actual dumps are at https://dumps.wikimedia.org/backup-index.html

AlgorithmGG (talkcontribs)

Thanks Bawolff, long time no speak. I didn't realise it is over 25tb I've got space but not that much. is having specific extensions going to be an issue if the wiki is running ok prior to the dump?

Bawolff (talkcontribs)

The bigger issue than space is cpu time. The default importDump.php parses pages to figure out categories, whatlinkshere, etc. This is quite slow at wikipedia scale might give incorrect results if the wrong extensions are installed.

There used to be scripts that imported without parsing (e.g. mwdumper) which were much faster, but i dont think they work anymore with most recent mediawiki. They used the .sql files for auxillary data you can download instead of parsing themselves.

AlgorithmGG (talkcontribs)

Cheers Bawolff, It is tempting although I haven't needed to upgrade from 1.34. I was hoping to use mwdumper! if I am to upgrade what wiki version do you recommended?

Bawolff (talkcontribs)

I mean, i generally reccomend the latest unless you have a reason to use an earlier version (mwdumper might be a very good reason to use an earlier version, unless someone fixed it)

AlgorithmGG (talkcontribs)

Thanks again, I am currently getting an error connecting to Elasticsearch on line 90 of the elastic search connection page:

public_html/extensions/Elastica/includes/ElasticaConnection.php.

line 90 error note ( client not found)

$this->client = new \Elastica\Client( [ 'servers' => $servers ],

/**

* Callback for \Elastica\Client on request failures.

* @param \Elastica\Connection $connection The current connection to elasticasearch



Do you recall this issue is it something that can be updated manually?

Bawolff (talkcontribs)

You should update extensions when you update mediawiki, so that the versions match.

Beyond that, what's the error message?

Reply to "Is it considered acceptable to mirror wikipedia?"

Issue Upgrading from 1.33.4 to 1.34.0

15
Ljkennedy2000 (talkcontribs)

Error when upgrading from MediaWiki 1.33.4 to 1.34 on main page:


Hi all, I wonder if anyone can help?


When trying to upgrade from 1.33.4 to 1.34 I get the following error:


[YizDPmwd53OgEa32QAw-tQAAAAU] /index.php/Main_Page UnexpectedValueException from line 462 of /var/www/mediawiki-1.34.0/includes/libs/rdbms/loadbalancer/LoadBalancer.php: Invalid server index index #DB_SLAVE

Backtrace:

#0 /var/www/mediawiki-1.34.0/includes/libs/rdbms/loadbalancer/LoadBalancer.php(896): Wikimedia\Rdbms\LoadBalancer->getConnectionIndex(string, array, string)

#1 /var/www/mediawiki-1.34.0/includes/libs/rdbms/loadbalancer/LoadBalancer.php(1043): Wikimedia\Rdbms\LoadBalancer->getConnection(string, array, string, integer)

#2 /var/www/mediawiki-1.34.0/includes/GlobalFunctions.php(2576): Wikimedia\Rdbms\LoadBalancer->getMaintenanceConnectionRef(string, array, string)

#3 /var/www/mediawiki-1.34.0/extensions/DynamicArticleList/DynamicArticleList_body.php(31): wfGetDB(string)

#4 /var/www/mediawiki-1.34.0/includes/parser/Parser.php(4293): DynamicArticleList::renderTag(string, array, Parser, PPFrame_Hash)

#5 /var/www/mediawiki-1.34.0/includes/parser/PPFrame_Hash.php(328): Parser->extensionSubstitution(array, PPFrame_Hash)

#6 /var/www/mediawiki-1.34.0/includes/parser/Parser.php(3330): PPFrame_Hash->expand(PPNode_Hash_Tree, integer)

#7 /var/www/mediawiki-1.34.0/includes/parser/Parser.php(1489): Parser->replaceVariables(string)

#8 /var/www/mediawiki-1.34.0/includes/parser/Parser.php(593): Parser->internalParse(string)

#9 /var/www/mediawiki-1.34.0/includes/content/WikitextContent.php(368): Parser->parse(string, Title, ParserOptions, boolean, boolean, integer)

#10 /var/www/mediawiki-1.34.0/includes/content/AbstractContent.php(555): WikitextContent->fillParserOutput(Title, integer, ParserOptions, boolean, ParserOutput)

#11 /var/www/mediawiki-1.34.0/includes/Revision/RenderedRevision.php(264): AbstractContent->getParserOutput(Title, integer, ParserOptions, boolean)

#12 /var/www/mediawiki-1.34.0/includes/Revision/RenderedRevision.php(236): MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached(WikitextContent, boolean)

#13 /var/www/mediawiki-1.34.0/includes/Revision/RevisionRenderer.php(215): MediaWiki\Revision\RenderedRevision->getSlotParserOutput(string)

#14 /var/www/mediawiki-1.34.0/includes/Revision/RevisionRenderer.php(152): MediaWiki\Revision\RevisionRenderer->combineSlotOutput(MediaWiki\Revision\RenderedRevision, array)

#15 [internal function]: MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}(MediaWiki\Revision\RenderedRevision, array)

#16 /var/www/mediawiki-1.34.0/includes/Revision/RenderedRevision.php(198): call_user_func(Closure, MediaWiki\Revision\RenderedRevision, array)

#17 /var/www/mediawiki-1.34.0/includes/poolcounter/PoolWorkArticleView.php(196): MediaWiki\Revision\RenderedRevision->getRevisionParserOutput()

#18 /var/www/mediawiki-1.34.0/includes/poolcounter/PoolCounterWork.php(125): PoolWorkArticleView->doWork()

#19 /var/www/mediawiki-1.34.0/includes/page/Article.php(791): PoolCounterWork->execute()

#20 /var/www/mediawiki-1.34.0/includes/actions/ViewAction.php(63): Article->view()

#21 /var/www/mediawiki-1.34.0/includes/MediaWiki.php(511): ViewAction->show()

#22 /var/www/mediawiki-1.34.0/includes/MediaWiki.php(302): MediaWiki->performAction(Article, Title)

#23 /var/www/mediawiki-1.34.0/includes/MediaWiki.php(900): MediaWiki->performRequest()

#24 /var/www/mediawiki-1.34.0/includes/MediaWiki.php(527): MediaWiki->main()

#25 /var/www/mediawiki-1.34.0/index.php(44): MediaWiki->run()

#26 {main}


I’ve tried adding new extensions for 1.34, upgrading to different versions ‘upto the latest 1.37.1’ and made sure that the LoadBalancer.php has the DB_REPLICA setting enabled. This only seems to appear on the main page from what I can see.


Thanks

Bawolff (talkcontribs)

Are you sure DynamicArticleList is compatible with your version of mediawiki?

Ljkennedy2000 (talkcontribs)

Hi Bawolff,

Thanks for the response.

Unfortunately my Experience is limited with Media Wiki and i've been tasked with updating our current system, is that an extension of some sort and how would I check compatibility?

Thanks again.

Ljkennedy2000 (talkcontribs)

Hi Bawolff,


I've found the DynamicArticleList.php file in extensions and noticed the following on line 24 -

// The callback function for converting the input text to HTML output

function DynamicArticleList( $input ) {

require_once ('CategoryUtil.php');

$dbr =& wfGetDB( DB_SLAVE );

// INVALIDATE CACHE

global $wgTitle;

$wgTitle->invalidateCache();

// Default Values

$listTitle = false;

$listType = 'new';

$listCount = 5;

$categoryRoot = false;


Should this be set to DB_REPLICA instead of DB_SLAVE  ?


Thanks

Ljkennedy2000 (talkcontribs)

Also it's * Dynamic Article List v2.0

Jonathan3 (talkcontribs)
Ljkennedy2000 (talkcontribs)

Hi Jonathan3,


Thanks for the response. To disable Dynamic Article List is this done in the Localsettings.php?


Thanks for you help.

Bawolff (talkcontribs)

Yes. There should be a line that starts with "require". Find the one for this extension and remove it. (Keep backups just in case)

Ljkennedy2000 (talkcontribs)

Hi,

I upgraded to 1.35.5 and disabled Dynamic Article List in the Localsettings.php which seems to let the main page work now apart from one of our side menus which shows the following error:

<dynamicarticlelist>

title=Recently Update Articles

type=update

count=5

</dynamicarticlelist>

<dynamicarticlelist>

title=Most Popular Articles

type=hot

count=5

</dynamicarticlelist>

Is there something else other than dynamic article lists that could be used to get this working or an update for it maybe?

I then tried upgrading to media wiki 1.36.0 to see if this would resolve it but it didn't work - the whole page seems to have lost images and styling it's showing text for everything 'although it lets me login etc'. The images were copied over and the permissions looked okay, I did have to install php-intl to get to 1.36.0 so i'm not sure if this caused it?

Thanks

Ljkennedy2000 (talkcontribs)

Hi Bawolff,


Thanks for your input I will give it a try :)

Ljkennedy2000 (talkcontribs)

Hi,


I upgraded to 1.35.5 and disabled Dynamic Article List in the Localsettings.php which seems to let the main page work now apart from one of our side menus which shows the following error:

<dynamicarticlelist>

title=Recently Update Articles

type=update

count=5

</dynamicarticlelist>

<dynamicarticlelist>

title=Most Popular Articles

type=hot

count=5

</dynamicarticlelist>

Is there something else other than dynamic article lists that could be used to get this working or an update for it maybe?

I then tried upgrading to media wiki 1.36.0 to see if this would resolve it but it didn't work - the whole page seems to have lost images and styling it's showing text for everything 'although it lets me login etc'. The images were copied over and the permissions looked okay, I did have to install php-intl to get to 1.36.0 so i'm not sure if this caused it?


Thanks

Bawolff (talkcontribs)

There are lots of similiar extensions, like DynamicPageList.

Ljkennedy2000 (talkcontribs)

Hi Bawolff,


I've managed to get to version 1.36 now :) but my next issue is all the main_page entries are in text format, I cannot see any of the images or headers etc.. The images files have been copied over into the correct folder. Could this be a php issue?


Thanks

Bawolff (talkcontribs)

What do you mean by headers?


For images, are their image description pages working?

Ljkennedy2000 (talkcontribs)

Hi Bawolff,


Thanks for the reply I will try and give that a go.

Reply to "Issue Upgrading from 1.33.4 to 1.34.0"

JsonConfig Extension Support Question about public static function onCanonicalNamespaces( array &$namespaces ) { } lucky best man In the world

3
Jasonkhanlar (talkcontribs)
Bawolff (talkcontribs)

What's the actual question?


It sounds like you are declaring the namespace twice so you are getting a warning.

This post was hidden by Bawolff (history)
Reply to "JsonConfig Extension Support Question about public static function onCanonicalNamespaces( array &$namespaces ) { } lucky best man In the world"

RSync Alarm wikipedia requests

20
99.102.84.25 (talkcontribs)
Bawolff (talkcontribs)

What is your actual concern?

That page is not really meant for public consumption.

99.102.84.25 (talkcontribs)

We are starting to see a large increase in the number of our requests to Wikipedia API timing out starting as of yesterday

99.102.84.25 (talkcontribs)

It seems to align with when the two rsync alarms began to go into alarm state

Ciencia Al Poder (talkcontribs)
Malyacko (talkcontribs)

Also, what's your user agent used for your requests?

Bawolff (talkcontribs)

It is very unlikely that the rsync alarm has anything to do with that.

75.172.125.42 (talkcontribs)

Is there any other change were going online yesterday that might possibly causing issue? like overall service connection issue?

Malyacko (talkcontribs)

Impossible to say without answers to all the currently unanswered questions in this thread.

75.172.125.42 (talkcontribs)

We are still checking our user agent and check the API Etiquette. But our code base has been out for a few years and the http request is falling suddenly since May 03. Anything possible might be causing the issue?

Malyacko (talkcontribs)

Yes, see above: Ignoring the rate limits, for example.

75.172.125.42 (talkcontribs)

Could you please add more information for the "Ignoring the rate limit"?

Btw after some investigation for the API Etiquette, here are some result:

  1. We usually just query 2 articles for the in the loop and the title is not piped, one of the title could be some transformation with removal the spaces.
  2. We do not continue send request for the result we get from another request, thus we do not have the generator thing in our use case.
  3. We still investigating around the gzip. Is there anything happened recently with gzip request? Like that become mandatory?

And we are only seeing the issue partially, not every request we make is having issue.

Bawolff (talkcontribs)

Gzip is not mandatory.

Which wiki are the requests being made to? What type of api requests (do you have an example)? What is the timestamp in utc when you started noticing the issue? Are you logged in (if so what username makes request? If not, what IP?) What precisely do you mean by "timeout" (are you getting an http response that is an error,if so what error, is your connection just not connecting? Do you not just not recieve an http response after some time (how long), something else?

Bawolff (talkcontribs)

There was database maintenance around this time. While it wasn't supposed to affect anything, there was some reports that it was causing temporary slowness. It may be related to your issue.

97.113.61.16 (talkcontribs)

Which wiki are the requests being made to?

What type of api requests (do you have an example)?

It is a Get request send to en.wikipedia.org/w/api.php

What is the timestamp in utc when you started noticing the issue?

Between 05/03/2022 9am to 10am UTC

Are you logged in (if so what username makes request? If not, what IP?)

No login with username, trying to get the ip.

What precisely do you mean by "timeout" (are you getting an http response that is an error,if so what error, is your connection just not connecting?

Sometimes for some queries the http request is not succeeded, no wiki response returned, with request aborted exception. Continue checking the specific http code.

Do you not just not recieve an http response after some time (how long), something else?

No response  for 3s in US

97.113.61.16 (talkcontribs)

Btw is the database maintenance still going on? We still seeing the issue on our side.

Bawolff (talkcontribs)

No.

Also, no response for 3 seconds sounds more like you should just increase your timeouts. A problem on wmf end would look more like getting a 503 error. Most api endpoints in normal times should respond within 3 seconds but that is not true of all of them.

97.113.61.16 (talkcontribs)

Actually we have a retry. The first call is 3s whlle the second call is 5s.

For the user agent we found mostly they are "Java/1.8.0_211-ea", and several of them are "Java/phoneme_advanced-Core-1.3-b16 sjmc-b111".

97.113.61.16 (talkcontribs)

And with the 5 s retry we are still seeing failure.

Bawolff (talkcontribs)

> For the user agent we found mostly they are "Java/1.8.0_211-ea", and several of them are "Java/phoneme_advanced-Core-1.3-b16 sjmc-b111".

Per WMF's user agent policy, this user agent isn't allowed and could potentially be blocked (you are probably not blocked, as you wouldget an error message). Your user agent must have a contact email adress in it and should have a descriptive name of your tool.


Anyways, i would suggest a timeout of 60 seconds.

Reply to "RSync Alarm wikipedia requests"
96.245.117.142 (talkcontribs)

I tried my other email address and STILL cannot get a verification email.

Bawolff (talkcontribs)

its unlikely anyone is going to help you if you dont even include what website you are talking about, what username and what email.

Reply to "signing up"

Internal error - Mediawiki

12
FrViPofm (talkcontribs)

Hi,

I'm re-installing Mediawiki 1.37 on a fresh Ubuntu 22.04, Apache, PHP 8.1. I get the first config screen (lang), I click for the following step, and... fail!

/w/mw-config/index.php?page=Language Error: Class "FormatJson" not found

Backtrace:
from /var/www/wiki/w/includes/exception/MWExceptionHandler.php(754)
 #0 /var/www/wiki/w/includes/exception/MWExceptionHandler.php(291): MWExceptionHandler::logError()
 #1 /var/www/wiki/w/includes/AutoLoader.php(117): MWExceptionHandler::handleError()
 #2 /var/www/wiki/w/includes/AutoLoader.php(117): require(string)
 #3 /var/www/wiki/w/includes/cache/localisation/LocalisationCache.php(594): AutoLoader::autoload()
 #4 /var/www/wiki/w/includes/cache/localisation/LocalisationCache.php(929): LocalisationCache->readJSONFile()
 #5 /var/www/wiki/w/includes/cache/localisation/LocalisationCache.php(496): LocalisationCache->recache()
 #6 /var/www/wiki/w/includes/cache/localisation/LocalisationCache.php(370): LocalisationCache->initLanguage()
 #7 /var/www/wiki/w/includes/cache/localisation/LocalisationCache.php(311): LocalisationCache->loadItem()
 #8 /var/www/wiki/w/includes/language/LanguageFallback.php(106): LocalisationCache->getItem()
 #9 /var/www/wiki/w/includes/language/LanguageFactory.php(158): MediaWiki\Languages\LanguageFallback->getAll()
 #10 /var/www/wiki/w/includes/language/LanguageFactory.php(116): MediaWiki\Languages\LanguageFactory->newFromCode()
 #11 /var/www/wiki/w/mw-config/index.php(75): MediaWiki\Languages\LanguageFactory->getLanguage()
 #12 /var/www/wiki/w/mw-config/index.php(40): wfInstallerMain()
 #13 {main}

What is wrong ?

Bawolff (talkcontribs)

You are missing MediaWiki files or your install is corrupt. I would suggest re-installing.

FrViPofm (talkcontribs)

I tried the installation several times from the 1.37.2 .zip or tar.gz archive. I moves the files in place together with a <code>cp -r /path/to/source/* /path/to/target</code> and changed the owner with a <code>chown -R www-data:www-data /path/to/target</code> I doubt I forgot a file each time, and always the same to get the same error. Maybe a php file. But witch ? My phpinfo says that JSON is enabled.

Bawolff (talkcontribs)

The missing file would be includes/json/FormatJson.php

FrViPofm (talkcontribs)

<code>ls /var/www/wiki/w/includes/json/</code>

returns :

<code>

FormatJson.php     JsonSerializer.php           JsonUnserializer.php

JsonCodec.php      JsonUnserializable.php

JsonConstants.php  JsonUnserializableTrait.php

</code>

The FormatJson.php file is here.

Bawolff (talkcontribs)

Check it has the same contents as https://github.com/wikimedia/mediawiki/blob/REL1_37/includes/json/FormatJson.php (especially line 1 and 26, but whole thing should be the same) and also check line 539 of autoload.php is the same as https://github.com/wikimedia/mediawiki/blob/REL1_37/autoload.php#L539

Php 8.1 isn't officially supported by mediawiki 1.37, but it feels really unlikely that any incompatibility would manifest like this. However it seems like some other people have the same error https://phabricator.wikimedia.org/T307816 which is very weird (edit: i didnt realize you were same person)

Bawolff (talkcontribs)
FrViPofm (talkcontribs)
FrViPofm (talkcontribs)

OK. WIki installed.

Thanks.

Bawolff (talkcontribs)

Thats still super weird as an empty line at end should affect nothing. Maybe some sort of opcode cache issue and editing the file forced a cache clear.

217.132.25.116 (talkcontribs)

FYI - happened to me also, adding a newline at the end of FormatJson solves the issue.

FrViPofm (talkcontribs)
Reply to "Internal error - Mediawiki"

[Newbie] Pictures not shown after upload

5
2A02:8070:782:3C00:68DE:DCE2:814F:FAEB (talkcontribs)

Hello,

I need some help with my problem. I found the same issue alread here: Topic:Tvl9fsap0q0j1v9e But I can't get the solution running for me and it is already 4 years ago.

But the problem is the same, I am running a wiki on a hosted webspace (no commandlines) and when i upload an image, it looks like this:

http://i.imgur.com/XInqC7g.jpg


When I click on the Image I receive the following:

Forbidden

You don't have permission to access this resource.Server unable to read htaccess file, denying access to be safe


What can I do? Thanks for the Help in advance. Unfortunately I only have a basic understanding of programming and none in SQL (i don't even know how to look into my SQL database)

Malyacko (talkcontribs)

Hi, see "Post a new question" in the sidebar and check the server logs.

2A02:8070:782:3C00:68DE:DCE2:814F:FAEB (talkcontribs)

Hi @Malyacko sorry for that


Software Version

MediaWiki 1.37.2

PHP 7.4.28 (fpm-fcgi)

MariaDB 10.5.15-MariaDB-1:10.5.15+maria~focal-log

ICU 66.1

Pygments 2.10.0


Giving the web address, doesn'T help unfortunatley as I restricted the access to members only (it will be private wiki for me and a few friends and we don't want the content available in the web)

2A02:8070:782:3C00:68DE:DCE2:814F:FAEB (talkcontribs)

Ok I played around ab it more and it looks like it was due to the access rights of the image folder. I changed it and it is working now. I had the hope that thoses images can be displayed without allowing everyone (knowing the link) to access the images.


If there is such solution, I am happy to hear about it. Else, this topic is resolved

Bawolff (talkcontribs)
Reply to "[Newbie] Pictures not shown after upload"