Project:Support desk

About this board

Welcome to the MediaWiki Support desk, where you can ask MediaWiki questions!

(Read this message in a different language)

See also

Before you post

Post a new question

  1. To help us answer your questions, please indicate which versions you are using, as found on your wiki's Special:Version page:
    • MediaWiki version
    • PHP version
    • Database type and version
  2. Please include the web address (URL) to your wiki if possible. It's often easier for us to identify the source of the problem if we can see the error directly.
  3. To start a new thread, click "Start a new topic".

Error 504 when trying to submit CI Form data and no Email Notifications

1
Tarunjoseph93 (talkcontribs)

Hi,

Just to give you some context before I lay out the issues, our organisation has enabled the extension CIForms for users to submit feedback data as a feedback form. Composer updates are run on all folders within the extensions/ folder before the site is deployed. This MediaWiki site uses MySQL for its database. The composer.local.json file even contains the latest phpmailer version (^6.5) and a few other dependencies. The way I run composer is I download composer-2 in a shell script, self-update, update --no-dev -o, install --no-dev -o and then remove the composer instance. Every extension uses the REL1_39 package because this site's MediaWiki version is 1.39.2 and uses PhP 8.1. The site is even setup to send emails when users request accounts to the admin, and it works. It even sends back a temp password to the user to sign up, once the admin approves the user. So SMTP is setup properly as well.

1. Issue 1

Currently, when testing out the feedback form as admin, I fill out the form and hit submit. The page takes a long time to load and then times out with a "504 Error". When I check the Special:CIFormsManage page, I can see that the form has been submitted. I'm able to view and download the PDF as well. But I'm not sure why the page times out when Submit is clicked. I've done a Network inspect, but Special:CIFormsSubmit just seems to be in a pending state before throwing a 504 Gateway Timeout Nginx error. I've checked the Nginx logs and the error on Timeout is as follows: [error] 35#35: *226301113 upstream timed out (110: Operation timed out) while reading response header from upstream, client: <Masked_IP_Address>, server: <Masked_DNS_Address>, request: "POST /index.php/Special:CIFormsSubmit HTTP/1.1".

2. Issue 2

Although the feedback form data seems to be present on the Special:CIFormsManage page and can be downloaded, an email doesn't seem to be sent to the admin. As stated above, emails for account requests work, so SMTP is configured correctly in LocalSettings.php and $wgEnableEmail = true as well as $wgEnableUserEmail = true. Even under the wfLoadExtension( 'CIForms' ) line, $wgCIFormsSenderEmail is set to the admin's email as well as $wgCIFormsEmailTo. Even PhPMailer is set to its latest version. Not sure why the admin nor I receive emails on submission of these feedback forms. Any help will do please! Thanks in advance!

Reply to "Error 504 when trying to submit CI Form data and no Email Notifications"

Where to place 'ads.txt' file in Bitnami Mediawiki on Google Cloud?

4
The.machine.preacher (talkcontribs)

I'm running Bitnami Mediawiki on Google Cloud. As stated in the title, where do I place the 'ads.txt' file? In the '/root' folder or in the main directory '/'?

Bawolff (talkcontribs)

Hi. This question seems unrelated to MediaWiki. You may have better luck in a different support forum.

2001:8003:269F:4D00:C80D:F5DE:E15C:995F (talkcontribs)

Why is it unrelated to MediaWiki, all I wished to know was where the root directory was?

TheDJ (talkcontribs)

Because we don't know anything about things specific to Bitnami on Google Cloud.

Reply to "Where to place 'ads.txt' file in Bitnami Mediawiki on Google Cloud?"

Extracting user lists

2
82.8.224.56 (talkcontribs)

Hi,


Very new to this, but can't find a way to export the user lists?

Jonathan3 (talkcontribs)
Reply to "Extracting user lists"

Migrating from 1.25.2 to 1.39.2

2
MattRobl (talkcontribs)

I need to upgrade from MediaWiki 1.25.2 to 1.39.2. I understand I need to upgrade from 1.25.2 to 1.35.9, and then upgrade to 1.39.2. I also need to migrate the application server from Windows Server 2012 R2 to Windows Server 2019. My question is how much of the upgrade should be done on the Windows 2012 server before moving to the Windows 2019 server? I am guessing since I can't get 1.25.2 I should upgrade to 1.35.9, move to the Windows 2019 server and then upgrade to 1.39.2. Thoughts?

Jonathan3 (talkcontribs)

I've never used a Windows server but imagine as far as MediaWiki is concerned, it's still the PHP and database versions that matter. As long as you can transfer the files and database to a server on which the update.php script will run, you'll be all right, I think.

Reply to "Migrating from 1.25.2 to 1.39.2"

Revision 0 does not exist

19
Rebastion2 (talkcontribs)

So my wiki has gone through a few decades of upgrades and I would consider its database surely rife for optimization. I noticed this as I am unable to upgrade to the latest LTS, so now I am trying to isolate and fix as many problems with the database as I can.


One is shoddy files and other corruptions. Most maintenance scripts don't seem to really fix this.


One problem I have is similar to this one https://stackoverflow.com/questions/34143056/main-page-error-of-mediawiki

I have a file that displays properly but has no 0 revision registered for some reason and I cannot edit it, either. None of the "fixes" I can find on here and elsewhere on the web really help. I appreciate any pointers to maintenance scripts, hacks and other tips to really (really) fix flawed database structures etc...

Bawolff (talkcontribs)

Can you post the entry in the page table for that page, along with anything in the revision table where rev_page is equal to the page_id for that page?

What version of mediawiki are you using?

Common causes of this are referential integrity errors, either with the page_latest field in page, the rev_comment_id field in revision or rev_actor field in revision (the last being most common as the actor migration upgrade script is fragile)

Rebastion2 (talkcontribs)

First of all thanks for your interest in helping me. Ok let's see (would love a maintenance script that just goes through my entire database and either fixes everything or gives me a very detailed report what needs fixing).


"Entry in the page table for that page":

page_namespace 6 pagetitle 4A61636B70616C616E63652E6A7067 is_redirect 0 is_new 1 page_touched 20230118090223 page_latest 5734 page_len page_links_updated 3230313530343032313635383433


"anything in the revision table where rev_page is equal to page_id"

rev_id 5734 rev_page 2524 rev_commend_id 0 rev_actor 0 rev_timestamp 20060208174920 rev_minor_edit 0 rev_deleted 0 rev_len 0 rev_parent_id 0 rev_sha1 70686F6961633968346D383432787134357370377336753231657465657131 (this is not unique, I see a few adjacent revisions that have the same hash, but I could tink of reasons why that may be so, but not sure if it's supposed to be non-unique)

Mediawiki 1.38.2

I may have used delete old revisions many many years ago. And one of the things I have tried very recently is the migrateactors php script which helps in assiging content that has no actor assigned

Bawolff (talkcontribs)

So the rev_actor being 0 and rev_comment_id being 0 indicate that update.php didnt do the migration properly.

The actor migration script generally doesnt work anymore. Its best to run it around 1.33. If you dont have backups to that point you would probably have to manually fix it by updating rev_actor to point to some actor.

Rebastion2 (talkcontribs)

what do I point it to? this all means very little to me not being familiar with the product's sql architecture. Going back to 1.33 is not an option of course....

Rebastion2 (talkcontribs)
Rebastion2 (talkcontribs)

Bawolff, I have a very concrete question, maybe you can point me in the right direction: from all I can gather, once restoring to a pre 1.35 backup state is off the table, there is no real way to "fix" this. However, if it only affects a manageable number of pages/files, I wonder if the following could be achieved: is there a way to manually remove all traces of these pages/files from the database, so that these become "nonexistent" again and can be re-created or re-uploaded (in the case of the files) thereby creating new/fresh and correct database entries for these content items? That would be a manual way of fixing this in order to have the next upgrade to 1.39.2 not throw any errors :)

Bawolff (talkcontribs)

You could delete the entry in the page table i suppose, ymmv.

Rebastion2 (talkcontribs)

hm, can't even find it. Nothing in the database seems to be in plain text, as far as I understand it it is all hashed, but can't find one of the files in question even by looking for their hashed value... so weird

Bawolff (talkcontribs)

They aren't hashed. Some db tools will display page titles in hexadecimal form

Rebastion2 (talkcontribs)

How weird, but indeed the sql query the db tool shows on top of the screen when opening the pages table says "

SELECT *, HEX(`page_title`) AS `page_title`, HEX(`page_content_model`) AS `page_content_model`, HEX(`page_links_updated`) AS `page_links_updated`, HEX(`page_lang`) AS `page_lang`   FROM `swdb_page`LIMIT 50
Rebastion2 (talkcontribs)

talking to my hoster now why that is the database wide default, so weird

Rebastion2 (talkcontribs)

it seems to be default behavior for "Adminer" to show as Hex if the column type is binary. Now of course it sucks the hoster uses Adminer, but before I go into further trouble I just want to make sure and ask this: binary is the correct format for the table column(s), right?

Bawolff (talkcontribs)

Yes. We basically use it to tell the DB to be hands off about the value.

Simpsonspedia.net (talkcontribs)
Rebastion2 (talkcontribs)

Hi Bawolf. Appreciate your input. If my problem is limited to just a handful pages/files, you mentioned they could be pointed to to "some actor". Any advice as to which?

The alternative would be to delete all mention of this content and create it anew (thereby correctly establishing that content and its relationships in the database like any new content created)

Bawolff (talkcontribs)

it doesn't really matter which actor. You could create a new user for that purpose.

Rebastion2 (talkcontribs)

I could just assign all these to me, but what values doe rev_actor and rev_comment_id etc. get? forgive me but it's all a bit overwhelming without prior in depth knowledge of the database structure. My goal would be for none of these to be 0, right? ev_minor_edit 0 rev_deleted 0 rev_len 0 rev_parent_id 0 isn't a problem?

Rebastion2 (talkcontribs)

sorry to be a bother, still looking for a step by step sort of instruction how to possibly "fix" this manually (at least as a "hack" - I don't care who these revisions are assigned to as long as the database is no longer practically corrupted and these files uneditable)

Reply to "Revision 0 does not exist"

Malformed UTF-8 characters, possibly incorrectly encoded

11
Delta5768 (talkcontribs)

Got stuck on Connect Database;


[92c66aff7b86e27481dece49] /mywiki/mw-config/index.php?page=DBConnect Exception: preg_match_all error 4: Malformed UTF-8 characters, possibly incorrectly encoded

Backtrace:

from C:\xampp\htdocs\mywiki\includes\MagicWordArray.php(319)

#0 C:\xampp\htdocs\mywiki\includes\parser\Parser.php(4116): MagicWordArray->matchAndRemove(string)

#1 C:\xampp\htdocs\mywiki\includes\parser\Parser.php(1636): Parser->handleDoubleUnderscore(string)

#2 C:\xampp\htdocs\mywiki\includes\parser\Parser.php(724): Parser->internalParse(string)

#3 C:\xampp\htdocs\mywiki\includes\language\MessageCache.php(1374): Parser->parse(string, MediaWiki\Page\PageReferenceValue, ParserOptions, boolean)

#4 C:\xampp\htdocs\mywiki\includes\Status.php(331): MessageCache->parse(string, MediaWiki\Page\PageReferenceValue, boolean, boolean, NULL)

#5 C:\xampp\htdocs\mywiki\includes\installer\WebInstaller.php(1044): Status->getHTML()

#6 C:\xampp\htdocs\mywiki\includes\installer\WebInstallerDBConnect.php(41): WebInstaller->showStatusBox(Status)

#7 C:\xampp\htdocs\mywiki\includes\installer\WebInstaller.php(271): WebInstallerDBConnect->execute()

#8 C:\xampp\htdocs\mywiki\mw-config\index.php(82): WebInstaller->execute(array)

#9 C:\xampp\htdocs\mywiki\mw-config\index.php(40): wfInstallerMain()

#10 {main}

Bawolff (talkcontribs)

Certainly weird. Maybe could be caused if the database returned an error that was not encoded as valid utf-8.

Delta5768 (talkcontribs)

I don't know. I followed everything exactly step by step from the xampp manual for mediawiki.

Bawolff (talkcontribs)

As a hack, you could try in includes/installer/WebInstaller.php on around line 4042 replace the line

$html = $status->getHTML();

with:

$html = UtfNormal\Validator::cleanUp( $status->getHTML() );

If that actually fixes it, you should get a more descriptive message. Please let me know if that works, and what the more descriptive error message is.

Delta5768 (talkcontribs)

There's only ~1200 lines in WebInstaller.php, but I found the line of code and replaced it; it didn't do anything.

Bawolff (talkcontribs)

Oh, i guess it should be line 1042.

I suppose you could try doing: $html = htmlspecialchars( UtfNormal\Validator::cleanUp( print_r( $status->getErrorsArray(), true ) ) );

instead. Not sure if that will work.

Delta5768 (talkcontribs)

Array ( [0] => Array ( [0] => config-connection-error [1] => Cannot access the database: :real_connect(): (HY000/2002): php_network_getaddresses: getaddrinfo for local/localhost failed: ���� ���� ����������. ) )

Bawolff (talkcontribs)

Super weird. Some sort of locale issue in the error i guess.

But for actual cause, what did you use as your db server?

Delta5768 (talkcontribs)

MySQL

Ciencia Al Poder (talkcontribs)

The error message "getaddrinfo for local/localhost failed" is concerning... Are you using "local/localhost" (with a literal /) as the hostname for the database connection?

Delta5768 (talkcontribs)
Reply to "Malformed UTF-8 characters, possibly incorrectly encoded"

Strange formatting on MediaWiki site

2
Summary by TimeWisely

I fixed it, I downloaded all of the dependencies that I didn't have, as well as all associated templates. I had to enable a toggle in the LocalSettings.php "$wgJsonConfigEnableLuaSupport = true"

I installed the JsonConfig extension, and dumped the settings into my LocalSettings.php file (the ones listed in the common settings.php file.)

None of which fixed my issues yet. I then took the following exported from Wikipedia:

MediaWiki:Common.css

MediaWiki:Common.js

and imported it into my wiki. That seemed to have fixed my problems.

If I had to guess, all of these fixed it, so if you forgot to do any of these, please feel free to do so.

I found another user talking about this same problems I had. I hadn't realized that I was doing it wrong, and I was forgetting a lot of essentials.

Please see this link: Manual:Importing Wikipedia infoboxes tutorial for a full comprehensive tutorial.

TimeWisely (talkcontribs)

Hello there, I am trying to set up another MediaWiki website. I am calling it "SkyeWiki" because, that's a fancy name. After many attempts, and for some reason using wordpress, then attempting to write my own, I finally learned what the heck a docker container was and here we are, I finally got passed the first stage of making a wiki...


However, I would be remissed if I didn't say I was struggling a little bit. I got through everything from the start, however I am having one strange issue, and I don't know what to do about it. You see, I'm not getting an error message, nor am I getting warning messages or anything.


I am for some reason just getting really strange formatting issues. I tried to install the Infobox template, who doesn't, am I right? However when I compare it to the Wikipedia version, I find it to be a lot more messy, and a lot of the code is not formatted, I don't know if this is the wiki, a configuration problem, plugin maybe, but, this concerns me because I'm unsure if the infobox is going to work.


I have what they mention, For the infobox specifically I believe it was only one thing; I have "wgUseInstantCommons" set to true, "Scribunto" enabled, and "Luastandalone" in the configs. However this doesn't seem to help with my issue.


Here is my infobox documentation: https://wiki.timewisely.net/index.php/Template:Infobox my version is 1.39.2, and everything else is stock, I did make a few changes, for example, I created a volume for Extensions, to my knowledge, it shouldn't affect anything, but if I don't do that, it'll delete the plugin for "Sanitized CSS" when it restarts. It's not mounted to a directory, so that shouldn't be a problem.


If anyone would mind to take a look at it, and maybe be able to deliver some expert opinions my way, I would be very grateful for the help, thank you.

TimeWisely (talkcontribs)

I fixed it, I downloaded all of the dependencies that I didn't have, as well as all associated templates. I had to enable a toggle in the LocalSettings.php "$wgJsonConfigEnableLuaSupport = true"

I installed the JsonConfig extension, and dumped the settings into my LocalSettings.php file (the ones listed in the common settings.php file.)

None of which fixed my issues yet. I then took the following exported from Wikipedia:

MediaWiki:Common.css

MediaWiki:Common.js

and imported it into my wiki. That seemed to have fixed my problems.


If I had to guess, all of these fixed it, so if you forgot to do any of these, please feel free to do so.

I found another user talking about this same problems I had. I hadn't realized that I was doing it wrong, and I was forgetting a lot of essentials.

Please see this link: Manual:Importing Wikipedia infoboxes tutorial for a full comprehensive tutorial.

Thanks.

Bluehost and MediaWiki

5
Euroexp (talkcontribs)

Hello everybody,

Has anyone had experience using the individual Symantec MediaWiki on Bluehost hosting. If so, please advise how best to install MediaWiki for individual use?

Yesterday, I asked Bluehost's help desk if it was possible to download MediaWiki. I was told that MediaWiki is not available for public use on Bluehost.

Does anyone have experience using individual MediaWiki on Bluehost?

Thanks in advance for your help.

Malyacko (talkcontribs)

Hi, there are general installation instructions linked from the mediawiki.org main page. What have you tried already, and were any parts unclear?

Euroexp (talkcontribs)

Thanks for the answer.


I tried to install MediaWiki according to the recommendations:

Help:Using Composer


But I do not have access to the command line. Bluehost doesn't let me do it. Therefore, I took advantage of the opportunity, which is described as follows:

If due to some configuration restrictions on part of the hosting company or missing command line access, it is suggested to run a local copy and upload the generated files together with the directory to the target destination: /vendor

1. Copy the following file to you local computer: "IndividualFileRelease.sh"

2. Change permissions of the file with:..

3. Run the script with:...

But I have an installed Windows, not Linux. And I can't do it.


Therefore, I asked if someone had experience of installing a MediaWiki on Bluehost.


What do I need to do?

Malyacko (talkcontribs)

Help:Using Composer does not exist. If you followed some instructions somewhere, please link to them.

Bawolff (talkcontribs)

SMW is a pain to install. They also have some of their own help forum which might have people specifically knowlegable about it.

As an aside, it is possible to use composer from the command line on windows, but just a lot more annoying.

Reply to "Bluehost and MediaWiki"

how can i set multiples wiki under only one dns name by example https://mediawikis.domain.com/wiki1 https://mediawikis.domain.com/wiki2

2
Mhetru (talkcontribs)

Hello,


i search the solution that how can i install mediawiki with only one folder source of master mediawiki :


/var/www/mediawikis.domain.com/htdocs/

and the folder of the wikis are :

/var/www/mediawikis.domain.com/wikis

by example wiki1 is this folder :

/var/www/mediawikis.domain.com/wikis/wiki1


Thanks for your help


Mathieu

Malyacko (talkcontribs)
Reply to "how can i set multiples wiki under only one dns name by example https://mediawikis.domain.com/wiki1 https://mediawikis.domain.com/wiki2"
Reception123 (talkcontribs)

I know this might be a bit of a basic question but I'm not really a developer, just trying to adapt something. I'm trying to connect to an API endpoint (which I'm sure works) but I'm a bit confused about how I'm supposed to get a valid CSRF token for the user executing it. Here is the part of the code so far but I can't seem to figure out the token part public function updateServer ( string $data ) { if ( $this->getStatus() === 'complete' ) { $data = [ 'action' => 'modifyserver', 'format' => 'json', 'mwaction' => 'modifyserver', 'wiki' => $this->getTarget(), 'server' => $this->getCustomDomain(), 'token' => ??, ];

$httpRequestFactory = MediaWikiServices::getInstance()->getHttpRequestFactory(); $httpRequestFactory->post( 'https://example.org/w/api.php', [ 'postData' => $data ] ); } }

Bawolff (talkcontribs)

You need to include cookies or the csrf token wont be valid.

e.g. First you have to get a token via api.php?action=query&meta=tokens request. You need to use the ->create() method so you have a MWHttpRequest object. After the request goes through, you call ->getCookieJar() on the Request object. Before sending the second request, you call ->setCookieJar to set the cookies for that request.

Reception123 (talkcontribs)

Thanks! I'll give that a try

Reply to "Adding CSRF token"