Changes

Jump to: navigation, search

Rollover for the manual

1,599 bytes added, 20:44, 4 July 2017
Rolling over the Gramps user manual
{{stub}}<!--check instructions-->Technical details on how the Gramps wiki [[Category:Developers/GeneralUser manual]]is updated for each release. == Rolling over the Gramps user manual ==On this page, <code>3.x</code> refers to the 'old' (current) version of the manual - <code>3.z</code> refers to the new, to be created, version...#Use phpmyadmin to backup all of the wiki tables!#Locate all pages on the wiki that have 3.x in the title - from the wiki: toolbox, Special pages, All pages to get a full list of wiki pages. #Copy/paste it into text processor - it may be in three columns separated by tabs. If so, use regular expression search/replace to replace tabs with carriage returns. Using GEDIT, best to replace <code>\t</code> with <code>\n</code> rather than <code>\r</code>. You now have a list of all files on the wiki, one title per line. Save it as a text file (I'll call it <code>myfile.txt</code> for this example).#Use grep to find only the pages with <code>3.x</code> in the title: ''<code>grep "3\.x" myfile.txt > mynewfile.txt</code>'' The backslash will make the full stop a literal rather than trying to be part of a regular expression.#Use a text editor to view the new file, and delete any pages that you don't want to roll over (That is, some pages may refer to <code>3.x</code> in the title, but are not pages that we want duplicated into <code>3.z</code>...)#Copy/paste the new list of page titles into the wiki export using toolbox, Special pages, Export pages and create the xml to screen. {{man menu|Be sure box for Include only the current revision, not the full history is ticked.}}#Copy/paste the xml into a text processor and use search/replace to replace <code>3.x</code> with <code>3.z</code> #Save the file (I'll call it <code>export.xml</code> for this example). For the change to <code>3.4</code> and then from <code>3.4</code> to <code>4.0</code>, I used a script in the maintenance directory to import xml from the command line. I was successful with (but I got an error in <code>importDump.php</code> and had to edit the file to ensure that <code>Maintenance.php</code> was included properly!):
Locate all pages on the wiki that have '''-bash-3.0 in the title - from the wiki: toolbox, Special pages, All pages to get a full list of wiki pages2$<code>php importDump. Copy/paste it into Open Office word processor - it is in three columns separated by tabsphp 34. Use regular expression searchxml</replace to replace tabs with carriage returns - now have a list of all files on the wiki, one title per line. Save it as a text file (Icode>'''ll call it myfile.txt for this example).
Use grep to find only the pages with 3.0 in the titlefollowed by:grep "3.0" myfile.txt > mynewfile.txt
Use a text editor to view the new file, and delete any pages that you don't want to roll over (That is, some pages may refer to ''-bash-3.0 in the title, but are not pages that we want duplicated into 32$<code>php rebuildrecentchanges.1...)php</code>'''
Copy/paste That should now have created all of the new list of page titles into the wiki export using toolbox, Special pages, Export pages and create the xml to screen. Be sure box for Include only the current revision, not the full history is ticked.
Copy/paste ''New pages created by this sort of import do not automatically get added into the search index of the wiki. Use phpmyadmin from the xml into a text processor cpanel and use search/replace to replace 3.0 with 3the repair tool for the searchindex table.1 ''
Save Finally, if you want to set all the file previous version pages to be protected, so only sysop mediawiki users are able to make changes (Iso all changes will be forced into the next set of pages...), you can use a series of sql statements in the form:<pre>UPDATE `grampswiki2`.`page` SET `page_restrictions` = 0x656469743d7379736f703a6d6f76653d7379736f70 WHERE `page`.`page_title` ='Title_of_the_page_in_the_wiki' collate utf8_unicode_ci LIMIT 1;</pre>({{man menu|You'll call it exportneed to replace the spaces in the page title with underscores...xml for this example}}).
Import the new pages into I do that concatenation in a spreadsheet, creating the wiki: toolbox, Special pages, Restricted special pages (available only to Sysop members of the wikilong)SQL statements by combining a first part, Import pagesthe page names and the last part. Browse You'll need to find search for any single quotes and put a backslash in front of them in the page names first, too..xml file and Upload file.
There are serious restrictions for both file size and processing time for That updates a blob field to force protection without having to do the scripts - I had pages individually. You need to split do the .xml file into 8 pieces above sql statement for each page that you want to get it all uploadedprotect...
That should now have created all *ScreenshotsAs there is no [[Screenshots|naming scheme]] we are trying to follow a rule for manual upgrade :* <code>filename-{number of version}-{lang}.extension</code>ex:* <code>Edit-person-40-en.png</code>We need to know if we keep the {number of version} or if we use the new pagesone on migration (3.1->3.2->4.0->15.0...) ?
New pages created by this sort of import do not automatically get added into the search index of the wiki. Use phpmyadmin from the cpanel and use the repair tool for the searchindex table.==See also==
Finally, if you want to set all the previous version pages to be protected, so only sysop mediawiki users are able to make changes (so all changes will be forced into the next set of pages* [[User manual]]* [[Manual Generation 3...), you can use a series of sql statements in the form:0]]
UPDATE `grampsd_wiki`.`page` SET `page_restrictions` = 0x656469743d7379736f703a6d6f76653d7379736f70 WHERE `page`.`page_title` ='Title_of_the_page_in_the_wiki' LIMIT 1 ;
That updates the blob field to force protection without having to do the pages individually...[[Category:Developers/General]]
1,073
edits

Navigation menu