Rollover for the manual

From Gramps
Jump to: navigation, search
Notes for the [email protected]

Used in the maintenance of the Gramps website

Technical details on how the Gramps wiki User manual is updated for each release.

Rolling over the Gramps user manual

On this page, 5.x refers to the 'old' (current) version of the manual - 5.z refers to the new, to be created, version...

Backup wiki

Use phpmyadmin to backup all of the wiki tables (if phpmyadmin times out use the commandline )!

  • Command line: mysqldump -p -h hostname -u username database > dbname.sql


  1. Locate all pages on the wiki that have 5.x in the title - from the wiki menu: Tools > Special pages > All pages, page by page copy the text to get a full list of wiki pages.
  2. Copy/paste it into text processor - it may be in three columns separated by tabs. If so, use regular expression search/replace to replace tabs with carriage returns. Using GEDIT, best to replace \t with \n rather than \r. You now have a list of all files on the wiki, one title per line. Save it as a text file (I'll call it myfile.txt for this example).
  3. Use grep to find only the pages with 5.x in the title: grep "5\.x" myfile.txt > mynewfile.txt The backslash will make the full stop a literal rather than trying to be part of a regular expression.
  4. Use a text editor to view the new file, and delete any pages that you don't want to roll over (That is, some pages may refer to 5.x in the title, but are not pages that we want duplicated into 5.z...)
  5. Go to the wiki menu: Tools > Special pages > Export pages and Copy/paste the new list of page titles into the Add pages manually: section then Be sure the checkbox is ticked for Include only the current revision, not the full history and then to create the xml to screen press the Export button.
  6. Copy/paste the xml into a text processor and use search/replace to replace 5.x with 5.z
  7. Save the file (I'll call it export.xml for this example).

Import xml from the command line

For the change to 5.4 and then from 5.4 to 6.0, I used the mediwiki script in the maintenance directory to import xml from the command line. I was successful with (but I got an error in importDump.php and had to edit the file to ensure that Maintenance.php was included properly!):

-bash-prompt$php importDump.php export.xml

followed by:

-bash-prompt$php rebuildrecentchanges.php

That should now have created all of the new pages.

Update searchindex table

Before doing this backup the database.

Update searchindex table

New pages created by this sort of import do not automatically get added into the search index of the wiki.

  • Use phpmyadmin from the cpanel and use the repair tool(on the Operations tab and under table Maintenance>Repair tablefor the searchindex table.

Set previous version pages to be protected

Finally, if you want to set all the previous older user manual version pages to be protected, so only sysop mediawiki users are able to make changes (so all changes will be forced into the next set of pages...), you can use a series of sql statements in the form:

UPDATE `grampswiki`.`page` SET `page_restrictions` = 0x656469743d7379736f703a6d6f76653d7379736f70 WHERE `page`.`page_title` ='Title_of_the_page_in_the_wiki' collate utf8_unicode_ci LIMIT 1;

(You'll need to replace the spaces in the page title with underscores...)

Beginning with MediaWiki 1.10

Page protection controls were moved to the page restrictions table, so this field will be empty in databases generated by more current versions of MediaWiki. However, this field is still used in current versions of MediaWiki for rows generated by older versions of MediaWiki!

Do that by using concatenation in a spreadsheet (call it protectoldpages.xls), creating the (long) SQL statements by combining a first part, the page names and the last part. You'll need to search for any single quotes and put a backslash in front of them in the page names first, too...

That updates a blob field to force protection without having to do the pages individually. You need to do the above sql statement for each page that you want to protect...

  • Copy the resulting SQL statement to a text file protectoldpages.txt
  • check the SQL statements syntax here
  • Then run the SQL statement by either using phpmyadmin("SQL" tab to run the query) or the commandline.

Backup the database.


As there is no naming scheme we are trying to follow a rule for manual upgrade :

  • filename-{number of version}-{lang}.extension


  • Edit-person-50-en.png

We need to know if we keep the {number of version} or if we use the new one on migration (3.1->3.2->4.0->15.0...) ?

Regenerate Version-Specific Content

Update files dependency list : The documentation assumes the default (English-US) language on Linux

Edit the manual section :

Copy (or pipe) the updated Gramps installation output from the command line Version option :

gramps -v

Replace any user-specific directory information with <~username>

If appropriate, please reference the Gramps "User Directory" Glossary term entry instead of redundantly explaining the directory paths under any of the multiuser OSes. This glossary entry include a drill-down link to the Appendix describing how to find the directory for a particular fork of Gramps

[[Gramps_Glossary#user_directory|User Directory]]

Gramps user directory available on various operating systems.

Remove "STUB" template from the previous User manual

For the Gramps user manual only the latest version should be listed on Category:Stub.

Version manual template

Update the current ( 5.2 ) version number using on Template:Version_manual for pages outside of the user manual.

Manually update any REDIRECT pages also eg: "Third-party Addons" link redirects to 5.1 Addons etc..

Email announcement

Send the following email announcement (see embedded email) to the mailing list.

External Site changes

  • Some of the Discourse forum automatic linking of key phrases point to sections in the current manual. A user with "Leader" or higher trust setting should review and update the Admin -> Customize -> Watched Words -> Link list. This is most easily accomplished by:
  1. using the discourse Download button to acquire a CSV of the list (which is an unqualified comma delimited list with a "phrase,url" on each line)
  2. edit it in a text editor
  3. replace the versions in the list
  4. save the file
  5. use the "Clear All" discourse button
  6. use the "Add from file" discourse button to upload the modified list
  7. refresh the Discourse page manually as that seems to be quirky
  8. use the "Test" discourse button to verify a couple linkify phrases

See also