Progress in 2.2 tree

In recent weeks, several fairly big changes happened in HEAD. First of all, the Berkeley DB storage is now using DB transactions (Txn for short). The use of transactions should ensure that any change to the database file either happens completely (will be atomic and durable) or not at all. It should protect us from corrupted data.

We added support for Repositories and for source references to the Repositories. Also event references are added, to enable events to be shared among different people. These and other changes caused the additions and restructuring in our XML format. The old format will continue to be supported for reading though. Both XML versions are formally described at http://gramps-project/xml (version 1.0.0 is for stable gramps, 2.0.x; while 1.1.0 is for devel tree, 2.2.x).

Then, the importers from GEDCOM and XML were optimized. We now import same XML or GEDCOM file faster than we do in 2.0.x, despite the fact that we are writing more data (repositories and references, event references, etc) in a journaling manner (Txn). Also, the storing of the data has been converted from saving Gramps-defined objects to saving native python objects: text, numbers, lists, etc. This provides shorter files and faster storage/retrieval, as well as nice separation between the code and the data.

On the interface front, much profiling and optimizing has been done by Don and Richard. The different views are now being built upon request, the extra steps in drawing have been removed, the sorting optimized. Richard is almost done with the modification of the TreeView that can draw a view with 40 thousand rows within a fraction of a second.

Finally, as the core database changes stabilize, the work is being done on bringing the new and improved UI elements back to life. This includes editing dialogs for person, relationship, event/event reference, etc. In short, there’s a lot of exciting things going on to wait for and to help with. Please chime in on gramps-devel at lists.sourceforge.net if you would like to help — we will let you know how :-)

Join the Conversation!