wiki:TestManagerForTracPluginGenericClass

Version 10 (modified by Roberto Longobardi, 11 years ago) (diff)

--

Test Manager for Trac - Generic Persistent Class Framework

The Test Manager plugin is comprised of four plugins, one of which is a Generic Persistent Class framework, allowing plugin programmers to easily build and manage persistent objects in the Trac database.

Inheriting from the AbstractVariableFieldsObject base class provided with this plugin is a matter of a few lines of code, and will provide your objects with:

  • Persistence, in the Trac database,
  • Custom Properties, your Users will be able to specify in their trac.ini files. Also automatic Web user interface support is provided,
  • Change History tracking, recording User, timestamp and values before and after of every object's property change,
  • Object lifecycle Listeners, allowing your plugins or others to register for feedback on objects creation, deletion and modification,
  • Custom Authorization, allowing custom permissions to control any access to your objects,
  • Integrated Search, both via the Trac general Search box and programmatically, using pattern matching.


The Generic Workflow Engine and the Test Manager plugins leverage this plugin for their data models.

Works with both Trac 0.11 and 0.12.

The basic building blocks of the framework are:

  • The AbstractVariableFieldsObject base class, providing most object features. Inheriting from this class is a matter of a few lines of code, and will provide your objects with the features outlined above.
  • The IConcreteClassProvider interface, allowing your plugin to register your classes with the framework, participate in the framework's objects factory and provide custom security.
  • The GenericClassModelProvider component, providing the objects factory, managing concrete class providers and handling metadata about custom classes and their fields.
  • The GenericClassSystem component, providing the framework logic for displaying and updating custom properties in the Web user interface, and providing support for the Trac search box.
  • The need_db_upgrade() and upgrade_db() utility methods, providing the ability for plugins to declaratively create and upgrade their database tables to match the custom classes provided.
  • The IGenericObjectChangeListener inteface, an extension point interface for components that require notification when objects of any particular type are created, modified, or deleted.


AbstractVariableFieldsObject - The data model

The plugin provides a base python class, AbstractVariableFieldsObject, originally derived from the base Trac Ticket class, which brings a rich set of features to any custom class that should inherit from it.

It represents a persistent object which fields are declaratively specified.

Each subclass will be identified by a type name, the "realm", which is provided at object initialization time. This name must also match the name of the database table storing the corresponding objects, and is used as the base name for the custom fields table and the change tracking table (see below), if needed.

Features:

  • Support for custom fields, specified in the trac.ini file with the same syntax as for custom Ticket fields. Custom field values are kept in a "<schema>_custom" table.

  • Keeping track of all changes to any field, into a separate "<schema>_change" table.
  • A set of callbacks to allow for subclasses to control and perform actions pre and post any operation pertaining the object's lifecycle. Subclasses may give or deny permission for any operation to be performed.
  • Registering listeners, via the IGenericObjectChangeListener interface, for object creation, modification and deletion.
  • Searching objects matching any set of valorized fields, (even non-key fields), applying the "dynamic record" pattern. See the method list_matching_objects.
  • Very well commented code and full of debug messages.


A set of special fields help subclasses to implement their logic:

  • self.exists : always tells whether the object currently exists in the database.
  • self.resource: points to a Resource, in the trac environment, corresponding to this object. This is used, for example, in the workflow implementation.
  • self.fields: points to an array of dictionary objects describing name, label, type and other properties of all of this object's fields.
  • self.metadata: points to a dictionary object describing further meta-data about this object.


Note: database tables for specific realms are supposed to already exist: this object does not create any tables. See below the GenericClassModelProvider to see how to make the framework create the required tables declaratively.


How to create a custom persistent object

This section will guide you with step-by-step instructions on how to create a custom object, including declaring the object's class, providing the fields metadata, and specifying the DB tables.

Note: The following examples are taken from the TracGenericWorkflow plugin. Refer to the corresponding code if you want more context.

The following is the basic set of imports you will need to create a custom object. The first one is needed to let your class inherit from the basic abstract class, the others to define your concrete model provider.

from tracgenericclass.model import AbstractVariableFieldsObject, IConcreteClassProvider, need_db_upgrade, upgrade_db


Then we must declare the concrete class. See the description below the code box.

class ResourceWorkflowState(AbstractVariableFieldsObject):
    # Fields that have no default, and must not be modified directly by the user
    protected_fields = ('id', 'res_realm', 'state')

    def __init__(self, env, id=None, res_realm=None, state='new', db=None):
        self.values = {}

        self.values['id'] = id
        self.values['res_realm'] = res_realm
        self.values['state'] = state

        key = self.build_key_object()
    
        AbstractVariableFieldsObject.__init__(self, env, 'resourceworkflowstate', key, db)

    def get_key_prop_names(self):
        return ['id', 'res_realm']
        
    def create_instance(self, key):
        return ResourceWorkflowState(self.env, key['id'], key['res_realm'])

The first statement defines the concrete class as a subclass of AbstractVariableFieldsObject.

The protected_fields field declares which fields must not be given a default at creation time. It is of little use, but it's still required.

Then, the constructor. The only requirements on the constructor signature are that you must have room for the parameters env and db, which you'll need to pass to the superclass. The other parameters you see in this example are just specific to the workflow object.

Talking about the constructor we must see the different uses you can make of it.

In fact, you can:

  • Create an empty object, and also try to fetch it from the database, if an object with a matching key is found. To fetch an existing object from the database and modify or delete it:
    1. specify a key at contruction time: the object will be filled with all of the values form the database,
    2. modify any other property via the obj['fieldname'] = value syntax, including custom fields. This syntax is the only one to keep track of the changes to any field.
    3. call the save_changes() or delete() method on the object.
  • Create a new object to be later stored in the database. In this case, the steps required are the following:
    1. specify a key at contruction time,
    2. set any other property via the obj['fieldname'] = value syntax, including custom fields,
    3. call the insert() method on the object.
  • Create an empty, template object, used for pattern matching search. To do this, do not specify a key at initialization time.

In the body of the constructor, there a few important things you must do:

  1. Clear the self.values dictionary field, which holds all of the object's property values,
  2. Set the parameters received in the constructor into self.values. Use the syntax where you assign values directly to the self.values dictionary here, not the one like self['fieldname'] = value,
  3. Build the object's key (by using the provided self.build_key_object() method, if you wish),
  4. Call the superclass constructor, passing as the third argument the unique type name that will identify objects of your custom class. Remember that this name must match the name of the database table providing persistence to your objects.

Two more methods are required to make a functional subclass.

The get_key_prop_names() method is vital in telling the framework which the key properties of your subclass are. As you can see, you should return an array of property names here. Also note that the build_key_object() method uses information returned from this method to build the object's key.

The create_instance() is a factory method used to create an instance of this custom class, given a key dictionary object.


This is it!!! Our new class is ready to be used, with all the features outlined at the top of this page.



Well, actually before being able to use our new class, we must:

  • Define our concrete class provider, implementing the IConcreteClassProvider interface, and
  • Declare the data model to be created into the Trac DB. The database will be created (and updated) by the framework.

As you will see, this is far from complex. See the description below the code box.

class GenericWorkflowModelProvider(Component):
    """
    This class provides the data model for the generic workflow plugin.
    
    The actual data model on the db is created starting from the
    SCHEMA declaration below.
    For each table, we specify whether to create also a '_custom' and
    a '_change' table.
    
    This class also provides the specification of the available fields
    for each class, being them standard fields and the custom fields
    specified in the trac.ini file.
    The custom field specification follows the same syntax as for
    Tickets.
    Currently, only 'text' type of custom fields are supported.
    """

    implements(IConcreteClassProvider, IEnvironmentSetupParticipant)

    SCHEMA = {
                'resourceworkflowstate':  
                    {'table':
                        Table('resourceworkflowstate', key = ('id', 'res_realm'))[
                              Column('id'),
                              Column('res_realm'),
                              Column('state')],
                     'has_custom': True,
                     'has_change': True,
                     'version': 1}}
            }

    FIELDS = {
                'resourceworkflowstate': [
                    {'name': 'id', 'type': 'text', 'label': N_('ID')},
                    {'name': 'res_realm', 'type': 'text', 'label': N_('Resource realm')},
                    {'name': 'state', 'type': 'text', 'label': N_('Workflow state')}
                ]
            }
            
    METADATA = {
                'resourceworkflowstate': {
                        'label': "Workflow State", 
                        'searchable': False,
                        'has_custom': True,
                        'has_change': True
                    },
                }

            
    # IConcreteClassProvider methods
    def get_realms(self):
            yield 'resourceworkflowstate'

    def get_data_models(self):
        return self.SCHEMA

    def get_fields(self):
        return self.FIELDS
        
    def get_metadata(self):
        return self.METADATA
        
    def create_instance(self, realm, key=None):
        obj = None
        
        if realm == 'resourceworkflowstate':
            if key is not None:
                obj = ResourceWorkflowState(self.env, key['id'], key['res_realm'])
            else:
                obj = ResourceWorkflowState(self.env)
        
        return obj

    def check_permission(self, req, realm, key_str=None, operation='set', name=None, value=None):
        pass


    # IEnvironmentSetupParticipant methods
    def environment_created(self):
        self.upgrade_environment()

    def environment_needs_upgrade(self, db=None):
        for realm in self.SCHEMA:
            realm_metadata = self.SCHEMA[realm]

            if need_db_create_for_realm(self.env, realm, realm_metadata, db) or \
                need_db_upgrade_for_realm(self.env, realm, realm_metadata, db):
                
                return True
                
        return False

    def upgrade_environment(self, db=None):
        # Create or update db
        @self.env.with_transaction(db)
        def do_upgrade_environment(db):
            for realm in self.SCHEMA:
                realm_metadata = self.SCHEMA[realm]

                if need_db_create_for_realm(self.env, realm, realm_metadata, db):
                    create_db_for_realm(self.env, realm, realm_metadata, db)

                elif need_db_upgrade_for_realm(self.env, realm, realm_metadata, db):
                    upgrade_db_for_realm(self.env, 'tracgenericworkflow.upgrades', realm, realm_metadata, db)


First of all, this should be a component, because it must implement Trac interfaces. The interfaces to implement are the following:

  • IConcreteClassProvider, to provide custom classes to the framework,
  • IEnvironmentSetupParticipant, to be queried by Trac about any needs regarding database creation or upgrade.

Then comes the declaration of the database schema, class fields and class metadata, to be later returned in the corresponding methods of the interface.


Upgrading the Database

If you need to upgrade the database schema in a subsequent release, the framework helps you with that.

The call to upgrade_db_for_realm, in the code above, specifies as the second parameter the subdirectory where all of the DB upgrade scripts will be provided.

This directory will contain simple python files, one for each table and for each DB version upgrade, with a structure similar to the following. See a detailed explanation after the code box. Refer to the TestManager plugin upgrade directory for a more complete example.

These upgrade scripts must follow a specific naming convention: db_<table name>_<version>.

The following script form TestManager plugin, for example, is named "db_testcaseinplan_2", meaning that it will upgrade the "testcaseinplan" table up from version 1 to version 2.

from trac.db import Table, Column, Index, DatabaseManager
from tracgenericclass.util import *
from testmanager.model import TestManagerModelProvider

def do_upgrade(env, ver, db_backend, db):
    """
    Add 'page_version' column to testcaseinplan table
    """
    cursor = db.cursor()
    
    realm = 'testcaseinplan'
    cursor.execute("CREATE TEMPORARY TABLE %(realm)s_old AS SELECT * FROM %(realm)s" % {'realm': realm})
    cursor.execute("DROP TABLE %(realm)s" % {'realm': realm})

    table_metadata = Table('testcaseinplan', key = ('id', 'planid'))[
                              Column('id'),
                              Column('planid'),
                              Column('page_name'),
                              Column('page_version', type='int'),
                              Column('status')]

    env.log.info("Updating table for class %s" % realm)
    for stmt in db_backend.to_sql(table_metadata):
        env.log.debug(stmt)
        cursor.execute(stmt)

    cursor = db.cursor()

    cursor.execute("INSERT INTO %(realm)s (id,planid,page_name,page_version,status) "
                   "SELECT id,planid,page_name,-1,status FROM %(realm)s_old" % {'realm': realm})

    cursor.execute("DROP TABLE %(realm)s_old" % {'realm': realm})


In case you need to update a table's structure (e.g. add columns), you will:

  1. Create a temporary table with the same structure as the old table
  2. Copy all the contents of the current table into the temporary table
  3. Drop the original table
  4. Recreate the original table with the new structure
  5. Copy back all the contents from the temporary table into the updated original table
  6. Drop the temporary table


Following is the IConcreteClassProvider interface documentation, which is pretty explanatory about the format and meaning of the required data.

    def get_realms():
        """
        Return class realms provided by the component.

        :rtype: `basestring` generator
        """

    def get_data_models():
        """
        Return database tables metadata to allow the framework to create the
        db schema for the classes provided by the component.

        :rtype: a dictionary, which keys are schema names and values 
                are dictionaries with table metadata, as in the following example:
                return {'sample_realm':
                            {'table':
                                Table('samplerealm', key = ('id', 'otherid'))[
                                      Column('id'),
                                      Column('otherid'),
                                      Column('prop1'),
                                      Column('prop2'),
                                      Column('time', type='int64')],
                             'has_custom': True,
                             'has_change': True},
                       }
        """

    def get_fields():
        """
        Return the standard fields for classes in all the realms 
        provided by the component.

        :rtype: a dictionary, which keys are realm names and values 
                are arrays of fields metadata, as in the following example:
                return {'sample_realm': [
                            {'name': 'id', 'type': 'text', 'label': N_('ID')},
                            {'name': 'otherid', 'type': 'text', 'label': N_('Other ID')},
                            {'name': 'prop1', 'type': 'text', 'label': N_('Property 1')},
                            {'name': 'prop2', 'type': 'text', 'label': N_('Property 2')},
                            {'name': 'time', 'type': 'time', 'label': N_('Last Change')}
                       }
        """
        
    def get_metadata():
        """
        Return a set of metadata about the classes in all the realms 
        provided by the component.

        :rtype: a dictionary, which keys are realm names and values 
                are dictionaries of properties.
                
                Available metadata properties are:
                    label: A User-friendly name for the objects in this class.
                    searchable: If present and equal to True indicates the class
                                partecipates in the Trac search framework, and
                                must implement the get_search_results() method.
                    'has_custom': If present and equal to True indicates the class
                                  supports custom fields.
                    'has_change': If present and equal to True indicates the class
                                  supports property change history.
                    
                See the following example:
                return {'sample_realm': {
                                'label': "Sample Realm", 
                                'searchable': True
                            }
                       }
        """

    def create_instance(realm, props=None):
        """
        Return an instance of the specified realm, with the specified properties,
        or an empty object if props is None.

        :rtype: `AbstractVariableFieldsObject` sub-class instance
        """

    def check_permission(req, realm, key_str=None, operation='set', name=None, value=None):
        """
        Checks whether the logged in User has permission to perform
        the specified operation on a resource of the specified realm and 
        optionally with the specified key.
        
        Raise an exception if authorization is denied.
        
        Possible operations are:
            'set': set a property with a value. 'name' and 'value' parameters are required.
            'search': search for objects of this class.
        
        :param key_str: optional, the object's key, in the form of a string representing 
                        a dictionary. To get a dictionary back from this string, use the 
                        get_dictionary_from_string() function in the
                        tracgenericclass.util package.
        :param operation: optional, the operation to be performed on the object.
        :param name: optional property name, valid for the 'set' operation type
        :param value: optional property value, valid for the 'set' operation type
        """

For what concerns the IEnvironmentSetupParticipant interface implementation, it is really straightforward, in that the framework provides utility methods that, reading the schema declared in the above section:

  • Determine whether a database upgrade is needed, checking the availability of the tables and the columns as declared. This is achieved through the utility method need_db_upgrade().
  • Perform the database upgrade if required. This is achieved through the utility method upgrade_db(). Note: only creating all the tables is currently supported, not altering existing tables (for example due to your plugin database schema changing between versions).


Keeping control of your objects

We have seen how easy is to define a class and its properties, and below we will see how easy it is to then use these classes in the client code.

The framework will handle all of the burden of fetching the database rows, populating the object, keeping track of the changes, saving it when you want to, and so on.

Anyway, there may be cases where you want to keep control on what's being done, and perhaps prevent some of these operations to occur, based on your own logic.

This can be easily achieved by overriding some methods:

  • The "pre" methods get called before performing the corresponding operation. You can return True or False to allow, or deny, the operation to be performed, respectively.
  • The "post" methods get called after the corresponding operation has been performed, before the transaction is committed. They give you the opportunity to perform further custom work in the context of the same operation.

The following is the list of such methods, from the AbstractVariableFieldsObject class documentation.

    def pre_fetch_object(self, db):
        """ 
        Use this method to perform initialization before fetching the
        object from the database.
        Return False to prevent the object from being fetched from the 
        database.
        """
        return True

    def post_fetch_object(self, db):
        """
        Use this method to further fulfill your object after being
        fetched from the database.
        """
        pass
        
    def pre_insert(self, db):
        """ 
        Use this method to perform work before inserting the
        object into the database.
        Return False to prevent the object from being inserted into the 
        database.
        """
        return True

    def post_insert(self, db):
        """
        Use this method to perform further work after your object has
        been inserted into the database.
        """
        pass
        
    def pre_save_changes(self, db):
        """ 
        Use this method to perform work before saving the object changes
        into the database.
        Return False to prevent the object changes from being saved into 
        the database.
        """
        return True

    def post_save_changes(self, db):
        """
        Use this method to perform further work after your object 
        changes have been saved into the database.
        """
        pass
        
    def pre_delete(self, db):
        """ 
        Use this method to perform work before deleting the object from 
        the database.
        Return False to prevent the object from being deleted from the 
        database.
        """
        return True

    def post_delete(self, db):
        """
        Use this method to perform further work after your object 
        has been deleted from the database.
        """
        pass
        
    def pre_save_as(self, old_key, new_key, db):
        """ 
        Use this method to perform work before saving the object with
        a different identity into the database.
        Return False to prevent the object from being saved into the 
        database.
        """
        return True
        
    def post_save_as(self, old_key, new_key, db):
        """
        Use this method to perform further work after your object 
        has been saved into the database.
        """
        pass
        
    def pre_list_matching_objects(self, db):
        """ 
        Use this method to perform work before finding matches in the 
        database.
        Return False to prevent the search.
        """
        return True


A sample use

Now that we have our new class and provider in place, let's see how to use them!

Creating a new, non-existing object

To create a new, non existing object of our new class, we follow the steps outlined above:

  1. specify a key at contruction time,
  2. set any other property via the obj['fieldname'] = value syntax, including custom fields,
  3. call the insert() method on the object.

See the following code.

# Let's first import our new class definition. 
# Note that you don't have to deal with the framework in any way, the class may be defined on its own.
from tracgenericworkflow.model import ResourceWorkflowState
    # The following statement will create an empty object with a specific key, and suddenly 
    # try to fetch an object with the same key from the database. 
    # If it is found, then the object's properties will be filled with the corresponding values 
    # from the database, and the internal field "exists" set to True.
    rws = ResourceWorkflowState(self.env, id, sometext)
    
    # We can here check whether the object was found in the database
    if rws.exists:
        # The object already exists! So we can get its property values
        print rws['state']
    else: 
        # Here we decide to create the object. So we fill in some other properties and then call the "insert()" method.
        # The object will be stored into the database, and all registered listeners will be called.
        rws['state'] = 'new'
        rws.insert()

Reading an existing object, and changing it

Here we want to read an existing object of our new class, alter some of its properties and save it back.

We follow the steps outlined above:

  1. specify a key at contruction time: the object will be filled with all of the values form the database,
  2. modify any other property via the obj['fieldname'] = value syntax, including custom fields. This syntax is the only one to keep track of the changes to any field.
  3. call the save_changes() method on the object.

See the code below.

    rws = ResourceWorkflowState(self.env, id, sometext)
    
    if rws.exists:
        # The object already exists. 
        # Now we also want to modify the object's 'state' property, and save the updated object.
        rws['state'] = 'ok'

        # The following code will modify the object in the database and also call all 
        # of the registered listeners.
        try:
            rws.save_changes(author, "State changed")
        except:
            self.log.info("Error saving the resource with id %s" % rws['id'])

Saving changes will also keep track of the change history into a different table, named like the base table with an "_change" suffix. Note: Currently there is no explicit programmatic API to access these change tables, you must access them directly with SQL if you need to.

Delete an object

Deleting an object is as simple as calling the delete() method on the object instance.

See the following code.

    rws = ResourceWorkflowState(self.env, id, sometext)
    
    if rws.exists:
        # The object already exists. 
        # Now we want to delete the object from the database. The following code will delete the
        # object and also call all of the registered listeners.
        try:
            rws.delete()
        except:
            self.log.info("Error deleting the resource with id %s" % rws['id'])

Use pattern-matching to find a particular set of objects

You can get a list of objects matching a particular set of properties - i.e. pattern matching - very easily:

  1. Create an empty, template object, without specifying a key,
  2. Give values to any properties you want the objects in the database to match,
  3. Call the list_matching_objects() method on the object.

See the following code.

    # We create a template object here, i.e. not providing a key.
    rws_search = ResourceWorkflowState(self.env)

    # Let's say we want to list all the objects in the database having a 'state' of 'new'.
    # We set the desired property values in the template objects
    rws_search['state'] = 'new'

    # We then start the search
    for rws in rws_search.list_matching_objects():
        # At every cycle, rws will hold another result matching the search pattern, in the
        # form of a full-fetched object of the ResourceWorkflowState type.
        print rws['id']

Saving an object with a new name (key)

Sometimes all your Users want is to use an existing object, modify a couple of things and save it under another name.

To help your code with this task, the abstract class provide the save_as() method.

The previous object is not deleted, so if needed it must be deleted explicitly.

See the following code:

    rws = ResourceWorkflowState(self.env, id, sometext)
    
    if rws.exists:
        # The object already exists. 
        # Now we want to duplicate it, by giving the object a different key and saving it.
        # The following code will save the new object into the database and also call all 
        # of the registered listeners.
        try:
            new_key = {'id': '123456', 'res_realm': rws['res_realm']}
            rws['state'] = 'new'
            rws.save_as(new_key)
        except:
            self.log.info("Error saving the resource with id %s" % new_key['id'])

Getting or setting multiple values at the same time

A couple of utility methods in the base class allow clients to get or get multiple values at the same time.

See the following code for details.

    def get_values(self, prop_names):
        """ 
        Returns a list of the values for the specified properties,
        in the same order as the property names.
        """
                
    def set_values(self, props):
        """
        Sets multiple properties into this object.
        props must be a dictionary object with the names and values to set.        
        
        Note: this method does not keep history of property changes.
        """


Trac Resource coupling

Every object managed within this framework is paired with a Trac Resource.

TODO Fill in this section.


Providing specific results to the Trac search engine

Subclasses can participate in the Trac search engine, if they wish to. It is a matter of a few lines of code.

First of all, the concrete class provider must explicitly state this support in the class metadata, just by setting the 'searchable' metadata property. For more details about class metadata, refer to the IConcreteClassProvider interface implementation.

Then, the subclass must override the get_search_results() method.

For details about how to implement this method, refer to the base Trac documentation about the ISearchSource interface in the trac.search package.


IGenericObjectChangeListener - Registering listeners to objects lifecycle events

Every class inheriting from AbstractVariableFieldsObject supports a listener interface for components interested in objects' lifecycle events.

To register a listener for any particular class type, i.e. "realm", your component must implement the IGenericObjectChangeListener from the tracgenericclass.api package.

Following is the documentation from the interface itself.

    def object_created(g_object):
        """Called when an object is created."""

    def object_changed(g_object, comment, author, old_values):
        """Called when an object is modified.
        
        `old_values` is a dictionary containing the previous values of the
        fields that have changed.
        """

    def object_deleted(g_object):
        """Called when an object is deleted."""


The object that has just been created, modified or deleted is passed along with the methods in the interface.

You can extract basic information about the object as follows:

  • The object's type can be retrieved from the realm field:
    object_realm = g_object.realm
    
  • The information in the "special fields" as described above.

Note: The object being passed along here is the actual object, not a copy. The listener would then be able to alter the object and then save it, thus affecting the system. I didn't want to prevent this (and probably it wouldn't be even possible), because this may be a powerful mechanism of further hacking the environment for achieving higher functionalities.