Pages

Wednesday, January 18, 2017

DQL to retrieve all the non inherited attributes of an object type

How to retrieve all the attributes, including both the inherited and not inherited ones, of an object type? The quick way is to use DQL:

select * from dm_type  where  name='testtype' ENABLE(ROW_BASED);

If only the personal attributes without the inherited attributes are needed, the DQL should be slightly elaborated to exclude the inherited attributes:

select * from dm_type  where  name='testtype' and attr_identifier>start_pos order by attr_name ENABLE(ROW_BASED);

Obviously, one would need to replace * with the list of the needed columns, most probably such as attr_name, attr_type, attr_repeating and attr_length.

Tuesday, January 17, 2017

How to copy all versions of a documentum object if the root version in the object version hierarchy is deleted?

Suppose a documentum object has many versions. It is easy to copy such an object while preserving its version tree. For example, probably the easiest possible approach is described here. The described method will work even if the root version in the source object has been previously deleted. But the copies will not be entirely valid. For example, they will be undeletable by normal means.

When the root version is deleted, chronicle id of all its descendants are not changed even though the root version becomes inaccessible. To produce the valid objects, I treat the oldest existing version as the root version - its copy id serves as the chronicle in the producted copies of the source versions.

    Map<IDfId, IDfId> copyAllObjectsWithChronicleId(IDfId chronicleId, IDfSession sourceSession) throws DfException {
        // list ordered by r_creation_date and r_object_id
        List<IDfId> idsWithChronicleId = getAllVersionIdsWithChronicleId(sourceSession, chronicleId);
        Map<IDfId, IDfId> missingSourceObjIdCopyObjIds = new HashMap<IDfId, IDfId>();

        // missing because it was deleted
        if (idsWithGivenChronicleId.contains(chronicleId)) {
            logger.debug("the root version is intact");
        } else {
            // the root object is deleted, so the oldest existing its version will substitute it
            logger.debug("the version is missing");
            missingSourceObjIdCopyObjIds.put(chronicleId, idsWithGivenChronicleId.get(0));
        }
        ..... // copy data, contents and version-related attributes
    }

I use the map-based approach also for exact replication of virtual documents or objects together with all their parent folder paths whereby the existing paths are reused whereas missing paths are created.

Friday, January 13, 2017

Saving current GIT commit hash in manifest

Sometimes I can be quite handy to save commit hash in the manifest of the built jar. If the development is quite intense and many successive versions of jar are installed in many locations, it might be difficult to determine the commit revision corresponding to the code in a particular jar, unless the project version and thereby filename is updated with each commit. The commit hash included in a jar allows easy recovery of the corresponding source code.

Build Number Maven Plugin allows generating GIT-related properties such as git hash or branch. Then those properties can be saved in some file with help of another plugin. The following lines in pom.xml allows saving git hash into manifest.mf:

    <scm>
        <connection>scm:git:https://test@test.git.beanstalkapp.com/test.git</connection>
        <developerConnection>scm:git:https://test@test.git.beanstalkapp.com/test.git</developerConnection>
        <tag>HEAD</tag>
    </scm>
    <build>
        <plugins>             
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>buildnumber-maven-plugin</artifactId>
                <version>1.3</version>
                <executions>
                    <execution>
                        <phase>validate</phase>
                        <goals>
                            <goal>create</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <doCheck>false</doCheck>
                    <doUpdate>false</doUpdate>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>2.6</version>
                <configuration>                  
                    <archive> 
                        <manifest>
                            <addDefaultImplementationEntries>true</addDefaultImplementationEntries>
                        </manifest>
                        <manifestEntries>
                            <Commit>${buildNumber}</Commit>
                            <Branch>${scmBranch}</Branch>
                            <Build-Time>${maven.build.timestamp}</Build-Time>
                        </manifestEntries> 
                    </archive>                  
                </configuration>
            </plugin>   
        </plugins>
    </build>

The manifests in jar files built with such settings will contain the useful attributes that might facilitate debugging such as commit hash, branch and build time.

Invoking the dm_method from an action plugin to access the manifest of the jar encoding the method

Suppose the jar implemting some functionality is placed somewhere in WEB-INF/lib of some war deployed in some server. And there is a need for the jar to know the contents of its own manifest.mf. Standard methods such as Class.getResource("/META-INF/MANIFEST.MF") can only be chance might return the right url. Usually it will be the manifest of the file that was loaded first by the classloader.

To illustrate how the right manifest could be recovered, I will consider the setup where a dm_method installed as a jar in Documentum Java Methods Server. This jar is supposed to function as an important method. Additionally, administrators need to be able to easily access its description, e.g. build number or time, contained in manifest. For example, this information can be displayed in a popup invoked by clicking on a custom menu item in D2.

When a method jar depends on other classes in the classpath, the jar should be placed in WEB-INF/lib of DmMethods.war. Some people put method jars into dba/java_methods. They work only if the jars have no dependencies in the classpath.

The following class load the manifest specifically from the hosting jar. Obviously the manifest attributes should be filtered so that only pertinent values, such as commit hash or build time, are returned.

public class ManifestLoader {

    // escaped new line so that new lines can be passed to javascript
    public static String NEW_LINE = "\\n";
    private static ManifestLoader instance = new ManifestLoader();

    public static ManifestLoader getInstance() {
        return instance;
    }

    public String getVersionInfo() {
        Attributes manifestAttrs=loadManifestAttributes();
        StringBuilder sb = new StringBuilder();
        // return all attributes, but normally here should be some filter for the pertinent attributes
        for (Object o : manifestAttrs.keySet()) {
            sb.append(o + ": " + manifestAttrs.get(o) + NEW_LINE);
        }
        return sb.toString();
    }
    
    Attributes manifestAttrs;

    Attributes loadManifestAttributes() {
        // load manifest only once per runtime
        if (manifestAttrs == null) {
            manifestAttrs = getPersonalManifestInJBoss().getMainAttributes();
        }
        return manifestAttrs;
    }

    Manifest getPersonalManifestInJBoss() { // works only in JBOSS
        Manifest manifest = new Manifest();
        try {
            // determine the url of this jar
            URL thisJarUrl = getClass().getResource(getClass().getSimpleName() + ".class");
            // convert the url into the filename
            String jarFileName = thisJarUrl.toString().replaceFirst("vfs:/", "jar:file:/");
            String jarExtention = ".jar/";
            jarFileName = jarFileName.substring(0, jarFileName.indexOf(jarExtention) + jarExtention.length()) + "!/";
            // open the jar to extract its contents
            URL jarUrl = new URL(jarFileName);
            JarURLConnection jarConnection = (JarURLConnection) jarUrl.openConnection();
            // here we need only manifest
            manifest = jarConnection.getManifest();
        } catch (IOException ex) {
        }
        return manifest;
    }
}

Below is the simplified class implementing IDmMethod so that the jar can be registered as Documentum dm_method. Note, unlike retrieving the manifest value, the principal skipped here long running functions should by executed asynchronously in this dm_method. When the method is invoked by D2 custom action plugin, it returns the contents of the manifest as an error message. Note, throwing an exception is the only way to pass a message from the invoked dm_method to the invoking dql statement. In the dql statement result collection the message will be stored as error_message attribute value.

public class InvokeMethodFromPlugin implements IDmMethod {

    public static final String INFO_KEY = "info";

    public void execute(Map params, OutputStream out) throws Exception {
        String[] infos = (String[]) params.get(INFO_KEY);
        if (infos != null) {
            throw new RuntimeException(ManifestLoader.getInstance().getVersionInfo());
        }
    }
}

Last, let's consider a simplified D2 action plugin that invokes the method and relays the message to D2 where it can be displayed in javascript alert.

public class LaunchMethodPlugin implements IPluginAction, ID2fsPlugin {

    public static final String INFO_KEY = "info";

    public List<Attribute> getInfo(D2fsContext context) throws D2fsException, DfException {
        IDfSession curSession = context.getSession();
        String dql = "EXECUTE do_method WITH METHOD='MethodName', ARGUMENTS='-" + INFO_KEY + " true'";
        String msg = executeDql(dql, curSession);
        List<Attribute> result = new ArrayList<Attribute>();
        result.add(AttributeUtils.createAttribute("result", msg));
    }

    String invokeMethod(String dql, IDfSession session) throws DfException {
        String msg = null;
        IDfCollection col = null;
        try {
            col = new DfQuery(dql).execute(session, DfQuery.DF_EXEC_QUERY);
            while (col.next()) {
                msg = col.getString("error_message");
            }
        } finally {
            if (col != null) {
                col.close();
            }
        }
        return msg;
    }
}

Monday, January 9, 2017

Decompiling jars obfuscated with AspectJ (e.g. D2FS4DCTM-WEB-4.5.0.jar or dfc.jar)

It is much easier to develop dfc.jar-based applications if dfc source code is available. Unless a jar is deliberately obfuscated, it can be easily decompiled. Unfortunaly, most of the methods of dfc.jar source code include AspectJ expressions that generate logging. Upon compilation AspectJ introduces lots of artificial try-catch blocks, if conditions, synthetic methods and classes. This leads to both trippling the size of the source code and obfuscation. In the decompiled code all AspectJ constructs can be easily eliminated using a simple java application. Unfortunately with most of decompilers, some methods, particularly synchronized or containing synchronized block, are transformed by AspectJ compiler into so complex byte code that they fail to be decompiled by ordinary decompilers.

Development of D2 listener plugins is also easier if the source code of the D2 services is available. The D2 services are encoded in D2FS4DCTM-WEB-4.5.0.jar. If you try to decompile it, you will notice that in all the service classes the original methods encoding all the service logic are missing from the decompiled code. If fact only artificial AspectJ methods remain visible.

For example, let's look into the source code of a decompiled short service class D2DetailService:

public class D2DetailsService extends D2fsAbstractService implements IDetailsService {

    public static Set<String> s_redirectedRefDetail;

    static {
        s_redirectedRefDetail = new HashSet<String>();
        s_redirectedRefDetail.add("Renditions");
        D2DetailsService.s_redirectedRefDetail.add("Audits");
    }

    @InjectSession(redirectReference = RedirectReferenceType.NONE)
    public DocItems getDetailContent(final Context context, final String id, final String detailName, final List<Attribute> parameters) throws Exception {
        return (DocItems) InjectSessionAspect.aspectOf().process(new D2DetailsService$AjcClosure1(new Object[]{this, context, id, detailName, parameters, Factory.makeJP(D2DetailsService.ajc$tjp_0, (Object) this, (Object) this, new Object[]{context, id, detailName, parameters})}).linkClosureAndJoinPoint(69648));
    }

    public class D2DetailsService$AjcClosure1 extends AroundClosure {

        public D2DetailsService$AjcClosure1(final Object[] array) {
            super(array);
        }

        public Object run(final Object[] array) {
            final Object[] state = super.state;
            return D2DetailsService.getDetailContent_aroundBody0((D2DetailsService) state[0], (Context) state[1], (String) state[2], (String) state[3], (List) state[4], (JoinPoint) state[5]);
        }
    }

    public static ID2Detail getD2DetailInstance(final Context context, String detailName) throws Exception {
        ID2Detail result = null;
        Class detailClass = null;
        try {
            detailName = StringUtil.getJavaName(detailName);
            detailClass = Class.forName(String.valueOf(ID2Detail.class.getPackage().getName()) + '.' + detailName);
            result = detailClass.newInstance();
        } catch (ClassNotFoundException ex) {
        }
        return result;
    }
}

@InjectSession annotation marks methods as the targets for transformation by AspectJ. The annotated method getDetailContent is indeed totally twisted by AspectJ. Namely, the original method is replaced by a substitute method that invokes a bizarre innner class that in turn calls the method getDetailContent_aroundBody0 containing the slightly mutilated code of the original getDetailContent method. The problem is that synthetic getDetailContent_aroundBody0 is missing in the decompiled code. Try any decompilers if you doubt this phenomenon.

The best decompiler for obfuscated and crippled java classes is cfr. It is an extraordinary tool that decompiles everything. However, some manual editing is often necessary for the methods, particularly including many blocks, that other decompilers fail to decompile. Let's see what we can recover with cfr from D2DetailsService service that was partially decompiled above.

public class D2DetailsService extends D2fsAbstractService implements IDetailsService {

    public static Set<String> s_redirectedRefDetail;

    static {
        s_redirectedRefDetail = new HashSet<String>();
        s_redirectedRefDetail.add("Renditions");
        s_redirectedRefDetail.add("Audits");
    }

    @InjectSession(redirectReference = RedirectReferenceType.NONE)
    public DocItems getDetailContent(Context context, String id, String detailName, List<Attribute> parameters) throws Exception {
        Context context2 = context;
        String string = id;
        String string2 = detailName;
        List<Attribute> list = parameters;
        Object[] arrobject = new Object[]{context2, string, string2, list};
        JoinPoint joinPoint = Factory.makeJP((JoinPoint.StaticPart) ajc$tjp_0, (Object) this, (Object) this, (Object[]) arrobject);
        Object[] arrobject2 = new Object[]{this, context2, string, string2, list, joinPoint};
        return (DocItems) InjectSessionAspect.aspectOf().process(new D2DetailsService$AjcClosure1(arrobject2).linkClosureAndJoinPoint(69648));
    }

    public class D2DetailsService$AjcClosure1 extends AroundClosure {

        public D2DetailsService$AjcClosure1(final Object[] array) {
            super(array);
        }

        public Object run(final Object[] array) {
            final Object[] state = super.state;
            return D2DetailsService.getDetailContent_aroundBody0((D2DetailsService) state[0], (Context) state[1], (String) state[2], (String) state[3], (List) state[4], (JoinPoint) state[5]);
        }
    }

    public static ID2Detail getD2DetailInstance(Context context, String detailName) throws Exception {
        ID2Detail result;
        result = null;
        Class detailClass = null;
        try {
            detailName = StringUtil.getJavaName((String) detailName);
            detailClass = Class.forName(String.valueOf(ID2Detail.class.getPackage().getName()) + '.' + detailName);
            result = (ID2Detail) detailClass.newInstance();
        } catch (ClassNotFoundException classNotFoundException) {
        }
        return result;
    }

    static final /* synthetic */ DocItems getDetailContent_aroundBody0(D2DetailsService ajc$this, Context context, String id, String detailName, List parameters, JoinPoint joinPoint) {
        ID2Detail detailInstance;
        DocItems result;
        D2fsContext d2fsContext;
        result = new DocItems();
        d2fsContext = (D2fsContext) context;
        d2fsContext.setParameterParser(parameters);
        if (id != null) {
            d2fsContext.getParameterParser().setParameter("id", (Object) id);
        }
        if (detailName != null && s_redirectedRefDetail.contains(detailName) && !d2fsContext.getParameterParser().getBooleanParameter("redirectedReference", false)) {
            D2fsContext sourceContext = null;
            try {
                sourceContext = ReferenceUtils.getSourceContext(d2fsContext, true);
                if (sourceContext != null) {
                    IDfId sourceId = sourceContext.getFirstId();
                    DocItems docItems = new D2DetailsService().getDetailContent((Context) sourceContext, sourceId.toString(), detailName, parameters);
                    return docItems;
                }
            } catch (Exception exception) {
                if (result.getUpperItem() == null) {
                    ContentBuilder.addUpperItem(result, d2fsContext, id, detailName, null);
                }
                DocItems docItems = result;
                return docItems;
            } finally {
                if (sourceContext != null) {
                    sourceContext.release(false);
                }
            }
        }
        if ((detailInstance = D2DetailsService.getD2DetailInstance(context, detailName)) != null) {
            result = detailInstance.getDetailContent(d2fsContext, id);
        }
        if (result.getUpperItem() == null) {
            ContentBuilder.addUpperItem(result, d2fsContext, id, detailName, null);
        }
        return result;
    }
}

In addition to the code analogous to the code that we saw above, we see the nicely decompiled synthetic getDetailContent_aroundBody0 method that essentially contains the untouched original code of the original getDetailContent method. However, unlike the original method, its derivative contains an irrelevant argument ajc$this added by AspectJ.

To sup up, if you develop Documentum applications base on dfc.jar, or if you develop plugins for D2, cfr decompiler is a must-have tool!

Thursday, January 5, 2017

Using an external dfc.properties file

Normally, dfc.jar based client applications load dfc.properties from the class path. That usually means that the file should be included in the application archive. For example, when D2 or DA are deployed to a server, dfc.properties in the WEB-INF/class of the exploded web application has to be adjusted.

However, dfc.jar allows specifying the path to an external dfc.properties file that should be used instead of the internal file. The path should be assigned to system property dfc.properties.file before any dfc.jar method is invoked.

    final static String DFC_PROPERTIES_FILE_NAME = "settings/dfc.properties";

    void checkIfExtenalDFCPropertiesFileIsAvailable() {
        // checking whether the external dfc.properties exist
        File dfcPropertiesFile = new File(DFC_PROPERTIES_FILE_NAME);
        if (dfcPropertiesFile.exists()) {
            // dfc.properties.file system property points to the file that is to be used by dfc.jar
            System.setProperty("dfc.properties.file", dfcPropertiesFile.getAbsolutePath());
            logger.debug("dfc.properties at {} will be loaded", dfcPropertiesFile.getAbsolutePath());
        } else {
            logger.debug("dfc.properties from the classpath will be used");
        }
    }

Tuesday, January 3, 2017

Multithreaded access to Documentum with DFC

Execution of some tasks, e.g. the tasks involving communication via tcp, might be faster when multiple tasks are executed simultaneously. Suppose an application is to make 250 calls to some web server supporting concurrent users. Making 250 calls at the same time will take less time than making all the calls one after the other. The example below mimics such an application and compares the time required to download yahoo.com web page 250 times in a row to the time used to dowload the webpage as many times but using 10 threads.

MyConcurrency class includes the main method. It performs 250 sequential calls followed by 250 parallel calls and then calculates how much faster is the concurrent execution. Those two steps are repeated 10 times to get the reliable average.

public class MyConcurrency {
    int callNumber;
    int numberOfThreads;

    public MyConcurrency(int callNumber, int numberOfThreads) {
        this.numberOfThreads = numberOfThreads;
        this.callNumber = callNumber;
    }

    public static void main(String... args) throws DfException, InterruptedException, ExecutionException {
        int callNumber = 250; 
        int numberOfThreads = 10; // number of threads to invoke the number of task defined in the row above.
        for (int c = 0; c < 5; c++) {
            MyConcurrency i = new MyConcurrency(callNumber, numberOfThreads);
            i.executeBothSequentialAndConcurrentCalls(new MyHTTPRequest());       
        }
    }

    public double executeBothSequentialAndConcurrentCalls(MyRequest req) throws DfException, InterruptedException, ExecutionException {

        // measure how much time it takes to execute calls sequentially 
        long begin = System.currentTimeMillis();
        sequential(req);
        long end = System.currentTimeMillis();
        double sequentialDuration = end - begin;

        // measure how much time it takes to make calls simultaneously 
        begin = System.currentTimeMillis();
        concurrent(req);
        end = System.currentTimeMillis();
        double concurrentDuration = end - begin;
        double ratio = sequentialDuration / concurrentDuration;
        System.out.println("ratio total duration sequential/concurrent=" + ratio);
        return ratio;
    }

    // executes calls one by one
    List<Long> sequential(MyRequest req) throws DfException, InterruptedException {
        List<Long> durations = new ArrayList<>(); 
        for (int i = 0; i < callNumber; i++) {
            durations.add(req.request("S", i));
        }
        return times; // individual durations are not discussed here to simplify the text
    }

    // tries to execute all call simultaneously
    List<Long> concurrent(final MyRequest req) throws InterruptedException, ExecutionException {
        final List<Callable<Long>> partitions = new ArrayList<>();
        for (int i = 0; i < callNumber; i++) {
            final int j = i;
            partitions.add(new Callable<Long>() {
                @Override
                public Long call() throws Exception {
                    return req.request("C", j);
                }
            });
        }

        ExecutorService executorPool = Executors.newFixedThreadPool(numberOfThreads);
        List<Future<Long>> results = executorPool.invokeAll(partitions);
        // a list storing execution time of each single task, in parallel execution some threads have to wait long time for available resources
        List<Long> durations = new ArrayList<>();
        for (Future<Long> r : results) {
            durations.add(r.get());
        }
        executorPool.shutdown();
        return durations; // individual durations are not discussed here to simplify the text
    }
}

Abstract class MyRequest will serve as the interface for subclasses implementing calls to websites or documentum.

public abstract class MyRequest {

    // execute a call to documentum or website, return the spent time
    long request(String objectNamePrefix, Integer objectNumber ) throws DfException, InterruptedException {
        long start = System.currentTimeMillis();

        execute(objectNamePrefix, objectNumber );
        long end = System.currentTimeMillis();

        return end - start;
    }

    abstract void execute(String objectNamePrefix, Integer objectNumber ) throws DfException, InterruptedException;
}

The subclass downloading yahoo website.

public class MyHTTPRequest extends MyRequest {

    @Override
    void execute(String objectNamePrefix, Integer objectNumber) throws DfException, InterruptedException {
        // download a webpage to see how concurrency should work
        downloadWebsite();
    }
    
    // do something that can be executed faster concurrently 
    void downloadWebsite() {
        try {
            final URL url = new URL("http://www.yahoo.com");
            BufferedReader reader = null;
            String l;

            reader = new BufferedReader(new InputStreamReader(url.openConnection().getInputStream()));
            while ((l = reader.readLine()) != null) {
            }
        } catch (Exception ex) {
            ex.printStackTrace();
        }
    }
}

As expected, the output of the example above demonstrates that downloading yahoo 250 times is on average 8 times faster when executed in 10 concurrent threads.

The common thing in http requests and calls from dfc to Documentum is that the communication occurs via tcp. So let's adapt the example above to evaluate whether multithreading and parallelization could speed up an application interacting with a Documentum repository. To compare serial to parallel calls to Documentum, the main method has to be further elaborated and some auxiliary methods have to be added:

public static void main(String... args) throws DfException, InterruptedException, ExecutionException {
        int callNumber = 250;
        int numberOfThreads = 10;
        for (int c = 0; c < 5; c++) {
            System.out.println(">>>Http Control ");
            MyConcurrency i = new MyConcurrency(callNumber, numberOfThreads);
            i.executeBothSequentialAndConcurrentCalls(new MyHTTPRequest());

            System.out.println(">>> Documentum operations using the same session");
            String targetFolderId = createNewFolder(); // all objects will be linked to this folder
            IDfSession commonSession = getNewSession();
            i.executeBothSequentialAndConcurrentCalls(new MyDocumentumRequestUsingOneSession(targetFolderId, commonSession));
            commonSession.disconnect();

            System.out.println(">>> Documentum operations using sessions from pool");
            int numberOfSessionsInPool = numberOfThreads;
            BlockingQueue<IDfSession> sessionPool = createSessionPool(numberOfSessionsInPool);
            i.executeBothSequentialAndConcurrentCalls(new MyDocumentumRequestUsingSessionsFromPool(targetFolderId, sessionPool));
            releaseSessionsInPool(sessionPool);

            System.out.println(">>> Documentum operations using a new session for each call");
            i.executeBothSequentialAndConcurrentCalls(new MyDocumentumRequestUsingNewSessionEachTime(targetFolderId));
        }
    }

    // creates a folder to contain created documentum objects
    static String createNewFolder() throws DfException {
        IDfSession session = getNewSession();
        IDfFolder folder = (IDfFolder) session.newObject("dm_folder");
        folder.setObjectName("Test" + System.currentTimeMillis());
        folder.link(baseFolderPath);
        folder.save();
        String targetFolderId = folder.getObjectId().getId();
        session.disconnect();
        return targetFolderId;
    }

    static BlockingQueue<IDfSession> createSessionPool(int numberOfSessionsInPool) throws DfException, InterruptedException {
        BlockingQueue<IDfSession> sessionPool = new LinkedBlockingQueue<>();
        for (int i = 0; i < numberOfSessionsInPool; i++) {
            sessionPool.put(getNewSession());
        }
        return sessionPool;
    }

    static void releaseSessionsInPool(BlockingQueue<IDfSession> sessionPool) throws DfException {
        try {
            while (true) {
                IDfSession session = sessionPool.remove();
                session.disconnect();
            }
        } catch (NoSuchElementException ex) {
            // all sessions released
        }
    }

    static IDfSession getNewSession() throws DfException {
        return connection.ConnectionFactory.getSession();
    }

    static String baseFolderPath = "/Formation";

Now, serial and concurrent calls to a Documentum repository will be compared to each other when documentum session are obtained through 3 different means:

  • the same one session will be used for all the calls
  • each call will wait for an available free session in a pool of premade sessions (the size of the pool equals the number of threads)
  • each call will create a new personal session

The first option is implemented in the class below:

public class MyDocumentumRequestUsingOneSession extends MyRequest {

    String targetFolderId;
    IDfSession commonSession;

    MyDocumentumRequestUsingOneSession(String targetFolderId, IDfSession commonSession) {
        this.targetFolderId = targetFolderId;
        this.commonSession = commonSession;
    }

    @Override
    void execute(String objectNamePrefix, Integer objectNumber ) throws DfException, InterruptedException {
        accessDocumentum(objectNamePrefix, objectNumber, commonSession);
    }

    // do anything in a docbase
    void accessDocumentum(String objectNamePrefix, int objectNumber, IDfSession session) throws DfException {
        String objectName =objectNamePrefix + System.currentTimeMillis() + "; object=" + objectNumber + "; thread=" + Thread.currentThread().getName() + "; session=" + session.getSessionId();
        IDfSysObject targetObject = (IDfSysObject) session.newObject("dm_document");
        targetObject.setObjectName(objectName);
        targetObject.link(targetFolderId);
        targetObject.save();
        targetObject.fetch(null);
        targetObject.setTitle("current time " + new Date().getTime());
        targetObject.save();
    }
}

The second case is implemented by the class below:

public class MyDocumentumRequestUsingSessionsFromPool extends MyDocumentumRequestUsingOneSession {

    BlockingQueue<IDfSession> sessionPool;

    public MyDocumentumRequestUsingSessionsFromPool( String targetFolderId, BlockingQueue<IDfSession> sessionPool) {
        super(targetFolderId, null);
        this.sessionPool = sessionPool;
    }

    @Override
    void execute(String objectNamePrefix, Integer objectNumber) throws DfException, InterruptedException {
        IDfSession sessionFromPool = getNextPooledSession();
        // do something timeconsuming in the docbase
        accessDocumentum(objectNamePrefix, objectNumber, sessionFromPool);
        returnSessionIntoPool(sessionFromPool);

    }

    IDfSession getNextPooledSession() throws InterruptedException {
        return sessionPool.take();
    }

    void returnSessionIntoPool(IDfSession session) throws InterruptedException {
        sessionPool.put(session);
    }
}

And the creation of new session for every call to documentum is implemented as follows:

public class MyDocumentumRequestUsingNewSessionEachTime extends MyDocumentumRequestUsingOneSession {

    public MyDocumentumRequestUsingNewSessionEachTime(String targetFolderId) {
        super(targetFolderId, null);
    }

    @Override
    void execute(String objectNamePrefix, Integer objectNumber) throws DfException, InterruptedException {
        IDfSession newSession = getNewSession();
        // do something timeconsuming in the docbase
        accessDocumentum(objectNamePrefix, objectNumber, newSession);
        newSession.disconnect();
    }

    IDfSession getNewSession() throws DfException {
        return connection.ConnectionFactory.getSession();
    }
}

Results from 10 consecutive executions of the three listed above options:

I tested the application with two different Documentum servers. The absolute times and the ratios varied but the pattern was the same. The results above were obtained with Documentum developer edition 7.2 image, which has 2 processors and few memory. With a production-quality server the maximal acceleration that the application demonstrated was over 3-fold.

Use of the same session does not give any performance advantage for the concurrent calls. It is very much expected because all DFC methods interacting with documentum are synchronized (except in subclasses of IDfOperation, if you try to execute concurrently for example IDfCopyOperation at best you will receive some error and usually the docbase will strangely stop without leaving any errors in log). So even if you have many threads sharing the same session they will mostly wait until the session class becomes unlocked.

In contrast, concurrent use of the sessions from a pool or creating new sessions every time both slightly speed up the execution. Clearly, the fastest execution can be achieved by using multiple pooled premaid sessions.

Using custom subclasses of DfSysObject

Sometimes it might be desirable to replace IDfSysObject with a subclass in which some methods are disabled or replaced by custom logic.

Suppose a method has a documentum object as an input argument. The method directly and indirectly passes the object to many others methods that each invokes some exposes by IDfSysObject interface methods of the object. You want that all methods are always successfully executed but, depending on the settings or an input argument e.g. saveEnabled, you do not want to allow linking, saving, checking out or deleting the input object to occur.

One solution would be to add additional boolean argument saveEnabled to all the involved methods and insert if condition blocks before each invocation of save, link, checkout or destroy methods on the input object.

A less intrusive so solution is to use as the argument an instance DfSysObject subclass in which specifically only link, destroy, checkout and save methods are empty and do nothing. Then you would need to add only one if condition verifying whether save is enabled. Furthermore, all the involved methods remain unaltered. You can start from a method optionally generating such objects:

  void createNewObject(IDfSession session, String objectType, boolean saveEnabled) throws DfException {
    IDfSysObject obj;
    if (saveEnabled) {
      obj = (IDfSysObject) session.newObject(objectType);
    } else {
      obj = (IDfSysObject) newCustomObject(session, objectType);
    }
  }

If saveEnabled argument is true, the method will produce regular IDfSysObject, otherwise it will produce an instance of the custom subclass with some methods disabled.

Custom IDfSysObjects are produced by the following method:

  IDfSysObject newCustomObject(IDfSession sessionIDf, String typeName) throws DfException {
    ISession session = (ISession) sessionIDf;

    ILiteType type = session.getLiteType(typeName, null);

    IDfId objectId = session.getDocbase().getObjectIdManager().getNextId(session, type);
    ITypedData typedData = new TypedData(type, objectId);

    return makeCustomObject(session, typedData);
  }

  IDfSysObject makeCustomObject(ISession session, ITypedData typedData) throws DfException {
    DfSysObject object = new CustomDfSysObject();
    object.initialize(session.getObjectManager().getObjectFactory(), typedData, session, session, true);
    return object;
  }

So produced objects are totally valid and capable objects retaining connection to the docbase.

Last, custom subclass CustomDfSysObject have to be defined:

// it's up to you which methods to override
public class CustomDfSysObject extends DfSysObject {

  Logger logger = LoggerFactory.getLogger(getClass().getName());

  @Override
  public void save() throws DfException {
    logger.info("fake save");
  }

  @Override
  public IDfId checkoutEx(final String versionLabel, final String compoundArchValue, final String specialAppValue) throws DfException {
    logger.info("fake checkout");
    return DfId.DF_NULLID;
  }

  @Override
  public void destroy() throws DfException {
    logger.info("fake destroy");
  }

  @Override
  public void link(String folderSpec) throws DfException {
    logger.info("fake link");
  }
}

The objects generated as described above are empty. If you need an object filled with data, i.e. an exact copy of an object saved to the docbase, you can use a similar approach whereby the data is loaded from the docbase.

  IDfSysObject getCustomObject(IDfSession sessionIDf, String objectId) throws DfException {
    ISession session = (ISession) sessionIDf;
    final ITypedData data = session.getDataManager().getData(new DfId(objectId), new DfGetObjectOptions(), true, false);
    return makeCustomObject(session, data);
  }

Now let's test the generated objects:

public class CustomObjectFactoryTest {

  @BeforeClass
  public static void setUp() throws DfException {
    session = ConnectionFactory.getSession();
  }

  @AfterClass
  public static void tearDown() throws DfException {
    session.disconnect();
  }
  CustomObjectFactory i = new CustomObjectFactory();
  static IDfSession session;

  @Test
  public void test() throws DfException {

    String typeName = "testtype";
    // create an empty object of custom class
    IDfSysObject newCustomObj = i.newCustomObject(session, typeName);
    assertEquals(newCustomObj.getClass().getName(), CustomDfSysObject.class.getName());
    // test that the overriden methods do nothing
    newCustomObj.destroy();
    newCustomObj.save();
    newCustomObj.link(null);
    newCustomObj.checkout();

    // now let's retrieve existing object so that it is instantiated as the custom class
    String existingObjectId = "090f4241800c2507";
    // retrieve regular object
    IDfSysObject existingObj = (IDfSysObject) session.getObject(new DfId(existingObjectId));
    // retrieve custom object
    IDfSysObject existingCustomObj = i.getCustomObject(session, existingObjectId);
    // assert that their types are diffent
    assertEquals(existingCustomObj.getClass().getName(), CustomDfSysObject.class.getName());
    assertFalse(existingCustomObj.getClass().getName().equals(existingObj.getClass().getName()));
    // assert that their values are the same
    compareAllAttributeValues(existingObj, existingCustomObj);
    // test that the overriden methods do nothing
    existingCustomObj.destroy();
    existingCustomObj.save();
    existingCustomObj.link(null);
    existingCustomObj.checkout();

    // proof that the custom objects retain connection to the docbase
    String newName = "NewName " + System.nanoTime();
    existingObj.setObjectName(newName);
    existingObj.save();
    assertFalse(existingCustomObj.getObjectName().equals(newName));
    existingCustomObj.fetch(null);
    assertTrue(existingCustomObj.getObjectName().equals(newName));
  }

  void compareAllAttributeValues(IDfSysObject obj1, IDfSysObject obj2) throws DfException {
    for (int i = 0; i < obj1.getAttrCount(); i++) {
      IDfAttr attr = obj1.getAttr(i);
      String attrName = attr.getName();
      String attrValue1, attrValue2;

      if (attr.isRepeating()) {
        attrValue1 = obj1.getAllRepeatingStrings(attrName, "|");
        attrValue2 = obj2.getAllRepeatingStrings(attrName, "|");
      } else {
        attrValue1 = obj1.getString(attrName);
        attrValue2 = obj2.getString(attrName);
      }
      assertEquals(attrValue1, attrValue2);
    }
  }
}

The techniques described above allow optionally generating subclasses of DfSysObject. If you wished to always produce instances of some class for a particular documentum type the solution would be even more simple. There is a registry with all the documentum types and the corresponding java classes. You would need to register you custom class as the java class corresponding to the particular documentum type, so that documentum instantiates you class whenever object of the specified type is created or loaded for the docbase.

Documentum D2 custom action plugins

The administrators of D2 can create custom menu items invoking your custom service classes. Menu items are created and set up in D2 config.

Invoking a native D2 service
But first, let's consider a native D2 service can be invoked. Suppose we want to display the value of some attribute of the selected object. For this getProperties methods of D2 PropertyService can be employed. The target property has to be specified in the message field. The returned value can be passed to some javascript function, for example, to be displayed in javascript alert popup.
Invoking a custom service class

Custom service methods must implement marking interface IPluginAction and have the common signature:

public class ActionServiceTemplate implements IPluginAction {

    public List<Attribute> someMethods(D2fsContext d2context) throws Exception {
        ParameterParser d2parameterparser = d2context.getParameterParser();

        // all the parameters received by method
        for (Attribute a : d2parameterparser.getParameters()) {
            System.out.println("paramName/value: " + a.getName() + " " + a.getValue());
        }

        // shortcut method to access the selected object id
        IDfId objectId = d2context.getFirstId();

        // ParameterParser method to retrieve values of received attributes
        String contentType = d2parameterparser.getStringParameter("aContentType");
        String containingFolderObjectId =d2parameterparser.getStringParameter("parentId");

        // the method returns list containing arbitrary nubmer of key value pairs
        List<Attribute> result = new ArrayList<>();
        result.add(AttributeUtils.createAttribute("result", "test"));
        return result;
    }
}

Depending on the menu item clicked, D2 user interface sends to the method various named values ( e.g. ContentType, parentId). The selected object id is always included. The method has only one argument of type D2fsContext. This type includes ParameterParser that is a container for the list of name value pairs each enclosed in Attribute class. An instance of Attribute holds name and value. The values can be directly accessed using ParameterParser method getStringParameter. Some additional examples of how some received values can be used I included the article of listener plugins.

D2fsContext also contains shortcut methods used to directly access some parameters, for example, getFirstId to get id of the selected objects. The selection might include single or multiple objects.

As it was demonstrated above, the method might optionally return a list of results in the same form i.e as Attributes. In D2 user interface the returned results will be available as javascript variables having the same names, for example above we used alert(object_name), that can be further processed by javascript.

The example below shows how to create a custom "Copy link to clipboard" action. The standard "Copy link to clipboard" menu item publishes D2_ACTION_COPY_LINK_IN_CLIPBOARD event that is processed by some Clipbard service that eventually puts the url to the selected object into clipboard.

If custom urls are needed, for example with different hostname and some object-dependent parameters, the custom action plugin is the solution.

public class CopyLink implements IPluginAction {

    public List<Attribute> copyLink(D2fsContext d2context) throws DfException, D2fsException, IOException {
        IDfId objectId = d2context.getFirstId();
        
        ParameterParser d2parameterparser = d2context.getParameterParser();
        String contentType = d2parameterparser.getStringParameter("aContentType");
        String url = "https://www.instagram.com/get?id=" + objectId.getId() + "&type=" + contentType;

        List<Attribute> result = new ArrayList<>();
        result.add(AttributeUtils.createAttribute("result", url));
        return result;
    }
}

The setting in D2 config should be adjusted as follows:

The plugin returns the custom url as the javascript variable named result. The result is passed to a native D2 javascript method pasteInClipboard that in turn calls the applet to put the value into the clipboard.

Action plugin executing javascript before executing a D2 event

Unfortunately D2 config does not allow executing javascript and then publishing D2 event to ajaxHub. After a service is executed, either javscript is executed (when JS in selected in Type list) or event is published (when EVENT is selected in Type list). When NATIVE is selected, the service results are ignored.

Suppose a confirmation pop up dialog is should appear when a user clicks on some standard action publishing event, for example when "Cancel checkout" is clicked. If the user clicks yes in the popup then the event is published. Alternatively, nothing happens, if no is selected.

I propose a working workaround allowing execute a javascript code before sending D2 event. When a user triggers action normally publishing an event, a custom service is invoked instead. The service does nothing but only returns javascript that executes arbitrary code, such as displaying the stand confirm popup, and then publishes an arbitrary event directly to openAjaxHub.

public class Relay implements IPluginAction {

    static String EVENT_NAME = "myActionName";
    static String CONFIRMATION_MESSAGE = "myActionMessage";

    public List<Attribute> relay(D2fsContext d2context) throws DfException, D2fsException, IOException {
        Map<String, String> parametersMap = new HashMap<>();
        ParameterParser pp = d2context.getParameterParser();

        // forward the original openAjax message together with the event to be published
        for (Attribute a : pp.getParameters()) {
            parametersMap.put(a.getName(), a.getValue());
        }
        Parameters parameters = new Parameters(parametersMap);

        String confimationMessage = pp.getStringParameter(CONFIRMATION_MESSAGE);
        String eventName = pp.getStringParameter(EVENT_NAME);

        // return a string with immediately-invoked javascript function expression
        String js = "(function(){if(confirm('" + confimationMessage + "')){var myPluginContainer=new OpenAjax.hub.InlineContainer(managedHub,'myPluginContainer',{Container:{onSecurityAlert:function(){},onConnect:function(){},onDisconnect:function(){}}}); var myPluginContainerClient=new OpenAjax.hub.InlineHubClient({HubClient:{onSecurityAlert:function(){}},InlineHubClient:{container:myPluginContainer}});myPluginContainerClient.connect(function(hubClient,success){if(success){hubClient.publish('" + eventName + "','" + parameters.toString() + "');console.log('connected and sent');managedHub.removeContainer(myPluginContainer);}else{console.log('failed to connect');}});}})()";

        List<Attribute> result = new ArrayList<>();
        result.add(AttributeUtils.createAttribute("result", js));
        return result;
    }
}

The plugin needs input parameters: the message for the confirm dialog and the event to publish. The script returned by the service is executed by eval function. Now when a user click "Cancel checkout" he will have to additionally press OK in the confirm popup. If the user clicks Cancel nothing will happen.

Action plugin handling multiple selected objects

Action plugins can handle multiple selections. The class template below demonstrate how one could access all the selected objects using a shortcut method getObject. The class additionally demonstrates that plugin does not need to return any results. To set up a plugin that do not return any values, the option NATIVE could be selected in the D2 config field Type.

public class WorkflowActions implements IPluginAction {

    public List<Attribute> startWorkflows(D2fsContext d2context) throws DfException, D2fsException {

        // Loop through all the selected documents
        int i = d2context.getObjectCount();

        for (int j = 0; j < i; j++) {
            IDfSysObject obj = (IDfSysObject) d2context.getObject(j);
            // do something to the object, for example start some workflow
            startWorkflow(obj);
        }

        return new ArrayList<>();
    }
    // very oversimplified method starting workflows on the input object
    void startWorkflow(IDfSysObject obj) throws DfException {
        String workflowName = obj.getTypeName() + " Workflow";
        D2SdkWorkflowLauncher workflowLauncher = new D2SdkWorkflowLauncher(obj.getSession(), workflowName);
        IDfWorkflow workflow = workflowLauncher.startWorkflow(obj, workflowName);
        workflow.setSupervisorName(obj.getSession().getLoginUserName());
    }
}

Unit testing Documentum client application using mock IDfSysObjects or serialized IDfSysObjects detached from documentum session

There are two easy options to test the methods with IDfSysObject as arguments without creating connection to the docbase:

The former is the quickest but the mock objects can hardly mimic real documentum objects carrying many values. Nevertheless, if the tested method does not try to access attribute values using IDfSysObject methods such as getString this can be the best option. The latter option is particularly suitable if the tests require objects prefilled with some values so that the tested method can freely manipulate all the attributes. Below I describe two practical variations of this option:

In the first case, the target IDfSysObjects should be loaded from the docbase, saved to the filesystem, and then loaded from disk before the test execution. In the second case, the type should be loaded from the docbase, serialized and then deserialized before creating objects. The generated disconnected objects preserve all behaviors not requiring interaction with the docbase, e.g. the cannot be saved by calling method save.

Creating mock objects implementing DFC interfaces

First create a simple mock class implementing the interface used by the tested methods:

public class IDfSysObjectMock implements IDfSysObject {
}

Then in you IDE click an option "Implement all abstract methods". A class implementing all methods will be generated.

public class IDfSysObjectMock implements IDfSysObject {
  @Override
  public IDfId saveAsNew(boolean shareContent) throws DfException {
    throw new UnsupportedOperationException("Not supported yet."); //To change body of generated methods, choose Tools | Templates.
  }

  @Override
  public boolean areAttributesModifiable() throws DfException {
    throw new UnsupportedOperationException("Not supported yet."); //To change body of generated methods, choose Tools | Templates.
  // …
 }

Now if a method in your application has IDfSysObject as an argument but invokes only few IDfSysObject methods you can use a modified IDfSysObjectMock instances as arguments. In the instantiated IDfSysObjectMock objects the IDfSysObject methods invoked inside the tested methods should be overridden as appropriate For example:

new IDfSysObjectMock() {
 @Override
 public String getString(String attributeName) throws DfException {
   return "test";
 }
});

This technique is particularly useful when you need to test the exception handling in you methods. Even though normally exceptions do not occur, try-catch blocks must be there. If they are left not tested, it will be reflected in the weakened test coverage report. The following mock object throws DfException when its method save is invoked.

new IDfSysObjectMock() {
 @Override
 public String getLogEntry() throws DfException {
   return "test";
 }

 @Override
 public void setString(String attributeName, String value) throws DfException {
 }

 @Override
 public void save() throws DfException {
      throw new DfException("test");
 }      
};

The same technique can be used with any DFC types. For example mock classes implementing IDfSession, IDfId or IDfType can be used for tests as well. Below is an example of a method producing mock IDfSession instance:

public IDfSession getTargetSession() {
  return new IDfSessionMock() {
    @Override
    public IDfType getType(String typeName) throws DfException {
      return new IDfTypeMock() {
        @Override
        public boolean isSubTypeOf(String typeName) throws DfException {
          return !typeName.equals("dm_sysobject");
        }
      };
    }
  };
}
Generating copies of once serialized IDfSysObjects stored in the docbase

Essentially, first we save any object filled with values to the disk, then we loaded it every time before executing the test. The generated objects will have exactly the same values as the original object. Object serialization and deserialization is illustrated in ObjectSerialization class.

public class ObjectSerialization {

  MyObjectStream os = new MyObjectStream();

  void saveObject(IDfSysObject objFromDocbase, String fileName) throws DfException, IOException, ClassNotFoundException {
    // save all the object data
    ITypedData originalData = ((ITypedObject) objFromDocbase).getData(false);
    os.write(originalData, fileName);
  }
   
  // load and instantiate a serialized IDfSysObject. note documentum session is irrelevant
  IDfSysObject loadObject(String fileName) throws DfException, IOException, ClassNotFoundException {

    ITypedData loadedData = (ITypedData) os.read(fileName);
    DfSysObject serializedObject = new DetachedDfSysObject();
    serializedObject.initialize(null, loadedData, null, null, true);
    return serializedObject;
  }
  // …
}

A simple auxiliary class encoding serialization:

public class MyObjectStream {

  public void write(Object o, String fileName) throws IOException {
    try (ObjectOutputStream out = new ObjectOutputStream(new BufferedOutputStream(new FileOutputStream(fileName)))) {
      out.writeObject(o);
    }
  }

  public Object read(String fileName) throws IOException, ClassNotFoundException {
    try (ObjectInputStream in = new ObjectInputStream(new BufferedInputStream(new FileInputStream(fileName)))) {
      return in.readObject();
    }
  }
}

In the mock DfSysObject subclass several methods, namely initialize, init, setDirty, have to be overrriden so that the class can be instantiated. Optionally you can override any other methods invoked in you tested methods, e.g save or link, so that they do not fail as the instances are disconnected from the docbase.

public class DetachedDfSysObject extends DfSysObject {

  @Override
  public void save() throws DfException {
    System.out.println("fake save");
  }

  @Override
  public void destroy() throws DfException {
    System.out.println("fake destroy");
  }

  @Override
  public void link(String folderSpec) throws DfException {
    System.out.println("fake link");
  }

  @Override
  public void initialize(final IPersistentObjectFactory factory, final ITypedData data, final ISession session, final ISession originalSession, final boolean isNew) throws DfException {
    this.initData(data);
    this.init();
  }

  @Override
  protected void init() throws DfException {
  }

  @Override
  public synchronized void setDirty(final boolean dirty) throws DfException {
  }
}

A junit test proving that the original and deserialized object contain the same data:

public class ObjectSerializationTest {

  ObjectSerialization i = new ObjectSerialization();
  IDfSession session;

  @Before
  public void setUp() throws DfException {
    session = ConnectionFactory.getSession();
  }

  @After
  public void tearDown() throws DfException {
    session.disconnect();
  }

  @Test
  public void testSerializingAndLoadingDfSysObject() throws Exception {
    // load an object from the docbase
    String existingObjectId = "090f4241800c2507";
    IDfSysObject existingObj = (IDfSysObject) session.getObject(new DfId(existingObjectId));

    String fileName = "Serizalized" + existingObjectId;

    // save all the object data
    i.saveObject(existingObj, fileName);

    // now load serialized IDfSysObject
    IDfSysObject loadedSerializedObject = i.loadObject(fileName);
    compareAllAttributeValues(existingObj, loadedSerializedObject);
  }

  void compareAllAttributeValues(IDfSysObject obj1, IDfSysObject obj2) throws DfException {
    for (int i = 0; i < obj1.getAttrCount(); i++) {
      IDfAttr attr = obj1.getAttr(i);
      String attrName = attr.getName();
      String attrValue1, attrValue2;

      if (attr.isRepeating()) {
        attrValue1 = obj1.getAllRepeatingStrings(attrName, "|");
        attrValue2 = obj2.getAllRepeatingStrings(attrName, "|");
      } else {
        attrValue1 = obj1.getString(attrName);
        attrValue2 = obj2.getString(attrName);
      }
      System.out.println(">>> " + attrName);
      System.out.println("\t" + attrValue1);
      System.out.println("\t" + attrValue2);
      assertEquals(attrValue1, attrValue2);
    }
  }
}
Saving a type definition to filesystem and generating empty IDfSysObject objects of this type

Essentially, first we load the target type definition from the docbase, save it, and then before tests load it and generate detached but valid empty IDfSysObjects.

public class ObjectSerialization {

  MyObjectStream os = new MyObjectStream();
  // methods shown above …

  // serialize a type definition
  void saveType(IDfSession session, String typeName, String fileName) throws DfException, IOException, ClassNotFoundException {
    // load the type definition from the docbase
    ILiteType liteType = ((ISession) session).getDocbaseConnection().getLiteType(typeName);
    // serialize the type definition
    os.write(liteType, fileName);
  }

  // create an empty object of a particular type without using documentum session
  IDfSysObject loadTypeAndCreateEmptyObject(String fileName) throws DfException, IOException, ClassNotFoundException {
    // load the serialized type definition     
    ILiteType type = (ILiteType) os.read(fileName);

    // create data container
    final ITypedData typedData = new TypedData(type, null);

    // now instantiate an empty IDfSysObject 
    final DfSysObject emptyObject = new DetachedDfSysObject();
    emptyObject.initialize(null, typedData, null, null, true);
    return emptyObject;
  }
}

The following junit test asserts that a generated object has all the attributes of the target type and all the attributes are empty.

public class ObjectSerializationTest {

  // declarations shown above …

  @Test
  public void testSerializingTypeAndGeneratingEmptyDfSysObject() throws Exception {
    // load an object from the docbase
    String typeName = "testtype";
    String fileName = "Serizalized" + typeName;

    // serialize the type definition
    i.saveType(session, typeName, fileName);

    // now create an empty IDfSysObject of the serialized type
    IDfSysObject generatedObject = i.loadTypeAndCreateEmptyObject(fileName);
    assertThatObjectHasAllAttributesAndTheyAreEmpty(typeName, generatedObject);
  }

  void assertThatObjectHasAllAttributesAndTheyAreEmpty(String typeName, IDfSysObject obj) throws DfException {
    IDfType existingType = session.getType(typeName);
    for (int i = 0; i < existingType.getTypeAttrCount(); i++) {
      IDfAttr attr = existingType.getTypeAttr(i);
      String attrName = attr.getName();
      String attrValue;
      int attrType = attr.getDataType();
      if (attr.isRepeating()) {
        attrValue = obj.getAllRepeatingStrings(attrName, "|");
      } else {
        attrValue = obj.getString(attrName);
      }

      System.out.println(">>> " + attrName + "; " + attrType + "; " + attrValue);
      switch (attr.getDataType()) {
        case IDfAttr.DM_TIME:
          assertTrue(attrValue.equals("nulldate") || attrValue.isEmpty());
          break;
        case IDfAttr.DM_BOOLEAN:
          assertTrue(attrValue.equals("F") || attrValue.isEmpty());
          break;
        case IDfAttr.DM_INTEGER:
          assertTrue(attrValue.equals("0") || attrValue.isEmpty());
          break;
        case IDfAttr.DM_DOUBLE:
          assertTrue(attrValue.equals("0") || attrValue.isEmpty());
          break;
        case IDfAttr.DM_ID:
          assertTrue(attrValue.equals(DfId.DF_NULLID_STR) || attrValue.isEmpty());
          break;
        default:
          assertTrue(attrValue.isEmpty());
      }
    }
  }
}

Sunday, January 1, 2017

Documentum D2 external widget. How to nicely use the current user's session

The code of many sample D2 widgets is available on EMC website. The widgets accessing documentum need a session or credentials to create one. The provided samples use tickets generated by D2 to create a session each time user activates the widget.

I used a simpler way - I reused the current user's session used by D2. To do this, the widget must be included in D2 application, which is not a problem. The same as with plugins, the widget jar should be placed into D2/WEB-INF/lib folder. Then both the static resources and the servlet have the same context root as D2 application and, therefore, can access httpSession of the current user. In D2 all the documentum sessions created for the current user are stored in http session attribute "CONTEXT". The attribute value is a map containing D2 session ids as key and the credentials for corresponding documentum sessions as values.

Project file layout

My sample widget comprises static resources such as html template, javascript and css stylesheet, and a servlet together with auxiliary classes required to generate the output.

myWidget.html

It is a simple file with DIV placeholder for the dynamically generated content. Additionally, the file load two standard scripts enabling the interaction with OpenAjaxHub, which act as a event bus in D2 application. The third script myWidget.js listens to D2 events and updates the html with the content generated by servlet MyWidgetServlet.java.

<html>
  <head>
    <title>My widget</title>
    <script language='javascript' src="container/external-api/OpenAjaxManagedHub-all.js"></script>
    <script language='javascript' src="container/external-api/D2-OAH.js"></script>
    <script src="myWidget.js"></script>
    <link rel="stylesheet" type="text/css" href="myWidgetStyles.css">
  </head>
  <body  onload="myWidget.loadEventHandler()">    
    <div id="myPlaceHolderForHTMLContent"></div>
  </body>
</html>
myWidget.js

The third script myWidget.js connects to ajaxHub, subscribes to D2_EVENT_SELECT_OBJECT event, and calls the servlet whenever the user selects an object. Additionally, when the user select an object visualized in the widget, the script issues event D2_ACTION_LOCATE_OBJECT so that this object is selected in D2 also. Essentially, the script is a relay ensuring the two-way communication between D2 ajaxHub and the widget. When the servlet is accessed it is passed two parameters: the selected object id and D2 session id (mere the D2-specific id of the documentum session used by the current user). The servlet response is inserted into the placeholder div. Then in method attachEventListeners onClick listeners are attached to the displayed objects and the new content is additionally styled and positioned (this code is not shown).

var myWidget = {
  clickedObjectId: "", // last selected object id
  widgetIsOn: false, // widget is active

  // Application initializes in response to document load event  
  loadEventHandler: function () {
    console.log("Iframe loaded ");
    myWidget.ajaxHub = new D2OpenAjaxHub();
    myWidget.ajaxHub.connectHub(myWidget.connectCompleted, myWidget.onInitWidget, myWidget.onActiveWidget);
  },
  connectCompleted: function (hubClient, success, error) {
    if (success) {
      console.log("Hub client connected");
      myWidget.subscribeEvents();
    } else
      console.log("Failed to connect");
  },
  // Callback that is invoked upon widget activation 
  onActiveWidget: function (bActiveFlag) {
    console.log("onActiveWidget: " + bActiveFlag);
    // set the internal flag
    myWidget.widgetIsOn = bActiveFlag;
  },
  onInitWidget: function (message) {
    console.log("onWidgetInit");
  },
  // the widget will react to selection of an object in D2, selectObjectCallback will be invoked
  subscribeEvents: function () {
    console.log("subscribeEvents");
    myWidget.ajaxHub.subscribeToChannel("D2_EVENT_SELECT_OBJECT", myWidget.selectObjectCallback, false);
  },
  // invoked when an object is selected in D2
  selectObjectCallback: function (name, msg) {
    var id=msg.get("oam_id");
    console.log("selectObjectCallback id: " + id);
    // check that the widget is active
    if (!myWidget.widgetIsOn) {
      return;
    }
    // react only if the newly selected object is not the same as the currently selected 
    if (myWidget.myClickedObjectId !== id) {
      console.log("selectObjectCallback processing: " + id);
      var xmlhttp = new XMLHttpRequest();
      xmlhttp.onreadystatechange = function () {
        if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
          console.log("selectObjectCallback received response " + xmlhttp.status);
          // display the generated html
          document.getElementById("myPlaceHolderForHTMLContent").innerHTML = xmlhttp.responseText;
          console.log("selectObjectCallback inserted response ");
           // the html can display some objects that are related to the selected object. For example, 
           // if versions or objects linked by relations are visualized, then one would expect that clicking on an object
           // would trigger something, for example, the clicked object will be selected in D2. 
              myWidget.attachEventListeners();
              console.log("selectObjectCallback attached listeners");
            }
          };
    
          // sent not only object id but also session id so that it can be recovered by the servlet
          xmlhttp.open("GET", "myWidgetServlet?id=" + id +  "&uid=" + msg.get("oam_cuid"), true);
          xmlhttp.send();
          console.log("selectObjectCallback sent ajax request");
          myWidget.myClickedObjectId = "";
        }
      },
      // optionally attach listeners to your generated html or modify html or do anything else
      attachEventListeners: function () {
      },
      // a methods that could be used together with the method above to trigger selection of the object in D2
      // the method sends D2_ACTION_LOCATE_OBJECT event together with the object id to D2 AjaxHub
      displayInD2ObjectSelectedInWidget: function (id) {
        console.log("displayInD2ObjectSelectedInWidget: " + id);
        var messageToSend = new OpenAjaxMessage();
        messageToSend.put("oam_id", id);
        myWidget.ajaxHub.sendMessage("D2_ACTION_LOCATE_OBJECT", messageToSend);
        return messageToSend;
      }
    };
MyWidgetServlet.java

Note, this servlet works only in servers supporting servlet specification 3.0 and above.

The servlet receives the selected object id and the D2-specific documentum session id. Then the documentum session is extracted from http session, which is shared by D2 application and the widget. The session and the selected object id are passed to auxiliary method createPage rendering html. For example, the navigatable version tree of the object could be rendered.

@WebServlet("/myWidgetServlet")
public class MyWidgetServlet extends HttpServlet {

  @Override
  protected void service(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
    response.setContentType("text/html;charset=UTF-8");
    PrintWriter out = response.getWriter();
    try {
      // the selected object id
      String selectedObjectId = request.getParameter("id");

      Map<String, Object> contextMap = (Map) request.getSession().getAttribute("CONTEXT");

      // get the session of the current user
      Context c = (Context) contextMap.get(request.getParameter("uid"));
      D2fsContext d2fsContext = new D2fsContext(c, false);
      IDfSession session = d2fsContext.getSession();
       
      // use the session and selected object id to create HTML page 
      // for example you can visualize the version tree of the object
      String html = createPage(session, selectedObjectId);
      out.println(html);
    } catch (DfException | D2fsException ex) {
      ex.printStackTrace(out);
    } finally {
      out.close();
    }
  }

  String createPage(IDfSession session, String selectedObjectId) {
    // generate html displayed in the widget
    return html;
  }
}
Installation

After the project is built into jar file and placed into D2/WEB-INF/lib folder, the widget has to be enabled in D2 config:

  • create a new widget entry in D2 config
  • select ExternalWidget option in the Widget type list
  • check Bidirectional communication
  • enter myWidget.html?anything=11 (without an arbitrary parameter the relative url is misunderstood by D2) into the Widget url text input
  • click Save
  • enable the widget in the context matrix of D2 config
  • open D2 and select the widget in the widget gallery