Tag Archives: dfc

Enable DFC Tracing

Add the following properties in file.

dfc.tracing.enable = true
dfc.tracing.include_rpcs = true
dfc.tracing.include_rpc_count = true
dfc.tracing.max_stack_depth = 6
dfc.tracing.include_session_id = true
dfc.tracing.log.level = DEBUG
dfc.tracing.date_format=yyyy-MM-dd HH:mm:ss.SSS
dfc.tracing.include_session_id = true


Tags: ,

BOF – What is Documentum Business Objects Framework?

Documentum provides framework for applying business logic and is termed as Business Objects Framework.  Documentum provides a clear separation of business logic from the core content services so that the business logic can be executed irrespective of client accessing the repository. Using BOF, business logic can be changed without effecting the UI code.
Business objects encapsulates business logic in hot deployable package that is stored in the repository.  This package will be deployed to the client application on demand.  Business objects are used to extend the functionality of Documentum. We may need to validate the document before saving into the repository. To achieve this we can override the default behaviour of save event of document to execute the custom validation before save event was fired on that object.  Code for the custom logic must be deployed to the repository and this code will be stored in the System cabinet. There are 3 types of business objects available in this framework.  We will look into each type in detail


TBO stands for Type Based Objects.  These are the most common types of business objects.  These are used to change or add behaviour to all instances of an object type. This requires no changes in the UI code.  A TBO can be defined to create new actions on an object type.  However client applications should have customized/configured to call these new actions.

The following diagram shows how the TBO is invoked by client application

TBOs must be developed using java only.  Documentum provides composer tool to deploy the TBO and related jar (if required) into repository.


SBO stands for Service Based Object.  These are not tied to any object types and these objects can be used to perform custom action on all repository objects.  They are generally used to define business logic that can be utilized by several applications.  Content server has many SBOs out of the box like Inbox SBO, Workflow Reporting SBO, and Subscription SBO etc. We can write our custom SBO for common business logic and implementing Documentum Foundation Services is recommended approach than using SBOs.

Leave a comment

Posted by on August 24, 2012 in BOF, Developer Resource, DFS, Java, Java, SBO, TBO, WDK, Webtop


Tags: , , , , , ,

Troubleshooting UCF

UCF stand for ‘Unified Content Facilities’ which is a lightweight applet that is responsible for transferring content between the content server, application server and the client machine during operations such as check-in, checkout, import etc. UCF is a Java-based client application that is installed at run-time.

Since WDK application uses UCF content transfer it downloads a lightweight applet to the browser when the client uses the Documentum operations for the first time. UCF has many advantages that extend beyond simple FTP (file transfer protocol), such as:

- Recoverability in case of brief network interruption.

- It supports transfer of large files over network and optimizes the transfers by content compression capabilities.

- Awareness of locally downloaded files, and the ability to avoid downloading the same content twice

- Support for complex documents, such as XML files

- Registry support, to allow checkout from one application and check in from another. The diagram below in high level describes the process for a simple ucf operation.

When an UCF operation is requested, a Java applet is downloaded. The UCF client is invoked which requests for a UCF Session ID (UID) from the corresponding UCF servlet process. The UID sent via the browser as a request for content back to the application server. The content is then transferred between the UCF servlet and the UCF client as a direct HTTP connection. On completion of the process the UCF servlet send the response to the application that it is completed and the browser receives a signal to terminate the UCF client.

UCF performs the following functions:

1. Standardizes content handling across infrastructure and applications

2. Simplifies XML and compound document processing in a Web environment by decoupling DFC and WDK and by providing an open framework for content analysis.

3.Improves maintainability, reliability and performance of WDK content transfer.


Whenever a user gets an UCF related errors, the following are the approaches which should be checked before going into troubleshooting of the issue. This will confirm that the error is not user or machine specific.

1. Closing all browsers and try invoking the UCF operation again. 2. Try login on other user’s machine with your login details to test. Also try doing a test with admin and non-admin privileged users. 3. Try connecting to another UCF client for e.g. DA.

If the error still persists in all the above scenarios, the best option is to reinstall UCF on your client as explained in the section below.


Particularly for developers using multiple applications and making configuration changes it’s sometimes worth re-installing UCF client on your machine.  The steps described below will normally require admin access depending on original UCF configuration.

1. Shutdown down all Browser sessions

2. Delete the ucf folder on your client located here(if this does not work you may need to reboot machine and try)

3. C:\Documents and Settings\<USER>\Documentum\ucf

4. Clear your browser cache

5. Close down any Java Consoles Running and any javaw.exe processes running

6. Clear your Browser Java Plug-in Cache

MS JVM – Delete all files in Temporary Internet Files

Sun JVM – From the Java Control Panel è  General Tab-> Temporary Internet Files -> Delete Files

7. Restart your UCF Test (e.g. Perform the import operation which invokes UCF operation)

8. Check the following folder is created (this confirms the plug-in has downloaded UCF)

C:\Documents and Settings\<user>\Documentum\ ucf\config

9. Check the ucf.installs.config.xml contains the correct paths to valid JRE’s on your desktop.  A common problem is it may point to a JRE that isn’t on your machine (typically due to a corrupt registry on workstation).

10. You must ensure in the file below that the appropriate paths and Java versions are all valid.

<xml version=”1.0″ encoding=”UTF-8″?>

<? dctm fileVersion=”″ compatibilityVersion=”″?>
<ucfInstall appId=”shared” version=”″ host=”WLDN0179847″ home=”C:\Documents and Settings\<user>\Documentum\ucf”><java version=”1.4.2″ minVersion=”1.4.2″ exePath=”C:\Program Files\Java\j2re1.4.2_11\bin\javaw.exe” classpath=”C:\Documents and Settings\<user> \Documentum\ucf\WLDN0XXXXX\shared\bin\ucf-client-api.jar;C:\Documents and Settings\<user>\Documentum\ucf\WLDN0XXXX\shared\bin\ucf-client-impl.jar”>
<option value=”-Djava.library.path=C:\Documents and Settings\<USER>\Documentum\ucf\XXXXXX\shared\bin”/>
<option value=”-Djava.util.logging.config.class=com.documentum.ucf.client.logging.impl.ConfigureLogger”/>
<option value=”-Duser.home=C:\Documents and Settings\<USER>”/>
</ucfInstalls>11. You can also check that the correct versions of the UCF runtimes have been downloaded from the application server. C:\Documents and Settings \<user> \Documentum \ucf \<computer_name> \shared \bin

You can check the versions of the dll by viewing properties. The version should match the version of UCF you are using and what has been installed on the application server for example for WDK 5.3sp2.


With UCF there are two types of logging available for diagnostics. There are client side (browser) and server side (application server) logging available.Client Side Logging:To enable client side logging, 2 files will need to be modified. Both of these files can be found in the folder:C:\Documents and Settings\<USERNAME>\Documentum\ucf\<HOSTNAME>\shared\config\Where,USERNAME is the windows user IDHOSTNAME is the machine hostname where the testing is being done.The first file to be edited is ucf.client.config.xml. The following entry should be changed to true:
<option name=”tracing.enabled”>
The following entry will need to be added as well:
<option name=”debug.mode”> <value>true</value>
The next file to be edited is and set the .level entries to ALL (all caps). The debugging levels available are (from least informative to most):
  • SEVERE (highest value)
  • INFO
  • FINE
  • ALL
Once these changes are made, UCF will begin logging on the client side starting with the next invocation. All output will be logged to: C:\Documentum\logs\ucf.client.*.log.App Server Logging:To enable UCF tracing, the following file in the WDK application will need to be edited:  ../WEB-INF/classes/ucf.server.config.xmlIn this file, the following entry will need to be set to true:
<option name=”tracing.enabled”> <value>true</value> </option>
UCF will log on the app server to $DOCUMENTUM_SHARED/logs/trace.log and/or wdk.log (depending on how log4j is configured). These changes will require a restart of the app server.

Tags: , , , , ,

BOF 101

BOF is a deployment mechanism we use for a number of different component types, like TBOs, SBOs, modules, aspects. This is how it works from a 20,000 ft view.

BOF modules are packaged using Composer into DARs (or docapps in the old days). They are deployed to a docbase. In general, with the exception of SBOs, they have docbase affinity (you might get a different implementation from a different docbase). SBOs are deployed to a global registry docbase and so have a single implementation.

Once deployed to the docbase, any DFC instance that communicates to that docbase will be sensitive to any changes made to the BOF module. If it detects a change (via an optimized polling strategy) it will download the new/updated module definition to its file system cache. You can find these caches in $DOCUMENTUM/cache/<release>/bof/<docbase>. In these directories you will find downloaded jars and other downloadable resources (the jar names are the object ids). You will also find a content.xml file which is a manifest. These caches can be deleted at will and will be automatically reconstructed as necessary by a running DFC instance. Zipping up this cache can be very useful collateral in debugging a BOF problem.

The BOF runtime (within DFC) runs entirely from the file system. Class loaders are constructed which point to the jar files on the file system, so there is no performance penalty in deploying to the docbase (except for the initial reference).

BOF supports hot deployment. If a DFC instance recognizes an updated BOF module it will download it and build a new class loader for that module which points to the updated jars. The next request for that module will use the new implementation. Any preexisting objects remain using the old implementation and will be aged out as they are garbage collected. This is great for updating an implementation, but you must never update a public interface or you will cause havoc (you would need to reload the web app and ensure application compatibility first).

The class loader and packaging implications of BOF will be addressed in a subsequent post.


Tags: , , , , , , ,

Developing BOF applications with Composer tutorial

1 Understanding BOF class loading

Before you begin BOF development, it is important to understand how Composer categorizes JAR files, the BOF class loader, and the problems that you might encounter when developing BOF applications.

1.1. The BOF class loader hierarchy

Composer categorizes JAR files into three categories: Implementation, Interface, and Mixed. A different class loader is used to load the classes depending on the type of JAR file. The following diagram shows the class loader hierarchy:


Implementation JAR files are loaded into module-specific class loaders. Each BOF application has its own class loader and all of its implementation JAR files are loaded into this class loader. Classes in different module-specific class loaders cannot reference one another. This class loader is parent-last, meaning classes within the class loader take precedence over the same classes in its parent class loaders (the shared BOF class loader and the application class loader).

Interface JAR files are loaded in a class loader that is shared across a DFC instance. All interface JAR files for all BOF applications are loaded into the shared BOF class loader. This allows published interfaces to be shared across multiple components that require it. This class loader is parent-first, meaning classes that are loaded by its parent class loader (the application class loader) take precedence over its own classes. Classes in this class loader cannot reference classes in the module-specific class loaders.

Mixed JAR files are deprecated and should not be used.

The application class loader is typically where your client or web application is loaded. This class loader cannot reference any of the classes that are loaded by the shared BOF class loader or the module-specific class loaders. You must package any interface JAR files that are needed by your client application with the client application, so it is aware of your BOF application’s API.

1.2. What to package in JAR files

The different types of JAR files and their class loading behavior in a repository are more complex than in your development environment. Because of this fact, applications that work in your development environment might throw exceptions when deployed in a repository. It is important to understand what to package in implementation and interface JAR files before developing your BOF application. As a general rule, you should not use mixed JAR files.

Implementation JAR files typically contain the logic of your application and a primary class that serves as the entry point to your BOF module. Implementation JAR files are loaded into a module-specific class loader and classes in different BOF modules cannot reference one another.

Interface JAR files do not need to include all interfaces, but all published interfaces that are needed by your clients should be packaged. Any exception classes or factory classes that appear in your interfaces should also be packaged in interface JAR files. Ensure that your interfaces do not transitively expose classes that are not packaged in the interface JAR file, which would lead to a NoClassDefFoundException.

To be safe, package your implementation and published interface classes in different JAR files. Implementation JAR files can contain non-published interfaces (the interface is not needed by the client or any other BOF module). It is recommended that you also separate your implementation and published interface source files into separate directories so that the packaging process is less error prone.

1.3. Common exceptions caused by incorrect JAR packaging

If you do not package your JAR files correctly, the different layers of class loaders can cause exceptions to occur. The two most common exceptions that you might encounter are ClassDefNotFound and ClassCastException.

The ClassCastException occurs when you try to cast an object to an interface or class that it does not implement. In most cases, you will be sure that the object you are casting implements the interface that you are trying to cast it to, so there is another point to consider when encountering this error. Java considers a class to be the same only when it has the same fully qualified name (package and class name) and if it is loaded by the same class loader. If you accidentally package a published interface within an implementation JAR file, the exception occurs if you try to cast an object to that interface in your client application.

For instance, say you created a BOF module that implements an interface and package the interface in an implementation JAR file:

  • The interface that is packaged in the implementation JAR resolves to the module-specific class loader because it is parent last.
  • Your client application instantiates the BOF module and tries to cast it to the interface. It uses the interface that is loaded by the client application class loader, because there is no way for your application to reference the interface in the module class loader (the parent class loader cannot see children class loaders).
  • Java throws ClassCastException, because it expects the interface that was loaded by the module-specific class loader to be used to cast the BOF module, but you are using the one that was loaded by your application class loader. Alternatively, if you correctly package the interface inside an interface JAR file, it is loaded by the shared BOF class loader, which is parent-first. The interface resolves to its parent class loader first (your application’s class loader), and no exception is thrown.

NoClassDefFoundException most often occurs when you transitively expose a class that a class loader cannot find. A common example is when you accidentally package an implementation class inside an interface JAR file, and that implementation class references another class in an implementation JAR file. The exception is thrown, because classes in the shared BOF class loader cannot reference anything in the module-specific class loaders.

For instance, say you created a BOF module and accidentally packaged an implementation class inside an interface JAR file:

  • You call a method that references the implementation class that you accidentally packaged inside the interface JAR file.
  • The method runs and in turns references another class inside the implementation JAR file.
  • Java throws the NoClassDefFoundException, because the classes that are loaded in the shared BOF class loader cannot see any classes that are loaded by the module-specific class loaders (classes in parent class loaders cannot see classes in child class loaders). This problem can manifest itself in other scenarios, but the basic problem involves referencing classes that do not exist (either through not packaging the class at all or packaging the class at a level that is hidden from the referencing class).

Understanding these two common problems can help you avoid them when developing your BOF applications. Because of these problems, it is recommended that you separate your implementation source files and published interface source files into different folders to begin with, so that packaging the JAR files is a simpler process.

2 Building the Hello World BOF module

Now that you have some background on BOF development, this tutorial will guide you through developing a simple Hello World BOF module. You will create the following items when developing the Hello World BOF module:

  • An interface for the Hello World BOF module
  • An implementation class for the Hello World BOF module
  • An Ant script that builds the interface and implementation classes into separate JAR files
  • A JAR Definition artifact that defines the implementation and interface JAR files as a Documentum artifact
  • A Module artifact that defines the BOF module
  • An Ant script that automatically updates the JAR Definition with the most recent version of the implementation JAR file and builds and installs the project

2.1. Creating the Hello World Java project

To begin, you will first create a Java project that contains the code for the Hello World BOF module. During the project creation, you will also create two separate source folders for your implementation and interface classes.

To create the Java project:

  1. Start Composer and click File > New > Java Project…. The New Java Project wizard appears.
  2. Type HelloWorldBOFModule in the Project Name field and click Next. The Java Settings screen appears.
  3. Check the Allow output folders for source folders checkbox.create_source_folders.png 
  4. Click Create new source folder. The Source Folder window appears.
  5. Type src-impl in the Folder name field and click Finish.create_source_impl.png


  6. Click Configure Output Folder Properties. The Source Folder Output Location window appears.
  7. Click the Specific output folder radio button and click Browse….source_output_folder_impl.png 
  8. Click the HelloWorldBOFModule node and click Create New Folder…. The New Folder window appears.
  9. In the Folder name field, type bin-impl and click OK three times to return to the Java Settings window.
  10. Repeat steps 4–9, but create a source folder named “src-int” with an output folder of “bin-int”.
  11. Click on the Libraries tab and click Add External JARs….
  12. Browse for <Composer_root>\plugins\com.emc.ide.external.dfc_1.0.0\lib\dfc.jar and click Open.
  13. Click Finish to create the project. If you are prompted to switch to the Java Perspective, do so.

2.1.1. Creating the IHello interface

The IHello interface defines one method: sayHello. This interface will be packaged in a JAR file that is designated as an Interface JAR when defining the JAR Definition artifact.

To create the IHello interface:

  1. In the Package Explorer view, right click the src-int folder and select New > Interface…. The New Java Interface window appears.
  2. Specify the following values for the following fields and click Finish:
    • Package — com.emc.examples.helloworld
    • Name — IHello



  3. Replace the code in the file with the following code and save the file:
    package com.emc.examples.helloworld;
         public interface IHello {
              public void sayHello();

2.1.2. Creating the Hello World implementation class

The HelloWorld class implements the IHello interface and prints out a “Hello, World” string when its sayHello method is called. The HelloWorld class will be packaged in a JAR file that will be designated as an Implementation JAR when defining the JAR Definition artifact.

To implement the IHello interface:

  1. In the Package Explorer view, right click the src–impl folder and select New > Class…. The New Java Class window appears.
  2. Specify the following values for the following fields and click Finish:
    • Package — com.emc.examples.helloworld.impl
    • Name — HelloWorldcreate_impl_class.png
  3. Replace the code in the file with the following code and save the class:
    package com.emc.examples.helloworld.impl;
    import com.documentum.fc.client.IDfModule;
    import com.emc.examples.helloworld.IHello;
    public class HelloWorld implements IHello, IDfModule{
         public void sayHello() {     
              System.out.println("Hello World!");

2.1.3. Creating the JAR Ant Builder

You will now create an Ant Builder to automatically build the interface and implementation classes into two separate JAR files: hello-api.jar and hello.jar. When you make changes to any of your code, the Ant Builder automatically rebuilds the JAR files.

To create the Ant Builder:

  1. Right click the HelloWorldBOFModule node in the Package Explorer view and select New > File. The New File window appears.
  2. In the File name field, type jarBuilder.xml and click Finish.new_jarbuilder_file.pngThe jarBuilder.xml file appears in the Package Explorer view and is opened in an editor.
  3. Click on the Sourcetab in the XML file editor, copy and paste the following code into the editor, and save the file:
    <project name="JARBuilder" default="main"> 
         <target name="main">
              <delete file="bin-impl/hello.jar" />
              <delete file="bin-int/hello-api.jar" />
              <jar destfile="bin-impl/hello.jar" basedir="bin-impl"/>
              <jar destfile="bin-int/hello-api.jar" basedir="bin-int"/>
              <eclipse.refreshLocal resource="HelloWorldBOFModule"/>
  4. Right click the HelloWorldBOFModule node in the Package Explorer view and select Properties. The Properties for HelloWorldBOFModule window appears.builders_screen.png 
  5. Select Builders on the left and click New…. The Choose configuration type window appears.
  6. Select Ant Builder from the list and click OK. The Edit Configuration window appears.
  7. Specify the following values for the fields listed and click OK:
    • Main tab > Name — JAR_Builder
    • Main tab > Buildfile — Click Browse workspace and select HelloWorldBOFModule > jarBuilder.xml
    • Targets tab > Auto Build — Click Set Targets…, select main, and click OK.
  8. Select JAR_Builder and click Down to move JAR_Builder below Java Builder and click OK. The Ant Builder builds the JAR files and outputs them to the bin-impl and bin-int folders.

2.2. Creating the Composer project

The Composer project contains the Documentum artifacts that are needed for your BOF module. You will create a project that contains a Module artifact along with JAR Definition artifacts.

To create the Composer project:

  1. Click File > New Project, select Documentum Project > Documentum Project from the New Project wizard, and click Next. The New Documentum Project window appears.
  2. In the Project Name field, type HelloWorldArtifacts and click Finish. Composer takes a few minutes to create the project. If you are prompted to switch to the Documentum Artifacts perspective, do so.

2.2.1. Creating the JAR Definition artifacts

Before you create the BOF module artifact, you must create JAR Definition artifacts that reference your implementation and interface JAR files. The BOF module cannot reference JAR files directly.

To create the JAR Definition artifacts:

  1. In the Documentum Navigator view, right click the HelloWorldArtifacts > Artifacts > JAR Definitions folder and select New > Other….The New Wizard appears.
  2. Select Documentum Artifact > JAR Definition and click Next.new_jardef.png 
  3. In the Artifact name field, type hello and click Finish. The hello editor opens.
  4. In the JAR Content section, click Browse… and select the hello.jar file that is located in the <workspace>\HelloWorldBOFModule\bin–impl directory.
  5. In the Type drop down menu, select Implementation and save the JAR definition.
  6. In the Documentum Navigator view, right click the HelloWorldArtifacts > Artifacts > JAR Definitions folder and select New > Other…. The New Wizard appears.
  7. Select Documentum Artifact > JAR Definition and click Next.
  8. In the Artifact name field, type hello-api and click Finish. The hello-api editor opens.
  9. In the JAR Content section, click Browse… and select the hello-api.jar file that is located in the <workspace>\HelloWorldBOFModule\bin–int folder.
  10. In the Type drop down menu, select Interface and save the JAR definition.
  11. In the Documentum Navigator view, right-click HelloWorldArtifacts > Artifacts > JAR Definitions > hello.jardef and select Properties.
  12. Select Documentum Arfitact on the left, select Ignore Matching Objects for the Upgradeoption field and click Apply.hello_properties_ignore.png 
  13. For the Upgrade option field, re-select the Create New Version of Matching Objects option and click OK. Composer does not set the Create New Version of Matching Objects option unless you set it to something else first. This bug will be addressed in future releases. This option allows the client to detect new changes in JAR files in the repository. If you do not set the JAR Definition to this property, updated JAR files will not get downloaded to the client unless the BOF cache is cleared.

The HelloWorld.jar and IHello.jar files are now associated with a JAR definition and can be used within a module. If you decide to modify any code within these JAR files, you must remove the JAR file from the JAR definition and re-add it. You must do this, because Composer does not use the JAR file in the location that the Ant builder outputs it to. Composer actually copies that JAR file to another location and uses that copy. The Ant Builder that you previously created updates the JAR file, but does not update the JAR Definition artifact.

You can update the artifact manually by clicking the Remove button, clicking the Browse… button, and reselecting the appropriate JAR file whenever the JAR file is updated. Later on in this tutorial, you will learn how to automate this requirement with another Ant script and headless Composer.

2.2.2. Creating the BOF module artifact

Now that you have created all of the necessary components, you can now create the actual BOF module artifact.

To create the BOF module artifact:

  1. In the Documentum Navigator view, right click the HelloWorldArtifacts > Artifacts > Modules folder and select New > Other…. The New Wizard appears.
  2. Select Documentum Artifacts  > Module and click Next.
  3. In the Artifact name field, type HelloWorldModule and click Finish. The HelloWorldModule editor opens.create_module.png
  4. In the Implementation JARs section, click the Add… button, select hello from the list that appears, and click OK.
  5. For the Class name field, click Select…, select com.emc.examples.helloworld.impl.HelloWorld from the list, and click OK. This sets the HelloWorld class as the entry point for the module.
  6. In the Interface JARs section, click the Add… button, select hello-api from the list that appears, and click OK.
  7. Save the module. The hello.jar file and the hello-api.jar file are now associated with the module. You can now install the module to a repository.

2.3. Installing the BOF module

Now that you have created all of the needed artifacts, you can install the BOF module to the repository. Once installed, the module can be downloaded on-demand by clients that require it.

To install the BOF module:

  1. Ensure that your <Composer_root>\plugins\com.emc.ide.external.dfc_1.0.0\documentum.config is properly configured with the correct Docbroker information.
  2. In the Documentum Navigator view, right-click the HelloWorldArtifacts node and select Install Documentum Project…. The Install Wizard appears.
  3. Select the repository that you want to install the BOF module, enter the credentials for that repository, and click Login. If the login is successful, the Finish button is enabled.
  4. Click Finish to install the project.

2.4. Creating the HelloWorld BOF module client

Once the BOF module is installed, you can write a client to test its functionality. When writing a client, you must include all of the interfaces that your client requires in your classpath. In this case, the client requires the hello-api.jar interface JAR file. If you do not package the interface, the client is unaware of the API for the BOF module.

To create the client:

  1. Create the project:
    1. In Composer, click File > New Project….
    2. Select Java Project and click Next.
    3. In the Project Name field, type HelloWorldClient and click Finish. If prompted to switch to the Java perspective, do so.
  2. Add hello-api.jar and dfc.jar to the build path:
    1. Copy the hello-api.jar file from the <workspace>\HelloWorldBofModule\bin-int directory to the <workspace>\HelloWorldClient directory.
    2. Right click the HelloWorldClient node in the Documentum Navigator view and select Properties.
    3. Select Java Build Path from the left and click on the Libraries tab.
    4. Click Add JARs…, select HelloWorldClient > hello-api.jar to add the JAR file, and click OK.add_jar_client.png 
    5. Click Add External JARs…, select <Composer>\plugins\com.emc.ide.external.dfc_1.0.0\lib\dfc.jar, and click OK.
    6. Click OK again to close the Properties for HelloWorldClient window.
  3. Create the file for the client:
    1. In the Documentum Navigator view, right click the HelloWorldClient > src folder and select New > File. The New File window appears.
    2. In the File name field, type and click Finish.
    3. Specify values for the file as follows:[0]=<Docbroker host>
      dfc.docbroker.port[0]=<Docbroker port>
      # Global registry settings are optional, 
      # but the client will throw an exception if not specified
      dfc.globalregistry.repository=<global registry repository name>
      dfc.globalregistry.username=<global registry repository user>
      dfc.globalregistry.password=<global registry repository password>
  4. Optional: Create the file for the client. If you do not have this file, the log4j logger will use a default configuration, but will post warnings to the console.
    1. In the Documentum Navigator view, right click the HelloWorldClient > src folder and select New > File. The New File window appears.
    2. In the File name field, type and click Finish.
    3. Specify values for the file as follows:
      # ***** Set root logger level to WARN and its two appenders to stdout and R.
      log4j.rootLogger=warn, stdout, R
      # ***** stdout is set to be a ConsoleAppender.
      # ***** stdout uses PatternLayout.
      # ***** Pattern to output the caller's file name and line number.
      log4j.appender.stdout.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n
      # ***** R is set to be a RollingFileAppender.
      # ***** Max file size is set to 100KB
      # ***** Keep one backup file
      # ***** R uses PatternLayout.
      log4j.appender.R.layout.ConversionPattern=%p %t %c - %m%n
  5. Create the client class file:
    1. Right click the src folder and select New > Class. The New Java Class window appears.
    2. Specify the following values for the fields and click Finish:
      • Package — com.emc.examples.helloworld.client
      • Name — HelloWorldClientnew_client_class.png 
    3. Copy and paste the following code for the and save the file:
      package com.emc.examples.helloworld.client;
      import com.documentum.fc.client.IDfSession;
      import com.documentum.fc.client.DfClient;
      import com.documentum.fc.client.DfServiceException;
      import com.documentum.fc.client.IDfModule;
      import com.documentum.fc.client.IDfSessionManager;
      import com.documentum.fc.common.DfException;
      import com.documentum.fc.common.DfLoginInfo;
      import com.documentum.fc.common.IDfLoginInfo;
      import com.emc.examples.helloworld.IHello;
      public class HelloWorldClient {
           public void run(String repository, String user, 
                String password, String module){
              System.err.println("Connecting to repository [" + repository + 
                        "]     as user [" + user + "]");
              IDfSession session = null;
              IDfSessionManager sm = null;
            try {
                 sm = newSessionManager(repository, user, password);
              session = sm.getSession(repository);
              run(sm, session, module, repository);
           catch (Exception e) {
           finally {
              if (sm != null) {
                 if (session != null)
         private IDfSessionManager newSessionManager(String docbase, String user, 
                String password) throws DfException
            IDfSessionManager sm = DfClient.getLocalClient().newSessionManager();
            IDfLoginInfo info = new DfLoginInfo();
            sm.setIdentity(docbase, info);
            return sm;
         public void run(IDfSessionManager manager, IDfSession session, 
                String module, String repository) 
                throws Exception
           IDfModule idfModule = null;
                   idfModule = new DfClientX().getLocalClient().newModule(repository, 
                          module, manager);
           catch (DfServiceException e) 
           catch (DfException e) 
              IHello hello = (IHello)idfModule;
           public static void main(String args[]){
              HelloWorldClient client = new HelloWorldClient();
    "GlobalRegistry", "Administrator", "emc", "HelloWorldModule");
  6. In the Documentum Navigator view, right click the file and select Run As > Java application. The Composer console should output the message “Hello, World!” if everything runs correctly.

Automating the updating, building, and installation process with headless Composer and Ant

Now that you have a working BOF module and client, it is useful to have a process in place to update the BOF module in the repository automatically. Previously, you learned that when updating code in JAR files, you had to also remove and re-add the JAR file to the appropriate JAR definition if you wanted the JAR definition to pick up the new changes. You can automate this step with headless Composer, a command line driven version of Composer that is used for build and deployment. The Ant scripts that you will create automatically update the hello JAR Definition with the most recent hello.jar implementation JAR file, build the project, and install it to a repository.

To create the headless Composer Ant scripts:

  1. Extract the headless Composer package to a location of your choice. The extraction process unzips the package to a ComposerHeadless directory. In our examples, it is assumed headless Composer is unzipped to the C:\ drive.
  2. Modify the ComposerHeadless\plugins\com.emc.ide.external.dfc_1.0.0\documentum.config\ to specify the correct DocBroker information. You can also copy your existing file from UI Composer if you want to use the same settings.
  3. Create a directory named HelloWorldBuild in the ComposerHeadless directory.
  4. Create a batch file, ComposerHeadless\HelloWorldBuild\run.bat, that defines the necessary environment variables and runs the Ant scripts. An example batch file is shown in the following sample. You can modify the strings in bold to meet your environment needs:
    REM Set environment variables to only apply to this command prompt
    REM Sets the root location of headless Composer
    SET ECLIPSE="C:\ComposerHeadless"
    REM Sets the location of your source projects. This location gets copied into 
    REM your build workspace directory
    SET PROJECTSDIR="C:\My\Projects"
    REM Sets the workspace directory where Composer builds the projects that you 
    REM want to install to a repository
    SET BUILDWORKSPACE="C:\ComposerHeadless\HelloWorldBuild\build_workspace"
    REM Sets the workspace directory where Composer extracts built DAR files before 
    REM installing them to a repository
    SET INSTALLWORKSPACE="C:\ComposerHeadless\HelloWorldBuild\install_workspace"
    REM Sets the Ant script that builds your projects
    SET BUILDFILE="C:\ComposerHeadless\HelloWorldBuild\build.xml"
    REM Sets the Ant script that installs your projects
    set INSTALLFILE="C:\ComposerHeadless\HelloWorldBuild\install.xml"
    REM Delete old build and installation workspaces
    REM Copy source projects into build workspace
    REM Run Ant scripts to build and install the projects
    JAVA -cp %ECLIPSE%\startup.jar org.eclipse.core.launcher.Main -data %BUILDWORKSPACE% -application org.eclipse.ant.core.antRunner -buildfile %BUILDFILE%
    JAVA -cp %ECLIPSE%\startup.jar org.eclipse.core.launcher.Main -data %INSTALLWORKSPACE% -application org.eclipse.ant.core.antRunner -buildfile %INSTALLFILE%

    The JAVA commands above should be on one line each in your batch file. When running the XCOPY command in the batch file, Windows might return an error displaying “Insufficient memory.” This might occur if the filepaths during the copy process are too long. If this occurs, try changing your build workspace to a directory with a shorter name or download the Windows 2003 Resource Kit Tool, which contains ROBOCOPY, a robust version of the COPY command that can handle longer filepaths.

  5. Copy and paste the following Ant script to a file named ComposerHeadless\HelloWorldBuild\build.xml. This Ant script updates the HelloWorld JAR definition artifact with the most recent HelloWorld.jar file and builds and packages the project.
    <?xml version="1.0"?>
    <project name="HelloWorldBuild" default="package-project">
         <target name ="import-project" description="
         Must import a project before updating, building, or installing it">
         <target name="update-jardef" depends="import-project" description="
         Update JARDef with most current JAR file">
                   artifactpath="Artifacts/JAR Definitions/hello.jardef" 
                   contentfile="build_workspace/HelloWorldBOFModule/bin-impl/hello.jar" /> 
         <target name="build-project" depends="update-jardef" 
              description="Build the project">
         <target name="package-project" depends="build-project" description="
          Package the project into a DAR for installation">
              <delete file="HelloWorld.dar" />
                        dar="HelloWorld.dar" />
  6. Create an Ant script, ComposerHeadless\HelloWorldBuild\install.xml, that installs the HelloWorldArtifacts project to a repository. An example Ant script is shown in the following sample. You can modify the strings in bold to meet your environment needs:
    <?xml version="1.0"?>
    <project name="headless-install" default="install-project">     
         <target name="install-project" description="Install the project to
              the specified repository. must be configured">              
                   domain="" />   
  7. Go back to UI Composer and modify the file to print out “Goodbye World!”: Save the file.
    package com.emc.examples.helloworld.impl;
    import com.documentum.fc.client.IDfModule;
    import com.emc.examples.helloworld.IHello;
    public class HelloWorld implements IHello, IDfModule{
         public void sayHello() {     
              System.out.println("Goodbye World!");
  8. Run the run.bat file to begin the build and installation process.
  9. Run the HelloWorld client in UI Composer. The Composer console output should display, “Goodbye World!”. This verifies that the HelloWorldArtifacts project was updated and installed correctly to a repository.

Congratulations, you have successfully built a BOF module, built a client to access the BOF module, and automated the updating, build, and deployment of the module with headless Compose


Tags: , , , , ,

Object Fetch Vs DQL Fetch Performance

A Object Fetch call would retrieve all attribute information of the object from server, this information will then be cached on the client side DMCL cache.

A DQL query will only retrieve the attributes specified in the “select” statement of the query.

A dm_document object has around 70+ attributes. If you are only interested in few attributes of an object you should use a DQL statement to avoid retrieving unnecessary information, this becomes significant especially in a low bandwidth environment.

Object Fetch should used when most attributes of an object are wanted, and/or that attribute information is repeatedly needed in multiple places.

Note that a DFC Session.getObject() call is effectively a fetch call, you should avoid creating a IDfSysobject with session.getObject() just to look at a couple attributes of the object, use query.execute(iDfSession, IDfQuery_DF_READ_QUERY) and specify the attributes of interest in the select statement of the query instead.


Tags: , , , ,

Xense Profiler for DFC now open source

I’m really excited to announce that from today Xense Profiler for DFC – the Documentum 6.x performance profiling tool – is now an open source project (DFCprof).

The DFCprof project is hosted on Sourceforge where you can download fully-functional binaries or build your own copy from the source code. The software is totally free – that’s free as in ‘free beer’ as well as ‘free speech’!

DFCprof can be used in a number of different ways. Most people will be interested in using it as a standalone application to process a DFC trace file and create a performance analysis report.DFCprof basic architecture

Just download the application from sourceforge, extract the files and you are ready to go.

Alternatively you can embed the dfcprof-x.x.x.jar library into your java project and use the trace parsing facility from there. I’ll be posting more details on the DFCprof parser API in due course. I’ll also be talking about the roadmap for future DFCprof features. Feel free to drop me a line in the comments if there are particular things you would like the project to do.


Tags: , ,

How to create two session to two different CS docbases?

Include both the docbroker in the

public static IDfSession connectToDocbase1(String docbaseName,
     String username,
     String passwd,
     String domain,
     String docBrokerHost,
     String docBrokerPort) throws DfException {
 try {
  IDfSession session = null;
  IDfClientX clientX = new DfClientX();
  IDfClient client = clientX.getLocalClient();
  IDfLoginInfo loginInfo = new DfLoginInfo();

  if (docBrokerHost != null && docBrokerHost.length() > 0) {
   IDfTypedObject apiconfig = client.getClientConfig();
   String primaryHost = apiconfig.getString(“primary_host”);
   String primaryPort = apiconfig.getString(“primary_port”);
   apiconfig.setString(“primary_host”, docBrokerHost);
   apiconfig.setString(“primary_port”, docBrokerPort);
   try {
    IDfTypedObject serverMap = client.getServerMapEx(
    docbaseName, null, docBrokerHost, docBrokerPort);

    session = client.newSession(docbaseName + “@”
    + serverMap.getString(“i_host_name”), loginInfo);
   catch (DfException e) {
    if (primaryHost != null) {
     apiconfig.setString(“primary_host”, primaryHost);
    if (primaryPort != null) {
     apiconfig.setString(“primary_port”, primaryPort);
    throw e;
   apiconfig.setString(“primary_host”, primaryHost);
   apiconfig.setString(“primary_port”, primaryPort);
  } else {
   session = client.newSession(docbaseName, loginInfo);
  return session;
 catch (DfException ex) {
  System.out.println(“failed to connect ” + ex.getMessage());
  throw ex;

Leave a comment

Posted by on August 11, 2012 in DFC, Java


Tags: ,


Get every new post delivered to your Inbox.