Unit Testing Business Processes in Oracle BPM Suite 11g

One of the fundamental challenges that every individual/team experiences when implementing business processes using Oracle BPM Suite 11g is the ability to run unit tests. Managers and testers direly want unit testing from developers to ensure code quality and stability. Another important requirement around testing is around coverage i.e how to ensure that all possible scenarios represented by the business processes are properly unit tested. Having a stringent methodology towards unit testing and a framework of executing them every time a  functionality is changed or when the business processes are deployed is a much desired and essential feature. Unfortunately there is a dearth of information available for developers to be able to create a complete and comprehensive framework of unit testing. As such, a lack of this knowledge is a big deterrent and this blog will, in details, cover the soup to nuts of planning, creating and executing a unit testing methodology that provides an almost cent precent coverage of all scenarios in a business process.

The Problem Statement

The credibility of a business process lies not only in the way it has been implemented but also in it being diligently tested for all outcomes and scenarios. Business processes implemented with BPMN 2.0 posses a greater challenge as processes tend to be fairly unstructured and involve a lot of components like decision functions, workflow tasks, events and services to name a few. As such it is very challenging to ensure that the processes being implemented are fully tested and hundred per cent compliant with how they are intended to function. It is no surprise that without a concrete unit testing the QA phase of the project is marred with several issues and this is when you start to feel that a lot of bugs could have been spotted and fixed during the development phase if there had been some sort of mechanism to test various routes of a process map. To add to the woes, I have seen many projects either performing these tests manually as a series of steps per test case or trying desperately to create a complex framework of using API’s to initiate a process instance, obtain and complete task items, stub service components etc.  The problem with both these approaches are that they are time consuming and relatively prone to errors. Because every project is usually always on a tight schedule, these tests are not executed and the issues are not spotted during the development cycle. Another option is to use various off the shelf testing suites available in the market but they tend to be expensive, have a steep learning curve and may not always fit the the bill.

This article take a unique approach that employs the existing unit testing framework available with Oracle BPM Suite 11g together with proven scientific software testing techniques to achieve an almost full coverage of unit tests for business processes.

The Sample Business Process

In order to demonstrate the philosophies expressed earlier and the testing techniques to follow , this article considers a simple employee expense approval process that has primarily all the ingredients that any complex business process has i.e it is fairly unstructured, have business rules, human workflow components, arbitrary cycles, gateways and events.

The process is fairly easy to understand. There are two actors in this process viz. Expense Approver (typically a manager) and a Finance Approver. An expense approval request event is evaluated against a business rule to determine whether it can be automatically approved. If so, it is sent to a finance admin to review and disburse payment if everything is right and green. If the expenses over-reach the auto approval limits and/or if the finance admin rejects them, they then go to the manager who then have to approve or reject them. Only manager can reject an expense for it to be recorded as rejected. The finance approver can only refer it back to the manager if he finds it to be incorrect. Well, this may not be the ideal and most comprehensive expense management process, but for the purpose of this demonstration, this variation is fairly suitable considering the commonalities.


With the problem statement laid out and the sample business process being considered, the challenge is how can we guarantee that the different routes in this process can be thoroughly unit tested. 


The modelling workspace and the version of Oracle BPM Suite used in this demonstration is 11gR1PS5 i.e although the concepts discussed are generic and applicable to all versions. I have also used MS-Visio for the simplified block diagrams but any diagramming tool can be used instead.

Testing Methodology

The testing methodology being discussed here is holistically categorized into five different steps, with each steps being characterized by a distinct set of functions and objectives.


The first step is the structuring step wherein the process map needs to have a finalized structure with all scenarios being identified and modeled including the exception paths. The structuring step may begin with the definition of the highest level business process diagrams such as process choreographies or collaboration diagrams, etc. (https://beatechnologies.wordpress.com/2011/10/17/choreography-collaboration-and-oracle-bpm-11g/) but it is to be considered complete when the BPMN models are finalized and approved.


This step is generally required if the business process is not straight forward and a plain vanilla type which is unusual in most cases. A reconstruction of the BPMN model is necessary to be able to evaluate the possible number of routes in the process map. The processes are simplified to be represented as block diagrams to uniquely identify all actors, events and sequence flows only with each of them marked generically. A couple of guidelines to reconstruct are:

  1. Represent each activity, event and sequence flow with a distinct name to convey the process flow (RULE for decision activity, SERVICE for sync/async communications with external systems, initial trigger and final outcome for events, approval actions for sequences etc.). You may use alphabets as suffixes if there are more than one similar activities.
  2. Optionally use colour coding for sequence flows to represent happy and alternative paths in a business process.
  3. Use markers in sequence flow to indicate its execution ability (parallel, conditional, inclusive etc.) with respect to other sequence flow from the same activity.

The following diagram represents a simplified skeleton for the expense approval business process with bare minimum semantics. As you can see that the model generalizes the name of each activity/event/sequence in the business process. A default and a conditional marker is also used to represent the execution flow path from a given node. 


There is just one more step to reconstruct the diagram further. For as much as possible, try to group the likely marked activities together to make the diagram look simpler.  The final reconstructed diagram should ensure that each activity has uniquely different sequence flow originating out of it. If there are more than one similar sequence flow that originates from an activity then group them as one. For instance there are two default sequence flow originating out of the FINANCE activity and leading to the service activities SERVICE C and SERVICE D. These activities have to be combined/grouped to represent a single activity and there should just be one default sequence flow leading out of the FINANCE activity into it. The simplified and final reconstructed model would look like the one shown below.



Having a simplified and reconstructed model is a significant milestone as it is required to measure the number of linearly independent routes in a business process. This is a very essential piece of information to derive at the minimum number of test cases required to unit test the business process to ensure it has a full coverage. To determine the minimum number of test cases required to cover all possible process routes for the expense approval business process, McCabe’s Cyclomatic Complexity algorithm can be applied.  Cyclomatic complexity can be computed using the reconstructed control flow diagram of the business processes. The Basis Path Testing by McCabe who first proposed it, is to test each linearly independent path in the business process; in this case, the number of test cases will equal the computed cyclomatic complexity.

The cyclomatic complexity, M, is evaluated as:

M = S – A + 2E


 S = the total number of sequences (edges) in the restructured process diagram
A = the number of events and activities (nodes) in the process
E = the number of end events in a business processes (connected components)

Looking at the restructured process map for expense approval will reveal that there are 10 sequences and 9 nodes (6 activities, 3 events) and 2 end events. Hence the maximum number of test cases required for full test coverage can be determined as:

M = 10-9+2*2 = 5

It is now easy to determine the test cases to be executed by a casual inspection of the reconstructed process graph. The following test cases shows the various routes a process can take from the conditional nodes.

Test Case 1: Submit, Auto Approve, Finance Approve, Approve
Test Case 2: Submit, Auto Approve, Finance Reject, Refer
Test Case 3: Submit, Refer, Manager Approve, Finance Approve, Approve
Test Case 4: Submit, Refer, Manager Approve, Finance Reject, Refer
Test Case 5: Submit, Refer, Manager Reject, Reject

Another important use of McCabe’s number (cyclomatic complexity calculation) is to understand and limit the complexity of a business process. It is recommended that the complexity of a business process should not exceed the figure of 10. If it does then it is advisable to split it into multiple processes.

Process Implementation

The expense approval business process implementation is not covered in this blog but readers are free to create an expense business object, define the conditional logic, auto generate UI for the task forms and so forth. If you want to spare yourself from that effort then an implemented process can be downloaded from here. The implemented process looks like below in the studio canvass.


Test Suite Creation

From a unit testing perspective, there are two things that should be predominantly tested. One is the business process itself and all its possible paths. This has been explained already and a strategy defined. Another thing that is important to test is the business rule component.  Well the approach to be taken depends upon the nature of the rules and their effect on the business processes. If the business rule is merely to dynamically determine a condition that defines the process path (followed by an exclusive gateway) then testing the business process should cover it. However, if the business rules executes conditions to create dynamic process data, then testing the business processes is not sufficient. A more thorough unit testing of the business rule must also be carried out.

The below three blog posts cover testing of business rules extensively by a wide variety of means.




This blog intends to build upon the strategies covered in the above posts but also refine it further to create automated tests that can be executed using Ant or Maven and a detailed JUnit type report published for them.

Unit Testing the Business Process

There may be many strategies to test the business process in an end to end manner but this post focuses primarily on unit testing. This is more appropriate is scenarios where a testing has to be carried out by possibly stubbing out behaviours of external systems that the business process is dealing with. A readily available and out of the box tool to test a business process is the composite test framework. But the interesting challenge is how this framework can be used to test a business process with complex human tasks, events, business rules, etc. and this is something that is aimed here.

  • Begin by creating a test suite by right clicking on the testsuites folder under SOA Content in JDeveloper. Name this test suite as ExpenseApprovalTest. This will create a folder by this name and some auto generated contents within them.


  • Next right click on the generated test suite and click Create Test to create a unit test case for the composite. It is advisable to put an apt name and description for the test case so that it explains the scenario being tested. This will generate an .xml file by the test case name in the tests sub-directory.


  • Double click the .xml file to open it in JDeveloper. The editor opens the test view of the composite with all the services, components and references. This test view allows to create pretty sophisticated unit tests by providing the following options.
    • Define Initiate Messages to start the business process if it is based on a message based start. Else initiate events or the Initiate task can be used to invoke a manually triggered business process.
    • Each wire leading from the service, component and reference can be configured with Wire Actions. The wire actions can emulate the response from the target component if the component is not implemented. The wire action can also assert expected response(s) with the actual response or the emulated response from the output message of target components. Multiple assertions can also be configured on the same wire actions.


  • Double click on the binding element at the ApproveEmployeeExpense service. This launches the Initiate Messages wizard. Generate a sample message that conforms to the request type that this service expects.


  • The generated sample has dummy values which can be replaced with actual test values. In this case, use the following sample request.  This message is used to invoke the submit expense application start event message in the business process.

Initiate Message

<submitExpenseApplication xmlns="http://www.rubiconred.com/ApproveEmployeeExpense">
<Expense xmlns="http://www.rubiconred.com/bpmtest/empl">
<Justification>Travel on Business</Justification>

  • As the message arrives in the business process, an expense record is created in the database with an initial status. The operation to insert a message in the database is one-way and hence from the testing perspective, it is good enough to assert what is going in. A valid assertion for unit testing for this test case is to validate the expenseTotal element in the input request and match it to the sum of all expenses in the expense list.


  • The next step in the business process is the business rule task and the response (output) from the business rule can be asserted by modifying the wire action between the business process and the business rule. At the moment the business rule is configured to respond back with a status of pre-approval i.e whether an approval from the manager is required or not. This test case assumes that the manager approval is not required as the values in the expense item list determine the rule output. This can be asserted against the expected response. The business rule is configured to auto approve expenses less than 10k for employees in the Sales business unit Sales (determined by the integer value of 1).
  • At this point the business rule component can be opened to see how it processes the employee expense fact and what is the response action.


  • As mentioned before, it is recommended to have a detailed unit testing suite created solely for business rules as they may or may not determine the process path. Also the business process testing should be based on the rule outcome which is either Approved or Referred. But the business rule outcome can depend upon a lot of factors. For the example considered here, the business rule vets the outcome differently for different business units i.e the auto approval threshold is different depending upon the business unit the employee belongs too. Hence it is advisable to test the rules separately. A quick approach to do so will be discussed here shortly.

  • The next activity that will be executed in this test case is the Administer Expense Payment human task assigned to the Finance admin. Till this point all the activities being asserted against will automatically produce a result/outcome. But a human task is a manual step in the business process that is performed by a process participant. Hence the way to emulate a human task response in a test case is tricky. Fortunately the test composite allows emulating call-backs for human tasks that can mock task execution.

  • Begin by double clicking on the wire between the business process and FinanceApproval human task component. This, as usual, launches the wire action wizard. Select initiateTask from the list of operation to emulate its call-back. The call-back should be emulated when the task has been completed and hence the callback operation to be chosen is onTaskCompleted.

Generating a sample for the call-back operation creates a workflow task request based on the task schema. It is complicated to create values based on this schema initially so it is advisable to get a generated one and then use it while creating the test case. All values in the task xml are unimportant except for the values within the payload element under the root and outcome element under systemAttributes. The payload content corresponds to the data object (s) defined for the human tasks and is used for passing the values to the process variables. Whereas the outcome is used to determine the business process flow beyond the human task.


  • Repeat the exact above steps for the updateTaskOutcome operation too. Behind the scenes when a human actor logs into the task workspace the following things happen.
    • Depending upon the workspace preference and role a list of tasks are displayed to the user. The queryTasks operation is used with the proper predicate for this purpose.
    • When a task item is selected the getTaskDetailsById function is executed to retrieve the details specific to the selected task.
    • All updates, captures and saves to the task in the workspace are persisted in the dehydration store via the updateTask operation.
    • When the task is finally actioned its outcome and state is updated via the updateTaskOutcome operation which sends a callback to the waiting business process instance about task completion.
  • The operations discussed here were just a high level overview of what happens to a human task in the general scenario. There are infinite other possibilities of what stages the human task may go through in terms of assignments, renewals, expiries etc. However the business process does not care about all that. It simply waits for a callback from the human task which happens either when the task is actioned or when it expires.
  • Emulating the callback for the updateTaskOutcome will pass the control back to the business process and it will then process ahead depending upon the task outcome. The task outcome for this test case is set to Approved.


  • The following request can be used to emulate the task responses for this project. The outcome determines what is the action performed on the task where as the payload corresponds to what the user sees in the screen and conforms to.

Task Response Payload

<task xmlns="http://xmlns.oracle.com/bpel/workflow/task">
<title>Approve Expenses for Arun Pareek</title>
<ExpenseApproval xmlns="http://www.rubiconred.com/bpmtest/empl">
<ns2:Employee xmlns:ns2="http://www.rubiconred.com/bpmtest/empl">
<ns2:ExpenseList xmlns:ns2="http://www.rubiconred.com/bpmtest/empl">
<ns2:ExpenseList xmlns:ns2="http://www.rubiconred.com/bpmtest/empl">
<ApprovalStatus xmlns:ApprovalStatus="http://www.rubiconred.com/bpmtest/empl">
<Comments>Expenses Approved</Comments>
<taskViewContext>Action Required</taskViewContext>
<activityName>Administer Expense Payment</activityName>

  • Post approval by the finance admin the last step in the process is saving the payment record in the database. This also being a one way operation, from the unit testing perspective add a test check to ensure that the payment amount being saved is equal to the sum of all amounts in the payment list.
  • This is done by adding an assertion in the wire between the RecordEmployeeExpense component and PaymentRegisterService reference in the test composite view.


  • Business processes ending with a message end event are asynchronous in nature (in most cases) where as the ones ending with an empty end events are one way. The ExpenseApprovalProcess is an asynchronous business process as it has an message based end event that sends the final status of the approval request. This status can also be asserted to test if the clients invoking the business process get a valid business response. The final assertion in the test case.
  • Response from the business process can be validated by asserting the payload of the call-back operation expensesPaid. The expected value of the approvalStatus should be FINANCE APPROVE for the given test case which can be entered in the assert value text field.


Apart from the unit testing steps covered, there is no harm in adding more test conditions. This has to be determined on use case basic and quality of unit testing to be produced. Also add as many test cases as derived according to McCabe’s number.

Similarly test cases have to be added to test each possible path in the business process as determined by the cyclomatic complexity algorithm. Alternatively download the composite to see all the test cases created for this exercise.

Unit Testing the Business Rule (s) Components

As explained earlier, it is recommended to not mix business rule testing and business process testing together. As you can see that there are only two possible approval statuses from the cascaded business rules below. The business process takes a due transition only based on the business rule outcome, i.e, the approval status. However the approval status could itself be derived in a lot of ways depending upon the business unit of the employee submitting the expense and also the sum of the expense amount.


To create using testing for business rules, I would recommend going through my previous blog that explains this in great details. The blog can be accessed here:


One missing piece to the puzzle however is to to plug the testing framework with ant or maven so that they can be run as part of the build and deployment cycle. This is briefly discussed in this blog too.

Test Execution

Using Enterprise Manager

The easiest way to invoke the unit test suite is by deploying the composite and then initiating the testing from the enterprise manager console.

  • Start the soa/bpm managed server and deploy to the composite to the default partition.
  • Login to the enterprise manager and then navigate to the ExpenseApprovalComposite under the deployed partition.
  • Clicking on the Unit Tests tab will show all the created test cases under the test suite. All or some of the test cases can be selected and then click the Execute button.
  • This will prompt for naming the test run. Provide a name and then click OK to start the testing.


  • Once the tests have finished executing the Test Runs displays the final status of the test run along with statuses of each test case. A detailed report is also available in the trail showing how each assertion fared. The Enterprise Manager offers a great view in terms of determining what the actual value of a given step was against its expected value. It also does a full XML compare to show the difference between expected and actual XML structures.
  • This is a great way to test/smoke test on-demand to determine if any changes made to the business process doesn’t effect its core flow and functionality.


Using JDeveloper Studio

Business rules created as part of the project have a great means to be unit tested using the out of the box test features. A test suite XML can be created for a given decision function that may contain multiple test cases. Each test case is comprised of an input structure to the business rule and an expected output structure which is ascertained with the actual output of the rule.

Running the test from the studio also creates an inline test report with all the details about each test case and the overall test suite.


These approaches are good and provides a mean to ensure that a proper checks and balances are executed before deploying the project for any changes/incremental changes. However more rigorous quality checks can be implemented by automating the execution of these tests with each build and deployment cycle.

Test Automation

In order to industrialize the entire process of unit testing for it to be running with each deployment/release, it is essential that the test execution step be automated. It is highly desirable that with reach minor/major deployment, and ANT or a MAVEN task can be executed to test the business process and the business rules to produce a release specific test report. The composite test framework provides this capability out-of the box through a ANT file (ant-sca-test.xml in the MW_HOME/Oracle_SOA1/bin directory) and it is very convenient to use. However the problem with it is that it cannot test the business rules. This is a big limitation as far as a total test coverage is concerned.

However overcoming the limitation is easy and explained in the steps that follow. I managed to create a utility framework in Java using the business rules API’s and creating custom ANT tasks to test business rules, once they have been created by following the approach covered above and produce JUnit type reports. The composite test framework can also produce html reports with the test execution statues. However these reports can also be configured to be based on JUnit standards. Hence it makes sense to create all reports that are JUnit based particularly as tools such as Hudson can be configured to point to the report directories and automatically read test report files. It can do the rest in terms of embedding the test report per build/deployment.

  • Right click on the project in studio and create a new Ant build file from the project.
  • This generates two files (build.xml and build.properties) with all the class path and dependencies that the project is base on. This is a good starting point to modify and customize these files to add custom targets and properties execute both the composite and rule tests.


The customizations to the build.properties file are specific to the middleware installation folder. The property file also needs access to the BPM server to deploy and unit test the composites. A sample property file from my project is copied below.

Configuring build.properties


# Environment Homes

# Output Directories

# SCA Test Parameters

The entire build.xml file can be downloaded from here. However the two targets of interest to execute and report the composite tests are scatest and scareport. The scatest Ant target is executed by specifying a bunch of properties that are sourced from the build.properties file. An important parameter is the xsl attribute. This determines the format of the report that is generated. The default property is overwritten with the value of junit to create JUnit type reports.

Creating and Configuring the ANT build/property files

<property name="scatest.tasks.class.path" refid="classpath"/>
<taskdef name="scatest" classname="oracle.integration.platform.testfwk.ant.ScaTestTask" >
<pathelement path="${scatest.tasks.class.path}"/>

<target name="scatest" description="Run the unit test cases in the Business Process" depends="compile,copy">
<echo message="Classpath = ${classpath}"/>
<echo message="Middleware Home = ${MW.HOME}"/>
<echo message="Running scatest using ORACLE HOME = ${JDEVELOPER.HOME} ${WL.HOME}"></echo>
<echo message="Using context = build.properties"></echo>
<scatest compositedn="${scatest.partition}/${scatest.input}" timeout="${scatest.timeout}"
xsl="${scatest.format}" runname="${scatest.runName}" resultsdir="${scatest.result}" context="${jndi.properties.input}">

<target name="scareport" description="Generate JUNIT type report for composite Test Suites" depends= "scatest">
<echo message="Generate report to ${scatest.result}"/>
<junitreport todir="${scatest.result}">
<fileset dir="${scatest.result}">
<include name="*.xml" />
<report format="frames" todir="${scatest.result}/html" />
<exec executable="cmd.exe">
<arg line="/C start ${scatest.result}/html/index.html" />

These targets can, in the most basic way be invoked by right clicking on the build file and selecting Run Ant Target > scareport. The wider possibility in the greater scheme of things is to execute this targets from a build control tool such as Hudson. Now, instead of Ant these targets can even be maven-(ized).


The composite test report is currently generated in the SCA-INF/out/scatests directory and automatically launched when running the scatest target. It shows the execution status of each test case and detailed failure report.


Integrating the Rule Test Framework

In order to execute the testing of business rules from a continuous integration standpoint, I have developed a Rule test framework which is wrapped over by an Ant command to test as well as generate JUnit based reports. The test framework is generic. A business rule can specify any number of test suite’s. The framework can read any number of test suite XMLs, unit test the business rule component against them and generate a report.The business rules test framework is a set of utility classes that in turn leverages the business rule APIs.

Creating Rule test and report targets

<target name="ruletest" description="Execute unit tests for business rules" depends="compile,copy">
<echo message="Using path = ${classpath}"></echo>
<java classname ="com.rxr.ruletest.RuleTestRunner" classpathref="classpath" logerror="on" failonerror="true" />

<target name="rulereport" description="Generate JUNIT type report for business Rule Tests">
<echo message="Generate report to ${ruletest.result}"/>
<junitreport todir="${ruletest.result}">
<fileset dir="${ruletest.result}">
<include name="*.xml" />
<report format="frames" todir="${ruletest.result}/html" />
<exec executable="cmd.exe">
<arg line="/C start ${ruletest.result}/html/index.html" />

The utility classes take care of report generation as well thus allowing robust testing that can be included as part of the build automation.



In the end it will be unusual to explain the benefit of establishing automation practices for every component being developed as part of a business process. My colleague, Craig Barr, recently pointed out in his recent blog about SOA and BPM industrialization, the importance of unit testing in a software delivery cycle. Craig mentions:

“Unit test everything! It is a myth that vendor tools don’t cut it when it comes to testing. You just need to know how!”

All files, resources and projects used in this blog can be downloaded from this location. As always I would love to have any feedback so please don’t hesitate provide any suggestions.


2012 in review

The WordPress.com stats helper monkeys prepared a 2012 annual report for this blog.

Here’s an excerpt:

19,000 people fit into the new Barclays Center to see Jay-Z perform. This blog was viewed about 100,000 times in 2012. If it were a concert at the Barclays Center, it would take about 5 sold-out performances for that many people to see it.

Click here to see the complete report.

Open a Workflow Task or Instance Trail from Oracle BAM 11g

An interesting requirement that I have recently seen a lot of clients demanding in their BAM dashboards is the ability to action or monitor process instances. Oracle BAM 11g typically provide means to create real time dashboard for process metrics to business process stakeholders. However a lot of organizations are beginning to prefer to also use it as a starting point to get more insight into instance details (such as in-flight status, flow trail, fault information, etc.). One of my reputing requirement in the past was so much as to provide a mechanism to open actionable task forms directly from BAM. And then sometime back I had the same question asked again as how to open a process instance trail directly from Oracle BAM 11g to view its fault details. This is a natural demand for process owners typically in the support role who are alerted of fault notifications from BAM and would not like to go through the trouble of going back and forth the BAM dashboard and the Enterprise Manager console to correlate instances together. This eventually led me to believe that, it is in fact a sought after requirement and hence this article will aim to provide a detailed and step by step guide to define and implement this use case.

In a nutshell, this article will demonstrate creation of actionable links in a BAM dashboard to view and instance trail of a composite instance and from there dig into its audit trail or flow trace. Another useful feature demonstrated will be the use of Action Buttons in Oracle BAM 11g dashboard to launch instances of human workflow tasks of a BPMN composites awaiting to be actioned from designated process users. When such an instance is clicked the user is taken to the BPM workspace UI and take an action on the selected task.

This allows business users to seamlessly monitor business applications and processes and be able to correlate them with instances appearing across different UI’s of the Oracle SOA Suite 11g product suite such as BPM Workspace and EM Fusion Middleware Control console. It is essentially useful for organizations that intend to use Oracle BAM as an operational dashboard for process users in addition to it being used as a real time business metrics monitoring and alerting dashboard.

Take a Sneak Peak at the Use Case

The following video will show a sneak peak of the use case intended to be demonstrated in the course of this article.

During the course of implementing the use case, the article will also describe the following activities in detail:

  • An Overview of creating and assigning values to a business indicator and publishing it to Oracle BAM data objects at runtime.

  • Configuration changes involved in sending the process metrics data from a BPM process to BAM data objects such as activating sampling points, enabling data targets, configuring the OracleBAMAdapter and altering BPMN engine MBeans .

  • Create Streaming Action Lists in an Oracle BAM reporting view to track in-flight instances of a composite that are assigned to process owners, waiting to be actioned. and a list of completed instances.

  • Create an action list of completed instances and define an action to launch their audit trail in the Enterprise Manager console. This concept is useful as this can be extended/customized to instead create a list of faulted instances for users to gain detailed insight into the fault details from the Enterprise Manager Fusion Middleware Console.

  • Configure External Data Objects and create lookup fields in BAM.

  • Create and configure action buttons in Oracle BAM to open an external URL and pass the value(s) of a column in an action list for a selected instance to the URL term.

To gather process metrics and activity instance data in BAM to be able to build the reports, an existing BPMN based composite will be used. The composite project can be downloaded from the example demonstrated in an earlier BPM post on event based gateways. The article also describes the use case the project tries to address in details.


The download link for the application workspace is provided in the aforementioned article. The soapui project to create sample instances and thus populate the BAM reports is provided at the end of this article.


The modelling workspace and the version of Oracle BPM and BAM Suite used in this demonstration is 11gR1PS5 i.e although the concepts discussed are generic and applicable to all versions.

Creating the Business Indicator(s)

Import the application workbench consisting of the modelled process in JDeveloper. To begin with the article describes adding a business indicator to capture the credit card approval status as the instance progresses through multiple activities.

In the JDeveloper menu click on View > BPM Project Navigator and open the AccountOpeningProcess business process. You will notice a Structure pane visible in the bottom left hand side of the editor. If the structure pane is missing then click on View > Structure to be able to see it. Expand Business Indicators > Dimension and right click on it to be able to add a new dimension of type String. Name the dimension as cardApplicationStatus. For simplicity this article just has a single business indicator, though in real business processes you may end up having multiple KPI collecting indicators.


Once an indicator has been defined, it has to be assigned a certain value to depict a business friendly state of the process instance is currently in. Have a look at the red information rectangles for the AccountOpeningProcess as depicted in the screenshot below. They may well represent the various business friendly stages of the application status. For example, when a New Application request is received  by the process the business indicator may be set to “PPLICATION IN PROGRESS indicating that it is being currently processed upon. If the customer for any reasons wants to terminate his application midway while his initial request is in-flight, the application status may be transitioned to “REQUESTED CANCELLATION”. Similarly there may be other intermediate statuses that can be captured in a long running business processes and be presented to business stakeholders or process owners.


Populating a business indicator with a pre-determined value is straightforward. It is in a similar way as a value is assigned to any other process variable. Double click on the New Application start message event and then on the Data Associations link. Simply drag the Expression icon on the cardApplicationStatus metric variable under OnlineAccountOpenin>DataObjects on the right hand side of the Data Association wizard and assign it a static value of “APPLICATION IN PROGRESS”.


Similarly, double click the Cancel Application intermediate event and then assign REQUEST FOR CANCELLATION to the same business indicator from the Data Associations link. When the business processes receives an intermediate request for cancellation while the task is still in progress, the cancel application intermediate event is invoked and the latest status of the process instance (determined by the cardApplicationStatus business indicator) is replaced. Use the same technique to assign other possible approval states to the business indicator for other activities.


Configuring Sampling Points and Data Targets

Now that the business indicators are assigned appropriate values describing meaningful process states, the next thing is to publish these indicators to BAM. Readers familiar with Oracle SOA Suite will know that sensors can be defined for the BPEL engine to automatically publish their data to BAM data objects in real time as soon as their values in the business process changes. Unfortunately as of now, sensors cannot be defined in Oracle BPM Suite. Instead there is a concept of Sampling points.  The BPMN Service Engine uses the defined sampling point configuration to decide when and where to publish process analytics information. Process analytics sampling points can be created for all activities, just interactive activities or disabled altogether. The process metrics can be published internally to the in-built cube star tables in the SOAINFRA schema or externally to BAM data objects. The process sampling points can be configured by right clicking on the BPM process from the BPM Project Navigator view and then clicking Project Preferences . This article needs sampling points to be generated for all activities in the business process, hence Generate for All activities from the Project Sampling Points radio options is selected. Sampling points capture a host of other information in addition to capturing the values of defined business indicators such as performance and workload metrics. Also note that choosing Generate for All Activities will generate a lot of data to be published to BAM. More information about Process Analytics and Sampling points can be read at the official Oracle documentation site at


By default the sampling point only publishes process metrics to the internal CUBE_* tables in the SOAINFRA schema. To publish these analytics data externally to BAM click on the Data Targets tab and check Enable BAM option. Also change the default Data Object Path to something like /Customer/AccountInitiation. This ensures that all data objects pertaining to a project are created in the folder hierarchy entered in the data object path field.  This external publishing of the BPMN engine data to BAM data objects in the active data cache is taken care by the OracleBAMAdapter. This adapter has two predefined connection factories. It allows either publishing the data to the BAM engine either through a RMI interface or through a web service interface.  The BAM Adapter JNDI name for these interfaces is either eis/bam/rmi or eis/bam/soap respectively. The eis/bam/rmi provides a better performance as this doesn’t employ the BAM web service interface and used preferably if the BAM and BPMN engine reside in the same JVM.


Configuring the OracleBAMAdapter

The integration glue that binds and permeates instance metrics information to pass from the BPMN engine to the BAM active data cache is the OracleBAMAdapter. The  BAM adapter needs to have the same JNDI (one that was selected in the previous step) configured that is pointing to the correct instance of the BAM active data server. Oracle Weblogic Server administration server console provides the mechanism to view and edit BAM adapter configuration properties. Login to the administration and navigate to Summary of deployments > OracleBamAdapter > Configuration > Outbound Connection Pools and click on the eis/bam/rmi connection pool to view its default configuration properties. These properties may have to be changed to point to the correct instance of the BAM active data cache server. It is also important to note that any change in these adapter properties are applicable only when they are saved and the adapter configuration is updated. This typically happens when the adapter is either redeployed with the changes in the plan file or updated with  the saved changes.  Proper configuration of all of the above settings is important for the BPMN engine to be able to plug with the BAM server.


On an Oracle BPM Suite 11g PS3 and above, with only these changes in place, simply deploying the BPM project will create the necessary data objects in the BAM active data cache. Open the Oracle BAM console and click on the BAM Architect view to be able to see the list of data objects existing in the BAM active data cache server. Notice the AccountInitiation subfolder created under DataObjects > Customer folder. Under this hierarchy will be the data objects containing the instance metrics data published from the BPMN engine. Later parts of this article will show how these data objects will be used to create interactive reporting view in BAM.


BPMN Engine MBean configuration Alteration to Allow the engine to post data to BAM

As discussed earlier, the above set of configuration changes are sufficient to create the data objects when the business process composite is deployed to the application server. Now when an instance is created and executed it is expected that these data objects will have process metrics value populated at near real time. However the contents of the data objects are not be populated unless a certain disable action to supress process measurements to be sent to the BAM engine is turned off. Login to Oracle Enterprise Manager Fusion Middleware console and navigate to soa-infra > SOA Administration > BPMN Properties > More BPMN Configuration Properties to view the configurable BPMN engine MBean properties. Scroll down the list to see the DisableActions property. The default value for this property is BAMCommand, which supresses measurement actions to be sent to BAM. Clear this field (leaving it blank) and save the settings. If another instance of the BPMN process is now executed, the BAM data objects will being to be populated with measurements obtained for the  various process activities.


Creating the Reporting Dashboard in BAM

With the data objects being able to receive measurement data from the business processes, the next thing to do would be to create the BAM reporting dashboard. The dashboard to be developed in this article will have two horizontal regions each displaying a streaming action list of its own. The idea is to have one of the horizontal view displaying recently completed instances and have a hyperlink defined to launch its detailed audit trail in a pop-up while displaying the list of in-flight instances in another. The view showing the in-flight instances will also provide an action button to launch the approval task form for the the selected instance.  As a desirable feature a similar  streaming list of displaying the faulted instances can also be created that leads process users enlisted as support stakeholders to the flow trail to know the exact point and reason of failure.

Login to the Oracle BAM 11g console and from the landing page click on Active Studio (this is where reports and alerts are create in BAM). Click on the Create a New Report link to open the tiled report template selection screen. Select the template titled Two equal horizontal tiles with thin separator. From the list of various report types in each of the horizontal tiles select Action List and  title the report as Credit Card Approval Scoreboard. Each of the report views will prompt for a data object to be selected from which the view will sources its data content.

The upper horizontal tile is used to display a list of completed instances. For this view select BI_default_OnlineAccountOpening_AccountOpeningProcess as the data object and then click on the Fields tab. This lists all the fields in the data object. From a reporting standpoint, it may not necessarily make sense to display all the fields in the data objects but only a few important ones.  For the report in this article, the selected fields are COMPOSITE_INSTANCE_ID,  BI_NAME, COMPONENT_START_TIME, COMPONENT_END_TIME, COMPONENT_INSTNANCE_STATUS, COMPOSITE_NAME, COMPOSITE_REVISION, DOMAIN_NAME and METRIC_cardApplicationStatus. These fields however will appear in the report in their alphabetical order. To change this check the select-boxes corresponding to these fields and use the Arrange icon to rearrange them to be displayed in a custom order. Optionally the report can also be pre-sorted based on values a particular field by moving the field under the Sorted Fields panel in the Sort tab.


Another important thing worth mentioning here that if the view configuration is left just as it, a whole lot of duplicate instances will appear in the list. They are not necessarily duplicate as the process measurement data is published for every activity. Therefore the same composite instance will have as many rows in the data object as the number of activities in the original business process. There will however be only one row with the latest snapshot of the process. In order to show only the unique and non duplicated list of composite instances in the streaming list the Filter tab can be used to and add a filter expression to discard the non-latest snapshot of an instance. Add a new filter entry and for the filter expression select the field as LATEST from the data object fields, the comparison type as is equal to and equate it to a value of Y. For the report in this article, yet another filter is added to display only the approved instances by adding a filter expression to equate the METRIC_cardApplicationStatus field to a value of APPROVED. The list can be narrowed further by adding more filters to the list.


The Properties tab allows to change the view name, fonts and other visual aspect of the streaming list.The final streaming list view should resemble something as shown in the below screenshot.


Hyperlinking a Column in the Report with an Action

Well, the teaser video above showed that the composite instances in the view were hyperlinked which when clicked pops up the audit trail (as seen from the composite dashboard of Enterprise Manager Fusion Middleware Console) for the selected instance. Now for the creamy part of this article, it will be described to create and implement action buttons in Oracle BAM and how they can be used in a report. Select the Completed Instances streaming list report tile and then click on the Change Report Properties link under the report Actions menu. This opens the Report Properties popup dialog. Click the Actions tab and then the New button to create and configure a new action for the report. Provide a name and optional description for the action name and hit Next.


The Action Type lists various action that can be performed on data in a report such as executing a read/write action on data input, selections from that action list or form, open an external URL etc. From the options in the select the Open a URL radio box and then click on Next to open the Action Editor. The action editor provide a very convenient feature to construct a URL by creating a combination of terms term which can be concatenated in the end to form the final URL. These terms can be configured to either have constant values or represent dynamic columnar values from any selected view in a report.


Before being able to configure the URL term, it is required to know how to open the link to the audit trail of an instance. The audit trail of any composite instance can be directly viewed in the browser by using the link below.


The link, however is made up of a lot of instance specific information such as the host, port and partition of the soa server where the composite is deployed to, its version and the instance id. Interestingly all but the host name, port name and the domain name (Weblogic server domain name) can be figured out from the default data object itself. They can either be hard coded for each instance of the BAM server to have the corresponding value of the SOA server or the other ingenious way is to create a table in the database and read the values from it. The instance specific information can be directly derived as values of the selected row from the view list. There are other ingenious ways to do it if you dont want to hard code these values. Entering constant values for a URL term is pretty simple. Simply choose the Mapping Type as Constant Value to be able to enter the hard coded part of the URL.


The dynamic part of the URL value can be added in a two step process. First select the choose Value from a List View radio buttonin the Mapping Type and then select the view from the report skeleton containing the streaming list to populate a drop down of the data object columns used in the list. Select the appropriate column value that is needed in the URL. For instance, the COMPOSITE_NAME, COMPOSITE_REVISION, DOMAIN_NAME (this is the partition contrary to the Weblogic domain name) and COMPOSITE_INSTANCE_ID may be required as dynamic values in the URL. There is no restriction on the number of terms that can be created as a combination of constant and lookup values to form the complete URL.


The final Open a URL configuration screen will contain multiple entries that are a combination of constant hard coded values and dynamic values from the list. These are concatenated together to form the actual URL for the hyperlink to be launched when the composite instance column is clicked in the list view. The wizard also allows editing the features of the popup. Click on the Click here to edit Windows features link and choose Full screen for both the width and height of the popup window to display it in full screen mode.


Now that the action is defined the last remaining bit is to configure the action button itself. Click on Next to be presented to the Button Formatting screen. Select the Display the button in a view radio button and from the visual empty report  skeleton select the streaming list view to which the URL action needs to be added. As a hyperlink is desired on the composite instance id column for the view check the Display as a column of formatted links in the selected view option and from the Column to format dropdown choose COMPOSITE_INSTANCE_ID. Click OK on all the screens to save the changed report properties.  With this the configuration changes required for the Completed Instances view of the reporting dashboard is complete. The article will shortly demonstrate how the final view looks like and what does the configured action list do.


Adding the SOAINFRA Schema as an External Data Source

In the following part of the article, another interesting aspect of the use case that is to view the task form of an instance that is awaiting an action from a process user will be demonstrated. But before going ahead with creating the report view there is something else that has to be taken care of. In order to open a task form corresponding to a particular instance id, its taskId is required. Unfortunately the default business indicator metrics data object does not contain the task id. The easiest way to lookup taskId for a composite instance is to query the WFTASK table in the SOAINFRA schema.

select taskid,state from wftask where compositeinstanceid=?;

However the problem is that composite instance id and taskId share a one to many relationship. The query to retrieve taskdId from compositeInstanceId my return more than one row and not all of them can be used to open the task form. Another important thing to know would be that only the taskId row that has the value of the state column as ASSIGNED indicates the current task(s) that are pending actioning from process users. This is achieved by simply appending a filter on the state column.

select taskid,instanceid,ecid from wftask where compositeinstanceid=? and state=’ASSIGNED’;

It is therefore required that there is some sort of mechanism available in BAM to lookup these required field from the database table. This is facilitated in Oracle BAM using external data sources. To begin with open the BAM Architect view from the landing page and click on External Data Sources. Click Create to configure an external data source, provide a suitable name, say SOA_INFRA and connect to the [PREFIX]_SOAINFRA schema. Check out the screenshot below to see as how to provide the schema name, password and the connection string. Verify that the connection to the schema is established by clicking the Test button at the bottom.


Creating an External Data Object with the WFTASK table

The external data source can now be used to create an external data object. Browse to the AccountInititation subfolder and click on the Create Data Object link to create a new data object. Name it as TaskDetails. Check the option box beside External Data Object  indicating that this data object sources its content from a database table instead. Select the external data source that was created in the previous step and then select WFTASK from the dropdown under External Table Name. The columns of this database table are now available to be added as fields for the data object by clicking on the Add a field link. The fields that are added from database table are COMPOSITEINSTANCEID,TASKID,PROCESSDUEDATE,ACCQUIREDBY,ASSIGNEES,ASSIGNEEUSERS,CREATOR,STATE,INSTANCEID,ECID. Note that only a few of these fields will be needed to configure the report view. After the fields are added click on Save Changes to save the data object. View the Contents tab of this data object to see the selected columns for all the rows in the database.


Adding Lookup Fields in the Business Indicator Data Object from the External Data Object

As discussed earlier the streaming action list view to display the in-flight composite instances with human tasks will need to source its reporting data from the BI_default_OnlineAccountOpening_AccountOpeningProcess data object. But somehow this data object would need the additionally  required fields too such as the TASKID, STATE. These fields can be looked up from the TaskDetails data object. This is straight forward as Oracle BAM data objects allows creating look up fields that can be created to look up data from different data objects. Open the BI_default_OnlineAccountOpening_AccountOpeningProcess data object and click on Layout > Edit Layout to be able to add lookup fields to it. Once in the edit mode, click on the Add one or more lookup fields link at the bottom to open the lookup field definition popup. Select the TaskDetails data object and from the list of Lookup Fields click on the TASKID field. This lookup however will be based on some common fields in both these data objects. Select COMPOSITEINSTANCEID from the list under Fields to match from Lookup Data Object  and COMPOSITE_INSTANCE_ID from the list under Fields to match from this Data Object. Click Add to define the lookup criterion. More than one such lookup criterions can be added if needed be. Similarly add another lookup field i.e the STATE from the TaskDetails data object. Keep the lookup criterions based on the COMPOSITE_INSTANCE_ID and COMPONENT_INSTANCE_ID. Now the  BI_default_OnlineAccountOpening_AccountOpeningProcess will also have the values for TASKID and STATE for a given composite instance.


Configuring the Streaming Action List for Inflight Instances

Make sure everything is saved for the moment. Edit the  original report template and for the next horizontal tile too select the view type as Action List. The streaming list view will have to be now configured to show the relevant in-flight live instances. Open the report in the Edit mode. For the data object field select the same data object as used in the previous view i.e BI_default_OnlineAccountOpening_AccountOpeningProcess. Go to the Fields tab to select and arrange the following fields in their order: COMPOSITE_INSTANCE_ID, BI_NAME, COMPONENT_NAME, METRIC_cardApplicationStatus, STATE, TASKID. From the Sort tab add BI_NAME and METRIC_cardApplicationStatus under the Sorted Fields.


As stated earlier if no filters are applied to this data object list, it will end up displaying all the instances against all executed activities in the business process. In order to view non-duplicated composite instances only the ones that are latest are to be filtered out. and just those that are assigned to process users. Having such filters is easy. Click on the Filter tab and click on Add Entry. This opens the Filter Expression wizard. Add a filter for the STATE field having a value of ASSIGNED. Similarly add another filter for the LATEST field to have a value of Y. This would do the work.


The display settings of a report are configured from the menu tabs under for the Properties button. Click on the Properties button and under the General tab enter Live Instances for the View Title. By default the streaming action list shows checkboxes against all the rows in the report. Change these to radio buttons so that only an instance is selected at a time. Go to the Actions tab and select Single Select (radio buttons) options.


Once these settings are applied and saved the streaming list will show the in-flight instances along with their task identifier and the pending action. Optionally, the task creators, assignees, task expiration time, priorities etc. can also be shown in this list to have a better visibility as to which tasks are assigned to which process and which tasks needs urgent attention. Also note that the column titles displayed in the screenshot below is different from the data objects fields. This can be done by overriding their default names from the Text and Align tab under the Properties button.


Define an Action Button for the selected instances of the List

The final piece in the hook is to define an action button for a selected row in the report view. What is desired here is that when a row is selected the action button should take the users to the approval form for the selected instance. This would effectively mean launching the business process workspace view of the task form allowing it to be actioned if needed be. Opening the task approval form is easy. The URL to be used to launch the task form simply needs the HOST and PORT on which the business process workspace is running and the task id of the composite instance (which is obtained already and displayed in the list).


The same set of steps that were discussed earlier to create an action button can be executed once again to create another action button. Open the report once again in the Edit mode, select the view and click on Change Report Properties.  Click on the Actions tab and then the New button to open a new action editor wizard. Name this action button as View Approval Page. In the next screen define the type of action as Open a URL. The URL for the action button can be configured by adding constant or dynamic strings (here referred to as Terms) which when concatenated will give the complete URL string.


Create a New Term with constant value by passing the entire URL string except the TASKID in it. Be sure to replace the value of the HOST and PORT with that of the server instance running the business process workspace user interface. Add another New Term but this time choose Value from a List View from the mapping type action list. The next screen displays the report skeleton from where select the live instances view in the report. This will show all the selected data object fields configured for the list as dropdowns under Select a column from this view column. Select TASKID and then click OK.


In the following Button Formatting screen simply select the Display the button in a view radio button under the Button Location panel. This will add the action button in the same view. For rest of the screens simply click on OK to save the report property changes.


Appreciating the End Result

Well, that’s it! After all these changes your BAM users will be able to see two streaming lists of completed/historical instances and in-flight instances. The users will also be able to view the instance audit trail by clicking on an instance in the list and if they happen to be process users then also action in-flight instances by selecting it and clicking the View Approval Page action button. This also allows you to create a seamlessly correlated view of instances across the multiple dashboards that Oracle SOA Suite 11g has to offer. It is important to note that not everyone in the organization will be able to view the task-form for the instance. Only users who are assigned to the process roles to action the task will be able to do so. This ensures that the security aspects of this implementation methodology is also covered


Downloadable Resources for the Project

You can import these data objects and reports into your BAM server by running the ICommand utility. The easiest way is to copy all the dataobjects and reports in the DataObjects.zip file to the <MIDDLEAWARE_HOME>\Oracle_SOA1\bam\bin directory and then run  icommand –cmdfile import.xml in the command line. Before running the Icommand utility make sure that the BAMIcommandConfig.xml file under the <MIDDLEAWARE_HOME>\Oracle_SOA1\bam\config directory is configured to point to the correct instance of the BAM server. A detailed explanation of working with the ICommand utility can be read here at http://docs.oracle.com/cd/E14571_01/integration.1111/e10224/bam_app_icommand.htm

Feel free to post comments and suggestions, and possible improvements and I will be glad to incorporate them here. If you have a similar requirement and are struggling to re-create a reporting dashboard similar to what is proposed, send me a message.

Hope this article helps!


Oracle SOA Suite 11g: Advanced Administration Topics

A few days ago, i had the pleasure to announce the release of my upcoming handbook on Oracle SOA Suite 11g Administration Handbook. It is my privilege once again to let everyone know that Chapter 10 of our book, that deals with some advanced administration topic is available as a free download from the publisher’s website. Here is the link for the chapter.


In addition to this chapter, the sample code for the book, containing a handful of administration scripts, can also be downloaded from


The entire table of content of the book can be see here


I am thrilled by the number of emails and accolades I have received so far with respect to the contents of the book. I am sure this would be useful to both Oracle SOA Suite 11g Administrators and Developers alike.

Oracle SOA Suite 11g Administrator’s Handbook

Your blogger, for the past few months has been involved in what I am sure many will agree is a nightmarish activity.  As a result the blog posts have dried up a little as blogging involves both time and novelty and I have been a little short on the first lately. What has consumed me all the while? Well it was co-authoring a book on Oracle SOA Suite 11g Administration and ensuring that it met our vision to provide readers with a definitive administration reference handbook. The book was announced a month ago and is finally out and available.


My fellow co-author Ahmed Aboulnaga, has been quick enough to mention this on this widely popular blog and you can read about the book here at: http://blog.ipnweb.com/2012/08/oracle-soa-suite-11g-administrators.html

Ahmed, in his blog post, has already listed down, in details, as what is this book about, who is it for, and the contents of the book. I will thereby refrain myself from repeating the same things again here. However, I would though like to share a few insights that went into making the books and why do we think this book would be useful for the readers.

Oracle SOA Suite 11g is the backbone of messaging and application integration in a service oriented/event driven architecture. Due to its sheer size, understanding the underlying components, services, configuration and their relations, can be a daunting task. But it doesn’t have to be! Faced with the many challenges and common pitfalls involved in real-world Oracle SOA Suite implementations, we realized an opportunity to provide a comprehensive yet practical solutions guide to both re-actively and pro-actively manage the infrastructure.  We have ensured a lot of our real-world
experiences of working with Oracle SOA Suite 11g infrastructure are captured in the book and that we deliver a comprehensive guide filled with grounded solutions, best practice(s), and pragmatic instructions.

We begin by providing an introduction of SOA and quickly move on to provide as much concise and useful information related to management of SOA composite applications. We feel it is important to know how to manage composite applications, their deployments and lifecycles to be successful in administering them.

First and foremost, this book is an attempt to equip the reader with everything they need to know about all the core administrative and management activities around Oracle SOA Suite 11g. Our attempt throughout this book has been to give readers administration tips straight from the trench, i.e from our real time experiences. Among the many things covered in this book are automation techniques to manage the infrastructure, troubleshoot root causes of error, devising strategies of reuse with respect to code, configuration and customizations, engage in performance tuning of each components individually or the infrastructure as a whole apart from routine administration activities. The book has a wealth of information that can be applied to real world scenarios or help you understand the best ways to tackle core administration tasks in Oracle SOA Suite 11g. It has been structured to allow readers to open up any chapter of interest, with earlier chapters beginning with more basic administrative concepts while moving to more advanced topics towards the latter half.  Right from Chapter 1, we have made the best effort to ensure that the reader is taken through a step-by-step, tutorial based approach offering a unique insight into the what, why and how of all administrative related activities involved with Oracle SOA Suite 11g. Our aim throughout was to put forward a definitive guide to real world administration of Oracle SOA Suite 11g, that provides a monitoring, troubleshooting and tuning methodology, explains what and how to backup and restore your environment, configure the infrastructure to implement a policy based security framework, offer resuable scripts for common administration tasks such as deployments,tuning and migration, and delves into advanced topics such as silent installs, cloning, backup and recovery and provisioning highly available infrastructure.

Instructions and guidelines provided in the book offer a lot of handy tips and best practices, all based on real-world problems and solutions that Oracle SOA Suite 11g Administrators can benefit with and significantly increase their productivity. We certainly feel that you will feel armed with all the “to-do’s” and “know-how’s” of basic and advanced administration with this book. The skills and insights learnt from each of the chapters in this book can inadvertently be  applied to real world administration.

Part of our writing style in this book draws heavily on the philosophy of reuse and as such we provide an ample of executable SQL queries and WLST scripts that administrators can reuse and extend to perform most of the administration tasks such as monitoring instances, processing times, instance states and perform automatic deployments, tuning, migration, and installation. These scripts are spread over each of the chapters in the book and can also be downloaded from http://www.packtpub.com/oracle-soa-suite-11g-administrators-handbook/book

I would finally try to make a brave and best attempt to summarize in brief the key learning’s from the book:

  • Overview of the Consoles and Key Administration Areas.
  • Monitoring Instances, Messages, and Composites
  • Configuring Audit Levels and means of end-to-end monitoring through the use of extended logging
  • Administering and configuring components within Oracle SOA Suite 11g such as the BPEL, Mediator, BAM, EDN engines, Services and References.
  • Monitor and Performance Tuning service engines, WebLogic server, threads and timeouts, files systems, and composite applications.
  • Interacting with and configuring the infrastructure MBeans using both Oracle Enterprise Manager Fusion Middleware Control and WLST based scripts.
  • Troubleshooting approaches as how to identify faults and exception through extended logging and thread dumps and find solutions to common start-up problems and deployment issues.
  • Secure the components deployed to the infrastructure by leveraging Oracle Web Services Manager along with the supporting infrastructure configurations.
  • Later chapters deal with managing the Metadata Services Repository, dehydration store, backup and recovery.
  • Use case depicting entire steps involved in upgrading from one patchset to another and migrating from SOA Suite 10g to 11g.
  • Advanced topics such as silent installations, cloning, and high availability installations.

Finally, I would reiterate the statement made by my co-author once again that this book is not a re-hash of Oracle’s documentation on Oracle SOA Suite 11g. It offers insights into SOA Suite administration that you will not find elsewhere. If you are an Oracle SOA Suite Administrator or aspiring to be one, then this book is your bible. It gives you everything you need to about all your tasks and lets you apply what you learn in your everyday life right from chapter.

The book is available at the following websites:

Paperback and eBook versions:

Kindle version:

Any feedback, comments and inquiries are most welcome!

Arun Pareek


Custom XPath Functions in OSB 11g to Lookup Shared DVMs

The Problem Statement(s)

I was recently working on a big OSB implementation for one of our clients using the Oracle SOA Suite 11g platform. Interestingly we had to use both OSB and components in the SOA Suite 11g platform to cater to different needs integration and business scenarios. However one common thing that we needed to use across both the components were DVM’s and XREF’s. It would be naive to highlight this as a problem as this is well known and well blogged about. There has been numerous blog posts showing workarounds to implement DVM alike functionality in OSB. However none of them seemed to work for our implementation. To highlight a few reasons, here were our guidelines and the challenges thereof:

  • All DVM’s were deployed as a shared and reusable project into the MDS. Needless to explain why we did this. Once we deploy DVM’s as a shared project to the MDS they become available from the SOA Composer and can be edited/changed at runtime.
  • These DVM’s were to be used both across OSB and BPEL/Mediator to lookup values. Now obviously there is no direct way to access DVM’s deployed as a shared project or within a composite from OSB, so we relied on creating a composite, exposed as a service that will be invoked by the OSB, have a mediator look up the DVM and return the response.
  • This workaround worked for us but with the baggage of hundreds of calls being made between OSB and SOA Suite service just for DVM lookup retrievals.

Well I am sure everyone will have to say that DVM like functionality will be introduced in the future releases of OSB. But what about till then? Also once Oracle releases a new version of OSB, say 12c, do we expect every 11g implementation in the world to migrate over to it.  The question that lead me to ponder over and thereby resulting in this article was “Is it that difficult to reuse the existing DVM XPath functionalities (and the DVM’s that are deployed to the SOA infrastructure) in OSB?”. Turned out it is difficult but with a little effort, I was able to put the pieces together and will explain in this blog, in a step by step way, how was it all done.


  • Oracle SOA Suite 11gR1 (PS2 and higher)
  • Oracle Service Bus 11gR1(PS2 and higher)

Creating and Deploying the Shared DVM Project

Let me start by creating a very basic composite containing just a DVM (CountryCode.dvm). This is good enough to test all the three different DVM functions available in Oracle SOA Suite 11g. The DVM just store a couple of records of Country names, their Country and Currency codes. The SharedDVMProject is then deployed to the SOA Infrastructure so that the DVM is available in the MDS.


Once the resource gets deployed to the SOA infrastructure the easiest way to locate it externally is by accessing it over http. The deployed DVM should be available at http://<host>:<port>/soa-infra/services/<partition>/<compositeName>/CountryCode.dvm. Replace these values of host, port, partition and the composite name with values belonging to your environment and open the address in a browser to see the DVM xml.


Oracle SOA Suite 11g supports primarily three ways to look up DVMs. For a full read on working with DVMs refer to the link http://docs.oracle.com/cd/E14571_01/integration.1111/e10224/med_dvm.htm.

The three available XPath DVM functions are

  • dvm:lookupValue(dvmMetadataURI as string, sourceColumnName as string, sourceValue as string, targetColumnName as string, defaultValue as string) as string

The above function is used to return a looked up value of the string by looking up the value for the target column in a domain value map, where the source column contains the given source value. If the target value is not found the default value is returned.

  • dvm:lookupValue(dvmMetadataURI as string, sourceColumnName as string, sourceValue as string, targetColumnName as string, defaultValue as string, (qualifierSourceColumn as string, qualifierSourceValue as string)*)) as string

This function is also used to retrieve a looked up value of the string by looking up the value for the target column in a domain value map, where the source column contains the given source value. If however you dvm contains qualifiers columns then you can pass multiple qualifier columns and their values additionally to the DVM.

  • dvm:lookupValue1M(dvmMetadataURI as string, sourceColumnName as string, sourceValue as string, (targetColumnName as string)? ) as nodeset

This one to many dvm function returns an XML document fragment containing values for multiple target columns of a domain value map, where the value for the source column is equal to the source value. You can provide all the target columns for which the values needs to be retrieved as comma separated string names.

The arguments as explained in the oracle documentation site for these various functions are:

  • dvmMetadataURI – The domain value map URI. Here in our example you could simply pass CountryCode.dvm
  • sourceColumnName – The source column name.
  • sourceValue – The source value (an XPath expression bound to the source document of the XSLT transformation).
  • targetColumnName – The target column name.
  • defaultValue – If the value is not found, then the default value is returned.
  • qualifierSourceColumn: The name of the qualifier column.
  • qualifierSourceValue: The value of the qualifier.
  • targetColumnName – The name of the target columns. At least one column name should be specified. The question mark symbol (?) indicates that you can specify multiple target column names.

Importing the Custom DVM XPath Library into OSB

Having said, this, we would probably like to see a support for all these three dvm functions if we are building a custom XPath library for them in OSB. Well the Custom DVM XPath library that I have created, and that you can use, provide an implementation of all the three functions. However the usage semantics is a little different and I shall explain how each of the function can be used. Before proceeding further, if you want to know how custom XPath libraries can be added to both OSB designtime and runtime refer to these blogs that contain some useful demonstration about it.



Once you know the basics of how custom XPath functions can be made available to OSB, download the osb-dvm-xpath.jar available from the link here and copy it in <MW_HOME>\Oracle_OSB1\config\xpath-functions folder.

You would also need a .xml (here custom-osb-dvm-xpath.xml) file, as shown below to describe the usage of these functions. The xml file basically contains how would you like to name your function and also which class in the package has a method to implement the corresponding function.

<xpf:xpathFunctions xmlns:xpf="http://www.bea.com/wli/sb/xpath/config">
<xpf:category id="DVM XPath Functions">
<xpf:comment>Lookup Value DVM Function based on dvmLocation,source column, source value, target column and target value</xpf:comment>
<xpf:method>java.lang.String lookupValue(java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String)</xpf:method>
<xpf:comment>Lookup Value DVM Function based on dvmLocation, source column, source value, target column, target value and qualifiers</xpf:comment>
<xpf:method>java.lang.String lookupValueWQ(java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String,org.apache.xmlbeans.XmlObject)</xpf:method>
<xpf:comment>Lookup Value DVM Function based on dvmLocation, source column, source value and multiple target column qualifiers</xpf:comment>
<xpf:method>org.apache.xmlbeans.XmlObject lookupValue1M(java.lang.String, java.lang.String, java.lang.String,org.apache.xmlbeans.XmlObject)</xpf:method>

As you can see that the the XML file has three functions, namely lookupDVM, lookupDVWQ, lookupDVM1M that mimic the ones available in Oracle SOA Suite 11g, albeit with a little difference.

The three available XPath DVM functions are

  • dvmG:lookupValue(dvmLoc as string, sourceColumnName as string, sourceValue as string, targetColumnName as string, defaultValue as string) as string

The above function is exactly the same as the one available in Oracle SOA Suite library. It returns the looked up value for the target column.

  • dvmG:lookupValueWQ(dvmLoc as string, sourceColumnName as string, sourceValue as string, targetColumnName as string, defaultValue as string, qualifierArray element(*)) as string

This function enables a dvm lookup by providing the necessary qualifiers, only in this case rather than passing multiple qualifier columns and their values as sting arguments, the function accepts a Document containing a list of qualifiers.

  • dvmG:lookupValue1M(dvmMetadataURI as string, sourceColumnName as string, sourceValue as string, targetColumnArray as element(*) ) as element(*)*This one to many dvm function returns an XML document fragment containing multiple matched values for all the queried columns for a particular source column/value pair. All the target column names should be passed as an element array instead of repeating string arguments instead.

The arguments as explained in the oracle documentation site for these various functions are:

  • dvmLocation– This is similar to the domain value map URI. However since the DVM will not be present in the OSB runtime but instead be deployed to the SOA infrastructure the URI has to be http://<host>:<port>/soa-infra/services/<partition>/<compositeName>/CountryCode.dvm instead of just CountryCode.dvm, i.e the DVM name.
  • sourceColumnName – The source column name.
  • sourceValue – The source value (an XPath expression bound to the source document of the XSLT transformation).
  • targetColumnName – The target column name.
  • defaultValue – If the value is not found, then the default value is returned.
  • qualifierArray : XML Document containing repeating arrays of qualifier name and column similar to the fragment shown here: <qualifiers><qualifierName>CurrencyCode</qualifierName><qualifierValue>GBR</qualifierValue></qualifiers>. However note that you may use any names for the XML elements but the structure needs to similar.
  • targetColumnArray – An XML Document containing a list of repeating of the target columns. For example, the input should be somewhat like <targetColumns><column1>CountryCode</column1><column2>CurrencyCode</column2></targetColumns>. Here also the name of the elements being used can be anything but the structure of the Element array has to be quite like above.

Once the custom-osb-dvm-xpath.xml and osb-dvm-xpath.jar are placed in the <MW_HOME>\Oracle_OSB1\config\xpath-functions start/restart your Eclipse IDE and the OSB server to see the Custom DVM XPath library made available to the OSB functions. The following image also shows a Xquery file dvmLookup.xq that uses the custom library function dvmaG:lookupDVM by making all the arguments required for the function as externally passed values from users.


Once done, execute this on the server by right clicking on the file and Run As –> Run on  Server. This brings up the OSB Test console for XQuery Resource Testing. Pass some valid values to look up a particular DVM target column value as shown below to see the response.


Similarly create two more XQueries to test the other two functions. Or alternatively, import this soa.osb.dvm.project.jar that has all the XQueries created for you to test. Look for how the qualifiers are passed in the image below while testing the dvmLookupWQ function. The document map is convenient to create as you can pass as many qualifiers name value pairs under the root document.


The lookupDVM1M is a bit different from the functions above as it returns document fragment containing multiple target columns for which the values needs to be retrieved all at once. This is useful in cases where you need to retrieve multiple target column values at once. The only trick part is that the target columns should be placed inside repeating elements inside a root level document while being passed as arguments.


Now all was good until now as I could use already deployed DVMs from the OSB itself by just a few XPath functions that provide an exact similar functionality as the ones in Oracle SOA Suite. All was good until I further realized that sending a complete URL for the MDS is a pain. For instance when we use the DVM functions in SOA we just send the DVM name as in CountryCode.dvm but to use the corresponding functions in OSB, i had to use :/soa-infra/services///CountryCode.dvm”>http://<host>:<port>/soa-infra/services/<partition>/<compositeName>/CountryCode.dvm instead. This turned out to be another problem as the environments keep on changing, and so do the partitions that these DVMs are deployed to. Also there were a few hundreds of DVMs that were used and hence using the full URL to access the DVM was not accepted and ruled out.

In the next step, I refined the Custom XPath functions to take just the DVM name instead of the entire URL. But the question is how would the OSB runtime locate the DVM’s that are deployed somewhere in the SOA MDS. Well, turns out all I needed to do is write another utility to look for and extract the DVM from the MDS based on the DVM name.

The Custom DVM classes just needed the MDS Database connection information to connect at runtime and look for the DVM metadata to retrieve it. I simply created a db.properties file, something like what is shown below, that has some name value pairs to describe the database connection details. This file can be edited to replace the values with the ones corresponding to your environment and copied to the <MW_HOME>\user_projects\domains\<Domain_Home> folder of your OSB domain.

username=<MDS User Name>
password=<MDS Password>
partitionName=<Partition Name>
connName=<Any arbitrary Name for the MDS Connection>


So next time when you test any of the above DVM function, you can either specify the complete URL (or the HTTP address where the DVM is located), or just the DVM name only.


Feel free to post comments and suggestions, and possible improvements and I will be glad to incorporate them here. If at all you have a similar requirement or are facing problems similar to what I faced and would like to get the binaries of the jar, send me a message. Hope this helps!

Once again, the files used in this example can be downloaded from the links below