Tuesday, April 22, 2014

Facilitating Core Banking Transformation Through BPM and Rules

This is a guest blog entry by Scott Simmons of IBM. Scott is the BPM Solutions Architect for the Banking and Financial Markets.

Faced with aging systems, banks are reaching decision points in terms of buying a packaged solution or building/extending their existing core banking solutions.  Often banks find that packaged solutions will not meet the needs of the bank and opt for a transformation approach based on their existing solutions.  Using BPM and rules, organizations can extend existing core banking applications and not have to sacrifice the intellectual capital that is part of their current solution.  We often find that this is done in conjunction with (often) a service middle-tier normally implemented with an ESB.  As the BPM and Rules/Events technologies have evolved over the last few years, this approach has become a common and well tested pattern enabling banks to iteratively replace portions of the system while still providing key core banking functions to the business.

If we review how process and rules can be used to extend an existing solution - let's take retail lending as a key application.  In the legacy situation - Retail Lending is normally a set of COBOL programs residing on the mainframe.  In the current regulatory climate - this is an impediment to change as the 3GL solutions are not very flexible.  In short, it becomes increasingly difficult to modify code on an continual basis.  Equally important, when we need to introduce new functionality such as mobile, social or analytics functionality - this often requires code modifications as well.

So let's consider another approach.  First, understand that retail lending provides a number of high level business functions comprising multiple processes such as loan origination, loan servicing and loan closure.  We can model and simulate these functions quite easily without the need for direct code development.  The process flow logic and underlying validation/calculation activities can be modeled as decision services (e.g. rules and events).  The process flows can invoke atomic and composite services such as receive loan applications, check credit, approve loan and other key functions which in turn, provide interfaces to the operational systems.  As a result, we see that a process and decision management approach (based on a loosely coupled service design) provides a foundation to extend current core banking applications (e.g. to deal with regulatory change) and to introduce new business functionality more easily and rapidly than traditional 3GL development.  We see this approach being adopted as a key pattern to address the challenges of aging core systems in the face of ongoing market and technology changes.

We see our clients normally initiating these types of transformation projects through top-down process and rules discovery. The discovery process enables business architects to use process modeling tools and techniques to identify and decompose key business functions and requirements specifications.  One of our clients uses IBM Blueworks Live to map 1000s of key processes many of which become implemented as functions on existing core banking solutions.  Although business process and rules management can be undertaken as strictly a discovery exercise with a tool like Blueworks Live -- many tools provide the ability to develop and implement solutions.  These development tools  often provide for rapid prototyping and solution implementation facilitating collaborative design and development between business and IT stakeholders.  This capability is a key functionality of the IBM BPM and Decision Management solution offerings and is a key reason for their use in numerous banking transformation projects.

Interested in hearing more?
Next week at IBM Impact, Scott will be hosting a session titled: Raise Expectations for the Always-on Enterprise
Session Number: BTA-1221
Date/Time: Thu May 1; 10:30 a.m. – 11:30 a.m.
Location: Marcello 4503

Scott Simmons is the Banking/Financial Markets Solutions Architect for IBM’s NA Business Process Management Solution Architect team. Scott specializes on the design, development and implementation of BPM solution architectures with a focus for banking/financial markets customers. Scott has deep experience in the areas of Service Oriented Architecture and Enterprise Modernization and is a Certified SOA Solution Designer.  

Monday, April 21, 2014

The Five “Rights” of Business Process Management

Recently, while working with a client in the medical field, I was made aware of an interesting
adage used by those in the nursing profession. It’s known as The Five “Rights” of Patient Care
and is embraced by those responsible for ensuring that accurate and appropriate care is given to
patients. A mistake in their world could result in a life threatening situation. It is critical, therefore,
that mistakes are avoided. The Five “Rights” of Patient Care are:

  • Right Patient: Make certain you treat the right person. 
  • Right Drug: Check that you provide the right medication to the right person. 
  • Right Dose: Dispense the right amount of the right medication to the right person. 
  • Right Route: Ensure the medication is given the right way (e.g., orally, topically, intravenously). 
  • Right Time: If the medication is going to make the person go to sleep, it is probably best to give it to them at bedtime. 

The power of this adage is in its simplicity and made me think again about that fine book by Atul
Gawande, where he brings into focus the importance of checklists. The Five “Rights” is indeed a
checklist and one that, with only 5 points, is easy to commit to memory and easy to apply in
practice. When followed, it would be very difficult to screw something up.

So, I thought how one might take this simple patient care checklist and adapt it for the BPM
profession…The Five “Rights” of Business Process Management, since “screwing things up” is
alarmingly commonplace in the world of process improvement.

We approach new initiatives with all the best intentions but too often fail to execute, or execute
then fail to show any real, verifiable benefits (ROI) from our work. Too often, businesses fall
victim to poorly planned, poorly executed process improvement efforts. They are left with less
money and greater frustrations when yet another attempt to repair their troubled, inefficient
processes has been unsuccessful. What then? Recommending reorganization or maybe buying
some new technologies are options. Those usually take the heat off for a while. Or, we could
focus on the basics with five simple checks. Let’s see what the Five “Rights” of Business Process
Management might look like.

Right Process 

Obviously, we have to consider process; we’re process professionals after all. That means Right
number one would be “Right Process.” For any number of reasons, politics, squeaky wheel, easy
to work with users, the wrong process is often targeted to be fixed. We need to always ask the
question, “Is this process really the place we should be focusing our efforts?” Consider the
medical scenario a bit further. When a patient comes in to the emergency room complaining of
chest pains and a sore ankle, where does the medical staff focus their efforts? Most likely, they’ll
make certain the patient is not having a heart attack before tending to the ankle, even if the
patient may be complaining more loudly about the ankle.

It’s amazing how often process teams go to work on some back office process improvement
because they have support from departmental management or it’s just easier, while their critical
customer-facing processes are in cardiac arrest!

Another thing the emergency room staff does is to very quickly hook the patient up to monitors,
checking vital signs, and so on. They do this for the obvious reason that without current
information – metrics, they can’t possibly make an intelligent decision as to what course of
treatment is most appropriate.

Here again, we usually fail miserably in the “process” emergency room because we simply do not spend enough time or effort measuring and monitoring performance before we begin our
treatment. The result of this haste is often the death of the project, death of the process, or, in
extreme cases, death of the business.

Selection of the right process, especially in an organization where process thinking is not the
norm, is absolutely critical. It takes some intelligence and research to know what the “right” first
process is. Take time for some business triage before determining what processes are in need of treatment, which processes are likely to yield business value if treated, and which ones have the
greatest chance of recovery from the treatment. There’s little point in saving a process that really
should be left to die in the first place, or just left on life support. If a process does not add
tangible business value, don’t waste your time improving it.

Right Solution 

There are usually multiple ways to treat business issues. Too often, we take the easy route and
throw technology at the problem. If we again liken this to patient care, we have Right number
two, “Right Solution.” More and more, health care providers are pointing us toward natural
remedies; exercise and diet vs. medication, for example. If we consider how much business
value can be realized simply by modifying how we do what we do (i.e., true process
improvement), we may be able to either avoid investments in new technology or better leverage
the technology we already have.

When technology is necessary, we should be certain how it will be of benefit. If a healthcare
provider gave us a pill and said, “Not really sure what impact this is going to have on your
condition, but let’s give it a go,” it wouldn't instill much confidence. We want the provider to tell us
what we can expect.

Yet process and technology professionals will plead their cases for new technology not really
knowing how or if it will address the business problems it is purported to solve.

Applying technology to a business model with limited understanding of what can be expected is
simply unprofessional and should not be done. Granted, we can’t always know exactly how a
certain technology will perform in different environments, but we should be able to predict with a
high degree of confidence the business value we expect to realize. And if we are monitoring
performance as we should, we’ll know very early if we are realizing the desired results. Process
teams, uncertain of outcomes quite possibly are looking at the wrong solution, or are lacking the
skills and analysis needed before prescribing an expensive and perhaps unnecessary treatment.

Right Scope 

Scope creep is nothing new to those of us who have come from the IT world. Software
development projects routinely struggle with scope management. Scope creep is expensive, time
consuming, and increases the risk of project failure. Right number three is “Right Scope.”

IT projects usually creep up from added requirements and functionality, resulting in further
analysis and more lines of code to be written and tested. Business process projects can creep
for the same reasons, but requirements and functionality can be dramatically broader. Where an
IT project may need to add a field to a screen, or code some additional validation checks, a
process team may be directed to look at an additional line of business, or include a supply chain
with several business partners, or document process flows for branch operations at multiple
locations. These kinds of change requests can add months of effort to a project and cast the
team into a downward spiral of endless analysis and process modeling.

Prescribing the right dosage, right scope, for an improvement effort can mean the difference
between success and failure. Taking 2 antibiotic pills twice daily can kill an infection in 7 days.
Taking 4 antibiotics a day won’t kill the infection twice as fast. Teams that try to take on a
process that is too large will most likely become frustrated, lose management support, and fail to
deliver much, if any, value to the business.

Manage scope and don’t attempt a radical, cross-enterprise treatment unless it is absolutely
necessary. Smaller scope efforts can show value quickly and are more likely to be well received.
Process improvement will be recognized as a credible approach to treating business problems.

Right Message 

Process improvement is about change. People, in general, don’t like change. That’s just the way
it is. As process professionals, we need to adopt a bedside manner that will communicate
confidence and assurance and set people at ease with change. We need the “Right Message.”

How we position, market, and communicate the message of change will, in large measure,
determine how successful we are in effecting change. Do not take shortcuts or underestimate the effort necessary to develop a solid message.

Change and culture go hand in glove. Cultural impact must be assessed as part of process
improvement. Processes cannot mature independent of cultural change. Crafting the right
message means understanding the context in which that message will be received. Is the culture
one of rapid growth and frequent change? Has the organization been around for decades,
continuing to do things pretty much the way they have always done them? Does management
support and embrace change? Are people rewarded for creative thinking? Knowing the answers
to questions like these will help guide what messages need to be crafted and shared.

Don’t underestimate the need to market new ideas and concepts. Since change is hard and
makes people uncomfortable, we need to share information about new ideas more than just once
or twice via email. We need to allow people time to get comfortable with new ideas, to allay their
fears, and market the value of the change. Everyone needs to know “What’s in it for me?” We
may watch TV commercials over and again and still not be able name the product being sold.
Start marketing change well before implementation, crafting messages for both global audiences
and individual, one-on-one, water cooler conversations. Every change needs an “elevator pitch.”
If we want people to embrace change, we need to equip them with the appropriate information
and right message for them to do so.

Right Time 

Timing can be a friend or foe when it comes to process change. Consider all the activities that
are taking place in your organization at any given point in time; year end closings, end-of-quarter
sales pushes for quotas, reorganizations, moves to new locations, launching new products, the
list goes on.

As with understanding culture, it is important to understand what activities or events are occurring
in an organization that may be captivating everyone’s attention. If I’m being asked to deliver a
significant month end report, or push to close a pending deal, now might not be the right time to
hit me up with process change, no matter how wonderful that change might be. I’m simply not in
a position or in the right frame of mind to deal with it.

We need to know our audience, know what is going on in their world, and be empathetic to their
situations. Great ideas introduced at the wrong time can quickly become really bad ideas. Our
good work could be all for naught. We may not get another chance to introduce it again because
the idea, the process change, is now tainted. More than once I’ve heard it said, “No, we tried that
once before and it didn’t work. We’re not going to do that again.” In reality, it wasn’t a bad idea
at all, it was just bad timing.


No matter how long we have worked in a given profession, we can always benefit from refocusing on the basics to improve our performance by following some simple checks. Am I targeting the
right process, with the right solution? Is my scope too big -- Will it take too long to succeed? Is
my message on target and informative? Is this really the right time, or should I wait until
conditions are more favorable?

  • Right Process  
  • Right Solution 
  • Right Scope 
  • Right Message 
  • Right Time 

Next week, Prolifics will be a Platinum sponsor at IBM Impact 2014, where our team will be showcasing our BPM solutions and recent successes. If you are planning to attend IBM Impact, be sure to visit Prolifics or schedule a one-on-one meeting with one of our experts.

To learn more about Prolifics' BPM solutions, visit www.prolifics.com or connect with Howard Webb:

Howard Webb is the BPM Advisory Services Director at Prolifics. Howard and his team provides consulting and guidance to clients in transitioning to highly efficient Process Managed business models, and equips them for success in their BPM initiatives. For over 25 years he has been a consultant, trainer, facilitator, and speaker on the topics of Business Process Management (BPM), data architecture, and project management. He founded the Midwest BPM Users Group and has published articles on BPM and enterprise architecture. Prior to coming to Prolifics, Howard was founder and partner of Bizappia, a consulting and services firm focused on business agility, performance and innovation. Prior to Bizappia, he was a Sr. BPM Technical Specialist with IBM.

**this article originally appeared on www.bptrends.com.

IBM Impact Session Preview: Extending IBM WebSphere Service Registry and Repository to Legacy IBM WebSphere Application Server

Almost all organizations today have embraced the concept of SOA and have aligned it to the path of application agility. One such bottleneck comes in the form of legacy applications where the endpoint of the service provider is stored either in a constant or in a property file. This restricts the ability to change both at the service provider and consumer ends.

Applications integrated with IBM WebSphere Service Registry and Repository (WSRR) have the ability to select the endpoints based on policy. Not only are they immune to constructive changes at the service provider but this also eases the overall governance process.

At IBM Impact, I will be hosting a subject on this topic. In this session, we will focus on the integration of Legacy Java applications with WSRR using APIs.

Session Abstract:
Almost all organizations today have embraced the concept of SOA and is aligned to the path of application agility. Bottlenecks can come in the form of legacy applications where the endpoint of the service provider is stored either in a constant or in a property file. This restricts the ability to change both at the service provider and consumer end. Applications integrated with WSRR have the ability to select the endpoints based on policy. Not only are they immune to constructive changes at the service provider, but this also eases the overall governance process. In this session, we will focus on: pattern to integrate WSRR with legacy java applications based on Spring framework, different scenarios in which it should be use, deep dive into the development and maintenance aspect, and extending this pattern to make it framework independent.

Session Number: INO-2740
Date/Time: Wednesday, April 30th; 2:15pm - 3:15pm
Location: Venetian-Marcello 4405

Add this session to your calendar!

To see a full list of Prolifics sessions, visit www.prolifics.com/impact-2014.htm

Animesh Jain is a solution architect at Prolifics with over 10 years of experience in application development and maintenance. He has worked on projects of varying complexity and magnitude and has performed in various capacities as Technical Architect, SOA Lead and Senior Developer. Specializing in integration, Animesh takes deep interest in understanding the business issues which helps him to guide and suggest the best approach to our clients. He also has been involved in numerous pre-sales projects and enablement activities. He has mentored several resources and has created enablement documents on Web-Services, SOA, BPM and WESB. 

Thursday, April 17, 2014

Thinking Out-of-the-Box: A Unique Implementation of BPM Batch Process

One of consultants’ biggest challenges is to fulfill our clients’ requirements entirely. Some requirements are straight forward and easy to implement, while others are quite complex and force us to think out-of-the-box. It is those complex requirements which bring our “best” by making us venture into unknown, ultimately resulting in innovative solutions.

One of my recent client engagements posed a quite interesting requirement. Below is a full description on how I solved it with advice from other team members.

Client Requirement:
In a recent BPM project, we had a requirement to implement Batch Process such that it runs every day at a certain time. But another requirement along with that was, the time to run the batch process should be changeable. For example, a batch processes is set to run at 12.00 am every day as per the present requirement; however, after 3 months we might need to run them at 3.00 am everyday instead of 12.00 am.

For this project, we were using IBM Business Process Manager V8.0.1 Advanced.

Challenges in Designing Solution:
  1. One of the obvious solutions I thought of was to use Time-Elapsed UCA assuming it will allow us to change the trigger time at the run-time. However, there is no way to change the trigger time of the Time Elapsed UCA at the runtime. It needs to be changed using Process Designer and deployment needs to happen across all the environments before finally promoting the change to Production environment.
  2. Another option was to use WAS Scheduler to run the Batch jobs. However, it needed some learning curve to develop a solution and would have hampered the timeline. Another challenge was that the client wanted to use the IBM BPM (Process Designer) capabilities to develop a solution which will be easily supported by the client.

Solution Description:
Here is the high-level design for implementing above requirement for the Batch Process.

  • A batch process will run systematically once every day at a specific time. 
  • Use Singleton design pattern such that at any point of time one and only one batch process instance will be running in BPM which will set off every day at a certain time.
  • At any point, if the time for the batch process needs to be changed, we should be able to do it by doing following step:
    • Check if there is an existing instance of the BPM Batch Process and terminate/delete it.
    • Create a new instance with new time to run the batch processes.
  • On a specified time, this process instance will invoke multiple batch process use-cases. For example Batch Process UseCase 1 and Batch Process UseCase 2 and so on as in the High Level Diagram below.

  1. It is a generic solution which can be implemented once and then batch process use-cases can be added at any time.
  2. It is not performance heavy. Even though there is always an instance in the background, it will consume resources only when it is executing at the batch process time and will release them while it is in wait state.
  3. It uses Process Designer capabilities with no java components, schedulers and properties files.
  4. It avoids any complications of running multiple schedulers at different server locations and on multiple nodes.

  1. A new instance needs to be created by the Admin when a change in time is proposed either by running scripts, using Human Service URL or Process Portal.

Implementation Details using Process Designer

“BPM Batch Process” Business Process Diagram: 
Here is the implementation of the “BPM Batch Process” BPD which is exposed to be accessed by Admins. This BPD helps get the time for the batch job execution using SetBatchProcessTime Human Service and then every day it invokes multiple batch process use-cases at that time.

This implementation is based on Singleton design pattern i.e.at any given time only one Process Instance of this BPD should be active in BPM. If we want to change the time of batch job, then we need to call SetBatchProcessTime human service which will terminate the old instance and then create a new one with new time.

“SetBatchProcessTime” Human Service:
“SetBatchProcessTime” human service is used to set the batch process time such that it will run every day at the same time. This human service will be started with “batchDateTime” as the input. This will help to change the batch process time whenever required.

Since there can be only one instance of BPM Batch Process running at any point when a new time is being set for batch process, it will make sure there are no instances running beforehand. If there are then it will terminate it before creating a new instance for BPM Batch Process with new time. 

Here is the implementation of the Human Service “SetBatchProcessTime”

Batch Process Use Cases:
Batch Process Use Cases can be different implementations in order to achieve any business relevant goals like changing a status of certain business cases, terminating business process instances, sending emails, and so on to name a few. These use cases can be any type of implementation, like a BPD, a General System Service, an Integration Service or any other Process Designer specific artifacts.

Invoking Batch Process

There are multiple ways to invoke this Batch Process via SetBatchProcessTime Human Service. 

The SetBatchProcessTime Human Service will accept the trigger time for the batch process to be triggered every day. This Human Service will in turn instantiate BPM Batch Process BPD.

This Human Service can be invoked in number of ways as listed below.

1. Using SetBatchProcessTime Human Service URL

SetBatchProcessTime can be invoked by the administrators who are part of “Admins” group using the URL as mentioned below:



SPA = Process App Acronym
SetBatchProcessTime = Name of Human Service

2. Using SetBatchProcessTime REST URL

The REST URL which can be used for invoking SetBatchProcessTime Human Service using POST method call is:



SPA = Process App Acronym
SetBatchProcessTime = Name of Human Service
batchDateTime = Batch Process Time parameter (%22 stands for quotes “”)

3. Using Process Portal
SetBatchProcessTime Human Service can be invoked using Process Portal by exposing it as Startable Service.

Process Portal:

For this particular solution, I exposed it as REST URL to be invoked using REST API in the deployment scripts to automate creation of the process instance instead of doing it manually.


Yes, I was able to successfully achieve the client requirement by designing and implementing a solution but the significance of this experience is how thinking out-of-the-box helped me come up with an entire analysis while designing and implementing a solution. Not only was the Singleton design pattern used to design this solution but also it was generic along with flexibility to change the batch process trigger time in the future; there by sticking to the guidelines around good practices which is equally important in a solution.

I would like to thank the following individuals:
  • Soundarajan Mannathan for his advice. 
  • Salem Hadim for his advice and approvals. 
  • Eva Khalil for her valuable inputs on this blog piece.

Want to learn more about Prolifics?

Jimit Sukkawala is a Consultant in BPM and Connectivity LoB in Prolifics specializing in Service Oriented Architecture, System Integration & Business Process Management. He has been with Prolifics for past 5 years and worked across different industry domains like Financial Services and Leisure & Hospitality. He has worked on projects through different phases of software development cycle and most of them making to Production. His main expertise is in WebSphere Process Server, WebSphere ESB and IBM BPM. He holds a master’s degree in Management Information System from University at Buffalo and bachelor’s degree in Information Technology from University of Mumbai.

Monday, April 14, 2014

Migrating IBM WebSphere Message Broker v6.1, 7, 8 to IBM Integration Bus v9 on Distributed Platforms - Checklist for a Successful Migration

IBM’s endeavor to offer one integration product capable of supporting .NET, Java, and various heterogeneous integration scenarios resulted in the launch of IBM Integration Bus (IIB). IIB v9 offers a standardized and flexible integration backbone. It allows placement of WebSphere Enterprise Service Bus (WESB), Rules Engine, and WebSphere Message Broker (WMB) under one software umbrella. One can expect significant benefits including enhanced support for .NET, improved Global Cache, a new Real-Time Performance Monitoring Tool and many other new features and upgrades such us Policy based WLM and BPM integration.

The below checklist is a compilation of IBM Integration Bus Information Center v9 instructions and offers a guideline to a successful migration of WMB v6.1, 7, 8 Broker Component to IIB v9 on distributed platforms. The checklist is comprised of the following topics: Migration of Broker Components and Migration of Development Artifacts.

Migration of Broker Components

The earliest version of WMB that one can migrate resources from is WMB v6.1. In a case where a migration from earlier versions is required, one must migrate to WMB v6.1, 7 or 8 first, however earlier version migrations are not in the scope of this document.

Migration from Windows x86 product to x86-64 product is not supported as well as migration of Linux x86 product to Linux x86-64.

In addition there are restrictions on migrations of earlier versions to IIBv9 Developer Edition. Supported operating modes are Full Edition, Remote Adapter Deployment, Express Edition and Standard Edition.

Finally, IIB v9.0 requires a minimum WebSphere MQ (WMQ) v7.5.0.1 on distributed platform. If publish/subscribe functionality is utilized by WMB v 6.1, a migration of WMQ Queue Manager is required first, followed by migration of subscriptions and then Broker component.

Remember: back up all of your resources prior to migration!

Migrating WMB v6.1, WMB v7.0, and WMB v8.0 Broker Component

  • Back up all resources: configuration repository (WMBv.6.1), Broker database(WMBv6.1),  and development artifacts.
  • If using ODBC connections, back up ODBC connection files.
  • If using configurable services, back up configurable services properties.
  • Install WMQ v7.5.0.1 and up.
In Place Broker Component Migration

Install at least Integration Bus component under new location on the same machine

Remove WMB v.6.1,7.0, 8.0 Broker from the domain configuration

Stop all channels connected to WMB v6.1,7.0,8.0 Broker

Stop WMB v6.1, 7.0, 8.0 Broker

Migrate WMB v6.1 Broker‘s Queue Manager to appropriate IIB v9 WMQ version

Migrate Pub\Sub components if required

Update ODBC connection filesto IIB v9.0 format if required

Set up IIB v9.0 command environment

Run the mqsimigratecomponents command to migrate WMB v6.1, 7.0, and 8.0 Broker (mqsimigratecomponents command migrates configuration data in forward and backward fashion (use “t” option for rollback): broker databases, queues and broker registry are migrated)

Copy environment settings if required

Start IIB Broker

Deploy migrated artifacts

Parallel Broker Component Migration:
Install at least Integration Bus component under new location on the same machine or on another machine

Update ODBC connection files to IIB v9.0 format if required

Set up IIB v9.0 command environment

Create IIBv9.0 Queue Manager on appropriate WMQ version

Create IIBv9.0 Broker

Copy environment settings if required

Start IIB Broker

Create Integration Servers (Recreate WMBv6.1, 7.0, 8.0 Execution Groups)

Deploy migrated artifacts

Migration of Development Artifacts

Development artifacts from all previous versions cannot be shared with IIB v9.0. 

Artifacts developed on IIB v9.0 cannot be deployed to earlier versions.

User Defined Node Jars developed in earlier versions  cannot be migrated to IIB v9.0.

User defined nodes require an extra diligence during migration. 

Resources which are imported into IIB v.9.0 cannot be used in earlier versions of the product.

Message flow projects are replaced with integration projects.
Message sets are “Read-only” once imported and not modifiable. (Modify message sets by
enabling msgset development in IBM Integration Toolkit -> Preferences)

WMB v6.1 and v.70 msgmap can be viewed, compiled and deployed but they are “Read- only” In order to modify .(msgmap) convert (.msgmap) to .map file

 Migrating WMB v6.1, WMB v7.0, and WMB v8.0 Development Artifacts
  • Development resources can be imported into IIB v.9.0 as PI files.
  • Workspaces can be migrated into IIB v9.0.
  • Migrate Development Artifacts

PI Import:
Export development artifacts from a previous WMB version into a PI file

Import PI file into an IIB v.9.0 workspace

Convert integration projects into applications or libraries if required

Workspace migration:
Point IIB v.9.0 IBM Integration Toolkit to location of an existing workspace

Click OK

In case of errors during migration apply “Quick Fix” by right clicking task list warning or error in the Problems view, then click Quick Fix.

*Figure 1, 2: Migrating to IBM® Integration Bus v90
*All Content: IBM Integration Bus Information Center

Eva Khalil is a middleware solutions designer with more than 12 years of experience in systems and applications integration area.  
She has been working with Prolifics for the past five years.  
Eva has successfully led many integration projects for various business industries: financial, government, healthcare, insurance and utilities.
She is especially fond of WMQ and IIB.

Tuesday, April 8, 2014

BPM, Rules and Analytics: A Business Trilogy

It’s an exciting time to be a business process professional. The management discipline of BPM is mature, process modeling standards are stable and the technologies are at a point that allows us to provide the kinds of process innovation and agility that we have long asserted, but were unable to easily deliver.

We can now look at broadening our horizons beyond process workflow and the monitoring of basic operational metrics. Today’s process professionals can deliver incredible business solutions that include, not only process workflow, but also the powerful execution and management of business rules, broader, more complex business decisions and advanced business analytics.

This business ‘Trilogy’ of capabilities (Process, Decisions, Analytics) is changing the landscape and makeup of business solutions, business process design and the ability of organizations to rapidly change, adapt and respond to the volatile nature of today’s global market.

Not only are these capabilities available to the process professional, they are essential. Process efficiency alone is no longer sufficient to ensure business success. Consider the history of process from the early 1900’s and the work of Frederick Taylor around process standardization and Henry Ford’s groundbreaking assembly line production automation. And moving into the 1980’s we had total quality management, Six-Sigma and Dr. Michael Hammer’s business process reengineering.

These initiatives, revolutionary in their time, were focused essentially on improving process efficiency. In today’s ‘customer-centric’ world where the customer has greater power than ever before with mobility, the cloud and ubiquitous Internet access at their fingertips, process efficiency alone is simply not enough.

We must today look at creating, in IBM’s terminology, ‘Smarter Processes’. These 21st century processes must certainly be streamlined and efficient, almost a given, and be able to orchestrate process workflow across systems and data sources enterprise wide. However, modern business solutions cannot stop there.

Any process improvement initiative must consider the following:

  • How process execution can be dynamically changed through business rules management
  • How business decisions can be made both independent of business processes and in support of business process workflow
  • How processes can support customer’s mobility demands
  • How big data and social media can be leveraged to not only respond to customer demands, but begin to predict future customer behavior

All this means we must go well beyond traditional business process improvement in developing business solutions. We’ve touched on process improvement to some degree.  Now let’s examine the other components of a business trilogy: Decisions and Analytics.

Business Decisions
A business decision is a collection of business rules, encapsulated to ‘hide’ the complexity of that decision from the user or recipient of that decision. A user could be a business process, system, web portal or other medium that may need an answer, a decision, without needing to understand the complexity that underlies that decision.

Business decisions and the practice of Business Decision Management (BDM), can dramatically simplify and change how we go about business process improvement.  Not only can BDM simplify business processes and process models, but also enable greater agility and flexibility than ever before.

It is common process modeling practice to extract business rules from processes, leaving only simple decision point that determine navigation, or which path to take through the process (if ‘Gold Customer’ go one direction, if ‘Platinum Customer’ go another).

The process needs only to know what type of customer it is.  The complex business rules that determine the customer type (Gold vs. Platinum), can be modeled in a Business Decision.  The process then can query the business decision, much like a business service (SOA) to get the answer.

We have been modeling business processes for some time and the standard Business Process Modeling Notation (BPMN) is quite mature and tools like IBM’s BlueworksLive support and enforce the standard.

Decision modeling is an emerging discipline also with standards, just not yet as mature as BPMN. A decision model provides a means of capturing very detailed business rules that support a business decision, and presents them in a standard model that can be communicated, implemented and managed. These business decisions can be shared, reused and leveraged across multiple business processes, systems and business scenarios.

Business processes become very simple indeed and extremely stable. Processes only need to be changed if a business model changes…like offering a new service to customers. The dynamicity of the business is managed outside the process in business rules and higher-level business decisions. Rules can change dynamically with little or no impact on the business processes that use them.

Advanced Analytics
With the onslaught of information, big data, from a myriad of sources, it is critical that organizations consider how that information will be captured, analyzed and leveraged to both gain a greater understanding of their customers and improve their customers experience.

Analytics can be grouped into three categories; Descriptive, Prescriptive and Predictive. Each has a specific focus and purpose.

Perhaps the most common and familiar, descriptive analytics focuses on real-time, operational metrics associated with, for example, process activity monitoring. Descriptive analytics are typically displayed on dashboards, operational reports and scorecards providing immediate insights into business performance.

Prescriptive analytics takes a deeper perspective of data to uncover patterns and trends.  This analysis helps organizations gain a better understanding of customer buying patterns and habits focusing on historical information.  Prescriptive analytics helps to answer the question of why customers do what they do.

Taking a more forward thinking perspective, Predictive analytics strives to provide insights and guidance into what an organization should be doing in the future. This predictive or what-if analysis helps support future planning and resource allocation for what may be required or demanded by customers or the competition.

The Trilogy
Process consistency, agile decision and rules management and analysis of critical business data are the three characteristics of a modern, agile and dynamic organization. Process management alone is no longer sufficient. Business decisions are becoming more complex and data intensive.

Decision Management is the essential element to reduce process complexity and enable the ability to rapidly adapt to a changing landscape driven by customer demand, competition and regulatory pressures.

Likewise, advanced analytics provide the necessary visibility into historical, current and potential future business operations. This visibility coupled with efficient, well orchestrated business processes and state of the art Decision Management will be a powerful advantage for any enterprise that wishes to remain competitive in the age of big data, mobility, cloud, and proliferation of social media.

Attending IBM Impact?
Prolifics will be taking a deeper dive into this topic at IBM Impact 2014. Ryan Trollip, Decision Management Practice Director at Prolifics, has been selected to host a conference session that is not to be missed!

Session Number: BDL-3320
Date/Time: Mon, Apr 28; 12:00 pm - 12:30 pm
Location: EXPO Theater

Howard Webb is a Director of BPM Advisory Services at Prolifics. Howard and his team provides consulting and guidance to clients in transitioning to highly efficient Process Managed business models, and equips them for success in their BPM initiatives. For over 25 years he has been a consultant, trainer, facilitator, and speaker on the topics of Business Process Management (BPM), data architecture, and project management.  He founded the Midwest BPM Users Group and has published articles on BPM and enterprise architecture.  Prior to coming to Prolifics Howard was founder and partner of Bizappia, a consulting and services firm focused on business agility, performance and innovation.  Prior to Bizappia, he was a Sr. BPM Technical Specialist with IBM.

Friday, March 28, 2014

Prolifics Employee Showcase: Handly Cameron

Handly Cameron is a Software Architect for IBM Collaboration Solutions (ICS) and Mobile at Prolifics and an IBM Champion for ICS. He has over 20 years of experience in enterprise collaboration and application development and is passionate about helping companies leverage social and collaboration technologies to drive business value.

We recently caught up with Handly to learn more about his time at Prolifics and some of the exciting initiatives he has worked on.

Joining Prolifics
Handly joined the Prolifics team over two years ago, stating that he saw great potential to grow in the IBM space and was impressed with Prolifics’ depth and breadth of experience across the IBM portfolio.

Technical Expertise and Becoming an IBM Champion
Handly was named an IBM Champion for ICS in the first group of Champions in June 2011. Since then, he has been selected as an IBM Champion each year and has earned 43 IBM certifications. In fact, he even helped the IBM team write 3 certifications, the most recent of which for the Worklight and WebSphere Portal developer exam. Currently, he holds 11 active certifications and continues to update them as product versions progress and expire.

Handly sees a lot of value in the IBM Champion program and has remained an advocate of the program since it was created. An IBM Champion is defined as an individual who leads and mentors others in making the best use of IBM solutions and services. Champions are typically evangelists who run user groups, are active in the blog community, help others in online spaces such as Skype or StackOverflow, present at conferences, and generally participate in the broader IBM community. Some of the benefits you can receive as an IBM Champion include:
  • Access to IBM Executives 
  • Opportunities to participate in community activities (i.e. Team Social and interviews at IBM Connect)
  • Free invitation to the ICS Leadership Conference
Handly encourages anyone wanting to become an IBM Champion to talk with him or another one of Prolifics Champions and find out how you can get involved!

Going Mobile: The Prolifics Connect App
Before joining Prolifics Software Group last year, Handly led the team that developed the Prolifics Connect mobile app that is used to support activities at large IBM conferences such as Connect and Impact. This particular project included helping several members of the Prolifics team learn mobile app development with IBM Worklight and integrating the app with back-end systems such as IBM’s sessions data, Twitter, and our blog feeds. Handly is currently updating the app for iOS7 and the upcoming Impact conference in late April. Click here to see the full story of the app.

Client Success
When asked about a recent client engagement he is most proud of, Handly recounted the planning and implementation of IBM Connections Content Manager (CCM) at Mizuno. Handly explains, “We were able to take a lead from IBM, demonstrate how the solution helped solve specific business needs, and sell the software and implementation services. Mizuno became the first customer to install CCM, with Mike McReady standing by to download the software the minute that IBM released the eGA. Following from this success, the customer was so happy with the results that they joined us in an IBM webinar last September that has been seen by hundreds of people. New leads are still coming in for customers who saw that webinar and want to hear more about how they can get similar results with Prolifics”. To listen to a Webcast replay, click here.

Interested in Hearing More from Handly?
He was recently interviewed by Sandy Carter at IBM Connect 2014. At the start of the video, you can see the blazing yellow Mizuno shoes Handly is wearing.

Connect with Handly Today!

Handly Cameron is an Enterprise Architect in the IBM Software Group at Prolifics and an IBM Champion for IBM Collaboration Solutions. He has over 20 years of experience in enterprise collaboration and application development and loves to help companies leverage social and collaboration technologies to drive business value. Handly is the Leader of the Atlanta WebSphere Users Group, recently finished 5 years as the president of Agile Atlanta, and holds 25+ industry technical certifications. He blogs at http://handly.org and is on Twitter as @Handly.