Business Process Management : Ten Trends in Ten Years

In Business Process Management by IRM UKLeave a Comment

Print Friendly, PDF & Email

“The IRM UK BPM Europe Conference is celebrating its tenth anniversary in 2015. It is scheduled to take place 15-18 June 2015 in London. This long standing BPM Conference is presented and produced by IRM UK, co-produced by BPTrends and chaired by Roger Burlton, pioneering BPM thought leader recognized for his innovative contributions to BPM.

Together, IRM UK and Roger have brought practitioners, thought leaders and experts from around the world together for over a decade, presenting innovative strategies, products and services to the BPM market. We congratulate IRM UK and Roger Burlton on their contribution to the BPM market and in recognition of their tenth year anniversary, we have produced a short, informal review of ten significant events that have shaped BPM during the past decade.”

Celia Wolf, Founder and President, BPTrends

Business Process Before 2004

Before beginning a consideration of the major trends that have occurred in the past decade, it’s worth spending a moment laying the groundwork by describing the process environment in the first years of the new millennium.

Process work, as we know it today, began almost a hundred years ago – with the publication of Fredrick Winslow Taylor’s Principles of Scientific Management, which generated the first process improvement movement. In the 1920s and 30s “work simplification” became popular and gradually evolved into industrial engineering and quality control. This movement got a huge boost during World War II when process practitioners were able to demonstrate what they had learned, while improving US war production. For example, at peak production the Kaiser Shipyards were able to produce transport ships in a little over two weeks.

In the post war years, quality control techniques enabled the creation of new and more efficient factories in countries devastated by the war. In the Eighties, movements like Six Sigma, and Lean, initially developed in Japan, became popular in the US. During this same period, process guru, Geary Rummler, developed new process improvement techniques by applying evolving principles of psychology to human performance problems. In the Nineties, gurus like Michael Hammer, Tom Davenport, and James Champy applied insights about the use of computers in process improvement to create the Business Process Reengineering (BPR) movement. This evolved into many new approaches, including the Capability Maturity Model (CMM) that helped managers evaluate their organizations process capabilities and Enterprise Resource Planning (ERP) software, that provided companies with pre-packaged, process-oriented systems for managing routine processes.

In the years immediately following the year 2000, there was a lull in the interest in process improvement, as companies to regrouped to absorb all the new IT technology that was available. Then, in 2003, Howard Smith and Peter Fingar published Business Process Management: The Third Wave, a book that created lots of interest in the potential of process improvement by focusing on how new software tools could automate process management.
Based on this development we will consider the publication of Business Process Management: The Third Wave the first important trend in our ten year review.


1. Business Process Management: The Third Wave

In essence, BPM: The Third Wave, proposed a new model for the development of workflow software. Workflow software had first been used in the Nineties, and was associated with BPR. The basic idea was to create a business process flow model of a set of activities, and then use a computer application to manage the flow of information between activities. When an application for an insurance policy arrived at an insurance company, for example, it would be scanned. The “virtual document” would be sent to an application analyst’s computer and he or she would be asked to approve or decline the application. If the analyst approved the application, it would be forwarded, by the software application, to an actuary, who would price the insurance, etc. The idea was to replace the flow of paper from one employee to the next, and only move data which was maintained in a virtual file in the company’s database. In the year 2000, workflow systems were popular, but they weren’t easy to build. Problems with proprietary software, data in different incompatible databases, as well as other problems, made the software sound exciting, but made its actual development an expensive proposition.

At the same time, the development of the Internet, the Web, and Internet languages like Java, HTML and XML made it much easier to weave data and applications together into functioning systems. Smith and Fingar proposed developing “modern” workflow applications using Web and XML techniques. These new applications would be capable of using Internet techniques to blend multiple software applications, employee tasks undertaken on specific computers, and data from more than one database together into a more or less seamless whole. Moreover, the same software tools would allow business managers to monitor process-based activities and even make changes in processes to improve their functions.

In hindsight, we can see that much of the vision outlined by Smith and Fingar has proved difficult or impossible to implement. Certainly BPM software applications have been built, and in some cases data is used by managers to make better decisions. Little has been done, however, to eliminate the need for software developers, and very few managers have BPMS applications that they can actually modify on a daily basis. On the other hand, as we suggested, the book created lots of excitement, many vendors rushed to develop new software applications in hopes of implementing Smith and Fingar’s vision, and the process movement, in general, was greatly advanced.


2. BPM Software Tools

The second important trend I’d identify would be the BPMS movement that was stimulated by the combination of new technology and Smith and Fingar’s book, BPM: The Third Wave.

At about the same time that he published his book, Howard Smith, began to work with a group of like-minded individuals to found the Business Process Management Initiative (BPMI) which, in turn, dedicated itself to creating a software language that could generate BPMS applications, and a process notation that business people could use to describe business processes that would be automated in BPMS applications. While BPMI was working on one version of a BPMS language, IBM, Microsoft and Sun evolved their own alternative version of a BPMS language, which they called the Business Process Execution Language (BPEL). Both were published at about the same time, and most of those involved in BPMI realized that their version of the language wouldn’t be able to compete in the marketplace with a language backed by IBM and Microsoft, and they shifted their attention to BPEL. The BPMI group, after some floundering, merged with the Object Management Group (OMG), a standards body that agreed to sponsor BPMI’s process notation, BPMN (Business Process Management Notation).

In hindsight, one can describe all of the activities involved in the development of BPMS languages and notations in a few sentences, but the work and the politics surrounding these developments occupied several years in the mid-Zeros, and hundreds of articles were written debating the merits of the alternative approaches. Today, the idea of a BPMS language has evolved so that it is incorporated in semantics of the BPMN notation supported by the OMG. Software tools that fully support BPMN allow business people to describe processes in a notation that can then generate code. In theory, this will allow a business person to change a process flow chart and change the software supporting the process. In practice, few business people are capable of understanding all of the national intricacies of today’s BPMN.

What’s more important, from our perspective, is that a whole new generation of software tools has been created that allows for the development of modern workflow systems. During this period BPM conferences and publications emerged creating wide-spread interest in all aspects of business process improvement. And, several major universities developed academic BPM programs that are creating a whole new generation of process practitioners.

It’s probably not too much to say that the advent of BPM Europe in 2004 owed its origin to the new interest in BPM developed by Smith and Fingar.


3. Business Process Change and BPTrends

At the same time that Smith and Fingar were writing BPM: The Third Wave, I was engaged in writing a book, Business Process Change, which was also published in 2003. Once BP Change was published, Celia Wolf and I launched a website, Business Process Trends (www.bptrends.com) to provide an unbiased source of information on all aspects of process improvement. We updated the site each month, adding new columns, articles and book reviews on all aspects of process change. Today, we continue to publish BPTrends and there are over 1000 articles archived and available FREE of charge to all our members and readers.

The book, Business Process Change, did not have the immediate impact on the public that BPM: The Third Wave had.. Over the longer term, however, BP Change, has done well, and is now in its third edition. The strength of BP Change, in contrast to BPM: The Third Wave, is that BP Change, focuses on the basic concepts and historical trends in process improvement, and offers a methodology that allows those engaged in process improvement to integrate older tools and techniques, like Six Sigma and Lean, with newer tools and techniques, like the SCOR framework, business intelligence (BI) and the new BPM Software. This broader perspective is important because the initial BPMS tools proved to be too limited to do what managers require which is a more integrated approach that combines the old with the new.


4. Business Rules

A good case in point, and another important development in the way process people think today, is illustrated by the incorporation of business rule technology into BPM. The original BPMS formulation offered by Smith and Fingar depended on underlying semantics (Pi Calculus) that could not accommodate business rules. In effect, the semantics assumed that work was done in activities and then flowed to other activities. What happens, however, when the work being done involves making a decision? Say the activity in question is termed: Price International Loan, and the resulting outcome is a complex contract that describes how a bank will interact with a major international corporation or a sovereign state. An instance of this activity involves a loan team that debates for hours as they establish one clause after another and finally put a price on a specific loan. This is a long way from the type of activities that Taylor described when he studied workers on production lines and wrote down each movement they made as they assembled things.

One can certainly imagine decomposing the activity, Price International Loan, into smaller activities, but ultimately it isn’t going to help much as you try to figure out how to define the process in a manner that assures new employees can perform it correctly. A much more useful approach to defining decisions derives from the work of expert systems theorists. Expert systems were software systems that first appeared in the 1980s. The developers sought to capture knowledge of human experts, and found they could do it by carefully defining the terms used by the experts, and then writing rules that captured expert reasoning. In essence, an expert loan officer might say something like: “Whenever I see X and Y, then I check to see if Z is also present and if it is, then I usually proceed to recommend M.” (Imagine each letter is replaced by a specific financial test to get the jist of this rule.) Once one begins to study expert behavior one finds that experts rely on hundreds or thousands of rules like this to deal with the problems they face.

Building rule-based decision management systems isn’t trivial, but it provides a useful way for process practitioners to approach the problems they face when they try to describe activities that involve complex decision making tasks. By the end of the first decade, most BPMS tools had been extended to include a rule capability, and the market was as interested in human decision making as it was in linking ERP modules together.


5. Business Process Frameworks

Another development that has received quite a bit of attention in the past 10 years emphasizes the difference between an IT-led process initiative and a business management-led initiative. In this case, I refer to the development of business process frameworks like SCOR (the Supply Chain Council’s Supply Chain Operational Reference model), eTOM (the TeleManagement Forum’s electronic Telemanagement Operational Model) and the UK Office of Government Commerce’s ITIL (a framework used to describe how IT supports software for other groups in an organization.)

Let’s look at SCOR to get an idea of what’s involved. The Supply Chain Council (SCC) is made up of supply chain executives from Fortune 1000 companies. They work together to make it easier to design, measure and manage large supply chains. In the course of their work they created a process notation system for modeling large supply chains. At the top level, for example, they describe a supply chain as made up of a combination of Source, Make, Deliver and Return processes. Each of these processes is subdivided into subprocesses and then into sub-subprocesses. There are also standard metrics that can be used for each process and subprocess. This system was developed over the course of several years by committees of the SCC. Today, a group of companies planning on working together, can sit down and quickly agree on a map and metrics for their proposed supply chain. Similar results have been achieved by Telecom execs who can design major telco processes that link multiple phone companies using a similar process-focused approach.

Equally impressive is the power demonstrated by the use of frameworks to target opportunities for improvement. By submitting information to a common data manager, an organization like SCOR is able to publish very reliable metrics for various supply chain activities. A company new to SCOR can use this data, appropriately abstracted, to model and then evaluate its own supply chain. By comparing local results to international averages, the new company might immediately realize that it is significantly below average in its Source activities or on some specific subprocess of Sourcing, and focus an improvement effort on that subprocess. This ability to find problems without detailed examinations, but simply as a result of having a standard way of defining and modeling a supply chain and a set of internationally valid metrics represents a major breakthrough in the way many executives have thought about processes, and emphasizes the power of a good business process architecture.

This is work that might otherwise be undertaken by a process design team, but is, in this case, done by supply chain managers, efficiently and effectively. Information about this approach, and much similar work became widely discussed in the mid-10s and did a lot to encourage some business executives to pay more attention to process work.

Similar stories could be told about the impact of the other major frameworks deployed in the late Zeros. In one sense, these various frameworks were used by industry group managers for their own process work. In a broader sense, however, they did a lot to encourage others to reconsider a whole range of business process architecture issues. They also emphasized the strong interest in process architecture among business managers, independent of IT concerns.


6. Business Process Architecture

The process field has always been roughly divided between those who advocate a top-down approach and those who favor a bottom-up approach. Rummler and Hammer, for example, both advocated starting with a broad view – by defining a business process architecture with value chains, and then drilling down to see where problem occurred. Six Sigma and IT, with some notable exceptions, have advocated a bottom-up approach where one focuses on a specific problem and introduces incremental improvements. The problem with a top-down approach is that it often seems one wastes a lot of time modeling the entire organization before arriving at the specific problems that need attention. The problem with the bottom-up approach is that one sometimes “fixes” a specific problem only to find that one has made several other problems worse as an indirect result.

In 2004-2005, when people began to get excited about BPM software, the emphasis seemed to be on targeting specific problems. By the end of that decade, the emphasis was more evenly divided. The demonstrated power of top-down approaches like SCOR, combined with a renewed interest in holistic process change methodologies, caused many practitioners to focus on architecture.

An event that did a lot to change the focus from specific applications to architecture occurred when the US Congress passed a law requiring that all US government departments and agencies have an enterprise architecture. There has been, of course, a long-standing interest in Enterprise Architecture, which, in most organizations, is advocated by IT, and is really just a euphuism for defining an organization’s IT resources. In most organizations, the head of EA was located in the IT organization.

In the mid-Zeros, when the US Congress passed laws that required the Executive Branch of the US government to develop detailed Enterprise Architectures, they required, among other things, that each agency develop a detailed Business Architecture that would define the processes that the agencies’ IT resources were meant to support. This requirement, and subsequent efforts to implement it, did a lot to expose the lack of good models for business architecture development among existing EA approaches.

In the past decade a lot of work has been undertaken to better define a business architecture, and to describe how to implement such an architecture.


7. Customer Processes

Another development that has changed how people think of process work involves the new emphasis on how customers interact with business processes. The importance of this relationship was implicit in the “swimlane” diagrams that Geary Rummler created in the 1980s, and got added support from a book titled Lean Solutions, published in 2005 and written by Lean gurus James Womack and Daniel Jones. It received even more attention as IT developers began to use swimlane concepts in BPMN. In the BPTrends methodology, we teach students to draw swimlane diagrams that always place the customer process (or processes where there are multiple groups of customers) at the top of the diagram when it is drawn in a vertical format. (See Figure 1.)

Figure 1.  A BPMN swimlane diagram with the customer process in the top lane.

Figure 1. A BPMN swimlane diagram with the customer process in the top lane.

Once one gets into the habit of developing diagrams like this, one begins to think more about what a customer goes through to interact with a company process. From that it’s a short step to diagramming customer processes, by themselves, and trying to imagine better customer processes. In a given situation, for example, it might be easier if the customer didn’t have to take has car to the dealer, but could remain at home or work while the auto repair company came to him to pick up his car.

Once one accepts that one of the main goals of process redesign is to create improved experiences and more value for customers, then efforts to redesign customer processes, and then subsequently modify business processes to support better customer processes is a logical strategy. Some have termed this emphasis on looking at what the customer does before considering what the business does, an “Outside-In” strategy. Whatever it’s called, it is a good approach for those who want to transform their business processes and assure happy customers.


8. Academic BPM Programs and the International BPM Conference

Another development that was initially stimulated by the publication of Smith and Fingar’s BPM book but which has acquired a life of its own, is the creation of BPM programs in universities. Some programs are simply offered as specialties that one can take in conjunction with an MBA, or a graduate IT degree, while others offer MAs or PhDs in Business Process Management in their own right. In all cases, these programs represent a commitment on the part of university researchers to learn more about the ways that processes are used in organizations, and to develop strategies to improve the use of processes. Some, as we suggest, are narrowly focused on how IT can be used to improve processes, but, increasingly, these programs have become comprehensive research programs looking into all aspects of process change.

The various academic practitioners have also created their own international BPM conference where academic researchers meet annually to share their latest research. These various academic programs suggest that the process field will increasingly be able to rely on a body of trained professionals who will gradually capture and refine knowledge of process work for future generations.


9. Analytics, Big Data and Process Mining

A recent process trend that is accumulating momentum rather quickly is tied to the use of IT techniques that facilitate the capture and analysis of large bodies of software data. This, in turn, is driven by the revolution in communication that has taken place in the past two decades. Today, rather then speaking face to face or using a telephone, a growing number of people rely on email, on websites, or on intelligent personal digital assistants of all kinds, to send electronic messages to one another. All of these messages can be captured and represent a mass of data. For example, an organization may run an ad to promote a new product. No sooner does the product begin to sell than people begin to exchange messages about the value, the advantages and the problems associated with the new product. If the vendor can capture and analyze this data, it has a massive databank that it can search to refine their business modes, modify their products and services, target their messaging and improve their customer relationships.

Software tools to capture and analyze massive amounts of data (sometimes called “Big Data”) are rapidly becoming available, and business processes in many organizations are being redesigned to take advantage of an organization’s ability to use this data to modify their subsequent activities.
A example specific to process work is a technique termed “Process Mining.” When an organization executes a process, it typically retrieves and updates databases as the work is done. Using software tools, the metadata created by these database access activities can be used to create a model that shows how work is flowing between specific activities. If one approaches this in a trivial way, one can use data on new accounts created to determine how many new accounts are being added in a given period. If one uses lots of data from lots of different database contacts, however, one can actually determine the flow of activities across a large, complex business process. One can track instances where new applications flow smoothly from initiation to completion, and can compare them with instances in which new applications encounter roadblocks and identify problems that prevented the smooth flow of work to a conclusion. Using process mining tools, and historical databases associated with existing business processes, one can automatically create diagrams of the process underlying the data activities and highlight places where changes would improve the flow.

This work is only beginning, and only early versions have been incorporated into existing BPMS tools, but this approach will grow rapidly and we will soon find that another generation of process tools will be able to support analytic and redesign tasks that are much more complex and precise than those we routinely tackle today.

10. Case Management

Finally, the topic that is currently enjoying the most attention at today’s conferences and in blogs, is Case Management (or adaptive case management, or eBPM, or dynamic processes, as you prefer). In a strange sense, its like we have come full circle and are about to finish this decade and start the next by focusing on software tools and notations all over again.

In 2004 process people were excited about BPM software tools that would let business managers define, monitor and change business processes. At the time, no one said much about processes, assuming everyone knew what they were. Now, a decade later, it’s as if we have spent the decade figuring out what we hadn’t known. In hindsight, the processes we were talking about in 2004 were manufacturing processes, more or less routine processes that were basically activity specific. Since then, we’ve spent a lot of time talking about service processes where the business people interact with customers to create value, about decision-based processes that are undertaken by knowledge workers, processes that a team may not be able to structure until they are actually undertaking the process, processes that rely on tailoring to generate a unique output, or on the analysis of massive amounts of data in real time. We have, in other words, identified many of the limitations of the BPM software tools and notations that we embraced in 2004, and we are now working to create a new generation of BPM tools and, perhaps, a whole new notation that will be much more flexible.

Case Management Software (CMS), the new BPMS, is going to emphasize tailoring outputs for customers and empowering employees to respond to changing circumstances. CMS will emphasize dynamic planning that will allow teams to switch activities and sequences as circumstances change. CMS will emphasize the use of knowledge that will be captured from massive data sources, embedded as rules and partially automated. And, most important, CMS will emphasize the need to constantly alter assumptions as new technologies continue to drive changes in everything we do.

Some will be quick to note that there is nothing new in all this, and that Hammer and Rummler would have understood it – and that’s true. Those who work closer to business people have always been much more flexible in their ideas about the nature of processes. Most of the interest in the new approaches comes from those closer to the IT side of the process movement who have tended to define process work in terms of types of problems that their current software tools can deal with. Thus, we ended the Nineties with ERP software and limited workflow software tools. We spent the Zeros creating BPMS applications which could make ERP and early workflow tools much more flexible. Today with new techniques, massive data, cloud architectures and knowledge-based approaches, we are ready to up the ante again, with a newer generation of CM software tools that will support much more flexible applications.

It may be the case that process theorists have imagined more flexible approaches all along – but they haven’t been able to engineer them very well. Soon we will have both the vision and the technology to capture and structure a wide variety of new processes. With any luck we’ll soon see more conferences and articles about how a new generation of process tools – a fourth wave, perhaps – is about to revolutionize the way we do business.

Since 2004 IRM UK’s BPM Europe has been an important event for those interested in promoting, discussing and networking around business processes. In 2015, IRM UK BPM Europe celebrates its 10th year, and is committed to continuing to deliver the same consistent value to the business process community for the next ten years. For more information on the 2015 conference taking place in London please click here

About the Author

Harmon_PaulPaul Harmon is a Co-Founder, Executive Editor and Market Analyst at BPTrends, the most trusted source of information and analysis on trends, directions and best practices in business process management, (www.bptrends.com). He is also a Co-Founder, Chief Methodologist and Principal Consultant of BPTrends Associates, a professional services company providing executive education, training and consulting services for organizations interested in understanding and implementing business process management.

Paul is the Co-Author and Editor of the BPTrends Product Reports, the most widely read reports available on BPM software products and the author of the best selling book, Business Process Change, 2nds edition: A Guide for Business Managers and BPM and Six Sigma Professionals. He is an acknowledged BPM thought leader and noted consultant, educator, author and market analyst concerned with applying new technologies and methodologies to real-world business problems. He is a widely respected keynote speaker and has developed and delivered executive seminars, workshops, briefings and keynote addresses on all aspects of BPM to conferences and major organizations throughout the world.BPM Conference

Leave a Comment