Sunday, December 6, 2009

Exact Acquires Vanguard Solutions Group

Exact not only expects Vanguard to continue to sell GPS through these existing partners, but also expects Vanguard to expand its partner network. Balio, president and general manager of the new division defines the new relationship as "co-opetition", which, as the name suggestions, involves both cooperation and competition. He explains that while "Vanguard's ERP partners will compete with Exact for new accounts � they will also cooperate with Vanguard in enhancing their product offerings. It has been and continues to be our objective to provide our partners with a complete, integrated analytics offering. Our partners use our analytics to enhance the value of their products to both their existing customers and new accounts. We are dedicated to maintaining the partnership relationships to the benefit of all involved."

Prior to the Vanguard acquisition, in mid 2005, Exact acquired other companies. In August, it took a majority interest in Modulair Easy Access B.V., a logistics and warehouse software firm, while in June. It also acquired Kooijman Software, a software provider to construction and education organizations, and AllSolutions, a software provider to non-profit and service organizations. In May 2004, Exact acquired the Treadstone Group, a Cincinnati, Ohio-based (US) IT consulting and systems integration firm. All these acquisitions contribute to an incremental, rather than dramatic, revenue increase, and also to a gradual geographic coverage expansion.

The acquisitions follow the appointment of Rajesh Patel as corporate CEO on July 1, 2005. In a keynote address at the Engage 2005 annual user conference in early October, Patel said that Exact was firmly focused on the mid-market and that a core part of the company's growth strategy is not to be "distracted by [the] digestion or integration issues" caused by rampant acquisitions for market share purposes. He also stated that Exact has no ambition to climb up the market.

The Wizardry of Business Process Management

The business process management (BPM) market is sizzling hot, with Gartner Dataquest estimating its compound annual growth rate (CAGR) at 13 percent in 2009. In fact, almost all leading BPM vendors have been buzzing about their unprecedented growth and profitability, especially amidst the ongoing economic drought.

It is truly difficult to argue against the need for companies from all walks of life to improve their business processes. Doing “better, faster, and cheaper” is the “slogan du jour.”

In his keynote presentation during the recent Lombardi Driven Online virtual conference, Lombardi Software’s CEO Ron Favaron referred to BPM as “Business Pressure Management.” That pretty much says it all. Logically, to the end of managing business pressures, Lombardi offers its broad BPM suite called TeamWorks Enterprise Edition [evaluate this product].

I also recently attended a Webcast by Appian Corporation, possibly the first BPM company to deliver process, knowledge, content, collaboration, and analytics capabilities in a comprehensive suite, Appian Enterprise [evaluate this product] and its software as a service (SaaS) counterpart Appian Anywhere. I particularly liked one slide in the presentation deck wherein the eight bullet points’ first letters cleverly spelled out the mnemonic REMEMBER (why to deploy BPM now) as follows:

* Retain customers
* Enhance standardization (and consistency)
* Measure business performance
* Evaluate components of processes (subprocesses)
* Manage all elements of the business
* Bottom line improvements
* Eliminate bottlenecks
* Rapidly deploy new services (and processes).

Indeed, these are some of the typical benefits of deploying BPM systems, but the trouble, called the lack of clarity and consensus, starts with the quandary about what exactly constitutes BPM, and what exact parts and capabilities of BPM help achieve those benefits? In other words, are there more important and better BPM suites and/or components vs. those that are of less importance?

In plain English, BPM entails all methodologies and tools that help businesses improve their processes. Depending on the context, BPM can be regarded as a management discipline, a technology, or even both. If one defines BPM as an approach to methodically design, implement, execute, control, and improve business processes, then one can argue that it is a management discipline. In addition, there exists a raft of accompanying IT tools to support this discipline in all of its abovementioned stages.

Extreme BPM Definitions

Simple as that, right? Well, not really, and my concern with the BPM word-use is semantics. Namely, so many people might still mean “business process modeling tools to help us with radical business process re-engineering (BPR),” which was all the rage back in the early 1990s. Indeed, this erstwhile people-centric approach of managing the overall business, independent of the specific technology or tools that are used to support it, has since gone out of fashion.

Namely, the problem is that the abstract world modeled in modeling tools often has not much to do with real-life business processes and typically cannot be implemented. If a business process analyst models a company all the way in, e.g., IDS Scheer AG’s ARIS tools, by the time he or she is done the model might already be obsolete.

And by the time the company implements the model, the dynamic economic environment will have already changed. Yet the model on paper (a bunch of flowcharts) must be deployable, usable, and maintainable in production. Business processes are about dynamics (or business agility, if you will) and a drawn flowchart is anything but dynamic.

On the other hand, the purely integration-centric approach of providing a way for software to communicate and execute automated workflow to accomplish discrete tasks via integration and process orchestration is not nirvana either. Many other vendors might mean that BPM is “an effective way for us to re-package our traditional enterprise application integration (EAI) tools under the guise of a service oriented architecture (SOA) orchestration project.”

IBM’s BPM suite, Fujitsu Interstage, TIBCO Software, Software AG (formerly webMethods), SAP NetWeaver BPM, and Oracle BPM come to mind here. These integration-centric BPM providers have often been accused by pure-play BPM suite providers of primarily targeting IT departments to try to sell BPM as a matter of service orchestration.

Yet, the true value of BPM should be to empower business users. To be fair, these larger companies have recently acquired companies that provide BPM software aimed at business users, and I imagine the competition from larger companies will only intensify.

For instance, zooming in on Oracle’s BPM product strategy, the idea here is to offer a complete and integrated BPM platform that caters to system-centric, human-centric, document-centric, and decision-centric business processes (workflows) in a single runtime environment. The suite is aimed at business owners and developers to collaborate, to define processes across systems and lines of business (LoBs), and to improve business process efficiency by modeling, executing, monitoring, analyzing, simulating, visualizing, and optimizing business processes.

But, by digging deeper into the Oracle SOA Governance suite and Oracle BPM suite, it is possible to note so many identical components, differing mainly in the fact that Oracle BPM has to also accommodate human interventions. The Oracle BPM suite can be bolstered optionally with the business process analysis (BPA), design, and modeling capabilities via the partnering IDS Scheer’s ARIS tool (e.g., for achieving Six Sigma compliance).

IDS Sheer’s blog post recently tried to demarcate the line between BPA and BPM suites as follows:

“The primary purpose of Business Process Analysis (BPA) tools is to visualize, analyze and improve business processes. BPA tools help translate every day business complexity into structured models (scope: from business to model). They provide insight into an enterprise’s structure – i.e. how strategy, products and services, processes, roles, information and systems are related and influence one another. By creating a single point of truth, BPA tools strive to improve the communication between various stakeholders in a company, safeguard corporate knowledge and support decision-making and change management. Most notable user groups are business managers, process owners, quality managers, business analysts, risk & compliance officers and enterprise architects. BPA tools have rich semantics in order to fulfill a broad information need. They enable users to visualize and analyze the enterprise from different point of views, e.g. from a performance-, risk & compliance- or architecture perspective.

Business Process Management Suites (BPMS) on the other hand serve a different purpose and target a different audience. While they do offer modeling capabilities, their primary purpose is to automate, execute and monitor business processes based on technical models (scope: from model to execution). Notable user groups are business- and information analysts, process engineers, software developers and system administrators. BPMS do not offer such rich semantics as BPA tools in the sense that their metamodel does not comprise concepts for performance management, risk & compliance management or architecture management. Then again, these concepts are not required to automate processes.”

A Fragmented and Crowded Market

Thus, the market for BPM software and related implementation, consulting, and training services is intensely competitive, rapidly changing, and highly fragmented. Every BPM aspirant likely encounters competition from internal IT departments of potential or existing customers that may seek to modify existing systems or develop proprietary systems in a do-it-yourself (DIY) fashion.

The process improvement adoption has started lately in many IT departments via implementing a set of concepts and policies for managing IT infrastructure, development, and operations. Some of these sets and disciplines are Information Technology Infrastructure Library (ITIL), IT Service Management (ITSM), and Control Objectives for Information and related Technology (COBIT).

Moreover, there are a number of enterprise-wide initiatives around process improvement disciplines such as Lean manufacturing, Six Sigma, Total Quality Management (TQM), etc. These frameworks of concepts and policies often require IT support in order to make operational best-practice workflows, while the linkage to business users is critical.

The market consists of a number of established BPM suite providers such as Appian, Ascentn Corporation, Cordys, Global 360, Lombardi, Metastorm, Savvion, Pegasystems, and Ultimus, to name some. In addition to the abovementioned SOA middleware and enterprise architecture (EA) providers, internal IT departments, and BPA and process mining vendors (e.g., IDS Scheer or Pallas-Athena respectively), BPM suite vendors also compete with companies that target the customer interaction and workflow markets, and companies focused on Business Rules Engine (BRE) such as Corticon Technologies, FICO (formerly Fair Isaac Corporation), and the ILOG division of IBM.

Competition additionally comes from professional service organizations that develop custom BPM software in conjunction with rendering consulting services. To further muddle the picture, there is a number of Enterprise Content Management (ECM)-based vendors such as the Documentum division of EMC Corporation, FileNet, the division of IBM’s Information Management Group, Adobe LiveCycle, Oracle Stellent, and Autonomy Interwoven, to name but a few.

What Constitutes a Full-fledged BPM Suite?

BPM suites’ scope can be sliced and diced in many ways. For one, Part II of my 2008 five-part blog series entitled “It’s About Process (or Ability to be Responsive)” outlined the necessary BPM suite components. But if one is to look at BPM suites through the lens of the Plan-Do-Check-Act (PDCA) loop, one could think of covering the following three necessary activities (within the feedback loop):

1. Business process modeling and analysis, which were explained earlier on;
2. Business process automation or execution; and
3. Business activity monitoring (BAM), measuring, process mining, and so on, as parts of continuous business process improvement efforts.

Most of the abovementioned contemporary BPM suites have a comparable basic functionality set, which has been specified as the desired capabilities of a BPM suite in Gartner’s 2008 report entitled “Four Paths Characterize BPMS Market Evolution.” These capabilities are:

* Model-driven development environment (with model-driven process execution rather than source code-based one). Processes can be changed bi-directionally, either in the design or execution stage (each impacting the other), and with an audit trail of changes;
* Process component registry or repository management;
* Document management and ECM systems;
* User and group collaboration;
* System inter-connectivity;
* Business event management, business intelligence (BI), and BAM;
* Online and offline process simulation and process optimization;
* Business rules management system (BRMS);
* System management and system administration; and
* Process execution and a state management engine.

While most of the leading BPM suites would address this prescribed broad functionality set by and large, they have some intrinsic differences that make them more suitable for one usage scenario versus the others. This difference often comes from the underlining architecture and the genesis of a particular suite. It may also be a consequence of the key customer segments that the vendor targets.

BPM practitioners understand some of the common usage types of BPM systems, often referred to as “human-centric business processes,” “system-centric (integration) processes,” and “document-centric processes.” Most of the real-life business processes have all of three elements in them, but some are heavier on one versus the other two.

In its whitepaper entitled ”Understanding Usage Scenarios An Enterprise BPMS Must Support,” Savvion identifies and describes four other equally important usage scenarios that are not very well understood. These are: case management, rule-based (decision-intensive) processes, project-oriented processes, and event-centric process management. Savvion claims to currently be the only BPM provider that can accommodate all of these seven usage scenarios.

What About Accommodating Change (We Can Believe In)?

But, after all this discussion, do BPM suites necessarily mean “build for change?” Do they by default mean easy automation, rapid iteration, and execution?

The bar has to be set higher, since pragmatic buyers are increasingly looking for proven fixed-cost business-driven enterprise-wide BPM deployments. An astute BPM suite should take a process-centric approach to managing business operations that can deal with any business workflow with high impact on overall customer service operations.

Metastorm’s [evaluate this product] white paper “Building a Business Case for BPM” asserts that there are three fundamental characteristics of BPM that make this technology the game-changer:

It’s About Process (or Ability to be Responsive) – Part I

After several years (if not decades, even) of painstakingly corralling and setting up all their custom data, objects, tables and whatnot, and making sure that these static and/or dynamic transactional data are secure, many enterprise applications users have realized that the time is long overdue for them to start looking at ways to make their applications more process-savvy.

Companies are increasingly trying to adopt and implement standardized (and yet flexible and easily modifiable) business processes to help their operations run more consistently and smoothly. For example, the chief executive officer (CEO) might decide that as of, say, next month “All customer service cases must be resolved within 24 to 48 hours,” or, “We are going to institute a new sales process for all deals worth over US$100,000.”

However, these business processes often get communicated to employees in an ad hoc and unregulated manner. A process document with instructions may exist on a network file share, but people have not the foggiest idea that it’s there. And some employees might rely on word-of-mouth information from co-workers (so called “tribal knowledge”) to learn the processes for their jobs.

Consequently, standardizing and instituting new business processes can prove challenging for most companies, particularly larger organizations.

Indeed, until recently most enterprise applications have hardly been anything more than glorified databases — they could hold all of the information users may need and allow users to search for records based on various criteria, but they could not really help users to perform the functions of their daily jobs more effectively.

There’s still often no native automation and agility within the system that lets, e.g., a recruiter instantly know when the status of a candidate has changed or when a new position requisition has been entered into the system.

Indeed, when any changes are made somewhere in the organization, users have to remember to notify one another of the change or else rely on others finding the updates on their own. Neither solution is practical in the long term and invites the possibility that the software solution or best practice will not be adopted consistently by all employees at the company.

How can one then build processes into enterprise applications so that users won’t need to, time and again, rely on manual (pedestrian) methods of communication to inform others of changes, increasing the risk that many issues will fall through the cracks?

Introducing Workflow Automation

To that end, a built-in or an external standalone add-on tool (or capability) that can be used to solve the process automation problem is called workflow automation (or workflow management). Some will refer to it as business process management (BPM), and we will shortly try to point out the differences between the two – i.e., workflow and BPM.

Traditional enterprise applications typically feature some built-in functionality, such as a human resource management system (HRMS) or a procurement application, with some capability to tailor the base functionality through parametric configuration options (e.g., via “order types” that entail different mandatory and optional “order steps”) that users have to learn by heart.

To be fair, some enterprise applications have introduced workflow capability into their products to give users some ability to control the process behavior of documents such as an invoice or an engineering specification. But in most enterprise applications workflow is implemented through hard-coding, which means that programmers must develop and maintain the code.

In addition, workflow automation of the typical enterprise application is generally limited to a single document or task routing. This usually means that companies implementing an enterprise application must choose between accepting the vendor’s pre-built business process behavior or paying the vendor dearly to make expensive modifications to accommodate more complex processes, which will then make upgrades either costly or impossible.

In contrast, a specialized workflow tool enhances a single task and/or document routing by providing an integrated capability to include rich user interfaces (UIs), system integration, rule processing and event handling.

Rules are necessary to determine which path users should take next in a process that has multiple possible paths, e.g., an order worth less than US$1,000 does not need manager approval, but over that amount it does. On its part, an example of event handling would be a necessary step after a product recall: a “pull from shelves” notification must be sent throughout the distribution channels.

These capabilities can be pretty powerful, since in general, if users can come up with a standard rule that specifies when a particular event should happen, they can make it happen automatically with workflow. In other words, workflow becomes the magic ingredient that transforms many traditional transactions-capturing applications from a glorified database into fully functional tools that basically everyone in the company should find useful.

Workflow Components

The individual components that make up workflow are rules and associated actions — tasks, field updates, and alerts.

In general, a workflow rule is the main container for a set of workflow instructions. It includes the criteria for when the workflow should be activated, as well as the particular actions that should take place when the criteria for that rule are met. Every workflow rule must be based on a single object that users will choose when they define the rule, as this object then influences the fields that are available for setting workflow activation criteria.

For example, if a user defines a workflow rule for the “Job Application” object in an HR application, he/she will be able to set workflow activation criteria based on the values of fields like “Job Application Number” and “Status”. Users can also set workflow activation criteria based on standard fields, like “Record Owner” or “Created Date”, as well as fields based on the currently active user when a rule is evaluated, such as their “Role” or “Time Zone”.

When a workflow rule is triggered, there are many types of actions that can occur, starting with a workflow task (or step), which assigns a task to a user according to a particular template. Just as in Microsoft Outlook, tasks include information about something that needs to be done by a certain time, such as making a telephone call, creating an order, shipping goods, or paying an invoice. Typically, assigned tasks appear in a user’s “My Tasks” related list on their home tab (or page) and generate reminder messages that pop up when a user logs in.

When an administrator defines a workflow task, he/she provides default values for data fields like “Assignee”, “Subject”, “Status”, “Priority”, and “Due Date” for tasks that are generated by its associated workflow rule. Administrators can also make sure that a notification email is sent to the assignee when a task is automatically generated.

In additon, a workflow field update changes the value of a particular field on the record that initially triggered the workflow rule, while a workflow alert sends an email according to a specified email template. Unlike workflow tasks, which can only be assigned to users of the application, workflow alerts can be sent to any user or contact, as long as they have a valid email address.

A workflow rule can include any combination of these actions when the rule is triggered. For example, one rule might send out an alert and update two fields on a particular record. The action that one workflow rule takes can also trigger the execution of another workflow rule.

Workflow-enabled Applications

Many enterprise applications today come with built-in workflow management capabilities, such as the Salesforce.com Enterprise Edition on-demand customer relationship management (CRM) suite and its on-demand Force.com (formerly Apex) platform, Agresso Business World (ABW) or Exact E-Synergy, to name only some.

Microsoft Dynamics CRM too includes a workflow module that users can use to automate their business processes based on the rules, logic, and actions that they design. Microsoft has revamped the workflow functionality in Microsoft Dynamics CRM 4.0 so that it now uses the Microsoft Windows Workflow Foundation (WF), whereas previous versions of Microsoft Dynamics CRM used their own proprietary workflow engine.

The result of the revised workflow functionality is that users, administrators, and developers can design and create business processes using the workflow tools with new features and a new UI for creating and monitoring the workflow processes.

Windows WF provides a comprehensive programming model, run-time engine, and tools to manage workflow logic and applications. The Microsoft Dynamics CRM workflow UI relieves users and administrators from the need to interact with WF directly. Therefore, users do not necessarily have to understand the underlying workflow technology to create workflow logic in Microsoft Dynamics CRM.

As a recap, a built-in workflow provides a tool to help companies set up and define business process activities (including the proper sequencing) that involved employees can use when working with the enterprise system’s data. Conceptually, one should think of a workflow as an application or service that runs in the background, 24 hours a day, 7 days a week, constantly evaluating the data and the multiple workflow rules in the company’s deployment.

When the workflow service encounters a trigger event, it activates the appropriate workflow rules to run the workflow actions. Typical workflow actions include sending an e-mail message, creating a task, and updating a data field on a record.

By implementing workflow processes in the enterprise resource planning (ERP), supply chain management (SCM) or CRM systems deployments, users can enjoy many benefits, such as:

1. Ensuring that users track and manage their customer data and processes in a consistent fashion — instead of relying on users to remember the appropriate steps for processing data, managers or administrators can create workflow rules that will automatically determine the next required steps and assign activities as necessary;
2. Processing the customer data more quickly so that, for example, new sales leads or customer service requests are assigned and routed immediately upon record creation; and
3. Allowing users to focus on more value adding activities — instead of having to perform a large number of manual repetitive steps.

Workflow vs. BPM

Both Workflow and BPM are systematic approaches and technologies to improve a company’s business processes (and performance). From a business perspective, they are ways to make people, information and computers work together more consistently and efficiently to produce needed results.

For example, a workflow/BPM-enabled application could monitor receiving systems for missing or defective items, or walk an employee through the steps to troubleshoot why an order arrived late or not at all.

Both technologies foster ongoing collaboration between information technology (IT) and business users to jointly build applications that effectively integrate people, processes and information. They provide organizations with the ability to define, execute, manage and refine processes that:

* Involve human interaction (such as placing or approving orders);
* Integrate and work with multiple diverse applications; and
* Handle dynamic process rules and changes, not just simple static flows, (i.e., those flows that enable tasks with multiple choices and contingencies/escalations).

The market for workflow and BPM applications is highly stratified and fragmented, in part because the currently available products stem from different origins. Namely, there are former pure integration vendors or document management/enterprise content management (ECM) vendors that have meanwhile encroached into the BPM space.

The difference between workflow tools and BPM suites is largely a semantic distinction, and the gist of the matter is that a workflow engine is at the heart of BPM suites with process execution capabilities. Also, in most cases vendors who sell applications labeled as BPM are aiming at a bigger scope and more complex projects, with elaborate software supporting even more elaborate methodologies, process defintion and modeling, collaboration methods, and so on.

Features and capabilities are not necessarily the only differences between tools, since usually the products aimed at simpler processes focus strongly on “ease of use.” The designers’ assumption is generally that the users are non-IT experts within the company. Such workflow products might be built around the concept of an intelligent form. Basically, the user develops the workflow by filling in a familiar-looking form (e.g., a “tasks vs. actions” matrix), including the business rules.

Yet the limitations of the simpler workflow tools become evident when they attempt to manage inter-process dependencies amongst several applications, handle complex database integration and handle tasks that partake in larger, more complex processes.

For more information on BPM, see TEC’s earlier articles entitled “Business Process Management: How to Orchestrate Your Business” , “Giving a Business Process Management Edge to Enterprise Resource Planning” and “Business Process Analysis versus Business Process Management.”

Special credit also goes to CIO Magazine’s articles entitled “ABC: An Introduction to Business Process Management (BPM)” and “Making Workflow Work and Flow for You.” Some useful concepts and examples were also adapted from the Salesforce.com’s AppExchange Developer Network (ADN) book entitled “Creating On-Demand Applications with AppExchange: An Introduction” and from the Microsoft Press book entitled “Working with Microsoft Dynamics CRM 4.0.”

usiness Process Management: How to Orchestrate Your Business

Companies used to coordinate activities through the company manually. This resulted in inefficiency and errors in the operational process and often led to difficulties in improving the process itself. Organizations are increasingly focusing on the implementation of business process management (BPM) solutions for the purpose of improving functional efficiency and effectiveness in their core business processes.

Evolution of BPM

Approximately ten to fifteen years ago, organizations began assimilating their legacy systems in specific industries or divisions by integrating enterprise applications via data transformation and routing, event triggering, process automation, and adapters. Enterprise resource planning (ERP), customer relationship management (CRM), and supply chain management (SCM) vendors were flourishing at this time. They automated their transaction systems with ERP software while including the information systems from CRM software. Five years later, business process integration (BPI) solutions, namely business process modeling, business-to-business (B2B) connectivity, and vertical industry process templates were built on top of these enterprise application integration (EAI) systems.

Today, the market offers BPM solutions that incorporate both the EAI and BPI functionality in addition to functionality such as workflow, business activity monitoring, web services, rule engines, and portal capability.

What Is BPM?

Business process management (BPM) was recognized by the academic world in the fifties and sixties as an important ingredient in the quality management approach. In the eighties, authors Hammer and Champy drew the attention of business managers to process management, process (re-)engineering, and workflow management. Today, BPM is continually gaining ground. Many companies have learned from experience that BPM is a strong asset when facing the rapidly changing requirements that are typical of today's dynamic world.

The acronym BPM has been the cause of some confusion in the past. It can be mistaken for business process modeling, which is a subset of the more "evolved" business process management. It is important to note the distinction between the two.

Business process modeling is issued solely for the graphical representation of the workflow, which can be either information or an actual document in a business process. Business process management is the definition of the process as a whole, including EAI, business process modeling, workflow, and even B2B transport capabilities. Furthermore, BPM should not be confused with business performance management which belongs to the world of business intelligence (BI) and data warehousing.
Organizations regularly implement CRM, SCM, and ERP applications. As a result, key business functions such as inventory management, warehouse management, or product lifecycle management are highly integrated. All these applications focus on a specific function or area within the company and are vertically managed.

What companies are looking to do these days is to (1) achieve horizontal integration in order to cater to cross-functional business processes, and (2) achieve true process automation to enhance the processing efficiency of company transactions.

What Are the Different Components in BPM?

BPM encompasses several disciplines intended for use across different divisions and areas within organizations. Some of these disciplines are

Business Process Modeling. "Defines" the process (usually in graphical format). As explicitly modeled processes are required for all subsequent BPM disciplines, process modeling is often perceived as the starting point of BPM. Defined with the use of a process modeler (not to be confused with graphical editors such as Visio or PowerPoint), the resulting model is composed of objects that are able to be related to by the BPM engines. Composed of different diagrams (to represent different dimensions of the organization), the model is stored in a structured repository.

Business Process Documentation. Responsible for the process-enhanced documentation. It complements the process diagrams by providing, through graphics, the what-to-do description and sequence of steps. It also adds the extended documentation by providing the how-to-do of business tasks to the model "skeleton". Items such as the work instructions, standard operating procedures, master templates, training components, etc. are added to the diagrams to create a documented process.

Business Process Certification. Takes care of the process's ability to comply either with industry documentation standards such as ISO or with an internal "gating process". It confirms that the processes have been approved or certified in a proper manner before their internal deployment.

Business Process Collaboration. Deploys processes (intranet or extranet publication) on the one hand, and provides users with the ability to leverage the process know-how into enhanced productivity via user and task collaboration, on the other hand. This BPM discipline addresses corporate-wide knowledge management (KM) by not only making documented and certified processes readily available to all employees and associates, but by also providing employees collaboration functions, which enable them to manage projects, tasks, or transactions in a work team approach.

Business Process Compliancy. Establishes the process's readiness to comply with internal and external regulations (such as Sarbanes-Oxley [SOX]). The compliant certified processes are then used to achieve governance certification, audits, or both.

Business Process Optimization. Responsible for continuous process improvement (CPI), including tools to assess the performance of the actual process against internal norms or industry benchmarks. The integrated quantitative analysis capability is used to identify bottlenecks and estimate throughput times and cost saving opportunities. This often includes a simulation engine to perform "what-if" analyses to locate process issues in a proactive manner.

Business Process Automation. Responsible for the integration between users, processes, and related applications, resulting in the system automation of the process tasks. Driven by a workflow management engine, the BPM process information, as modeled, can be used for automated transaction execution and routing, including task execution triggered by previous events, evolved task scheduling and user notification, real time monitoring of task execution, ad hoc execution, etc.
Organizations use BPM systems to improve the effectiveness of their core operations. BPM specifically coordinates interactions between systems, business processes, and human interaction. The expected results include

Saving money by automating the routing of activities and tasks to employees, taking away the non-value-adding activities such as routine decisions, transfer of data or forms etc., and providing users with tailored task lists.

Saving time by changing business processes as per technology, government, or competition requirements. With today's tight integration of process definitions and underlying applications, the changes in the definition can be deployed and communicated virtually immediately.

Adding value by opening up a range of functions that can be leveraged in a truly BPM-minded company. Value can be added in several areas—process (quantitative) analysis and optimization, quality certification (e.g., ISO)—requiring procedures to be created and published. Another area is compliance management (e.g., SOX) which is imposed on many organizations.

By implementing BPM, companies are able to orchestrate and leverage cross-functional business processes that are used over multiple systems, divisions, people, and partners.

The beneficiary of BPM systems is actually the customer. The customer will receive information sooner and products faster, which results in an improved level of customer satisfaction. This will translate into more revenue for the company.

Friday, December 4, 2009

Do You Need a Content Management System?

The ongoing drive to save time and money drives organizations to look into content management. As the costs of software and implementation range from almost free to millions of dollars and choosing the right vendor or system is vital, this decision can be daunting.

The term content management: What does it mean?

Content management is a phrase you hear everywhere these days. Companies claim they "do content management" and vendors say that they sell content management software. People who hear about content management often think about how to create a web site. The text, images, movies, etc., that are shown on web sites are the actual content indeed, but content management entails more than meets the eye.

Prior to explaining what content management is, it is useful to define the word content. Content is essentially any type or "unit" of digital information that is used to populate a page. It can be text, images, graphics, video, sound—or in other words—anything that is likely to be published across an intranet, extranet, or the Internet.

Where does content management come from?

Currently, information, communication, and digital networks have made a major impact. In this society, there is much information available. A company needs to acquire and structure information that exists both within and outside of its own four walls.

Where does this need for information or this need for content come from?

It can be said that the buzzword of this era is content. Before content, the hype of the late eighties and early nineties surrounded documents. As companies were producing large volumes of information by the end of the eighties, and while business boomed for products like Word, WordPerfect, Excel, and Lotus 123, organizations faced an increasing need to organize documentation. Rather than printing and storing hard copies, the documents required digital storage. The market responded with the creation of a powerful software tool to manage this process. These solutions became known as document management systems (DMS).

By the end of the nineties, the terms changed from document management systems to content management systems (CMS). A lot of DMS vendors suddenly called themselves CMS vendors since the main difference between document management and content management is the fact that document management deals with the document in its entirety, while content management focuses on the individual parts that make up a document or even a web page.

Both systems follow the same basic rules, workflow and processes. However, due to the evolution of the Internet, companies began to be more focused on managing the web site at the content rather than document level. This caused the market to shift from document management systems to content management systems.

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times -- A Panel Discussion Analyzed Part Six: Custom Development and Singl

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Leveraging Technology to Maintain a Competitive Edge during Tough Economic Times -- A Panel Discussion Analyzed Part Five: Profitability and Changing

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times--A Panel Discussion Analyzed Part Four: RFID Software Issues

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)
Q5. RFID is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID.

A5: Well, we will have all likely heard of some concrete examples (or imagined ideas) of expensive (and thus highly pilfered) retail items (such as razors, prescription drugs, apparel, and DVDs) packaged with pin-sized chips and tiny antennae that send retailers and manufacturers information about their use, and even about those who buy (or attempt to steal) them. Or the stories of grocery clerks immediately knowing when perishable items on the shelf have expired and replacing them before the items are purchased. We've also have heard of a consumer ordering the latest "hot item" and tracking it in the real time through the entire supply chain right up to the time when it is ready to be picked up. How about the idea of tracking employees and their labor with an RFID chip embedded in their ID badges to automatically record their transactions and even control their authorizations for a given area to detect security issues?

These futuristic-sounding scenarios (though not necessarily of the future, given such technology was employed decades ago, but only where its price was justified, like in the defense industry or to track the movements of precious pets) are being touted as the applications of an automatic identification and data capture technology named radio frequency identification (RFID). RFID uses low-powered radio transmitters to read data stored in smart tags embedded with minuscule chips and antennae. The tags are attached to packaged goods that can communicate with electronic reading devices and deliver a message to a computer that alerts retailers, suppliers, and manufacturers when a product's state has changed and requires action.

While the potential of RFID technology is indisputable (for example, unlike bar-codes, RFID requires no direct contact or line-of-sight scanning, and it provides streams of data that can be differentiated and interpreted before being passed to an enterprise application), much more is required in moving RFID from a lab to a live environment. RFID has the potential of a new technology inflection point and it can be a missing piece in the long-lasting puzzle of squeezing excess inventory out of supply chains. It will be only this piece, however, when (and if) it reaches a critical mass of adoption and maturity over the next several years. Nowadays, the market is still in a "chicken-and-egg" conundrum—until more companies commit to RFID, the cost of tags and other infrastructure will remain prohibitively high for mass deployment. A few years ago, typical smart-label tags were between $1�$2 (USD) each, while today we may be looking at production volumes in millions, costing 30�40 cents (USD). This is further projected into billions of tags on individual items in the future causing the cost to ideally fall to five cents (USD) or so. Eventually, in the long term, the price might fall to a penny or less, with new technology and even greater volumes. Still, while the tag price might seem as a major barrier now, it will likely become a minor issue down the track, when many companies start grappling with RFID deployments in earnest.

Over that time, many companies will begin to deliver and potentially receive a higher proportion of goods with RFID tags and, thus, they will have a better understanding of the technology and its potential in broader business improvements per se rather than only due to the mandated Wal-Mart, (US) Department of Defense (DoD) or Target compliance. Namely, as the world's largest retailer, with over 5,000 outlets worldwide, Wal-Mart currently uses traditional bar-coding and UPCs (unique product codes) to identify items and cases or pallets of goods as they move through the supply-chain and out to the stores. By 2005, Wal-Mart has envisioned to have live implementations of RFID tagging using new EPCs (electronic product codes, which can carry more useful data than UPCs), with a mandate to the Top 100 suppliers to provide RFID tags on cases and pallets at distribution centers, followed by item-level tagging at a much later date. EPCs on tags should be easier and quicker to read than barcodes, since there is supposedly no need to unpack pallets to check contents, as RFID readers do not, unlike bar-code scanners, require line-of-sight, which should all result in less labor, fewer errors, and better management of inventory.

However, companies implementing RFID should expect increased labor in the first year or so, because vendors have yet to perfect solutions for automating tagging and embedding RFID in packaging material. Also, the current state of RFID technologies would also revolve around label creation and production, plastic chip development, intelligent shelving and packaging, to name but a few. Furthermore, to gain benefits such as product tracking, supply chains should logically begin RFID implementation at the manufacturing level, rather than at the distribution center, which is one step closer to a retailer in the supply chain. Still, "source tagging" cases at the manufacturer is too disruptive for most companies to implement.

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times -- A Panel Discussion Analyzed Part Three: Applications Hosting

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times -- A Panel Discussion Analyzed Part Two: Business Process Modeling

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Leveraging Technology to Maintain a Competitive Edge during Tough Economic Times—A Panel Discussion Analyzed Part One: Introduction

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, SVP, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability," which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director, Start Magazine; Dave Turbide; an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director at TEC. In preparation for the event, we polled the thoughts and opinions from our following experts and contributors Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion, but we also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but which transpired from many other interactions and presentations at the conference. Also, pertinent articles that have been published earlier on our site, and which may shed more light on the respective topic are mentioned here as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Business Process Management How to Orchestrate Your Business

Companies used to coordinate activities through the company manually. This resulted in inefficiency and errors in the operational process and often led to difficulties in improving the process itself. Organizations are increasingly focusing on the implementation of business process management (BPM) solutions for the purpose of improving functional efficiency and effectiveness in their core business processes.
Approximately ten to fifteen years ago, organizations began assimilating their legacy systems in specific industries or divisions by integrating enterprise applications via data transformation and routing, event triggering, process automation, and adapters. Enterprise resource planning (ERP), customer relationship management (CRM), and supply chain management (SCM) vendors were flourishing at this time. They automated their transaction systems with ERP software while including the information systems from CRM software. Five years later, business process integration (BPI) solutions, namely business process modeling, business-to-business (B2B) connectivity, and vertical industry process templates were build on top of these enterprise application integration (EAI) systems.

Today, the market offers BPM solutions that incorporate both the EAI and BPI functionality in addition to functionality such as workflow, business activity monitoring, web services, rule engines, and portal capability.

Friday, November 6, 2009

The Duet Architecture Outlined

Despite seemingly simple processes that Duet enables, the product exemplifies how serious an undertaking a commercially available composite application can be. When the goals of Project Mendocino (now Duet) were first formulated, it reportedly[1] became clear that two very different architectures needed to be brought together. On the one side was the ubiquitous client application, which required local data storage, while on the other side was the proverbially complex SAP ERP environment. The different technologies in this case reportedly made it quite easy to select Web services as the interface technology, since both camps added standards-based Web services support in their last releases. However, in this case, simply connecting these two worlds using Web services did not offer a comprehensive enough solution. Namely, the goals required more extensibility, because SAP wanted to enable a model-driven environment on the client side, which would allow Duet to push additional screens and updates to the user without the need to continuously run through an installation and reinstallation whenever business needs changed.

On top of that, Microsoft Office currently works in online as well as offline modes. This capability had to be maintained in Duet as well, since users had to be able to trigger activities while being offline which would later have to be automatically resynchronized once their machines went back online. At the time, Microsoft and SAP also realized that the involved disparate system components (i.e., Microsoft Exchange Servers, Microsoft Office, and SAP ERP systems) are of such a high value to customers that massively updating those environments (and exposing the existing system landscape to any unnecessary disruption) would be unacceptable. SAP and Microsoft weighed these requirements carefully and realized that there was a the need for a communications hub that would sit in the middle of the two existing environments and "mediate" communication and processes. The hub collects various configurations from the back-end system, determines the objects that should be exposed, and decides which activities the user can trigger and how all of that ties together within the user screen. The communications hub is also referred to as the Duet Extensions, which are connected with the Microsoft Office client and the SAP ERP system. The result is that there are three primary parts to Duet's architecture: 1) the Duet Extensions; 2) the Microsoft Office Add-On; and 3) SAP ERP foundation.

More Than Meets the Eye

In addition to disseminating useful SAP data among knowledge workers (outside of its traditionally limited power-user dispatch list), Duet has been crucial for being a "proof of concept" model illustrating the potential development and adoption of composite applications, especially when the result of a joint collaboration between two software giants and market influencers.

Indeed, Duet is one of the first examples of a tangible SOA-based composite application product. While several tools use SOA conceptually, in ways that are sometimes hard to grasp, this tool is based on consuming services in concrete ways that benefits almost every information worker. Duet showed how SOA can be applied to the user experience through familiar desktop applications, and for some users, it will deliver functionality that supersedes the need to work directly with any line-of-business (functional department) or back-office enterprise applications. By exposing functionality and giving even the most casual users an easier way to update data that normally resides only in the back-office system, Duet embraced the innovative potential of SOA services. It exposes features from underlying ERP systems in new ways that create more value. And these services can be used together, even though they were probably written for a system that was designed before SOA was someone's figment of imagination.

This fulfills one of SAP's short-term goals for ESA (SAP's variant of the SOA blueprint) adoption—to create simple services (software components, if you will) that work on top of legacy applications already used by organizations. In the future, the entire stack which encompasses ERP, CRM, and all other SAP Business Suite solutions will eventually evolve to use business objects as their underlying application. Instead of having a rigid and unwieldy monolithic set of applications, SAP is creating a collection of business objects that can be applied in more flexible ways. By late 2007, there will be more services to choose from than the ones used to support Duet, since ESA follows the SOA format of "model once, run anywhere". Namely, instead of hard-coding multiple solutions that apply to different domains, ESA employs business objects or services that are modeled in a way that allows them to handle different solutions. Duet is just one of many client-side solutions that ESA will enable.

To understand how Duet is in tune with SOA, it is important to become familiar with the new stack defined by SAP ESA, and to understand what a composite application is. Webopedia defines a composite application as an application that consists of more than one type of service delivered from an SOA environment. It can range from functionality to entire applications. Services are generated through "local" application logic that controls how services interact with each other. For more information, see Understanding SOA, Web Services, BPM, and BPEL.

As a composite application, Duet overlaps with nearly every part of the new SOA stack:

  • User screens. Duet uses the familiar Microsoft Office desktop interface, which is achieved not by hard-coding the UI, but through backend modeling and deploying it to the client.

  • Process orchestration. Duet uses a communications hub referred to as the Duet Extensions to route data to and within the ERP system.

  • Process integration. Using the aforementioned extensions, Duet translates data from Microsoft Office applications such as Excel into a format that is easily understood by existing ERP tools and their respective enterprise services.

  • Process workflow. All of the usual workflow processes within SAP ERP take place within the context of Microsoft Office's desktop tools.

  • Distributed data. The ability to cache data for working online or offline also plays an important part in the functionality.

Application Giants in Duel—and Duet—for Users' Hearts, Minds … and Wallets

The relationship between the two software powerhouses, Microsoft and SAP, has been intriguing to put it mildly, at least since Microsoft's entry into the enterprise applications arena in late 2000 (see Microsoft 'The Great' Poised to Conquer Mid-market, Once and Again ). While the relationship has been depicted by many through a myriad of antonyms, such as "on-off", "hot-cold", or "love-hate", currently, it can best be described as "mutually civil". One can even find some uncanny similarities between the two, such as the occasional involvement in intellectual property lawsuits (whether as plaintiffs or defendants) or through the relatively recent, almost coinciding departures of technological visionaries, Satya Nadella and Shai Agassi, respectively (although Nadella was merely transferred within Microsoft, to the search and ad group that will hope to fend off Google's undeniable threat).

Microsoft and SAP then entered a "strange bedfellows" or co-opetion phase in their relationship, by dallying in business applications. Acting like two high-profile on-again, off-again celebrities, the two were dismissive about questions from the press and analysts about the inevitable competition this partnership would create (i.e., responding "We do target different sizes of companies"). Nonetheless, this stance become moot owing to SAP's forays into small business via SAP Business One and Microsoft's propping up of Microsoft Dynamics AX, as an upper mid-market solution. Then came a perceived snub by SAP for opting for Java 2 Enterprise Edition (J2EE) as a primary development environment for its infrastructure and development platform (while there is, nonetheless, some lesser valuable interface options for the counterpart Microsoft .NET Framework environment). However, SAP's move was quite logical given the still lingering perception of Java's better fit for larger enterprises (see Understand J2EE and .NET Environments Before You Choose ).

Any hard feelings between SAP and Microsoft were short lived, as we found out in 2004 when the two were engaged in secret (and startling) merger talks, which was quickly put ad acta before the news broke (whether for good remains to be seen). For most of that year, both vendors had to spend time explaining their separate forays into developing next-generation, service oriented architecture (SOA)-enabled products. Then 2005 seemed to be the year of bliss, where the two expressed mutual respect, and even worked jointly on a commercially available product featuring best of both worlds. Specifically, SAP and Microsoft joined together to leverage the openness of the SAP NetWeaver and Enterprise Service Architecture (ESA) blueprint (see Multipurpose SAP NetWeaver ) with the .NET-based architecture of Microsoft Office desktop applications suite (see Subtle [or Not-so-subtle] Nuances of Microsoft .NET Enablement ). The result was the joint product code-named Project Mendocino (the name of a town halfway between the companies' respective US headquarters) that promised to deliver familiar Microsoft Office desktop management and productivity tools as the façade for the heavy-duty lifting processes of SAP's enterprise applications. In other words, Project Mendocino extended and automated selected business processes from SAP ERP (Enterprise Resource Planning) through the familiar Microsoft Office user interface (UI), by providing role-relevant displays of information while retaining SAP applications' process context and the necessary collaboration and analytic tools.

Wednesday, October 21, 2009

Agresso + CODA, VITA + Link (+ CODA 2go): What’s the Sum? – Part 2

Enter CODA Link Architecture

For its part, CODA’s value proposition is in being a best-in-class financial management solution with possibly unmatched connectivity (i.e., it plays nicely with others, if not almost everyone in the yard). By the very nature of its narrow functional scope, CODA’s financial management software provides a stand-alone solution that simply must fit into customers’ existing IT infrastructure to work with other business systems without negatively impacting them.

CODA focuses on solutions targeted at chief financial officers (CFOs) and controllers. The “best-in-class” financial management designation comes from the single Web browser-based general ledger design that accommodates the “multi-everything” mantra (i.e., multi-currency, multi-country, multi-dimension, multi-subledger, etc.). This way CODA is able to meet both local and global requirements, and the system is compliant with the Sarbanes-Oxley Act (SOX), Generally Accepted Accounting Principles (GAAP), and International Financial Reporting System (IFRS).

CODA’s customers have been raving about the vendor meeting their needs for consistent and accurate data, and an up-to-date “single version of the truth.” In addition, they often talk about improved financial processes (e.g., purchase-to-pay, invoice-to-collection, record-to-report, etc.), more streamlined and effective financial period closing practices, complete audit trails, and flexible enterprise reporting and analysis capabilities.

CODA-Financials is targeted at midsize and large companies across all public and commercial sectors, while CODA Dream targets small and medium enterprises (SMEs) , primarily in the UK. Both products have a long heritage, and CODA certainly has a remarkable reputation in the UK’s CFO/controller community.

CODA-Financials has a similar number of customers as Agresso, and has customers in all geographies (about 2,800 customers in over 100 countries). CODA has local sales and service & support hubs in the US, Europe, and Singapore.

The current Release 11 of CODA Financials (code-named Neon) has seen significant research and development (R&D) investment (the vendor estimates around 300 person years) to meet its customers’ changing needs. These are along the lines of helping organizations to achieve superior finance processes and improve business visibility (e.g., performance by company, location, product, line of business, etc.), regulatory compliance, and corporate governance.

In recent years, CODA has expanded its offerings beyond accounting transactions into other areas relevant to CFOs, such as financial analysis, financial consolidation, cash management (through the acquisition of OCRA), and financial governance solutions/business process control. This has enabled CODA to cross-sell these solutions to existing customers and even to organizations that do not necessarily use CODA’s financial management applications. However, the effort has not realized significant increases in revenue. For more on these events, see TEC’s 2005 series entitled “Best-of-breed Approach to Finance and Accounting.”

Superior Connectivity

The underlying Link architecture provides the backbone for CODA’s sophisticated and interoperable enterprise financials solution. Link’s capabilities give financial executives’ applications’ change management capabilities in terms of fast implementation, “low-impact” integration, and pain-free upgrading.

CODA’s standalone specialist financial management software has been designed to work with other surrounding IT systems, thanks to its notable system compatibility and easy integration. For one, the vendor’s stand-alone components fit into existing infrastructures due to their support for the following platforms (per each architectural layer):

* Hardware – Windows/Intel (Wintel) ; HP-UX servers; SUN SPARC; IBM pSeries (formerly RS/6000); and IBM System i (formerly iSeries and AS/400)
* Operating systems (OS) – Microsoft Windows; UNIX; Linux; and OS/400
* Databases – Microsoft SQL Server; Oracle; Sybase; and IBM DB2
* Web servers – Internet Information Services (IIS) and Apache HTTP Server
* Web browsers – Windows Internet Explorer (IE) and Mozilla Firefox
* Integrated development environments (IDEs) – Microsoft .NET Framework and Java Platform Enterprise Edition (formerly J2EE).

Moreover, integration with other key business systems can take place via either CODA 2link, a user-friendly integration tool (with more structure to it), or simply Web Services that capitalize on SOA principles. CODA 2Link offers a choice of appropriate toolsets for integration, starting with table-based batch loading integration. Integration can also be done via online remote procedure calls (RPCs), via extensible markup language (XML) in a distributed manner, or via a combination of XML interfaces and Web Services.

That is to say that this architecture blueprint provides both simple integration (via Microsoft Excel uploads) and advanced integration options. The latter options include Structured Query Language (SQL) batch uploads, Visual basic .NET and/or C language application programming interfaces (APIs) for legacy systems. Last but not least, and as said before, integration can also be programmatic via XML and Web Services.

User access is via a pure Web client or embedded within Microsoft Office (CODA also has its own implementation of AJAX called APE). Moreover, personal digital assistant (PDA) devices and mobile delivery of personalized reports are also supported.

The Web-based deployment and infrastructure for effective data management provides secure and personalized access to an up-to-date “single version of the truth.” The system ensures that everyone is “on the same page” by keeping functional updates “in sync” for all users, and by gathering, unifying, and analyzing data from systems across the entire organization, in a timely manner.

The system offers wide-ranging automation capabilities for data entry and processing, reconciliation, reporting, and financial processes. Personalization capabilities are also at the core of the architecture, with users driving configuration and tailoring of forms, inquiries, reports, and so on. There is a single graphical user interface (GUI) and look-and-feel for all CODA products, and users can redesign CODA’s processes and screens that come “out of the box.”

But the “Future Proofing” feature is the ability to decouple users’ personalized interfaces from the underlying CODA version on the server side, which provides for minimal impact on interfaces when moving to the latest CODA release. In other words, all user-driven customization and integration is preserved and protected through the upgrade process.

So, How is CODA’s SOA Better Than Other SOAs?

In addition to the aforementioned support for multiple platforms and personalization capabilities, I was wondering whether the CODA 2link SOA-based architecture is any different and better than other SOA counterparts, and how. In other words, SOA is known for plugging pieces together, and most SOA platforms are fairly evenly matched in that regard.

If there is something that differentiates the Coda 2link’s performance from, say, SAP NetWeaver, IBM WebSphere, or Oracle Fusion Middleware (OFM) connection capabilities, then users need to know that, and why it is better. CODA believes that its unique selling proposition (USP) with CODA Link is the extent of its coverage. Namely, all of the granular functions within CODA are available as Web Services, which means that anything that users can do within CODA’s finance system can be easily integrated to and accessed via another application – a front-end business system, or another back-end system, for example.

This feature also means that users can achieve a greater level of integration than with other systems, and avoid the normal pitfalls of enterprise application integration (EAI) and middleware products such as having to duplicate customer records or other data, for example. We should also note that OFM, WebSphere and NetWeaver are really middleware offerings that require extensive certification processes for best results. What CODA is providing is standards-based (the WS-I or Web Services Interoperability organization) service entry points into its business applications.

The vendor is not supplying a middleware solution per se, but rather a finance engine that can sit at the heart of an enterprise-wide, integrated, best-of-breed applications suite that meets the unique requirements of the customer in a way that no broad homogenous application suite could. The organic use of Web Services provides

* the ability to upgrade CODA but not have to change users’ screens;
* the ability to use the same development methods irrespective of the underlying hardware and software platform;
* integration with other systems independent of location (i.e., intranet or outside the firewall, at subsidiaries, affiliates, business partners, etc.);
* a single point of maintenance (repository) for financial business rules and security.

Furthermore, the entire Link infrastructure has full version support, so that CODA can guarantee that any integrations made using any of the technologies it offers within its architecture will continue to work through future upgrades of CODA. This versioning support, which enables consumers to maximize their investment in R&D around solutions they build even over multiple upgrades from CODA, is possibly a unique proposition that should resonate with some prospective customers.

Back to Agresso + CODA

The merger with Agresso has certainly given CODA a safer harbor from less friendly acquirers, while Agresso now has a two-prong product strategy along the “change” theme. On a somewhat negative note, despite Agresso’s ongoing success, its revenue is still centered on Europe, with only 9 percent of 2008 revenue coming from outside the region.