Sunday, December 6, 2009

Exact Acquires Vanguard Solutions Group

Exact not only expects Vanguard to continue to sell GPS through these existing partners, but also expects Vanguard to expand its partner network. Balio, president and general manager of the new division defines the new relationship as "co-opetition", which, as the name suggestions, involves both cooperation and competition. He explains that while "Vanguard's ERP partners will compete with Exact for new accounts � they will also cooperate with Vanguard in enhancing their product offerings. It has been and continues to be our objective to provide our partners with a complete, integrated analytics offering. Our partners use our analytics to enhance the value of their products to both their existing customers and new accounts. We are dedicated to maintaining the partnership relationships to the benefit of all involved."

Prior to the Vanguard acquisition, in mid 2005, Exact acquired other companies. In August, it took a majority interest in Modulair Easy Access B.V., a logistics and warehouse software firm, while in June. It also acquired Kooijman Software, a software provider to construction and education organizations, and AllSolutions, a software provider to non-profit and service organizations. In May 2004, Exact acquired the Treadstone Group, a Cincinnati, Ohio-based (US) IT consulting and systems integration firm. All these acquisitions contribute to an incremental, rather than dramatic, revenue increase, and also to a gradual geographic coverage expansion.

The acquisitions follow the appointment of Rajesh Patel as corporate CEO on July 1, 2005. In a keynote address at the Engage 2005 annual user conference in early October, Patel said that Exact was firmly focused on the mid-market and that a core part of the company's growth strategy is not to be "distracted by [the] digestion or integration issues" caused by rampant acquisitions for market share purposes. He also stated that Exact has no ambition to climb up the market.

The Wizardry of Business Process Management

The business process management (BPM) market is sizzling hot, with Gartner Dataquest estimating its compound annual growth rate (CAGR) at 13 percent in 2009. In fact, almost all leading BPM vendors have been buzzing about their unprecedented growth and profitability, especially amidst the ongoing economic drought.

It is truly difficult to argue against the need for companies from all walks of life to improve their business processes. Doing “better, faster, and cheaper” is the “slogan du jour.”

In his keynote presentation during the recent Lombardi Driven Online virtual conference, Lombardi Software’s CEO Ron Favaron referred to BPM as “Business Pressure Management.” That pretty much says it all. Logically, to the end of managing business pressures, Lombardi offers its broad BPM suite called TeamWorks Enterprise Edition [evaluate this product].

I also recently attended a Webcast by Appian Corporation, possibly the first BPM company to deliver process, knowledge, content, collaboration, and analytics capabilities in a comprehensive suite, Appian Enterprise [evaluate this product] and its software as a service (SaaS) counterpart Appian Anywhere. I particularly liked one slide in the presentation deck wherein the eight bullet points’ first letters cleverly spelled out the mnemonic REMEMBER (why to deploy BPM now) as follows:

* Retain customers
* Enhance standardization (and consistency)
* Measure business performance
* Evaluate components of processes (subprocesses)
* Manage all elements of the business
* Bottom line improvements
* Eliminate bottlenecks
* Rapidly deploy new services (and processes).

Indeed, these are some of the typical benefits of deploying BPM systems, but the trouble, called the lack of clarity and consensus, starts with the quandary about what exactly constitutes BPM, and what exact parts and capabilities of BPM help achieve those benefits? In other words, are there more important and better BPM suites and/or components vs. those that are of less importance?

In plain English, BPM entails all methodologies and tools that help businesses improve their processes. Depending on the context, BPM can be regarded as a management discipline, a technology, or even both. If one defines BPM as an approach to methodically design, implement, execute, control, and improve business processes, then one can argue that it is a management discipline. In addition, there exists a raft of accompanying IT tools to support this discipline in all of its abovementioned stages.

Extreme BPM Definitions

Simple as that, right? Well, not really, and my concern with the BPM word-use is semantics. Namely, so many people might still mean “business process modeling tools to help us with radical business process re-engineering (BPR),” which was all the rage back in the early 1990s. Indeed, this erstwhile people-centric approach of managing the overall business, independent of the specific technology or tools that are used to support it, has since gone out of fashion.

Namely, the problem is that the abstract world modeled in modeling tools often has not much to do with real-life business processes and typically cannot be implemented. If a business process analyst models a company all the way in, e.g., IDS Scheer AG’s ARIS tools, by the time he or she is done the model might already be obsolete.

And by the time the company implements the model, the dynamic economic environment will have already changed. Yet the model on paper (a bunch of flowcharts) must be deployable, usable, and maintainable in production. Business processes are about dynamics (or business agility, if you will) and a drawn flowchart is anything but dynamic.

On the other hand, the purely integration-centric approach of providing a way for software to communicate and execute automated workflow to accomplish discrete tasks via integration and process orchestration is not nirvana either. Many other vendors might mean that BPM is “an effective way for us to re-package our traditional enterprise application integration (EAI) tools under the guise of a service oriented architecture (SOA) orchestration project.”

IBM’s BPM suite, Fujitsu Interstage, TIBCO Software, Software AG (formerly webMethods), SAP NetWeaver BPM, and Oracle BPM come to mind here. These integration-centric BPM providers have often been accused by pure-play BPM suite providers of primarily targeting IT departments to try to sell BPM as a matter of service orchestration.

Yet, the true value of BPM should be to empower business users. To be fair, these larger companies have recently acquired companies that provide BPM software aimed at business users, and I imagine the competition from larger companies will only intensify.

For instance, zooming in on Oracle’s BPM product strategy, the idea here is to offer a complete and integrated BPM platform that caters to system-centric, human-centric, document-centric, and decision-centric business processes (workflows) in a single runtime environment. The suite is aimed at business owners and developers to collaborate, to define processes across systems and lines of business (LoBs), and to improve business process efficiency by modeling, executing, monitoring, analyzing, simulating, visualizing, and optimizing business processes.

But, by digging deeper into the Oracle SOA Governance suite and Oracle BPM suite, it is possible to note so many identical components, differing mainly in the fact that Oracle BPM has to also accommodate human interventions. The Oracle BPM suite can be bolstered optionally with the business process analysis (BPA), design, and modeling capabilities via the partnering IDS Scheer’s ARIS tool (e.g., for achieving Six Sigma compliance).

IDS Sheer’s blog post recently tried to demarcate the line between BPA and BPM suites as follows:

“The primary purpose of Business Process Analysis (BPA) tools is to visualize, analyze and improve business processes. BPA tools help translate every day business complexity into structured models (scope: from business to model). They provide insight into an enterprise’s structure – i.e. how strategy, products and services, processes, roles, information and systems are related and influence one another. By creating a single point of truth, BPA tools strive to improve the communication between various stakeholders in a company, safeguard corporate knowledge and support decision-making and change management. Most notable user groups are business managers, process owners, quality managers, business analysts, risk & compliance officers and enterprise architects. BPA tools have rich semantics in order to fulfill a broad information need. They enable users to visualize and analyze the enterprise from different point of views, e.g. from a performance-, risk & compliance- or architecture perspective.

Business Process Management Suites (BPMS) on the other hand serve a different purpose and target a different audience. While they do offer modeling capabilities, their primary purpose is to automate, execute and monitor business processes based on technical models (scope: from model to execution). Notable user groups are business- and information analysts, process engineers, software developers and system administrators. BPMS do not offer such rich semantics as BPA tools in the sense that their metamodel does not comprise concepts for performance management, risk & compliance management or architecture management. Then again, these concepts are not required to automate processes.”

A Fragmented and Crowded Market

Thus, the market for BPM software and related implementation, consulting, and training services is intensely competitive, rapidly changing, and highly fragmented. Every BPM aspirant likely encounters competition from internal IT departments of potential or existing customers that may seek to modify existing systems or develop proprietary systems in a do-it-yourself (DIY) fashion.

The process improvement adoption has started lately in many IT departments via implementing a set of concepts and policies for managing IT infrastructure, development, and operations. Some of these sets and disciplines are Information Technology Infrastructure Library (ITIL), IT Service Management (ITSM), and Control Objectives for Information and related Technology (COBIT).

Moreover, there are a number of enterprise-wide initiatives around process improvement disciplines such as Lean manufacturing, Six Sigma, Total Quality Management (TQM), etc. These frameworks of concepts and policies often require IT support in order to make operational best-practice workflows, while the linkage to business users is critical.

The market consists of a number of established BPM suite providers such as Appian, Ascentn Corporation, Cordys, Global 360, Lombardi, Metastorm, Savvion, Pegasystems, and Ultimus, to name some. In addition to the abovementioned SOA middleware and enterprise architecture (EA) providers, internal IT departments, and BPA and process mining vendors (e.g., IDS Scheer or Pallas-Athena respectively), BPM suite vendors also compete with companies that target the customer interaction and workflow markets, and companies focused on Business Rules Engine (BRE) such as Corticon Technologies, FICO (formerly Fair Isaac Corporation), and the ILOG division of IBM.

Competition additionally comes from professional service organizations that develop custom BPM software in conjunction with rendering consulting services. To further muddle the picture, there is a number of Enterprise Content Management (ECM)-based vendors such as the Documentum division of EMC Corporation, FileNet, the division of IBM’s Information Management Group, Adobe LiveCycle, Oracle Stellent, and Autonomy Interwoven, to name but a few.

What Constitutes a Full-fledged BPM Suite?

BPM suites’ scope can be sliced and diced in many ways. For one, Part II of my 2008 five-part blog series entitled “It’s About Process (or Ability to be Responsive)” outlined the necessary BPM suite components. But if one is to look at BPM suites through the lens of the Plan-Do-Check-Act (PDCA) loop, one could think of covering the following three necessary activities (within the feedback loop):

1. Business process modeling and analysis, which were explained earlier on;
2. Business process automation or execution; and
3. Business activity monitoring (BAM), measuring, process mining, and so on, as parts of continuous business process improvement efforts.

Most of the abovementioned contemporary BPM suites have a comparable basic functionality set, which has been specified as the desired capabilities of a BPM suite in Gartner’s 2008 report entitled “Four Paths Characterize BPMS Market Evolution.” These capabilities are:

* Model-driven development environment (with model-driven process execution rather than source code-based one). Processes can be changed bi-directionally, either in the design or execution stage (each impacting the other), and with an audit trail of changes;
* Process component registry or repository management;
* Document management and ECM systems;
* User and group collaboration;
* System inter-connectivity;
* Business event management, business intelligence (BI), and BAM;
* Online and offline process simulation and process optimization;
* Business rules management system (BRMS);
* System management and system administration; and
* Process execution and a state management engine.

While most of the leading BPM suites would address this prescribed broad functionality set by and large, they have some intrinsic differences that make them more suitable for one usage scenario versus the others. This difference often comes from the underlining architecture and the genesis of a particular suite. It may also be a consequence of the key customer segments that the vendor targets.

BPM practitioners understand some of the common usage types of BPM systems, often referred to as “human-centric business processes,” “system-centric (integration) processes,” and “document-centric processes.” Most of the real-life business processes have all of three elements in them, but some are heavier on one versus the other two.

In its whitepaper entitled ”Understanding Usage Scenarios An Enterprise BPMS Must Support,” Savvion identifies and describes four other equally important usage scenarios that are not very well understood. These are: case management, rule-based (decision-intensive) processes, project-oriented processes, and event-centric process management. Savvion claims to currently be the only BPM provider that can accommodate all of these seven usage scenarios.

What About Accommodating Change (We Can Believe In)?

But, after all this discussion, do BPM suites necessarily mean “build for change?” Do they by default mean easy automation, rapid iteration, and execution?

The bar has to be set higher, since pragmatic buyers are increasingly looking for proven fixed-cost business-driven enterprise-wide BPM deployments. An astute BPM suite should take a process-centric approach to managing business operations that can deal with any business workflow with high impact on overall customer service operations.

Metastorm’s [evaluate this product] white paper “Building a Business Case for BPM” asserts that there are three fundamental characteristics of BPM that make this technology the game-changer:

It’s About Process (or Ability to be Responsive) – Part I

After several years (if not decades, even) of painstakingly corralling and setting up all their custom data, objects, tables and whatnot, and making sure that these static and/or dynamic transactional data are secure, many enterprise applications users have realized that the time is long overdue for them to start looking at ways to make their applications more process-savvy.

Companies are increasingly trying to adopt and implement standardized (and yet flexible and easily modifiable) business processes to help their operations run more consistently and smoothly. For example, the chief executive officer (CEO) might decide that as of, say, next month “All customer service cases must be resolved within 24 to 48 hours,” or, “We are going to institute a new sales process for all deals worth over US$100,000.”

However, these business processes often get communicated to employees in an ad hoc and unregulated manner. A process document with instructions may exist on a network file share, but people have not the foggiest idea that it’s there. And some employees might rely on word-of-mouth information from co-workers (so called “tribal knowledge”) to learn the processes for their jobs.

Consequently, standardizing and instituting new business processes can prove challenging for most companies, particularly larger organizations.

Indeed, until recently most enterprise applications have hardly been anything more than glorified databases — they could hold all of the information users may need and allow users to search for records based on various criteria, but they could not really help users to perform the functions of their daily jobs more effectively.

There’s still often no native automation and agility within the system that lets, e.g., a recruiter instantly know when the status of a candidate has changed or when a new position requisition has been entered into the system.

Indeed, when any changes are made somewhere in the organization, users have to remember to notify one another of the change or else rely on others finding the updates on their own. Neither solution is practical in the long term and invites the possibility that the software solution or best practice will not be adopted consistently by all employees at the company.

How can one then build processes into enterprise applications so that users won’t need to, time and again, rely on manual (pedestrian) methods of communication to inform others of changes, increasing the risk that many issues will fall through the cracks?

Introducing Workflow Automation

To that end, a built-in or an external standalone add-on tool (or capability) that can be used to solve the process automation problem is called workflow automation (or workflow management). Some will refer to it as business process management (BPM), and we will shortly try to point out the differences between the two – i.e., workflow and BPM.

Traditional enterprise applications typically feature some built-in functionality, such as a human resource management system (HRMS) or a procurement application, with some capability to tailor the base functionality through parametric configuration options (e.g., via “order types” that entail different mandatory and optional “order steps”) that users have to learn by heart.

To be fair, some enterprise applications have introduced workflow capability into their products to give users some ability to control the process behavior of documents such as an invoice or an engineering specification. But in most enterprise applications workflow is implemented through hard-coding, which means that programmers must develop and maintain the code.

In addition, workflow automation of the typical enterprise application is generally limited to a single document or task routing. This usually means that companies implementing an enterprise application must choose between accepting the vendor’s pre-built business process behavior or paying the vendor dearly to make expensive modifications to accommodate more complex processes, which will then make upgrades either costly or impossible.

In contrast, a specialized workflow tool enhances a single task and/or document routing by providing an integrated capability to include rich user interfaces (UIs), system integration, rule processing and event handling.

Rules are necessary to determine which path users should take next in a process that has multiple possible paths, e.g., an order worth less than US$1,000 does not need manager approval, but over that amount it does. On its part, an example of event handling would be a necessary step after a product recall: a “pull from shelves” notification must be sent throughout the distribution channels.

These capabilities can be pretty powerful, since in general, if users can come up with a standard rule that specifies when a particular event should happen, they can make it happen automatically with workflow. In other words, workflow becomes the magic ingredient that transforms many traditional transactions-capturing applications from a glorified database into fully functional tools that basically everyone in the company should find useful.

Workflow Components

The individual components that make up workflow are rules and associated actions — tasks, field updates, and alerts.

In general, a workflow rule is the main container for a set of workflow instructions. It includes the criteria for when the workflow should be activated, as well as the particular actions that should take place when the criteria for that rule are met. Every workflow rule must be based on a single object that users will choose when they define the rule, as this object then influences the fields that are available for setting workflow activation criteria.

For example, if a user defines a workflow rule for the “Job Application” object in an HR application, he/she will be able to set workflow activation criteria based on the values of fields like “Job Application Number” and “Status”. Users can also set workflow activation criteria based on standard fields, like “Record Owner” or “Created Date”, as well as fields based on the currently active user when a rule is evaluated, such as their “Role” or “Time Zone”.

When a workflow rule is triggered, there are many types of actions that can occur, starting with a workflow task (or step), which assigns a task to a user according to a particular template. Just as in Microsoft Outlook, tasks include information about something that needs to be done by a certain time, such as making a telephone call, creating an order, shipping goods, or paying an invoice. Typically, assigned tasks appear in a user’s “My Tasks” related list on their home tab (or page) and generate reminder messages that pop up when a user logs in.

When an administrator defines a workflow task, he/she provides default values for data fields like “Assignee”, “Subject”, “Status”, “Priority”, and “Due Date” for tasks that are generated by its associated workflow rule. Administrators can also make sure that a notification email is sent to the assignee when a task is automatically generated.

In additon, a workflow field update changes the value of a particular field on the record that initially triggered the workflow rule, while a workflow alert sends an email according to a specified email template. Unlike workflow tasks, which can only be assigned to users of the application, workflow alerts can be sent to any user or contact, as long as they have a valid email address.

A workflow rule can include any combination of these actions when the rule is triggered. For example, one rule might send out an alert and update two fields on a particular record. The action that one workflow rule takes can also trigger the execution of another workflow rule.

Workflow-enabled Applications

Many enterprise applications today come with built-in workflow management capabilities, such as the Salesforce.com Enterprise Edition on-demand customer relationship management (CRM) suite and its on-demand Force.com (formerly Apex) platform, Agresso Business World (ABW) or Exact E-Synergy, to name only some.

Microsoft Dynamics CRM too includes a workflow module that users can use to automate their business processes based on the rules, logic, and actions that they design. Microsoft has revamped the workflow functionality in Microsoft Dynamics CRM 4.0 so that it now uses the Microsoft Windows Workflow Foundation (WF), whereas previous versions of Microsoft Dynamics CRM used their own proprietary workflow engine.

The result of the revised workflow functionality is that users, administrators, and developers can design and create business processes using the workflow tools with new features and a new UI for creating and monitoring the workflow processes.

Windows WF provides a comprehensive programming model, run-time engine, and tools to manage workflow logic and applications. The Microsoft Dynamics CRM workflow UI relieves users and administrators from the need to interact with WF directly. Therefore, users do not necessarily have to understand the underlying workflow technology to create workflow logic in Microsoft Dynamics CRM.

As a recap, a built-in workflow provides a tool to help companies set up and define business process activities (including the proper sequencing) that involved employees can use when working with the enterprise system’s data. Conceptually, one should think of a workflow as an application or service that runs in the background, 24 hours a day, 7 days a week, constantly evaluating the data and the multiple workflow rules in the company’s deployment.

When the workflow service encounters a trigger event, it activates the appropriate workflow rules to run the workflow actions. Typical workflow actions include sending an e-mail message, creating a task, and updating a data field on a record.

By implementing workflow processes in the enterprise resource planning (ERP), supply chain management (SCM) or CRM systems deployments, users can enjoy many benefits, such as:

1. Ensuring that users track and manage their customer data and processes in a consistent fashion — instead of relying on users to remember the appropriate steps for processing data, managers or administrators can create workflow rules that will automatically determine the next required steps and assign activities as necessary;
2. Processing the customer data more quickly so that, for example, new sales leads or customer service requests are assigned and routed immediately upon record creation; and
3. Allowing users to focus on more value adding activities — instead of having to perform a large number of manual repetitive steps.

Workflow vs. BPM

Both Workflow and BPM are systematic approaches and technologies to improve a company’s business processes (and performance). From a business perspective, they are ways to make people, information and computers work together more consistently and efficiently to produce needed results.

For example, a workflow/BPM-enabled application could monitor receiving systems for missing or defective items, or walk an employee through the steps to troubleshoot why an order arrived late or not at all.

Both technologies foster ongoing collaboration between information technology (IT) and business users to jointly build applications that effectively integrate people, processes and information. They provide organizations with the ability to define, execute, manage and refine processes that:

* Involve human interaction (such as placing or approving orders);
* Integrate and work with multiple diverse applications; and
* Handle dynamic process rules and changes, not just simple static flows, (i.e., those flows that enable tasks with multiple choices and contingencies/escalations).

The market for workflow and BPM applications is highly stratified and fragmented, in part because the currently available products stem from different origins. Namely, there are former pure integration vendors or document management/enterprise content management (ECM) vendors that have meanwhile encroached into the BPM space.

The difference between workflow tools and BPM suites is largely a semantic distinction, and the gist of the matter is that a workflow engine is at the heart of BPM suites with process execution capabilities. Also, in most cases vendors who sell applications labeled as BPM are aiming at a bigger scope and more complex projects, with elaborate software supporting even more elaborate methodologies, process defintion and modeling, collaboration methods, and so on.

Features and capabilities are not necessarily the only differences between tools, since usually the products aimed at simpler processes focus strongly on “ease of use.” The designers’ assumption is generally that the users are non-IT experts within the company. Such workflow products might be built around the concept of an intelligent form. Basically, the user develops the workflow by filling in a familiar-looking form (e.g., a “tasks vs. actions” matrix), including the business rules.

Yet the limitations of the simpler workflow tools become evident when they attempt to manage inter-process dependencies amongst several applications, handle complex database integration and handle tasks that partake in larger, more complex processes.

For more information on BPM, see TEC’s earlier articles entitled “Business Process Management: How to Orchestrate Your Business” , “Giving a Business Process Management Edge to Enterprise Resource Planning” and “Business Process Analysis versus Business Process Management.”

Special credit also goes to CIO Magazine’s articles entitled “ABC: An Introduction to Business Process Management (BPM)” and “Making Workflow Work and Flow for You.” Some useful concepts and examples were also adapted from the Salesforce.com’s AppExchange Developer Network (ADN) book entitled “Creating On-Demand Applications with AppExchange: An Introduction” and from the Microsoft Press book entitled “Working with Microsoft Dynamics CRM 4.0.”

usiness Process Management: How to Orchestrate Your Business

Companies used to coordinate activities through the company manually. This resulted in inefficiency and errors in the operational process and often led to difficulties in improving the process itself. Organizations are increasingly focusing on the implementation of business process management (BPM) solutions for the purpose of improving functional efficiency and effectiveness in their core business processes.

Evolution of BPM

Approximately ten to fifteen years ago, organizations began assimilating their legacy systems in specific industries or divisions by integrating enterprise applications via data transformation and routing, event triggering, process automation, and adapters. Enterprise resource planning (ERP), customer relationship management (CRM), and supply chain management (SCM) vendors were flourishing at this time. They automated their transaction systems with ERP software while including the information systems from CRM software. Five years later, business process integration (BPI) solutions, namely business process modeling, business-to-business (B2B) connectivity, and vertical industry process templates were built on top of these enterprise application integration (EAI) systems.

Today, the market offers BPM solutions that incorporate both the EAI and BPI functionality in addition to functionality such as workflow, business activity monitoring, web services, rule engines, and portal capability.

What Is BPM?

Business process management (BPM) was recognized by the academic world in the fifties and sixties as an important ingredient in the quality management approach. In the eighties, authors Hammer and Champy drew the attention of business managers to process management, process (re-)engineering, and workflow management. Today, BPM is continually gaining ground. Many companies have learned from experience that BPM is a strong asset when facing the rapidly changing requirements that are typical of today's dynamic world.

The acronym BPM has been the cause of some confusion in the past. It can be mistaken for business process modeling, which is a subset of the more "evolved" business process management. It is important to note the distinction between the two.

Business process modeling is issued solely for the graphical representation of the workflow, which can be either information or an actual document in a business process. Business process management is the definition of the process as a whole, including EAI, business process modeling, workflow, and even B2B transport capabilities. Furthermore, BPM should not be confused with business performance management which belongs to the world of business intelligence (BI) and data warehousing.
Organizations regularly implement CRM, SCM, and ERP applications. As a result, key business functions such as inventory management, warehouse management, or product lifecycle management are highly integrated. All these applications focus on a specific function or area within the company and are vertically managed.

What companies are looking to do these days is to (1) achieve horizontal integration in order to cater to cross-functional business processes, and (2) achieve true process automation to enhance the processing efficiency of company transactions.

What Are the Different Components in BPM?

BPM encompasses several disciplines intended for use across different divisions and areas within organizations. Some of these disciplines are

Business Process Modeling. "Defines" the process (usually in graphical format). As explicitly modeled processes are required for all subsequent BPM disciplines, process modeling is often perceived as the starting point of BPM. Defined with the use of a process modeler (not to be confused with graphical editors such as Visio or PowerPoint), the resulting model is composed of objects that are able to be related to by the BPM engines. Composed of different diagrams (to represent different dimensions of the organization), the model is stored in a structured repository.

Business Process Documentation. Responsible for the process-enhanced documentation. It complements the process diagrams by providing, through graphics, the what-to-do description and sequence of steps. It also adds the extended documentation by providing the how-to-do of business tasks to the model "skeleton". Items such as the work instructions, standard operating procedures, master templates, training components, etc. are added to the diagrams to create a documented process.

Business Process Certification. Takes care of the process's ability to comply either with industry documentation standards such as ISO or with an internal "gating process". It confirms that the processes have been approved or certified in a proper manner before their internal deployment.

Business Process Collaboration. Deploys processes (intranet or extranet publication) on the one hand, and provides users with the ability to leverage the process know-how into enhanced productivity via user and task collaboration, on the other hand. This BPM discipline addresses corporate-wide knowledge management (KM) by not only making documented and certified processes readily available to all employees and associates, but by also providing employees collaboration functions, which enable them to manage projects, tasks, or transactions in a work team approach.

Business Process Compliancy. Establishes the process's readiness to comply with internal and external regulations (such as Sarbanes-Oxley [SOX]). The compliant certified processes are then used to achieve governance certification, audits, or both.

Business Process Optimization. Responsible for continuous process improvement (CPI), including tools to assess the performance of the actual process against internal norms or industry benchmarks. The integrated quantitative analysis capability is used to identify bottlenecks and estimate throughput times and cost saving opportunities. This often includes a simulation engine to perform "what-if" analyses to locate process issues in a proactive manner.

Business Process Automation. Responsible for the integration between users, processes, and related applications, resulting in the system automation of the process tasks. Driven by a workflow management engine, the BPM process information, as modeled, can be used for automated transaction execution and routing, including task execution triggered by previous events, evolved task scheduling and user notification, real time monitoring of task execution, ad hoc execution, etc.
Organizations use BPM systems to improve the effectiveness of their core operations. BPM specifically coordinates interactions between systems, business processes, and human interaction. The expected results include

Saving money by automating the routing of activities and tasks to employees, taking away the non-value-adding activities such as routine decisions, transfer of data or forms etc., and providing users with tailored task lists.

Saving time by changing business processes as per technology, government, or competition requirements. With today's tight integration of process definitions and underlying applications, the changes in the definition can be deployed and communicated virtually immediately.

Adding value by opening up a range of functions that can be leveraged in a truly BPM-minded company. Value can be added in several areas—process (quantitative) analysis and optimization, quality certification (e.g., ISO)—requiring procedures to be created and published. Another area is compliance management (e.g., SOX) which is imposed on many organizations.

By implementing BPM, companies are able to orchestrate and leverage cross-functional business processes that are used over multiple systems, divisions, people, and partners.

The beneficiary of BPM systems is actually the customer. The customer will receive information sooner and products faster, which results in an improved level of customer satisfaction. This will translate into more revenue for the company.

Friday, December 4, 2009

Do You Need a Content Management System?

The ongoing drive to save time and money drives organizations to look into content management. As the costs of software and implementation range from almost free to millions of dollars and choosing the right vendor or system is vital, this decision can be daunting.

The term content management: What does it mean?

Content management is a phrase you hear everywhere these days. Companies claim they "do content management" and vendors say that they sell content management software. People who hear about content management often think about how to create a web site. The text, images, movies, etc., that are shown on web sites are the actual content indeed, but content management entails more than meets the eye.

Prior to explaining what content management is, it is useful to define the word content. Content is essentially any type or "unit" of digital information that is used to populate a page. It can be text, images, graphics, video, sound—or in other words—anything that is likely to be published across an intranet, extranet, or the Internet.

Where does content management come from?

Currently, information, communication, and digital networks have made a major impact. In this society, there is much information available. A company needs to acquire and structure information that exists both within and outside of its own four walls.

Where does this need for information or this need for content come from?

It can be said that the buzzword of this era is content. Before content, the hype of the late eighties and early nineties surrounded documents. As companies were producing large volumes of information by the end of the eighties, and while business boomed for products like Word, WordPerfect, Excel, and Lotus 123, organizations faced an increasing need to organize documentation. Rather than printing and storing hard copies, the documents required digital storage. The market responded with the creation of a powerful software tool to manage this process. These solutions became known as document management systems (DMS).

By the end of the nineties, the terms changed from document management systems to content management systems (CMS). A lot of DMS vendors suddenly called themselves CMS vendors since the main difference between document management and content management is the fact that document management deals with the document in its entirety, while content management focuses on the individual parts that make up a document or even a web page.

Both systems follow the same basic rules, workflow and processes. However, due to the evolution of the Internet, companies began to be more focused on managing the web site at the content rather than document level. This caused the market to shift from document management systems to content management systems.

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times -- A Panel Discussion Analyzed Part Six: Custom Development and Singl

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Leveraging Technology to Maintain a Competitive Edge during Tough Economic Times -- A Panel Discussion Analyzed Part Five: Profitability and Changing

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times--A Panel Discussion Analyzed Part Four: RFID Software Issues

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)
Q5. RFID is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID.

A5: Well, we will have all likely heard of some concrete examples (or imagined ideas) of expensive (and thus highly pilfered) retail items (such as razors, prescription drugs, apparel, and DVDs) packaged with pin-sized chips and tiny antennae that send retailers and manufacturers information about their use, and even about those who buy (or attempt to steal) them. Or the stories of grocery clerks immediately knowing when perishable items on the shelf have expired and replacing them before the items are purchased. We've also have heard of a consumer ordering the latest "hot item" and tracking it in the real time through the entire supply chain right up to the time when it is ready to be picked up. How about the idea of tracking employees and their labor with an RFID chip embedded in their ID badges to automatically record their transactions and even control their authorizations for a given area to detect security issues?

These futuristic-sounding scenarios (though not necessarily of the future, given such technology was employed decades ago, but only where its price was justified, like in the defense industry or to track the movements of precious pets) are being touted as the applications of an automatic identification and data capture technology named radio frequency identification (RFID). RFID uses low-powered radio transmitters to read data stored in smart tags embedded with minuscule chips and antennae. The tags are attached to packaged goods that can communicate with electronic reading devices and deliver a message to a computer that alerts retailers, suppliers, and manufacturers when a product's state has changed and requires action.

While the potential of RFID technology is indisputable (for example, unlike bar-codes, RFID requires no direct contact or line-of-sight scanning, and it provides streams of data that can be differentiated and interpreted before being passed to an enterprise application), much more is required in moving RFID from a lab to a live environment. RFID has the potential of a new technology inflection point and it can be a missing piece in the long-lasting puzzle of squeezing excess inventory out of supply chains. It will be only this piece, however, when (and if) it reaches a critical mass of adoption and maturity over the next several years. Nowadays, the market is still in a "chicken-and-egg" conundrum—until more companies commit to RFID, the cost of tags and other infrastructure will remain prohibitively high for mass deployment. A few years ago, typical smart-label tags were between $1�$2 (USD) each, while today we may be looking at production volumes in millions, costing 30�40 cents (USD). This is further projected into billions of tags on individual items in the future causing the cost to ideally fall to five cents (USD) or so. Eventually, in the long term, the price might fall to a penny or less, with new technology and even greater volumes. Still, while the tag price might seem as a major barrier now, it will likely become a minor issue down the track, when many companies start grappling with RFID deployments in earnest.

Over that time, many companies will begin to deliver and potentially receive a higher proportion of goods with RFID tags and, thus, they will have a better understanding of the technology and its potential in broader business improvements per se rather than only due to the mandated Wal-Mart, (US) Department of Defense (DoD) or Target compliance. Namely, as the world's largest retailer, with over 5,000 outlets worldwide, Wal-Mart currently uses traditional bar-coding and UPCs (unique product codes) to identify items and cases or pallets of goods as they move through the supply-chain and out to the stores. By 2005, Wal-Mart has envisioned to have live implementations of RFID tagging using new EPCs (electronic product codes, which can carry more useful data than UPCs), with a mandate to the Top 100 suppliers to provide RFID tags on cases and pallets at distribution centers, followed by item-level tagging at a much later date. EPCs on tags should be easier and quicker to read than barcodes, since there is supposedly no need to unpack pallets to check contents, as RFID readers do not, unlike bar-code scanners, require line-of-sight, which should all result in less labor, fewer errors, and better management of inventory.

However, companies implementing RFID should expect increased labor in the first year or so, because vendors have yet to perfect solutions for automating tagging and embedding RFID in packaging material. Also, the current state of RFID technologies would also revolve around label creation and production, plastic chip development, intelligent shelving and packaging, to name but a few. Furthermore, to gain benefits such as product tracking, supply chains should logically begin RFID implementation at the manufacturing level, rather than at the distribution center, which is one step closer to a retailer in the supply chain. Still, "source tagging" cases at the manufacturer is too disruptive for most companies to implement.

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times -- A Panel Discussion Analyzed Part Three: Applications Hosting

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Leveraging Technology to Maintain a Competitive Edge During Tough Economic Times -- A Panel Discussion Analyzed Part Two: Business Process Modeling

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, senior vice president, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability", which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director; Start Magazine; Dave Turbide, an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director, TEC. In preparation for the event, we polled the thoughts and opinions of our experts and contributors: Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion. We also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but transpired from many other interactions and presentations at the conference. Also, some pertinent articles published previously on our site, which may shed more light at the respective topic are mentioned as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Leveraging Technology to Maintain a Competitive Edge during Tough Economic Times—A Panel Discussion Analyzed Part One: Introduction

At the IFS Executive Forum, which took place on March 29 and 30 in Orlando, Florida (US), leading research analysts and industry experts discussed how companies can still leverage technology to maintain their competitive edge, even during tough economic times. The event was held in conjunction with IFS World Conference 2004, and it included six panel discussions, with each panel including top executives, analysts, and journalists. Some of the renowned panelists were Geoff Dodge, vice president, Business Week; Dave Caruso, SVP, AMR Research; Barry Wilderman, vice president, Meta Group; Leo Quinn, vice president of operations, Global Manufacturing Solutions, Rockwell Automation; Dave Brousell, editor-in-chief, Managing Automation; David Berger, Western Management Consultants; and Josh Greenbaum, principal, Enterprise Applications Consulting. Breakout sessions explored such topics as turning global competitive threats into opportunities, increasing the bottom line through operational efficiency, complying with the Sarbanes-Oxley Act of 2002, and using enterprise software to prepare for future challenges.

Technology Evaluation Centers (TEC) was represented at the executive panel titled "The Future of Enterprise Software and How It Impacts Your Profitability," which was aimed at helping companies find out where enterprise software is going in the next five years, and how it can make or break their profitability and market share. The panel, which was moderated by Josh Greenbaum, included the following participants: Barry Wilderman; Peggy Smedley, president and editorial director, Start Magazine; Dave Turbide; an independent consultant and renowned columnist for magazines such as The Manufacturing Systems; and Predrag Jakovljevic, research director at TEC. In preparation for the event, we polled the thoughts and opinions from our following experts and contributors Olin Thompson, Jim Brown, Joseph Strub, Kevin Ramesan, and Lou Talarico, given they were unable to attend the event in person.

Below are the questions and consolidated thoughts and answers that transpired from the panel discussion, but we also took the liberty to expand with a few pertinent questions and thoughts that were not discussed at the panel per se (due to the time limit), but which transpired from many other interactions and presentations at the conference. Also, pertinent articles that have been published earlier on our site, and which may shed more light on the respective topic are mentioned here as further recommended readings.

The questions are

Q1. What is the one piece of new software or technology that will be a must-have in the next five years? (see Part One)

Q2. Some pundits say the future of enterprise software lies in service-oriented architectures and component applications. True? False? (see Part One)

Q3. How does the development of new business processes and business process modeling fit in? (see Part Two)

Q4. What are applications hosting and other service models? (see Part Three)

Q5. Radio frequency identification (RFID) is on everyone's mind these days. Let's discuss the software issues around RFID and what kind of software solutions will be taking advantage of RFID. (see Part Four)

Q6. Technology aside for a moment, what can we say about its impact on profitability? (see Part Five)

Q7. With all this new technology, the question is what happens to existing applications and technology. Nobody wants to start over, but how much will existing IT systems have to change? (see Part Five)

Q8. Will the newest and greatest only come from packaged software? What about custom development? What is the build versus buy equation look like in the near future? (see Part Six)

Q9. How will the latest improvements in software flexibility and agility play in the single-vendor versus multi-vendor solution equation at multi-division corporations? (see Part Six)

Business Process Management How to Orchestrate Your Business

Companies used to coordinate activities through the company manually. This resulted in inefficiency and errors in the operational process and often led to difficulties in improving the process itself. Organizations are increasingly focusing on the implementation of business process management (BPM) solutions for the purpose of improving functional efficiency and effectiveness in their core business processes.
Approximately ten to fifteen years ago, organizations began assimilating their legacy systems in specific industries or divisions by integrating enterprise applications via data transformation and routing, event triggering, process automation, and adapters. Enterprise resource planning (ERP), customer relationship management (CRM), and supply chain management (SCM) vendors were flourishing at this time. They automated their transaction systems with ERP software while including the information systems from CRM software. Five years later, business process integration (BPI) solutions, namely business process modeling, business-to-business (B2B) connectivity, and vertical industry process templates were build on top of these enterprise application integration (EAI) systems.

Today, the market offers BPM solutions that incorporate both the EAI and BPI functionality in addition to functionality such as workflow, business activity monitoring, web services, rule engines, and portal capability.