Product Configuration in SAP: A Retrospective - Configuration Environments - Knowledge-based Configuration: From Research to Business Cases, FIRST EDITION (2014)

Knowledge-based Configuration: From Research to Business Cases, FIRST EDITION (2014)

Part V. Configuration Environments

Chapter 27. Product Configuration in SAP: A Retrospective

Albert Haag, SAP AG, Walldorf, Germany


This chapter gives a retrospective of the development of the SAP Variant Configurator and the concepts it is based on. The SAP OS/2 configurator that preceded it and the IPC configurator that followed it share this legacy and are also covered. As these three configurators all have their roots in expert systems development projects of the 1980s, the beliefs and motivation of that time are reviewed.


Knowledge-based Configuration; Variant Configuration; SAP Variant Configurator; SAP IPC; Expert Systems; Constraints; Decision Support


Many of the challenges of making a commercially successful configurator remain unpublished and unsung. Therefore, I would like to acknowledge the contributions of a very few key people who are not otherwise visible through citations.

Axel Kurka put together and led an AI development group in the SAP Basis development that built both SAP PROLOG and the SAP OS/2 configurator. Axel was a “real programmer” in the sense of the article Real Programmers Don’t Use Pascal (Post, 1983). He suffered an untimely death in image at the age of image. Peter Zencke was the manager in charge of developing the supply chain applications in R/3. The foresight of Peter in sponsoring configuration as an important topic and ensuring its integration into the relevant business processes was a major factor in the success of the SAP VC. Richard Knierim is a “real programmer” in his own right. Besides writing formidable parts of the SAP VC code himself, he had a clear sense of how and where to integrate it with the business processes. Andreas Krämer had a central role in the development of all three configurators. He remains one of the core architects of the SAP configurator engines to this day. Raimond Buchholz first worked with the SAP VC with Richard Knierim and later contributed to the IPC. He was active in the configuration space for more than two decades. I should also mention that the SCE (now the IPC configurator) was developed in Palo Alto, CA with the help of a great team from IntelliCorp. Robert Noble was the project lead of that team and is another “real programmer.” He eased the transition of the implementation to the then still nascent Java. Of course there have been and still continue to be many contributions by a multitude of colleagues who have worked on this topic over time. A history of the SAP configurators would have to include them. Unfortunately, this is out of scope here.

27.1 Introduction

The SAP Variant Configurator (SAP VC) is a prominent example of a commercially relevant configurator. It is part of the SAP ERP (Enterprise Resource Planning) systems since R/3 3.0 and thus available to all customers as part of the SAP ERP license. Based on analysis of consulting requests/problem reports and other available data it seems that around 2000 companies employ it, making it one the most used commercial configurators since its inception around 20 years ago. For many companies their business depends on it.

There exists a dedicated book, Variant Configuration with SAP (Blumöhr et al., 2012), which delineates not only the features and business integration of the SAP VC, but also looks at the surrounding ecosystem and customer case studies, and has a brief look at other related SAP configurators. This chapter is not a condensed version of that material, but a retrospective of the development of the SAP VC and the principles from which it is derived. It also uses the development of the configurators at SAP as an example for the commercial side of the evolution of configuration technology, because publications about this are sparse. This retrospective is based on personal involvement both with SAP (since 1992), and before that on participation in the TEX-K and TEX-B expert systems projects mentioned later (for the Battelle Institute, Frankfurt, Germany).

The retrospective covers a succession of three SAP product configurators. The first was a PROLOG based sales configurator with very direct roots in expert systems. It was called the SAP configurator at the time, being the only one. As it was deployed on the OS/2 platform it was also informally referred to as the SAP OS/2 configurator, a designation used here, although not officially correct. The next configurator is the SAP VC, already mentioned (see also Höfling, 20141). The third SAP configurator is what is now known as the IPC configurator. Each configurator is based on its predecessor. Many of the features originally developed in the expert system paradigm are still visible in these configurators. Hence, understanding expert system projects is key to understanding how and why they operate as they do.

27.2 Expert Systems (XPS)

The XCON system (McDermott, 1982) was deemed an expert system (XPS; see also Hotz et al., 2014a2). Its success showed that XPS can solve real business problems where classical implementations fail. At the time this grounded the fact that XPS provide a good way of dealing with configuration problems, and this in turn provided for ample funding both for corresponding research and for commercial projects in the 1980s.

The basic architecture of an XPS as depicted in Figure 27.1 is very simple. The task of implementing an application is reduced to capture and represent the knowledge of human experts. This knowledge is then stored in the knowledge base (KB) of an otherwise out-of-the-box XPS. A user may then query the XPS, which uses predefined inference techniques to answer the query using the expertise stored in the KB. The task of acquiring the expert’s knowledge and representing it adequately in the KB is termed knowledge engineering (as opposed to classical software engineering).

FIGURE 27.1 General architecture of an expert system (XPS).

The XPS idea has some definite conceptual advantages. Since the XPS is to emulate the cognitive processes of the human expert, the system ideally is able to do whatever the expert would have been able to do if the KB encodes the relevant expertise and assuming the system has adequate inference techniques. Knowledge engineers are freed from worrying about the exact nature of the underlying problem. Their task is just to encode the relevant knowledge correctly. They also don’t have to worry about consistency of the KB with the business master data in a company’s databases as they can assume that this will follow automatically if the expert’s knowledge is correct.

In the beginning, the commercial expectations attached to configuration XPS were not a direct return on investment (ROI). Rather, the goal of an XPS was perceived mainly as risk mitigation regarding the availability of human experts and in alleviating the expertise bottleneck in general. We may imagine that the verification of orders in XCON required expertise of a kind that was not easily scalable as demand for the DEC servers increased.

The German TEX-K project (Strecker, 1989) is typical for the zeitgeist and had later influence on the SAP configurators. TEX-K was an R&D endeavor funded by the German government (BMFT) from 1986 to 1989 to develop an XPS shell specialized to deal with configuration in technical domains. It was underpinned with six commercially relevant applications, all denoted by a system name indicating their XPS or configurator heritage. The first three were conceived as direct replacements for human experts: their goal was to configure (parameterize) a system for automatic processing of visual information (XVIS and XRAY) or chemical analysis (XCHEM). The fourth was to support a human expert in automation engineering (MMC-KON). The fifth was in some then as yet undefined space between sales support and engineer-to-order for electrical components such as an electrical transformer (KONEX). And the last one, APLEX, was an XPS to derive a route sheet for the machining of parts (see also Neumann, 1988).

The central system implemented within the TEX-K project was called PLAKON.This is a German acronym combining the words for planning (PLAnung) and configuration (KONfiguration). As this suggests, it was to generically cover both planning and configuration (synthesis) tasks. There was no thought that different applications or different goals might pose different problems requiring different implementations.

PLAKON and the TEX-K project and its aspirations are described in a post-project book Das PLAKON Buch (referred to subsequently as the PLAKON book; Cunis et al., 1991). This book was one source of inspiration when first developing the SAP configurators. It has not been translated into English but does contain a number of references to TEX-K project reports that were published in English.

27.3 Declarative Knowledge Representation and Constraints in XPS in the 1980s

Notwithstanding the simple general architecture of an XPS, there was an intense quest in the 1980s to develop powerful and adequate representation and inference techniques, both as general research into AI methods and as dedicated development within XPS projects such as TEX-K. This quest was motivated as much by needing to be able to emulate human reasoning as by more abstract insights into interesting algorithms.

The rules in XCON may seem like a very natural way for experts to explain themselves in the chosen domain, but drawbacks ensue because the outcome is dependent on the particular order in which the rules are applied (Soloway et al., 1987). It is difficult to predict the effect of local changes to the rule base on the correctness of the overall outcome. Testing is tedious and costly.

To avoid this problem, many of the subsequent approaches to configuration in the 1980s have a basis in predicate logic. Among these are Description Logics (Brachman et al., 1991) and Frames (Hotz, 2009), as well as the tendency to move from rules to constraints (Mackworth, 1977). Because logical formulae state “what is” rather than “what should be done,” these approaches are termed declarative as opposed to procedural. An XPS, which operates on a declarative knowledge representation in its KB, is termed a model based configurator. The declarative representation as such is called a (configuration) model (see Hotz et al., 2014a3).

27.3.1 (Conceptual) Constraints

One primary concern of the knowledge engineer is to represent dependencies known to the human domain expert. In a model-based XPS, this representation should be declarative wherever possible. A (declarative) constraint, as opposed to a (procedural) rule, is a relation between entities in the model and their properties.4

For example, an expert might tell the knowledge engineer that for a (straight) staircase of equidistant steps the height of each stair post (image) depends on the position of the post on the step (image), the height of the handrail (image), the angle of ascension of the staircase (image), and that it is calculated using the following formula:


This is a geometric constraint, which would allow calculating the value of any of the variables in the formula given the other three by solving for that variable. (Normally though, it makes no sense to solve for image.) Therefore, without changing anything, it could also be written in a more neutral form closer to the geometric meaning of the expression as:


The knowledge engineer must realize that this equality shall apply for all pairs image of instances in the configuration, where image is a staircase, image is a stair post, and image is a part of image.

The knowledge engineer may model this information in various ways. One way might be as a snippet of PROLOG code. Figure 27.2 shows how it could be expressed in the SAP constraint syntax.5

FIGURE 27.2 (Conceptual) Constraint in SAP syntax.

The constraint in Figure 27.2 is not a CSP6 constraint. If there were three instances of staircases in the configuration of a building with 50 posts each, it would need to be applied to 150 staircase/post pairs. This corresponds to 150 CSP constraints. Moreover, it uses real numbers and is prepared for interval domains of the variables. This is outside the scope of classical CSP. A constraint formulated at an abstract level as in Figure 27.2 is referred to as a conceptual constraint7 in PLAKON when there is danger of confusion with the CSP term. But this confusion does not occur in the context of the configurators discussed here, because CSP constraints are not explicit there. The variables of a constraint8 refer to entities in the model and their properties. When the constraint is applied, its variables are bound to the corresponding properties (values or domains) of object instances in the configuration. The constraint can then be evaluated using those bindings.

27.3.2 Local Constraint Propagation and Arc Consistency

The approach to reasoning with constraints taken in the TEX-K project was fairly simple. The primary inference technique associated with constraints is local constraint propagation to achieve arc consistency (see Hotz et al., 2014b, Chapter 6); that is, a constraint is used to filter out nonallowable values and reduce the domains of its variables wherever possible. The search for an actual solution, perhaps requiring the formulation of suitable heuristics, was considered as a separate problem and not a topic of constraints themselves.

The approach to arc consistency was influenced by work of Hans-Werner Güsgen (Güsgen, 1988; Ho et al., 1994) in the TEX-B project, a sister project of TEX-K devoted to developing basic techniques useful in technical XPS for diagnosis or configuration. Local propagation can also be applied in the case of linear numeric inequalities/equalities with real-valued variables with interval domains (Haag, 1989).

The equality in the staircase example is linear in the three variables image, and image. Thus, assuming the angle image is known, the constraint can be used to ensure arc consistency of the other three variables9 (for all staircase/post pairs). To illustrate these concepts: assume the angle of ascension is image (image) and the height of the handrail is image, and the steps are image deep, then the variable image (position of the post on the step) is restricted (without considering further constraints) to the interval image and, consequently, the constraint can restrict the variable image (the height of the post) to image.

There were several implementations for processing constraints in TEX-K (Cunis et al., 1991). One of them was implemented for the KONEX and APLEX applications and is referred to as imagePLAKON as it focuses only on a subset of the functionality envisioned for PLAKON.10 It is described in Haag (1991) and more completely in Haag (1995). It was later adopted by SAP and is still in use today. In short, it consists of three modules: the so-called dynamic database (DDB), a pattern matching system (PMS), and a truth maintenance system (TMS).

The DDB is a repository of facts (predicate logic atoms) describing the configuration. Mainly, a fact states the assignment of either a value or a domain to an observable property of an instance. The PMS identifies all tuples of facts that allow a particular constraint to be applied. The implementation of the PMS is based on the RETE Algorithm used for forward chaining in XCON and part of the OPS programming languages (Forgy, 1982).

The TMS records justifications for all facts in the DDB, in particular those due to constraints. A justification is an implication image that states that the fact image is a valid part of the configuration if the facts image are valid. The TMS hides/screens11uninteresting facts and is able to efficiently invalidate all facts that depended on a revoked decision. The latter is referred to as dependency directed backtracking (see also Hotz and Wolter, 2013).

27.3.3 Balances, Checks, and Resources

Sometimes a dependency cannot be expressed as a constraint in the model with a given finite set of variables. One example would be the “constraint” that the summed width image of all cabinets in a kitchen should fill the allotted space image. The “constraining” relation is image. If this is violated because image, this results in an inconsistency that can only be corrected by taking something back (a cabinet or the specified value of image). In this way it behaves like a real constraint. But, if this is violated because image, then this merely indicates incompleteness, which might be corrected by adding something (another cabinet). Since (at least in interactive configuration) the XPS must make the distinction between inconsistency and incompleteness clear to the user, it is opportune to distinguish such dependencies from constraints; they were referred to as checks in Haag (1991).

Dependencies involving sums are often related to the concept of resources (see Hotz et al., 2014b12). A resource (such as space or electrical power) is consumed by some components and provided by others. In the preceding example, each cabinet consumes space equal to its width and the overall kitchen provides the amount of available space image. SAP followed the approach of Bernhard Neumann who developed a concept of balances for resources and/or functionality in hierarchical configurations in his dissertation (Neumann, 1989). The two sides of a balance were called offers and requires.

27.3.4 Soft Constraints

Not all constraints are created equal. Some constraints encode physical or legal laws that must not be violated. These cannot be questioned and are termed hard constraints. Other constraints encode defaults or relations that are nice to have. These are typically fulfilled in good solutions, but might be dropped if violated, and are termed soft constraints. Soft constraints are instrumental in interactive configuration in guiding the user toward good solutions. Generally, any constraint can be marked as soft in the model. Whether a constraint is deemed hard or soft may also be the result of a conscious decision: for example, constraints based purely on marketing decisions might be classified as either hard or soft, depending on a company’s policy.

Soft constraints may arise naturally in knowledge engineering. Let us consider the following two constraints in the model of the APLEX application given in the formulation of the expert:

• The operation rough machining must precede the operation fine machining for each feature to be machined.

• (All) rough machining operations should precede (all) fine machining operations.

The latter is a global statement that expresses a good strategy. When in need it could be violated. The former is a hard constraint. Rough machining a feature after fine machining it would have no point and would destroy the result of the fine machining.

Another example, famous in AI, is the statement “all birds fly”: that is if an object instance is known to be a bird, this constraint would set its property of “being able to fly” to the value true. Obviously, an individual bird may not be able to fly (due to a broken wing, say). In a configuration involving birds, the soft constraint will apply in some situations, but may not apply in others. In imagePLAKON it is possible to selectively remove some inferences of a soft constraint, while leaving others in place, because inferences due to soft constraints are represented as soft justifications. Resolving inconsistency is a problem of discarding soft justifications, not soft constraints overall.

The question of whether birds can fly or not is a typical problem dealt with in the discipline of nonmonotonic reasoning. This was an R&D topic in the TEX-B project, where various versions of the Assumption Based Truth Maintenance System (ATMS; DeKleer, 1986) were developed (Dressler, 1988). The TMS in imagePLAKON was one of these, called the image (Haag, 1991, 1995). The soft justifications were treated as ATMS assumptions. In the case of an inconsistent configuration the image can calculate minimal sets of conflicting soft justifications. This functionality has been incorporated in the SAP IPC configurator (see dedicated section later) and with further extensions also in the SAP internal “Scope Selection” application (Haag and Riemann, 2011).

27.3.5 Problem Solving

The question now is: What is the task of the XPS in supporting problem solving? If the underlying problem is equivalent to a CSP (or some extension of CSP) that is hard to solve because it is heavily constrained, then finding a solution using CSP techniques is key. On the other hand, typical interactive configuration problems (such as sales configuration) are usually underconstrained with regard to the hard constraints. This means that a multitude of solutions exists and it is not hard to find one. But the problem is overconstrained with respect to the overall set of hard and soft constraints. This means it is not possible to find a solution that satisfies all these constraints. Resolving the inconsistencies by deciding which derivations from soft constraints (and user choices) to ignore is now key and defines the actual search problem.

There is another way to limit the applicable constraints besides marking them as soft. In an engineer-to-order scenario there may be a preferred way of configuring a needed component such as the core of a transformer using standard components. Constraints can be formulated for this task. However, if this fails, the component must be engineered using nonstandard components. A different set of constraints then applies. Thus, the set of constraints to be considered is based on the tasks being performed.

The upshot is that in order to support finding a solution, the XPS may need to do different things in different situations. If the problem is finding one or more solutions that satisfy the applicable constraints, the XPS must implement a strategy of exploring the search space systematically and efficiently. In TEX-K this was not directly part of the constraint processing, but rather a topic considered under the heading of controlling the configuration process (Günter et al., 1990; see Hotz and Günter, 201413).

In the case that the problem is overconstrained due to soft constraints (and has no solution that satisfies everything), the sought solution takes the form of a maximal set of soft constraints so that the resulting overall set of constraints can be satisfied and provides an optimal solution. Finding this autonomously may be the task of the XPS. In this case, the preferences or optimality criteria need to be defined somehow. One way of doing this (explored with imagePLAKON in APLEX) is to assign a numeric strength to each soft constraint and perform a best-first search. The complexity of this is high in practice and exponential in theory. Thus, the approach was not adopted by SAP.

However, if the user or an agent on the user’s behalf performs the search interactively, they will make the creative decisions themselves and any complexity is their responsibility. The role of the XPS is then to provide decision support. This means the XPS will be able to calculate the root causes of inconsistency and guide the user in resolving them. It will also support a user in juggling alternatives; that is, let them switch back and forth between alternate configuration states, and aid in comparing the differences between these states. The image in imagePLAKON supports this.

Finally, the XPS may need to reason about which tasks to apply, and when. A PROLOG-like mechanism was provided in imagePLAKON for KONEX that allows breaking a task into subtasks and formulating alternative tasks to start upon the failure of a task. Although the need to be able to switch to a configuration with nonstandard components is a common problem in engineer-to-order settings, this approach was taken up very late in SAP configurators and has not been completely implemented to date. Independently from the development in imagePLAKON, reasoning about tasks also played a role in the XVIS application (Kühne and Meyer-Fujara, 1991).

27.4 The Manufacture of Variants: A Configuration Problem

The PLAKON applications are diverse, but by no means span the space of all business relevant configuration problems. One central process that is not addressed in TEX-K stems from the manufacture of product variants. Often many products a company manufactures and sells are variations on a basic scheme. For example, the same ballpoint pen may come in several different colors. Classically, a standard (nonconfigurable) product is defined for each pen/color combination. An alternate approach would be to treat all of these as product variants of a central product, “pen.”

For manufacturing a nonconfigurable product, it suffices to know the bill of materials (BoM) and the material master data. The former defines the parts that are needed to build the product. The latter records standard properties of the product including dimensions (e.g., weight, length).

In the example of the ballpoint pen, the only variance is in the choice of color. The BoMs of all pens are alike, except that the ink cartridge and lower part of the outer casing differ in color. Nevertheless, the material master data and a BoM must be maintained separately for each variant. As further variance is introduced, it becomes more and more tedious to maintain such partially redundant data for each product variant. For products such as cars with a potentially huge number of variants, it is no longer even possible.

A solution to this is, of course, to create a BoM (along with the product master data) that covers all variants of a given product; that is, encompasses the union of all potentially needed parts and properties. A mechanism for correctly selecting the needed parts for a particular variant is then needed when processing an order. Ideally, the system will ensure consistency and completeness. Consistency states that the selected parts will work together and completeness means that all parts needed to build the product are chosen.

SAP was one of the first companies to offer standard business software off the shelf. The SAP R/2 system first supported variants by allowing a BoM to cover a fixed number (16) of variants. A part in the BoM was either fixed (i.e., part of all variants) or listed the quantity in which the part was to appear for each variant. A quantity of zero denoted that the part was not needed in that variant. For manufacturing, the variant to use was identified by its index (a number between 1 and 16).

By 1988 SAP R/2 had improved on this and offered so-called Open Variants functionality. It was possible to define characteristics for those materials with variants. These were designated as KBAU materials. The characteristics were defined in a dedicated (business) database table. It was possible to reference these characteristics in selection conditions attached to the BoM items. It was also possible to define transformations on the characteristics and thus calculate dimensions dynamically. A concurrent publication at about this time on this subject (outside of SAP) is Schönsleben (1985).

Subsequent research (concurrent with the TEX-K project) on the variant problem is given in Kleine Büning and Schmitgen (1988) and Neumann (1987). Bernhard Neumann joined SAP in 1990 and this work was incorporated into a first SAP product configurator, which was then being developed for SAP R/2, the SAP (OS/2) configurator (see later). The important kinds of dependencies proposed were (as later implemented in the SAP VC) selection conditions, preconditions, and procedures. Selection conditions and preconditions can be attached to BoM items, characteristics, or characteristic values. A selection condition is a sufficient condition. If it is fulfilled, the BoM item is selected, the characteristic becomes required, or the value is chosen. A precondition is a necessary condition. If it is not fulfilled, an instance should not be selected for this item, the characteristic should not be assigned a value, or the value is disallowed. Preconditions can signal inconsistency. Selection conditions can cause incompleteness (when a required characteristic does not yet have an assigned value).

Procedures are nonforward chaining rules that can be used to set characteristic values. Each of these dependencies has access to the characteristics of at most three objects: $SELF (the object to which the dependency is attached), $PARENT (the parent object of a BoM item), and most importantly, $ROOT (the unique root object of the configuration).

Configuration can be a multilevel process; that is, the parts of a configurable material might themselves be configurable. These mechanisms are suited to a top-down mode of configuration, moving from an instance to its parts and configuring them in turn. There are, of course, still two ways this process may evolve: breadth first or depth first. Arguments can be made for either.14

27.5 A Productively Used XPS: The SAP (OS/2) Configurator

SAP brought elements of various XPS developments together with a pragmatic touch that was instrumental in creating a configurator that saw sustained productive use. The business purpose was to offer sales configuration capabilities to the SAP R/2 system. This included sales support, functioning as an electronic catalog for managing variants and as an electronic form of a sales handbook.

The thinking at the beginning of the 1990s was still very much in line with that of XPS inherited from AI. It was taken for granted that an XPS development could not be done in a classic programming language. Therefore, this configurator was implemented in PROLOG. For this, SAP created the SAP PROLOG15 environment, which consisted of a PROLOG engine written in C and an interface to user interface (UI) technology of the IBM OS/2 operating system. The configurator saw productive use starting sometime in 1992 on the OS/2 platform. Hence, it was known internally as the OS/2 configurator. A few customers used it productively well into the new millennium.

One of the major influences to the OS/2 configurator was the concepts Bernhard Neumann brought with him: selection conditions, preconditions, procedures, and balances16. The other major influence was the PLAKON book (Cunis et al., 1991). Early OS/2 configurator manuals had sections directly devoted to the major features of PLAKON. However, many were never completely implemented. Figure 27.3 gives an early version of the constraint syntax. This is the same constraint as that depicted in Figure 27.2, which gives the current syntax.

FIGURE 27.3 Constraint in early SAP (OS/2) configurator syntax.

A standalone modeling environment was implemented also in PROLOG on OS/2. This allowed defining classes with characteristics (influenced by FRESKO; Cunis, 1992). A class could have multiple superclasses. Materials, which needed to correspond to materials in the host R/2 system, were assigned to classes and inherited characteristics, value domains, and defaults from these. Inheritance was monotonic, except for defaults. Touretzky’s concept of minimal inferential distance was used to control default inheritance (Touretzky, 1986).17 The partonomy was BoM-like and represented sales BoMs. A syntax was defined for maintaining the different forms of dependencies: selection conditions, preconditions, procedures, balances, and constraints. The syntax of the SAP ABAP programming language may have had an influence on the style of the dependency syntax with the addition of a smattering of LISP taken from the PLAKON constraint syntax (see Figure 27.3). An early manual mentions control knowledge similar to that in TEX-K, but there was never a way to maintain this.

Constraint processing was based on the imagePLAKON architecture. The three modules DDB, PMS, and TMS were implemented in PROLOG. A simpler handling of defaults than that of the image was implemented in the DDB. The basic distinction between inconsistency (something decided on needs to be revised) and incompleteness (might be fixed by adding something) was made. If a constraint was violated this was an inconsistency. Constraints were permanently active in the configuration, so the user was immediately alerted to an inconsistency. Completeness checks (including the calculation of the balances) were performed only on user demand. The other dependencies were executed either on demand, or automatically linked to certain events in the UI.

Every configuration had one unique root instance. All other components had a unique parent instance in the configuration. Thus the set of object instances in the configuration was always a tree linked in a BoM structure. There were three types of characteristics: single-valued characteristics that can be assigned a value in the configuration; multivalued characteristics that can be assigned more than one value; and restrictable characteristics. The latter behave like single-valued characteristics, but are assigned domain restrictions in the configuration. A singleton restriction was taken as being equivalent to a value assignment. A restriction to the empty domain was an inconsistency. All characteristics could be declared required or optional. The configuration was not complete if a required characteristic had not been assigned a value.

Except for singleton domains (equivalent to assigning values) the user was not able to formulate domain restrictions. This simplified the configuration logic and the UI, particularly with respect to taking user decisions back. The possible interactive configuration decisions a user could make were to

• Assign a characteristic value (or take an assignment back)

• Manually select a component from the BoM, causing an instance to be instantiated (or delete an instance from the configuration)

• Specialize the type of an instance; for example, decide that an instance of the class personal computer actually is an instance of the class laptop (or take a specialization of an instance back)

It was possible to search variants by their characteristic values. This was implemented as a fuzzy search, to ease finding similar variants. A facility for generating explanations was implemented, as was a scheme for pricing that emulated that in use in SAP sales. Ideas for an automatic search for a solution as part of the configurator itself were discarded. Any overly complex algorithm could cause performance problems for the customers and would land at SAP’s doorstep as maintenance requests or bug reports.

The system was able to export a configuration to R/2 (which was also stored there) and to reimport a configuration from R/2. The same mode of operations was then also offered for the then nascent R/3 system, but this was more or less immediately supplanted by the development of the SAP VC. There was a “dark” (noninteractive) mode in which the configurator merely checked and completed configurations similar to the task of XCON. The intended primary mode of operations, however, was interactive.

Like XCON the SAP OS/2 configurator showed that the XPS approach works in practice.18 Projects were also underway for configuring non-tangible goods such as financial packages (insurance, mortgages) and for supporting sales of the then upcoming R/3 system.19

The SAP OS/2 configurator was also the entry ticket of SAP into the realm of real product configuration. It was used to showcase this functionality to customers (primarily in the United States) who were already thinking about configuration or had a configurator project history (such as DEC and HP) and facilitated the discussion with them on how to further this topic.

27.6 Making It Mainstream: The SAP Variant Configurator (SAP VC)

SAP saw configuration as an important business topic and perceived the need for a smoother integration of a configurator with the business processes of the logistics supply chain and PLM (Product Lifecycle Management) than the SAP OS/2 configurator could offer. As a result a new configurator, the SAP VC,20 was developed as a direct part of R/3 and implemented in the SAP ABAP development environment.21 The concept of balances was not implemented, and an ATMS was not attempted. Additional work needed to be done on utilities to provide a PROLOG-like evaluation of conditions, and to provide explanations. A major task was also writing a parser in SAP ABAP for the dependency syntax including constraints. The discrimination nets used in the PMS module to execute constraints (following the RETE approach; Forgy, 1982; Haag, 1991, 1995) had to be compiled.

The configuration model was composed of R/3 master data. It was possible to define classes and characteristics that were of general business use directly as R/3 master data. This R/3 class system was partly based on the object system of the SAP OS/2 configurator, and partly on experiences with other business processes.22 The partonomy was again the BoM. It was possible to have BoM items refer to classes rather than materials. An instance corresponding to such a position in the BoM needed to be specialized to a material during the configuration process. The dependencies were compiled into a form that allowed fast run-time evaluation. One of the things learned from the SAP OS/2 configurator was that as the number of constraints gets larger, the compilation of the associated discrimination nets is slowed down. Consequently, constraints were organized into constraint nets of limited size that were compiled individually. All of these modeling entities were subject to the normal ECM (Engineering Change Management) of the ERP system (granularity of the changes: one day). ECM is a process familiar to engineers maintaining master data.

A configurable material (of type KMAT) was always the entry point (root instance) for a configuration (the SAP OS/2 configurator also allowed a class as an entry point). A KMAT had one or more configuration profiles. A configuration profile controlled the level to which the subcomponents should be configured and other aspects related to the business scenario. Constraint nets were attached to configuration profiles. When a configuration was started, the business scenario determined which profile to use, and the needed model data (classes, dependencies, BoMs, and anything referenced by these) were collected; that is, the model was created dynamically by reading the database. For purposes of better performance it was possible to store the dynamically assembled model of a KMAT in a proprietary format in temporary storage on a daily basis.

Integration of SAP (sales) pricing with configuration was provided as follows: in the configuration, a special multivalued pricing characteristic could be set to hold one or more pricing condition keys. These in turn were intelligible to pricing that used them to calculate surcharges and such. A persistent storage for configurations was created that was intelligible to other business processes. This records all facts in the DDB that are simple properties, mainly value assignments to properties of the instances in the configuration. Among these are also all user choices. The author of the fact is also recorded. The author encodes the main reason for a fact’s existence; that is, whether the fact is due to a user choice, a constraint inference, a statement in the model, and so on. Recording the author has turned out to be important for supporting reconfiguration; that is, being able to modify the configuration further after reloading it. When loading the configuration again the justifications due to dependencies can be reconstructed in the TMS. Those due to the user are recreated based on the author. For example, when an inconsistent configuration is reloaded, the user should be informed which of their choices conflict. Without the author information it would be impossible to distinguish user input from other sources of derivation.

What are major lessons learned? Well, very generally, that business integration is surprisingly difficult. Good integration is more important than advanced configurator features. Configuration in itself is not a business process. It is a step in various business processes. These need access to the configuration and the master data of the configuration model and must be prepared to process this information.

But there are also some more concrete insights. Foremost is the realization that the unity of configuration model and general master data is a singular advantage. It saves customers a tremendous amount of time and money if they can be assured that the configuration model is in sync with the business master data. This in itself would be considered a unique selling point (USP) for the SAP VC.

Next is the realization that sales configuration (whether quote or sales order) is not viable without both reliable pricing and availability information. Pricing in turn may be based on costing. To provide accurate availability and costing the entire MRP run (Materials Requirements Planning) must be performed based on the configuration, perhaps reserving materials in the chosen manufacturing plant. For a configurator to do this is only possible in a tight integration such as that of the SAP VC (another USP).

Furthermore, it is essential that configurations, which are flagged as consistent and complete by the SAP VC, can be manufactured and delivered at the promised price and date. Again, this is greatly facilitated by having the same master data as source for both the configuration model and the manufacturing processes.

Finally, it was discerned that sales configuration is a three-tier approach. At the bottom is the low-level BoM explosion of potentially tens of thousands of parts (e.g., in the MRP run). This is non-interactive and best handled with a separate implementation that is freed from all overhead regarding possible explanations. The middle level is called manufacturing completion. This, as in XCON, takes a sales configuration, verifies that it is correct, and completes it by deriving values for additional characteristics and perhaps instantiating additional components. The result of manufacturing completion is input to the low-level BoM explosion. The first tier is the sales configuration. As opposed to the other two levels, it normally does not depend on the manufacturing plant. The top two tiers both allow the use of constraints. The low-level task allows only selection conditions and procedures. The SAP VC actually handles the first two together. Only the low-level BoM explosion is implemented separately. Nevertheless, it would be a boon to formally separate the top-level interactive configuration from the completion configuration making it easier to manage the overall performance of the configuration. In XCON-like scenarios the SAP VC only performs the manufacturing completion task (based on sales orders obtained through other means) and triggers the low-level BoM explosion.

27.7 The SAP IPC (Internet Pricing and Configuration)

From the perspective of many customers that configure both in the SAP CRM (Customer Relationship Management) and the SAP ERP systems, the SAP IPC configurator is just a reimplementation of the SAP VC in Java with less integration and an annoying list of small differences (the so-called IPC delta list). The IPC is functionally part of the CRM system, can be attached to the ERP system, has a web-suitable interface, and can operate detached from the ERP system. It also supports customers who have configuration processes in CRM that do not have configuration models in ERP.

The motivation for its creation was based on specific customer requirements. These customers wanted a standalone configurator for sales that was not as BoM-oriented as the SAP OS/2 and VC configurators were and that supported further relations between objects, such as connection relations and containment relations, among others.23

Based on these requirements, a configurator engine was designed and implemented in Java called the SAP SCE (Standalone/Sales Configurator Engine).24 The SCE is now part of the IPC. Model data is extracted from R/3 (SAP ERP). All model data the SAP VC normally collects dynamically for a specific date or ECM state is stored in a repository called a knowledge base run-time version (KBRT). A KBRT can potentially store model data for more than one root material. For example, data for all materials under a given class could be combined into one KBRT. KBRTs can be transported to a local database or to the CRM system. For those customers using the CRM system without the ERP system (a definite minority), it is possible to maintain models in the form of a KBRT directly in CRM.

Maintenance in R/3 (SAP ERP) was enhanced to allow advanced features such as ADT (abstract data type) characteristics, which can be assigned another object instance as a value at run-time (to express relations between instances). Constraints were enhanced to deal with ADTs and to be able to do the job of selection conditions and preconditions. Also, one of the limitations of the SAP VC was removed. With a classical BoM each instance in the configuration corresponds uniquely to a BoM item in the BoM of its parent instance. In the advanced models it was possible to specify a minimum and maximum number of allowed instances for each item in a BoM. Many instances could now belong to one BoM item. This feature is important when modeling products where the number of parts of an object instance could not be foreseen. An example would be the BoM of an elevator. It is tedious or impractical to model for the maximum number of floors an elevator may have to service.

A nonhierarchical concept for summation was introduced, so-called aggregating rules. These were implemented using the same technology as the constraint processing. Non-hierarchical balances were supported. These subsumed two aggregations under one name, one for the offers side and one for the requires side of the balance, with a built-in comparison of the two sides. ATMS functionality was finally implemented to support the conflict solver, an aid to dealing with inconsistency. In its current state it has been improved to use a concept of choices adapted from the Scope Selection application (Haag and Riemann, 2011). A choice is a user decision to support or forego a particular property of the configuration. If they choose to forego such a property, then all soft justifications for this property are simultaneously inactivated/discarded. If they then choose to support the property again, all soft justifications are activated/reinstantiated.

Further notes: A concept of tasks, similar to that in imagePLAKON, was proposed in the design document, but has not been implemented to date. For non-monotonic inheritance of defaults, the notion of Touretzky’s minimal inferential distance (Touretzky, 1986) was replaced by that of a class precedence list as defined for Common LISP (Steele, 1990). This still proved too complex and there are proposals for a pragmatic simplification. Hooks for integration with 2D/3D rendering of configured products using third-party viewers were provided, but this functionality was not heavily used due to its associated effort and cost.

The first use of the SCE was in a project configuring office furniture solutions in 1998, which also encompassed integration with a CAD system, pioneering work. Such developments were not mainstream projects due to their complexity and integration issues. The number of customers involved in them was small.

Meanwhile, many customers using the SAP VC wanted to profit from a standalone configurator without needing the new advanced functionality. For this reason, a compatibility mode was added to the SCE. Selection conditions, preconditions, and procedures were reintroduced. However, due to the fact that the SCE had been designed to improve on the specification of the SAP VC and also because its operation occurred outside of the ERP environment, there remained a list of differences between the SCE and the SAP VC (theIPC delta list). Efforts have been made to reduce this list over time, by moving the SCE closer to the SAP VC. The original mode, for which the SCE had been designed, was subsequently referred to as the SCE advanced mode.

The advanced mode suffered from a lack of standard integration to ERP. For example, the multiple instances for a single BoM item had to somehow be converted to an order BoM. This kind of integration was done in the individual projects. These were increasingly costly and SAP stopped supporting them without additional remuneration. Recently, the SAP Solution Configurator (SSC) based on advanced mode functionality is being offered as a standard product.

The SCE was incorporated into SAP mobile sales and was also adapted for use on the Internet. Pricing had followed suit and developed the SPE (Standalone Pricing Engine) also in Java. SCE and SPE were joined in a component named the SAP IPC (Internet Pricing and Configuration). The IPC is part of SAP CRM. It is also described in Blumöhr et al. (2012) as far as it is relevant for customers using the SAP VC.

From the point of business integration, there were new challenges. I mention three examples in closing this section: purchasing configurable products, contracts for configurable products, and incorporating configurable products in a catalog.

Purchasing might be considered as a sales configuration from the perspective of the supplier. However, the buyer is faced with the problem of having to be able to store the configuration in a form that is intelligible to them; for example, the configuration must be enriched with texts and such, that would not be stored with an internal configuration, because they could be read from the database. However, purchasing could also result in a tender, which invites competing suppliers to submit quotes. In this case the subject of the tender is a configuration that describes the product sought (a magazine to be printed, say). The buyer owns the model, but the suppliers must be able to understand and process the configuration.

A contract between two companies might state what products are offered from one to the other. For example, a company might allow their employees to directly order a personal computer from another company, but only when conforming to conditions agreed on with the supplier in a contract. In the case of a configurable product like a computer, this takes the form of a partial configuration (restricted domains, etc.). In the SAP world nonsingleton domain restrictions are normally not stored in a persisted configuration. Thus, the storage and associated logic must be modified accordingly. Another approach might be to define a restricted model for the contract. In practice, the first approach proved simpler.

An electronic catalog that contains configurable products must support configuration. Some customers wanted to mass distribute such an electronic catalog on a CD. The CD would then have to contain the configurator and the model, ideally requiring no installation by the users. One question is how to display the product and its price in the catalog before the product has been configured. An answer is to always display a variant with its price and then adapt the product and price as the user reconfigures his/her solution based on this variant.

In all three aforementioned examples, a central question is the ownership of the model and the pricing data. Generally, this is sensitive proprietary information that is exposed when access to it is granted through use of the configurator. If the KB is distributed on a CD, say, it may contain only information that is public.

27.8 Conclusion

The SAP Variant Configurator (SAP VC) has been in stable use for two decades (the SAP IPC configurator for more than one). Both configurators are hybrid in that they allow both procedural and declarative modeling. They support both (conceptual/generic) constraints and BoM oriented processing. The constraints are used to ensure arc consistency. They can be applied dynamically based on a condition and they allow setting defaults. The configurators provide explanations and decision support in resolving inconsistency, the IPC somewhat more so than the SAP VC. The SAP VC is tightly integrated with the business processes in the logistics supply chain in the underlying SAP ERP system.


1. Blumöhr U, Münch M, Ukalovic M. Variant Configuration with SAP. second ed. Galileo Press 2012.

2. Brachman RJ, McGuinness DL, Patel-Schneider PF, Resnick LA, Borgida A. Living with CLASSIC: when and how to use a KL-ONE-like language. In: Sowa J, ed. Principles of Semantic Networks: Explorations in the Representation of Knowledge. San Mateo, CA: Morgan-Kaufmann; 1991:401–456.

3. Cunis, R., 1992. The Three Level Frame Representation Schema – A Multidimensional Modular Basis for Developing Expert System Kernels. Dissertation, University of Hamburg, Germany (in German: Das 3-stufige Frame-Repräsentationsschema – eine mehrdimensional modulare Basis für die Entwicklung von Expertensystemkernen).

4. Cunis R, Günter A, Strecker H, eds. The PLAKON-Book No 266 in Informatik-Fachberichte. Heidelberg: Springer; 1991; (in German: Das PLAKON-Buch).

5. DeKleer J. An assumption-based TMS. Artificial Intelligence. 1986;28(2):127–162.

6. Dressler O. An extended basic ATMS. In: Reinfrank M, Kleer J, Ginsberg M, Sandewall E, eds. Non-monotonic Reasoning, Second International Workshop, Proceedings. Heidelberg: Springer; 1988:143–163. Lecture Notes in Computer Science vol. 346.

7. Forgy C. RETE: a fast algorithm for the many pattern/many object pattern matching problem. Artificial Intelligence. 1982;19:17–37.

8. Günter A, Cunis R, Syska I. Separating control from structural knowledge in construction expert systems. In: 1990:601–610. Proceedings of the Third International Conference on Industrial and Engineering Applications of AI and Expert Systems (IEA/AIE-90), Charlseston, SC. vol. 2.

9. Güsgen, H.-W., 1988. CONSAT – A System for Constraint Satisfaction. Ph.D. thesis, Universität Kaiserslautern.

10. Haag, A., 1989. Some Remarks on the Global Behavior of Local Propagation of Intervals in Linear Constraint Nets. Tech. Rep. 32, TEX-B, Battelle Institute, Frankfurt, Germany.

11. Haag A. A practical ATMS-based problem solving method. In: Cunis R, Günter A, Strecker H, eds. The PLAKON-Book (Das PLAKON Buch). Heidelberg: Springer; 1991:212–237. (in German: Konzepte zu einer praktischen Handhabbarkeit einer ATMS-basierten Problemlösung).

12. Haag, A., 1995. The image – An Assumption Based Problem Solving Architecture Utilizing Specialization Relations. Dissertation, University of Kaiserslautern, Kairserslautern, Germany.

13. Haag A. Sales configuration in business processes. IEEE Intelligent Systems. 1998;13(4):78–85.

14. Haag A, Riemann S. Product configuration as decision support: the declarative paradigm in practice. Artificial Intelligence for Engineering Design, Analysis and Manufacturing (AI EDAM). 2011;25(2):131–142.

15. Ho K, Güsgen HW, Hilfinger PN. Consat: a parallel constraint satisfaction system. LISP and Symbolic Computation. 1994;7(2–3):195–210.

16. Höfling B. encoway: from ERP-based to sales-oriented configuration. In: Felfernig A, Hotz L, Bagley C, Tiihonen J, eds. Knowledge-based Configuration – From Research to Business Cases. Waltham, MA: Morgan Kaufmann Publishers; 2014:219–227. (Chapter 18).

17. Hotz, L., 2009. Frame-based Knowledge Representation for Configuration, Analysis, and Diagnoses of Technical Systems. Dissertation, University of Hamburg, Hamburg, Germany (in German: Frame-basierte Wissensrepräsentation für Konfigurierung, Analyse und Diagnose technischer Systems).

18. Hotz L, Günter A. KONWERK. In: Felfernig A, Hotz L, Bagley C, Tiihonen J, eds. Knowledge-based Configuration – From Research to Business Cases. Waltham, MA: Morgan Kaufmann Publishers; 2014:281–295. (Chapter 24).

19. Hotz L, Wolter K. Beyond physical product configuration – configuration in unusual domains. AI Communications. 2013;26(1):39–66.

20. Hotz L, Felfernig A, Günter A, Tiihonen J. A short history of configuration technologies. In: Felfernig A, Hotz L, Bagley C, Tiihonen J, eds. Knowledge-based Configuration – From Research to Business Cases. Waltham, MA: Morgan Kaufmann Publishers; 2014a:9–19. (Chapter 2).

21. Hotz L, Felfernig A, Stumptner M, Ryabokon A, Bagley C, Wolter K. Configuration knowledge representation and reasoning. In: Felfernig A, Hotz L, Bagley C, Tiihonen J, eds. Knowledge-based Configuration – From Research to Business Cases. Waltham, MA: Morgan Kaufmann Publishers; 2014b:41–72. (Chapter 6).

22. Kleine Büning H, Schmitgen S. Concepts for solving the variant problem in bill of material processing. CIM Management. 1988;2:60–70 (in German: Konzepte zur Lösung des Variantenproblems in der Stücklistenverarbeitung).

23. Kühne A, Meyer-Fujara J. Task construction in expert systems. In: Cunis R, Günter A, Strecker H, eds. The PLAKON-Book (Das PLAKON Buch). Heidelberg: Springer; 1991:238–254. (in German: Aufgabenerstellung in Expertensystemen).

24. Mackworth A. Consistency in networks of relations. Artificial Intelligence. 1977;8(1):99–118.

25. McDermott J. R1: a rule-based configurer of computer systems. Artificial Intelligence. 1982;19(1):39–88.

26. Neumann B. Configuration expert systems: a case study and tutorial. In: Bunke HO, ed. Proceedings of the 1988 SGAICO Conference on Artificial Intelligence in Manufacturing, Assembly, and Robotics, Oldenbourg, Munich. 1988;27–68.

27. Neumann, Bh., 1987. Expert systems support for order and bill of materials processing in manufacturing variants. Diploma Thesis, University of Karlsruhe (TH), Germany (in German: Expertensysteme in der Auftrags- und Stücklistenverarbeitung bei variantenreicher Fertigung).

28. Neumann, Bh., 1989. An approach for knowledge-based verification of orders in mass customization for technical equipment. Dissertation, University of Duisburg-Essen, Fachbereich 11/Mathematik, Duisburg, Germany (in German: Ein Ansatz zur wissensbasierten Auftragsprfung für technische Anlagen des Breitengeschäfts).

29. Post E. Real programmers don’t use pascal. Datamation. 1983;29(7):263–265 (Reader’s forum).

30. Schönsleben, P., 1985. Flexible Production Planning and Controlling with a Computer. CW-Publikationen (in German: Flexible Produktionsplanung und -steuerung mit dem Computer).

31. Soloway E, Bachant J, Jensen K. Assessing the maintainability of XCON-in-RIME: coping with the problem of very large rule-bases. In: Proceedings of the Sixth National Conference on Artificial Intelligence (AAAI-87), Seattle, WA. 1987:824–829. (July 13–17).

32. Steele GLJ. COMMON LISP: the Language. second ed. Digital Press 1990.

33. Strecker H. Configuration using PLAKON – an applications perspective. In: Heidelberg: Springer; 1989:352–362. Wissensbasierte Systeme Informatik-Fachberichte. vol. 227.

34. Touretzky DS. The Mathematics of Inheritance Systems. London, Great Britain: Pitman; 1986.

1Chapter 18.

2Chapter 2.

3Chapter 2.

4In this chapter, the terms image are used interchangeably. Characteristic is the preferred term in SAP ERP. Attribute is the preferred term in AI. The term feature used in sales would be another synonym, not used here. Each of these terms may refer to an actual property of an object instance in the configuration, such as “height = 75 cm.” Or it may refer to an observable of an abstract concept, such as a class. For example, all stair posts have the observable property height.

5See SAP documentation for details on this syntax. Examples of SAP constraints with explanations are also given in Haag (1998). The angle image in the equation is taken to be in radians whereas the variable image in Figure 27.2 is taken to be in degrees.

6See Hotz et al. (2014b), Chapter 6.

7A generic constraint is similar to a conceptual constraint (see Hotz and Günter, 2014, Chapter 24).

8In the sequel I shall simply use the term constraint to refer to conceptual constraint.

9In SAP constraint syntax the variables that are subjects of constraint propagation can be declared by adding the following to the constraint in Figure 27.2: INFERENCES:?hp,?pos. This means that the other variables must be given before the constraint can be applied. (image was consciously omitted as an inference, because it makes business sense to specify its value up front.)

10Both PLAKON and imagePLAKON were implemented in LISP.

11Multiple facts each restricting the domain of the same variables may be valid at the same time. For brevity such facts are denoted as domain restrictions. A domain restriction is screened if there is another valid domain restriction that represents a smaller domain.

12Chapter 6.

13Chapter 24.

14See SAP documentation on how the BoM explosion is performed. It varies over time and by configurator.

15SAP PROLOG was never offered as a standalone product.

16Balances were computed at designated instances in the BoM hierarchy and signaled incompleteness in case of an imbalance in the part of the hierarchy below this point.

17This was dropped in the later SAP configurators for performance reasons.

18The SAP OS/2 configurator was used productively by companies whose business came to depend on it. An important group of these were mid-sized machining companies.

19Interestingly, the need to support R/3 sales by configuration capabilities vanished when SAP dramatically simplified the pricing approach. This shows that sometimes the problem itself should be reviewed, not only the way to solve it.

20See Blumöhr et al. (2012) for a comprehensive description of the SAP VC.

21PROLOG was ported to R/3 for purposes of porting the configurator engine, but the developers decided to use SAP ABAP instead. The success of this effort was an early vindication that relevant XPS techniques could be implemented in conventional programming environments with reasonable effort.

22One concept introduced due to business consideration was that of class type. Each class type defines its own separate space of classes. Apart from this, the class type controls the business functionality associated with the objects assigned to it. In Figure 27.2 the two occurrences of the string “(300)” denote the class type image, which denotes the normal class type for configurable materials in SAP ERP.

23They turned to SAP because they wanted to keep the benefits of the SAP VC (the unity of model and master data and the assurance that correct configurations could be manufactured as promised). They organized in the American Configurator Work Group (ACWG). Since then, the group has become international and is now simply called the CWG. There is a section devoted to the CWG in Blumöhr et al. (2012).

24Initially, there was no plan to create an UI, as it was thought that for sales all customers would prefer to create their own. However, it soon became apparent that without an UI nothing could be tested or demo’ed. Subsequently, the quest for a generic UI that could be easily customized by customers was a perennial topic in the further development of this configurator.