KEMBAR78
Knowledge Area: Systems Fundamentals | PDF | System | Systems Engineering
0% found this document useful (0 votes)
102 views31 pages

Knowledge Area: Systems Fundamentals

This document discusses systems fundamentals and introduces key concepts: 1. It defines a system as a set of related parts where there is coherence between the parts making viewing them as a whole useful. 2. It describes a system context as the set of interrelationships associated with a system of interest within its environment. 3. Engineered systems are socio-technical systems which are the focus of systems engineering across their lifecycles to balance technical and human elements.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
102 views31 pages

Knowledge Area: Systems Fundamentals

This document discusses systems fundamentals and introduces key concepts: 1. It defines a system as a set of related parts where there is coherence between the parts making viewing them as a whole useful. 2. It describes a system context as the set of interrelationships associated with a system of interest within its environment. 3. Engineered systems are socio-technical systems which are the focus of systems engineering across their lifecycles to balance technical and human elements.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

90

Knowledge Area: Systems Fundamentals

Systems Fundamentals
Lead Author: Rick Adcock, Contributing Authors: Janet Singer, Duane Hybertson

This knowledge area (KA) provides a guide to some of the most important knowledge about a system, which forms
part of systems thinking and acts as a foundation for the related worlds of integrative systems science and systems
approaches to practice.
This is part of the wider systems knowledge, which can help to provide a common language and intellectual
foundation and make practical systems concepts, principles, patterns and tools accessible to systems engineering
(SE) as discussed in Part 2: Foundations of Systems Engineering.

Topics
Each part of the SEBoK is divided into KAs, which are groupings of information with a related theme. The KAs, in
turn, are divided into topics. This KA contains the following topics:
• Introduction to System Fundamentals
• Types of Systems
• Complexity
• Emergence
• Fundamentals for Future Systems Engineering
Systems Fundamentals 91

Introduction
The word system is used in many areas of human activity and at many levels. But what do systems researchers and
practitioners mean when they use the word system? Is there some part of that meaning common to all applications?
The following diagram summarizes the ways in which this question is explored in this KA.

Figure 1. System Fundamentals and Engineered Systems. (SEBoK Original)

The concepts of open system and closed system are explored. Open systems, described by a set of elements and
relationships, are used to describe many real world phenomena. Closed systems have no interactions with their
environment. Two particular aspects of systems, complexity and emergence, are described in this KA. Between
them, these two concepts represent many of the challenges which drive the need for systems thinking and an
appreciation of systems science in SE.
Some systems classifications, characterized by type of element or by purpose, are presented.
An engineered system is defined within the SEBoK as encompassing combinations of technology and people in the
context of natural, social, business, public or political environments, created, used and sustained for an identified
purpose. The application of the Systems Approach Applied to Engineered Systems requires the ability to position
problems or opportunities in the wider system containing them, to create or change a specific engineered
system-of-interest, and to understand and deal with the consequences of these changes in appropriate wider systems.
The concept of a system context allows all of the system elements and relationships needed to support this to be
identified.
The discussions of engineered system contexts includes the general idea of groups of systems to help deal with
situations in which the elements of an engineered system are themselves independent engineered systems. To help
provide a focus for the discussions of how SE is applied to real world problems, four engineered system contexts are
introduced in the KA:
Systems Fundamentals 92

1. product system context


2. service system context
3. enterprise system context
4. system of systems (sos) context
The details of how SE is applied to each of these contexts are described in Part 4: Applications of Systems
Engineering.

References

Works Cited
None.

Primary References
Bertalanffy, L., von. 1968. General System Theory: Foundations, Development, Applications, rev. ed. New York,
NY, USA: Braziller.
Magee, C. L., O.L. de Weck. 2004. "Complex system classification." Proceedings of the 14th Annual International
Council on Systems Engineering International Symposium, Toulouse, France, 20-24 June 2004.
Rebovich, G., and B.E. White (eds.). 2011. Enterprise Systems Engineering: Advances in the Theory and Practice.
Boca Raton, FL, USA: CRC Press.
Sheard, S.A. and A. Mostashari. 2009. "Principles of complex systems for systems engineering." Systems
Engineering, vol. 12, no. 4. pp. 295-311.
Tien, J.M. and D. Berg. 2003. "A case for service systems engineering." Journal of Systems Science and Systems
Engineering, vol. 12, no. 1, pp. 13-38.

Additional References
None.

< Previous Article | Parent Article | Next Article >


SEBoK v. 2.4, released 19 May 2021
Introduction to System Fundamentals 93

Introduction to System Fundamentals


Lead Author: Rick Adcock, Contributing Authors: Brian Wells, Scott Jackson, Janet Singer, Duane Hybertson

This article forms part of the Systems Fundamentals knowledge area (KA). It provides various perspectives on
systems, including definitions, scope, and context.
This article provides a guide to some of the basic concepts of systems developed by systems science and discusses
how these relate to the definitions to be found in systems engineering (SE) literature. The concept of an engineered
system is introduced as the system context of critical relevance to SE.

Overview
In the System Fundamentals KA we will define some terms and ideas which are foundational to the understanding
and practice of Systems Engineering (SE). In particular, a number of views of system are explored; these are
summarized below and described in more detail with links to relevant references in the rest of this article.
• A simple definition of System is any set of related parts for which there is sufficient coherence between the
parts to make viewing them as a whole useful. If we consider more complex situations in which the parts of a
system can also be viewed as systems, we can identify useful common systems concepts to aid our understanding.
This allows the creation of systems theories, models and approaches useful to anyone trying to understand, create
or use collections of related things, independent of what the system is made of or the application domain
considering it.
• Many of these common systems ideas relate to complex networks or hierarchies of related system elements. A
System Context is a set of system interrelationships associated with a particular system of interest (SoI)
within a real world environment. One or more views of a context allow us to focus on the SoI but not lose sight
of its broader, holistic relationships and influences. Context can be used for many kinds of system but is
particularly useful for scoping problems and enabling the creation of solutions which combine people and
technology and operate in the natural world. These are referred to as socio-technical system contexts.
• Systems Engineering is one of the disciplines interested in socio-technical systems across their whole life. This
includes where problems come from and how they are defined, how we identify and select candidate solutions,
how to balance technology and human elements in the wider solution context, how to manage the complex
organizational systems needed to develop new solutions, and how developed solutions are used, sustained and
disposed of. To support this, we define an Engineered System as a socio-technical system which is the focus of
a Systems Engineering life cycle.
• While SE is focused on the delivery of an engineered system of interest, an SE should consider the full
Engineered System Context so that the necessary understanding can be reached and the right systems
engineering decisions can be made across each Life Cycle.
Introduction to System Fundamentals 94

A General View of Systems


The idea of a system whole can be found in both Western and Eastern philosophy. Many philosophers have
considered notions of holism, the concept that ideas, people or things must be considered in relation to the things
around them to be fully understood (M’Pherson 1974).
One influential systems science definition of a system comes from general system theory (GST):
A System is a set of elements in interaction. (Bertalanffy 1968)
The parts of a system may be conceptual organizations of ideas in symbolic form or real objects. GST considers
abstract systems to contain only conceptual elements and concrete systems to contain at least two elements that are
real objects, e.g. people, information, software, and physical artifacts, etc.
Similar ideas of wholeness can be found in systems engineering literature. For example:
We believe that the essence of a system is 'togetherness', the drawing together of various parts and the
relationships they form in order to produce a new whole… (Boardman and Sauser 2008).
The cohesive interactions between a set of parts suggest a system boundary and define what membership of the
system means. For closed systems, all aspects of the system exist within this boundary. This idea is useful for
abstract systems and for some theoretical system descriptions.
The boundary of an open systems defines elements and relationships which can be considered part of the system and
describes how these elements interact across the boundary with related elements in the environment. The
relationships among the elements of an open system can be understood as a combination of the systems structure and
behavior. The structure of a system describes a set of system elements and the allowable relationships between them.
System behavior refers to the effects or outcomes produced when an instance of the system interacts with its
environment. An allowable configuration of the relationships among elements is referred to as a system state. A
stable system is one which returns to its original, or another stable, state following a disturbance in the environment.
System wholes entities often exhibit emergence, behavior which is meaningful only when attributed to the whole, not
to its parts (Checkland 1999).
The identification of a system and its boundary is ultimately the choice of the observer. This may be through
observation and classification of sets of elements as systems, through an abstract conceptualization of one or more
possible boundaries and relationships in a given situation, or a mixture of this concrete and conceptual thinking. This
underlines the fact that any particular identification of a system is a human construct used to help make better sense
of a set of things and to share that understanding with others if needed.
Many natural, social and man made things can be better understood by viewing them as open systems. One of the
reasons we find the idea of systems useful is that it is possible to identify shared concepts which apply to many
system views. These recurring concepts or isomorphies can give useful insights into many situations, independently
of the kinds of elements of which a particular system is composed. The ideas of structure, behavior, emergence and
state are examples of such concepts. The identification of these shared system ideas is the basis for systems thinking
and their use in developing theories and approaches in a wide range of fields of study the basis for system sciences.
Systems Engineering (SE), and a number of other related disciplines use systems concepts, patterns and models in
the creation of useful outcomes or things. The concept of a network of open systems created, sustained and used to
achieve a purpose within one or more environments is a powerful model that can be used to understand many
complex real world situations and provide a basis for effective problem solving within them.
Introduction to System Fundamentals 95

System Context
Bertalanffy (1968) divided open systems into nine real world types ranging from static structures and control
mechanisms to socio-cultural systems. Other similar classification systems are discussed in the article Types of
Systems.
The following is a simple classification of system elements which we find at the heart of many of these
classifications:
• Natural system elements, objects or concepts which exist outside of any practical human control. Examples: the
real number system, the solar system, planetary atmosphere circulation systems.
• Social system elements, either abstract human types or social constructs, or concrete individuals or social groups.
• Technological System elements, man-made artifacts or constructs; including physical hardware, software and
information.
While the above distinctions can be made as a general abstract classification, in reality there are no hard and fast
boundaries between these types of systems: e.g., natural systems are operated by, developed by, and often contain
social systems, which depend on technical systems to fully realize their purpose. Systems which contain technical
and either human or natural elements are often called socio-technical systems. The behavior of such systems is
determined both by the nature of the technical elements and by their ability to integrate with or deal with the
variability of the natural and social systems around them.
Many of the original ideas upon which GST and other branches of system study are based come from the study of
systems in the natural and social sciences. Many natural and social systems are initially formed as simple structures
through the inherent cohesion among a set of elements. Once formed, they will tend to stay in this structure, as well
as combine and evolve further into more complex stable states to exploit this cohesion in order to sustain themselves
in the face of threats or environmental pressures. Such complex systems may exhibit specialization of elements, with
elements taking on roles which contribute to the system purpose, but losing some or all of their separate identity
outside the system. Such roles might include management of resources, defense, self-regulation or problem solving,
and control. Natural and social systems can be understood through an understanding of this wholeness, cohesion and
specialization. They can also be guided towards the development of behaviors which not only enhance their basic
survival, but also fulfill other goals of benefit to them or the systems around them. In The Architecture of
Complexity, Simon (1962) has shown that natural or social systems which evolve via a series of stable “hierarchical
intermediate forms” will be more successful and resilient to environmental change.
Thus, it is often true that the environment in which a particular system sits and the elements of that system can
themselves be considered as open systems. It can be useful to consider collections of related elements as both a
system and a part of one or more other systems. For example, a “holon” or system element was defined by Koestler
as something which exists simultaneously as a whole and as a part (Koestler 1967). At some point, the nature of the
relationships between elements within and across boundaries in a hierarchy of systems may lead to complex
structures and emergent behaviors which are difficult to understand or predict. Such complexity can often best be
dealt with not only by looking for more detail, but also by considering the wider open system relationships.
Introduction to System Fundamentals 96

Figure 1: General description of System Context (SEBoK Original)

A system context describes all of the external elements which interact across the boundary of a particular system of
interest (SoI) and a sufficient view of the elements within its boundary, to allow the SoI to be better understood as
part of a wider systems whole. To fully understand the context, we also need to identify the environment in which
the SoI and wider system sit and the systems in the environment which influence them.
Many man-made systems are designed as networks and hierarchies of related system elements to achieve desirable
behaviors and the kinds of the resilience seen in natural systems. While such systems can be deliberately created to
take advantage of system properties such as holism and stability, they must also consider system challenges such as
complexity and emergence. Considering different views of a SoI and its context over its life can help enable this
understanding. Considering systems in context allows us to focus on a SoI while maintaining the necessary wider,
holistic systems perspective. This is one of the foundations of the Systems Approach described in SEBoK part 2,
which forms a foundation of systems engineering.

Systems and Systems Engineering


Some of the systems ideas discussed above form part of the systems engineering body of knowledge. Systems
engineering literature, standards and guides often refer to “the system” to characterize a socio-technical system with a
defined purpose as the focus of SE, e.g.
• “A system is a value-delivering object” (Dori 2002).
• “A system is an array of components designed to accomplish a particular objective according to plan” (Johnson,
Kast, and Rosenzweig 1963).
• “A system is defined as a set of concepts and/or elements used to satisfy a need or requirement" (Miles 1973).
The International Council on Systems Engineering Handbook (INCOSE 2015) generalizes this idea, defining system
as “an interacting combination of elements to accomplish a defined objective. These include hardware, software,
Introduction to System Fundamentals 97

firmware, people, information, techniques, facilities, services, and other support elements." While these definitions
cover the socio-technical systems created by SE, it is also necessary to consider the natural or social problem
situations in which these systems sit, the social systems which developed, sustained and used them, and the
commercial or public enterprises in which these all sit as systems (Martin 2004).
Hence, while many SE authors talk about systems and systems ideas, they are often based on a particular world view
which related to engineered artifacts. It would also be useful to take a broader view of the context in which these
artifacts sit, and to consider through life relationships as part of that context. To help promote this, the SEBoK will
attempt to be more precise with its use of the word system, and distinguish between general systems principles and
the specific socio-technical systems created by SE.
The term socio-technical system is used by many in the systems community and may have meanings outside of that
relevant to SE. Hence, we will define an engineered system as a socio-technical system forms the primary focus or
system of interest (SoI) for an application of SE. A SE life cycle will consider an engineered system context, from
initial problem formulation through to final safe removal from use (INCOSE 2015). A more detailed discussion of
engineered system context and how it relates to the foundations of systems engineering practice can be found below.

Introduction to Engineered Systems


An engineered system defines a context containing both technology and social or natural elements, developed for a
defined purpose by an engineering life cycle.
Engineered system contexts:
• are created, used and sustained to achieve a purpose, goal or mission that is of interest to an enterprise, team, or
an individual.
• require a commitment of resources for development and support.
• are driven by stakeholders with multiple views on the use or creation of the system, or with some other stake in
the system, its properties or existence.
• contain engineered hardware, software, people, services, or a combination of these.
• exist within an environment that impacts the characteristics, use, sustainment and creation of the system.
Engineered systems typically:
• are defined by their purpose, goal or mission.
• have a life cycle and evolution dynamics.
• may include human operators (interacting with the systems via processes) as well as other social and natural
components that must be considered in the design and development of the system.
• are part of a system-of-interest hierarchy.
Open systems are a useful way to understand many complex situations. Traditional engineering disciplines have
become very good at building up detailed models and design practices to deal with the complexity of tightly
integrated collections of elements within a technology domain. It is possible to model the seemingly random
integration of lots of similar elements using statistical approaches. Systems Engineering makes use of both these
aspects of system complexity, as discussed in the Complexity article.
SE also considers the complexity of relatively small numbers of elements taken from a range of design disciplines
together with people who may not always be experienced or have detailed training in their use. Such engineered
systems may be deployed in uncertain or changing environments and be used to help people achieve a number of
loosely defined outcomes. Relatively small changes in the internal working of these engineered systems’ elements, or
in how those elements are combined, may lead to the emergence of complex or un-expected outcomes. It can be
difficult to predict and design for all such outcomes during an engineered system’s creation, or to respond to them
during its use. Iterative life cycle approaches which explore the complexity and emergence over a number of cycles
of development and use are needed to deal with this aspect of complexity. The ways that system engineering deals
Introduction to System Fundamentals 98

with these aspects of complexity in the definition of life cycle and life cycle processes applied to an engineered
system context is fully explored in Part 3

Life Cycle Definitions


As well as being a kind of system, an engineered system is also the focus of a life cycle and hence part of a
commercial transaction. Historically,
Economists divide all economic activity into two broad categories, goods and services.
Goods-producing industries are agriculture, mining, manufacturing, and construction; each of them
creates some kind of tangible object. Service industries include everything else: banking,
communications, wholesale and retail trade, all professional services such as engineering, computer
software development, and medicine, nonprofit economic activity, all consumer services, and all
government services, including defense and administration of justice.... (Encyclopedia Britannica 2011).
The following diagram defines some terms related to an engineered system life cycle and the development of goods
(products) and services.

Figure 2: Life Cycle Terminology (Modified from Capability Engineering – an Analysis of Perspectives (modified from (Henshaw et al, 2011), used
with permission))

In the above figure the capability needed to enable an enterprise to achieve its goals is delivered by the synchronized
use of services. Those services are provided by a service system ,which is created, sustained and deployed by one or
more organizations. A service system is composed of people, technology, information, and access to related services
and other necessary resources. Some of these resources are provided by enabling services and the technological
elements may be developed and supplied as product systems. An enterprise system describes a collection of related
capabilities and associated services which together enable the achievement of the overall purpose of an enterprise as
a government, business or societal entity. Measurement and review of enterprise goals may define needs for change
Introduction to System Fundamentals 99

which require an organization to acquire or modify, and integrate the elements needed to evolve its service systems.
The general terminology above is described briefly in the associated glossary definitions and expanded in related
articles in Part 4: Applications of Systems Engineering.

Engineered System Context


Engineered systems are developed as combinations of products and services within a life cycle. The figure below
gives a general view of the full context for any potential application of a SE life cycle.

Figure 3: General Engineered System Context (SEBoK original)

In this view a service system related directly to a capability need sets the overall boundary. This need establishes the
problem situation or opportunity which encapsulates the starting point of any life cycle. Within this service system
are the related services, products and people (or intelligent software agents) needed to fully deliver a solution to that
need. The environment includes any people, organizations, rules or conditions which influence or constrain the
service system or the things within it. The SoI for a particular SE life cycle may be defined at any level of this
general context. While the focus of the context will vary for each life cycle it is important that some version of this
general context is considered for all SE life cycles, to help maintain a holistic view of problem and solution. This is
discussed in Types of Systems.
An engineered system context describes the context for a SoI so that the necessary understanding can be reached and
the right systems engineering decisions can be made across the life of that SoI. This will require a number of
different views of the context across a SE life cycle, both to identify all external influence on the SoI and to guide
and constraint the systems engineering of the elements of the SoI. A full engineered systems context will include the
problem situation from which a need for a SoI is identified, one or more socio technical solutions, the organizations
needed to create and sustain new solutions and the operational environment within which those solutions must be
integrated, used and eventually disposed. The kinds of views which can be used to represent a SoI context over its
Introduction to System Fundamentals 100

life and how those views can be combined into models is discussed in the Representing Systems with Models KA in
Part 2. The activities which use those models are described conceptually in the Systems Approach Applied to
Engineered Systems KA in part 2 and related to more formal SE life cycle processes in Part 3.

References

Works Cited
Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications, rev. ed. New York:
Braziller.
Boardman, J. and B. Sauser. 2008. Systems Thinking: Coping with 21st Century Problems. Boca Raton, FL, USA:
Taylor & Francis.
Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: Wiley and Sons, Inc.
Dori, D. 2002. Object-Process Methodology – A Holistic Systems Paradigm. New York, NY, USA: Springer.
Henshaw, M., D. Kemp, P. Lister, A. Daw, A. Harding, A. Farncombe, and M. Touchin. 2011. "Capability
engineering – An analysis of perspectives." Presented at International Council on Systems Engineering (INCOSE)
21st International Symposium, Denver, CO, USA, June 20-23, 2011.
Hitchins, D. 2009. “What are the general principles applicable to systems?” INCOSE Insight, vol. 12, no. 4, pp.
59-63.
INCOSE. 2015. Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, version
4.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.
Johnson, R.A., F.W. Kast, and J.E. Rosenzweig. 1963. The Theory and Management of Systems. New York, NY,
USA: McGraw-Hill Book Company.
Koestler, A. 1990. The Ghost in the Machine, 1990 reprint ed. Sturgis, Michigan, USA: Penguin Group.
Martin, J, 2004. "The seven samurai of systems engineering: Dealing with the complexity of 7 interrelated systems."
Proceedings of the 14th Annual International Council on Systems Engineering International Symposium, 20-24 June,
2004, Toulouse, France, 20-24 June, 2004.
Miles, R.F. (ed). 1973. System Concepts. New York, NY, USA: Wiley and Sons, Inc.
M’Pherson, P.K. 1974. "A perspective on systems science and systems philosophy." Futures. Vol. 6, no. 3, pp.
219-39.
Simon, H.A. 1962. "The architecture of complexity." Proceedings of the American Philosophical Society. Vol. 106,
no. 6 (Dec. 12, 1962), pp. 467-482.

Primary References
Bertalanffy, L., von. 1968. General System Theory: Foundations, Development, Applications, rev. ed. New York,
NY, USA: Braziller.
INCOSE. 2015. Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, version
4.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.

Additional References
Hybertson, Duane. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and
Complex Systems. Boca Raton, FL, USA: CRC Press.
Hubka, Vladimir, and W. E. Eder. 1988. Theory of Technical Systems: A Total Concept Theory for Engineering
Design. Berlin: Springer-Verlag.
Introduction to System Fundamentals 101

Laszlo, E., ed. 1972. The Relevance of General Systems Theory: Papers Presented to Ludwig von Bertalanffy on His
Seventieth Birthday. New York, NY, USA: George Brazillier.

< Previous Article | Parent Article | Next Article >


SEBoK v. 2.4, released 19 May 2021

Types of Systems
Lead Author: Rick Adcock, Contributing Authors: Brian Wells, Scott Jackson

This article forms part of the Systems Fundamentals knowledge area (KA). It provides various perspectives on
system classifications and types of systems, expanded from the definitions presented in What is a System?.
The modern world has numerous kinds of systems that influence daily life. Some examples include transport
systems; solar systems; telephone systems; the Dewey Decimal System; weapons systems; ecological systems; space
systems; etc. Indeed, it seems there is almost no end to the use of the word “system” in today’s society.
This article considers the different classification systems which some systems science authors have proposed in an
attempt to extract some general principles from these multiple occurrences. These classification schemes look at
either the kinds of elements from which the system is composed or its reason for existing.
The idea of an engineered system is expanded. Four specific types of engineered system context are generally
recognized in systems engineering: product system, service system, enterprise system and system of systems.

System Classification
A taxonomy is "a classification into ordered categories" (Dictionary.com 2011). Taxonomies are useful ways of
organizing large numbers of individual items so their similarities and differences are apparent. No single standard
systems taxonomy exists, although several attempts have been made to produce a useful classification taxonomy, e.g.
(Bertalanffy 1968) and (Miller 1986).
Kenneth Boulding (Boulding 1956), one of the founding fathers of general system theory, developed a systems
classification which has been the starting point for much of the subsequent work. He classifies systems into nine
types:
1. Structures (Bridges)
2. Clock works (Solar system)
3. Controls (Thermostat)
4. Open (Biological cells)
5. Lower organisms (Plants)
6. Animals (Birds)
7. Man (Humans)
8. Social (Families)
9. Transcendental (God)
These approaches also highlight some of the subsequent issues with these kinds of classification. Boulding implies
that physical structures are closed and natural while social ones are open. However, a bridge can only be understood
by considering how it reacts to traffic crossing it, and it must be sustained or repaired over time (Hitchins 2007).
Boulding also separates humans from animals, which would not fit into more modern thinking.
Peter Checkland (Checkland 1999, 111) divides systems into five classes: natural systems, designed physical
systems, designed abstract systems, human activity systems and transcendental systems. The first two classes are
self-explanatory.
Types of Systems 102

• Designed abstract systems – These systems do not contain any physical artifacts but are designed by humans to
serve some explanatory purpose.
• Human activity systems – These systems are observable in the world of innumerable sets of human activities
that are more or less consciously ordered in wholes as a result of some underlying purpose or mission. At one
extreme is a system consisting of a human wielding a hammer. At the other extreme lies international political
systems.
• Transcendental systems – These are systems that go beyond the aforementioned four systems classes, and are
considered to be systems beyond knowledge.
Checkland refers to these five systems as comprising a “systems map of the universe”. Other, similar categorizations
of system types can be found in (Aslaksen 1996), (Blanchard 2005) and (Giachetti 2009).
Magee and de Weck (Magee and de Weck 2004) provide a comprehensive overview of sources on system
classification such as (Maier and Rechtin 2009), (Paul 1998) and (Wasson 2006). They cover some methods for
classifying natural systems, but their primary emphasis and value to the practice of systems engineer is in their
classification method for human-designed, or man-made, systems. They examine many possible methods that
include: degree of complexity, branch of the economy that produced the system, realm of existence (physical or in
thought), boundary, origin, time dependence, system states, human involvement / system control, human wants,
ownership and functional type. They conclude by proposing a functional classification method that sorts systems by
their process (transform, transport, store, exchange, or control), and by the entity on which they operate (matter,
energy, information and value).

Types of Engineered System


The figure below is a general view of the context for any potential application of an engineered system life cycle.

Figure 1: General types of Engineered System of Interest (SoI) (SEBoK original)


Types of Systems 103

Figure 1 shows four general cases of system of interest (SoI) which might be the focus of a life cycle.
• A technology focused product system SoI embedded within one or more integrated products,
• An integrated multi-technology product system SoI used directly to help provide a service,
• An enabling service system SoI supporting multiple service systems
• A service system SoI created and sustained to directly deliver capability.

Products and Product Systems


The word product is defined as "a thing produced by labor or effort; or anything produced" (Oxford English
Dictionary). In a commercial sense a product is anything which is acquired, owned and sustained by an organization
and used by an enterprise (hardware, software, information, personnel, etc.).
A product system is an engineered system in which the focus of the life cycle is to develop and deliver products to an
acquirer for internal or external use to directly support the delivery of services needed by that acquirer.
A product systems life cycle context will describe a technology focused SoI plus the related products, people and
services with which the SoI is required to interact. Note, the people associated with a product system over its life
(e,g, operators, maintainers, producers, etc.) sit outside of the product SoI, since they are not delivered as part of the
product. However, to develop a successful product, it is essential to fully understand its human interfaces and
influences as part of its context. The product context will also define the service systems within which it will be
deployed to help provide the necessary capability to the acquiring enterprise.
In a product life cycle, this wider context defines the fixed and agreed relationships within which the SoI must
operate, and the environmental influences within which the life cycle must be delivered. This gives the product
developer the freedom to make solution choices within that context and to ensure these choices fit into and do not
disrupt the wider context.
A product life cycle may need to recommend changes to enabling services such as recruitment and training of
people, or other infrastructure upgrades. Appropriate mechanisms for the implementation of these changes must be
part of the agreement between acquirer and supplier and be integrated into the product life cycle. A product life cycle
may also suggest changes in the wider context which would enhance the product’s ownership or use, but those
changes need to be negotiated and agreed with the relevant owners of the systems they relate to before they can be
added to the life cycle outputs.
A more detailed discussion of the system theory associated with product systems can be found in History of Systems
Science and an expansion of the application of systems engineering to service systems in the Product Systems
Engineering KA in Part 4.

Services and Service Systems


A service can be simply defined as an act of help or assistance, or as any outcome required by one or more users
which can be defined in terms of outcomes and quality of service without detail to how it is provided (e.g., transport,
communications, protection, data processing, etc.). Services are processes, performances, or experiences that one
person or organization does for the benefit of another, such as custom tailoring a suit; cooking a dinner to order;
driving a limousine; mounting a legal defense; setting a broken bone; teaching a class; or running a business’s
information technology infrastructure and applications. In all cases, service involves deployment of knowledge and
skills (competencies) that one person or organization has for the benefit of another (Lusch and Vargo 2006), often
done as a single, customized job. To be successful, service requires substantial input from the client and related
stakeholder, often referred to as the co-creation of value (Sampson 2001). For example, how can a steak be
customized unless the customer tells the waiter how the customer wants the steak prepared?
A service system (glossary) is an engineered system created and sustained by an organization that provides outcomes
for clients within an enterprise. A service system context contains the same kinds of system elements as a product
Types of Systems 104

system context but allows greater freedom for what can be created or changed to deliver the required service.
A service system life cycle may deliver changes to how existing products and other services are deployed and used.
It may also identify the need to modify existing products or create new products, in which case it may initiate a
related product life cycle. In most cases the service developer will not have full freedom to change all aspects of the
service system context without some negotiation with related system element owners. In particular, people and
infrastructure are part of the service context and changes to how system elements are used to provide desired
outcomes are part of the service life cycle scope.
The description of product system context above might be viewed as a special case of a service system context in
which a specific product is created and integrated into a fixed service system by an organization and used by an
enterprise directly related to the organization to provide a capability.
In a general service system context, it is not necessary to deliver all hardware or software products to the service
provider. In some cases, some of the hardware, software or human elements may be owned by a third party who is
not responsible for the service directly but provides enabling outputs to a number of such services. In other cases, the
whole service may be provided by an organization that is completely separate to the enterprise which needs the
service. Nor is it necessary for the exact versions of products or enabling services to be defined and integrated prior
to service delivery. Some service system elements can be selected and integrated closer to the point of use. To allow
for this late configuration of a service system, it will contain some method of discovery by which appropriate
available elements can be found, and an overall service management element to implement and direct each instance
of the service system. The use of a service system approach gives greater freedom for acquirers in how they obtain
and support all of the capital equipment, software, people, etc. in order to obtain the capabilities needed to satisfy
users.
Services have been part of the language of systems engineering (SE) for many years, either as a way to describe the
context of a product-focused life cycle or to describe commercial arrangements for the 'outsourcing' of product
ownership and operation to others. The use of the term service system in more recent times is often associated with
software configurable and information intensive systems, i.e.,
...unique features that characterize services – namely, services, especially emerging services, are
information-driven, customer-centric, e-oriented, and productivity-focused. (Tien and Berg 2003, 13)
A more detailed discussion of the system theory associated with service systems can be found in History of Systems
Science and an expansion of the application of systems engineering to service systems in the Service Systems
Engineering KA in Part 4.

Enterprises and Enterprise Systems


An enterprise is one or more organizations or individuals sharing a definite mission, goals, and objectives to offer an
output such as a product or service.
An enterprise system consists of a purposeful combination (network) of interdependent resources (e.g., people;
processes; organizations; supporting technologies; and funding) that interact with each other (e.g., to coordinate
functions; share information; allocate funding; create workflows; and make decisions) and their environment(s), to
achieve business and operational goals through a complex web of interactions distributed across geography and time
(Rebovich and White 2011).
Both product and service systems require an enterprise system to create them and an enterprise to use the product
system to deliver services, either internally to the enterprise or externally to a broader community.
Enterprise systems are unique, compared to product and service systems, in that they are constantly evolving; they
rarely have detailed configuration controlled requirements; they typically have the goal of providing shareholder
value and customer satisfaction, which are constantly changing and are difficult to verify; and they exist in a context
(or environment) that is ill-defined and constantly changing.
Types of Systems 105

While an enterprise system cannot be described using the general system context above, an enterprise may wish to
create a model of the capabilities and services it needs to achieve its strategy and goals. Such a model can be
extended to describe a baseline of service system and product system contexts related to its current capabilities, and
to proposed future capabilities. These are referred to as enterprise architectures or enterprise reference architectures.
A more detailed discussion of the system theory associated with service systems can be found in History of Systems
Science and an expansion of the application of systems engineering to service systems in the Enterprise Systems
Engineering KA in Part 4. The notion of enterprises and enterprise systems also permeates Part 5 Enabling Systems
Engineering.

Systems of Systems
A product, service or enterprise context can be defined as a hierarchy of system elements, with the additional
definition of which elements are part of a SoI solution, which form the related problem context and which influence
any life cycle associated with that context.
The additional concepts of Systems of Systems (SoS) or Federations of Systems (FoS) is used for some contexts. In
terms of the general description in Figure 1 above, this would apply to any life cycle context in which elements
within the SoI have independent life cycle relationships. This concept could apply to any of the life cycle contexts
above, although it is of particular relevance to the service and enterprise contexts.
It is important to understand that the term SoS is an addition to the general concept of system hierarchy that applies
to all systems. Maier examined the meaning of System of Systems in detail and used a characterization approach
which emphasizes the independent nature of the system elements (Maier 1998, 268). Maier describes both
independence in how a system element operates (e.g. an element in the SoI also has its own separate mission or is
part of another SoI) and in how an element is developed or sustained (e.g. an element is made available, modified or
configured by a different organization to the one responsible for the rest of the SoI).
There are advantages to being able to have elements shared across a number of engineered systems and to being able
to quickly create solutions to problems by combining existing engineered systems. As the technology to enable
integration of independent systems becomes more common, this SoS approach becomes a common aspect of many
SE life cycles.
Wherever system elements in an engineered system context have any degree of independence from the SoI life cycle,
this adds a further complexity; specifically, by constraining how the resulting engineered system can be changed or
controlled. This dimension of complexity affects the management and control aspects of the systems approach.
A more detailed discussion of the different system grouping taxonomies developed by systems science can be found
in Part 4 Applications of Systems Engineering and an expansion of the ways we deal with SoS complexity can be
found in the Systems of Systems KA in Part 4.

Applying Engineered System Contexts


From the discussions of product and service contexts above, it should be clear that they require similar systems
understanding to be successful and that the difference between them is more about the scope of life cycle choices and
the authority to make changes than about what kinds of systems they are.
These contexts are presented here as generalizations of the system engineering approach. All real projects may have
both product and service system dimensions to them. In this general view of engineered systems, there is always an
enterprise system directly interested in the service system context and either directly owning and operating any
product systems and enabling services or gaining access to them as needed. This enterprise system may be explicitly
involved in initiating and managing an engineering system life cycle or may be implicit in the shared ownership of a
problem situation. Any engineered system context may have aspects of the SoS independence discussed above. This
may be part of the context in the wider system or environment or it may be related to the choice of elements within
Types of Systems 106

the SoI.
A real SE life cycle typically combines different aspects of these general contexts into a unique problem and solution
context and associated acquirer and supplier commercial relationships. These must be identified by that life cycle as
part of its SE activities. More details of these different life cycle contexts are given in part 2 and their applications to
SE practice are expanded upon in Part 4.
A good example of a general description of the above is given by Ring (1998), who defines the overall context as the
Problem Suppression System, describes a cycle by which an enterprise will explore its current needs, uses these to
identify one or more life cycle interventions and relevant organizations, then conduct and deliver those life cycles
and integrate their outputs into the PSS; the enterprise can then review the results in the environment and begin the
cycle again.
This general systems approach is described in part 2 and used as a focus to identify areas of foundational knowledge.
The current practices of SE described in the rest of the SEBoK reference these foundations as appropriate.

References

Works Cited
Aslaksen, E.W. 1996. The Changing Nature of Engineering. New York, NY, USA: McGraw-Hill.
Bertalanffy, L. von. 1968. General System Theory. New York, NY, USA: Brazillier.
Blanchard, B.S., and W.J. Fabrycky. 2005. Systems Engineering and Analysis, 4th ed. Prentice-Hall International
Series in Industrial and Systems Engineering. Englewood Cliffs, NJ, USA: Prentice-Hall.
Boulding, K. 1956. “General systems theory: The skeleton of science” Management Science, vol. 2, no. 3,
Aprilpp.197-208, 1956; reprinted in General Systems, Yearbook of the Society for General Systems Research, vol. 1,
1956.
Checkland, P.B. 1999. Systems Thinking, Systems Practice. Chichester, UK: John Wiley & Sons Ltd.
Dictionary.com, s.v. "Taxonomy." Accessed 3 December 2014. Available at: http:/ / dictionary. reference. com/
browse/taxonomy.
Encyclopedia Britannica, s.v. "Service Industry." Accessed 3 December 2014. Available at:http:/ / www. britannica.
com/EBchecked/topic/535980/service-industry.
DeRosa, J. K. 2005. “Enterprise systems engineering.” Air Force Association, Industry Day, Day 1, 4 August 2005,
Danvers, MA, USA.
Giachetti, R.E. 2009. Design of Enterprise Systems: Theory, Architectures, and Methods. Boca Raton, FL, USA:
CRC Press.
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: Wiley.
Lusch, R.F. and S. L. Vargo (Eds). 2006. The Service-Dominant Logic of Marketing: Dialog, Debate, and
Directions. Armonk, NY, USA: ME Sharpe Inc.
Magee, C.L. and O.L. de Weck. 2004. "Complex System classification". Proceedings of the 14th Annual
International Symposium of the International Council on Systems Engineering, Toulouse, France, 20-24 June, 2004.
Maier, M. W. 1998. "Architecting principles for systems-of-systems". Systems Engineering, vol. 1, no. 4, pp. 267-84.
Maier, M., and E. Rechtin. 2009. The Art of Systems Architecting, 3rd Ed. Boca Raton, FL, USA: CRC Press.
Miller J. G. 1986. "Can Systems Theory Generate Testable Hypothesis?: From Talcott Parsons to Living Systems
Theory" Systems Research, vol. 3, pp. 73-84.
Paul, A.S. 1998. "Classifying systems." Proceedings of the 8th Annual International Council on Systems
Engineering International Symposium, Vancouver, BC, Canada, 26-30 July, 1998.
Types of Systems 107

Rebovich, G., and B.E. White (eds.). 2011. Enterprise Systems Engineering: Advances in the Theory and Practice.
Boca Raton, FL, USA: CRC Press.
Ring, J., 1998. "A value seeking approach to the engineering of systems." Proceedings of the IEEE Conference on
Systems, Man, and Cybernetics. p. 2704-2708.
Sampson, S.E. 2001. Understanding Service Businesses. New York, NY, USA: John Wiley.
Tien, J.M. and D. Berg. 2003. "A case for service systems engineering." Journal of Systems Science and Systems
Engineering, vol. 12, no. 1, pp. 13-38.
Wasson, C.S. 2006. System Analysis, Design and Development. Hoboken, NJ, USA: John Wiley and Sons.

Primary References
Checkland, P. B. 1999. Systems Thinking, Systems Practice. Chichester, UK: John Wiley & Sons.
Magee, C. L., O.L. de Weck. 2004. "Complex system classification." Proceedings of the 14th Annual International
Council on Systems Engineering International Symposium, Toulouse, France, 20-24 June 2004.
Rebovich, G., and B.E. White (eds.). 2011. Enterprise Systems Engineering: Advances in the Theory and Practice.
Boca Raton, FL, USA: CRC Press.
Tien, J.M. and D. Berg. 2003. "A case for service systems engineering". Journal of Systems Science and Systems
Engineering, vol. 12, no. 1, pp. 13-38.

Additional References
None.

< Previous Article | Parent Article | Next Article >


SEBoK v. 2.4, released 19 May 2021
Complexity 108

Complexity
Lead Author: Rick Adcock, Contributing Authors: Hillary Sillitto, Sarah Sheard

This article is part of the Systems Fundamentals knowledge area (KA). It gives the background of and an indication
of current thinking on complexity and how it influences systems engineering (SE) practice.
Complexity is one of the most important and difficult to define system concepts. Is a system's complexity in the eye
of the beholder, or is there inherent complexity in how systems are organized? Is there a single definitive definition
of complexity and, if so, how can it be assessed and measured? This topic will discuss how these ideas relate to the
general definitions of a system given in What is a System?, and in particular to the different engineered system
contexts. This article is closely related to the emergence topic that follows it.

Defining System Complexity


Complexity has been considered by a number of authors from various perspectives; some of the discussions of
complexity relevant to systems are described in the final section of this article. Sheard and Mostashari (Sheard and
Mostashari 2011) synthesize many of these ideas to categorize complexity as follows:
1. Structural Complexity looks at the system elements and relationships. In particular, structural complexity looks
at how many different ways system elements can be combined. Thus, it is related to the potential for the system to
adapt to external needs.
2. Dynamic Complexity considers the complexity which can be observed when systems are used to perform
particular tasks in an environment. There is a time element to dynamic complexity. The ways in which systems
interact in the short term is directly related to system behavior; the longer-term effects of using systems in an
environment is related to system evolution.
3. Socio-Political Complexity considers the effect of individuals or groups of people on complexity. People-related
complexity has two aspects. One is related to the perception of a situation as complex or not complex, due to
multiple stakeholder viewpoints within a system context and social or cultural biases which add to the wider
influences on a system context. The other involves either the “irrational” behavior of an individual or the swarm
behavior of many people behaving individually in ways that make sense; however, the emergent behavior is
unpredicted and perhaps counterproductive. This latter type is based on the interactions of the people according to
their various interrelationships and is often graphed using systems dynamics formalisms.
Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the
consequences of changing it. It occurs when there is no simple relationship between what an individual element does
and what the system as a whole will do, and when the system includes some element of adaptation or problem
solving to achieve its goals in different situations. It can be affected by objective attributes of a system such as by the
number, types of and diversity of system elements and relationships, or by the subjective perceptions of system
observers due to their experience, knowledge, training, or other sociopolitical considerations.
This view of complex systems provides insight into the kind of system for which systems thinking and a systems
approach is essential.

Complexity and Engineered Systems


The different perspectives on complexity are not independent when considered across a systems context. The
structural complexity of a system-of-interest (SoI) may be related to dynamic complexity when the SoI also
functions as part of a wider system in different problem scenarios. People are involved in most system contexts, as
part of the problem situation, as system elements and part of the operating environment. The human activity systems
which we create to identify, design, build and support an engineered system and the wider social and business
Complexity 109

systems in which they sit are also likely to be complex and affect the complexity of the systems they produce and
use.
Sheard and Mostashari (2011) show the ways different views of complexity map onto product system, service
system and enterprise system contexts, as well as to associated development and sustainment systems and project
organizations. Ordered systems occur as system components and are the subject of traditional engineering. It is
important to understand the behaviors of such systems when using them in a complex system. One might also need
to consider both truly random or chaotic natural or social systems as part of the context of an engineered system. The
main focus for systems approaches is organized complexity (see below). This kind of complexity cannot be dealt
with by traditional analysis techniques, nor can it be totally removed by the way we design or use solutions. A
systems approach must be able to recognize and deal with such complexity across the life of the systems with which
it interacts.
Sillitto (2014) considers the link between the types of system complexity and system architecture. The ability to
understand, manage and respond to both objective and subjective complexity in the problem situation, the systems
we develop or the systems we use to develop and sustain them is a key component of the Systems Approach Applied
to Engineered Systems and hence to the practice of systems engineering.

Origins and Characteristics of Complexity


This section describes some of the prevailing ideas on complexity. Various authors have used different language to
express these ideas. While a number of common threads can be seen, some of the ideas take different viewpoints and
may be contradictory in nature.
One of the most widely used definitions of complexity is the degree of difficulty in predicting the properties of a
system if the properties of the system's parts are given (generally attributed to Weaver). This, in turn, is related to the
number of elements and connections between them. Weaver (Weaver 1948) relates complexity to types of elements
and how they interact. He describes simplicity as problems with a finite number of variables and interaction, and
identifies two kinds of complexity:
1. Disorganized Complexity is found in a system with many loosely coupled, disorganized and equal elements,
which possesses certain average properties such as temperature or pressure. Such a system can be described by
“19th Century” statistical analysis techniques.
2. Organized Complexity can be found in a system with many strongly coupled, organized and different elements
which possess certain emergent properties and phenomena such as those exhibited by economic, political or social
systems. Such a system cannot be described well by traditional analysis techniques.
Weaver's ideas about this new kind of complex problem are some of the foundational ideas of systems thinking. (See
also Systems Thinking.)
Later authors, such as Flood and Carson (1993) and Lawson (2010), expand organized complexity to systems which
have been organized into a structure intended to be understood and thus amenable to engineering and life cycle
management (Braha et al. 2006). They also suggest that disorganized complexity could result from a heterogeneous
complex system evolving without explicit architectural control during its life (complexity creep). This is a different
use of the terms “organized” and “disorganized” to that used by Weaver. Care should be taken in mixing these ideas
Complexity should not be confused with "complicated". Many authors make a distinction between ordered and
disordered collections of elements.
Ordered systems have fixed relationships between elements and are not adaptable. Page (2009) cites a watch as an
example of something which can be considered an ordered system. Such a system is complicated, with many
elements working together. Its components are based on similar technologies, with clear mapping between form and
function. If the operating environment changes beyond prescribed limits, or one key component is removed, the
watch will cease to perform its function.
Complexity 110

In common usage, chaos is a state of disorder or unpredictability characterized by elements which are not
interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain
dynamic systems (e.g., the weather) which, although they have structure and relationships, exhibit unpredictable
behavior. These systems may include aspects of randomness but can be described using deterministic models from
which their behavior can be described given a set of initial conditions. However, their structure is such that
(un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in
behavior. Such systems are referred to as deterministically chaotic or, simply, chaotic systems. Simulations of
chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are
possible at least some of the time.
On a spectrum of order to complete disorder, complexity is somewhere in the middle, with more flexibility and
change than complete order and more stability than complete disorder (Sheard and Mostashari 2009).
Complex systems may evolve “to the edge of chaos,” resulting in systems which can appear deterministic but which
exhibit counter intuitive behavior compared to that of more ordered systems. The statistics of chance events in a
complex system are often characterized by a power-law distribution, the “signature of complexity” (Sheard 2005).
The power-law distribution is found in a very wide variety of natural and man-made phenomena, and it means that
the probability of a low probability—large impact event is much higher than a Gaussian distribution would suggest.
Such a system may react in a non-linear way to exhibit abrupt phase changes. These phase changes can be either
reversible or irreversible. This has a major impact on engineered systems in terms of the occurrence, impact and
public acceptance of risk and failure.
Objective complexity is an attribute of complex systems and is a measure of where a system sits on this spectrum. It
is defined as the extent to which future states of the system cannot be predicted with certainty and precision,
regardless of our knowledge of current state and history. Subjective complexity is a measure of how easy it is for an
observer to understand a system or predict what it will do next. As such, it is a function of the perspective and
comprehension of each individual. It is important to be prepared to mitigate subjective complexity with consistent,
clear communication and strong stakeholder engagement (Sillitto 2009).
The literature has evolved to a fairly consistent definition of the characteristics of system elements and relationships
for objective systems complexity. The following summary is given by Page (2009):
1. Independence: Autonomous system elements which are able to make their own decisions, influenced by
information from other elements and the adaptability algorithms the autonomous elements carry with themselves
(Sheard and Mostashari 2009).
2. Interconnectedness: System elements connect via a physical connection, shared data or simply a visual
awareness of where the other elements are and what they are doing, as in the case of the flock of geese or the
squadron of aircraft.
3. Diversity: System elements which are either technologically or functionally different in some way. For example,
elements may be carrying different adaptability algorithms.
4. Adaptability: Self-organizing system elements which can do what they want to do to support themselves or the
entire system in response to their environment (Sheard and Mostashari 2009). Adaptability is often achieved by
human elements but can be achieved with software. Pollock and Hodgson (2004) describe how this can be done in
a variety of complex system types, including power grids and enterprise systems.
Due to the variability of human behavior as part of a system and the perceptions of people outside the system, the
inclusion of people in a system is often a factor in their complexity. People may be viewed as observing systems or
as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or
irrational behavior of individuals in particular situations is a vital factor in respect to complexity (Kline 1995). Some
of this complexity can be reduced through education, training and familiarity with a system. Some is irreducible and
must be managed as part of a problem or solution. Checkland (1999) argues that a group of stakeholders will have its
own world views which lead them to form different, but equally valid, understandings of a system context. These
Complexity 111

differences cannot be explained away or analyzed out, and must be understood and considered in the formulation of
problems and the creation of potential solutions.
Warfield (2006) developed a powerful methodology for addressing complex issues, particularly in the
socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a
set of interacting problems - what he called the “problematique”. The complexity is then characterized via several
measures, such as the number of significant problems, their interactions and the degree of consensus about the nature
of the problems. What becomes clear is that how, why, where and by whom a system is used may all contribute to its
perceived complexity.
Sheard and Mostashari (2011) sort the attributes of complexity into causes and effects. Attributes that cause
complexity include being non-linear; emergent; chaotic; adaptive; tightly coupled; self-organized; decentralized;
open; political (as opposed to scientific); and multi-scale; as well as having many pieces. The effects of those
attributes which make a system be perceived as complex include being uncertain; difficult to understand;
unpredictable; uncontrollable; unstable; unrepairable; unmaintainable and costly; having unclear cause and effect;
and taking too long to build.

References

Works Cited
Axelrod, R. and M. Cohen. 1999. Harnessing Complexity: Organizational Implications of a Scientific Frontier. New
York, NY, USA: Simon and Schuster.
Braha, D., A. Minai, and Y. Bar-Yam (eds.). 2006. Complex Engineered Systems: Science Meets Technology. New
York, NY, USA: Springer.
Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: John Wiley & Sons.
Flood, R. L., and E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of
Systems Science, 2nd ed. New York, NY, USA: Plenum Press.
Lawson, H. W. 2010. A Journey Through the Systems Landscape. Kings College, UK: College Publications.
Kellert, S. 1993. In the Wake of Chaos: Unpredictable Order in Dynamical Systems, Chicago, IL, USA: University
of Chicago Press.
Kline, S. 1995. Foundations of Multidisciplinary Thinking. Stanford, CA, USA: Stanford University Press.
Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.
Pollock, J.T. and R. Hodgson. 2004. Adaptive Information. Hoboken, NJ, USA: John Wiley & Sons.
Senge, P.M. 1990. The Fifth Discipline: The Art & Practice of The Learning Organization. New York, NY, USA:
Doubleday/Currency.
Sheard, S.A. 2005. "Practical applications of complexity theory for systems engineers". Proceedings of the Fifteenth
Annual International Council on Systems Engineering, vol. 15, no. 1.
Sheard, S.A. and A. Mostashari. 2009. "Principles of complex systems for systems engineering." Systems
Engineering, vol. 12, no. 4, pp. 295-311.
Sheard, SA. and A. Mostashari. 2011. "Complexity types: From science to systems engineering." Proceedings of the
21st Annual of the International Council on Systems Engineering (INCOSE) International Symposium, Denver,
Colorado, USA, 20-23 June 2011.
Sillitto, H. 2014. "Architecting Systems - Concepts, Principles and Practice", London, UK: College Publications.
Warfield, J.N. 2006. An Introduction to Systems Science. London, UK: World Scientific Publishing.
Weaver, W. 1948. "Science and complexity." American Science, vol. 36, pp. 536-544.
Complexity 112

Primary References
Flood, R. L., & E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of
Systems Science, 2nd ed. New York, NY, USA: Plenum Press.
Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.
Sheard, S.A. and A. Mostashari. 2009. "Principles of complex systems for systems engineering". Systems
Engineering, vol. 12, no. 4, pp. 295-311.

Additional References
Ashby, W.R. 1956. An Introduction to Cybernetics. London, UK: Chapman and Hall.
Aslaksen, E.W. 2004. "System thermodynamics: A model illustrating complexity emerging from simplicity".
Systems Engineering, vol. 7, no. 3. Hoboken, NJ, USA: Wiley.
Aslaksen, E.W. 2009. Engineering Complex Systems: Foundations of Design in the Functional Domain. Boca Raton,
FL, USA: CRC Press.
Aslaksen, E.W. 2011. "Elements of a systems engineering ontology". Proceedings of SETE 2011, Canberra,
Australia.
Eisner, H. 2005. Managing Complex Systems: Thinking Outside the Box. Hoboken, NJ, USA: John Wiley & Sons.
Jackson, S., D. Hitchins, and H. Eisner. 2010. “What is the Systems Approach?” INCOSE Insight, vol. 13, no. 1,
April, pp. 41-43, 2010.
MITRE. 2011. "Systems engineering strategies for uncertainty and complexity." Systems Engineering Guide.
Accessed 9 March 2011. Available at: http:/ / www. mitre. org/ work/ systems_engineering/ guide/
enterprise_engineering/comprehensive_viewpoint/sys_engineering_strategies_uncertainty_complexity.html.
Ryan, A. 2007. "Emergence is coupled to scope, not Level, complexity". A condensed version appeared in INCOSE
Insight, vol. 11, no. 1, January, pp. 23-24, 2008.
Sillitto H.G. 2009. "On systems architects and systems architecting: Some thoughts on explaining the art and science
of system architecting." Proceedings of the 19th Annual International Council on Systems Engineering (INCOSE)
International Symposium, Singapore, 20-23 July 2009.

< Previous Article | Parent Article | Next Article >


SEBoK v. 2.4, released 19 May 2021
Emergence 113

Emergence
Lead Author: Rick Adcock, Contributing Authors: Scott Jackson, Dick Fairley, Janet Singer, Duane Hybertson

This topic forms part of the Systems Fundamentals knowledge area (KA). It gives the background to some of the
ways in which emergence has been described, as well as an indication of current thinking on what it is and how it
influences systems engineering (SE) practice. It will discuss how these ideas relate to the general definitions of
systems given in What is a System?; in particular, how they relate to different engineered system contexts. This topic
is closely related to the complexity topic that precedes it.
Emergence is a consequence of the fundamental system concepts of holism and interaction (Hitchins 2007, 27).
System wholes have behaviors and properties arising from the organization of their elements and their relationships,
which only become apparent when the system is placed in different environments.
Questions that arise from this definition include: What kinds of systems exhibit different kinds of emergence and
under what conditions? Can emergence be predicted, and is it beneficial or detrimental to a system? How do we deal
with emergence in the development and use of engineered systems? Can it be planned for? How?
There are many varied and occasionally conflicting views on emergence. This topic presents the prevailing views
and provides references for others.

Overview of Emergence
As defined by Checkland, emergence is “the principle that entities exhibit properties which are meaningful only
when attributed to the whole, not to its parts.” (Checkland 1999, 314). Emergent system behavior can be viewed as a
consequence of the interactions and relationships between system elements rather than the behavior of individual
elements. It emerges from a combination of the behavior and properties of the system elements and the systems
structure or allowable interactions between the elements, and may be triggered or influenced by a stimulus from the
systems environment.
Emergence is common in nature. The pungent gas ammonia results from the chemical combination of two odorless
gases, hydrogen and nitrogen. As individual parts, feathers, beaks, wings, and gullets do not have the ability to
overcome gravity; however, when properly connected in a bird, they create the emergent behavior of flight. What we
refer to as “self-awareness” results from the combined effect of the interconnected and interacting neurons that make
up the brain (Hitchins 2007, 7).
Hitchins also notes that technological systems exhibit emergence. We can observe a number of levels of outcome
which arise from interaction between elements in an engineered system context. At a simple level, some system
outcomes or attributes have a fairly simple and well defined mapping to their elements; for example, center of
gravity or top speed of a vehicle result from a combination of element properties and how they are combined. Other
behaviors can be associated with these simple outcomes, but their value emerges in complex and less predictable
ways across a system. The single lap performance of a vehicle around a track is related to center of gravity and
speed; however, it is also affected by driver skill, external conditions, component ware, etc. Getting the 'best'
performance from a vehicle can only be achieved by a combination of good design and feedback from real laps
under race conditions.
There are also outcomes which are less tangible and which come as a surprise to both system developers and users.
How does lap time translate into a winning motor racing team? Why is a sports car more desirable to many than
other vehicles with performances that are as good or better?
Emergence can always be observed at the highest level of system. However, Hitchins (2007, 7) also points out that to
the extent that the systems elements themselves can be considered as systems, they also exhibit emergence. Page
(2009) refers to emergence as a “macro-level property.” Ryan (2007) contends that emergence is coupled to scope
Emergence 114

rather than system hierarchical levels. In Ryan’s terms, scope has to do with spatial dimensions (how system
elements are related to each other) rather than hierarchical levels.
Abbott (2006) does not disagree with the general definition of emergence as discussed above. However, he takes
issue with the notion that emergence operates outside the bounds of classical physics. He says that “such higher-level
entities…can always be reduced to primitive physical forces.”
Bedau and Humphreys (2008) and Francois (2004) provide comprehensive descriptions of the philosophical and
scientific background of emergence.

Types of Emergence
A variety of definitions of types of emergence exists. See Emmeche et al. (1997), Chroust (2003) and O’Connor and
Wong (2006) for specific details of some of the variants. Page (2009) describes three types of emergence: "simple",
"weak", and "strong".
According to Page, simple emergence is generated by the combination of element properties and relationships and
occurs in non-complex or “ordered” systems (see Complexity) (2009). To achieve the emergent property of
“controlled flight” we cannot consider only the wings, or the control system, or the propulsion system. All three must
be considered, as well as the way these three are interconnected-with each other, as well as with all the other parts of
the aircraft. Page suggests that simple emergence is the only type of emergence that can be predicted. This view of
emergence is also referred to as synergy (Hitchins 2009).
Page describes weak emergence as expected emergence which is desired (or at least allowed for) in the system
structure (2009). However, since weak emergence is a product of a complex system, the actual level of emergence
cannot be predicted just from knowledge of the characteristics of the individual system components.
The term strong emergence is used to describe unexpected emergence; that is, emergence not observed until the
system is simulated or tested or, more alarmingly, until the system encounters in operation a situation that was not
anticipated during design and development.
Strong emergence may be evident in failures or shutdowns. For example, the US-Canada Blackout of 2003 as
described by the US-Canada Power System Outage Task Force (US-Canada Power Task Force 2004) was a case of
cascading shutdown that resulted from the design of the system. Even though there was no equipment failure, the
shutdown was systemic. As Hitchins points out, this example shows that emergent properties are not always
beneficial (Hitchins 2007, 15).
Other authors make a different distinction between the ideas of strong, or unexpected, emergence and unpredictable
emergence:
• Firstly, there are the unexpected properties that could have been predicted but were not considered in a systems
development: "Properties which are unexpected by the observer because of his incomplete data set, with regard to
the phenomenon at hand" (Francois, C. 2004, 737). According to Jackson et al. (2010), a desired level of
emergence is usually achieved by iteration. This may occur as a result of evolutionary processes, in which
element properties and combinations are "selected for", depending on how well they contribute to a system’s
effectiveness against environmental pressures or by iteration of design parameters through simulation or build/test
cycles. Taking this view, the specific values of weak emergence can be refined, and examples of strong
emergence can be considered in subsequent iterations so long as they are amenable to analysis.
• Secondly, there are unexpected properties which cannot be predicted from the properties of the system’s
components: "Properties which are, in and of themselves, not derivable a priori from the behavior of the parts of
the system" (Francois, C. 2004, 737). This view of emergence is a familiar one in social or natural sciences, but
more controversial in engineering. We should distinguish between a theoretical and a practical unpredictability
(Chroust 2002). The weather forecast is theoretically predictable, but beyond certain limited accuracy practically
impossible due to its chaotic nature. The emergence of consciousness in human beings cannot be deduced from
Emergence 115

the physiological properties of the brain. For many, this genuinely unpredictable type of complexity has limited
value for engineering. (See Practical Considerations below.)
A type of system particularly subject to strong emergence is the system of systems (sos). The reason for this is that
the SoS, by definition, is composed of different systems that were designed to operate independently. When these
systems are operated together, the interaction among the parts of the system is likely to result in unexpected
emergence. Chaotic or truly unpredictable emergence is likely for this class of systems.

Emergent Properties
Emergent properties can be defined as follows: “A property of a complex system is said to be ‘emergent’ [in the case
when], although it arises out of the properties and relations characterizing its simpler constituents, it is neither
predictable from, nor reducible to, these lower-level characteristics” (Honderich 1995, 224).
All systems can have emergent properties which may or may not be predictable or amenable to modeling, as
discussed above. Much of the literature on complexity includes emergence as a defining characteristic of complex
systems. For example, Boccara (2004) states that “The appearance of emergent properties is the single most
distinguishing feature of complex systems.” In general, the more ordered a system is, the easier its emergent
properties are to predict. The more complex a system is, the more difficult predicting its emergent properties
becomes.
Some practitioners use the term “emergence” only when referring to “strong emergence”. These practitioners refer to
the other two forms of emergent behavior as synergy or “system level behavior” (Chroust 2002). Taking this view,
we would reserve the term "Emergent Property" for unexpected properties, which can be modeled or refined through
iterations of the systems development.
Unforeseen emergence causes nasty shocks. Many believe that the main job of the systems approach is to prevent
undesired emergence in order to minimize the risk of unexpected and potentially undesirable outcomes. This review
of emergent properties is often specifically associated with identifying and avoiding system failures (Hitchins 2007).
Good SE isn't just focused on avoiding system failure, however. It also involves maximizing opportunity by
understanding and exploiting emergence in engineered systems to create the required system level characteristics
from synergistic interactions between the components, not just from the components themselves (Sillitto 2010).
One important group of emergent properties includes properties such as agility and resilience. These are critical
system properties that are not meaningful except at the whole system level.

Practical Considerations
As mentioned above, one way to manage emergent properties is through iteration. The requirements to iterate the
design of an engineered system to achieve desired emergence results in a design process are lengthier than those
needed to design an ordered system. Creating an engineered system capable of such iteration may also require a
more configurable or modular solution. The result is that complex systems may be more costly and time-consuming
to develop than ordered ones, and the cost and time to develop is inherently less predictable.
Sillitto (2010) observes that “engineering design domains that exploit emergence have good mathematical models of
the domain, and rigorously control variability of components and subsystems, and of process, in both design and
operation.” The iterations discussed above can be accelerated by using simulation and modeling, so that not all the
iterations need to involve building real systems and operating them in the real environment.
The idea of domain models is explored further by Hybertson in the context of general models or patterns learned
over time and captured in a model space (Hybertson 2009). Hybertson states that knowing what emergence will
appear from a given design, including side effects, requires hindsight. For a new type of problem that has not been
solved, or a new type of system that has not been built, it is virtually impossible to predict emergent behavior of the
solution or system. Some hindsight, or at least some insight, can be obtained by modeling and iterating a specific
Emergence 116

system design; however, iterating the design within the development of one system yields only limited hindsight and
often does not give a full sense of emergence and side effects.
True hindsight and understanding comes from building multiple systems of the same type and deploying them, then
observing their emergent behavior in operation and the side effects of placing them in their environments. If those
observations are done systematically, and the emergence and side effects are distilled and captured in relation to the
design of the systems — including the variations in those designs — and made available to the community, then we
are in a position to predict and exploit the emergence.
Two factors are discovered in this type of testing environment: what works (that is, what emergent behavior and side
effects are desirable); and what does not work (that is, what emergent behavior and side effects are undesirable).
What works affirms the design. What does not work calls for corrections in the design. This is why multiple systems,
especially complex systems, must be built and deployed over time and in different environments - to learn and
understand the relations among the design, emergent behavior, side effects, and environment.
These two types of captured learning correspond respectively to patterns and “antipatterns,” or patterns of failure,
both of which are discussed in a broader context in the Principles of Systems Thinking and Patterns of Systems
Thinking topics.
The use of iterations to refine the values of emergent properties, either across the life of a single system or through
the development of patterns encapsulating knowledge gained from multiple developments, applies most easily to the
discussion of strong emergence above. In this sense, those properties which can be observed but cannot be related to
design choices are not relevant to a systems approach. However, they can have value when dealing with a
combination of engineering and managed problems which occur for system of systems contexts (Sillitto 2010). (See
Systems Approach Applied to Engineered Systems.)

References

Works Cited
Abbott, R. 2006. "Emergence explained: Getting epiphenomena to do real work". Complexity, vol. 12, no. 1
(September-October), pp. 13-26.
Bedau, M.A. and P. Humphreys, P. (eds.). 2008. "Emergence" In Contemporary Readings in Philosophy and
Science. Cambridge, MA, USA: The MIT Press.
Boccara, N. 2004. Modeling Complex Systems. New York, NY, USA: Springer-Verlag.
Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: John Wiley & Sons.
Chroust. G. 2002. "Emergent properties in software systems." 10th Interdisciplinary Information Management Talks;
Hofer, C. and Chroust, G. (eds.). Verlag Trauner Linz, pp. 277-289.
Chroust, G., C. Hofer, C. Hoyer (eds.). 2005. The concept of emergence in systems engineering." The 12th Fuschl
Conversation, April 18-23, 2004, Institute for Systems Engineering and Automation, Johannes Kepler University
Linz. pp. 49-60.
Emmeche, C., S. Koppe, and F. Stjernfelt. 1997. "Explaining emergence: Towards an ontology of levels." Journal
for General Philosophy of Science, vol. 28, no. 1, pp. 83-119. Accessed 3 December 2014. Available at:http://www.
nbi.dk/~emmeche/coPubl/97e.EKS/emerg.html.
Francois, C. 2004. International Encyclopedia of Systems and Cybernetics, 2nd edition, 2 volumes. Munich,
Germany: K.G.Saur Verlag.
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley &
Sons.
Honderich. T. 1995. The Oxford Companion to Philosophy. New York, NY, USA: Oxford University Press.
Emergence 117

Hybertson, D. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and
Complex Systems. Boca Raton, FL, USA: Auerbach/CRC Press.
Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight. 13(1) (April
2010): 41-43.
O’Connor, T. and H. Wong. 2006. "Emergent Properties," in Stanford Encyclopedia of Philosophy. Accessed
December 3 2014: Available at: http://plato.stanford.edu/entries/properties-emergent/.
Page, S.E. 2009. Understanding Complexity. The Great Courses. Chantilly, VA, USA: The Teaching Company.
Ryan, A. 2007. "Emergence is coupled to scope, not level." Complexity, vol. 13, no. 2, November-December.
Sillitto, H.G. 2010. "Design principles for ultra-large-scale systems". Proceedings of the 20th Annual International
Council on Systems Engineering (INCOSE) International Symposium, July 2010, Chicago, IL, USA, reprinted in
“The Singapore Engineer,” April 2011.
US-Canada Power System Outage Task Force. 2004. Final Report on the August 14, 2003 Blackout in the United
States and Canada: Causes and Recommendations. Washington-Ottawa. Accessed 3 December 3, 2014. Available:
http:/ / energy. gov/ oe/ downloads/
blackout-2003-final-report-august-14-2003-blackout-united-states-and-canada-causes-and

Primary References
Emmeche, C., S. Koppe, and F. Stjernfelt. 1997. "Explaining emergence: Towards an ontology of levels." Journal
for General Philosophy of Science, vol. 28, no. 1, pp. 83-119. Available: http:/ / www. nbi. dk/ ~emmeche/ coPubl/
97e.EKS/emerg.html.
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley &
Sons.
Page, S. E. 2009. Understanding Complexity. The Great Courses. Chantilly, VA, USA: The Teaching Company.

Additional References
Sheard, S.A. and A. Mostashari. 2008. "Principles of complex systems for systems engineering." Systems
Engineering, vol. 12, no. 4, pp. 295-311.

< Previous Article | Parent Article | Next Article >


SEBoK v. 2.4, released 19 May 2021
Fundamentals for Future Systems Engineering 118

Fundamentals for Future Systems Engineering


Lead Author: Rick Adcock, Contributing Author: Duane Hybertson

This article forms part of the Systems Fundamentals knowledge area (KA). It considers future trends in SE and how
these might influence the evolution of future fundamentals.
The SEBoK contains a guide to generalized knowledge about the practice of SE. It does not pass judgement on that
knowledge. However, it can be useful in some cases to indicate which parts of the knowledge are rooted in existing
practice and which point towards the future evolution of SE.
This article provides a sketch of how SE is changing and suggests how these changes may affect the future of
systems engineering, the SEBoK, and the foundations in Part 2.

INCOSE Vision
The INCOSE Vision 2025 statement (INCOSE 2014) depicts some future directions in:
Broadening SE Application Domains
• SE relevance and influence will go beyond traditional aerospace and defense systems and extend into the broader
realm of engineered, natural and social systems
• SE will be applied more widely to assessments of socio-physical systems in support of policy decisions and other
forms of remediation
More Intelligent and Autonomous Systems
• Systems of the future need to become smarter, self-organized, sustainable, resource-efficient, robust and safer
• The number of autonomous vehicles and transportation systems needs to increase
• Systems become more “intelligent” and dominate human-safety critical applications
Theoretical Foundations
• SE will be supported by a more encompassing foundation of theory and sophisticated model-based methods and
tools allowing a better understanding of increasingly complex systems and decisions in the face of uncertainty
• Challenge: A core body of systems engineering foundations is defined and taught consistently across academia
and forms the basis for systems engineering practice
In this article we will consider how the fundamentals of SE might need to evolve to support this vision.

How will SE Change?


In Types of Systems, we describe three general contexts in which a SE life cycle can be applied. In a product system
context, the outputs of SE focus on the delivery of technological systems. While such systems are designed to be
used by people and fit into a wider problem-solving context, this context has been seen as largely fixed and external
to SE. The service system context allows SE to consider all aspects of the solution system as part of its
responsibility. This is currently seen as a special case of SE application largely focused on software intensive
solutions. The enterprise system context offers the potential for a direct application of SE to tackle complex
socio-technical problems, by supporting the planning, development and use of combinations of service systems.
While this is done, it can be difficult to connect to the product focused life cycles of many SE projects.
The role of the systems engineer has already begun to change somewhat due to the first two of the future trends
above. Changes to the scope of SE application and the increased use of software intensive reconfigurable and
autonomous solutions will make the service system context the primary focus of most SE life cycles. To enable this,
most product systems will need to become more general and configurable, allowing them to be used in a range of
service systems as needed. These life cycles are increasingly initiated and managed as part of an enterprise portfolio
Fundamentals for Future Systems Engineering 119

of related life cycles.


In this evolution of SE, the systems engineer cannot consider as many aspects of the context to be fixed, making the
problem and possible solution options more complex and harder to anticipate. This also means the systems engineer
has greater freedom to consider solutions which combine existing and new technologies and in which the role of
people and autonomous software can be changed to help deliver desired outcomes. For such systems to be
successful, they will need to include the ability to change, adapt and grow both in operation and over several
iterations of their life cycle. This change moves SE to be directly involved in enterprise strategy and planning, as part
of an ongoing and iterative approach to tackling the kinds of societal problems identified in the INCOSE vision.
This evolution of both the role and scope of SE will also see the system of systems aspects of all system contexts
increase. We can expect system of systems engineering to become part of the systems engineering of many, if not
most, SE life cycles.

Evolution of Fundamentals
These ongoing changes to SE place more emphasis on the role of autonomous agents in systems engineering, and
agency will be an area of increased emphasis in the systems engineering and SEBoK of the future. Hybertson (2019)
spells out in more detail the increased role of agents and agency in future SE. Moving from a total control model to a
shared responsibility model changes the nature of engineering to something more like collective actualization, as
proposed by Hybertson (2009 and 2019). Systems will represent a combination and interplay of technology and
social factors, and they can range from technical product to service provider to social entity. In many cases they will
be a socio-technical combination or hybrid.
The above trends have an impact on SE foundations, including technical aspects, social aspects, and ethical aspects.
Inclusion of people in systems implies significant expansion of foundation sciences, to provide principles, theories,
models, and patterns of the human, biological, social, and agent realm as well as the technical and physical realm.
Emphasis on agents implies a revised conceptualization of system change, from the traditional model of mechanistic
and controlled fixes and upgrades to a more organic change model that involves growth, self-learning,
self-organizing, and self-adapting. Ethical considerations will include how to allocate responsibility for a system in a
shared responsibility model. Further discussion of the expanded foundation and a list of foundation disciplines for
future SE are presented in (Hybertson 2009 and 2019).

References

Works Cited
Hybertson, D. (2009). Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and
Complex Systems, Boca Raton, FL, USA: Auerbach/CRC Press.
Hybertson, D. (2020 forthcoming). Systems Engineering Science. Chapter in G. S. Metcalf, H. Deguchi, and K.
Kijima (editors in chief). Handbook of Systems Science. Tokyo: Springer.
INCOSE (2014). “A world in motion: Systems engineering vision 2025.” International Council on Systems
Engineering.
Fundamentals for Future Systems Engineering 120

Primary References
None.

Additional References
None.

< Previous Article | Parent Article | Next Article >


SEBoK v. 2.4, released 19 May 2021

You might also like