buffers and margins are impacted in ways that limit
the ability of the system to maintain and sustain
adaptability when confronted with uncertainty and
surprise events and thereby making the system less
effective. Additionally, increasing the distance
between the practitioner, and the system reduces the
practitioner’s ability to intervene in case of
unexpected events:
Work changes. When work changes there are
consequences on the practitioner’s ability to create
strategies that can exploit system characteristics of
agility and flexibility, in other words adaptive
capacity. Boy (Boy, 2020) refers to this as a form of
smart integration: designing for innovative complex
systems - that exploit the ability to understand
increasing complexity. This means embracing
complexity. What are we designing for?
A design that embraces complexity will adopt the
opposite of the reductionist view – which means
reducing or eliminating the effects of complexity, by
eliminating or reducing the role of the human. As
opposed to designs that embrace and design for
complexity by matching emerging system behaviours
with creative emergent human real time responses.
5 CONCLUSIONS
In this paper we argue that we need to move towards
designing a socio-cognitive system. This is proposed
as a way forward to reduce the distance between
practitioners and designers so that designs
incorporate joint activity that supports common
ground.
To make that possible, we must embrace
complexity, uncertainty and surprises rather than
trying to eliminate it. In doing so the role of the
human practitioner is recognised and sustained,
which permits more efficient and effective operation
in real-time. Furthermore, such an approach can lead
to maintaining job satisfaction, practitioner
involvement and the real-time learning and
adjustments of patterns of activity associated with
complexity, uncertainty and surprises.
One of the means to achieve a constructive
approach to the design of effective and meaningful
human-system integration is through new ways of
working together. These need to be institutionalised
and embedded by the Regulator. In the recent Boeing
episode, the manufacturer was doing the regulators
job (Nicas, J. et al, 2019).
Further areas for consideration are a coherent
transition plan should be derived to identify the needs
of management and the human practitioner in
complex socio-cognitive systems. Another question
is whether we are deceived by the optimistic
predictions of costs saved by tools and method of
operations without the human practitioner.
REFERENCES
Bainbridge L (1983): Ironies of automation. Automatica 19,
775-779.
Baxter, G., & Somerville, I. (2011). Socio-technical
systems: from design methods to systems engineering.
Interacting with computers 23, 4-17
Boy G, (2020). Human-systems integration. CRC Press
Bradshaw, J., Feltovich, P., & Johnson. (2011). Human-
agent interaction. In G. Boy (Ed), The handbook of
human-machine interaction (pp. 283-303) Ashgate
Cilliers, P. (2000), Complexity and postmodernism:
understanding complex systems. Routledge
Eisenberg, D., Seager, T., Alderson, D.L. (2019).
Rethinking resilience analytics. Risk Analysis DOI:
10.1111/risa.13328
Flach, J. (2012), Complexity: learning to muddle through.
Cogn Tech Work) 14, (187–197), DOI 10.1007/s10111-
011-0201-8
Heylighen, F., Cilliers, P., Gershenson, (2007). Complexity
and Philosophy. In Ed. J. Bogg, R. Geyer. Complexity,
Science and Society (1
ST
Edition). CRC Press.
Hollnagel E, Woods D (2005): Joint Cognitive Systems:
Foundations of Cognitive Systems Engineering. CRC
Press, Taylor and Francis.
Inagaki, T. (2014). Human-machine co-agency for
collaborative control. In Eds, Yoshiyuki, Sankai; Kenji
Suzuki, Yasuhisa Hasegawa: Cybernetics: fusion of
human, machine an information system. Springer
Klein, G. (2022). The war on expertise. Novellus.
https://novellus.solutions/insights/podcast/war-on-
expertise-how-to-prepare-and-how-to-win/.
Klein, G., Feltovich, P.J., Bradshaw, J.M., Woods, D.D.
(2005). Common ground and coordination in joint
activity. In Eds W. B. Rouse & K.R. Boff,
Organisational Simulation: Wiley
Lanir, S. (1983). Fundamental surprises. Tel Aviv: Centre
for strategic studies
McDaniel P.R Jr., & Driebe D.J. (2006): Uncertainty and
Surprise. Complex Systems: Questions on Working with
the Unexpected. Springer Publishing.
Nicas J, Kitroeff N, Gelles, D, Glanz J. (2019). Boeing built
deadly assumptions into 737 Max, blind to a late design
change [Internet]. New York: The New York Times;
2019 Jun 1 [cited 2019 Oct 17]. Available from
https://www.nytimes.com/2019/06/01/business/boeing
-737-max-crash.html
Woods, D.D. (2010) Fundamental surprise. In Ed. D.D.
Woods, S.W.A Dekker, R. Cook, L, Johannesen, N.
Sarter. Behind human error (2
nd
Edition). CRC Press.