![]() |
Hoc, J.-M-., Cacciabue, P.
C. & Hollnagel, E. (Eds.) (1995). Hillsdale, NJ: Lawrence Erlbaum
Associates. |
EDITORS’ FOREWORD
The technological development have changed the nature of industrial production so that it today no longer is a question of a human working with a machine, but rather that a joint human-machine system performs the task. This development, which started in the 1940's, has become even more pronounced with the proliferation of computers and the invasion of digital technology in all wakes of working life. The first area where radical changes took place was administration and clerical work - including such trades as typesetting and technical drawing. We are now seeing a similarly drastic change in the industrial process domain. Process automation has long been used to improve the efficiency and safety of industrial production, but the availability of cheap but powerful computers is at this time producing fundamental changes in the very nature of work. It may look as if the result is that the importance of human work has been reduced compared to what can be achieved by intelligent software systems. But in reality the opposite is the case: the more complex a system is, the more vital is the human operator's task. The conditions, however, have changed. Whereas people used to be in control of their own tasks, today they have become supervisors of tasks which are shared between humans and machines.
Engineers and technical specialists have often had as their goal the complete automation of the systems, i.e. making the human operator redundant. This goal has rarely been achieved. There are two main reasons for that. Firstly, it is not always technically possible to automate every function in a system. Secondly, and more importantly, it is impossible to anticipate the range of situations and conditions that can occur in the life of a system, except for very trivial cases. Human operators therefore still have an important role to play in providing the adaptation that is necessary when the systems go beyond their normal operating conditions. It follows that a proper understanding of the operator's cognition is at least as important as mastering the technical side of the systems. A thorough understanding and appreciation of human cognition is essential both to develop efficiently and safety functioning systems and to train the human operators to fulfil their assignments.
To attain these goals, multi-disciplinary interactions and collaboration are essential. A considerable amount of effort has been devoted to the domain of administrative and clerical work and has led to the establishment of an internationally based human-computer interaction (HCI) community, at the level of research as well as at the level of application. The HCI community, however, has paid more attention to static environments, where the human operator is in complete control of the situation, than to dynamic environments, where changes may occur independent of human intervention and actions. A typical case is that of industrial process control. However, many other situations share the same feature, for instance air traffic control, aircraft piloting, ship manoeuvring, intensive care in hospitals, crisis management, electronic trading, etc.
In 1986 an
international working group was established with the financial support of the
French National Scientific Research Centre (CNRS). Two multi-disciplinary work
programs on Work, and Cognitive Science enabled psychologists, ergonomists,
computer scientists, and control engineers to work together. As a result, a
series of bi-annual meetings has been organized since 1987 (CSAPC: Cognitive
Science Approaches to Process Control). In addition, a more focused working
group has held a number of meetings (CADES: Cognitive Approaches to Dynamic
Environment Supervision), coordinated by J.M. Hoc, with the contribution of R.
Amalberti, P.C. Cacciabue, J. Patrick, B. Pavard, and R. Samurcay; the aim of
this working group was to produce an overview in the form of a book, which would
take stock of recent research developments for the topic. The result is the
present volume. Although the CADES group initially was created on a European
basis, it has progressively been expanded to include also several colleagues
from the
The basic philosophy of this book is the conviction that human operators remain the unchallenged experts even in the worst cases where their working conditions have been impoverished by senseless automation. They maintain this advantage due to their ability to learn and build up a high level of expertise, a foundation of operational knowledge, during their work. This expertise must be taken into account in the development of efficient human-machine systems, in the specification of training requirements, and in the identification of needs for specific computer support to human actions.
The book is
divided into an introductory chapter (Ch. 1) organizing the common concepts used
in the book, three main sections, and a conclusion (
Section I deals with the main features of cognition in dynamic environments, combining issues coming from empirical approaches of human cognition and cognitive simulation. Ch. 2 integrates three key activities - diagnosis, decision-making, and time management - in a common framework, since they are always interacting in dynamic environment supervision. Ch. 3 reviews the main cognitive architectures which are used in cognitive modelling by the means of computers. Ch. 4 presents the four domains of application of cognitive simulation - system design, analysis and evaluation, human operator training and on-line support. Ch. 5 explores the ability of computer simulation to model the dynamics of Human-Machine Systems considered as joint cognitive systems.
Section 2 addresses the question of the development of competence and expertise. Ch. 6 analyzes errors in a medical context to catch novice/expert differences and suggest some ways to improve the development of expertise. Ch. 7 explores diverse conceptual models of technological processes for training process operators and examines the crucial question of the decomposition of complex systems into manageable knowledge units by beginners. Ch. 8 proposes a methodology to design training situations from real work situations, improving knowledge transfer. Ch. 9 stresses the need for developing expertise for cooperation in highly coupled systems.
Section 3
proposes some ways to take up the main challenge in this domain - the design of
an actual cooperation between human experts and computers of the next century.
automated systems and their self confidence in their abilities as manual controllers. Ch. 12 stresses that human-machine cooperation has to handle human error which is conceived as a product of conflict between the human and the physical or social artefacts. Ch. 13 combines the human engineering approach and the ergonomic approach to tackle the problem of cooperation between humans and intelligent support systems. Ch. 14 stresses the crucial role adaptation plays in the coupling between human and machine. Ch. 15 deals with the paradigm of "human-like" systems which could improve the human-machine cooperation. Ch. 16 presents a methodology to support human operator planning activities, taking the intentionality of industrial environments into consideration.
Jean-Michel Hoc Pietro C. Cacciabue Erik Hollnagel
| SERIES FOREWORD | |
| EDITORS' FOREWORD | |
| LIST OF CONTRIBUTORS | |
| Chapter 1 |
Hollnagel, E., Cacciabue, P. C. & Hoc, J.-M.: Work with Technology: Some Fundamental Issues |
| Section 1 Cognition and work with technology | |
| Chapter 2 | Hoc, J.-M., Amalberti, R. & Boreham, N.: Human Operator Expertise in Diagnosis, Decision-Making, and Time Management |
| Chapter 3 | Kjaer-Hansen, J.: Unitary Theories of Cognitive Architectures |
| Chapter 4 |
Cacciabue, P. C. & Hollnagel, E.: Simulation of Cognition: Applications |
| Chapter 5 | Woods, D: D. & Roth, E.: Symbolic AI Computer Simulations as Tools for Investigating the Dynamics of Joint Cognitive Systems |
| Section 2 Development of Competence and Expertise | |
| Chapter 6 |
Boreham, N.: Error Analysis and Expert-Novice Differences in Medical Diagnosis |
| Chapter 7 | Samurcay, R.: Conceptual Models for Training |
| Chapter 8 | Rogalski, J.: From Real Situations to Training Situations: Conservation of Functionalities |
| Chapter 9 | Norros, L.: An Orientation-Based Approach to Expertise |
| Section 3 Cooperation between Humans and Computers | |
| Chapter 10 | Benchekroun, H., Pavard, B. & Salembier, P.: Design of Cooperative Systems in Complex Dynamic Environments |
| Chapter 11 |
Moray, N. P., Hiskes, D., Lee, J. & Muir, B. M.: Trust and Human Intervention in Automated Systems |
| Chapter 12 |
Rizzo,
A., Ferrante, D. & Bagnara, S.: Handling Human Error |
| Chapter 13 |
Millot, P. & Mandiau, R.: Man-Machine Cooperative Organizations: Formal and Pragmatic Implementation Methods |
| Chapter 14 | Hollnagel, E.: The Art of Efficient Man-Machine Interaction: Improving the Coupling Between Man And Machine |
| Chapter 15 |
Boy, G.: "Human-Like" System Certification and Evaluation |
| Chapter 16 |
Lind, M. & Larsen, M. N.: Planning Support and the Intentionality of Dynamic Environments |
| Chapter 17 | Hollnagel,
E., Hoc, J. M. & Cacciabue, P. C.: Expertise and Technology: "I
Have a Feeling We Are Not in |
| SUBJECT INDEX | |