Dept. of Computer and Information science, Linköping University

## IDA Technical Reports, 1992

Last updated: Tue, 02 Dec 1997 11:02:31

Andersson, N. and Fritzson, P. (1992). Comparative Evaluation and Industrial Application of Code Generator Generators. Technical Report LiTH-IDA-R-92-06, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The past ten to fifteen years has seen active research in the area of automatically generating the code generator part of compilers from formal specifications. However, less work has been done on evaluating and applying these systems in an industrial setting. This paper attempts to fill this gap.Three systems for automatic generation of code generators are evaluated in this paper: CGSS, BEG and TWIG. CGSS is an older Graham-Glanville style system based on pattern matching through parsing, whereas BEG and TWIG are more recent systems based on tree pattern matching combined with dynamic programming. An industrial-strength code generator previously implemented for a special-purpose language using the CGSS system is described and compared in some detail to our new implementation based on the BEG system. Several problems of integrating local and global register allocation within automatically generated code generators are described, and some solutions proposed. We finally conclude that current technology of automatically generating code generators is viable in an industrial setting. However, further research needs to be done on the problem of properly integrating register allocation with instruction selection, when both are generated from declarative specifications.

Auguston, M. and Fritzson, P. (1992). Parforman-an Assertion Language for Specifying Behaviour when Debugging Parallel Applications. Technical Report LiTH-IDA-R-92-16, Department of Computer and Information Science, Linköping University, Sweden. Has been accepted to the EuroMicro workshop on Parallel and Distributed Processing, Januari 1993. (bibtex),

Abstract: PARFORMAN (PARallel FORMal ANnotation language) is a high-level specification language for expressing intended behaviour or known types of error conditions when debugging or testing parallel programs. The high-level debugging approach which is supported by PARFORMAN is model-based, similar to the EBBA system, or to Algorithmic Debugging. Models of intended or faulty behaviour can be succinctly specified in PARFORMAN. These models are then compared with the actual behaviour in terms of execution traces of events, in order to localize possible bugs. PARFORMAN can also be used as a general language for expressing computations over execution histories.PARFORMAN is based on a precise axiomatic model of target program behavior. This model, called H-space (History-space), is formally defined through a set of general axioms about three basic relations, which may or may not hold between two arbitrary events: they may be sequentially ordered (SEQ), they may be parallel (PAR), or one of them might be included in another composite event (IN). The general notion of composite event is exploited systematically, which makes possible more powerful and succinct specifications. The high-level notion of event-grammar is introduced to describe allowed event patterns over a certain application domain or language. Auxiliary composite events such as Snapshots are introduced to be able to define the notion "occurred at the same time" at suitable levels of abstraction. Finally, patterns and aggregate operations on events are introduced to make possible short and readable specifications. In addition to debugging and testing, PARFORMAN can also be used to specify profiles and performance measurements.

Bäckström, C. (1992). Planning with Partical States in O(n2) Time: The SAS+-PUS Planning Problem. Technical Report LiTH-IDA-R-92-05, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: We generalize the previously presented, tractable SAS-PUS planning problem to the SAS+-PUS planning problem. This generalization allows partial initial and goal states as well as actions changing a state variable from the undefined value to some defined value. We also present a sound and complete algorithm for finding minimal, maximally parallel plans for instances of the SAS+-PUS planning problem. Although the algorithm solves a more general problem it---somewhat surprisingly---improves over the earlier algorithms by running in O(n2) time in the worst case.

Busch, D. (1992). Sequent Formalizations of Three-valued Logic. Technical Report LiTH-IDA-R-92-13, Department of Computer and Information Science, Linköping University, Sweden. Accepted: revised version of " An Intuitionistic Three-valued Logic" delivered at the Logic Programming in AI First Compulog-Net Workshop, 23-24 March 1992, Imperial College London. Is going to be presented at the Nordic Workshop and Summer School on Partial Semantics and Non-Monotonic Reasoning for Knowledge Representation, May 25-29 at Linkoping. (bibtex),

Abstract: No abstract available

Dahlbäck, N. and Jönsson, A. (1992). An Empirically Based Computationally Tractable Dialogue Model. Technical Report LiTH-IDA-R-92-32, Department of Computer and Information Science, Linköping University, Sweden. Also in the proceedings of the 14th Annual Conference of the Cognitive Science Society (COG SCI-92), Bloomington, Indiana, July 29th to August 1st. (bibtex),

Abstract: No abstract available

Dahlbäck, N., Jönsson, A., and Ahrenberg, L. (1992). Wizard of Oz-studies - why and how. Technical Report LiTH-IDA-R-92-19, Department of Computer and Information Science, Linköping University, Sweden. Has been accpted to the International Workshop on Intelligent User Interfaces, Orlando, Florida, Januari, 4-7, 1993. (bibtex),

Abstract: We first argue that Wizard-of-Oz studies have a special significance for the development of natural language dialogue systems and the understanding of the specific genre such systems induce. We then discuss the practical execution of Wizard-of-Oz experiments, drawing on experiences from some successful and some less successful studies of this kind. Finally, we discuss some potential drawbacks of the method, concluding that they can be avoided if the experiments are designed and carried out with sufficient care.

Doherty, P. (1992). A Constraint-Based Approach to Proof Procedures for Multi-Valued Logics. Technical Report LiTH-IDA-R-92-02, Department of Computer and Information Science, Linköping University, Sweden. Also published in 1st World Conference on the Fundamentals of Artificial Intelligence, Paris, France, 1-5 July 1991. (bibtex),

Abstract: In this article, we present a tableau-based theorem-prover for KL*, a three-valued logic, based on Kleene's strong definitions (KL). KL* is derived from KL by extending it with an additional weak negation connective denoting absence of truth. The resulting logic appears to be quite useful in the natural language area, but has had little application in artificial intelligence until recently. Although we are aware of a number of axiomatizations for KL*, the technique described in this article, which is based on generalizing the notion of a signed formula with a single sign to one with a set of signs ordered in a constraint lattice has, to our knowledge, not yet been used in proof techniques for KL*. The use of a constraint lattice on truth-values emphasizes the reading of multiple-truth values as constraints on sentences rather than denotations. Once this view is taken, any set of elements with an appropriate constraint ordering may be incorporated into the technique, a very useful notion in the context of knowledge representation in AI.

Doherty, P., Driankov, D., and Hellendoorn, H. (1992). Fuzzy If-Then-Unless Rules and their Implementa tion. Technical Report LiTH-IDA-R-92-21, Department of Computer and Information Science, Linköping University, Sweden. Extended Version of a Paper accepted at InternationalConference on Information Processing and Management of Uncertainty in Knowledge-Based Systems, IPMU'92 Palma de Mallorca, July 6-10, 1992. (bibtex),

Abstract: We consider the possibility of generalizing the notion of a fuzzy If-Then rule to take into account its context dependent nature. We interpret fuzzy rules as modeling a forward directed causal relationship between the antecedent and the conclusion, which applies in most contexts, but on occasion breaks down in exceptional contexts. The default nature of the rule is modeled by augmenting the original If-Then rule with an exception part. We then consider the proper semantic correlate to such an addition and propose a ternary relation which satisfies a number of intuitive constraints described in terms of a number of inference rules. In the rest of the paper, we consider implementational issues arising from the unless extension and propose the use of reason maintenance systems, in particular TMS's, where a fuzzy If-Then-Unless rule is encoded into a dependency net. We verify that the net satisfies the constraints stated in the inference schemes and conclude with a discussion concerning the integration of qualitative IN-OUT labelings of the TMS with quantitative degree of membership labelings for the variables in question. A generalization to the TMS relabeling algorithm is proposed which would permit the modeling of reasoning with generalized modus ponens''.

Doherty, P., Driankov, D., and Tsoukias, A. (1992). Partiality, Åara-consistency and Preference Modeling : Preliminary Version. Technical Report LiTH-IDA-R-92-18, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Existing non-monotonic formalisms are generally designed to capture a single criterion for determining preference among competing models. As pointed out by Jon Doyle, one must aggregate individual preference orders into a global order to obtain unified non-monotonic formalisms. As long as one considers preference modeling in terms of classical decision theory, any attempt at aggregating different criteria will violate certain commonsense postulates about preference orderings. Fortunately, recent advances in multiple-criteria decision theory permit the use of preference orderings not limited by the classical constraints of complete comparability and transitivity. The advantage is that one is able to model decision scenarios where information is incomplete and inconsistent. In addition, non-conventional preference modeling may provide a potential solution to the aggregation problem discussed by Doyle. One obstacle towards applying non-conventional preference modeling has been the lack of a sufficient axiomatization for non-conventional preference structures. The main contribution of this paper is to propose the use of a multi-valued para-consistent logic, WL4, as a means of providing a robust semantic theory for non-conventional preference modeling.

Doherty, P. and Lukaszewicz, W. (1992). FONML3 - A First-Order Non-monotonic Logic with Explicit Defaults. Technical Report LiTH-IDA-R-92-20, Department of Computer and Information Science, Linköping University, Sweden. This is an extended version of a paper published in the Proceedings of the 10th European Conference on Artificial Intelligence (ECAI 92), Vienna, Austria, Aug 3-7, 1992. (bibtex),

Abstract: In this article, we present FONML3, a first-order version of NML3, a propositional non-monotonic logic with explicit defaults, characterized by the the following features: (1) The use of the strong Kleene three-valued logic as a basis. (2) The addition of an explicit default operator which permits distinguishing {\em tentative} conclusions from ordinary conclusions in the object language. (3) The use of preferential entailment to generate non-monotonic behavior. We also show that FONML3, besides having a number of attractive formal properties such as a cumulative consequence relation, also avoids a number of counter-intuitive limitations associated with existing non-monotonic formalisms which are due to the tight coupling between the minimization of abnormalities and the maximization of normalities. The tight coupling leads to a number of paradoxes which are absent in FONML3.

Eles, P., Kuchcinski, K., Peng, Z., and Minea, M. (1992). Compiling VHDL into a High-Level Synthesis Design Representation. Technical Report LiTH-IDA-R-92-04, Department of Computer and Information Science, Linköping University, Sweden. Also has been accepted for presentation at the EURO-DAC ( EURO- Design Automation Conference), Hamburg, Germany, September 7-10, 1992. Was selected as the Best Paper of the EURO-DAC/VHDL conference in Hamburg, Germany. (bibtex),

Abstract: The paper discusses the problem of extending the use of VHDL to the field of hardware synthesis.The main difficulty lies in the fact that the semantics of standard VHDL is defined strictly in terms of simulation. We present a synthesis-oriented compiler based on a broad subset of VHDL. We describe the language subset, the internal design representation (based on an extended timed Petri net model), and the implementation of the front end as part of the CAMAD Synthesis System. Based on the fact that CAMAD supports the design of hardware with concurrency and asynchrony, our VHDL subset includes the concurrent features of the language. We state some important conclusions concerning how to deal with signals, wait statements, structured data, subprograms, from the specific point of view of synthesis. We discuss also the aspects of VHDL semantics that are strictly simulation oriented and should be redefined or ignored when dealing with synthesis.

Fagerström, J., Fritzson, P., and Pettersson, J. R. M. (1992). A Data-Parallel Language and its Compilation to a Formally Defined Intermediate Language. Technical Report LiTH-IDA-R-92-15, Department of Computer and Information Science, Linköping University, Sweden. In Proceedings of 4th International Conference on Computing and Information, ICCI'92, Toronto may 28-30, 1992. (bibtex),

Abstract: In this paper we present some work on language design and compilation techniques for data parallel languages, examplified by a simple data parallel language called Predula. The goal is to provide users with a basically serial language in which inherent parallelism is expressed precisely enough for the compiler to generate efficient code for massively parallel machines. We also present the automatic generation of a compiler for a subset of Predula to data parallel vector code (VCode) from a denotational specification of a subset of the language. A formal specification of the intermediate form used by this specification is partially described by using a two-level approach; the denotational semantics of Predula is expressed in terms of intermediate primitives used to generate final code from. These primitives are defined using structured operational semantics. The denotational specification is expressed in DML (Denotational Meta Language), a language and system for generating practical compilers from denotational specifications. The results from the two level semantic approach are encouraging; a small compiler has been generated, which have been used to compile a small example program that was executed on a Sparcstation and a Cray Y-MP. However, we have found that the current stack based VCode abstract machine is unsuitable as a target for compiling imperative languages for efficiency reasons. A more general abstract machine could be designed based on the vector primitives behind VCode.

Fahl, G. (1992). Integration of Heterogeneous Databases in a Mediator Architecture. Technical Report LiTH-IDA-R-92-23, Department of Computer and Information Science, Linköping University, Sweden. Also in the Proceedings of the Third Annual IDAConference on Computer and Information Science Linköping University, Sweden, March 1992. (bibtex),

Abstract: Future information systems will require techniques to integrate data from heterogeneous data sources and provide shared use of operations operating on the data. This paper describes the idea of a mediator architecture to accomplish this.A mediator is a sharable software component placed in a layer between data sources and the applications using them. We briefly describe four classes of mediators and then concentrate on one of them; integration mediators. We give an overview of how integration is usually achieved in current research systems, and how the mediator approach relates to this. Integration mediators have a lot in common with tightly coupled federated database systems.Finally, our plans for a research project to investigate various issues of the mediator approach are outlined. We will initially concentrate on building integration mediators.

Goldkuhl, G. (1992). Information Systems Design as Argumentation - An Investigation into Design Rationale as a Conceptualization of Design. Technical Report LiTH-IDA-R-92-40, Department of Computer and Information Science, Linköping University, Sweden. Accepted to the 14th IRIS (Information Systems Research Seminar in Scandinavia), Umeå, 11-14 August 1991. (bibtex),

Abstract: Design Rationale (DR) is an approach how to justify design decisions. Different design alternatives are evaluated according to criteria. In this process different arguments behind options are described. The paper investigates one approach to Design Rationale developed at Xerox EuroPARC. This approach (concepts and notation) is critically studied. The paper contains also proposals for modification and extension to this DR approach. The critical analysis is performed from a communicative action perspective.

Goldkuhl, G. (1992). Stöd och Struktur i Systemutvecklingsprocess. Technical Report LiTH-IDA-R-92-41, Department of Computer and Information Science, Linköping University, Sweden. Accepterad till " Systemutveckling i praktisk belysning" Dataföreningen, Norrköping, 19 november 1991. (bibtex),

Abstract: Vid utveckling av datorbaserade informationssystem behövs olika hjälpmedel. Man använder ofta modeller, metoder och datorverktyg. Rapporten ger en referensram över dessa hjälpmedel och hur de kan samspela i systemutvecklingsprocessen. Olika historiska utvecklingstendenser gås igenom: - Olika utvecklingsmetoders framväxt- Ökad användning av verktyg- Konsultmodeller som kombinerar olika metoderRapporten innehåller också en diskussion om några tänkbara framtida utvecklingstendenser:- Synsättens ökande betydelse- Pendling mellan standardisering och innovation- Situationsanpassning av metoder- Egenkombinerade metodpaket- Anpassning av CASE-verktyg

Goldkuhl, G. and Röstlinger, A. (1992). Att bygga in verksamhetskvalitet i informationssystem. Acceptera till Konferens " Sundsvall 42" , Dataföreningen, Sundsvall, 15-17 oktober 1991. Technical Report LiTH-IDA-R-92-39, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Rapporten behandlar kvalitetssäkring under systemutveckling. En jämförelse görs med kvalitetssäkring i tillverkningsindustri. Rapporten argumenterar för betydelsen att bygga in kvalitet från början. Begreppet verksamhetskvalitet behandlas. Rapporten beskriver hur kontextuell verksamhetsanalys enligt SIM-metoden kan användas för att bygga in verksamhetskvalitet i informationssystem.

Hirsch, R. (1992). About Belief: Representation of Content, Change and Development. Technical Report LiTH-IDA-R-92-29, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The paper gives a concise overview and discussion of the subject-area of belief, representation of belief content, change of belief, and development of belief in communication as may be relevant for the construction of models of human cognitive activity as, for instance, in theory building behind models of the user in man-machine interaction. The paper consists of six major sections: 1) What is a belief? 2) Content of belief, 3) Representation of content of belief, 4) Relations between beliefs, 5) Change of belief, and 6) Communication and belief.

Holmgren, H., Timpka, T., Goldkuhl, G., Nyce, J., and Sjöberg, C. (1992). Argumentative Design of Medical Software: an Essentional Part of Action Design. Technical Report LiTH-IDA-R-92-28, Department of Computer and Information Science, Linköping University, Sweden. Accepted to the MEDINFO'92, Geneva, Switzerland, September 1992. (bibtex),

Abstract: Action Design is a method for participatory development of information systems for health care. Like allsystem development methods that rely strongly on user participation it has been criticized for lack of rigor. As a response to this criticism an elaboration within the Action Design framework, Argumentative Design is introduced. This method makes the argumentation and justification aspects of design discussions explicit through a notational system which is used to represent the design space of a system. This representation, the design rationale for the system, is seen as a co-product of the design process. Action Design supports all aspects of participative system development, but in this paper the main concern is with the later, iterative phases of development. Activity theory is introduced as a theoretical frame of reference for these phases, which are crucial for meeting extreme demands on the performance of information systems in medical settings.

Lambrix, P. (1992). Management of Historical Information of Composite Objects, 1992. (Master Thesis No 1). Technical Report LiTH-IDA-R-92-11, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: An important aspect of object oriented databasese systems is the ability to build up composite objects from object parts. This allows modularity in the representation of objects and reuse of parts where appropriate. It is also generally accepted that object-oriented database systems should be able to handle temporal data. However little theoretical work has been done on temporal behavior of composite objects, and only relatively few systems attempt to incorporate both historical information and composite objects in a multi-user environment.In this work we make a first step in formalizing historical information of composite objects. We identify different kinds of compositions and give formal synchronization rules between a composition and its components to induce the desired behavior of these compositions in a database setting. Further we argue that the support for handling temporal information provided by other systems addresses only one of two important kinds of historical information. We describe the notions of temporal history and edit history. Finally we address the problem of maintaining consistent historical information of a composition using the historical information of its components. This problem occurs as a result of sharing objects between several compositions. We propose a solution in the LINCKS system.

Larsson, T. I. and Vainio-Larsson, A. A. (1992). Software Producers as Software Users. Technical Report LiTH-IDA-R-92-08, Department of Computer and Information Science, Linköping University, Sweden. Presented at NATO Workshop on Software Requirement for Software Environments, Touluse, France, 5-11 september 1991. (bibtex),

Abstract: Within the framework of a large industrial software development project designers' needs and requirements on software engineering environments have been studied. The study focussed on software designers as being not only software producers but also users of software. The results show that in order to benefit productivity and quality, improvements of the methodologies and tools that support co-work and document traceability are often more essential than many other improvements. In the paper the concept of 'user-centered requirements' and its implications on design work are developed from a bifocal perspective, the designers' and the potential customers' perspective. From the standpoint that design should be requirement-driven, methodological implications for productive, user-centered software engineering environments are traced and discussed

Litwin, W. and Risch, T. (1992). Main Memory Oriented Optimization of OO Queries using Typed Datalog with Foreign Predicates. Technical Report LiTH-IDA-R-92-24, Department of Computer and Information Science, Linköping University, Sweden. Accepted for publication in IEEE Transactions on Knowledge and Data Engineering and accepted to IEEE Transactions on Knowledge and Data Engineering in a special section on main memory databases. (bibtex),

Abstract: Object-oriented DBMSs (OODBs) have created a demand for relationally complete, extensible, and declarative object-oriented (OO) query languages. Until now, run time performance of such languages was far behind that of procedural OO interfaces. One reason is the internal use of a relational engine with magnetic disk resident databases. We address the processing of the declarative OO language WS-OSQL, provided by the fully operational prototype OODB called WS-IRIS. A WS-IRIS database is MM resident. The system architecture, data structures, and optimization techniques are designed accordingly. WS-OSQL queries are compiled into an OO extension of Datalog called ObjectLog, providing for objects, typing, overloading, and foreign predicates for extensibility. We present cost based optimizations in WS-IRIS using ObjectLog. Performance tests show that WS-IRIS is about as fast as current OODBs with procedural interfaces only and is much faster than known relationally complete systems. These results would not be possible for a traditional disk based implementation. However, MM residency of a database appears only a necessary condition for better performance. An efficient optimization proves of crucial importance as well.

Malec, J. (1992). Complex Behavior Specification for Autonomous Systems, 1992. Technical Report LiTH-IDA-R-92-14, Department of Computer and Information Science, Linköping University, Sweden. Accepted to IEEE International Symposium on Intelligent Control'92, Glasgow, Scotland, august 11-13, 1992. (bibtex),

Abstract: No abstract available

Malec, J. (1992). Process Transition Networks: The Final Report. Technical Report LiTH-IDA-R-92-07, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Process Transition Networks (PTNs), introduced by Sandewall [San90b], are a graphical language to describe complex qualitative processes; processes which are difficult to express quantitatively even with the help of causal dependencies. Road or air traffic monitoring, process control, or computer systems maintenance are some examples of possible application areas for PTNs. The language primitives are introduced informally, followed by a formal definition of syntax and semantics of the language. The relation between PTNs and other representation languages (Petri nets, and elementary feature logic) is considered, and it is shown that although PTNs have no more expressive power than Petri nets, their flexibility as a representation language for complex processes justifies their existence.This report contains the results published as [Mal91c], and [Mal91d].

Merkel, M. (1992). Recurrent Patterns in Technical Documentation. Technical Report LiTH-IDA-R-92-31, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This paper addresses some of the problems involved in the production and translation of technical documentation. The techniques and methods developed within Natural Language Processing in general and Machine Translation in particular have still a long way to go before we can see any commercial products that would be general enough to automatically translate unrestricted text. Instead of merely aiming for the perfect MT system, we should also focus on how to make use of existing and simple techniques and the capacity of today's hardware to make the production of technical documentation faster, better and cheaper. Even a twenty per cent gain in efficiency compared to manual translation is considerable compared by any industry standard.In this paper I describe a tool that pre-processes the source text and gives various kind of information that forms decision support whether translation tools should be applied at all. Examples from analyses show that up to 43 per cent of a text could be repetitious and that this should be utilised before the translator starts translating. If we consider both repetitions within one document as well as repeated patterns across documents, there is evidence in the corpus that 55 per cent of the text in one document can be regarded as recurring. The tool has been run on several real handbook texts from major computer software companies and a summary of the results is presented.

Näslund, T. (1992). Evaluation as a Means for Improving Development of Decision of Support Systems. Technical Report LiTH-IDA-R-92-01, Department of Computer and Information Science, Linköping University, Sweden. Presented at Doctoral Consortium, 11th International Conference on Information Systems, Copenhagen, Denmark, December 17-19, 1990. (bibtex),

Abstract: The aim of decision support systems is to support a decision-maker, rather than replicate or interfere with him or her. This is often hard to achieve at once, and iterative development is often advocated as a remedy for this. We argue that it is important to investigate the role of evaluation in this development paradigm. It is the evaluation activity that helps us to reduce uncertainties of the effectiveness of the system, and gives us information for continuous adaptation of the system to its intended use. In the paper we describe a framework for development of decision support systems, and discuss our intended research activities for establishing the use of a proper evaluation methodology in this framework.

Nilsson, H. and Fritzson, P. (1992). Algorithmic Debugging for Lazy Functional Languages. Technical Report LiTH-IDA-R-92-17, Department of Computer and Information Science, Linköping University, Sweden. In Proceedings of PLILP'92, Symposium on Programming Language Implementation and Logic Programming, LNCS, Springer Verlag, Leuven, Belgium, August 26-28, 1992. Accepted to the Journal of Functional Programming, 1993. (bibtex),

Abstract: Lazy functional languages have non-strict semantics and are purely declarative, i.e. they support the notion of referential transparency and are devoid of side effects. Traditional debugging techniques are, however, not suited for lazy functional languages since computations generally do not take place in the order one might expect. Since algorithmic debugging allows the user to concentrate on the declarative aspects of program semantics, and will semi-automatically find functions containing bugs, we propose to use this technique for debugging lazy functional programs. In this paper we present an algorithmic debugger for a lazy functional language and some experience in using it. Because of the non-strict semantics of lazy functional languages, arguments to functions are in general partially evaluated expressions. The user is, however, usually more concerned with the values that these expressions represent. We address this problem by providing the user with a strictified view of the execution trace whenever possible. We believe, supported by [2], that this is the first algorithmic debugger for a lazy functional language.

Persson, J. (1992). Generation of Multi-Resolution Maps. Technical Report LiTH-IDA-R-92-33, Department of Computer and Information Science, Linköping University, Sweden. Awarded the prize " Lilla Polhemspriset" 1992. (bibtex),

Abstract: In geographical information systems (GIS), the need for hierarchically organized data structures has evolved. The reason for this stems from the fact that the map information usually is extremely large and that low resolution maps in most applications require less heavy computations than maps represented in higher resolutions. Hierarchical spatial data structures for generation of maps in lower resolutions already exist, e g quad-trees and resolution pyramids. Many other spatial data structures that are non-hierarchical do not permit the generation of resolution hierarchies like the two above. One such structure is the run-length code (RLC), which has many powerful advantages that make the structure feasible in a GIS. For that reason a method for generation of resolution hierarchies based on RLC is needed. In this work an approach to the problem of generating a resolution hierarchy from RLC is described and discussed. A generalisation method based on recursive polygon approximation is described in detail.

Peter, Loborg, Holmbom, P., Sköld, M., and Törne, A. (1992). A Model for the Execution of Task Level Specifications for Intelligent and Flexible Manufacturing Systems. Technical Report LiTH-IDA-R-92-22, Department of Computer and Information Science, Linköping University, Sweden. Accepted at the V International Symposium on Artificial Intelligence, ISAI92, Cancun, Mexico, Dec 7-11, 1992. Accepetd for publication in Integrated Computer-Aided Engineering (special issue about AI in Manufactoring and Robotics). (bibtex),

Abstract: No abstract available

Pettersson, M. (1992). A Term Pattern-Match Compiler Inspired by Finite Automata Theory. Technical Report LiTH-IDA-R-92-09, Department of Computer and Information Science, Linköping University, Sweden. Also in the proceedings of The 4th International Conference on Compiler Construction (CC'92), Paderborn, Germany, October 5-7, 1992, LNCS-641, Springer Verlag. (bibtex),

Abstract: This paper presents a new algorithm for compiling term pattern-matching for functional languages. Earlier algorithms may produce duplicated code, and redundant or sub-optimal discrimination tests for certain combinations of patterns, in particular when a pattern column contains a mixture of constructors and variables. This algorithm, which was inspired by finite automata theory, addresses these problems and solves them to some extent. It does so by viewing patterns as regular expressions, and optimising the finite automaton that is built to recognise them. It also makes checking patterns for exhaustiveness and irredundancy cheap operations.

Risch, T. and Sköld, M. (1992). Active Rules based on Object-Oriented Queries. Technical Report LiTH-IDA-R-92-35, Department of Computer and Information Science, Linköping University, Sweden. To be published in special issue on Active Databases in IEEE Data Engineering. (bibtex),

Abstract: We present a next generation object-oriented database with active properties by introducing rules into OSQL, an Object-Oriented Query Language. The rules are defined as Condition Action (CA) rules and can be parameterized, overloaded and generic. The condition part of a rule is defined as a declarative OSQL query and the action part as an OSQL procedure body. The action part is executed whenever the condition becomes true. The execution of rules is supported by a rule compiler that installs log screening filters and uses incremental evaluation of the condition part. The execution of the action part is done in a check phase, that can be done after any OSQL commands in a transaction, or at the end of the transaction. Rules are first-class objects in the database, which makes it possible to make queries over rules. We present some examples of rules in OSQL, some implementation issues, some expected results and some future work such as temporal queries and real-time support.

Sandewall, E. (1992). Causal Qualification and Structure-Based Ramification. Technical Report LiTH-IDA-R-92-42, Department of Computer and Information Science, Linköping University, Sweden. Accepted to The Second Symposium on Logical Formalizations of Commonsense Reasoning, Austin, TX, USA, January 11-13, 1993. (bibtex),

Abstract: No abstract available

Sandewall, E. (1992). Features and Fluents. A Systematic Approach to the Representation of Knowledgr about Dynamical Systems. Technical Report LiTH-IDA-R-92-30, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The report proposes and uses a systematic methodology for the presentation, analysis, and comparison of logics of action and change, in particular for logics that deal with the "frame problem". The methodology is characterized by three major aspects: - The use of an underlying semantics for characterizing systems with in ertia (temporal persistence). - The use of an ontological taxonomy which is strictly defined in terms of the underlying semantics, and which characterizes classes of dynamical systems with specific restrictions on them. - Formally proven assessments of the range of applicability of various (non-monotonic) entailment criteria. The main results in the report are assessment results for a number of logics of time and action, including both logics that have been proposed before and logics defined here. The report is a second review version of a forthcoming book on the logic of reasoning about action and time, and on various aspects of the frame problem. The report corresponds to the first part of the intended book.

Shu, H. (1992). A Preferential Logic for Reasoning with Goals and Intentions. Technical Report LiTH-IDA-R-92-34, Department of Computer and Information Science, Linköping University, Sweden. Accepted to LOGIC & CHANGE workshop at GWAI'92, Bonn september 3th 1992. (bibtex),

Abstract: This paper presents a preferential logic to be used for reasoning with goals and intentions. A preferential-model approach provides a proper framework to integrate both logical and subjective criteria adopted by a rational agent reasoning with incomplete information. However, the subjective criteria considered in the previous non-monotonic logics are rather limited. The intent of this work is the recognition of additional subjective criteria, in particular those relevant to goals and intentions. The preferential logic defined in this paper is based on a frame structure where, apart from three accessibility relations about beliefs, goals and intentions, three preferential relations on the respective accessible models are specified. Three preferential consequence relations are defined, with different degrees of integration of the preferential relations, and thus reflect different levels of subjectivism. The properties of restricted monotonicity provide some formal principles for a reactive planner to improve performance by minimizing plan revision. The development of this logic is an attempt to enrich the formal methods needed to model reasoning activities behind reactive planning.

Sjöberg, C., Timpka, T., Nyce, J. M., Peolsson, M., and af Klercker, T. (1992). From Clinical Literature to MedicalHypemedia : Procedures and Experiences. Technical Report LiTH-IDA-R-92-25, Department of Computer and Information Science, Linköping University, Sweden. Also Accepted to MEDINFO'92, Geneva, Switzerland, September 92. (bibtex),

Abstract: This paper describes a procedure for converting clinical literature into a non-sequential corpus for hypermedia. It describes experiences with creating a corpus of information to support interprofessional work at a primary health care centre. The paper looks at the relationship between information resources clinicians use and the implications this has for the structure of a hypermedia resource designed to support clinical decision-making. This work raises a number of issues about how knowledge and text have been represented in the medical informatics literature.

Strömberg, J.-E., Söderman, U., and Top, J. (1992). Bond Graph Supported Estimation of Discretely Changing Parameters. Technical Report LiTH-IDA-R-92-37, Department of Computer and Information Science, Linköping University, Sweden. Accepted to 1992 International Conference on Bond Graph Modeling and Simulation ICBGM 93, San Diego, USA, Januari 1993. (bibtex),

Abstract: A key factor in parameter estimation is the selection of model structure within which the fit to observed data is to be achieved. For linear systems, bond graphs have proven to give excellent support in this respect. However, for systems undergoing abrupt transitions between different (linear) modes of behaviour, the situation is far more complex. To handle this, we exploit the features of the ideal switching element as introduced by Strömberg, Top and Söderman. This solves the question of physical modelling as well as how to derive the supporting information. The approach is demonstrated on a non-trivial example using a multiple Kalman filter estimator.

Strömberg, J.-E., Top, J., and Söderman, U. (1992). Modelling Mode Switching in Dynamical Systems. Technical Report LiTH-IDA-R-92-38, Department of Computer and Information Science, Linköping University, Sweden. Accepted to ECC-93 - 2nd European Control Conference, Gröningen, Holland. (bibtex),

Abstract: Finding uniform formalisms to represent mode switching in models of dynamical systems is a goal that has long been strived for -- not the least within the automatic control community. Here we claim that the main reason for not reaching this goal is the tendency to confound modelling and computation. We will conjecture that the level of computation, i.e. the mathematical representation, is too low a level to yield any hope for a solution. Thus, we have chosen to search the solution at the physical level of representation; here by means of the bond graph formalism. By carefully defining a new ideal bond graph switching element, we have ensured a clear distinction between the physical and computational levels. With this minor addition to the original formalism, we maintain the abstraction level of bond graphs and hence keep all the conceptual properties of bond graphs, facilitating powerful guidelines to systematise the modelling process and tools to perform model validation already at the conceptual level.

Strömberg, J.-E., Top, J., and Söderman, U. (1992). Variable Causality in Bond Graphs Caused by Discrete Effects. Technical Report LiTH-IDA-R-92-36, Department of Computer and Information Science, Linköping University, Sweden. Accepted to 1993 International Conference on Bond Graph Modeling and Simulation ICBGM 93, San Diego, USA, Januari 1993. (bibtex),

Abstract: Current approaches to the problem of switching between modes in continuous dynamic system models tend to confound modelling and computation. Here we introduce an alternative approach ensuring a clear distinction between the physical and computational levels. Our method is centered around the idea that the variability of causal directions should be accepted rather than being alleviated. Hence, we define an ideal bond graph switching element, to deal with this variability in a bond graph uniform and systematic way.We hereby maintain the abstraction level of the bond graph language as well as the possibility to perform initial model validation directly in the graph.

Timpka, T. and Nyce, J. M. (1992). Towards a Pragmatics for Medical Hypermedia Systems. Technical Report LiTH-IDA-R-92-26, Department of Computer and Information Science, Linköping University, Sweden. Accepted to the MEDINFO'92 Geneva, Switzerland, September 1992. (bibtex),

Abstract: No abstract available

Timpka, T., Nyce, J. M., Sjöberg, C., Hedblom, P., and Lindblom, P. (1992). Developing a Clinical Hypermedia Corpus: Experiences from the use of a Practice-centered Method. Technical Report LiTH-IDA-R-92-27, Department of Computer and Information Science, Linköping University, Sweden. Accepted to SCAMC'92, Baltimore, Maryland, November 1992. (bibtex),

Abstract: This paper outlines a practice-centered method for creation of a hypermedia corpus. It also describes experiences with creating such a corpus of information to support interprofessional work at a Primary Health Care Centre. From these experiences, a number of basic issues regarding information systems development within medical informatics will be discussed.

Ubar, R. (1992). Functional Level Testability Analysis for Digital Circuits. Technical Report LiTH-IDA-R-92-03, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The paper presents a general approach for calculating controllabilty and observability measures. These measures are applicable to sequential and combinational circuits represented at the functional level. The developed methods and algorithms are based on alternative graphs which are an extension of binary decision diagrams where instead of only Boolean, also integer variables and functions are used. The algorithms developed for AGs are general and can be easily adjusted for calculation of different types of testability measures. It is shown that the controllability and observability cannot be treated as absolute measures because they are very tightly related to a used testing method. The testability measures for different testing environments are defined and their calculation is supported by corresponding algorithms. A presented formula for the calculation of the combinational controllability allows to regard initiability and different heuristic controllability measures as components of the probabilistic controllability.