Dept. of Computer and Information science, Linköping University

## IDA Technical Reports, 1995

Last updated: Tue, 02 Dec 1997 11:02:50

Axelsson, J. (1995). Analysis and Improvement of Task Scheduability in Hardware/Software Codesign. Technical Report LiTH-IDA-R-95-05, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Many real-time systems have timing requirements that are difficult to fulfil if the system is implemented by software running on a microprocessor. One way to remedy this problem is to implement the most time-critical parts in application-specific integrated circuits instead. Hardware/software codesign aims at providing support for the designer of such a heterogeneously implemented system, and especially at finding ways to determine what parts should be implemented in what technology. In this report, we discuss an approach to codesign which has the objective of implementing a real-time system so that it meets its deadlines. The main result presented is a schedulability evaluation method, and we describe how it can be used to guide the partitioning of the system behaviour onto the different components of a hardware architecture.

Axelsson, J. (1995). Analysis and Improvement of Task Schedulability in Hardware/Software Codesign. Technical Report LiTH-IDA-R-95-24, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Many real-time systems have timing requirements that are difficult to fulfil if the system is implemented by software running on a microprocessor. One way to remedy this problem is to implement the most time-critical parts in application-specific integrated circuits instead. Hardware/software codesign aims at providing support for the designer of such a heterogeneously implemented system, and especially at finding ways to determine what parts should be implemented in what technology. In this report, we discuss an approach to codesign which has the objective of implementing a real-time system so that it meets its deadlines. The main result presented is a schedulability evaluation method, and we describe how it can be used to guide the partitioning of the system behaviour onto the different components of a system hardware architecture.

Axelsson, K. (1995). Centralized or Decentralized Responsibility for Information Systems? Consequences of two Different Strategies. Technical Report LiTH-IDA-R-95-29, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: When structuring their information systems, some organizations choose to follow an explicit information systems strategy. An important question during the strategy realization is how the responsibility for the systems should be distributed within the organization. Two strategies, with different approaches to this issue, have been examined in six case studies. One of the strategies suggests that the information system responsibility should be held by a central data function separated from the business functions, while the other strategy proposes that each business function should be responsible for its own information systems. The result shows that both strategies have problems when it comes to distribution of responsibility, according to the theoretical description of each strategy. In reality, the responsibility often ends up at the data function or in a similar department, regardless of what the intentions were. One reason is that both users and managers are playing very passive parts in the realization of the strategy. In the conclusions of the paper factors are presented that constrain as well as enable the distribution of responsibility within organizations.

Axelsson, K. (1995). Realization of a Decentralized IS-Strategy. Technical Report LiTH-IDA-R-95-28, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Today almost all organizations are depending on information technology. It is therefore of great importance to structure the information systems in a satisfying way. Different strategies for structuring information systems in order to create an information systems architecture have been used for several years. This paper discusses an IS-strategy that enables each business function of a decentralized organization to co-operate with other functions, without loosing their autonomy. The IS-strategy could be described as an activity based, functional approach. Three case studies in different organizations, with the strategy in common, have been conducted. Experiences of successes and failures of the strategy realization in these organizations are presented and analysed in the paper. The conclusions show that the IS-strategy can give a substantial support to decentralized organizations. To achieve this result the user participation during system development can not be disregarded.Otherwise problems with distribution of responsibility for the information systems almost inevitably occur. In decentralized organizations it is of great importance to take such problems into serious consideration.

Bäckström, C. (1995). Expressive Equivalence of Planning Formalisms. Technical Report LiTH-IDA-R-95-03, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: A concept of expressive equivalence for planning formalisms based on polynomial transformations is defined. It is argued that this definition is reasonable and useful both from a theoretical and from a practical perspective; if two languages are equivalent, then theoretical results carry over and, more practically, we can model an application problem in one language and then easily use a planner for the other language. In order to cope with the problem of exponentially sized solutions for planning problems an even stronger concept of expressive equivalence is introduced, using the novel {\em ESP-reduction}. Four different formalisms for propositional planning are then analyzed, namely two variants of STRIPS, ground TWEAK and the SAS$^+$ formalism. Although these may seem to exhibit different degrees of expressive power, it is proven that they are, in fact, expressively equivalent under ESP-reduction. This means that neither negative goals, partial initial states nor multi-valued state variables increase the expressiveness of `standard' propositional STRIPS.

Bäckström, C. (1995). Five Years of Tractable Planning. Technical Report LiTH-IDA-R-95-32, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: We summarize the results from the first five years of a project aiming at identifying tractable classes of planning problems and investigate sources of computational difficulties in planning. The paper is a non-formal survey, including also historical remarks on the background of the project as well as discussion and motivation of the underlying assumptions, the methodology and the intended applications.

Bäckström, C. and Jonsson, P. (1995). PLANNING WITH ABSTRACTION HIERARCHIES CAN BE EXPONENTIALLY LESS EFFICIENT. Technical Report LiTH-IDA-R-95-12, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: It is well-known that state abstraction can speed up planning exponentially, under ideal conditions. We add to the knowledge{\ae}showing that state abstraction may likewise slow down planning exponentially, and even result in generating an exponentially longer solution than necessary. This phenomenon can occur for abstraction hierarchies which are generated automatically by the ALPINE AND HIGHPOINT algorithms. We further show that there is little hope of any drastic improvement upon these algorithms{\ae}it is computationally difficult to generate abstraction hierarchies which allow finding good approximations of optimal plans.

Coradeschi, S. (1995). Reasoning with Unreliable Observations in the Features and Fluents Framework. Technical Report LiTH-IDA-R-95-44, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: In this work we consider a way to deal with the problem of unreliable observations when an agent is reasoning about dynamical systems as they are formalized and systematically studied in Sandewall's approach to reasoning about action and change. The presence of incorrect observations can be detected in case it generates a contradiction. In this case a revision function resolves the inconsistency by constructing alternative consistent descriptions of the system. However, revision is in general expensive and therefore we define a delayed revision that postpones doing revision for some steps and we prove that it gives results similar to immediate revision. We also examine some forms of preferential revision that reduce the number of alternative descriptions of the system. We finally consider the relations between our work and Gardenfors' approach to belief revision.

Cronholm, S. (1995). Why CASE Tools in Information Systems Development? -an Empirical Study Concerning Motives for Investing in CASE Tools. Technical Report LiTH-IDA-R-95-37, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: For developing of computer-based information systems (IS) there nowadays exist different kind of tools. This study deals with tools for earlier phases (analysis, design), so called upper-CASE. CASE tools usually contain support for one or more ISD-methods. Working with CASE tools means that the ISD-work is both method- and computer-based. This paper is concerned about the motives for investments in CASE tools. Why do organizations invest in CASE tools? What objectives are the organizations aiming at? This study is also interested in if the stated objectives are reached. Are the motives for using CASE tools fulfilled?

Degerstedt, L. (1995). Declarative Development of Abstract Machines -- A Case Study in Bottom-up Execution for Logic Databases. Technical Report LiTH-IDA-R-95-45, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: We propose a new declarative method for development of abstract machines. Using this method the abstract machine can be designed by means of stepwise refinement of both control and data representation. The approach is declarative in the sense that each refinement of the initial computational model respects the declarative reading of its predecessor. The paper suggests to use fixed point characterizations based on a variant of chaotic iteration as computational models. Two classes of transformations of such models, called splittings and restrictions, are identified as the cornerstones of the method. We give sufficient criteria for when these transformations may be used. Ordinary bottom-up computation of logic databases is used as an illustrative example throughout the paper.

Doherty, P., Lukaszewicz, W., and Salas, A. (1995). A Characterization Result for Circumscribed Normal Logic Programs. Technical Report LiTH-IDA-R-95-20, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Circumscription has been perceived as an elegant mathematical technique for modeling nonmonotonic and commonsense reasoning, but difficult to apply in practice due to the use of second-order formulas. One proposal for dealing with the computational problems is to identify classes of first-order formulas whose circumscription can be shown to be equivalent to a first-order formula. In previous work, we presented an algorithm which reduces certain classes of second-order circumscription axioms to logically equivalent first-order formulas. The basis for the algorithm is an elimination lemma due to Ackermann. In this paper, we capitalize on the use of a generalization of the Ackermann lemma in order to deal with a subclass of universal formulas used in normal logic programs. Our results subsume previous results by Kolaitis and Papadimitriou regarding a characterization of circumscribed definite logic programs which are first-order expressible. The means for distinguishing which programs are reducible is based on a boundedness criterion. The approach we use is to first reduce circumscribed normal logic programs to a fixpoint formula which is reducible if the program is bounded, otherwise not. In addition to a number of other extensions, we also present a fixpoint calculus which is shown to be sound and complete for bounded fixpoint formulas.

Eles, P., Peng, Z., Kuchcinski, K., and Doboli, A. (1995). Performance Guided System Level Hardware/Software Partitioning with Iterative Improvement Heuristics. Technical Report LiTH-IDA-R-95-26, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This paper presents two heuristics for automatic hardware/software partitioning of system level specifications. Partitioning is performed at the granularity of loops, subprograms, and processes with the objective of performance optimization with a limited hardware and software cost. We define the metric values for partitioning and develop a cost function that guides partitioning towards the desired objective. We consider minimization of communication cost and improvement of the overall parallelism as essential criteria during partitioning. The two heuristics for hardware/software partitioning, formulated as a graph partitioning problem, are presented: one based on simulated annealing and the other on tabu search. Results of extensive experiments, including real life examples, show the clear superiority of the tabu search based algorithm.

Fahl, G. and Risch, T. (1995). Query Processing over Object Views of Relational Data. Technical Report LiTH-IDA-R-95-11, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This paper presents an approach to object view management for relational databases. Such a view mechanism makes it possible for users to transparently work with data in a relational database as if it was stored in an object-oriented (OO) database. A query against the object view is translated to one or several internal queries against the relational database from which the answer is formed. The object view can also store its own data and therefore it must be possible to process queries that combine local data residing in the object view with data retrieved from the relational database. We discuss the key issues when object views of relational databases are developed, namely: how to map relational structures to subtype/supertype hierarchies in the view; how to represent relational database access in OO query plans; how to provide the concept of object identity in the view; how to handle the fact that the extension of types in the view depends on the state of the relational database; and how to optimize queries against the object view. The results are based on experiences from a running prototype implementation.

Falkenroth, E., Risch, T., and Törne, A. (1995). Using an Embedded Active Database in a Control System Architecture. Technical Report LiTH-IDA-R-95-39, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Typical data management problems arise in control applications when the controlled environment becomes complex and large volumes of data are involved. This paper addresses this problem by embedding an active object-relational database in a control system architecture. The database stores an abstract model of the controlled environment. A control application language, tightly integrated with the query processor, is used for high-level specification of operations. A set of control algorithms running on a separate real-time kernel performs the actual closed loop control of the external environment. The primitive operation of the control application language is an update of the database that will trigger the control algorithms. In this way our architecture combines cyclic control algorithms and an event driven operation language with data management capabilities. The integrated architecture makes extensive use of the active rule facilities in the database, both in the execution of the control application language and to initiate appropriate control algorithms in the real-time kernel. We discuss practical experiences of building a unified control system with tightly integrated queries and active rules.

Flodin, S. and Risch, T. (1995). Processing Object-Oriented Queries with Invertible Late Bound Functions. Technical Report LiTH-IDA-R-95-10, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: New demands are put on query processing in Object-Oriented (OO) databases to provide efficient and relationally complete query languages. A flexible OO data model requires overloading and late binding of function names. Relational completeness requires capabilities to handle queries where functions are inverted, i.e. where it is possible to select those objects y that satisfies fn(y)=x where x is known. A system that supports both late binding and inverted functions must be able to solve fn(y)=x for a given x and unknown y when fn is late bound, i.e. the resolvent (implementation of a function name) to apply on y is selected based on the type of y. This combination of late binding and inverted function calls require novel query processing capabilities to fully utilize indexes referenced in late bound function calls. This paper presents an approach to the management of late binding in query processing. The main result is a query processing method where late bound function calls are efficiently executed and optimized for both inverted and regular execution. The proposed solution is based on substituting each late bound function call in the execution plan with a special function, DTR, which dynamically selects the actual resolvent to call. We define the inverse of DTR and its correctness. We show a dramatic execution time improvement by making DTR invertible and by defining its cost model for query optimization. The improvements are verified by performance measurements.

Goldkuhl, G. (1995). Information as Action and Communication. Technical Report LiTH-IDA-R-95-09, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The paper investigates information concepts. The elementary message (e-message) of Langefors is analysed. An e-message (with its three parts: object, property, time) is characterized as a contents-view of information. From a communicative action perspective (speech act theory) it is claimed that the e-message should be expanded with action concepts. An e-message should also include "action type" and "communicator". This communicative action view on information is taken as a starting point for analysis of business action and business processes. A generic business model is presented. The business process is considered to consist of four phases: 1) Inquiry and negotiation stage 2) contractual stage 3) fulfilment stage and 4) satisfaction stage A comparison is made with the business process conception in BPR (Busi- ness Process Reengineering) and the value chain concept of Porter.

Goldkuhl, G. (1995). Processtänkande vid verksamhetsutveckling. Technical Report LiTH-IDA-R-95-41, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: I samband med utveckling av verksamheter och organisationer tillämpas nu ofta ett sk processtänkande. Detta är aktuellt inom Business Process Reengineering (BPR) med inriktning på radikal verksamhetsförnyelse och inom Total Quality Management (TQM) som är mer inriktat på kontinuerliga förbättringar. Även inom andra aktuella förändringskoncept tillämpas ett processtänkande, som innebär en horisontell syn på verksamheten och dess aktiviteter tillsammans med en tydlig kundfokusering. Rapporten studerar ett antal definitioner av processbegreppet; gjorda av författare som Hammer & Champy, Willoch, Steneskog, Davenport och Harrington. En kritisk analys av dessa definitioner görs med identifiering av likheter och skillnader. På basis av denna analys presenteras alternativa definitioner av begreppen operativ verksamhetsprocess och affärsprocess. Dessa definitioner baseras på en kommunikations- och handlingssyn på affärsprocesser. En affärsgenerisk modell presenteras tillsammans med en analys av olika aktörsrelationer och ekonomiska styrformer i samband med affärsprocesser. Inspiration för detta kommer från talaktsteori och transaktionskostnadsteori.

Hallberg, J. and Peng, Z. (1995). Synthesis under Local Timing Constraints in the CAMAD High-Level Synthesis System. Technical Report LiTH-IDA-R-95-34, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This paper describes a technique to use local timing constraints to drive the high-level synthesis process, which has been implemented in the CAMAD system. A design is represented with an Extended Timed Petri Net (ETPN) and local timing constraints are introduced as special arcs, in the control part of the ETPN. A method for checking the consistency of a given set of local timing constraints is described as well as an algorithm that schedules the operations in such a way that the timing constraints are fulfilled. Together with the possibility to compile behavioral VHDL to ETPN this work is a step towards a system that allows high-level behavioral specifications including timing constraints to be efficiently compiled into silicon.

Hoffner, T. (1995). Evaluation and Comparison of Program Slicing Tools. Technical Report LiTH-IDA-R-95-01, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This report presents an evaluation and comparison of implementations of program slicing, which is a technique for extracting parts of computer programs by tracing the programs' control and data flow related to some data item. The technique has found uses in various application areas such as debugging, program integration and testing where it is useful to be able to focus on relevant parts of large programs. Static program slicing, which is a compile-time version of the analysis, was first introduced 1982, whereas run-time based dynamic slicing systems appeared around 1988. However, previously there has not been any comprehensive evaluation of the state of the art regarding slicing system implementations. This is an attempt to partially fill that need, by evaluating five implementations. Not surprisingly, it was observed that dynamic slicing systems often give smaller and more precise slices than static slicing systems, since in the dynamic case an actual flow of control is known. All systems can be regarded as first generation systems, in that they are mainly developed to support research. They have some performance problems and in several cases support rather small language subsets. Since their are no established criteria for how to evaluate program slicing yet, this report also discusses how to compare slicing tools for different application areas and formulates a method for doing so. The method is then used to perform the evaluation of the tools that provide program slicing, with a particular emphasis on Mariam Kamkar's dynamic interprocedural program slicing tool.

Hua, S. (1995). A New Approach to Cumulative Default Logic. Technical Report LiTH-IDA-R-95-22, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Cumulativity has been widely recognized as a desirable property of the logics for defeasible reasoning. The previous approaches to cumulative default logics are mainly concerned with the applicability conditions for defaults, rather than with the underlying, deductive system of the default extensions. This paper presents a new approach to cumulative default logic, by identifying more constructive underlying deductive systems. We show that cumulativity can be guaranteed if the semantic structures of the default extensions satisfy the properties of the coherence spaces. Based on this result, we investigate two variants of cumulative default logic. The first variant is that based on the conjunctive logic a non-classical (two-valued) propositional logic that can be examined as intuitionistic logic without the disjunction and implication connectives. The second, which is the more interesting one from the proof-theoretic point of view, is that based on linear logic another non-classical logic that can be examined as sequent calculus without the structural rules of weakning and contraction [7]. The semantics for the latter variant of cumulative default logic seems to coincide rather nicely with the semantics for cumulative default logic based on coherent semantic structures presented in this paper. This brings us closer to a more constructive formalization of cumulative default reasoning, and eventually to an implementation of the cumulative default logic.

Johansson, O. (1995). Using an Extended ER-Model Based Data Dictionary to Automatically Generate Product Modeling Systems. Technical Report LiTH-IDA-R-95-38, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: A product modeling system (PMS) is a computer integrated development environment for a specific class of advanced products. A well integrated PMS consists of a product model database which is interfaced with CAD-applications that support graphical design of various engineering models. For power plant design, there are functional models, mechanical models, electrical models etc. This paper describes a successful approach to manage the development of a product modeling system for power plant design. The key idea is to store a high-level PMS design specification in the form of an extended entity relationship model in a data dictionary. Most of the source code for the PMS implementation is then generated automatically, using SQL-based source code generators which are easy to develop. Our PMS-development system generates product model database schemas and user interfaces. It also generates high-level database schema related interface modules in the native application development language of a CAD-system. Through these, a CAD application developer has a high-level access to the object structures in the product model database. Using the described approach, we have developed a power plant PMS which is in production at the turbine manufacturer ABB STAL and the power plant engineering company ABB Carbon. The data dictionary design and SQL-based code generation technique seems to be generally applicable and has been used for generating source code implementations in C++, LISP, SQL, and various textual form description languages. The architecture of our PMS-development system is described together with the data dictionary schema and examples of generated source code. We estimate that this software engineering approach reduces systems development costs about 5 - 10 times. NOTE: This version of the report has a few important updates compared to the original paper! See page 17 for details.

Johansson, O. (1995). Utvecklingsmiljö för Produktmodelleringssystem. Technical Report LiTH-IDA-R-95-40, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Komplexiteten i högteknologiska produkter som kraftverk, flygplan, etc. ökar ständigt. Produktutvecklingstiderna måste minskas samtidigt som design och kvalitet ska förbättras på ett kostnadseffektivt sätt. Ett produktmodelleringssystem (PMS) är en datorintegrerad konstruktionsmiljö för en specifik klass av avancerade produkter. Det finns en stor potential att förbättra produktutvecklingsprocessen med hjälp av modern informationsteknologi. Det krävs dock en långsiktig och genomtänkt satsning för att utveckla ett PMS som är flexibelt nog att snabbt kunna anpassas till nya produktkrav. Systemet måste även kunna porteras till nya generationer av datateknik under det antal decennier som utgör produktens livscykel. Artikeln beskriver ett antal aspekter på mjuvaruutvecklingsmiljön man bör ta i beaktande för att lyckas med en långsiktig satsning på ett PMS. Utvecklingsmiljöns arkitektur och en iterativ utvecklingsmetod beskrivs som borgar för att produktmodelleringssystemet tidigt kan tas i drift och utvecklas mot en funktionalitet som prioriteras av företagsledning och användare. Utvecklingstrender och standarder för kommersiell basmjukvara som är lämplig att använda i ett PMS beskrivs. Artikeln baseras på erfarenheter från en framgångsrik utveckling av ett PMS kallat ProcessCAD. Det används för systemkonstruktion av turbin- och kraftverksanläggningar vid ABB STAL och ABB Carbon. Utvecklingsprojektet startade hösten 91. Efter 10 prototyper med stegvis ökande funktionalitet togs systemet i drift hösten 93. Under hösten 1995 är ett 50-tal anläggningar under konstruktion eller har konstruerats med hjälp av systemet. Leveranstiden för de vanligaste anläggningarna är ca 1 år.

Jonsson, P. and Bäckström, C. (1995). Complexity Results for State-Variable Planning under Mixed Syntactical and Structural Restrictions. Technical Report LiTH-IDA-R-95-17, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Most tractable planning problems reported in the literature have been defined by syntactical restrictions. To better exploit the inherent structure of problems, however, it is probably necessary to study also structural restrictions on the state-transition graph. We present an exhaustive map of complexity results for state-variable planning under all combinations of our previously analysed syntactical (P, U, B, S) and structural (I, A, O) restrictions in combination with two new restrictions (A+, A-). The complexity map considers both optimal and non-optimal plan generation.

Jonsson, P. and Bäckström, C. (1995). Incremental Planning. Technical Report LiTH-IDA-R-95-31, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Ambros-Ingerson and Steel suggested to interleave planning and execution through incremental planning, ie. using a planner that can output valid prefixes of the final plan before it has finished planning. This method could considerably bring down the time lost in planning, especially in dynamic domains, where replanning has to occur frequently. We improve on the basic idea, avoiding certain problems, by presenting an incremental planner with provable properties for a restricted class of planning problems, the 3S class. Finding out whether a 3S instance is solvable or not is computationally tractable, despite the fact that generating a plan is inherently intractable. By first testing whether an instance is solvable or not, we can avoid outputting prefixes of invalid plans in the latter case. Furthermore, making the reasonable assumption that natural problems have non-exponential-size solutions, we can also plan efficiently in practice since we need not waste time on non-solvable instances.

Jonsson, P. and Bäckström, C. (1995). Tractable Planning with State Variables by Exploiting Structural Restrictions. Technical Report LiTH-IDA-R-95-16, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: So far, tractable planning problems reported in the literature have been defined by syntactical restrictions. To better exploit the inherent structure in problems, however, it is probably necessary to study also structural restrictions on the state-transition graph. Such restrictions are typically computationally hard to test, though, since this graph is of exponential size. We take an intermediate approach, using a state-variable model for planning and restricting the state-transition graph implicitly by restricting the transition graph for each state variable in isolation. We identify three such restrictions which are tractable to test and we present a planning algorithm which is correct and runs in polynomial time under these restrictions.

Karlsson, J. s., Litwin, W., and Risch, T. (1995). LH*LH : A Scalable High Performance Data Strucuture for Switched Multicomputers. Technical Report LiTH-IDA-R-95-25, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: LH*LH is a new data structure for scalable high-performance hash files on the increasingly popular switched multicomputers, i.e., MIMD multiprocessor machines with distributed RAM memory and without shared memory. An LH*LH file scales up gracefully over available processors and the distributed memory, easily reaching Gbytes. Address calculus does not require any centralized component that could lead to a hot- spot. Access times to the file can be under a millisecond and the file can be used in parallel by several client processors. We show the LH*LH design, and report on the performance analysis. This includes experiments on the Parsytec GC/PowerPlus multicomputer with up to 128 Power PCs and 32 MB of distributed RAM per node. We prove the efficiency of the method and justify various algorithmic choices that were made. LH*LH opens a new perspective for high-performance applications, especially for the database management of new types of data and in real-time environments.

Lindvall, M. and Sandahl, K. (1995). Exempel på spårbarhet i en industriell tillämpning av Objectory. Technical Report LiTH-IDA-R-95-08, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Spårbarhet, definierat som möjligheten att spåra beroende komponenter inom en modell och möjligheten att spåra motsvarande komponenter i andra modeller, är en önskvärd egenskap hos programvarusystem. Vi har observerat ett industriellt projekt där analys- och designmetoden Objectory har använts och vi har dokumenterat detta, med tonvikt på ett antal exempel på spårbarhet. Exemplen har vi genererat genom att ta rollen som en systemutvecklare med uppgift att vidareutveckla systemet och som därigenom också måste förstå hur det är uppbyggt och fungerar. Fyra representativa exempel och en kategorisering av spårbarhet är det främsta bidraget i den här rapporten. Avsikten med exemplen är att skapa en konkret empirisk bas för att tillämpa spårbarhet på objektorienterade system.

Löwgren, J. (1995). Perspectives on Usability. Technical Report LiTH-IDA-R-95-23, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: One of the central concepts in human-computer interaction (HCI) is usability. Interestingly, in spite of its brief history as a scientific and applied discipline, HCI has already produced several different views on usability. These views are, in turn, interrelated with how research and systems development are seen. This paper identifies five different perspectives on usability: general theory, usability engineering, subjectivity, flexibility and sociality. Their interrelations and implications for usability-oriented systems development are discussed.

Nadjm-Tehrani, S. (1995). A Study of Decompositional Verification of Hybrid Systems. Technical Report LiTH-IDA-R-95-30, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This paper is a study of decompositional proof techniques applied to the verification of a model of a real world hybrid system, an aircraft landing gear. We present a formal description of these techniques (taken from Halwbachs et.al. [5]) and look at two ways of applying them. We discover, and correct, a flaw in the theory, but conclude ultimately that when dealing with a plant-controller combination there is often little to be gained by adopting a decompositional approach to verification. Moreover we argue that in these cases the composed system can be even simpler than its components, and thus it is most expedient to prove properties of the system directly.

Nilsson, A. (1995). KLASSISK INFORMATIONSTEORI: FOKUS PÅ BÖRJE LANGEFORS INFOLOGISKA TEORIBILDNING. Technical Report LiTH-IDA-R-95-14, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Under hösten 1994 och våren 1995 genomfördes en doktorandkurs om Klassisk Informationssystemteori vid Institutionen för datavetenskap (IDA), Linköpings Universitet och Tekniska Högskola. Ansvarig för kursen var Anders G. Nilsson och 21 doktorander från olika forskningsområden deltog på kursen. Kursen hade fokus på professor Börje Langefors infologiska teoribildning. Infologi kan sägas vara läran om hur vi utvecklar och använder informationssystem i verksamheter så att användarnas behov tillgodoses. Det visade sig att kursen blev mycket givande kunskapsmässigt för doktoranderna. Vidare ledde doktorandkursen fram till ett stort antal intressanta resultat och slutsatser om Langefors infologiska teoribildning. Vi beslöt oss därför att försöka sammanfatta våra viktigaste intryck och publicera en vetenskaplig rapport för spridning till andra forskare i Skandinavien. Rapporten innehåller ett antal jämförelser av Langefors teorier med andra forskningsverk inom informations-behandling och företagsekonomi. Var och en av de 21 doktoranderna presenterar ett eget kunskapsbidrag i rapporten. Som helhet ger kunskapsbidragen från de olika forskarna upphov till en mängd intressanta uppslag, dimensioner och infallsvinklar för fortsatta studier kring Langefors teorier. En övergripande slutsats från kursen är att Börje Langefors info-logiska teoribildning fortfarande verkar vara hållbar för många olika situationer och sammanhang inom vårt ämnesområde. Det är vår förhoppning att denna rapport ska ge nya impulser och id{\'e}er när det gäller att förvalta vårt historiska arv på bästa sätt.

Nilsson, A. (1995). UTVECKLING AV METODER FÖR SYSTEMARBETE - ETT HISTORISKT PERSPEKTIV. Technical Report LiTH-IDA-R-95-13, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: I denna rapport försöker jag belysa hur metoder för systemarbete har utvecklats över tiden. Med systemarbete avses här både systemutveckling och systemförvaltning. Min genomgång av metoder har ett historiskt perspektiv och ett speciellt fokus på svenska förhållanden. Undersökningen är starkt personligt präglad men jag försöker strukturera upp analysen av metoder efter sex dimensioner: beståndsdelar i metoder, fokus på aspekter, livscykel-täckning, förspecificering i systemarbete, form av hjälpmedel och synsätt bakom metoder.

Peng, Z. (1995). High-Level Test Synthesis Using Design Transformations. Technical Report LiTH-IDA-R-95-35, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: A transformation-based approach to high-level test synthesis is presented. It utilizes a sequence of design-improvement transformations to generate a register-transfer level design from a VHDL behavioral specification. Selection of transformations is based on a performance-driven optimization strategy as well as a testability analysis algorithm which determines the testability-improvement techniques to be used. The main testability-improvement techniques are controllability/observability balance allocation, partial scan insertion and condition scan insertion. One important feature of our approach is that the testability-improvement trans-formations are not carried out in a separate synthesis step. Instead they are performed at the same time when operation scheduling, data path allocation and control allocation are carried out.

Peppas, P. (1995). Epistemic Entrenchment and the Possible Models Approach. Technical Report LiTH-IDA-R-95-02, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The Possible Models Approach (PMA) introduced by Winslett, proposes, among other things, a domain-independent criterion for measuring the similarity between the different states of a dynamic system. Based on this criterion of similarity, the PMA makes inferences about the effects of events in the system. The concept of similarity between states (possible worlds) appears also in the area of Belief Revision in the form of a system of spheres. Systems of spheres are in turn connected to epistemic entrenchments by means of the AGM revision functions that the two structures induce. In view of these connections, in this article we identify and study the properties of the family of epistemic entrenchments corresponding to systems of spheres that comply with PMA's criterion of similarity. Using the results established herein, we then outline a version of the PMA based on epistemic entrenchments.

Ragnemalm, E. L. (1995). Student Diagnosis in Practice: Bridging a Gap. Technical Report LiTH-IDA-R-95-36, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This paper presents a novel framework for looking at the problem of diagnosing a student's knowledge in an Intelligent Tutoring System. It is indicated that the input and the conceptualisation of the student model are significant for the choice of modelling technique. The framework regards student diagnosis as the process of bridging the gap between the student's input to the tutoring system, and the system's conception and representation of correct knowledge. The process of bridging the gap can be subdivided into three issues, data acquisition, transformation and evaluation, which are studied further. A number of published student modelling techniques are studied with respect to how they bridge the gap.

Röstlinger, A. (1995). Verksamhetsanalys för förändringar med kvalitet. Technical Report LiTH-IDA-R-95-42, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Förändringar innebär i många fall investerar i IT-lösningar. Nya och förändrade IT-lösningar resulterar inte alltid i det verksamhetsstöd som man hade planerat. Genom att ställa en diagnos på verksamheten kan vi skapa beredskap inför förändringar. Att utföra en Verksamhetsanalys enligt SIMMetoden innebär ett sätt att skaffa sig diagnoskunskaper om verksamheten inför en eventuell förändring. Rapporten vill visa på behovet att arbeta med metoder vid förändring av verksamheter samt visa på möjligheter att använda Verksamhetsanalys och Handlingsgrafer som ett redskap vid diagnostisering av verksamheter i syfte att uppnå goda förändringar och kvalitet i kommunala verksamheter.

Sandewall, E. (1995). Reasoning about Actions and Change with Ramification. Technical Report LiTH-IDA-R-95-21, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The systematic approach to nonmonotonic logics of actions and change attempts to identify, for each proposed logic, what is its range of applicability relative to a precise definition of intended models. The article describes the concepts and methods that are used in this approach, and examples of assessed range of applicability for a few of the previously proposed logics in the case of strict inertia. The article also presents some new results, where the same systematic approach is applied to the case of chronicles with inertia and ramification.

Sandewall, E. (1995). Systematic Comparison of Approaches to Ramification using Restricted Minimization of change. Technical Report LiTH-IDA-R-95-15, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Most approaches to the ramification problem are based on the principle of minimization of change. It turns out, however, that this principle can not be applied uniformly, and many modern approaches use a classification of the fluents whereby change is only minimized in some of the fluents. The present article reviews these approaches and their underlying motivations. It also presents a unified formal framework whereby it is possible to compare, between different approaches, the set of selected models in each of them as well as their range of correct applicability. Finally it discusses the applicability of the Katsuno-Mendelzon postulates for these approaches.

Sjödin, M. (1995). A WWW Front End to an OODBMS. Technical Report LiTH-IDA-R-95-19, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: We have created a prototype World Wide Web (WWW) front end to an OODBMS, LINCKS. LINCKS is a multi user object store with support for composite objects, database histories, multiple views, information sharing by linking, and parallel editing notification. The prototype generates HTML documents on the fly from information pie ces in the database. A document's logical structure is described by a simple grammar, as well as where in the database the different parts of the logical structure should be retrieved. We can use LINCKS for tracking owner, source, and creation date of information pieces and documents. Instead of using "copy-and-paste" for information duplication, we are using links to the real source and thus maintain trace-ability. Moreover, by applying different views to the same document root object we are able to provide: abstract; table of contents (of different levels); a chapter, a section, or a subsection; the full article; with or without footnotes; or any combination of the above. Using any of the current crops of server (CERN, NCSA, Plexus, etc.) we must create several different files for each of these views. These files must be created by hand or created by a program like LATEX2HTML which makes it yet more difficult to maintain the documents.

Sköld, M., Falkenroth, E., and Risch, T. (1995). Rule Contexts in Active Databases - A Mechanism for Dynamic Rule Grouping. Technical Report LiTH-IDA-R-95-27, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Engineering applications that use Active DBMSs (ADBMSs) often need to group activities into modes that are shifted during the execution of different tasks. This paper presents a mechanism for grouping rules into contexts that can be activated and deactivated dynamically. The ADBMS monitors only those events that affect rules of activated contexts. By dynamic rule grouping the set of monitored rules can be changed during the transactions. In a static rule grouping the rules are associated with specific objects during the schema definition. A rule is always activated into a previously defined context. The same rule can be activated with different parameters and into several different contexts. Rules in a context are not enabled for triggering until the context is activated. However, rules can be directly activated into a previously activated context. When rule contexts are deactivated all the rules in that context are disabled from triggering. The user defined contexts can be checked at any time in a transaction. Rule contexts can be used as a representation of coupling modes, where the ADBMS has built-in contexts for immediate, deferred, and detached rule processing. These built-in coupling modes are always active and are automatically checked by the ADBMS. Contexts and rules are first-class objects in the ADBMS. Database procedures can be defined that dynamically activate and deactivate contexts and rules to support dynamically changing behaviours of complex applications. The context mechanism has been implemented in the AMOS ADBMS. The paper concludes with an example of a manufacturing control application that highlights the need for rule contexts.

Söderman, U. and Strömberg, J.-E. (1995). Switched Bond Graphs: multiport switches, mathematical characterization and systematic composition of computational models. Technical Report LiTH-IDA-R-95-07, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Classical bond graphs are in principal restricted to the modelling of continuous physical systems only. In previous work we have extended classical bond graphs to systems involving abrupt changes as well. This extension is centered around the introduction of an ideal primitive switch concept. In this paper we continue this work and extend it in a number of important directions. We present the multiport generalization of the previously introduced primitive one-port switch. We elaborate on the mathematical semantics of individual switch elements as well as complete switched bond graphs, i.e. bond graphs involving one or more switch elements. We discuss the systematic composition of computational models for switched bond graphs and for this purpose we introduce a constructive composition operator. Finally, we also discuss some ideas to deal with model complexity and 'non-physical' modes. Here, the multiport switch plays an important role. For the representation of the composed computational models of switched bond graphs we introduce a mathematical structure related with state automata. This structure is referred to as mode transition systems. For the mathematical characterization of individual switch elements a simplified version of this structure, referred to as switch transition systems, is introduced.

Söderman, U., Strömberg, J.-E., and Top, J. (1995). Mode Invariant Modelling. Technical Report LiTH-IDA-R-95-04, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Modelling of physical systems is a difficult task, in particular if mode switching is allowed. Automated support can be supplied, if two conditions are fulfilled. First, a set of suitable {\em qualitative} concepts has to be availiable as a tool box for constructing adequate models. Second, powerful computational machinery must handle model complexity. In this paper we introduce both a conceptual basis and a computational algorithm for dealing with mode switching physical systems. Our approach is more effective than current AI-based methods since it is based on general physical principles at the proper level of abstraction, expressed in terms of bond graphs. The latter formalism is extended in order to deal with mode switching. All meaningful causal modes and transitions between modes are derived automatically. A power electronic circuit serves a guiding example.

Stoy, E. and Peng, Z. (1995). Inter-Domain Movement of Functionality as a Repartitioning Strategy for Hardware/Software Co-Design. Technical Report LiTH-IDA-R-95-33, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: This paper describes how initial hardware/software partitions can be modified by moving functionality between hardware and software. This inter-domain movement of functionality combined with a transformation-based design methodology enables the use of repartitioning as an optimization technique for co-design. This is made possible because of a unified co-design representation where Petri nets are used to represent controllers of both hardware and software module. To model data manipulation we use datapaths in hardware and instruction dependence graphs in software. We demonstrate in this paper how datapaths can be mapped into instruction dependence graphs and vice versa. Using this technique, systems may be repartitioned whithout the complex and time-consuming resynthesis of the functionality to be moved or the whole design.

Strömberg, J.-E. and Söderman, U. (1995). Modular Modelling of Mode-switching Systems. Technical Report LiTH-IDA-R-95-06, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: By a mode-switching system we mean a physical engineering system in which there are fast switching devices such as diodes, check-valves, free-wheeling devices etc. or other similar effects. For the proper conceptualisation of such systems, we have earlier presented switched bond graphs; an extension of the classical bond graph language with an ideal primitive switch concept. In this paper we study the potentials of systematically deriving computational models from switched bond graphs. The underlying requirement is to preserve the modularity provided by classical bond graphs. The resulting computational model is hybrid in the sense that it combines systems of continuous differential and algebraic equations and discrete mode transitions, and is suitable for control, simulation and design.

Stys, M. and Zemke, S. (1995). Incorporating Discourse Aspects in English-Polish MT:Towards Robust Implementation. Technical Report LiTH-IDA-R-95-18, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: The main aim of translation is an accurate transfer of meaning so that the result is not only grammatically and lexically correct but also communicatively adequate. This paper stresses the need for discourse analysis in order to preserve the communicative meaning in English Polish machine translation. Unlike English which is a positional language with word order being grammatically determined, Polish displays a strong tendency to order constituents according to their degree of salience, so that the most informationally salient elements are placed towards the end of the clause regardless of their grammatical function. The Centering Theory developed for tracking down given information units in English and the Theory of Functional Sentence Perspective for Slavic languages provide the theoretical backgrounds. The notion of center is then extended to accommodate not only for pronominalization and exact reiteration but also for definiteness and various center pointing constructs. Center information is additionally graded and applicable to all primary constituents in an utterance. Next, this information, along with some syntactic clues is used to order correctly post-transfer constituents relying on statistical regularities.

Wallgren, L. (1995). Kvalitets- och prestationsuppföljning i forskarutbildningen. Handledningens roll för genomströmningen.. Technical Report LiTH-IDA-R-95-43, Department of Computer and Information Science, Linköping University, Sweden. (bibtex),

Abstract: Den här studien beskriver svensk forskarutbildning och handledning utifrån en undersökningsmodell, som är baserad på en förväntan-valens paradigm. Studiens syfte är att beskriva faktorer, som kan påverka genomströmningen i forskarutbildningen och att inventera tidigare forskning om handledning i forskarutbildning. Normalstudietid för doktorsexamen är fyra år, men i praktiken har hälften av de som avlägger doktorsexamen varit inskrivna i forskarutbildningen i mer än sju år. Studien är i stor utsträckning baserad på material från Universitets- och Högskoleämbetet (UHÄ) och på uppgifter från de enskilda högskoleenheterna. En slutsas är, att det finns ett antal faktorer, som kan påverka genomströmningen nämligen ekonomi, studiefinansiering, rekrytering, forskarutbildningens meritvärde och arbetsmarknadens efterfrågan, handledning, studie- och forskarmiljö, forskarutbildningens planering, grundutbildningens forskningsanktnytning och avhandlingens utformning. Vidare kan konstateras, att handledningen har en central roll för resultatet i forskarutbildningen och att dess roll är beroende av i vilket stadium doktoranden befinner sig i och de personer som ingår i handledningen. Av studien framkommer vidare betydelsen av relationen mellan doktoranden och handledaren. Totalt sett visar studien att både forskarutbildningen och handledningen i hög grad är beroende av de villkor som råder i den studie- och forskarmiljö och vid den institution där handledaren och doktoranden är verksamma.

Goto (at Linköping University):
CS Dept TR Overview
Maintained by webmaster