Hide menu

SaS Seminars 2008

Software and Systems Research Seminar Series

Autumn 2008

On Improving Real-Time Observability in Silicon Debug

Date: Tuesday, Nov 11, 2008 Place: Alan Turing Time: 15:15

Nicola Nicolici, McMaster University, Canada


To identify design errors that escape pre-silicon verification, silicon debug is becoming an important step in the implementation flow of digital integrated circuits. Embedded logic analysis is emerging as an alternative to scan chains for improving real-time observability during in-system silicon debugging. In this talk we discuss several basic techniques used in silicon debug, as well as some recent research on embedded logic analysis.

Speaker's Profile:

Nicola Nicolici is an Associate Professor in the Department of Electrical and Computer Engineering at McMaster University, Canada. His research interests are in VLSI test and silicon debug.

MVC: 30 years later, the other shoe drops

Date: Friday, Oct 3, 2008 Place: Alan Turing Time: 10:15

Jim Coplien , Gertrud&Cope, Mørdrup, Denmark


To its end user, software is not a product, but a service. Procedural programming made it possible to reason about these services and their logic in which most problems could be found in low-cost but dutiful desk checks. The main correlation-like entity was the procedure, which could be assembled to collect large numbers of activation record instances into a few archetypical structures. In 1967, the ability to do this was taken away by object-oriented programming, which encouraged a style of programming where this user-focused structure was subordinated to the user's cognitive model of the static world: of its objects. The algorithmic view was further muddled by inclusion polymorphism. The Smalltalk anthropomorphic view and ever small method sizes made it necessary to understand dozens of atomic algorithms to understand even the simplest functionality. Progress in methodologies reduced this static view to an even more over-simplified and more static view in classes, the almost final step in removing our ability to reason about the end user system model. The final step was Agile methods, which focus on the customer -- the middleman -- instead of the end user, enabling a product focus instead of a service focus. This is even celebrated as a good thing.

Piecemeal, technology has slowly staggered roughly in the direction of the more primordial object view, and AOP has struggled to restore some of the algorithmic view. Roles and role-based modeling have brought back a bit more dynamic view of the system; we find their incarnation in Java and C# interfaces. There is renewed interest in dynamic programming languages and in the kind of flexibility one finds in traits. Trygve Reenskaug has combined these techniques and brought us full circle in the DCI paradigm. The "D" is for data modeling: what we know as traditional objects, though bereft of knowledge about scenarios. It captures the static structure of objects and their references on the heap. The "C" is for context: the mapping of roles onto objects on a per-use-case basis, implemented as a dictionary. The "I" is for interaction: an algorithm of a stateless role, written in terms only of other roles, that defines in readable terms what the system does. Object dynamics can be captured in interaction roles and melded with classes who use the roles as traits; interaction dynamics appear in a context object generated anew for every use case; and structural dynamics appear as references between elements of object data.

This design approach expresses several important correlations that long have been missing in object orientation. Rather than unifying all algorithmic cross-cutting into Aspects, it teases out important facets into the context and interaction, with the object model a third correlation that is usually presumed to be the base partitioning. These correlations conceivably compose in uniform and predictable ways because of their grounding in simple object concepts such as interfaces and classes, rather than cutpoints or wrappers and whoppers. It is an extended subset of multi-paradigm design, incorporating important elements of the procedural and object paradigms.

Speaker's Profile:

Jim ("Cope") Coplien is a Software Architecture and Agile Consultant at Gertrud&Cope in Denmark. He has a 25-year history as an "early adopter" and innovator behind several strategic innovations in software: his C++ Idioms book was one of the major sources for Design Patterns; his work on Organizational Patterns was one of the foundations of the structural components of XP and was the inspiration for Scrums. His books cover areas as diverse as C++ programming, software design, and organizational design. He has started writing a book on Agile software development. His current professional focus areas include Lean software architecture, highlighting the challenges of test-driven development, and Scrum process improvement using Organizational Patterns. His current day-to-day work includes architecture reviews, coding, and helping organizations work more effectively in lean economic conditions through process improvement and reduction of waste. His current hobby is creating advanced (housing) architecture CAD tools based on pattern languages. He lives with his wife and son in Mørdrup, Denmark. When he grows up, he wants to be an anthropologist.

Spring 2008

Multi-Paradigm Modelling, and the quest for tool support

Date: Wednesday, June 4, 2008 Place: John von Neumann Time: 15:15

Prof. Dr. Hans Vangheluwe , McGill University, Montréal, Québec, Canada


Models are invariably used in Engineering (for design) and Science (for analysis) to precisely describe structure as well as behaviour of systems. Models may have components described in different formalisms, and may span different levels of abstraction. In addition, models are frequently transformed into domains/formalisms where certain questions can be easily answered. We introduce the term "multi-paradigm modelling" to denote the interplay between multi-abstraction modelling, multi-formalism modelling and the modelling of model transformations.

The presentation will start with some ancedotal evidence of the need for multi-paradigm modelling. Subsequently, the foundations of multi-paradigm modelling will be presented. It will be shown how all aspects of multi-paradigm modelling can be explicitly (meta-)modeled enabling the efficient synthesis of (possibly domain-specific) multi-paradigm (visual) modelling environments. We have implemented our ideas in the tool AToM^3 (A Tool for Multi-formalism and Meta Modelling). AToM^3 will be introduced by means of a simple example. Finally, an overview will be given of current and future challenges of multi-paradigm modelling.

Speaker's Profile:

Hans Vangheluwe is an Associate Professor in the School of Computer Science at McGill University, Montréal, Canada. He holds a D.Sc. degree, as well as M.Sc. degrees in Computer Science, and Theoretical Physics, all from Ghent University in Belgium. He has been a Research Fellow at the Centre de Recherche Informatique de Montréal, Canada, the Concurrent Engineering Research Center, WVU, Morgantown, WV, USA, at the Delft University of Technology, The Netherlands, and at the Supercomputing and Education Research Center of the Indian Institute of Science (IISc), Bangalore, India. At McGill University, he teaches Modelling and Simulation, as well as Software Design. He also heads the Modelling and Simulation and Design (MSDL) research lab. He has been the Principal Investigator of a number of research projects focused on the development of a multi-formalism theory for Modelling and Simulation. Some of this work has led to the WEST++ tool, which was commercialised for use in the design and optimization of bioactivated sludge Waste Water Treatment Plants. His current interests are in domain-specific modelling and simulation. The MSDL's tool AToM^3 (A Tool for Multi-formalism and Meta-Modelling) developed in collaboration with Prof. Juan de Lara uses meta-modelling and graph grammars to specify and generate domain-specific environments. Recently, he has applied model-driven techniques in a variety of areas such as modern computer games, dependable and privacy-preserving systems (the Belgian electronic ID card), embedded systems, and to the design and synthesis of advanced user interfaces.

Scalable Publish/Subscribe Infrastructures

Date: Monday, April 14, 2008 Place: Alan Turing Time: 10:15

Prof. Roberto Baldoni, Univ. of Rome La Sapienzia, Italy


This talk will address recent advances in building scalable publish/subscribe communication systems as infrastructures for mature and novel large scale distributed applications (e.g., internet-based applications, scalable QoS applications and Enterprise Data Centers Monitoring). In particular event routing schemes and p2p overlay networks will be analyzed as technology enablers for the construction of such publish/subscribe systems. Three publish/subscribe systems will be outlined and compared with respect to event routing and p2p overlay layers. The talk also will look at future applicative scenarios where these infrastructures are expected to be employed.

Reasoning on the Web: Why and how it is changing the making of computer and other science

Date: Wednesday, February 20, 2008 Place: Alan Turing Time: 13:15

Prof. Francois Bry, Ludwig Maximilian University of Munich, Germany


Many researchers candidly admit, if asked, that they no longer use traditional libraries and not professional address books. Instead, they rely on search engines. Indeed, with a bit of practice, one can better follow the developments in one's field on the web than in a reading room. The reasons for this are threefold: (1) "the long tail", i.e., on the web, goods, software, and ideas with small, fragmented and/or scattered audiences are marketable, (2) on the web, "everything is miscellaneous", i.e. one can search after one's own mental model, (3) on the web, access to information is very cheap, easy and extremely fast. This already has a considerable impact on the making of sciences. In this talk, the impact still to come is investigated and novel research issues induced by emerging web-based tools for sciences are discussed: social software and specialized search for sciences. Reasoning and semantics are core aspects of these novel web-based tools. Outcomes of the Network of Excellence "Reasoning on the Web with Rules and Semantics (REWERSE)", are presented and their relevance to the aforementioned systems is discussed. Finally, open issues of reasoning for social software and search are presented.

Page responsible: Christoph Kessler
Last updated: 2012-08-17