ELECTRONIC WORKSHOPS IN COMPUTING Series edited by Professor C.J. van Rijsbergen
He Jifeng, Oxford University Computing Laboratory, UK, John Cooke, Loughborough University, UK, and Peter Wallis, University of Bath, UK (Eds) BCS-FACS 7th Refinement Workshop
Proceedings of the BCS-FACS 7th Refinement Workshop, Bath, 3-5 July 1996
Chopping a Point Zhou Chaochen and Michael R. Hansen
Published in Collaboration with the British Computer Society
©Copyright in this paper belongs to the author(s) ISBN 3-540-76104-7
Chopping a Point y Zhou Chaochen Department of Information Technology, Technical University of Denmark DK-2800 Lyngby, Denmark
Michael R. Hansen Department of Information Technology, Technical University of Denmark
DK-2800 Lyngby, Denmark
zcc mrhgitdtudk Emails: f
Abstract This paper introduces a super-dense chop modality into the Duration Calculi. The super-dense chop can be used to specify a super-dense computation, where a number of operations happens simultaneously, but in a specific order. With this modality, the paper defines a real-time semantics for an OCCAM-like language. In the semantics, assignments and passing of messages in communications are assumed to be timeless operations.
1 Introduction
In a digital control system, a piece of program, hosted in a computer, acts as a controller in order to periodically receive sampled outputs of a plant, and to calculate and send controls to the plant. The program may be written in an OCCAM-
like language as a loop of
sensor x CALx u actuator u w ait T
x u x
where CAL stands for a program segment, which decides a current control from the current sampled data and T the previous control u. is the sampling period. Typically, the time spent in the calculation and change of controls
is negligible compared with the sampling period T . So control engineers make the comfortable assumption that the
x u
calculation program (i.e. CAL ) becomes a sequence of statements which are executed one by one, but consume
x actuator u no time. Similarly, the receiving and the sending (i.e. sensor and ) are assumed to be a pair of consecu- tive operations which can happen instantaneously, if the partners (i.e. sensor and actuator) are willing to communicate. A computation of a sequence of operations which is assumed to be timeless is called a super-dense computation.A semantic model of super-dense computations is discussed e.g. in [10, Example 6.4.2] and [11], and is e.g. adopted by the languages Esterel [3] and Statecharts [9]. A super-dense computation is an abstraction of a real-time computation within a context with a grand time gran-
ularity. For instance, in the digital control system, the cycle time of the computer may be nanoseconds, while the
x u
sampling period of the plant may be seconds. In other words, the calculation (CAL ) and the communications
sensor x actuator u T w ait T and may take micro- or milli-seconds, while the sampling period (in )maytake seconds. A computation time with fine time granularity is only negligible for computations not having infinite loops. Otherwise it is known as Zeno phenomenon, or finite divergence [5]. In this paper, finite divergence will not be taken into account.