Image

Temporal logic via recursion, part VII. Synthesis

http://chaource.livejournal.com/84503.html Prologue. Specification and synthesis in Boolean logic
http://chaource.livejournal.com/84914.html Part I. Temporal logic. Basic definitions
http://chaource.livejournal.com/85196.html Part II. Temporal fixed points
http://chaource.livejournal.com/85401.html Part III. A catalog of operations
http://chaource.livejournal.com/85714.html Part IV. Examples
http://chaource.livejournal.com/86203.html Part V. Normal forms
http://chaource.livejournal.com/86279.html Part VI. Satisfiability

In the prologue post I described the solution to the Boolean synthesis problem. If we write a specification as a Boolean formula, we can compute the required values of the unknown variables as explicit Boolean functions of the given parameters.

For example, given a specification

φ(a,b,x,y) = (a -> x)(b -> x'y) = (a' + x)(b' + x'y),

we demand that x and y be such that φ(a,b,x,y)=1. Here the values of a, b are treated as given parameters. The point of synthesis is to avoid guessing but instead to derive the rules of computation for x and y. Ideally, we merely state the specification and the computer will automatically deduce how to compute x, y so that the specification is always satisfied.

In the propositional Boolean logic, the synthesis problem is solved by a calculation. For the example above, we get the formulas

x=b', y=a'+b'.

These formulas can be seen as a "Boolean program" that calculates x and y given a,b. We have thus "synthesized" a program from the Boolean specification. The synthesized program is correct by construction; we can make a mistake only if we start with a wrong specification. (Incidentally, this is what happens in the example above: the specification is unsolvable if a=b=1.)

Can we find a similar procedure that allows us to synthesize "temporal Boolean programs" (also called reactive programs) from temporal logic specifications? If so, we will have a powerful tool for writing programs that are correct by construction.

The advantage of the propositional Boolean logic is that there is no heuristic search involved in the solution of the synthesis problem. A typical situation in automatic theorem proving is that we have a logic where several derivation rules can be applied to an initial set of axioms, and it is not clear how to apply the derivation rules so that a desired theorem can be proved. In the propositional Boolean logic, we do not have to guess which derivation rules or axioms to use. We just calculate the normal form of an expression and find the answer. To be sure, we need to simplify the result using the rules of Boolean algebra; but we merely keep applying a fixed set of simplification rules until no more rules can be applied. We do not have to guess or to prove any non-obvious theorems.

We have seen how to compute normal forms of temporal logic expressions. These normal forms are similar to those in ordinary Boolean logic, except that temporal normal forms are recursive because they are infinite expression trees. We just found a way of denoting these infinite trees compactly as a system of recursive equations. Each equation defines a formula that recursively refers to itself and/or other formulas at the next step in time.

Another difference between temporal logic and ordinary Boolean logic is that the solution to the Boolean synthesis problem is a Boolean formula such as x=b', while the solution to the temporal synthesis problem cannot be a temporal formula such as x = ALL(a+SOME(b)). Such a formula requires to know the future values of the parameters "a" and "b" in order to compute the present value of "x"!

The solution of the temporal synthesis problem must be a program for computing the value of x at every moment of time, given the knowledge of previous values of the parameters "a" and "b". The program for computing x is called reactive to emphasize that it must "react" to the values of the parameters "a", b", as these values change with time. Unlike a Boolean program that computes a single value of x and stops, a reactive program must indefinitely keep running and computing the next values of x as the next values of the parameters "a", "b" become available. By definition, temporal variables are infinite sequences of Boolean values. So a reactive program never really stops; it is always ready to process further parameter values.

Rather than trying to write a reactive program in a specific programming language, let us see how we could represent a reactive program using Boolean equations. This approach was successful for synthesizing Boolean programs: In order to synthesize a Boolean program that computes "x" from "a" and "b", it is sufficient to specify a formula ψ(a,b,x) and to require that, for any given values of "a" and "b", the value of "x" must be such that ψ(a,b,x)=1. As we have already seen, this is a fully adequate specification of a Boolean program.

A reactive program must compute an infinite sequence of values of "x". At each time step, the current value of "x" must be computed using the current values of "a" and "b". In our approach, computations are specified through Boolean equations. So, in order to compute the current value of "x" we need to know which Boolean equation ψ1(a,b,x)=1 needs to be solved at this time step. At the next step, there will be perhaps another Boolean equation ψ2(a,b,x)=1. So we also need to know which new Boolean equation is to be selected for the next step. (The choice of the equation will perhaps depend on the values of "a" and b" at this step!)

This kind of step-by-step solution procedure is quite similar to the way we checked temporal formulas by a finite automaton. However, now the parameters "a","b" are known while the value of "x" is unknown. We need to modify the procedure so that somehow we find an (ordinary) Boolean equation for "x" at each step, and also decide the next step. (As before, we will be defining a non-deterministic automaton.)

Example of synthesis

Let us find out how the synthesis could work by considering an example specification:

"Whenever the boss comes by my office, I will start working. Once I start working, I will keep working until the telephone rings."

Denote B = "The boss comes by", W = "I am working", R = "the telephone rings". The specification is

φ = ALL( (B -> W)(W -> W TU R) )

where the temporal operation TU ("true until") is defined by

x TU y := y + x N+(x TU y).

(We are using the optimistic recursion since we do not require that the telephone ever rings.) We consider B and R as given parameters and W as an unknown temporal value. Our goal is to synthesize a program that computes W given B and R.

The normal form for φ consists of two equations:

φ = (B' + W)(W' + W TU R)N+(φ) = (B'+W)(W'+R) + (B'+W)W N+(W TU R))N+(φ)
= (B'W'+B'R+WR) N+(φ) + W N+(φ1);
φ1 = (W TU R)φ = (B' + W)R N+(φ) + W N+(φ1).

Let us try to satisfy the specification φ step by step. At the initial time t=1, we can have φ=1 only if B'W'+B'R+WR=1 or if W=1. In other words, the value of W at the present time must be such that

B'W'+B'R+WR+W = 1.

We can regard this equation as the "present value" of φ; it is computed by setting N(...)=1 in the normal form for φ (heuristically, assuming that all parts of the specification will continue to hold at the next time). This "present value" is a non-temporal formula, since we have omitted all N(...) operators. Let us denote this "present value" formula by PV(φ). In any case, PV(φ) is a necessary condition that W should satisfy at the present time.

In our example, we compute:

PV(φ) = B'W'+B'R+WR+W = B'W'+B'R+W = B'+B'R+W = B'+W.

PV(φ1) = B'R+WR+W = B'R+W.

We now interpret the equation PV(φ)=1 as the specification for a program that computes the unknown W with known B and R. In our example, this equation has two solutions: W=B (the minimal solution) and W=1 (the maximal solution). The first solution means that we start working only if the boss comes by; the second solution means that we start working right away, whether or not the boss comes by.

For now, let's not commit to a choice of the solution for W, but keep in mind that there may be several solutions. In any case, we have succeeded in computing the present value of W, and it is time to move on to the next time step.

In order to move on to the next time step, we need to determine which of the two N() terms is needed. Suppose that presently the boss is not in, so B=0, and suppose we choose the minimal solution W=B. Then the present values yield

φ(t=1) = N+(φ).

So in this case we select the term N(φ), so we go on to the next step and again try to satisfy φ.

What if we had B=1 at the present, and the telephone does not ring (R=0)? Then our solution for W is unique: W=1 at the present time. The evaluation of φ gives

φ(t=1) = N+(φ1).

So in this case we go on checking φ1 at the next step.

What if we had B=1 at the present time, and the telephone rings (R=1)? In that case, W=1 is still the only solution. However, now the evaluation of φ at the present time gives

φ(t=1) = N+(φ) + N+(φ1).

So in this case we have a non-deterministic choice of φ or φ1 at the next step.

Whatever our choice, at the next step we have the same situation: we need to compute the value of W at the present time (which we obtain by solving the present-value equation) and then decide which formula to select for the next step.

In effect, we are building a non-deterministic finite automaton whose "states" are the formulas in our normal form, φ and φ1. In each "state" (φ or φ1) there is a non-temporal equation (PV(φ) or PV(φ1)), which needs to be solved for the unknown variable W given the known parameter values B,R at the current time. The solution W may or may not be unique; if it is not unique, it has to be chosen in some way. After obtaining W, we evaluate all terms of the present formula and compute the set of "candidate" next-step formulas N() that will have to be satisfied in the next step. We then non-deterministically select the next state among these candidates and repeat the synthesis at the next time step.

In principle, this completes the synthesis procedure! All that remains are the details...

Solvability and realizability

So far we have looked at only one example of a temporal specification, and several questions are still open at this point. Our procedure could fail in several ways:

1. What if there is no solution for W given some particular values B,R at a given time step?

We have to check for this! Namely, we check whether any of the formulas ∃W : PV(φ) and ∃ W: PV(φ1) are not identically 1. If this happens, the specification is unsolvable, i.e. some conditions on the input parameters need to be satisfied before we can satisfy the specification.

More generally, we need a way of computing the temporal formula

ψ(B,R) := ∃ W : &phi(B,R,W);

(Here ∃ W is quantifying an entire temporal sequence, not just the value of W at one time.) Normally we would expect ψ(B,R) to be identically TRUE, meaning that the solution W exists for any values of the input parameters B,R. It is easy to check this by computing the normal form of ψ' and checking its satisfiability; we expect that ψ' is unsatisfiable. If, contrary to our expectation, ψ' is satisfiable, then there exist some cleverly chosen temporal sequences B, R such that ψ(B,R) is false at least at one time moment, say t=n. At the time t=n, we would not be able to find a value W that satisfies the specification, if B and R were given by these cleverly chosen sequences. So these sequences provide a counterexample, showing us directly how we could "break" the specification.

Alternatively, the temporal formula ψ(B,R) may be considered as an additional restriction on the input parameters B and R. We may add this condition to the specification, or we may assume that this condition holds (e.g. if we can make sure that the input parameters violating the condition ψ will never be given to our synthesized program).

In any case, it is necessary to decide how the synthesis procedure should handle this case.

2. Suppose there is a solution W for given B,R, but when we evaluate φ with this W, we get 0. Then no terms N() can be selected for the next step. Can this be?

(No. Since PV(φ)=1, it is guaranteed that at least one term in the normal form remains nonzero.)

3. It is not clear how to convert a non-deterministic automaton into a deterministic one, if at each step we have complicated Boolean conditions that need to be evaluated, and a Boolean equation to be solved! (What does it mean that we non-deterministically solve several Boolean equations at once?)

Good question!

4. In our example, only the optimistic recursion was used. If our temporal specification contains pessimistic recursion, we need to enforce the condition that the pessimistic recursion is not looping indefinitely. How to implement this in our synthesis procedure?

For example, the specification φ = ALL(SOME x) has the normal form

φ = xN+(φ) + N-(φ).

The interpretation of this specification is that "x = 1 happens infinitely many times in the future".

When we run the synthesis procedure for this specification, we come to solving the present-value equation PV(φ)=x+1=1. This equation has two solutions, x=0 and x=1. If we choose x=0, we are selecting the pessimistic recursion leading to the same formula. Nothing prevents us from choosing x=0 also in the next step, and so on indefinitely. However, the pessimistic recursion prohibits keeping x=0 forever; the sequence x = {0,0,...} does not satisfy the specification. The specification ALL(SOME x) requires that x becomes true at some point in the future (and even infinitely many times). Somehow we need to implement this requirement in the synthesis procedure!

A simple-minded approach to solving this problem would be to prohibit pessimistic recursions when computing the value of x. In other words, we define the "optimistic present value" OPV(φ) as the formula φ where all N+(...) are replaced by 1 and all N-(...) are replaced by 0.

If we use this approach with the above example, we will now have OPV(φ)=x. The equation OPV(φ)=1 now has only one solution, x=1. Thus, we will immediately satisfy the pessimistic requirement. However, this approach will only generate the solution x = {1,1,...}, in other words we will never have x=0 at any time, even though this is permitted by the specification. It may not be always desirable to satisfy pessimistic recursions immediately.

What would be a more flexible way of treating pessimistic recursions?

5. What if we choose, on purpose, a specification that cannot be realized by any program, because it requires knowing the future ("clairvoyance")? The simplest example of such specification is

ψ(p,x) = ALL(x == SOME p), where x is unknown and p is a parameter.

Here the Boolean operation x == y (equivalence) is defined as

x==y := (x->y)(y->x) = xy + x'y'.

The specification ψ(p,x) expresses the requirement that the present value of x must be true if and only if p=1 at some time in the future. Obviously, the present value of x cannot be computed by any program that knows only the past values of p and x. This specification is unrealizable because it requires clairvoyance.

Of course, we do not expect such specifications to be necessary for practical programming. However, we could by mistake write such a specification and attempt to synthesize a program from it. How does the synthesis procedure react to such specifications? Can we detect unrealizable specifications, and can we point out where the problem is?

Very good question!

(To be continued.)

Some exercises

1. Optimism and pessimism

Suppose we define a temporal operation F using the pessimistic recursion,

x F y = a(x,y) + b(x,y)N-(x F y).

Show that x F y can be expressed through a different temporal operation, x G y, which uses the optimistic recursion, and the SOME() operation, as x F y = (x G y)SOME(a(x,y)). Find out how (x G y) must be defined.

Solution:

x F y = (x G y)SOME(a(x,y))

where

x G y = a(x,y) + b(x,y)N+(x F y).

2. Negation
Suppose we define a temporal operation F using the pessimistic recursion,

x F y = a(x,y) + b(x,y)N-(x F y).

Show that the negation of (x F y) is equivalent to another temporal operation G, so (x F y)' = x G y. Derive the normal form for (x G y).

Solution:

x G y = a'b' + a'N+(x G y)

3. Simplification

a) Show that ALL(ALL(x)) = ALL(x) and SOME(SOME(x)) = SOME(x).

b) Show that SOME(x+y)=SOME(x)+SOME(y) and ALL(xy)=ALL(x)ALL(y).

c)* Show that ALL(x)+ALL(xy)=ALL(x) and SOME(x)SOME(x+y)=SOME(x).

Prove these statements informally (by using the semantics of ALL and SOME) as well as by using normal forms.

For c), it is necessary to compute a canonical form rather than a normal form.

d) Simplify the formula φ = SOME(ALL(x)+ALL(y))ALL(xy)

4. Satisfiability

Show that the formula φ = ALL( (x->ALL z')(y -> SOME x)(z -> SOME y)(SOME z) ) is unsatisfiable.