Contexts and the Concept of Mild Context-Sensitivity

Preview:

Citation preview

Contexts and the Concept of Mild Context-SensitivityAuthor(s): M. Kudlek, C. Martín-Vide, A. Mateescu and V. MitranaSource: Linguistics and Philosophy, Vol. 26, No. 6 (Dec., 2003), pp. 703-725Published by: SpringerStable URL: http://www.jstor.org/stable/25001907 .

Accessed: 14/06/2014 12:19

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp

.JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range ofcontent in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new formsof scholarship. For more information about JSTOR, please contact support@jstor.org.

.

Springer is collaborating with JSTOR to digitize, preserve and extend access to Linguistics and Philosophy.

http://www.jstor.org

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

M. KUDLEK, C. MARTIN-VIDE, A. MATEESCU and V. MITRANA

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY *

ABSTRACT. We introduce and study a natural extension of Marcus external contextual

grammars. This mathematically simple mechanism which generates a proper subclass of

simple matrix languages, known to be mildly context-sensitive ones, is still mildly context sensitive. Furthermore, we get an infinite hierarchy of mildly context-sensitive families of languages. Then we attempt to fill a gap regarding the linguistic relevance of these mechanisms which consists in defining a tree structure on the strings generated by many dimensional external contextual grammars, and investigate some related issues. Several

open problems are finally discussed.

1. PRELIMINARIES

Soon after Chomsky's seminal work on generative grammars, a long de bate about the place of grammars of natural languages in the Chomsky hierarchy started. This debate, which lasted for more than twenty years, was focused on the context-freeness of natural languages. Many argu ments supported the context-freeness (see, e.g., Pullum 1985, Rounds et al. 1987), whereas other authors refuted them by showing constructions in natural languages that are noncontext-free (Pullum and Gazdar 1982,

Gazdar and Pullum 1985, Partee et al. 1990, Manaster Ramer 1999). The hierarchy of Chomsky does not provide a specific demarcation

of language families having the desired properties. Whereas the family of context-free languages has good computational properties, it does not contain some important formal languages that appear in human languages. On the other hand, the family of context-sensitive languages contains all

important constructions that occur in natural languages, but the member

ship problem for languages in this family cannot be solved in deterministic

polynomial time. The idea of finding a grammatical framework for natural language

constructions has led to the concept of mildly context-sensitive generat * This work has been partially supported by Direcci6 General de Recerca, Generalitat

de Catalunya (programme PIV). All correspondence to Carlos Martin-Vide.

*A Linguistics and Philosophy 26: 703-725, 2003. 9 ? 2003 Kluwer Academic Publishers. Printed in the Netherlands.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

704 MANFRED KUDLEK ET AL.

ive devices (see Joshi 1985 and Joshi and Schabes 1997 for linguistic motivations).

Mildly context-sensitive languages should include the most significant formal languages derived from the noncontext-free constructions that were found in the syntax of natural languages. Actually, there exists only a small amount of material on such constructions. The reader interested in these

aspects may wish to consult Bresnan et al. (1987), Culy (1987), Shieber

(1987), or Savitch et al. (1987) for an overview. We briefly list next a few of such noncontext-free structures which will turn out to be covered by our

grammatical model. The first example is from Dutch (Bresnan et al. 1987):

... dat Jan Piet Marie de kinderen zag helpen laten zwemmen.

(that Jan saw Piet help Marie make the children swim.)

showing a duplication-like structure {wT I w E {a, b}*}, where w is the word obtained from w by replacing each letter with its barred copy. This is only weakly noncontext-free, i.e., only in the deep structure.

Another example is from the African language Bambara (Culy 1987). In its vocabulary, a duplication structure is found demonstrating a strong noncontext-freeness, i.e., on the surface too:

malonyininafilela o malonyininafilela o

(one who searches for rice watchers - one who searchesfor rice watchers = whoever searches for rice watchers).

This has the structure {wcw I w E {a, b}*}. But also the crossed agreement structure {ambncambn I m, n > 0} is possible to be inferred.

The last also strong noncontext-free example is from Swiss German

(Zurich) (Shieber 1987), showing again crossed agreement:

Jan sdit das mer d'chind em Hans es huus haend wele laa hilfe aastriiche.

(Jan said that we wanted to let the children help Hans paint the

house.)

This has the structure xwambnycmdnz, where a, b stand for dative, ac cusative noun phrases, respectively, and c, d for the corresponding dative, accusative verb phrases, respectively.

On the other hand, languages in the mildly context-sensitive family must be computationally feasible, i.e., the membership problem for them

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 705

must be solvable in deterministic polynomial time, and should have the bounded growth property (the length difference of any two consecut ive words in the language must be bounded). The last condition will be

replaced here by a stronger one, that of semilinearity. In this paper, by a mildly context-sensitive family of languages we mean

a family ? of languages that satisfies the following conditions:

(i) each language in ? is semilinear,

(ii) for each language in ? the membership problem is solvable in deterministic polynomial time, and

(iii) d contains the following three noncontext-free languages: - multiple agreements:

LI = {anbncn I n > 0},

- crossed agreements:

L2 = {anbmcndm I n, m > 0}, and

- duplication:

L3 = {ww | w E (a, b}*}.

Note that often it is considered that such a family contains all context-free

languages and/or some other noncontext-free languages: k multiple agree ments: L\

= {ala2 ... .an n > 0} where k > 3, marked duplications:

L3 = {wcw I w E {a, b}*}. In the sequel, we will take into account these variants too, although we do not want to enter into this discussion very deeply.

Why pay attention to yet another mechanism to fabricate mildly context-sensitive families? There are many such mechanisms: tree adjoin ing grammars (Joshi et al. 1975, 1997), head grammars (Pollard 1984;

Roach 1987), combinatory categorial grammars (Steedman 1985), linear indexed grammars (Aho 1968; Gazdar 1985), control grammars (Weir 1992), simple matrix grammars (Ibarra 1970), sewing grammars (Martin

Vide and Mateescu: 1999, 2000b, 2001), etc. Some of them were proved to be equivalent (see Joshi et al. 1991). Why consider the one in this paper? Because, as we expect to show below, the mechanism presented here is

(technically) extremely simple. Despite that these mechanisms generate a

proper subclass of simple matrix languages, they are still mildly context sensitive. An important feature and advantage of the models that we are

going to present is that they are comparatively quite elemental. These models do not involve nonterminals. Moreover, they do not have rules of derivation except one general rule: to adjoin contexts. These models are

clearly simpler than any other models found in the literature on mildly

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

706 MANFRED KUDLEK ET AL.

context-sensitive families of languages. One may argue that the absence of nonterminals results in the absence in the formalism of any way of

structuring the grammar. An attempt to overcome this drawback will be discussed in the last section, where a tree structure will be associated with the strings generated by these grammars.

Contextual grammars were firstly considered in Marcus (1969) with the aim to model some natural aspects from descriptive linguistics, like for in stance the acceptance of a word (construction) only in certain contexts. For a detailed introduction to the topic, the reader is referred to the monograph (Paun 1997). We start by presenting the formal definition of a Marcus external contextual grammar. Later on, these grammars are extended to the many-dimensional case: we investigate them and prove the existence of an infinite hierarchy of mildly context-sensitive families of languages. This result is not contradictory with the results in Joshi et al. (1991) since all these families actually converge to another family of mildly context sensitive languages. In a certain sense, our hierarchy is similar to the one in Weir (1992). Then, we present incomparability relationships with the families of regular and context-free languages and discuss some closure

properties. One may say that our grammatical formalism generating a class of languages occupying an eccentric position with respect to the Chomsky hierarchy holds a so-called "transversal" property.

Finally, we shall discuss tree structures, open problems and further directions for research.

2. FORMAL LANGUAGE THEORY PRE-REQUISITES

Let E be an alphabet and let E* be the free monoid generated by E with the identity denoted by X. The free semigroup generated by E is S =

* -{X}. Elements in E* (E+) are referred to as words (nonempty words); X is the empty word. Assume that a E E and w E S*; the length of w is denoted by Iwl, while the number of occurrences of a in w is denoted by W Ia. A context is a pair of words, i.e., (u, v), where u, v E E*.

The families of regular, minimal linear, linear, context-free, and context-sensitive languages are denoted by REG, MinLIN, LIN, CF, and CS, respectively. A minimal linear language is a language gener ated by a minimal linear grammar, that is a linear grammar with just one nonterminal.

Assume that E = {al, a2, .. ., ak. The Parikh mapping, denoted by 'P, is:

'I: E* - Nk @4(w) = (|WIal, IWIa2 . ., Iak)

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 707

If L is a language, then the Parikh set of L is defined by:

1(L) = {I(w) I w e L}.

A linear set is a set M c Nk such that M = {vo + =l vii v xi E N}, for some vo, vl,..., Vm in Nk. A semilinear set is a finite union of linear sets, and a semilinear language is a language L such that J (L) is a semilinear set.

The reader is referred to Rozenberg and Salomaa (1997) or Salomaa

(1973) for the basic notions on formal languages we use in the sequel. A Marcus external contextual grammar is G = (E, B, C), where E is

the alphabet of G, B is a finite subset of E* called the base of G, and C is a finite set of contexts, i.e., a finite set of pairs of words over E. C is called the set of contexts of G.

Let G = (E, B, C) be a Marcus external contextual grammar. The direct derivation relation with respect to G is a binary relation between words over E, denoted ==G, or =, if G is understood from the context. By definition, x =oG y, where x, y E *, iff y = uxv for some (u, v) E C. The derivation relation with respect to G, denoted =fr, or =A* if G is understood from the context, is the reflexive and transitive closure of ==G.

Let G = (E, B, C) be a Marcus external contextual grammar. The

language generated by G, denoted L(G), is defined as:

L(G) = {y E E* I there existsx E Bsuch thatx =G y}.

One can verify that the language generated by G = (E, B, C) is the smallest language L over E such that:

(i) B CL,

(ii) if x E L and (u, v) E C, then uxv E L.

The family of all Marcus external contextual languages is denoted by 8C.

REMARK 2.1. Obviously, 8C = MinLIN, which is a strict subfamily of

LIN, incomparable with REG (see Paun 1997).

There are many variants of Marcus contextual grammars, all of them being based on context adjoining. The differences consist in the way of adjoin ing contexts, the sites where contexts are adjoined, the use of selectors, etc. The reader interested in this topic is referred to the monograph Paun

(1997).

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

708 MANFRED KUDLEK ET AL.

3. THE MANY-DIMENSIONAL CASE

In this section we extend external contextual grammars to Marcus many dimensional external contextual grammars which work with vectors of words and vectors of contexts. Let p > 1 be a fixed integer and let E be an

alphabet. A p-word x over E is a p-dimensional vector whose components are words over E, i.e., x = (x, x2,..., xp), where xi E E*, 1 < i < p.

A p-context c over E is a p-dimensional vector whose components are contexts over E, i.e., c = [c, c2,..., p], where ci = (ui, vi), ui, vi

S*, 1 < i < p. We denote vectors of words with round brackets, whereas vectors of contexts are denoted with square brackets.

Let p > 1 be an integer. A Marcus p-dimensional external contextual

grammar is G = (E, B, C), where E is the alphabet of G, B is a finite set of p-words over E called the base of G, and C is a finite set of p-contexts over E. C is called the set of contexts of G.

Let G = (, B, C) be a Marcus p-dimensional external con textual grammar. The direct derivation relation with respect to G is a binary relation between p-words over E, denoted ==G, or = if G is understood from the context. Let x = (x, x2, ..., xp) and

y = (Y1, Y2, * *, yp) be two p-words over E. By definition, x =JG y iff y = (UIXll, U2X2v2, ...,upxpp) for some p-context c =

[(u , V1), (U2, v2), * ., (Up, Up)] E C. The derivation relation with respect to G, denoted =G, or =* if no confusion is possible, is the reflexive and transitive closure of =.G

Let G = (E, B, C) be a Marcus p-dimensional external contextual

grammar. The language generated by G, denoted L(G), is defined as:

L(G) = {y e S* I there exists (xl, x2,... xp) e B such that

(Xl,X2,...,Xp) =:G (Y, Y2, ....yp) and y = YlY2... Yp}.

We denote by g6p the family of all Marcus p-dimensional external contextual languages.

REMARK 3.1. Note that eC = Ce1i and GSp c

8Gq, for all 1 < p < q. Note also that the trivial (empty) context [((, X), (., A.),..., (A., A)] is not

necessary, therefore we will not consider it in the sequel.

We now show that any family GCp for p > 2 is a subfamily of linear

simple matrix languages. A simple matrix grammar of degree n (see Ibarra

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 709

1970) is a construct G = (N1,..., N,, E, P, S), where Ni, 1 < i < n, are finite sets of nonterminals, E is a terminal alphabet, S is the start symbol:

n

S E UUNi,

i=1

and P is a finite set of n-dimensional vectors of rules (rl,..., rn), where each rule ri is a context-free rule over the alphabet Ni U E such that for all

pairs of rules ri : Ai > xi,rj : Aj > xj it follows that Ixi INi = IX INj, 1 <i, j n.

Furthermore, P contains rules of the form (S -> u), with u E E*, and

also rules of the form (S -* AiA2... An), where Ai E Ni, 1 < i < n.

Let G be a simple matrix grammar of degree n. G defines a relation of direct derivation as follows:

S =G V iff(S-- v) E P,

and

UOX1U1 ... XnUn =4G UOVlUI . . VnUn

iff (X ---> v, ... , Xn , n) E P,

where Uj E (E U UL l Ni)*,j = 0, 1,..., n, Xi Ni, i = 1..., n,

and, additionally, the derivation is leftmost on each of the n substrings in

(Ni U S)* of the current string. The derivation relation induced by G, denoted =,s, is the reflexive and

transitive closure of =-G. The language generated by a simple matrix grammar G of degree n is:

L(G) = {w E E* I S =* W}.

A linear simple matrix grammar of degree n, where n > 1, is a simple matrix grammar of degree n G = (N1,..., Nn, E, P, S) such that all the rules occurring as components in the n-dimensional vectors from P,

excepting the rules starting with S, are Chomsky linear rules.

THEOREM 3.1. Let L be a language and p an integer, p > 1. The

following statements are equivalent:

(i) L e p. (ii) L is generated by a linear simple matrix grammar G =

(N1,..., Np, , P, S) of degree p such that, for all 1 < i < p,

card(Ni) = 1.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

710 MANFRED KUDLEK ET AL.

Proof (i) =X (ii). Let G' = (E, B, C) be a Marcus p-dimensional external contextual grammar such that L(G') = L. Define the linear

simple matrix grammar of degree p G = (N1,..., Np, E, P, S) such that

Ni = {Xi}, 1 < i < p, where Xi are all different nonterminals and also different from S. The set P of rules contains the rule (S -- X1 X2... Xp) and, for each p-context c = [(ul, v1), (u2, v), ..., (up, VP)] from C, P contains the p-vector of rules (X1

- ulXl vl, X2 -- u2X2v2, ..., Xp

->

upXpvp). Moreover, for each p-word (Yi, Y2, ... , yp) from B, P contains the p-vector of rules (X1

-- Yi, X2

-- Y2 ... , Xp

-- Yp).

From this definition of G, it is easy to see that L(G) = L.

(i) ?= (ii). Let G be the linear simple matrix grammar of degree p, with L(G) = L. Assume that G = (N1,..., Np, , P, S) such that Ni

{Xi , 1 < i < p, where Xi are all different nonterminals and also different from S. We define a Marcus p-dimensional external contextual grammar

G' = (E, B, C) such that L(G') = L. The set C of p-contexts contains c = [(ui, vi), (u2, v2),...., (p, Vp)] whenever P contains the p-vector of rules (Xi -+

ulXlvl,X2 --+ 2X2v2, ..., Xp

-* UpXpvp). Furthermore,

B contains the p-word (yi, 2, ..., yp) whenever P contains the p-vector of rules (X1

-- yi, X2

- Y2,..., Xp -- yp), where Yi E y *, 1 < i <

p. One can easily verify that L(G) = L. D

We now present a further characterization of each family 8C1p which will be useful in what follows.

For an alphabet E and a natural number p, we define the new alphabet [E, p] = x {1, 2,..., p} and the two morphisms:

(1) Tp : [E, p]* > (E*)Pdefined by:

(i)Tp (X) =

(,, i ... . i),

(ii) rp((a, i)) = (L., , ..., Ia, A ..., ),

where a is the ith component of Tp(a, i), for any 1 < i < p, and the

operation in the monoid (E*)P is the componentwise concatenation, and

(2) fp : (E*)P > E* defined by fp(xl, X2,...,x) =

X1X2 .. .Xp.

THEOREM 3.2. Let L be a language over an alphabet E. L e ESp iff there exists a language L' c [E, p]*, L E e?, such that L = fp(rp(L').

Proof. Let G = (E, B, C) be a p-dimensional external contextual

grammar. We construct the following external contextual grammar G' =

([E, p], B', C'), where:

B' = {p (Xl, X2, ... , Xp) I (X, X2, ..., Xp) E B}, p

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 711

C' = {(t '(uI, U2, . . ., Up), Trl(V , V2, . . ., Up)) I

[(uI, Vl), (U2, V2), * .., (Up, Vp)] E C}.

Clearly, L(G)= f (rp(L(G'))). Conversely, for an external contextual grammar G = ([E, p], B, C),

we construct a p-dimensional external contextual grammar G' =

(E, B', C') with B' and C' defined by:

B' = {rp(x) I x E B},

C' = {[(1, vI), (2, V2 (, V)) I rp(u) = (Ul, t2, ..., Up),

trp() = (V1, V2, .. , Up), (U, V) E C}.

One can easily check that fp(rp(L(G))) = L(G') holds. 1]

Based on the last two results, we can state:

THEOREM 3.3. For every integer p > 2, the family eCp is a mildly context-sensitive family of languages.

Proof (1) Each language in SCp, where p > 1, is a semilinear

language, since each external contextual language is semilinear and the

mappings fp and Tp from the previous theorem preserve semilinearity. (2) The membership problem means the following: given a language

L c E* (defined by a certain type of grammar, automaton, etc.) and a word w E E*, decide algorithmically whether w is in L or not. Since the construction in Theorem 3.1 is effective and the membership problem is polynomially decidable for simple matrix grammars (Ibarra 1970), it follows that each family Cp, p > 1, is parsable in polynomial time.

(3) Now, let E be an alphabet. Clearly, all languages 0, S*, F C E*, F is finite, are in &Cp for every p > 1.

The following languages: - multiple agreements:

L = {anbnc n > 0},

- crossed agreements:

L2 = {anbmcdm I n, m > 0}, and

- duplication:

L3 = ww E w E {a, b}*}

are in C p for every p > 2.

Indeed, L1 is generated by the grammar G1 = ({a, b, c}, B, C), where B = {(X, X)} and C = {[(a, b), (c, X)]}.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

712 MANFRED KUDLEK ET AL.

L2 is generated by the grammar G2 = ({a, b, c, d}, B, C), where B =

{(X, X)} and C = {[(a, A), (c, .)], [(., b), (R., d)]}. Finally, L3 is generated by the grammar G3 = ({a, b}, B, C), where

B = {(., .)} and C = {[(a, .), (a, A)], [(b, .), (b, .)]}. O

COMMENT. One can easily prove that the language of k multiple agree ments L1 =

-{aa2 ...ak I n > 0}, where k > 3, is in Gpe if k = 2p, and

in 8 pp+ if k = 2p + 1. In other words, p-dimensional external contextual

grammars can count to 2p. Also, it can be proved that the language of marked duplications L3 =

{wcw I w E {a, b}*} is in 8e2 (in the grammar associated with L3, replace the set B with B = {(c, X)}).

The following necessary conditions for a language to be a many dimensional external contextual language are direct consequences of Theorem 3.2.

LEMMA 3.1. Let L c E* be a language in 8Cp, p > 1. There exist two

integers n > 1 and k > 1 such that:

(i) (pumping an arbitrary context) If w E L such that Iwl > n, then w has a decomposition w = xluly lvlx2u 2 y v... XpupypVpXp+l, with 0 < |lull + iv I + lu21 + l v2l +... + lup I+lvp

< k, such that, for all

i > 0, the following words are in L:

Wi = x1 u yl VlX2UY2V2 . .

.XpUpyp VpXp+l.

(ii) (pumping an innermost context) If w E L such that Iwl > n, then w has a decomposition w = xlulylvlx2u2y2v2... .XPUPYPPXp+l, with 0 < UI + IVl + U121 + I21 +... + lUpl + lvp <k, and lyl + y21 +

.. . + IYp, < n, such that, for all i > 0, the following words are in L:

Wi = xlul Y VX2U2Y2V2 ?.. XpUppVpXp+l.

(iii) (pumping an outermost context) If w E L such that Iwl > n, then w has a decomposition w = ulyIl U2y2V2 ... U pypVp, with 0 < JuI +

li ll + IU21 + 1V21 +... + l U, + p + p < k, such that, for all i > 0, the

following words are in L:

Wi = u Y1 VlU2Y2V2 ... Upyp Vp

(iv) (pumping all occurring contexts) If w E L such that Iwl > n, then w has a decomposition w = uly1 V1u2y22 v.. UpypVp, with 0 < IY I +

IY21 +.. + lYI < n, such that, for all i > 0, the following words are in L:

Wi = uilyvUy22 ...Upy

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 713

(v) (interchanging contexts) If w, w' e L such that Iwi > n and lw'l > n, then w and w' have decomposi tions w = XlUlYlVlX2U2Y2V2 .

.XpupypVpXp+l, and w' =

x lulylvlx2u2y2 2 ...XpUp P with 0 < lull + lvll + 1u21 +

Iv21 + ... + lUpl + Ivpl < k, and with 0< lu'l + Iv'I + lunl + Ivl\ +

... + lu' + Iv I < k, such that also the following two words z and z' are in L:

z = xlU y VlX2U2Y2V2 ... XpUpypVpXp+l,

Z = x ulyl VlX2U2Y2V2 ... XpUpYpVpXp+I.

Proof All these properties hold for external contextual grammars. For

instance, the first statement follows immediately for an external contextual

grammar G = (E, B, C) if one takes:

n =max{xl I x e B}, andk = max{luvl I (u, v) e C}. D

Lemma 3.1 can be used to show that certain languages are not in a family

SCp, p > 1.

THEOREM 3.4. If p, q > 1 such that p < q, then 8Cp C C q and the

inclusion is strict.

Proof: Clearly, Cep c 6Cq (see Remark 3). It remains to be shown that the inclusion is strict. Consider the language:

L = {ablb'ab ... aqbq I n > 0}.

Let G = (E, B, C) be a Marcus q-dimensional external contextual

grammar where:

E = {al, a2,..., aq, bl, b2,..., bq },

B = {(X,. ..X)},

q

and

C = {[(al, bl), (a2, b2), ...,(aq, bq)]}.

It is easy to see that L(G) = L and hence L E E(Cq. On the other hand, L

is not in 8Cp, since L does not satisfy the condition from Lemma 3.1 (i).

Therefore, Gep is strictly included in Seq. F

By combining Theorem 3.3 and Theorem 3.4, we obtain:

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

714 MANFRED KUDLEK ET AL.

THEOREM 3.5. The families (USp)p>2 define an infinite hierarchy of

mildly context-sensitive languages. D

Now, we investigate the interrelationships between the families ECp, p >

1, and the families of languages in the Chomsky hierarchy.

THEOREM 3.6. (1) 8S, C CS, for every p > 1.

(2) Each family C p, p > 2, is incomparable with the family CF. The

family 8CI is strictly contained in CF.

(3) Each family Cp, p > 1, is incomparable with the family REG.

Proof. (1). Since no deletion is observed in the derivation process of a string in a many-dimensional external contextual grammar, the first

statement follows.

(2). From Theorem 3 it follows that every family SCp, p > 2, con

tains noncontext-free languages. Consider now the context-free language L = {anb I n > 0}*. Assume that L can be generated by a Marcus p

dimensional external contextual grammar G = (E, B, C). Consider the

following word from L:

w =ail bl ai2bi2 ... aiqbiq

where p < q. One can easily see that w does not satisfy Lemma 3.1 (iv), and hence L is not in 8Cp, for any p > 2.

The second part of this statement follows from Remark 2.1.

(3). Note that each family 8Cp, p > 1, contains nonregular languages. Now consider the regular language L = a* Ub*. One can verify that L does

not satisfy Lemma 3.1 (v), i.e., the property of interchanging the contexts, and thus L is not in ,Cp, for any p > 1. O

Let (C* be the family:

e* =U eCp. p>l

REMARK 3.2. One can easily see that C, is also a mildly context

sensitive family of languages. Additionally, *C, is strictly contained in

the family CS and is incomparable with the families CF and REG.

We finish this section with a brief discussion about some closure prop erties of the families eCp, p > 1, and EC*. It is easy to observe that

all these families are closed under morphisms and A-free morphisms, and

that no one is closed under union, Kleene *, Kleene +, intersection with

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 715

regular languages and inverse morphisms. Each family 8Cp fails to be closed under concatenation, while &e, does have this property. Indeed, given two languages L1 E 8Cp and L2 E 8Cq, one can easily construct a

p + q-dimensional external contextual grammar generating L1L2. The emptiness problem and the finiteness problem are decidable,

whereas the emptiness of the intersection is undecidable. The decidabil

ity of the inclusion and the decidability of the equivalence are still open problems, even for the case p = 1 (see Paun 1997).

Extending the above considerations, it should be emphasized that, al

though closure of operations is a very natural and elegant mathematical

property, however nonclosure properties seem to be natural too. One may argue that it is possible to construct a grammatical formalism for a lan

guage in the following way. One looks for a decomposition of the language into some independent fragments, constructs auxiliary grammars for such

fragments, and then obtains the desired grammar by closure under union of the auxiliary grammars. Following this idea, the grammatical model should

satisfy the closure property under union, which is fulfilled by almost all

mildly context-sensitive devices (but not ours). Along the same lines, we can imagine two other ways of constructing a grammatical formalism for a language L.

The first one consists of identifying a set of conditions cl, c2,. . Cn which define the correctness of L. Then, one tries to define a grammar gen erating the set L(ci) of all words observing the condition ci and ignoring the other conditions. It is expected that the intersection nf=l L(ci) gives exactly the language L. Now, the closure property which should be verified

by the model is the one under intersection, which is not verified by many mildly context-sensitive mechanisms, among them the one investigated in this paper.

Another strategy might be the following one. The conditions from above are ranked in accordance with a scale of quality degrees, and the construction starts with a grammar satisfying the lowest conditions and

going up to the highest one through a convergence process. An infinite

hierarchy of devices might be successfully used in such a process. Let us consider a specific linguistic operation, which is closely re

lated with that of concatenation, namely the operation of coordination.

Following the model of going from:

John eats an apple

John eats a banana

to

John eats an apple and a banana,

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

716 MANFRED KUDLEK ET AL.

one may infer that the closure under prefix coordination defined by:

PCoord(x, y) = {ux'y' I x = ux', y = uy'}

should be necessary for a class of generative devices. However, by ap plying this operation to two perfectly grammatical sentences one may get ungrammatical constructions. One example is zeugma, in which a verb

governs two objects under different senses, such as:

On his ski vacation, John got chapped lips and lost.

The two component sentences:

On his ski vacation, John got chapped lips. On his ski vacation, John got lost.

are grammatical, but the combination generated by the prefix coordina tion is marginal at best. This is because the senses of "got" in the two

component sentences are distinct and take different kinds of objects. More extreme cases can be found by playing on gross syntactic ambi

guity of the prefixes. Consider these two sentences in which the phrase "the

dog races" are syntactically completely different. In the former sentence, "races" is a verb, while in the latter one is a plural noun:

The dog races down the street.

The dog races are over.

By applying the prefix coordination, we get the completely ungrammatical sentence:

The dog races down the street and are over.

Similar examples can be provided for the operations of union, Kleene *, Kleene +, intersection with regular languages, and inverse morphisms.

In our opinion, nonclosure under these operations cannot be viewed as a

major drawback of a mathematical formalism for natural language syntax.

4. ANOTHER INFINITE HIERARCHY OF MILDLY CONTEXT-SENSITIVE

FAMILIES OF LANGUAGES

Most frequently, it is required that all regular languages or all context-free

languages are included in a mildly context-sensitive family of languages.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 717

CS

CF +C, --- REG + EC*, EC*

I It

CF + ?Cp+i - REG + ECp+i - ECp l

CF + EC ----- REG + Cp ------ Cp

I I I

CF + EC2 - REG + EC2 -- C2

Figure 1.

An infinite hierarchy beyond context-free languages but including them has been reported in Weir (1992).

Take the following notations and definitions:

CF + 8Cp = CF U p, p > 1,F + CF + U &C,

and

REG + 8Cp = REG U p, p 1, G + U e *.

Note that all languages in CF (and REG) are semilinear and for all these

languages the membership problem is solvable in deterministic polynomial time.

Summarizing our results, we obtain the following main result:

THEOREM 4.1. All families depicted in Figure 1, except CS, are mildly context-sensitive families of languages. Every arrow in Figure 1 denotes a strict inclusion.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

718 MANFRED KUDLEK ET AL.

5. AN ATTEMPT TO DEFINE A STRUCTURE ON MANY-DIMENSIONAL

EXTERNAL CONTEXTUAL GRAMMARS

Despite the simplicity of the machinery discussed in the sections above, it might happen that possible linguistic applications lead to rather com

plex grammars with a large amount of redundancy due to a complete absence of any structure in these grammars. As we have seen, the way of producing a string in a many-dimensional external contextual gram mar does not directly provide us with any structural description of the

string (tree, dependence relation, pattern interpretation, etc.). We shall define the so-called fully bracketed strings with which we associate a tree structure in a very natural and simple fashion. Then, we shall define

many-dimensional external contextual grammars generating fully brack eted strings. The working mode of these grammars naturally suggests the aforementioned structure.

Let V = {(, )} and Dv be the Dyck language over V, that is the context free language generated by the grammar with the set of productions {S -*

SS, S -? (S), S - X}.. Given two alphabets E, A, a projection of (EU A)* on A* is a morphism prA : (E U A)* > A* defined by pra(a) = X for all a E E, and prA (b) = b for all b e A.

A bracketed word x E (E U V)* is a word satisfying the following conditions:

(i) prv(x) e Dv, (ii) x can be reduced to a word over E by iteratively applying reduction

rules of the form (w) -> ., w E +.

A fully bracketed word x e (E U V)* is a word that can be written as x = (y), where y is a bracketed word that cannot be written in the form

(z), for any z E E*. For example, (ab) (b) is a bracketed but not a fully bracketed word,

whereas ((ab) (b)) is a fully bracketed but not a bracketed word. We associate a tree T (x) with each nonempty fully bracketed string

x E (E U V)+, rooted in the node denoted by p(x), in the following way. If x = (y) such that:

Y = Yo, Y,2 ... YOqoXY1, 1Y12

... Y1,q X2

... XrYr, Yr,2 ... Yrqr

for some r > 0, with Yi,j E , O < i < r, 1 < j < qi, and xl, x2, ..., x,

fully bracketed words, then the tree T (x) depicted in Figure 2 results.

Clearly, the frontier of a tree associated with a fully bracketed string x over E U V is exactly prs(x). This can be written as yield(T(x)) pro (x).

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 719

p(x)

YO,1 Y0,2 ..

YO,qo p(x)Yl,i Y1,2 Yl,qip(x2) P(Xr) r, 1 r,2 ... Yr,q.

Figure 2.

For example, the tree associated with the fully bracketed word x =

(a(bc)((a)b)) is represented in Figure 3.

a A

b c b

a Figure 3.

The set of all fully bracketed words over E U V is denoted by FBS(E). A p-dimensional bracketed external contextual grammar (p-BEC

grammar for short), p > 2, is a construct r = (E U V, B, C), where B is a finite set of vectors of size p having components in FBS(E), and C is a p-vector of contexts [(u , vl), (u2, v2), ..., (Up, Vp)] such that, for each context (ui, vi), the string uivi is in FBS(E). The derivation in a p BEC grammar is defined in the same way as in a p-dimensional external contextual grammar. We define the following two languages:

the fully bracketed language ofF: Lb(r) = {(y) I y E L(F)}, the string language of : Ls = prE(L(r)) = {prE(y) I y

L(r)}.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

720 MANFRED KUDLEK ET AL.

REMARK 5.1. The fully bracketed language of r contains fully bracketed words only. Indeed, this claim follows from the definition of r and the fact that uxv E FBS((E) provided that both x and uv are in FBS(E).

The set Tree(F) = [p(x) I x e Lb(r)} is called the tree language as sociated with the p-dimensional bracketed external contextual grammar r.

It is known (Thatcher 1967) that the yield of any regular tree language is a context-free language, and the set of all derivation trees of a context free language is a regular tree language. By combining this result with Theorem 3.6, we get:

THEOREM 5.1. The families of regular tree languages and tree languages associated with p-BEC grammars are incomparable.

Let r = (x U V, B, C) be a p-BEC grammar, for some p > 2. We

construct the following sets:

- (r) is a singleton set with the following tree only having p

substitution nodes identified by the sign t:

- C(r) is the set of all tree vectors of size p represen ted below, each tree having a substitution node only, where

[((ui, l)), ((u2, v2)), ..., ((Up, vp))] is a context in C:

U1 4. V1 U2 4. V2 UP 4. vp

- B(r) is the set of all tree vectors of size p associated with each

fully bracketed string vector in B. These trees do not contain any substitution node.

We say that a tree vector is derived by a parallel tree substitution if it is an element of s (r) or it is obtained from a tree vector in C(F) in which

the substitution node of each component is replaced by the corresponding component of a derived tree vector. A derivation tree in F is a tree built

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 721

from the unique tree in 1(r) where each substitution node was replaced by the corresponding component of a derived tree vector.

It is easy to note that, for a p-BEC grammar as above, the set of derivation trees in r coincides with the set of trees associated with r.

The structure introduced above offers the possibility to define the am

biguity of a BEC-grammar. We say that a p-BEC grammar r = (E U V, B, C) is ambiguous if there exist two different strings x, y E Lb(r) such that pr (x) = pr (y). Clearly, x and y are different if, and only if, the trees associated with them are different. A string language L is

p-inherently ambiguous in the class of p-BEC grammars if any p-BEC grammar r with Ls(r) = L is ambiguous. If L is p-inherently ambiguous for any p > 2, then L is said to be inherently ambiguous.

The following result can be easily proved by reducing the problem to the Post Correspondence Problem.

THEOREM 5.2. Given a p-BEC grammar r, it is undecidable whether r is ambiguous or not.

THEOREM 5.3. For any p > 2, there are inherently ambiguous languages in the class of p-BEC grammars. There are also inherently ambiguous languages.

Proof. We take the language L = {anbambaP I n,m,p > 1}.

Clearly, L is the string language of some 2-BEC grammar. Let r =

({a, b, (,)}, B, C) be such a 2-BEC grammar. Since a+baba C L and aba+ba c L, it follows that C must contain the vectors of contexts

cl = [(x, )), (R, X)] and c2 = [((, y), (z, ))], where x), (y, z) are fully bracketed words over V U {a}. As ababa is in L, we infer that there exist

(wl, W2) E B such that pra,b(W W2) = ababa. By applying cl to (wl, w2) and then c2 to the resulting vector, we get (al, a2). By applying c2 to

(Wl, w2) and then cl to the resulting vector, we get (I1, P2). Obviously, a1Ca2 Pfli2 and pra,b(ala2) = Pra,b(Pll2), which concludes the first

statement of the theorem. The proof is complete as soon as we notice that any p-BEC grammar,

p > 2, which generates L has to be ambiguous. L]

6. CONCLUSIONS

The grammars introduced and studied in this paper are natural extensions of Marcus external contextual grammars. These grammars lead to a new infinite hierarchy of mildly context-sensitive languages.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

722 MANFRED KUDLEK ET AL.

Note that Weir (1988) introduced an infinite hierarchy of mildly context-sensitive families of languages, too. However, that hierarchy is based on control grammars, which is a formalism much more complicated than our formalism presented here.

One can find in the literature many other variants of Marcus contextual

grammars (see Paun 1997). Those grammars could also be investigated in the many-dimensional case.

Of special interest in the future will be to investigate contextual gram mars with contexts shuffled on trajectories in the many-dimensional case. Such a variant provides the most general framework for contextual gram mars (see Martin-Vide and Mateescu 2000a, for more details on this

topic). It may be argued that we are indirectly suggesting in this paper the

Chomsky hierarchy not to be the appropriate place for locating natural

languages. Yes, we are doing so definitely. We strongly believe that natural

languages could occupy an eccentric position in the Chomsky hierarchy. Though simply a belief, there are several reasons that support it: among others, it is clear that not all regular languages over the alphabet of, say, English are subsets of it. Certainly, some authors pointed out this in the

past in a more or less informal way (as a matter of example, see Manaster Ramer 1999). Thus, we need a new hierarchy, which should certainly hold

strong relationships with the Chomsky's one, but which should not coin cide with it: in a certain sense, it should be incomparable with the Chomsky hierarchy and pass across it. This paper has to be regarded as a step in that

long-term research direction. Since the families of languages generated by many-dimensional external contextual grammars have the property of

transversality, they appear to be appropriate candidates to model natural

language syntax. Unfortunately, when we compare our families with the

Chomsky's ones and see that they are incomparable, we cannot say much more for the time being that is empirically motivated about what is there in each one of the respective difference sets.

Something similar should be said with regard to closure properties. Closure under different operations is surely very fruitful for mathematics or computer science, but, as suggested above, it should be carefully recon sidered as far as mathematical linguistics issues are addressed. Thus, the fact that most usual language operations are not closed in our framework does not seem to us to be a problem. On the contrary, it could give support to our approach to mild context-sensitivity.

Finally, whether or not our machinery runs efficiently in a processing environment, though a point of great importance in computational linguist ics, was not a matter of our concern here.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 723

ACKNOWLEDGEMENTS

The authors express their thanks to S. M. Shieber and H. Bordihn for their useful remarks on an earlier version. Thanks are due also to the anonym ous referees for their valuable criticisms and suggestions, which made the

improvement of this paper possible.

REFERENCES

Aho, A. V.: 1968, 'Indexed Grammars - An Extension of Context-Free Grammars', Journal of the ACM 15, 647-671.

Bresnan, J., R. M. Kaplan, S. Peters and A. Zaenen: 1987, 'Cross-Serial Dependencies in

Dutch', in W. J. Savitch, E. Bach, W. Marsh and G. Safran-Naveh (eds.), The Formal

Complexity of Natural Language, D. Reidel, Dordrecht, pp. 286-319.

Culy, C.: 1987, 'The Complexity of the Vocabulary of Bambara', in W. J. Savitch, E. Bach, W. Marsh and G. Safran-Naveh (eds.), The Formal Complexity of Natural Language, D.

Reidel, Dordrecht, pp. 349-357.

Ehrenfeucht, A., G. Rozenberg and Gh. Paun: 1997, 'Contextual Grammars and Formal

Languages', in G. Rozenberg and A. Salomaa (eds.), Handbook of Formal Languages, Vol. 2, Springer Verlag, Berlin, pp. 237-294.

Gazdar, G.: 1985, 'Applicability of Indexed Grammars to Natural Languages', Tech nical report CSLI-85-34, Center for the Study of Language and Information, Stanford

University CA.

Gazdar, G. and G. K. Pullum: 1985, 'Computationally Relevant Properties of Natural

Languages and Their Grammars', New Generation Computing 3, 273-306.

Ibarra, 0.: 1970, 'Simple Matrix Languages', Information and Control 17, 359-394.

Joshi, A. K.: 1985, 'How Much Context-Sensitivity is Required to Provide Reason able Structural Descriptions: Tree Adjoining Grammars', in D. Dowty, L. Karttunen and A. Zwicky (eds.), Natural Language Parsing: Psychological, Computational and Theoretical Perspectives, Cambridge University Press, Cambridge, pp. 206-250.

Joshi, A. K., L. S. Levy and M. Takahashi: 1975, 'Tree Adjunct Grammars', Journal of Computer and System Sciences 10(1), 136-163.

Joshi, A. K. and Y. Schabes: 1997, 'Tree-Adjoining Grammars', in G. Rozenberg and A. Salomaa (eds.), Handbook of Formal Languages, Vol. 3, Springer Verlag, Berlin, pp. 69-123.

Joshi, A. K., K. Vijay-Shanker and D. Weir: 1991, 'The Convergence of Mildly Context Sensitive Grammatical Formalisms', in P. Sells, S. Shieber and T. Wasow (eds.), Foundational Issues in Natural Language Processing, MIT Press, Cambridge, MA, pp. 31-81.

Manaster Ramer, A.: 1999, 'Some Uses and Abuses of Mathematics in Linguistics', in C. Martin-Vide (ed.), Issues in Mathematical Linguistics, John Benjamins, Amsterdam, pp. 73-130.

Marcus, S.: 1969, 'Contextual Grammars', Revue Roumaine des Mathematiques Pures et

Appliquees 14(10), 1525-1534. Marcus S.: 1997, 'Contextual Grammars and Natural Languages', in G. Rozenberg and

A. Salomaa (eds.), Handbook of Formal Languages, Vol. 2, Springer Verlag, Berlin, pp. 215-236.

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

724 MANFRED KUDLEK ET AL.

Martin-Vide, C. and A. Mateescu: 1999, 'Sewing Grammars', in Gh. Paun and G. Ciobanu

(eds.), Fundamentals of Computation Theory, 12th International Symposium FCT'99, Lecture Notes in Computer Science 1684, Springer Verlag, Berlin, pp. 398-408.

Martin-Vide, C. and A. Mateescu: 2000a, 'Contextual Grammars with Trajectories', in G. Rozenberg and W. Thomas (eds.), Developments in Language Theory: Foundations, Applications and Perspectives, World Scientific, Singapore, pp. 362-374.

Martin-Vide, C. and A. Mateescu: 2000b, 'Special Families of Sewing Languages', Journal

of Automata, Languages and Combinatorics 5(3), 279-286.

Martin-Vide, C. and A. Mateescu: 2001, 'Sewing Contexts and Mildly Context-Sensitive

Languages', in C. Martin-Vide and V. Mitrana (eds.), Where Mathematics, Computer Science, Linguistics and Biology Meet, Kluwer, Dordrecht, pp. 75-84.

Partee, B. H., A. ter Meulen and R. E. Wall: 1990, Mathematical Methods in Linguistics, Kluwer, Dordrecht.

Pollard, C. J.: 1984, 'Generalized Phrase Structure Grammars, Head Grammars, and Natural Language', PhD dissertation, Stanford University, CA.

Pullum, G. K.: 1985, 'On Two Recent Attempts to Show That English is Not a CFL, Computational Linguistics 10, 182-186.

Pullum, G. K. and G. Gazdar: 1982, 'Natural Languages and Context-Free Languages', Linguistics and Philosophy, 4, 471-504.

Paun, Gh.: 1997, Marcus Contextual Grammars, Kluwer, Dordrecht.

Roach, K.: 1987, 'Formal Properties of Head Grammars, in A. Manaster Ramer (ed.), Mathematics of Language, John Benjamins, Amsterdam, pp. 293-348.

Rounds, W. C., A. Manaster Ramer and J. Friedman: 1987, 'Finding Natural Languages a Home in Formal Language Theory', in A. Manaster Ramer (ed.), Mathematics of Language, John Benjamins, Amsterdam, pp. 349-360.

Rozenberg, G. and A. Salomaa (eds.): 1997, Handbook of Formal Languages, 3 vols, Springer Verlag, Berlin.

Salomaa, A.: 1973, Formal Languages, Academic Press, New York.

Savitch, W. J., E. Bach, W. Marsh and G. Safran-Naveh (eds.): 1987, The Formal

Complexity of Natural Language, D. Reidel, Dordrecht.

Shieber, S. M.: 1987, 'Evidence Against the Context-Freeness of Natural Languages', in W. J. Savitch, E. Bach, W. Marsh and G. Safran-Naveh (eds.), The Formal Complexity of Natural Language, D. Reidel, Dordrecht, pp. 320-334.

Steedman, M.: 1985, 'Dependency and Coordination in the Grammar of Dutch and

English', Language 61, 523-568.

Thatcher, J. W.: 1967, 'Characterizing Derivation Trees of Context-Free Grammars

Through a Generalization of Finite Automata Theory', Journal of Computer and System Sciences 1, 317-322.

Vijay-Shanker, K. and D. Weir: 1994, 'The Equivalence of Four Extensions of Context Free Grammars', Mathematical Systems Theory 87, 511-546.

Weir, D.: 1988, 'Characterizing Mildly Context-Sensitive Grammar Formalisms', PhD

dissertation, University of Pennsylvania. Weir, D.: 1992, 'A Geometric Hierarchy Beyond Context-Free Languages', Theoretical

Computer Science 104, 235-261.

Manfred Kudlek Fachbereich Informatik Universitdt Hamburg, Germany

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

CONTEXTS AND THE CONCEPT OF MILD CONTEXT-SENSITIVITY 725

kudlek@ informatik. uni-hamburg.de

Carlos Martin-Vide Research Group on Mathematical Linguistics (GRLMC) Universitat Rovira i Virgili, Tarragona, Spain cmv@astor urv.es

Alexandru Mateescu

Faculty of Mathematics

University of Bucharest, Romania alexmate @pcnet. ro

Victor Mitrana

Faculty of Mathematics

University of Bucharest, Romania mitrana @funinf: cs. unibuc. ro

This content downloaded from 185.2.32.49 on Sat, 14 Jun 2014 12:19:14 PMAll use subject to JSTOR Terms and Conditions

Recommended