Copyright notice. The documents referenced below are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright.
This details certain problems with the model of computation implicit in Adelman's initial work on DNA based computing.
This surveys some of the early work that followed up on Adelman's paper on DNA based computing, sketches some of the biological background for this work, and speculates about how one might base computers on RNA.
This paper builds on the Bellantoni-Cook and Leivant characterizations of Ptime to obtain a kind of type-theoretic characterization the type-2 basic feasible functionals.
For a quick overview, here are a set of slides from a talk on this work.
FORMATS: [US/ps]
This concerns characterizations of the basic feasible functionals as type-level three and above. In contrast to Part I, in which the focus was on types, the focus here is on semantics – a rather humble and gritty sort of semantics, but still semantics.
FORMATS: [US/ps] [A4/ps] [US/pdf] [A4/pdf]
For a quick overview, here are two sets of slides from talks on this work and an extended abstract.
- Separating Notions of Higher-Type Polynomial-Time: Extended Abstract, This is a draft of an extended abstract that sketches part of the story from Part II that leads up to the D!=BFF result.
FORMATS: [US/ps] [A4/ps]Also, here are slides from a talk presented at ICC'00
FORMATS: [ps] [pdf]
- The Why and How of Higher-Type Complexity Theory, these are slides from another talk presented at ICC'00 that, in part, tells a different portion of the story.
FORMATS: [ps] [pdf]
The Kreisel-Lacombe-Schoenfield Theorem from recursion theory, says (very roughly) that the functionals definable through computing over syntax (i.e., actual programs) are the same as the functionals definable through computing over semantics (i.e., the graphs of the functions computed by programs). This paper investigates whether an analogue of KLS holds for type-2 feasible functionals.A fairly natural direct analogue of KLS is shown to involve complexity-theoretic conundrums. E.g., if this direct analogue holds, then P differs from NP.
A weaker analogue of KLS – in which "computing over syntax" is replaced by "computing over computations (of programs)" – is shown to hold.
The sequentially realizable functionals (denoted SR) is a class of ``sequentially computable'' higher-type functionals that include the PCF-computable functionals along with noncontinuous elements. SR has very strong and natural mathematical properties. In particular, there are senses in which SR is a maximal class of sequential functionals.John Longley exhibited a functional H in SR with the property that if one adds H to the language PCF, then the resulting language, PCF+H, computes exactly SR. Longley hesitated to recommend PCF+H as the basis of a practical programming language as the only known ways of computing H seemed to involve high computational complexity.
In this simple note we show that if P is different from NP, then the computational complexity of H (and of similar SR-functionals) is inherently infeasible.
Here are slides from a talk at ICC'00 based on this paper.
FORMATS: [ps] [pdf]
There are now a number of things called ``higher-type complexity classes.'' The most promenade of these is the class of basic feasible functionals, a fairly conservative higher-type analogue the (type-1) polynomial-time computable functions. There is however currently no satisfactory general notion of what a higher-type complexity class should be. In this paper we propose one such notion for type-2 functionals and begin an investigation of its properties.The most striking difference between our type-2 complexity classes and their type-1 counterparts is that, because of topological constrains, the type-2 classes have a much more ridged structure. Example: It follows from McCreight and Meyer's Union Theorem that the (type-1) polynomial-time computable functions form a complexity class. The analogous result fails for the class of type-2 basic feasible functionals.
Here are slides from a talk at ICC'01 based on this paper.
FORMATS: [ps] [pdf]
This paper investigates what is essentially a call-by-value version of PCF under a complexity-theoretically motivated type system. The programming formalism, ATR, has its first-order programs characterize the poly-time computable functions, and its second-order programs characterize the type-2 basic feasible functionals of Mehlhorn and Cook & Urquhart. (The ATR-types are confined to levels 0, 1, and 2.)The type system comes in two parts, one that primarily restricts the sizes of values of expressions and a second that primarily restricts the time required to evaluate expressions. The size-restricted part is motivated by Bellantoni & Cook's and Leivant's implicit characterizations of poly-time. The time-restricting part is an affine version of Barber & Plotkin's DILL.
Two semantics are constructed for ATR. The first is a pruning of the naive denotational semantics for ATR. This pruning removes certain functions that cause otherwise feasible forms of recursion to go wrong. The second semantics is a model for ATR's time complexity relative to a certain abstract machine. This model provides a setting for complexity recurrences arising from ATR recursions, the solutions of which yield second-order polynomial time bounds. The time-complexity semantics is also shown to be sound relative to the costs of interpretation on the abstract machine.
Origins of the title: The title is borrowed from the classic 1946 anthology Adventures in Time and Space edited by Healy and McComas. It is also based on Ian Hacking's categorization of experiments in Representing and Intervening in which an adventure is an experiment that is not about hypothesis testing, but more of the gee-what-happens-if-we-do-this variety.
OTHER VERSIONS:
[POPL'06 pdf]
[POPL'06 ps]
Slides from a talk at
GEOCAL'06
The authors' ATR programming formalism is a version of call-by-value PCF under a complexity-theoretically motivated type system. ATR programs run in type-2 polynomial-time and all standard type-2 basic feasible functionals are ATR-definable (ATR types are confined to levels 0, 1, and 2). A limitation of the original version of ATR is that the only directly expressible recursions are tail-recursions. Here we extend ATR so that a broad range of affine recursions are directly expressible. In particular, the revised ATR can fairly naturally express the classic insertion- and selection-sort algorithms, thus overcoming a sticking point of most prior implicit-complexity-based formalisms. The paper's main work is in refining the original time-complexity semantics for ATR to show that these new recursion schemes do not lead out of the realm of feasibility.
An earlier version of this appeared as: Time-complexity semantics for feasible affine recursions, by Norman Danner and James S. Royer in Computation in the Real World (Proceedings of Computability in Europe 2007, Sienna), S.B. Cooper, B. Lowe, and A. Sorbi (eds.), LNCS vol. 4497, Springer-Verlag, 2007, pp. 205-217.
We investigate feasible computation over a fairly general notion of data and codata. Specifically, we present a direct Bellantoni-Cook-style normal/safe typed programming formalism, RS1, that expresses feasible structural recursions and corecursions over data and codata specified by polynomial functors. (Lists, streams, finite trees, infinite trees, etc. are all directly definable.) A novel aspect of RS1 is that it embraces structure-sharing as in standard functional-programming implementations. As our data representations use sharing, our implementation of structural recursions are memoized to avoid the possibly exponentially-many repeated subcomputations a naive implementation might perform. We introduce notions of size for representations of data (accounting for sharing) and codata (using ideas from type-2 computational complexity) and establish that type-level 1 RS1-functions have polynomial-bounded runtimes and satisfy a polynomial-time completeness condition. Also, restricting RS1 terms to particular types produces characterizations of some standard complexity classes (e.g., ω-regular languages, linear-space functions) and some less-standard classes (e.g., log-space streams).
A previous version appeared in: Learning Theory and Kernel Machines, Proceedings of the 16th Annual Conference on Learning Theory (COLT) and the 7th Kernal Workshop, Lecture Notes in Artificial Intelligence, Number 2777, Springer-Verlag, pp. 684–698, August 2003.FORMATS: [US/pdf] [US/ps]
The paper investigates tradeoffs between the generality of an algorithmic learning device and the quality of the programs it learns successfully. For example:
- There are results to the effect that, thanks to small increases in generality of a learning device, the computational complexity of some successfully learned programs is provably unalterably suboptimal.
- There are also results in which the complexity of successfully learned programs is asymptotically optimal and the learning device is general, but, still thanks to the generality, some of those optimal, learned programs are provably unalterably information deficient. This information deficiency is, in some cases, deficient as to safe, algorithmic extractability/provability of the fact that they are even approximately optimal.
This is a technical investigation of several questions about complexity theory relative to random oracles. The motivation for this work comes out of studying power of one-way functions – random oracles provide a computational world in which very strong one-way functions exist.
We show that the following are equivalent:Proving this involves showing that certain complexity-theoretic versions of the Cantor-Bernstein Theorem succeed and showing that, under the assumption that P is different from PSPACE, certain other versions fail.
- Every two ptime 1-equivalent sets are p-isomorphic.
- Every two p-invertible equivalent sets are p-isomorphic.
- P = PSPACE.
This is a study of the NPkV classes which are classes of multi-valued functions computed by non-deterministic, poly-time transducers - they are roughly the function analogue of the UP(k) language classes.
The title says it all.
A study of closure properties related to probablistic complexity classes.
A survey of work inspired by the Berman-Hartmanis Isomorphism Conjecture and the seemingly innocent question: What does an NP-complete set look like? Several key open questions mentioned in the paper have been answered. For example, see:(Just to be clear, I am not an author on either of the above two papers.)
- The Isomorphism Conjecture Holds Relative to an Oracle by Stephen Fenner, Lance Fortnow, and Stuart Kurtz; SIAM Journal on Computing 25 (1996) 193-206.
- The Isomorphism Conjecture Holds and One-way Functions Exist Relative to an Oracle by John Rogers, J. of Computer and System Sciences 54 (1997) 412-423.
Back to http://www.cis.syr.edu/~royer/