Building thinking machines

Our mission is to advance both fundamental research and practical application in the fields of artificial intelligence and machine learning. The fundamental research is focused on so-called Artificial General Intelligence (AGI), which refers humanity's long term dream of constructing thinking machines that can solve a wide range of tasks without being specifically programmed for any of them.

About us

OCCAM is a private non-profit research laboratory founded by
Dr. Arthur Franz and Michael Löffler.

Contact Us

Research

Agenda

It is not a coincidence that finding short explanations for observations is central to research at OCCAM. After all, it is what William Occam's razor demands and what has been formalized in Ray Solomonoff's theory of universal induction a few decades ago.

In that sense, in a nutshell, research at OCCAM consists of trying to make universal induction tractable by building efficient data compression algorithms. Formally, algorithmic information theory is used to create a sound and general mathematical foundation. The representations gained by the compression algorithms are then to be used to guide grounded reasoning, concept acquisition and communication through language, to build an agent with common sense reasoning abilities. In our view, solving these problems would be a big step toward AGI.

Research projects

  • Incremental Compression as a theory of Deep Learning

    We have developed a theory of incremental compression, which can find short representations efficiently for arbitrary strings that can be generated by a composition of functions/features. In a sense, this is exactly the sort of representation employed by many deep learning approaches. Our theory provides an explanation why deep learning works and sheds light on the ways deep learning can be improved even further.

  • Theory of Hierarchical Compression

    Incremental compression is an efficient way to find short representations for data generated by a composition of functions/features. Even though it is much more efficient than universal search, it is not enough for practical applications, since we don't know how to find those functions. However, since real world data usually show local correlations, this circumstance can be exploited by using a branching compression hierarchy. See this publication for more on that issue.

  • Efficient bias-optimal search

    Search in a Turing-complete space of algorithms is very hard. It would be very helpful, if there was an approach using merely parameter optimization rather than searching through all programs. In a sense, that's what artificial neural networks (ANNs) do. After all, it is well known that ANNs are universal approximators. The trouble is that universal approximators can approximate arbitrary functions but pay the price of a complex representation, whose complexity exceeds the simplest function in a Turing-complete space. How can we make neural networks respect symmetries and regularities in data? See this blog article for a more lengthy discussion.

    Further, efficient induction should be bias-optimal, i.e. the simple representations should be tried first. How can a simplicity bias be introduced into universal approximators?

  • Specialization of algorithms

    A data scientist's job is usually to select and tune an algorithm to perform a specific task or even to come up with a new algorithm. How can this process be itself performed by a general algorithm? Is there general algorithmic specialization? Can we build a general algorithm that takes an arbitrary task and derive a special algorithm that is suited to solve a narrow class of tasks to which this task belongs to?

  • Complexity in physics

    Why does natural data seem to be hierarchically compressible? Issues like scale invariance and self-orginized criticality seem to be closely related to hierarchical compression and power laws. Is there a deeper physical reason to it?

    Physics does not have a formalization of simplicity/complexity. Only the number of possible states in a system are formalized by the Boltzmann entropy. Algorithmic entropy should be introduced into statistical mechanics.

Publications

The listed publications below give an impression of the science done at OCCAM. To get started, we recommend reading the position paper.

On hierarchical compression and power laws in nature

Since compressing data incrementally by a non-branching hierarchy has resulted in substantial efficiency gains for performing induction in previous work, we now explore branching hierarchical...

Franz A. On Hierarchical Compression and Power Laws in Nature //International Conference on Artificial General Intelligence. – Springer, Cham, 2017. – С. 77-86.

July 13, 2017

View publication
Some Theorems on Incremental Compression

The ability to induce short descriptions of, i.e. compressing, a wide class of data is essential for any system exhibiting general intelligence. In all generality,...

Franz A. Some Theorems on Incremental Compression //International Conference on Artificial General Intelligence. – Springer International Publishing, 2016. – С. 74-83.

June 25, 2016

View publication
Toward Tractable Universal Induction Through Recursive Program Learning

Since universal induction is a central topic in artificial general intelligence (AGI), it is argued that compressing all sequences up to a complexity threshold should...

Franz A. Toward tractable universal induction through recursive program learning //International Conference on Artificial General Intelligence. – Springer, Cham, 2015. – С. 251-260.

July 15, 2015

View publication
Artificial general intelligence through recursive data compression and grounded reasoning: a position paper

This paper presents a tentative outline for the construction of an artificial, generally intelligent system (AGI). It is argued that building a general data compression...

Franz A. Artificial general intelligence through recursive data compression and grounded reasoning: a position paper //arXiv preprint arXiv:1506.04366. – 2015.

January 06, 2015

View publication
Will super-human artificial intelligence (AI) be subject to evolution?

There has been much speculation about the future of humanity in the face of super-humanly intelligent machines. Most of the dystopian scenarios seem to be...

September 06, 2013

View publication

Blog

Building thinking machines is hard. Very hard. This blog discusses some of the issues involved in our approach.

Our approach to artificial general intelligence

Our approach to artificial general intelligence

Our presentation at GOOD AI (Prague) Topic: Our approach to artificial general intelligence

Recent lectures on AGI by Dr. Arthur Franz (in russian)

Recent lectures on AGI by Dr. Arthur Franz (in russian)

For our laboratory the year had begun auspiciously by a succession of introductory lectures held in Odessa. Those of our readers who speak Russian can...

Emergence of attention mechanisms during compression

Emergence of attention mechanisms during compression

It just dawned on me. When we want to compress, we have to do it in one or the other incremental fashion, arriving in description...

Hierarchical epsilon machine reconstruction

Hierarchical epsilon machine reconstruction

Having read this long paper by James P. Crutchfield (1994) “Calculi of emergence”, we have to admit, that it is very inspiring. Let’s think about...

Universal approximators vs. algorithmic completeness

Universal approximators vs. algorithmic completeness

Finally, it has dawned on us. A problem that we had troubles conceptualizing is the following. On the one hand, for the purposes of universal...

The merits of indefinite regress

The merits of indefinite regress

The whole field of machine learning, and artificial intelligence in general, is plagued by a particular problem: the well known curse of dimensionality. In a...

Using features for the specialization of algorithms

Using features for the specialization of algorithms

A widespread sickness of present “narrow AI” approaches is the almost irresistible urge to set up rigid algorithms that find solutions in an as large...

AGI-16 Arthur Franz - Some theorems on incremental compression

AGI-16 Arthur Franz - Some theorems on incremental compression

Arthur Franz presents his talk “Some theorems on incremental compression” at the Ninth Conference on Artificial General Intelligence ( AGI-16 ) in New York

The physics of structure formation

The physics of structure formation

The entropy in equilibrium thermodynamics is defined as , which always increases in closed systems. It is clearly a special case of Shannon entropy ....

Incremental compression

Incremental compression

A problem of the incremental approach is obviously local minima in compression. Is it possible that the probability to end up in a local minimum...

Scientific progress and incremental compression

Scientific progress and incremental compression

Why is scientific progress incremental? Clearly, the construction of increasingly unified theories in physics and elsewhere is an example incremental compression of experimental data, of...

Recursive unpacking of programs

Recursive unpacking of programs

One idea that we have been following is the idea of simple but deep sequences. Simple in terms of Kolmogorov complexity and deep in terms...

Learning spatial relations

Learning spatial relations

The big demonstrator that we have in mind is the ability to talk about some line drawing scene, after having extracted various objects from it...

Extraction of orthogonal features

Extraction of orthogonal features

The lesson from those considerations is the need for features. Each of those relations does not fix the exact position of the object but rather...

Extending the function network compression algorithm

Extending the function network compression algorithm

So, what are the next steps? We should expand our attempts with the function network. That, it seems, is the best path. One thing that...

Hiring

Interested in working on inspiring research projects?

If you are a scientist in one of the related areas or a student in Odessa in computer science, mathematics or physics, you are welcome to contact us. We have several open positions for both fundamental research and applied data science. Here are a vacancies for AI engineers and researchers.

Current jobs:

AI Researcher
  • specializing in algorithmic information theory or related areas of discrete mathematics
AI Engineer/Programmer
  • in Python, experienced with algorithms and large projects