Building thinking machines

Our mission is to advance both fundamental research and practical application in the fields of artificial intelligence and machine learning. The fundamental research is focused on so-called Artificial General Intelligence (AGI), which refers humanity's long term dream of constructing thinking machines that can solve a wide range of tasks without being specifically programmed for any of them.

About us

OCCAM is a private research laboratory and consultancy in artificial intelligence and machine learning, founded by Dr. Arthur Franz and Michael Löffler.

Contact Us

Research

Agenda

It is not a coincidence that finding short explanations for observations is central to research at OCCAM. After all, it is what William Occam's razor demands and what has been formalized in Ray Solomonoff's theory of universal induction a few decades ago.

In that sense, in a nutshell, research at OCCAM consists of trying to make universal induction tractable by building efficient data compression algorithms. Formally, algorithmic information theory is used to create a sound and general mathematical foundation. The representations gained by the compression algorithms are then to be used to guide grounded reasoning, concept acquisition and communication through language, to build an agent with common sense reasoning abilities. In our view, solving these problems would be a big step toward AGI.

Research projects

  • Theory of Incremental Compression

    We have developed a theory of incremental compression [1,2], which can find short representations efficiently for arbitrary strings that can be generated by a composition of functions/features. Since data compression is crucial for AGI, this is a major step forward.

  • Developing a practical compression algorithm

    Incremental compression is an efficient way to find short representations for data generated by a composition of functions/features. Even though it is much more efficient than universal search, it is not enough for practical applications, since we don't know how to find those functions. However, since real world data usually show local correlations, this circumstance can be exploited by using a branching compression hierarchy. See this publication for more on that issue.

Publications

The listed publications below give an impression of the science done at OCCAM. To get started, we recommend reading the position paper.

Experiments on the generalization of machine learning algorithms

The inductive programming system WILLIAM is applied to machine learning tasks, in particular, centralization, outlier detection, linear regression, linear classification and decision tree classification. These...

Franz, A. (2022). Experiments on the Generalization of Machine Learning Algorithms. In: Goertzel, B., Iklé, M., Potapov, A. (eds) Artificial General Intelligence. AGI 2021. Lecture Notes in Computer Science(), vol 13154. Springer, Cham. https://doi.org/10.1007/978-3-030-93758-4_9

November 22, 2023

View publication
A theory of incremental compression

The ability to find short representations, i.e. to compress data, is crucial for many intelligent systems. We present a theory of incremental compression showing that...

Arthur Franz, Oleksandr Antonenko, Roman Soletskyi, A theory of incremental compression, Information Sciences, Volume 547, 2021, Pages 28-48, ISSN 0020-0255, https://doi.org/10.1016/j.ins.2020.08.035. ArXiv: https://arxiv.org/abs/1908.03781 The pdf of the published paper is available upon request.

September 14, 2020

View publication
WILLIAM: A monolithic approach to AGI

We present WILLIAM – an inductive programming system based on the theory of incremental compression. It builds representations by incrementally stacking autoencoders made up of...

Arthur Franz, Victoria Gogulya, and Michael Löffler. "WILLIAM: A monolithic approach to AGI." In International Conference on Artificial General Intelligence, pp. 44-58. Springer, Cham, 2019.

November 08, 2019

View publication
Introducing WILLIAM: a system for inductive inference based on the theory of incremental compression

We introduce WILLIAM — a new system for data compression that is based on a formal mathematical theory of incremental compression. The theory promises to...

August 30, 2018

View publication
On hierarchical compression and power laws in nature

Since compressing data incrementally by a non-branching hierarchy has resulted in substantial efficiency gains for performing induction in previous work, we now explore branching hierarchical...

Franz A. On Hierarchical Compression and Power Laws in Nature //International Conference on Artificial General Intelligence. – Springer, Cham, 2017. – С. 77-86.

July 13, 2017

View publication
Some Theorems on Incremental Compression

The ability to induce short descriptions of, i.e. compressing, a wide class of data is essential for any system exhibiting general intelligence. In all generality,...

Franz A. Some Theorems on Incremental Compression //International Conference on Artificial General Intelligence. – Springer International Publishing, 2016. – С. 74-83.

June 25, 2016

View publication
Toward Tractable Universal Induction Through Recursive Program Learning

Since universal induction is a central topic in artificial general intelligence (AGI), it is argued that compressing all sequences up to a complexity threshold should...

Franz A. Toward tractable universal induction through recursive program learning //International Conference on Artificial General Intelligence. – Springer, Cham, 2015. – С. 251-260.

July 15, 2015

View publication
Artificial general intelligence through recursive data compression and grounded reasoning: a position paper

This paper presents a tentative outline for the construction of an artificial, generally intelligent system (AGI). It is argued that building a general data compression...

Franz A. Artificial general intelligence through recursive data compression and grounded reasoning: a position paper //arXiv preprint arXiv:1506.04366. – 2015.

January 06, 2015

View publication
Will super-human artificial intelligence (AI) be subject to evolution?

There has been much speculation about the future of humanity in the face of super-humanly intelligent machines. Most of the dystopian scenarios seem to be...

September 06, 2013

View publication

Blog

Building thinking machines is hard. Very hard. This blog discusses some of the issues involved in our approach.

Talk at AI Journey 2019 in Moscow

Talk at AI Journey 2019 in Moscow

Our approach to artificial general intelligence

Our approach to artificial general intelligence

Our presentation at GOOD AI (Prague) Topic: Our approach to artificial general intelligence

Recent lectures on AGI by Dr. Arthur Franz (in russian)

Recent lectures on AGI by Dr. Arthur Franz (in russian)

For our laboratory the year had begun auspiciously by a succession of introductory lectures held in Odessa.Those of our readers who speak Russian can watch...

Emergence of attention mechanisms during compression

Emergence of attention mechanisms during compression

It just dawned on me. When we want to compress, we have to do it in one or the other incremental fashion, arriving in description...

Аппроксимация теории универсального интеллекта с помощью инкрементного сжатия информации

Аппроксимация теории универсального интеллекта с помощью инкрементного сжатия информации

7 февраля 2018 года, в 14:30, в главном корпусе ОНУ (Дворянская, 2), ауд. 47 (1 этаж) состоялось очередное заседание Одесского семинара по дискретной математике, организованное совместно Одесским национальным университетом...

На пути к сверхинтеллекту

На пути к сверхинтеллекту

22 января в Терминале 42 прошло уникальное событие : знакомство с частной некоммерческой исследовательской лабораторией OCCAM и лекция, в рамках которой основатель OCCAM, Артур Франц, рассказал слушателям...

Hierarchical epsilon machine reconstruction

Hierarchical epsilon machine reconstruction

Having read this long paper by James P. Crutchfield (1994) “Calculi of emergence”, we have to admit, that it is very inspiring. Let’s think about...

Universal approximators vs. algorithmic completeness

Universal approximators vs. algorithmic completeness

Finally, it has dawned on us. A problem that we had troubles conceptualizing is the following. On the one hand, for the purposes of universal...

The merits of indefinite regress

The merits of indefinite regress

The whole field of machine learning, and artificial intelligence in general, is plagued by a particular problem: the well known curse of dimensionality. In a...

Using features for the specialization of algorithms

Using features for the specialization of algorithms

A widespread sickness of present “narrow AI” approaches is the almost irresistible urge to set up rigid algorithms that find solutions in an as large...

The physics of structure formation

The physics of structure formation

The entropy in equilibrium thermodynamics is defined as , which always increases in closed systems. It is clearly a special case of Shannon entropy ....

Incremental compression

Incremental compression

A problem of the incremental approach is obviously local minima in compression. Is it possible that the probability to end up in a local minimum...

Scientific progress and incremental compression

Scientific progress and incremental compression

Why is scientific progress incremental? Clearly, the construction of increasingly unified theories in physics and elsewhere is an example incremental compression of experimental data, of...

Recursive unpacking of programs

Recursive unpacking of programs

One idea that we have been following is the idea of simple but deep sequences. Simple in terms of Kolmogorov complexity and deep in terms...

Learning spatial relations

Learning spatial relations

The big demonstrator that we have in mind is the ability to talk about some line drawing scene, after having extracted various objects from it...

Extraction of orthogonal features

Extraction of orthogonal features

The lesson from those considerations is the need for features. Each of those relations does not fix the exact position of the object but rather...

Extending the function network compression algorithm

Extending the function network compression algorithm

So, what are the next steps? We should expand our attempts with the function network. That, it seems, is the best path. One thing that...

Hiring

Interested in working on inspiring research projects?
If you are a scientist in one of the related areas or a student in Odessa in computer science, mathematics or physics, you are welcome to contact us.

Current jobs:

to be announced