Discovering neural nets with low Kolmogorov complexity and high generalization capability

Research output: Contribution to journalArticlepeer-review

95 Scopus citations

Abstract

Some basic concepts of algorithmic complexity theory relevant to machine learning are reviewed along with the Solomon-Levin distribution (or universal prior) which deals with the prior problem. The universal prior leads to a probabilistic method for finding algorithmically simple problem solutions with high generalization capability. The method is based on Levin complexity and inspired by Levin's optimal universal search algorithm. For a given problem, solution candidates are computed by efficient self sizing programs that influence their own runtime and storage size. The method, at least with certain toy problems where it is computationally feasible, can lead to unmatchable generalization results.
Original languageEnglish (US)
Pages (from-to)857-873
Number of pages17
JournalNeural Networks
Volume10
Issue number5
DOIs
StatePublished - Jan 1 1997
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2022-09-14

ASJC Scopus subject areas

  • Artificial Intelligence
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Discovering neural nets with low Kolmogorov complexity and high generalization capability'. Together they form a unique fingerprint.

Cite this