Searching for Minimal Neural Networks in Fourier Space
- 10.2991/agi.2010.28How to use a DOI?
The principle of minimum description length suggests looking for the simplest network that works well on the training examples, where simplicity is measured by network description size based on a reasonable programming language for encoding networks. Previous work used an assembler-like universal network encoding language (NEL) and Speed Priorbased search (related to Levinâ's Universal Search) to quickly find low-complexity nets with excellent generalization performance. Here we define a more natural and often more practical NEL whose instructions are frequency domain coefficients. Frequency coefficients may get encoded by few bits, hence huge weight matrices may just be low-complexity superpositions of patterns computed by programs with few elementary instructions. On various benchmarks this weight matrix encoding greatly accelerates the search. The scheme was tested on pole-balancing, long-term dependency T-maze, and ball throwing. Some of the solutions turn out to be unexpectedly simple as they are computable by fairly short bit strings.
- © 2010, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Jan Koutnik AU - Faustino Gomez AU - Jürgen Schmidhuber PY - 2010/06 DA - 2010/06 TI - Searching for Minimal Neural Networks in Fourier Space BT - Proceedings of the 3d Conference on Artificial General Intelligence (2010) PB - Atlantis Press SP - 128 EP - 133 SN - 1951-6851 UR - https://doi.org/10.2991/agi.2010.28 DO - 10.2991/agi.2010.28 ID - Koutnik2010/06 ER -