Quantifying AI Reasoning via Systematic Neural Network Representations of Algorithms
A main open question in contemporary AI research is quantifying the forms of reasoning neural networks can perform when perfectly trained. We answer this question by interpreting reasoning tasks as circuit emulation, where the gates define the type of reasoning; e.g., Boolean gates for predicate logic, tropical gates for dynamic programming, and analytic gates for symbolic mathematical representation.
We will present a systematic meta-algorithm that converts essentially any circuit into a feedforward neural network (FF NN) with ReLU activations. The number of neurons in the resulting network (parametric complexity) scales with the circuit's complexity, and the network's structure (computational graph) mirrors that of the emulated circuit. This formalizes the folklore that NNs trade algorithmic run-time (circuit runtime) for space complexity (number of neurons).
We will discuss corollaries of the algorithm that pertain to new size, depth, and width bounds for FF NNs in terms of representing a general boolean function, solving the all shortest paths (ASP) problem, and more. Finally, we will prove a kind of Kolmogorov–Arnold representation theorem in the digital setting; namely, that our construction emulates the circuit exactly -- without approximation and rounding.
Bio: Dennis is a PhD student jointly in the mathematical foundations of artificial intelligence and model theory; under Bradd Hart and Anastasis Kratsios.

