Information is encoded and processed in the brain by the spatio-temporal collective dynamics of large neural circuits. Cortical circuits show asynchronous and irregular activity which can be explained by a dynamical balance of excitatory and inhibitory input currents. While statistical properties of this balance are described by a mean field theory, which is independent of single neuron properties, the dynamics is not well understood. Here we show that chaos and dynamical entropy production depends on details of action potential initiation. We found in exact numerical simulations of a novel neuron model that increasing the action potential onset rapidness leads to decreasing chaos and entropy production. Based on an analytical expression of the governing dynamics we numerically calculated all Lyapunov exponents and derived the Kaplan-Yorke attractor dimension and Kolmogorov-Sinai entropy production rate of the entire network. Gradually increasing the spike onset rapidness monotonously decreased the attractor dimensionality and entropy production, which vanished at a critical value. The above findings in random networks were confirmed by numerical simulations in networks with more realistic cortical topology. These results demonstrate that the dynamics of single neuron spike initiation critically determine the occurrence and strength of chaos in balanced neural networks. They suggest that cortical neurons with their fast AP onsets might be tuned to reduce information loss by the chaotic network dynamics.