Enlarge (credit: pitju / Adobe Stock) In an earlier deep learning article, we talked about how inference workloads—the use of already-trained neural networks to analyze data—can run on fairly cheap hardware, but running the training workload that the neural network "learns" on is orders of magnitude more expensive. In particular, the more potential inputs you have to an algorithm, the more out of control your scaling problem gets when analyzing its problem space.