Posts

Showing posts from May, 2016

C# math is fast

I was reading an article on neural networks that mentioned the usual sigmoid activation function when the inputs are real numbers in the [0, 1) interval: 1 / (1 + e −x ) The article mentions that this is probably where the program would spend at least half of its time so I thought "why not pre-compute a bunch of values and trade memory and precision for time"? It turns out, C# math is quite fast and the gain might not be worth it. (I haven't tested it yet with a NN.) This is the code I wrote to benchmark the two options, using LinqPad 5 : void Main() { const int STEPS = 1 * 1000 * 1000; Func<double, double> activation = x => 1.0 / (1.0 + Math.Exp(-x)); var cache = Precompute(0.0, 1.0, STEPS, activation); Benchmark("Using the cache", x => cache[(int) Math.Truncate(x * STEPS)]); Benchmark("Calling the function each time", activation); } double[] Precompute(double lower, double upper, int steps, Func<dou