Posts

A Whole New Dimension of Optimization
Numerical Optimization: Optimizing multivariate objectives.

Take the Rough With the Smooth
Numerical Optimization: How helpful are smooth objectives? This post explores strategies for incorporating derivatives in optimization algorithms.

The Roots of No Evil
The first post in a planned series of posts about numerical optimization algorithms. This one is about extrema and roots.

Random Integers
Efficient sampling from a discrete distribution is a useful yet nontrivial algorithmic buildingblock, which involves some interesting and clever ideas.

Machine Learning. Literally.
Arguably, the most successful application of machine learning is largely unknown to most practitioners. Appropriately, it literally involves machines that learn.

Laws, Sausages and ConvNets
The nuts and bolts of Convolutional Neural Networks: algorithms, implementations and optimizations.

The GenerativeDiscriminative Fallacy
Machine learning algorithms are often categorized as either discriminative or generative. While this dichotomy can be instructive, it is often misleading.

Learning Dynamical Systems
Machine Learning meets Differential Equations.

The Name of The Rose
Convergent evolution is a common phenomenon in machine learning: many dissimilar scenarios lead to similar algorithms. When it comes to generalizations, though, distinctive underlying ideas could be fundamental.

Pointless Topology via Abstract Nonsense
Trigger Warning: pure mathematics. Stone Duality gives a rigorous meaning to the slogan “Geometry is dual to Algebra”.

MetaSequences in C++
With the introduction of variadic templates to C++, metasequences became a central idiom in metaprogramming. The standard implementation is not always the best choice.

Random Bitstreams
Controlling the entropy of pseudorandom bits in Python when performance matters.
subscribe via RSS