I will post here some stochastic simulations I have been running, with an ultimate goal of sharing some ideas and codes on Simulated Annealing. A few remarks before we start:

Some notes and codes

For the following, it is helpful if, besides python, you know a little bit of tensorflow (the code is "almost self explanatory", but it is always helpful if you have been previously exposed to tensorflow's idea)

The next notebook is a continuation of the previous one. Instead of using classical backpropagation, we use a probabilistic way to choose new weights

Inthe next notebook is short we run a Convolution Neural Network (CNN) in order to do predictions in a supervised learning problem. Later on, through dimensional reduction (which will be explained in the notebook), we run a Neural Network prediction model in a lower dimensional manifold which, due to its "high quality", ends up providing good information for a prediction algorithm as good or even more efficient than CNN.

For this program we use Keras and (very little) Tensorflow.

The following note concerns LASSO, Ridge regression, and Least squares regression. It can be also be seen, and that's the perspective that I adopt in the notes, as a 1 layer NN where one forgets to use an activation function. There is also an interesting issue regarding to symmetry, labeling, and penalization. For this study we only use standard libraries (sklearn and numpy).

I recently read an interesting (old) paper by Daniel Hillis, Co-evolving parasites improve simulated evolution as an optimization procedure, on the idea of co-evolving parasites applied on an optimization problem. I decided to play a bit with it. This is somehow related to the above post on weight evolution and mass shuffling, but the heuristics for parameter search is way more interesting than the one I had designed therein. Nevertheless, they parallel in the sense that both are stochastic algorithms, and "Backpropagation-free".