# PatWie - Blog

### Introduction TensorPack - The training has been finished - Now what?

June 5, 2017

GPUs were running for several weeks, and you got angry emails from your colleges about the occupied computing machines. At this moment your mind struggles: ‘Did I contributed to the global climate change?’. Nonetheless, the training error decreased, and the cross-validation information puts you in a happy mood. Your model performance surpassed all competitors methods. Then it is time to deploy your model!

### Introduction TensorPack - Knitting a Model

June 4, 2017

In the last post, I covered a nice way of producing training data on-the-fly. In this section, we are going to create a model with trainable parameters to predict high-res MNIST digits.

### Introduction TensorPack - Data Prefetching

March 1, 2017

There are many libraries/wrappers for TensorFlow claiming to be easy to apply to your problems. Most of them over simply the usage, such that they are not flexible enough. Just take a look at Keras, PrettyTensor, TfSlim, sugarTensor, tflearn, … . There are probably much more libraries doing the same thing. But they have been looking at the wrong problem all the time. It is not hard to come up with writing a Conv2D layer or a ReLU layer (yes people actually wrapping tf.nn.relu). So writing a layer is not the issue, efficient training is! But so many people think they need to provide another TF-wrapper for commons layers. They do the same mistake again and again. But I care about speed during training.

### Introduction TensorFlow - Optimization of Objective Functions

January 4, 2017

As ‘deep learning’ is simply a nicer slogan for ‘non-linear optimization’, we will now consider how to optimize things using TensorFow without manually computing all these derivatives by hand.