The following post are intended for people who are interested to understand very basic TF workflows.

TensorFlow and C++ - Once And For All

Looking at stackoverflow for the tags tensorflow, c++ one might get the impression combining both is the direct connection to hell. But it is quite simple to use both and not ending in the loony bin.

Read more

Introduction TensorFlow - Understanding the Computation Graph

Most introduction guides just rephrase the official MNIST example from the TensorFlow documentation without any additional information. And the official guide is definitely not the best way to get into working with TensorFlow (TF). To understand things behind the scene you need to understand the entire concept of TensorFlow, its symbolic graph, and start with very basic examples.

Read more

Introduction TensorFlow - Optimization of Objective Functions

As ‘deep learning’ is simply a nicer slogan for ‘non-linear optimization + data’, we will now consider how to optimize “things” using TensorFow without manually computing all these derivatives by hand.

Read more

Introduction TensorPack - Data Prefetching

There are many libraries/wrappers for TensorFlow claiming to be easy to apply to your problems. Most of them oversimplify the usage, such that they are not flexible enough. Just take a look at Keras, PrettyTensor, TfSlim, sugarTensor, tflearn, … . There are probably much more libraries doing the same thing. But they have been looking at the wrong problem all the time. It is not hard to come up with writing a Conv2D layer or a ReLU layer (yes people actually wrapping tf.nn.relu). So writing a layer is not the issue, efficient training is! The entire interplay between CPU processing and GPU processing. You do not want one of these units to feel bored while waiting for the other one. But so many people think they need to provide another TF-wrapper for commons layers. They make the same mistake again and again. But I care about speed during training.

Read more

Introduction TensorPack - Knitting a Model

In the last post, we looked into a nice way of producing training data on-the-fly. In this section, we are going to create a model with trainable parameters to predict high-res MNIST digits.

Read more

Introduction TensorPack - The training has been finished - Now what?

GPUs were running for several weeks, and you got angry emails from your colleges about the occupied computing machines. At this moment your mind struggles: ‘Did I contributed to the global climate change?’. Nonetheless, the training error decreased, and the cross-validation information puts you in a happy mood. Your model performance surpassed all competitors methods. Then it is time to deploy your model!

Read more