This is my comment on a github issue.

My solution is somehow tricky. You can build your network with standard **Keras** backend.

**During development,** you can use Tensorflow as backend because compiling a model in Theano takes more time than you can endure, especially when the optimization option is on. You might suffer from a small batch size and slow computation with Tensorflow, but what you have to do is just to make sure your network structure is correct.

**After that**, you can simply switch to Theano backend in the configuration file of Keras. Since you used the standard Keras backend when building the network, all your function calls of Tensorflow API will automatically switch to corresponding ones of Theano API. It might take a while to compile the model in Theano, but you might use a larger batch size and enjoy a fast computation.

**Notice that** Theano and Tensorflow differ on the arrangement of data, such as the axis of filter channel. I use the Theano's way in both development and deployment in that it can boost Theano's computation.

**And that** Theano does not have versatile tools such as TensorBoard in TensorFlow. And hence if you want to visualize or analyze the trained model, feel free and safe to switch back to TensorFlow.

**And that** if some function is not implemented by Keras backend, you can write a simple conditional statement to do that. Such as eigenvalue decomposition of a self-adjoint matrix:

```
import keras.backend as K
# K.eig() does not exist
if K.backend()=='theano':
eig, eigv = K.theano.tensor.nlinalg.eig(matrix)
elif K.backend() == 'tensorflow':
eig, eigv = K.tf.self_adjoint_eig(matrix)
else:
raise NotImplemented
```