15

Lets say I have a simple neural network with an input layer and a single convolution layer programmed in tensorflow:

  # Input Layer
  input_layer = tf.reshape(features["x"], [-1, 28, 28, 1])

  # Convolutional Layer #1
  conv1 = tf.layers.conv2d(
      inputs=input_layer,
      filters=32,
      kernel_size=[5, 5],
      padding="same",
      activation=tf.nn.relu)

I leave out any further parts of the network definitions for the features.

If I wanted to add an LSTM Layer after this convolution layer, I would have to make the convolution layer TimeDistributed (in the language of keras) and then put the output of the TimeDistributed layer into the LSTM.

Tensorflow offers access to the keras layers in tf.keras.layers. Can I use the keras layers directly in the tensorflow code? If so, how? Could I also use the tf.keras.layers.lstm for the implementation of the LSTM Layer?

So in general: Is a mixture of pure tensorflow code and keras code possible and can I use the tf.keras.layers?

Merlin1896
  • 1,751
  • 24
  • 39
  • I think this question was answered here: https://stackoverflow.com/questions/42441431/how-to-set-the-input-of-a-keras-layer-with-a-tensorflow-tensor – ldavid Nov 08 '17 at 17:01
  • 1
    Possible duplicate of [How to set the input of a Keras layer with a Tensorflow tensor?](https://stackoverflow.com/questions/42441431/how-to-set-the-input-of-a-keras-layer-with-a-tensorflow-tensor) – ldavid Nov 08 '17 at 17:02

1 Answers1

17

Yes, this is possible.

Import both TensorFlow and Keras and link your Keras session to the TF one:

import tensorflow as tf
import keras
from keras import backend as K

tf_sess = tf.Session()
K.set_session(tf_sess)

Now, in your model definition, you can mix TF and Keras layers like so:

# Input Layer
input_layer = tf.reshape(features["x"], [-1, 28, 28, 1])

# Convolutional Layer #1
conv1 = tf.layers.conv2d(
    inputs=input_layer,
    filters=32,
    kernel_size=[5, 5],
    padding="same",
    activation=tf.nn.relu)

# Flatten conv output
flat = tf.contrib.layers.flatten(conv1)

# Fully-connected Keras layer
layer2_dense = keras.layers.Dense(128, activation='relu')(flat)

# Fully-connected TF layer (output)
output_preds = tf.layers.dense(layer2_dense, units=10)

This answer is adopted from a Keras blog post by Francois Chollet.

collector
  • 940
  • 1
  • 8
  • 14
  • So then `preds` would be a keras layer, right? Is there a way to put this back into a tensorflow layer/operation? – Merlin1896 Mar 09 '18 at 13:40
  • @Merlin1896 You can mix and match any layers you want. I have updated the answer so that the final layer is a standard tensorflow layer – collector Mar 09 '18 at 14:37
  • @Merlin1896 if this answered your question you can mark it as answered so that others may find it as well – collector Mar 13 '18 at 11:22
  • Could you provide a complete minimal example for future reference? The current code won't run :) – Merlin1896 Mar 13 '18 at 19:11
  • @Merlin1896 I changed the example to fit in with your convolutional example. It should run now! In the example I used a Keras dense layer since I'm not sure how exactly you are going to implement the LSTM – collector Mar 14 '18 at 11:22