keras issue lstm blocks and unit
I found my answers here: fchollet#7600
https://github.com/keras-team/keras/issues/7600
And here are my questions:
Is it correct that model.add(LSTM(32)) means that there are 32 nodes in the hidden layer?
Yes.
Are these 32 nodes equivalent to 32 LSTM Blocks?
The answer here is definitely no.
Is it correct to assume that in one timestep the Input x_t goes through 32 LSTM blocks?
x_1 --> 32 LSTM blocks (input x_1 goes 32 times through a structure like "A")
x_2 --> 32 LSTM blocks (input x_2 goes 32 times through a structure like "A")
x_3 --> ...
No.
Or does the number of units (32) has nothing to do with the number of LSTM blocks?
Yes.
Is it possible that there is just one LSTM Block per timestep like in the picture from http://colah.github.io/posts/2015-08-Understanding-LSTMs/ and 32 means that the internal cell state c_t is a vector with 32 rows that passes the LSTM block ("A") just once?
Yes.
Or do I mix something up here?
Yes, I did. I mixed "LSTM block" and "units".