Problem GAN conversion when applying variable reuse on tensorflow
I am building an GAN and when i started calling my discriminator twice, using reuse, my GAN started to diverge. I first created my discriminator as following:
def discriminator(self, x_past, x_future, gen_future):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc") as disc:
gen_future = tf.concat([gen_future, x_past], 2)
x_future = tf.concat([x_future, x_past], 2)
x_in = tf.concat([gen_future, x_future], 0)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
y_true = logits[:self.batch_size]
y_gen = logits[self.batch_size:]
return y_true, y_gen
And I was calling it like this:
y_true, y_gen = self.discriminator(x_past, x_future, gen_future)
I was able to train the GAN properly. Now I need to use reuse to be able to call it without having to send real and fake data at once. I changed it to:
def discriminator(self, x_past, x_future, reuse=False):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc", reuse=reuse) as disc:
x_in = tf.concat([x_future, x_past], 2)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
return logits
And calling it like this:
y_true = self.discriminator(x_past, x_future)
y_gen = self.discriminator(x_past, gen_future, reuse=True)
Now it started to diverge. Any idea why is that?
python tensorflow generative-adversarial-network
add a comment |Â
I am building an GAN and when i started calling my discriminator twice, using reuse, my GAN started to diverge. I first created my discriminator as following:
def discriminator(self, x_past, x_future, gen_future):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc") as disc:
gen_future = tf.concat([gen_future, x_past], 2)
x_future = tf.concat([x_future, x_past], 2)
x_in = tf.concat([gen_future, x_future], 0)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
y_true = logits[:self.batch_size]
y_gen = logits[self.batch_size:]
return y_true, y_gen
And I was calling it like this:
y_true, y_gen = self.discriminator(x_past, x_future, gen_future)
I was able to train the GAN properly. Now I need to use reuse to be able to call it without having to send real and fake data at once. I changed it to:
def discriminator(self, x_past, x_future, reuse=False):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc", reuse=reuse) as disc:
x_in = tf.concat([x_future, x_past], 2)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
return logits
And calling it like this:
y_true = self.discriminator(x_past, x_future)
y_gen = self.discriminator(x_past, gen_future, reuse=True)
Now it started to diverge. Any idea why is that?
python tensorflow generative-adversarial-network
add a comment |Â
I am building an GAN and when i started calling my discriminator twice, using reuse, my GAN started to diverge. I first created my discriminator as following:
def discriminator(self, x_past, x_future, gen_future):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc") as disc:
gen_future = tf.concat([gen_future, x_past], 2)
x_future = tf.concat([x_future, x_past], 2)
x_in = tf.concat([gen_future, x_future], 0)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
y_true = logits[:self.batch_size]
y_gen = logits[self.batch_size:]
return y_true, y_gen
And I was calling it like this:
y_true, y_gen = self.discriminator(x_past, x_future, gen_future)
I was able to train the GAN properly. Now I need to use reuse to be able to call it without having to send real and fake data at once. I changed it to:
def discriminator(self, x_past, x_future, reuse=False):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc", reuse=reuse) as disc:
x_in = tf.concat([x_future, x_past], 2)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
return logits
And calling it like this:
y_true = self.discriminator(x_past, x_future)
y_gen = self.discriminator(x_past, gen_future, reuse=True)
Now it started to diverge. Any idea why is that?
python tensorflow generative-adversarial-network
I am building an GAN and when i started calling my discriminator twice, using reuse, my GAN started to diverge. I first created my discriminator as following:
def discriminator(self, x_past, x_future, gen_future):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc") as disc:
gen_future = tf.concat([gen_future, x_past], 2)
x_future = tf.concat([x_future, x_past], 2)
x_in = tf.concat([gen_future, x_future], 0)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
y_true = logits[:self.batch_size]
y_gen = logits[self.batch_size:]
return y_true, y_gen
And I was calling it like this:
y_true, y_gen = self.discriminator(x_past, x_future, gen_future)
I was able to train the GAN properly. Now I need to use reuse to be able to call it without having to send real and fake data at once. I changed it to:
def discriminator(self, x_past, x_future, reuse=False):
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
with tf.variable_scope("disc", reuse=reuse) as disc:
x_in = tf.concat([x_future, x_past], 2)
conv1 = tf.layers.conv1d(inputs=x_in, filters=20, kernel_size=3, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')
conv2 = tf.layers.conv1d(inputs=max_pool_1, filters=3, kernel_size=2, strides=1,
padding='same', activation=tf.nn.relu)
max_pool_2 = tf.layers.max_pooling1d(inputs=conv2, pool_size=2, strides=2, padding='same')
# Flatten and add dropout
flat = tf.reshape(max_pool_2, (-1, 9))
flat = tf.nn.dropout(flat, keep_prob=self.keep_prob)
# Predictions
logits = tf.layers.dense(flat, 2)
return logits
And calling it like this:
y_true = self.discriminator(x_past, x_future)
y_gen = self.discriminator(x_past, gen_future, reuse=True)
Now it started to diverge. Any idea why is that?
python tensorflow generative-adversarial-network
python tensorflow generative-adversarial-network
asked Nov 10 at 22:43
Rafael Reis
157216
157216
add a comment |Â
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid â¦
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid â¦
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53244152%2fproblem-gan-conversion-when-applying-variable-reuse-on-tensorflow%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown