-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Description
Hi,
During the GAN training I'm facing an issue. The KL divergence loss between real image logits and fake image logits turns out to be NaN after a a couple of steps. I'm printing the KL divergence after every step, and I found in the first step itself, the KL divergence is huge, and in the following step it becomes negative.
After Step 1:
G: 23949.90 | D: 30.51 | GP: 11.63 | Rec: 4.70 | KL: 2703973204759576721869805868380323840.00
Any idea why this is the case ? @NoahVl
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels