Skip to content

KL divergence NaN issue #13

@harshvardhan96

Description

@harshvardhan96

Hi,

During the GAN training I'm facing an issue. The KL divergence loss between real image logits and fake image logits turns out to be NaN after a a couple of steps. I'm printing the KL divergence after every step, and I found in the first step itself, the KL divergence is huge, and in the following step it becomes negative.

After Step 1:
G: 23949.90 | D: 30.51 | GP: 11.63 | Rec: 4.70 | KL: 2703973204759576721869805868380323840.00

Any idea why this is the case ? @NoahVl

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions