Skip to content

Conversation

AhmetCanSolak
Copy link
Collaborator

This PR implements the original unet for Noise2Self.

@AhmetCanSolak AhmetCanSolak self-assigned this Jul 21, 2022
elif self.normalization == 'batch':
x = self.batch_normalization(x)

x = self.activation_function(x)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If none of the correct activation key was passed, will this become x = None(x) ?

Copy link
Collaborator Author

@AhmetCanSolak AhmetCanSolak Jul 21, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, it will give error within the __init__

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might want to make an option not to get any activation layer just like the normalization layer.


# assert result.shape == input_image.shape
# assert result.dtype == input_image.dtype


def test_masking_2D():
input_array = torch.zeros((1, 1, 64, 64))
@pytest.mark.parametrize("nb_unet_levels", [2, 3, 5, 8])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

8 layers?!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might worth to test, don't you think so? :)

Copy link
Collaborator

@li-li-github li-li-github left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall it looks good to me!

@AhmetCanSolak AhmetCanSolak merged commit a4454b8 into royerlab:master Jul 21, 2022
@AhmetCanSolak AhmetCanSolak deleted the torch-original-unet branch July 21, 2022 20:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants