Skip to content

Adopt or not to adopt tensorboard native hyperparameters recording #184

@vwxyzjn

Description

@vwxyzjn

Problem Description

The current way of adding hyperparameters is to do it via markdown text:

writer = SummaryWriter(f"runs/{run_name}")
writer.add_text(
    "hyperparameters",
    "|param|value|\n|-|-|\n%s" % ("\n".join([f"|{key}|{value}|" for key, value in vars(args).items()])),
)

It is however to use the tensorboard native hyperparameters plugin, which unfortunately also has ugly code... see pytorch/pytorch#37738 (comment)

writer = SummaryWriter(f"runs/{run_name}")
from torch.utils.tensorboard.summary import hparams
exp, ssi, sei = hparams(vars(args), metric_dict={"charts/episodic_return": 0})   
writer.file_writer.add_summary(exp)                 
writer.file_writer.add_summary(ssi)                 
writer.file_writer.add_summary(sei)  

produces the following demo, which I guess is pretty cool

Screen.Recording.2022-05-11.at.11.45.43.PM.mov

Checklist

CC @araffin and @Miffyli and who might be interested in this hidden feature.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions