-
Notifications
You must be signed in to change notification settings - Fork 43
Closed
Description
Fig 1 |
Fig 2 |
I used 400 prompts from https://huggingface.co/datasets/k-mktr/improved-flux-prompts to generate 400 pairs of (modulated_input_diff, output_diff), the shape of each is 49 as I take the following hyperparameters.
num_inference_steps = 50,
guidance_scale=3.5,
max_sequence_length=256,
generator=torch.Generator(device).manual_seed(42)
The result is satisfying because the modulated_input_diff and modulated_output_diff using my 400 generated data always show a stable and close relationship using different prompt (Fig 2). However, I meet some problem when I use the 4th order coefficient provided in ./TeaCache4FLUX/teacache_flux.py
,
- I don't see an obvious relationship between either ( log(ouput_diff) vs log(predicted_output_diff) ) or ( ouput_diff vs predicted_output_diff ) using my own data.(Fig 1)
- I do the 4th order polynomial fitting with my own data, and get the different coeffient
[-34.84608751, -10.79323838, 16.39479138, -1.21976726, 0.12762022])
, but it also show a bad relationship. - I find the L1 loss between the ouput_diff and predicted_output_diff decreses as the order of fitting increases (I tried order from 1 to 10)
The code is displayed below, I wonder if I do it wrong ? (BTW, the TeaCache speed-up and performance is marvelous in both flux and hunyuanvideo !!! )
plt.clf()
x = input_diff.mean() #### the .csv, shape = (400, 49)
y = output_diff.mean() #### the .csv, shape = (400, 49)
coefficients = [4.98651651e+02, -2.83781631e+02, 5.58554382e+01, -3.82021401e+00, 2.64230861e-01]
rescale_func = np.poly1d(coefficients)
ypred = rescale_func(x)
plt.figure(figsize=(8,8))
plt.plot(np.log(x), np.log(y), '*',label='log original values',color='green')
plt.plot(np.log(x), np.log(ypred), '.',label='log polyfit values',color='blue')
plt.xlabel(f'4th order true fitting')
plt.legend(loc=4)
Metadata
Metadata
Assignees
Labels
No labels