Using a pipeline method that changes attention processors after loading the IP Adapter can lead to weird errors when running the pipeline. e.g: https://github.com/huggingface/diffusers/issues/8863 e.g. ```python pipe.load_ip_adapter( "h94/IP-Adapter", subfolder="models", weight_name="ip-adapter-plus_sd15.bin" ) pipe.set_ip_adapter_scale(0.7) pipe.enable_xformers_memory_efficient_attention() ``` Will lead to the following error ``` AttributeError: 'tuple' object has no attribute 'shape' ``` Perhaps we could add a warning message if a pipeline has already loaded the IPAdapter attention processors and attempts to change them? cc: @yiyixuxu