Skip to content

XiaojieGu/UltraEdit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

If our project helps you, please give us a star ⭐ on GitHub to support us. 😉😉

arXiv

🔥 News

📦 Data & Model Preparation

1️⃣ Download the files from Google Drive and place them under UltraEdit/data/raw.

2️⃣ Download the UltraEditBench and save it under UltraEdit/data/raw/ultraeditbench.

3️⃣ Specify the path to model weights by setting the name_or_path field in UltraEdit/config/model/model.yaml.

If you need to use locate-then-edit methods, we provide precomputed covariance matrices on Hugging Face for several models: GPT-J 6B, Qwen2.5-7B-Instruct, Mistral-7B-v0.3, LLaMA-3-8B-Instruct, and LLaMA-2-7B-hf.

🚀 Setup

Create the environment and install dependencies:

conda create -n ultraedit python=3.10
conda activate ultraedit
pip install torch==2.3.0+cu121 --index-url https://download.pytorch.org/whl/cu121
pip install -r requirements.txt

💡 If you want to try editing a Mistral-7B model, even a 24GB consumer GPU is enough — model editing for everyone!

🧪 Run

Run the main experiment with:

sh run.sh

The run.sh script includes a sample command like:

python main.py dataset=zsre model=mistral-7b editor=ultraedit num_seq=200 \ # Number of turns
    editor.cache_dir=cache \
    dataset.batch_size=10 \
    dataset.n_edits=100 \ # Number of edits per turn
    model.edit_modules="[model.layers.29.mlp.down_proj, model.layers.30.mlp.down_proj]"

💡 Just try editing 20K samples on Mistral-7B in under 5 minutes — ultra-efficient!

🙏 Acknowledgements

Our work builds upon several excellent model editing frameworks. We sincerely thank the authors of RLEdit for their valuable contributions to the field.

🌟 Star History

Star History Chart

📫 Contact

For any inquiries or possible collaboration, feel free to reach out at peettherapynoys@gmail.com

📑 Citation

If you find UltraEdit useful for your research and applications, please cite using this BibTeX:

@article{gu2025ultraedit,
  title={UltraEdit: Training-, Subject-, and Memory-Free Lifelong Editing in Large Language Models},
  author={Gu, Xiaojie and Chen, Guangxu and Li, Jungang and Gu, Jia-Chen and Hu, Xuming and Zhang, Kai},
  journal={arXiv preprint arXiv:2505.14679},
  year={2025}
}

About

UltraEdit: Training-, Subject-, and Memory-Free Lifelong Editing in Large Language Models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published