This repository contains the official implementation for the paper:
Reparameterized LLM Training via Orthogonal Equivalence Transformation (paper).
🚧 Work in Progress 🚧
This codebase is currently under development. The implementation will be made available soon.
If you find this work useful in your research, please cite our paper:
@article{qiu2025poet,
title={Reparameterized LLM Training via Orthogonal Equivalence Transformation},
author={Qiu, Zeju and Buchholz, Simon and Xiao, Tim Z. and Dax, Maximilian and Sch\"olkopf, Bernhard and Liu, Weiyang},
journal={arXiv preprint arXiv:2506.08001},
year={2025}
}