feat: AdaLoRA and other improvements
- Fixed flash attention (vRAM improvements 8xa100 to 4xa100)
- Implemented custom trainer with AdaLoRA
- Migrated to the new GCP
- General bug fixes and improvements
Related #512373
Edited by Ekaterina Nikonova
Related #512373