Steven Palma
|
f0d2b37beb
|
chore(dependencies): bump transformers v5 (#2964)
* chore(dependencies): upgrade transformers + hggingface-hub + peft + scipy
* chore(dependencies): bump pi0 family to transformers v5
* chore(dependencies): bump wall x to transformers v5
* chore(dependencies): bump gr00t to transformers v5
* chore(style): fix pre-commit
* fix(policy): xvla forced_bos_token missing
* test(rl): skip ci tests for resnet10
* Fix: full pi models support for transformer v5 (#2967)
* fix(pi): remove loss truncation
* fix(pi): remove state padding before tokenization
* fix(pi): fix image padding value
* fix from_pretrain
* add transformer v5 changes
* remove reference
* more fixes
* make it work
* add support for rest of pi family
* add pifast work
* more changes
* more changes
* more cleanup
* fix torch params
* dtype fix
* torch compile
* embed mismatch fix
* revert groot
* more nit fixes
* remove unused classes
* more fixes
* revert
* nit
* torch dtype warning fix
* but back dynamic renaming
* add tie embedding
---------
Co-authored-by: Yufei Sun <skieyfly@gmail.com>
* chore: fix XVLA in transformers v5 (#3006)
* test(policies): enable wall x CI testing
* style(test): pre-commit check
* style(test): pre-commit
* fix wall x for transformer v5 (#3008)
* tv5 fix
* various wall x fixes
* Delete tests/policies/pi0_pi05/print_pi05_output_logits.py
Signed-off-by: Jade Choghari <chogharijade@gmail.com>
* sync modeling_florence2.py with chore/bump_transformers_v5
* more
* more fixes
* more
* remove comment
* more
---------
Signed-off-by: Jade Choghari <chogharijade@gmail.com>
* chore(dependencies): adjust dependencies versioning after transformers v5 (#3034)
* chore(dependecies): adjust dependecies versioning after transformers v5
* fix(policies): remove deprecated input_embeds
* fix(policies): dict _tied_weights_keys
* chore(depedencies): common qwen-vl-utils
* chore(dependencies): bump transformers to 5.2
* Fix policy testing for tv5 (#3032)
* fix ci logger
* other fix
* fix mypy
* change logits to torch2.10
* skip wallx|
* remove logging
---------
Co-authored-by: Steven Palma <imstevenpmwork@ieee.org>
* feat(ci): log into HF to unblock some CI tests (#3007)
* feat(ci): log into HF to unblock some CI tests
* chore(ci): change hf call + secret name
* fix(ci): temp fix for pi0 rtc test
* test(policies): require_cuda for unblocked tests
* test(policies): require_cuda wall_x
* fic(tests): require_cuda outter most for pi0
* fix(test): return instead of yield
---------
Signed-off-by: Steven Palma <imstevenpmwork@ieee.org>
* style(test): fix pre-commit
* chore(deps): upgrade transformers (#3050)
* chore(test): use lerobot model
* fix(policies): change default action tokenizer for wall x
* sample on cpu
* Revert "Merge branch 'chore/bump_transformers_v5' of https://github.com/huggingface/lerobot into chore/bump_transformers_v5"
This reverts commit d9b76755f7, reversing
changes made to 89359cb0b6.
* Reapply "Merge branch 'chore/bump_transformers_v5' of https://github.com/huggingface/lerobot into chore/bump_transformers_v5"
This reverts commit c9914db78b.
---------
Signed-off-by: Jade Choghari <chogharijade@gmail.com>
Signed-off-by: Steven Palma <imstevenpmwork@ieee.org>
Co-authored-by: Jade Choghari <chogharijade@gmail.com>
Co-authored-by: Yufei Sun <skieyfly@gmail.com>
Co-authored-by: Pepijn <pepijn@huggingface.co>
|
2026-03-05 09:25:26 +01:00 |
|