Steven Palma
85de893fa7
fix(ci): skip HF log in (and tests) in forks and community PRs ( #3097 )
...
* fix(ci): skip HF log in (and tests) in forks and community PRs
* chore(test): remove comment about test meant to be only run locally
* fix(tests): no hf log in decorator for xvla
* fix(test): no decorator in yield
2026-03-06 16:33:43 +01:00
Steven Palma
f0d2b37beb
chore(dependencies): bump transformers v5 ( #2964 )
...
* chore(dependencies): upgrade transformers + hggingface-hub + peft + scipy
* chore(dependencies): bump pi0 family to transformers v5
* chore(dependencies): bump wall x to transformers v5
* chore(dependencies): bump gr00t to transformers v5
* chore(style): fix pre-commit
* fix(policy): xvla forced_bos_token missing
* test(rl): skip ci tests for resnet10
* Fix: full pi models support for transformer v5 (#2967 )
* fix(pi): remove loss truncation
* fix(pi): remove state padding before tokenization
* fix(pi): fix image padding value
* fix from_pretrain
* add transformer v5 changes
* remove reference
* more fixes
* make it work
* add support for rest of pi family
* add pifast work
* more changes
* more changes
* more cleanup
* fix torch params
* dtype fix
* torch compile
* embed mismatch fix
* revert groot
* more nit fixes
* remove unused classes
* more fixes
* revert
* nit
* torch dtype warning fix
* but back dynamic renaming
* add tie embedding
---------
Co-authored-by: Yufei Sun <skieyfly@gmail.com >
* chore: fix XVLA in transformers v5 (#3006 )
* test(policies): enable wall x CI testing
* style(test): pre-commit check
* style(test): pre-commit
* fix wall x for transformer v5 (#3008 )
* tv5 fix
* various wall x fixes
* Delete tests/policies/pi0_pi05/print_pi05_output_logits.py
Signed-off-by: Jade Choghari <chogharijade@gmail.com >
* sync modeling_florence2.py with chore/bump_transformers_v5
* more
* more fixes
* more
* remove comment
* more
---------
Signed-off-by: Jade Choghari <chogharijade@gmail.com >
* chore(dependencies): adjust dependencies versioning after transformers v5 (#3034 )
* chore(dependecies): adjust dependecies versioning after transformers v5
* fix(policies): remove deprecated input_embeds
* fix(policies): dict _tied_weights_keys
* chore(depedencies): common qwen-vl-utils
* chore(dependencies): bump transformers to 5.2
* Fix policy testing for tv5 (#3032 )
* fix ci logger
* other fix
* fix mypy
* change logits to torch2.10
* skip wallx|
* remove logging
---------
Co-authored-by: Steven Palma <imstevenpmwork@ieee.org >
* feat(ci): log into HF to unblock some CI tests (#3007 )
* feat(ci): log into HF to unblock some CI tests
* chore(ci): change hf call + secret name
* fix(ci): temp fix for pi0 rtc test
* test(policies): require_cuda for unblocked tests
* test(policies): require_cuda wall_x
* fic(tests): require_cuda outter most for pi0
* fix(test): return instead of yield
---------
Signed-off-by: Steven Palma <imstevenpmwork@ieee.org >
* style(test): fix pre-commit
* chore(deps): upgrade transformers (#3050 )
* chore(test): use lerobot model
* fix(policies): change default action tokenizer for wall x
* sample on cpu
* Revert "Merge branch 'chore/bump_transformers_v5' of https://github.com/huggingface/lerobot into chore/bump_transformers_v5"
This reverts commit d9b76755f7 , reversing
changes made to 89359cb0b6 .
* Reapply "Merge branch 'chore/bump_transformers_v5' of https://github.com/huggingface/lerobot into chore/bump_transformers_v5"
This reverts commit c9914db78b .
---------
Signed-off-by: Jade Choghari <chogharijade@gmail.com >
Signed-off-by: Steven Palma <imstevenpmwork@ieee.org >
Co-authored-by: Jade Choghari <chogharijade@gmail.com >
Co-authored-by: Yufei Sun <skieyfly@gmail.com >
Co-authored-by: Pepijn <pepijn@huggingface.co >
2026-03-05 09:25:26 +01:00
Pepijn
6106a8136c
Fix invalid syntax ( #2752 )
...
* fix invalid syntax
* also skip for torchdiffeq
* fix patch for gpu tests
2026-01-05 12:13:42 +01:00
Tong Wu
17c5a0774f
feat: support wallx model ( #2593 )
...
* support wallx
* fix bugs in flow
* incorporate wallx model into lerobot
* update the policy methods
* reduce to least config and params & pass lerobot basic test
* fixed dtype bugs
* add wallx dependencies
* update
* remove flash-attn requirement && fix bug in inference and fast mode
* fix bug for inference
* add some small modifications
* fix pre-commit errors
* remove lerobot[wallx]
* fix ci
* fix precommit issues
* fix: exclude wallx extra properly in CI workflows
* fix: add uv conflicts for wallx transformers version
* fix: peft test import
* pre-commit
* only export WallXConfig from wall_x package to avoid peft import in CI
* remove torch dep
* precommit
* add import
---------
Co-authored-by: vincentchen <chenlufang@x2robot.com >
Co-authored-by: Geoffrey19 <sympathischmann35@gmail.com >
Co-authored-by: Pepijn <138571049+pkooij@users.noreply.github.com >
Co-authored-by: Pepijn <pepijn@huggingface.co >
2025-12-22 10:12:39 +01:00