Running Tests
Quick commands
Section titled “Quick commands”# Run everythingpytest tests/
# Stop on first failurepytest tests/ -x
# Run a single filepytest tests/data/test_data_loader.py -v
# Run a single testpytest tests/data/test_data_loader.py::TestMicroBatchCollator::test_micro_batch_splitting -vBy category
Section titled “By category”pytest tests/data/ # Data pipelinepytest tests/models/ # MoE and LoRA model logicpytest tests/ops/ # Low-level ops (GPU)pytest tests/qlora/ # QLoRA (GPU)pytest tests/server/ # Server infrastructurepytest tests/e2e/ # End-to-end (GPU + torchrun)By marker
Section titled “By marker”pytest tests/ -m cpu # Only CPU testspytest tests/ -m gpu # Only GPU testspytest tests/ -m "not slow" # Skip slow testspytest tests/ -m collator # Only collator testspytest tests/ -m server # Only server testsDistributed tests
Section titled “Distributed tests”Distributed tests work with plain pytest — tests that require multiple processes spawn torchrun internally as a subprocess (same pattern as e2e tests):
pytest tests/distributed/ -vEnd-to-end tests
Section titled “End-to-end tests”E2E tests call torchrun internally per test, so they run with plain pytest — no wrapper needed.
Use the helper script from the repo root:
# All e2e tests./tests/e2e/run_e2e.sh
# One model suite./tests/e2e/run_e2e.sh qwen3_8b
# One file./tests/e2e/run_e2e.sh qwen3_8b/test_lora.pyOr invoke pytest directly:
pytest tests/e2e/ -m e2e -vpytest tests/e2e/qwen3_8b/test_pp.py -vMake sure the environment is set up per the Installation guide.