Added easy venv creation using uv. Also updated PyTorch version.#431
Added easy venv creation using uv. Also updated PyTorch version.#431
Conversation
- Added tool.uv.extra-build-dependencies to pyproject.toml for easier installation of flash attention with no build isolation. - Updated PyTorch version to 2.10.*. - Updated flash-attn version to 2.8.3. - Updated README.md
There was a problem hiding this comment.
Pull request overview
This PR modernizes the installation process by adding native uv support with improved flash-attention build configuration, updates dependency versions, and reorganizes the README installation instructions. The changes aim to simplify the developer experience by leveraging uv's ability to handle complex build dependencies automatically.
Changes:
- Added uv-specific configuration (
tool.uv.extra-build-dependencies) to automate flash-attention installation with proper build dependencies - Updated dependency version constraints: PyTorch to <2.11.0, flash-attn to 2.8.3, transformers to <5.0.0, and Python to <3.14
- Restructured README installation instructions with uv as the recommended Option 1, followed by pip-based manual installation
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 5 comments.
| File | Description |
|---|---|
| pyproject.toml | Added uv build configuration for flash-attn, updated Python/PyTorch/transformers version constraints, added ninja to main dependencies |
| README.md | Reorganized installation instructions to prioritize uv installation, updated dependency versions in manual installation steps, reformatted feature tables |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| "matplotlib", | ||
| "wandb", | ||
| "einops>=0.7.0", | ||
| "flash-attn==2.8.3; platform_system != 'Darwin' and platform_machine != 'aarch64'", |
There was a problem hiding this comment.
The platform markers "platform_system != 'Darwin' and platform_machine != 'aarch64'" exclude flash-attn installation on macOS and ARM64 architectures. While this is likely intentional due to flash-attn's CUDA dependencies, consider adding a comment explaining this constraint to help users understand why flash-attn won't be installed on these platforms.
| "flash-attn==2.8.3; platform_system != 'Darwin' and platform_machine != 'aarch64'", | |
| "flash-attn==2.8.3; platform_system != 'Darwin' and platform_machine != 'aarch64'", # Exclude macOS (Darwin) and ARM64 because flash-attn requires NVIDIA CUDA and wheels are not available on these platforms |
What does this PR do?
Breaking Changes
Checklist before submitting final PR
python tests/tests.py)CHANGELOG_DEV.md)