Skip to content

Added easy venv creation using uv. Also updated PyTorch version.#431

Merged
le1nux merged 2 commits intomainfrom
easy_flash_attention_install_with_uv
Feb 18, 2026
Merged

Added easy venv creation using uv. Also updated PyTorch version.#431
le1nux merged 2 commits intomainfrom
easy_flash_attention_install_with_uv

Conversation

@BlueCrescent
Copy link
Member

@BlueCrescent BlueCrescent commented Feb 14, 2026

What does this PR do?

  • Added tool.uv.extra-build-dependencies to pyproject.toml for easier installation of flash attention with no build isolation.
  • Updated PyTorch version to 2.10.*.
  • Updated flash-attn version to 2.8.3.
  • Limited transformers version to <5.0.0 until we checked compatibility.
  • Updated README.md

Breaking Changes

  • None

Checklist before submitting final PR

  • My PR is minimal and addresses one issue in isolation
  • I have merged the latest version of the target branch into this feature branch
  • I have reviewed my own code w.r.t. correct implementation, missing type hints, proper documentation, etc.
  • I have run a sample config for model training
  • I have checked that all tests run through (python tests/tests.py)
  • I have updated the internal changelog (CHANGELOG_DEV.md)

- Added tool.uv.extra-build-dependencies to pyproject.toml for easier installation of flash attention with no build isolation.
- Updated PyTorch version to 2.10.*.
- Updated flash-attn version to 2.8.3.
- Updated README.md
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR modernizes the installation process by adding native uv support with improved flash-attention build configuration, updates dependency versions, and reorganizes the README installation instructions. The changes aim to simplify the developer experience by leveraging uv's ability to handle complex build dependencies automatically.

Changes:

  • Added uv-specific configuration (tool.uv.extra-build-dependencies) to automate flash-attention installation with proper build dependencies
  • Updated dependency version constraints: PyTorch to <2.11.0, flash-attn to 2.8.3, transformers to <5.0.0, and Python to <3.14
  • Restructured README installation instructions with uv as the recommended Option 1, followed by pip-based manual installation

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 5 comments.

File Description
pyproject.toml Added uv build configuration for flash-attn, updated Python/PyTorch/transformers version constraints, added ninja to main dependencies
README.md Reorganized installation instructions to prioritize uv installation, updated dependency versions in manual installation steps, reformatted feature tables

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

"matplotlib",
"wandb",
"einops>=0.7.0",
"flash-attn==2.8.3; platform_system != 'Darwin' and platform_machine != 'aarch64'",
Copy link

Copilot AI Feb 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The platform markers "platform_system != 'Darwin' and platform_machine != 'aarch64'" exclude flash-attn installation on macOS and ARM64 architectures. While this is likely intentional due to flash-attn's CUDA dependencies, consider adding a comment explaining this constraint to help users understand why flash-attn won't be installed on these platforms.

Suggested change
"flash-attn==2.8.3; platform_system != 'Darwin' and platform_machine != 'aarch64'",
"flash-attn==2.8.3; platform_system != 'Darwin' and platform_machine != 'aarch64'", # Exclude macOS (Darwin) and ARM64 because flash-attn requires NVIDIA CUDA and wheels are not available on these platforms

Copilot uses AI. Check for mistakes.
Copy link
Member

@le1nux le1nux left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM :)

@le1nux le1nux merged commit 0596085 into main Feb 18, 2026
3 checks passed
@le1nux le1nux deleted the easy_flash_attention_install_with_uv branch February 18, 2026 16:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments