Installation Guide
This guide will help you install and set up InfinityStar on your system. Follow these steps carefully to ensure proper installation and configuration.
System Requirements
Before installing InfinityStar, ensure your system meets the following requirements:
- Python 3.8 or higher
- PyTorch 2.5.1 or higher (required for FlexAttention support)
- CUDA-capable GPU with sufficient memory (recommended for inference)
- At least 40GB of free disk space for model checkpoints
- Sufficient RAM for model loading and inference
Step 1: Install PyTorch
InfinityStar uses FlexAttention to speed up training, which requires PyTorch version 2.5.1 or higher. Install PyTorch with CUDA support if you have a compatible GPU:
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118For CPU-only installation (not recommended for inference):
pip install torch torchvision torchaudioVerify your PyTorch installation by checking the version:
python -c "import torch; print(torch.__version__)"Ensure the version is 2.5.1 or higher.
Step 2: Install Required Packages
Clone the InfinityStar repository and navigate to the project directory:
git clone https://github.com/FoundationVision/InfinityStar.git
cd InfinityStarInstall all required Python packages using pip:
pip3 install -r requirements.txtThis will install all necessary dependencies including transformers, datasets, and other required libraries.
Step 3: Download Model Checkpoints
Download the appropriate model checkpoints based on your needs:
- 720p Model: For high-quality 5-second video generation at 720p resolution. This model is optimized for text-to-video tasks.
- 480p Model: For variable-length video generation (5 or 10 seconds) at 480p resolution. This model works well for image-to-video and video-to-video tasks.
Model checkpoints are available from the official repository. The full model size is approximately 35 gigabytes, so ensure you have sufficient storage space and bandwidth.
Place the downloaded checkpoints in the appropriate directory as specified in the repository structure.
Step 4: Verify Installation
After installation, verify that all components are working correctly:
python -c "import torch; print('PyTorch version:', torch.__version__)"
python -c "import transformers; print('Transformers installed')"If all commands execute without errors, your installation is complete.
Step 5: Prepare for Inference
Before running inference, ensure you have:
- Downloaded the appropriate model checkpoint
- Placed the checkpoint in the correct directory
- Prepared your input (text description, image, or video)
- Ensured sufficient GPU memory is available
Running Inference
For 720p video generation, use the following command:
python3 tools/infer_video_720p.pyFor 480p variable-length video generation, use:
python3 tools/infer_video_480p.pyYou can specify parameters such as text prompts, image paths, or video paths in the script or through command-line arguments, depending on the implementation.
Training Setup
If you plan to train or fine-tune the model, additional setup is required:
- Organize your training data according to the structure specified in data/README.md
- Extract features from your data using the provided feature extraction scripts
- Configure training parameters in the training scripts
- Ensure you have sufficient computational resources for training
For detailed training instructions, refer to the data/README.md file in the repository.
Troubleshooting
If you encounter issues during installation:
- Ensure PyTorch version is 2.5.1 or higher
- Check that all dependencies are installed correctly
- Verify CUDA compatibility if using GPU
- Check available disk space for model checkpoints
- Review error messages for specific dependency issues
Additional Resources
For more information, refer to:
- Official repository documentation
- Training scripts and data organization guides
- Inference script documentation
- Research paper for technical details
Note: This installation guide provides general instructions. For the most up-to-date and detailed information, please refer to the official InfinityStar repository and documentation.