vllm-project/vllm-omniA framework for efficient model inference with omni-modality models Language: Python Stars: 1550 Forks: 200