Category
LLMs
High-throughput LLM serving and inference optimization.
LLMs
curated
repo
unknown
Local AI (On-Prem / Edge)
Local AI (On-Prem / Edge)
n/a
n/a
publishable
unknown
no
no
n/a
n/a
open_source_ai_hub_catalog
learning_paths, open_source_ai_hub_catalog
ba10cc4c361a3e4b