Skip to content

Popular repositories Loading

  1. Smaug-72B Smaug-72B Public

    Smaug-72B - which topped the Hugging Face LLM leaderboard and it’s the first model with an average score of 80, making it the world’s best open-source foundation model.

    Python 16 5

  2. RMBG-1.4 RMBG-1.4 Public template

    RMBG v1.4 is our state-of-the-art background removal model, designed to effectively separate foreground from background in a range of categories and image types.

    Python 10 5

  3. triton-co-pilot triton-co-pilot Public

    Generate Glue Code in seconds to simplify your Nvidia Triton Inference Server Deployments

    Python 9 2

  4. whisper-large-v3 whisper-large-v3 Public template

    Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours of labelled data, Whisper models demonstrate a strong ability to generalise to ma…

    Python 7 8

  5. stable-video-diffusion stable-video-diffusion Public template

    (SVD) Image-to-Video is a latent diffusion model trained to generate short video clips from an image conditioning. This model was trained to generate 25 frames at resolution 576x1024 given a contex…

    Python 6 4

  6. TensorRT-LLM TensorRT-LLM Public

    6

Repositories

Showing 10 of 124 repositories
  • inferless/bge-base-en-v1.5’s past year of commit activity
    Python 0 2 0 0 Updated Aug 5, 2024
  • inferless/Llama-3.1-8B-Instruct-GGUF’s past year of commit activity
    Python 0 3 0 0 Updated Aug 5, 2024
  • inferless/CodeLlama-70B’s past year of commit activity
    Python 0 1 0 0 Updated Jul 26, 2024
  • Phi-2 Public template

    Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value).

    inferless/Phi-2’s past year of commit activity
    Python 0 4 0 0 Updated Jul 26, 2024
  • inferless/TenyxChat-8x7B-v1’s past year of commit activity
    Python 0 0 0 0 Updated Jul 26, 2024
  • Mixral-8x7B Public Forked from rbgo404/Mixral-8x7B

    Mixtral is a large language model developed by Mistral AI, a French artificial intelligence company. It is a sparse Mixture of Experts (MoE) model with 8 experts per MLP, totaling 45 billion parameters. Mixtral is designed to handle contexts of up to 32,000 tokens.

    inferless/Mixral-8x7B’s past year of commit activity
    Python 0 1 0 0 Updated Jul 26, 2024
  • inferless/DeciLM-7B’s past year of commit activity
    Python 0 1 0 0 Updated Jul 26, 2024
  • inferless/inferless_tutorials’s past year of commit activity
    Python 1 3 0 0 Updated Jul 26, 2024
  • inferless/Parler-tts-streaming’s past year of commit activity
    Python 0 0 0 0 Updated Jul 24, 2024
  • inferless/Llama-3.1-8B-Instruct’s past year of commit activity
    Python 0 0 0 0 Updated Jul 24, 2024

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…