Skip to content

LSP‐AI for Ace‐linters

Azat Alimov edited this page Jun 18, 2024 · 2 revisions

Integrating lsp-ai Language Server with Ace Linters

Introduction

The LSP-AI language server provides advanced AI-assisted code analysis features like autocompletion, which enhance the Ace Editor environment for various programming languages.

Installation

  1. Prerequisites:

    • Ensure you have Rust installed on your system. You can install Rust using rustup. This will include Cargo, the Rust package manager.
    • ace-linters ^1.2.3
  2. Install LSP-AI Language Server: Use Cargo to install the lsp-ai language server. Depending on your system requirements, you can choose to install it with different features:

    • Basic Installation:

      cargo install lsp-ai
    • With llama_cpp Support: This is needed if you want to leverage AI models for code completion.

      cargo install lsp-ai --features llama_cpp
    • With metal and llama_cpp for MacOS: Recommended for users on MacOS with Metal support.

      cargo install lsp-ai --features "llama_cpp metal"
    • With cuda and llama_cpp for Nvidia GPUs: Recommended for users with Nvidia GPUs on Linux.

      cargo install lsp-ai --features "llama_cpp cuda"

    For detailed installation options and troubleshooting, visit the installation guide.

  3. Running the Language Server: Ensure that the LSP-AI language server is running on the specified WebSocket port. For setting up the WebSocket server for LSP-AI, you can use tools like Language Server WebSocket Bridge for integrating multiple language servers.

Configuration

To integrate the lsp-ai language server with Ace Linters, configure your Ace Editor environment to connect to the lsp-ai language server via WebSocket. Here’s an example configuration:

// defaultGenerationConfiguration - (https://github.com/SilasMarvin/lsp-ai/wiki/Configuration)
// you will need OPENAI_API_KEY in env or use "auth_token" instead of "auth_token_env_var_name"
const defaultServerConfiguration =
    {
        "memory": {
            "file_store": {}
        },
        "models": {
            "model1": {
                "type": "open_ai",
                "chat_endpoint": "https://api.openai.com/v1/chat/completions",
                "model": "gpt-3.5-turbo-0125",
                "auth_token_env_var_name": "OPENAI_API_KEY"
            }
        },
        "completion": {
            ...defaultGenerationConfiguration
        }
    }
const serverData = {
    module: () => import("ace-linters/build/language-client"),
    modes: "javascript",
    type: "socket",
    socket: new WebSocket("ws://localhost:3030/lsp-ai"),
    initializationOptions: defaultServerConfiguration
}

Refer to lsp-ai configuration for configuration defaultGenerationConfiguration and defaultServerConfiguration or just look at Full example

Usage Examples

Full example

Troubleshooting

  • Connection Issues: Ensure the WebSocket server for the lsp-ai language server is set up correctly and running. Verify the port and endpoint configurations.

  • Feature Limitations: Some advanced features may experience delays or reduced performance due to network latency or server configuration.

  • Error Logs: Check the language server’s logs for any error messages or warnings that might indicate misconfiguration or compatibility issues.

Additional Resources

This guide provides the necessary steps and configuration details for integrating the lsp-ai language server with Ace Linters. Adjust the configuration parameters to match your specific environment and requirements.