Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 BUG: LSP completions return incorrect ranges in some cases #664

Closed
Trildar opened this issue Oct 11, 2023 · 5 comments
Closed

🐛 BUG: LSP completions return incorrect ranges in some cases #664

Trildar opened this issue Oct 11, 2023 · 5 comments
Labels
needs response Need a response from the original author

Comments

@Trildar
Copy link

Trildar commented Oct 11, 2023

Describe the Bug

Under some circumstances, responses to completionItem/resolve calls seem to have the wrong range.

---
const test = "TEST";
---

{(<div id={test} />)}

For example, in a file like the above, replacing the test in id and inserting a completion for test results in the following completionItem/resolve request and response:

Request

{
    commitCharacters = {".", ",", ";", "("},
    data = {
        original = {
            data = {
                fileName = "d:/User/Documents/project/src/pages/test.astro.tsx",
                offset = 56,
                originalItem = {name = "test"},
                uri = "file:///d%3A/User/Documents/project/src/pages/test.astro.tsx"
            },
            textEdit = {
                insert = {["end"] = {character = 22, line = 4}, start = {character = 21, line = 4}},
                newText = "test",
                replace = {["end"] = {character = 22, line = 4}, start = {character = 21, line = 4}}
            }
        },
        serviceId = "typescript",
        uri = "file:///D:/User/Documents/project/src/pages/test.astro",
        virtualDocumentUri = "file:///d%3A/User/Documents/project/src/pages/test.astro.tsx"
    },
    insertTextFormat = 1,
    kind = 6,
    label = "test",
    sortText = "11",
    textEdit = {
        insert = {["end"] = {character = 12, line = 4}, start = {character = 11, line = 4}},
        newText = "test",
        replace = {["end"] = {character = 12, line = 4}, start = {character = 11, line = 4}}
    }
}

Response

{
    commitCharacters = {".", ",", ";", "("},
    data = {
        fileName = "d:/User/Documents/project/src/pages/test.astro.tsx",
        offset = 56,
        originalItem = {name = "test"},
        uri = "file:///d%3A/User/Documents/project/src/pages/test.astro.tsx"
    },
    detail = 'const test: "TEST"',
    documentation = {kind = "markdown", value = ""},
    insertTextFormat = 1,
    kind = 6,
    label = "test",
    sortText = "11",
    textEdit = {
        insert = {["end"] = {character = 22, line = 4}, start = {character = 21, line = 4}},
        newText = "test",
        replace = {["end"] = {character = 22, line = 4}, start = {character = 21, line = 4}}
    }
}

The response gives a range starting at character 21 instead of character 11 as it should be.

Additional Notes

Testing done in Neovim; refer to the linked nvim-cmp issue below for a reproduction config. Not tested myself, but apparently VS Code, and using Neovim omnifunc completions instead of nvim-cmp, don't have the same completion insert issues, maybe because they don't use completionItem/resolve for the insert range (since the ranges returned for textDocument/completion are correct) or some other difference in how they interact with the language server.

EDIT: The LSP messages do seem to be similar in VS Code. So it's likely that VS Code doesn't use the ranges from completionItem/resolve to insert the completion.

Some additional cases that produce incorrect ranges can also be found in the nvim-cmp issue below.

Some prior discussion for reference:
hrsh7th/nvim-cmp#1705
LazyVim/LazyVim#1455

Steps to Reproduce

  1. npm init astro using template sample files
  2. Replace src/pages/index.astro with the below
---
const test = "TEST";
---

{(<div id={test} />)}
  1. Open index.astro in Neovim with the config from Astro LSP completions sometimes write over incorrect ranges hrsh7th/nvim-cmp#1705 or some similar config using nvim-cmp and the Astro language server
  2. Replace the test in id and try to insert a completion for test
  3. The text ends up something like the below instead of a correctly inserted completion
---
const test = "TEST";
---

{(<div id={ttest
@Princesseuh
Copy link
Member

Princesseuh commented Oct 11, 2023

Hmm, we don't touch completion ranges (apart from frontmatter edits) in our code, so if there is an issue, it's highly likely to be upstream. (Only chance it's on us if it's related to mappings, but I doubt it is?)

I'd say that I typically consider VS Code's implementation to be the "reference implementation" for LSP and if another editor exhibits a wrong behavior, it's generally its problem. Nonetheless, if there is something we can fix, I'd definitely love to do it

@yaegassy
Copy link
Contributor

At this time, the only way to address this issue on the client side may be to disable completionItem/resolve feature when editing astro files.

However, disabling completionItem/resolve may be inconvenient as I think it will stop "auto-import" etc. from working...

@Princesseuh Princesseuh added - P4: important Violate documented behavior or significantly improves performance (priority) and removed bug labels Nov 12, 2023
@hrsh7th
Copy link

hrsh7th commented Mar 24, 2024

IMO, This issue already solved in latest @astrojs/language-server.

@Trildar Could you test with latest @astrojs/language-server?

@Princesseuh Princesseuh added needs response Need a response from the original author and removed ecosystem: upstream Issue is caused by a bug / missing feature upstream - P4: important Violate documented behavior or significantly improves performance (priority) labels Mar 26, 2024
@Trildar
Copy link
Author

Trildar commented Apr 3, 2024

This does appear to be fixed now.

Checked on 2.8.4

@Princesseuh
Copy link
Member

Awesome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs response Need a response from the original author
Projects
None yet
Development

No branches or pull requests

4 participants