Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support gRPC over websocket to allow direct access from browsers #450

Open
arya2 opened this issue Aug 18, 2023 · 8 comments
Open

Support gRPC over websocket to allow direct access from browsers #450

arya2 opened this issue Aug 18, 2023 · 8 comments

Comments

@arya2
Copy link

arya2 commented Aug 18, 2023

Motivation

Using lightwalletd from the browser currently requires a proxy server to provide an http/1.1 transport. or gRPC over websocket.

Integrating gRPC over websocket support will offer web developers the direct access to lightwalletd that native developers currently enjoy.

@LarryRuane
Copy link
Collaborator

I asked chatGPT about this -- did I ask the question correctly? https://chat.openai.com/share/02a1ba5b-217f-42a4-986d-d191f70866c3

It seems to suggest supporting websocket but maybe not http/1.1. Is that acceptable? (I really don't understand this area very well.)

@LarryRuane
Copy link
Collaborator

Can you provide some motivation for this? I would have to justify spending time on this (if possible), thanks.

@arya2
Copy link
Author

arya2 commented Aug 19, 2023

It seems to suggest supporting websocket but maybe not http/1.1. Is that acceptable? (I really don't understand this area very well.)

That would be ideal, I'm not sure if http/1.1 would support the streams, and websockets offer more flexibility / lower latency.

@arya2 arya2 changed the title Support http/1.1 or gRPC over websocket to allow direct access from browsers Support gRPC over websocket to allow direct access from browsers Aug 19, 2023
@borngraced
Copy link

Hey @LarryRuane how is this going?

@LarryRuane
Copy link
Collaborator

I'm just getting around to this finally, sorry for the delay, I'll try to get this done within the next week or so.

@arya2
Copy link
Author

arya2 commented Nov 16, 2023

It seems to suggest supporting websocket but maybe not http/1.1. Is that acceptable? (I really don't understand this area very well.)

That would be ideal, I'm not sure if http/1.1 would support the streams, and websockets offer more flexibility / lower latency.

@LarryRuane There may also be interest in calling these RPCs from a serverless environment where websockets don't work well, so adding regular http/1.1 REST endpoints would be valuable too.

@teor2345
Copy link
Contributor

I asked chatGPT about this -- did I ask the question correctly? chat.openai.com/share/02a1ba5b-217f-42a4-986d-d191f70866c3

It seems to suggest supporting websocket but maybe not http/1.1. Is that acceptable? (I really don't understand this area very well.)

ChatGPT prioritises giving confident answers over correct answers, so I'd encourage you to check those answers with StackOverflow or the Go library reference, and test that all the RPCs work.

Here's some background on ChatGPT's lack of reliability - a study found that ChatGPT got code questions wrong 52% of the time:
https://www.theregister.com/2023/08/07/chatgpt_stack_overflow_ai/

@ec2
Copy link

ec2 commented Sep 19, 2024

Is this still being worked on? @LarryRuane

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants