Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fancy remote debugging (aka tab completion and editor controls) #130

Open
goodboy opened this issue Jul 24, 2020 · 2 comments
Open

Fancy remote debugging (aka tab completion and editor controls) #130

goodboy opened this issue Jul 24, 2020 · 2 comments
Labels
debugger discussion enhancement New feature or request experiment Exploratory design and testing help wanted Extra attention is needed

Comments

@goodboy
Copy link
Owner

goodboy commented Jul 24, 2020

As part of the journey in #129 I discovered that no-one seems to have solved the problem of getting the fancy features of a modern Python debugger working in remote debugging applications. Though I haven't tested them all, the list in #113 seems to mostly contain systems which rely on telnet servers (or other network IPC magic) but none of them actually solve the issue of how to get the features in the local client that would normally only be possible when the remote process is connected to a tty.

The problem

Standard fancy (read human enhanced) debugger repls (including the stdlib's pdb which uses rlcompleter, and pdb++) rely on libraries such as GNU readline to get things like completion and CLI "editting controls". There seems to be no way to get these features with readline based systems in a remote debugging context since Python's use of readline requires that the process is launched under a tty/pty system. Ideally these features are available in such use cases to make debugging of remote systems sane and efficient for the user.

Evidence

In #129 I was able to verify that launching subprocesses with stdin as a unix pipe indeed causes no readline systems to be loaded. I haven't been able to find a remote debugger that supports this feature either (but of course hopefully someone will prove me wrong!).

Possible solutions

  • in the near term: we can not spawn with stdin as a pipe and instead let child processes stay connected to the parent tty (this is actually what trip does and it works):

    • this should allow for nested actor access to stdin since the parent won't have to figure out which pipe to write to.
    • we'll need locking around access to the root actor's tty (which was already part of the plan).
    • this will of course only work for host local actors.
  • consider a debugger that doesn't use the stdlib's readline

  • work with debugger's that want to move to ptk to get this functionality supported in their initial integration such as with pdbpp in Edit debugger line with editor pdbpp/pdbpp#362

Other notes

Remote debugging possible hacks or solutions

  • python-remote-pdb offers a readline hack using nc or socat which may be usable in the near term
  • somehow using some of that ptyprocess has done from tractor spawning machinery and keeping compat with the public trio api

Ideally

  • the correct solution to me is getting ptk to support all it's features without requiring a tty whatsoever and then being able to simply talk to a tractor actor running it.
@goodboy goodboy added discussion enhancement New feature or request experiment Exploratory design and testing help wanted Extra attention is needed labels Jul 24, 2020
@goodboy
Copy link
Owner Author

goodboy commented Jul 24, 2020

Relevant links from ptk:

goodboy added a commit that referenced this issue Jul 30, 2020
This is the first step in addressing #113 and the initial support
of #130. Basically this allows (sub)processes to engage the `pdbpp`
debug machinery which read/writes the root actor's tty but only in
a FIFO semaphored way such that no two processes are using it
simultaneously. That means you can have multiple actors enter a trace or
crash and run the debugger in a sensible way without clobbering each
other's access to stdio. It required adding some "tear down hooks" to
a custom `pdbpp.Pdb` type such that we release a child's lock on the
parent on debugger exit (in this case when either of the "continue" or
"quit" commands are issued to the debugger console).

There's some code left commented in anticipation of full support for
issue #130 where we're need to actually capture and feed stdin to the
target (remote) actor which won't necessarily being running on the same
host.
goodboy added a commit that referenced this issue Jul 30, 2020
This is the first step in addressing #113 and the initial support
of #130. Basically this allows (sub)processes to engage the `pdbpp`
debug machinery which read/writes the root actor's tty but only in
a FIFO semaphored way such that no two processes are using it
simultaneously. That means you can have multiple actors enter a trace or
crash and run the debugger in a sensible way without clobbering each
other's access to stdio. It required adding some "tear down hooks" to
a custom `pdbpp.Pdb` type such that we release a child's lock on the
parent on debugger exit (in this case when either of the "continue" or
"quit" commands are issued to the debugger console).

There's some code left commented in anticipation of full support for
issue #130 where we're need to actually capture and feed stdin to the
target (remote) actor which won't necessarily being running on the same
host.
goodboy added a commit that referenced this issue Aug 4, 2020
This is the first step in addressing #113 and the initial support
of #130. Basically this allows (sub)processes to engage the `pdbpp`
debug machinery which read/writes the root actor's tty but only in
a FIFO semaphored way such that no two processes are using it
simultaneously. That means you can have multiple actors enter a trace or
crash and run the debugger in a sensible way without clobbering each
other's access to stdio. It required adding some "tear down hooks" to
a custom `pdbpp.Pdb` type such that we release a child's lock on the
parent on debugger exit (in this case when either of the "continue" or
"quit" commands are issued to the debugger console).

There's some code left commented in anticipation of full support for
issue #130 where we're need to actually capture and feed stdin to the
target (remote) actor which won't necessarily being running on the same
host.
goodboy added a commit that referenced this issue Aug 4, 2020
This is the first step in addressing #113 and the initial support
of #130. Basically this allows (sub)processes to engage the `pdbpp`
debug machinery which read/writes the root actor's tty but only in
a FIFO semaphored way such that no two processes are using it
simultaneously. That means you can have multiple actors enter a trace or
crash and run the debugger in a sensible way without clobbering each
other's access to stdio. It required adding some "tear down hooks" to
a custom `pdbpp.Pdb` type such that we release a child's lock on the
parent on debugger exit (in this case when either of the "continue" or
"quit" commands are issued to the debugger console).

There's some code left commented in anticipation of full support for
issue #130 where we're need to actually capture and feed stdin to the
target (remote) actor which won't necessarily being running on the same
host.
goodboy added a commit that referenced this issue Aug 9, 2020
This is the first step in addressing #113 and the initial support
of #130. Basically this allows (sub)processes to engage the `pdbpp`
debug machinery which read/writes the root actor's tty but only in
a FIFO semaphored way such that no two processes are using it
simultaneously. That means you can have multiple actors enter a trace or
crash and run the debugger in a sensible way without clobbering each
other's access to stdio. It required adding some "tear down hooks" to
a custom `pdbpp.Pdb` type such that we release a child's lock on the
parent on debugger exit (in this case when either of the "continue" or
"quit" commands are issued to the debugger console).

There's some code left commented in anticipation of full support for
issue #130 where we're need to actually capture and feed stdin to the
target (remote) actor which won't necessarily being running on the same
host.
goodboy added a commit that referenced this issue Aug 13, 2020
This is the first step in addressing #113 and the initial support
of #130. Basically this allows (sub)processes to engage the `pdbpp`
debug machinery which read/writes the root actor's tty but only in
a FIFO semaphored way such that no two processes are using it
simultaneously. That means you can have multiple actors enter a trace or
crash and run the debugger in a sensible way without clobbering each
other's access to stdio. It required adding some "tear down hooks" to
a custom `pdbpp.Pdb` type such that we release a child's lock on the
parent on debugger exit (in this case when either of the "continue" or
"quit" commands are issued to the debugger console).

There's some code left commented in anticipation of full support for
issue #130 where we're need to actually capture and feed stdin to the
target (remote) actor which won't necessarily being running on the same
host.
@goodboy
Copy link
Owner Author

goodboy commented Aug 19, 2020

Follow up from prompt-toolkit/python-prompt-toolkit#1204:

goodboy added a commit that referenced this issue Sep 24, 2020
This is the first step in addressing #113 and the initial support
of #130. Basically this allows (sub)processes to engage the `pdbpp`
debug machinery which read/writes the root actor's tty but only in
a FIFO semaphored way such that no two processes are using it
simultaneously. That means you can have multiple actors enter a trace or
crash and run the debugger in a sensible way without clobbering each
other's access to stdio. It required adding some "tear down hooks" to
a custom `pdbpp.Pdb` type such that we release a child's lock on the
parent on debugger exit (in this case when either of the "continue" or
"quit" commands are issued to the debugger console).

There's some code left commented in anticipation of full support for
issue #130 where we're need to actually capture and feed stdin to the
target (remote) actor which won't necessarily being running on the same
host.
@goodboy goodboy changed the title Fancy stuff (aka tab completion and editor controls) in a "remote debugger" Fancy debugging (aka tab completion and editor controls) in a "remote debugger" Sep 28, 2020
@goodboy goodboy changed the title Fancy debugging (aka tab completion and editor controls) in a "remote debugger" Fancy remote debugging (aka tab completion and editor controls) Oct 5, 2021
goodboy added a commit that referenced this issue Mar 20, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Apr 11, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Apr 12, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Apr 17, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
@goodboy goodboy added debugger and removed debugging labels Aug 1, 2022
goodboy added a commit that referenced this issue Sep 6, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Sep 15, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Sep 15, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Sep 15, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Sep 16, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Oct 6, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Oct 7, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Oct 12, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Oct 26, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Oct 26, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Dec 12, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Dec 13, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Dec 13, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Dec 13, 2022
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Jan 26, 2023
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Jan 26, 2023
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Jan 26, 2023
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
goodboy added a commit that referenced this issue Jan 30, 2023
This code is originally written (with much thanks) by
@mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which
is `trio` compatible and allows straight up `await async_func()` calls
in the REPL with expected default blocking semantics. More refinements
to come including user config loading and eventually a foundation for
what will be a console REPL + %magics for shipping work off to actor
clusters and manual respawn controls and thus probably eventually
obsoleting all the "parallel" stuff built into `ipython` B)

Probably pertains to #130
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
debugger discussion enhancement New feature or request experiment Exploratory design and testing help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant