Getting ipynb to open as a notebook in GitHub codespace with openFiles? #58399
-
Select Topic AreaQuestion Body** Also posted in Jupyter extension repo: microsoft/vscode-jupyter#13723 ** I am trying to create a dev container that automatically opens up a particular notebook file using the Jupyter extension. All of my attempts result in the notebook rendering as JSON instead of a notebook. Example devcontainer.json:
For this repo: Here's another repo that demonstrates the issue. If you create a Codespace, you'll notice it opens 4 ipynb files, but they render as JSON, not notebooks. https://github.com/janosh/pymatviz If you then close them and re-open them, they open as expected, with the notebook UI. I'm not sure if this is an extension issue or a Codespace issue, as I assume it's related to timing of the customizations and extensions. It'd be really fantastic if there was a way to make this work, since this use case would be very popular with data science teachers. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Also filed here: microsoft/vscode-jupyter#13723 (comment) |
Beta Was this translation helpful? Give feedback.
-
@pamelafox I see in your other discussion that it might be an issue with VS Code core, but I'm also not sure that it's a supported scenario in Codespaces since the file will be opened before the extensions (specifically the Notebooks extension) in your codespace will be activated. One workaround that should work is to remove the |
Beta Was this translation helpful? Give feedback.
@pamelafox I see in your other discussion that it might be an issue with VS Code core, but I'm also not sure that it's a supported scenario in Codespaces since the file will be opened before the extensions (specifically the Notebooks extension) in your codespace will be activated. One workaround that should work is to remove the
openFile
customization in yourdevcontainer.json
and addpostAttachCommand: "code /workspaces/materials-fa22/materials/fa22/project/project1/project1.ipynb"
. Let me know if you are still running into the same issue after giving that a try!