-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build a GPU accelerated docker container with jetson-inferense, python3.10 and ros2 humble for jetson nano 4G #1839
Comments
I think you can use this container mentioned here, by running this-
|
thanks for your reply. |
Hi, Jetson Inference has suitable container for 4.6.1 as well. check here. Just add the |
Thanks. I checked there but L4T R32.7.1 based on JetPack 4.6.1 has Ubuntu 18 and python3.6. I have used image: timongentzsch/l4t-ubuntu20-ros2-desktop it has ubuntu20, python 3.8, ros foxy and also supports GPU by CUDA10.2. after all, I did, I couldn't build jetson-inference in the container but at least, that image supports the rest of the things |
Did it work? I plan to use mine in the same way and continue with the GPU support. |
Hello.
I have a Jetson Nano 4G and must use its GPU.
I need python3.8 << and ros2 Humble or foxy.
I know it can not be installed on Jetpack 4.6.1, so I thought I might have it in a docker container. I also need to use
the jetson-inference package inside the container.
is that scenario possible?
(Note that ros Foxy can be installed on Ubuntu 20 but humble needs ubuntu22. It would be great if I could use Humble)
can I have a docker container which uses GPU and has python3.10, ROS Humble or foxy, and jetson-inference ?
this is a vital question for me and I would appreciate it if anyone can help me with it.
thanks
The text was updated successfully, but these errors were encountered: