ML Framework Support and Commanding NPU RK3588 #14100
Replies: 1 comment
-
@samarth1232 hello! The NPU RK3588 is a powerful neural processing unit designed to accelerate machine learning tasks. To get the most out of it, you'll want to use frameworks that are optimized for this hardware. Here are some commonly supported ML frameworks for the RK3588:
To command the NPU to exclusively run models using these frameworks, you typically need to:
Here’s a simple example using TensorFlow Lite to run a model on an NPU: import tensorflow as tf
from tensorflow.lite.experimental import Interpreter
# Load the TFLite model
interpreter = Interpreter(model_path="model.tflite", experimental_delegates=[tf.lite.experimental.load_delegate('libedgetpu.so.1')])
# Allocate tensors
interpreter.allocate_tensors()
# Get input and output tensors
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
# Run inference
interpreter.set_tensor(input_details[0]['index'], input_data)
interpreter.invoke()
output_data = interpreter.get_tensor(output_details[0]['index']) For more detailed guidance, you might want to check the official documentation of the respective frameworks and any specific resources provided by the RK3588 manufacturer. If you encounter any issues or need further assistance, please provide a reproducible example of your setup and the specific problem you're facing. This will help us diagnose and address your concerns more effectively. You can find more information on creating a minimum reproducible example here. Best of luck with your project! 🚀 |
Beta Was this translation helpful? Give feedback.
-
I'm exploring running machine learning models on NPU RK3588. Could anyone share which ML frameworks are supported for this NPU? Additionally, I'd like to know how I can command the NPU to exclusively run models using these frameworks. Any insights or resources would be greatly appreciated!
Beta Was this translation helpful? Give feedback.
All reactions