-
-
Notifications
You must be signed in to change notification settings - Fork 15.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add TFLite Metadata to TFLite and Edge TPU models #9903
Conversation
@paradigmn thanks for the PR! Do the TFLite models retain their *.tflite suffix? Do they unpack on inference or does the file always stay the same as a single file? For edge_tpu handling I'm not sure, maybe just pass a metadata argument to the tflite export function, i.e. |
@paradigmn BTW, is this metadata format applicable to any other TF formats like pb or saved_model? |
@glenn-jocher the model stays a valid tflite model. It is part of the flatbuffer standard, that metadata can have associated files which are embedded after the model weights. Hence, the model can be inferred as usual. The associated files can be retrieved by means of the tflite-support library or unpacking the model file like a zip archive. The edgetpu-compiler seems to erase the meta data from the model. Therefore, it must be added again after the compilation (should work without problems, as it is still a valid flatbuffer). Tensorflow uses protobuf to save weights and model structure instead of the tflite flatbuffer format. Therefore, this method won't work here, unfortunately. |
@paradigmn ok got it. Then let's refactor the metadata code into a new file, i.e. |
@glenn-jocher I just refactored the code into a separate function. I integrated an additional test to avoid accidentally overwriting an existing meta.txt file. |
@paradigmn I've updated meta file from meta.txt to /tmp/meta.txt. I think this should work in Linux and macOS, might break on windows. I'll see what the tmp directory looks like there, and probably migrate to Pathlib. |
for more information, see https://pre-commit.ci
@paradigmn I tried |
@glenn-jocher yes, tflite-support must be installed via pip |
@paradigmn ok got it. I installed these in Colab, but there seem to be conflicts unfortunately. I guess we can leave the code as is so if installed it will be utilized. |
@paradigmn ok I think PR is all set. What do you think? |
@glenn-jocher interesting, I didn't encounter any conflicts with dependencies. Probably some issue with old python versions. The code looks good and works as intended on my side. I think this can be merged without further changes :) |
@paradigmn Hi. Thanks for the commit. I really appreciate it. As far as I understand, this PR is reading the metadata information from a meta.txt file and combining the metadata into tflite. Can you let me know how can I write that metatdata? I can't seem to find any guidance to it. I searched for the Tensorflow lite documentation, but as far as I can see, this is the documentation I found, https://www.tensorflow.org/lite/models/convert/metadata_writer_tutorial but it seems to be as we need tflite file to make metadata which is needed for tflite..? Doesn't seem to make sense. |
@jinmc Hi, the TFLite meta data format is a bit complicated compared to e.g. ONNX. First of all you have to distinguish between structural meta data (e.g. input name, output description, etc.) and custom meta data. The former is part of the TFLite flatbuffer format, while the latter consists more or less of text files that are attached to the end of the model as a zip file. To make things even more complicated, Google tries to impose its own idea of how a model's input and output should be defined (e.g. Detection Model: one image input and four output tensors). Since this format is not compatible with the Ultralytics models, you will run into problems. My code is based on a somewhat hidden tutorial that uses a more low-level approach. Basically, you create a metadata graph structure for the model. Inputs and outputs are defined as subgraphs in this structure. They are mandatory and must define the correct number of input and output tensors! For the main graph and all subgraphs you can attach a list of text files. With these you can add arbitrary information to the model. However, since there is no specified format, you are responsible for writing and parsing the embedded information correctly. Especially when working with multiple embedded files, you should carefully name each of them so that the information can be extracted correctly later. |
This PR embeds meta data to exported tflite models. The code makes use of the tflite-support library for the model export. For inference, this dependency is not necessary. The model can be unpacked as zipfile, were the meta data is stored as an integrated text file. While this procedure appears to be rather convoluted, it is the intended method according to Google's documentation.
It seems the meta data does not survive the edgetpu-compiler. Should I refactor the code into a separate function which can be called at the end of the tflite and edgetpu export?
🛠️ PR Summary
Made with ❤️ by Ultralytics Actions
🌟 Summary
Enhanced TensorFlow Lite exports with metadata addition and improved model metadata handling.
📊 Key Changes
add_tflite_metadata
function to append metadata to.tflite
models.contextlib
,ast
, andzipfile
for handling the new metadata functionalities.🎯 Purpose & Impact
.tflite
models can now benefit from integrated metadata, enhancing model portability and interoperability across different platforms and tools. This change could enhance the user experience by providing richer model information directly within the.tflite
files. 📱💼