-
Notifications
You must be signed in to change notification settings - Fork 45.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
offline_eval_map_corloc: TypeError (expected str instance, bytes found) #3252
Comments
Could you please try using TensorFlow 1.5 and let us know if you still see the problem? |
Thanks for your answer, I will try Tf 1.5 today and report. Just a general question: Is the way, -> frozen graph from the model zoo -> create inference with the graph & eval images -> run evaluation, the propriate way to receive evaluation results from a frozen graph? update: Just installed TF 1.5.0 and executed the same command. Unfortunately I receive the identical error:
|
Hello, At first I took the raccoon train.record and the graph of ssd_inception_v2_coco (directly from the modelzoo). I ran infer_detections.py and received the detection.tfrecord. Next I tried to evaluate with offline_eval_map_corloc.py but the same error (TypeError: sequence item 0: expected str instance, bytes found) showed up. After that I replaced the ssd_inception_v2_coco with the ssd_mobilenet_v1_coco graph and repeated the steps -> the same error occurred (TypeError) Then I ran the infer_detections.py with the raccoon train.record and the raccoon output_inference_graph.pb from the repo. ... -> offline_eval_map_corloc.py -> the same error (TypeError) If anyone else has an idea about what I could try out, let me know. |
@tombstone Can you take a look at this? |
I had same error with Python3.5 and TF1.5. This causes tf_example.features.feature[self.field_name].bytes_list.value returns byte type instead of string type in metrics/tf_example_parser.StringParser. So I changed tf_example_parser.StringParser below
I got no errors. But this is instance solution, I think there is better solution. |
I tried ohnabes solution and it fixed the error for me. |
Great. @ohnabe thank you for your help! |
I've used your method, except with:
to handle bits which can't be parsed. |
@varun19299 Thanks! Great! |
Thanks a lot. This worked |
hello, I am using the script (https://www.shiftedup.com/2018/10/10/confusion-matrix-in-object-detection-api-with-tensorflow) , I generated the confuction matrix, only I have two classes and the matrix has a size of 3x3, which is wrong. someone could explain me. Thank you Confusion Matrix: category ... recall_@0.5IOU [2 rows x 3 columns] |
System information
tf.VERSION = 1.4.0
tf.GIT_VERSION = v1.4.0-0-gd752244
tf.COMPILER_VERSION = v1.4.0-0-gd752244
Describe the problem
Previous steps:
My aim is to calculate the AP (for the class 'car') from models of the zoo on the KITTI dataset. Therefor I used the create_kitti_tf_record.py (dataset_tools). See point A in the logs below. While generating the tfrecord, i faced the same issue as #3239. The solution SamDon87 mentioned worked for me (see A3).
For the Inference (see point B) I successfully processed the smaller val tfrecord (200 images).
Error:
After executing the offline_eval_map_corloc.py (see "Exact command to reproduce", further information at point C1 and C2 in the logs) I get the following Error message:
What I tried
To ensure that the error is not caused by the tfrecord I generated, I used another (non kitti) tfrecord. Unfortunately, the same error occurred.
Now I'm stuck and I don't know how to solve this issue.
I am grateful for any help.
Source code / logs
A1 - Command to create tfrecord
A2 - kitti_label_map.pbtxt
A3 - line 61 to 64 of the modified create_kitti_tf_record_mod.py
B - infer_detections command
C1 - validation_input_config_KITTI.pbtxt
C2 - validation_eval_config.pbtxt
The text was updated successfully, but these errors were encountered: