-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
数据集中 “area” 数据问题 #2
Comments
@Qidian213 我也遇到这个问题 ,请问你怎么解决的啊 |
|
@ccnankai 验证是可用的 |
@Qidian213 你好,请问def annotation(self,points,label,num)中的points是指什么。运行有错的 |
就是标注文件里的,运行没问题,训练效果很好 |
@Qidian213 我用的是faser rcnn,只有标注框,没有points这个数据。我把求area的计算变成w*h就可以了。 |
嗯,感谢大家的热情,已经添加了 area字段,希望能帮助大家更好的学习!谢谢! |
请问在poly = Polygon(points) |
https://nbviewer.jupyter.org/github/aleju/imgaug-doc/blob/master/notebooks/B03%20-%20Augment%20Polygons.ipynb |
我的在添加area之前转换的很成功,但是加上 |
请问Polygon(points)的points 是标注文件里面的一个点还是多个点呀,是list里面包含tuple吗 |
应该是多个点
…---- 回复的原邮件 ----
| 发件人 | ***@***.***> |
| 日期 | 2022年09月29日 18:39 |
| 收件人 | ***@***.***> |
| 抄送至 | ***@***.******@***.***> |
| 主题 | Re: [wucng/TensorExpand] 数据集中 “area” 数据问题 (#2) |
def annotation(self,points,label,num):
annotation={}
annotation['segmentation']=[list(np.asarray(points).flatten())]
poly = Polygon(points)
area_ = round(poly.area,6)
annotation['area'] = area_
annotation['iscrowd'] = 0
annotation['image_id'] = num+1
# annotation['bbox'] = str(self.getbbox(points)) # 使用list保存json文件时报错(不知道为什么)
# list(map(int,a[1:-1].split(','))) a=annotation['bbox'] 使用该方式转成list
annotation['bbox'] = list(map(float,self.getbbox(points)))
annotation['category_id'] = self.getcatid(label)
annotation['id'] = self.annID
return annotation
请问Polygon(points)的points 是标注文件里面的一个点还是多个点呀,是list里面包含tuple吗
谢谢解答!
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
我在使用你的labelme2coco的代码进行数据转化,以训练官方版Mask-Rcnn,但是在你的代码中没有 annotation['area'] 数据的计算,而MASK-RCNN的训练中是需要这个数据的,请问有相关的计算代码可参考学习?
The text was updated successfully, but these errors were encountered: