Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hyperparameter setting for training Object365 dataset #12048

Closed
1 task done
Aagamshah9 opened this issue Aug 29, 2023 · 11 comments
Closed
1 task done

Hyperparameter setting for training Object365 dataset #12048

Aagamshah9 opened this issue Aug 29, 2023 · 11 comments
Labels
question Further information is requested Stale

Comments

@Aagamshah9
Copy link

Aagamshah9 commented Aug 29, 2023

Search before asking

Question

Hi @glenn-jocher ,

I am currently training Object365 YOLOv5m model with hyperparamter setting of hyp.Objects365.yaml for 300 epochs and total batch size of 64, but the mAP score is too low. How can I improve the training process and mAP score ?

Kindly please look into this. Also please let me know if you need any additional information from my end. I would be happy to share.

Additional

 Epoch    gpu_mem   GIoU       obj	 cls       total     targets   img_size    P        R       mAP@.5  mAP@.5:.95  val_GIoU  val_obj   val_cls             
 0/299     6.77G   0.08292   0.07495   0.04284    0.2007        28       640   0.02306  0.002415  0.003592  0.001659   0.07294   0.08572   0.06229
 1/299     6.82G   0.06868   0.07174   0.03286    0.1733         6       640   0.04519  0.009981   0.01233  0.006113   0.06658   0.08394   0.05844
 2/299     6.82G   0.06451   0.06994   0.02882    0.1633        28       640   0.09033   0.03356    0.0278   0.01436   0.06089   0.07892   0.05479
 3/299     6.82G   0.06242   0.06898   0.02666    0.1581         2       640    0.1204   0.05513   0.04282   0.02293   0.05804   0.07676   0.05196
 4/299     6.82G   0.06089   0.06822   0.02524    0.1543         9       640    0.1337   0.07182   0.05505   0.03016   0.05648   0.07579   0.04971
 5/299     6.82G   0.05985   0.06764   0.02428    0.1518         9       640    0.1341    0.0847   0.06515   0.03639    0.0554   0.07514   0.04783
 6/299     6.82G   0.05911   0.06726   0.02357    0.1499         4       640    0.1461   0.09563   0.07427   0.04211   0.05459   0.07456    0.0463
 7/299     6.82G   0.05857   0.06691   0.02304    0.1485        12       640     0.149    0.1039   0.08135   0.04662   0.05405   0.07424   0.04507
 8/299     6.82G   0.05811   0.06664   0.02265    0.1474        19       640    0.1529    0.1123    0.0873   0.05049   0.05346   0.07383   0.04405
 9/299     6.82G   0.05773   0.06644   0.02231    0.1465         5       640    0.1632    0.1192   0.09237   0.05385   0.05303   0.07357   0.04314
10/299     6.82G   0.05741   0.06626   0.02204    0.1457        18       640    0.1691    0.1256   0.09694   0.05681   0.05269   0.07338   0.04242
11/299     6.82G   0.05715   0.06616   0.02182    0.1451        20       640    0.1736    0.1313    0.1008   0.05941   0.05232   0.07299    0.0418
12/299     6.82G   0.05691   0.06596   0.02162    0.1445        14       640    0.1788    0.1362    0.1044   0.06182   0.05208   0.07282   0.04125
13/299     6.82G   0.05673   0.06584   0.02145     0.144         2       640    0.1892      0.14    0.1075   0.06381   0.05188   0.07264   0.04081
14/299     6.82G   0.05654   0.06576   0.02132    0.1436        12       640     0.187     0.143    0.1104   0.06574   0.05169    0.0726   0.04043
15/299     6.82G   0.05639   0.06567   0.02118    0.1432        43       640     0.182    0.1466    0.1128   0.06738   0.05148   0.07238    0.0401
16/299     6.82G   0.05626   0.06555   0.02107    0.1429        37       640    0.1809    0.1492    0.1152   0.06894   0.05133   0.07225   0.03978
17/299     6.82G   0.05611   0.06546   0.02097    0.1425        22       640    0.1709    0.1518    0.1173   0.07034    0.0512   0.07225   0.03952
18/299     6.82G   0.05599    0.0654   0.02089    0.1423        15       640    0.1772    0.1542    0.1194   0.07164   0.05105    0.0721   0.03923
19/299     6.82G   0.05592   0.06535    0.0208    0.1421        27       640    0.1846    0.1567    0.1213   0.07296   0.05094   0.07203   0.03905
20/299     6.82G   0.05581   0.06531   0.02072    0.1418         5       640     0.189    0.1586    0.1229   0.07402   0.05085   0.07194   0.03886
21/299     6.82G   0.05572    0.0653   0.02065    0.1417        18       640    0.1967    0.1606    0.1245   0.07513   0.05075   0.07185   0.03866
22/299     6.82G   0.05562   0.06515   0.02059    0.1414        15       640    0.1801    0.1626    0.1261   0.07609   0.05063   0.07177   0.03849
23/299     6.82G   0.05555   0.06517   0.02052    0.1412        30       640    0.1834    0.1641    0.1273   0.07706   0.05053   0.07175   0.03833
24/299     6.82G   0.05547   0.06508   0.02047     0.141         3       640    0.1908    0.1657    0.1288   0.07806   0.05046    0.0717   0.03819
25/299     6.82G   0.05543     0.065   0.02041    0.1408         7       640    0.1875    0.1672      0.13   0.07884   0.05038   0.07163   0.03808
26/299     6.82G   0.05534   0.06503   0.02037    0.1408        30       640    0.1772    0.1685     0.131   0.07959   0.05032   0.07156   0.03797
27/299     6.82G   0.05528     0.065   0.02032    0.1406         7       640    0.1912    0.1701    0.1321    0.0803   0.05025   0.07153   0.03781
28/299     6.82G   0.05522   0.06492   0.02027    0.1404         9       640    0.1896    0.1712    0.1333    0.0811   0.05018   0.07145   0.03771
29/299     6.82G   0.05516   0.06488   0.02024    0.1403        42       640    0.1906    0.1726    0.1341   0.08174   0.05013   0.07139   0.03763
30/299     6.82G   0.05511   0.06489   0.02019    0.1402         9       640    0.1946    0.1735    0.1352   0.08242   0.05012    0.0715   0.03751
31/299     6.82G   0.05506   0.06484   0.02017    0.1401        55       640    0.1844    0.1746    0.1358   0.08288   0.05003   0.07136   0.03742
32/299     6.82G   0.05501   0.06479   0.02012    0.1399        36       640    0.1865    0.1755    0.1365   0.08338   0.04999   0.07131   0.03733
33/299     6.82G   0.05497   0.06477   0.02009    0.1398         3       640    0.1833    0.1762    0.1372   0.08381   0.04994   0.07129   0.03723
34/299     6.82G   0.05494   0.06477   0.02005    0.1398        30       640    0.1775    0.1772     0.138   0.08435   0.04985   0.07112   0.03715
35/299     6.82G   0.05489   0.06475   0.02002    0.1397        42       640    0.1824    0.1785    0.1389   0.08502   0.04982   0.07109   0.03706
36/299     6.82G   0.05485   0.06469   0.01998    0.1395         7       640     0.193    0.1792    0.1398   0.08565   0.04978   0.07109     0.037
37/299     6.82G   0.05481   0.06465   0.01997    0.1394        15       640    0.1914    0.1809    0.1407   0.08618   0.04973   0.07102   0.03693
38/299     6.82G   0.05478   0.06462   0.01994    0.1393        11       640    0.1887    0.1813    0.1412   0.08655   0.04972   0.07105   0.03681
39/299     6.82G   0.05474   0.06464   0.01991    0.1393        32       640    0.2016    0.1819    0.1418   0.08697   0.04964   0.07094   0.03677
40/299     6.82G    0.0547   0.06467   0.01987    0.1392        75       640    0.1969     0.183    0.1424   0.08738    0.0496   0.07092    0.0367
41/299     6.82G   0.05468   0.06456   0.01985    0.1391        27       640    0.1902    0.1836     0.143   0.08778    0.0496   0.07097   0.03664
42/299     6.82G   0.05463    0.0646   0.01983    0.1391        10       640    0.1987    0.1845    0.1437    0.0883   0.04955   0.07096   0.03659
43/299     6.82G    0.0546   0.06455   0.01982     0.139        53       640    0.1909    0.1847    0.1441    0.0886   0.04951    0.0709   0.03652
44/299     6.82G   0.05457   0.06451   0.01978    0.1389         7       640     0.189    0.1857    0.1447   0.08905   0.04946   0.07084   0.03645
45/299     6.82G   0.05452    0.0645   0.01976    0.1388        37       640    0.1848     0.186    0.1451   0.08936   0.04943   0.07086    0.0364
46/299     6.82G   0.05451    0.0645   0.01974    0.1388        18       640    0.1945    0.1868    0.1457   0.08972   0.04943   0.07088   0.03635
47/299     6.82G   0.05448   0.06449   0.01972    0.1387        32       640    0.1969    0.1872    0.1462   0.09015   0.04939   0.07086   0.03632
48/299     6.82G   0.05445   0.06443    0.0197    0.1386         4       640    0.2007    0.1877    0.1466   0.09026   0.04935   0.07084   0.03627
49/299     6.82G   0.05443   0.06442   0.01968    0.1385         5       640    0.2029    0.1883    0.1472   0.09067   0.04936    0.0708   0.03623
50/299     6.82G    0.0544   0.06447   0.01966    0.1385         6       640    0.2081    0.1891    0.1476   0.09105    0.0493   0.07075   0.03617
51/299     6.82G   0.05436   0.06441   0.01964    0.1384        22       640    0.2069    0.1897    0.1481   0.09134   0.04927   0.07076   0.03612
52/299     6.82G   0.05434   0.06439   0.01961    0.1383        11       640    0.2111    0.1902    0.1486   0.09166   0.04925   0.07074   0.03608
53/299     6.82G   0.05431    0.0644   0.01961    0.1383        17       640    0.1957    0.1906    0.1489   0.09194    0.0492   0.07067   0.03603
54/299     6.82G    0.0543   0.06437   0.01958    0.1383         8       640    0.1923    0.1915    0.1495   0.09235   0.04918   0.07065   0.03595
55/299     6.82G   0.05426   0.06441   0.01956    0.1382         2       640    0.1956    0.1918      0.15   0.09266   0.04913   0.07054   0.03592
56/299     6.82G   0.05425   0.06435   0.01954    0.1381        76       640    0.2101    0.1925    0.1505   0.09304   0.04915   0.07065   0.03587
57/299     6.82G   0.05422   0.06432   0.01953    0.1381        64       640    0.2033    0.1931    0.1508   0.09319    0.0491   0.07053   0.03582
58/299     6.82G    0.0542   0.06432   0.01951     0.138        16       640    0.1944    0.1934    0.1514   0.09353   0.04911   0.07062   0.03579
59/299     6.82G   0.05417   0.06427   0.01949    0.1379        18       640    0.1902    0.1936    0.1516   0.09373   0.04908   0.07057   0.03572
60/299     6.82G   0.05415   0.06426   0.01947    0.1379        22       640    0.1944     0.194    0.1521   0.09402   0.04907   0.07058   0.03571
61/299     6.82G   0.05413   0.06427   0.01946    0.1379         2       640    0.1925    0.1949    0.1527   0.09446   0.04906   0.07058   0.03565
62/299     6.82G   0.05411   0.06426   0.01945    0.1378        15       640    0.2094    0.1948    0.1528   0.09458   0.04902   0.07055   0.03563
63/299     6.82G   0.05409   0.06425   0.01944    0.1378        19       640    0.2031    0.1955    0.1532   0.09484   0.04899    0.0705   0.03558
64/299     6.82G   0.05408   0.06421   0.01942    0.1377        20       640     0.192    0.1962    0.1537   0.09514   0.04897   0.07048   0.03554
65/299     6.82G   0.05405   0.06421    0.0194    0.1377         6       640    0.1971    0.1968    0.1539   0.09527   0.04898   0.07052   0.03553
66/299     6.82G   0.05402   0.06416   0.01938    0.1376        11       640    0.2034    0.1973    0.1543   0.09553   0.04895   0.07042   0.03548
67/299     6.82G   0.05402   0.06424   0.01938    0.1376        10       640    0.2004    0.1978    0.1548    0.0959   0.04893   0.07046   0.03545
68/299     6.82G     0.054   0.06421   0.01935    0.1376         9       640    0.1961    0.1983     0.155   0.09604   0.04892   0.07043   0.03543
69/299     6.82G   0.05399   0.06419   0.01935    0.1375        24       640    0.1948    0.1984    0.1555   0.09637    0.0489   0.07043   0.03535
70/299     6.82G   0.05396   0.06411   0.01933    0.1374        23       640    0.1997    0.1993    0.1559   0.09655   0.04891   0.07051   0.03534
71/299     6.82G   0.05395   0.06411   0.01931    0.1374        22       640    0.2006    0.1995    0.1561   0.09676   0.04887    0.0704   0.03531
72/299     6.82G   0.05392   0.06414    0.0193    0.1374        34       640    0.1948       0.2    0.1564   0.09695   0.04886   0.07048   0.03531
73/299     6.82G   0.05391   0.06417   0.01929    0.1374        10       640     0.194    0.2005    0.1568   0.09721   0.04883   0.07038   0.03527
74/299     6.82G   0.05389   0.06412   0.01927    0.1373        19       640     0.202    0.2012    0.1572   0.09753   0.04882    0.0703   0.03522
75/299     6.82G   0.05387   0.06411   0.01925    0.1372        43       640     0.206     0.201    0.1574   0.09766   0.04882   0.07034   0.03521
76/299     6.82G   0.05385   0.06407   0.01925    0.1372        35       640    0.1997    0.2012    0.1573   0.09758   0.04878   0.07031   0.03517
77/299     6.82G   0.05385    0.0641   0.01923    0.1372        11       640     0.209    0.2022    0.1579   0.09796   0.04875   0.07021   0.03514
78/299     6.82G   0.05382   0.06409   0.01923    0.1371        11       640    0.2133    0.2024     0.158   0.09807   0.04878   0.07035   0.03509
79/299     6.82G   0.05379   0.06403    0.0192     0.137         7       640    0.2105    0.2031    0.1585   0.09848   0.04876    0.0703   0.03508
80/299     6.82G   0.05379   0.06402   0.01921     0.137         7       640    0.2185    0.2038     0.159   0.09873   0.04874   0.07023   0.03504
81/299     6.82G   0.05377   0.06404   0.01918     0.137        11       640    0.2012    0.2036    0.1589   0.09871   0.04873   0.07027   0.03502
82/299     6.82G   0.05375   0.06405   0.01916     0.137        11       640    0.2046    0.2035    0.1591   0.09887   0.04873   0.07037   0.03501
83/299     6.82G   0.05375   0.06404   0.01917     0.137        17       640    0.2132    0.2039    0.1592   0.09881   0.04871   0.07028   0.03498
84/299     6.82G    0.0537   0.06399   0.01915    0.1368         9       640    0.2068    0.2043    0.1596   0.09921   0.04867   0.07023   0.03493
85/299     6.82G   0.05371   0.06402   0.01913    0.1369        19       640    0.2028    0.2048    0.1598   0.09923   0.04866   0.07022   0.03491
86/299     6.82G    0.0537   0.06402   0.01912    0.1368        12       640    0.2066    0.2052    0.1602   0.09952   0.04866   0.07025   0.03489
87/299     6.82G   0.05368     0.064    0.0191    0.1368         7       640    0.2126    0.2052    0.1606   0.09978   0.04864    0.0702   0.03487
88/299     6.82G   0.05366   0.06393   0.01909    0.1367        20       640    0.2009    0.2055    0.1608   0.09992   0.04866   0.07027   0.03484
89/299     6.82G   0.05364   0.06398   0.01909    0.1367        26       640    0.2149    0.2061    0.1612    0.1002   0.04864   0.07031    0.0348
90/299     6.82G   0.05363   0.06395   0.01907    0.1367         7       640    0.2096    0.2066    0.1613    0.1003   0.04862   0.07024   0.03479
91/299     6.82G   0.05361   0.06394   0.01905    0.1366        22       640    0.2078    0.2067    0.1616    0.1005    0.0486   0.07019   0.03477
92/299     6.82G   0.05359   0.06392   0.01905    0.1366        25       640    0.2131    0.2069    0.1615    0.1005   0.04862   0.07026   0.03472
93/299     6.82G    0.0536   0.06398   0.01904    0.1366        14       640    0.2105    0.2077    0.1621    0.1008   0.04857   0.07016    0.0347
94/299     6.82G   0.05355   0.06389   0.01903    0.1365        18       640     0.199     0.208    0.1622    0.1009   0.04857   0.07021   0.03466
95/299     6.82G   0.05357   0.06393   0.01902    0.1365        33       640    0.2011    0.2082    0.1623    0.1011   0.04858   0.07022   0.03465
96/299     6.82G   0.05355    0.0639   0.01901    0.1365        45       640    0.1927    0.2077    0.1625    0.1012   0.04856   0.07023   0.03463
97/299     6.82G   0.05353   0.06391     0.019    0.1364         6       640    0.2096    0.2084     0.163    0.1014   0.04853   0.07016   0.03461
98/299     6.82G    0.0535   0.06384   0.01898    0.1363        38       640    0.2012    0.2082     0.163    0.1014   0.04854   0.07019   0.03461
99/299     6.82G    0.0535   0.06388   0.01897    0.1364         5       640    0.2071    0.2088    0.1632    0.1016   0.04854   0.07019   0.03457

Please see the attached results.txt for further epoch's scores.
results.txt

@Aagamshah9 Aagamshah9 added the question Further information is requested label Aug 29, 2023
@Aagamshah9
Copy link
Author

Hi @glenn-jocher, I have opened this question 2 weeks ago, it would be great if you can provide some insights or solution of how to solve this issue soon.

@glenn-jocher
Copy link
Member

@Aagamshah9 hi there,

I apologize for the delay in addressing your question. Our team is currently working on resolving a high volume of inquiries and we appreciate your patience. We'll do our best to provide you with an update or solution as soon as possible. Thank you for your understanding.

@Aagamshah9
Copy link
Author

Hi @glenn-jocher,

It has been a month since this issue was opened and I am genuinely stuck in one of the very big project having critical deadlines, I really request you to provide me with solution ASAP. Disappointed :(

@glenn-jocher
Copy link
Member

Hi @Aagamshah9,

Thank you for reaching out and expressing your concern. We apologize for the delay in addressing your issue. Our team understands the importance of your project and the deadlines involved.

Rest assured, we are actively working on finding a solution for you. We are doing our best to provide you with a prompt and effective resolution to get you back on track as soon as possible.

Thank you for your patience and understanding.

@github-actions
Copy link
Contributor

👋 Hello there! We wanted to give you a friendly reminder that this issue has not had any recent activity and may be closed soon, but don't worry - you can always reopen it if needed. If you still have any questions or concerns, please feel free to let us know how we can help.

For additional resources and information, please see the links below:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLO 🚀 and Vision AI ⭐

@github-actions github-actions bot added the Stale label Oct 23, 2023
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 3, 2023
@Aagamshah9
Copy link
Author

Aagamshah9 commented Apr 26, 2024

Hi @glenn-jocher,

I am opening up this issue again since neither you or your team has given any solution or any updates for this. Kindly, please do send the optimal response as soon as possible.

Thank you !!

@glenn-jocher
Copy link
Member

@Aagamshah9 hi there,

I'm truly sorry for any inconvenience our delayed response may have caused. We understand how crucial timely support can be, especially when facing deadlines or critical project milestones. 🙏

While I may not be able to provide an immediate, detailed solution in this message, I want to assure you that we are actively looking into your concern. In the meantime, reviewing our documentation at https://docs.ultralytics.com/yolov5/ might offer some insights or workarounds relevant to your issue.

Thank you for your patience and understanding. We appreciate your continued support and collaboration.

Warm

@Aagamshah9
Copy link
Author

@Aagamshah9 hi there,

I'm truly sorry for any inconvenience our delayed response may have caused. We understand how crucial timely support can be, especially when facing deadlines or critical project milestones. 🙏

While I may not be able to provide an immediate, detailed solution in this message, I want to assure you that we are actively looking into your concern. In the meantime, reviewing our documentation at https://docs.ultralytics.com/yolov5/ might offer some insights or workarounds relevant to your issue.

Thank you for your patience and understanding. We appreciate your continued support and collaboration.

Warm

This issue was opened in August 2023. But I haven't received any pointer yet to move forward. Please look into it immediately.

@glenn-jocher
Copy link
Member

@Aagamshah9,

I'm here to help! For improving mAP scores on the Object365 dataset, consider:

  1. Data Augmentation: Increase diversity with more aggressive augmentation. Check augmentation settings in your YOLOv5 YAML.

  2. Learning Rate: Experiment with adjusting the learning rate. Sometimes, a slightly lower or higher learning rate can make a significant difference in performance.

  3. Epochs: Although you're running 300 epochs, monitor if the model still improves late in training. If not, early stopping might be beneficial.

  4. Model Size: If feasible, trying a larger model variant (e.g., YOLOv5l or YOLOv5x) could yield better results if hardware resources allow.

  5. Review Dataset: Ensure the dataset is correctly labeled and consider adding more data if possible.

Your patience is greatly appreciated, and we're committed to assisting you.

@Aagamshah9
Copy link
Author

@Aagamshah9,

I'm here to help! For improving mAP scores on the Object365 dataset, consider:

  1. Data Augmentation: Increase diversity with more aggressive augmentation. Check augmentation settings in your YOLOv5 YAML.
  2. Learning Rate: Experiment with adjusting the learning rate. Sometimes, a slightly lower or higher learning rate can make a significant difference in performance.
  3. Epochs: Although you're running 300 epochs, monitor if the model still improves late in training. If not, early stopping might be beneficial.
  4. Model Size: If feasible, trying a larger model variant (e.g., YOLOv5l or YOLOv5x) could yield better results if hardware resources allow.
  5. Review Dataset: Ensure the dataset is correctly labeled and consider adding more data if possible.

Your patience is greatly appreciated, and we're committed to assisting you.

I already took care of point 1,3,4 and 5. For learning rate, I am using the hyp.Objects365.yaml which you have provided. I believe that learning rate in your .yaml file is through hyperparameter evolution.
@glenn-jocher Thank you so much for your response. I appreciate it.

@glenn-jocher
Copy link
Member

@Aagamshah9,

Great to hear that you've covered most of the recommended adjustments! Regarding the learning rate set in hyp.Objects365.yaml, it's indeed tailored based on general observations. However, each dataset could respond differently. I'd suggest experimenting with minor adjustments in the learning rate, either slightly higher or lower, to see if it optimally suits your specific dataset characteristics. Sometimes even small tweaks can lead to noticeable improvements.

Let us know how it goes! 😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale
Projects
None yet
Development

No branches or pull requests

2 participants