-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using model weights on own dataset #227
Comments
hey @CSteele97,
Hope this helps. |
For the simple case of predicting on some new data, prepare sources and environment, then follow this section: https://github.com/neptune-ai/open-solution-mapping-challenge/blob/master/REPRODUCE_RESULTS.md#predict-on-new-data |
Hi @kamil-kaczmarek thank you for your reply. In the case of the REPRODUCE_RESULTS section for predict on new data, would the pipeline_name therefore be unet, as this is the trained model? Thank you |
Hey @CSteele97, There is a full command provided in the aforementioned section. It looks like this:
There is a pipeline name provided: Cheers, |
Thanks @kamil-kaczmarek I have been trying to run the command you mentioned, but I get an error 'no module named neptune'. I have followed all the previous steps (without a Neptune registration) and am not sure why I am getting this error or how to resolve it. I appreciate your time in helping me figure all of this out! Thank you |
did you install neptune? |
It will be simplest workaround |
I have managed to solve the neptune issue using pip install neptune-cli, thanks |
I have tried to run the above command however I am now receiving 'Error: No such command 'predict_on_dir' |
I see that you installed The best solution here is to create an environment using conda. Here is full specification of the conda environment: https://github.com/neptune-ai/open-solution-mapping-challenge/blob/master/environment.yml Regarding Error with Hope this helps :) |
Thanks Kamil, I have updated my environment which seems to now be working. I have been running the command from the open-solution-mapping-challenge directory - is this correct? Thank you |
Hey @CSteele97, Yep, it should work. |
Thanks Kamil, I've tried running the command again from the aforementioned directory but it's still giving the |
Hey, Can you paste full error massage? |
/anaconda3/envs/mapping/lib/python3.6/site-packages/sklearn/externals/joblib/init.py:15: FutureWarning: sklearn.externals.joblib is deprecated in 0.21 and will be removed in 0.23. Please import this functionality directly from joblib, which can be installed with: pip install joblib. If this warning is raised when loading pickled models, you may need to re-serialize those models with scikit-learn 0.21+. Error: No such command 'predict_on_dir'. |
Great thanks, Can you also paste full command that you use? |
python main.py predict_on_dir |
Hi @CSteele97 I have just successfully run: python main.py predict_on_dir \
--pipeline_name unet_tta_scoring_model \
--chunk_size 100 \
--dir_path data/paper_images \
--prediction_path data/paper_images_predictions.json perhaps you didn't use the |
Hi, I got a different error here, when I ran the above command. Any idea?
I'm actually not sure where I should put the released checkpoint. Currently I've put them as |
I'm wondering that you've released checkpoints for |
Hi @asahi417 those transformers that don't have any state are created on the fly so you only need Both of those trained models should be placed in the I tried to explain it in the Reproduce Results but I am not sure if it is clear:
I hope this helps. |
Also, I'm wondering if it possible to finetune the released checkpoint to own dataset. |
Hi there, I think there may be something wrong with the indices of your images in the prediction file. It seems that those predictions belong to different images right? You can easily fine-tune by overriding (or simply pasting) a snippet that loads weights when you train in steps/pytorch.models.py. |
@jakubczakon Hi, thanks for your feedback. I've tried to export segmentation over single image, but still attained similar results... Could you take a look my code where I export segmentation map from coco-formatted prediction file, which was produced by your https://github.com/asahi417/open-solution-mapping-challenge-script |
I solved this error in a different way. Inside the main.py script you will find a line before the function definition as |
Hi, @jakubczakon
I would like to use the model weights to detect buildings from my own imagery, but I'm not entirely sure how to do this. I notice there are two files on the following website (https://ui.neptune.ai/neptune-ai/Mapping-Challenge/e/MC-1057/artifacts) - but I am not sure which file is the model weights and how to implement it on my own imagery. I have also seen the 'Predict on new data' section of REPRODUCE_RESULTS but I do not know what the pipeline_name would be or the prediction_path.
I hope this makes sense, I am very new to machine learning so do not yet understand a lot of things.
I would really appreciate it if you could provide some instructions on how I can achieve this. Thank you.
The text was updated successfully, but these errors were encountered: