Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestions for micro-sam tool #406

Closed
anwai98 opened this issue Mar 1, 2024 · 3 comments
Closed

Suggestions for micro-sam tool #406

anwai98 opened this issue Mar 1, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@anwai98
Copy link
Contributor

anwai98 commented Mar 1, 2024

I received some feedback and requests:

  • Possibility to store and export annotation meta-data / macro recordings (regarding the steps for data annotation using automatic / interactive instance segmentation) to ensure reproducibility, and probably mentioning the workflows in publications involving data annotation (wherever relevant)
  • Explanation of "embedding file" - some explanation towards what's intended to store here / pass here would help to understand this better.
    • One more possible addition could be, the user isn't prompted correctly on what to do while they use tiling in micro_sam.annotator and not pass the embedding file (the compute image embeddings in .zarr format), it would be nice to lead the users towards how to precompute embeddings (using a link or redirecting to a brief mention of scripts).

Some observations from my side:

  • micro_sam.precompute_embeddings - expects -o attribute to store the image embeddings. It would be nice (and probably consistent) to expose the attribute as -e (the same expectation as for annotator_2d, etc., or vice-versa)
  • (Minor) micro_sam.annotator_2d - the choice of the model is limited to --model_type. It would be nice to have -m attribute exposed as well.
@anwai98 anwai98 added the enhancement New feature or request label Mar 4, 2024
@anwai98
Copy link
Contributor Author

anwai98 commented Mar 4, 2024

Updates based on discussions today:

  • Save the information for interactive annotations (if possible, expose a one-click option to export them)
  • Advanced addition: To store the projected prompts (preferably in an additional layer, not exposed by default - ideally, users would need to specify some option to access this) to aid interactive improvement in volumetric segmentation.

@anwai98
Copy link
Contributor Author

anwai98 commented Mar 4, 2024

Providing a demo notebook for:

  • AMG / AIS (automatic segmentation methods)
  • Interactive segmentation (using box and/or points)

Hint: something like this - https://github.com/facebookresearch/segment-anything/blob/main/notebooks/automatic_mask_generator_example.ipynb

@constantinpape
Copy link
Contributor

I will close this since most points here are either solved or covered by other issues.

  • Possibility to store and export annotation meta-data / macro recordings (regarding the steps for data annotation using automatic / interactive instance segmentation) to ensure reproducibility, and probably mentioning the workflows in publications involving data annotation (wherever relevant)

This is being implemented as part of #408.

Explanation of "embedding file" - some explanation towards what's intended to store here / pass here would help to understand this better.

* One more possible addition could be, the user isn't prompted correctly on what to do while they use tiling in `micro_sam.annotator` and not pass the embedding file (the compute image embeddings in .zarr format), it would be nice to lead the users towards how to precompute embeddings (using a link or redirecting to a brief mention of scripts).

This is a bit outdated. It's now possible to compute tiled embeddings without using the embedding file. In any case, we should improve the documentation on this, but this is covered by other issues already.

  • micro_sam.precompute_embeddings - expects -o attribute to store the image embeddings. It would be nice (and probably consistent) to expose the attribute as -e (the same expectation as for annotator_2d, etc., or vice-versa)

    • (Minor) micro_sam.annotator_2d - the choice of the model is limited to --model_type. It would be nice to have -m attribute exposed as well.

Both changes are now implemented. (Together with a better documentation of the parameters of the CLI scripts)

  • Save the information for interactive annotations (if possible, expose a one-click option to export them)

    • Advanced addition: To store the projected prompts (preferably in an additional layer, not exposed by default - ideally, users would need to specify some option to access this) to aid interactive improvement in volumetric segmentation.

Also covered by #408.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants