Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distance to closest semantic wall #526

Closed
eriksandstroem opened this issue Mar 12, 2020 · 3 comments
Closed

Distance to closest semantic wall #526

eriksandstroem opened this issue Mar 12, 2020 · 3 comments
Labels
question Further information is requested

Comments

@eriksandstroem
Copy link

eriksandstroem commented Mar 12, 2020

❓ Questions and Help

Hi,
Thanks for providing Habitat! Great work!

I am building an agent that walks around in the scene by traversing randomly navigable points and collecting sensor data from the Replica dataset.

I have successfully achieved the above, but I would like to have better control when sampling the random points. I do not want them to be too close to a wall because this will make the sensor data less reliable. Being close to e.g. a chair is, however, not a problem. Currently, I have set a threshold as follows:

traverseList = []
randPt = sim.pathfinder.get_random_navigable_point()
dist_to_obs = sim.pathfinder.distance_to_closest_obstacle(randPt, max_search_radius)
if dist_to_obs >= max_search_radius:
    traverseList.append(randPt)

This means that I will add the point only if its distance to any obstacle is at least max_search_radius. This means that I will also stay away from furniture, which I don't want.

My question: Given a surface point (e.g. coming from the output of the function "closest_obstacle_surface_point", is there any way to find it's semantic class such that I can only focus on walls?

Thanks a lot

@erikwijmans
Copy link
Contributor

My question: Given a surface point (e.g. coming from the output of the function "closest_obstacle_surface_point", is there any way to find it's semantic class such that I can only focus on walls?

In theory yes, you can use the bounding boxes in the semantic annotations to approximate this. However, in practice, this still won't give you what you are looking for as the closet obstacle search will simply find the closet obstacle, regardless of whether or not it is furniture or a wall, so you may still end up near walls.

Perhaps a better way to do this is to iterate over all the wall objects in the scene (https://aihabitat.org/docs/habitat-sim/habitat_sim.scene.SemanticScene.html#objects from here: https://aihabitat.org/docs/habitat-sim/habitat_sim.simulator.Simulator.html#semantic_scene) and then check to see if the distance to the bounding box is greater than some threshold (https://aihabitat.org/docs/habitat-sim/habitat_sim.geo.OBB.html#distance). I believe the walls in replica are annotated in a way that they can be well approximated by a bounding box, so this should work.

@dhruvbatra
Copy link
Contributor

I am building an agent that walks around in the scene by traversing randomly navigable points and collecting sensor data from the Replica dataset.

Sounds interesting.

One pointer -- @mpiseno is building native "data/sensor acquisition" capability in Habitat-Sim, i.e. the ability to extract images/semantics from various viewpoints across 3D scenes. You can follow his progress here: #518

@mathfac mathfac added the question Further information is requested label Jan 18, 2021
@mathfac
Copy link
Contributor

mathfac commented Jan 18, 2021

Closing as the question was answered.

@mathfac mathfac closed this as completed Jan 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants