Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/update archive #293

Merged
merged 119 commits into from
Sep 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
119 commits
Select commit Hold shift + click to select a range
27eda37
Started adding an update method to mission, to allow for easy updatin…
DavidT3 Aug 31, 2024
903e9a7
Trying to make the update method of BaseMission re-run the filtering …
DavidT3 Sep 2, 2024
e2923a1
Made sure the update method of BaseMission resets the locked attribut…
DavidT3 Sep 2, 2024
c246002
Moved the locked attribute reset to before the reset filter call
DavidT3 Sep 2, 2024
a7a92a5
I was daft and was iterating through filtered_obs_info rather than fi…
DavidT3 Sep 2, 2024
2c960d5
The iterating through filtering operations in update doesn't seem to …
DavidT3 Sep 2, 2024
ffb6e35
Switched update() of BaseMission to make a copy of the filtering oper…
DavidT3 Sep 2, 2024
c95580b
The mission update method can now download data if requested.
DavidT3 Sep 2, 2024
97d73d6
Added more docstring to update()
DavidT3 Sep 2, 2024
1a660f6
Didn't change anything that matters but having this commit so I can r…
DavidT3 Sep 2, 2024
6a76d89
Made the mission save method save the science and proprietary usable …
DavidT3 Sep 2, 2024
d53c2c3
Added internal attributes to BaseMission to store the saved usable st…
DavidT3 Sep 2, 2024
c5a1415
Changed the loading in of usable flags from save state to be in dicti…
DavidT3 Sep 2, 2024
f0cb9d9
Started to add checking for any data selection changes to MissionBase…
DavidT3 Sep 2, 2024
b7b0300
Forgot the .all() call when checking to see if all previously selecte…
DavidT3 Sep 2, 2024
11ef784
Testing out whether the finding of ObsIDs that have been removed is w…
DavidT3 Sep 2, 2024
c8f6b3a
Testing out whether the finding of ObsIDs that have been removed is w…
DavidT3 Sep 2, 2024
a5d1a18
The arrays of new and removed ObsIDs was inverted from what it should…
DavidT3 Sep 2, 2024
443bc00
Now checking to see if any previously unusable observations are now u…
DavidT3 Sep 2, 2024
483484a
Check array is coming out formatted weird because of how I was using …
DavidT3 Sep 2, 2024
4f1dca2
Trying a different approach, comparing dictionaries
DavidT3 Sep 2, 2024
2cb4b45
Was comparing to the wrong saved usable dictionary at one point
DavidT3 Sep 2, 2024
24bc07d
Made sure that I wouldn't be comparing dictionaries that include an O…
DavidT3 Sep 2, 2024
8d7197d
Added a bunch of comments to the update() method, as well an update m…
DavidT3 Sep 2, 2024
0c554c1
Made the update() method of BaseMission record which ObsIDs have had …
DavidT3 Sep 2, 2024
31a17ca
Had my boolean logic flipped again wrt checking if any usable states …
DavidT3 Sep 2, 2024
526f8bb
Wasn't comparing proprietary usable state to the correct dictionary w…
DavidT3 Sep 2, 2024
57208d6
Added a little extra to the update() docstring in BaseMission
DavidT3 Sep 2, 2024
0ba7ab9
Started the update() method of Archive. For issue #69
DavidT3 Sep 2, 2024
e708a06
Made sure all the randint calls are taking an integer as an argument …
DavidT3 Sep 2, 2024
9cf34eb
Added a new exception
DavidT3 Sep 2, 2024
4de8fa9
Started to implement function configuration capture for storage by ar…
DavidT3 Sep 2, 2024
16c8737
Started to implement function configuration capture for storage by ar…
DavidT3 Sep 2, 2024
aa8c9ef
Attempting to understand why the config-capture code can't currently …
DavidT3 Sep 2, 2024
5c268e8
Should have altered the config-capture code so that any kwargs that a…
DavidT3 Sep 2, 2024
fbe28ca
Added a process_configurations property getter (and accompanying attr…
DavidT3 Sep 2, 2024
9570a03
Added a property setter for process_configurations
DavidT3 Sep 2, 2024
3cab4f3
Hopefully XMM processing should now store the processing configuratio…
DavidT3 Sep 2, 2024
3838eb6
Misunderstood what was being captured by the sas_call decorator again…
DavidT3 Sep 2, 2024
5a90e3c
The process_configurations properties' attribute wasn't being setup w…
DavidT3 Sep 2, 2024
c542e84
Diagnostics added to figure out why fit confs aren't being stored lik…
DavidT3 Sep 2, 2024
8f3855a
I am silly and was not writing process confs to the archive class bec…
DavidT3 Sep 2, 2024
f9098bd
Removed some diagnostics.
DavidT3 Sep 2, 2024
8f3f0bd
Made process run configurations be saved with the rest of the informa…
DavidT3 Sep 2, 2024
a3a121b
Converting the process run configuations dictionary so that quantitie…
DavidT3 Sep 2, 2024
9011725
Hopefully made sure that any quantities are loaded back in (when load…
DavidT3 Sep 2, 2024
b420be1
Was trying to convert the name of the key into a string rather than t…
DavidT3 Sep 2, 2024
c8fa2d9
Needed to check if parameter type was string when loading back in the…
DavidT3 Sep 2, 2024
a39834e
Changed how process configurations are stored, and we now retain the …
DavidT3 Sep 2, 2024
5264262
Started adding the proc_lookup dictionary so that processing step nam…
DavidT3 Sep 3, 2024
5387ff7
Started adding the proc_lookup dictionary so that processing step nam…
DavidT3 Sep 3, 2024
bdc532a
Caused some circular imports and trying to fix them.
DavidT3 Sep 3, 2024
4de376e
Changed PROC_LOOKUP to a local import in sas_call - SO MANY circular …
DavidT3 Sep 3, 2024
d371b19
Restored the cif_build entry in PROC_LOOKUP xmm_pointed.
DavidT3 Sep 3, 2024
ef31458
Added comments to the update() method of Archive.
DavidT3 Sep 3, 2024
0130ac4
Made the gauss_fit_lims be able to be a list rather than just a tuple…
DavidT3 Sep 3, 2024
9b263e3
The cleaned_evt_lists (as called by full xmm process) processing conf…
DavidT3 Sep 3, 2024
bdedf42
The cleaned_evt_lists (as called by full xmm process) processing conf…
DavidT3 Sep 3, 2024
704d236
Think I may have fixed the problem of positional argument values bein…
DavidT3 Sep 3, 2024
de1bd6b
Removed some diagnostics
DavidT3 Sep 3, 2024
cc6b9a6
Hopefully added the process configuration saving to the eROSITA esass…
DavidT3 Sep 3, 2024
8c3a8a4
Added entries to the PROC_LOOKUP for all the missions, though they do…
DavidT3 Sep 3, 2024
17f0443
Added the eROSITA mappings of function name to actual function in PRO…
DavidT3 Sep 3, 2024
fb578d6
Started adding checks for if a process has already been run for any o…
DavidT3 Sep 3, 2024
10cb8c4
Figuring out if my checks for previous runs of a processing step are …
DavidT3 Sep 3, 2024
8e608e2
Added better checks for if processes have been run before - to all XM…
DavidT3 Sep 3, 2024
1d4a604
Removed the code I started to add to sas_call to check if the process…
DavidT3 Sep 3, 2024
7bb46af
Added previous process run checking to emanom. Indirectly for issue #69
DavidT3 Sep 3, 2024
5798ce5
Ensured that espfilt will not run on ObsID-inst-subexp combos that it…
DavidT3 Sep 3, 2024
f951de0
Ensured that espfilt will not run on ObsID-inst-subexp combos that it…
DavidT3 Sep 3, 2024
30049de
Made sure that cif build and odf ingest shouldn't run on observations…
DavidT3 Sep 3, 2024
1c499f6
Mispelled the function name in the process success check in cleaned_e…
DavidT3 Sep 3, 2024
0de3f9b
cif_build (and any other first process) needs a special try except wh…
DavidT3 Sep 3, 2024
929c367
Added another part to the odf_ingest check of process_success, to ens…
DavidT3 Sep 3, 2024
4430975
Added checks to all functions in xmm.assemble to ensure that the proc…
DavidT3 Sep 3, 2024
b728458
Added checks to all functions in xmm.check to ensure that the process…
DavidT3 Sep 3, 2024
5a4ec57
Added checks to all functions in xmm.clean to ensure that the process…
DavidT3 Sep 3, 2024
58a461e
Changed how the process_success setter adds info to the storage dicti…
DavidT3 Sep 3, 2024
de02440
Changed how the process_errors and process_warnings setters work, so …
DavidT3 Sep 3, 2024
402cf98
Changed how the raw_process_errors setter work, so it now adds entrie…
DavidT3 Sep 3, 2024
5b33d03
Changed how the process_logs setter work, so it now adds entries on a…
DavidT3 Sep 3, 2024
06eb9a9
Changed how the process_extra_info setter work, so it now adds entrie…
DavidT3 Sep 3, 2024
d0cbab7
Made the process_configurations property not issue a warning in case…
DavidT3 Sep 3, 2024
bb54284
Messed up indenting in the new version of process_extra_info setter -…
DavidT3 Sep 3, 2024
fa90ce8
Added diagnostics to check why updated processing isn't working now t…
DavidT3 Sep 3, 2024
106023a
Added diagnostics to check why updated processing isn't working now t…
DavidT3 Sep 3, 2024
86973f5
Think the mission update method isn't downloading the new data?
DavidT3 Sep 3, 2024
258356e
The mission update method WASN'T downloading the new data - this may …
DavidT3 Sep 3, 2024
04f480b
The mission update method WASN'T downloading the new data - this may …
DavidT3 Sep 3, 2024
553a9be
Trying to double check the new obsid selection when updating a mission
DavidT3 Sep 3, 2024
776a903
Hopefully avoided another key error in odf_ingest
DavidT3 Sep 3, 2024
9831695
Made the reloading of archives from save re-run the download() method…
DavidT3 Sep 3, 2024
5008569
Maybe ironed out the kinks in updating and successfully running the p…
DavidT3 Sep 3, 2024
1114ca7
Hopefully added previous-process-run checking to the cleaned_evt_list…
DavidT3 Sep 3, 2024
d6abe84
Hopefully added previous-process-run checking to the flaregti functio…
DavidT3 Sep 3, 2024
24a4933
merge_subexposures for XMM was not identifying when it had already be…
DavidT3 Sep 3, 2024
18900d4
merge_subexposures for XMM was not identifying when it had already be…
DavidT3 Sep 3, 2024
7c214a0
merge_subexposures for XMM was not identifying when it had already be…
DavidT3 Sep 3, 2024
dd44a3b
Being cheeky and removing a warning from final_process_success
DavidT3 Sep 3, 2024
4588e3e
Added a save command to the update() method of Archive. For issue #69
DavidT3 Sep 3, 2024
402f771
I am a fool and was double copying the failed ObsIDs to the failed da…
DavidT3 Sep 3, 2024
b4a2105
Trying to fix an error I get when testing these features with eROSITA…
DavidT3 Sep 4, 2024
79b529b
Still trying to diagnose why the error is popping up
DavidT3 Sep 4, 2024
5ada97e
REALLY Still trying to diagnose why the error is popping up
DavidT3 Sep 4, 2024
0d5eaaa
It is proprietary usable's fault in the eROSITA missions but I don't …
DavidT3 Sep 4, 2024
5bb1d79
It is proprietary usable's fault in the eROSITA missions but I don't …
DavidT3 Sep 4, 2024
214abde
Think it should be fixed now, using the array method tolist() actuall…
DavidT3 Sep 4, 2024
8b2f88f
Made the eRASS:1DE mission work for tiles that are shared between MPE…
DavidT3 Sep 4, 2024
64ddb31
Changed the path exists check in the last_process_function, it wasn't…
DavidT3 Sep 4, 2024
a5b4153
Altered how _prepare_erosita_info is called in esass_call - now will …
DavidT3 Sep 4, 2024
9fc33f0
Changed how preprocessed_missions works - checks for specific backend…
DavidT3 Sep 4, 2024
e3e1ee3
Moved the import of find_esass in Archive to avoid circular imports
DavidT3 Sep 4, 2024
b4274ee
Changed mention of the 'update_filtering' method of mission to 'updat…
DavidT3 Sep 5, 2024
0cc5d87
Added an 'Updating an existing Archive' section to the Archives tutor…
DavidT3 Sep 5, 2024
3c86dde
Added _version and _last_version attributes to the Archive class, as …
DavidT3 Sep 5, 2024
7f3a082
Added a version property to Archive. Indirectly for issue #69
DavidT3 Sep 5, 2024
2f56109
Added a version print out to the info() method of archive.
DavidT3 Sep 5, 2024
4df41ca
Added a v to the starting 0 version
DavidT3 Sep 5, 2024
136a087
Now have made sure that any version change will save the missions, bu…
DavidT3 Sep 5, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
350 changes: 282 additions & 68 deletions daxa/archive/base.py

Large diffs are not rendered by default.

21 changes: 20 additions & 1 deletion daxa/exceptions.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# This code is a part of the Democratising Archival X-ray Astronomy (DAXA) module.
# Last modified by David J Turner (turne540@msu.edu) 16/04/2024, 19:47. Copyright (c) The Contributors
# Last modified by David J Turner (turne540@msu.edu) 02/09/2024, 16:59. Copyright (c) The Contributors


class DAXAConfigError(Exception):
Expand Down Expand Up @@ -467,3 +467,22 @@ def __str__(self):
else:
return 'PreProcessedNotAvailableError has been raised'


class DAXADeveloperError(Exception):
def __init__(self, *args):
"""
Raised when an error has occurred that needs the attention of developers.

:param expression:
:param message:
"""
if args:
self.message = args[0]
else:
self.message = None

def __str__(self):
if self.message:
return '{0} '.format(self.message)
else:
return 'DAXADeveloperError has been raised'
181 changes: 178 additions & 3 deletions daxa/mission/base.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# This code is a part of the Democratising Archival X-ray Astronomy (DAXA) module.
# Last modified by David J Turner (turne540@msu.edu) 24/04/2024, 11:38. Copyright (c) The Contributors
# Last modified by David J Turner (turne540@msu.edu) 04/09/2024, 12:09. Copyright (c) The Contributors
import inspect
import json
import os.path
Expand Down Expand Up @@ -267,6 +267,16 @@ def __init__(self):
# the missions that need to set it to True (e.g. Chandra)
self._one_inst_per_obs = False

# These are used if the mission is reinstated from a save, and let us know what the usability states were
# when the mission was saved - useful for the update() method, as we can see if anything has changed
self._saved_science_usable = None
self._saved_prop_usable = None

# This dictionary is for any meta data (i.e. what observations changed, has anything flipped from proprietary
# to non-proprietary etc.) related to updating a mission (the update() method). This will be useful for
# an archive instance containing this mission, as it will be used to update the archive version
self._update_meta_info = {}

# Defining properties first
@property
@abstractmethod
Expand Down Expand Up @@ -756,6 +766,18 @@ def one_inst_per_obs(self) -> bool:
"""
return self._one_inst_per_obs

@property
def updated_meta_info(self) -> dict:
"""
This property returns a dictionary containing information about what changed during the last update of this
mission, populated only after running the update() method. This is useful for Archives containing this mission
as they can use it to update their version.

:return: The dictionary containing information about the update to this mission.
:rtype: dict
"""
return self._update_meta_info

# Then define internal methods
def _load_state(self, save_file_path: str):
"""
Expand Down Expand Up @@ -833,6 +855,13 @@ def _load_state(self, save_file_path: str):
# Finally, we store the restored dictionary in the filtering operations attribute
self._filtering_operations = reinstated_filt_ops

# These simply store the 'usable' states of the ObsIDs that were selected in the save state we're loading
# in - we primarily want these so that if the mission is updated, we know what changed.
self._saved_science_usable = {obs_id: save_dict['science_usable'][ind]
for ind, obs_id in enumerate(save_dict['selected_obs'])}
self._saved_prop_usable = {obs_id: save_dict['proprietary_usable'][ind]
for ind, obs_id in enumerate(save_dict['selected_obs'])}

def _obs_info_checks(self, new_info: pd.DataFrame):
"""
Performs very simple checks on new inputs into the observation information dataframe, ensuring it at
Expand Down Expand Up @@ -2271,15 +2300,29 @@ def save(self, save_root_path: str, state_file_name: str = None):
# by re-running the stored filtering steps, rather than comparing a stored list of ObsIDs to a newly
# downloaded one
sel_obs = self.filtered_obs_ids
# We also wish to save which ObsIDs were considered 'usable' - the first usable type is scientifically
# usable, for which every mission has a column - some of them are all True, but some missions to define
# criteria for things which aren't scientifically usable
science_usable = self.filtered_obs_info['science_usable']
# Now we also need to define whether the ObsID is currently in a proprietary period - not every mission has
# such a concept, so they don't all have that column. In the case where that column doesn't exist we'll make
# it all True, otherwise we'll extract the values from the filtered obs info dataframe
if 'proprietary_usable' in self.filtered_obs_info.columns:
prop_usable = self.filtered_obs_info['proprietary_usable']
else:
prop_usable = np.full(len(sel_obs), True)

# It is possible, if someone isn't paying attention, that the save method could be triggered when there aren't
# actually any observations left - that doesn't really make sense to me, so we'll throw an error
if len(sel_obs) == 0:
raise NoObsAfterFilterError("There are no observations associated with this {mn} mission after "
"filtering, so the mission state cannot be saved.".format(mn=self.pretty_name))

# Make sure to add the sel_obs dictionary into the overall one we're hoping to store
# Make sure to add the sel_obs list into the overall one we're hoping to store (as well as the usable
# flag lists)
mission_data['selected_obs'] = list(sel_obs)
mission_data['science_usable'] = science_usable.tolist()
mission_data['proprietary_usable'] = prop_usable.tolist()

# We can now store the filtering operations (and their configurations), as well as the order they were run in,
# which means a reinstated mission can re-run the same filtering on an updated data set. HOWEVER, there is
Expand Down Expand Up @@ -2326,12 +2369,144 @@ def save(self, save_root_path: str, state_file_name: str = None):
filt_op['arguments'][arg_name] = {'skycoord': {'ra': ra, 'dec': dec, 'frame': frame}}

mission_data['filtering_operations'] = filt_ops

# Now we write the required information to the state file path
with open(miss_file_path, 'w') as stateo:
json_str = json.dumps(mission_data, indent=4)
stateo.write(json_str)

def update(self, download_new: bool = True):
"""
This method is meant to update the selected observations of a mission which has been loaded in from the
save state. The filtering operations from the saved state will be re-applied in the same order (and with the
same configurations) as they were originally. This is designed to allow mission data selections to be easily
updated to reflect newly available observations; particularly useful for large samples of objects.

NOTE - THIS METHOD WILL NOT AUTOMATICALLY CALL THE save() METHOD.

:param bool download_new: Controls whether any newly selected data from the update should be downloaded
automatically by this method. Default is True, the download type (i.e. with products or without) will
be defined by what was originally downloaded by this mission. If no data was downloaded in the original
form of this mission then the download() method will have to be run after this method.
"""
if len(self.filtering_operations) == 0:
# If no filtering operations at all (or only filtering on ObsID, which isn't recorded because it can't be
# updated) have been run, we just warn the user and do nothing else
warn("No updatable filtering operations have been run for {pn}.".format(pn=self.pretty_name), stacklevel=2)
else:
# In this case there ARE filtering operations that we want to re-apply to the updated observation
# database

# We need to reset the locked attribute, otherwise the mission isn't going to let us re-run
# anything. This must be done through altering the attribute, rather than the property setter, as the
# property setter only allows a change from False -> True, not the other way
self._locked = False

# We need to make a copy of the filtering operations before the reset_filter method is called (as it wipes
# the operation history)
filt_op_copy = deepcopy(self.filtering_operations)

# Now that we've unlocked the mission instance, and copied the filtering operations, we can reset the
# filter - this will allow us to again select from the entire stock of observations for the current
# mission
self.reset_filter()
# Now we can work through the stored history of filtering operations - in the order they were used
for cur_filt in filt_op_copy:
cl_meth = getattr(self, cur_filt['name'])
cl_meth(**cur_filt['arguments'])

# The ObsIDs that were selected in the save state that was loaded in, we need to compare to these
og_sel_obs = np.array(list(self._saved_prop_usable.keys()))

# Now we want to determine if the observation selection has changed AND/OR whether any of the previously
# selected observations have become usable (most likely because they've come out of a proprietary period)
# First, lets just see if the selected observations are different in any way from the saved selected obs
if set(self.filtered_obs_ids) != set(og_sel_obs):
# This describes whether the selected observations have changed at all
obs_sel_change = True

# Now we want to know if there are any ObsIDs selected NOW that weren't there in the save state
cur_in_save_obs_arr = np.isin(self.filtered_obs_ids, og_sel_obs)
new_obs_ids = self.filtered_obs_ids[~cur_in_save_obs_arr]
# One bool summary of if there are new ObsIDs
obs_sel_add = True if not cur_in_save_obs_arr.all() else False

# We also want to know if there are any ObsIDs in the save state but AREN'T selected anymore - this
# can happen as some of the missions are 'live' and are having their datasets constantly altered
save_in_cur_obs_arr = np.isin(og_sel_obs, self.filtered_obs_ids)
rem_obs_ids = og_sel_obs[~save_in_cur_obs_arr]
# One bool summary of if there are removed ObsIDs
obs_sel_rem = True if not save_in_cur_obs_arr.all() else False
# In this case the selected ObsIDs (current and in the save state) are identical
else:
obs_sel_change = False
new_obs_ids = np.array([])
obs_sel_add = False

obs_sel_rem = False
rem_obs_ids = np.array([])

# This is a dictionary of ObsIDs and their science usable values, but only of the ObsIDs that are not
# newly selected as we want to do a like for like comparison with the save state science usable dict
oi_sc_dict = {row['ObsID']: row['science_usable']
for row_ind, row in self.filtered_obs_info.iterrows() if row['ObsID'] not in new_obs_ids}
# We do the comparison, making sure to get rid of any removed ObsIDs in the save state dict that are no
# longer present in the filtered dataset (otherwise we would get an artificial mismatch between the
# science usable dictionaries
saved_sc_us = {oi: us for oi, us in self._saved_science_usable.items() if oi not in rem_obs_ids}
sc_us_ch = saved_sc_us != oi_sc_dict
if sc_us_ch:
# You could argue that we should have just done this from the start, but I think the dict
# comparisons are a better way to identify whether anything has changed at first.
# This dictionary contains the ObsIDs if those observations that have had their science-usable state
# change, and what the usable value has been changed too as values
which_sc_us_ch = {oi: oi_sc_dict[oi] for oi, save_us in saved_sc_us.items()
if save_us != oi_sc_dict[oi]}
else:
which_sc_us_ch = {}

# We repeat that same process (see above) with the proprietary usable column (much more likely to have
# changed than the science usable column) - though we only do that check if there IS a proprietary usable
# column. Remember that not every mission has a proprietary period.
# We create this empty dictionary that will be overwritten if it needs to be - it's just neater here
which_pr_us_ch = {}
if 'proprietary_usable' in self.filtered_obs_info.columns:
# This is all the exact same process as above - see those comments
oi_pr_dict = {row['ObsID']: row['proprietary_usable']
for row_ind, row in self.filtered_obs_info.iterrows() if row['ObsID'] not in new_obs_ids}
saved_pr_us = {oi: us for oi, us in self._saved_prop_usable.items() if oi not in rem_obs_ids}
pr_us_ch = saved_pr_us != oi_pr_dict

if pr_us_ch:
which_pr_us_ch = {oi: oi_pr_dict[oi] for oi, save_us in saved_pr_us.items()
if save_us != oi_pr_dict[oi]}
else:
# If the mission does not have a proprietary period, then of course it will never have changed for any
# of our ObsIDs
pr_us_ch = False

self._update_meta_info['sel_obs_change'] = obs_sel_change
# These contain degenerate info, but might as well provide the option of not using a len check on the
# new/removed ObsID arrays
self._update_meta_info['any_obs_add'] = obs_sel_add
self._update_meta_info['new_obs_ids'] = new_obs_ids
self._update_meta_info['any_obs_removed'] = obs_sel_rem
self._update_meta_info['removed_obs_ids'] = rem_obs_ids
# Now we can store whether the usability state of anything has changed - again this is degenerate info
self._update_meta_info['science_usable_change'] = sc_us_ch
self._update_meta_info['which_changed_science_usable'] = which_sc_us_ch
self._update_meta_info['proprietary_usable_change'] = pr_us_ch
self._update_meta_info['which_changed_proprietary_usable'] = which_pr_us_ch

# This runs the download process for any newly selected observations, if the update method was
# called with the download_new argument set to True. We try to match the downloaded data to the type
# that was originally downloaded
if download_new:
self._download_done = False
try:
self.download(download_products='preprocessed' in self.downloaded_type)
except DAXANotDownloadedError:
self.download()

def info(self):
print("\n-----------------------------------------------------")
print("Number of Observations - {}".format(len(self)))
Expand Down
13 changes: 8 additions & 5 deletions daxa/mission/erosita.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# This code is a part of the Democratising Archival X-ray Astronomy (DAXA) module.
# Last modified by David J Turner (turne540@msu.edu) 23/08/2024, 11:42. Copyright (c) The Contributors
# Last modified by David J Turner (turne540@msu.edu) 04/09/2024, 12:38. Copyright (c) The Contributors

import gzip
import os
Expand Down Expand Up @@ -919,10 +919,13 @@ def __init__(self, insts: Union[List[str], str] = None, save_file_path: str = No

# We set up the eROSITA file name templates, so that the user (or other parts of DAXA) can retrieve paths
# to the event lists, images, exposure maps, and background maps that can be downloaded
self._template_evt_name = "EXP_010/em01_{oi}_020_EventList_c010.fits"
self._template_img_name = "EXP_010/em01_{oi}_02{eb}_Image_c010.fits"
self._template_exp_name = "DET_010/em01_{oi}_02{eb}_ExposureMap_c010.fits"
self._template_bck_name = "DET_010/em01_{oi}_02{eb}_BackgrImage_c010.fits"
# The wildcards are needed because that second character describes the 'owner' of the tile - m=MPE,
# c=calibration, b=MPE+IKE - guess we could have populated that from the observation info table but ah well
# if it works...
self._template_evt_name = "EXP_010/e*01_{oi}_020_EventList_c010.fits"
self._template_img_name = "EXP_010/e*01_{oi}_02{eb}_Image_c010.fits"
self._template_exp_name = "DET_010/e*01_{oi}_02{eb}_ExposureMap_c010.fits"
self._template_bck_name = "DET_010/e*01_{oi}_02{eb}_BackgrImage_c010.fits"

# Call the name property to set up the name and pretty name attributes
self.name
Expand Down
43 changes: 42 additions & 1 deletion daxa/process/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,43 @@
# This code is a part of the Democratising Archival X-ray Astronomy (DAXA) module.
# Last modified by David J Turner (turne540@msu.edu) 23/11/2022, 16:42. Copyright (c) The Contributors
# Last modified by David J Turner (turne540@msu.edu) 03/09/2024, 10:23. Copyright (c) The Contributors
from daxa.process.erosita import flaregti
from daxa.process.erosita.assemble import cleaned_evt_lists as ecleaned_evt_lists
from daxa.process.xmm import epchain, emchain, rgs_events, rgs_angles, cleaned_rgs_event_lists, cleaned_evt_lists, \
merge_subexposures, emanom, espfilt, cif_build, odf_ingest

PROC_LOOKUP = {'xmm_pointed': {'epchain': epchain,
'emchain': emchain,
'rgs_events': rgs_events,
'rgs_angles': rgs_angles,
'cleaned_rgs_event_lists': cleaned_rgs_event_lists,
'cleaned_evt_lists': cleaned_evt_lists,
'merge_subexposures': merge_subexposures,
'emanom': emanom,
'espfilt': espfilt,
'cif_build': cif_build,
'odf_ingest': odf_ingest},

'xmm_slew': {'epchain': epchain,
'emchain': emchain,
'rgs_events': rgs_events,
'rgs_angles': rgs_angles,
'cleaned_rgs_event_lists': cleaned_rgs_event_lists,
'cleaned_evt_lists': cleaned_evt_lists,
'merge_subexposures': merge_subexposures,
'emanom': emanom,
'espfilt': espfilt,
'cif_build': cif_build,
'odf_ingest': odf_ingest},
'erosita_calpv': {'cleaned_evt_lists': ecleaned_evt_lists,
'flaregti': flaregti},
'erosita_all_sky_de_dr1': {'cleaned_evt_lists': ecleaned_evt_lists,
'flaregti': flaregti},
'nustar_pointed': {},
'chandra': {},
'rosat_all_sky': {},
'rosat_pointed': {},
'swift': {},
'suzaku': {},
'asca': {},
'integral_pointed': {}
}
Loading