diff --git a/.github/workflows/testing.yml b/.github/workflows/testing.yml index 347d73f7..1d3d292c 100644 --- a/.github/workflows/testing.yml +++ b/.github/workflows/testing.yml @@ -22,8 +22,8 @@ jobs: - name: oldest supported versions os: ubuntu-latest - python-version: '3.8' - toxenv: py38-test-oldest + python-version: '3.9' + toxenv: py39-test-oldest - name: macOS latest os: macos-latest diff --git a/docs/bandpass.rst b/docs/bandpass.rst deleted file mode 100644 index 747f04dd..00000000 --- a/docs/bandpass.rst +++ /dev/null @@ -1,11 +0,0 @@ -Bandpass (CMB) -======== - -Bandpass computation --------------------- - -.. automodule:: soliket.bandpass - :exclude-members: initialize - :members: - :show-inheritance: - :private-members: diff --git a/docs/developers.rst b/docs/developers.rst index 8298654e..09983400 100644 --- a/docs/developers.rst +++ b/docs/developers.rst @@ -66,13 +66,8 @@ Basic ingredients ^^^^^^^^^^^^^^^^^ Your theory calculator must inherit from the cobaya theory class. It must have 3 main blocks of functions: inizialization (``initialize``); requirements (``get_requirement``, ``must_provide``); calculations (``get_X``, ``calculate``). -In what follows, we will use the structure of ``mflike`` as a concrete example of how to build these 3 blocks. The new version of ``mflike`` in SOLikeT splits the original mflike in 4 blocks: one cobaya-likelihood component (``mflike``); three cobaya-theory components. - -The three theory components are: - 1. ``TheoryForge``: this is where raw theory (CMB spectra) is mixed and modified with instrumental and non-cosmological effects - 2. ``Foreground``: this is where the foreground (fg) spectra are computed - 3. ``BandPass``: this is where bandpasses are built (either analytically or read from file) - +In what follows, we will use the structure of ``mflike`` as a concrete example of how to build these 3 blocks. mflike splits the original mflike into 2 blocks: +one cobaya-likelihood component (``mflike``), and a cobaya-theory component ``BandpowerForeground`` to calculate the foreground model. Initialization ^^^^^^^^^^^^^^ @@ -81,37 +76,29 @@ You can either assign params in initialize or do that via a dedicated yaml. You Requirements ^^^^^^^^^^^^ Here you need to write what external elements are needed by your theory block to perform its duties. These external elements will be computed and provided by some other external module (e.g., another Theory class). -In our case, ``mflike`` must tell us that it needs a dictionary of cmb+fg spectra. This is done by letting the get_requirement function return a dictionary which has the name of the needed element as a key. For example, if the cmb+fg spectra dict is called ``cmbfg_dict``, the get_requirement function should:: +In our case, ``mflike`` must tell us that it needs a dictionary of cmb and fg spectra. This is done by letting the get_requirement function return a dictionary which has the name of the needed element as a key. For example, if the cmb+fg spectra dict is called ``fg_totals``, the get_requirement function should be:: - return {"cmbfg_dict":{}} + return {"fg_totals":{}} -The key is a dict itself. It can be empty, if no params need to be passed to the external Theory in charge of computing cmbfg_dict. -It might be possible that, in order to compute ``cmbfg_dict``, we should pass to the specific Theory component some params known by ``mflike`` (e.g., frequency channel). This is done by filling the above empty dict:: +The key is a dict itself. It can be empty, if no params need to be passed to the external Theory in charge of computing fg_totals. +It might be possible that, in order to compute ``fg_totals``, we should pass to the specific Theory component some params known by ``mflike`` (e.g., frequency channel). This is done by filling the above empty dict:: - {"cmbfg_dict": {"param1": param1_value, "param2": param2_value, etc}} + {"fg_totals": {"param1": param1_value, "param2": param2_value, etc}} -If this happens, then the external Theory block (in this example, ``TheoryForge``) must have a ``must_provide`` function. ``must_provide`` tells the code: +If this happens, then the external Theory block (in this example, ``BandpowerForeground``) must have a ``must_provide`` function. ``must_provide`` tells the code: 1. The values which should be assigned to the parameters needed to compute the element required from the Theory block. The required elements are stored in the ``**requirements`` dictionary which is the input of ``must_provide``. - In our example, ``TheoryForge`` will assign to ``param1`` the ``param1_value`` passed from ``mflike`` via the ``get_requirement`` in ``mflike`` (and so on). For example: + In our example, ``BandpowerForeground`` will assign to ``param1`` the ``param1_value`` passed from ``mflike`` via the ``get_requirement`` in ``mflike`` (and so on). For example: :: must_provide(self, **requirements): - if "cmbfg_dict" in requirements: - self.param1 = requirements["cmbfg_dict"]["param1"] + if "fg_totals" in requirements: + self.param1 = requirements["fg_totals"]["param1"] if this is the only job of ``must_provide``, then the function will not return anything - 2. If required, what external elements are needed by this specific theory block to perform its duties. In this case, the function will return a dictionary of dictionaries which are the requirements of the specific theory block. These dictionaries do not have to necessarily contain content (they can be empty instances of the dictionary), but must be included if expected. Note this can be also done via ``get_requirement``. However, if you need to pass some params read from the block above to the new requirements, this can only be done with ``must_provide``. For example, ``TheoryForge`` needs ``Foreground`` to compute the fg spectra, which we store in a dict called ``fg_dict``. We also want ``TheoryForge`` to pass to ``Foreground`` ``self.param1``. This is done as follows: - :: - - must_provide(self, **requirements): - if “cmbfg_dict” etc etc - ... - return {“fg_dict”: {“param1_fg”: self.param1}} - - Of course, ``Foreground`` will have a similar call to ``must_provide``, where we assign to ``self.param1_fg`` the value passed from ``TheoryForge`` to ``Foreground``. + 2. If required, what external elements are needed by this specific theory block to perform its duties. In this case, the function will return a dictionary of dictionaries which are the requirements of the specific theory block. These dictionaries do not have to necessarily contain content (they can be empty instances of the dictionary), but must be included if expected. Note this can be also done via ``get_requirement``. However, if you need to pass some params read from the block above to the new requirements, this can only be done with ``must_provide``. Calculation ^^^^^^^^^^^ @@ -123,7 +110,7 @@ In each Theory class, you need at least 2 functions: get_X(self, any_other_param): return self.current_state[“X”] - where "X" is the name of the requirement computed by that class (in our case, it is ``cmbfg_dict`` in ``TheoryForge``, ``fg_dict`` in ``Foreground``). ``any_other_param`` is an optional param that you may want to apply to ``current_state["X"]`` before returning it. E.g., it could be a rescaling amplitude. This function is called by the Likelihood or Theory class that has ``X`` as its requirement, via the ``self.provider.get_X(any_other_param)`` call. + where "X" is the name of the requirement computed by that class (in our case, it is ``fg_totals`` in ``BandpowerForeground``). ``any_other_param`` is an optional param that you may want to apply to ``current_state["X"]`` before returning it. E.g., it could be a rescaling amplitude. This function is called by the Likelihood or Theory class that has ``X`` as its requirement, via the ``self.provider.get_X(any_other_param)`` call. 2. A calculate function: :: @@ -182,7 +169,7 @@ To see if codes you have written when developing SOLikeT are valid and will pass If you are using conda, the easiest way to run tests (and the way we run them) is to use tox-conda:: - pip install tox-conda + pip install tox tox -e test This will create a fresh virtual environment replicating the one which is used for CI then run the tests (i.e. without touching your current environment). Note that any args after a '--' string will be passed to pytest, so:: diff --git a/docs/foreground.rst b/docs/foreground.rst deleted file mode 100644 index cb793701..00000000 --- a/docs/foreground.rst +++ /dev/null @@ -1,12 +0,0 @@ -Foreground (CMB) -========== - -Foreground computation ----------------------- - -.. automodule:: soliket.foreground - :members: - :show-inheritance: - :private-members: - - diff --git a/docs/mflike.rst b/docs/mflike.rst index bebea98c..0f1a0c17 100644 --- a/docs/mflike.rst +++ b/docs/mflike.rst @@ -1,27 +1,4 @@ MFLike (Primary CMB) ====== -.. automodule:: soliket.mflike.mflike - -Multi Frequency Likelihood --------------------------- - -.. autoclass:: soliket.mflike.MFLike - :exclude-members: initialize - :members: - :private-members: - :show-inheritance: - -Application of foregrounds and systematics ------------------------------------------- - -.. automodule:: soliket.mflike.theoryforge_MFLike - -TheoryForge_MFLike ------------------- - -.. autoclass:: soliket.mflike.TheoryForge_MFLike - :exclude-members: initialize - :members: - :show-inheritance: - :private-members: +Install the "mflike" package to use the SO primary CMB likelihoods. diff --git a/docs/soliket-docs.yml b/docs/soliket-docs.yml index 4b865b8e..a1c18883 100644 --- a/docs/soliket-docs.yml +++ b/docs/soliket-docs.yml @@ -3,7 +3,7 @@ channels: - conda-forge - nodefaults dependencies: - - python>=3.8,<3.12 + - python>=3.9 - pip - pytest-cov - compilers diff --git a/docs/theory-component-guidelines.rst b/docs/theory-component-guidelines.rst index 573872b7..7c7b08c5 100644 --- a/docs/theory-component-guidelines.rst +++ b/docs/theory-component-guidelines.rst @@ -8,81 +8,63 @@ Basic ingredients ^^^^^^^^^^^^^^^^^ Your theory calculator must inherit from the cobaya theory class. It must have 3 main blocks of functions: -inizialization (``inizialize``); +initialization (``initialize``); -requirements (``get_requirement``, ``must_provide``); +requirements (``get_requirements``, ``must_provide``); calculations (``get_X``, ``calculate``). -In what follows, we will use the structure of ``mflike`` as a concrete example of how to build these 3 blocks. The new version of ``mflike`` in SOLikeT splits the original mflike in 4 blocks: one cobaya-likelihood component (``mflike``); three cobaya-theory components. - -The 3 theory components are: - 1. ``TheoryForge``: this is where raw theory (CMB spectra) is mixed and modified with instrumental and non-cosmological effects - 2. ``Foreground``: this is where the foreground (fg) spectra are computed - 3. ``BandPass``: this is where bandpasses are built (either analytically or read from file) +In what follows, we will use the structure of ``mflike`` as a concrete example of how to build these 3 blocks. mflike splits the original mflike into 2 blocks: +one cobaya-likelihood component (``mflike``), and a cobaya-theory component ``BandpowerForeground`` to calculate the foreground model. Initialization ^^^^^^^^^^^^^^ -You can either assign params in inizialize or do that via a dedicated yaml. You can in general do all the calculations that need to be done once for all. +You can either assign params in initialize or do that via a dedicated yaml. You can in general do all the calculations that need to be done once for all. Requirements ^^^^^^^^^^^^ Here you need to write what external elements are needed by your theory block to perform its duties. These external elements will be computed and provided by some other external module (e.g., another Theory class). -In our case, ``mflike`` must tell us that it needs a dictionary of cmb+fg spectra. This is done by letting the get_requirement function return a dictionary which has the name of the needed element as a key. For example, if the cmb+fg spectra dict is called ``cmbfg_dict``, the get_requirement function should - -:: +In our case, ``mflike`` must tell us that it needs a dictionary of cmb and fg spectra. This is done by letting the get_requirement function return a dictionary which has the name of the needed element as a key. For example, if the cmb+fg spectra dict is called ``fg_totals``, the get_requirement function should be:: - return {"cmbfg_dict":{}} + return {"fg_totals":{}} -The key is a dict itself. It can be empty, if no params need to be passed to the external Theory in charge of computing cmbfg_dict. -It might be possible that, in order to compute ``cmbfg_dict``, we should pass to the specific Theory component some params known by ``mflike`` (e.g., frequency channel). This is done by filling the above empty dict: +The key is a dict itself. It can be empty, if no params need to be passed to the external Theory in charge of computing fg_totals. +It might be possible that, in order to compute ``fg_totals``, we should pass to the specific Theory component some params known by ``mflike`` (e.g., frequency channel). This is done by filling the above empty dict:: -:: + {"fg_totals": {"param1": param1_value, "param2": param2_value, etc}} - {"cmbfg_dict": {"param1": param1_value, "param2": param2_value, etc}} +If this happens, then the external Theory block (in this example, ``BandpowerForeground``) must have a ``must_provide`` function. ``must_provide`` tells the code: -If this happens, then the external Theory block (in this example, ``TheoryForge``) must have a ``must_provide`` function. -``must_provide`` tells the code - 1. what values should be assigned to the parameters needed to compute the element required from the Theory block. The required elements are stored in the ``**requirements`` dictionary which is the input of ``must_provide``. - In our example, ``TheoryForge`` will assign to ``param1`` the ``param1_value`` passed from ``mflike`` via the ``get_requirement`` in ``mflike`` (and so on). For example: + 1. The values which should be assigned to the parameters needed to compute the element required from the Theory block. The required elements are stored in the + ``**requirements`` dictionary which is the input of ``must_provide``. + In our example, ``BandpowerForeground`` will assign to ``param1`` the ``param1_value`` passed from ``mflike`` via the ``get_requirement`` in ``mflike`` (and so on). For example: :: - must_provide(self, **requirements): - if "cmbfg_dict" in requirements: - self.param1 = requirements["cmbfg_dict"]["param1"] + must_provide(self, **requirements): + if "fg_totals" in requirements: + self.param1 = requirements["fg_totals"]["param1"] if this is the only job of ``must_provide``, then the function will not return anything - - 2. if needed, what external elements are needed by this specific theory block to perform its duties. In this case, the function will return a dictionary of (empty or not) dictionaries which are the requirements of the specific theory block. Note this can be also done via ``get_requirement``. However, if you need to pass some params read from the block above to the new requirements, this can only be done with ``must_provide``. For example, ``TheoryForge`` needs ``Foreground`` to compute the fg spectra, which we store in a dict called ``fg_dict``. We also want ``TheoryForge`` to pass to ``Foreground`` ``self.param1``. This is done as follows: - :: - - must_provide(self, **requirements): - if “cmbfg_dict” etc etc - ... - return {“fg_dict”: {“param1_fg”: self.param1}} - - Of course, ``Foreground`` will have a similar call to ``must_provide``, where we assign to ``self.param1_fg`` the value passed from ``TheoryForge`` to ``Foreground``. + 2. If required, what external elements are needed by this specific theory block to perform its duties. In this case, the function will return a dictionary of dictionaries which are the requirements of the specific theory block. These dictionaries do not have to necessarily contain content (they can be empty instances of the dictionary), but must be included if expected. Note this can be also done via ``get_requirement``. However, if you need to pass some params read from the block above to the new requirements, this can only be done with ``must_provide``. Calculation ^^^^^^^^^^^ In each Theory class, you need at least 2 functions: - 1. + 1. A get function: :: get_X(self, any_other_param): return self.current_state[“X”] - where "X" is the name of the requirement computed by that class (in our case, it is ``cmbfg_dict`` in ``TheoryForge``, ``fg_dict`` in ``Foreground``). - "any_other_param" is an optional param that you may want to apply to ``current_state["X"]`` before returning it. E.g., it could be a rescaling amplitude. - This function is called by the Likelihood or Theory class that has "X" as its requirement, via the ``self.provider.get_X(any_other_param)`` call. + where "X" is the name of the requirement computed by that class (in our case, it is ``fg_totals`` in ``BandpowerForeground``). ``any_other_param`` is an optional param that you may want to apply to ``current_state["X"]`` before returning it. E.g., it could be a rescaling amplitude. This function is called by the Likelihood or Theory class that has ``X`` as its requirement, via the ``self.provider.get_X(any_other_param)`` call. - 2. + 2. A calculate function: :: - calculate(self, **state, want_derived=False/True, **params_values_dict): - do actual calculations, that could involve the use of some of the **params_value_dict, and might also compute derived params (if want_derived=True) + calculate(self, **state, want_derived=False, **params_values_dict): + state[“X”] = result of above calculations - state[“X”] = result of above calculations \ No newline at end of file + which will do actual calculations, that could involve the use of some of the ``**params_value_dict``, and might also compute derived params (if ``want_derived=True``). diff --git a/examples/example_1.yaml b/examples/example_1.yaml index f80022c9..d66ce742 100644 --- a/examples/example_1.yaml +++ b/examples/example_1.yaml @@ -11,28 +11,17 @@ output: output/example_1 # Specify which Likelihoods we would like to use # Any options for the Likelihood will use their default values unless specified here -# In this case the options and defaults are specified in soliket/mflike/MFLike.yaml # Note that MFLike is a Cobaya `installable likelihood`. # When running this yaml file, or calling `cobaya-install example_1.yaml` the required # installable components will automatically be downloaded and installed. -# Note that for the soliket MFLike likelihood we are required to calculate: +# Note that for the mflike.MFLike likelihood we are required to calculate: # - CMB theory power spectra (from CAMB theory below) -# - Multi-frequency bandpass calibrations (from soliket.BandPass theory below) -# - Multi-frequency foregrounds (from soliket.Foreground theory below) -# - The combination of the above components (from soliket.TheoryForge_MFLike theory below) +# - Multi-frequency passband-integrated foregrounds (from mflike.BandpowerForeground theory below) likelihood: - soliket.MFLike: - data_folder: MFLike/v0.8 - input_file: LAT_simu_sacc_00000.fits + mflike.MFLike: + package_install: pip + input_file: LAT_simu_sacc_00044.fits cov_Bbl_file: data_sacc_w_covar_and_Bbl.fits - defaults: - polarizations: ['TT', 'TE', 'ET', 'EE'] - scales: - TT: [30, 9000] - TE: [30, 9000] - ET: [30, 9000] - EE: [30, 9000] - symmetrize: False # Specify the Theory codes which will compute observables to be compared with the data # in the Likelihood. @@ -55,17 +44,7 @@ theory: share_delta_neff: True stop_at_error: True - soliket.BandPass: - stop_at_error: True - - soliket.Foreground: - stop_at_error: True - - soliket.TheoryForge_MFLike: - spectra: - polarizations: ['tt', 'te', 'ee'] - lmin: 2 - lmax: 9050 + mflike.BandpowerForeground: stop_at_error: True # Specify the parameter values at which to compute the likelihood diff --git a/pyproject.toml b/pyproject.toml index aef145bc..73dae974 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -2,12 +2,12 @@ name = "soliket" dynamic = ['version'] authors = [ - {name = "Simons Observatory"} + { name = "Simons Observatory" } ] description = "Likelihood and Theory codes for the Simons Observatory." readme = "README.rst" -requires-python = ">=3.8, <3.12" -license = {text = "MIT"} +requires-python = ">=3.9" +license = { text = "MIT" } dependencies = [ "requests", "numpy", @@ -20,10 +20,10 @@ dependencies = [ "fgspectra >= 1.1.0", "pyccl >= 3.0; platform_system!='Windows'", "pyhalomodel", - "scikit-learn", "camb", "getdist", "syslibrary>=0.2.0", + "mflike>=1.0.0" ] [project.urls] @@ -55,7 +55,7 @@ include = [ "soliket.poisson", "soliket.ps", "soliket.xcorr", - ] +] [tool.setuptools.package-data] "*" = ['*.yaml', '*.fits', '*.txt', '*.pkl', '*.gz'] @@ -72,11 +72,14 @@ docs = [ "sphinx", "sphinx_rtd_theme", ] +tests = [ + "scikit-learn", +] [tool.flake8] -select = ["E713","E704","E703","E714","E741","E10","E11","E20","E22","E23","E25","E262","E27","E301","E302","E304","E9","F405","F406","F5","F6","F7","F8","E501","W191","F401","W1","W292","W293","W3"] +select = ["E713", "E704", "E703", "E714", "E741", "E10", "E11", "E20", "E22", "E23", "E25", "E262", "E27", "E301", "E302", "E304", "E9", "F405", "F406", "F5", "F6", "F7", "F8", "E501", "W191", "F401", "W1", "W292", "W293", "W3"] max-line-length = 90 -exclude = [".tox","build","cobaya_packages","test",".eggs"] +exclude = [".tox", "build", "cobaya_packages", "test", ".eggs"] [tool.coverage.run] omit = [ diff --git a/requirements.txt b/requirements.txt index 083788e8..84dd9e3c 100644 --- a/requirements.txt +++ b/requirements.txt @@ -18,4 +18,4 @@ pyccl >= 3.0; platform_system!='Windows' sacc fgspectra>=1.1.0 syslibrary -pyhalomodel \ No newline at end of file +pyhalomodel diff --git a/soliket-tests.yml b/soliket-tests.yml index cffa78fd..10734e9b 100644 --- a/soliket-tests.yml +++ b/soliket-tests.yml @@ -3,7 +3,7 @@ channels: - conda-forge - nodefaults dependencies: - - python>=3.8,<3.12 + - python>=3.8 - pip - pytest - pytest-cov diff --git a/soliket/__init__.py b/soliket/__init__.py index f58c1e77..be95c420 100644 --- a/soliket/__init__.py +++ b/soliket/__init__.py @@ -6,24 +6,18 @@ # package is not installed pass -from .bandpass import BandPass from .bias import Bias, Linear_bias from .ccl import CCL from .clusters import ClusterLikelihood from .cosmopower import CosmoPower, CosmoPowerDerived from .cross_correlation import (CrossCorrelationLikelihood, GalaxyKappaLikelihood, ShearKappaLikelihood) -from .foreground import Foreground from .gaussian import GaussianLikelihood, MultiGaussianLikelihood from .lensing import LensingLikelihood, LensingLiteLikelihood -from .mflike import MFLike, TheoryForge_MFLike -# from .studentst import StudentstLikelihood from .ps import BinnedPSLikelihood, PSLikelihood from .xcorr import XcorrLikelihood __all__ = [ - # bandpass - "BandPass", # bias "Bias", "Linear_bias", @@ -38,19 +32,12 @@ "CrossCorrelationLikelihood", "GalaxyKappaLikelihood", "ShearKappaLikelihood", - # foreground - "Foreground", # gaussian "GaussianLikelihood", "MultiGaussianLikelihood", # lensing "LensingLikelihood", "LensingLiteLikelihood", - # mflike - "MFLike", - "TheoryForge_MFLike", - # studentst - # "StudentstLikelihood", # ps "BinnedPSLikelihood", "PSLikelihood", diff --git a/soliket/bandpass/BandPass.yaml b/soliket/bandpass/BandPass.yaml deleted file mode 100644 index 541cd2bd..00000000 --- a/soliket/bandpass/BandPass.yaml +++ /dev/null @@ -1,51 +0,0 @@ -data_folder: MFLike/v0.8 - -# Three options for passband construction (top_hat_band, external_bandpass, -# read_from_sacc). Fill the corresponding dictionary to select one of the -# options. The other dictionaries have to be left empty. Default is -# read_from_sacc. - -# Parameters to build a top-hat band: -# - nsteps sets the number of frequencies used in the band integration -# - bandwidth sets the relative width of the band wrt the central frequency -# the central frequency of each band is set from the bands stored in the sacc file -# - with nstep: 1, bandwidth must be 0 -# Dirac delta bandpass, no band integration -# useful if you don't want to do band integration -# when the bandpasses in the sacc file are multifrequency -# the freq used is the effective frequency from the bandpass -# - if nstep > 1, bandwidth must be > 0 -# bandwidth can be a list if you want a different width for each band -# e.g. bandwidth: [0.3,0.2,0.3] for 3 bands -# when top_hat_band is a null dict: no top-hat band is built and -# bandpasses read from external path. Band integration is performed accordingly -# (if bandpass in the file is a single freq, no band integration) -# Bandpass has to be divided by nu**2 if measured with respect to a RJ source -top_hat_band: -# nsteps: 1 -# bandwidth: 0 - - -# If passband read from an external file, provide the field -# "path" with the directory path under the data_folder -# this directory would store several subdirectories with the -# name of experiment/array and the frequency -external_bandpass: - #path: path - - -# The mflike default is to read passbands from the sacc file -# storing the data. If you are not using mflike, make this -# dictionary empty and fill one of the other two dictionaries -read_from_sacc: True - -params: - bandint_shift_LAT_93: - value: 0 - latex: \Delta_{\rm LAT}^{93} - bandint_shift_LAT_145: - value: 0 - latex: \Delta_{\rm LAT}^{145} - bandint_shift_LAT_225: - value: 0 - latex: \Delta_{\rm LAT}^{225} diff --git a/soliket/bandpass/__init__.py b/soliket/bandpass/__init__.py deleted file mode 100644 index 4fb3089b..00000000 --- a/soliket/bandpass/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -from .bandpass import BandPass - -__all__ = [ - "BandPass", -] diff --git a/soliket/bandpass/bandpass.py b/soliket/bandpass/bandpass.py deleted file mode 100644 index 27d85580..00000000 --- a/soliket/bandpass/bandpass.py +++ /dev/null @@ -1,328 +0,0 @@ -r""" -.. module:: bandpass - -This module computes the bandpass transmission based on the inputs from -the parameter file ``BandPass.yaml``. There are three possibilities: - * reading the passband :math:`\tau(\nu)` stored in a sacc file - (which is the default now, being the mflike default) - * building the passbands :math:`\tau(\nu)`, either as Dirac delta or as top-hat - * reading the passbands :math:`\tau(\nu)` from an external file. - -Fore the first option, the ``read_from_sacc`` option in ``BandPass.yaml`` -has to be set to ``True``: - -.. code-block:: yaml - - read_from_sacc: True - -Otherwise, it has to be left empty. The frequencies and passbands are passed in a -``bands`` dictionary, which is passed from ``Foregrounds`` through ``TheoryForge``. - - -For the second option, the ``top_hat_band`` dictionary in ``BandPass.yaml`` -has to be filled with two keys: - - * ``nsteps``: setting the number of frequencies used in the band integration - (either 1 for a Dirac delta or > 1) - * ``bandwidth``: setting the relative width :math:`\delta` of the band with respect to - the central frequency, such that the frequency extrems are - :math:`\nu_{\rm{low/high}} = \nu_{\rm{center}}(1 \mp \delta/2) + - \Delta^{\nu}_{\rm band}` (with :math:`\Delta^{\nu}_{\rm band}` - being the possible bandpass shift). ``bandwidth`` has to be 0 - if ``nstep`` = 1, > 0 otherwise. ``bandwidth`` can be a list - if you want a different width for each band - e.g. ``bandwidth: [0.3,0.2,0.3]`` for 3 bands. - -The effective frequencies are read from the ``bands`` dictionary as before. If we are not -using passbands from a sacc file, ``bands`` is filled in ``Foreground`` using the default -``eff_freqs`` in ``Foreground.yaml``. In this case it is filled assuming a Dirac delta. - -.. code-block:: yaml - - top_hat_band: - nsteps: 1 - bandwidth: 0 - - -For the third option, the ``external_bandpass`` dictionary in ``BandPass.yaml`` -has to have the the ``path`` key, representing the path to the folder with all the -passbands. - -.. code-block:: yaml - - external_bandpass: - path: "path_of_passband_folder" - - -The path has to be relative to the ``data_folder`` in ``BandPass.yaml``. -This folder has to have text files with the names of the experiment or array and the -nominal frequency of the channel, e.g. ``LAT_93`` or ``dr6_pa4_f150``. No other extensions -to be added to the name! These files will contain the frequencies as first column and -the passband as second column. - -To avoid the options you don't want to select, the corresponding dictionary has to be -``null``. -If all dictionaries are ``null``, there will be an error message inviting to choose -one of the three options. - -The bandpass transmission is built as - -.. math:: - \frac{\frac{\partial B_{\nu+\Delta \nu}} - {\partial T} \tau(\nu+\Delta \nu)}{\int d\nu\frac{\partial - B_{\nu+\Delta \nu}}{\partial T} \tau(\nu+\Delta \nu)} - -where - -.. math:: - &\frac{\partial B_{\nu}}{\partial T} \propto \frac{x^2 e^x \nu^2}{(e^x-1)^2} \\ - & x \equiv \frac{h \nu}{k_B T_{CMB}} - -which converts from CMB thermodynamic temperature to differential source intensity -(see eq.8 of https://arxiv.org/abs/1303.5070). -The passband :math:`\tau(\nu)` has to be divided by :math:`\nu^2` if it has been -measured with respect to a Rayleigh-Jeans (RJ) source and :math:`\Delta \nu` is the -possible bandpass shift for that channel. - -""" - -import os -from typing import List, Optional, Union - -import numpy as np -from cobaya.log import LoggedError -from cobaya.theory import Theory -from cobaya.tools import are_different_params_lists - -from soliket.constants import T_CMB, h_Planck, k_Boltzmann - -# Converts from cmb units to brightness. -# Numerical factors not included, it needs proper normalization when used. - - -def _cmb2bb(nu: np.ndarray) -> np.ndarray: - r""" - Computes the conversion factor :math:`\frac{\partial B_{\nu}}{\partial T}` - from CMB thermodynamic units to differential source intensity. - Passbands measured with respect to a RJ source have to be divided by a - :math:`\nu^2` factor. - - Numerical constants are not included, which is not a problem when using this - conversion both at numerator and denominator. - - :param nu: frequency array - - :return: the array :math:`\frac{\partial B_{\nu}}{\partial T}`. See note above. - """ - # NB: numerical factors not included - x = nu * h_Planck * 1e9 / k_Boltzmann / T_CMB - return np.exp(x) * (nu * x / np.expm1(x)) ** 2 - - -class BandPass(Theory): - # attributes set from .yaml - data_folder: Optional[str] - read_from_sacc: bool - top_hat_band: Optional[dict] - external_bandpass: Optional[Union[dict, list]] - params: dict - - enforce_types: bool = True - - def initialize(self): - self.expected_params_bp = ["bandint_shift_LAT_93", - "bandint_shift_LAT_145", - "bandint_shift_LAT_225"] - - self.exp_ch: Optional[List[str]] = None - # self.eff_freqs: Optional[list] = None - self.bands: dict = None - - # To read passbands stored in the sacc files - # default for mflike - self.read_from_sacc = bool(self.read_from_sacc) - - # Parameters for band integration - self.use_top_hat_band = bool(self.top_hat_band) - if self.use_top_hat_band: - self.bandint_nsteps = self.top_hat_band["nsteps"] - self.bandint_width = self.top_hat_band["bandwidth"] - - self.bandint_external_bandpass = bool(self.external_bandpass) - if self.bandint_external_bandpass: - path = os.path.normpath(os.path.join(self.data_folder, - self.external_bandpass["path"])) - arrays = os.listdir(path) - self._init_external_bandpass_construction(path, arrays) - - if (not self.read_from_sacc and not self.use_top_hat_band - and not self.bandint_external_bandpass): - raise LoggedError( - self.log, "fill the dictionaries in the yaml file for" - "either reading the passband from sacc file (mflike default)" - "or an external passband or building a top-hat one!" - ) - - def initialize_with_params(self): - # Check that the parameters are the right ones - differences = are_different_params_lists( - self.input_params, self.expected_params_bp, - name_A="given", name_B="expected") - if differences: - raise LoggedError( - self.log, "Configuration error in parameters: %r.", - differences) - - def must_provide(self, **requirements): - # bandint_freqs is required by Foreground - # and requires some params to be computed - # Assign those from Foreground - if "bandint_freqs" in requirements: - self.bands = requirements["bandint_freqs"]["bands"] - self.exp_ch = [k.replace("_s0", "") for k in self.bands.keys() - if "_s0" in k] - - def calculate(self, state: dict, want_derived=False, **params_values_dict: dict): - r""" - Adds the bandpass transmission to the ``state`` dictionary of the - BandPass Theory class. - - :param *params_values_dict: dictionary of nuisance parameters - """ - - nuis_params = {k: params_values_dict[k] for k in self.expected_params_bp} - - # Bandpass construction - if self.bandint_external_bandpass: - self.bandint_freqs = self._external_bandpass_construction(**nuis_params) - if self.read_from_sacc or self.use_top_hat_band: - self.bandint_freqs = self._bandpass_construction(**nuis_params) - - state["bandint_freqs"] = self.bandint_freqs - - def get_bandint_freqs(self) -> dict: - """ - Returns the ``state`` dictionary of bandpass transmissions - """ - return self.current_state["bandint_freqs"] - - # Takes care of the bandpass construction. It returns a list of nu-transmittance for - # each frequency or an array with the effective freqs. - def _bandpass_construction( - self, **params - ) -> Union[np.ndarray, List[np.ndarray]]: - r""" - Builds the bandpass transmission - :math:`\frac{\frac{\partial B_{\nu+\Delta \nu}}{\partial T} - \tau(\nu+\Delta \nu)}{\int d\nu - \frac{\partial B_{\nu+\Delta \nu}}{\partial T} \tau(\nu+\Delta \nu)}` - using passbands :math:`\tau(\nu)` (divided by :math:`\nu^2` if - measured with respect to a RJ source, not read from a txt - file) and bandpass shift :math:`\Delta \nu`. If ``read_from_sacc = True`` - (the default), :math:`\tau(\nu)` has been read from the sacc file - and passed through ``Foreground`` from ``TheoryForge``. - If ``use_top_hat_band``, :math:`\tau(\nu)` is built as a top-hat - with width ``bandint_width`` and number of samples ``nsteps``, - read from the ``BandPass.yaml``. - If ``nstep = 1`` and ``bandint_width = 0``, the passband is a Dirac delta - centered at :math:`\nu+\Delta \nu`. - - :param *params: dictionary of nuisance parameters - :return: the list of [nu, transmission] in the multifrequency case - or just an array of frequencies in the single frequency one - """ - - data_are_monofreq = False - bandint_freqs = [] - for ifr, fr in enumerate(self.exp_ch): - bandpar = 'bandint_shift_' + str(fr) - bands = self.bands[f"{fr}_s0"] - nu_ghz, bp = np.asarray(bands["nu"]), np.asarray(bands["bandpass"]) - if self.use_top_hat_band: - # checks on the bandpass input params - if not hasattr(self.bandint_width, "__len__"): - self.bandint_width = np.full_like( - self.exp_ch, self.bandint_width, dtype=float - ) - if self.bandint_nsteps > 1 and np.any(np.array(self.bandint_width) == 0): - raise LoggedError( - self.log, "One band has width = 0, \ - set a positive width and run again" - ) - # Compute central frequency given bandpass - fr = nu_ghz @ bp / bp.sum() - if self.bandint_nsteps > 1: - bandlow = fr * (1 - self.bandint_width[ifr] * .5) - bandhigh = fr * (1 + self.bandint_width[ifr] * .5) - nub = np.linspace(bandlow + params[bandpar], - bandhigh + params[bandpar], - self.bandint_nsteps, dtype=float) - tranb = _cmb2bb(nub) - tranb_norm = np.trapz(_cmb2bb(nub), nub) - bandint_freqs.append([nub, tranb / tranb_norm]) - if self.bandint_nsteps == 1: - nub = fr + params[bandpar] - data_are_monofreq = True - bandint_freqs.append(nub) - if self.read_from_sacc: - nub = nu_ghz + params[bandpar] - if len(bp) == 1: - # Monofrequency channel - data_are_monofreq = True - bandint_freqs.append(nub[0]) - else: - trans_norm = np.trapz(bp * _cmb2bb(nub), nub) - trans = bp / trans_norm * _cmb2bb(nub) - bandint_freqs.append([nub, trans]) - - if data_are_monofreq: - bandint_freqs = np.asarray(bandint_freqs) - self.log.info("bandpass is delta function, no band integration performed") - - return bandint_freqs - - def _init_external_bandpass_construction(self, path: str, exp_ch: List[str]): - """ - Initializes the passband reading for ``_external_bandpass_construction``. - - :param exp_ch: list of the frequency channels - :param path: path of the passband txt file - """ - self.external_bandpass = [] - for expc in exp_ch: - print(path, expc) - nu_ghz, bp = np.loadtxt(path + "/" + expc, usecols=(0, 1), unpack=True) - self.external_bandpass.append([expc, nu_ghz, bp]) - - def _external_bandpass_construction( - self, **params - ) -> Union[np.ndarray, List[np.ndarray]]: - r""" - Builds bandpass transmission - :math:`\frac{\frac{\partial B_{\nu+\Delta \nu}}{\partial T} - \tau(\nu+\Delta \nu)}{\int d\nu - \frac{\partial B_{\nu+\Delta \nu}}{\partial T} \tau(\nu+\Delta \nu)}` - using passbands :math:`\tau(\nu)` (divided by :math:`\nu^2` if measured - with respect to a RJ source) read from - an external txt file and - possible bandpass shift parameters :math:`\Delta \nu`. - - :param *params: dictionary of nuisance parameters - - :return: the list of [nu, transmission] or array of effective freqs - if the passbands read are monofrequency. - """ - bandint_freqs = [] - for expc, nu_ghz, bp in self.external_bandpass: - bandpar = "bandint_shift_" + expc - nub = nu_ghz + params[bandpar] - if not hasattr(bp, "__len__"): - bandint_freqs.append(nub) - bandint_freqs = np.asarray(bandint_freqs) - self.log.info("bandpass is delta function, no band integration performed") - else: - trans_norm = np.trapz(bp * _cmb2bb(nub), nub) - trans = bp / trans_norm * _cmb2bb(nub) - bandint_freqs.append([nub, trans]) - - return bandint_freqs diff --git a/soliket/cosmopower/cosmopower.py b/soliket/cosmopower/cosmopower.py index bb569f54..f981f2c3 100644 --- a/soliket/cosmopower/cosmopower.py +++ b/soliket/cosmopower/cosmopower.py @@ -1,4 +1,4 @@ -""" +r""" .. module:: soliket.cosmopower :Synopsis: Simple CosmoPower theory wrapper for Cobaya. diff --git a/soliket/foreground/Foreground.yaml b/soliket/foreground/Foreground.yaml deleted file mode 100644 index d7a41cbe..00000000 --- a/soliket/foreground/Foreground.yaml +++ /dev/null @@ -1,116 +0,0 @@ -# spectra can be either assigned from yaml -# or passed from like/theory component requiring Foreground -spectra: - polarizations: ["tt", "te", "ee"] - lmin: 2 - lmax: 9000 - exp_ch: ["LAT_150"] - eff_freqs: [150.] - -foregrounds: - normalisation: - nu_0: 150.0 - ell_0: 3000 - T_CMB: 2.725 - components: - tt: - - kSZ - - tSZ_and_CIB - - cibp - - dust - - radio - te: - - radio - - dust - ee: - - radio - - dust - -params: - # Foregrounds - a_tSZ: - prior: - min: 3.0 - max: 3.6 - proposal: 0.05 - latex: a_\mathrm{tSZ} - a_kSZ: - prior: - min: 1.4 - max: 1.8 - proposal: 0.1 - latex: a_\mathrm{kSZ} - a_p: - prior: - min: 6.2 - max: 7.6 - proposal: 0.075 - latex: a_p - beta_p: - prior: - min: 1.8 - max: 2.2 - proposal: 0.015 - latex: \beta_p - a_c: - prior: - min: 4.4 - max: 5.4 - proposal: 0.12 - latex: a_c - beta_c: - prior: - min: 2.0 - max: 2.4 - proposal: 0.03 - latex: \beta_c - a_s: - prior: - min: 2.8 - max: 3.4 - proposal: 0.01 - latex: a_s - a_gtt: - prior: - dist: norm - loc: 2.79 - scale: 0.45 - proposal: 0.4 - latex: a_\mathrm{dust}^\mathrm{TT} - a_gte: - prior: - dist: norm - loc: 0.36 - scale: 0.04 - proposal: 0.04 - latex: a_\mathrm{dust}^\mathrm{TE} - a_gee: - prior: - dist: norm - loc: 0.13 - scale: 0.03 - proposal: 0.03 - latex: a_\mathrm{dust}^\mathrm{EE} - a_psee: - prior: - min: 0 - proposal: 0.05 - latex: a_\mathrm{ps}^\mathrm{EE} - a_pste: - prior: - min: -1 - max: 1 - proposal: 0.05 - latex: a_\mathrm{ps}^\mathrm{TE} - xi: - prior: - min: 0 - max: 0.2 - proposal: 0.05 - latex: \xi - T_d: - prior: - min: 8.60 - max: 10.60 - proposal: 0.6 - latex: T_d diff --git a/soliket/foreground/__init__.py b/soliket/foreground/__init__.py deleted file mode 100644 index 7d668e53..00000000 --- a/soliket/foreground/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -from .foreground import Foreground - -__all__ = [ - "Foreground", -] diff --git a/soliket/foreground/foreground.py b/soliket/foreground/foreground.py deleted file mode 100644 index 2b8faa97..00000000 --- a/soliket/foreground/foreground.py +++ /dev/null @@ -1,367 +0,0 @@ -r""" -.. module:: foreground - -The ``Foreground`` class initialized the foreground components and computes -the foreground spectra for each component and each channel. The information -on the arrays to use come from the ``TheoryForge`` class by default, through -the dictionary ``bands``. This is a dictionary - -.. code-block:: python - - {"experiment_channel": {{"nu": [freqs...], - "bandpass": [...]}}, ...} - -which is filled by ``MFLike`` using the information from the sacc file. -This dictionary is then passed to ``Bandpass`` to compute the bandpass -transmissions, which are then used for the actual foreground spectra computation. - - -If one wants to use this class as standalone, the ``bands`` dictionary is -filled when initializing ``Foreground``. The name of the channels to use -are read from the ``exp_ch`` list in ``Foreground.yaml``, the effective -frequencies are in the ``eff_freqs`` list. Of course the effective frequencies -have to match the information from ``exp_ch``, i.e.: - -.. code-block:: yaml - - exp_ch: ["LAT_93", "LAT_145", "ACT_145"] - eff_freqs: [93, 145, 145] - - -The foreground spectra in this case can be computed by calling the -function - -.. code-block:: python - - Foreground._get_foreground_model(requested_cls, - ell, - exp_ch, - bandint_freqs=None, - eff_freqs, - **fg_params): - -which will have -``bandint_freqs=None`` (no passbands from ``BandPass``). The spectra will be computed -assuming just a Dirac delta at the effective frequencies ``eff_freqs``. - - -""" - -import os -from typing import Dict, List, Optional - -import numpy as np -from cobaya.log import LoggedError -from cobaya.theory import Provider, Theory -from cobaya.tools import are_different_params_lists - - -class Foreground(Theory): - spectra: dict - foregrounds: dict - params: dict - - enforce_types: bool = True - - eff_freqs: Optional[list] - exp_ch: Optional[list] - provider: Provider - - # Initializes the foreground model. It sets the SED and reads the templates - def initialize(self): - """ - Initializes the foreground models from ``fgspectra``. Sets the SED - of kSZ, tSZ, dust, radio, CIB Poisson and clustered, - tSZxCIB, and reads the templates for CIB and tSZxCIB. - """ - from fgspectra import cross as fgc - from fgspectra import frequency as fgf - from fgspectra import power as fgp - - self.expected_params_fg = ["a_tSZ", "a_kSZ", "a_p", "beta_p", - "a_c", "beta_c", "a_s", "a_gtt", "a_gte", "a_gee", - "a_psee", "a_pste", "xi", "T_d"] - - self.requested_cls: List[str] = self.spectra["polarizations"] - self.lmin: int = self.spectra["lmin"] - self.lmax: int = self.spectra["lmax"] - self.exp_ch: List[str] = self.spectra["exp_ch"] - self.eff_freqs: List[float] = self.spectra["eff_freqs"] - - if hasattr(self.eff_freqs, "__len__"): - if not len(self.exp_ch) == len(self.eff_freqs): - raise LoggedError( - self.log, "list of effective frequency has to have" - "same length as list of channels!" - ) - - # self.bands to be filled with passbands read from sacc file - # if mflike is used - self.bands: Dict[str, Dict[str, List[float]]] = { - f"{expc}_s0": {'nu': [self.eff_freqs[iexpc]], 'bandpass': [1.]} - for iexpc, expc in enumerate(self.exp_ch) - } - - template_path: str = os.path.join( - os.path.dirname(os.path.abspath(fgp.__file__)), 'data' - ) - cibc_file: str = os.path.join(template_path, 'cl_cib_Choi2020.dat') - - # set pivot freq and multipole - self.fg_nu_0: float = self.foregrounds["normalisation"]["nu_0"] - self.fg_ell_0: float = self.foregrounds["normalisation"]["ell_0"] - - # We don't seem to be using this - # cirrus = fgc.FactorizedCrossSpectrum(fgf.PowerLaw(), fgp.PowerLaw()) - self.ksz = fgc.FactorizedCrossSpectrum(fgf.ConstantSED(), fgp.kSZ_bat()) - self.cibp = fgc.FactorizedCrossSpectrum(fgf.ModifiedBlackBody(), fgp.PowerLaw()) - self.radio = fgc.FactorizedCrossSpectrum(fgf.PowerLaw(), fgp.PowerLaw()) - self.tsz = fgc.FactorizedCrossSpectrum(fgf.ThermalSZ(), fgp.tSZ_150_bat()) - self.cibc = fgc.FactorizedCrossSpectrum(fgf.CIB(), - fgp.PowerSpectrumFromFile(cibc_file)) - self.dust = fgc.FactorizedCrossSpectrum(fgf.ModifiedBlackBody(), fgp.PowerLaw()) - self.tSZ_and_CIB = fgc.SZxCIB_Choi2020() - - self.components: List[str] = self.foregrounds["components"] - - def initialize_with_params(self): - # Check that the parameters are the right ones - differences = are_different_params_lists( - self.input_params, self.expected_params_fg, - name_A="given", name_B="expected") - if differences: - raise LoggedError( - self.log, "Configuration error in parameters: %r.", - differences) - - # Gets the actual power spectrum of foregrounds given the passed parameters - def _get_foreground_model( - self, - requested_cls: Optional[List[str]] = None, - ell: Optional[np.ndarray] = None, - exp_ch: Optional[List[str]] = None, - bandint_freqs: Optional[np.ndarray] = None, - eff_freqs: Optional[np.ndarray] = None, - **fg_params: dict - ) -> dict: - r""" - Gets the foreground power spectra for each component computed by ``fgspectra``. - The computation assumes the bandpass transmissions from the ``BandPass`` class - and integration in frequency is performed if the passbands are not Dirac delta. - - :param requested_cls: the fields required. If ``None``, - it uses the default ones in the - ``Foreground.yaml`` - :param ell: ell range. If ``None`` the default range - set in ``Foreground.yaml`` is used - :param exp_ch: list of strings "experiment_channel" used to indicate the - foreground components computed for a particular array - of an experiment. - If ``None``, it uses the default ones in the ``Foreground.yaml`` - :param bandint_freqs: the bandpass transmissions. If ``None`` it is built as an - array of frequencies stored in the ``eff_freqs`` argument, - which in this case has to be not ``None``. If - ``bandint_freqs`` is not ``None``, it is - the transmissions computed by the ``BandPass`` class - :param eff_freqs: list of the effective frequencies for each channel - used to compute the foreground components (assuming a Dirac - delta passband at these frequencies) if the - ``bandint_freqs`` argument is not provided - :param *fg_params: parameters of the foreground components - - :return: the foreground dictionary - """ - - if not requested_cls: - requested_cls = self.requested_cls - # if ell = None, it uses ell from yaml, otherwise the ell array provided - # useful to make tests at different l_max than the data - if not hasattr(ell, '__len__'): - ell = self.ell - - ell_0 = self.fg_ell_0 - nu_0 = self.fg_nu_0 - - # Normalisation of radio sources - ell_clp = ell * (ell + 1.) - ell_0clp = ell_0 * (ell_0 + 1.) - - # Set component spectra - self.fg_component_list = {s: self.components[s] for s in requested_cls} - - # Set exp_ch list - if not hasattr(exp_ch, '__len__'): - exp_ch = self.exp_ch - - # Set array of freqs to use if bandint_freqs is None - if not hasattr(bandint_freqs, '__len__'): - if hasattr(eff_freqs, '__len__'): - bandint_freqs = np.asarray(eff_freqs) - else: - raise LoggedError( - self.log, "no frequency list provided to compute the passbands" - ) - - model = {} - model["tt", "kSZ"] = fg_params["a_kSZ"] * self.ksz({"nu": bandint_freqs}, - {"ell": ell, - "ell_0": ell_0}) - - model["tt", "cibp"] = fg_params["a_p"] * self.cibp({"nu": bandint_freqs, - "nu_0": nu_0, - "temp": fg_params["T_d"], - "beta": fg_params["beta_p"]}, - {"ell": ell_clp, - "ell_0": ell_0clp, - "alpha": 1}) - - model["tt", "radio"] = fg_params["a_s"] * self.radio({"nu": bandint_freqs, - "nu_0": nu_0, - "beta": -0.5 - 2.}, - {"ell": ell_clp, - "ell_0": ell_0clp, - "alpha": 1}) - - model["tt", "tSZ"] = fg_params["a_tSZ"] * self.tsz({"nu": bandint_freqs, - "nu_0": nu_0}, - {"ell": ell, - "ell_0": ell_0}) - - model["tt", "cibc"] = fg_params["a_c"] * self.cibc({"nu": bandint_freqs, - "nu_0": nu_0, - "temp": fg_params["T_d"], - "beta": fg_params["beta_c"]}, - {'ell': ell, - 'ell_0': ell_0}) - - model["tt", "dust"] = fg_params["a_gtt"] * self.dust({"nu": bandint_freqs, - "nu_0": nu_0, - "temp": 19.6, - "beta": 1.5}, - {"ell": ell, - "ell_0": 500., - "alpha": -0.6}) - - model["tt", "tSZ_and_CIB"] = \ - self.tSZ_and_CIB({'kwseq': ({'nu': bandint_freqs, 'nu_0': nu_0}, - {'nu': bandint_freqs, 'nu_0': nu_0, - 'temp': fg_params['T_d'], - 'beta': fg_params["beta_c"]})}, - {'kwseq': ({'ell': ell, 'ell_0': ell_0, - 'amp': fg_params['a_tSZ']}, - {'ell': ell, 'ell_0': ell_0, - 'amp': fg_params['a_c']}, - {'ell': ell, 'ell_0': ell_0, - 'amp': - fg_params['xi'] \ - * np.sqrt(fg_params['a_tSZ'] * - fg_params['a_c'])})}) - - model["ee", "radio"] = fg_params["a_psee"] * self.radio({"nu": bandint_freqs, - "nu_0": nu_0, - "beta": -0.5 - 2.}, - {"ell": ell_clp, - "ell_0": ell_0clp, - "alpha": 1}) - - model["ee", "dust"] = fg_params["a_gee"] * self.dust({"nu": bandint_freqs, - "nu_0": nu_0, - "temp": 19.6, - "beta": 1.5}, - {"ell": ell, - "ell_0": 500., - "alpha": -0.4}) - - model["te", "radio"] = fg_params["a_pste"] * self.radio({"nu": bandint_freqs, - "nu_0": nu_0, - "beta": -0.5 - 2.}, - {"ell": ell_clp, - "ell_0": ell_0clp, - "alpha": 1}) - - model["te", "dust"] = fg_params["a_gte"] * self.dust({"nu": bandint_freqs, - "nu_0": nu_0, - "temp": 19.6, - "beta": 1.5}, - {"ell": ell, - "ell_0": 500., - "alpha": -0.4}) - - fg_dict = {} - for c1, f1 in enumerate(exp_ch): - for c2, f2 in enumerate(exp_ch): - for s in requested_cls: - fg_dict[s, "all", f1, f2] = np.zeros(len(ell)) - for comp in self.fg_component_list[s]: - if comp == "tSZ_and_CIB": - fg_dict[s, "tSZ", f1, f2] = model[s, "tSZ"][c1, c2] - fg_dict[s, "cibc", f1, f2] = model[s, "cibc"][c1, c2] - fg_dict[s, "tSZxCIB", f1, f2] = ( - model[s, comp][c1, c2] - - model[s, "tSZ"][c1, c2] - - model[s, "cibc"][c1, c2] - ) - fg_dict[s, "all", f1, f2] += model[s, comp][c1, c2] - else: - fg_dict[s, comp, f1, f2] = model[s, comp][c1, c2] - fg_dict[s, "all", f1, f2] += fg_dict[s, comp, f1, f2] - return fg_dict - - def must_provide(self, **requirements) -> dict: - # fg_dict is required by theoryforge - # and requires some params to be computed - # Assign those from theoryforge - # otherwise use default values - # Foreground requires bandint_freqs from BandPass - # Bandint_freqs requires some params to be computed - # Passing those from Foreground - if "fg_dict" in requirements: - req = requirements["fg_dict"] - self.requested_cls = req.get("requested_cls", self.requested_cls) - self.ell = req.get("ell", None) - self.bands = req.get("bands", self.bands) - self.exp_ch = req.get("exp_ch", self.exp_ch) - return {"bandint_freqs": {"bands": self.bands}} - return {} - - def get_bandpasses(self, **params) -> np.ndarray: - """ - Gets bandpass transmissions from the ``BandPass`` class. - """ - return self.provider.get_bandint_freqs() - - def calculate(self, state: dict, want_derived=False, **params_values_dict: dict): - """ - Fills the ``state`` dictionary of the ``Foreground`` Theory class - with the foreground spectra, computed using the bandpass - transmissions from the ``BandPass`` class and the sampled foreground - parameters. - - :param state: ``state`` dictionary to be filled with computed foreground - spectra - :param *params_values_dict: dictionary of parameters from the sampler - """ - - # compute bandpasses at each step only if bandint_shift params are not null - # and bandint_freqs has been computed at least once - if np.all( - np.array([params_values_dict[k] for k in params_values_dict.keys() - if "bandint_shift_" in k]) == 0.0 - ): - if not hasattr(self, "bandint_freqs"): - self.log.info("Computing bandpass at first step, no shifts") - self.bandint_freqs = self.get_bandpasses(**params_values_dict) - else: - self.bandint_freqs = self.get_bandpasses(**params_values_dict) - - fg_params = {k: params_values_dict[k] for k in self.expected_params_fg} - state["fg_dict"] = self._get_foreground_model(requested_cls=self.requested_cls, - ell=self.ell, - exp_ch=self.exp_ch, - bandint_freqs=self.bandint_freqs, - **fg_params) - - def get_fg_dict(self) -> dict: - """ - Returns the ``state`` dictionary of fogreground spectra - """ - return self.current_state["fg_dict"] diff --git a/soliket/gaussian/gaussian.py b/soliket/gaussian/gaussian.py index 73c963af..e9df199f 100644 --- a/soliket/gaussian/gaussian.py +++ b/soliket/gaussian/gaussian.py @@ -8,8 +8,7 @@ from cobaya.typing import empty_dict from soliket.utils import get_likelihood - -from .gaussian_data import GaussianData, MultiGaussianData +from soliket.gaussian.gaussian_data import GaussianData, MultiGaussianData class GaussianLikelihood(Likelihood): @@ -37,7 +36,10 @@ def _get_cov(self) -> np.ndarray: def _get_theory(self, **kwargs) -> np.ndarray: raise NotImplementedError - def logp(self, **params_values) -> float: + def _get_gauss_data(self): + return self.data + + def logp(self, **params_values): theory = self._get_theory(**params_values) return self.data.loglike(theory) @@ -75,7 +77,7 @@ def __init__(self, info: dict = empty_dict, **kwargs): def initialize(self): self.cross_cov: Optional[CrossCov] = CrossCov.load(self.cross_cov_path) - data_list = [like.data for like in self.likelihoods] + data_list = [like._get_gauss_data() for like in self.likelihoods] self.data = MultiGaussianData(data_list, self.cross_cov) self.log.info('Initialized.') diff --git a/soliket/mflike/MFLike.yaml b/soliket/mflike/MFLike.yaml deleted file mode 100644 index 28954fb9..00000000 --- a/soliket/mflike/MFLike.yaml +++ /dev/null @@ -1,58 +0,0 @@ -# A simple cobaya likelihood for SO/LAT - -data_folder: MFLike/v0.8 -# Path to the input SACC file, containing, minimally, -# information about the different tracers (i.e. frequency -# bands) and the set of power spectra. -input_file: null -# If cov_Bbl_file is null, then the previous file should -# also contain bandpower window functions and covariance -# matrix. Otherwise they'll be read from this file. -# (The logic here is that you may have many different -# realizations that share the same bandpowers and covariance) -cov_Bbl_file: null - -# Maximum multipole value up to compute theory Cl -# If set to null i.e. None, the program will set the value to 9000 -lmax_theory: null - -# Specify default set of spectra and scale cuts -# to be used -defaults: - # Which spectra? - polarizations: ['TT', 'TE', 'ET', 'EE'] - # Scale cuts (in ell) for each spectrum - # these scales should be used when marginalizing over foregrounds! - scales: - TT: [30, 9000] - TE: [30, 9000] - ET: [30, 9000] - EE: [30, 9000] - # If True, TE' = (TE + ET) / 2 will be used - # instead of TE and ET separately. - symmetrize: False - -data: - # List the names and frequencies of all the - # relevant experiments. - experiments: - - LAT_93 - - LAT_145 - - LAT_225 - # PlanckHFI: - - spectra: - # Here, list all the different cross-correlations - # between experiments and bands you want to - # analyse. - # For each of them, you can specify which spectra - # and scale cuts you'd like to use. If you don't - # specify anything, the defaults will be used. - - experiments: [LAT_93, LAT_93] - - experiments: [LAT_93, LAT_145] - - experiments: [LAT_93, LAT_225] - - experiments: [LAT_145, LAT_145] - - experiments: [LAT_145, LAT_225] - - experiments: [LAT_225, LAT_225] - - diff --git a/soliket/mflike/TheoryForge_MFLike.yaml b/soliket/mflike/TheoryForge_MFLike.yaml deleted file mode 100644 index 94d87709..00000000 --- a/soliket/mflike/TheoryForge_MFLike.yaml +++ /dev/null @@ -1,60 +0,0 @@ -data_folder: MFLike/v0.6 - -# freqs can be either assigned from yaml -# or passed from like/theory component requiring theoryforge -exp_ch: ["LAT_93", "LAT_145", "LAT_225"] -eff_freqs: [93, 145, 225] - -spectra: - # Which spectra? - # polarizations can be either assigned from yaml - # or passed from like/theory component requiring theoryforge - polarizations: ["tt", "te", "ee"] - # Which lrange? - lmin: 2 - lmax: 9000 - -systematics_template: - #rootname: "test_template" - -params: - # Systematics - cal_LAT_93: - value: 1 - latex: \mathrm{Cal}_{\rm LAT}^{93} - cal_LAT_145: - value: 1 - latex: \mathrm{Cal}_{\rm LAT}^{145} - cal_LAT_225: - value: 1 - latex: \mathrm{Cal}_{\rm LAT}^{225} - calT_LAT_93: - value: 1 - latex: \mathrm{Cal}_{\rm LAT, T}^{93} - calE_LAT_93: - value: 1 - latex: \mathrm{Cal}_{\rm LAT, E}^{93} - calT_LAT_145: - value: 1 - latex: \mathrm{Cal}_{\rm LAT, T}^{145} - calE_LAT_145: - value: 1 - latex: \mathrm{Cal}_{\rm LAT, E}^{145} - calT_LAT_225: - value: 1 - latex: \mathrm{Cal}_{\rm LAT, T}^{225} - calE_LAT_225: - value: 1 - latex: \mathrm{Cal}_{\rm LAT, E}^{225} - calG_all: - value: 1 - latex: \mathrm{Cal}_{\rm G}^{\rm All} - alpha_LAT_93: - value: 0 #deg - latex: \alpha_{\rm LAT}^{93} - alpha_LAT_145: - value: 0 #deg - latex: \alpha_{\rm LAT}^{145} - alpha_LAT_225: - value: 0 #deg - latex: \alpha_{\rm LAT}^{225} diff --git a/soliket/mflike/__init__.py b/soliket/mflike/__init__.py deleted file mode 100644 index 040bc7ab..00000000 --- a/soliket/mflike/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -from .mflike import MFLike, TestMFLike -from .theoryforge_MFLike import TheoryForge_MFLike - -__all__ = [ - # mflike - "MFLike", - "TestMFLike", - # theoryforge_MFLike - "TheoryForge_MFLike", -] diff --git a/soliket/mflike/mflike.py b/soliket/mflike/mflike.py deleted file mode 100644 index 49f41056..00000000 --- a/soliket/mflike/mflike.py +++ /dev/null @@ -1,439 +0,0 @@ -r""" -.. module:: mflike - -:Synopsis: Multi frequency likelihood for TTTEEE CMB power spectra for Simons Observatory -:Authors: Thibaut Louis, Xavier Garrido, Max Abitbol, - Erminia Calabrese, Antony Lewis, David Alonso. - -MFLike is a multi frequency likelihood code interfaced with the Cobaya -sampler and a theory Boltzmann code such as CAMB, CLASS or Cosmopower. -The ``MFLike`` likelihood class reads the data file (in ``sacc`` format) -and all the settings -for the MCMC run (such as file paths, :math:`\ell` ranges, experiments -and frequencies to be used, parameters priors...) -from the ``MFLike.yaml`` file. - -The theory :math:`C_{\ell}` are then summed to the (possibly frequency -integrated) foreground power spectra and modified by systematic effects -in the ``TheoryForge_MFLike`` class. The foreground power spectra are -computed by the ``soliket.Foreground`` class, while the bandpasses from -the ``soliket.BandPass`` one; the ``Foreground`` class is required by -``TheoryForge_MFLike``, while ``BandPass`` is requires by ``Foreground``. -This is a scheme of how ``MFLike`` and ``TheoryForge_MFLike`` are interfaced: - -.. image:: images/mflike_scheme.png - :width: 400 -""" -import os -from typing import List, Optional, Tuple - -import numpy as np -from cobaya.likelihoods.base_classes import InstallableLikelihood -from cobaya.log import LoggedError -from cobaya.theory import Provider - -from soliket.gaussian import GaussianData, GaussianLikelihood - - -class MFLike(GaussianLikelihood, InstallableLikelihood): - _url = "https://portal.nersc.gov/cfs/sobs/users/MFLike_data" - _release = "v0.8" - install_options = {"download_url": "{}/{}.tar.gz".format(_url, _release)} - - # attributes set from .yaml - input_file: Optional[str] - cov_Bbl_file: Optional[str] - data: dict - defaults: dict - lmax_theory: Optional[int] - data: dict - provider: Provider - - def initialize(self): - # Set default values to data member not initialized via yaml file - self.l_bpws: Optional[np.ndarray] = None - self.spec_meta: Optional[list] = [] - - # Set path to data - if ((not getattr(self, "path", None)) and - (not getattr(self, "packages_path", None))): - raise LoggedError(self.log, - "No path given to MFLike data. " - "Set the likelihood property " - "'path' or 'packages_path'" - ) - # If no path specified, use the modules path - data_file_path = os.path.normpath(getattr(self, "path", None) or - os.path.join(self.packages_path, - "data")) - - self.data_folder = os.path.join(data_file_path, self.data_folder) - if not os.path.exists(self.data_folder): - if not getattr(self, "path", None): - self.install(path=self.packages_path) - else: - raise LoggedError( - self.log, - "The 'data_folder' directory does not exist. " - "Check the given path [%s].", - self.data_folder, - ) - - self.requested_cls: List[str] = [ - p.lower() for p in self.defaults["polarizations"] - ] - for x in ["et", "eb", "bt"]: - if x in self.requested_cls: - self.requested_cls.remove(x) - - # Read data - self.prepare_data() - self.lmax_theory: int = self.lmax_theory or 9000 - self.log.debug(f"Maximum multipole value: {self.lmax_theory}") - - self.log.info("Initialized!") - - def get_requirements(self) -> dict: - r""" - Passes the fields ``ell``, ``requested_cls``, ``lcuts``, - ``exp_ch`` (list of array names) and ``bands`` - (dictionary of ``exp_ch`` and the corresponding frequency - and passbands) inside the dictionary ``requirements["cmbfg_dict"]``. - - :return: the dictionary ``requirements["cmbfg_dict"]`` - """ - # mflike requires cmbfg_dict from theoryforge - # cmbfg_dict requires some params to be computed - reqs = dict() - reqs["cmbfg_dict"] = {"ell": self.l_bpws, - "requested_cls": self.requested_cls, - "lcuts": self.lcuts, - "exp_ch": self.experiments, - "bands": self.bands} - return reqs - - def _get_theory(self, **params_values) -> np.ndarray: - cmbfg_dict = self.provider.get_cmbfg_dict() - return self._get_power_spectra(cmbfg_dict) - - def logp(self, **params_values) -> float: - cmbfg_dict = self.provider.get_cmbfg_dict() - return self.loglike(cmbfg_dict) - - def loglike(self, cmbfg_dict: dict) -> float: - r""" - Computes the gaussian log-likelihood - - :param cmbfg_dict: the dictionary of theory + foregrounds - :math:`D_{\ell}` - - :return: the exact loglikelihood :math:`\ln \mathcal{L}` - """ - ps_vec = self._get_power_spectra(cmbfg_dict) - delta = self.data_vec - ps_vec - logp = -0.5 * (delta @ self.inv_cov @ delta) - logp += self.logp_const - self.log.debug( - "Log-likelihood value computed " - "= {} (Χ² = {})".format(logp, -2 * (logp - self.logp_const))) - return logp - - def prepare_data(self, verbose=False): - """ - Reads the sacc data, extracts the data tracers, - trims the spectra and covariance according to the ell scales - set in the input file. It stores the ell vector, the deta vector - and the covariance in a GaussianData object. - If ``verbose=True``, it plots the tracer names, the spectrum name, - the shape of the indices array, lmin, lmax. - """ - import sacc - data = self.data - # Read data - input_fname = os.path.join(self.data_folder, self.input_file) - s = sacc.Sacc.load_fits(input_fname) - - # Read extra file containing covariance and windows if needed. - cbbl_extra = False - s_b = s - if self.cov_Bbl_file: - if self.cov_Bbl_file != self.input_file: - cov_Bbl_fname = os.path.join(self.data_folder, - self.cov_Bbl_file) - s_b = sacc.Sacc.load_fits(cov_Bbl_fname) - cbbl_extra = True - - try: - default_cuts = self.defaults - except AttributeError: - raise KeyError("You must provide a list of default cuts") - - # Translation betwen TEB and sacc C_ell types - pol_dict = {"T": "0", - "E": "e", - "B": "b"} - ppol_dict = {"TT": "tt", - "EE": "ee", - "TE": "te", - "ET": "te", - "BB": "bb", - "EB": "eb", - "BE": "eb", - "TB": "tb", - "BT": "tb", - "BB": "bb"} - - def get_cl_meta(spec: dict) -> Tuple[str, str, List[str], dict, bool]: - """ - Lower-level function of `prepare_data`. - For each of the entries of the `spectra` section of the - yaml file, extracts the relevant information: channel, - polarization combinations, scale cuts and - whether TE should be symmetrized. - - :param spec: the dictionary ``data["spectra"]`` - """ - # Experiments/frequencies - exp_1, exp_2 = spec["experiments"] - # Read off polarization channel combinations - pols = spec.get("polarizations", - default_cuts["polarizations"]).copy() - # Read off scale cuts - scls = spec.get("scales", - default_cuts["scales"]).copy() - - # For the same two channels, do not include ET and TE, only TE - if exp_1 == exp_2: - if "ET" in pols: - pols.remove("ET") - if "TE" not in pols: - pols.append("TE") - scls["TE"] = scls["ET"] - symm = False - else: - # Symmetrization - if ("TE" in pols) and ("ET" in pols): - symm = spec.get("symmetrize", - default_cuts["symmetrize"]) - else: - symm = False - - return exp_1, exp_2, pols, scls, symm - - def get_sacc_names(pol: str, exp_1: str, exp_2: str) -> Tuple[str, str, str]: - r""" - Lower-level function of `prepare_data`. - Translates the polarization combination and channel - name of a given entry in the `spectra` - part of the input yaml file into the names expected - in the SACC files. - - :param pol: temperature or polarization fields, i.e. 'TT', 'TE' - :param exp_1: experiment of map 1 - :param exp_2: experiment of map 2 - - :return: tracer name 1, tracer name 2, string for :math:`C_{\ell}` - type - """ - tname_1 = exp_1 - tname_2 = exp_2 - p1, p2 = pol - if p1 in ["E", "B"]: - tname_1 += "_s2" - else: - tname_1 += "_s0" - if p2 in ["E", "B"]: - tname_2 += "_s2" - else: - tname_2 += "_s0" - - if p2 == "T": - dtype = "cl_" + pol_dict[p2] + pol_dict[p1] - else: - dtype = "cl_" + pol_dict[p1] + pol_dict[p2] - return tname_1, tname_2, dtype - - # First we trim the SACC file so it only contains - # the parts of the data we care about. - # Indices to be kept - indices = [] - indices_b = [] - # Length of the final data vector - len_compressed = 0 - for spectrum in data["spectra"]: - (exp_1, exp_2, pols, scls, symm) = get_cl_meta(spectrum) - for pol in pols: - tname_1, tname_2, dtype = get_sacc_names(pol, exp_1, exp_2) - lmin, lmax = scls[pol] - ind = s.indices(dtype, # Power spectrum type - (tname_1, tname_2), # Channel combinations - ell__gt=lmin, ell__lt=lmax) # Scale cuts - indices += list(ind) - - # Note that data in the cov_Bbl file may be in different order. - if cbbl_extra: - ind_b = s_b.indices(dtype, - (tname_1, tname_2), - ell__gt=lmin, ell__lt=lmax) - indices_b += list(ind_b) - - if symm and pol == "ET": - pass - else: - len_compressed += ind.size - - self.log.debug(f"{tname_1} {tname_2} {dtype} {ind.shape} {lmin} {lmax}") - - # Get rid of all the unselected power spectra. - # Sacc takes care of performing the same cuts in the - # covariance matrix, window functions etc. - s.keep_indices(np.array(indices)) - if cbbl_extra: - s_b.keep_indices(np.array(indices_b)) - - # Now create metadata for each spectrum - len_full = s.mean.size - # These are the matrices we'll use to compress the data if - # `symmetrize` is true. - # Note that a lot of the complication in this function is caused by the - # symmetrization option, for which SACC doesn't have native support. - mat_compress = np.zeros([len_compressed, len_full]) - mat_compress_b = np.zeros([len_compressed, len_full]) - - self.lcuts = {k: c[1] for k, c in default_cuts["scales"].items()} - index_sofar = 0 - - for spectrum in data["spectra"]: - (exp_1, exp_2, pols, scls, symm) = get_cl_meta(spectrum) - for k in scls.keys(): - self.lcuts[k] = max(self.lcuts[k], scls[k][1]) - for pol in pols: - tname_1, tname_2, dtype = get_sacc_names(pol, exp_1, exp_2) - # The only reason why we need indices is the symmetrization. - # Otherwise all of this could have been done in the previous - # loop over data["spectra"]. - ls, cls, ind = s.get_ell_cl(dtype, tname_1, tname_2, return_ind=True) - if cbbl_extra: - ind_b = s_b.indices(dtype, - (tname_1, tname_2)) - ws = s_b.get_bandpower_windows(ind_b) - else: - ws = s.get_bandpower_windows(ind) - - if self.l_bpws is None: - # The assumption here is that bandpower windows - # will all be sampled at the same ells. - self.l_bpws = ws.values - - # Symmetrize if needed. - if (pol in ["TE", "ET"]) and symm: - pol2 = pol[::-1] - pols.remove(pol2) - tname_1, tname_2, dtype = get_sacc_names(pol2, - exp_1, exp_2) - ind2 = s.indices(dtype, - (tname_1, tname_2)) - cls2 = s.get_ell_cl(dtype, tname_1, tname_2)[1] - cls = 0.5 * (cls + cls2) - - for i, (j1, j2) in enumerate(zip(ind, ind2)): - mat_compress[index_sofar + i, j1] = 0.5 - mat_compress[index_sofar + i, j2] = 0.5 - if cbbl_extra: - ind2_b = s_b.indices(dtype, - (tname_1, tname_2)) - for i, (j1, j2) in enumerate(zip(ind_b, ind2_b)): - mat_compress_b[index_sofar + i, j1] = 0.5 - mat_compress_b[index_sofar + i, j2] = 0.5 - else: - for i, j1 in enumerate(ind): - mat_compress[index_sofar + i, j1] = 1 - if cbbl_extra: - for i, j1 in enumerate(ind_b): - mat_compress_b[index_sofar + i, j1] = 1 - # The fields marked with # below aren't really used, but - # we store them just in case. - self.spec_meta.append({"ids": (index_sofar + - np.arange(cls.size, - dtype=int)), - "pol": ppol_dict[pol], - "hasYX_xsp": pol in ["ET", "BE", "BT"], # For symm - "t1": exp_1, - "t2": exp_2, - "leff": ls, - "cl_data": cls, - "bpw": ws}) - index_sofar += cls.size - if not cbbl_extra: - mat_compress_b = mat_compress - # Put data and covariance in the right order. - self.data_vec = np.dot(mat_compress, s.mean) - self.cov = np.dot(mat_compress_b, - s_b.covariance.covmat.dot(mat_compress_b.T)) - self.inv_cov = np.linalg.inv(self.cov) - self.logp_const = np.log(2 * np.pi) * (-len(self.data_vec) / 2) - self.logp_const -= 0.5 * np.linalg.slogdet(self.cov)[1] - - self.experiments = data["experiments"] - self.bands = { - name: {"nu": tracer.nu, "bandpass": tracer.bandpass} - for name, tracer in s.tracers.items() - } - - # Put lcuts in a format that is recognisable by CAMB. - self.lcuts = {k.lower(): c for k, c in self.lcuts.items()} - if "et" in self.lcuts: - del self.lcuts["et"] - - ell_vec = np.zeros_like(self.data_vec) - for m in self.spec_meta: - i = m["ids"] - ell_vec[i] = m["leff"] - self.ell_vec = ell_vec - - self.data = GaussianData("mflike", self.ell_vec, self.data_vec, self.cov) - - def _get_power_spectra(self, cmbfg: dict) -> np.ndarray: - r""" - Get :math:`D_{\ell}` from the theory component - already modified by ``theoryforge_MFLike`` - - :param cmbfg: the dictionary of theory+foreground :math:`D_{\ell}` - - :return: the binned data vector - """ - ps_vec = np.zeros_like(self.data_vec) - DlsObs = dict() - # Rescale l_bpws because cmbfg spectra start from first element of l_bpws (l=2) - ell = self.l_bpws - self.l_bpws[0] - - for m in self.spec_meta: - p = m["pol"] - i = m["ids"] - w = m["bpw"].weight.T - - if p in ['tt', 'ee', 'bb']: - DlsObs[p, m['t1'], m['t2']] = cmbfg[p, m['t1'], m['t2']][ell] - else: # ['te','tb','eb'] - if m['hasYX_xsp']: # not symmetrizing - DlsObs[p, m['t2'], m['t1']] = cmbfg[p, m['t2'], m['t1']][ell] - else: - DlsObs[p, m['t1'], m['t2']] = cmbfg[p, m['t1'], m['t2']][ell] - # - if self.defaults['symmetrize']: # we average TE and ET (as for data) - DlsObs[p, m['t1'], m['t2']] += cmbfg[p, m['t2'], m['t1']][ell] - DlsObs[p, m['t1'], m['t2']] *= 0.5 - - dls_obs = DlsObs[p, m["t2"], m["t1"]] if m["hasYX_xsp"] \ - else DlsObs[p, m["t1"], m["t2"]] - - clt = w @ dls_obs - ps_vec[i] = clt - - return ps_vec - - -class TestMFLike(MFLike): - _url = "https://portal.nersc.gov/cfs/sobs/users/MFLike_data" - filename = "v0.1_test" - install_options = {"download_url": f"{_url}/{filename}.tar.gz"} diff --git a/soliket/mflike/theoryforge_MFLike.py b/soliket/mflike/theoryforge_MFLike.py deleted file mode 100644 index fe7e3900..00000000 --- a/soliket/mflike/theoryforge_MFLike.py +++ /dev/null @@ -1,349 +0,0 @@ -r""" -.. module:: theoryforge - -The ``TheoryForge_MFLike`` class applies the foreground spectra and systematic -effects to the theory spectra provided by ``MFLike``. To do that, ``MFLike`` -provides ``TheoryForge_MFLike`` with the appropriate list of channels, the -requested temperature/polarization fields, the -:math:`\ell` ranges and a dictionary of the passbands read from the ``sacc`` file: - -.. code-block:: python - - bands = {"experiment_channel": {{"nu": [freqs...], - "bandpass": [...]}}, ...} - -This dictionary is then passed to ``Bandpass`` (through ``Foreground``) -to compute the bandpass -transmissions, which are then used for the actual foreground spectra computation. - - -If one wants to use this class as standalone, the ``bands`` dictionary is -filled when initializing ``TheoryForge_MFLike``. The name of the channels to use -are read from the ``exp_ch`` list in ``TheoryForge_MFLike.yaml``, the effective -frequencies are in the ``eff_freqs`` list. Of course the effective frequencies -have to match the information from ``exp_ch``, i.e.: - -.. code-block:: yaml - - exp_ch: ["LAT_93", "LAT_145", "ACT_145"] - eff_freqs: [93, 145, 145] - -This class applies three kinds of systematic effects to the CMB + foreground -power spectrum: - * calibrations (global ``calG_all``, per channel ``cal_exp_nu``, per field - ``calT_exp_nu``, ``calE_exp_nu``) - * polarization angles effect (``alpha_exp_nu``) - * systematic templates (e.g. T --> P leakage). In this case the dictionary - ``systematics_template`` has to be filled with the correct path - ``rootname``: - - .. code-block:: yaml - - systematics_template: - rootname: "test_template" - - If left ``null``, no systematic template is applied. - - -The bandpass shifts are applied within the ``Bandpass`` class. - -The values of the systematic parameters are set in ``TheoryForge_MFLike.yaml``. -They have to be named as ``cal/calT/calE/alpha`` + ``_`` + experiment_channel string -(e.g. ``LAT_93/dr6_pa4_f150``). - -""" - -from typing import List, Optional, Union - -import numpy as np -from cobaya.log import LoggedError -from cobaya.theory import Provider, Theory -from cobaya.tools import are_different_params_lists - - -class TheoryForge_MFLike(Theory): - # attributes set from .yaml - data_folder: Optional[str] - exp_ch: List[str] - eff_freqs: List[Union[int, float]] - spectra: dict - systematics_template: dict - params: dict - provider: Provider - - enforce_types: bool = True - - def initialize(self): - - self.lmin: int = self.spectra["lmin"] - self.lmax: int = self.spectra["lmax"] - self.ell = np.arange(self.lmin, self.lmax + 1) - - # State requisites to the theory code - # Which lmax for theory CMB - # Note this must be greater than lmax above to avoid approx errors - self.lmax_boltzmann: int = self.lmax + 500 - - # Which lmax for theory FG - # This can be larger than lmax boltzmann - self.lmax_fg: int = self.lmax + 500 - - # Which spectra to consider - self.requested_cls: List[str] = self.spectra["polarizations"] - - # Set lmax for theory CMB requirements - self.lcuts = {k: self.lmax_boltzmann for k in self.requested_cls} - - if hasattr(self.eff_freqs, "__len__"): - if not len(self.exp_ch) == len(self.eff_freqs): - raise LoggedError( - self.log, "list of effective frequency has to have" - "same length as list of channels!" - ) - - # self.bands to be filled with passbands read from sacc file - # if mflike is used - self.bands = {f"{expc}_s0": {'nu': [self.eff_freqs[iexpc]], 'bandpass': [1.]} - for iexpc, expc in enumerate(self.exp_ch)} - - self.expected_params_nuis = ["cal_LAT_93", "cal_LAT_145", "cal_LAT_225", - "calT_LAT_93", "calE_LAT_93", - "calT_LAT_145", "calE_LAT_145", - "calT_LAT_225", "calE_LAT_225", - "calG_all", - "alpha_LAT_93", "alpha_LAT_145", - "alpha_LAT_225", - ] - - self.use_systematics_template = bool(self.systematics_template) - - if self.use_systematics_template: - self.systematics_template = self.systematics_template - # Initialize template for marginalization, if needed - self._init_template_from_file() - - def initialize_with_params(self): - # Check that the parameters are the right ones - differences = are_different_params_lists( - self.input_params, self.expected_params_nuis, - name_A="given", name_B="expected") - if differences: - raise LoggedError( - self.log, "Configuration error in parameters: %r.", - differences) - - def must_provide(self, **requirements) -> dict: - # cmbfg_dict is required by mflike - # and requires some params to be computed - # Assign required params from mflike - # otherwise use default values - if "cmbfg_dict" in requirements: - req = requirements["cmbfg_dict"] - self.ell = req.get("ell", self.ell) - #self.log.info('%d', self.ell[0]) - #self.log.info('%d', self.ell[-1]) - self.requested_cls = req.get("requested_cls", self.requested_cls) - self.lcuts = req.get("lcuts", self.lcuts) - self.exp_ch = req.get("exp_ch", self.exp_ch) - self.bands = req.get("bands", self.bands) - - # theoryforge requires Cl from boltzmann solver - # and fg_dict from Foreground theory component - # Both requirements require some params to be computed - # Passing those from theoryforge - reqs = dict() - # Be sure that CMB is computed at lmax > lmax_data (lcuts from mflike here) - reqs["Cl"] = {k: max(c, self.lmax_boltzmann + 1) for k, c in self.lcuts.items()} - reqs["fg_dict"] = {"requested_cls": self.requested_cls, - "ell": self.ell, - "exp_ch": self.exp_ch, "bands": self.bands} - return reqs - - def get_cmb_theory(self, **params) -> dict: - return self.provider.get_Cl(ell_factor=True) - - def get_foreground_theory(self, **params) -> dict: - return self.provider.get_fg_dict() - - def calculate(self, state, want_derived=False, **params_values_dict): - Dls = self.get_cmb_theory(**params_values_dict) - #self.Dls = {s: Dls[s][self.ell] for s, _ in self.lcuts.items()} - Dls_cut = {s: Dls[s][self.ell] for s, _ in self.lcuts.items()} - params_values_nocosmo = {k: params_values_dict[k] for k in ( - self.expected_params_nuis)} - fg_dict = self.get_foreground_theory(**params_values_nocosmo) - state["cmbfg_dict"] = self.get_modified_theory(Dls_cut, - fg_dict, **params_values_nocosmo) - - def get_cmbfg_dict(self) -> dict: - return self.current_state["cmbfg_dict"] - - def get_modified_theory(self, Dls: dict, fg_dict: dict, **params) -> dict: - r""" - Takes the theory :math:`D_{\ell}`, sums it to the total - foreground power spectrum (possibly computed with bandpass - shift and bandpass integration) and applies calibration, - polarization angles rotation and systematic templates. - - :param Dls: CMB theory spectra - :param fg_dict: total foreground spectra, provided by - ``soliket.Foreground`` - :param *params: dictionary of nuisance and foregrounds parameters - - :return: the CMB+foregrounds :math:`D_{\ell}` dictionary, - modulated by systematics - """ - - nuis_params = {k: params[k] for k in self.expected_params_nuis} - - cmbfg_dict = {} - # Sum CMB and FGs - for f1 in self.exp_ch: - for f2 in self.exp_ch: - for s in self.requested_cls: - cmbfg_dict[s, f1, f2] = (Dls[s] + - fg_dict[s, 'all', f1, f2]) - - # Apply alm based calibration factors - cmbfg_dict = self._get_calibrated_spectra(cmbfg_dict, **nuis_params) - - # Introduce spectra rotations - cmbfg_dict = self._get_rotated_spectra(cmbfg_dict, **nuis_params) - - # Introduce templates of systematics from file, if needed - if self.use_systematics_template: - cmbfg_dict = self._get_template_from_file(cmbfg_dict, **nuis_params) - - return cmbfg_dict - - def _get_calibrated_spectra(self, dls_dict: dict, **nuis_params: dict) -> dict: - r""" - Calibrates the spectra through calibration factors at - the map level: - - .. math:: - - D^{{\rm cal}, TT, \nu_1 \nu_2}_{\ell} &= \frac{1}{ - {\rm cal}_{G}\, {\rm cal}^{\nu_1} \, {\rm cal}^{\nu_2}\, - {\rm cal}^{\nu_1}_{\rm T}\, - {\rm cal}^{\nu_2}_{\rm T}}\, D^{TT, \nu_1 \nu_2}_{\ell} - - D^{{\rm cal}, TE, \nu_1 \nu_2}_{\ell} &= \frac{1}{ - {\rm cal}_{G}\,{\rm cal}^{\nu_1} \, {\rm cal}^{\nu_2}\, - {\rm cal}^{\nu_1}_{\rm T}\, - {\rm cal}^{\nu_2}_{\rm E}}\, D^{TT, \nu_1 \nu_2}_{\ell} - - D^{{\rm cal}, EE, \nu_1 \nu_2}_{\ell} &= \frac{1}{ - {\rm cal}_{G}\,{\rm cal}^{\nu_1} \, {\rm cal}^{\nu_2}\, - {\rm cal}^{\nu_1}_{\rm E}\, - {\rm cal}^{\nu_2}_{\rm E}}\, D^{EE, \nu_1 \nu_2}_{\ell} - - It uses the ``syslibrary.syslib_mflike.Calibration_alm`` function. - - :param dls_dict: the CMB+foregrounds :math:`D_{\ell}` dictionary - :param *nuis_params: dictionary of nuisance parameters - - :return: dictionary of calibrated CMB+foregrounds :math:`D_{\ell}` - """ - from syslibrary import syslib_mflike as syl - - cal_pars = {} - if "tt" in self.requested_cls or "te" in self.requested_cls: - cal = (nuis_params["calG_all"] * - np.array([nuis_params[f"cal_{exp}"] * nuis_params[f"calT_{exp}"] - for exp in self.exp_ch])) - cal_pars["t"] = 1 / cal - - if "ee" in self.requested_cls or "te" in self.requested_cls: - cal = (nuis_params["calG_all"] * - np.array([nuis_params[f"cal_{exp}"] * nuis_params[f"calE_{exp}"] - for exp in self.exp_ch])) - cal_pars["e"] = 1 / cal - - calib = syl.Calibration_alm(ell=self.ell, spectra=dls_dict) - - return calib(cal1=cal_pars, cal2=cal_pars, nu=self.exp_ch) - - ########################################################################### - # This part deals with rotation of spectra - # Each freq {freq1,freq2,...,freqn} gets a rotation angle alpha_LAT_93, - # alpha_LAT_145, etc.. - ########################################################################### - - def _get_rotated_spectra(self, dls_dict: dict, **nuis_params: dict) -> dict: - r""" - Rotates the polarization spectra through polarization angles: - - .. math:: - - D^{{\rm rot}, TE, \nu_1 \nu_2}_{\ell} &= \cos(\alpha^{\nu_2}) - D^{TE, \nu_1 \nu_2}_{\ell} - - D^{{\rm rot}, EE, \nu_1 \nu_2}_{\ell} &= \cos(\alpha^{\nu_1}) - \cos(\alpha^{\nu_2}) D^{EE, \nu_1 \nu_2}_{\ell} - - It uses the ``syslibrary.syslib_mflike.Rotation_alm`` function. - - :param dls_dict: the CMB+foregrounds :math:`D_{\ell}` dictionary - :param *nuis_params: dictionary of nuisance parameters - - :return: dictionary of rotated CMB+foregrounds :math:`D_{\ell}` - """ - from syslibrary import syslib_mflike as syl - - rot_pars = [nuis_params[f"alpha_{exp}"] for exp in self.exp_ch] - - rot = syl.Rotation_alm(ell=self.ell, spectra=dls_dict) - - return rot(rot_pars, nu=self.exp_ch, cls=self.requested_cls) - - ########################################################################### - # This part deals with template marginalization - # A dictionary of template dls is read from yaml (likely to be not efficient) - # then rescaled and added to theory dls - ########################################################################### - - # Initializes the systematics templates - # This is slow, but should be done only once - def _init_template_from_file(self): - """ - Reads the systematics template from file, using - the ``syslibrary.syslib_mflike.ReadTemplateFromFile`` - function. - """ - - if not self.systematics_template.get("rootname"): - raise LoggedError(self.log, "Missing 'rootname' for systematics template!") - - from syslibrary import syslib_mflike as syl - - # decide where to store systematics template. - # Currently stored inside syslibrary package - templ_from_file = \ - syl.ReadTemplateFromFile(rootname=self.systematics_template["rootname"]) - self.dltempl_from_file = templ_from_file(ell=self.ell) - - def _get_template_from_file(self, dls_dict: dict, **nuis_params: dict) -> dict: - r""" - Adds the systematics template, modulated by ``nuis_params['templ_freq']`` - parameters, to the :math:`D_{\ell}`. - - :param dls_dict: the CMB+foregrounds :math:`D_{\ell}` dictionary - :param *nuis_params: dictionary of nuisance parameters - - :return: dictionary of CMB+foregrounds :math:`D_{\ell}` - with systematics templates - """ - # templ_pars=[nuis_params['templ_'+str(fr)] for fr in self.exp_ch] - # templ_pars currently hard-coded - # but ideally should be passed as input nuisance - templ_pars = {cls: np.zeros((len(self.exp_ch), len(self.exp_ch))) - for cls in self.requested_cls} - - for cls in self.requested_cls: - for i1, f1 in enumerate(self.exp_ch): - for i2, f2 in enumerate(self.exp_ch): - dls_dict[cls, f1, f2] += (templ_pars[cls][i1][i2] * - self.dltempl_from_file[cls, f1, f2]) - - return dls_dict diff --git a/soliket/tests/data/test_bandpass/LAT_145 b/soliket/tests/data/test_bandpass/LAT_145 deleted file mode 100644 index f8bd4b8b..00000000 --- a/soliket/tests/data/test_bandpass/LAT_145 +++ /dev/null @@ -1,3 +0,0 @@ -1.445000000000000000e+02 9.699999999999999734e-01 -1.450000000000000000e+02 1.000000000000000000e+00 -1.455000000000000000e+02 9.300000000000000488e-01 diff --git a/soliket/tests/data/test_bandpass/LAT_225 b/soliket/tests/data/test_bandpass/LAT_225 deleted file mode 100644 index 2f1715ba..00000000 --- a/soliket/tests/data/test_bandpass/LAT_225 +++ /dev/null @@ -1,3 +0,0 @@ -2.230000000000000000e+02 9.699999999999999734e-01 -2.250000000000000000e+02 9.799999999999999822e-01 -2.280000000000000000e+02 9.599999999999999645e-01 diff --git a/soliket/tests/data/test_bandpass/LAT_93 b/soliket/tests/data/test_bandpass/LAT_93 deleted file mode 100644 index 28a778c8..00000000 --- a/soliket/tests/data/test_bandpass/LAT_93 +++ /dev/null @@ -1,3 +0,0 @@ -9.200000000000000000e+01 9.899999999999999911e-01 -9.300000000000000000e+01 9.799999999999999822e-01 -9.400000000000000000e+01 1.000000000000000000e+00 diff --git a/soliket/tests/test_bandpass.py b/soliket/tests/test_bandpass.py deleted file mode 100644 index 84aae82e..00000000 --- a/soliket/tests/test_bandpass.py +++ /dev/null @@ -1,192 +0,0 @@ -import copy -import numpy as np -from cobaya.model import get_model -import pytest - -from soliket.constants import T_CMB, h_Planck, k_Boltzmann - -bandpass_params = { - "bandint_shift_LAT_93": 0.0, - "bandint_shift_LAT_145": 0.0, - "bandint_shift_LAT_225": 0.0 -} - -bands = {"LAT_93_s0": {"nu": [93], "bandpass": [1.]}, - "LAT_145_s0": {"nu": [145], "bandpass": [1.]}, - "LAT_225_s0": {"nu": [225], "bandpass": [1.]}} -exp_ch = [k.replace("_s0", "") for k in bands.keys()] - - -def _cmb2bb(nu): - # NB: numerical factors not included - x = nu * h_Planck * 1e9 / k_Boltzmann / T_CMB - return np.exp(x) * (nu * x / np.expm1(x)) ** 2 - - -# noinspection PyUnresolvedReferences -def test_bandpass_import(): - from soliket.bandpass import BandPass # noqa F401 - - -def test_wrong_types(): - from soliket.bandpass import BandPass - - base_case = { - "data_folder": "valid_path", - "read_from_sacc": True, - "top_hat_band": {}, - "external_bandpass": {}, - "params": {} - } - - wrong_type_cases = { - "data_folder": 12345, - "read_from_sacc": "not_a_bool", - "top_hat_band": "not_a_dict", - "external_bandpass": "not_a_dict", - "params": "not_a_dict" - } - - for key, wrong_value in wrong_type_cases.items(): - case = copy.deepcopy(base_case) - case[key] = wrong_value - with pytest.raises(TypeError): - _ = BandPass(**case) - - -def test_bandpass_model(evaluate_one_info): - from soliket.bandpass import BandPass - - evaluate_one_info["params"] = bandpass_params - evaluate_one_info["theory"] = {"bandpass": { - "external": BandPass, - }, - } - model = get_model(evaluate_one_info) # noqa F841 - - -def test_bandpass_read_from_sacc(evaluate_one_info): - from soliket.bandpass import BandPass - - # testing the default read_from_sacc - evaluate_one_info["params"] = bandpass_params - evaluate_one_info["theory"] = { - "bandpass": {"external": BandPass}, - } - - model = get_model(evaluate_one_info) - model.add_requirements({"bandint_freqs": {"bands": bands} - }) - - model.logposterior(evaluate_one_info['params']) # force computation of model - - lhood = model.likelihood['one'] - - bandpass = lhood.provider.get_bandint_freqs() - - bandint_freqs = np.empty_like(exp_ch, dtype=float) - for ifr, fr in enumerate(exp_ch): - bandpar = 'bandint_shift_' + fr - bandint_freqs[ifr] = ( - np.asarray(bands[fr + "_s0"]["nu"]) + evaluate_one_info["params"][bandpar] - ) - - assert np.allclose(bandint_freqs, bandpass) - - -def test_bandpass_top_hat(evaluate_one_info): - from soliket.bandpass import BandPass - - # now testing top-hat construction - evaluate_one_info["params"] = bandpass_params - evaluate_one_info["theory"] = { - "bandpass": {"external": BandPass, - "top_hat_band": { - "nsteps": 3, - "bandwidth": 0.5}, - "external_bandpass": {}, - "read_from_sacc": False, - }, - } - - model = get_model(evaluate_one_info) - model.add_requirements({"bandint_freqs": {"bands": bands} - }) - model.logposterior(evaluate_one_info['params']) # force computation of model - - lhood = model.likelihood['one'] - - bandpass = lhood.provider.get_bandint_freqs() - - bandint_freqs = [] - nsteps = evaluate_one_info["theory"]["bandpass"]["top_hat_band"]["nsteps"] - bandwidth = evaluate_one_info["theory"]["bandpass"]["top_hat_band"]["bandwidth"] - for ifr, fr in enumerate(exp_ch): - bandpar = 'bandint_shift_' + fr - bd = bands[f"{fr}_s0"] - nu_ghz, bp = np.asarray(bd["nu"]), np.asarray(bd["bandpass"]) - fr = nu_ghz @ bp / bp.sum() - bandlow = fr * (1 - bandwidth * .5) - bandhigh = fr * (1 + bandwidth * .5) - nub = np.linspace(bandlow + evaluate_one_info["params"][bandpar], - bandhigh + evaluate_one_info["params"][bandpar], - nsteps, dtype=float) - tranb = _cmb2bb(nub) - tranb_norm = np.trapz(_cmb2bb(nub), nub) - bandint_freqs.append([nub, tranb / tranb_norm]) - - assert np.allclose(bandint_freqs, bandpass) - - -def test_bandpass_external_file(request, evaluate_one_info): - import os - - from soliket.bandpass import BandPass - - filepath = os.path.join(request.config.rootdir, - "soliket/tests/data/") - # now testing reading from external file - evaluate_one_info["params"] = bandpass_params - evaluate_one_info["theory"] = { - "bandpass": {"external": BandPass, - "data_folder": f"{filepath}", - "top_hat_band": {}, - "external_bandpass": { - "path": "test_bandpass"}, - "read_from_sacc": False, - }, - } - - model = get_model(evaluate_one_info) - model.add_requirements({"bandint_freqs": {"bands": bands} - }) - - model.logposterior(evaluate_one_info['params']) # force computation of model - - lhood = model.likelihood['one'] - - bandpass = lhood.provider.get_bandint_freqs() - - path = os.path.normpath(os.path.join( - evaluate_one_info["theory"]["bandpass"]["data_folder"], - evaluate_one_info["theory"]["bandpass"]["external_bandpass"]["path"])) - - arrays = os.listdir(path) - external_bandpass = [] - for a in arrays: - nu_ghz, bp = np.loadtxt(path + "/" + a, usecols=(0, 1), unpack=True) - external_bandpass.append([a, nu_ghz, bp]) - - bandint_freqs = [] - for expc, nu_ghz, bp in external_bandpass: - bandpar = "bandint_shift_" + expc - nub = nu_ghz + evaluate_one_info["params"][bandpar] - if not hasattr(bp, "__len__"): - bandint_freqs.append(nub) - bandint_freqs = np.asarray(bandint_freqs) - else: - trans_norm = np.trapz(bp * _cmb2bb(nub), nub) - trans = bp / trans_norm * _cmb2bb(nub) - bandint_freqs.append([nub, trans]) - - assert np.allclose(bandint_freqs, bandpass) diff --git a/soliket/tests/test_bandpass.yaml b/soliket/tests/test_bandpass.yaml deleted file mode 100644 index d10c09d2..00000000 --- a/soliket/tests/test_bandpass.yaml +++ /dev/null @@ -1,9 +0,0 @@ -params: - bandint_shift_LAT_93: 0.0, - bandint_shift_LAT_145: 0.0, - bandint_shift_LAT_225: 0.0 -likelihood: - one: None -sampler: - evaluate: None -debug: True diff --git a/soliket/tests/test_foreground.py b/soliket/tests/test_foreground.py deleted file mode 100644 index 751c232c..00000000 --- a/soliket/tests/test_foreground.py +++ /dev/null @@ -1,304 +0,0 @@ -import copy -import os - -import numpy as np -from cobaya.model import get_model -import pytest - -foreground_params = { - "a_tSZ": 3.3044404448917724, - "a_kSZ": 1.6646620740058649, - "a_p": 6.912474322461401, - "beta_p": 2.077474196171309, - "a_c": 4.88617700670901, - "beta_c": 2.2030316332596014, - "a_s": 3.099214100532393, - "T_d": 9.60, - "a_gtt": 0, - "a_gte": 0, - "a_gee": 0, - "a_psee": 0, - "a_pste": 0, - "xi": 0, -} - - -def test_foreground_import(): - from soliket.foreground import Foreground # noqa F401 - - -def test_wrong_types(): - from soliket.foreground import Foreground - - base_case = {"spectra": { - "polarizations": ["a", "b", "c"], - "lmin": 1, - "lmax": 2, - "exp_ch": ["A", "B", "C"], - "eff_freqs": [1, 2, 3], - }, "foregrounds": { - "components": {"a": ["b", "c"], "d": ["e", "f"]}, - "normalisation": {"nu_0": 1.0, "ell_0": 2, "T_CMB": 3.0}, - }, "params": {}} - - wrong_main_cases = { - "spectra": "not_a_dict", - "foregrounds": "not_a_dict", - "params": "not_a_dict", - } - - wrong_spectra_cases = { - "polarizations": "not_a_list", - "lmin": "not_an_int", - "lmax": "not_an_int", - "exp_ch": "not_a_list", - "eff_freqs": "not_a_list", - } - - wrong_foregrounds_cases = { - "components": "not_a_dict", - "normalisation": "not_a_dict", - } - - wrong_normalization_cases = { - "nu_0": "not_a_float", - "ell_0": "not_an_int", - "T_CMB": "not_a_float", - } - - for key, wrong_value in wrong_main_cases.items(): - case = copy.deepcopy(base_case) - case[key] = wrong_value - with pytest.raises(TypeError): - _ = Foreground(**case) - - for key, wrong_value in wrong_spectra_cases.items(): - case = copy.deepcopy(base_case) - case["spectra"][key] = wrong_value - with pytest.raises(TypeError): - _ = Foreground(**case) - - for key, wrong_value in wrong_foregrounds_cases.items(): - case = copy.deepcopy(base_case) - case["foregrounds"][key] = wrong_value - with pytest.raises(TypeError): - _ = Foreground(**case) - - for key, wrong_value in wrong_normalization_cases.items(): - case = copy.deepcopy(base_case) - case["foregrounds"]["normalisation"][key] = wrong_value - with pytest.raises(TypeError): - _ = Foreground(**case) - - -def test_foreground_model(evaluate_one_info): - from soliket.foreground import Foreground - - evaluate_one_info["params"] = foreground_params - evaluate_one_info["theory"] = { - "foreground": {"external": Foreground}, - } - model = get_model(evaluate_one_info) # noqa F841 - - -def test_foreground_compute(evaluate_one_info): - from soliket.bandpass import BandPass - from soliket.foreground import Foreground - - evaluate_one_info["params"] = foreground_params - - evaluate_one_info["theory"] = { - "foreground": {"external": Foreground}, - "bandpass": {"external": BandPass}, - } - - evaluate_one_info["foregrounds"] = { - "normalisation": {"nu_0": 150.0, "ell_0": 3000, "T_CMB": 2.725}, - "components": { - "tt": ["kSZ", "tSZ_and_CIB", "cibp", "dust", "radio"], - "te": ["radio", "dust"], - "ee": ["radio", "dust"], - }, - } - - evaluate_one_info["spectra"] = { - "polarizations": ["tt", "te", "ee"], - "lmin": 2, - "lmax": 9000, - "exp_ch": ["LAT_93", "LAT_145", "LAT_225"], - "eff_freqs": [93, 145, 225], - } - - nu_0 = evaluate_one_info["foregrounds"]["normalisation"]["nu_0"] - ell_0 = evaluate_one_info["foregrounds"]["normalisation"]["ell_0"] - ell = np.arange( - evaluate_one_info["spectra"]["lmin"], evaluate_one_info["spectra"]["lmax"] + 1 - ) - requested_cls = evaluate_one_info["spectra"]["polarizations"] - components = evaluate_one_info["foregrounds"]["components"] - exp_ch = evaluate_one_info["spectra"]["exp_ch"] - eff_freqs = np.asarray(evaluate_one_info["spectra"]["eff_freqs"]) - bands = { - f"{expc}_s0": {"nu": [eff_freqs[iexpc]], "bandpass": [1.0]} - for iexpc, expc in enumerate(exp_ch) - } - - model = get_model(evaluate_one_info) - model.add_requirements( - { - "fg_dict": { - "requested_cls": requested_cls, - "ell": ell, - "exp_ch": exp_ch, - "bands": bands, - }, - } - ) - - model.logposterior(evaluate_one_info["params"]) # force computation of model - - lhood = model.likelihood["one"] - - fg_model = lhood.provider.get_fg_dict() - fg_model_test = get_fg( - exp_ch, eff_freqs, ell, ell_0, nu_0, requested_cls, components, evaluate_one_info - ) - - for k in fg_model_test.keys(): - assert np.allclose(fg_model[k], fg_model_test[k]) - - -def get_fg( - freqs, bandint_freqs, ell, ell_0, nu_0, requested_cls, components, evaluate_one_info -): - from fgspectra import cross as fgc - from fgspectra import frequency as fgf - from fgspectra import power as fgp - - template_path = os.path.join(os.path.dirname(os.path.abspath(fgp.__file__)), "data") - cibc_file = os.path.join(template_path, "cl_cib_Choi2020.dat") - - ksz = fgc.FactorizedCrossSpectrum(fgf.ConstantSED(), fgp.kSZ_bat()) - cibp = fgc.FactorizedCrossSpectrum(fgf.ModifiedBlackBody(), fgp.PowerLaw()) - radio = fgc.FactorizedCrossSpectrum(fgf.PowerLaw(), fgp.PowerLaw()) - tsz = fgc.FactorizedCrossSpectrum(fgf.ThermalSZ(), fgp.tSZ_150_bat()) - cibc = fgc.FactorizedCrossSpectrum(fgf.CIB(), fgp.PowerSpectrumFromFile(cibc_file)) - dust = fgc.FactorizedCrossSpectrum(fgf.ModifiedBlackBody(), fgp.PowerLaw()) - tSZ_and_CIB = fgc.SZxCIB_Choi2020() - - ell_clp = ell * (ell + 1.0) - ell_0clp = ell_0 * (ell_0 + 1.0) - fg_component_list = {s: components[s] for s in requested_cls} - - model = {} - model["tt", "kSZ"] = evaluate_one_info["params"]["a_kSZ"] * ksz( - {"nu": bandint_freqs}, {"ell": ell, "ell_0": ell_0} - ) - - model["tt", "cibp"] = evaluate_one_info["params"]["a_p"] * cibp( - { - "nu": bandint_freqs, - "nu_0": nu_0, - "temp": evaluate_one_info["params"]["T_d"], - "beta": evaluate_one_info["params"]["beta_p"], - }, - {"ell": ell_clp, "ell_0": ell_0clp, "alpha": 1}, - ) - - model["tt", "radio"] = evaluate_one_info["params"]["a_s"] * radio( - {"nu": bandint_freqs, "nu_0": nu_0, "beta": -0.5 - 2.0}, - {"ell": ell_clp, "ell_0": ell_0clp, "alpha": 1}, - ) - - model["tt", "tSZ"] = evaluate_one_info["params"]["a_tSZ"] * tsz( - {"nu": bandint_freqs, "nu_0": nu_0}, {"ell": ell, "ell_0": ell_0} - ) - - model["tt", "cibc"] = evaluate_one_info["params"]["a_c"] * cibc( - { - "nu": bandint_freqs, - "nu_0": nu_0, - "temp": evaluate_one_info["params"]["T_d"], - "beta": evaluate_one_info["params"]["beta_c"], - }, - {"ell": ell, "ell_0": ell_0}, - ) - - model["tt", "dust"] = evaluate_one_info["params"]["a_gtt"] * dust( - {"nu": bandint_freqs, "nu_0": nu_0, "temp": 19.6, "beta": 1.5}, - {"ell": ell, "ell_0": 500.0, "alpha": -0.6}, - ) - - model["tt", "tSZ_and_CIB"] = tSZ_and_CIB( - { - "kwseq": ( - {"nu": bandint_freqs, "nu_0": nu_0}, - { - "nu": bandint_freqs, - "nu_0": nu_0, - "temp": evaluate_one_info["params"]["T_d"], - "beta": evaluate_one_info["params"]["beta_c"], - }, - ) - }, - { - "kwseq": ( - { - "ell": ell, - "ell_0": ell_0, - "amp": evaluate_one_info["params"]["a_tSZ"], - }, - {"ell": ell, "ell_0": ell_0, "amp": evaluate_one_info["params"]["a_c"]}, - { - "ell": ell, - "ell_0": ell_0, - "amp": -evaluate_one_info["params"]["xi"] - * np.sqrt( - evaluate_one_info["params"]["a_tSZ"] - * evaluate_one_info["params"]["a_c"] - ), - }, - ) - }, - ) - - model["ee", "radio"] = evaluate_one_info["params"]["a_psee"] * radio( - {"nu": bandint_freqs, "nu_0": nu_0, "beta": -0.5 - 2.0}, - {"ell": ell_clp, "ell_0": ell_0clp, "alpha": 1}, - ) - - model["ee", "dust"] = evaluate_one_info["params"]["a_gee"] * dust( - {"nu": bandint_freqs, "nu_0": nu_0, "temp": 19.6, "beta": 1.5}, - {"ell": ell, "ell_0": 500.0, "alpha": -0.4}, - ) - - model["te", "radio"] = evaluate_one_info["params"]["a_pste"] * radio( - {"nu": bandint_freqs, "nu_0": nu_0, "beta": -0.5 - 2.0}, - {"ell": ell_clp, "ell_0": ell_0clp, "alpha": 1}, - ) - - model["te", "dust"] = evaluate_one_info["params"]["a_gte"] * dust( - {"nu": bandint_freqs, "nu_0": nu_0, "temp": 19.6, "beta": 1.5}, - {"ell": ell, "ell_0": 500.0, "alpha": -0.4}, - ) - - fg_dict = {} - for c1, f1 in enumerate(freqs): - for c2, f2 in enumerate(freqs): - for s in requested_cls: - fg_dict[s, "all", f1, f2] = np.zeros(len(ell)) - for comp in fg_component_list[s]: - if comp == "tSZ_and_CIB": - fg_dict[s, "tSZ", f1, f2] = model[s, "tSZ"][c1, c2] - fg_dict[s, "cibc", f1, f2] = model[s, "cibc"][c1, c2] - fg_dict[s, "tSZxCIB", f1, f2] = ( - model[s, comp][c1, c2] - - model[s, "tSZ"][c1, c2] - - model[s, "cibc"][c1, c2] - ) - fg_dict[s, "all", f1, f2] += model[s, comp][c1, c2] - else: - fg_dict[s, comp, f1, f2] = model[s, comp][c1, c2] - fg_dict[s, "all", f1, f2] += fg_dict[s, comp, f1, f2] - - return fg_dict diff --git a/soliket/tests/test_foreground.yaml b/soliket/tests/test_foreground.yaml deleted file mode 100644 index 7a0ea3d9..00000000 --- a/soliket/tests/test_foreground.yaml +++ /dev/null @@ -1,7 +0,0 @@ -params: - T_d: 9.60, -likelihood: - one: None -sampler: - evaluate: None -debug: True diff --git a/soliket/tests/test_mflike.py b/soliket/tests/test_mflike.py index 5f190d9d..fd69db53 100644 --- a/soliket/tests/test_mflike.py +++ b/soliket/tests/test_mflike.py @@ -1,285 +1,7 @@ -""" -Make sure that this returns the same result as original mflike.MFLike from LAT_MFlike repo -""" -import copy -import os +def test_import(): -import camb -import numpy as np -import pytest -from cobaya.tools import resolve_packages_path -from packaging.version import Version + import mflike # noqa: F401 -import soliket -from soliket.mflike import TestMFLike + from mflike import BandpowerForeground, Foreground # noqa: F401 -packages_path = resolve_packages_path() - -nuisance_params = { - "a_tSZ": 3.3044404448917724, - "a_kSZ": 1.6646620740058649, - "a_p": 6.912474322461401, - "beta_p": 2.077474196171309, - "a_c": 4.88617700670901, - "beta_c": 2.2030316332596014, - "a_s": 3.099214100532393, - "T_d": 9.60, - "a_gtt": 0, - "a_gte": 0, - "a_gee": 0, - "a_psee": 0, - "a_pste": 0, - "xi": 0, - "bandint_shift_LAT_93": 0, - "bandint_shift_LAT_145": 0, - "bandint_shift_LAT_225": 0, - "cal_LAT_93": 1, - "cal_LAT_145": 1, - "cal_LAT_225": 1, - "calT_LAT_93": 1, - "calE_LAT_93": 1, - "calT_LAT_145": 1, - "calE_LAT_145": 1, - "calT_LAT_225": 1, - "calE_LAT_225": 1, - "calG_all": 1, - "alpha_LAT_93": 0, - "alpha_LAT_145": 0, - "alpha_LAT_225": 0, -} - - -if Version(camb.__version__) >= Version('1.4'): - chi2s = {"tt": 544.9017, - "te": 136.6051, - "ee": 166.1897, - "tt-te-et-ee": 787.9529} -else: - chi2s = {"tt": 544.8797, - "te-et": 151.8197, - "ee": 166.2835, - "tt-te-et-ee": 787.9843} - -pre = "test_data_sacc_" - - -class Test_mflike: - - @classmethod - def setup_class(cls): - from cobaya.install import install - - install( - {"likelihood": {"soliket.mflike.TestMFLike": None}}, - path=packages_path, - skip_global=False, - force=True, - debug=True, - no_set_global=True, - ) - - def test_mflike_with_wrong_types(self): - - base_case = { - "data_folder": "data", - "input_file": "input.txt", - "cov_Bbl_file": "cov.txt", - "data": { - "experiments": ["exp1"], - "spectra": [{}], - }, - "defaults": { - "polarizations": ["pol1"], - "scales": {"scale1": [1, 2]}, - "symmetrize": True, - }, - "lmax_theory": 2500, - } - - wrong_main_cases = { - "data_folder": 123, - "input_file": 123, - "cov_Bbl_file": 123, - "data": "not_a_dict", - "defaults": "not_a_dict", - "lmax_theory": "not_an_int", - } - - wrong_data_cases = { - "experiments": "not_a_list", - "spectra": "not_a_list", - } - - wrong_defaults_cases = { - "polarizations": "not_a_list", - "scales": "not_a_dict", - "symmetrize": "not_a_bool", - } - - for key, wrong_value in wrong_main_cases.items(): - case = copy.deepcopy(base_case) - case[key] = wrong_value - with pytest.raises(TypeError): - _ = TestMFLike(**case) - - for key, wrong_value in wrong_data_cases.items(): - case = copy.deepcopy(base_case) - case["data"][key] = wrong_value - with pytest.raises(TypeError): - _ = TestMFLike(**case) - - for key, wrong_value in wrong_defaults_cases.items(): - case = copy.deepcopy(base_case) - case["defaults"][key] = wrong_value - with pytest.raises(TypeError): - _ = TestMFLike(**case) - - - def test_theoryforge_with_wrong_types(self): - from soliket.mflike import TheoryForge_MFLike - - base_case = { - "data_folder": "data", - "exp_ch": ["exp1"], - "eff_freqs": (1, 2), - "spectra": { - "lmin": 30, - "lmax": 2000, - "polarizations": ["pol1"] - }, - "systematics_template": {}, - "params": {} - } - - wrong_type_cases_main = { - "data_folder": 123, - "exp_ch": "not_a_list", - "eff_freqs": "not_a_tuple", - "spectra": "not_a_dict", - "systematics_template": "not_a_dict", - "params": "not_a_dict" - } - - wrong_type_cases_spectra = { - "lmin": "not_an_int", - "lmax": "not_an_int", - "polarizations": "not_a_list" - } - - for key, wrong_value in wrong_type_cases_main.items(): - case = copy.deepcopy(base_case) - case[key] = wrong_value - with pytest.raises(TypeError): - _ = TheoryForge_MFLike(**case) - - for key, wrong_value in wrong_type_cases_spectra.items(): - case = copy.deepcopy(base_case) - case["spectra"][key] = wrong_value - with pytest.raises(TypeError): - _ = TheoryForge_MFLike(**case) - - - @pytest.mark.usefixtures("test_cosmology_params") - def test_mflike(self, test_cosmology_params): - - # As of now, there is not a mechanism - # in soliket to ensure there is .loglike that can be called like this - # w/out cobaya - - lmax = 9000 - test_cosmology_params.update({"lmax": lmax, "lens_potential_accuracy": 1}) - pars = camb.set_params(**test_cosmology_params) - results = camb.get_results(pars) - powers = results.get_cmb_power_spectra(pars, CMB_unit="muK") - cl_dict = {k: powers["total"][:, v] for - k, v in {"tt": 0, "ee": 1, "te": 3}.items()} - - - BP = soliket.BandPass() - FG = soliket.Foreground() - TF = soliket.TheoryForge_MFLike() - - bands = TF.bands - exp_ch = TF.exp_ch - - requested_cls = TF.requested_cls - BP.bands = bands - BP.exp_ch = [k.replace("_s0", "") for k in bands.keys() - if "_s0" in k] - - bandpass = BP._bandpass_construction(**nuisance_params) - - for select, chi2 in chi2s.items(): - - my_mflike = TestMFLike( - { - "external": TestMFLike, - "packages_path": packages_path, - "data_folder": "TestMFLike", - "input_file": pre + "00000.fits", - "defaults": { - "polarizations": select.upper().split("-"), - "scales": { - "TT": [2, 179], - "TE": [2, 179], - "ET": [2, 179], - "EE": [2, 179], - }, - "symmetrize": False, - }, - } - ) - - ell_cut = my_mflike.l_bpws - dls_cut = {s: cl_dict[s][ell_cut] for s, _ in my_mflike.lcuts.items()} - fg_dict = FG._get_foreground_model(requested_cls=requested_cls, - ell=ell_cut, - exp_ch=exp_ch, - bandint_freqs=bandpass, - **nuisance_params) - dlobs_dict = TF.get_modified_theory(dls_cut, fg_dict, **nuisance_params) - - loglike = my_mflike.loglike(dlobs_dict) - - assert np.isclose( - -2 * (loglike - my_mflike.logp_const), chi2, atol=1e-2, rtol=0.0 - ) - - @pytest.mark.usefixtures("test_cosmology_params") - def test_cobaya(self, test_cosmology_params): - - info = { - "likelihood": { - "soliket.mflike.TestMFLike": { - "datapath": os.path.join(packages_path, "data/TestMFLike"), - "data_folder": "TestMFLike", - "input_file": pre + "00000.fits", - "defaults": { - "polarizations": ["TT", "TE", "ET", "EE"], - "scales": { - "TT": [2, 179], - "TE": [2, 179], - "ET": [2, 179], - "EE": [2, 179], - }, - "symmetrize": False, - }, - }, - }, - "theory": {"camb": {"extra_args": {"lens_potential_accuracy": 1}, - "stop_at_error": True}}, - "params": test_cosmology_params, - "modules": packages_path, - "debug": True, - } - - info["theory"]["soliket.TheoryForge_MFLike"] = {'stop_at_error': True} - info["theory"]["soliket.Foreground"] = {'stop_at_error': True} - info["theory"]["soliket.BandPass"] = {'stop_at_error': True} - from cobaya.model import get_model - - model = get_model(info) - my_mflike = model.likelihood["soliket.mflike.TestMFLike"] - chi2 = -2 * (model.loglikes(nuisance_params)[0] - my_mflike.logp_const) - - assert np.isclose(chi2[0], chi2s["tt-te-et-ee"], atol=1e-2, rtol=0.0) + from mflike import TTTEEE, TT, EE, TE # noqa: F401 diff --git a/soliket/tests/test_mflike.yaml b/soliket/tests/test_mflike.yaml deleted file mode 100644 index 77307a9f..00000000 --- a/soliket/tests/test_mflike.yaml +++ /dev/null @@ -1,243 +0,0 @@ -# A simple cobaya likelihood for SO - -debug: True - -likelihood: - soliket.mflike.TestMFLike: - data_folder: TestMFLike - input_file: test_data_sacc_00000.fits - defaults: - # Which spectra? - polarizations: ['TT', 'TE', 'ET', 'EE'] - # Scale cuts (in ell) for each spectrum - scales: - TT: [2, 6002] - TE: [2, 6002] - ET: [2, 6002] - EE: [2, 6002] - symmetrize: False - data: - experiments: - - LAT_93 - - LAT_145 - - LAT_225 - - - spectra: - - experiments: ['LAT_93', 'LAT_93'] - polarizations: ['TT','TE','EE'] - - experiments: ['LAT_93', 'LAT_145'] - - experiments: ['LAT_93', 'LAT_225'] - - experiments: ['LAT_145', 'LAT_145'] - polarizations: ['TT','TE','EE'] - - experiments: ['LAT_145', 'LAT_225'] - - experiments: ['LAT_225', 'LAT_225'] - polarizations: ['TT','TE','EE'] - -params: - # Sampled - cosmomc_theta: - prior: - min: 0.0103 - max: 0.0105 - proposal: 1.5e-6 - latex: \theta_\mathrm{MC} - logA: - prior: - min: 2.6 - max: 3.5 - proposal: 0.0036 - drop: True - latex: \log(10^{10} A_\mathrm{s}) - As: - value: "lambda logA: 1e-10*np.exp(logA)" - latex: A_\mathrm{s} - ns: - prior: - min: 0.9 - max: 1.1 - proposal: 0.0033 - latex: n_\mathrm{s} - ombh2: - prior: - min: 0.017 - max: 0.027 - proposal: 6.5e-5 - latex: \Omega_\mathrm{b}h^2 - omch2: - prior: - min: 0.09 - max: 0.15 - proposal: 0.0011 - latex: \Omega_\mathrm{c}h^2 - Alens: - prior: - min: 0.5 - max: 1.5 - proposal: 0.022 - latex: A_\mathrm{L} - tau: - prior: - dist: norm - loc: 0.0544 - scale: 0.0073 - proposal: 0.0073 - latex: \tau_\mathrm{reio} - H0: - latex: H_0 - sigma8: - latex: \sigma_8 - - # Sampled nuisance params - a_tSZ: - prior: - min: 3.0 - max: 3.6 - proposal: 0.05 - latex: a_\mathrm{tSZ} - a_kSZ: - prior: - min: 1.4 - max: 1.8 - proposal: 0.1 - latex: a_\mathrm{kSZ} - a_p: - prior: - min: 6.2 - max: 7.6 - proposal: 0.075 - latex: a_p - beta_p: - prior: - min: 1.8 - max: 2.2 - proposal: 0.015 - latex: \beta_p - a_c: - prior: - min: 4.4 - max: 5.4 - proposal: 0.12 - latex: a_c - beta_c: - prior: - min: 2.0 - max: 2.4 - proposal: 0.03 - latex: \beta_c - a_s: - prior: - min: 2.8 - max: 3.4 - proposal: 0.01 - latex: a_s - a_gtt: - prior: - dist: norm - loc: 2.79 - scale: 0.45 - proposal: 0.4 - latex: a_\mathrm{dust}^\mathrm{TT} - a_gte: - prior: - dist: norm - loc: 0.36 - scale: 0.04 - proposal: 0.04 - latex: a_\mathrm{dust}^\mathrm{TE} - a_gee: - prior: - dist: norm - loc: 0.13 - scale: 0.03 - proposal: 0.03 - latex: a_\mathrm{dust}^\mathrm{EE} - a_psee: - prior: - min: 0 - proposal: 0.05 - latex: a_\mathrm{ps}^\mathrm{EE} - a_pste: - prior: - min: -1 - max: 1 - proposal: 0.05 - latex: a_\mathrm{ps}^\mathrm{TE} - xi: - prior: - min: 0 - max: 0.2 - proposal: 0.05 - latex: \xi - T_d: - prior: - min: 8.60 - max: 10.60 - proposal: 0.6 - latex: T_d - # Systematics - bandint_shift_LAT_93: - value: 0 - latex: \Delta_{\rm band}^{93} - bandint_shift_LAT_145: - value: 0 - latex: \Delta_{\rm band}^{145} - bandint_shift_LAT_225: - value: 0 - latex: \Delta_{\rm band}^{225} - cal_LAT_93: - value: 1 - latex: \mathrm{Cal}^{93} - cal_LAT_145: - value: 1 - latex: \mathrm{Cal}^{145} - cal_LAT_225: - value: 1 - latex: \mathrm{Cal}^{225} - calT_LAT_93: - value: 1 - latex: \mathrm{Cal}_{\rm T}^{93} - calE_LAT_93: - value: 1 - latex: \mathrm{Cal}_{\rm E}^{93} - calT_LAT_145: - value: 1 - latex: \mathrm{Cal}_{\rm T}^{145} - calE_LAT_145: - value: 1 - latex: \mathrm{Cal}_{\rm E}^{145} - calT_LAT_225: - value: 1 - latex: \mathrm{Cal}_{\rm T}^{225} - calE_LAT_225: - value: 1 - latex: \mathrm{Cal}_{\rm E}^{225} - calG_all: - value: 1 - latex: \mathrm{Cal}_{\rm G}^{\rm All} - alpha_LAT_93: - value: 0 #deg - latex: \alpha^{93} - alpha_LAT_145: - value: 0 #deg - latex: \alpha^{145} - alpha_LAT_225: - value: 0 #deg - latex: \alpha^{225} - -theory: - camb: - stop_at_error: False - extra_args: - lens_potential_accuracy: 1 - soliket.TheoryForge_MFLike: - stop_at_error: True - soliket.Foreground: - stop_at_error: True - soliket.BandPass: - stop_at_error: True - -sampler: - evaluate: - -output: chains/test_mflike diff --git a/soliket/tests/test_multi.py b/soliket/tests/test_multi.py index a00d66cb..2a079481 100644 --- a/soliket/tests/test_multi.py +++ b/soliket/tests/test_multi.py @@ -1,18 +1,49 @@ import numpy as np from cobaya.tools import resolve_packages_path - -from soliket.tests.test_mflike import nuisance_params +from cobaya.model import get_model packages_path = resolve_packages_path() +nuisance_params = { + "a_tSZ": 3.3044404448917724, + "a_kSZ": 1.6646620740058649, + "a_p": 6.912474322461401, + "beta_p": 2.077474196171309, + "a_c": 4.88617700670901, + "beta_c": 2.2030316332596014, + "a_s": 3.099214100532393, + "T_d": 9.60, + "a_gtt": 0, + "a_gte": 0, + "a_gee": 0, + "a_psee": 0, + "a_pste": 0, + "xi": 0, + "bandint_shift_LAT_93": 0, + "bandint_shift_LAT_145": 0, + "bandint_shift_LAT_225": 0, + "cal_LAT_93": 1, + "cal_LAT_145": 1, + "cal_LAT_225": 1, + "calT_LAT_93": 1, + "calE_LAT_93": 1, + "calT_LAT_145": 1, + "calE_LAT_145": 1, + "calT_LAT_225": 1, + "calE_LAT_225": 1, + "calG_all": 1, + "alpha_LAT_93": 0, + "alpha_LAT_145": 0, + "alpha_LAT_225": 0, +} + def test_multi(test_cosmology_params): lensing_options = {"theory_lmax": 5000} - pre = "test_data_sacc_" mflike_options = { - "input_file": pre + "00000.fits", - "data_folder": "TestMFLike", + "input_file": "LAT_simu_sacc_00044.fits", + "cov_Bbl_file": "data_sacc_w_covar_and_Bbl.fits", "stop_at_error": True, } @@ -20,43 +51,36 @@ def test_multi(test_cosmology_params): fg_params = {"a_tSZ": {"prior": {"min": 3.0, "max": 3.6}}, "a_kSZ": {"prior": {"min": 1.4, "max": 1.8}}} - mflike_params = {**test_cosmology_params, **nuisance_params} - mflike_params.update(fg_params) + mflike_params = test_cosmology_params | nuisance_params | fg_params - lensing_params = {**test_cosmology_params} + lensing_params = test_cosmology_params info = { "likelihood": { "soliket.gaussian.MultiGaussianLikelihood": { - "components": ["soliket.mflike.TestMFLike", "soliket.LensingLikelihood"], + "components": ["mflike.TTTEEE", "soliket.LensingLikelihood"], "options": [mflike_options, lensing_options], "stop_at_error": True, } }, "theory": {"camb": camb_options, - "soliket.TheoryForge_MFLike": {'stop_at_error': True}, - "soliket.Foreground": {"stop_at_error": True}, - "soliket.BandPass": {"stop_at_error": True}}, - "params": {**mflike_params}, + "mflike.BandpowerForeground": {"stop_at_error": True}}, + "params": mflike_params, } info1 = { - "likelihood": {"soliket.mflike.TestMFLike": mflike_options}, + "likelihood": {"mflike.TTTEEE": mflike_options}, "theory": {"camb": camb_options, - "soliket.TheoryForge_MFLike": {'stop_at_error': True}, - "soliket.Foreground": {"stop_at_error": True}, - "soliket.BandPass": {"stop_at_error": True}}, - "params": {**mflike_params}, + "mflike.BandpowerForeground": {"stop_at_error": True}}, + "params": mflike_params, } info2 = { "likelihood": {"soliket.LensingLikelihood": lensing_options}, "theory": {"camb": camb_options}, - "params": {**lensing_params}, + "params": lensing_params, } - from cobaya.model import get_model - model = get_model(info) model1 = get_model(info1) model2 = get_model(info2) @@ -72,7 +96,7 @@ def test_multi(test_cosmology_params): logp_a = model.loglikes(fg_values_a, cached=False)[0].sum() logp_b = model.loglikes(fg_values_b, cached=False)[0].sum() d_logp = logp_b - logp_a - assert np.isclose(d_logp, 0.0052275, rtol=1e-5) + assert np.isclose(d_logp, -503.395, rtol=1e-4) model1_logp_a = model1.loglikes(fg_values_a, cached=False)[0].sum() model2_logp_a = model2.loglikes({}, cached=False)[0].sum() diff --git a/soliket/tests/test_multi.yaml b/soliket/tests/test_multi.yaml index a54a6b78..7534e87b 100644 --- a/soliket/tests/test_multi.yaml +++ b/soliket/tests/test_multi.yaml @@ -5,37 +5,11 @@ debug: True likelihood: soliket.gaussian.MultiGaussianLikelihood: components: - - soliket.mflike.TestMFLike + - mflike.TTTEEE - soliket.LensingLikelihood options: - - input_file: test_data_sacc_00000.fits - data_folder: "TestMFLike" - defaults: - # Which spectra? - polarizations: ['TT', 'TE', 'ET', 'EE'] - # Scale cuts (in ell) for each spectrum - scales: - TT: [2, 6002] - TE: [2, 6002] - ET: [2, 6002] - EE: [2, 6002] - symmetrize: False - data: - experiments: - - LAT_93 - - LAT_145 - - LAT_225 - - spectra: - - experiments: ['LAT_93', 'LAT_93'] - polarizations: ['TT','TE','EE'] - - experiments: ['LAT_93', 'LAT_145'] - - experiments: ['LAT_93', 'LAT_225'] - - experiments: ['LAT_145', 'LAT_145'] - polarizations: ['TT','TE','EE'] - - experiments: ['LAT_145', 'LAT_225'] - - experiments: ['LAT_225', 'LAT_225'] - polarizations: ['TT','TE','EE'] + - input_file: LAT_simu_sacc_00044.fits + cov_Bbl_file: data_sacc_w_covar_and_Bbl.fits - theory_lmax: 5000 stop_at_error: true stop_at_error: True @@ -177,7 +151,7 @@ params: latex: \xi T_d: prior: - min: 8.60 + min: 8.60 max: 10.60 proposal: 0.6 latex: T_d @@ -212,6 +186,12 @@ params: calG_all: value: 1 latex: \mathrm{Cal}_{\rm G}^{\rm All} + cal_LAT_93: + value: 1 + cal_LAT_145: + value: 1 + cal_LAT_225: + value: 1 alpha_LAT_93: value: 0 #deg latex: \alpha^{93} @@ -227,11 +207,7 @@ theory: stop_at_error: False extra_args: lens_potential_accuracy: 1 - soliket.TheoryForge_MFLike: - stop_at_error: True - soliket.Foreground: - stop_at_error: True - soliket.BandPass: + mflike.BandpowerForeground: stop_at_error: True sampler: diff --git a/soliket/tests/test_runs.py b/soliket/tests/test_runs.py index e3cd5c70..85674772 100644 --- a/soliket/tests/test_runs.py +++ b/soliket/tests/test_runs.py @@ -9,8 +9,7 @@ @pytest.mark.parametrize("lhood", - ["mflike", - "lensing", + ["lensing", "lensing_lite", "multi", # "galaxykappa", @@ -29,8 +28,7 @@ def test_evaluate(lhood): @pytest.mark.parametrize("lhood", - ["mflike", - "lensing", + ["lensing", "lensing_lite", "multi", # "galaxykappa", diff --git a/tox.ini b/tox.ini index 9eaf82cc..bd13e2cd 100644 --- a/tox.ini +++ b/tox.ini @@ -4,7 +4,7 @@ requires = setuptools_scm >= 8 pip >= 19.3.1 envlist = - py{38,39,310,311}-test{,-all}{,-latest,-oldest}{,-cov} + py{39,310,311}-test{,-all}{,-latest,-oldest}{,-cov} codestyle [testenv] @@ -39,11 +39,13 @@ description = cov: and test coverage extras = + tests all: all commands = conda info pip freeze + cobaya-install mflike.TTTEEE --no-set-global all: cobaya-install planck_2018_highl_plik.TTTEEE_lite_native --no-set-global !cov: pytest -vv --rootdir={toxinidir} --pyargs {toxinidir}/soliket/ {posargs} cov: pytest -vv --rootdir={toxinidir} --pyargs {toxinidir}/soliket/ --cov soliket --cov-report=xml --cov-config={toxinidir}/pyproject.toml {posargs}