Skip to content

Commit

Permalink
fix set_up of norm in TOF ProjData gradient
Browse files Browse the repository at this point in the history
Due to my misunderstanding (and some wrong comments) of the meaning of
`add_sensitivity`, we did not set-up the normalisation object when
computing the gradient.

Fixes #1431
  • Loading branch information
KrisThielemans committed May 16, 2024
1 parent 7460c36 commit 74cfcdf
Showing 1 changed file with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -706,7 +706,7 @@ PoissonLogLikelihoodWithLinearModelForMeanAndProjData<TargetT>::actual_compute_s
if (!this->distributable_computation_already_setup)
error("PoissonLogLikelihoodWithLinearModelForMeanAndProjData internal error: setup_distributable_computation not called "
"(gradient calculation)");
if (add_sensitivity)
if (!add_sensitivity)
this->ensure_norm_is_set_up();
distributable_compute_gradient(this->projector_pair_ptr->get_forward_projector_sptr(),
this->projector_pair_ptr->get_back_projector_sptr(),
Expand Down Expand Up @@ -1225,7 +1225,7 @@ distributable_compute_gradient(const shared_ptr<ForwardProjectorByBin>& forward_
{
if (add_sensitivity)
{
// Within the RPC process, subtract ones before to back projection ( backproj[ y/ybar - 1] )
// Within the RPC process, only do div/truncate ( backproj[ y/ybar ] )
distributable_computation(forward_projector_sptr,
back_projector_sptr,
symmetries_sptr,
Expand All @@ -1250,7 +1250,7 @@ distributable_compute_gradient(const shared_ptr<ForwardProjectorByBin>& forward_
}
else if (!add_sensitivity)
{
// Within the RPC process, only do div/truncate ( backproj[ y/ybar ] )
// Within the RPC process, subtract ones before to back projection ( backproj[ y/ybar - eff*1] )
distributable_computation(forward_projector_sptr,
back_projector_sptr,
symmetries_sptr,
Expand Down

0 comments on commit 74cfcdf

Please sign in to comment.