keep ofdm signal from being much larger than num_iq_samples #252
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Currently, during the generation of IQ samples for OFDM signals the generated signal can be much longer than
num_iq_samples
. This is dealt with by truncating the IQ samples before returning them. However, depending on the bandwidth of the OFDM signal, this can lead to unnecessary memory/cpu use during generation. Given the current hardcoded minimum bandwidth of 0.2, the OFDM signal generator will use up to 5x more memory than required.I've been experimenting with generating signals with a much larger number of IQ samples and smaller relative bandwidths and in this case the excess memory/cpu use is much larger.
Here I have fixed the issue similarly to how it is avoided in other places (like
ChirpSSDataset._generate_samples()
orFSKBasebandModulator()
) by limiting the number of symbols generated to a value that will yield a final signal length only slightly larger thannum_iq_samples
.