Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

keep ofdm signal from being much larger than num_iq_samples #252

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dustinlagoy
Copy link

Currently, during the generation of IQ samples for OFDM signals the generated signal can be much longer than num_iq_samples. This is dealt with by truncating the IQ samples before returning them. However, depending on the bandwidth of the OFDM signal, this can lead to unnecessary memory/cpu use during generation. Given the current hardcoded minimum bandwidth of 0.2, the OFDM signal generator will use up to 5x more memory than required.

I've been experimenting with generating signals with a much larger number of IQ samples and smaller relative bandwidths and in this case the excess memory/cpu use is much larger.

Here I have fixed the issue similarly to how it is avoided in other places (like ChirpSSDataset._generate_samples() or FSKBasebandModulator()) by limiting the number of symbols generated to a value that will yield a final signal length only slightly larger than num_iq_samples.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant