-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Req/Progress: Interpolation within FVCOM mesh triangles #7
Comments
Perhaps a dumb question, but why not rasterize onto a regular grid? |
Not a dumb question. I/We keep getting dragged into somewhat unproductive discussions around what resolution to rasterize it to, and how the regridding to raster should be performed. We have a project working at both nearshore and offshore scales and there are some tradeoffs between using a higher resolution nearshore to take advantage of the higher node densities there that might poorly represent the sparser coverage offshore. That and file size or computation speeds if we make a 4km resolution product for the whole shelf. Then there was a second roundabout discussion on which method to use for interpolating between the points to: IDW, Kriging, what data to use to fit those, is that introducing biases that are inconsistent with the FVCOM data itself? etc. I asked the Chen lab what methods they recommend/use when regridding and they responded by asking me why I would even want to do that. They shared that they use linear interpolation methods. And that they would interpolate GOM3 node locations for time periods covered by newer FVCOM versions to create a consistent timeseries: https://github.com/SiqiLiOcean/matFVCOM/blob/main/interp_2d_calc_weight.m So I embarked on this journey to recreate what they use and similar triangle-based methods to avoid overthinking and needing to defend complex interpolation solutions. The interpolation from nearest nodes seems most true to the model outputs (doesn't prescribe some spatial covariance structure) and should scale to many points well. Though, I don't expect it to give meaningfully different results than rasterizing to a thoughtful resolution based on some project needs. |
Point of clarification* The primary utility of this or other interpolation methods would be to extract a value or values that do not fall directly on the mesh. This can be extended to recreate a regularly spaced grid, as an alternative method to rasterizing, but primarily it would be for getting point values directly. This approach would avoid a need to first either fill that polygon or rasterize first. |
So, using |
Exactly, I have some messy code for doing either linear interpolation or barycentric that I need to clean up. The lead into either is the same use of Then when applying the interpolation you could idealy select the relevant file or timestep and variable(s) from the nc connection, to get the interpolated values: These names are a mess, sorry:
Then for interpolation:
|
For barycentric I was using
My function for this is a needlessly confusing mess ATM |
Oh, nice! Let's build on what you have and consider adding a new file in the R directory with perhaps the following functions.
The casual user would only probably call Do you have a fork of the repos? |
I'm trying to sort out the best way to code up how to handle matching time steps. And I'm not sure how to make it general enough to handle however way different people may have downloaded the FVCOM data. To use an example. The dataset I acquired for surface and bottom temperature from the Chen lab does not contain the It also is organized into annual files, which is convenient for me, but not always going to be the case as size constraints may force people to store/access monthly files. Here is the function I'm using for date-wise interpolation, just looping over yearly files:
The linear interpolation method is hard-coded in. The full workflow is:
|
So, For each row of |
Almost.
I'm using the time index here:
p1, p2, & p3 are the three FVCOM nodes that surround the point I want to interpolate at. Their values within |
So, I think your approach looks right. |
The question I'm asking myself. Is whether to keep these staging steps this way to work through many points efficiently. Or package all the steps into some "interpolate_at_date_and_place" function that takes lat/lon/time and does the rest. Potentially very inefficiently for many points, but more hands-off. |
Well, both!
|
The broken bit is really the time index part. Kind of need to know ahead of time whether a date is in the NetCDF file to even bother. |
Yeah - that is a problem |
Would it be helpful to set up a time when we could sit down and work our way through this use case? I could come down and we could hack it out if that would be helpful (it would be for me.) |
Yes, let's do that. I could do something this Friday at the earliest or sometime next week before Thursday. |
How about Tuesday morning at 9? |
I gave this a thumbs up but it's actually not a great window of time now that it's here. I could do a quick check in if you wanted to start. |
Yeah - let's take this to email |
Hi all!
I met with some of the Chen lab postdocs last week briefly and am now working on implementing some workflows in R based on what we chatted about. They're primarily a matlab group so I have some matlab code that I'd like to implement in R/python.
You may have already explored this functionality, but I am looking to add the ability to estimate/interpolate point values within triangles based on distance weights to the correct nodes. Ideally in a vectorized way.
They use this approach to reproject data from the newer meshes to the older GOM3 meshes, when looking to create a cohesive full timeline. It also could be applied for point extractions, or for re-projecting to a regular grid given equally spaced coordinates.
So far I am leveraging get_mesh_geometry() and st_join() to handle the triangle-to-point matching, and I'm working on calculating barycentric coordinates for the relevant nodes.
In my head the triangle matching and weights could/should probably be separate from pulling the proper nc variables at the correct time steps since the mesh coordinates don't change over time. Then maybe a second function that does the date/time node lookups and applies the weighted interpolation.
You may have already done this, so I wanted to flag it for potential collaboration/team problem-solving.
Cheers!
The text was updated successfully, but these errors were encountered: