Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX Gaps in frames are not ignored anymore #293

Merged
merged 3 commits into from
Apr 24, 2016

Conversation

caspervdw
Copy link
Member

Fixes #292 , but not yet for link_df_iter. There is an exception because I only changed the levels generator and in link_df_iter, this is zipped together with the original iterable, which is unchaged.

Do you agree on the approach?

Another approach would be link_iter checking on the framenumbers.

@nkeim
Copy link
Contributor

nkeim commented Oct 13, 2015

Is it possible to change the behavior of link_df() only? I think there was more support for that approach. You could replace the groupby in link_df() with a custom generator.

link_df_iter is unchanged compared to master
@caspervdw
Copy link
Member Author

On second thought, I agree, the generators link_iter and link_df_iter should just take the frameorder with which they are supplied.

So link_df_iter is unchanged. link_iter and the hashtables can now deal with empty frames and link_df has a custom generator instead of groupby. There already was a test with 1 blank frame, I changed the expected results.

@nkeim
Copy link
Contributor

nkeim commented Oct 14, 2015

The code changes look good. Would it be OK if I added a warning when a skipped frame is encountered? Or just a debug message? My worry is that if the user ever has floating-point frame numbers, and they are integer + epsilon, the resulting problem will be a nightmare to debug.

@caspervdw caspervdw closed this Oct 15, 2015
@caspervdw caspervdw reopened this Oct 15, 2015
@tacaswell
Copy link
Member

@nkeim What is the use-case for floating-point frame numbers?

@nkeim
Copy link
Contributor

nkeim commented Oct 15, 2015

@tacaswell Besides making mischief? The main use case is if you've read the data directly from format that makes it hard to mix types within an array (like vanilla HDF5).

@caspervdw
Copy link
Member Author

By adding 0.5 to the expected framenumber, this can be fixed.

Speaking about column types: the particle label type is now float (because
of the np.nan initialization). I would vote for integer type here.

@nkeim
Copy link
Contributor

nkeim commented Oct 15, 2015

@caspervdw Fabulous! Much better than a warning.

I agree that integer particle IDs are desirable. Maybe we can do int64 and initialize with -1. Will create an issue.

@caspervdw
Copy link
Member Author

Fixed. Tests pass locally.

Travis keeps failing btw because of pims. We should make the PIL/Pillow and scipy dependencies compeletely optional (see soft-matter/pims#186)

@caspervdw caspervdw closed this Oct 24, 2015
@caspervdw caspervdw reopened this Oct 24, 2015
@caspervdw
Copy link
Member Author

As of the latest pims fix, the tests pass again. Ready for merge.

@@ -1009,8 +1049,9 @@ def link(self, levels):
p.forward_cands = []

# Sort out what can go to what.
assign_candidates(cur_level, prev_hash, self.search_range,
self.neighbor_strategy)
if len(cur_level) > 0 and len(prev_hash) > 0:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does assign_candidates not deal with empty levels gracefully on it's own?

@nkeim
Copy link
Contributor

nkeim commented Oct 26, 2015

Great! Merge for v0.3, or v0.3.1? This seems like a non-critical bug for the vast majority of existing users, and the fix could break existing code. So I think that we should either issue v0.3.0rc2, or hold until v0.3.1. I vote for the latter.

@danielballan
Copy link
Member

Same.

On Mon, Oct 26, 2015 at 2:50 PM Nathan Keim [email protected]
wrote:

Great! Merge for v0.3, or v0.3.1? This seems like a non-critical bug for
the vast majority of existing users, and the fix could break existing code.
So I think that we should either issue v0.3.0rc2, or hold until v0.3.1. I
vote for the latter.


Reply to this email directly or view it on GitHub
#293 (comment).

@caspervdw
Copy link
Member Author

Agreed

@caspervdw caspervdw added this to the 0.3.1 milestone Oct 27, 2015
@caspervdw
Copy link
Member Author

@nkeim, shall we merge this first and then have a look at #333 ?

@ceciirdz31
Copy link

ceciirdz31 commented Apr 18, 2016

Hi!
I had this same problem with trackpy and I was adviced to update my trackpy version by doing the following:
pip install --upgrade https://github.com/caspervdw/trackpy/archive/linking-gaps.zip

I did this 1 week ago, but I still have the problem, it seems like the amount of gaps is lower, but I still have some.

Looking foward for any comments :)

@caspervdw
Copy link
Member Author

That must be a separate issue then.. could you post the portion of the dataframe that has this unexpected gap?

@ceciirdz31
Copy link

ceciirdz31 commented Apr 21, 2016

Sure, here it is:

frame x y mass size ecc signal raw mass ep frame
5133 108.1 11.791 4823.3 5.9078 0.073306 103.55 7.2169e+05 NaN 5133
5134 109.26 12.78 5534.1 5.9348 0.23518 106.02 7.2317e+05 NaN 5134
5135 108.43 15.184 3992.8 5.6977 0.1098 95.915 7.2164e+05 NaN 5135
5137 109.01 12.801 5199.4 5.7499 0.14993 62.39 7.2293e+05 NaN 5137
5138 109.05 11.783 5453.3 6.115 0.20996 68.547 7.219e+05 NaN 5138
5139 109.25 10.609 6789.2 4.9075 0.085875 150.78 7.2574e+05 NaN 5139
5140 109.33 11.469 7103.8 5.1782 0.1432 155.49 7.2522e+05 NaN 5140
5141 110.79 10.415 4743 5.325 0.1873 103.16 7.2194e+05 NaN 5141
5142 107.91 11.782 6380.9 5.2972 0.15076 133.21 7.2445e+05 NaN 5142
5143 109.1 10.756 5158 5.3906 0.20743 120.9 7.2192e+05 NaN 5143
5144 110 10.706 3905.9 5.912 0.28583 78.549 7.2061e+05 NaN 5144
5145 109.73 10.977 4890.5 5.3721 0.10857 91.334 7.2222e+05 NaN 5145
5146 110.54 10.791 3279.5 5.1958 0.21242 59.211 7.1944e+05 NaN 5146
5147 109.5 10.811 5808.7 5.509 0.26575 170.35 7.2322e+05 NaN 5147
5148 109.6 9.1864 4944.5 4.5681 0.14422 92.616 7.2155e+05 NaN 5148
5150 111.05 11.025 5044.5 5.1451 0.11271 97.429 7.2151e+05 NaN 5150
5151 111.65 11.599 4603.5 5.7472 0.25981 119.95 7.2107e+05 NaN 5151

You can find two gaps, one in the frame 5135-5137 and the second in the frame 5148-5150.

Im adding also an image for you to see the information in a more orderly way:

dataframe

@caspervdw
Copy link
Member Author

So is the issue then that these features are missing? Or that the features are linked into one trajectory while they actually should not (with memory=0). I don't see a particle column.

@ceciirdz31
Copy link

That´s right, the issue was that the features were linked into one trajectory while they should not because of memory=0.
BUT I was just running some more tests and seems like the issue is completly gone, there are no more gaps at all, so I guess we don´t have a problem anymore.
Thanks for your help!

@caspervdw
Copy link
Member Author

That's good. @nkeim, I think this is ready to merge. Also we shoukd consider doing a minor bugfix release soon.

@nkeim
Copy link
Contributor

nkeim commented Apr 23, 2016

Sorry, I don't know how I missed that this was ready. I should be able to look at it tomorrow. Is there anything else for 0.3.1? I see that #286 is also tagged.

@nkeim
Copy link
Contributor

nkeim commented Apr 24, 2016

@caspervdw Apologies again for the long delay. I think @tacaswell has a good point. The BTree hash has no problem with empty levels; it is just an idiosyncrasy of KDTree, so I think it is better to have the workaround at the lowest layer of abstraction. I have created caspervdw#2 in case you would prefer to do it that way. But I could be very easily convinced to merge as-is 😄.

@nkeim
Copy link
Contributor

nkeim commented Apr 24, 2016

Never mind — my PR does not have all the recent Travis fixes. Will instead merge and create separate PR against master.

@nkeim nkeim merged commit f3660c5 into soft-matter:master Apr 24, 2016
@ceciirdz31
Copy link

Hi Casper,

I´m the one that posted about the problem with the gaps in trackpy and then
posted that the problem was suddently solved. I´m writing you because I was
using trackpy today and the problem with the gaps is back. I´m attaching
one document with one of the tracks that have many gaps.

2016-04-19 16:11 GMT+02:00 Casper van der Wel [email protected]:

That must be a separate issue then.. could you post the portion of the
dataframe that has this unexpected gap?


You are receiving this because you commented.
Reply to this email directly or view it on GitHub
#293 (comment)

2872 184,7847837 133,1168551 14378,45113 4,171569113 0,211527342 264,2620271 92361 2872 233
2873 186,4303627 132,5569072 14284,70762 4,038235949 0,200629704 260,5207012 90936 0,076848305 2873 233
2874 184,3694516 133,3091605 14551,33637 4,419118171 0,152017604 284,8493624 92510 0,17479598 2874 233
2880 184,7788753 133,0131805 15299,00161 4,261353382 0,137085497 260,8988863 92573 1,773830496 2880 233
2883 183,8859831 132,7837758 14315,64531 4,391898109 0,196542611 191,6012319 91919 2883 233
2884 184,2114865 133,0219726 15320,274 4,08922309 0,159250998 293,4050421 93593 2884 233
2890 184,8591339 132,1235689 14331,67546 3,945041181 0,103852359 285,6130858 91287 0,277195051 2890 233
2891 184,5369029 132,2134412 14182,99896 4,001453509 0,145349518 298,3814034 91118 0,186989607 2891 233
2893 183,0272942 131,9090987 14331,02364 4,261936171 0,158440061 230,6356452 91018 0,110085609 2893 233
2894 184,3494628 132,3294808 14326,66331 4,16600784 0,099572582 215,6911793 92086 0,014372659 2894 233
2897 184,0183795 131,6162522 14021,37037 4,282611075 0,218791014 239,9783411 91801 2897 233
2898 184,2419476 131,5728658 14392,23191 4,160576312 0,215108759 278,211109 92111 2898 233
2900 185,2690758 131,2204939 14911,78911 4,003887006 0,158353817 265,8784174 92700 0,101177157 2900 233
2907 185,4742495 130,9200299 14913,47389 4,155195547 0,240204952 271,860644 91900 2907 233
2908 184,2515303 130,9100221 14425,86519 3,979835876 0,093288963 260,3296435 91478 0,083653415 2908 233
2909 185,0822397 130,9203982 15536,96143 4,298762697 0,133731547 233,3740087 93505 0,124259042 2909 233
2913 184,0934428 131,4020289 14345,61556 4,146067083 0,04722448 250,2772382 91858 0 2913 233
2914 184,8147827 130,9242899 15684,8199 4,371767603 0,143043556 241,326676 94623 0,145148138 2914 233
2915 184,5380486 131,4644554 14333,96638 4,498320467 0,102527458 215,6793903 92355 0,058075135 2915 233
2917 185,4253813 130,5081864 14630,85965 4,297797763 0,15393567 241,672873 92892 2917 233
2920 184,6706892 130,6325072 15780,12762 4,19218452 0,1210735 245,2854257 92893 2920 233
2927 184,8440632 129,9935733 14410,50481 4,179542742 0,111471448 253,7089469 91171 2927 233
2929 183,8114329 130,1997901 14601,86842 4,396925771 0,068623823 227,5228636 92626 2929 233
2930 184,7467785 130,1371209 14347,96228 4,49773661 0,146862022 251,635308 92910 0 2930 233
2931 184,1610139 130,539168 14819,84854 4,180867236 0,134005476 238,7268803 92051 0,027446209 2931 233
2932 183,7193737 130,7085534 14484,63527 4,24336626 0,224628175 199,6427091 91673 0,050054118 2932 233
2934 182,5562737 131,366018 14298,77096 4,253714737 0,147708296 227,8656046 92033 0,002444391 2934 233
2935 183,769613 131,092856 15017,36703 4,40312383 0,136030395 249,0216384 92119 0,063392078 2935 233
2936 184,6460988 131,1707644 14587,59352 4,390260889 0,124508634 205,7975988 91514 0,114411687 2936 233
2937 183,8555126 131,4257778 15273,71306 4,265040001 0,048557511 243,0260429 93182 2937 233
2938 183,588542 131,4066166 14220,62543 4,192292574 0,227546102 252,9960612 91251 0,183069345 2938 233
2939 183,9230393 131,4954451 14468,17257 4,137431857 0,217356955 261,7385427 90831 0,200239372 2939 233
2940 184,0156891 131,9195469 16838,33139 4,039337981 0,100441365 328,2312626 93991 0,028981794 2940 233
2943 183,7115485 131,724666 16992,05322 4,188362295 0,155845287 299,010875 94892 2943 233
2944 183,1427877 132,5236049 14940,06879 3,960397913 0,223852764 295,5392823 91903 2944 233
2945 183,4752603 132,0595254 16314,3815 3,901883124 0,191017595 340,2222314 93690 0 2945 233
2946 184,0092861 132,2485451 15572,56076 3,893078691 0,174605736 405,0127032 93031 0,038558774 2946 233
2947 184,2433182 132,1833647 16604,2345 3,914003272 0,071497915 372,9069475 93980 2947 233
2948 183,0507962 132,1926834 19549,53987 3,886663256 0,108826965 436,0724437 97140 0 2948 233
2949 183,7539287 132,4114238 16060,47461 3,874601668 0,087415046 421,5500458 93139 2949 233
2950 183,8297274 131,9583017 15906,66111 3,803055967 0,174418479 328,9001281 92488 0,220067663 2950 233
2951 183,687317 132,0070126 15890,63714 3,869333059 0,061241973 312,7371 93488 2951 233
2952 183,7413465 132,171594 14168,28809 3,851986975 0,194236493 337,3383543 90976 2952 233
2953 183,5785273 131,9579405 15254,40141 3,916912661 0,172537489 455,7056509 92929 0,069424498 2953 233
2954 183,4484213 131,5159313 15769,75163 3,935729167 0,028991402 348,3345608 93108 0,122459962 2954 233
2955 183,9996172 132,045515 15749,03681 4,034702362 0,161757981 351,161727 93352 0 2955 233
2956 183,5821166 131,5079123 15350,85214 3,944739964 0,133927911 344,1098064 92513 2956 233
2957 183,7752878 131,7794369 15909,47726 4,047089415 0,178531603 333,3876683 93420 2957 233
2958 183,3100299 131,9527195 15006,95924 3,734904247 0,185304711 412,3694417 91508 0,144891515 2958 233
2959 183,8418665 132,0453891 15870,38384 4,032200692 0,155937819 384,5055534 93643 2959 233
2960 184,5114844 131,3973235 15696,92656 4,062370722 0,123241784 320,4629985 93111 0,158571508 2960 233
2961 183,662083 131,5347575 16880,63331 3,931026968 0,153599207 302,3844509 94253 0 2961 233
2962 183,8178341 132,8543405 16501,82451 3,834942963 0,178745305 384,3439672 94243 0,05439786 2962 233
2963 184,5718184 132,069656 15749,32079 3,935464065 0,179433191 287,9557883 92257 2963 233
2964 184,4394434 131,5139489 15406,95232 4,187392299 0,107198131 255,2507485 92247 0,334285932 2964 233
2965 184,2978229 131,8380374 17181,09216 3,883100553 0,153816372 376,1216359 94657 2965 233
2967 182,9105361 131,699364 16521,90876 4,11836115 0,202956204 264,4629192 94340 2967 233
2968 182,9366338 132,1267674 15578,65985 3,969346748 0,035456644 248,0183071 93190 0,063392152 2968 233
2969 183,7248163 132,2010681 15872,61539 4,05776081 0,18005257 376,9657931 93993 2969 233
2970 183,6766333 131,9646621 15746,63154 3,992264845 0,192053032 361,8309077 93719 2970 233
2972 183,8993791 131,7348078 15462,11422 4,188022899 0,0958488 231,4719483 92332 0,108674919 2972 233
2977 184,4824823 132,4709035 16429,00478 4,292121397 0,156361082 283,8415406 95274 0,001334759 2977 233
2978 185,9019055 131,892716 15978,69125 4,153030032 0,067907611 257,0827338 93635 0,153006642 2978 233
2981 185,7574888 132,8296734 14278,54167 4,428286352 0,168035314 192,5516129 91921 0,055395474 2981 233
2985 184,6654031 132,1247052 14296,16979 4,481700528 0,113717692 213,9124498 91552 2985 233

@caspervdw
Copy link
Member Author

That's strange, the tests confirm the correct working of the code. Is it possible that you started using trackpy v0.3 again and not the dev version (check trackpy.__version__)? Or could the feature dataframe be badly sorted? (try your_dataframe.sort(['particle', 'frame'])))

If this is all OK, could you do some debugging to narrow down the issue? Or give me a recipe on how to reproduce the problem?

Please make a new issue if the problem persists, as this PR is closed.

@ceciirdz31
Copy link

Hi Casper,
You are right, I somehow started to use the version 0.3.0. I did the
upgrade again and it is now properly working again. Thank you very much!
Have a nice week!

Cecilia Rodriguez

2016-05-09 18:51 GMT+02:00 Casper van der Wel [email protected]:

That's strange, the tests confirm the correct working of the code. Is it
possible that you started using trackpy v0.3 again and not the dev version
(check trackpy.version)? Or could the feature dataframe be badly
sorted? (try your_dataframe.sort(['particle', 'frame'])))

If this is all OK, could you do some debugging to narrow down the issue?
Or give me a recipe on how to reproduce the problem?

Please make a new issue if the problem persists, as this PR is closed.


You are receiving this because you commented.
Reply to this email directly or view it on GitHub
#293 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants