Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eval_coco error: Problem in running contour metrics #29

Open
whumqy opened this issue Dec 29, 2021 · 1 comment
Open

Eval_coco error: Problem in running contour metrics #29

whumqy opened this issue Dec 29, 2021 · 1 comment

Comments

@whumqy
Copy link

whumqy commented Dec 29, 2021

Error Description

When running eval_coco mode on the CrowdAI mapping dataset, I encounter the same error on both pre-trained UNet-ResNet101 models, whether the frame field is computed or not. A fragment of the log is as follows:

INFO: Running contour metrics
TopologyException: unable to assign free hole to a shell at 236 299
Contour metrics:   0%|                                                                                             | 16/60317 [00:02<2:28:21,  6.77it/s]
Traceback (most recent call last):
  File "main.py", line 400, in <module>
    main()
  File "main.py", line 396, in main
    launch_eval_coco(args)
  File "main.py", line 381, in launch_eval_coco
    eval_coco(config)
  File "/file/Polygonization-by-Frame-Field-Learning/eval_coco.py", line 71, in eval_coco
    eval_one_partial(annotation_filename)
  File "/file/Polygonization-by-Frame-Field-Learning/eval_coco.py", line 139, in eval_one
    max_angle_diffs = contour_eval.evaluate(pool=pool)
  File "/file/Polygonization-by-Frame-Field-Learning/eval_coco.py", line 210, in evaluate
    measures_list.append(compute_contour_metrics(args))
  File "/file/Polygonization-by-Frame-Field-Learning/eval_coco.py", line 156, in compute_contour_metrics
    fixed_gt_polygons = polygon_utils.fix_polygons(gt_polygons, buffer=0.0001)  # Buffer adds vertices but is needed to repair some geometries
  File "/file/Polygonization-by-Frame-Field-Learning/lydorn_utils/lydorn_utils/polygon_utils.py", line 1649, in fix_polygons
    polygons_geom = shapely.ops.unary_union(polygons)  # Fix overlapping polygons
  File "/opt/conda/lib/python3.7/site-packages/shapely/ops.py", line 161, in unary_union
    return geom_factory(lgeos.methods['unary_union'](collection))
  File "/opt/conda/lib/python3.7/site-packages/shapely/geometry/base.py", line 73, in geom_factory
    raise ValueError("No Shapely geometry can be created from null value")
ValueError: No Shapely geometry can be created from null value

Locate bug

The error exists when figuring out the contour metrics, after the COCO stats are correctly computed. The progress is interrupted for the 16th image (start from 0), while executing the function shapely.ops.unary_union(polygons) from polygon_utils.fix_polygons(gt_polygons, buffer=0.0001).

Take a step forward, the code is unable to deal with the 4th polygon (start from 0) of ground truth annotation in the 16th image. I visualize the polygon, and find that the polygon in the GT label has a topology error, i.e. self-intersection (the intersection is circled in blue in the following image). This polygon.is_valid returns False while other polygons returns True.

16_5

Attempt to fix the bug

The change is made in the function fix_polygons.

def fix_polygons(polygons, buffer=0.0):

    #### added by myself ####
    for i in range(len(polygons)):
        polygons[i] = polygons[i].buffer(0)
    #### adding done ####

    polygons_geom = shapely.ops.unary_union(polygons)  # Fix overlapping polygons
    polygons_geom = polygons_geom.buffer(buffer)  # Fix self-intersecting polygons and other things
    ...

I try to fix the self-intersecting polygons using buffer(0) for every distinct polygon in the variable polygons in advance. And then, fix overlapping polygons and self-intersecting non-overlapping polygons in order, along with the original operation.

This change enables the code to continue metrics computation.

My Issue

  1. Have you run into the same problem while evaluating the models on the CrowdAI mapping dataset? And what's your solution?
  2. Will my change influence the evaluation results?
@Co4AI
Copy link

Co4AI commented Mar 24, 2022

good job. I meet the same error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants