Skip to content

Conversation

smathermather
Copy link
Contributor

No description provided.

@smathermather smathermather marked this pull request as ready for review August 5, 2025 01:45
@NathanMOlson
Copy link

NathanMOlson commented Aug 9, 2025

We should change the following to point away from my personal account before merging this PR. Should I open PRs to each of these repos? If so, should I target them to the main branch, or something else?

  • Entwine
  • mvs-texturing
  • OpenSfM
  • PDAL
  • pdal-python (upstream)
  • draco
  • untwine (upstream)

For python dependencies, I've used pip install with --break-system-packages. Is this acceptable? Does anyone know how to do this better?

@smathermather
Copy link
Contributor Author

smathermather commented Aug 9, 2025

We should change the following to point away from my personal account before merging this PR. Should I open PRs to each of these repos? If so, should I target them to the main branch, or something else?

Yes, this pull request isn't ready to merge as-is, but wanted to bundle this as such because it gives us some nominal testing of build by default as well as a place to track changes and discussion.

  • Entwine
  • mvs-texturing
  • OpenSfM
  • PDAL
  • pdal-python
  • draco (currently pointing upstream, which works for my use case but will break some things for others I think)
  • untwine (currently pointing upstream, which works for my use case but will break some things for others I think)

Each of these either needs the ODM specific mods upstreamed where appropriate (e.g. Entwine / Untwine) as Piero indicated or for where that isn't appropriate (e.g. OpenSfM / mvs-texturing), then it needs accompanied by pull requests against OpenDroneMap's forks.

It is also appropriate to keep this scoped to only pull requests against OpenDroneMap's forks, with an eye to upstreaming as a next step.

For python dependencies, I've used pip install with --break-system-packages. Is this acceptable? Does anyone know how to do this better?

Not sure. Since we discourage use outside a container, this might be ok. Although until we have an ODM repo maintainer recruited, that's a change I wouldn't endorse without someone more knowledgeable weighing in on the implications. Were you unable to get it to build without?

@smathermather smathermather marked this pull request as draft August 9, 2025 18:33
@NathanMOlson
Copy link

Not sure. Since we discourage use outside a container, this might be ok. Although until we have an ODM repo maintainer recruited, that's a change I wouldn't endorse without someone more knowledgeable weighing in on the implications. Were you unable to get it to build without?

I was unable to get it to build without this flag. This is related to how newer versions of Python prefer to relate to system-installed files. https://stackoverflow.com/questions/75608323/how-do-i-solve-error-externally-managed-environment-every-time-i-use-pip-3

@NathanMOlson
Copy link

I just attempted to open a PR to the ODM fork of PDAL, but was rejected with this message:

Pull request creation failed. Validation failed: must be a collaborator.

Any guidance on how I can contribute?

@smathermather
Copy link
Contributor Author

Any guidance on how I can contribute?

I will be offline until Monday, but will check back in then. Just to verify: you did the pull request from your fork of a PDAL repo?

I added classic branch protection rules to the relevant repos (either to main or master) in the meantime. Hopefully this resolves it. Otherwise I'll dive in Monday.

@NathanMOlson
Copy link

I will be offline until Monday, but will check back in then. Just to verify: you did the pull request from your fork of a PDAL repo?

I added classic branch protection rules to the relevant repos (either to main or master) in the meantime. Hopefully this resolves it. Otherwise I'll dive in Monday.

Thanks! I originally created a PR from PDAL/PDAL (which does not work), but have now successfully created a PR from NathanMOlson/PDAL.

I'm not sure I understand the branching strategy for all these repos, a pointer to documentation or a brief description of the nominal workflow for making changes would be helpful.

Copy link

@NathanMOlson NathanMOlson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re: 9806d91: This pattern is an option when there are only a few minor changes. We use the upstream repository and apply a patch.

This seems like an improvement to me, because I can bump the upstream version, or apply changes to it, without having to send PRs to multiple repos. When the changes are simple, it also seems easier to see exactly what has changed.

I don't think this is a good pattern when the changes are large, like OpenSfM or mvs-texturing.

@smathermather
Copy link
Contributor Author

Re: 9806d91: This pattern is an option when there are only a few minor changes. We use the upstream repository and apply a patch.

I agree this is a good approach when the changes are small.

@NathanMOlson
Copy link

@smathermather Do you have a preferred pattern for PRs to the ODM repos that fork outside projects?

My preference would be to have a main branch that tracks the upstream main branch, and an ODM (or develop) branch that tracks main plus whatever modifications ODM requires. Then ODM would pull from these forks using a git tag or commit hash, rather than branch name as seems to be the current pattern.

For this effort, I think it would make most sense to create the ODM branch from upstream main, so we can review the ODM-specific changes rather than all the changes that have happened upstream.

If we go this way, I think the workflow for each repo would be:

  1. Sync main branch with upstream main. (@smathermather or someone else with write privileges)
  2. Create ODM branch from main. (@smathermather or someone else with write privileges)
  3. Port ODM-specific changes, issue pull request to ODM branch (@NathanMOlson)
  4. Review changes and merge (@smathermather or someone else with write privileges)
  5. Update this PR to point to the commit hash of ODM branch after the merge (@NathanMOlson)

This will need to happen for entwine, OpenSfm, mvs-texturing, and draco.

This is just one approach and I'm happy to do something different. @smathermather Let me know if there's a different approach I should take. If we follow my approach, the first two steps in each repo are on you :)

…y v1.26. Now they are built using the venv. fix missing packages. Now making an orthophoto
@NathanMOlson
Copy link

@smathermather can you post a log from the failed run?

@smathermather
Copy link
Contributor Author

@smathermather can you post a log from the failed run?

https://gist.github.com/smathermather/13a87ce38432eeb8ed81a6f58a330434

@NathanMOlson
Copy link

The boundary is not created because GSD detection fails:

[WARNING] Cannot compute boundary (GSD cannot be estimated)

This suggests that gsd.opensfm_reconstruction_average_gsd() is returning None.

@Saijin-Naib
Copy link
Contributor

Yeah, testing should be done by the weekend. Sorry for the delay!

@smathermather
Copy link
Contributor Author

smathermather commented Oct 2, 2025

This suggests that gsd.opensfm_reconstruction_average_gsd() is returning None.

I thought so too, but the log for current ODM also has that warning:
https://gist.github.com/smathermather/de27a72ef564705ccab5a8e3932f6539

Looks like it might be at the georeferencing step:

[INFO]    Georeferencing point cloud                                                                                                                                                   |  [INFO]    Georeferencing point cloud
  [INFO]    las scale calculated as the minimum of 1/10 estimated spacing or 0.001, which ever is less.                                                                                  |  [INFO]    las scale calculated as the minimum of 1/10 estimated spacing or 0.001, which ever is less.
  [INFO]    Embedding GCP info in point cloud                                                                                                                                            |  [INFO]    Embedding GCP info in point cloud
  [INFO]    running pdal translate -i "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_filterpoints/point_cloud.ply" -o "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_ge|  [INFO]    running pdal translate -i "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_filterpoints/point_cloud.ply" -o "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_ge
  [INFO]    Calculating cropping area and generating bounds shapefile from point cloud                                                                                                   |  [INFO]    Calculating cropping area and generating bounds shapefile from point cloud
  [INFO]    running pdal translate -i "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz" -o "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d|  [INFO]    running pdal translate -i "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz" -o "/var/www/data/5eaaf721-8fc3-4272-bde4-da245
  [INFO]    running pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenc|  [INFO]    running pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenc
  [WARNING] Cannot calculate crop bounds! We will skip cropping                                                                                                                          |  [INFO]    running pdal info --summary "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz" > "/var/www/data/5eaaf721-8fc3-4272-bde4-da24
  ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|  [INFO]    running ogr2ogr -overwrite -f GPKG -a_srs "+proj=utm +zone=11 +datum=WGS84 +units=m +no_defs" /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_geore
  [INFO]    Creating Entwine Point Tile output                                                                                                                                           |  [INFO]    Creating Entwine Point Tile output
  [INFO]    running entwine build --threads 24 --tmp "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/entwine_pointcloud-tmp" -i /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/od|  [INFO]    running entwine build --threads 24 --tmp "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/entwine_pointcloud-tmp" -i /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/od
  1/1: /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz                                                                                 |  1/1: /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz                                                                                 
  Dimensions: [                                                                                                                                                                          |  Dimensions: [
  X:int32, Y:int32, Z:int32, Intensity:uint16, ReturnNumber:uint8,                                                                                                                       |  X:int32, Y:int32, Z:int32, Intensity:uint16, ReturnNumber:uint8,
  NumberOfReturns:uint8, ScanDirectionFlag:uint8, EdgeOfFlightLine:uint8,                                                                                                                |  NumberOfReturns:uint8, ScanDirectionFlag:uint8, EdgeOfFlightLine:uint8,
  Classification:uint8, Synthetic:uint8, KeyPoint:uint8, Withheld:uint8,                                                                                                                 |  Classification:uint8, ScanAngleRank:float32, UserData:uint8,                                                                                                                           
  Overlap:uint8, ScanAngleRank:float32, UserData:uint8, PointSourceId:uint16,                                                                                                            |  PointSourceId:uint16, GpsTime:float64, Red:uint16, Green:uint16, Blue:uint16                                                                                                           
  GpsTime:float64, ScanChannel:uint8, Red:uint16, Green:uint16, Blue:uint16,                                                                                                             |  ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
  OriginId:uint32                                                                                                                                                                        |  ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
  ]                                                                                                                                                                                      |  ]
  Points: 1,196,554                                                                                                                                                                      |  Points: 1,203,420                                                                                                                                                                      
  Bounds: [(235245, 3811192, -5), (235282, 3811232, 18)]                                                                                                                                 |  Bounds: [(235245, 3811192, -5), (235282, 3811229, 18)]                                                                                                                                 
  Scale: 0.001                                                                                                                                                                           |  Scale: 0.001
  SRS: PROJCS["unknown",GEOGCS["unknown",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,...                                                                                                  |  SRS: EPSG:32611                                                                                                                                                                        
                                                                                                                                                                                         |  
  Adding 0 - /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz                                                                           |  Adding 0 - /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz                                                                           
  Joining                                                                                                                                                                                |  Joining
  Done 0                                                                                                                                                                                 |  Done 0
  Saving                                                                                                                                                                                 |  Saving
  Wrote 1,196,554 points.                                                                                                                                                                |  Wrote 1,203,420 points.                                                                                                                                                                
  [INFO]    Finished odm_georeferencing stage                                                                                                                                            |  [INFO]    Finished odm_georeferencing stage
  [INFO]    Running odm_dem stage                                                                                                                                                        |  [INFO]    Running odm_dem stage
  [INFO]    Create DSM: True                                                                                                                                                             |  [INFO]    Create DSM: True
  [INFO]    Create DTM: False                                                                                                                                                            |  [INFO]    Create DTM: False
  [INFO]    DEM input file /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz found: True                                                 |  [INFO]    DEM input file /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz found: True                                                 
  [INFO]    running renderdem "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz" --outdir "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92|  [INFO]    running renderdem "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz" --outdir "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb
  Reading points from /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz                                                                  |  Reading points from /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz                                                                  
  Number of points: 1196554                                                                                                                                                              |  Number of points: 1203420                                                                                                                                                              
  Classification dimension: Classification                                                                                                                                               |  Classification dimension: Classification
  Point cloud bounds are [minx: 235245.676, maxx: 235281.623, miny: 3811192.52, maxy: 3811231.261]                                                                                       |  Point cloud bounds are [minx: 235245.763, maxx: 235281.607, miny: 3811192.396, maxy: 3811228.964]                                                                                      
  DEM resolution is (719, 775), max tile size is 4096, will split DEM generation into 1 tiles                                                                                            |  DEM resolution is (717, 732), max tile size is 4096, will split DEM generation into 1 tiles                                                                                            
  r0.02_x0_y0.tif                                                                                                                                                                        |  r0.02_x0_y0.tif
  r0.0282843_x0_y0.tif                                                                                                                                                                   |  r0.0282843_x0_y0.tif
  r0.04_x0_y0.tif                                                                                                                                                                        |  r0.04_x0_y0.tif
  [INFO]    Generated 3 tiles                                                                                                                                                            |  [INFO]    Generated 3 tiles
  [INFO]    running gdalbuildvrt -input_file_list "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_dem/tiles_list.txt" "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_dem|  [INFO]    running gdalbuildvrt -input_file_list "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_dem/tiles_list.txt" "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_dem
  0...10...20...30...40...50...60...70...80...90...100 - done.                                                                                                                           |  0...10...20...30...40...50...60...70...80...90...100 - done.
  [INFO]    running gdal_translate -co NUM_THREADS=24 -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 46.2% -outsize 10% 0 "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d9|  [INFO]    running gdal_translate -co NUM_THREADS=24 -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 46.3% -outsize 10% 0 "/var/www/data/5eaaf721-8fc3-4272-bde4-da245c
  100 - done.                                                                                                                                                                            |  100 - done.

@smathermather
Copy link
Contributor Author

 [WARNING] Cannot calculate crop bounds! We will skip cropping 

From stages/odm_georeferencing.py:

                if args.crop > 0:
                    log.ODM_INFO("Calculating cropping area and generating bounds shapefile from point cloud")
                    cropper = Cropper(tree.odm_georeferencing, 'odm_georeferenced_model')

                    if args.fast_orthophoto:
                        decimation_step = 4
                    else:
                        decimation_step = 40

                    # More aggressive decimation for large datasets
                    if not args.fast_orthophoto:
                        decimation_step *= int(len(reconstruction.photos) / 1000) + 1
                        decimation_step = min(decimation_step, 95)

                    try:
                        cropper.create_bounds_gpkg(tree.odm_georeferencing_model_laz, args.crop,
                                                    decimation_step=decimation_step)
                    except:
                        log.ODM_WARNING("Cannot calculate crop bounds! We will skip cropping")
                        args.crop = 0

We should probably be printing that exception:

                if args.crop > 0:
                    log.ODM_INFO("Calculating cropping area and generating bounds shapefile from point cloud")
                    cropper = Cropper(tree.odm_georeferencing, 'odm_georeferenced_model')

                    if args.fast_orthophoto:
                        decimation_step = 4
                    else:
                        decimation_step = 40

                    # More aggressive decimation for large datasets
                    if not args.fast_orthophoto:
                        decimation_step *= int(len(reconstruction.photos) / 1000) + 1
                        decimation_step = min(decimation_step, 95)

                    try:
                        cropper.create_bounds_gpkg(tree.odm_georeferencing_model_laz, args.crop,
                                                    decimation_step=decimation_step)
                    except Exception as e:
                        log.ODM_WARNING("Cannot calculate crop bounds! We will skip cropping:")
                        print(e)
                        args.crop = 0
[WARNING] Cannot calculate crop bounds! We will skip cropping:
Received a NULL pointer.

A little under the weather here, so I'll leave this for today and return to troubleshoot tomorrow.

@Saijin-Naib
Copy link
Contributor

Saijin-Naib commented Oct 2, 2025

ODM Setup.exe testing from GitHub Action:

 ____________________________
/   ____    _____    __  __  \
|  / __ \  |  __ \  |  \/  | |
| | |  | | | |  | | | \  / | |
| | |  | | | |  | | | |\/| | |
| | |__| | | |__| | | |  | | |
|  \____/  |_____/  |_|  |_| |
\____________________________/
       version: 3.5.6

Traceback (most recent call last):
  File "C:\ODM\\run.py", line 15, in <module>
    from opendm.utils import get_processing_results_paths, rm_r
  File "C:\ODM\opendm\utils.py", line 9, in <module>
    from opendm.photo import find_largest_photo_dims, find_mean_utc_time
  File "C:\ODM\opendm\photo.py", line 19, in <module>
    from opensfm.sensors import sensor_data
  File "C:\ODM\SuperBuild\install/bin/opensfm\opensfm\sensors.py", line 4, in <module>
    from opensfm import context
  File "C:\ODM\SuperBuild\install/bin/opensfm\opensfm\context.py", line 27, in <module>
    OPENCV5: bool = int(cv2.__version__.split(".")[0]) >= 5
                        ^^^^^^^^^^^^^^^
AttributeError: module 'cv2' has no attribute '__version__'

(venv) C:\ODM>

Not functional. Similar/same failure under Linux/WINE host as well. Are we dropping a depends, or is this a path isolation issue?

Edge/Defender are also blocking the download of the Runner artifact, but it is clean:
https://www.virustotal.com/gui/file/03c79c8a66b2badd1f1a908f5f9bbd4ddcc7cc7911b8473826e91f05db5bd1e9/detection

@smathermather
Copy link
Contributor Author

Testing with Coal Oil Point Reserve from Michele M. Tobias & Alex Mandel (first ODM dataset), first issue found: auto-boundary doesn't appear to be respected:

This is a problem for all datasets, including referenced ones. Per Nathan, I'll poke at the GSD estimate as a potential issue.
toledo

@smathermather
Copy link
Contributor Author

[WARNING] Cannot calculate crop bounds! We will skip cropping:
Received a NULL pointer.

Looks like we are getting to the point of writing the tmp-bounds.geojson:
odm_georeferencing/odm_georeferenced_model.tmp-bounds.geojson

        # Write bounds to GeoJSON
        tmp_bounds_geojson_path = self.path('tmp-bounds.geojson')
        with open(tmp_bounds_geojson_path, "w") as f:
            f.write(json.dumps({
                "type": "FeatureCollection",
                "features": [{
                    "type": "Feature",
                    "geometry": pc_geojson_boundary_feature
                }]
            }))

But not all the way to cleaning it up in opendm/cropper.py.

@smathermather
Copy link
Contributor Author

smathermather commented Oct 3, 2025

But not all the way to cleaning it up in opendm/cropper.py.

It appears that's due getting a bounding box instead of a feature boundary. That tmp file looks like this in current:

{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Polygon", "coordinates": [[[235276.71, 3811187.58949192], [2352
77.21, 3811190.18756814], [235281.21, 3811191.91961894], [235282.71, 3811194.51769515], [235283.21, 3811198.84782217], [235279.71, 3811204.91], [235
275.71, 3811203.17794919], [235275.71, 3811204.91], [235278.71, 3811206.64205081], [235276.71, 3811208.37410162], [235277.21, 3811210.97217783], [23
5274.21, 3811212.70422863], [235272.21, 3811217.90038106], [235269.71, 3811217.03435565], [235269.71, 3811220.49845727], [235266.71, 3811222.2305080
8], [235264.71, 3811227.4266605], [235262.21, 3811226.56063509], [235255.71, 3811230.89076211], [235253.21000001, 3811230.02473671], [235251.2100000
1, 3811233.48883833], [235248.21000001, 3811235.22088913], [235238.21000001, 3811230.02473671], [235236.21000001, 3811226.56063509], [235238.2100000
1, 3811223.09653348], [235237.71000001, 3811218.76640646], [235240.71000001, 3811218.76640646], [235242.71000001, 3811217.03435565], [235245.2100000
1, 3811217.90038106], [235252.71000001, 3811213.57025404], [235254.71000001, 3811210.10615242], [235252.71000001, 3811208.37410162], [235250.2100000
1, 3811209.24012702], [235249.71000001, 3811203.17794919], [235251.71000001, 3811201.44589838], [235254.21000001, 3811202.31192379], [235254.2100000
1, 3811198.84782217], [235257.71, 3811192.78564435], [235267.71, 3811192.78564435], [235276.71, 3811187.58949192]]]}}]}

And like this in 24.04:

{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": "{\"bbox\":[235245.99000000, 3811192.59000000, 235281.49000000,3811231.21
000000]}"}]}

Looks like a failure at this step which appears to be the source of the bounding box:

run('pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 "{0}" > "{1}"'.format(decimated_pointcloud_path,  boundary_file_path))

Needs changed to:

run('pdal info --boundary --filters.hexbin.edge_length=1 --filters.hexbin.threshold=0 "{0}" > "{1}"'.format(decimated_pointcloud_path,  boundary_file_path))

per current pdal function signature.

This at least improves odm_georeferenced_model.boundary.json:

{
  "boundary":
  {
    "boundary": "POLYGON ((235245.99000000 3811192.59000000, 235245.99000000 3811231.21000000, 235281.49000000 3811231.21000000, 235281.49000000 381
1192.59000000, 235245.99000000 3811192.59000000))",
    "boundary_json": "{\"bbox\":[235245.99000000, 3811192.59000000, 235281.49000000,3811231.21000000]}"
  },
  "file_size": 1078459,
  "filename": "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.decimated.las",
  "now": "2025-10-03T16:09:33+0000",
  "pdal_version": "2.9.0 (git-version: 831631)",
  "reader": "readers.las"
}

There's still something broken after that though, as we still get a bounding box in our tmp json.

@NathanMOlson
Copy link

Not functional. Similar/same failure under Linux/WINE host as well. Are we dropping a depends, or is this a path isolation issue?

I don't have any idea where to look for this issue. Can anybody provide any clues?

@Saijin-Naib
Copy link
Contributor

Usually we see issues with python when the user has a system-wide install and the Python folder in %AppData% has site-packages that conflict with what we ship.

@NathanMOlson
Copy link

Ah. The Docker version is now using a python virtual environment. Maybe that would be a solution for WIndows as well?

@NathanMOlson
Copy link

Hmm, it looks like Windows is already using a venv...

@Saijin-Naib
Copy link
Contributor

But somehow, the existing %AppData%\Python interferes, so something isn't using the venv fully.

@smathermather
Copy link
Contributor Author

@Saijin-Naib -- were you able to do thermal and/or multi-spectral testing?

@smathermather
Copy link
Contributor Author

There's still something broken after that though, as we still get a bounding box in our tmp json.

I'll dive in today. The auto-boundary is still broken, though at a later step, so that's progress.

@Saijin-Naib
Copy link
Contributor

@smathermather very limited.

@smathermather
Copy link
Contributor Author

Ok, this:

run('pdal info --boundary --filters.hexbin.edge_length=1 --filters.hexbin.threshold=0 "{0}" > "{1}"'.format(decimated_pointcloud_path,  boundary_file_path))

Needs a slight change to this:

run('pdal info --boundary --filters.hexbin.edge_length=1 --filters.hexbin.threshold=1 "{0}" > "{1}"'.format(decimated_pointcloud_path,  boundary_file_path))

In order to get stats on output.

@NathanMOlson
Copy link

@smathermather Thanks for tracking that down!

@smathermather
Copy link
Contributor Author

@smathermather very limited.

If you've got data to test with, feel free to upload or send my way and I can test, if time is a factor.

If we don't have enough test data, I'll get a post started on the forum.

@Saijin-Naib
Copy link
Contributor

@NathanMOlson ,I think next step is to fully remove WebODM and the Python AppData folder from the test machine and re-run.

If it works, the years-old path pollution/escape bug lives on 😤

@Saijin-Naib
Copy link
Contributor

Saijin-Naib commented Oct 6, 2025

@NathanMOlson ,I think next step is to fully remove WebODM and the Python AppData folder from the test machine and re-run.

If it works, the years-old path pollution/escape bug lives on 😤

@NathanMOlson
Good news and bad!

No change when removing WebODM and nuking the AppData\Python folder, so the issue isn't path pollution (yay!)

But, it must be something else. Perhaps OPENCV of the current release doesn't actually have the __version__ attribute as stated in the debug log, and we need to call the version check from cv2 differently?

@Saijin-Naib
Copy link
Contributor

It seems this error might stem from mixed python2/python3 runtimes?
https://forum.opencv.org/t/problem-installing-on-jetson-nano-attributeerror-module-cv2-has-no-attribute-version/3408/2

@NathanMOlson
Copy link

But, it must be something else. Perhaps OPENCV of the current release doesn't actually have the __version__ attribute as stated in the debug log, and we need to call the version check from cv2 differently?

In the docker build, cv2.__version__ works as expected:

>>> import cv2
>>> cv2.__version__
'4.12.0'

The same version of OpenCV is built for Windows, so I don't think this is an OpenCV problem.

@smathermather
Copy link
Contributor Author

As windows build and python isolation isn't a new problem, we don't have to get that resolved in this pull request. It's already better from a Windows build perspective, and hopefully what Sam proposes in #1936 can get us 100%.

Thermal testing is done, according to Brett. So now just waiting on some multispectral testing. We have a mapir dataset and some DJI data. Will loop back soon on that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants