-
Notifications
You must be signed in to change notification settings - Fork 1.2k
24.04 #1904
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: 24.04
Are you sure you want to change the base?
24.04 #1904
Conversation
We should change the following to point away from my personal account before merging this PR. Should I open PRs to each of these repos? If so, should I target them to the main branch, or something else?
For python dependencies, I've used |
Yes, this pull request isn't ready to merge as-is, but wanted to bundle this as such because it gives us some nominal testing of build by default as well as a place to track changes and discussion.
Each of these either needs the ODM specific mods upstreamed where appropriate (e.g. Entwine / Untwine) as Piero indicated or for where that isn't appropriate (e.g. OpenSfM / mvs-texturing), then it needs accompanied by pull requests against OpenDroneMap's forks. It is also appropriate to keep this scoped to only pull requests against OpenDroneMap's forks, with an eye to upstreaming as a next step.
Not sure. Since we discourage use outside a container, this might be ok. Although until we have an ODM repo maintainer recruited, that's a change I wouldn't endorse without someone more knowledgeable weighing in on the implications. Were you unable to get it to build without? |
I was unable to get it to build without this flag. This is related to how newer versions of Python prefer to relate to system-installed files. https://stackoverflow.com/questions/75608323/how-do-i-solve-error-externally-managed-environment-every-time-i-use-pip-3 |
I just attempted to open a PR to the ODM fork of PDAL, but was rejected with this message:
Any guidance on how I can contribute? |
I will be offline until Monday, but will check back in then. Just to verify: you did the pull request from your fork of a PDAL repo? I added classic branch protection rules to the relevant repos (either to main or master) in the meantime. Hopefully this resolves it. Otherwise I'll dive in Monday. |
Thanks! I originally created a PR from PDAL/PDAL (which does not work), but have now successfully created a PR from NathanMOlson/PDAL. I'm not sure I understand the branching strategy for all these repos, a pointer to documentation or a brief description of the nominal workflow for making changes would be helpful. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Re: 9806d91: This pattern is an option when there are only a few minor changes. We use the upstream repository and apply a patch.
This seems like an improvement to me, because I can bump the upstream version, or apply changes to it, without having to send PRs to multiple repos. When the changes are simple, it also seems easier to see exactly what has changed.
I don't think this is a good pattern when the changes are large, like OpenSfM or mvs-texturing.
I agree this is a good approach when the changes are small. |
@smathermather Do you have a preferred pattern for PRs to the ODM repos that fork outside projects? My preference would be to have a For this effort, I think it would make most sense to create the If we go this way, I think the workflow for each repo would be:
This will need to happen for entwine, OpenSfm, mvs-texturing, and draco. This is just one approach and I'm happy to do something different. @smathermather Let me know if there's a different approach I should take. If we follow my approach, the first two steps in each repo are on you :) |
…y v1.26. Now they are built using the venv. fix missing packages. Now making an orthophoto
@smathermather can you post a log from the failed run? |
https://gist.github.com/smathermather/13a87ce38432eeb8ed81a6f58a330434 |
The boundary is not created because GSD detection fails:
This suggests that |
Yeah, testing should be done by the weekend. Sorry for the delay! |
I thought so too, but the log for current ODM also has that warning: Looks like it might be at the georeferencing step: [INFO] Georeferencing point cloud | [INFO] Georeferencing point cloud
[INFO] las scale calculated as the minimum of 1/10 estimated spacing or 0.001, which ever is less. | [INFO] las scale calculated as the minimum of 1/10 estimated spacing or 0.001, which ever is less.
[INFO] Embedding GCP info in point cloud | [INFO] Embedding GCP info in point cloud
[INFO] running pdal translate -i "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_filterpoints/point_cloud.ply" -o "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_ge| [INFO] running pdal translate -i "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_filterpoints/point_cloud.ply" -o "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_ge
[INFO] Calculating cropping area and generating bounds shapefile from point cloud | [INFO] Calculating cropping area and generating bounds shapefile from point cloud
[INFO] running pdal translate -i "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz" -o "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d| [INFO] running pdal translate -i "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz" -o "/var/www/data/5eaaf721-8fc3-4272-bde4-da245
[INFO] running pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenc| [INFO] running pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenc
[WARNING] Cannot calculate crop bounds! We will skip cropping | [INFO] running pdal info --summary "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz" > "/var/www/data/5eaaf721-8fc3-4272-bde4-da24
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| [INFO] running ogr2ogr -overwrite -f GPKG -a_srs "+proj=utm +zone=11 +datum=WGS84 +units=m +no_defs" /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_geore
[INFO] Creating Entwine Point Tile output | [INFO] Creating Entwine Point Tile output
[INFO] running entwine build --threads 24 --tmp "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/entwine_pointcloud-tmp" -i /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/od| [INFO] running entwine build --threads 24 --tmp "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/entwine_pointcloud-tmp" -i /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/od
1/1: /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz | 1/1: /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz
Dimensions: [ | Dimensions: [
X:int32, Y:int32, Z:int32, Intensity:uint16, ReturnNumber:uint8, | X:int32, Y:int32, Z:int32, Intensity:uint16, ReturnNumber:uint8,
NumberOfReturns:uint8, ScanDirectionFlag:uint8, EdgeOfFlightLine:uint8, | NumberOfReturns:uint8, ScanDirectionFlag:uint8, EdgeOfFlightLine:uint8,
Classification:uint8, Synthetic:uint8, KeyPoint:uint8, Withheld:uint8, | Classification:uint8, ScanAngleRank:float32, UserData:uint8,
Overlap:uint8, ScanAngleRank:float32, UserData:uint8, PointSourceId:uint16, | PointSourceId:uint16, GpsTime:float64, Red:uint16, Green:uint16, Blue:uint16
GpsTime:float64, ScanChannel:uint8, Red:uint16, Green:uint16, Blue:uint16, | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
OriginId:uint32 | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
] | ]
Points: 1,196,554 | Points: 1,203,420
Bounds: [(235245, 3811192, -5), (235282, 3811232, 18)] | Bounds: [(235245, 3811192, -5), (235282, 3811229, 18)]
Scale: 0.001 | Scale: 0.001
SRS: PROJCS["unknown",GEOGCS["unknown",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,... | SRS: EPSG:32611
|
Adding 0 - /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz | Adding 0 - /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz
Joining | Joining
Done 0 | Done 0
Saving | Saving
Wrote 1,196,554 points. | Wrote 1,203,420 points.
[INFO] Finished odm_georeferencing stage | [INFO] Finished odm_georeferencing stage
[INFO] Running odm_dem stage | [INFO] Running odm_dem stage
[INFO] Create DSM: True | [INFO] Create DSM: True
[INFO] Create DTM: False | [INFO] Create DTM: False
[INFO] DEM input file /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz found: True | [INFO] DEM input file /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz found: True
[INFO] running renderdem "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz" --outdir "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92| [INFO] running renderdem "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz" --outdir "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb
Reading points from /var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.laz | Reading points from /var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_georeferencing/odm_georeferenced_model.laz
Number of points: 1196554 | Number of points: 1203420
Classification dimension: Classification | Classification dimension: Classification
Point cloud bounds are [minx: 235245.676, maxx: 235281.623, miny: 3811192.52, maxy: 3811231.261] | Point cloud bounds are [minx: 235245.763, maxx: 235281.607, miny: 3811192.396, maxy: 3811228.964]
DEM resolution is (719, 775), max tile size is 4096, will split DEM generation into 1 tiles | DEM resolution is (717, 732), max tile size is 4096, will split DEM generation into 1 tiles
r0.02_x0_y0.tif | r0.02_x0_y0.tif
r0.0282843_x0_y0.tif | r0.0282843_x0_y0.tif
r0.04_x0_y0.tif | r0.04_x0_y0.tif
[INFO] Generated 3 tiles | [INFO] Generated 3 tiles
[INFO] running gdalbuildvrt -input_file_list "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_dem/tiles_list.txt" "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_dem| [INFO] running gdalbuildvrt -input_file_list "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_dem/tiles_list.txt" "/var/www/data/5eaaf721-8fc3-4272-bde4-da245cb5582a/odm_dem
0...10...20...30...40...50...60...70...80...90...100 - done. | 0...10...20...30...40...50...60...70...80...90...100 - done.
[INFO] running gdal_translate -co NUM_THREADS=24 -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 46.2% -outsize 10% 0 "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d9| [INFO] running gdal_translate -co NUM_THREADS=24 -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 46.3% -outsize 10% 0 "/var/www/data/5eaaf721-8fc3-4272-bde4-da245c
100 - done. | 100 - done. |
From stages/odm_georeferencing.py: if args.crop > 0:
log.ODM_INFO("Calculating cropping area and generating bounds shapefile from point cloud")
cropper = Cropper(tree.odm_georeferencing, 'odm_georeferenced_model')
if args.fast_orthophoto:
decimation_step = 4
else:
decimation_step = 40
# More aggressive decimation for large datasets
if not args.fast_orthophoto:
decimation_step *= int(len(reconstruction.photos) / 1000) + 1
decimation_step = min(decimation_step, 95)
try:
cropper.create_bounds_gpkg(tree.odm_georeferencing_model_laz, args.crop,
decimation_step=decimation_step)
except:
log.ODM_WARNING("Cannot calculate crop bounds! We will skip cropping")
args.crop = 0 We should probably be printing that exception: if args.crop > 0:
log.ODM_INFO("Calculating cropping area and generating bounds shapefile from point cloud")
cropper = Cropper(tree.odm_georeferencing, 'odm_georeferenced_model')
if args.fast_orthophoto:
decimation_step = 4
else:
decimation_step = 40
# More aggressive decimation for large datasets
if not args.fast_orthophoto:
decimation_step *= int(len(reconstruction.photos) / 1000) + 1
decimation_step = min(decimation_step, 95)
try:
cropper.create_bounds_gpkg(tree.odm_georeferencing_model_laz, args.crop,
decimation_step=decimation_step)
except Exception as e:
log.ODM_WARNING("Cannot calculate crop bounds! We will skip cropping:")
print(e)
args.crop = 0
A little under the weather here, so I'll leave this for today and return to troubleshoot tomorrow. |
ODM Setup.exe testing from GitHub Action:
Not functional. Similar/same failure under Linux/WINE host as well. Are we dropping a depends, or is this a path isolation issue? Edge/Defender are also blocking the download of the Runner artifact, but it is clean: |
This is a problem for all datasets, including referenced ones. Per Nathan, I'll poke at the GSD estimate as a potential issue. |
Looks like we are getting to the point of writing the tmp-bounds.geojson: # Write bounds to GeoJSON
tmp_bounds_geojson_path = self.path('tmp-bounds.geojson')
with open(tmp_bounds_geojson_path, "w") as f:
f.write(json.dumps({
"type": "FeatureCollection",
"features": [{
"type": "Feature",
"geometry": pc_geojson_boundary_feature
}]
})) But not all the way to cleaning it up in |
It appears that's due getting a bounding box instead of a feature boundary. That tmp file looks like this in current: {"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Polygon", "coordinates": [[[235276.71, 3811187.58949192], [2352
77.21, 3811190.18756814], [235281.21, 3811191.91961894], [235282.71, 3811194.51769515], [235283.21, 3811198.84782217], [235279.71, 3811204.91], [235
275.71, 3811203.17794919], [235275.71, 3811204.91], [235278.71, 3811206.64205081], [235276.71, 3811208.37410162], [235277.21, 3811210.97217783], [23
5274.21, 3811212.70422863], [235272.21, 3811217.90038106], [235269.71, 3811217.03435565], [235269.71, 3811220.49845727], [235266.71, 3811222.2305080
8], [235264.71, 3811227.4266605], [235262.21, 3811226.56063509], [235255.71, 3811230.89076211], [235253.21000001, 3811230.02473671], [235251.2100000
1, 3811233.48883833], [235248.21000001, 3811235.22088913], [235238.21000001, 3811230.02473671], [235236.21000001, 3811226.56063509], [235238.2100000
1, 3811223.09653348], [235237.71000001, 3811218.76640646], [235240.71000001, 3811218.76640646], [235242.71000001, 3811217.03435565], [235245.2100000
1, 3811217.90038106], [235252.71000001, 3811213.57025404], [235254.71000001, 3811210.10615242], [235252.71000001, 3811208.37410162], [235250.2100000
1, 3811209.24012702], [235249.71000001, 3811203.17794919], [235251.71000001, 3811201.44589838], [235254.21000001, 3811202.31192379], [235254.2100000
1, 3811198.84782217], [235257.71, 3811192.78564435], [235267.71, 3811192.78564435], [235276.71, 3811187.58949192]]]}}]} And like this in 24.04: {"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": "{\"bbox\":[235245.99000000, 3811192.59000000, 235281.49000000,3811231.21
000000]}"}]} Looks like a failure at this step which appears to be the source of the bounding box: run('pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 "{0}" > "{1}"'.format(decimated_pointcloud_path, boundary_file_path)) Needs changed to: run('pdal info --boundary --filters.hexbin.edge_length=1 --filters.hexbin.threshold=0 "{0}" > "{1}"'.format(decimated_pointcloud_path, boundary_file_path)) per current pdal function signature. This at least improves odm_georeferenced_model.boundary.json: {
"boundary":
{
"boundary": "POLYGON ((235245.99000000 3811192.59000000, 235245.99000000 3811231.21000000, 235281.49000000 3811231.21000000, 235281.49000000 381
1192.59000000, 235245.99000000 3811192.59000000))",
"boundary_json": "{\"bbox\":[235245.99000000, 3811192.59000000, 235281.49000000,3811231.21000000]}"
},
"file_size": 1078459,
"filename": "/var/www/data/04a6da71-4b4d-4ba1-a2f0-4c99d92c14e0/odm_georeferencing/odm_georeferenced_model.decimated.las",
"now": "2025-10-03T16:09:33+0000",
"pdal_version": "2.9.0 (git-version: 831631)",
"reader": "readers.las"
} There's still something broken after that though, as we still get a bounding box in our tmp json. |
I don't have any idea where to look for this issue. Can anybody provide any clues? |
Usually we see issues with python when the user has a system-wide install and the Python folder in %AppData% has site-packages that conflict with what we ship. |
Ah. The Docker version is now using a python virtual environment. Maybe that would be a solution for WIndows as well? |
Hmm, it looks like Windows is already using a venv... |
But somehow, the existing %AppData%\Python interferes, so something isn't using the venv fully. |
@Saijin-Naib -- were you able to do thermal and/or multi-spectral testing? |
I'll dive in today. The auto-boundary is still broken, though at a later step, so that's progress. |
@smathermather very limited. |
Ok, this: run('pdal info --boundary --filters.hexbin.edge_length=1 --filters.hexbin.threshold=0 "{0}" > "{1}"'.format(decimated_pointcloud_path, boundary_file_path)) Needs a slight change to this: run('pdal info --boundary --filters.hexbin.edge_length=1 --filters.hexbin.threshold=1 "{0}" > "{1}"'.format(decimated_pointcloud_path, boundary_file_path)) In order to get stats on output. |
@smathermather Thanks for tracking that down! |
If you've got data to test with, feel free to upload or send my way and I can test, if time is a factor. If we don't have enough test data, I'll get a post started on the forum. |
@NathanMOlson ,I think next step is to fully remove WebODM and the Python AppData folder from the test machine and re-run. If it works, the years-old path pollution/escape bug lives on 😤 |
@NathanMOlson No change when removing WebODM and nuking the AppData\Python folder, so the issue isn't path pollution (yay!) But, it must be something else. Perhaps OPENCV of the current release doesn't actually have the |
It seems this error might stem from mixed python2/python3 runtimes? |
In the docker build,
The same version of OpenCV is built for Windows, so I don't think this is an OpenCV problem. |
As windows build and python isolation isn't a new problem, we don't have to get that resolved in this pull request. It's already better from a Windows build perspective, and hopefully what Sam proposes in #1936 can get us 100%. Thermal testing is done, according to Brett. So now just waiting on some multispectral testing. We have a mapir dataset and some DJI data. Will loop back soon on that. |
No description provided.