POST
|
Sooo... I noticed that the Extract Multi Values to Points tool creates a temporary file named input.xml and Output.xml under appdata\local\temp\[insert unpredictable location] and deletes it on completion of processing. It looks like the Output.xml file (attached) is a temporary version of the data before it got turned into a shapefile (there's an input.xml file too), but it's not standard XML - there are no tags and lots of non-ascii data. [backstory] I ask because I deleted some stuff that took me most of a day to make and I shudder at the thought of doing it again. It was a point file with a lot of points I picked very carefully in areas that were unchanged between two elevation models before and after a recent flood so I could make an error surface in the model where all of my control was washed away. So I stared and stared at two orthophotos and two hillshades and a difference surface, while picking these points. Long story short I was trying to export a subset of the points and the export failed with an unintelligible error. So I changed some stuff and did it again and thought I was clicking on the previously created export but I went too fast and really it was the source file, and it failed again, and my source data disappeared. I tried Recuva to look in the sourcefile dir and almost got everything back, but the .shp file was nowhere to be seen I found the input.xml and output.xml files while looking forlornly in the recycle bin. They have coordinate system information in plaintext and look an awful lot like copies of my data but I'm not sure how to get them back. I tried the import from XML tool in arc and crashed arcmap, and the internets seem woefully immune to my carefully crafted googling. Any help would be super-appreciated. I'll even PayPal you enough for a six-pack/4-pack/bottle of your favorite (non-crazy-expensive) refreshment. Thanks for any help. Andy
... View more
01-19-2018
04:36 PM
|
1
|
0
|
759
|
DOC
|
Thanks for the quick reply. I unfortunately found my way into root before I saw your message and fixed my issues with conda update conda followed by conda install spyder. Hopefully python isn't broken now in ArcGIS Pro. Andy
... View more
12-21-2016
06:36 PM
|
0
|
0
|
20068
|
DOC
|
ok found what's hopefully the first part of the answer - 'deactivate' actually gets me into the root environment.
... View more
12-21-2016
04:04 PM
|
0
|
0
|
20068
|
DOC
|
I just installed ArcGIS Pro 1.4 (USGS build - sorry if this is off-topic) and I am trying to install spyder to work with conda, like Robert. I get the same error. When I try conda update conda I get Error: package 'conda' is not installed in c:\ArcGIS\Pro\bin\Python\envs\arcgispro-py3e looks like maybe I need root but I can't activate it: >conda install -f coda
Fetching package metadata: ....
.Solving package specifications: .........
Error: 'conda' can only be installed into the root environment
>activate root
No environment named "root" exists in c:\ArcGIS\Pro\bin\Python\envs\, or is not a valid conda installation directory.
>conda info --envs
# conda environments:
#
arcgispro-py3 * c:\ArcGIS\Pro\bin\Python\envs\arcgispro-py3
root c:\ArcGIS\Pro\bin\Python so confused
... View more
12-21-2016
03:56 PM
|
0
|
0
|
20068
|
POST
|
Arcmap dialog for ASCII 3D to feature class says point spacing optional. Error message says otherwise. Also I can't figure out where to report a bug and this doesn't warrant me harassing USGS GIS IT folks. The pic is self-explanatory.
... View more
06-23-2015
06:57 PM
|
0
|
2
|
3017
|
POST
|
I am Using ArcScene to visualize and edit four shapefiles of points (channel bottom soundings). It's a useful way to spot blunders from beam divergence or flyers/sinkers. Except it appears to be an inefficient editing environment. Each edit (select and delete a single or few point(s)) takes a surprisingly long time - about 6-8 seconds of hourglass when I select a point (more if a few points) and about 16-20 seconds of hourglass to delete after I hit the delete button. Each shapefile is 5km long or less with 12k to 60k points. I am running ArcGIS 10.2.1 on a Dual Quad core CPU Xeon workstation with 192 GB of RAM and dual AMD Radeon HD7970 GPUs Data are locally stored on SATA III 6 GB/s drives. Other than the obvious question of "Is there a way to improve editing speed (dramatically) in ArcScene" I recognize that I may be going a bit beyond what ArcScene is capable of and/or intended for, so I am wondering if anyone has a recommendation for another editing tool that would allow me to visualize and edit the data in 3D, preferably still in shapefile form.
... View more
04-03-2014
11:15 AM
|
0
|
0
|
2434
|
POST
|
Here's my experience so far, working with a .las file exported from Agisoft Photoscan and cleaned with las2las because of an import bug ArcMap has with VLRs in the LAS GeoKey: I initially imported a ~300 million point LAS file, then generated a multipoint (LAS to multipoint) and attempted to run point statistics with a cell size of 0.5m and a 1x1 rectangular neigborhood for each of median, mean, min, max, std, range, variety. Std and mean were the only statistics that successfully generated. Min,max,median all failed with 1 bit rasters and all nodata. range and variety also gave me 1-bit rasters, but range had 0 in cells that gave me a STD < ~0.4 and 1 with STD >0.4, and variety gave me 0 in all cells with data and 1 only in nodata cells from the original pointcloud (also nodata in STD raster). point to raster gave me the same result for std and mean as point statistics with a 1x1 neighborhood, and I was able to generate min, max, and range with that tool (also can do std), and it takes about 2 minutes/raster instead of 20. Unfortunately, point to raster doesn't generate median, so I am still striking out on the most important function I am trying to perform from a noise-removal standpoint. I figured I'd see if size was an issue, so I clipped a small (9 million point) area of the original raster and attempted median statistics on it. That still failed (produced a 1-bit depth raster with all nodata), so I clipped it down to 780 000 points, still failed (same way). I also tried to min and max, and I tried with 1 and 3 cell rectangular neighborhoods. All have same results. 1 bit pixel depth all nodata output. STD still produces a valid raster.
... View more
10-31-2013
10:24 AM
|
0
|
0
|
344
|
POST
|
... it is worth for corporate organizations to invest time/money onto other vendors�?? LAS solution, like ERDAS/ ENVI or Leica (XPro). Thanks for the quick reply! Unfortunately I'm not free to spent ~$10-$20K on other software, especially given that I am not using LiDAR-derived LAS pointclouds, and nothing works quite the same with SfM clouds. I have demo'd QT modeler and am in the process of demoing i-site studio, but haven't found an "ideal" tool yet. cloudcompare can handle 300-400 million points no problem but doesn't have editing tools. Lastools is useful but doesn't include a median filter. I am trying GRASS now as ArcMap chugs through generating stats on the multipoint I exported from Lidar. Would be nice if I didn't have to go through the step of exporting to multipoint though since arc now kind of natively supports lidar data. I should note that I was able to generate a 300 million point multipoint file with ArcMap (about 2 hrs), and it appears that I can generate some stats with the point statistics tool (median and minimum statistics failed, generating grids with pixel depth of 1 bit and apparently all nodata, but std worked). It looks like it takes about 15 minutes per stat run, and I am not clear if when I specify a cell size of 0.5 and a rectangular neighborhood of 1x1 if I am just analyzing the points in that cell, which is my goal (essentially like doing point-to-raster but allows me to calculate median).
... View more
10-30-2013
11:14 AM
|
0
|
0
|
344
|
POST
|
I am trying to figure out if ArcGIS is even worth working with to process my point data, or if I should look for another solution and just work with the finished rasters in Arc. I am generating 1.5 billion point clouds of a 30 km strip of river every few weeks. I chop the river into 5 chunks for processing. I have noise issues with specular reflection from the water (using SfM to generate pointclouds), so I have a mess where the water is. The best way to deal with the mess.. I think.. is to (1) clean up spikes with a median filter, and (2) eliminate areas of high noise/low returns with a combination of statistics. So ideally I would be able to generate a grid for each of the following in the .las file - mean, max, min, count, and stdev. My .las file stores everything as 1st (or last) return (I forget), and basically only has xyzrgb for each point. I tried to convert a <300 million point cloud to multipoint (actually I am still trying and it's been hours), but I am wondering if I should give up. I can get just about everything (except median, which is actually really important) from lastools, and I think I'll be able to get median, and maybe everything, with GRASS. I'd love to use ArcMap, but I am at a loss where to even start. .las file is 8GB.
... View more
10-29-2013
07:21 PM
|
0
|
4
|
655
|
POST
|
I had the same problem. I have attached a sample dataset that throws the error. The dataset is viewable in PointVue LE, but generates the following error when I run CheckLAS. WARNING 1: Failed to import spatial reference Failed to read LAS linear unit geo-key. Screenshot and workaround are below. Pointcloud as a ZIP along with logfile results as attachment in case ESRI wants to fix the bug. It looks like PointVue is more forgiving of GeoKeys than ArcGIS is, and so can import the data successfully. [ATTACH=CONFIG]25672[/ATTACH] --EDIT-- Thanks Joel @ USGS for pointing me to a post on the lastools Google Group that led to a workaround. I was able to strip out the offending VLRs (in my case userbytes were fine) and add correct projection information with las2las. lasinfo gave me a good look at what the offending data were. Here's lasinfo report on the VLRs before I ran las2las: GeoKeyDirectoryTag version 1.1.0 number of keys 4 key 1024 tiff_tag_location 0 count 1 value_offset 1 - GTModelTypeGeoKey: ModelTypeProjected key 1025 tiff_tag_location 0 count 1 value_offset 1 - GTRasterTypeGeoKey: RasterPixelIsArea key 3072 tiff_tag_location 0 count 1 value_offset 2285 - ProjectedCSTypeGeoKey: look-up for 2285 not implemented key 3073 tiff_tag_location 34737 count 32 value_offset 0 - PCSCitationGeoKey: NAD83 / Washington North (ftUS) My las2las command line options were: las2las -i GLI_Sparse.las -remove_extra -remove_all_vlrs -o GLI_Sparse_reproc.las -sp83 WA_N -feet -elevation_feet and the resulting (working) keys were: GeoKeyDirectoryTag version 1.1.0 number of keys 4 key 1024 tiff_tag_location 0 count 1 value_offset 1 - GTModelTypeGeoKey: ModelTypeProjected key 3072 tiff_tag_location 0 count 1 value_offset 32148 - ProjectedCSTypeGeoKey: PCS_NAD83_Washington_North key 3076 tiff_tag_location 0 count 1 value_offset 9002 - ProjLinearUnitsGeoKey: Linear_Foot key 4099 tiff_tag_location 0 count 1 value_offset 9002 - VerticalUnitsGeoKey: Linear_Foot Hopefully this is useful to folks.
... View more
07-02-2013
03:46 PM
|
0
|
0
|
791
|
POST
|
Thank you gerickecook!!! This was driving me nuts. I tried every possible snap to raster setting and finally came to the forums. I didn't find your post until I googled "site:forums.arcgis.com snap raster not working" and limited my results to the past year <sigh>. turning off background geoprocessing worked for me, and I was able to do the processing from within ArcMap and get a snap. Unfortunately (for me) turning off background geoprocessing looks like it breaks the "overwrite geoprocessing outputs" option. When I try to run the tool and overwrite a dataset, I get two errors: (1) The table already exists, and (2) No spatial reference exists. Renaming the output worked fine (I was working with file geodatabase rasters). Seems like this would be a big deal. hope it's fixed soon.
... View more
03-12-2012
11:13 AM
|
0
|
0
|
1474
|
POST
|
<sigh> Answered my own question again. shapefiles can't store time unless it's a string. Time fields get truncated to day. If you create a geodatabase feature class from the xy data, all the date fields will come in fine. If you need them in a shapefile, export the feature class to a shapefile, delete the DateTimeUTC and TimeUTC fields, and re-create them as strings, then join the geodatabase feature class to the shapefile (I create an index field in case there are duplicate entries) and calculate the fields in the shapefile from the geodatabase. In detail: 1) get your NavData into the format you want it in excel. I put individual days into sheets, and combined them into a final nav sheet, but you could do it by day for long cruises. 1a) I add an index field to make sure that there are no hiccups from duplicate lat/long/time entries 2) open the excel table in ArcGIS, create xy event layer, export to GEODATABASE (all the date fields should come in ok) 3) export geodatabase to shapefile. This should break the TimeUTC and DateTime 4) delete the datetime and TimeUTC fields in the shapefile, and re-add them as text. The TimeUTC field should be 11 characters (XX:XX:XX AM) and DateTime should be 22 (XX/XX/XXXX XX:XX:XX PM) 5) join the geodatabase to the shapefile on the index field (you might be able to get away with Lat or Lon) 6) calculate the TimeUTC and DateTime fields with the field calculator from the date fields in the Geodatabase. Andy
... View more
07-21-2011
03:52 PM
|
0
|
0
|
159
|
POST
|
I have an excel file (xlsx) with the following fields NavIDX ShipLat ShipLon DateUTC TimeUTC DateTimeUTC I can read the file in ArcGIS just fine, and when I display xy data, all of the fields with time values come through. But when I export to .shp, the UTCTime field goes to 12:00:00 AM and the DateTimeUT field reads only the date (C in the field name is truncated). I tried a join with the table based on the NavIDX field, then calculating the TimeUTC field based on the TimeUTC field from the table, but all that does is freeze ArcGIS. There are something like 224K records.
... View more
07-21-2011
01:40 PM
|
0
|
1
|
445
|
POST
|
I created a mosaic dataset from 59 DEMs, edited the minPS and max PS for each to be 0 and 100 respectively, set the properties to mosaic by the lowPS attribute, and bumped up the defaults for Maximum Size of Requests, Maximum Number of Rasters per Mosaic, Maximum number of Records Returned, and Maximum number of Items Downloadable Per Request. The resulting mosaic dataset and a referenced hillshade mosaic dataset display fine in ArcGIS Desktop, and the mosaic dataset works fine for elevation in ArcGlobe, but it doesn't appear - doesn't even try to appear in arcglobe as imagery, whether I use the DEM and add it a second time as imagery, or if I add the referenced mosaic hillshade. If I clip the mosaic and save it as a tif, it displays just fine. Any thoughts?
... View more
06-21-2011
08:54 AM
|
0
|
0
|
678
|
POST
|
I think you want to use the [make raster] tool with a SQL query and select a single band in the tool properties. You should be able to do this directly from the dataset. I'm only working with single band imagery right now but I have used the make raster tool to separate image bands, and SQL should allow you to limit the values.
... View more
06-20-2011
03:46 PM
|
0
|
0
|
411
|
Title | Kudos | Posted |
---|---|---|
1 | 01-19-2018 04:36 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|