Warp Scripting Tool Degrades Image

1275
7
11-06-2017 03:36 PM
RyanGoh
New Contributor

Hi, I am using Arcgic 9.3 and had a question regarding the Warp (Data Management) tool. Whenever I warp an image using the Warp (Data Management) tool, it reduces the contrast of my pixels and reduces the black intensities on my panchromatic image even though I have forced the CellSize. This is my code (takes about 1 min to run):

import arcgisscripting

gp = arcgisscripting.create(9.3)

gp.CellSize = 0.00001

gp.Warp_management(inputImage, sourcePoints, targetPoints, outputImage, "POLYORDER1", "CUBIC")

However, it works fine using Map Algebra but the code takes a lot longer to run (4-5 mins per image).

import arcgisscripting

gp = arcgisscripting.create(9.3)

gp.CheckOutExtension("Spatial")

warp = "warp(inputImage, linkFile, 1, CUBIC, #, 0.00001)"

gp.SingleOutputMapAlgebra_sa(warp, outputImage)

Is there somewhere I can force some environment variables that would prevent Warp (Data Management) from losing pixels/degrading my image? Much help would be appreciated as I have been banging my head a long time on this. Thanks!

0 Kudos
7 Replies
DanPatterson_Retired
MVP Emeritus

From the help files I would check the transformation types (which you have used the default) and make sure that you aren't changing the cell size (I presume your file is in geographic coordinates).  Environment settings can be controlled as specified in the help, so you can set those in your script as necessary.

RyanGoh
New Contributor

Hi Dan! Thanks for your reply. However, like I mentioned, I am using Arcgis 9.3 which does not have a WarpFromFile tool. The help files for ArcGIS Desktop Help 9.3 - Warp (Data Management) that I am using is this one.

Even if I use all the default values such as:

gp.CellSize = 0.00001

gp.Warp_management(inputImage, sourcePoints, targetPoints, outputImage, "POLYORDER1", "NEAREST")

The outputImage still loses intensity and degrades. The CellSize does not get restricted to the value that I want either. Does anyone know why that is so? 

Dan_Patterson

cdspatial

https://community.esri.com/groups/arcgis-data-reviewer?sr=search&searchId=3c33e42c-3ff0-4c00-8310-3e...

0 Kudos
DanPatterson_Retired
MVP Emeritus

Whether 10.X or 9.3, it seems that the only environment settings honored are

The following environments affect this tool: current workspace, scratch workspace, output coordinate system, output extent, snap raster, output CONFIG keyword, pyramid, raster statistics, compression, and tile size.

Have you confirmed that the intensity and degradation is not the result of the symbology used to view the raster?  Perhaps you can show the histogram of before and after through the symbology classify section for comparison.  

EDIT

Then warp apparently ignores the expected number of rows and columns setting bug

http://support.esri.com/en/bugs/nimbus/TklNMDU0NDYz

and there were a whole load of other 'issues' in older versions on the support site

XanderBakker
Esri Esteemed Contributor

In both examples that you posted I notice that you are using CUBIC:

Cubic—Performs a cubic convolution and determines the new value of a cell based on fitting a smooth curve through the 16 nearest input cell centers. It is appropriate for continuous data, although it may result in the output raster containing values outside the range of the input raster. It is geometrically less distorted than the raster achieved by running the nearest neighbor resampling algorithm. The disadvantage of the Cubic option is that it requires more processing time. In some cases, it can result in output cell values outside the range of input cell values. If this is unacceptable, use Bilinear instead.

It is normal that higher and lower values will be smoothed out when using Cubic resampling...

RyanGoh
New Contributor

If you refer to the above comment, I get degradation too even when I use default values such as Polyorder1 and Nearest neighbor sampling.

0 Kudos
XanderBakker
Esri Esteemed Contributor

Can you share a screenshot of input and output using Nearest Neighbor? It would help a lot to see how large the degradation is. Please use the same stretching of the image specifying min and max values (not standard deviation or other methods which do not allow to compare the exact difference):

  • Select stretch type "Minimum - Maximum"
  • Switch on the option "Edit High/Low Values"
  • Specify a relevant low and high Value according to your data

Please do this in both the input and output raster and past the screenshots in this thread.

0 Kudos
RyanGoh
New Contributor

Wow thanks Dan - I will try to play around with these environment variables and see if it fixes anything. 

Yes I can confirm that the degradation is not a result of viewing tool. I don't have the exact histograms right now but I recall that after warping, my histogram is shifted to the right and I have a whole block of intensities that I never had before. 

Is there another tool I can use to make a transformation like this other than warp?

Dan_Patterson

0 Kudos