POST
|
Hi Aron, unfortunately, it was not really solved, the Problem seemed to be that the table had not a simple key but a Composite one. I solved it by adding a view to the database that joined both of the tables. Then I was able to use TableToTable on the view. I hope this is an Option for you too. Best regards, Andreas
... View more
11-05-2019
02:06 AM
|
1
|
0
|
150
|
POST
|
Dear Joshua, I must perform a several independent clips on the incoming data and then generate an output like: "The geometry interacts with: a water body, a national park and two woods" So performing these clips via multiprocessing all together seems obvious to me. Of cause you are right, that the SE thread matches my problem in a way, but I doen't see how I can influence this pickling. Thank you, Andreas Ruloffs
... View more
03-18-2019
07:36 AM
|
0
|
0
|
1285
|
POST
|
Hallo, thank you for your answers, but I don't see that they fit my problem. The program works fine on the desktop but it fails after I have published it as a geoprocessing service on an arcgis for server. It seems to me, that the server has a different runtime environment than the desktop arcpy. I even don't if multiprocessing is possible on the server. Tank you, Andreas Ruloffs
... View more
03-18-2019
04:57 AM
|
0
|
2
|
1285
|
POST
|
Hello everybody, I have an arcpy program, that performs a series of different clips on an incoming geometry. I am only interested in the result, if there are geometries in the clip's result. To use this tool in an web application, I have published it as a geoprocessing service, For the sake of performance I want to calculate the several clips simultaneously. I found this tutorial: Parallel Python: Multiprocessing with ArcPy - YouTube So I came to this solution (compressed to the basic I hope not have cut something important): import time
import sys, os
from multiprocessing import Pool
import logging
import arcpy
logging.basicConfig(filename=os.path.join(arcpy.env.scratchWorkspace, "../", "20181204_BQ1_I.log"), level=logging.INFO, format='%(created)f %(asctime)s %(message)s')
# Some constants are set to link in the databases
out_path = "in_memory"
arcpy.env.workspace = out_path
arcpy.env.scratchWorkspace = out_path
out_name_geometry = "JustAName"
spatial_reference = arcpy.SpatialReference(2056)
def clipfun(result, kind, out_geometry,timeString,sde):
# the data will be set according to the kind
arcpy.analysis.Clip(kind ,
arcpy.env.scratchGDB + "\\" + out_name_geometry + timeString,
out_geometry + "_Clip_" + kind)
rows = arcpy.da.SearchCursor(out_geometry + "_Clip_" + kind,
["SHAPE@AREA"])
result[kind+"_res"] = {}
ergebnis[kind+"_res"] = False
for row in rows:
if row[0] >= schwellwertMeter:
result[kind+"_res"] = True
break
del rows
if row is not None:
del row
return result[kind+ "_res"]
if __name__ == '__main__':
geometry_as_json = arcpy.GetParameter(0)
# Set local variables
timeString = str(time.time()).replace(".", "_")
out_geometry = out_path + "\\" + out_name_geometry + timeString
# Execute CreateFeatureclass
arcpy.CreateFeatureclass_management(arcpy.env.scratchGDB,
out_name_geometry + timeString,
"POLYGON",
"#",
"DISABLED",
"DISABLED",
spatial_reference)
c = arcpy.da.InsertCursor(arcpy.env.scratchGDB + "\\" + out_name_geometry + timeString, ["SHAPE@JSON"])
c.insertRow([geometry_as_json])
del c
res={
"LV_res":"",
"NkB_res":"",
"NkBW_res": "",
"DG_res":""
ps = []
funcs=[(res,"LV",out_geometry,timeString,sde),
(res,"NkB",out_geometry,timeString,sde),
(res,"NkBW",out_geometry,timeString,sde),
(res,"DG",out_geometry,timeString,sde)]
p = Pool (processes=4)
erg = p.map(callfunction,funcs)
arcpy.SetParameter(1, res["LV_res"])
arcpy.SetParameter(2, res["NkB_res"])
arcpy.SetParameter(3, res["NkBW_res"])
arcpy.SetParameter(4, res["DG_res"])
logging.info('Ende') I can run this Tool in an arcGIS for desktop Toolbox and I can publish it to an arcGIS for server as an geoprocessing service. But when I try to run this new service I got this error: Fehler bei der Ausführung des Werkzeuges. ITest Job ID: j0cc0f30e501247dba8e7877929cc45ec : Traceback (most recent call last): File "D:\arcgisserver\directories\arcgissystem\arcgisinput\I2.GPServer\extracted\v101\lokal\Inventare_bestaende_ITest.py", line 255, in erg = p.map(clipfun,funcs) File "C:\Python27\ArcGISx6410.5\Lib\multiprocessing\pool.py", line 251, in map return self.map_async(func, iterable, chunksize).get() File "C:\Python27\ArcGISx6410.5\Lib\multiprocessing\pool.py", line 567, in get raise self._value PicklingError: Can't pickle <type 'function'="">: attribute lookup __builtin__.function failed Fehler beim Ausführen von (I2). Fehler beim Ausführen von (ITest). Has anybody done something like that and can help me with this? Tank you, Andreas Ruloffs
... View more
02-18-2019
01:57 AM
|
0
|
6
|
1586
|
POST
|
bkEingriffart is just a simple table, bkMassnahmenbkMassnahmenZiele is a table to resolve an m to n relationship. Therefor it has a composite primary key.
... View more
06-08-2018
02:58 AM
|
0
|
2
|
1231
|
POST
|
If I comment out the first T2T conversion, then the second one fails anyway.
... View more
06-05-2018
12:54 AM
|
0
|
4
|
1231
|
POST
|
Thank you Joe, some new Lines got missing, I correct that. I do not understand the part with the parantheses TableToTable_conversion works with a single Table.
... View more
06-04-2018
08:46 AM
|
0
|
2
|
1231
|
POST
|
Hi there, I am facing a strange Problem. In the following Script two Tables in an sql-Server Database should be exported. The first one works, the second one not (I am sure it is spelled right). import tempfile
import arcpyout
Location = tempfile.mkdtemp()
arcpy.TableToTable_conversion("../bamo.sde/bko.bkEingriffart",outLocation, "bkEingriffart.dbf")
arcpy.TableToTable_conversion("../bamo.sde/bko.bkMassnahmenbkMassnahmenZiele",outLocation, "bkMassnahmenbkMassnahmenZiele.dbf") I got the Error: Traceback (most recent call last):
File "C:\Users\xxx\.p2\pool\plugins\org.python.pydev_5.7.0.201704111357\pysrc\pydevd.py", line 1546, in <module>
globals = debugger.run(setup['file'], None, None, is_module)
File "C:\Users\xxx\.p2\pool\plugins\org.python.pydev_5.7.0.201704111357\pysrc\pydevd.py", line 982,
in run pydev_imports.execfile(file, globals, locals) # execute the script
File "C:\Repositories\Python\bkonline2-deploy\gp-services\bko\exportSHP.py", line 13, in <module>
arcpy.TableToTable_conversion("../bamo.sde/bko.bkMassnahmenbkMassnahmenZiele",outLocation, "bkMassnahmenbkMassnahmenZiele.dbf")
File "C:\Program Files (x86)\ArcGIS\Desktop10.3\ArcPy\arcpy\conversion.py", line 2133, in TableToTable
raise earcgisscripting.ExecuteError: Failed to execute. Parameters are not valid.
ERROR 000732: Input Rows: Dataset ../bamo.sde/bko.bkMassnahmenbkMassnahmenZiele does not exist or is not supportedFailed to execute (TableToTable). I use ArcGIS 10.3.1 for develpoing, but the error also occurs when using ArcGIS 10.5.1. Later the Script shell run as a geoprocessing Task on an ArcGIS for server. Has anybody an idea? Best Regard, Andreas
... View more
06-04-2018
05:58 AM
|
0
|
10
|
1861
|
POST
|
Hi, if the schema of both layers are equal, then you can simply do:
secondLayer.applyEdits(firstLayer.selectedFeatures, null ,null, new AsyncResponder(copyDone, copyFailed))
firstLayer and secondLayer are both FeatureLayers and copyDone, copyFailed are methods that handle the event that edit is ready or edit failed. Bye, Andreas
... View more
04-02-2014
08:56 AM
|
0
|
0
|
147
|
POST
|
Hi, I don't like the new deployment either. Our normal workflow is that we develop the tools in our system and later publish the well ready developed and well tested at our customer. Especially when deploying at the customer a too complicated and too long workflow does not make us look good. But I have another ugly thing with publishing. Our Scripts normally contain several files, the script itself, a config script and a general helper script. The last ones are imported in the first one. Publishing the script copies only the script itself but not the other files, so after publishing executing the script fails. I have to manually copy the files into the path Kevin mentioned earlier. I have registered the path where the scripts came from as data store, but this does not prevent the copy. Bye, Andreas
... View more
04-02-2014
05:47 AM
|
0
|
1
|
553
|
POST
|
Hi, you can use a FeatureLayerEvent.EDITS_STARTING event. This event is thrown before the edit is made permanently. So you can manipulate the features before they are written in the database. If you want to add only one value, maybe the easiest solution could be working with templates. Hope this helps you, Andreas Ruloffs
private function beforeEdit(event:FeatureLayerEvent):void
{
// here you can manipulate the features you want to.
event.featureLayer.applyEdits(event.adds,event.updates, event.deletes);
}
... View more
03-06-2014
11:05 PM
|
0
|
0
|
179
|
POST
|
Thank you Robert, you are right, this is not the answer I want to hear, but it seems to be the only right one. Bye, Andreas Ruloffs
... View more
03-04-2014
02:30 AM
|
0
|
0
|
362
|
POST
|
Hello, I have tried to use these events. They are all thrown just once when the GraphicLayer is created and then nevermore. And even the first time they are thrown far to early when the features are not yet visible.
this.shapefileGL = new GraphicsLayer();
baseWidget.map.addLayer( this.shapefileGL);
this.shapefileGL.addEventListener(LayerEvent.UPDATE_START, updateStart);
this.shapefileGL.addEventListener(LayerEvent.UPDATE_END, updateEnd);
this.shapefileGL.addEventListener(LayerEvent.LOAD, loaded);
private function updateStart(event:LayerEvent):void
{
LogUtil.getLogger(ShowShapefileMediator).info("updateStart");
}
private function updateEnd(event:LayerEvent):void
{
LogUtil.getLogger(ShowShapefileMediator).info("updateEnd");
}
private function loaded(event:LayerEvent):void
{
LogUtil.getLogger(ShowShapefileMediator).info("loaded");
}
bye, Andreas Ruloffs
... View more
02-25-2014
01:15 AM
|
0
|
0
|
362
|
POST
|
Hello, what you can do is to do a subselect in the WHERE Clause, like that: OBJECTID IN (SELECT TOP 20 OBJECTID FROM [SAME_TABLE]) The inner Select selects the first 20 OBJECTIDs from the Table, the Layer gets its data from. The outer select limits then the query to the results you want. Be careful, the innerselect may be a problem, if your featureclass is versioned. Hope this will help you, Andreas Ruloffs
... View more
02-17-2014
11:15 PM
|
0
|
0
|
206
|
Title | Kudos | Posted |
---|---|---|
1 | 11-05-2019 02:06 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|