|
POST
|
Just create a new project in each user's share folder, this includes a gdb and toolbox along with all the other project items. That said, treating Pro like ArcMap is going to lead to endless friction. If you structure your work around what you're doing vs. who's doing it then you can create projects for each discrete task. Client wants a new set of maps? Make a project! Need to process a big folder of soil samples or road network updates? Make a project! Projects also work for long running jobs, like keeping all your web service sources in one place.
... View more
12-22-2023
08:39 AM
|
0
|
1
|
2805
|
|
IDEA
|
This would be a massive boon for our org, our #1 source of user issues is failing to login correctly. The two biggest features we'd like to see are: A customizable header that includes our logo and instructions. The ability to make one of the login options much less prominent (in our case, the ArcGIS Login section). The current login layout treats both methods as equally valid which is unsuitable for 99% of our users and leads to confusion, an option to make that section harder to interact with would be great.
... View more
12-19-2023
09:24 AM
|
0
|
0
|
2036
|
|
POST
|
You can do this by: Looking for a text file with config options in a known location and populating those values in the tool's validation section if they exist, and Writing the user's chosen values to said config file if no default value exists. For the location of the file, the best practice for Windows is to use a folder with your team's name in the %APPDATA% folder, which is set per user. You can get the config file like so: import os
from os import path
folder = path.expandvars("%APPDATA%\\AlfredSoft")
if not path.exists(folder):
os.mkdir(folder)
config_path = path.join(folder, "tool_config.json") # Or whatever format works for you
if path.exists(config_path):
with open(config_path) as config:
pass # Work with the file here! As for creating or editing the config file, you should do that in the tool itself or in the validator's updateParameters method, depending on how you want to lay your code out. If you get reading working then writing shouldn't be much more work.
... View more
12-01-2023
03:15 PM
|
2
|
0
|
1600
|
|
POST
|
One last place to look is the map itself, there may be an option set there that the designer doesn't override
... View more
12-01-2023
12:56 PM
|
1
|
0
|
1260
|
|
POST
|
It might be worth a quick trip to the official Python docs for file objects. In short, the "open" function returns a file object with various settings from the function, including the file's encoding. The "writer" object from the "csv" module is just a wrapper around the file object that translates raw CSV data to and from Python data types; you can write a CSV file without the wrapper but it makes it easier.
... View more
11-28-2023
01:49 PM
|
0
|
0
|
3652
|
|
POST
|
You open the file on line 27, try adding encoding="utf-8" as parameter and see if that fixes things.
... View more
11-27-2023
12:38 PM
|
0
|
0
|
3674
|
|
POST
|
Based on the encoding the CSV writer pulled in you're trying to write the data to a file that isn't Unicode compatible. Specify a suitable encoding when you open the file ("utf-8" works in virtually every case) and you should make more progress.
... View more
11-27-2023
11:27 AM
|
0
|
2
|
3689
|
|
POST
|
Here's a summary of Shapefiles from the docs: link. In short: a shapefile is made of multiple files all in the same folder with the same filename. ArcGIS applications will show these files as a single ".shp" item in the catalog as there's no reason to work with each file separately in the apps. Just make sure you always keep your files grouped together in the same folder and it should all work out. Each file will be modified as required based on the processing you do so you don't have to babysit anything.
... View more
11-24-2023
09:33 AM
|
0
|
1
|
2029
|
|
POST
|
This is a solid script, one minor tweak that can be handy with massive datasets is: fc_dataframe = DataFrame((row for row in SearchCursor(input_fc, final_fields, query)), columns=final_fields) This avoids creating a list for the data before it hits the DataFrame, saving a decent chunk of memory. Might even be faster in some cases.
... View more
11-15-2023
05:21 PM
|
0
|
1
|
7811
|
|
POST
|
From the SQL end, any registered table that's branch versioned will have some special fields to track the branch version info. In my environment the GDB_IS_DELETE field is a solid indicator, your EGDB configuration may create different fields. I'm not a SQL wizard but if you can get the schema for every table programmatically that should do you.
... View more
11-01-2023
02:03 PM
|
0
|
0
|
3569
|
|
POST
|
Ah, looks like I was beaten to the punch! As Josh pointed out, names with more than 1 space in them will lead to issues. My code dumps everything but the last token in the first name slot, with some tweaking you can get everything but the first token in the last name slot.
... View more
11-01-2023
08:33 AM
|
1
|
1
|
7038
|
|
POST
|
The output is an array of strings. Here's how to safely extract that data assuming arbitrary input: var tokens = Split("Your Data", " ");
var count = Count(tokens);
var first = null;
var last = null;
if (count == 1) {
first = tokens[0];
} else if (count == 2) {
first = tokens[0];
last = tokens[1];
} else if (count > 2) {
var first_array = [];
for (var i in tokens) {
if (i == count - 1) {
break;
}
Push(first_array, tokens[i]);
}
first = Concatenate(first_array, " ");
last = tokens[-1];
}
// Do what you need with first and last
... View more
11-01-2023
08:29 AM
|
1
|
2
|
7045
|
|
POST
|
If pyscripter doesn't pick up on an import (pretty common with complex libraries like arcgis) you'll have to tweak the hidden imports for your build. Start with something high up in the import path (e.g. "arcgis.gis") and then get more specific until it works.
... View more
10-31-2023
04:21 PM
|
1
|
0
|
1779
|
|
POST
|
The traditional archiving method? Nope, you're out of luck with the data table alone, you might be able to correlate the GDB_TO_DATE with other database or server logs but no guarantees. Branch Versioned tables have their own schema that includes the GDB_DELETED BY field which should list the culprit, if you can work out a migration plan I'd recommend switching.
... View more
10-30-2023
11:42 AM
|
0
|
0
|
1482
|
|
POST
|
SQL Expression parameters require a "Dependency" on the parameter you want to filter so it can populate the dialog, set that up and your parameter should work
... View more
10-30-2023
08:49 AM
|
0
|
1
|
782
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | yesterday | |
| 1 | a week ago | |
| 1 | a week ago | |
| 1 | a week ago | |
| 1 | 3 weeks ago |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|