POST
|
This may help: json - Python Script to Convert CSV to GeoJSON - Stack Overflow If it is a one shot deal then this online converter may help: CSV To GeoJSON Converter if you do it all the time you may want to modify one of these techniques: GitHub - conmolloy/Geojson-Script: Python script to convert CSV files to GEOJSON file Convert CSV to Geojson in Python | Chenyu's Script
... View more
06-12-2019
07:23 AM
|
1
|
0
|
2944
|
POST
|
I will take a pop shot at the question..... I would run a python script that removes duplicate lines before you need to use the CVS .... Then your model would not have to worry about duplicate values. In the example below, 1.csv contains duplicate lines; the script create a 2.csv that removes the duplicates. You would use 2.csv in your model. inFile = open('1.csv','r')
outFile = open('2.csv','w')
listLines = []
for line in inFile:
if line in listLines:
continue
else:
outFile.write(line)
listLines.append(line)
outFile.close()
inFile.close()
... View more
06-12-2019
07:20 AM
|
1
|
2
|
938
|
POST
|
What you are describing are multi-part lines. You will have to decide when creating an LRS how to handle both directions there are a few options available (I believe you want to preserve overlapping segments). Creating a complex (looping) route—Help | ArcGIS Desktop Complex routes The process of creating a complex route is very similar to that of creating a simple route. The only real difference is that you must build a complex route in pieces. Once the pieces are created, they can be merged. Take note that in creating a complex (looping) route in the task, care has been taken to set the measures appropriately for each of the two halves of the route that are eventually merged. If this is not possible in your situation, you can still merge the pieces. Then at a later time, the measure values can be reset. Personally I avoid complex dynamic segmentation, I have ALWAYS found a way to create a simple route even under complex situations (Sometimes you got to think out of the box!)
... View more
06-07-2019
08:40 AM
|
0
|
0
|
1052
|
POST
|
No I do not use pro nor currently willing to migrate. Pro does not recognize an Access personal geodatabase, until it does Pro is a no go for me. I am struggling learning qgis as my fallback. This is a perfect use case demonstrating the time saving ability of access (even though it performs the geometry operations slow) but the flexibility and ease of use FAR OUTWEIGH any geometry processing. Enable ArcGIS Pro to access ESRI Personal Geodatabases
... View more
06-04-2019
07:04 AM
|
0
|
0
|
889
|
POST
|
This is a conflation process. I use MSAccess as my database and perform Attribute change detection using the following methodology using the Database SQL (Not using ArcGIS).
There are three kinds of differences:
-New records in tblDataToday
-New records in tblDataYesterday or lost records in tblDataToday
-Changed records that are both present in tblDataToday and tblDataYesterday
1. To browse new records in tblDataToday:
SELECT tblDataToday.*
FROM tblDataToday LEFT JOIN tblDataYesterday ON tblDataToday.ID = tblDataYesterday.ID
WHERE (((tblDataYesterday.ID) Is Null))
2 To browse new records in tblDataYesterday or lost records in tblDataToday:
SELECT tblDataYesterday.*
FROM tblDataToday RIGHT JOIN tblDataYesterday ON tblDataToday.ID = tblDataYesterday.ID
WHERE (((tblDataToday.ID) Is Null))
3 To browse changed records simultaneously in tblDataToday tblDataYesterday:
3.1 Create a query qChangedData:
SELECT tblDataToday.*,"tblDataToday" as ChangedIn
FROM tblDataToday
UNION ALL SELECT tblDataYesterday.*, "tblDataYesterday" as ChangedIn
FROM tblDataYesterday
3.2 Create a query qChangedIDs:
SELECT All_IDs.ID
FROM (SELECT DISTINCT * FROM (SELECT tblDataToday.* FROM tblDataToday union all SELECT tblDataYesterday.* FROM tblDataYesterday ) AS uAll) AS All_IDs
GROUP BY All_IDs.ID
HAVING (((Count(All_IDs.ID))=2));
3.3 Create a query qChangedData:
SELECT qChangedData.*
FROM qChangedIDs INNER JOIN qChangedData ON qChangedIDs.ID = qChangedData_u.ID
ORDER BY qChangedData.ID, qChangedData.ChangedIn
... View more
06-03-2019
07:57 AM
|
0
|
2
|
889
|
POST
|
If you are starting with an actual datetime field [dt]: import datetime
dt = datetime.datetime(2019, 5, 31, 0, 0) dt.strftime(' %Y%m%d ') or '{:%Y%m%d}'.format(dt) For completeness' sake: you can also directly access the attributes of the object, but then you only get the numbers: '%Y%m%d ' % (dt.month, dt.day, dt.year) For reference, here are the codes used in format string:
%a Weekday as locale’s abbreviated name.
%A Weekday as locale’s full name.
%w Weekday as a decimal number, where 0 is Sunday and 6 is Saturday.
%d Day of the month as a zero-padded decimal number.
%b Month as locale’s abbreviated name.
%B Month as locale’s full name.
%m Month as a zero-padded decimal number. 01, ..., 12
%y Year without century as a zero-padded decimal number. 00, ..., 99
%Y Year with century as a decimal number. 1970, 1988, 2001, 2013
%H Hour (24-hour clock) as a zero-padded decimal number. 00, ..., 23
%I Hour (12-hour clock) as a zero-padded decimal number. 01, ..., 12
%p Locale’s equivalent of either AM or PM.
%M Minute as a zero-padded decimal number. 00, ..., 59
%S Second as a zero-padded decimal number. 00, ..., 59
%f Microsecond as a decimal number, zero-padded on the left. 000000, ..., 999999
%z UTC offset in the form +HHMM or -HHMM (empty if naive), +0000, -0400, +1030
%Z Time zone name (empty if naive), UTC, EST, CST
%j Day of the year as a zero-padded decimal number. 001, ..., 366
%U Week number of the year (Sunday is the first) as a zero padded decimal number.
%W Week number of the year (Monday is first) as a decimal number.
%c Locale’s appropriate date and time representation.
%x Locale’s appropriate date representation.
%X Locale’s appropriate time representation.
%% A literal '%' character.
... View more
05-31-2019
07:09 AM
|
1
|
0
|
4996
|
POST
|
Calculating a bearing distance—Help | ArcGIS Desktop How to Calculate Azimuth in Excel | It Still Works Now if you can script in Excel this is a custom function I use for determining Azimuths.. I use it in excel like this... =Azimuth(cell with Latitude1, cell with Longitude1, cell with Latitude1, cell with Longitude2) eg. =Azimuth(A1,B1,A2,B2) Function Azimuth(lat1 As Single, lat2 As Single, lon1 As Single, lon2 As Single) As Single
Dim X1 As Single, X2 As Single, Y As Single, dX As Single, dY As Single
With Application.WorksheetFunction
X1 = .Radians(lat1)
X2 = .Radians(lat2)
Y = .Radians(lon2 - lon1)
End With
dX = Math.Cos(X1) * Math.Sin(X2) - Math.Sin(X1) * Math.Cos(X2) * Math.Cos(Y)
dY = Math.Cos(X2) * Math.Sin(Y)
With Application.WorksheetFunction
Azimuth = .Degrees(.Atan2(dX, dY))
End With
End Function
... View more
05-24-2019
07:18 AM
|
0
|
0
|
3903
|
POST
|
Answering your second question.... Why some values a blank others are interpreted as NULL -- it is depended upon how excel is determining data types on the cut/paste or export. My research on MSDN reveals the following information: The Excel source component determines the input data types by itself, based on the first 8 rows of the Excel file. “Missing values. The Excel driver reads a certain number of rows (by default, 8 rows) in the specified source to guess at the data type of each column. When a column appears to contain mixed data types, especially numeric data mixed with text data, the driver decides in favor of the majority data type, and returns null values for cells that contain data of the other type. (In a tie, the numeric type wins.) Most cell formatting options in the Excel worksheet do not seem to affect this data type determination. You can modify this behavior of the Excel driver by specifying Import Mode. To specify Import Mode, add IMEX=1 to the value of Extended Properties in the connection string of the Excel connection manager in the Properties window.” <--- This works if you are programming the transfer but the behaviour your experiencing carries through on a cut/paste operation as well.
... View more
05-23-2019
07:42 AM
|
0
|
0
|
5268
|
POST
|
The problem lies within excel. Excel DOES NOT have or recognize a "Null" value. The only exception I found so far is that SQL Server will reliably treat the result from the NA() function as a null value (But this is very specific to the database and not very intuitive). The workaround I use is place a known value as a replacement for NULL. For example if I wish to export a string column to a database that contains NULLs I create a VB function or Macro in excel that does something like this placing a text string "NULL" in all the blank cells (this can be enhanced to cover NA values as well: Sheet1.UsedRange.SpecialCells(xlCellTypeBlanks)="NULL" Then on the database side, I create import scripts that will convert the text String to a DBNull value. For numbers, instead of using a Null text string I assign a number that will never occur in my data ... "-10013" Then on the database side script replacing that number with nulls. Not pretty, a pain in the behind, but workable when you fully script it.
... View more
05-23-2019
07:30 AM
|
0
|
0
|
5268
|
POST
|
Well said! Don't forget to vote here if you have not already. Enable ArcGIS Pro to access ESRI Personal Geodatabases
... View more
05-17-2019
07:16 AM
|
0
|
0
|
1730
|
POST
|
What about looking into LRS (Linear Referencing System)? Your sewer lines could be linearly referenced and your inspection points "event mapped." So if you move your Referenced line all you would have to do is re-event map your inspection points and they will fall into the general distribution as with the previous location. This is not a perfect solution -- but one you should consider if/when your back is against the wall!
... View more
05-13-2019
07:12 AM
|
0
|
1
|
1091
|
IDEA
|
SQLLight does not allow for editing. To update an SQLLIGHT DB you must overwrite the table for each update unless you want to spend additional for licensing. ESRI has not allowed SQLLIGHT to replace the personal geodatabase -- If they would that would at least be a compromise. Additionally there are workable 64 bit drivers for Access so there is no excuse for that reason.
... View more
05-06-2019
08:09 AM
|
0
|
0
|
856
|
POST
|
I don't know if this means anything to Pro but I can and access .xlsx spreadsheets from both unc paths and mapped network drives with no issues in ArcMap nor the Environment. This may be a pro thing? or could it be a security thing in your network?
... View more
04-26-2019
07:21 AM
|
0
|
1
|
1339
|
POST
|
I agree. In my case my data resides in Personal Geodatabase. ESRI has decided to drop support. If I convert and upgrade to the ESRI data formats 99% of my work process is lost. This affects all of our annotations! And there is no viable upgrade path moving forward. You may not use access but the concept of yanking support is still the same. Microsoft Access (.accdb) Support in ArcGIS Pro https://community.esri.com/ideas/12662
... View more
04-23-2019
07:49 AM
|
3
|
0
|
183
|
IDEA
|
Rober -- Forgive me, I do understand that you are an Instructor and I am not picking on you. I was just using you answers to call light on this issue. In my discussions with ESRI the reasons they gave me was 64bit drivers and speed in geo processing. In counter arguments I pointed out that there is a WORKING 64 bit driver for access and Speed is not the issue. Any speed gained using the FGDB is lost in debugging and the agonizing futility of trying to warp the data into something useful. When said and done -- the enormous gains in speed in the geoprocessing is lost when factoring in Start to End product. I can guarantee that I can get a formatted report out to meet a deadline EXPONENTIALLY faster using an Access database than FGDB even though the geoprocessing in the FGDB may be 100 times faster. Now to add insult to injury -- ESRI is leaving in compatibily with a Shape File which HAS NOT 64 BIT drivers. GO FIGURE!
... View more
04-22-2019
07:13 AM
|
6
|
0
|
856
|
Title | Kudos | Posted |
---|---|---|
1 | 10-18-2018 09:46 AM | |
1 | 05-23-2018 08:30 AM | |
8 | 04-18-2019 07:15 AM | |
1 | 02-23-2018 10:08 AM | |
1 | 12-28-2017 07:36 AM |
Online Status |
Offline
|
Date Last Visited |
10-18-2023
06:40 PM
|