|
BLOG
|
Thanks, I'm so deep in the weeds I forget to demystify jargon.
... View more
02-11-2022
11:04 AM
|
0
|
0
|
662
|
|
BLOG
|
Hi Gab Lets take this off the blog thread, if you can email me as bharold at esri dot com and attach a file I'll take a look.
... View more
02-10-2022
01:05 PM
|
0
|
0
|
1129
|
|
BLOG
|
The .markup file will have to be readable somehow to make any progress, I would have to ask around what it is built from. Can you open it with Notepad++ and paste in a few lines? Your challenge may be a good topic for a question on the experts forum: https://community.safe.com/s/knowledge-base
... View more
02-07-2022
07:19 AM
|
1
|
0
|
1193
|
|
BLOG
|
Did the custom formats in this blog post fail or another you have? If it is one of yours please open a support call and share the fds file and error details and we can take a look. If its the ones in this post please share the error details and confirm the release of ArcGIS Pro you are using - thanks.
... View more
02-07-2022
06:36 AM
|
0
|
0
|
1200
|
|
IDEA
|
That is correct. https://docs.safe.com/fme/2021.1/html/DataInterop_Documentation/FME_ReadersWriters/ifc/ifc.htm
... View more
02-07-2022
05:47 AM
|
0
|
0
|
5213
|
|
BLOG
|
If your team's job requires repeated custom data ingest then you'll be interested in making it a one-step process - the proverbial Easy Button. The best way to share an ETL integration is to make the destination dataset a feature service, they are ideal for sharing, and of course you can make web tools in Enterprise. However, sometimes you cannot use those options; you need a simple way to ingest a dataset just for you or your colleagues, or like-minded people who are not in your organization or even not known to you. You just want anyone doing the same job to make data the same way. Your data may be sensitive or time bound, or you may be off any network. You just want data loaded into a geodatabase in one step. A couple of teams here at Esri came to me recently with this problem and in both cases the solution was the same - ArcGIS Data Interoperability custom formats. Data Interoperability comes with an 'easy button' called the Quick Import geoprocessing tool. This tool lets you pick any non-raster format from its gallery of 500 or so and convert the data to file geodatabase. To add new formats all you need to do is create them! Relax, its a no-code experience. Then you can share them with anyone. In my colleagues' cases they were dealing with special 'dialects' of XML and CSV that needed specific logic to create features correctly. Custom formats are built on top of existing formats (e.g. XML, CSV and hundreds more) and encapsulate all the logic to create feature classes or tables however you need. Data like the current US weather can be queried at a location: National Weather Service Report Anyone can do this with Quick Import if the weather site response is made a custom format. I did this, so the Quick Import input dataset format gallery lets me pick US Point Location Weather Forecast format and set its parameters (any address, location or POI in the US will work, under the covers it uses the World Geocode Service): US Weather Format Weather Format Properties Then a Forecast feature is available to map and query - the green hexagon is a weather feature: Boston Weather Feature This is not relevant to the story of custom formats, but the feature is a hexagon because I used an Uber H3 encoding of the XY value geocoded from the input. This complies with terms for non-storage use of the geocode service - all I know is the geocode XY is somewhere inside the hexagon. The hexagon size (i.e. H3 index scale) also agrees with the coverage NOAA's NWS service returns for a point forecast. So how do you make a custom format? All that is required is just a normal standalone Workbench document (.fmw) that satisfies two conditions: The workspace must have an input dataset The workspace must write to a file geodatabase Once you have configured your workspace and verified it creates data correctly, the File menu in Workbench has a choice Export as Custom Format. This saves the workspace as a format definition to a default location in your profile directory with a short name and long description you can see and search on in the formats gallery. The format definition has the file extension '.fds' but it remains editable with Workbench. These fds files can be shared with anyone either by copying into their profile directory or putting on a file share they can see - the Tools>FME Options>Default Paths dialog allows you to specify Shared FME Folders where your team can share fds files and other resources like credentials. Here is my profile directory with a few custom formats in it: C:\Users\<username>\Documents\FME\Formats Two of these files are in the blog download, NWSFORECAST.fds and CASOLAR.fds - the file names come from the short name you give the format at export time. If you copy them into your C:\Users\<username>\Documents\FME\Formats folder (make it if it isn't there already) you'll have two new formats! NWSFORECAST is the weather forecast format and CASOLAR is California Distributed Generation Interconnected Project Sites data. This is a tabular monthly digest of electrical generation installations (that are not off grid) in California, it has a lot of fields describing generation projects (mostly solar). Here is the data at writing summarized as total system kilowatts per ZIP code and extruded as 1Kw = 1m: California Generation Kw November 2021 If you want your own copy look for this format: California Generation Format I chose to make these example custom formats from underlying XML for the weather format and CSV for the power generation format, partly because XML and CSV (along with JSON) are frequently encountered but also because in this case they are delivered via HTTP, not a dataset reader, and I wanted to show you how to do that. To meet the condition that a custom format workspace must have an input dataset I used a special format called NULL, which as the name suggests does nothing. If you edit the fds files you'll see nothing special. Weather format workbench Solar projects workbench If you want to test drive the custom formats you will require ArcGIS Data Interoperability extension installed for Pro. I used Pro 2.9 to create the formats and have not tested earlier releases. Be aware the power generation data will take about 40 minutes to download and process (nearly 1.3M rows), and you will need to find California ZIP code boundaries to map the data. The weather report format takes a few seconds. My real message here is that one person with Data Interoperability (or FME) skills can create custom formats for big button ingest and share the functionality with anyone using the extension. The person using the format only needs to know about the Quick Import tool and does not need Data Interoperability training.
... View more
02-04-2022
11:00 AM
|
3
|
9
|
2341
|
|
POST
|
No, it's just a slogan, a bit like "The Geographic Approach" you'll hear from Esri, where the definite article doesn't literally mean there is a single path to nirvana. "Building a Data Driven Organization" is just a set of ideas and approaches you can look at and judge for yourself whether each is relevant to your organization.
... View more
02-01-2022
05:10 AM
|
0
|
0
|
477
|
|
POST
|
Hi Sietse Try making an ETL tool using the basic XML reader. When you add the reader you can set the reader parameters after picking the input file. Go to the setting for 'Elements to match", this will let you select the quaylocationdata element as an attribute. Then the rd-x and rd-y values will be available to use in a VertexCreator to make geometry. Ask Esri NL for assistance, they have some specialists.
... View more
01-31-2022
05:32 AM
|
0
|
0
|
1015
|
|
BLOG
|
Let's expand the scope of the 'data driven organization' theme just a little to include every organization in the world, or at least all those you care about in their B2B interactions. If you're an agency responsible for promoting business efficiency you might like to curate data like in the below map in your SDI, it is all businesses in a country with all their functional locations uniquely keyed, the map view is centered in a city industrial zone (there is an estuary in the middle - it's a very scenic industrial zone): Company Locations The dots on the map are coming from an ArcGIS Knowledge graph service, I'll get to the details in a bit, but more importantly what is the big picture? It is standardized, authoritative, shared access to entity location in a way you can build into your B2B processes and trust its validity and persistence. Using a standard in other words. The dots are GS1 Global Location Number (GLN) locations. Any legal, physical, functional or even digital location can get a GLN, and where they have a location, be used for logistics - i.e. supply chain analysis. When combined with other operational data like purchase orders, inventory value, goods in transit between GLN points etc., a graph analysis opens up many more analytic opportunities to improve near real time business decision making. Even just adopting GLN as a shared foreign key in an economy will enable efficiencies and reduce errors. The dots on the map above are where registered businesses identify their primary location to be. Well, more correctly I geocoded the locations from addresses contained in a bulk download of a day's snapshot of GLN data. I used the ArcGIS Online World Geocode Service and I worked in ArcGIS Pro, using the ArcGIS Data Interoperability extension to handle the JSON data in the download. I didn't write any code. If you read up on the GLN, you'll see its a 13-digit number and does not in itself encode location. To create geometry from GLN data you will need supporting address or coordinate data. Lets discuss GLN a little before having some fun with the data. There are many approaches to encode 2D geometry into a single value, such as geohash, Google's S2 library, Uber's H3 library, several distributed global grid (DGGS) hashing schemes and at first I wondered why GS1 didn't use one. On reflection businesses can relocate, or even be mobile, and locations may be 3D. Digital GLN numbers (web domains etc.) aren't even physically located, so a number was chosen as the 'encoding'. 13 digits is a little awkward, it overflows a long integer type and using a double precision type runs the risk of some storage or transport formats creating a decimal from the number (adding '.0') so for my purposes I handle GLN as char(13) and generate geometry as 3D WGS84 points, with 0 as the default elevation above the ellipsoid. I suppose if I was doing this for real I would store the epoch too so tectonically fast-moving places like Australia could maintain 2D accuracy over long time periods. Some businesses are hundreds of years old! Anyway, on to a scenario! The simplest use case for keying B2B transactions with GLN is logistics - shipping goods. Things like sale and purchase agreements would come for the ride, recording any transaction's GLN details. My (fake) scenario is the International Space Station has a need for a new dishwasher, your transactions may be more mundane. The purchasing folks for the ISS are very smart (after all, they are rocket scientists) and looked for an innovative appliance manufacturer closely located to a launch provider. After a little research they found a good pair of suppliers and organized delivery of a custom zero-gravity dishwasher. The agency where the companies are located made an enlightened decision to base their corporate registration system on GLN, so this high value (or any) transaction can be driven by GLN! In the blog download is a locator, unzip it into an ArcGIS Pro project folder and add it to your project. Make two maps in Web Mercator coordinate system, make the basemap Imagery Hybrid and arrange them side by side. In the left map use the Locate tool to go to the location 9429040747378 and in the right map go to 9429034019108. I built the locator to return a company name in a field EntityName. When you geocode to the GLN locations the candidate popup will reveal the suppliers I'm talking about. They are real companies that really operate in the industries I mention. A human interest factoid, the CEO of the rocket company used to work at the appliance manufacturer, and so did I. Here is how things should look with the maps at 1:2000 (you will not have the blue dots), obviously the companies would use their GLN locations for shipping arrangements in their contract. Appliance manufacturer and launch provider locations via GLN locator So that's a shout out to GLN as a hook to hang B2B processes on. It is simple, official, and extensible - organizations can have any number of GLN numbers for delivery points, including ones they might keep private for internal purposes. All that remains is for you good folks to build it out. Ask your local Esri representative for help building out GLN as part of your SDI. I said I created a Knowledge graph with the data. The source dataset was ~6GB of zipped JSON, I got it into the graph in three steps (without unzipping): Extract Company Data Load Entities Load Relationships In hindsight I could have done less flattening of the JSON and retained more expansion tables but it doesn't matter for this exercise. The graph has these object counts: Graph Investigation Contents That's the whole database, including historic records. I might not retain everything if I was doing this for real. I mentioned I flattened everything in the JSON, so I ended up with very much a star schema, not a snowflake one. The Company entity is the sole fact table, everything else is a dimension with a relationship starting at Company, hence the names I gave them beginning "co". You can go either way with a graph. Lets see a couple of things the graph can tell us. In the map of GLN locations I noticed a lot of business locations in amongst residential housing, to the east of the map. What's going on there? Businesses in a residential area I sent in this query to the graph: match (ra:RegisteredAddress {postCode:'2016'})<-[:coHasRegisteredAddress]-(co:Company {entityStatusDescription:'Registered'}) with co match (co)-[:coHasIndustryClassification]->(ic:IndustryClassification) return ic.classificationDescription, count(*) as icCount order by icCount desc Here is what is returned: Business Classifications in Postcode 2016 They are a bunch of property developers! Well, the data has a long tail, there are 689 business classifications represented, making fascinating reading, right down to motor wreckers, milliners and milkers of cows. Back to the graph theme. I can see 10 construction carpentry businesses and can get them into a link chart: match (ra:RegisteredAddress {postCode:'2016'})<-[:coHasRegisteredAddress]-(co:Company {entityStatusDescription:'Registered'}) with co match (co)-[:coHasIndustryClassification]->(ic:IndustryClassification {classificationCode:'E324220'}) return co Construction Carpenters in Postcode 2016 At this point I run out of steam with my analytical abilities with graphs so I'll stop while I'm ahead. That's a personal limitation, not the technology, I just haven't spent time refining my skills. The point I really wanted to make is the world's businesses and transactions can be modelled in ArcGIS and there is a handy linking key available to you in the GS1 GLN. Talk to the responsible people in your organization, province, state or country and help get everyone on the map!
... View more
01-25-2022
07:28 AM
|
1
|
2
|
1807
|
|
POST
|
Hi, we attempted to reproduce your issue but could not, we recommend opening a support call so the analyst can see your data and ETL tool - thanks.
... View more
01-24-2022
06:42 AM
|
0
|
0
|
3642
|
|
POST
|
Thanks for the workflow thumbnail, we'll chase it here, in the meantime please use an AttributeRounder.
... View more
01-21-2022
05:23 AM
|
1
|
0
|
3658
|
|
POST
|
Hi, please make sure the feature service you are writing to doesn't define the target field as double. Data Interop will happily write numeric strings into a double. Are you creating the service with the writer or does it already exist? We would like to reproduce the problem here so the workflow is important.
... View more
01-20-2022
09:45 AM
|
1
|
0
|
3665
|
|
POST
|
You can import ElasticSearch documents (including with GeoJSON or LatLon geometry) with Data Interoperability but not as a stream layer, only on demand or on a schedule. There are other options for handling the data in real time, if you contact your Esri representative and start a discussion please ask to include me (Bruce Harold) and we can take it from there. Thanks.
... View more
01-18-2022
06:24 AM
|
0
|
0
|
2551
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-03-2025 05:45 AM | |
| 1 | 11-21-2025 05:34 AM | |
| 2 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM |
| Online Status |
Offline
|
| Date Last Visited |
Friday
|