Got stuck in creating and exporting an ENC(S-57 file)

4227
8
10-27-2010 02:36 AM
Hsing-JuChang
New Contributor
I would like to produce an ENC product from my existing shapefiles. I've followed the PLTS Desktop Help 'ENC Production Procedures'. In shorts:

  1. Set up the product library and nautical properties.

  2. Add series, product, instance, and AOI to the ENC product class.

  3. Implement instance and check out product.

  4. Import data from existing shapefiles to the checked-out geodatabase and edit attributes.

  5. Run the Update Primitives tool. Then I got the following error 'Invalid Name:Error while initializing Feature cache hr:-2147467259'


What does the error message mean? I don't understand the overall ENC production procedures very well. Am I on the right track? Since I'm importing data other than ENC cell(no cell info can be imported to my geodatabase), how should I define S-57 cell metadata?

In addition, is it possible to modify the Agency/FIDS list(ex. add item to the list)?

The software version is PLTS Nautical 9.3.1. Hope I explained my question clearly.
Thanks in Advance,
Amy
0 Kudos
8 Replies
NancyEl-Zeaiter
New Contributor III
Amy,

You must run Desktop Populate on your product to associate ENC metadata.  The process will also version and archive your database in support of ER (ENC Revision) exports and geodatabase history. When Desktop Populate is run you will be asked for your producer code. Since you inquired as how you can extend the current list of producers I will address this question first.

Producer Codes:

At the 9.3.1 release the agency list was comprised of hydrographic agencies/organizations from S-62 as well as a snapshot of the OpenECDIS.org list of non-hydrographic agency S-57 data producers. The OpenECDIS.org list continues to evolve, at the 10 release we updated the agency list to reflect the most recent state.

To Extend the list:

When installed the Esri Nautical solution has a number of configuration files. The one that contains the producer codes (agency) is called �??products_config.xml�?�. The default install location for this file is C:\Program Files\PLTS\Nautical\Common. Using and XML editor or text pad you will need to locate the tags for producer codes. (<PRODUCERAGENCY> </PRODUCERAGENCY>) Input the correct Agency code, acronym and company/organization description for your agency according to the xml syntax.

Example:

<AGENCY code="550" value="US" description="Office of Coast Survey National Ocean Service" />

That should do it, make sure you start a new ArcMap session to see the changes.


Desktop Populate:

First let�??s make sure you have the correct populate option selected.

1)      At the Class Level Node (ex. ENC) on the PL tree view, right click and choose Properties.

2)      On the Class Propertied dialog, choose Product Instances on the left and on the Population Dropdown list choose Nautical Desktop Populate.

Now you can associate ENC metadata with your product by running the populate command. Make sure your product is currently checked into PL and then right click on your product instance and select populate.

1)      Select a location where you will checkout your product.

2)      When the new product form appears select the product type you are producing.

See NewProduct Attachment

3)      Note for producer code you should see the acronym and description you added.

4)      Complete the New Product form and click on OK.

5)      This will bring up the Metadata dialog where you can set the DSID, DSSI and DSPM values for your product.

See Metadata Attachment

6)      After this dialog is populated and you click OK then you will want to wait until your product is populated.


If your data is missing the LNAM and DSNM for the features you brought in from the shape file source you have the option of running the product refresher command to populate those fields.


7)      Run the Product Refresher tool, that you can get under the Customize/Commands dialog, and make sure that you check the radio buttons for Populate DSNM and LNAM. Note: Additional processes can be run depending on your data content and need.


As a side note, without knowing more about the work you are doing and data you are compiling I think it is worth mentioning that there are many other required features that S-57 data depends on in the creation of a valid ENC file, (such as a M_COVR feature or other contiguous SOE Features). I hope this information helps you get on your way
0 Kudos
Hsing-JuChang
New Contributor
Nancy,

Thanks for your reply!
I finally get my 'Geodatabase to S-57' to work!

With a little problem...
I can convert COALNE, LNDARE, and M_COVR features. However, after adding the depth contour data, PLTS fails to export my S-57 cell. I got the following error: 'Error while writing to S57 File-2147467259'.

As an experiment, I try to draw a depth contour line myself, export to S-57, and it works. So, maybe the problem is coming from my depth contour data. I need some advices about how to check my data and fix it. BTW,the depth contour is generated from grid data by Contour Analysis(Surface Analysis).

Thanks for your effort and help.

Best Regards,

Amy
0 Kudos
NancyEl-Zeaiter
New Contributor III
Amy,

Since you are not using typical S-57 data, and are loading varying types of data from various sources, I would consider running the Generalize tool on the DepthsL features that you have brought into the DB, to minimize the number of vertices that conversion methods tend to increase, and make sure that the data that you have imported into your DB is attributed correctly for export to S-57, as well.

I hope that this helps.
0 Kudos
GeoffreyGomez
Esri Contributor
Hi Amy,
I work with Nancy and wanted to find out some more information about your issue. Specifically I wish to learn more about the depth contour data that is causing problems. You mentioned that you created the depth contour data using Surface Analysis Tools (Contour). Can you describe the work flow you used? By that I would like a high level overview to see if there is a potential for issue.

Example: Preprocessed Raster data using Neighborhood Focal Statistics operation with the Mean option. Generated contours using an interval of 2 (meters), and a base contour of 0, no factor was necessary. Post processed resulting contours by using the smooth Geo-processing tool found in the Generalization tool set. Then I used the simple data loader in Arc Catalog to bring data into Nautical Data model.

What I am looking out for in my example are the potential issues I want to uncover.


  1. If you post processed and smoothed the contours both of the interpolation methods can result in topological errors. These errors can introduce mathematically generated curves (no vertices for every point of inflection), or self intersecting lines that are auto corrected to make multi-part features, neither of which are supported by S-57. http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?id=1845&pid=1837&topicname=Smooth_Line_%28Data_M...


  2. Simple data loader happens outside of an edit session which means some fields important to making the s-57 file may go unpopulated. (i.e. LNAM, DSNM, NAME)


Did update primitives show any errors in its logfile, missing LNAM, DSNM etc...

I hope this helps flush out some thoughts so we can narrow in on where the issue may be occurring. I look forward to learning more and getting you to your goal. If you can provide the data I can look into the issues directly.

Geoff
0 Kudos
GeoffreyGomez
Esri Contributor
Hi Amy,
I work with Nancy and wanted to find out some more information about your issue. Specifically I wish to learn more about the depth contour data that is causing problems. You mentioned that you created the depth contour data using Surface Analysis Tools (Contour). Can you describe the work flow you used? By that I would like a high level overview to see if there is a potential for issue.

Example: Preprocessed Raster data using Neighborhood Focal Statistics operation with the Mean option. Generated contours using an interval of 2 (meters), and a base contour of 0, no factor was necessary. Post processed resulting contours by using the smooth Geo-processing tool found in the Generalization tool set. Then I used the simple data loader in Arc Catalog to bring data into Nautical Data model.

What I am looking out for in my example are the potential issues I want to uncover.

  1. If you post processed and smoothed the contours both of the interpolation methods can result in topological errors. These errors can introduce mathematically generated curves (no vertices for every point of inflection), or self intersecting lines that are auto corrected to make multi-part features, neither of which are supported by S-57. http://webhelp.esri.com/arcgisdeskto..._Management%29

  2. Simple data loader happens outside of an edit session which means some fields important to making the s-57 file may go unpopulated. (i.e. LNAM, DSNM, NAME)


Did update primitives show any errors in its logfile, missing LNAM, DSNM etc...

I hope this helps flush out some thoughts so we can narrow in on where the issue may be occurring. I look forward to learning more and getting you to your goal. If you can provide the data I can look into the issues directly.

Geoff
0 Kudos
Hsing-JuChang
New Contributor
Hi Geoff,
Thanks for your reply.

I did use the smooth tool for my depth contour data. I found out that the resulting depth contours have topological errors(see attachment) which may cause the problem.

As far as I know, these errors are created during the smoothing process. Is it possible to prevent or fix them automatically?

Best Regards,

Amy
0 Kudos
GeoffreyGomez
Esri Contributor
Hi Amy,

With the workflow I have inferred from our communication your best option may be to pre-process your raster dataset prior to generating contours.
[INDENT]I noted the hyperlink I included in the last post did not work, so I will add the paths within the help as well. Sorry for that.[/INDENT] The help mentions either a FocalMean or the Filter tool with the LOW option. Both tools are found in the Neighborhood toolset within Spatial Analyst.

Path: Extensions > Spatial Analyst > Spatial Analyst functional reference > Neighborhood (Spatial Analyst)

As I mentioned in the previous post, the smooth process (Path:Geoprocessing tool reference > Data Management toolbox > Generalization toolset > Tools) in itself has the potential to introduce mathematically generated curves (no vertices for every point of inflection), or self intersecting lines that are auto corrected to make multi-part features, neither of which are supported by S-57.

If pre-processing the raster dataset is not a viable option then I can recommend a means for reducing the topologic errors. Providing they continue to occur I can at least give you a way of identifying and resolving the issues.

Smoothing options:
Prior to smoothing (and generalizing) your contour data I would consider running a GIS data reviewer check to identify portions of linework like the image you sent. The check I had in mind is the cutback check. (Path:Extensions > PLTS > Validating data with GIS Data ReViewer > Configuring checks) This may help you identify and fix portions of line work that could cause issues once smoothed.

I am not sure which tool you used to smooth your linework, whether a GP tool or a PLTS tool. I suggest using the PLTS Generalize and Smooth tool (Path:Extensions > PLTS > Editing data in PLTS > Editing existing features) Using this tool has a number of advantages.

  • It works within an edit session on existing data (Ability to undo)

  • You continue to have the ability to set a tolerance (Maximum Offset)

  • You can define the units

  • You can select the algorithm of choice

  • You can preview the results


I have attached an image (GSG.png) of how I would suggest using the tool. That said you may want to use a different tolerance for the smooth process than the generalize process. Since the tool works on a selected set you may be able to tailor the amount you smooth to the area you are working and what it warrants. Also, make sure that you always complete your processing by generalizing the results.

For generalize I elect to set the maximum offset to .00000001. This is the resolution of S-57 data. This goes along with the units being decimal degrees and serves to remove bézier curves while not affecting the character of the features.

Identifying and Resolving:
Once you have smoothed and generalized your data (especially if using the GP tools) it may be difficult to identify multipart features or bézier curves. I again suggest using GIS data reviewer to help you identify these issues.

Both checks I recommend are under the default checks(Path:Extensions > PLTS > Validating data with GIS Data ReViewer > Configuring checks ) Configure and run the Multipart Line and Non-Linear Segment checks to ensure you have not produced any topologic issues as related to S-57 data.

If you have multi-part features you can resolve the issue by selecting the errant geometries and using the explode multi-part feature tool found on the Advanced Editing toolbar.

If you have non-linear segments then you will need to select them and run generalize based on the above guidance.

Hope this helps 🙂
0 Kudos
Hsing-JuChang
New Contributor
Hi Geoffrey,

Your suggestion is really helpful. Thanks!!:D

Amy
0 Kudos