|
POST
|
OK, so you want to go into your Control Panel and look to see what version of .NET is installed. On this machine it is 4.6.1 You may need to update this as its something that can be easily overlooked. As for how to open the .dmp files if you are using Microsoft Visual Studio you right click on the .dmp file and click on OPEN You should get this pop-up and you mark "Select a program..." and click OK. A new window pops up and you locate your Microsoft Visual Studio and choose it. Once you have selected MS Visual Studio it should open and looks like this. For this .dmp it appears there was a permissions issue based off what is in the "Exception Information". If you don't have MS Visual Studio you could go get one of these free tools to dig into the .dmp 3 Ways to Analyze Memory Dump (.dmp) File • Raymond.CC
... View more
08-10-2016
06:28 AM
|
1
|
3
|
4772
|
|
POST
|
Hmmm what version of Microsoft .NET do you have installed and what version of ArcServer are you using?
... View more
08-10-2016
05:13 AM
|
0
|
5
|
4772
|
|
POST
|
1) I've always used Microsoft Visual Studio to open them and read them. Know lots of people may not have it and hopefully some others can offer up what they use for reading them. 2) Not certain but sounds like maybe you are wondering about the OS level logs? If so and IF you are using a Windows environment then you want to go to Server Manager > Diagnostics > Event Viewer > Windows Logs (this is on Windows Server 2008). Believe in 2012 you just go to the Events area under the Local Server but don't have any instances to confirm exact paths but a Google search would give you where to go. 3) Haven't seen this error before but perhaps look at what Jose says here How to avoid this error: XML Parsing Error: no element found but he suggests permissions issues.
... View more
08-10-2016
04:30 AM
|
1
|
7
|
4772
|
|
POST
|
Sure thing glad it helped and you should be able to reconcile and post all of them back to your default and bring in data from all the various versions. Just sounds like you might have some duplication across versions that would need to be cleaned up using the conflict resolution. GL
... View more
08-03-2016
05:33 AM
|
0
|
0
|
735
|
|
POST
|
This sounds like it might be due to versioning. Have you brought up the database administration to see if there is 1 or more versions pending? Probably fastest way to get an answer would be to launch ArcToolbox > Data Management Tools > Versions > Reconcile Versions. Once you provide the DB connection the tool should pop a list of all pending versions into the "Edit Versions" box of the tool. I'm betting once you perform a reconcile it should move all your data from the edited versions to base and you will then be able to see your data.
... View more
08-03-2016
04:42 AM
|
0
|
0
|
735
|
|
POST
|
I'd agree with Dan. I can say we also use a shared UNC location for the config store, and server job folders, etc. I keep a backup copy of all of our DB connection files, ETL's etc. in a folder at the same location but it's not the true source for any of our work. Thus far it's never caused any issues. From an architectural stand point, I only see 2 downsides. First, you create a single point of failure. So if you are backing up this share location in its entirety elsewhere you have a restore option for not just servers but data, etc. as well. The only other concern you run is depending on what/how you are leveraging server and the data for desktop needs etc. in simultaneous workflows because you could push up against some bad disc I/O giving performance headaches to users on both ends (desktop & server).
... View more
08-02-2016
08:13 AM
|
0
|
0
|
699
|
|
POST
|
Ah lol been over a year since we did these configs. We did this as well although don't believe it was what resolved this issue for us but still agree with Jade that this is good to change.
... View more
07-28-2016
08:08 AM
|
0
|
4
|
2495
|
|
POST
|
Knew this looked familiar and fairly certain this was an issue we had. <Msg time="2015-05-17T15:33:10,141" type="SEVERE" code="9003" source="Rest" process="5208" thread="32" methodName="" machine="XXX" user="XXX" elapsed="">Unable to process request. For input string: "18446744073709551614"</Msg> So if this is what I am thinking then for us it was a RAM resource issue. Not knowing your environment and configuration I can't tell you exactly where to be looking. We run the Web Adaptors out on their own IIS servers and use the load balancing onto the ArcServers on their own servers. When we saw this, the issue was the RAM on our web servers was pegging out to 100% usage and users would wait to sync and get errors on the mobile device. We took this to be a timeout issue and the lack of resources prevented the data from getting through. Easy fix scaled out another 4GB of RAM and all was well again. What's interesting is you are saying you can pass them through via ArcGIS Online which one would think should generate the same results as Collector. Perhaps consider leveraging the newer releases of Collector that allow for photo compression and verify file size between mobile devices vs what you are pushing into AGO.
... View more
07-28-2016
06:55 AM
|
1
|
0
|
2495
|
|
POST
|
Currently there isn't an ESRI designed method for creating required fields. We have brought this up with them several times over the last year and they say it is in the works. In the meantime we have found an option that does work but ESRI tells us is not "by design". It will make the fields required though and the submit button is grayed out until all required fields are populated. Have a write up on this workaround if you want to e-mail me at scott.fierro@dot.ohio.gov I can pass it along to you. Just understand that at this time ESRI states this is a BUG workaround so something as simple as a future update may break this working or they may decide to use this as the workflow for creating required fields.
... View more
07-22-2016
04:16 AM
|
1
|
0
|
7969
|
|
POST
|
Due to some of our areas in southern Ohio where cellular data connectivity is essentially non-existent we have set a business rule that for most of our Collector projects the work is done in offline mode. We did some testing with triggers and never took the time to dig into all the underlying causes but we did have issues with the triggers and assumed it's tied to the way the "sync" process is designed. As a result we do everything as Stored Procedures instead and based off our field workers daily routines it's a simple matter of running the SP once nightly. Their workflow is when they come in to do a fresh download for offline use of any needed maps for that day, go out all day to do field work and upon returning at night perform a sync then drop iPads in docking stations for charging and us to push updates as needed. So a single nightly process has been effective for our needs but if your workflow was different perhaps set the SP to run every 30 minutes or something like that. Really would come down to how often you have users syncing and downloading maps for offline if it becomes necessary to ensure no new download contains records that haven't had the SP run against it.
... View more
07-21-2016
04:33 AM
|
0
|
1
|
1358
|
|
POST
|
Series of options here you can use just depends on need/intent but since you created feature classes for both it sounds like you want more than just X/Y you want Geometry as well: 1) From the database side create a trigger for on insert that extracts X/Y from the geometry you are collecting into fields in that table and then performing a copy based off GUID join into the other table 2) Create a stored procedure that performs these tasks based on some scheduled routine (hourly, every 4 hours, nightly, etc.) 3) Create an ESRI based ETL (model in model builder) that performs the same above tasks but additionally you could create geometry on the 2nd table using the X/Y data being copied into it (this requires ArcServer to publish the model as a service and then schedule it to run routinely using windows task manager OR it could be done manually if there's no need for automation) 4) Create a custom python script that performs the same tasks the ETL does but it can be pushed into windows task scheduler without having ArcServer and using the .py as an argument call to the python.exe Those are some of the options but I am sure there are several more if you wanted to go down the path of FME or some other options. Most of what I offered up is focused around automation but can be done manually as well.
... View more
07-20-2016
03:45 AM
|
0
|
3
|
1358
|
|
POST
|
Interesting that it's setting the hint to that. While it's not a supported or intended option for Collector currently there is a way to enforce required fields. We have brought it to ESRI and it is deemed more of a BUG than working solution but it is currently a repeatable process. Some of our staff Anthony Clark did a great write up on it that is far too long to post here but if you want to email me at scott.fierro@dot.ohio.gov we can pass it along for you to try
... View more
07-14-2016
07:07 AM
|
0
|
0
|
1165
|
|
POST
|
Veronica, so there is a very not so good and takes a little work way we discovered to force a "required fields" scenario. We have brought it up with ESRI and it's deemed more of a BUG than an intended scenario or solution. So at the moment this does work and is a repeatable process but can't guarantee how long it will continue to work for. Some of our staff put together a good write up on it but its far too long with too many screen shots to past into here. If you would like it you can email me at scott.fierro@dot.ohio.gov and I can pass you the Word document on this.
... View more
07-14-2016
07:04 AM
|
0
|
0
|
1199
|
|
POST
|
Jolynn Becker great question and this is referred to as "Skip Logic" Skip Logic – Survey Response Path Based on Answers | SurveyMonkey . We have done monthly or bi-monthly meetings with a PM for all of ESRI's mobile app solutions and raised this issue a while ago. As @Kylie Donia said above they are looking into providing this functionality in a future release. By all means though keep posting and pushing this stuff on the Ideas pages to help stress the high user need for it.
... View more
07-14-2016
06:57 AM
|
1
|
0
|
1199
|
|
IDEA
|
Good catch Matt I forgot about making that change as well, I did it so long ago, but yes that's the config setup we use. I'd agree ESRI might not condone it but there are a lot of pretty useful and beneficial DB side tools that exist across multiple or singular DB platforms and ESRI doesn't condone/script for them either. Have gone so far as to reverse engineer some of the ESRI processes in order to modify them to be able to leverage a specific DB platforms native capabilities.
... View more
07-08-2016
10:00 AM
|
2
|
1
|
1163
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 08-30-2017 03:41 AM | |
| 1 | 03-01-2017 08:50 AM | |
| 1 | 03-17-2017 10:37 AM | |
| 2 | 05-24-2017 07:57 AM | |
| 1 | 03-16-2017 10:06 AM |
| Online Status |
Offline
|
| Date Last Visited |
03-07-2022
02:41 PM
|