|
POST
|
That's really nice for the basemaps. I'm curious though, if you guys completely lost internet connectivity to the outside world (but maintained internal network), would your applications continue to work? I feel like it would be a bit of an undertaking to create all the web-maps, then create all the web-apps that consume them, then access the source code for every map and every app, and redirect it to look to a local copy of the Javascript API. As well as any other ESRi specific items in the source code like AppID and webmap ID. I'm sure there is a way, but I'm curious if in the situation I oultlined (losing complete connectivity to the outside world), would your Portal apps and maps still function as is? Have you tested that?
... View more
01-27-2017
09:07 AM
|
0
|
4
|
1819
|
|
POST
|
I have, but not extensively. I thought that it still referenced ESRI Javascript API which is hosted on their server. Does it have the option to download the API and host internally, then point all "web-maps" and "web-apps" to utilize the locally hosted API?
... View more
01-27-2017
08:20 AM
|
0
|
10
|
1819
|
|
POST
|
Hey Joe - To answer your questions.... yes we do have unlimited data. But the caveat to that is, the last major hurricane my area received, a hurricane that didn't just have major rain and flooding, but severe damage to critical infrastructure as well as 80% of all homes experiencing damage, the entire city was completely without power for 2 weeks. I can't speak much for cellular service because at this time I was in high school and smart phones hadn't existed yet. We were still using those old Nokia phones. But I'm fairly certain cellular data and internet were non-existent for those 2 weeks as well. So obviously when a very serious hurricane hits, all bets are off and you have to rely on paper maps, or like you said lightweight, locally hosted apps. I haven't dove too deeply into 3rd party lightweight apps. I could probably create an HTML viewer that would allow some of our reporters to record public relief calls, which would be consuming the REST endpoints of our ArcGIS Server. Host the app internally, download ArcGIS Javascript API and host internally for the app to use. It's feasible. We have very serious redundancy plans in these events for our power and data restores, but when it comes to communicating to the outside world, we are limited to whatever damage may or may not have hit FPL or Comcast fiber lines. It's definitely an interesting subject, one that I am not too familiar with, but am quickly emerging myself in. Seems like you would need essentially 3 plans. 1. Minor Hurricane - minor damage, flooding, internet connectivity or cellular is still functional - In this situation you could use your primary response applications driven by ArcGIS Online 2. Medium sized Hurricane - Average damage, flooding, some damage to critical infrastructure (pump houses, sub stations, bridges, etc), inter-mitten power outages, as well as intermitten internet capabilities/cellular. -In this situation, I suppose you could start with the basic internal HTML viewer that was created as it doesn't rely on anything other than your internal network, and as power/internet/cellular is restored, migrate the ArcGIS Online applications, which will be consuming the same REST endpoints as the viewer, so when we go back online, ArcGIS Online will get a dump of all edits and updates we've been making internally so we don't lose a step in the recovery and response process. 3. Large Hurricane - This is where things get interesting. You can't rely on cellular service. You can't rely on internet. You can't even guarantee that our Diesel back-up generator on the roof didn't sustain major damage and isn't functional for a few days, or didn't get completely blown off when it got hit by a boat that got picked up out of the harbor. In this situation, it would very obviously be paper maps until power comes back up. Once power comes back up, data and our IT infrastructure will be restored which means our internal network will be operational. We could migrate to the internal application. As infrastructure is repaired, continue to utilize functionality of Application until connectivity with ArcGIS Online can be restored. Then migrate to ArcGIS Online like in the second process. I think I may have just thought through my beginning proposal for damage recovery in this response hahaha If you have any suggestions to those three situations, I would really appreciate the feedback. Thanks Joe
... View more
01-27-2017
05:31 AM
|
0
|
2
|
1819
|
|
POST
|
Hello... I am quite familiar with the majority of ESRI's software, but now am venturing into a somewhat unfamiliar territory that is Emergency Operations. My biggest question I suppose is, how are other municipalities that utilize ArcGIS Online and Emergency Response Solutions accessing the resources provided by ESRI during a major natural disaster, such as a hurricane. We are in Florida, and hurricanes are unfortunately a regular occurrence between July-October. If we lose internet capabilities, our internal network will be fine and operational; But if we are utilizing some of the solutions provided, we rely on access to ArcGIS Online, as well as the Javascript API that the applications are constructed upon, which is hosted by ESRI. If we go "black" so to speak, and are disconnected from the outside world, wouldn't that render a lot of these solutions useless, unless you have a contingency plan such as cellular data or possible satellite data as a back-up mode of communication to ESRI's servers? I know you can host your ArcGIS Online apps locally, but those HTML/Javascript/and CSS3 files will still be referencing ArcGIS Online web maps, and the Javascript API's used to construct those apps. Does anyone have any suggestions or remarks on how their organization handles this situation? Thanks!
... View more
01-26-2017
01:06 PM
|
1
|
24
|
5280
|
|
POST
|
I am on Server 10.3.1 and am experiencing the same issue when trying to download a map in collector for offline use. All parameters and features are correct, but my ArcGIS Online Username has a "." in it, and the credentials I use to access the layer (Server Authentication - Token - with our Windows Credentials) has a "-" in it. Would this be the same issue?
... View more
01-04-2017
11:41 AM
|
0
|
0
|
1086
|
|
POST
|
This makes sense. It didn't dawn on me but, I keep all my SDE connection files in the same folder. I could essentially set the full file path to that folder as the workspace right? Alternatively, in this case... I'm not sure if I need to set a workspace. If I specify each gdb in the parameters of arcpy.SynchronizeChanges_management, I don't think a workspace is necessary since the gdb's are "hardcoded" into the script essentially. I think so?
... View more
12-22-2016
10:33 AM
|
0
|
2
|
1299
|
|
POST
|
I've attached the script I am working on. It is a basic script set-up to sync changes between two replicated sde geodatabases. They are replicated fine, and manual sync works fine but I've been wanting to automate it. The thing I'm unsure of, is... what do I set the environment to if both of these databases exists on different SQL Servers? The only python scripts I have created have to do with reconciling version or compressing an sde, but for all of that, the environment is just a single SDE database. What do I do if the script needs to interact with two databases not in the same location?
... View more
12-22-2016
08:40 AM
|
0
|
4
|
2498
|
|
POST
|
We have a production EDIT sde geodatabase with multiple QA/QC versions, with check out replica's of those and other versions. Everything is working fine. Occasionally I will need to make a schema change for a department. I will usually make the change to the DEFAULT version, and push the schema change out via import/export schema change geoprocessing tool (I do this every Friday, off-hours). As of late, a main user of GIS in one of our departments has been demanding that they needs Admin privs to the full EDIT db because they need to be able to make schema changes whenever they like. (they really don't, they need to make them once a week at best). I've tried working with them to establish a Change management type system where changes are made Fridays off-hours, and they are still hesitant. I told them it is really not best practice for end-users to make schema changes to any version, given that they will be making them during business hours which locks all other departments out of all versions and replicas, and that they don't know what to do after the schema change to make sure all versions and replicas continue to sync properly. Plus no one has the ability to do anything to DEFAULT other than the gdb admin, since it is protected anyway. They're not going to get elevated priv's, because they will mess something up and it will ultimately fall on me. Curious what everyone else is doing? I think if she gives me the changes she needs, I can add them to a standard script that calls the AddValueToDomain or TableToDomain (append) tools and schedule it to run Saturday nights. What do you think?
... View more
12-21-2016
10:46 AM
|
0
|
0
|
1475
|
|
BLOG
|
Great article! I love the use of the three icons as containers for Applications, Open Data, and Story Maps. I'm curious though, are these just images used as hyperlinks to take the user to the map and app gallery supplied with the local government model? If so, that is a really clever way of making the Homepage streamlined and simple. How were you able to use images as hyperlinks? Is that a common functionality, or did it take some HTML, CSS, and Javascript work in the text box to make it all function? Great read!
... View more
11-28-2016
07:45 AM
|
1
|
0
|
6788
|
|
POST
|
Was curious if there was a way, other than copy pasting, that you could just import a FGDB into a blank SQL Database. Something that would wrap up the installing SDE system tables and configuration, and import data in one swoop instead of SDE creation and then copy pasting data. Just curious
... View more
11-23-2016
08:18 AM
|
0
|
3
|
6626
|
|
POST
|
Situation: We are using editor tracking with our Windows Domain Active Directory. It works just fine for edits made by users on ArcGIS Desktop. Problem... when field users use Collector with feature services coming from our ArcGIS for Server, Collector does not pass credentials for editor tracking, we get "ArcGIS_Anonymous" or whatever it is. I understand this is an inherent issue. So what are the workarounds? Hosting the features layers on ESRI's cloud? That's fine, it will keep track of editors, albeit, with their AGOL usernames and not our Active Directory, but will still give an indication of who edited it. The question: Is there a way to host those layers on ESRI's cloud, while maintaining a replicated copy of it? It seems like a pain to have to continuously download the data to get a local copy which becomes out of date quickly and requires another download, and then another, etc. etc. etc. It just seems easier to host the data in the cloud, and have a replicated version sitting locally so data and be reconciled to ESRI's cloud and vice versa. The whole point is to be able to maintain editor tracking from field users, and try to avoid having to download copies from AGOL all the time. Would ArcGIS Online - collector app pass ArcGIS for Server authentication?
... View more
11-17-2016
01:44 PM
|
2
|
4
|
5809
|
|
POST
|
Joe, Thanks for the solace. How did you go about handling your situation? Just adding additional fields? I think LGM and it's accompanying map and app templates can still function when additional fields are added. Things start to break when names of feature classes are changed or when fields that are queried, are deleted. The addition of new fields shouldn't screw things up too bad. I don't think at least?
... View more
11-17-2016
12:16 PM
|
0
|
1
|
1510
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 11-07-2016 02:13 PM | |
| 1 | 06-02-2017 05:23 AM | |
| 1 | 01-26-2017 01:06 PM | |
| 1 | 01-27-2017 09:30 AM | |
| 2 | 11-17-2016 01:44 PM |
| Online Status |
Offline
|
| Date Last Visited |
11-11-2020
02:24 AM
|