Best Practices for ArcPro Projects

4767
8
04-01-2021 01:08 PM
Labels (1)
RobertStevens
Occasional Contributor III

About two years ago I first attempted to migrate to arcpro from arcmap. I had about 24 different maps and I put them all into one arcpro project.

 

The consequence of this was that arcpro took forever to start, and sometimes crashed. According to ESRI, when pro starts it verifies, or attempts to access, or .. whatever.., every dataset in every map whether or not that map is the current one in Pro. Not good.

I am once again going to try and migrate to Pro.

So, my question is: what is considered best practice for a Project. How many maps is it reasonable to be contained in one project.

Second question. All my maps use a common file geodatabase. Is it considered wise to move that geodatabase into the project, or is it best left where it now is. If I move it, I will have to change all the paths of all the layers in my maps that use data from my one file geodatabase.

 

Thanks

Rob Stevens

0 Kudos
8 Replies
jcarlson
MVP Esteemed Contributor

In regards to the second question, you'll get a bunch of "Repair Data Source" warnings, but if they all come from the same database, fixing one automatically fixes all the others.

There's no real need to move the database into the project folder, if it's more convenient to have it where it is now. The only difference you might see is if the FGDB was on a network drive somewhere instead of locally. If it's just one local directory vs another, do what you personally prefer.

You can just add the database to your Catalog pane, or even add it to your Favorites so that it's easily accessible in any project.

- Josh Carlson
Kendall County GIS
JoeBorgione
MVP Emeritus

Best practices can be subjective.  One practice I like to follow though is to be up to date on the latest version of ArcGIS Pro; I understand that version 2.8 will be out in May, and the focus on this release is performance especially with large projects. The numbers that were tossed around to define large is 20-30 maps.  Personally, 20 to 30 maps in a single project seems gigantic but I'm more of a database guy than a cartographer.  (Hence the subjectivity to best practices...)

With respect to the use of one file geodatabase, you may get several different perspectives on that as well. How big is your organization and how large is your spatial data collection? How many people access the data in that one fgdb, and how many of them edit any of the data?  How do you manage that one fgdb?  Is it on a local C drive or a shared network drive?  Do you back it up regularly?

The only best practice I'm sure everyone can agree on is be sure to store your data such that it gets backed up on a regular basis.  Don't tempt fate!

As you know, when you create a project in pro, a file geodatabase is created as well.  If you are going to create data specific to that project like clipping, or appending etc, I would say use that to your advantage.  However if your single fgdb is the data source of truth to leave it as is; the last thing you want is copies of authoritative data floating about in different databases. It's bad practice to store the same data again and again plus you'll lose track of what really is authoritative in no time.

 

That should just about do it....
JustinJohnston
Occasional Contributor II

I am interested in this question, but for different reasons.  I only know ArcGIS Pro since I never used ArcMap.  It is not optimized to be used the way I need it to, and I think that results in the extremely slow performance often times.  The biggest problem I find is that, as a consultant with many clients and many different projects with each client, often juggling data to/from multiple sources requires me to have multiple GDB on each project.  Simple projects, or projects where my company is in charge of 100% of the data in and out, a single GDB for the project works.  I have heard of governments having a single GDB that is shared by many staff/groups/organizations within their government realm and that just seems crazy to me, but if it is mainly a collection of relatively stable reference information, it probably works well.  Always changing, dynamic, data going in/out doesn't seem to work well with a single GDB.  So I end up with lots of them on a single complex project just to keep all the information straight.  Versioning within a single GDB results in a mess of feature classes to try and sort through to find what you are looking for even if you have good procedures for naming etc.  I am not sure if this is a good practice, much less a best practice.  It seems to me to be dependent on the data.

The other issue I have is that Project data needs to be independent for each project and saved with all of the data for that project.  I don't mean just all the GIS data for the project, I mean ALL of the data for a project needs to be saved in the project folder for that project.  So for instance, say I am working with a client to design, plan, and permit a wind farm.  The project is the wind farm.  I am going to have all sorts of document files for permit applications, studies, wetland delineations, utility information, civil design from CAD, and maps from GIS.  It is all interrelated and all owned by the client for the project in the end.  My company uses Sharepoint to save ALL of the data for each Project.  So there is a Sharepoint site and a GIS folder in that Sharepoint site where all the GIS data for the Project is housed.  This is the biggest failure of ArcGIS Pro.  It is so dead set against you saving files to sharepoint or other "network" or "cloud" drives that it won't even let you browse to them.  You have to go into the folder in Windows Explorer, copy the address from the address bar, then paste into the address bar in the open and save windows of ArcGIS Pro just to be able to open or save anything in the Sharepoint folder.  To make things worse, it has real issues trying to sync files.  Files take forever to open/close and if you don't close out of ArcGIS Pro often and especially before your computer enters sleep mode, it wont sync at all and you have to open the file from the backup on your computer.  Not a huge deal, unless a different user needs to work on the same project and your changes didn't sync to Sharepoint.  Once another user opens the file from Sharepoint, that never was synced, you now have an issue that I don't even know how to resolve because you have two different updated files that will conflict.  You just need to pick which version survives I guess.  I get that ESRI wants us to all be using their cloud service, but that just isn't workable for consultants in the AEC world where data needs to stay with the Project as a whole, not just the "GIS Project".

Cheers,

Justin

JoeBorgione
MVP Emeritus

I'm not so sure that ArcGIS Pro is the only software that is limited by SharePoint.  It sounds like you have a very diverse business model with respect to data.  Could publishing data services be an option?  Have you looked into branch versioning over traditional?

That should just about do it....
0 Kudos
JustinJohnston
Occasional Contributor II

Other software has difficulty with SharePoint, yes, but ArcGIS Pro (and ArcMap as well I suspect) seems particularly problematic.  I personally know nothing about publishing data services or branch versioning.  Is there any good primer on those that you can point me to?

Cheers,

Justin

Mtwitchell
New Contributor

Hi Justin,

You mentioned quite a few issues that I also have with the company I work for. We have multiple GIS users across a dispersed geographic environment and we have to keep all of the data (and subsequent files like you touched on, for us it's CEQA permits, contracts, excel sheets, log tracking, and maps with multiple data display/layout options, that are also always changing as timber harvests commence). I was wondering how your company had things set up, are you working from a remote desktop setting or are you running arcpro on a local machine and just making sure to migrate/backup the data often? Trying to figure out some good practice here to make things smoother and less sketchy when it comes to that possible data loss when it's not being backed up every night. Any info is helpful, thank you for your post!

-Mandy

0 Kudos
marymekk
New Contributor

Hi Justin, I worked in a private firm with similar project folder and data storage requirements and an overall GBD for general reference data.   Not sure what the technical solution is in regard to working with spatial data in SharePoint as it is your company policy to store all project data there - and I agree that no data should ever reside (as temporary as it may be) in your local drives.  If a technical cannot be achieved with the help of your IT or ESRI, the practical solution might be to implement a company spatial data work flow policy that allows you to work efficiently and meet the data integrity and SharePoint storage requirements of the company. Find network location that is backed up that you can work from and have project data products uploaded/updated to SharePoint on a daily or weekly basis, or (preferably) at the  completion of the project as final deliverables . This former isn't ideal and provides for duplication of data - something the workflow must address to ensure it doesn't lead to loosing track of authoritative data (as JoeBorgione mentioned); the later avoids this issue.  There is nothing more frustrating for analysts, project managers, and clients than to see the project budget blown due to efficiency issues with hardware and/or software. Tasks that should be completed in 30 minutes do not need to take 1 or 1.5 hrs - particularly when your time is billable to the project or otherwise placed on the company overhead.  It boils down to time efficiency - you can spend a lot of it waiting on routine tasks to complete and getting burned out, or a fraction of that time implementing an alternate work flow.  Hope this helps.  I'm new to SharePoint as well but only use it as a DMS. 

M.

0 Kudos