IDEA
|
ESRI, This basic function - capacity to auto name photos from various application settings is found throughout industry. There is an overriding theory in ESRI that attachments are the only way to feature attribute link. We get that, but even with the new capacity in 3.3, we, as professional's desire better photo handling, and that starts with what seems simple to us in this forum - adding names from a Project name, YYYYMMMDDDHHMMSS for keeping data independent of each other. Take a Trimble surveyor workflow in Access. A job within a project and a code. I can take a photo, tagged to the feature before it with KATMUtilitySurvey_FireHydrant_20240621143533.jpg Like trimble access your programmatric capacity inside any of the apps, allows you to extract (pull) any of these attributes and on photo storage time, assign the photo from the app on device. We think this is a good idea. Even better give us, as GIS professionals the choice to choose how to string attributes into a photo name. Love to see movement on this.
... View more
06-20-2024
07:04 PM
|
0
|
0
|
119
|
BLOG
|
Regarding comment above from GISAdminSHN "The article puts a lot of emphasis on documenting and displaying this kind of information, but once inside the AGOL system it gets pretty buried, forcing a user to dig into REST links, click admin, click JSON, squint at the tiny code and try to dig out the details. This experience could be vastly improved! While we are on the subject, online maps having their spatial reference controlled by the basemap layer needs to die already. Your users want to publish and maintain maps in their preferred coordinate system AND use your fantastic AGOL basemap collection. And they should be able to. Easily. Just do it already (pretty please <3)" This belongs on IDEAS, re: AGOL burying PCS/GCS and the mere fact that basemap controls spatial reference. This has to go away. There is no reason that augmented or autonomous data fed into a hosted feature layer has to be shifted or not twice up and down thru the cycle just because the controlling nature of basemap PCS definition. Currently this is crushing persons who think Survey123/QuickCapture handles GCS like FieldMaps. Datum handling needs to be fixed BEFORE 2025 or there is not a hope in heaven that folks can even do the right thing when NAD83/NAVD88 is abandoned. One thing a GIS professional can do is to set control at their location using OPUS Share. Only then can one fight the datum shifts (shi&6) that is found throughout our industry. If a person submits that measurement not only can one determine shifts today, but after 2025 that coordinate will move and one can verify the future. Projections are easy. Datums are hard - and it will get worse before it gets better (Part of the quote from Michael Dennis, NGS Geodesist).
... View more
06-10-2024
07:57 PM
|
1
|
0
|
97
|
POST
|
mfarfad, Did you solve this, or get assistance? I get this occassionally, with features with exact same geometries, same time stamp from GNSS stored in the feature service. I would think, if the first second you press a button "seals" the time of storage, and inputting, say a project input, photo, you have stored a unique feature on device. I use AutoSend OFF, and return to the office from the field and when in WIFI, Submit my features. This process should never duplicate positions. This issue may be because you are set to autosend on, but to have duplicate geometries, same time stamp, I am also stumped. My tests using Android v. 13, Samsung Tab3, Trimble R1/R12i with mock location on, NTRIP connection via TMM (latest version) and Internal provider set in QC.
... View more
03-31-2024
09:41 AM
|
0
|
0
|
180
|
IDEA
|
Segmenting line features in FM would be huge. Yes, Terrasync can do this with two taps (option/segment). It would make sense if QC could do this too, as mentioned, carrying forward into the next segment previous attribution, allowing change on one or more. Couldn't this be simple, building into the interface, knowledge in FM that its a line, and provide a segment button (not sync). Fundamental to GIS linework is changes at vertices and no better way to directly observe changes when mapping in the field (Field Maps) right?
... View more
07-04-2023
11:07 AM
|
0
|
0
|
305
|
POST
|
Colin, correct me if I'm wrong, but the basic premise is Field Maps must have a basemap in order for any data collection to occur in the field, with or without realtime access to a NTRIP stream. NTRIP is second-by-second realtime positions being stored in the map. Intermittent cellular broadband connectivity is the achilles heel in field collection. Therefore, going into the field without an offline map ahead of time and losing NTRIP also means losing the map. Data collection ceases. Going into the field with an offline map allows Greg to continue to store positions offline (but without the precision of NTRIP, and the resultant storage of positions in two different datums...an issue that can only be corrected in a post-process environment). No offline map = No data being collected when internet in the field breaks down.
... View more
07-04-2023
09:15 AM
|
0
|
1
|
1511
|
IDEA
|
Like PhilippNagel1, we also work across large geographic regions, and depending on the locality have either WAAS (SBAS) augmentation with submeter, but obviously in ITRF2014, while in other locations, Single Base RTK solutions based on NAD83 2011 (epoch 2010) solutions. Hence this datum switch is now, maintained on each device by each user creating two location profiles, both of which are of course dependent on the map coordinate system (which can come in two "types" - WGS84 and NAD83 (2011). Right there we have a 4-way permutation, rife with errors for field personnel. So, controlling at least a portion of the location profile (incoming datum stream) is crucial, and could eliminate the number one error in mobile GIS - Datum mismatch.
... View more
11-12-2022
12:06 PM
|
0
|
0
|
2330
|
POST
|
May I suggest the ultimate way to solve this dilemma. Its called Whack a Mole. Unless you occupy survey control,that is in the datum of which you seek to be tied too, you will chase this till the cows come home. Surveyors, the expert measurement experts will tell a GIS guy/gal - to Check in. This means to see how close your system, software, incoming augmentation and all the datum gates in Pro/ArcGIS/Map/FS/Location Profile/Quality of GPS take into effect have on becoming "accurate". Check into a Control point using a nearby benchmark using OPUS shared (google) or NGS Explorer (google). Find a point that is reporting its coordinates in, I'm guessing with your situation as NAD83 (2011). Setup on that point with your system, do what you think is best and see how close you get too that point using best practices. Look up the specs of the device. If it says 20cm, subfoot, you should get there. If your not, then one of the datum gates you selected is wrong. Or more. Good Luck.
... View more
01-16-2021
10:55 PM
|
0
|
0
|
661
|
POST
|
Kyle, Okay - You have got me hooked. You state in original thread (last sentence of first paragraph). "All of my GIS map documents and data are in NAD83(1986)." What is the source of these original map documents and data. Lets start there. Deployment of modern GPS receivers, bound to latest reference frame of WGS84, when selecting "correct realtime" in PFO means you are assuming latest reference frame which starts the game of shifts. Lets start over. What is your data really "in" that you need to get back too. Thanks, Joel
... View more
01-08-2020
08:40 PM
|
0
|
1
|
3000
|
POST
|
Kyle, Your not kidding it gets thick. Frances gives the guidance, while I'll take a stab at your original workflow, ie., using a Geo7x and Post Processing against a CORS station. This is my expertise and have what I think is a workflow that won't get you to 1986 (original NAD83), but holding a CORS published position through a workflow when collecting a position with that bomber system you have. Assumptions: You are using TerraSync. You are using either Trimble Pathfinder office 5.4 or higher. Ideally, today, your using both TS and PFO at the 5.86/5.85 or 5.9 version. This ensures the differential correction wizard is optimized. Finally, you are using a CORS station, not a UNAVCO or local base station. Okay, Step 3 above has to be explained further, and I'll take an opposite tact, that holding "NAD83" is paramont to getting data into NAD83 (and the highest form). I worked with Trimble years ago on this issue of the "Use reference position from base provider" vs "Use reference position from base files". Trimble has not corrected this error and continues to propogate a NON published position, hindcasting to a version of non-NAD83 that, when converting to NAD83 from that version is going to be wrong. This has to be understood that the former "provider" is NOT the published position of a CORS station, but a calculated shift that frankly should never be used and was intended by Trimble to be a stop gap shifter when in 2002, CORS96 datum moved to a new reference frame. NGS would state - always use the published position for a CORS. I agree, and hence, when using that Geo7x, the best thing you can do (the right thing) is to use the modern, published position of that ARP for that antenna certified by NGS. That position is what differential correction means to "hold" the CORS and calculate your rover "difference" from the coordinates you stored in TerraSync and what the CORS stored and differentially held during that same epoch at the base. You should always use the latter "Use reference position from Base File". Today, that is NAD83 (2011) epoch 2010. You get that coordinate from the seeded RINEX file that CORS so elegantly stores in all the RINEX header of the zip file you are retrieving from CORS. Run the PFO wizard, stalling at the confirmation step, and call up the CORS station, Position and Velocity page, and open the ARP coordiantes for that station. Scroll to NAD83 (2011) and compare the coordinates in the confirmation box and CORS YOu will see nearly exact coordinates. (Any difference is usually out to the 5th decimal seconds, way under the resolution of your gps). Proceed thru differential, and today, your COR file is now "In" NAD83 (2011) Epoch 2010.0. By doing this, you skip around what Frances says your in "1997". Your COR file is now at today's epoch (the day you collected the data) and are now holding NAD83 (2011) at Epoch 2010. You do have 9 years between your time of collection, but that is the premise of NAD83 (holding at an epoch). I know, this doesn't get you to NAD83 (1986), but you have to exclusively use "Use reference position from base file" to use the Published coordinate of a CORS. That is the key to start putting GPS data collected today into the most current form of NAD83. This is a serious Datum Gate many users are skipping by, and Trimble has not corrected this naming convention and shifting what you should be doing. Since this issue started in 2002, we are now 2 reference frames past that construct, and, now in ITRF14. Here is a link to the coordinates for a CORS in AK. Do this for your CORS you are using. YOu don't see ITRF97 do you?? https://www.ngs.noaa.gov/cgi-cors/CorsSidebarSelect.prl?site=tsea&option=Coordinates Once you do these steps, then Use HTDP tool. I would take the top choice, hindcasting your coordinate back to 1986. This would be as close to assigning a "plain vanilla" NAD83. Then, shift that coordinate from NAD83 to WGS84 using a bookeeping Molendensky transform like WGS_1984_To_NAD_1983_1. Joel
... View more
12-29-2019
10:00 AM
|
3
|
1
|
3000
|
POST
|
Hi Keith, Bookkeeping transformations (ones with null values for parameters) are used quite often. We did not have HARN up here, but for states with HARN, there is extra pain involved. From doing this type of work your doing, here are some suggestions after being in the trenches so often its hard to count. 1) Ignore accuracies (the way ESRI puts a transformation on top. Often, concatenated transforms are put on top, and I try to tell people to NOT do double transforms unless you have thoroughly solved one at a time. keep to single transforms, and carefully jump from one to another. Its insanely complex when combining transforms in one step. 1a) Get comfortable with the Data Frame shifts on fly and effect with data assigned diff. projections. That can really sting when your shifting data say in ArcCatalog, and the dataframe is shifting too. 2) Get onto a survey control (modern NAD83 (2011). Conduct tests - if your using GPS - on a BULLSEYE, because eventually once you figure out what is "moving", your going to have the real question... Which one do I use. Finally, find a surveyor near you who knows all about HARN. He/she will have sage advice. Your down in the weeds man, where GIS is just mapping, what Surveyors would call "guessing". Joel
... View more
10-16-2019
01:14 PM
|
1
|
0
|
1814
|
POST
|
I fear to tread on this topic, but its my understanding from sitting in so many NGS presentations on this topic, the phrase above :"geoid12b should be used with NAD83 (2011)" should be better written as "geoid12b must be used with NAD83 (2011)". The Datum Tag and Epoch of NAD83 2011 (Epoch 2010.0) is tied too the Orthometric H using Geoid12A/12B. I'm not a surveyor, but I heard these two MUST be associated with each other. That is, of course if the data stored in this table was derived from say an OPUS solution, or post Processed with or RTN streamed from a service bound to NAD83 (2011). Again, not a surveyor, nor am I a geodesist. I'm merely a biologist by training who tries to tie all my federal data to NSRF. Joel
... View more
03-13-2019
12:20 PM
|
4
|
0
|
990
|
POST
|
John, I'm pretty familiar with PFO/TBC and have both down cold depending on what you want your final export to be "in". Then there is the issue of what is your input datum, and how you are correcting the data. Your screen shots from PFO are absolutely correct. TBC also has the same exact ones. But, and I'll emphasize again is there is no way around testing on control set to the output you want to be "in". I've trained hundreds, and every student by the 3rd day doesn't get "it" until they plop their data onto a point in ArcMap and they are 1.5 meters away, while the other students are 10cm away. Your getting bit by thee most common confusion in GIS today. Thank god we got Melita, but you can imagine her inbox from all over the world. Your original question is very specific to handling in TBC, PFO. So, I don't even want to guess why you want to be in "WGS84". If your in State Plane, your an obvious US citizen, so most common is you want to be in NAD83, and since your using top dollar GPS devices, you most specifically want to be "in" NAD83 (2011) Epoch 2010.0. Thats what you get when you run OPUS or PFO with a CORS base station and you use the reference from Base file. You have with each software, 2 datum gates you must pass thru, and thirdly ask what you are setting in ArcMap. So that is a 3x3 matrix or 9 possible combinations. You can't do this correctly without TRUTH. You must calibrate your GPS work, and Accuracy means doing the right thing with datum. You can't evaluate Accuracy without comparing to TRUTH, and its so darn easy these days to set a nail, occupy with a GPS running 2 hours and comparing. Oh, and your last comment on KMZ production. Nothing in Google Earth ever is to be trusted. I wouldn't export data from 2,000$ software just to be compliant with GE. Export to NAD83 (2011) and let others suffer with any supposed shifts in GE. Just some thoughts. Joel
... View more
12-11-2018
02:39 PM
|
0
|
0
|
1482
|
POST
|
May I make a suggestion for times like these. Much like our tidal flats up here in Alaska, the more you struggle the deeper you get sucked into a place that is very hard to climb out of. Here is the way out. Occupy a high quality survey control. You mentioned TBC - you likely have survey grade GPS so, drop a nail outside your door, occupy for 2 or 4 hours, run OPUS and when you have that sheet, occupy the same point with any GPS. In ArcGIS, start a fresh dataframe in your reference frame. Add to that frame an unadulterated coordinate from OPUS( you have lat/long in both reference frames and utm/state plane in NAD83 (2011). Then drop your exported data into that frame. Quietly test your on-the-fly transformations. You will know very quickly what is biting you and what is not. Joel
... View more
12-10-2018
05:33 PM
|
3
|
1
|
7876
|
POST
|
Henry, May I chime in on the first layer of the onion that I see in this original question. I'm experienced in Trimble workflows including Pathfinder Office and have fought my way out of the datum tidal flats more than i can mention. So, if its not too late, or too far back, I have two comments on the precise terms you use when you wrote what your doing in PFO. If PFO output (the shapefiles you mention are considered "truth"), then I have a suggestion to do the right thing when it comes to "holding" a CORS solution. Then, when you get your PFO shapefile, you know you did everything you could prior to the loading into arcGIS. True accuracy is assessed only when you occupy truth better than your system. That is a modern OPUS observed benchmark. Try to find one nearby using NGS explorer or OPUS shared solutions. you'll see the NAD83 2011 coordinate (assume you are trying to "get there" is posted. Thats truth. second level of truth is your Correction settings. At PFO v. 5.2, pick the closest CORS station. Next, when stepping thru the wizard your two radio button choices in the Diff Wizard will allow you to either "hold what Trimble has produced - thats the "Use position from base provider" or "Use position from base files". The former is NOT a published CORS coordinate, and I never use. The second option "base files" is the actual NAD83 (2011) coordinate for the Rinex file embedded by NGS. Use that option at all times. Then, that COR file at the conclusion of Differential correction is in NAD83 (2011). On export, you do not shift. You use a Trimble "bookkeeping" shifter (NAD83 (CONUS) which does nothing. Define your shapefile with the correct PRJ (provided by ESRI with the datum tag 2011 and your exported shape is in the highest quality form. I never use ESRI to shift data coming from PFO. Why? Because you can harvest data directly from CORS published position - thats the best approach in my view. So, this is going to have different results (likely about 90cm) depending on where you are in your world, relative to the WGS84 output you were exporting before, but your ready to load with no shifting in GIS. Good luck. Hope this helped, or thoroughly confused you. Just know, Trimble has a mistake in the CBS list which is often overlooked and one path down PFO will lead you to holding NOT the IGS08 position for a CORS, but in reality, ITRF00 (1997) position.
... View more
03-15-2018
05:43 PM
|
0
|
0
|
2630
|
Title | Kudos | Posted |
---|---|---|
1 | 06-10-2024 07:57 PM | |
1 | 04-11-2016 03:24 PM | |
1 | 10-16-2019 01:14 PM | |
1 | 01-13-2016 03:39 PM | |
3 | 12-29-2019 10:00 AM |
Online Status |
Offline
|
Date Last Visited |
07-07-2024
03:16 AM
|