Yes please. It's so dumb that we can't do this anymore. I have a python script. It grabs attributes from one layer and updates the attribute tables of a bunch of other layers. I have this script in a geoprocessing tool and it runs as a task. It all works but requires the user to go to List by Data Source, find the right data connection (we have data from an enterprise geodatabase version, a different enterprise geodatabase, portal services and image services - so heaps of data source connections) and then right click and choose Refresh.
We are using tasks because most of these people aren't GIS people. GIS is a small part of their job and this palaver is confusing - even though we have attempted to explain how as part of the task.
The major problem with this when the next task is run it's looking for attributes that were updated in the previous task but since ArcPro refuses to acknowledge that these actually have been changed this next task doesn't work unless above palaver is executed. So frustrating.
With all I've read Pro is supposed to update itself when things change if the script is run from inside ArcPro. I'm not referencing the data directly in our enterprise geodatabase from the script, I'm finding and using the layers from the pro project itself and I thought that would allow pro to know something had changed and refresh itself but it doesn't.
I can't even put it in as a hidden task as I found suggested.... somewhere on the interweb.... this simply doesn't work.
This is a major annoyance. Please can we get this sorted soon?
@Clubdebambos@Michele_Hosking I want to clarify that you are asking specifically for a method to refresh the contents of the Catalog Pane rather than the Contents Pane/Map View/Layer.
To clarify, arcpy.RefreshLayer will cause a layer in the map view (present in the Contents Pane) and its table to update if you have made changes to it. For example, you have a layer symbolized a certain way based on the values in a particular field, and you change some of those values with an arcpy.da.UpdateCursor. Calling arcpy.RefreshLayer(<layer name>) will update the layer displayed in the map view and table to reflect the new values.
arcpy.PauseDrawing will pause updating the map view. For example, you are repeatedly running a tool in Python Window or a Notebook in Pro that repeatedly produces an output and adds it to the Map View, causing a lot of rapidly succeeding refreshes of the map and slowing down the overall operation. Using arcpy.PauseDrawing will defer the refresh until after all the outputs have been produced.
Neither of these functions affect the contents of Catalog Pane. Is your request specifically regarding the Catalog Pane, or do the above methods provide what you are looking for?
@HannesZiegler- What I believe this original posting is asking for is an ArcPy equivalent to right-clicking a folder or file geodatabase within the Catalog pane and clicking Refresh. In many scenarios, I am not needing to load feature class output as layers into the map, so that RefreshLayer() tool would not apply to those scenarios.
Example. If A) I'm actively connected to my Pro project's default file geodatabase (ex: Counties.gdb) through the Catalog pane, and if B) I use a geoprocessing tool within my custom Python toolbox that saves the feature class output to my Pro project's default geodatabase, then C) new output will not appear in that file geodatabase until I manually right-click it and Refresh within the Catalog pane. Does that clear things up? It seems like such a trivial piece of ArcPy functionality, and existed in ArcMap as RefreshCatalog(). We'd assume there'd be a RefreshCatalogPane() or something like that.
Pretty much as @anonymous_geographer stated above. A manual refresh is required to see the newly created feature class.
With ArcGIS Pro, when I have used ArcPy to alter a feature class, say the schema, a manual refresh doesnt reflect the change, I have to close down and reopen Pro to see the change take affect, this was never the case with ArcMap and using the RefreshCatalog() function.
To answer your question, this specific request is specifically related to the Catalog Pane and removing the need to refresh manually by implementing the RefreshCatalog() function that was available with ArcMap.
Thank you all for the clarification. Based on your feedback and discussions with the team I've moved this idea into "Under Consideration". We encourage the communityto continue voting and sharing feedback which will help us gauge user demand.
I must be doing something wrong but as far as I can see arcpy.RefreshLayer() doesn't work. I deliberately upgraded to 3.3.0 to get this functionality.
I've included my script below in case there are any clues in it as to why it's not working.
This script is set up as a tool in a toolbox. It has validation set up so a person choses the layer to run the script for from a drop-down list of available layers in the current map and then a list of available agreement numbers is populated to chose from.
I've put the refresh layer statement at the end of the script as it might loop through a number of times depending on how many agreement numbers is chosen to run it for. I thought one layer refresh would be fine at the end.
I've tried arcpy.RefreshLayer(futureLayer) which is looking at the layer as chosen by the user from the current map.
I've tried arcpy.RefreshLayer(futureDataEntry) which is the same layer but is the variable I actually use in the script (you can see how it's defined below). Not sure if I still need to do this as I have pulled the script directly across from an ArcMap plugin but a million years ago when I created this I obviously thought I needed it....
Anyway - neither of these refresh the layer or attribute table in the current map.
See below the python code for images of the results (or non-results)
Not sure why this is so hard.
P.S. - this is a different application to that I described in a previous post on this topic but RefreshLayer doesn't work for me in that other application either.
# Set up variables
futureLayer = arcpy.GetParameterAsText(0)
agreementNumberList = arcpy.GetParameterAsText(1)
agreementNumber = agreementNumberList.split(";")
desc = arcpy.Describe(futureLayer)
futureDataEntry = str(desc.path) + "\\" + str(desc.name)
#There are some other set up variable here but I've taken them out
# Definitions
#There are a bunch of definitions that do stuff - don't know that they are required for this purpose.
# Run Script
arcpy.AddMessage("%s" %os.path.basename(futureDataEntry))
CreateWorkingFGDB(workingFGDB)
for number in agreementNumber:
# Update LandUseID's in NDMS_FutureAllocation_DataEntry
arcpy.AddMessage("Update LandUseID's for %s" %number)
maxID = max(FindMaxID(futureDataEntry, activated))
arcpy.MakeFeatureLayer_management(futureDataEntry, "FutureLayer")
# Update LandUseID for repeated LandUseID's in selected PIN Numbers
if CheckSpace(number) == True:
selectionExpression = "AgreementNumber <> " + number + " AND FA_LandUseID is not null"
else:
selectionExpression = "AgreementNumber <> '" + number + "' AND FA_LandUseID is not null"
arcpy.SelectLayerByAttribute_management("FutureLayer", "NEW_SELECTION", selectionExpression)
uniqueList = UniqueList("FutureLayer", "FA_LandUseID")
if CheckSpace(number) == True:
selectionExpression = "AgreementNumber = " + number
else:
selectionExpression = "AgreementNumber = '" + number + "'"
arcpy.SelectLayerByAttribute_management("FutureLayer", "NEW_SELECTION", selectionExpression)
cursor = arcpy.UpdateCursor("FutureLayer")
checkDouble = []
for row in cursor:
if row.getValue("FA_LandUseID") in uniqueList or row.getValue("FA_LandUseID") in checkDouble:
maxID = maxID + 1
row.setValue("FA_LandUseID", maxID)
cursor.updateRow(row)
checkDouble.append(row.getValue("FA_LandUseID"))
# Update LandUseID for null or 0 values for selected Agreement Numbers
maxID = max(FindMaxID(futureDataEntry, activated))
if CheckSpace(number) == True:
selectionExpression = "AgreementNumber = " + number + "AND (FA_LandUseID is null OR FA_LandUseID = 0)"
else:
selectionExpression = "AgreementNumber = '" + number + "'AND (FA_LandUseID is null OR FA_LandUseID = 0)"
arcpy.SelectLayerByAttribute_management("FutureLayer", "NEW_SELECTION", selectionExpression)
expression = "autoIncrement(" + str(maxID) + ")"
codeblock = """rec=0
def autoIncrement(maxID):
global rec
pStart = maxID + 1
pInterval = 1
if (rec == 0):
rec = pStart
else:
rec += pInterval
return rec"""
calculateFieldList = ["FA_LandUseID", expression, codeblock]
CalculateField("FutureLayer", calculateFieldList)
# ---------------------------------------------------------------------------
# Export AgreementNumber to working file for the rest of the processing
CheckExists(f01)
arcpy.AddMessage("Select Agreement Number %s" %number)
if CheckSpace(number) == True:
selectionExpression = "AgreementNumber = " + number
else:
selectionExpression = "AgreementNumber = '" + number + "'"
arcpy.SelectLayerByAttribute_management("FutureLayer", "NEW_SELECTION", selectionExpression)
arcpy.AddMessage("Export %s to %s" %(number, os.path.basename(f01)))
arcpy.CopyFeatures_management("FutureLayer", f01)
arcpy.SelectLayerByAttribute_management("FutureLayer", "CLEAR_SELECTION")
# Update FA_LandUse and FA_Sector
arcpy.AddMessage("Calculate FA_LandUse and FA_Sector for %s" %number)
arcpy.MakeFeatureLayer_management(f01, "F01")
arcpy.AddJoin_management("F01", "FA_BlockType", blockTypeLookup, "BlockType", "KEEP_ALL")
calculateFieldList = [
[os.path.basename(f01) + ".FA_LandUse", "!" + os.path.basename(blockTypeLookup) + ".LandUse!", ""],
[os.path.basename(f01) + ".FA_Sector", "!" + os.path.basename(blockTypeLookup) + ".Sector!", ""]
]
for calcField in calculateFieldList:
CalculateField("F01", calcField)
arcpy.RemoveJoin_management("F01")
if f01 not in cleanUpList:
cleanUpList.append(f01)
# ---------------------------------------------------------------------------
# Create mean slope for each block in the Agreement Number
CheckExists(t03)
arcpy.AddMessage("Calculate slope statistics for %s" %number)
arcpy.sa.ZonalStatisticsAsTable(f01, "FA_LandUseID", slope, t03, "DATA", "MEAN")
arcpy.MakeFeatureLayer_management(f01, "F01")
arcpy.AddJoin_management("F01", "FA_LandUseID", t03, "FA_LandUseID", "KEEP_ALL")
calculateFieldList = [os.path.basename(f01) + ".FA_SlopeMean", "round(!" + os.path.basename(t03) + ".Mean! / 100, 2)", ""]
CalculateField("F01", calculateFieldList)
arcpy.RemoveJoin_management("F01")
# Calculate slope category
selectionExpression = "FA_SlopeMean < 8"
arcpy.SelectLayerByAttribute_management("F01", "NEW_SELECTION", selectionExpression)
arcpy.CalculateField_management("F01", "FA_SlopeCategory", "\"Flat\"","PYTHON_9.3")
selectionExpression = "FA_SlopeMean >= 8 AND FA_SlopeMean < 16"
arcpy.SelectLayerByAttribute_management("F01", "NEW_SELECTION", selectionExpression)
arcpy.CalculateField_management("F01", "FA_SlopeCategory", "\"Rolling\"","PYTHON_9.3")
selectionExpression = "FA_SlopeMean >= 16 AND FA_SlopeMean < 26"
arcpy.SelectLayerByAttribute_management("F01", "NEW_SELECTION", selectionExpression)
arcpy.CalculateField_management("F01", "FA_SlopeCategory", "\"Easy Hill\"","PYTHON_9.3")
selectionExpression = "FA_SlopeMean >= 26"
arcpy.SelectLayerByAttribute_management("F01", "NEW_SELECTION", selectionExpression)
arcpy.CalculateField_management("F01", "FA_SlopeCategory", "\"Steep Hill\"","PYTHON_9.3")
arcpy.SelectLayerByAttribute_management("F01", "CLEAR_SELECTION")
if t03 not in cleanUpList:
cleanUpList.append(t03)
# ---------------------------------------------------------------------------
# Create latitude and longitude for each block in the Agreement Number
CheckExists(f02)
CheckExists(f03)
arcpy.AddMessage("Calculate latitude and longitude for %s" %number)
arcpy.FeatureToPoint_management(f01, f02, "INSIDE")
spatialRef = arcpy.Describe(spatialRefWGS84).spatialReference
arcpy.Project_management(f02, f03, spatialRef, "NZGD_2000_To_WGS_1984_1")
calculateFieldList = [
["FA_Latitude", "round(!shape.centroid.Y!,4)", ""],
["FA_Longitude", "round(!shape.centroid.X!,4)", ""]
]
for calcField in calculateFieldList:
CalculateField(f03, calcField)
arcpy.AddJoin_management("F01", "FA_LandUseID", f03, "FA_LandUseID", "KEEP_ALL")
calculateFieldList = [
[os.path.basename(f01) + ".FA_Latitude", "!" + os.path.basename(f03) + ".FA_Latitude!", ""],
[os.path.basename(f01) + ".FA_Longitude", "!" + os.path.basename(f03) + ".FA_Longitude!", ""]
]
for calcField in calculateFieldList:
CalculateField("F01", calcField)
arcpy.RemoveJoin_management("F01")
if f02 not in cleanUpList:
cleanUpList.append(f02)
if f03 not in cleanUpList:
cleanUpList.append(f03)
# ---------------------------------------------------------------------------
# Update SoilData
arcpy.AddMessage("Update Soil data for %s" %number)
CheckExists(t01)
CheckExists(t02)
arcpy.TabulateIntersection_analysis(f01, ["FA_LandUseID"], soil, t01, "SmapName")
addFieldList = ["JoinField", "TEXT", "", "", 20]
AddField(t01, addFieldList)
calculateFieldList = ["JoinField", "str(!FA_LandUseID!) + \" \" + str(round(!PERCENTAGE!, 4))", ""]
CalculateField(t01, calculateFieldList)
arcpy.Statistics_analysis(t01, t02, [["PERCENTAGE", "MAX"]], "FA_LandUseID")
addFieldList = ["JoinField", "TEXT", "", "", 20]
AddField(t02, addFieldList)
calculateFieldList = ["JoinField", "str(!FA_LandUseID!) + \" \" + str(round(!MAX_PERCENTAGE!, 4))", ""]
CalculateField(t02, calculateFieldList)
arcpy.JoinField_management(t02, "JoinField", t01, "JoinField", ["SmapName"])
arcpy.AddJoin_management("F01", "FA_LandUseID", t02, "FA_LanduseID", "KEEP_ALL")
calculateFieldList = [
[os.path.basename(f01) + ".FA_SmapName", "!" + os.path.basename(t02) + ".SmapName!", ""],
[os.path.basename(f01) + ".FA_DominantSoilPercent", "round(!" + os.path.basename(t02) + ".MAX_PERCENTAGE!, 0)", ""]
]
for calcField in calculateFieldList:
CalculateField("F01", calcField)
arcpy.RemoveJoin_management("F01")
if t01 not in cleanUpList:
cleanUpList.append(t01)
if t02 not in cleanUpList:
cleanUpList.append(t02)
# ---------------------------------------------------------------------------
# Update F01 for soil data
arcpy.AddJoin_management("F01", "FA_SmapName", soilTable, "SmapName_SiblingName", "KEEP_ALL")
arcpy.SelectLayerByAttribute_management("F01", "CLEAR_SELECTION")
calculateFieldList = [
[os.path.basename(f01) + ".FA_FamilyName", "!" + os.path.basename(soilTable) + ".FamilyName!", ""],
[os.path.basename(f01) + ".FA_SoilOrder", "!" + os.path.basename(soilTable) + ".SoilOrder!", ""],
[os.path.basename(f01) + ".FA_MaximumRootingDepth_cm", "!" + os.path.basename(soilTable) + ".MaximumRootingDepth_cm!", ""],
[os.path.basename(f01) + ".FA_DepthToImpededDrainage_m", "!" + os.path.basename(soilTable) + ".DepthToImpededDrainage_m!", ""],
[os.path.basename(f01) + ".FA_WiltingPoint_0_30_cm", "!" + os.path.basename(soilTable) + ".WiltingPoint_0_30cm!", ""],
[os.path.basename(f01) + ".FA_WiltingPoint_30_60_cm", "!" + os.path.basename(soilTable) + ".WiltingPoint_30_60cm!", ""],
[os.path.basename(f01) + ".FA_WiltingPoint_60_cm", "!" + os.path.basename(soilTable) + ".WiltingPoint_60cm!", ""],
[os.path.basename(f01) + ".FA_FieldCapacity_0_30_cm", "!" + os.path.basename(soilTable) + ".FieldCapacity_0_30cm!", ""],
[os.path.basename(f01) + ".FA_FieldCapacity_30_60_cm", "!" + os.path.basename(soilTable) + ".FieldCapacity_30_60cm!", ""],
[os.path.basename(f01) + ".FA_FieldCapacity_60_cm", "!" + os.path.basename(soilTable) + ".FieldCapacity_60cm!", ""],
[os.path.basename(f01) + ".FA_Saturation_0_30_cm", "!" + os.path.basename(soilTable) + ".Saturation_0_30cm!", ""],
[os.path.basename(f01) + ".FA_Saturation_30_60_cm", "!" + os.path.basename(soilTable) + ".Saturation_30_60cm!", ""],
[os.path.basename(f01) + ".FA_Saturation_60_cm", "!" + os.path.basename(soilTable) + ".Saturation_60cm!", ""],
[os.path.basename(f01) + ".FA_NaturalDrainageClass", "!" + os.path.basename(soilTable) + ".NaturalDrainageClass!", ""],
[os.path.basename(f01) + ".FA_AnionStorageCapacityPercent", "!" + os.path.basename(soilTable) + ".AnionStorageCapacityPercent!", ""],
[os.path.basename(f01) + ".FA_SubsoilClayPercent", "!" + os.path.basename(soilTable) + ".SubsoilClayPercent!", ""],
[os.path.basename(f01) + ".FA_BulkDensity_KgPerM3", "!" + os.path.basename(soilTable) + ".BulkDensityKgPerM3!", ""],
[os.path.basename(f01) + ".FA_ClayPercent", "!" + os.path.basename(soilTable) + ".ClayPercent!", ""],
[os.path.basename(f01) + ".FA_SandPercent", "!" + os.path.basename(soilTable) + ".SandPercent!", ""]
]
for calcField in calculateFieldList:
CalculateField("F01", calcField)
arcpy.RemoveJoin_management("F01", os.path.basename(soilTable))
arcpy.CalculateField_management("F01", "FA_AreaHA", "!shape.area@hectares!", "PYTHON_9.3", "")
# ---------------------------------------------------------------------------
# Update NDMS_FutureAllocation_DataEntry for Slope, Latitude, Longitude, Area, SmapName and Dominant Soil Percent
cursorFields = ["FA_LandUseID", #0
"FA_SlopeMean", #1 [0,0]
"FA_SlopeCategory", #2
"FA_LandUse", #3
"FA_Sector", #4
"FA_Latitude", #5
"FA_Longitude", #6
"FA_SmapName", #7
"FA_DominantSoilPercent", #8
"FA_FamilyName", #9
"FA_SoilOrder", #10
"FA_MaximumRootingDepth_cm", #11
"FA_DepthToImpededDrainage_m", #12
"FA_WiltingPoint_0_30_cm", #13
"FA_WiltingPoint_30_60_cm", #14
"FA_WiltingPoint_60_cm", #15
"FA_FieldCapacity_0_30_cm", #16 [0,15]
"FA_FieldCapacity_30_60_cm", #17
"FA_FieldCapacity_60_cm", #18
"FA_Saturation_0_30_cm", #19
"FA_Saturation_30_60_cm", #20
"FA_Saturation_60_cm", #21
"FA_NaturalDrainageClass", #22
"FA_AnionStorageCapacityPercent", #23
"FA_SubsoilClayPercent", #24
"FA_BulkDensity_KgPerM3", #25
"FA_ClayPercent", #26
"FA_SandPercent", #27 [0,26]
"FA_AreaHA"] #28
searchDict = {}
with arcpy.da.SearchCursor(f01, cursorFields) as sCursor:
for row in sCursor:
searchDict.update({row[0]:[row[1],row[2],row[3],row[4],row[5],row[6],row[7],row[8],row[9],row[10],
row[11],row[12],row[13],row[14],row[15],row[16],row[17],row[18],row[19],
row[20],row[21],row[22],row[23],row[24],row[25],row[26],row[27], row[28]]})
with arcpy.da.UpdateCursor(futureDataEntry, cursorFields) as uCursor:
for row in uCursor:
if row[0] in searchDict:
row[1] = searchDict[row[0]][0]
row[2] = searchDict[row[0]][1]
row[3] = searchDict[row[0]][2]
row[4] = searchDict[row[0]][3]
row[5] = searchDict[row[0]][4]
row[6] = searchDict[row[0]][5]
row[7] = searchDict[row[0]][6]
row[8] = searchDict[row[0]][7]
row[9] = searchDict[row[0]][8]
row[10] = searchDict[row[0]][9]
row[11] = searchDict[row[0]][10]
row[12] = searchDict[row[0]][11]
row[13] = searchDict[row[0]][12]
row[14] = searchDict[row[0]][13]
row[15] = searchDict[row[0]][14]
row[16] = searchDict[row[0]][15]
row[17] = searchDict[row[0]][16]
row[18] = searchDict[row[0]][17]
row[19] = searchDict[row[0]][18]
row[20] = searchDict[row[0]][19]
row[21] = searchDict[row[0]][20]
row[22] = searchDict[row[0]][21]
row[23] = searchDict[row[0]][22]
row[24] = searchDict[row[0]][23]
row[25] = searchDict[row[0]][24]
row[26] = searchDict[row[0]][25]
row[27] = searchDict[row[0]][26]
row[28] = searchDict[row[0]][27]
uCursor.updateRow(row)
del sCursor
del uCursor
# ---------------------------------------------------------------------------
CleanUp(cleanUpList)
# ---------------------------------------------------------------------------
arcpy.RefreshLayer(futureDataEntry)
arcpy.AddMessage("*****ALL DONE*****")
This is the attribute table before the script has run and as it still appears after the script has run - and yes ok - but you have to have some fun when you're pulling your hair out!
I've also tried pushing the useless refresh buttons...
...on both the map and the attribute table - no idea why they are there.
The second image below is the attribute table after I have clicked the following
Just a thought - does this RefreshLayer not work when a layer is registered as versioned? If not - why not?
Just a thought - does this RefreshLayer not work when a layer is registered as versioned? If not - why not?
Firstly, thank you for the write up, nothing inherently wrong stood out to me in your script. Secondly, with regard to your quote above - that is correct, have a look at Branch version scenarios—ArcGIS Pro | Documentation, it states the following:
Per our expert in this area:
Pro is a consistent state\moment client for multi-user enterprise geodatabase environments. What that means is they maintain a consistent view of the data until the refresh version moves that state\moment forward.
For layers registered as traditional versioned (stateID) or branch versioned (moment) this pattern should hold true.
For layers not registered as versioned the RefreshLayer workflow should show changes from other users made since the last call.
I would expect this pattern to hold true for branch versioned Feature services as well.
In short, refreshing multi-user enterprise geodatabase requires explicitly refreshing their data source due to the data model, this is as-designed. arcpy.RefreshLayer is not designed to work for these data sources.
Ok - in this particular case the feature layer is traditionally versioned but I'm working in the default version (although in other cases I am working in an actual version of the data). And I'm not after seeing changes from other users - I'm after seeing changes made by me, in the default version via a script running from a toolbox using layers currently in the project. How do I get these changes to show straight away in my current pro session?
Do I have to work in an actual version and then - reconcile? Does the reconcile do the required refresh even though I'm not trying to pull down changes from other users?
It would be so much easier if we could just have an arcpy command to do the refresh that is required. Please?
We really need a way to programmatically refresh the data shown so the changes made by the script are showing in the project without the manual refresh. That manual refresh just confuses the non-gis people I am trying to enable via all this automation.
';
}
}
}
catch(e){
}
}
}
if (newSub.getAttribute("slang").toLowerCase() != code_l.toLowerCase()) {
if (trLabelsHtml != "") {
var labelSname = "";
if(labelEle[i].querySelector("ul li:nth-child(1)").getAttribute("aria-hidden")){
labelSname = labelEle[i].querySelector("ul li:nth-child(1)").outerHTML;
}
labelEle[i].innerHTML = "";
labelEle[i].innerHTML = labelSname + trLabelsHtml;
}
}
}
}
}
catch(e){
}
}
}
/* V 2.0:3 = Store not translated reply id */
if(lingoRSXML.snapshotLength == 0){
if($scope.falseReplyID == "") {
$scope.falseReplyID = value;
}
}
/* Get translated Body of Replies/Comments */
var lingoRBXML = doc.evaluate(lingoRBExp, doc, null, XPathResult.UNORDERED_NODE_SNAPSHOT_TYPE, null);
for(var i=0;i 0) {
var attachDiv = rootElement.querySelector('div.lia-quilt-row-main').querySelector('div.custom-attachments');
if (attachDiv) {
attachDiv = attachDiv.outerHTML;
}
else if(rootElement.querySelector('div.lia-quilt-row-main').querySelectorAll('#attachments').length > 0){
if ("IdeaPage" == "BlogArticlePage") {
attachDiv = rootElement.querySelector('div.lia-quilt-row-main .lia-message-body-content').querySelector('#attachments');
if (attachDiv) {
attachDiv = attachDiv.outerHTML;
}
else{
attachDiv = "";
}
}else{
attachDiv = rootElement.querySelector('div.lia-quilt-row-main').querySelector('#attachments').outerHTML;
}
}
else {
attachDiv = "";
}
/* Feedback Div */
var feedbackDiv = "";
var feedbackDivs = rootElement.querySelector('div.lia-quilt-row-main').querySelectorAll('div.lia-panel-feedback-banner-safe');
if (feedbackDivs.length > 0) {
for (var k = 0; k < feedbackDivs.length; k++) {
feedbackDiv = feedbackDiv + feedbackDivs[k].outerHTML;
}
}
}
else {
var attachDiv = rootElement.querySelector('div.lia-message-body-content').querySelector('div.Attachments.preview-attachments');
if (attachDiv) {
attachDiv = attachDiv.outerHTML;
} else {
attachDiv = "";
}
/* Everyone tags links */
if (document.querySelectorAll("div.TagList").length > 0){
var everyoneTagslink = document.querySelector('div.lia-quilt-row-main').querySelector(".MessageTagsTaplet .TagList");
if ((everyoneTagslink != null)||(everyoneTagslink != undefined)){
everyoneTagslink = everyoneTagslink.outerHTML;
}
else{
everyoneTagslink = "";
}
}
/* Feedback Div */
var feedbackDiv = "";
var feedbackDivs = rootElement.querySelector('div.lia-message-body-content').querySelectorAll('div.lia-panel-feedback-banner-safe');
if (feedbackDivs.length > 0) {
for (var m = 0; m < feedbackDivs.length; m++) {
feedbackDiv = feedbackDiv + feedbackDivs[m].outerHTML;
}
}
}
}
} catch (e) {
}
if (body_L == "") {
/* V 2.0:7 Replacing translated video data with source video data */
var newBodyVideoData = newBody.querySelectorAll('div[class*="video-embed"]');
angular.forEach($scope.videoData[value], function (sourceVideoElement, index) {
if (index <= (newBodyVideoData.length - 1)) {
newBodyVideoData[index].outerHTML = sourceVideoElement.outerHTML
}
});
/* V 2.0:7 = Replacing translated image data with source data */
var newBodyImageData = newBody.querySelectorAll('[class*="lia-image"]');
angular.forEach($scope.imageData[value], function (sourceImgElement, index) {
if (index <= (newBodyImageData.length - 1)) {
newBodyImageData[index].outerHTML = sourceImgElement.outerHTML;
}
});
/* V 2.0:7 = Replacing translated pre tag data with source data */
var newBodyPreTagData = newBody.querySelectorAll('pre');
angular.forEach($scope.preTagData[value], function (sourcePreTagElement, index) {
if (index <= (newBodyPreTagData.length - 1)) {
newBodyPreTagData[index].outerHTML = sourcePreTagElement.outerHTML;
}
});
}
var copyBodySubject = false;
if (body_L == "") {
copyBodySubject = true;
body_L = newBody.innerHTML;
}
/* This code is written as part of video fix by iTalent */
/* try{
var iframeHTMLText = body_L;
var searchIframeText = "<IFRAME";
var foundiFrameTag;
if (iframeHTMLText.indexOf(searchIframeText) > -1) {
foundiFrameTag = decodeHTMLEntities(iframeHTMLText);
foundiFrameTag = foundiFrameTag.split('src="')[1];
body_L = foundiFrameTag;
}
}
catch(e){
} */
/* This code is placed to remove the extra meta tag adding in the UI*/
try{
body_L = body_L.replace('<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />','');
}
catch(e){
}
/** We should not replace the source content if user profile language and selected target language matches with source language **/
if(showTrContent) {
var compiled = false;
rootElement.querySelectorAll('div.lia-message-body-content')[0].innerHTML = null
if("IdeaPage"=="IdeaPage"){
// var customAttachDiv = '';
rootElement.querySelectorAll('div.lia-message-body-content')[0].innerHTML = body_L + feedbackDiv ;
$compile(rootElement.querySelectorAll('div.lia-message-body-content')[0])($scope);
compiled = true;
/* Attach atttach div */
// document.querySelector("div.translation-attachments-"+value).innerHTML = attachDiv;
rootElement.querySelectorAll('div.lia-message-body-content')[0].insertAdjacentHTML('afterend',attachDiv);
if(rootElement.querySelectorAll('div.lia-quilt-idea-message .lia-message-body .lia-attachments-message').length > 1){
rootElement.querySelectorAll('div.lia-quilt-idea-message .lia-message-body .lia-attachments-message')[1].remove();
}
} else {
if("IdeaPage"=="TkbArticlePage"){
rootElement.querySelectorAll('div.lia-message-body-content')[0].innerHTML = body_L + feedbackDiv ;
}else{
rootElement.querySelectorAll('div.lia-message-body-content')[0].innerHTML = body_L + feedbackDiv + attachDiv;
compiled = true;
}
}
/* Destroy and recreate OOyala player videos to restore the videos in target languages which is written by iTalent as part of iTrack LILICON-79 */ /* Destroy and recreate OOyala player videos */
try{
// $scope.videoData[value][0].querySelector("div").getAttribute("id");
for(var vidIndex=0; vidIndex<$scope.videoData[value].length; vidIndex++){
if( $scope.videoData[value][vidIndex].querySelector("div") != null){
var containerId = LITHIUM.OOYALA.players[$scope.videoData[value][vidIndex].querySelector("div").getAttribute("id")].containerId;
videoId = LITHIUM.OOYALA.players[$scope.videoData[value][vidIndex].querySelector("div").getAttribute("id")].videoId;
/** Get the Video object */
vid = OO.Player.create(containerId,videoId);
/** Destroy the video **/
vid.destroy();
/** recreate in the same position */
var vid = OO.Player.create(containerId,videoId);
}
}
}
catch(e){
}
try{
for(var vidIndex=0; vidIndex<($scope.videoData[value].length); vidIndex++){
if($scope.videoData[value][vidIndex].querySelector('video-js') != null){
var data_id = $scope.videoData[value][vidIndex].querySelector('video-js').getAttribute('data-video-id');
var data_account = $scope.videoData[value][vidIndex].querySelector('video-js').getAttribute('data-account');
var data_palyer = $scope.videoData[value][vidIndex].querySelector('video-js').getAttribute('data-player');
var div = document.createElement('div');
div.id = "brightcove";
div.class = "brightcove-player";
div.innerHTML =
'(view in my videos)'
var data = div.getElementsByClassName("video-js");
var script = document.createElement('script');
script.src = "https://players.brightcove.net/" + data_account + "/" + data_palyer + "_default/index.min.js";
for(var i=0;i< data.length;i++){
videodata.push(data[i]);
}
}
}
for(var i=0;i< videodata.length;i++){
document.getElementsByClassName('lia-vid-container')[i].innerHTML = videodata[i].outerHTML;
document.body.appendChild(script);
}
}
catch(e){
}
if(!compiled){
/* Re compile html */
$compile(rootElement.querySelectorAll('div.lia-message-body-content')[0])($scope);
}
}
if (code_l.toLowerCase() != newBody.getAttribute("slang").toLowerCase()) {
/* Adding Translation flag */
var tr_obj = $filter('filter')($scope.sourceLangList, function (obj_l) {
return obj_l.code.toLowerCase() === newBody.getAttribute("slang").toLowerCase()
});
if (tr_obj.length > 0) {
tr_text = "Esri may utilize third parties to translate your data and/or imagery to facilitate communication across different languages.".replace(/lilicon-trans-text/g, tr_obj[0].title);
try {
if ($scope.wootMessages[$rootScope.profLang] != undefined) {
tr_text = $scope.wootMessages[$rootScope.profLang].replace(/lilicon-trans-text/g, tr_obj[0].title);
}
} catch (e) {
}
} else {
//tr_text = "This message was translated for your convenience!";
tr_text = "Esri may utilize third parties to translate your data and/or imagery to facilitate communication across different languages.";
}
try {
if (!document.getElementById("tr-msz-" + value)) {
var tr_para = document.createElement("P");
tr_para.setAttribute("id", "tr-msz-" + value);
tr_para.setAttribute("class", "tr-msz");
tr_para.style.textAlign = 'justify';
var tr_fTag = document.createElement("IMG");
tr_fTag.setAttribute("class", "tFlag");
tr_fTag.setAttribute("src", "/html/assets/langTrFlag.PNG");
tr_fTag.style.marginRight = "5px";
tr_fTag.style.height = "14px";
tr_para.appendChild(tr_fTag);
var tr_textNode = document.createTextNode(tr_text);
tr_para.appendChild(tr_textNode);
/* Woot message only for multi source */
if(rootElement.querySelector(".lia-quilt-forum-message")){
rootElement.querySelector(".lia-quilt-forum-message").appendChild(tr_para);
} else if(rootElement.querySelector(".lia-message-view-blog-topic-message")) {
rootElement.querySelector(".lia-message-view-blog-topic-message").appendChild(tr_para);
} else if(rootElement.querySelector(".lia-quilt-blog-reply-message")){
rootElement.querySelector(".lia-quilt-blog-reply-message").appendChild(tr_para);
} else if(rootElement.querySelector(".lia-quilt-tkb-message")){
rootElement.querySelector(".lia-quilt-tkb-message").appendChild(tr_para);
} else if(rootElement.querySelector(".lia-quilt-tkb-reply-message")){
rootElement.querySelector(".lia-quilt-tkb-reply-message").insertBefore(tr_para,rootElement.querySelector(".lia-quilt-row.lia-quilt-row-footer"));
} else if(rootElement.querySelector(".lia-quilt-idea-message")){
rootElement.querySelector(".lia-quilt-idea-message").appendChild(tr_para);
} else if(rootElement.querySelector('.lia-quilt-occasion-message')){
rootElement.querySelector('.lia-quilt-occasion-message').appendChild(tr_para);
}
else {
if (rootElement.querySelectorAll('div.lia-quilt-row-footer').length > 0) {
rootElement.querySelectorAll('div.lia-quilt-row-footer')[0].appendChild(tr_para);
} else {
rootElement.querySelectorAll('div.lia-quilt-column-message-footer')[0].appendChild(tr_para);
}
}
}
} catch (e) {
}
}
} else {
/* Do not display button for same language */
// syncList.remove(value);
var index = $scope.syncList.indexOf(value);
if (index > -1) {
$scope.syncList.splice(index, 1);
}
}
}
}
});
});
/* V 1.1:2 = Reply Sync button for multi source translation */
} catch(e){
console.log(e);
}
};
if((rContent != undefined) && (rContent != "")) {
drawCanvas(decodeURIComponent(rContent));
/** Update variable with selected language code **/
$scope.previousSelCode = code_l;
}
};
/**
* @function manageTranslation
* @description Managess the translation of given language for the thread
* @param {string} langCode - Language Code
* @param {string} tid - Thread ID
*/
$scope.manageTranslation = function (langCode, tid) {
//debugger;
$scope.showTrText = false;
/* V 2.0:5 = actualStatus variable introduced to indicate detailed connector status on UI. This variable holds the actual translation percentage */
$scope.transPercent = "";
$scope.actualStatus = "";
if (tid != "") {
var bulkTranslation = lithiumPlugin.bulkTranslation(langCode, tid);
bulkTranslation.then(function (trContent) {
if(trContent.body != "") {
$scope.showPreview(trContent.body, $scope.mszList, langCode);
if(langCode != "en-US") {
$scope.showTrText = true;
}
}
if((trContent.status != "NA") && trContent.status != null) {
// $scope.transPercent = String(trContent.status);
$scope.actualStatus = String(trContent.status);
} else {
// $rootScope.errorMsg = "Translation is in progress. Please check again a few minutes."
$rootScope.errorMsg = "Translation is in progress. Please retry in a few minutes."
}
$scope.workbench = trContent.wb;
/* V 2.0:4 = Trigger uncalled or delayed callbacks (documnet uploaded/translation completed from lithium).*/
if(trContent.callback == 'true') {
var trCompletCallback = lithiumPlugin.trCompletCallback(langCode, trContent.docID);
trCompletCallback.then(function (callback){
// $rootScope.errorMsg = "Downloading Translated content in " + langCode + " now. Please check again in a few minutes."
$rootScope.errorMsg = "Uploading content to translate. Please check again in a few minutes."
});
} else if (trContent.callback == 'upload') {
var trCompletUpload = lithiumPlugin.trCompletUpload(langCode, trContent.docID);
trCompletUpload.then(function (callback) {
//$rootScope.errorMsg = "Uploading content to translate. Please check again in a few minutes."
$rootScope.errorMsg = "Uploading content to translate. Please check again in a few minutes."
});
} else if ("many" == "one") {
$scope.updateOOS();
} else if("SmartConx" == "SmartConx"){
if ("many" == "many"){
$scope.updateOOS();
}
}else if ((trContent.status != null) && trContent.status.includes("100")) {
/* If everything fine then only check Out of Sync status */
$scope.updateOOS();
} else {
/* If translation perccent is less than 100 then show the percentage on UI */
$scope.transPercent = $scope.actualStatus;
}
});
}
}
/**
* @function selectThisLang
* @description Called on select dropdown.
* @param {string} lang - Language code
*
*/
$scope.selectThisLang = function (lang, anonymousFlag) {
/* 1.4:3 Update Analytics on language selection */
try {
setTimeout(()=>{
lingoThreadLangSelected(lang, '1320558');
console.log("Language",lang);
},5000)
} catch (e) {
console.log(e);
}
/** Display Translated content **/
var getTranslation = lithiumPlugin.getTranslation(lang, "1320558");
getTranslation.then(function (trContent) {
if (trContent.body != "") {
$scope.showPreview(trContent.body, $scope.mszList, lang);
} else {
//$rootScope.errorMsg = "Translation is in progress. Please check again in a few minutes."
$rootScope.errorMsg = "Translation is in progress. Please retry in a few minutes."
}
});
};
var decodeEntities = (function() {
// this prevents any overhead from creating the object each time
var element = document.createElement('div');
function decodeHTMLEntities (str) {
if(str && typeof str === 'string') {
// strip script/html tags
str = str.replace(/