Copy Diagnostic Monitor logs as Excel table, not as vertical tags

791
4
04-12-2023 10:16 PM
Status: Open
Labels (1)
Bud
by
Notable Contributor

ArcGIS Pro 3.1.1

In the Diagnostic Monitor —> Log tab:

I can right-click the logs —> Copy selected:

Bud_5-1681362717059.png

When I paste the copied values into Excel, the data looks like this:

Bud_3-1681362471555.png

The log values are squished into a single column as <tags>.

It would be better if the data was formatted as a table, similar to how it looks in the Diagnostic Monitor Log tab:

Bud_4-1681362650278.png

That would facilitate tabular analysis of the log data.

Could the Diagnostic Monitor logs be enhanced so that the data gets copied as a table, not as tags?

 

 

4 Comments
Bud
by

As a last resort, the following workaround might be useful. It's pretty involved; I wouldn't want to do it very often.

1. Copy the logs as described in the original idea. Paste into Excel.

2. Add a header to the logs column: LOG_TAGS.

3. Select all values. Create an Excel table from the data (CTRL+T).

Option #1 - Use a SQL query in a database (go to step #4 below).

Option #2 - Use Power Query in Excel (go to the next comment in this post).

4. Create an additional Excel table column: ID. Populate with sequential numbers.
Reason for field: I figure it might be unwise to rely on the OBJECTID field as a sequential ID field.

5. The Excel table should look like this:

Bud_0-1681618303531.png

6. Save the Excel spreadsheet as Logs.xlsx.

7. In ArcGIS Pro, create a new mobile geodatabase called LOGS.geodatabase.

8. Use the Excel to Table GP tool to import the spreadsheet to the mobile GDB as a table.

  1. Use these parameters:
    Bud_2-1681619276323.png
  2. Click Run.
  3. The table has been created.
    Bud_3-1681619318379.png
  4. Rename the table from main_LOGS to LOGS.  Reason: there's a bug.
  5. Bud_4-1681619457084.png

9. Run the following SQL query using a SQL client like DBeaver.

WITH cte AS (  --Related Oracle post: https://stackoverflow.com/questions/76024662/fill-in-rows-per-group-for-pivoting
    select sum(case when log_tags like '<Event%' then 1 else 0 end) over (order by id) group_id,  --https://stackoverflow.com/a/76021834/5576771
           case
             when substr(log_tags,1,13) = '<Event time="'                      then 'col_1'
             when substr(log_tags,1, 9) = 'Database:'                          then 'col_2'
             when substr(log_tags,1,10) = '      SQL:'                         then 'col_3'
             when substr(log_tags,1,16) = '      Number of '                   then 'col_4'
             when substr(log_tags,1, 8) = '</Event>'                           then 'col_5'
             else 'col_6' --Sometimes the values in cols 2-4 are different from what I'm querying for above. Col_6 is a catch-all for any unexpected values.
                          --However, col_6 will only grab the first unexpected value it comes across; all others will be ignored. I haven't found a better way to do it.
           end as type,    
           log_tags  --substr(log_tags,1,100)
      from logs
     where log_tags is not null)
--     
  select group_id,
         max(case when  type  = 'col_1' then      log_tags  end) col_1,
         max(case when  type  = 'col_2' then      log_tags  end) col_2,
         max(case when  type  = 'col_3' then trim(log_tags) end) col_3,
         max(case when  type  = 'col_4' then trim(log_tags) end) col_4,
         max(case when  type  = 'col_5' then      log_tags  end) col_5,
         max(case when  type  = 'col_6' then trim(log_tags) end) col_6
    from cte
group by group_id
order by group_id
  1. Note: I tried using the SQL above to create a database view in the mobile geodatabase. However, ArcGIS Pro didn't display the data in the attribute table correctly. Mobile GDB database view — CASE expression value shown incorrectly in attribute table. So I abandoned that idea.
  2. If you do try to create a database view, be aware that SQL comments are buggy in mobile GDBs when using the Create Database View GP tool. Example: SQL comment in WITH clause in mobile GDB database view. So you might need to remove the comments or create the view directly from a SQL client like DBeaver.

9. The resultset of the SQL query in DBeaver looks like this:

Bud_5-1681622285033.png

That's the desired result.

We can use that data to do tabular analysis on the logs when troubleshooting issues. We'd likely need to parse values from the columns to make the data useful.


Notes:

 

A. Be aware that not all logs/rows are structured the same way. Some logs have SQL queries in them like the ones shown in the above screenshots.

But other logs are different:

Bud_7-1681622526893.png

Bud_8-1681622729820.png

That makes it difficult to query the tag data using SQL. Cases/columns 2-4 work well most of the time. But not all of the time. That's where case/column 6 comes in. But it has limitations too.

case
  when substr(log_tags,1,13) = '<Event time="'                      then 'col_1'
  when substr(log_tags,1, 9) = 'Database:'                          then 'col_2'
  when substr(log_tags,1,10) = '      SQL:'                         then 'col_3'
  when substr(log_tags,1,34) = '      Number of features returned:' then 'col_4'
  when substr(log_tags,1, 8) = '</Event>'                           then 'col_5'
  else 'col_6' --Sometimes the values in cols 2-4 are different from what I'm querying for above. Col_6 is a catch-all for any unexpected values.
               --However, col_6 will only grab the first unexpected value it comes across; all others will be ignored. I haven't found a better way to do it.
end as type, 

 


B. I tried skipping the Excel step and just pasting the log data from the Diagnostic Monitor right into an empty mobile geodatabase table. However, that seemed buggy. Sometimes the data would paste as expected. Other times, it seemed to omit or rearrange some rows. I didn't spend much time trying to find the root cause. I just played it safe and used Excel as an intermediate step instead.  

 

C. The SQL query uses generic SQL. I've only tested it in SQLite (mobile GDB) and Oracle. But it should work in other DBs too.

Bud
by

Option #2 - Use Power Query in Excel


10. Create a TAG_START_END column in the Excel table:

=IF(  LEFT([@[LOG_TAGS]],6)="<Event","start",  IF(LEFT([@[LOG_TAGS]],7)="</Event","end",""))

11. Create a TYPE column in the Excel table (see the CASE expression in the SQL query above for a more readable version):

=IF(LEFT([@[LOG_TAGS]],12)="<Event time=","col_1",
      IF(LEFT([@[LOG_TAGS]],9)="Database:","col_2",
          IF(LEFT([@[LOG_TAGS]],10)="      SQL:","col_3",
              IF(LEFT([@[LOG_TAGS]],34)="      Number of features returned:","col_4",
                  IF(LEFT([@[LOG_TAGS]],8)="</Event>","col_5",
                      IF([@[LOG_TAGS]]<>"","col_6",""
  ))))))

Bud_1-1681750151894.png

12. Create an Excel Power Query:
Click a cell in the Excel table —> Data tab in ribbon —> From Table/Range.

13. Click a cell to deselect any selected columns.
Home tab —> Remove Rows —> Remove Blank Rows.

14. Trim the preceding spaces in the LOG_TAGS column:
Right-click the LOG_TAGS column —> Transform —> Trim.

15. Create a Group column in the Power Query that groups the tags using a common ID (needed for pivoting later).

In the Power Query —> Advanced Editor window, replace the existing M-code with the following M-code:

let
    Source = Excel.CurrentWorkbook(){[Name="Table"]}[Content],
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LOG_TAGS", type text}, {"TYPE", type text}, {"TAG_START_END", type text}}),
    #"Removed Blank Rows" = Table.SelectRows(#"Changed Type", each not List.IsEmpty(List.RemoveMatchingItems(Record.FieldValues(_), {"", null}))),
    #"TrimSpaces" = Table.TransformColumns(#"Removed Blank Rows",{{"LOG_TAGS", Text.Trim, type text}}),
    Lines = Table.ToRecords(TrimSpaces),
    Result = List.Accumulate(Lines, [table={}, group=0, seq=0],
        (state, current) => if Text.StartsWith(current[LOG_TAGS],"<Event")
            then let g = state[group]+1, s = 1 in
                 [ table= state[table] & {[group=g,seq=s] & current},
                   group=g, seq=s] 
            else let g = state[group], s = state[seq]+1 in
                 [ table= state[table] & {[group=g,seq=s] & current},
                   group=g, seq=s] ),
    Table1 = Result[table],
    Table2 = Table.FromRecords(Table1)
in
    Table2
  1. Hint: Change  Table  to the name of your Excel table.
  2. M-code Source: Stack Exchange Superuser - Assign group ID to rows (tags) 

Result:
A Group column has been successfully created.

16. In the Power Query, remove these fields:

  1. Seq
  2. TAG_START_END

17. Pivot the data:

  1. Select the TYPE column.
  2. Transform tab —> Pivot Column.
  3. Values Column: LOG_TAGS.
  4. In the Advanced section, choose Don't aggregate.
    Bud_3-1681751095273.png
  5. Click OK.

18. The Power Query should now look like this:

Bud_0-1681765038460.png

19. Click "Close & Load".

20. The resulting Excel worksheet should look like this:

Bud_1-1681765092934.png

21. I manually added column C to prevent the values in B from spilling over into the blanks in column C.

C = '      [there's a space after the single-quote]

 

22. Complete.

 

Note:

With certain log data, I get an error from Power Query:

Evaluation resulted in stack overflow and cannot continue.

Bud_0-1681766936032.png

I haven't figured out what specific log data causes that issue.

 

It might have something to do with the "Result" Power Query step:

Bud_0-1681767087481.png

 

 

Bud
by

Here's a specialized SQL query that extracts information about query logs that returned at least 1 row:

--https://community.esri.com/t5/arcgis-pro-questions/mobile-gdb-database-view-case-expression-value/td-p/1279170
WITH cte AS (  
    select sum(case when log_tags like '<Event%' then 1 else 0 end) over (order by objectid) group_id,
           case
               when substr(log_tags,1,13) = '<Event time="'                      then 'col_1'
               when substr(log_tags,1, 9) = 'Database:'                          then 'col_2'
               when substr(log_tags,1,10) = '      SQL:'                         then 'col_3'
               when substr(log_tags,1,16) = '      Number of '                   then 'col_4'
               when substr(log_tags,1, 8) = '</Event>'                           then 'col_5'
               else 'col_6'
           end as type,    
           log_tags
      from logs
     where log_tags is not null)
--     
select 
    substr(xml, instr(xml, 'elapsed=') + length('elapsed=') + 1) + 0 as elapsed,  --https://stackoverflow.com/a/76078619/5576771
    row_count+0 as row_count,
    lower(query) || '; --Excel rownum: ' || (row_number() over() + 2) as query
    --lower(query) as query
from (
          select group_id,
                 max(case when  type  = 'col_1' then             log_tags      end) xml,    
               --max(case when  type  = 'col_2' then             log_tags      end) col_2,
                 max(case when  type  = 'col_3' then substr(trim(log_tags), 6) end) query,
                 max(case when  type  = 'col_4' then substr(trim(log_tags),30) end) row_count
               --max(case when  type  = 'col_5' then             log_tags      end) col_5,
               --max(case when  type  = 'col_6' then        trim(log_tags)     end) col_6
            from cte
        group by group_id
        order by group_id
     ) 
where 
    row_count+0 > 1

Bud_0-1682145025536.png

 

Related: Open table with join (sorted) — Logs show redundant queries sent to DB

Bud
by

A Python script for parsing the XML tags and inserting into a table (untested): 
https://sqlite.org/forum/forumpost/cedad1130c0e828f