Query Performance

547
10
12-20-2018 11:23 AM
Highlighted
New Contributor II

Hello,

I'm working on some offline development and I'm hitting some pretty abysmal performance issues using the .net runtime API.  I have put together the following sample code to demonstrate the issue.  The following I have executed against a table with 67,000 records and my where clause is simply "1=1" to return the entire contents of the table.  Below are the performance numbers.  Is there something I'm missing here?  Obviously this sort of performance is unacceptable when querying large amounts of features in an offline database.  I'm using the 100.4.0 API currently with this test.

Direct Sqlite Connection

11.04 Seconds to iterate over all of the records.

.net Runtime API

4 Minutes and 53 Seconds to iterate over all of the records.

Here is the sample code.

Stopwatch directSqlSW = new Stopwatch();

Stopwatch runtimeAPISW = new Stopwatch();

int featuresFound = 0;

 

directSqlSW.Start();

//Find the features using direct Sqlite Connection

SQLiteConnection directSqliteConnection = new SQLiteConnection(string.Format("Data Source={0};Version=3;", ((GeodatabaseFeatureTable)SearchAgainstFeatureLayer.FeatureTable).Geodatabase.Path));

directSqliteConnection.Open();

 

SQLiteCommand selectCommand = new SQLiteCommand(string.Format("SELECT * FROM {0} WHERE {1}", SearchAgainstFeatureLayer.FeatureTable.TableName, WhereClause), directSqliteConnection);

SQLiteDataReader reader = selectCommand.ExecuteReader();

while (reader.Read())

{

   //Read attributes

   Dictionary<string, object> attributes = new Dictionary<string, object>();

   foreach (Field field in SearchAgainstFeatureLayer.FeatureTable.Fields)

   {

      attributes.Add(field.Name, reader[field.Name]);

   }

   featuresFound++;

}

reader.Close();

directSqliteConnection.Close();

directSqlSW.Stop();

               

featuresFound = 0;

runtimeAPISW.Start();

//Find the features using .net runtime API

QueryParameters queryParams = new QueryParameters();

queryParams.WhereClause = WhereClause;

FeatureQueryResult results = await SearchAgainstFeatureLayer.FeatureTable.QueryFeaturesAsync(queryParams);

               

foreach (Feature feature in results)

{

   //Read attributes

   Dictionary<string, object> attributes = new Dictionary<string, object>();

   foreach (Field field in SearchAgainstFeatureLayer.FeatureTable.Fields)

   {

      attributes.Add(field.Name, feature.GetAttributeValue(field.Name));

   }

   featuresFound++;

}

runtimeAPISW.Stop();

10 Replies
Highlighted
Esri Frequent Contributor

We have made some significant improvements to this for the next release.

Reply
0 Kudos
Highlighted
New Contributor II

Hi Morten,

Fantastic news! Thank you very much!

Kind Regards,

Kris Foster

Reply
0 Kudos
Highlighted
Occasional Contributor

Kris,

I'm curious to know if the "significant improvements" in v100.5 have helped at all with your query performance.  I'm experiencing similar issues when loading large non-spatial tables even with minimum columns and data.

Thanks,

Tim

Reply
0 Kudos
Highlighted
MVP Regular Contributor

Have you looked at where the real bottle neck is?  Is it the query (i.e., the actual time to return the records) or the iteration.  One thing to consider is that the result set from the Runtime approach is returning Feature objects which I think are going to be more heavy weight than a Sqlite reader row.

Once you have the query result set from the portal query you can likely get anything you need through a LINQ so the iteration does not seem to show mush other than the time to loop over a table

[That said I have seen other performance issues myself in Runtime, but I don't see this test evaluating a query it evaluates the iteration]

Reply
0 Kudos
Highlighted
Esri Frequent Contributor

Yes the comparison isn't completely fair, as the above example isn't parsing the geometry - which is one of the more expensive pieces to it.

Btw did you see this go a lot faster with Update 5?

Reply
0 Kudos
Highlighted
New Contributor II

I haven't had a chance to test out the performance yet as I've been working on other things.  The parsing of the geometry is no longer possible via direct sqlite queries (as far as I'm aware) since it was changed from a JSON string.  If there is a way to build the geometry like I could from the JSON string that used to be stored in there, please let me know   I wasn't able to figure it out with the way the shape is now stored in there when I put together this test.

Reply
0 Kudos
Highlighted
Occasional Contributor

With 100.5 I'm finding it 10x faster to query a .geodatabase table with System.Data.Sqlite.

Knowing the shape binary format would be greatly appreciated.

See also https://gis.stackexchange.com/questions/283670/esri-runtime-geodatabase-shape-storage

Reply
0 Kudos
Highlighted
Esri Frequent Contributor

That's A LOT better than 1100x faster it was before 🙂

As mentioned you'll never be able to match Sqlite's direct read as you're skipping a bunch of important steps, and if you were to parse all binary geometry fields into valid geometry shapes, I'd expect you'd be down to comparable numbers anyway.

Reply
0 Kudos
Highlighted
Occasional Contributor

For my particular use case, I can hang onto the byte[] array, and lazy-load the Geometry only if the user clicks on a particular feature.  Most of my geometries are points, then I have some polylines with typically have less than 5 vertices.  So I'm guessing the read time should be tolerable.

Reply
0 Kudos