Skip to Main Content

MongoByte MongoDB Logo

Welcome to the new MongoDB Feedback Portal!

{Improvement: "Your idea"}
We’ve upgraded our system to better capture and act on your feedback.
Your feedback is meaningful and helps us build better products.

Status Submitted
Categories Database
Created by Guest
Created on Jun 24, 2020

Performance problem with compound index using doubles, with range filtering

It looks like there may be a problem when using doubles in a compound index and using range-test filters against all of them in a query. The problem we see from the explain plans is that far more keys are being examined that there should be and this leads to poor performance. If we switch to using a similar setup but with integers instead, we don’t see the problem. See the second part of this ticket for full details https://support.mongodb.com/case/00659614 We have a document that looks like this – { "_id" : ObjectId("5e2ab24eca314f10b486d827"), "attributes" : [ { "attributeCode" : "attributes_itemsGeometry", "boundingBox" : { "lonMin" : -1.93181854783035, "latMin" : 52.1718902305398, "lonMax" : -1.9244498979136, "latMax" : 52.1749938832085 } } ] } With a compound index like this – { "v" : 2, "key" : { "attributes.boundingBox.lonMin" : 1, "attributes.boundingBox.lonMax" : 1, "attributes.boundingBox.latMin" : 1, "attributes.boundingBox.latMax" : 1 }, "name" : "itemVersions_designCode_collection_attributeCode_boundingBox_date", "ns" : "customerBlah.itemVersions" } And we make queries with range filters like this "attributes" : { "$elemMatch" : { "$and" : [ { "boundingBox.latMin" : { "$lte" : 52.0751028974155 } }, { "boundingBox.lonMin" : { "$lte" : -1.92557202436271 } }, { "boundingBox.latMax" : { "$gte" : 52.0729000493818 } }, { "boundingBox.lonMax" : { "$gte" : -1.92931078813729 } }
  • Attach files