Currently using mongodb timeseries collections to store IoT-Data. Data is then retrieved via multiple REST-APIs. In most cases those APIs fetching the last entry of a specified meta-data field.
Unfortunately as the time-series collections grow bigger (10 collections with around 4 mil entries each), im getting a “Slow Query” warning in the db logs and queries take unreasonably long (> 10 seconds) to return some value. Currently NO secondary index is setup.
My query (Golang code) looks like this
func (mh *MongoHandler) FindLast(collection string, nodeName string, exEmpty bool) ([]TimeSeriesData, error) {
coll := mh.client.Database(mh.database).Collection(collection)
filter := bson.D{
{Key: "meta.nodeName", Value: nodeName},
}
if exEmpty {
filter = append(filter, primitive.E{Key: "value", Value: bson.D{
{Key: "$exists", Value: true},
{Key: "$ne", Value: ""},
}})
}
sortParams := bson.D{{Key: "ts", Value: -1}}
var res []TimeSeriesData
cursor, err := coll.Find(ctx, filter, options.Find().SetSort(sortParams), options.Find().SetLimit(1))
if err != nil {
return nil, err
}
cursor.All(ctx, &res)
return res, nil
}
Can you help me to improve this query and speed it up? Would a secondary index on the timestamp field help me here?
3