Announcing Qwak Feature Store support for mongoDB data sources
Qwak Feature Store provides a unified store for features during training and real-time inference without the need to write additional code or create manual processes to keep features consistent.Â
As we support different ways to ingest features into Qwak Feature Store, including Batch, streaming, or non-materialized features, we recently added support for MongoDB as a batch data source to pull data from.
The Data Source connectors provide you with a consistent data source interface for any database, and they create a standard way to combine stream and batch data sources for Feature Transformations.
Integrating MongoDB and Qwak Feature Store
Defining a Feature Set enables you to create features from your analytical data: when calculating feature values, Qwak will simply read from the underlying data source.
For example, in a fraud detection model use case, we might have two values to pull from the MongoDB data source:
- Average transaction per customer - avg_amount
- Standard deviation of a transaction per customer - sttdev_amount
And two from Streaming events from Kafka:
- Last transaction amount
- Last transaction time
Architecture
How to configure MongoDB as Data Source
Snowflake data source connector definition:
Register batch feature using the MongoDB connector:
Qwak Feature Store helps ensure that models make accurate predictions by making the same features available for both training and for inference.Â