Skip to Main Content

MongoByte MongoDB Logo

Welcome to the new MongoDB Feedback Portal!

{Improvement: "Your idea"}
We’ve upgraded our system to better capture and act on your feedback.
Your feedback is meaningful and helps us build better products.

284 VOTE
Status Submitted
Categories Atlas
Created by Guest
Created on Nov 21, 2019

Ability to stream logs to Cloudwatch Logs/Datadog/Splunk

There's no way to stream logs from MongoDB on Atlas right now. I should be able to stream logs, either to Datadog or Cloudwatch or _something_!
  • Guest
    Apr 18, 2025
    I am also looking for this feature in Datadog not only mertics but also mongod log
  • Guest
    Jan 7, 2025
    Streaming logs to Datadog is a feature that we need.
  • Guest
    Jan 6, 2025
    Waiting for this feature soon!
  • Guest
    Nov 18, 2024
    Splunk connectivity for compliance review is a critical part of our production operations. This functionality is therefore critical for us.
  • Guest
    Aug 20, 2024
    It is needed to monitor Production instances.
  • Guest
    Aug 20, 2024
    Since we have a lot of production infra assets hosted in AWS, many of our tech support teams rely heavily on AWS CloudWatch so it would be great if MongoDB Atlas could create an integration with AWS so that the some or all of the KPIs available in the Atlas UI could be ingested and/or presented in an AWS CloudWatch dashboard; particularly for Atlas clusters that are hosted in AWS.
  • Guest
    Aug 1, 2024
    The idea is to connect MongoDB Atlas Organization with CM Solution like Wazuh and other solutions, and there we can set triggers and monitor logs real time for security enhancement, monitoring etc.
  • Guest
    Jul 10, 2024
    It's possible to stream auditing from Atlas to AWS S3. Pls implement the same for GCP block storage. Best would be to have the possibility to stream all audit logs(Atlas, project, RS) to GCP block storage.
  • Guest
    May 30, 2024
    Need to integrate mongod logs with exabeam to view the audit logs. And we need these logs for PCI compliance.
  • Guest
    Apr 11, 2024
    How about streaming to logstash and Kibana ? Logs from multiple nodes could also be combined into one single portal, and users can use whatever filter is desired to search.
  • Guest
    Apr 9, 2024
    We have introduce a capability to push cluster logs to a customer S3. This push happens every five minutes. This capability is not the same as streaming, but a step in that direction. https://www.mongodb.com/docs/atlas/push-logs/
  • Guest
    Feb 22, 2024
    Ability to stream logs to Cloudwatch Logs would eliminate lot of complicity and dependencies
  • Guest
    Feb 14, 2024
    At least document a workaround to either feed the logs to datadog via the API, or stream them to S3 and then datadog.
  • Guest
    Dec 20, 2023
    Why can't we dump audit logs to log analytics workspace when we are using azure private endpoints to connect atlas
  • Guest
    Dec 7, 2023
    We are standardizing on DataDog and like to have logs in a centralize location. It would be great to have logs stream to DataDog.
  • Guest
    Sep 25, 2023
    Yes, I confirm the need to stream logs to the cloud provider streaming service ( example Azure Event Hub)
  • Guest
    Aug 10, 2023
    Any movement on this one that was raised back in 2019?
  • Guest
    May 30, 2023
    Our requirement is to feed these audit logs in to a central system, which will help us report on elevated or unauthorized changes to the Database services. Currently, the only way to pull these logs through the API is to pull each file from each node within a cluster as a .gz file. (https://www.mongodb.com/docs/atlas/reference/api-resources-spec/#tag/Monitoring-and-Logs/operation/downloadHostLogs) What we’d like to request is a way to pull the Project (or cluster) audit logs that acts like the Events API endpoint, where it is a paged list of Events that have occurred for a project. (https://www.mongodb.com/docs/atlas/reference/api-resources-spec/#tag/Events/operation/listProjectEvents)
  • Guest
    Apr 6, 2023
    Logs, and activity feed also (like alerts, as datadog events)!
  • Guest
    Mar 13, 2023
    We believe shipping logs to S3 compatible object storage APIs such as AWS S3 or GCP Cloud Storage to be the highest priority and drive the most value. Many log routing frameworks support pulling logs from S3 or GCP and therefore you could cover more customer needs this way. We would like to see database logs (all) prioritized first so that it is easy for our teams to view database status across all clusters and projects easily using our log provider (it's not Datadog/Cloudwatch/GCP). The Atlas team should also consider https://feedback.mongodb.com/forums/924145-atlas/suggestions/43971369-send-atlas-logs-to-s3 when thinking about this functionality. Shipping to S3/GCP Storage would also allow Big Data frameworks such as GCP BigQuery or AWS Datalake to succeed. Therefore servicing the needs of internal data analysts as well as developers who need to view logs.
  • Load older comments
  • +184