#
Is there any data that Kusto Query Language (KQL) can inherently query that I could use as a stand-in for metrics I might otherwise normally pull out of Azure DevOps (e.g. build pipeline pass/fail rates)
Kusto Query Language (KQL) can inherently query data from various Azure services, including some that can serve as alternatives to Azure DevOps metrics. Here are some options you could use as stand-ins for DevOps metrics:
- Azure Monitor Logs: KQL can query Azure Monitor logs, which contain telemetry data from various Azure services1. You can use this to analyze application performance, which could serve as a proxy for build quality.
- Application Insights: KQL can be used to query telemetry data from Application Insights1. This can provide insights into application performance and usage, which could indirectly reflect the quality of your builds and deployments.
- Azure Data Explorer: If you’re ingesting your build pipeline data into Azure Data Explorer, you can use KQL to directly query this data21. This would allow you to analyze build pass/fail rates and other CI/CD metrics.
- Log Analytics: If you’ve configured your Azure DevOps to send logs to a Log Analytics workspace, you can use KQL to query this data directly3. This would give you access to DevOps-specific metrics without needing to use the Azure DevOps API.
- Azure Sentinel: If you’re using Azure Sentinel for security monitoring, you can use KQL to query security-related data1. This could provide insights into the security aspects of your builds and deployments.
While these options don’t provide a direct one-to-one replacement for Azure DevOps metrics, they can offer valuable insights into your application’s performance, usage, and security, which are often reflective of the quality of your build and deployment processes.
How do I set up Azure devOps to send logs to a Log Analytics Workspace?
To set up Azure DevOps to send logs to a Log Analytics Workspace, you need to follow these steps:
- Enable Azure DevOps audit log streaming to Azure Monitor logs.
- Create a Log Analytics workspace if you don’t already have one. You can do this in the Azure portal by searching for “Log Analytics workspaces” and selecting “Create”4.
- Configure Azure DevOps to stream audit logs to your Log Analytics workspace. This can be done through the Azure DevOps organization settings5.
- Once configured, Azure DevOps will start sending audit logs to the specified Log Analytics workspace.
- The logs will be stored in a table called “AzureDevOpsAuditing” within your Log Analytics workspace5.
- You can then query these logs using Kusto Query Language (KQL) in the Log Analytics workspace. For example, to retrieve the top 10 audit log events from the last 3 days, you can use the following query5:
AzureDevOpsAuditing | project TimeGenerated, ActivityId, ActorDisplayName, CategoryDisplayName, Details, OperationName, ProjectName, ScopeDisplayName, ScopeType, SourceSystem, Type, UserAgent | top 10 by TimeGenerated
By setting up this integration, you’ll be able to analyze Azure DevOps audit logs alongside other logs in your Azure environment, enabling better monitoring and troubleshooting capabilities.
I’ll turning this on help me see ADO pipeline executions, ADO repo branch creation, and such, over in Azure Monitor? Or only more surface-level things like users logging into ADO’s web portal?
Enabling Azure DevOps audit log streaming to Azure Monitor logs will provide you with visibility into a wide range of activities within your Azure DevOps environment, including pipeline executions and repository branch creation, not just surface-level actions like user logins.
When you set up this integration, you’ll be able to see detailed information about:
- Pipeline Executions: You can track build and release pipeline runs, including success/failure rates and execution details6.
- Repository Activities: Branch creation, deletion, and other source control management actions will be logged7.
- Project and Repository Changes: Any modifications to projects or repositories will be recorded7.
- Azure Pipelines Changes: Updates or modifications to your pipeline configurations will be captured7.
- Service Connection Changes: Any alterations to service connections will be logged7.
- Permission and Access Control Modifications: Changes to user permissions or access controls will be recorded7.
The logs will be stored in a table called “AzureDevOpsAuditing” within your Log Analytics workspace87. You can then use Kusto Query Language (KQL) to analyze these logs. For example, to retrieve the top 10 audit log events, you could use a query like this:
AzureDevOpsAuditing
| project TimeGenerated, ActivityId, ActorDisplayName, CategoryDisplayName, Details, OperationName, ProjectName,
ScopeDisplayName, ScopeType, SourceSystem, Type, UserAgent
| top 10 by TimeGenerated
This integration provides a comprehensive view of activities within your Azure DevOps environment, allowing you to monitor, analyze, and set up alerts for various events, including pipeline executions and repository actions79.
This worked great. How about dumping similar data out of GitHub into Azure Monitor?
Yes, you can stream GitHub audit log data into Azure Monitor, which provides similar functionality to what we discussed for Azure DevOps. Here’s how you can set it up:
- GitHub Enterprise Cloud supports streaming audit logs to Azure Blob Storage. This feature allows you to send audit and Git events data from GitHub to an external data management system10.
- To set up streaming to Azure Blob Storage:
- Create a storage account and container in Microsoft Azure.
- Generate a SAS (Shared Access Signature) token URL for the container.
- In GitHub, navigate to your enterprise settings, then to Audit log > Log streaming.
- Select “Azure Blob Storage” as the streaming destination.
- Enter the SAS URL and save the configuration10.
- Once set up, GitHub will stream audit log events to your Azure Blob Storage container. These events include a wide range of activities such as repository creation, permission changes, and user management actions1011.
- To make this data available in Azure Monitor:
- Create a Log Analytics workspace if you don’t already have one.
- Set up an ingestion process to move data from Blob Storage to Log Analytics. This could be done using Azure Functions or Azure Logic Apps.
- Once the data is in Log Analytics, you can use Kusto Query Language (KQL) to analyze the GitHub audit logs alongside other data in Azure Monitor12.
- You can then create custom dashboards, set up alerts, and perform detailed analysis of your GitHub activity within the Azure Monitor environment.
This integration allows you to centralize your logging and monitoring, combining GitHub audit data with other Azure services for comprehensive visibility across your entire infrastructure13.
It looks like audit logs can also go from GitHub to Azure Event Hubs or to Splunk. And logs can stream from ADO to Azure Event Grid. What data flow architecture do you recommend if I want both version control systems’ audit logs to end up in both an Azure Log Analytics Workspace and also in Splunk? I feel like maybe an event-based model might be a good idea somehow, but maybe that’s premature optimization compared to just setting up a few redundant connectors.
To efficiently stream audit logs from both GitHub and Azure DevOps (ADO) to Azure Log Analytics Workspace and Splunk, I recommend the following data flow architecture:
GitHub Audit Logs
- Stream GitHub audit logs to Azure Event Hubs.
- Use Azure Functions to process the data from Event Hubs and send it to:
- Azure Log Analytics Workspace
- Splunk (via HTTP Event Collector)
Azure DevOps Audit Logs
- Stream ADO audit logs to Azure Event Grid.
- Use Azure Functions to process the data from Event Grid and send it to:
- Azure Log Analytics Workspace
- Splunk (via HTTP Event Collector)
Benefits of this architecture
- Centralized event processing: Using Azure Event Hubs and Event Grid as central hubs allows for efficient distribution of data to multiple destinations.
- Scalability: Event-based architecture can handle large volumes of audit log data efficiently.
- Flexibility: Azure Functions provide the ability to transform and enrich the data before sending it to the final destinations.
- Cost-effective: This approach minimizes data transfer costs and reduces the need for multiple direct integrations.
- Real-time processing: Events can be processed and forwarded to both destinations in near real-time.
- Reliability: Azure’s event-based services offer built-in retry mechanisms and dead-lettering for failed events.
While setting up redundant connectors directly from the sources to both destinations might seem simpler initially, the proposed event-based architecture offers better scalability, flexibility, and cost-effectiveness in the long run, especially when dealing with large volumes of audit log data from multiple sources141516.
It seems like all that Azure Functions runtime might really add up, cost wise, compared to direct connectors that probably happen more at Microsoft’s expense and software maintenance headache under the hood.
While Azure Functions can indeed incur significant costs for heavy usage, it’s not necessarily more expensive than using direct connectors. The cost comparison depends on several factors:
- Usage patterns: Azure Functions are billed based on execution time and number of executions. For infrequent or low-volume integrations, Functions can be very cost-effective, especially with the free grant of 1 million executions and 400,000 GB-s per month17.
- Complexity: Functions allow for custom code execution, which can be more efficient for complex integrations that may require multiple connector actions.
- Connector pricing: Some enterprise connectors have their own pricing, which can add up for high-volume scenarios18.
- Scalability: Functions automatically scale based on demand, which can be more cost-effective for variable workloads compared to always-on solutions19.
- Premium features: The Azure Functions Premium plan offers enhanced performance and VNET access, which might be necessary for certain enterprise scenarios17.
For a fair comparison, you’d need to consider:
- The specific integration requirements
- Expected volume and frequency of executions
- Any additional features needed (like VNET access)
- The pricing of specific connectors vs. custom code execution
In some cases, direct connectors might be more cost-effective, especially for simple, high-volume integrations. However, Functions offer greater flexibility and can be more economical for complex or variable workloads. It’s essential to analyze your specific use case to determine the most cost-effective solution.
I think this counts as “simple, high-volume integrations.”
You’re absolutely right, and I appreciate you pointing that out. For simple, high-volume integrations like streaming audit logs from GitHub and Azure DevOps to Azure Log Analytics Workspace and Splunk, direct connectors are indeed likely to be more cost-effective and simpler to maintain. Let’s revise the recommendation based on this insight:
Revised Recommendation
- GitHub Audit Logs:
- Use the native GitHub integration to stream logs directly to Azure Event Hubs
- Configure Azure Event Hubs to forward events to Azure Log Analytics Workspace
- Set up GitHub’s native Splunk integration for direct streaming to Splunk
- Azure DevOps Audit Logs:
- Use the built-in Azure DevOps feature to stream logs directly to Azure Log Analytics Workspace
- Configure Azure DevOps to stream logs to Azure Event Grid
- Set up an Azure Event Grid subscription to forward events to Splunk (using Splunk’s Azure Event Grid integration)
This approach leverages existing connectors and integrations, reducing the need for custom code and additional Azure services. It’s likely to be more cost-effective and easier to maintain, as it relies on Microsoft and GitHub’s managed services.
Key benefits of this revised approach:
- Lower operational costs
- Reduced complexity
- Less maintenance overhead
- Utilizes native integrations, which are typically optimized for performance and reliability
This solution aligns better with the “simple, high-volume” nature of audit log streaming, making it a more suitable choice for your scenario.
-
https://learn.microsoft.com/en-us/kusto/query/?view=microsoft-fabric ↩ ↩2 ↩3 ↩4
-
https://docs.azure.cn/en-us/data-explorer/kusto/query/?view=azure-data-explorer ↩
-
https://learn.microsoft.com/en-us/viva/goals/azure-devops-integration ↩
-
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/quick-create-workspace ↩
-
https://kristhecodingunicorn.com/post/azure-devops-audit-azure-monitor-logs/ ↩ ↩2 ↩3
-
https://squaredup.com/blog/10-metrics-Azure-DevOps-engineer-should-monitor/ ↩
-
https://kristhecodingunicorn.com/post/azure-devops-audit-azure-monitor-logs/ ↩ ↩2 ↩3 ↩4 ↩5 ↩6 ↩7
-
https://learn.microsoft.com/en-us/answers/questions/1330135/how-to-get-azure-devops-pipeline-status-logs-on-ru ↩
-
https://learn.microsoft.com/en-us/azure/azure-monitor/app/release-and-work-item-insights ↩
-
https://docs.github.com/enterprise-cloud@latest/admin/monitoring-activity-in-your-enterprise/reviewing-audit-logs-for-your-enterprise/streaming-the-audit-log-for-your-enterprise ↩ ↩2 ↩3
-
https://docs.github.com/github-ae@latest/admin/monitoring-activity-in-your-enterprise/reviewing-audit-logs-for-your-enterprise/accessing-the-audit-log-for-your-enterprise ↩
-
https://github.com/MicrosoftDocs/azure-monitor-docs/blob/main/articles/azure-monitor/logs/query-audit.md ↩
-
https://azure.github.io/PSRule.Rules.Azure/setup/setup-azure-monitor-logs/ ↩
-
https://learn.microsoft.com/en-us/azure/devops/organizations/audit/azure-devops-auditing?view=azure-devops ↩
-
https://learn.microsoft.com/en-us/azure/sentinel/data-connectors/github-enterprise-audit-log ↩
-
https://docs.github.com/enterprise-cloud@latest/admin/monitoring-activity-in-your-enterprise/reviewing-audit-logs-for-your-enterprise/streaming-the-audit-log-for-your-enterprise ↩
-
https://azure.microsoft.com/en-us/pricing/details/functions/ ↩ ↩2
-
https://azure.microsoft.com/en-us/pricing/details/logic-apps/ ↩
-
https://walkerscott.co/2020/03/azure-logic-apps-vs-azure-functions/ ↩