When you use EventId
in .NET logs, both the Id
(an integer) and Name
are sent to Application Insights as part of customDimensions
. However, the EventId.Id
is stored as a string, which affects how you can filter it.
The Application Insights UI filter panel only supports string operations like equals
or contains
. You can’t use greater than
or less than
filters directly in the UI.
To filter numerically, use the Logs (Analytics) tab with Kusto Query Language (KQL):
This converts the string to an integer so you can filter properly.
Use numeric ranges for EventId
to categorize logs (e.g., 1000–1999 = Auth, 2000–2999 = Payments) and filter easily with KQL.
Tracking custom events with TrackEvent()
in Azure Application Insights helps you monitor how users interact with your app. However, to obtain meaningful data, it's essential to name your events clearly and consistently.
Here are some best practices:
Write event names using a Verb-Noun format like UserLoggedIn
, FileDownloaded
, or OrderSubmitted
. This makes them easy to understand.
Use PascalCase (each word starts with a capital letter), and avoid:
User Logged In
)User-Logged-In
)user_logged_in
)Use: UserLoggedIn
Keep user IDs, file names, or other changing values out of the event name.
Instead, put that info in custom properties or metrics.
Follow the same naming style across your app. This makes it easier to search and analyze data later.
If similar actions happen in different contexts, make your event name more specific:ContactFormSubmitted
vs. FeedbackFormSubmitted
Use a naming template like:<Action><Entity>[<Qualifier>]
Example: AddedItemToCart
A clean, consistent naming strategy makes your telemetry easier to work with, both now and in the future.
Application Insights doesn’t store the original .NET LogLevel
(like Debug
or Trace
) — it only stores SeverityLevel
, which combines them. To make your logs easier to filter and analyze, you can add LogLevel
as a custom property using a TelemetryInitializer
.
Example:
Register the initializer in your app:
services.AddSingleton<ITelemetryInitializer, LogLevelInitializer>();
Now, every trace will include a LogLevel
key in customDimensions
.
Application Insights uses a fixed enum called SeverityLevel
with 5 levels: Verbose
, Information
, Warning
, Error
, and Critical
.
When logging in .NET using ILogger
, the log level (such as Debug
, Information
, or Error
) is internally mapped to Application Insights’ SeverityLevel
. However, the mapping isn’t one-to-one — and by default, you can lose detail.
Application Insights uses this enum:
Application Insights SeverityLevel | Typical .NET LogLevel |
---|---|
Verbose (0) | Trace / Debug |
Information (1) | Information |
Warning (2) | Warning |
Error (3) | Error |
Critical (4) | Critical |
Both Trace
and Debug
are treated as Verbose
, which means they can’t be distinguished once sent to Application Insights.
LogLevel
as a Custom PropertyTo retain the original LogLevel
, consider using a TelemetryInitializer
to add it manually — so you can later filter logs by both SeverityLevel
and original LogLevel
.
Are you just starting your cloud journey or looking for ways to upgrade your knowledge in specific areas? Azure Charts is a web-based application that allows you to see what Azure consists of and how it evolves.
References
Source app settings from key vault
Complete reference:
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/)
Alternatively:
@Microsoft.KeyVault(VaultName=myvault;SecretName=mysecret)
Source: Use Key Vault references - Azure App Service | Microsoft Learn
Microsoft has announced that window sharing of the Microsoft Teams application is now generally available on the Azure Virtual Desktop for Windows users.
Application window sharing now allows users to select a specific window from their desktop screen to share. Previously, users could only share their entire desktop window or a Microsoft PowerPoint Live presentation. Application window sharing helps reduce the risk of showing sensitive content during meetings/calls and keeps meetings focused by directing participants to specific content.
Read more at Azure Daily 2022
The Change Tracking and Inventory service tracks changes to Files, Registry, Software, Services and Daemons and uses the MMA (Microsoft Monitoring Agent)/OMS (Operations Management Suite) agent. This preview supports the new AMA agent and enhances the following:
Large language models are quickly becoming an essential platform for people to innovate, apply AI to solve big problems, and imagine what’s possible. Today, we are excited to announce the general availability of Azure OpenAI Service as part of Microsoft’s continued commitment to democratizing AI, and ongoing partnership with OpenAI.
With Azure OpenAI Service now generally available, more businesses can apply for access to the most advanced AI models in the world—including GPT-3.5, Codex, and DALL•E 2—backed by the trusted enterprise-grade capabilities and AI-optimized infrastructure of Microsoft Azure, to create cutting-edge applications. Customers will also be able to access ChatGPT—a fine-tuned version of GPT-3.5 that has been trained and runs inference on Azure AI infrastructure—through Azure OpenAI Service soon.
We debuted Azure OpenAI Service in November 2021 to enable customers to tap into the power of large-scale generative AI models with the enterprise promises customers have come to expect from our Azure cloud and computing infrastructure—security, reliability, compliance, data privacy, and built-in Responsible AI capabilities.
Since then, one of the most exciting things we’ve seen is the breadth of use cases Azure OpenAI Service has enabled our customers—from generating content that helps better match shoppers with the right purchases to summarizing customer service tickets, freeing up time for employees to focus on more critical tasks.
Customers of all sizes across industries are using Azure OpenAI Service to do more with less, improve experiences for end-users, and streamline operational efficiencies internally. From startups like Moveworks to multinational corporations like KPMG, organizations small and large are applying the capabilities of Azure OpenAI Service to advanced use cases such as customer support, customization, and gaining insights from data using search, data extraction, and classification.
“At Moveworks, we see Azure OpenAI Service as an important component of our machine learning architecture. It enables us to solve several novel use cases, such as identifying gaps in our customer’s internal knowledge bases and automatically drafting new knowledge articles based on those gaps. This saves IT and HR teams a significant amount of time and improves employee self-service. Azure OpenAI Service will also radically enhance our existing enterprise search capabilities and supercharge our analytics and data visualization offerings. Given that so much of the modern enterprise relies on language to get work done, the possibilities are endless—and we look forward to continued collaboration and partnership with Azure OpenAI Service."—Vaibhav Nivargi, Chief Technology Officer and Founder at Moveworks.
“Al Jazeera Digital is constantly exploring new ways to use technology to support our journalism and better serve our audience. Azure OpenAI Service has the potential to enhance our content production in several ways, including summarization and translation, selection of topics, AI tagging, content extraction, and style guide rule application. We are excited to see this service go to general availability so it can help us further contextualize our reporting by conveying the opinion and the other opinion.”—Jason McCartney, Vice President of Engineering at Al Jazeera.
“KPMG is using Azure OpenAI Service to help companies realize significant efficiencies in their Tax ESG (Environmental, Social, and Governance) initiatives. Companies are moving to make their total tax contributions publicly available. With much of these tax payments buried in IT systems outside of finance, massive data volumes, and incomplete data attributes, Azure OpenAI Service finds the data relationships to predict tax payments and tax type—making it much easier to validate accuracy and categorize payments by country and tax type.”—Brett Weaver, Partner, Tax ESG Leader at KPMG.
The general availability of Azure OpenAI Service is not only an important milestone for our customers but also for Azure.
Azure OpenAI Service provides businesses and developers with high-performance AI models at production scale with industry-leading uptime. This is the same production service that Microsoft uses to power its own products, including GitHub Copilot, an AI pair programmer that helps developers write better code, Power BI, which leverages GPT-3-powered natural language to automatically generate formulae and expressions, and the recently-announced Microsoft Designer, which helps creators build stunning content with natural language prompts.
All of this innovation shares a common thread: Azure’s purpose-built, AI-optimized infrastructure.
Azure is also the core computing power behind OpenAI API’s family of models for research advancement and developer production.
Azure is currently the only global public cloud that offers AI supercomputers with massive scale-up and scale-out capabilities. With a unique architecture design that combines leading GPU and networking solutions, Azure delivers best-in-class performance and scale for the most compute-intensive AI training and inference workloads. It’s the reason the world’s leading AI companies—including OpenAI, Meta, Hugging Face, and others—continue to choose Azure to advance their AI innovation. Azure currently ranks in the top 15 of the TOP500 supercomputers worldwide and is the highest-ranked global cloud services provider today. Azure continues to be the cloud and compute power that propels large-scale AI advancements across the globe.
Source: TOP500 The List: TOP500 November 2022, Green500 November 2022.
As an industry leader, we recognize that any innovation in AI must be done responsibly. This becomes even more important with powerful, new technologies like generative models. We have taken an iterative approach to large models, working closely with our partner OpenAI and our customers to carefully assess use cases, learn, and address potential risks. Additionally, we’ve implemented our own guardrails for Azure OpenAI Service that align with our Responsible AI principles. As part of our Limited Access Framework, developers are required to apply for access, describing their intended use case or application before they are given access to the service. Content filters uniquely designed to catch abusive, hateful, and offensive content constantly monitor the input provided to the service as well as the generated content. In the event of a confirmed policy violation, we may ask the developer to take immediate action to prevent further abuse.
We are confident in the quality of the AI models we are using and offering customers today, and we strongly believe they will empower businesses and people to innovate in entirely new and exciting ways.
The pace of innovation in the AI community is moving at lightning speed. We’re tremendously excited to be at the forefront of these advancements with our customers, and look forward to helping more people benefit from them in 2023 and beyond.
Azure Data Explorer (ADX) now supports managed ingestion from Azure Cosmos DB.
This feature enables near real-time analytics on Cosmos DB data in a managed setting (ADX data connection). Since ADX supports Power BI direct query, it enables near real-time Power BI reporting. The latency between Cosmos DB and ADX can be as low as sub-seconds (using streaming ingestion).
This brings the best of both worlds: fast/low latency transactional workload with Azure Cosmos DB and fast / ad hoc analytical with Azure Data Explorer.
Only Azure Cosmos DB NoSQL is supported.
Source: Public Preview: Azure Cosmos DB to Azure Data Explorer Synapse Link