Microsoft has announced that window sharing of the Microsoft Teams application is now generally available on the Azure Virtual Desktop for Windows users.
Application window sharing now allows users to select a specific window from their desktop screen to share. Previously, users could only share their entire desktop window or a Microsoft PowerPoint Live presentation. Application window sharing helps reduce the risk of showing sensitive content during meetings/calls and keeps meetings focused by directing participants to specific content.
Read more at Azure Daily 2022
As a central authentication repository used by Azure, Azure Active Directory allows you to store objects such as users, groups, or service principals as identities. Azure AD also allows you to use those identities to authenticate with different Azure services. Azure AD authentication is supported for Azure SQL Database, Azure SQL Managed Instance, SQL Server on Windows Azure VMs, Azure Synapse Analytics, and now we are bringing it to SQL Server 2022.
General Availability: Azure Active Directory authentication for exporting and importing Managed Disks
Azure already supports disk import and export locking only from a trusted Azure Virtual Network (VNET) using Azure Private Link. For greater security, we are launching the integration with Azure Active Directory (AD) to export and import data to Azure Managed Disks. This feature enables the system to validate the identity of the requesting user in Azure AD and verify that the user has the required permissions to export and import that disk.
The Change Tracking and Inventory service tracks changes to Files, Registry, Software, Services and Daemons and uses the MMA (Microsoft Monitoring Agent)/OMS (Operations Management Suite) agent. This preview supports the new AMA agent and enhances the following:
Public Preview: Azure Automation Visual Studio Code Extension
You can now use Azure Automation Extension to quickly create and manage runbooks. All runbook creation and management operations like editing runbook, triggering jobs, tracking recent jobs, linking a schedule or webhook, asset management, local debugging, and many more are supported in the extension to make it easier for you to work in the IDE like interface provided by VS Code.
Large language models are quickly becoming an essential platform for people to innovate, apply AI to solve big problems, and imagine what’s possible. Today, we are excited to announce the general availability of Azure OpenAI Service as part of Microsoft’s continued commitment to democratizing AI, and ongoing partnership with OpenAI.
With Azure OpenAI Service now generally available, more businesses can apply for access to the most advanced AI models in the world—including GPT-3.5, Codex, and DALL•E 2—backed by the trusted enterprise-grade capabilities and AI-optimized infrastructure of Microsoft Azure, to create cutting-edge applications. Customers will also be able to access ChatGPT—a fine-tuned version of GPT-3.5 that has been trained and runs inference on Azure AI infrastructure—through Azure OpenAI Service soon.
We debuted Azure OpenAI Service in November 2021 to enable customers to tap into the power of large-scale generative AI models with the enterprise promises customers have come to expect from our Azure cloud and computing infrastructure—security, reliability, compliance, data privacy, and built-in Responsible AI capabilities.
Since then, one of the most exciting things we’ve seen is the breadth of use cases Azure OpenAI Service has enabled our customers—from generating content that helps better match shoppers with the right purchases to summarizing customer service tickets, freeing up time for employees to focus on more critical tasks.
Customers of all sizes across industries are using Azure OpenAI Service to do more with less, improve experiences for end-users, and streamline operational efficiencies internally. From startups like Moveworks to multinational corporations like KPMG, organizations small and large are applying the capabilities of Azure OpenAI Service to advanced use cases such as customer support, customization, and gaining insights from data using search, data extraction, and classification.
“At Moveworks, we see Azure OpenAI Service as an important component of our machine learning architecture. It enables us to solve several novel use cases, such as identifying gaps in our customer’s internal knowledge bases and automatically drafting new knowledge articles based on those gaps. This saves IT and HR teams a significant amount of time and improves employee self-service. Azure OpenAI Service will also radically enhance our existing enterprise search capabilities and supercharge our analytics and data visualization offerings. Given that so much of the modern enterprise relies on language to get work done, the possibilities are endless—and we look forward to continued collaboration and partnership with Azure OpenAI Service."—Vaibhav Nivargi, Chief Technology Officer and Founder at Moveworks.
“Al Jazeera Digital is constantly exploring new ways to use technology to support our journalism and better serve our audience. Azure OpenAI Service has the potential to enhance our content production in several ways, including summarization and translation, selection of topics, AI tagging, content extraction, and style guide rule application. We are excited to see this service go to general availability so it can help us further contextualize our reporting by conveying the opinion and the other opinion.”—Jason McCartney, Vice President of Engineering at Al Jazeera.
“KPMG is using Azure OpenAI Service to help companies realize significant efficiencies in their Tax ESG (Environmental, Social, and Governance) initiatives. Companies are moving to make their total tax contributions publicly available. With much of these tax payments buried in IT systems outside of finance, massive data volumes, and incomplete data attributes, Azure OpenAI Service finds the data relationships to predict tax payments and tax type—making it much easier to validate accuracy and categorize payments by country and tax type.”—Brett Weaver, Partner, Tax ESG Leader at KPMG.
The general availability of Azure OpenAI Service is not only an important milestone for our customers but also for Azure.
Azure OpenAI Service provides businesses and developers with high-performance AI models at production scale with industry-leading uptime. This is the same production service that Microsoft uses to power its own products, including GitHub Copilot, an AI pair programmer that helps developers write better code, Power BI, which leverages GPT-3-powered natural language to automatically generate formulae and expressions, and the recently-announced Microsoft Designer, which helps creators build stunning content with natural language prompts.
All of this innovation shares a common thread: Azure’s purpose-built, AI-optimized infrastructure.
Azure is also the core computing power behind OpenAI API’s family of models for research advancement and developer production.
Azure is currently the only global public cloud that offers AI supercomputers with massive scale-up and scale-out capabilities. With a unique architecture design that combines leading GPU and networking solutions, Azure delivers best-in-class performance and scale for the most compute-intensive AI training and inference workloads. It’s the reason the world’s leading AI companies—including OpenAI, Meta, Hugging Face, and others—continue to choose Azure to advance their AI innovation. Azure currently ranks in the top 15 of the TOP500 supercomputers worldwide and is the highest-ranked global cloud services provider today. Azure continues to be the cloud and compute power that propels large-scale AI advancements across the globe.
As an industry leader, we recognize that any innovation in AI must be done responsibly. This becomes even more important with powerful, new technologies like generative models. We have taken an iterative approach to large models, working closely with our partner OpenAI and our customers to carefully assess use cases, learn, and address potential risks. Additionally, we’ve implemented our own guardrails for Azure OpenAI Service that align with our Responsible AI principles. As part of our Limited Access Framework, developers are required to apply for access, describing their intended use case or application before they are given access to the service. Content filters uniquely designed to catch abusive, hateful, and offensive content constantly monitor the input provided to the service as well as the generated content. In the event of a confirmed policy violation, we may ask the developer to take immediate action to prevent further abuse.
We are confident in the quality of the AI models we are using and offering customers today, and we strongly believe they will empower businesses and people to innovate in entirely new and exciting ways.
The pace of innovation in the AI community is moving at lightning speed. We’re tremendously excited to be at the forefront of these advancements with our customers, and look forward to helping more people benefit from them in 2023 and beyond.
Generally available: Azure Ultra Disk Storage in Switzerland North and Korea South
Azure Ultra Disk Storage is now available in one zone in Switzerland North and with Regional VMs in Korea South. Azure Ultra Disk Storage offers high throughput, high IOPS, and consistent low latency disk storage for Azure Virtual Machines (VMs). Ultra Disk Storage is well-suited for data-intensive workloads such as SAP HANA, top-tier databases, and transaction-heavy workloads.
GitHub Actions: OpenID Connect token now supports more claims for configuring granular cloud access
OpenID Connect (OIDC) support in GitHub Actions enables secure cloud deployments using short-lived tokens that are automatically rotated for each deployment.
Each OIDC token includes standard claims like the audience, issuer, subject and many more custom claims that uniquely define the workflow job that generated the token. These claims can be used to define fine grained trust policies to control the access to specific cloud roles and resources.
These changes enable developers to define more advanced access policies using OpenID connect and do more secure cloud deployments at scale with GitHub Actions.
Azure Data Explorer (ADX) now supports managed ingestion from Azure Cosmos DB.
This feature enables near real-time analytics on Cosmos DB data in a managed setting (ADX data connection). Since ADX supports Power BI direct query, it enables near real-time Power BI reporting. The latency between Cosmos DB and ADX can be as low as sub-seconds (using streaming ingestion).
This brings the best of both worlds: fast/low latency transactional workload with Azure Cosmos DB and fast / ad hoc analytical with Azure Data Explorer.
Only Azure Cosmos DB NoSQL is supported.
Delta Lake is open source software that extends Parquet data files with a file-based transaction log for ACID transactions and scalable metadata handling. Now, Stream Analytics no-code editor provides you an easiest way (drag and drop experience) to capture your Event Hubs data into ADLS Gen2 with this Delta Lake format without a piece of code. A pre-defined canvas template has been prepared for you to further speed up your data capturing with such format.
To access this capability, simply go to your Event Hubs in Azure portal -> Features -> Process data or Capture.
With the "Featured Clothing" insight, you are able to identify key items worn by individuals within videos. This allows high-quality in-video contextual advertising by matching relevant clothing ads with the specific time within the video in which they are viewed. "Featured Clothing" is now available to you using Azure Video Indexer advanced preset. With the new featured clothing insight information, you can enable more targeted ads placement.
The new Azure Cosmos DB connector V2 for Power BI now allows you to import data into your dashboards using the DirectQuery mode, in addition to the previously available Import mode.
The DirectQuery mode in the V2 connector is helpful in scenarios where Azure Cosmos DB container data volume is large enough to be imported into Power BI cache via Import mode. It’s also helpful in scenarios where real-time reporting with the latest ingested data is a requirement. This feature can help you reduce data movement between Azure Cosmos DB and Power BI, with filtering and aggregations being pushed down. It is also helpful in user scenarios where real-time reporting with the latest ingested data is a requirement. Direct Query also has performance optimizations related to query pushdown and data serialization.
We are certifying IT Service Management Connector (ITSMC) on Tokyo version of ServiceNow.
Azure services like Log Analytics and Azure Monitor provide tools to detect, analyze and troubleshoot issues with your Azure and non-Azure resources to enable work item integration with IT Service Management products. The ITSM connector provides a bi-directional connection between Azure and ITSM tools to help track and resolve issues faster.
Azure Database for PostgreSQL – Flexible Server uses storage encryption of data at-rest for data using service managed encryption keys in limited Azure regions. Data, including backups, are encrypted on disk and this encryption is always on and can't be disabled. The encryption uses FIPS 140-2 validated cryptographic module and an AES 256-bit cipher for the Azure storage encryption. Currently this feature is available in the Switzerland North, Switzerland West, Canada East, Canada Central, Southeast Asia, Asia East and Brazil South regions.
Infrastructure encryption with customer managed keys (CMK) adds a second layer of protection by encrypting service-managed keys with customer managed keys. It uses FIPS 140-2 validated cryptographic module, but with a different encryption algorithm. This provides an additional layer of protection for your data at rest. The key managed by the customer that is used to encrypt the service supplied key is stored in Azure Key Vault service, providing additional security, high availability, and disaster recovery features.
Azure Dedicated Host gives you more control over the hosts you deployed by giving you the option to restart any host. When undergoing a restart, the host and its associated VMs will restart while staying on the same underlying physical hardware.
With this new capability, now generally available, you can take troubleshooting steps at the host level. This feature is currently available only in Azure public cloud and we will soon be launching this feature in sovereign clouds as well.
Today, customers and partners manage hundreds or even thousands of active databases. For each of these databases, it is essential to be able to create an accurate mapping of the active configurations. This could be for inventorying or even reporting purposes. Centralizing this database inventory in Azure using Azure Arc allows you to create a unified view of all your databases in one place regardless of which infrastructure those databases might be located on – in Azure, in your datacenter, in edge sites, or even in other clouds.