On Wednesday, while officially day 2, having participated in the pre-event sessions, this was my third day at the Ignite conference. The first session of the day was on Advanced Capabilities and Innovation of Azure Storage Solutions. This session covered all of the improvements and advancements of Azure storage. Here are the key insights from this session:
- In line with the general theme of AI and all of the benefits that the Microsoft platform offers, they announced the opening of a new data center region in Fairwater that is dedicated for OpenAI. Two of the key benefits of this location is that customers can get over an Exabyte of blob storage and IO’s of 1TB per second. They also now offer Azure Managed Lustre, which is a pay-as-you-go file system for high-performance and AI workloads. This offers 25Pib on a single namespace and auto input from blob storage.
- NVIDIA DGX on Azure Storage is a fully managed, AI training platform also offered as-a-service. This provides accelerated computing and ready-to-go dedicated clusters for customers. With this platform, customers get a high performance, parallel file system that provides consistent throughput, low latency, and seamless integration with orchestration tools. Basically, AI training as-a-service.
- A new tool is Azure Service Discovery Dashboard that provides customers and administrators a view of their storage across all environments. This shows usage and performance for not only an overall region, but down to types like blob, container, etc. It can show up to ten accounts and what is accessed most often, down to the individual file.
- Azure now offers scale storage for cloud native apps like blob, block, and file. Elastic SAN, coming later this year, will be the block storage for scale that makes multi-tenancy simple. Blob Smart Tiering, also available later this year, will provide automatic cost savings by moving data that is not accessed frequently to lower, less expensive storage. This will eliminate customers needing scripts to do this for them. There are also new enhancements for Azure NetApp File storage, including flexible service levels, short-term clones for test and development, object-level Rest APIs, and single file restores from backup.
- Ultra disk is also now available in Azure Storage. This storage reduces latency by 30% and used for financial and other high transaction storage. It also reduces outlier latency by 80%. Besides general performance, these storage offers instant access to snapshots and improves restore times from hours to minutes. Now, customers can be confident they can quickly perform upgrade rollbacks, scale out, and test safely.
- If you run products like VMware on-premises, there has been storage improvements for Azure VMware Solutions to improve the performance and functionality for those customers looking to do a lift-and-shift move to Azure. Microsoft announced that Azure Elastic SAN and Azure NetApp Files as storage for this environment will be available.
- When it comes to NAS solutions running on-premises, moving these has proven to be complicated. However, two solutions are now available in Public Preview to simplify these migrations. The two solutions are for Pure Storage and Dell OneFS. This new functionality will ease migrations and removes the reliance on on-premises domain controllers, moving authentication to Entra.
With each of these new features and functions, there were a number of demos shown and deeper discussions on the products. If any of this is relevant to either your future or current Azure environments, I highly recommend you look deeper into these offerings.
After the session on Azure Storage, I participated in the lab session Make More Informed Decisions with Geospatial Intelligence. Working for a manufacturing company, I found the description of the lab interesting. Basically, this takes information from Earth observation data like Google Earth and combines it with AI to product impacting analytics in real-time. What can be done with this information? That depends on the industry, but for manufacturing, it can help make decisions around supply chains or product deliveries. For example, say there is a huge snow storm hitting the Midwest that may impact trucking routers, the information gathered can provide alternate routes for trucks to ensure deliveries continue and avoid delays. For governments, the information can be uses during disasters response by ingesting data from drones and satellites to better assess overall damage. It was interesting to see how all of this date could be collected and used for better decision making and planning.
The first session of the afternoon was Securing Data Across Microsoft Environments with Microsoft Purview. As organizations look to migration more data to Azure and implement AI like CoPilot, being able to secure sensitive data is a top priority. Securing data has been an issue for some time. Microsoft acknowledged that customers have been complaining for years about inconsistency or lack of classification, governance, and security when it comes to their data. They feel that now Purview has become the best overall, integrated platform when it comes to securing data. With 90% of the world’s data being created in the last two years, Purview has been built as the single, unified control plane for data.
As part of their initiative to secure data, Public Preview was announced for Data Security and Secure AI. Here are the benefits of each:
- Data Security
- Unified Data Security Posture Management
- OneDrive for Business Security Reports and Simulations
- M365 Scanning and Auto-labeling for both SharePoint and OneDrive for Business
- Secure AI
- Data risk reports and data security
- Inline DLP for Copilot, Chat, and Agents
- DLP for Windows Recall
These new features not only report on possible oversharing or insecure data, but offers a plan for remediation, including step-by-step instructions. These new security features can block products like CoPilot from accessing sensitive data in queries. However, CoPilot is not the only access concern. Microsoft has found that over 40% of stored data is also accessed via Fabric and Purview works to secure that access as well. Purview also integrates to monitor, report, and secure data stored and accessed in One Lake and Power BI. With the Power BI integration, users are notified when they try to access sensitive date and red dots in the interface show them what data was considered sensitive.
Two new AI security roles\features have also been added to Purview. These are the Data Security Posture Agent and Data Security Investigations. Here is how these work to better secure data:
- Data Security Posture Agent
- Summarizes all data
- Makes recommendations on how to better secure
- Understands data context
- Data Security Investigations
- Can assist in security investigations like data exfiltration where it can determine what sensitive files or data was not only accessed, but exfiltrated.
Overall, Microsoft seems to have been listening to their customer concerns about not only overall concerns about securing sensitive data, but also how to better classify and report on it. However, much of this is either new, in Public Preview, or recently Generally Available, so it will be interesting if Purview can deliver long term on what has been promised.
The last session of the day was titled No Passwords, No Phishing, No Problem and was presented as a more secure way of accessing systems and data. While many organizations today have implemented MFA, there is a plethora of ways and products for this type of security. Many of which may still rely in part on a password and possibly an SMS text message. However, if you can totally eliminate the password from the equation, that also eliminates phishing from the threat landscape. The future of both access and Zero Trust authentication is through features like Entra Conditional Access Policies and modern, passwordless authentication like biometrics integrated into Windows Hello or token codes via the Microsoft Authenticator application, or even passkeys.
Besides these options, there was another, hardware based option that was presented by Yubico. In case you were not aware, Yubico is the manufacturer of Yubikeys. These are hardware based authenticators that you carry with you. Rather than having to receive an SMS test message or lookup up that 6-digit code or longer on your authentication application, you simply plug in your Yubuikey when prompted. Over the years, they have made a lot of advancement in supporting the enterprise environment. If you eliminate the password from the user, phishing attacks are no longer a concern since there is no password to be scrapped via a fake login page, better securing your environment.
Overall, day three was very informative and most of my sessions were around how to better secure and monitor storage and data. There seems to be a lot of continued innovations and improvements in the Azure environment that can benefits companies from the smallest mom-and-pop shop to massive enterprise environment.
Leave a Reply