Festive Tech Calendar: Adopt AI like a PRO with Azure Essentials

Another year, and another fantastic Festive Tech Calendar. While it wasn’t the first event I participated in, I do think it is my longest running, annual event. I have been a fan since it’s inception and am delighted to see it continue. This year, the team are raising funds for Beatson Cancer Charity. Donations are appreciated via the Just Giving page.

Now, this post is all about Azure AI adoption via the new offering of Azure Essentials. So, we will start off by explaining what that is and why it matters. Over the years, Microsoft has introduced new ways of doing things, new approaches or methods; sometimes these have been simple renames, and sometimes they have been a completely different vision of Azure. Often, they can be confusing regardless. This post aims to help understand Azure Essentials better, using the “tech of the moment” Azure AI.

So – let’s get started. What exactly is Azure Essentials? As we’re working related to AI, let’s set the scene using Copilot…

Copilot for M365 in Teams (please don’t @ me about the structure of that name, I cannot keep up with how to reference Copilot!) was helpful:

Copilot in Azure…not so much:

What you need to take away at this stage is, rather than this being an entirely new thing, it has consolidated existing good work to allow for the consumption of it to be simpler or more refined. At a theory level this seems to make sense, however, we all know the implementation of these things can be very tricky.

With this in mind, how to use or approach Azure Essentials shifts a bit. The first point that struck me was this is most useful for people new to Azure. However, that is not to say it is not useful for those experienced. But if you are new, we have a lot of assumptions that people will know and make use of offerings like CAF and WAF, will reference the Architecture Centre for design guidance, etc. When that is likely not the case.

Centralise core guidance as Azure Essentials is a great idea in my opinion. However, it hasn’t just centralised guidance. Disclosing at this point that I work for a Microsoft Partner. Essentials also includes recommendations for finding a partner, leveraging funding programs, which products are useful, and customer testimonials. This is nice for companies like mine as a marketing/contact channel, but I am not sure if I would define it as “essential”.

What is essential though is how it frames guidance and aligns customers into the right frame of approach in my opinion. The site is a touch confusing on this point though. So the new resource kit is right at the top, it’s the first link on the page. But scenario or use case guidance is further down and brings you elsewhere. Sticking with our original idea regarding AI adoption, there is a use case listed, and this brings you to an Azure Architecture blog from July – this is not what we want…

Whereas if we open the Resource Kit, then check it’s contents, we get a ‘common scenario’ with click through links

Now before we dig in there, one item I noted when researching this was that some messaging implies, or potentially confuses some elements of this with changes to, or improvements upon the Cloud Adoption Framework (CAF). In my opinion, Azure Essentials doesn’t change CAF, it’s not even listed on the What’s New page. However, it is an improvement to how people may be guided to CAF. And anything that brings more people to CAF and allows for efficient and more well governed deployments is a positive to me!

So, what exactly does Essentials recommend then as it’s ideal detail required for AI adoption? Six steps and some learning material. I am delighted to see the inclusion of learning material, its becoming more and more important as the rate of change increases. Let’s have a look at the six steps:

  1. Assess your Azure AI readiness
  2. Explore Azure AI pricing
  3. Prepare your AI environment
  4. Design your AI workloads
  5. Develop Well-Architected AI workloads
  6. Deploy, Manage, and operate your AI workloads

At first glance this looks like a good set to me. I don’t think I would have ranked pricing as high in the sequence, but perhaps it’s important to get that out of the way early! 🙂

The first ask is here is to take an Assessment. The Azure AI readiness assessment focusses on core areas of adoption strategy within your business. It can be a lengthy process, it notes 45 minutes, but if you choose all of the areas available, it will give you 100+ pages of questions to complete to attain your score. Anyone familiar with Azure Well Architected Reviews, or the old Governance Assessment will see the immediate similarities here and understand the usefulness of having something that asks people to think about things in the correct way and offers a score to guide expectations.

Next, it’s pricing. Again, this is tricky for me. To be remotely accurate with pricing, I think you need to have some form of design to then dictate resources, which lead to price. You are then happy, or shocked, and rework your design. Rinse repeat to get to where you need to be. Unfortunately, the link in the resource kit lands you on the default pricing page for Azure, nothing AI specific. So you really are starting at the bottom. Some more AI specific guidance here would be a great inclusion for the next version. For example, this placement link, bring you to the menu item for AI pricing on this page, a small but helpful direction.

Next, we’re onto preparation. A good note on a Landing Zone, but I would have expected as this is Azure Essentials that would link through to some guidance on Landing Zones. We then get two links to Design Architectures for Azure AI in the Architecture Centre. This could be more confusing than helpful, and it’s not the preparation guidance I would expect. This is Azure Essentials, and here is the first AI architecture Visio you see…

My concern here is complexity. I know people may have more interest in using OpenAI models and the whole chat functionality. But I would have gone a different route here. Most likely document based, something that uses one of the more mature services, like Document Intelligence, and a simpler architecture for guidance. Make it easier to see the objective rather than the mountain that is presented above. I don’t think there is actually a perfect set of links here, too many variables and too much information dependent on where the user perception of AI is. Will be very interesting to see how this progresses and it may always require further expertise and information to be properly impactful.

Next, design, one of my favourite areas. No other aspect of Azure excites me like creating the solution design. With a vast platform you start with everything and toil away until you have what works for what is needed. Here we get a note to choose from reference architectures – good point, but which ones? No links are provided, but having said that, there is no single link that works here. The reference architectures are spread out amongst the different products. Next, we get a great link to the AI architecture design overview page. I think I might have switched step 3 and 4 here actually. Doing this first, I believe it gives people a much better starting point to learn from and then understand point 3 more comprehensively. Bookmark this page for your AI adoption journey, it’s like a TOC of what to read for which service/product.

The penultimate step guides us to well architected workloads. The note is simply a note, however the point is valid but I think it should have included this link, as the start point for this step. It’s really useful and helps you quickly jump where you need to with reference to the Well Architected Framework (can anyone else just not call it WAF? Too confusing for me with Web Application Firewall). However the included link, which focusses on Azure OpenAI is good. It has the expected pillar guidance for Well Architected, and it has a comprehensive set of accurate click-through links. I think this step is important and placed in the right place too, so it flows well at this point of the resource kit.

Finally, we have a deploy and manage step. This feels a little bit like the weakest of the six steps. First of all the title is repeated as the first bullet point – not great.

Then it notes we should use best practice – again, no guidance as to what that means. Or how that is practically when it comes to deployment and management. Finally, it links to a guide page regarding responsible use of AI. Responsible use is incredibly important, it is valid when operating AI workloads, but it is useless as a single link for this step. There is a literal AI management page on CAF already that could be used. I have waited until this step to link to this area of CAF, as it hasn’t been updated since the start of 2024, but it has a lot of detail this kit should include, and with an update, would make much more sense than some of the links included.

In conclusion, I think the kit needs some work, a revision so to speak. First, I would tweak the steps to be as follows:

  1. Assess your Azure AI readiness
  2. Develop Well-Architected AI workloads
  3. Design your AI workloads
  4. Prepare your AI environment
  5. Deploy, Manage, and operate your AI workloads
  6. Explore Azure AI pricing

Next, I would rely more heavily on CAF and Architecture Center with context for links, or linking to overview pages with a note to use the links within. Like a ‘further reading’ note or similar. I know it is meant to be Essentials, but let’s give essential guidance rather than minimum perhaps?

Finally, if you want to adopt AI like a Pro – I think Essentials is useful as a sanity check, but you are better investing your time on the already existing items on Learn, CAF and WAF.

Microsoft Copilot in Azure – Networking Edition

Welcome all from Azure Back to School, another year and another excellent community event from the guys behind the scenes. And a thanks to the event sponsors Captain Hyperscaler and Trace3.

For this year, I have decided to combine my favourite tech – Azure Networking – with the buzziest tech of the moment – Copilot. Specifically of course, Microsoft Copilot in Azure.

For those not familiar with this, or Copilot of any form, essentially it is an AI assistant. Microsoft are aiming to make these services as integrated as possible. So, you see Copilot logos, chats, prompts etc built into portals and applications to help make engagement with the service as seamless as possible.

Screengrab of Copilot for Azure chat window

Copilot in Azure, is exactly as it sounds, an AI assistant focussed on Azure. It has mixed capabilities depending on what you are trying to do. It is currently in Public Preview, at no additional cost, so I would recommend making use of it for assessment purposes, if it is of interest to you.

There are a base set of use cases as below, so I want to explore how practical these are across some common Networking services.

Let’s start with Virtual Network!

Design – I’ve actually already covered an attempt at this here – How to – Design a Virtual Network with Microsoft Azure Copilot

Operate – I tried some basic queries, and they worked quite well actually. It defaults to Resource Graph queries to convert your ask to something tangible.

What I like here, and where this service has improved since launch, is the follow up suggestions are now based on capabilities and aligned to previous asks, so I now get:

Choosing the subnets ask, it outputs via Resource Graph, a nice list for me however I was expecting it to include the address ranges, not just the names. However, a follow up ask displays them no problem.

Optimise – This one is trickier. A limitation here is me working within my demo environments, which have either limited functionality, or are configured exactly to best practice. Here was the set of questions I tried and answers I got:

  • Are there any active recommendations for my virtual networks
    • There are no active alerts in resource group rg_network for resource type microsoft.network/virtualnetworks
  • Can you show me the metrics for my virtual networks?
    • Responded with a table of all possible metrics, but no details linked to my resources
  • are there any reliability improvements I could make to my virtual networks
    • Responded with a list of best practice recommendations and reference links again not related to my resources.

I think one of the challenges here is the prompt and possible output. There isn’t really enough information or intelligence to be able to respond. For example, if I phrased a question similar to “are there any improvements I could make to my virtual network address ranges” It doesn’t give anything specific to my virtual networks, just accurate best practice advice.

Troubleshoot – So I don’t have a specific issue to ask it about, so I looked for what might be useful, maybe something you don’t know about!

Neither are great responses to be honest. As at least the second was a question I thought would allow for query generation. I couldn’t find a useful one here for this use case, which is a shame, but my guess would be this improves over time, perhaps as it is able to better work with Azure Advisor.

Next, let’s take a look at a Public IP

Design – I know this won’t take information from my own resources, so this just helps with best practice guidance. I went for a question I think most people, even some who have worked with for Azure for a while aren’t sure about and I was impressed with the response. Good examples, excellent awareness and detail in my opinion.

Operate – For this use case, I tried some knowledge gathering queries. I was most impressed with the below. Clever query creation, accurate result, clear (enough) presentation. Exactly what you need for at scale work like this. Not sure why it added Location, but no harm done!

Optimise – Starts getting tricky here. I know there is little that can be done for say Cost or Performance, and there are so maybe contextual questions that could be better with more context, like asking for ‘orphaned’ IPs instead of the below

I tried a security configuration check and recommendation prompt, but it somehow lost its way and prompted to choose a Storage Account, I did, and it gave accurate recommendations for that. Confusing how that happened, but also the output is what I wanted, so kinda half correct?

Troubleshoot – Basic but effective, C minus.

I think I started to crack the best prompt methods at this point in the article research. I quite like this format and output, but I am aware this requires advanced knowledge of the resource and options in advance of prompting. It also got the associated resource part wrong, that’s an orphaned IP I have been working with.

Finally, let’s look at Network Security Group

Design – This is difficult in one way. You can build an NSG with nearly no configuration, just name and location. And that ticks a Defender for Cloud box, if you attach to a subnet etc. But generally there is more configuration, so I thought how could this help me? Well what if I give it my needs and see can it give the right logical output…

Nice! Now, can it help me build it?

Colour me impressed – this was my best interaction to date. Clear, accurate and usable.

Operate Optimise Troubleshoot- A triple whammy as I started to drift across the lot at this point in terms of use case. I wanted to try queries that would help me work with NSGs both day to day and in a potential high stress situation. So I started with this:

So it took my context and decided that a rule that has allow enabled and a direction of inbound would be insecure, fair enough, or at least worth checking in on. Comes back with the correct answer! So, I switched up a few rules on my NSGs to allow all inbound from any source etc. The Portal flags this is as a dumb decision, let’s see if Copilot spots it.

Nope – odd result there. So I tried it a different way. Again, this means I have to know more advanced detail, but nothing you wouldn’t pick up quickly as you upskill.

Output correct! They are the three rules I switched up. It didn’t directly get my port element right, but that just needs a more accurate prompt. I think one logical approach for actual operational queries is to think in pseudo code, and in steps, allowing it work to your meaning quicker. Essentially avoid prompts like – ‘any rules giving off bad vibes in my NSGs? Respond in iambic pentameter’ – they don’t work and let’s be honest, are weird.

To wrap up – I like Copilot in Azure now. I have found multiple use cases that would actually help me work day-to-day. However, would that work be quicker? I am not sure. I feel like I would need to build up a prompt library. And if I was doing that, why would I not just use Resource Graph queries instead? Quicker, more accurate etc. Also, some of the knowledge levels required don’t allow it to be most useful to the people I think it should be useful to – Azure newbies. Design and advice sure, actual hands-on resource work appears to require more contextual knowledge.

Some helpful links to hit up for Copilot in Azure:

Overview

Responsible AI FAQ

Example prompts

Manage access – will become more important depending on your use cases, the cost when it hits GA etc.

As always, get in touch if you have any questions, or even if you have prompts you want to chat about! And don’t worry, I reverted those terrible rule changes right after testing 🙂

Don’t forget to check out all of the other content throughout the month over on Azure Back to School!

wedoAI 2024

Something a little different…

Head over to https://wedoai.ie to checkout a new online event that just launched on August 22nd.

The idea of this event is to promote learning and sharing of knowledge within the Microsoft AI community. To achieve this, we have community driven articles that highlight best-practises, lessons learned, and help with some of the more difficult topics of Microsoft AI.

For anyone familiar with Azure Spring Clean – you will see some similarities!

How to – Design a Virtual Network with Microsoft Azure Copilot

Having access to Microsoft Azure Copilot has been really interesting. On one hand, the use cases are almost limitless, essentially a choice of what do you want to try do with it? On the other, there is still work to be done to maximise its potential (acknowledge by Microsoft throughout use in fairness).

Working with any of the ‘Copilots’, one important element for me is to get a grounded understanding of what it is capable of, based on something I am an expert on. I cannot tell how good it is if I am asking it help with something I don’t know arguably better than it does. So – let’s I decided to push it with a Virtual Network.

My objective when starting this post was to hopefully reach the point where one single, detailed prompt would spit out an acceptable VNET design statement, perhaps even the code to build it, but that part was less important to me right now. Anyone can create a good Azure design right? 🙂

I am first going to outlay my thinking with respect to a VNET, it’s purpose, my security posture, connectivity requirements, and likely workloads. Rewording this into a statement that is aligned to the Cloud Adoption Framework, and Azure Network Architecture details.

To get a baseline of a basic prompt, I started with the below. I believe this helps work towards the ‘best’ prompt.

So this jumps all over the place. We have perimeter references, AVS and App Gateway all mentioned. Not ideal. But I did ask for an example, and it does provide links. So let’s tighten our prompt.

This is much better, proper sequential statements, however that third link to hybrid with Citrix is irrelevant. Now, as Copilot functions in a chat method, let’s use this prompt and response to expand detail.

So this approach doesn’t work. When you select the (perhaps) relevant items, the output is not aligned to the original ask.

So – let’s try this another way. We know the first recommend prompt returned good results. Rather than continue in a chat-response format, let’s try one very specific prompt. To ensure no confusion – I started a new chat for this.

This is better, but to be honest – I am not looking for design principles like ‘zero trust’. So we need to adjust the wording. Again, I have started a new chat for this.

Now we are getting somewhere. If this had included Bastion I would have ranked it 9/10. The first link is good, second link is not so this scores a 7/10 for me. It is a great improvement on previous asks, and I am trying to ask as few leading questions as possible. I tried another following response to get some more detail

Again, the general detail is good, but the links are hit and miss. This could introduce some confusion. I tried another follow on from this, but again it went a different route based on my existing subscription services.

Rather than say this didn’t work, I think I have set out with a task that isn’t really achievable at present. There are so many elements that require consideration, some sequential, some overlapping, some interdependent, that a single chat response is going to be very difficult if not impossible. At the same time, repeat responses are also challenging, especially when you’re not looking for something relevant to what you currently have, but aligned to best practice.

Overall, I think Copilot for Azure is improving every month, and the use cases are constantly expanding. However, I don’t believe, based on current functionality that it will be able to fully assist with design guidance and decisions, beyond providing principles and guided links. For the real design work – you will still need an expert 😉

Exploring: Microsoft Copilot for Azure

Recently, I was lucky enough to gain access to Microsoft Copilot for Azure as part of a limited preview. For anyone who missed the announcement at Ignite, here is what Microsoft describe it as:

Microsoft Copilot for Azure (preview) is an AI-powered tool to help you do more with Azure. With Microsoft Copilot for Azure (preview), you can gain new insights, discover more benefits of the cloud, and orchestrate across both cloud and edge. Copilot leverages Large Language Models (LLMs), the Azure control plane, and insights about your Azure environment to help you work more efficiently.

So – what does that mean in practice? For me, this means reading the docs, then getting stuck into actually trying elements of this out. To be transparent, I had low expectations for this service. I am not 100% sure whether it is aimed at me, or someone with less Azure experience. I was also conscious that this is the limited preview I am working with, so there will be some oddities.

First up, the integration into the Portal UX – I like it. It’s simple, and consistent. As it is a tenant level service, it stays in place as you jump around the Portal from a Subscription to a Resource to Entra ID for example.

Next, what can I use this for that is quicker than me doing this myself? I will be honest, I struggled a bit here. This is for two reasons. One, this is enabled in my MVP tenant, so I have very little Production or day-to-day work to be done. Two, I was looking for something interesting rather than ‘tell me how to build a VM’.

So, I started with a question I know the answer to, but anyone who follows #AzNet knows we are all dying for progress on…

Imagine my surprise with how confident that response is! OH MY GOD I FOUND A THING. Well no, it doesn’t work. And I have no idea what it means in Step 3. If you find out – please let me, Aidan and Karl know, thanks 🙂 But I do like that it attempts to back up its answer with links to documentation.

As you make requests, it dynamically updates the text to tell you what it is ‘thinking’ which I really like.

And that ability to write queries, is a real winner for me. saves a lot of time, but you need to be quite specific with the ask and detail, but that’s no real surprise at this stage.

I do like its ability to take quite a non specific question and offer a decent and useful output in response

However, I am finding myself trying to find things for it to do. This is OK during preview, where there is no additional cost, however, it’s not clear on what pricing will actually be just yet, vague language on the landing site makes me think this will be charged for

Overall, I think it’s a welcome addition to the AI assistant space from Microsoft. I think those of us working with Azure would feel quite left behind otherwise. But I do think that as the platform is so vast and as each environment is unique, the core use case for different people will vary and that could significantly impact whether this is used widely or not. Having said that, I am looking forward to how this progresses, and more people having access can only mean improvements.