Lumagate NAOur corporate blog
While you’re here, check out Simon, the AI-powered bot, built on the Microsoft Bot Framework, Designed for IT Operations. Learn more at http://lumagatena.com/simon.
At this point, admins at most companies I work with are familiar enough with Azure Automation. Often, I find they just don’t have a great use case in mind. In general, there are a couple of ground rules for getting started:
- Start small. Look for a quick win. Preferably, a project you can knock out on a Friday afternoon.
- Start in Test/Dev. The quick win needs to be a win…not a black eye for the IT department because you took something down in production.
- Pick something with an ROI story. Management loves to share ROI stories with corporate leadership. Get a win you can evangelize to your IT management org to fuel desire for additional automation work.
With that out of the way, here are four use cases where you can get started quickly (and safely) with Azure Automation.
We’re talking about Azure here, so this is the most obvious application to Azure Automation and it can save a lot of money! You can run VMs just during working hours, shutting them down and deallocating at the end of the day, or you can start VMs based on need – when a certain job is started, the VM can boot, process the data, then shut down when it’s not needed. This is great for end of period reporting, if there’s a lot of data to process, and for developer environments. There’s lots of ready-to-go runbooks with the ability to filter VMs based on tags, allowing you to schedule some, but not all VMs to shut down – this could be useful for applications that need many servers at certain times of the day, but fewer at off-peak times – provided you cannot set up the application in VM Scale Sets, of course.
Replace SQL Agent Jobs
Moving to Azure SQL Database is a great way to reduce server management tasks and costs when migrating to Azure. Unfortunately, the Azure DB service doesn’t have a SQL Agent available due to the nature of the service. You can migrate your SQL Agent jobs to Azure Automation, which can connect to your SQL Databases and easily run automated tasks. If the SQL Agent job needs to access VM data somewhere in your network, you can use Hybrid Runbooks to allow Azure to run the job locally, like a Task Scheduler job. The benefit to Hybrid Runbooks over Task Scheduler is that the job is more likely to run if a server is offline – provided you install more than one hybrid worker!
Did you know that Azure Automation has a native Update Management capability? You can manage your VM updates from a single pane of glass, without needing to invest in a complex configuration management solution. You can quickly onboard VMs and see what patches are missing, triage the most critical patches, and apply them – without having to log into the VM itself. You can schedule regular installation of updates, or apply the specific updates just once. There’s a rich reporting system that can tell you at a glance the current compliance across your servers, and a quick view into whether the installation was successful. The solution can even give you an estimate for how long a given patch will take to install, based on other agents that have installed the same update – that can be tremendously helpful when setting expectations with your business!
Windows is, naturally, supported – using WSUS or public Microsoft update, and some flavours of Linux are supported as well.
Respond to an event alert
Azure Automation can start a runbook when a webhook is called. When an event occurs, the source makes an HTTP request to the URL configured for the webhook, containing some data, and Azure Automation can be configured to parse the data and make a change based on that information. An example could be (returning to the first point) starting a VM if an infrequent task relies on the VM running – when a user starts the process, a webhook containing the VM name and the action (Start) could be sent to Azure Automation, and the VM would boot up in a few minutes. Another example of handling events, which has a more direct application for InfoSec and ITPros, is to respond to an event – for example, if a critical service stops on a VM, Azure Automation can run to reboot the VM or restart the service, or if CPU use gets too high, Azure Automation can scale up the application one server at a time. This isn’t limited to Azure VMs; hybrid runbook workers can run scripts inside AWS or on premises just as easily (though the capabilities are limited to what’s possible on the system you’re connecting to).
That’s for this installment. Have a “win” from your Azure Automation journey you’d like to share? Share in the comments below.
While you’re here, check out Simon, the AI-powered bot, built on the Microsoft Bot Framework, Designed for IT Operations. Learn more at http://lumagatena.com/simon.
We live in a world where Troy Hunt announces a new breach almost every week. There are millions of leaked credentials on the internet. Your spam mailbox is full of Nigerian princes asking the same questions that your mortgage application asks, and the media’s recommendations on how to secure your online life seem to keep changing. Here are five tips that can help ensure your private information remains private
1. Use a Password Manager
Password managers like 1Password and KeePass are a great solution to address the need for a unique password for every site – which will in turn resolve the need to change your password everywhere when one site announces a breach. Not only do they save you from having to remember passwords, they let you forget your username too. Password managers make it easy to have a unique, complex password for each website that you sign in to, so that if a site’s password list is breached, you don’t need to worry. Using the same password everywhere is really common, but it’s what everyone does simply because there’s too many things to remember in your life.
There’s a few password managers out there, so do your research and use one that’s well known. The better ones are usually somewhat expensive (1Password) but there’s free options available that are pretty good (LastPass). Some people think it’s a bad idea to use a notebook with usernames and passwords, but as long as you keep that notebook secure (in a safe place at home – no visits to the coffee shop!) you’re probably OK – unless someone’s targeting you.
2. Stay up to date
Updates can interrupt your work, which sucks, but they are really important to keeping your computer safe from hackers. Hackers move really quickly, and as soon as an update is released, they’re working to reverse-engineer it to find out what it fixes and discover any ways to use the bug to their advantage. Major updates, like upgrading from Windows 7 to Windows 10, are also really important. On the surface, it may look like Windows 10 is just an update for appearances’ sake, but there are some major architectural changes that are really significant in making your computer more secure.
3. Use an ad blocker
By far, the easiest way to install malware accidentally is by clicking on an ad that’s related to something that you’re looking for, or getting a drive-by download. Installing an ad blocker is recommended by most security experts as a basic first line of defense against bad people. Advertising networks are massive and complex. Some browsers, such as Chrome, have integrated ad blockers you can pick from their marketplace of browser extensions.
4. Be wary of unexpected communications – email, phone calls, texts
Emails from people that you don’t frequently talk to, or links you’re not expecting. There’s a multitude of ways that you can be targeted through your communications, from receiving a phone call pretending to be a bank, to receiving an email that looks like it’s from a company you do business with asking for you to PAY THIS OVERDUE INVOICE which you don’t remember receiving. The common thread with all of these types of attacks is to trust, but verify. If your bank calls you with a fraud alert and asks to validate your information, politely tell them you’ll call them back – and then call them back with the phone number on the back of your card. If your CEO texts asking for an urgent wire transfer (assuming that’s a normal part of your day) – call him up and verify each time. If you receive an email with a PDF or Office document, preview it in Office Online instead of using the full Office Suite. There’s a lot of backwards compatibility built into Office that makes it an excellent way to get into your computer.
5. Enable Windows Defender
Windows Defender is a great antivirus product included with Windows 10 for free. It’s quite capable of detecting the obvious and less-obvious malware. Like every other home antivirus product, it won’t be much use against a determined attacker, but it’s better than nothing. Windows Defender is quite reasonable about letting you use your computer, which has always been a problem with AV tools. Similar to the Stay up to date recommendation above – it’s there, it works well, and there’s really no reason to mess with the defaults.
Have tips to share on how you secure personal data online and protect your privacy? Share in the comments below.
Recently, I was playing with the Investigation feature in Azure Security Center, which allows you to visualize the scope of a security event, triage, and track down the root cause of potential security incidents. It reminded me a lot of Microsoft Advanced Threat Analytics (ATA) we recommend for customers in their own datacenter. A companion feature are security playbooks, which are collections of procedures that can be executed from Security Center once a certain playbook is triggered from selected alert. In effect, Microsoft is offering intelligent detection of potential security events with capability to automate incident response.
- Additional Reading. Notice the many hyperlinks in this article, which lead to additional reading on topics directly related to these features and duplicating this scenario in your own lab.
- Cost. Bear in mind that the feature only comes with the Standard tier of Security Center, which enables Advanced threat detection capabilities, which includes advanced analytics that leverage the Microsoft Intelligent Security Graph…the source of intelligence through analysis of signals via machine learning. The security playbooks leverage Azure Logic Apps, which carries a separate charge.
Exploring on your own
This spawned a security alert in the Security Center portal.
You could also easily trigger similar events with AppLocker bypass.
When you drill into the Security alerts tile in the Detection section of the dashboard above, you can see a list of events.
And clicking on an event in the list, you can see the details of the event, including the description, severity, resource type and details of the action. Notice the Investigate button at the bottom of the event details window.
Clicking on the Investigate button launches the Investigation Dashboard. The investigation consists of a graph , which is always focused on a specific entity, and presents the entities that are related to it. An entity could be a security alert, user, computer or incident. In this case, the “specific entity” is the suspicious process…Mimikatz.exe.
If we select the Suspicious process executed entity, you’ll see guidance on how to proceed with investigation. Most of the steps I found in various scenarios were fairly rudimentary.
If you click on the Playbooks tab at the right of the window, you’ll find the Run Playbooks based on this alert.
I did not find m(any) existing sample runbooks, but there is a tutorial with an example for creating your own Security Playbook in response to a suspicious process execution. As with the example, since more than 90% of malicious activities are observed once and never again, you may find your playbooks are driving expedited notification to the appropriate channels in your org.
If you have not yet spent time with the advanced features of Security Center present only in the Standard tier, it is worth a look. The story grows more compelling every month.
Lumagate is excited to welcome Azure MVP and Azure specialist Nicholas (Nick) Romyn, to their North American team! As a deep technical specialist, he has helped enterprise customers progress in their cloud journey with confidence, providing design and implementation guidance for great availability, security and efficiency in delivering business-critical services. At Lumagate, Nick will work as a Senior Consultant for North America, reporting to Pete Zerger.
“I’m very excited to be working with Pete and Wes (again).”, Nick says about joining the Lumagate. “I believe Lumagate has a unique, exciting combination of skill sets and reputation across Microsoft cloud offerings to enable a holistic approach for customers. Lumagate’s breadth of Microsoft MVPs is a testament to their vision and understanding of the big picture for customers.”
“He’s excited about driving strategic initiatives to enable and secure our customer’s enterprises to help build a strong Azure practice in North America. And Nick’s skills mesh well with our focus here at Lumagate.”, says Managing Partner, Pete Zerger. At Lumagate, part of the recipe to success is to share knowledge and help your team grow, and Nick will help continue to grow the knowledge and experience within Lumagate.
When asked about the opportunities the cloud offers, Nick says “I think there’s tremendous opportunity to increase our customers agility through transition to the cloud, and Lumagate’s focus on providing a holistic information security solution is critical. Business economics are driving customers to the cloud; availability and security must be implemented in a holistic approach to enable this agility”. Leveraging the cloud itself for security and agility, from Microsoft as one of the largest cloud providers in the world, gives every company an economic opportunity to compete in their markets, Nick concludes.
We’re happy to have you on board Nick. Welcome!
I am excited to report the first in a series of courses I am developing for LinkedIn Learning (Lynda.com) is now available! The first installment, “Microsoft Cybersecurity Stack: Identity and Endpoint Protection Basics“, is the introductory module, written at an intermediate level. The course is approximately two hours in length, composed of “bite-sized” video installments, each 3-5 minutes in length (to accommodate learning on a busy schedule), and easily viewed from any device…even your mobile phone!
Course Table of Contents
- Chapter 1: Azure Active Directory Premium Setup
- Chapter 2: Enabling Multi-Factor Authentication (MFA)
- Chapter 3: Setting Conditions for Secure Access
- Chapter 4: Managing Mobile Devices with Intune
- Chapter 5: Publishing Applications with Azure AD App Proxy
Get your free trial
If you are interested in ramping up on Microsoft Cybersecurity Stack from the ground up, I hope you will give it a try! You can sign up for a free trial of LinkedIn Learning at https://www.lynda.com/trial/PeteZerger
In today’s world, businesses are driving cloud service adoption. At Lumagate, we find the adoption of cloud services impacts operational procedures, especially in the types of tasks traditional IT roles have performed (perhaps a blog for another day!). One area of particular significance is the transition from manual execution of tasks to automation. Microsoft has built a number of toolsets to accommodate this need, including Flow, Azure Logic Apps, Azure Functions, and Azure Automation. Automated tools need something to execute against however, and thus Microsoft is working to surface their cloud service APIs through the Microsoft Graph.
As a Microsoft Partner with Gold competencies in Cloud Productivity and Enterprise Mobility (among others), we see many use cases for Graph. However, Graph is delivered via a REST API, which many IT Pros are unfamiliar with. In addition, on our Professional Services side, we have a particular need to ensure our implementations are consistent, repeatable, and reliable. As such, PowerShell is a tool of choice for us during delivery. This blog series (which will be broken into a couple parts) describes how to leverage the Graph to configure Microsoft Intune via PowerShell.
Challenges to Solve
There are a couple of PowerShell wrapper modules for Graph available via GitHub and other repositories, in addition to a set of PowerShell samples published by the Intune team. However, all of the samples I’ve come across have deficiency in one of three areas (sometimes a combination):
- They require the Azure AD PowerShell module to be installed (as is the case with the Intune team’s PowerShell samples for Graph). This tends to be required only for the auth process, acquiring the OAuth token to talk to Graph. This is a heavy-handed approach (and arguably lazy) to acquire a token which is then used to talk to Graph.
- They require the ADAL DLLs to be installed. This is also used for the auth process, acquiring the OAuth token from Azure AD. Dropping DLLs into a customer’s environment is not always appreciated, and thus is something we like to avoid.
- They build their own OAuth connection, but require the admin to register an application with Azure AD, and then store the application ID and most egregiously, the client secret (as is the case with the PSMSGraph module). Storing a client secret should only be used for web applications where it can be protected in context of a service, and not for a portable PowerShell module, where untrusted personnel may be using the application.
So how do we solve these challenges? The answer lies in the Azure AD v2.0 authentication endpoint.
Modern Authentication with Azure AD v2.0
Azure AD v2.0 provides a number of features, but most interestingly is the ability to leverage OAuth 2.0 Authorization Code flow to authorize applications. This allows us to launch a native application (in this case PowerShell), and the following steps occur:
- The native application makes a request to the authorization endpoint for a code, supplying a string of permissions (also known as scopes) in the URL request, and the resource (i.e. service / API) which it wants to access.
- The authorization endpoint prompts the user to sign-in. After a successful authentication, the user is presented with a list of the permissions supplied in the URL request, and asked to authorize the application for the resource.
- The user authorizes the application, at which point the authorization endpoint returns an authorization code which is valid to redeem an access token with those permissions. The token expires in 10 minutes.
- The application presents the authorization code to the token endpoint. In the body of the request, it supplies the list of permission scopes (or a subset) that was used to obtain the authorization code, along with the resource to which it is requesting access.
- The token endpoint responds with an access token, valid for 1 hr. If the offline_access permission scope was requested, it also issues a refresh token, which can be valid for an extended time period (often 14 days, depending on context). The refresh token can be used to redeem new access tokens every hour without requiring the user to restart the whole authentication process.
- The native client makes a call to the resource (in this case Graph), bearing (i.e. supplying) the access token in the header to prove it is authorized to make the request.
A diagram from the Azure AD v2.0 Authorization Code flow documentation helps illustrate this concept.
In all the steps above, we never need to provide a client secret, allowing us to build a native application which can be used by anyone in any environment, provided they are authorized to perform the actions in their environment. This solves one of our challenges. In the next post, we will get into how we can implement this concept in PowerShell without DLLs or requiring the Azure AD module to be installed.
117 Barrett Ave
1000 N West St. Suite 1501