Tuesday, May 21, 2019

O365 Spam Remover Script - now with a GUI and supports MFA

Problem: A spam campaign has hit your company and you want to remove the email from all inboxes in the tenant to help prevent people clicking bad links, freaking out, etc.

Solution: If you have less than 50k mailboxes, use Office 365 Compliance Center's Search and Purge feature. If not, you can use their discovery tools to generate a search but then you can't see the progress and it'll be a bit slow. So, if the campaign is less than 10 days old, here's a script that obtains as many Exchange Admin creds (now supporting MFA and Non-MFA) you can supply, tries to load a GUI for you or fails back to interactive command-line requests, and will use multiple powershell windows to run the necessary mailbox searches while you watch the progress. As with any script you get from the internet, no warranty is expressed or implied for this script so test it and tweak to your environment. I have tried to make it use UTC and avoid hard-coding any regional settings but your mileage may vary.

Update on May 22, 2019 - I have added some support to attempt to auto-load the Exchange Online for Powershell module and use it as priority over basic authentication.


Update 6/17/2019 - Moved the code to GitHub for easier updating. DO NOT WORRY - my github does not look like some giant mess of folders with cryptic things...the powershell files are right there on the screen and you can click any of them to view them in their entirety. Here is the link to the script: https://github.com/hornerit/powershell/blob/master/O365-SPAM-REMOVER-GUI-Public.ps1

Monday, April 15, 2019

Office 365 Spam Remover - Now supports MFA

Problem: A spam campaign has hit your tenant and affected more mailboxes than can be processed by the Search n Purge option in Exchange Online.

Resolution: Adjust this script to replace CONTOSO with your domain (if not it will prompt you). This will prompt you for your Exchange Admin credentials, offer you the chance to add more exchange admin accounts to run this under, prompt for the evil sender(s), date and time the spam campaign hit, and optionally the subject line(s) of the evil email messages so you don't accidentally remove too many messages. The script uses a message trace of all email sent to your tenant by the evil senders during the time frame specified and then searches those mailboxes to find the message(s) and removes them but uses multiple PowerShell windows to perform this function so that you can watch it in real-time and see quicker progress.

Last Updated May 21, 2019 to improve several sections based on feedback and optimizing. Another version of this script has been posted that has a GUI for all of the initial input using the Windows Presentation Framework built into Windows (so no special installs needed) at https://www.hornerit.com/2019/05/o365-spam-remover-script-now-with-gui.html. As with any script you get from the internet, no warranty is expressed or implied for this script so test it and tweak to your environment. I have tried to make it use UTC and avoid hard-coding any regional settings but your mileage may vary.


Update 2019-06-17 - I have moved my scripts to a github repository so that updates are easier to make. DO NOT WORRY - I do not make my github look freaking weird with folders and cryptic things that non-developers don't understand...my scripts are right there in the main folder and you can click them to view/copy/download: https://github.com/hornerit/powershell/blob/master/O365-SPAM-REMOVER-NoGUI-Public.ps1

Thursday, April 11, 2019

Getting around the 50k limit for Azure / O365 Groups in Azure AD Sync

Problem: A giant group in your on-premises Active Directory does not sync through Azure Active Directory Sync

Source: Azure / O365 has a limit in the Azure AD Sync (AAD Sync) such that it ignores groups that are over 50k - I even found groups really close to this limit acted weird

Resolution: Script a solution that will take your on-prem group and create mirror groups with a maximum number of users in each so that all the miniature groups will sync and auto-add/remove users from the mirror groups, and you can use these groups in Azure.

Update 2019-06-17 - I have moved this script to a github repository to make updates easier. DO NOT WORRY - it is not some crazy-looking developer page...just a list of scripts from this site. Here is the url to my repository: https://github.com/hornerit/powershell/blob/master/ActiveDirectory-SplitAndSyncGroups-Public.ps1

Azure CloudShell - Store Scripts Centrally for your Team

Problem: Cannot find or run scripts from Azure Cloud Shell that are centrally managed

Source: Azure Cloud Shell uses a secure linux vm with Powershell Core running on it and that shell has no access to any local resources (aka your file server) and you really can't connect to a git repo made by one of those fancy developers in your area - though you can clone one every time you open your cloud shell if that helps and may be good for their source control...I'm a sysadmin, I don't really get the git mantra except for backups. Also, many of your IT workers may attempt to access secure resources on insecure devices (aka their random personal tablet since you called them on a saturday and they don't want to VPN in using a laptop).

Resolution:

Assuming you have an AD group or Azure/O365 group that has your team members in it like we did:

  1. Log into portal.azure.com with permissions to create stuff in your subscription
  2. Create a Resource Group for your team in the same location as your subscription
  3. Grant Reader permissions for the team to the Resource Group (use the IAM menu option to grant this)
  4. Create a storage account with the cheapest settings possible (see below)
    1. Set the Resource Group to the one you created for your team in step 2
    2. Give it a nice, short, simple name for the Storage Account Name
    3. Set the location to the same location as your resource group (NOTE: I had problems with East US 2; if your subscription is in East US 2, choose US EAST for this)
    4. Set Performance to Standard
    5. Leave Account kind for StorageV2 (general purpose v2)
    6. Set Replication to Locally Redundant Storage (LRS) if possible, it's the cheapest
    7. Set Access Tier to Cool
    8. Use the Review and Create button and create the resource...this will take a minute
  5. Once the storage account is created, navigate to the Storage Account, and grant "Reader and Data Access" for your team
    • If you receive errors about an unauthorized header, close your browser and re-login to Azure portal
  6. Inside the storage account, click Files and create a file share, call it something like "share", don't set a quota - every new user that accesses CloudShell and creates a profile will commit 5gb to this share so there's really no point in trying to limit unless you are scared of the hackers
  7. Click the "share" share so that it opens and click "Upload" and upload scripts and other resources here for your team
To access these resources, you need to have your each member on your team open a new Cloud Shell and, hopefully, they've never used this before. If they have, you will need to have them use a command to mount this new storage instead of their original storage (clouddrive mount -s SUBSCRIPTIONGUID -g "YOURRESOURCEGROUPNAME" -n "YOURSTORAGEACCOUNTNAME" -f "YOURSHARENAME")...beware, they may want to backup their stuff before they switch storage like that:

  1. Click the Cloud Shell button immediately to the right of the search bar in Azure Portal
  2. Click PowerShell as the language (you can change this later if you like)
  3. Click Advanced Settings on the right
  4. Make sure the settings all match what you have made (you may have to manually type the Share name)
  5. Click Connect/Yes/whatever allows it to go
Now that it is connected (you can verify if you use the Get-Clouddrive command and it gives you useful information), you can access those shared resources by typing this:
cd $HOME/clouddrive
Now your PowerShell window is located in the main directory of the share...you can use dir to list the contents and even use tab completion to load your script name and it'll work. One downside: you cannot access local resources (e.g. local Active Directory or modules) when the script is accessed within cloud shell (you can actually map it as a network drive). The best way to handle this situation is to create a Hybrid Runbook Worker so that you can create Runbooks (aka scripts) and use the scripts stored in your shared location to call the runbooks and specify that they run on the Hybrid Runbook Worker.