tag:blogger.com,1999:blog-45048603124809862432024-03-12T22:48:15.536-04:00Out of the Box SolutionsHelping regular people accomplish great things without programming using
SharePoint, Office, and PowerShell.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.comBlogger54125tag:blogger.com,1999:blog-4504860312480986243.post-80211776838041764312022-02-24T13:13:00.002-05:002022-03-04T15:13:29.693-05:00Bulk DNS Management in PowerShell<p>So your environment got bigger fast and you have a TON of forward and reverse lookup zones and something is out of whack. Well, I have a tool I've made in PowerShell and used successfully to 1) Find DNS A and PTR records related to specific hostnames or IPs and 2) Update their records like adjusting TTL and bulk renaming to point to a new host and 3) Make a backup of the existing records just in case as a CSV file ^_^).</p><p>This is probably one of the scariest tools I've had to build and comes with ZERO warranty - because, seriously, this is manipulating DNS records in bulk - but if you want to even just check to see what A or PTR records exist for a single IP then this might help you. You just run the tool as an admin on a Domain Controller, it IS interactive, and it will tell you that it may take several minutes to retrieve all A and PTR records and mesh them together. After that, it presents a menu to work with. I'm always open to tweaking ideas like I want to, at some point, have an option to just delete all orphaned objects or force re-create PTR records for all A records that are missing them...but that gets weird when you have load balancers and web apps where a ton of stuff should or should not point to one IP. Have fun!</p><p>Requires the DNSServer module from RSAT and to be run locally as an admin on the DC. Here's the link to the script:</p><p><a href="https://github.com/hornerit/powershell/blob/master/Get-DNSInfo.ps1">hornerit/powershell (github.com)</a></p>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-23369321631813327592020-10-20T17:09:00.000-04:002020-10-20T17:09:42.975-04:00Office 365 Spam Remover - Now supports MFA (Updated to support Content Searches)<b>Problem</b>: A spam campaign has hit your tenant and affected mailboxes and the Campaign options in O365 are either unavailable or don't satisfy the executives yelling at you while you read this.<br />
<br />
<b>Resolution</b>: Adjust this script to replace CONTOSO with your domain (if not it will prompt you). This will prompt you for your Exchange Admin credentials, <strike>offer you the chance to add more exchange admin accounts to run this under</strike>, prompt for the evil sender(s), date and time the spam campaign hit, and optionally the subject line(s) of the evil email messages so you don't accidentally remove too many messages. The script uses a message trace of all email sent to your tenant by the evil senders during the time frame specified and then searches those mailboxes to find the message(s) and NOW CREATES CONTENT SEARCHE(S)! From there, it will create the Purge action necessary to delete the message in question.<br />
<br />
Last Updated May 21, 2019 to improve several sections based on feedback and optimizing. Another version of this script has been posted that has a GUI for all of the initial input using the Windows Presentation Framework built into Windows (so no special installs needed) at <a href="https://www.hornerit.com/2019/05/o365-spam-remover-script-now-with-gui.html">https://www.hornerit.com/2019/05/o365-spam-remover-script-now-with-gui.html</a>. As with any script you get from the internet, no warranty is expressed or implied for this script so test it and tweak to your environment. I have tried to make it use UTC and avoid hard-coding any regional settings but your mileage may vary.<br />
<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQaUYnlcIcF96taYpL3TJ3O7tgNw84xRT0469GIbH0hw_XS0PqWLrfyghprZY5XBLiVo8n-BGcdgo0dKlSn-Q1108ENHPeaphuxDzh09Zo0PE_SmSOSo8jWNa-pbsOyYWXm6TH71lIT5A/s1600/imgSPAM.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="534" data-original-width="900" height="189" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQaUYnlcIcF96taYpL3TJ3O7tgNw84xRT0469GIbH0hw_XS0PqWLrfyghprZY5XBLiVo8n-BGcdgo0dKlSn-Q1108ENHPeaphuxDzh09Zo0PE_SmSOSo8jWNa-pbsOyYWXm6TH71lIT5A/s320/imgSPAM.jpg" width="320" /></a></div><br />
Update 2019-06-17 - I have moved my scripts to a github repository so that updates are easier to make. DO NOT WORRY - I do not make my github look freaking weird with folders and cryptic things that non-developers don't understand...my scripts are right there in the main folder and you can click them to view/copy/download: <a href="https://github.com/hornerit/powershell/blob/master/O365-SPAM-REMOVER-NoGUI-Public.ps1">https://github.com/hornerit/powershell/blob/master/O365-SPAM-REMOVER-NoGUI-Public.ps1</a>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-12885405028355238012020-10-20T17:02:00.002-04:002020-10-20T17:02:46.022-04:00O365 Spam Remover Script - now with a GUI and supports MFA (updated to use Content Search)<span style="font-family: "verdana" , sans-serif;">Problem: A spam campaign has hit your company and you want to remove the email from all inboxes in the tenant to help prevent people clicking bad links, freaking out, etc.</span><br />
<span style="font-family: "verdana" , sans-serif;"><br />
</span> <span style="font-family: "verdana" , sans-serif;">Solution: I've created a script as an update to the original script for this post. The newer ExchangeOnlineManagement powershell options appeared and the Search-Mailbox cmdlet has been deprecated...so, the new version creates a Content Search and also creates the appropriate purge actions to delete all email. This script will try to load a GUI for you with several options to control the senders, subject lines, and time frames the spam campaign was sent to make it much simpler on you to remove that phishing or spam campaign that made it through. If the GUI fails, it will fail back to an interactive cmd line script requesting the same info. As with any script you get from the internet, no warranty is expressed or implied for this script so test it and tweak to your environment. I have tried to make it use UTC and avoid hard-coding any regional settings but your mileage may vary.</span><br />
<span style="font-family: "verdana" , sans-serif;"><br />
</span><br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQaUYnlcIcF96taYpL3TJ3O7tgNw84xRT0469GIbH0hw_XS0PqWLrfyghprZY5XBLiVo8n-BGcdgo0dKlSn-Q1108ENHPeaphuxDzh09Zo0PE_SmSOSo8jWNa-pbsOyYWXm6TH71lIT5A/s1600/imgSPAM.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="534" data-original-width="900" height="189" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQaUYnlcIcF96taYpL3TJ3O7tgNw84xRT0469GIbH0hw_XS0PqWLrfyghprZY5XBLiVo8n-BGcdgo0dKlSn-Q1108ENHPeaphuxDzh09Zo0PE_SmSOSo8jWNa-pbsOyYWXm6TH71lIT5A/s320/imgSPAM.jpg" width="320" /></a></div><a href="https://github.com/hornerit/powershell/blob/master/O365-SPAM-REMOVER-GUI-Public.ps1">https://github.com/hornerit/powershell/blob/master/O365-SPAM-REMOVER-GUI-Public.ps1</a><br />
<br />
<span style="font-family: "verdana" , sans-serif;">Update 10/20/2020 - The code has been overhauled and updated for content searches!</span>
<span style="font-family: "verdana" , sans-serif;">Update 6/17/2019 - Moved the code to GitHub for easier updating. DO NOT WORRY - my github does not look like some giant mess of folders with cryptic things...the powershell files are right there on the screen and you can click any of them to view them in their entirety.</span><br />
<span style="font-family: "verdana" , sans-serif;"><br />
</span> <span style="font-family: "verdana" , sans-serif;">Update on May 22, 2019 - I have added some support to attempt to auto-load the Exchange Online for Powershell module and use it as priority over basic authentication.</span>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com2tag:blogger.com,1999:blog-4504860312480986243.post-48691782921077184482020-02-06T11:14:00.001-05:002020-03-05T16:41:26.455-05:00PowerShell Scripts - Get all Mailbox and Mailbox Folder permissions in O365 (New Exchange PowerShell)<i>Script updated several times between 2020-02-10 and 2020-03-5 to tweak different aspects when using the new Exchange Online powershell cmdlets that is currently in preview but is generally much more efficient for this task along with code readability and documentation updates. I received confirmation that getting mailbox folder permissions will return something more than a displayname and that we will eventually be able to use the FolderId instead of folder path since folder paths have problems with backslashes and other foreign characters.</i><br />
<br />
<b>Problem</b>:<br />
You need to document, monitor, and manage mailbox and mailbox folder permissions across an entire O365 tenant. When trying with PowerShell, we can't even pipe Get-Mailbox to Get-MailboxFolderStatistics to Get-MailboxFolderPermissions. When attempting these things in PowerShell, it has traditionally been extremely slow.<br />
<br />
<b>Source</b>:<br />
This gets weird when it comes to mailbox permissions (FullAccess, SendAs, SendOnBehalf), Calendar Permissions, and Mailbox Folder Permissions (tons of options) and HR wants you to verify that an employee does not have access to that mailbox even if they are rehired later. The old Exchange PowerShell commands transmitted too much data so the results of pulling large numbers of mailboxes and folders were too slow to be feasible. One could search the Unified Audit Log for folder permissions changes but that doesn't give the baseline.<br />
<br />
<b>Resolution</b>:<br />
You can download my interactive script that does the retrieval, partial retrieval, and resuming here: <a href="https://github.com/hornerit/powershell/blob/master/Get-O365MailboxPermissionsAcrossTenant.ps1">https://github.com/hornerit/powershell/blob/master/Get-O365MailboxPermissionsAcrossTenant.ps1</a>. I have updated several times and maintain updates on it for tweaking special scenarios and will update once the new cmdlets offer different data.<br />
<br />
No matter how you try, you will need to make a local copy of the mailbox/folder permissions somewhere because attempting to query this information is too time-intensive to be useful. With the advent of the new Exchange Online V2 powershell cmdlets, the performance of getting the mailbox, mailbox permissions, and folder permissions is not nearly as horrible as it was before.<br />
Make sure you have an Exchange Admin account and you have installed the new Exchange module (Install-Module ExchangeOnlineManagement) - you can connect and even support MFA using Connect-ExchangeOnline and all the old ExchangeOnline commands still work. From there, the more straightforward version for mailbox permissions for 5 mailboxes is something like this:<br />
<br />
<pre class="brush: ps">Get-EXOMailbox -ResultSize 5 |
Get-EXOMailboxPermissions |
Where-Object {
$_.IsInherited -eq $false -and
$_.Deny -eq $false
} |
Select-Object Identity,User,@{Label="AccessRights";Expression={$_.AccessRights -join ","}} |
Export-CSV -Path "C:\someFolder\MailboxPermissions.csv" -Append
</pre><br />
Or for the Mailbox Folder permissions (for 5 mailboxes):<br />
<br />
<pre class="brush: ps">Get-EXOMailbox -ResultSize 5 |
Get-EXOMailboxFolderStatistics |
Where-Object {
$_.SearchFolder -eq $false -and
@("Root","Calendar","Inbox","User Created") -contains $_.FolderType -and
(@("IPF.Note","IPF.Appointment",$null) -contains $_.ContainerClass -or $_.Name -eq "Top of Information Store")
} |
Select-Object @{Label="Identity";Expression={
if($_.Name -eq "Top of Information Store"){
$_.Identity.Substring(0,$_.Identity.IndexOf("\"))
} else {
$_.Identity.Substring(0,$_.Identity.IndexOf("\"))+':'+$_.Identity.Substring($_.Identity.IndexOf("\")).Replace([char]63743,"/")
}
}} |
Get-EXOMailboxFolderPermissions |
Where-Object { $_.AccessRights -ne "None" } |
Select-Object Identity,FolderPath,User,@{Label="AccessRights";Expression={$_.AccessRights -join ","}} |
Export-CSV -Path "C:\someFolder\MailboxFolderPermissions.csv" -Append
</pre><br />
Or combine them effectively (for 5 mailboxes):<br />
<br />
<pre class="brush: ps">Get-EXOMailbox -ResultSize 5 |
Tee-Object -Variable "myMailboxes" |
Get-EXOMailboxPermissions |
Where-Object {
$_.IsInherited -eq $false -and
$_.Deny -eq $false
} |
Select-Object Identity,User,@{Label="AccessRights";Expression={$_.AccessRights -join ","}} |
Export-CSV -Path "C:\someFolder\MailboxPermissions.csv" -Append
$myMailboxes |
Get-EXOMailboxFolderStatistics |
Where-Object {
$_.SearchFolder -eq $false -and
@("Root","Calendar","Inbox","User Created") -contains $_.FolderType -and
(@("IPF.Note","IPF.Appointment",$null) -contains $_.ContainerClass -or $_.Name -eq "Top of Information Store")
} |
Select-Object @{Label="Identity";Expression={
if($_.Name -eq "Top of Information Store"){
$_.Identity.Substring(0,$_.Identity.IndexOf("\"))
} else {
$_.Identity.Substring(0,$_.Identity.IndexOf("\"))+':'+$_.Identity.Substring($_.Identity.IndexOf("\")).Replace([char]63743,"/")
}
}} |
Get-EXOMailboxFolderPermissions |
Where-Object { $_.AccessRights -ne "None" } |
Select-Object Identity,FolderPath,User,@{Label="AccessRights";Expression={$_.AccessRights -join ","}} |
Export-CSV -Path "C:\someFolder\MailboxFolderPermissions.csv" -Append
</pre><br />
If you are a PowerShell person, you may notice 3 oddities with my combining:<ol><li>I did not use a foreach-object</li>
<li>I used a Tee-Object</li>
<li>There are a ton of filters and some weird Select-Object stuff going on in the middle with MailboxFolderStatistics<br />
</ol><br />
Foreach-Object breaks the new Exchange Online cmdlets' multithreading (speed) so I can't say Get-Mailboxes | foreach-object { Get-Mailboxpermissions; Get-MailboxFolderPermissions} ... It was actually faster to run each command separately. To keep from having to retrieve the mailboxes twice, Tee-Object allows you to take the output from the Get-EXOMailboxes, store it in a variable, and use it a second time once your first full command is over - and it does NOT break the PowerShell pipeline so speed stays happy!<br />
<br />
The reason for all the weirdness with Get-EXOMailboxFolderStatistics is that when someone tries Get-MailboxFolderPermission but only supplies the email address of the mailbox, it only retrieves the folder permissions for the folder "Top of Information Store". So one might think - I will just use the Get-EXOMailboxFolderStatistics to get the list of all the folders within the mailbox and send THAT over to the Get-EXOMailboxFolderPermission...well, it fails miserably because the Identity of the folder from Statistics looks like mailbox@domain.com\NameOfFolder but the MailboxFolderPermissions cmdlet is expecting mailbox@domain.com:\NameOfFolder\NameofSubfolder. Right now, the FolderID is not supported in the new cmdlets, only the path. Either way, the lack of nice pairing is stupid, but fixable by Selecting the identity of folders and then fixing the name and then passing that along the pipeline. The other filters I throw in there are so that I don't ask about permissions for every dumb folder in a mailbox that might be some Teams, Yammer, etc. system folder and subfolder but I do want Calendars.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-34183537131866672572019-09-18T15:44:00.000-04:002019-09-18T15:44:35.288-04:00PowerShell Scripts - Connect to Azure AD with MFA - reusing your existing connection<p>Situation: You are a very good user and have Multi-factor Authentication enabled for your account and you do things in Azure often. You want to connect to Azure without having to be prompted AGAIN for your MFA once you have done so for something else (e.g. the Exchange Online PowerShell module). You could also be creating a script that many people have to run many times a day and you don't want them re-authenticating over MFA every...single...time.</p><p>Resolution:<br />
<br />
<ol><li>Make sure you have the latest and greatest of the Exchange Online PowerShell Module (you can get it at https://aka.ms/exopspreview)</li>
<li>Make sure you have the latest version of the AzureAD Module (I'm using at least version 2.0.2.26 at the time of this writing)</li>
<li>Make a script similar to this (or add to your script...this assumes the current user is the one that installed the EXO module):<br />
<br />
<pre class="brush: ps">[CmdletBinding()]
param(
[string]$MFAPath,
[string]$O365TenantId = "YOUR-TENANT-ID-GOES-HERE",
[string]$O365ClientId = "1b730954-1685-4b74-9bfd-dac224a7b894",
[string]$O365ResourceUrl = "https://graph.windows.net",
[uri]$O365URI = "urn:ietf:wg:oauth:2.0:oob",
[string]$DefaultDomain = "contoso.com"
)
Import-Module AzureAD
#Load the MFA module.
try{
$getChildItemSplat = @{
Path = "$Env:LOCALAPPDATA\Apps\2.0\*\CreateExoPSSession.ps1"
Recurse = $true
ErrorAction = 'Stop'
Verbose = $false
}
$MFAPath = ((Get-ChildItem @getChildItemSplat | Sort-Object LastWriteTime -Descending | where-object {(Test-Path "$($_.PSParentPath)\Microsoft.Exchange.Management.ExoPowershellModule.dll") -eq $true} | Select-Object -First 1 | Select-Object -ExpandProperty fullname).Replace("\CreateExoPSSession.ps1", ""))
. "$MFAPath\CreateExoPSSession.ps1" 3>$null
Write-Verbose -Message "MFA module found in this folder - $MFAPath"
} catch {
Read-Host "MFA Module not found inside the local appdata folder for $ENV:USERNAME. If it is installed for another user already, run powershell as that user. To install the latest module, go to https://aka.ms/exopspreview. Press any key to exit"
exit
}
try {
#Since the MFA module has a file that holds all of the authentication processes, let's use that to authenticate silently
$IdentityPath = Get-ChildItem -Path "$MFAPath\Microsoft.IdentityModel.Clients.ActiveDirectory.dll" -Recurse -Verbose:$false | Sort-Object LastWriteTime -Descending | Select-Object -First 1 -ExpandProperty FullName
Add-Type -Path $IdentityPath
#Using the documentation for the AuthenticationContext, you need a resource url, client ID, URI (special), PromptBehavior (special), and UserIdentifier (special)
#This means that we create a basic context pointing to a common login url for O365 but the AzureAD graph url as the resource url
$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList "https://login.windows.net/$O365TenantId"
#Specifying Auto for this allows MFA to check for an existing token and use it if possible, otherwise prompt for MFA
$MFAPromptBehavior = [Microsoft.IdentityModel.Clients.ActiveDirectory.PromptBehavior]::Auto
#Using an identifier type of 2 is required where the displayId is required for token to pass (ie the email is the username)
$AADcredential = [Microsoft.IdentityModel.Clients.ActiveDirectory.UserIdentifier]::new(("$ENV:USERNAME@$DefaultDomain"),2)
#Get the authentication...let's hope the result is successful :)
$authResult = $authContext.AcquireToken($O365ResourceUrl,$O365ClientId,$O365URI,$MFAPromptBehavior,$AADcredential)
#Now that all the weird auth stuff has completed, connect to azuread using the resulting data
Connect-azuread -TenantId $authResult.TenantId -AadAccessToken $authResult.AccessToken -AccountId $authResult.UserInfo.DisplayableId
} catch {
write-output "Failure to connect to O365/Azure MSOLService for user account management. Here is the error from MS: $_"
Read-Host "Press any key to exit script"
Exit
}
</pre></li>
</ol></p><p>I learned this from a smattering of blog posts that you may also find helpful (one I found after I figured all this out on my own sadly):<ul><li><a href="https://ingogegenwarth.wordpress.com/2018/02/02/exo-ps-mfa/">Deep dive:Exchange Online PowerShell and MFA</a></li><li><a href="https://www.michev.info/Blog/Post/1771/hacking-your-way-around-modern-authentication-and-the-powershell-modules-for-office-365">Hacking your way around Modern authentication and the PowerShell modules for Office 365</a></li><li><a href="https://www.morgantechspace.com/2018/01/connect-to-microsoft-graph-api-using-powershell.html">How to Connect Microsoft Graph API using PowerShell</a> (though you can use the URI I supplied above and client ID and it works the same as my stuff here...his is much more nicely laid out)</li></ul></p>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-59746089923701729332019-04-11T16:26:00.001-04:002019-06-17T12:03:20.917-04:00Getting around the 50k limit for Azure / O365 Groups in Azure AD SyncProblem: A giant group in your on-premises Active Directory does not sync through Azure Active Directory Sync<br />
<br />
Source: Azure / O365 has a limit in the Azure AD Sync (AAD Sync) such that it ignores groups that are over 50k - I even found groups really close to this limit acted weird<br />
<br />
Resolution: Script a solution that will take your on-prem group and create mirror groups with a maximum number of users in each so that all the miniature groups will sync and auto-add/remove users from the mirror groups, and you can use these groups in Azure.<br />
<br />
Update 2019-06-17 - I have moved this script to a github repository to make updates easier. DO NOT WORRY - it is not some crazy-looking developer page...just a list of scripts from this site. Here is the url to my repository: <a href="https://github.com/hornerit/powershell/blob/master/ActiveDirectory-SplitAndSyncGroups-Public.ps1">https://github.com/hornerit/powershell/blob/master/ActiveDirectory-SplitAndSyncGroups-Public.ps1</a>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-43366700726289540182019-04-11T15:45:00.003-04:002019-04-15T07:33:29.267-04:00Azure CloudShell - Store Scripts Centrally for your TeamProblem: Cannot find or run scripts from Azure Cloud Shell that are centrally managed<br />
<br />
Source: Azure Cloud Shell uses a secure linux vm with Powershell Core running on it and that shell has no access to any local resources (aka your file server) and you really can't connect to a git repo made by one of those fancy developers in your area - though you can clone one every time you open your cloud shell if that helps and may be good for their source control...I'm a sysadmin, I don't really get the git mantra except for backups. Also, many of your IT workers may attempt to access secure resources on insecure devices (aka their random personal tablet since you called them on a saturday and they don't want to VPN in using a laptop).<br />
<br />
Resolution:<br />
<br />
Assuming you have an AD group or Azure/O365 group that has your team members in it like we did:<br />
<br />
<ol>
<li>Log into portal.azure.com with permissions to create stuff in your subscription</li>
<li>Create a Resource Group for your team in the same location as your subscription</li>
<li>Grant Reader permissions for the team to the Resource Group (use the IAM menu option to grant this)</li>
<li>Create a storage account with the cheapest settings possible (see below)</li>
<ol>
<li>Set the Resource Group to the one you created for your team in step 2</li>
<li>Give it a nice, short, simple name for the Storage Account Name</li>
<li>Set the location to the same location as your resource group (NOTE: I had problems with East US 2; if your subscription is in East US 2, choose US EAST for this)</li>
<li>Set Performance to Standard</li>
<li>Leave Account kind for StorageV2 (general purpose v2)</li>
<li>Set Replication to Locally Redundant Storage (LRS) if possible, it's the cheapest</li>
<li>Set Access Tier to Cool</li>
<li>Use the Review and Create button and create the resource...this will take a minute</li>
</ol>
<li>Once the storage account is created, navigate to the Storage Account, and grant "Reader and Data Access" for your team</li>
<ul>
<li>If you receive errors about an unauthorized header, close your browser and re-login to Azure portal</li>
</ul>
<li>Inside the storage account, click Files and create a file share, call it something like "share", don't set a quota - every new user that accesses CloudShell and creates a profile will commit 5gb to this share so there's really no point in trying to limit unless you are scared of the hackers</li>
<li>Click the "share" share so that it opens and click "Upload" and upload scripts and other resources here for your team</li>
</ol>
To access these resources, you need to have your each member on your team open a new Cloud Shell and, hopefully, they've never used this before. If they have, you will need to have them use a command to mount this new storage instead of their original storage (clouddrive mount -s SUBSCRIPTIONGUID -g "YOURRESOURCEGROUPNAME" -n "YOURSTORAGEACCOUNTNAME" -f "YOURSHARENAME")...beware, they may want to backup their stuff before they switch storage like that:<br />
<br />
<ol>
<li>Click the Cloud Shell button immediately to the right of the search bar in Azure Portal</li>
<li>Click PowerShell as the language (you can change this later if you like)</li>
<li>Click Advanced Settings on the right</li>
<li>Make sure the settings all match what you have made (you may have to manually type the Share name)</li>
<li>Click Connect/Yes/whatever allows it to go</li>
</ol>
Now that it is connected (you can verify if you use the Get-Clouddrive command and it gives you useful information), you can access those shared resources by typing this:<br />
<blockquote class="tr_bq">
cd $HOME/clouddrive</blockquote>
Now your PowerShell window is located in the main directory of the share...you can use dir to list the contents and even use tab completion to load your script name and it'll work. One downside: you cannot access local resources (e.g. local Active Directory or modules) when the script is accessed within cloud shell (you can actually map it as a network drive). The best way to handle this situation is to create a Hybrid Runbook Worker so that you can create Runbooks (aka scripts) and use the scripts stored in your shared location to call the runbooks and specify that they run on the Hybrid Runbook Worker.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-79690005832494068712017-11-30T11:41:00.061-05:002022-05-25T13:34:16.645-04:00Extract Attachments Inside or Embedded Within InfoPath XML FormsSITUATION: It's been a while but I'm back to working more with SharePoint - YAY - and was hit with my first unique request back in 2018: we want to find a way to extract files that are embedded inside of InfoPath attachment controls. Along the way, also had to figure out how to download all files in a library.<br />
<br />
RESOLUTION: I apologize now: THERE IS NO OOTB WAY TO JUST GRAB THESE SUCKERS...you must use SOME form of code or script :(. So, I figured out two ways originally and have maintained one way to do this: PowerShell. This way, you can download the files locally and then extract the attachments (does use more disk space) but is probably done on a SharePoint admin server - I welcome some assistance in porting this to O365 so that anyone can run this on their own computer if they have appropriate perms<br />
<br />
******CAVEATS******<br />
The ways presented are definitely a beta and should be applied only to a test instance of SharePoint so you don't kill your server or computer. I'm definitely not responsible for you using this script and it messing stuff up - review it, test it, follow your company's standards for verification and testing, etc.<br />
I welcome feedback if you discover a bug in it but feel free to use how you see fit, tweak it, whatever. It'd be nice to keep some credit if you end up doing amazing things with it - just let me know :)!<br />
This was tested using PowerShell on a SharePoint Server 2010 instance. This may or may not work in other versions, I'm sure you could test it and let me know.<br />
******END CAVEATS******<br />
<div><br />
</div><div>Now, it took me a while to get all the pieces together, so here are the links to the information needed to pull this off <begin credits>:<br />
<div><ol><li>MS explains how to do this in visual studio code <a href="https://support.microsoft.com/en-us/help/2517906/how-to-encode-and-to-decode-a-file-attachment-programmatically-by-usin" target="_blank">here</a></li>
<li>Chris White expounds on the same thing in some different detail <a href="http://chrissyblanco.blogspot.ie/2006/07/infopath-2007-file-attachment-control.html" target="_blank">here</a></li>
<li>Different stack exchange questions related to pushing and pulling from XML in a form library as well as handling the attachment raw data and rebuilding it <a href="https://sharepoint.stackexchange.com/questions/222076/reading-and-writing-xml-in-form-library-with-powershell" target="_blank">here</a>, <a href="https://stackoverflow.com/questions/14905396/using-powershell-to-read-modify-rewrite-sharepoint-xml-document" target="_blank">here</a>, and <a href="https://stackoverflow.com/questions/21797299/convert-base64-string-to-arraybuffer" target="_blank">here</a></li>
<li>Masanao Izumo has a cross-browser implementation for converting an unsigned 8 byte integer array back to a string (text) <a href="https://ourcodeworld.com/articles/read/164/how-to-convert-an-uint8array-to-string-in-javascript" target="_blank">here</a></li><li>(2022 update) - MS has some documentation on the byte header that they insert before files that have been embedded within their xml files <a href="https://docs.microsoft.com/en-us/previous-versions/office/troubleshoot/office-developer/encode-and-decode-attachment-using-visual-c-in-infopath-2010" target="_blank">here</a> <end credits></li>
</ol><b><u>Powershell Script</u></b> - Most recent update was 2022-05-02. Has 2 requirements and 3 optional pieces for which it prompts: 1) URL of the sharepoint web (aka the regular site, not site collection), 2) The name of the form library, and 3) if you are extracting InfoPath forms, what data source in the xml file do you want to use to create folder(s) on your computer in which to place all of the attachments - if it is an XML attribute instead of a proper data source then that is ok too, and 4) and 5) a cutoff date in case you want to only download items before a certain date or after a certain date. The last part that you *might* edit is the $filepath since that is the main path on your computer for where you want to stick all these attachment files. Here's the script link: <a href="https://github.com/hornerit/powershell/blob/master/Get-SharePointInfoPathFilesAndAttachments.ps1">https://github.com/hornerit/powershell/blob/master/Get-SharePointInfoPathFilesAndAttachments.ps1</a><br />
</div><div><div><br />
<b><u>Further Explanation:</u></b> So here's how it pulls together:<br />
<br />
<ol><li>InfoPath XML attachment files are first a long string that has been "encoded" into something called Base64. You don't have to know what that means, you just have to know they were encoded - which means to use them we need to decode them. That's the first step once we have the InfoPath XML file.</li>
<li>The attachment files (once decoded) are composed of a header portion and the data portion. The header consists of 2 parts: a fixed header set of information and the filename of the attachment.</li>
<li>Byte 20 of the header tells you just how long the filename is as a number. This is a byte but needs to be doubled for how the information is encoded (unicode or UTF8).</li>
<li>Once we know how long the filename is, we look *just* past the end of the regular, fixed header (byte 24) and copy every other byte till the end of the filename section - except for the very last one as it has a byte to tell the system that this is the end of the file name. The reason we do every other is that the ones in between are basically nothing or what programmers call 'null'. One weird thing - PDFs in my environment didn't have a file extension at the end of their name...so I created a small workaround to just append .pdf if there was not any extension mentioned.</li>
<li>Now that we have the filename, we need to separate out the contents from the header, so we create a new array that just starts after the end of the filename.</li>
<li>This new array needs to be somehow converted into a file...in *Windows* Powershell, we use the WriteAllBytes function to make it a file and we just tell it where the path is that it should write and give it this blob of goodness</li>
<li>Getting access to the file itself is easy for Powershell - when the script is done, go look at your folder. There's the "_DownloadedRawFiles" subfolder, which you can ignore for now, and there's all the other folders generated by the extraction process based on the group by aka Folder Structure Node supplied.</li>
</ol></div></div></div>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com10tag:blogger.com,1999:blog-4504860312480986243.post-7216630292019720472014-04-07T10:45:00.000-04:002014-04-07T10:45:13.877-04:00Android Home Button keeps opening recent apps<strong>Problem</strong>: I have a Galaxy S4 and at some point it decided that when I press the home button on the phone it would always open up the recent apps window - even if I didn't hold the home button down! This got frustrating but I started adapting to double-hitting the home button all the time.<br />
<br />
<strong>Resolution</strong>: Found out when working on my wife's phone - this started happening to her after installing the Dolphin browser. So I fiddled with it and I figured it out - Dolphin's "Confirm before exit" setting was causing it, turn that sucker off and restart your phone!<br />
<br />
TL;DR/To fix - open Dolphin, go to Settings, then Exit Settings, then turn off the "Confirm Before Exiting" setting. Restart your phone and BOOM. Worked for me. Others have also mentioned that they noticed this happening when using the Samsung Keyboard or the Samsung preinstalled Swype instead of the Google keyboard or the full Swype+Dragon app.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-48692867579831824802014-03-06T17:11:00.006-05:002019-04-17T09:55:56.668-04:00SharePoint 2013 Workflow App Step won't work, gets suspendedSituation: SharePoint 2013, Workflow Manager 1.0 + the two CUs for Service Bus and Workflow Manager, List with workflow that has an app step (to allow it to read/write to other lists to which the user does not have access). Workflow App Step can't seem to do anything at all! Tried process of elimination and it seems like it cannot look up any field in my current item. When I look at the workflow right after it starts, the status says "Starting" with this message in the little "i" symbol:<br />
<br />
<span style="font-size: small;"><i>Retrying last request. Next attempt scheduled in less than one minute. Details of last request: HTTP NotFound to https://MYSERVER/_vti_bin/client.svc/web/lists/getbyid('SOME_GUID_GOES_HERE')/Items(IdOfYourItem)?%24select=SOMETHING</i></span> <br />
<br />
After that, it gets a status of "Suspended" with detail in that little "i" icon of something awful like this (which is a web service result from that client.svc mentioned up there):<br />
<span style="font-size: x-small;"><i>Details: An unhandled exception occurred during the execution of the workflow instance. Exception details: System.ApplicationException: HTTP 404 {"error":{"code":"-1, System.ArgumentException","message":{"lang":"en-US","value":"Item does not exist. It may have been deleted by another user."}}} {"Transfer-Encoding":["chunked"],"X-SharePointHealthScore":["0"],"SPClientServiceRequestDuration":["18"],"SPRequestGuid":["c38c92af-563c-073b-8950-19e9ab5d00ce"],"request-id":["c38c92af-563c-073b-8950-19e9ab5d00ce"],"X-FRAME-OPTIONS":["SAMEORIGIN"],"MicrosoftSharePointTeamServices":["15.0.0.4420"],"X-Content-Type-Options":["nosniff"],"X-MS-InvokeApp":["1; RequireReadOnly"],"Cache-Control":["max-age=0, private"],"Date":["Thu, 06 Mar 2014 22:07:22 GMT"],"Server":["Microsoft-IIS\/7.5"],"X-AspNet-Version":["4.0.30319"],"X-Powered-By":["ASP.NET"]} at Microsoft.Activities.Hosting.Runtime.Subroutine`1.SubroutineChild.Execute(CodeActivityContext context) at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation) </i></span><br />
<br />
Resolution: Turns out, the App Step can't get past those pesky list advanced settings where you set "item-level permissions" to "Read only their own". Yea, for that the workflow "app" would need to have higher or elevated permissions. Seems like the only way to do this is to A) Have an app catalog and B) Set the workflow "app" as having full control using a hidden app permissions page. You can find the step by step <a href="http://msdn.microsoft.com/en-us/library/office/jj822159%28v=office.15%29.aspx" target="_blank">here </a>(Microsoft) or <a href="http://www.fabiangwilliams.com/2013/09/08/actually-resolved-unable-to-create-list-using-sharepoint-2013-rest-api-in-spd2013/" target="_blank">here </a>(Fabian Williams...the better one if you ask me). This applies to any action that might require full control or design-like permissions like creating lists, creating sites, etc. and applies to SharePoint 365 as well.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-33683420713427191102014-03-06T10:35:00.000-05:002014-03-06T10:35:14.679-05:00SharePoint 2013 List Not Shared With You<strong>Situation</strong>: SharePoint 2013, claims-based authentication using Windows NTLM claims (aka your Windows PC passes your username and password to SharePoint's claims security service to log you in), site uses AD group membership to control access, and finally a web part on the homepage to show some items in this list.<br />
<br />
Other users could load this just fine, my user would get no items and, when I clicked on the list itself, would get "<u>Sorry, this site has not been shared with you</u>" (nevermind that it was a list, not a site, the error still said 'site'). Checked permissions on the item, list, site, site collection, masterpage library, themes libraries, web parts, image library, custom javascript library pages, etc. Checked server logs and SharePoint logs that show all sorts of nasty detail on every thing in SharePoint, nothing.<br />
<br />
I also noticed that AD group membership changes didn't sync - even after telling the User Profile Service to do a full sync!<br />
<br />
<strong>Resolution</strong>:<br />
I remember stumbling on something about AD group memberships in SharePoint 2013 (though this applies to 2010 as well with ADFS) regarding claims-based authentication and an expiration or cache of some kind. Using two articles (<a href="http://sergeluca.wordpress.com/2013/07/06/sharepoint-2013-use-ag-groups-yes-butdont-forget-the-security-token-caching-logontokencacheexpirationwindow-and-windowstokenlifetime/" target="_blank">here</a> and <a href="http://blog.randomdust.com/index.php/2013/06/sharepoint-2013-claim-expiration-and-ad-sync/" target="_blank">here</a>) with MS documentation (<a href="http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.administration.spwebservice.tokentimeout(v=office.15).aspx" target="_blank">here</a>, <a href="http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.administration.claims.spsecuritytokenservicemanager.logontokencacheexpirationwindow(v=office.15).aspx" target="_blank">here</a>, <a href="http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.administration.claims.spsecuritytokenservicemanager.windowstokenlifetime.aspx" target="_blank">here</a>, and <a href="http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.administration.claims.spsecuritytokenservicemanager.formstokenlifetime(v=office.15).aspx" target="_blank">here</a>), I found the fix: update the Security Token Service timeout to a shorter time and update the Windows and Forms token timeouts to a shorter time as well.<br />
<br />
Be aware - the following powershell (that you have to run from the SharePoint 2013 Management Shell) tells the STS (Security Token Service) to reduce the amount of time a user token is valid (as far as the STS cares) and then lowers the expiration timeframe for claims token handed out by the STS. This would increase network traffic (refer to articles up there for more detailed explanations) and could make your SharePoint a bit unstable if your timeouts aren't in this pattern of TokenTimeout > Windows/FormsTokenLifetimes > LogonTokenCacheExpirationWindow:<br />
<br />
<pre class="brush:ps">$cs = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$cs.TokenTimeout = (New-TimeSpan -hours 4)
$cs.Update()
$sts = Get-SPSecurityTokenServiceConfig
$sts.FormsTokenLifetime = (New-TimeSpan -hours 2)
$sts.WindowsTokenLifetime = (New-TimeSpan -hours 2)
$sts.LogonTokenCacheExpirationWindow = (New-TimeSpan -minutes 30)
$sts.Update()
iisreset</pre>
<div class="brush:ps">
</div>
<div class="brush:ps">
You can play with the times to be much shorter if you are just trying to fix an issue immediately but the ContentService's TokenTimeout HAS to be greater than the TokenLifetimes which HAVE to be greater than the CacheExpirationWindow. I'm playing with these settings for now and will update later if I end up changing them.</div>
Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-23355600055343986342013-07-08T10:45:00.000-04:002013-07-08T10:45:18.759-04:00SharePoint 2010 People Picker Address Book Error - Namespace xsd not definedProblem: You attempt to use a people picker control in SharePoint 2010 and click the little 'address book' button to find a user. You search for someone and find them (yay!) and click the OK button to add them and you get a giant error message. And then nothing. It doesn't copy the user you picked over to the People Picker or anything. When you give the error message to the admin (or you are the admin and you look up the correlation ID in the logs) then you get an error in the list saying "Namespace 'xsd' is not defined" in an XML error.<br />
<br />
Cause: Customizations to SharePoint masterpage to make it use more recent web technologies like CSS3/HTML5 (via the X-UA-COMPATIBLE meta tag) OR your browser trying to force a square peg (IE8 stuff) into a round hole (IE9 or 10+ browser).<br />
<br />
Resolution:<br />
Sadly, if there are customizations to the masterpage to try to use HTML5 or CSS3 then you have to use javascript to fix it (google for that...I think Matt Oleson had something), use a different browser than IE9 or 10, or fix the masterpage to render the page in IE8 standards (re-setting the X-UA-COMPATIBLE to IE8 instead of IE9 or EDGE or anything else that might be there).<br />
<br />
If, however, you have done no customizations of this modern sort in your environment and you are using IE9 or IE10 and you get this then your problem is a lack of compatibility view. You need to add your SharePoint site to your IE list of sites in the Tools -> Compatibility View Settings. Normally, Intranet sites should be in this list (like sites inside your company network) but if the site contains a period (.) then it wouldn't qualify as an "Intranet" site in this case. We used GPO (Group Policy Objects) to push this setting out to all computers in our environment to force IE to see our site in Compatibility View.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-5043874165812349512013-06-27T13:44:00.002-04:002013-06-27T13:44:12.111-04:00SharePoint Designer 2010 Workflow - Lookup User Information in SharePoint Foundation 2010Problem: In Foundation 2010, SharePoint Designer only allows you to perform a lookup on a user's Name, Display Name, Email Address, and ID...what about custom fields that were added to the User Information List? What about Department, Phone number, or any other field that works fine with a People Picker? Can I do ANYTHING without going into dreaded code or installing a third-party app or upgrading to Enterprise's User Profile Service? The answer is yes, in VERY limited circumstances (so don't get your hopes too high yet)<br />
<br />
Resolution:<br />
Caveats:<br />
<ul>
<li>This will ONLY work on the root site of a site collection (believe me, tried this in many ways but it seems you cannot change the WEB property of a workflow context on the fly)</li>
<li>This requires SharePoint Designer, should be obvious but wanted to mention just in case</li>
<li>The getting of your User Information List GUID (first step) can be accomplished in several ways but almost all of them will require site collection administration privileges</li>
<li>What you are essentially going to do is create a dummy User Information List with the columns you want in it, get your workflow working beautifully with this dummy list based on someone's username being entered in a text field, and then do a bait-n-switch and tell SPD that the list you are using in your lookup isn't your dummy list but actually the User Information List</li>
<li>You may be able to play with it to be able to pass the User's ID to the User Info List instead of the text of the username, but it was easier for me to send it just the username from a text field</li>
</ul>
Setup:<br />
<ul>
<li>Custom List that I titled "TestUserInfoList" (this is my Dummy List) with 3 columns</li>
<ul>
<li>Title (didn't change it at all)</li>
<li>Username (Single Line Of Text)</li>
<li>Manager (People Picker) <-- This is my custom field to populate</li>
</ul>
<li>Custom List with these columns:</li>
<ul>
<li>Employee (Person or Group)</li>
<li>Manager (Person or Group)</li>
<li>Username (Single Line of Text)</li>
</ul>
<li>Workflow with 2 variables:</li>
<ul>
<li>varEmployee (string)</li>
<li>varManager (string)</li>
</ul>
<li>Steps to workflow</li>
<ul>
<li>Set varEmployee to CurrentItem:Username</li>
<li>Set varManager to (workflow lookup):</li>
<ul>
<li>Data Source: TestUserInfoList</li>
<li>Field from source: Manager</li>
<li>Return Field As: As String</li>
<li>Field: TestUserInfoList:Username</li>
<li>(Fx button) -> Workflow Variables -> Variable: varEmployee</li>
</ul>
<li>Set Employee to varEmployee</li>
<li>Set Manager to varManager</li>
</ul>
</ul>
Steps:<br />
<ol>
<li>Get your User Information List GUID (you *might* be able to get away without this but I haven't tested it yet) </li>
<ol>
<li>Go to your site collection homepage, and add '/_catalogs/users/simple.aspx' to the URL, and hit Enter (so your full URL would look something like <a href="http://yoursite/_catalogs/users/simple.aspx'">http://yoursite/_catalogs/users/simple.aspx'</a>)</li>
<li>Click on the "List View" at the top (which is a dropdown) and click "Modify this view"</li>
<li>Take a look at the URL, you will see ViewEdit.aspx?List=BLAH&View=BLAH&Source=BLAH. What you want is to grab the 'List=BLAH', nothing before or after it</li>
<li>Take the BLAH part and replace all the '%2D's in it with dashes (-), the '%7B' with a left curly brace ({) and the '%7D' with a right curly brace (})</li>
<li>So you should now have the GUID of your User Information List which should look like {00000000-0000-0000-0000-000000000000} except yours will not be all zeroes</li>
</ol>
<li>Create dummy list in the root site (where you HAVE to be working), ignore the title field, create a Username field that is a single line of text, and a field with the exact same name and type as the field you wish to return (most likely a single line of text field). For my purposes, I had added a "Manager" field to my User Information List that was a people picker so that's what I recreated in my dummy list</li>
<li>Add a couple of entries to your dummy list</li>
<li>Create your workflow on your custom list (NOT your dummy list) and, in your first step, use the workflow to set a workflow variable (I created one called varManager) and then, in a separate step, set the field(s) in your custom list to the appropriate workflow variable and PUBLISH your workflow</li>
<li>VERIFY THAT IT WORKS RIGHT...if you screw this part up then you'll spend hours troubleshooting later. Just make sure it works, m'kay?</li>
<li>Again, as long as your workflow works beautifully, you are ready to trick SharePoint into using the hidden User Information List: in SPD, click "All Files" at the bottom-left, then click the "Workflows" folder in the main window, then click your workflow, and you should see at least two files, maybe 3, but the one you are looking for has the workflow symbol (a circle with a red checkbox in the middle)...RIGHT-click on this file and then click "Open With..." and choose the SahrePoint Designer -> Open as XML option</li>
<li>You will see a bunch of gobbledeegook. It's ok, don't panic. What you are looking for is this and it will be somewhere around the 20th line of XML:<br /><SequenceActivity x:Name="ID##" Description="Your Step Description where you set the workflow variables"></li>
<li>Inside this sequence activity (which is just your step) you will look for your first ns1:FindValueActivity and it has an attribute that looks like ExternalFieldName="Username" and, further to the right, an ExternalListId="{}{0000000-0000-0000-0000-000000000000}"</li>
<li>Replace the GUID there (leaving that first set of curly braces) with the GUID you got in step 1</li>
<li>Go down around 5 lines and you'll see an ns1:LookupActivity where the ListId attribute looks just like the ExternalListId you just fixed...you guessed it, do the same thing to this list ID.</li>
<li>Repeat steps 8 thru 10 for each variable that you looked up in your workflow within this SequenceActivity. You will be essentially replacing all ExternalListId's and their corresponding lookupActivity listID's to the User Information List</li>
<li>Save this file and close SPD. </li>
<li>Re-Open SPD, open your site, and open this workflow</li>
<li>Add a single action to the workflow like "Log to history list" or something, ANYTHING, and you can publish this updated workflow to your site</li>
<li>You should be good now. To verify, delete your entries in your dummy list and create some new items in your custom list and all should work beautifully - you are now using the User Information List to set variables and use them to do your bidding...MWAHAHAHAHAHA</li>
</ol>
Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-15160067457873615182012-12-06T12:53:00.000-05:002012-12-06T15:58:04.199-05:00SharePoint 2010 Foundation User Profile Sync<h2>Problem:</h2>SharePoint 2010 Foundation does not provide a user profile synchronizing service between active directory and SharePoint. Sure, whenever a user logs into SharePoint it adds that user but it only copies the name and email address and only to the site collection in question. If there are changes to the name of the user, this can sometimes get stuck and not sync over correctly. Additionally, information like the manager, phone number, and other directory information does not get pushed to the user profiles.<br />
<h3>Why does this matter?</h3>When you use a Person/Group column in SharePoint 2007/2010, it is really just a lookup column to that site collection's User Information List and thus you can return any column from the user information list that you like (Name, email, etc.) even though the input is always the username and click the little Check Names box. This can allow for specialized lookups in things like InfoPath or special forms where you want to automatically know who the person's manager is whenever they add an item to a list.<br />
<h3>Enter Powershell scripting</h3>Oh no, NO NO NO: the word "scripting" made it into a blog about 'out-of-the-box' solutions!? HYPOCRITE. Anyway, Powershell is like the command prompt from windows, something geeks have been using for years to do things that would otherwise require lots of clicks. Powershell is ONLY for system administrators, not regular SharePoint people. But this type of problem is really more of an admin problem anyway.<br />
<h2>Resolution:</h2>I combined two blog entries: <a href="http://technet.microsoft.com/en-us/library/ff730967.aspx">here</a> and <a href="http://blog.falchionconsulting.com/index.php/2011/12/updating-sharepoint-2010-user-information">here</a> to make a script that does the following:<br />
<ol><li>Creates a request to pull from active directory via LDAP (lightweight directory access protocol I think). This asks Active Directory (where all usernames/passwords/etc are stored for your company) to find all users they have. Now, if your company has a bajillion users, this would be bad, but it asks for their account name and just a few properties so it's not too terrible (and, hey, I have like 80 users to deal with it).</li>
<li>Gets all the site collections from SharePoint</li>
<li>For each site, try to run a simple "SyncFromAd" command to make sure that the user's automatic info is up-to-date</li>
<li>Go the user information list and update the columns in there with my information</li>
<li>Forgot to mention - I added a couple of custom columns to the user information list for my site collection...that may be a no-no, not totally sure ^_^, but I SET THEM ANYWAY. Bam.</li>
</ol>So, without further ado, here's the script that does the work (and this WILL take updating on your end if you want to use it to specify the domain for your company and whether or not you are adding columns to your user information lists like I did):<br />
<br />
<pre class="brush:ps">$objDomain = New-Object System.DirectoryServices.DirectoryEntry("LDAP://OU=YOUR_USERS_OU,dc=YOUR_DOMAIN_NAME,dc=YOUR_DOMAIN_ENDING_LIKE_COM")
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$objSearcher.SearchRoot = $objDomain
$objSearcher.PageSize = 1000
$objSearcher.Filter = $strFilter
$objSearcher.SearchScope = "Subtree"
$varDomain = "YOUR_DOMAIN_NAME"
$colProplist = "samaccountname","title","department","ipphone","mobile","name","distinguishedname","manager"
foreach ($i in $colPropList) {
$objSearcher.PropertiesToLoad.Add($i)
}
$colResults = $objSearcher.FindAll()
$manResults = $objSearcher.FindAll()
$sites = Get-SPSite -Limit "ALL"
foreach ($objResult in $colResults) {
$objItem = $objResult.Properties
write-host
$userID = $varDomain+"\"+[string]$objItem.samaccountname
foreach ($site in $sites) {
$web=$site.RootWeb
set-spuser -Identity $userID -SyncFromAD -web $site.url -ErrorAction SilentlyContinue
if(!$error[0]) {
write-host $site.url " - " $userID
$list = $web.Lists["User Information List"]
$query = New-Object Microsoft.SharePoint.SPQuery
$query.Query = "<Where><Eq><FieldRef Name='Name' /><Value Type='Text'>$userID</Value></Eq></Where>"
foreach ($item in $list.GetItems($query)) {
$item["JobTitle"] = [string]$objItem.title
$item["Department"] = [string]$objItem.department
$item["IPPhone"] = [string]$objItem.ipphone
$item["MobilePhone"] = [string]$objItem.mobile
$item["Title"]= [string]$objItem.name
$item["Username"] = [string]$objItem.samaccountname
$manager = $manResults | Where-Object {$_.Properties.distinguishedname -eq $objItem.manager}
$managerClean = $varDomain+"\"+[string]$manager.Properties.samaccountname
$spManager = Get-SPUser -Identity $managerClean -web $site.url
$item["Manager"] = $spManager
$item.SystemUpdate()
}
}
else {
#write-host $site.url " - " $error[0]
}
$error.clear()
$web.Dispose()
$site.Dispose()
}
}</pre>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com2tag:blogger.com,1999:blog-4504860312480986243.post-63119849846599751022012-09-10T08:50:00.002-04:002012-09-10T08:53:36.622-04:00Sending emails between SharePoint lists(Also thought of as SharePoint Workflows emailing other SharePoint Lists.)<br />
<br />
Situation: Wanted to have a single list at the root site where users can submit ideas and problems that should go to IT for review. We figured we would place a simple list there, use the masterpage to create a link at the bottom, and have this list workflow email our own IT Tickets announcements list (different site). One problem: MS built SharePoint so that it will not recognize emails sent to it from itself (security feature for possible infinite loops- KB970818).<br />
<br />
Resolution: There is a resolution out there that involves a VB script placed on the SharePoint server to strip out the SharePoint-y part of the email....I thought that (since I've got Exchange in my environment) I could just use what is known as a "transport rule" to strip out the same stuff so that I don't have to worry about configuring SharePoint all the time. See, here's the deal - when SharePoint sends emails it adds a special header that you don't normally see (in Outlook 2010, you have to open an email and hit File -> Properties to see the message headers). This special header is titled "X-Mailer" and acts like a little field and SharePoint puts inside this field "Microsoft SharePoint" something or another. So I had our exchange administrator setup a transport rule that says "If an email has an X-Mailer header with the word "SharePoint" in it and is being sent to an address that ends in @sharepoint.mydomain.com then remove the X-Mailer header". That's it!<br />
<br />
Just to clarify - this would allow me to have two lists email each other in an infinite loop...so your SharePoint admin may not be a fan if you have a large environment and people could setup this sort of thing...only use if you have a structure in-place that would prevent that from happening :)Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-41123620828582022162012-05-25T11:33:00.000-04:002012-09-10T08:55:12.238-04:00SharePoint 2010 LVWP Dynamic Date FilteringSituation: You used SharePoint Designer 2010 to insert what you *thought* was a data view onto a web part page and it is actually a List View Web Part (LVWP) and you want to filter the data related to [Today]. When you click the 'Filter' button in the top-left of SPD2010, it will only let you filter to '[CurrentDate]' and doesn't give you the option to say something like '[Today]-10'.<br />
<br />
Resolution: There are two ways to alter the core query here:<br />
1 - No looking at the dreaded 'code' or 'CAML': Save the page in SharePoint Designer, open the page directly in your browser, modify the web part by clicking its little dropdown and clicking 'Edit Web Part' and clicking 'Edit Current View' underneath the Current View dropdown (and then fix like you would any other view...sometimes this may actually give you an error if your LVWP was being evil like mine).<br />
2 - Must look at 'code' and 'CAML': When you look at the evil 'code' behind your data view, you will notice a section that looks like this near the top of your LVWP:<br />
<View name="{123jk123-j123-12j1-j1k2l;3l123j}" Mobile View="TRUE" Type="HTML"....<br />
<Query><br />
<OrderBy /><br />
<GroupBy /><br />
</Query><br />
<br />
If you have already filtered to '[CurrentDate]' then it will have put a section inside the <Query> that looks similar to this:<br />
<Where><br />
<Eq><br />
<FieldRef Name="YourDateField"/><br />
<Value Type="DateTime"><br />
<Today/><br />
</Value><br />
</Eq><br />
</Where><br />
<br />
What to fix: that lovely <Today/> piece. If you want to say something like 'Show me everything that is greater than (aka after) the last 10 days' then you would need to:<br />
1) Add an 'OffsetDays' attribute to the Today so that it looks like this: <Today OffsetDays="-10"/><br />
2) Make sure that where I have "Eq", you put "Gt" (short for Greater Than) -- You could do the reverse by setting it to Lt for "Less Than" and making the OffsetDays a positive number to make it something like [Today]+7.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com3tag:blogger.com,1999:blog-4504860312480986243.post-23224428564471728962012-05-15T12:59:00.001-04:002013-01-09T14:00:19.910-05:00SharePoint 2007 and 2010 - Prepopulate or Prefill list fields via Query String aka browser address<em>Edit 1/9/2013 - Updated the time function to round any supplied time to the nearest 5 minutes so that it will properly set the time dropdown.</em><br />
<em>Edit 10/24/2012 - Updated one line where an errant forward slash existed and may be the cause of some users being unable to use the 2010 script.</em><br />
Situation: You want to have users fill out a standard list form but you want to auto-fill some of the fields for them (to make it just easier to fill or else to hide those fields so the user doesn't know they've been filled for them...think customer service feedback and a field specifying which person you spoke to).<br />
Resolution: <br />
<ol><li>Place the jQuery library (<a href="http://docs.jquery.com/Downloading_jQuery">http://docs.jquery.com/Downloading_jQuery</a> download the minified version) in a document library everyone has access to in your SharePoint</li>
<li>If you are using SharePoint 2010, open your list, click the 'List' tab, click the edit 'Default New Form' menu option, and skip to step 6</li>
<li>Go to your list, click the 'New' menu and click 'Item' (or just click 'New')</li>
<li>Notice the URL has your web address and ends in /newform.aspx?BLAHBLAH...delete everything after the question mark and replace it with the following: PageView=Shared&ToolPaneView=2</li>
<li>Your link should now look like <a href="http://sharepointserver/site/lists/yourlist/newform.aspx?PageView=Shared&ToolPaneView=2">http://sharepointserver/site/lists/yourlist/newform.aspx?PageView=Shared&ToolPaneView=2</a> and you just hit enter to visit this new address</li>
<li>Huzzah! You are in edit mode for the page! Just click 'Add a Web Part' and add a Content Editor web part</li>
<li>Once you add the web part, move it to the bottom of the page and edit its properties</li>
<li>Click 'Edit Source' or, if you are using 2010, click in the webpart so you could type and when the menu tab at the top switches to the 'Format Text' tab, click the 'HTML' dropdown option and choose 'Edit HTML Source'</li>
<li>If you are using SharePoint 2007, grab the script from nothingbutsharepoint.com (<a href="https://www.nothingbutsharepoint.com/sites/eusp/Pages/jquery-for-everyone-pre-populate-form-fields.aspx">https://www.nothingbutsharepoint.com/sites/eusp/Pages/jquery-for-everyone-pre-populate-form-fields.aspx</a>) and paste it into your source. If you are using SharePoint 2010, copy and paste my script below these instructions.</li>
<li>Key thing to notice: in the beginning of his script, there's a part referencing Google's jQuery library, what you want to do is replace what is typed there for a URL for your own URL to the document library where your jQuery is</li>
<li>Click ok, save the page, you are now done.</li>
</ol>Here is the script for SharePoint 2010 (If you want the non-commented/more condensed version, I've added it to the very bottom of this post):<br />
<pre class="brush: js"><script type="text/javascript">
if(typeof jQuery=="undefined"){
var jQPath="http://YOURSHAREPOINT/YOURLIBRARY/jquery.min.js";
document.write("<script src='",jQPath,"' type='text/javascript'><\/script>");
}
</script>
<script type="text/javascript">
/*
* Prepopulate form fields in SharePoint
* Copyright (c) 2008 Paul Grenier (endusersharepoint.com now nothingbutsharepoint.com)
* Licensed under the MIT (MIT-LICENSE.txt)
* Updated for 2010 by Brendan Horner for nothingbutsharepoint.com
*/
(function(){
var params = window.location.search.substring(1).split("&"),
kv = {},
opts,
sp=/%20|\+/g,
datetime=/([1-9]|0[1-9]|1[012])[\-\/.]([1-9]|0[1-9]|[12][0-9]|3[01])[\-\/.](19|20)\d\d\s([0-1][0-2]|[0-9]):([0-9]{2})\s(A|P)M/i,
date=/([1-9]|0[1-9]|1[012])[\-\/.]([1-9]|0[1-9]|[12][0-9]|3[01])[\-\/.](19|20)\d\d/,
clean = function(str){
return str.replace(sp," ");
},
getKv = function(){
$.each(params,function(i,e){
var p=e.split("=");
kv[p[0]]=decodeURIComponent(p[1]);
});
return kv;
};
jQuery.prepop = function(){
$.each(getKv(),function(k,v){
k=clean(k);
v=clean(v);
var f=$("[title='"+k+"']"),
job;
if (f.length>0){
if (f[0].type=="text"){job=10;} //text
if (f[0].type=="checkbox"){job=20;} //checkbox
if (f[0].type=="select-one"&&f[0].tagName=="SELECT"){job=10;} //choice dropdown and non-IE lookup
if (f[0].tagName=="TEXTAREA"){job=10;} //Multi-lines of text
if (f[0].type=="text"&&f[0].opt=="_Select"){job=70;} //IE lookup with evil img and hidden input
if (v.match(date)){job=40;} //date
if (v.match(datetime)){job=50;} //datetime
}
if (f.length===0){
var elm = $("nobr:contains('"+k+"')");
if (elm.length>0){
elm = elm.closest("td").next()[0];
var s1 = $(elm).find("select:first"),
s2 = $(elm).find("select:last"),
p1 = $(elm).find("textarea[title='People Picker']"),
p2 = $(elm).find("div[title='People Picker']"),
r1 = $(elm).find("span[title='"+v+"']"),
vals = v.split(","),
r2 = $(elm).find("span[title='"+vals[0]+"']");
if (s1.length>0){job=80;} //multi-select
if (p1.length>0){job=90;} //people picker
if (r1.length>0||r2.length>0){job=30;} //radio button single select or checkbox list
}
}
switch (job){
case 10:
if (v.substring(0,1)=="@"){
opts = f[0].options;
$.each(opts,function(i,e){
if (opts[i].value==v.substring(1)){f[0].selectedIndex=i;}
});
}else{
f.val(v);
}
break;
case 20:
if (v.toUpperCase()=="TRUE"||v=="1"){f[0].checked=true;}
if (v.toUpperCase()=="FALSE"||v=="0"){f[0].checked=false;}
break;
case 30:
$.each(vals, function(i,e){
var V=TrimSpaces(e); //TrimSpaces is a function in core.js of SharePoint 2010
$.each($(elm).find("span.ms-RadioText").find("label"),function(i,e){
if($(e).text()==V){
$(e).prev().attr('checked',true);
}
});
});
break;
case 40:
v=v.replace(/[\-\/.]/g,"/");
f.val(v);
break;
case 50:
var dt=v.split(" "),
d=dt[0].replace(/[\-\/.]/g,"/"),
t=dt[1],
hm=t.split(":"),
hh=hm[0].replace(/^0/,""),
mm=hm[1],
ap=dt[2].toUpperCase();
f.val(d);
mm=5*Math.round(mm/5);
f.parent("td").siblings("td.ms-dttimeinput")
.find("select:first").val(hh+" "+ap)
.parent("td").find("select:last").val(mm);
break;
case 70:
fArr = f.attr('choices').split('|');
if (v.substring(0,1)=="@"){
for (i=1;i<fArr.length;i=i+2){
if(fArr[i] == v.substring(1)){
f.val(fArr[i-1]);
$('input[id="'+f.attr("optHid")+'"]').val(fArr[i]);
}
}
f.blur();
}else{
f.val(v);
for (i=0;i<fArr.length;i=i+2){
if(fArr[i] == v){
$('input[id="'+f.attr("optHid")+'"]').val(fArr[i+1]);
}
}
f.blur();
}
break;
case 80:
opts = s1[0].options;
var s1hiddenInput = s1.parents('span').find('input[type="hidden"]').first();
var s1hiddenVal="";
$.each(vals,function(i,e){
var V=e;
$.each(opts,function(i,e){
if (opts[i].text==V){
s2.append("<option value='"+opts[i].value+"'>"+V+"</option>");
s1hiddenVal+=opts[i].value+"|t"+V+"|t";
}
if (V.substring(0,1)=="@"){
if (opts[i].value==V.substring(1)){
s2.append("<option value='"+V+"'>"+opts[i].text+"</option>");
s1hiddenVal+=opts[i].value+"|t"+V+"|t";
}
}
});
});
s1hiddenInput.attr('value',s1hiddenVal);
break;
case 90:
var p=vals.join(";");
p1.val(p);
p2.html(p);
break;
}
});
};
})();
$(window).load(function(){
$.prepop();
});
</script></pre><div class="brush: js"> </div><div class="brush: js">Here's the slightly more condensed version:<br />
</div><div><script type="text/javascript"> if(typeof jQuery=="undefined"){ var jQPath="http://YOURSHAREPOINT/YOURLIBRARY/jquery.min.js"; document.write("<script src='",jQPath,"' type='text/javascript'><\/script>"); } </script> <script type="text/javascript"> /* * Prepopulate form fields in SharePoint * Copyright (c) 2008 Paul Grenier (endusersharepoint.com now nothingbutsharepoint.com) * Licensed under the MIT (MIT-LICENSE.txt) * Updated for 2010 by Brendan Horner for nothingbutsharepoint.com */ (function(){ var params = window.location.search.substring(1).split("&"), kv = {}, opts, sp=/%20|\+/g, datetime=/([1-9]|0[1-9]|1[012])[\-\/.]([1-9]|0[1-9]|[12][0-9]|3[01])[\-\/.](19|20)\d\d\s([0-1][0-2]|[0-9]):([0-9]{2})\s(A|P)M/i, date=/([1-9]|0[1-9]|1[012])[\-\/.]([1-9]|0[1-9]|[12][0-9]|3[01])[\-\/.](19|20)\d\d/, clean = function(str){ return str.replace(sp," "); }, getKv = function(){ $.each(params,function(i,e){ var p=e.split("="); kv[p[0]]=decodeURIComponent(p[1]); }); return kv; }; jQuery.prepop = function(){ $.each(getKv(),function(k,v){ k=clean(k); v=clean(v); var f=$("[title='"+k+"']"), job; if (f.length>0){ if (f[0].type=="text"){job=10;} if (f[0].type=="checkbox"){job=20;} if (f[0].type=="select-one"&&f[0].tagName=="SELECT"){job=10;} if (f[0].tagName=="TEXTAREA"){job=10;} if (f[0].type=="text"&&f[0].opt=="_Select"){job=70;} if (v.match(date)){job=40;} if (v.match(datetime)){job=50;} } if (f.length===0){ var elm = $("nobr:contains('"+k+"')"); if (elm.length>0){ elm = elm.closest("td").next()[0]; var s1 = $(elm).find("select:first"), s2 = $(elm).find("select:last"), p1 = $(elm).find("textarea[title='People Picker']"), p2 = $(elm).find("div[title='People Picker']"), r1 = $(elm).find("span[title='"+v+"']"), vals = v.split(","), r2 = $(elm).find("span[title='"+vals[0]+"']"); if (s1.length>0){job=80;} if (p1.length>0){job=90;} if (r1.length>0||r2.length>0){job=30;} } } switch (job){ case 10: if (v.substring(0,1)=="@"){ opts = f[0].options; $.each(opts,function(i,e){ if (opts[i].value==v.substring(1)){f[0].selectedIndex=i;} }); }else{ f.val(v); } break; case 20: if (v.toUpperCase()=="TRUE"||v=="1"){f[0].checked=true;} if (v.toUpperCase()=="FALSE"||v=="0"){f[0].checked=false;} break; case 30: $.each(vals, function(i,e){ var V=TrimSpaces(e); $.each($(elm).find("span.ms-RadioText").find("label"),function(i,e){ if($(e).text()==V){ $(e).prev().attr('checked',true); } }); }); break; case 40: v=v.replace(/[\-\/.]/g,"/"); f.val(v); break; case 50: var dt=v.split(" "), d=dt[0].replace(/[\-\/.]/g,"/"), t=dt[1], hm=t.split(":"), hh=hm[0].replace(/^0/,""), mm=hm[1], ap=dt[2].toUpperCase(); f.val(d); mm=5*Math.round(mm/5); f.parent("td").siblings("td.ms-dttimeinput") .find("select:first").val(hh+" "+ap) .parent("td").find("select:last").val(mm); break; case 70: fArr = f.attr('choices').split('|'); if (v.substring(0,1)=="@"){ for (i=1;i<fArr.length;i=i+2){ if(fArr[i] == v.substring(1)){ f.val(fArr[i-1]); $('input[id="'+f.attr("optHid")+'"]').val(fArr[i]); } } f.blur(); }else{ f.val(v); for (i=0;i<fArr.length;i=i+2){ if(fArr[i] == v){ $('input[id="'+f.attr("optHid")+'"]').val(fArr[i+1]); } } f.blur(); } break; case 80: opts = s1[0].options; var s1hiddenInput = s1.parents('span').find('input[type="hidden"]').first(); var s1hiddenVal=""; $.each(vals,function(i,e){ var V=e; $.each(opts,function(i,e){ if (opts[i].text==V){ s2.append("<option value='"+opts[i].value+"'>"+V+"</option>"); s1hiddenVal+=opts[i].value+"|t"+V+"|t"; } if (V.substring(0,1)=="@"){ if (opts[i].value==V.substring(1)){ s2.append("<option value='"+V+"'>"+opts[i].text+"</option>"); s1hiddenVal+=opts[i].value+"|t"+V+"|t"; } } }); }); s1hiddenInput.attr('value',s1hiddenVal); break; case 90: var p=vals.join(";"); p1.val(p); p2.html(p); break; } }); }; })(); $(window).load(function(){ $.prepop(); }); </script></div>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com3tag:blogger.com,1999:blog-4504860312480986243.post-52152069989884476632012-01-05T19:11:00.002-05:002012-05-25T11:53:41.728-04:00Hide 'Submitted By' in InfoPath Browser FormIf you have had to deal with browser forms a lot, you may have run across the ability to send an email with the active view (aka page you are last looking at) as the body of an email to someone. The annoying thing is: it always adds 'Submitted by: BLAH' at the top of the email. I found a way to hide it IF you are sending the email to a recipient that processes <style> elements located in the <head> of the html in an email like Microsoft Exchange aka Outlook or Hotmail (GMail is one of the few that do not). DISCLAIMER: this just hides it and it only hides it ON THE FIRST EMAIL, it doesn't actually remove it from the email so someone *could* go find whoever submitted it if they wanted or see it if their computer is set to NOT process html emails. This also means that when they hit 'reply' or 'forward' it will appear (at least I haven't figured out a workaround for that yet). I have not tested it in a 2007 browser form or using InfoPath 2007 but the theory should work the same.<br />
<br />
Steps for InfoPath 2010:<br />
<ol>
<li>Publish like normal and run your form to send you an email</li>
<li>Open the email and view its source (in Outlook, you have to double-click and open the actual email and THEN you can right-click and then click 'View Source')</li>
<li>Scroll to the bottom and you will see something like this: <br />
<br />
<div style="word-wrap:break-word;color:windowtext;background:window;font-size:10.0pt;font-family:Tahoma" class="GOBBLEDEEGOOK">[Submitted by <a href="mailto:USERNAME@DOMAIN.COM">USERNAME@DOMAIN.COM</a>] <br><hr></div><br /><br />
</li>
<li>Note the 'class=' there...grab your GOBBLEDEEGOOK and copy it (without the quotations)</li>
<li>Go back to your infopath form and publish as source files into a folder on your desktop</li>
<li>Close InfoPath</li>
<li>Open up the folder on your desktop and right-click on the InfoPath view that you are emailing (it will be the name of your view.xsl) in notepad or, preferably, in notepad++</li>
<li>Add the following directly before the "</head>":<br />
<br />
<style>.GOBBLEDEEGOOK{DISPLAY:none !important;VISIBILITY:hidden !important}</style><br />
<br />
(Just be sure you replace GOBBLEDEEGOOK with whatever your GOBBLEDEEGOOK was). Mine looked like this:<br />
<style>.D38B7128-85A9-4481-A264-D05E46BC1B50{ DISPLAY:none !important; VISIBILITY:hidden !important;}</style><br /><br />
</li>
<li>Close notepad/notepad++</li>
<li>Go back into the folder on your desktop and right-click on manifest.xsf and click 'Design'</li>
<li>Publish your form like normal</li>
<li>Run a test to see if it hides that dumb 'submitted by' on that particular view/email.</li>
</ol>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com12tag:blogger.com,1999:blog-4504860312480986243.post-59240905181845101752011-09-24T00:38:00.001-04:002011-09-24T00:38:06.969-04:00Limit number of words in an InfoPath textboxSo you might have figured out you can limit the number of characters in a textbox in InfoPath, but what if you have to limit the number of words someone types? This came up in one project because they were posting the words onto a PowerPoint slide. To do this you must use 'data validation' (InfoPath 2007) or a 'validation' rule (InfoPath 2010) and you will need to select 'the expression...' for your first dropdown in the conditions and type this:<br />
<br />
string-length(.) - string-length(translate(., " ","")) > 60 (or the number of words you want)<br />
<br />
What this does is pretty straightforward - it counts the number of typed characters in your field (assuming you are placing this rule on the field in which people type - that's the string-length(.) part) and then it subtracts the same number of characters but with all the spaces removed (they were removed by the translate function) and this checks to see if the total is greater than 60...if it is, then your validation rule will tell the user something's wrong because they done typed more than 60 words o_O. This goes on the assumption that every space represents a word...which should be pretty close. Have fun!Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com1tag:blogger.com,1999:blog-4504860312480986243.post-82364267313221275852011-09-24T00:17:00.000-04:002011-09-24T00:17:16.393-04:00SharePoint 2010 Designer Workflow ErrorError: The server could not complete your request. For more specific information, click the details button. When you click the details button, the box is blank.<br />
Our setup: Workflows with a total number of conditions greater than 10-12 would break no matter how complex or how you broke down the steps/conditions when publishing a workflow through Forefront UAG.<br />
<br />
Explanation:<br />
There are several causes to this problem that we found as we researched our solution. In many cases, having your SharePoint admin recycle the application pool or adding an 'execution timeout' attribute to workflows on the web.config file helped (esp. for workflows with approval steps). In our case, the problem was that our system has to pass through Microsoft's security server known as Forefront UAG (Unified Access Gateway) when publishing workflows. Your computer talks to SharePoint with requests and receives responses but has to go through the UAG middle man that inspects everything by unboxing it, reviewing it, and packaging back and sending it on its way to the recipient - but it will only tolerate so much information to look through. There was a setting in UAG called 'maximum http body size' I believe that, when doubled, allowed us to publish a slightly larger workflow. So, the administrator went in and altered the settings in UAG to tell it to stop looking at some of these very specific requests for workflow publishing and voila - we can publish workflows now!<br />
<br />
For those UAG administrators, I believe you have to modify the trunk, go to the portal tab, and edit the 'do not parse the bodies of these requests' and add your web front ends using the FQDN and add the url pattern for whatever web services are being queried by SharePoint Designer when working with workflows (webs and lists maybe?). It is also noted in the documentation for SharePoint 2010 SP1 that you should also add those servers and the URLs for the lists and webs web services to the 'do not parse the RESPONSE bodies...' option as well.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-12237041708424437292011-05-02T11:41:00.000-04:002011-05-02T11:41:35.005-04:00SharePoint 2010 Themes - changing fontsI have to say a huge thank you to Randy Drisgill (@drisgill for twitter) for answering this situation for me. Basically, we had our administrator setup a new, default SharePoint 2010 site collection for us to use for testing. We decided to work on themes. We went to the Site Look n Feel and changed the fonts and colors...the color changes worked but the fonts would not change! Reason: v4master (the default masterpage for SP 2010) doesn't allow for themes to change the fonts! You have to switch the masterpage for your SP 2010 to nightandday or a custom masterpage that is setup to make fonts dynamic based on the theme. Just thought everyone should know. This could be a good thing to prevent users from changing fonts if you setup v4Master to use whatever default font you wish - but not so much for us.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-80419132897352107152011-03-23T12:56:00.000-04:002011-03-23T12:56:01.170-04:00Fixing crazy SharePoint permissionsSituation: SharePoint grew organically and we ended up with permissions craziness.<br />
Desired Solution: Come up with a model for our permissions and then implement it.<br />
Goals:<br />
<ol><li>Centralize the management of people's access to various lists/sites within a site</li>
<li>Allow administrative assistant(s) to manage access for their subdepartments' users' access</li>
<li>Allow the SharePoint support staff full access and control over everything</li>
</ol>Final Solution model:<br />
<ol><li>Use SharePoint groups based on the roles of every user in the department and subdepartments</li>
<li>Create an Excel spreadsheet of all the new groups and their assigned permissions on the different sites we have</li>
<li>Create SharePoint Support Team group which will have full control of everything and have an email address attached to it so that users could contact them when needed</li>
<li>Create a "SharePoint Group Managers" group that will "own" the other groups</li>
<ol><li>Set the owner of the SharePoint Group Managers to be the SharePoint support team</li>
</ol><li>Differentiate the proposed SharePoint groups and roles by how sensitive the position is</li>
<ol><li>All lower level position groups are managed by the SP Group Managers group and the higher level positions are managed by the SharePoint Support Team</li>
</ol><li>Create custom permission levels for this department</li>
<ol><li>Audit - Read + ability to create their own views</li>
<li>Restricted Contribute - contribute without the ability to delete or create personal views</li>
<li>Contribute without Views - well, it's contribute without personal views</li>
</ol><li>Reset permissions inheritance on the entire site and all subsites</li>
<li>Work for a few hours and set all the groups with their appropriate permissions on each list/site. Many of our lists ended up with broken inheritance but quite a few didn't. We often set all users of a department as having read or a modified contribute on the site and just tweaked certain lists</li>
<li>Go to the top-level site and edit the group quick launch so that all of the groups managed by the SharePoint managers are alphabetized and first before the groups managed by the SharePoint Support Team</li>
<li>Give instructions to admin assistants/other group managers on how to manage the memberships of their groups (and noone but the SharePoint Support Team has full control on ANYTHING)</li>
</ol>Result: We have an administrative assistant who goes to the People and Groups page on the top site and clicks on the role a user plays whenever she processes new employees or shifting employees. We are still watching this but all looks well enough to deploy to each department as we progress. This ends up with a lot of groups but works for our environment because we don't have several hundred groups accessing the same information. This pretty much follows my previous posts on permissions architecture.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0tag:blogger.com,1999:blog-4504860312480986243.post-23653888129652565262011-03-09T09:10:00.000-05:002011-03-09T09:10:37.887-05:00Outlook View Dynamic Date FilterThis is an oldie but a goodie. Refer to <a href="http://blogs.msdn.com/b/andrewdelin/archive/2005/08/08/448882.aspx">this article</a> to see where I got this. Situation: I need to see emails that are older than 60 days in Outlook 2007 or 2010 so that I can delete them; so, I go in Outlook and I create a "view" of my inbox. If you haven't ever done this before, in 2010 you click your inbox and then click the "View" tab and click "View Settings" on the left. You will be presented with a box to modify your current view. I just click the Filter... button, click on "Advanced", choose "Date/Time Field" -> "Received" for the field to check; then I click "on or before" for the middle dropdown; and type in any date for the Value and click "Add to List". Then, go to the SQL tab and click the "Edit these criteria directly" checkbox. At the end of what displays should be your actual date in single quotes...delete that and replace it with today(S) where S is the number of seconds you would like to offset from today and this can be a negative number to go backwards. So, I calculated that 60 days is 5184000 seconds before today so my SQL tab used to look like this:<br />
"urn:schemas:httpmail:datereceived" <= '1/1/2011 12:00 AM'<br />
<br />
and by using today(S) I converted it into this:<br />
<br />
"urn:schemas:httpmail:datereceived" <= today(-5148000)<br />
<br />
And that's it. This will filter to only show items that are 60 days old or older.Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com1tag:blogger.com,1999:blog-4504860312480986243.post-74548367412425393972011-02-16T10:08:00.000-05:002011-02-16T10:08:52.406-05:00InfoPath Form Security Checklist / FlowchartThe following graphic should help most of you prevent many security issues with your InfoPath forms. To be fair, one particular piece of functionality requires a codeplex addon called "SPDActivities" that you may have to convince your SharePoint manager to implement (if he/she hasn't already) or else you would be in SP 2010 and use the impersonation step. These are the pre-requisites to this chart:<br />
<ol><li>Create the following permission levels -</li>
<ul><li>Audit - Copy read and add the ability to "View Usage Data", "Manage Personal Views", and "Enumerate Permissions"...this permission is used for directors and auditors to see everything and do some reporting.</li>
<li>Restricted Contribute - Copy contribute and remove the ability to "Delete Items" and "Delete Versions" and "Manage Personal Views"...this is used for users who have to edit an infopath form and, with versioning turned on for the library, they can't delete the original version of the form.</li>
<li>Add Only - Copy Read and add the ability to "Add Items"...this is used for users who have to submit a form and need no subsequent access to it (or you want to secure it at that point)</li>
</ul><li>Anonymous Users = users who don't login</li>
<li>Always remember that those with Contribute permissions can easily switch to Explorer view or the Merge/Repair pages to view every form in your library...so, try not to ever give anyone contribute.</li>
<li>A couple of these things will appear redundant - it's to doubly make sure you do them :)</li>
</ol><br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjBHcoMpmKJRuzIhyD0AXOWGqUm5UrT-owQnS9_BmGDxwhMOXAM1FaUUQLmxpuY6een8aSYJh3P-5_c62G7aFYYaI_Dm7Y-r7HPR1iVGupxPRaZiJtxIsb4beKF9_Il4g_f9p4oo8k1ZSc/s1600/SP+Forms+Security+Flowchart+-+Universal.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" j6="true" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjBHcoMpmKJRuzIhyD0AXOWGqUm5UrT-owQnS9_BmGDxwhMOXAM1FaUUQLmxpuY6een8aSYJh3P-5_c62G7aFYYaI_Dm7Y-r7HPR1iVGupxPRaZiJtxIsb4beKF9_Il4g_f9p4oo8k1ZSc/s1600/SP+Forms+Security+Flowchart+-+Universal.jpg" /></a></div>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com1tag:blogger.com,1999:blog-4504860312480986243.post-32029308436108890772011-02-02T08:58:00.000-05:002011-02-02T08:58:58.072-05:00Invalid or missing properties when saving an Office documentSometimes you try to save an office document in SharePoint and it decides that it is missing some properties in something called the 'Document Information Panel'. This panel appears at the top of the Office client (Word, Excel, PowerPoint, etc) with the columns you have created within a document library for users to fill out key information ABOUT a document (often called document metadata) so that you can find that document in the future by using those columns. Here's the thing: when you see this error about missing or invalid properties - it usually means you filled out something incorrectly or there's something you forgot to fill out...but not in my case. In my case, I did a bad, bad thing and created a column called DocType. NEVER EVER CREATE A COLUMN WITH THE NAME OF DOCTYPE!!!! It will immediately give you an error aftter you create it that it cannot get the ContentTypeID and will never allow you to edit the column again.<br />
<br />
So, I decide that I'm going to fix it by just hiding the column. Well, that makes it so that I get the lovely 'invalid or missing properties' error but then there's no property that I can actually fix! I found a post here: <a href="http://www.novolocus.com/2010/05/10/to-save-to-the-server-correct-the-invalid-or-missing-required-properties/">http://www.novolocus.com/2010/05/10/to-save-to-the-server-correct-the-invalid-or-missing-required-properties/</a> that shows that you can inspect your document within Office to remove Custom XML data and that that should fix the problem...it lets me then save the document. YAY! BUT....then I can't check the document back in. So, I HAVE to leave DOCTYPE as an optional column and just tell users not to fill it in. Moral of story - do NOT create a DocType column or you will see the following error messages:<br />
<br />
<span>'Object reference not set to an instance of an object. at Microsoft.SharePoint.ApplicationPages.BasicFieldEditPage.get_ContentTypeId()'</span><br />
<br />
<span>'To save to the server, correct the invalid or missing required properties.'</span>Brendan Hornerhttp://www.blogger.com/profile/03976653245000784332noreply@blogger.com0