Archive

Posts Tagged ‘SharePoint’

[Nintex Workflow] Add user to Site Collection Administrator group with REST API

Helping people to automation their workplace is my passion and lucky for me I also get paid to do so !

This week I was finishing working with a partner to improve the (poor) automation steps required by Matter Center, which no-one can really complain because Microsoft made it open-source.
Matter Center documentation requires to create each client as a new site collection in PowerShell, but this is not quite possible if the users registering these new clients on a daily basis are regular Office 365 users and not SharePoint Administrators.Thanks to a few Nintex Workflows we managed to do all the configuration in the background.

Thanks to a few Nintex Workflows we managed to do all the configuration in the background.
Today’s post is not about the site collection creation so I will spare the details, but in summary and very high level, I developed 4 workflows, 1 CSOM Javascript to be executed on the browser, and 1 Nintex Form of course for submitting the new client on desktop or mobile.

Now this quick blog post is regarding the challenge that we had to add the user as a Site Collection Administrator of that newly created site collection.

Since there is no mention of the sort in http://bit.ly/1TUw4AY it may useful for someone, so here it is:

  1. Create a new Nintex workflow in an Office 365 site list.
  2. Download and Import the .NWP workflow file available here to replace the blank workflow
  3. Edit a few of the actions at the beginning of the workflow to set the variables (I never hard-code UserName and Password for instance, so you will see a few Lookup to a different list to get the value, which you can replace since they will be showing an error once imported into your list)

Note: In this workflow, the “user” I am adding to the Site Collection Administrators group is actually the “CreatedBy” of the list item, which may sound strange since the user running that workflow may be the CreatedBy. However this is NOT the case (refer to above point: we do not want all users to be SharePoint admins!), here is how you should sequence the workflow to start:
1) After the List Item is created, a first workflow (run by CreatedBy) i.e. called “Start and Call workflow 2” and in the workflow we just add a “Start Workflow”

2) then within that first workflow we just add a “Start Workflow” making sure that this action is bein executed in an “App Step” in order to use “elevated privilege”.

Nintex_Workflow_for_Office_365

3) finally all the actions are happening in Workflow2 (which you imported in step 2)

 

Hope this helps someone.

François.

via François on SharePoint & more http://bit.ly/1TUwgjP

François Souyri
French native Sharepoint Consultant living in London. A crossway between a designer, developer and system architect. Prefers stretching the limit of out-of-the-box features rather than breaking them into code. When not working with Microsoft Sharepoint François is often found on Web2.0 News sites and related social networking tools.

This article has been cross posted from sharepointfrancois.wordpress.com/ (original article)

Categories: Work Tags: ,

So, you want to delete users with the Azure AD Graph API? Good luck with that!

You might think that deleting users using the Azure AD Graph API would be pretty straightforward right?  You already have a registered application that succeeds in updating and creating new users.  This link doesn’t provide any warnings about hidden dragons or secret pitfalls.

Rest assured, there is at least one gotcha that’s primed to eat your lunch when it comes to deleting users.  Fortunately for you, True Believers, I’m here to show you how you too can quickly overcome this less than obvious configuration issue.

According the the Azure AD Graph Reference deleting user the is a simple operation.  All you have to do is send the HTTP Verb “DELETE” to the URL of the user you want to delete.

Example:

http://bit.ly/1VZ0GVf{user_id}[?api-version]

The user_id can be the UserPrincipalName. In other words, the E-mail address of the user.

As an example, I will delete a pesky AD user named “John Doe”.  This John Doe character has got to go!

Azure

I use PostMan to to get my API calls properly formatted.  It also helps to ferret out problems with permissions or configurations. This helps me to *know* that it works before I write my first line of application code.

Note: Notice that I have an OAuth Bearer token specified in the header.  I won’t cover how I got this token in this post.  If you want to know more about how I acquire tokens for Console Applications send me an E-mail!

PostmanDelete1

Assuming you have your tenant ID, user ID, and OAuth token all set correctly then all you need to do is click “Send”.  Your user is deleted as expected… right?

NOPE! you encounter the following JSON error response:

{
“odata.error”: {
“code”: “Authorization_RequestDenied”,
“message”: {
“lang”: “en”,
“value”: “Insufficient privileges to complete the operation.”
}
}
}

Your first reaction may be verify that your application registration is assigned the proper permissions on the AD Graph.  However, there is no permission that allows you to delete. You can only get variations of Reading and Writing.

AzurePermission

What do you do?  If you Google Bing around a bit you will find that your Application needs to be assigned an administrative role in Azure. It needs a ServicePrincipal.  So, off you go searching the competing, overlapping, portals of Azure trying to figure out how to assign an application roles within a resource.  You may even be successful.  We weren’t.

I had to use remote PowerShell to add my application to the appropriate role in order to delete users from AD.

REMOTE POWERSHELL TO AZURE AD

I used instructions from this MSDN article to download and install the Azure AD Module.  First I downloaded the Microsoft Online Services Sign-In Assistant for IT Professionals RTW.  Next, I grabbed the Active Directory Module for Windows PowerShell (64-bit version).  Once I had my PowerShell environment up and running, I cobbled together a quick script to Add my Application registration to the “User Account Administration” role.  Here is how I did it!

THE CODEZ

# Log me into my MSDN tenant using an account I set up as “global admin”.
$tenantUser = ‘admin@mytenant.onmicrosoft.com’
$tenantPass = convertto-securestring ‘Hawa5835!’ -asplaintext -force
$tenantCreds = new-object -typename System.Management.Automation.PSCredential -argumentlist $tenantUser, $tenantPass

Connect-MsolService -Credential $tenantCreds

# Get the Object ID of the application I want to add as a SPN.
$displayName = “MyAppRegistrationName”
$objectId = (Get-MsolServicePrincipal -SearchString $displayName).ObjectId

# Set the Role name and the Add the Application as a member of the Role.
$roleName = “User Account Administrator”
Add-MsolRoleMember -RoleName $roleName -RoleMemberType ServicePrincipal -RoleMemberObjectId $objectId

PLAY IT AGAIN SAM

If you execute the PowerShell above (and it’s successful) then you can attempt to invoke the API again.  Click Send!

DeleteSuccess

Notice this time PostMan returns an HTTP status of 204 (no content).  This is the appropriate response for a DELETE.  Let’s check our tenant to ensure Jon Snow is dead or rather John Doe is deleted.

DeleteProof

He’s gone!  You are good to go.

CONCLUSION

Azure is a dynamic, new technology.  Documentation is changing almost daily. It can be frustrating to navigate the changing landscape of marketing terms and portals.

All the information you need to sort out this error is out there. However, I found it to be scattered and not exactly applicable to what I was doing.  The PowerShell snippets existed in parts, one to log in to a remote tenant, one to add the role.  This post simply serves to bring the information together so you can quickly get past this problem and on to writing more code.

 

Cheers!

 

 

Chris Clements
I am a senior software developer and development team lead in Houston Texas. I am passionate about the “art” of software development. I am particularly interested in software design patterns and the principles of SOLID object-oriented code. I am an evangelist for test driven development. I love to think and write about my day-to-day experiences in the trenches of enterprise IT. I relish the opportunity to share my experiences with others.

From the wire to the presentation, I am holistic solutions guy. I have broad experience in client side technologies such as Javascript, Ajax, AngularJS, Knockout, and Bootstrap. I have extensive experience with MVC, MVVM, and ASP.NET Web Forms. I am strong in SQL Databases, performance tuning, and optimization. I also have a background in network engineering, wide-area and inter-networking.

This article has been cross posted from jcclements.wordpress.com/ (original article)

Reading a SharePoint Online (Office 365) List from a Console Application (the easy way)

In a previous post I talked about our strategy of using scheduled console applications to perform tasks that are often performed by SharePoint timer jobs.

As we march “zealously” to the cloud we find ourselves needing to update our batch jobs so that they communicate with our SharePoint Online tenant.  We must update our applications because the authentication flow between on premise SharePoint 2013 and SharePoint Online are completely different.

Fortunately for us, we found the only change needed to adapt our list accessing code was to swap instances of  the NetworkCredentials class for the SharePointOnlineCredentials class.

Imagine that this is your list reading code:

using (var client = new WebClient())
{
client.Headers.Add(“X-FORMS_BASED_AUTH_ACCEPTED”, “f”);
client.Credentials = _credentials;  //NetworkCredentials
client.Headers.Add(HttpRequestHeader.ContentType, “application/json;odata=nometadata”);
client.Headers.Add(HttpRequestHeader.Accept, “application/json;odata=nometadata”);

/* make the rest call */
var endpointUri = $”{_site}/_api/web/lists/getbytitle(‘{_listName}’)/Items({itemId})”;
var apiResponse= client.DownloadString(endpointUri);

/* deserielize the result */
return _deserializer.Deserialize(apiResponse);
}

The chances are your _credentials object is created like this:

_credentials= new NetworkCredentials(username,password,domain);

Here, the username and password are those of a service account specifically provisioned a for SharePoint list access.

In order to swap the NetworkCredentails class for SharePointOnlineCredentails first, you  need to download and install the latest version of the SharePoint Online Client Components SDK here (http://bit.ly/1rKS6N8).

Once the SDK is installed  add a reference to the Microsoft.SharePoint.Client and Microsoft.SharePoint.Client.Runtime libraries.  Assuming a default installation, these binaries can be found here: C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\.

Be certain to reference the 16.0.0.0 version of the dlls.  If you get the 15.0.0.0 version (which is currently the version in NUGet) your code may not work!

Now you can “new up” your _credentials like this:

_credentails = new SharePointOnlineCredentials(username,password);

But “TV Timeout!” (as a colleague likes to say after a couple brews at the pub) the password argument is a SecureString rather than the garden variety string.  You will need a helper method to transform your plain old string into a SecureString.  Here is how we do it:

public static SecureString GetSecureString(string myString)
{
var secureString = new SecureString();
foreach (var c in myString)
{
secureString.AppendChar(c);
}
return secureString;
}

One last thing to note; the SharePointOnlineCredentials class implements the System.Net.ICredentials interface. That’s what allows us to simple swap one class for another.

Therefore,  if you are following the SOLID principles and using dependency injection then the extent of your code changes may look like this:

var securePassword = SecureStringService
.GetSecureString(settings.SPOPassword);

container.Register<ICredentials>(()
=> new SharePointOnlineCredentials(username, securePassword));

Now that is cool!

Cheers and Happy Coding!

 

Chris Clements
I am a senior software developer and development team lead in Houston Texas. I am passionate about the “art” of software development. I am particularly interested in software design patterns and the principles of SOLID object-oriented code. I am an evangelist for test driven development. I love to think and write about my day-to-day experiences in the trenches of enterprise IT. I relish the opportunity to share my experiences with others.

From the wire to the presentation, I am holistic solutions guy. I have broad experience in client side technologies such as Javascript, Ajax, AngularJS, Knockout, and Bootstrap. I have extensive experience with MVC, MVVM, and ASP.NET Web Forms. I am strong in SQL Databases, performance tuning, and optimization. I also have a background in network engineering, wide-area and inter-networking.

This article has been cross posted from jcclements.wordpress.com/ (original article)

Office 365 is trying humour … will you recognise the quote from ?

November 18, 2015 Leave a comment

After the first funny quotes started some years back with the 404 not found pages showing “oops… something happened”, making them more friendly and less scary, it seems that the whole IT industry is trying to have humour, even on “serious” screens like Office 365 Admin center…

I think I like it!  better have fun at work, right? (and with all the TV geeks in SharePoint, it makes sense…)

Office 365 has humour

Office 365 has humour

via François on SharePoint & more http://bit.ly/1NbbPyF

François Souyri
French native Sharepoint Consultant living in London. A crossway between a designer, developer and system architect. Prefers stretching the limit of out-of-the-box features rather than breaking them into code. When not working with Microsoft Sharepoint François is often found on Web2.0 News sites and related social networking tools.

This article has been cross posted from sharepointfrancois.wordpress.com/ (original article)

Categories: Work Tags: ,

Bad Onedrive Business Sync bug (SP31654) if you use it with Office 2013 – install update required

October 21, 2015 Leave a comment

Over the past days several Office 365 client users reported a OneDrive For Business synchronisation issue, and I have to say that I usually just direct them to the IT Helpdesk but yesterday I decided that there was one too many so I went to troubleshoot it at a user’s desktop myself.

Nothing could be done to fix the random “red” icon when adding a SharePoint library to sync with user’s windows, remove the folder from OneDrive, uninstall and re-install OneDrive, none. And literally random, some files were also synchronising but still marked as red, and the Errors logs showing “please enter your credentials” but no option to enter them…

I was in a dead-end, until I found out that it is a current issue reported on the 15th October 2015 (5 days ago) and is actually clearly showing in the SHD (Office 365 Service Health Dashboard in the Admin center, see below post on Office 365 community).  The resolution is to update Office 2013.

But my main take away from this is that as much as I thought that no-one would seriously read the SHD every morning (and you can’t receive them by email!), I now realised that I should have started searching through the various incidents list, so I will pay more attention in the future when an user issue comes up.

I believe we have had so much frustration over the years of not finding the answer in Microsoft provided sources that we (I) have the reflex of Googling (binging..) an issue straight away and not actually checking the official source.

Now go on your mobile device and make sure you have the mobile app to see SHD installed !

Office 365 Service Health Dashboard

Office 365 Service Health Dashboard

This issue is now reported at Service Health Dashboard (SHD) as incident SP31654 starting at Thursday, October 15, 2015, at 3:00 PM UTC. The user experience of this incident is: Affected users are unable to sync files with OneDrive for Business. Users may see repeated prompts to enter their credentials, but entering them will not result in a successful sync. Tenant administrators can view current information and updates on SHD at the link here .

Source: Onedrive Business Sync – Credentials Required | Manage Office 365 | Microsoft Office 365 Community

via François on SharePoint & more http://bit.ly/1M7wYsZ

François Souyri
French native Sharepoint Consultant living in London. A crossway between a designer, developer and system architect. Prefers stretching the limit of out-of-the-box features rather than breaking them into code. When not working with Microsoft Sharepoint François is often found on Web2.0 News sites and related social networking tools.

This article has been cross posted from sharepointfrancois.wordpress.com/ (original article)

Categories: Work Tags: ,

Applying the Concepts of the SharePoint App Model to SharePoint 2010

September 24, 2015 Leave a comment

Legacy Code Is Still Out There

The SharePoint 2016 Preview was released in August and many companies are already moving toward the cloud and SharePoint Online.  However, a good number of enterprises still have SharePoint 2010 (and perhaps older) farms hanging around.   It’s likely those on premise 2010 farms host highly-customized SharePoint solutions and perhaps require occasional enhancements.  This is the case in our organization.

Our development team was approached and asked to enhance a SharePoint 2010 solution so that our site could display news feeds from an external vendor.  The site must cache feeds so that the page displays correctly even if the remote site is unavailable at the time of page load.  Naturally, we asked our SharePoint 2010 developer to devise a solution to this problem.  A short while later the developer delivered a technical approach that is steeped in SharePoint tradition.

The SharePoint Way of Doing Things can be Expensive, Time Consuming and Disruptive

The solution proposes to provision content types, site columns, and lists during in the usual way, via feature activation.  These two lists would hold the remote URL (feed) and the fetched content from the remote feed.   A timer job would read from the feed configuration list and fetch the data storing the results into a second SharePoint list.  Lastly, a custom (server side) web part would be created to read and display the contents of the retrieved news feeds list on the page with all the appropriate sorting, formatting, and style our users expect.

On the surface, this seems like a perfectly reasonable solution for the task at hand.   The use of a full-trust deployed solution to create needed plumbing such as content-types and lists was how it should be done in those heady, salad days of SharePoint 2010.  The proposed solution can confidently claim that it adheres to the best practices of SharePoint 2010.

However, there are drawbacks to going with a traditional SharePoint-based solution.  Before the advent of the sand-boxed solution in 2010 it was very easy for a poorly written SharePoint solution to adversely affect the farm on which it was installed.  Custom code has caused many a SharePoint admin sleepless nights. We don’t want to introduce new code to the farm if it’s not completely necessary.

Our team employs both SharePoint developers as well as .NET developers.  Our contract SharePoint developers command a higher hourly rate than our “run of the mill” .NET developers.  As our industry is extremely cost sensitive right now it would be great if we could avoid the use of specialized SharePoint developers for this one off project.

This last bit could be unique to our organization and may not be applicable to yours.  We have a stringent process for SharePoint deployments.  Suffice it to say, from the first request to have code promoted to test that a minimum of two weeks must pass before the code is deployed to production.  Content updates, such as adding web parts and editing pages is not subject to this testing period.  The ideal solutions would avoid an “formal” SharePoint development.

Why the SharePoint App Model is Cool!

The SharePoint app model was introduced with Office and Sharepoint 2013.   With the app model, Microsoft no longer recommended that developers create solutions that are deployed directly on the SharePoint farm.  Rather, developers create “apps” that are centrally deployed from an app catalog and run in isolation from SharePoint processes. SharePoint App Model code runs entirely on the client (browser) or in a separate web application on a remote server.  Apps’ access to SharePoint internals are funneled to a restricted and constricted RESTful API.

The app model prevents even the worst behaving application from affecting the SharePoint farm.  This means the farm is more stable.  Additionally, applications written using the App Model do not require a deployment to the farm or not the type of deployment that would necessitate taking farm into maintenance or resetting IIS.  Under the App Model SharePoint remains up even as new applications are made available.  Customers are happy you can quickly pound out their requests and make them available and admins are happy because your custom code isn’t taking down their farm (allegedly).

Sadly, the app model doesn’t exist for SharePoint 2010, or does it?  While specific aspects of the App Model do not exist in SharePoint 2010 you can still embrace the spirit of the App Model!  The very heart of the SharePoint App Model concept is running custom code in isolation away from SharePoint.  In our case we really only need to interact with SharePoint at the list level. Fortunately, SharePoint 2010 provides a REST API for reading and writing to lists.

Let’s re-imagine our solution and apply App Model-centric concepts in place of traditional SharePoint dogma.

First let’s use PowerShell scripts to create our Site Columns, Content Types, and lists rather than having a solution provision these objects on feature activation.

Next, let’s replace the SharePoint timer job with a simple windows console application that can be scheduled as a Windows scheduled task or kicked off by an agent such as Control-M.  This console app will read a SharePoint list using the REST API, then run out to fetch the content from the Internet writing the results back to a second list using the REST API.

Finally, we can substitute our server-side web part with a Content Editor Web Part that uses JavaScript/Jquery to access our news feed list via, you guessed it, the REST API.  The contents can then be styled with HTML and CSS and displayed to the user.

It’s noteworthy to mention that the UI aspect of this project could potentially suffer from the lack of a formal App Model and where a true Farm deployment may be superior.  In a true App Model scenario apps are deployed to a central App Catalog and can be deployed to sites across site collections.  In order to deploy this Content Editer Part to multiple site collections we would need to manually upload the HTML, CSS, and Javascript to the Style Library of each site collection.  Imagine having dozens or even hundreds of site collections. An actual solution deployment would have afforded us the ability to place common files in the _layouts folder where they would be available across site collections. Fortunately for us the requirement is only for a single site collection.

By cobbling together a set of non-SharePoint components we have, essentially, created an App Model-like solution for SharePoint 2010; a poor-man’s App Model if you will.

In my opinion, this solution is superior to the SharePoint way of doing things in the following ways:

  • Ease of Maintenance / Confidence – Using PowerShell to create columns, content-types, and list is better because scripts can be tested and debugged easily.  Deployments that provision sites are more complicated and time consuming.  From the perspective of a SharePoint admin PowerShell is likely a known entity. Admins can examine exactly what this code will be doing to their farm for themselves and perhaps gain a highly level of confidence in the new software being deployed.
  • Lower Development Cost / Ease of Maintenance A Windows console app is superior to a timer job because you don’t need to pay an expensive SharePoint developer to create or support a solution on a depreciated platform (SP 2010).  Maintaining a console application requires no specific SharePoint experience or knowledge.  In our case, we have an entire team that ensures timed jobs have run successfully and can alert on failure as needed.
  • Reliability / Availability – There is no custom code running within the SharePoint process.  This means there is NO chance of unintended consequences of misbehaving code created for this solution affecting your Farm.
  • Standards Based – HTML, JavaScript, and CSS are basically universal skills among modern developers and standard technologies.
  • No Deployment Outage – This solution can be implemented without taking down the SharePoint farm for a deployment.  Adding a simple content editor web part does not interrupt business operations.
  • Ease of Portability / Migration – Our solution, using a console app, HTML, and Javacript works just as well on SharePoint 2013 and Office 365 as it will with SharePoint 2013.  Whereas a traditional SharePoint solution cannot be directly ported to the cloud.

Conclusion

There is a lot of legacy SharePoint 2010 out there, especially in large enterprises where the adoption and migration to newer platforms can take years. Occasionally, these older solutions need enhancements and support.  However, you want to spend as little time and money as possible on supporting outdated platforms.

We needed a solution that had the following characteristics:

  • We didn’t want to continue to write new server-side code for SharePoint 2010.
  • We wanted a solution that didn’t require an experienced SharePoint developer to create and maintain.
  • We wanted code that was modular and easily migrated to Office 365.
  • We wanted to avoid a formal SharePoint deployment and its associated outage.

A traditional SharePoint solution was not going to get us there.  Therefore, we took the best parts of the SharePoint App Model (isolation, unobtrusive client side code, and RESTful interfaces to SharePoint) and created a holistic solution that fulfilled the customers’ expectations.

-Chris

Chris Clements
I am a senior software developer and development team lead in Houston Texas. I am passionate about the “art” of software development. I am particularly interested in software design patterns and the principles of SOLID object-oriented code. I am an evangelist for test driven development. I love to think and write about my day-to-day experiences in the trenches of enterprise IT. I relish the opportunity to share my experiences with others.

From the wire to the presentation, I am holistic solutions guy. I have broad experience in client side technologies such as Javascript, Ajax, AngularJS, Knockout, and Bootstrap. I have extensive experience with MVC, MVVM, and ASP.NET Web Forms. I am strong in SQL Databases, performance tuning, and optimization. I also have a background in network engineering, wide-area and inter-networking.

This article has been cross posted from jcclements.wordpress.com/ (original article)

Uploading an Existing Local Git Repository to BitBucket.

September 24, 2015 Leave a comment

I use BitBucket for all my recreational, educational, and at home programming projects.  I like that fact that you can have free, private repositories. BitBucket supports Git as well as Mercurial.

Typically, I will create a new BitBucket repository and then use the Git Bash shell or Visual Studio to clone the project from BitBucket and simply add files to the new local repository.  However, there are times when I will start a local repository first and later decide that I like the project and want to save it off to BitBucket.

This is the procedure I use to upload an existing local Git repository to BitBucket.

Step 1 – Create a New Git Repository on BitBucket.

newRepo

Step 2 – Open your Git Bash Shell and Navigate to the Location of your Git Repository

Note: The location to the .git file is the path we are looking for.

$ cd Source/Repos/MyProject/

navigateRepro

Step 3 – Add the Remote Origin

Note: You will need to the remote path to your repository you created on BitBucket.  You can find this URL on the Overview screen for your repository in the upper right corner of the page.

$ git remote add origin http://bit.ly/1PuVrG7

addOrigin

Step 4 – Push your Repro and All its’ References

$ git push -u origin –all

You will be prompted to enter your BitBucket password.

pushAll

Step 5 – Ensure all Tags get Pushed as Well

$ git push -u origin –tags

Again you will be prompted to get your BitBucket password.

pushTags

If all goes well you will see the “Everything up-to-date” message displayed in the Git Bash shell.

The procedure above will move the entire repository. That means if you created local branches, the those are moved up as well. It’s pretty cool really.  Once the remote origin is set you can commit changes locally and then use Visual Studio’s built in Git support, or the Git Bash to Sync your changes “to the cloud”.

sourceView

Happy Coding!

Chris Clements
I am a senior software developer and development team lead in Houston Texas. I am passionate about the “art” of software development. I am particularly interested in software design patterns and the principles of SOLID object-oriented code. I am an evangelist for test driven development. I love to think and write about my day-to-day experiences in the trenches of enterprise IT. I relish the opportunity to share my experiences with others.

From the wire to the presentation, I am holistic solutions guy. I have broad experience in client side technologies such as Javascript, Ajax, AngularJS, Knockout, and Bootstrap. I have extensive experience with MVC, MVVM, and ASP.NET Web Forms. I am strong in SQL Databases, performance tuning, and optimization. I also have a background in network engineering, wide-area and inter-networking.

This article has been cross posted from jcclements.wordpress.com/ (original article)

Follow

Get every new post delivered to your Inbox.

Join 496 other followers

%d bloggers like this: