In Part 3 of this series we looked at the plumbing required to add support for Web API 2.x to your SharePoint Apps, as well as some of the integration needed to have it work with SharePoint and CSOM. In Part 4 we’re going to look at the integration with various Azure services in the CloudTopia app.
Here's some quick links to the whole series:
- Open Graph
- Web API
- Azure Integration
- Office 365 Integration
- Cortana, Speech and Windows Phone Integration
To begin with I started the CloudTopia project like any other SharePoint App – I cracked open Visual Studio 2013 Update 2, and then I created a new SharePoint App. I made it a provider-hosted app, and it created two projects for me – one with the application manifest needed for app registration and the other a web site project. Deployment was straightforward but I’ll cover that in the next post; I just wanted to set the stage for how we get things started. Now let’s look at how we took this application and integrated with a variety of Azure services.
Azure Web Sites
CloudTopia is deployed to an Azure web site. You get 10 free web sites with an Azure subscription so I used one of those to deploy my app. The process for doing so is quite simple of course – you go to the Azure Management Portal and click the Web Sites link in the navigation and Add a new one. There is some subtlety to adding a new one for this project, but I’ll cover that in the SQL Azure section (it will make sense why when we get there). Once my web site is created I just downloaded the publishing profile from the Azure management portal page for the web site, and then in Visual Studio I chose the option to publish my site. When the publish wizard ran I gave it the location of the publishing profile file that I had downloaded and away we went. Whenever I made changes I just republished the site and about 30 seconds later my latest code was up and running in the Azure web site. Good stuff.
One other thing worth noting here is debugging. Debugging is possible for web sites hosted in Windows Azure, you just do it a little differently than if you were running your web site locally. I’ve previously posted about this process for debugging your SharePoint Apps that are hosted in an Azure web site – you can find that post here: http://blogs.technet.com/b/speschka/archive/2013/11/25/debugging-sharepoint-apps-that-are-hosted-in-windows-azure-web-sites.aspx.
I used SQL Azure in the CloudTopia app primarily to simplify the process of the daily task to go out and find matching tweets for social events. As I described earlier, I was able to take advantage of a free 20MB SQL Azure database that I get with my Azure subscription. You actually create and/or connect it to your application at the time you create your Azure web site – that’s why we’re covering site creation here in the SQL Azure topic. To connect these up you want to first do a custom create for your new Azure web site:
When you do that you’ll have the option of selecting a database. Click the drop down and if you have an MSDN subscription you should see an option to create a free 20MB database (assuming you have not created it already; if you have then you can just select the instance you already created):
Now I’m going to take what might seem like a brief detour but I’ll bring it back around when I’m done. One of the features of the CloudTopia app is that it will take a set of Twitter tags that have been defined for an event and go do a search to find tweets in the previous 24 hours that have used them. Every tweet that is found is added to the discussion on the Yammer Open Graph item that’s associated with the event. That’s how we get this nice integrated discussion in our events:
We’re just running this code once a day, so the process to gather these matching tweets is kicked off when an Azure Scheduler job makes a GET request to our REST endpoint that runs this code. So why am I sharing this information here? Because way back in Part 1 of this series I mentioned that I was using SQL Azure to store some of the CloudTopia data versus just keeping everything in a SharePoint list. Understanding this piece of functionality should help explain why SQL Azure.
As I also mentioned previously in this series, you always need the ID of a Yammer Open Graph item in order to read or write to the discussion that’s associated with it. Also, as I described above, this process kicks off once a day from an Azure Scheduler job. The distinction in this scenario is that there is no human present. That means that I don’t have a user context in order to make a call back into SharePoint. So if I wanted to store ALL of the CloudTopia metadata in a SharePoint list, I would need to configure my app to use an app only request. While I could certainly do that, it requires an elevated level of permissions versus a simple user-driven request and that was something I did not want to do. That’s how I landed on using SQL Azure for this purpose. Not only is it free for my application, I’m able to use it without any user context at all – I just use a connection string with a set of credentials for a SQL Azure user that has rights to my CloudTopia database. It’s also significantly easier for most developers to create SQL queries then the “sometimes mystical, sometimes magical, sometimes maddening” world of SharePoint CAML queries. SQL Azure makes it easy to retrieve the information needed for the CloudTopia app, and also doesn’t require a high level of permission from the application itself. Score one for SQL Azure!
The final Azure service I used on CloudTopia is the Azure Scheduler service. This is a pretty straightforward service to use and configure so I’m not going to spend a ton of time talking about it. There are always several options when you are looking to schedule tasks; for CloudTopia though, as the name implies, I wanted it to be 100% hosted in the cloud – and cheap. The Azure Scheduler service is a great solution for these requirements. You get a set number of job iterations for free, and since I’m only making one job run a day – to get the tweets from the last 24 hours – this fits the bill perfectly. When you create your job you can choose an HTTP or HTTPS endpoint to invoke, and you can define whether you want to do a GET, POST, PUT or DELETE. For POST and PUT you can optionally provide a Body to send along with the request; for all of them you can add one to many custom Http headers to send as well. After you configure your job endpoint you set up your schedule – a one time run to something that reoccurs on a regular basis. That is basically it, but here’s some pictures to show you the UI that was used to create the CloudTopia Scheduler job:
When the Scheduler job runs here’s the REST endpoint that it invokes:
public async Task<HttpResponseMessage> Get()
HttpResponseMessage result = Request.CreateResponse(HttpStatusCode.OK);
await Task.Run( () => UpdateYammerWithTwitterContent());
catch (Exception ex)
result = Request.CreateErrorResponse(
HttpStatusCode.BadRequest, ex.Message, ex);
So I just go off and run my code to update the Yammer Open Graph item with any tweets from the last 24 hours. If it works I return an HTTP status code 200, and if it fails I decided to return an HTTP status code 400. Yeah, it’s not really a bad request, but I’ve always wanted to return that to someone else for a change.
Here’s an abbreviated version of the code to actually go out and get the tweets and write them to Yammer. First I connect to SQL Azure and get the list of events and their associated Twitter tags and Open Graph IDs:
using (SqlConnection cn = new SqlConnection(conStr))
SqlCommand cm = new SqlCommand("getAllEvents", cn);
cm.CommandType = CommandType.StoredProcedure;
SqlDataAdapter da = new SqlDataAdapter(cm);
DataSet ds = new DataSet();
With my dataset of events I enumerate through each one and go get the event tweets. I start out by getting an access token for Twitter:
//create the authorization key
string appKey = Convert.ToBase64String(
(HttpUtility.UrlEncode(TWT_CONSUMER_KEY) + ":" +
//set the other data for our post
string contentType = "application/x-www-form-urlencoded;charset=UTF-8";
string postData = "grant_type=client_credentials";
//need to get the oauth token first
response = MakePostRequest(postData, TWT_OAUTH_URL, null,
//serialize it into our class
TwitterAccessToken accessToken =
//plug the value into our local AccessToken variable
TWT_ACCESS_TOKEN = accessToken.AccessToken;
Now that I’m sure I have a Twitter access token I can go ahead and query twitter for the tags I’m interested in:
//now that we have our token we can go search for tweets
response = MakeGetRequest(TWT_SEARCH_URL +
//plug the data back into our return value, which is just a
//custom class with a list of SearchResult so I can work
//with it easily from my code
results = SearchResults.GetInstanceFromJson(response);
//trim out any tweets older than one day, which is how frequently
//this task should get invoked
if (results.Results.Count > 0)
//retrieve items added in the last 24 hours
var newResults = from SearchResult oneResult in results.Results
where DateTime.Now.AddDays(-1) <
results.Results = newResults.ToList<SearchResult>();
Once I get my search results back, I can add each one to the discussion on the Yammer Open Graph item:
foreach (SearchResult sr in queryResults.Results)
string newPost = "From Twitter: " +
sr.User.FromUser + " says – " + sr.Title + ". See the post and more at " +
"https://twitter.com/" + sr.User.FromUser + ". Found on " +
You may notice that I’m calling the same CreateOpenGraphPost that I described earlier in this series – I used it previously to create the initial post for new Open Graph items.
That’s it for this post. In Part 5 of the series we’ll look at all of the integration that was done with Office 365. It was a lot, so stay tuned.