· Part 3: describes the custom toolkit assembly that provides the glue to connect your claims-aware WCF application in the cloud and your SharePoint farm. I’ll go through how you use the assembly, talk about the pretty easy custom control you need to create (about 5 lines of code) and how you can host it in a page in the _layouts directory as a means to retrieve and render data in web part. The full source code for the sample custom control and _layouts page will also be posted.
· Part 5: a brief walk-through of a couple of sample applications that demonstrate some other scenarios for using the custom control you build that’s described in part 3 in a couple of other common scenarios. One will be using the control to retrieve some kind of user or configuration data and storing it in the ASP.NET cache, then using it in a custom web part. The other scenario will be using the custom control to retrieve data from Azure and use it a custom task; in this case, a custom SharePoint timer job. The full source code for these sample applications will also be posted.
1. You really must upgrade to SQL 2008 R2 in order to be able to open SQL Azure databases with the SQL Server Management Studio tool. You can technically make SQL Server 2008 work, but there are a bunch of hacky work around steps to make that happen. 2008 R2 has it baked in the box and you will get the best experience there. If you really want to go the 2008 work-around route, check out this link: http://social.technet.microsoft.com/wiki/contents/articles/developing-and-deploying-with-sql-azure.aspx. There are actually a few good tips in this article so it’s worth a read no matter what.
2. Plan on using T-SQL to do everything. The graphical tools are not available to work with SQL Azure databases, tables, stored procedures, etc. Again, one thing I’ve found helpful since I’m not a real T-SQL wiz is to just create a local SQL Server database first. In the SQL Management tool I create two connections – one to my local SQL instance and one to my SQL Azure instance. I create tables, stored procedures, etc. in the local SQL instance so I can use the graphical tools. Then I use the Script [whatever Sql object] As…CREATE to…New Sql Query Window. That generates the SQL script to create my objects, and then I can paste it into a query window that I have open to my SQL Azure database.
3. This one’s important folks – the default timeout is frequently not long enough to connect to SQL Azure. You can change it if using the .NET SqlClient classes in the connection string, i.e. add "Connection Timeout=30;". If you using SQL Server Management Studio then click the Advanced button on the dialog where you enter the server name and credentials and change it to at least 30. The default is 15 seconds and fails often, but 30 seconds seems to work pretty well. Unfortunately there isn’t a way to change the connection timeout for an external content type (i.e. BDC application definition) to a SQL Azure data source.
4. Don't use the database administrator account for connecting to your databases (i.e. the account you get to create the database itself). Create a separate account for working with the data. Unfortunately SQL Azure only supports SQL accounts so you can't directly use the identity of the requesting user to make decisions about access to the data. You can work around that by using a WCF application that front-ends the data in SQL Azure and using claims authentication in Windows Azure, which is exactly the model the CASI Kit uses. Also, it takes a few steps to create a login that can be used to connect to data a specific database. Here is an example:
5. You must create firewall rules in SQL Azure for each database you have in order to allow communications from different clients. If you want to allow communication from other Azure services, then you must create the MicrosoftServices firewall rule (which SQL Azure offers to do for you when you first create the database), which is a start range of 0.0.0.0 to 0.0.0.0. If you do not create this rule none of your Windows Azure applications will be able to read, add or edit data from your SQL Azure databases! You should also create a firewall rule to allow communications with whatever server(s) you use to route to the Internet. For example, if you have a cable or DSL router or an RRAS server, then you want to use your external address or NAT address for something like RRAS.
· server: precede the name of the SQL Azure database with “tcp:”.
· Trusted_Connection: this should be false since you’re not using integrated security
· Encrypt: this should be true for any connections to a cloud-hosted database
· Connection Timeout: as described above, the default timeout is 15 seconds and that will frequently not be enough; instead I set it to 30 seconds here.
1. Create a format file – this is used for both exporting the local data and importing it into the cloud. In this example I’m creating a format file for the Region table in the Test database, and saving it to the file region.fmt.
2. Export data from local SQL – In this example I’m exporting out of the Region table in the Test database, into a file called RegionData.dat. I’m using the region.fmt format file I created in step 1.
3. Import data to SQL Azure. The main thing that’s important to note here is that when you are importing data into the cloud, you MUST include the “@serverName” with the user name parameter; the import will fail without it. In this example I’m importing data to the Region table in the Customers database in SQL Azure. I’m importing from the RegionData.dat file that I exported my local data into. My server name (the -S parameter) is the name of the SQL Azure database. For my user name (-U parameter) I’m using the format username@servername as I described above. I’m telling it to use the region.fmt format file that I created in step 1, and I’ve set a batch size (-b parameter) of 1000 records at a time.