Pimp My Content Deployment Job

As a follow up to my article series about the content deployment and migration API here are some tips on how you can “fine-tune” the out of the box content deployment jobs. This will include information about how adjust settings in Content Deployment Paths and Jobs and also about how to adjust content deployment global settings which are not available through the UI.

This article demonstrates

  • How to configure the FileMaxSize property for a Content Deployment Job
  • How to configure the Timeout and the Polling Interval for a Content Deployment Job
  • How to configure a Content Deployment Job to keep the temporary Files to simplify troubleshooting
  • How to disable Compression for a Content Deployment Job
  • How to enable Event Receivers during Import if required
  • How to configure a Content Deployment Job to automatically retry after a failure has happend

Content Deployment Settings

On the Operations Tab of the Central Administration you can find a link to the Content Deployment Settings page. Here you can configure the following options:

Accept Content Deployment Jobs

This option allows to decide whether this farm can be used as the target of an content deployment operations or not. Per default this option is disabled.

Import Server / Export Server

These options allows to decide which server in the current farm should act as the import and/or export server. As an import operation can impact the performance of a server significantly due – especially the memory consumption can be pretty high – you might want to configure a dedicated server to perform the import operation. Export is usually affecting the server performance less but that might depend on the specific content being exported.

Be aware that the server acting as the Import and Export server has to host an instance of the Central Administration Website. See section 1 of the following article for details.

Connection Security

Content Deployment first exports all content into one or more cab files and afterwards uploads the content to the destination server using http upload. In case that this upload goes over the internet or another somehow unsecure line you might want to use SSL to encrypt the uploaded content. This can be done using this option.

Temporary Files

Content Deployment requires a significant amount of disk space – as well on the exporting and on the importing server – as the content is first exported and then compressed. So as well the flat content and the cab files require space on your had drive. This option allows to configure the location for the compressed cab files. To see how to configure the location for the uncompressed content see section 2 of the following article for details.

Reporting

Per default MOSS keeps the last 20 reports for each content deployment job. In case you need to keep more or less you have the option to adjust the number of reports using this setting.

Ok, so far for the options that can be configured in the UI. But there are a couple more very intersting options which might be worth to look at which are NOT available through the UI. These can be configured using the object model using the ContentDeploymentConfiguration object. The documentation for this class can be found on MSDN.

The options above can also be configured using this class. But also a couple more which I will explain now.

Hidden Content Deployment Settings

FileMaxSize

This option allows to define the maximum size of each generated cab file. Per default this size is 10 MB. When creating your own applications to deploy content using the Content Deployment and Migration API you have full control over this parameter but the UI for the Content Deployment jobs does not allow you to change this parameter.

RemoteTimeout

During the import the exporting server that is hosting the content deployment job definition will poll the destination server in specific intervals to get the status of the import operation. The import will go through different phases like extracting and then performing the import. This timeout actually defines the maximum time that is allowed between changes to the status report. During the actual import the report changes quite often. But there are some phases – e.g. the decompression – which can take some time. If this time exceeds this remote timeout value then a timeout is reported back to the exporting machine. You might have seen this status in the content deployment jobs. Such a timeout does not mean that the import has failed! It just means that no status update did arrive for the configured time. The default here is 600 seconds which means 10 minutes.

In case that you have a large content deployment running with several GB of content it can happen that the decompression phase takes longer than 600 seconds – and then you will the a timeout. Adjusting this parameter allows you to avoid such timeout messages.

RemotePollingInterval

This setting is tightly bound to the previous. It defines how often the exporting server checks for status changes on the remote server. The default for this setting is 10 seconds. Usually it should not be required to change this settings as a SOAP call to the importing server every 10 seconds to get a status update should not add measurable load to this server.

As mentioned before to adjust these hidden properties it is required to write some custom code. Also be aware that these settings are – like the settings you can configure in the content deployment settings page – global settings and will affect all content deployment jobs.

Here is some sample code which allows you to configure the RemoteTimeout to one hour. It would be required to run this on the exporting farm if required:

using System;
using Microsoft.SharePoint.Publishing.Administration; 
 

namespace StefanG.Tools
{
    class AdjustContentDeploymentDeploymentSettings 
    {
        static void Main(string[] args)
        {
            ContentDeploymentConfiguration config = ContentDeploymentConfiguration.GetInstance();
            config.RemoteTimeout = 3600;
            config.Update();
        }
    }
}

Content Deployment Paths and Jobs

MOSS stores informations about content deployment paths and jobs in two hidden lists in the administration website. You can see all the details using the following URLs:

  • http://url-to-centraladmin/Lists/Content%20Deployment%20Paths
  • http://url-to-centraladmin/Lists/Content%20Deployment%20Jobs

When browsing to these urls you will find a list item for each configured path/job.

Attention: It is highly recommended not to modify any settings in this list directly! There are other supported methods available to modify most of the settings you can see here – even if they are not available in the UI where you configure the content deployment path and job.

I would like to highlight the following settings which are visible in the property pages but which are not available through the official UI.

Hidden Content Deployment Path properties

KeepTemporaryFiles

Possible Values: 0 = Never, 1 = Always, 2 = Failure, Default Value: Never.

When performing a content deployment operation the exporting server first exports the content into a temporary directory on the disk and then compresses the data. After the deployment is done the temporary files are cleaned up – more or less. Actually it will only remove the first cab file and not all of them, which makes the remaining content unusable but it requires that you still have to clean up the other files manually to preserve disk space. So why would it be interesting to change this setting? This is interesting when it comes to troubleshooting content deployment problems. Enabling this option allows you to keep and inspect the generated cab files to help you to identify what is being deployed. We at Microsoft Support often use this option when working on support cases.

To modify this option please use the following STSADM command:

STSADM -o editcontentdeploymentpath -pathname <pathname> -keeptemporaryfiles Never|Always|Failure

(in case your pathname contains blanks please specify it with double quotes)

EnableEventReceivers

Possible Values: 0 = No, 1 = Yes, Default Value: No.

This property allows you to control whether event receivers are allowed to fire during the import or not. The default is that event receivers are disabled during import. This configures the SPImportSettings.SuppressAfterEvents setting for the remote import job.

To modify this option please use the following STSADM command:

STSADM -o editcontentdeploymentpath -pathname <pathname> -enableeventreceivers yes | no

EnableCompression

Possible Values: 0 = No, 1 = Yes, Default Value: Yes

This property allows to control whether the data being deployed is being compressed during the deployment or if the uncompressed data should be sent to the destination server. Per default this setting is enabled to provide acceptable performance even when uploading the content to a server in a different subsidiary as in this situation the overhead to perform the compression is smaller than the additional transport time it would take to upload the uncompressed data to a remote server. But in an intranet or when deploying content between two site collections on the same server this might be different. Here the compress and uncompress time could cause a significant overhead in the performance of the content deployment jobs. So here it might make sense to disable the compression. This setting configures the SPDeploymentSettings.FileCompression property for export and import.

Attention: before disabling this option please check the version of the Microsoft.SharePoint.Publishing.dll on your disk. A known problem with disabling this options was fixed in build 12.0000.6315.5000. If your Microsoft.SharePoint.Publishing.dll has a lower build number you cannot disable this option as it would cause your content deployment job to fail! The first hotfixes containing the solution for this problem are 952698 (WSS fix) and 952704 (MOSS fix). You should ensure to install both fixes on your server.

To modify this option please use the following STSADM command:

STSADM -o editcontentdeploymentpath -pathname <pathname> -enablecompression yes | no

Hidden Content Deployment Job properties

Actually there are only two options which I would like to highlight here as these are really cool new additions being added in one of the latest hotfix packages. You will not find these options if your Microsoft.SharePoint.Publishing.dll has a build number lower than 12.0000.6315.5000. If your build number is lower and you need this feature, please request the following two hotfixes from Microsoft Support: 952698 (WSS fix) and 952704 (MOSS fix).

RetryOption

Possible Values: 0 = None, 1 = SkipFailedWebs, 2 = SimpleRetry, Default Value: None.

This option has been implemented to allow parallel execution of content deployment jobs. In the past we had a couple of cases where customers ran into problems when executing content deployment jobs in parallel. In such a situation the deployment job was likely to fail due to update conflicts caused by the parallel running content deployment jobs. To overcome this limitation the retry feature has been implemented which allows to redo a deployment if the deployment failed. This option can either retry without the failing sites (SPWeb objects) that caused the problem will not be exported on the next attempt or can try to export with all sites again. This allows to use content deployment in a limited manner in situations where problems in a specific sites persistently would cause content deployment jobs to fail.

ExportRetries

Possible Values: 1-999. A small number (3-5) is recommended.

This option defines how many times the deployment retry is attempted before the job finally ends with Failure. To ensure that the deployment jobs does not run forever on a persistent problem you should avoid configuring this to a number higher than 5.

As these options are pretty new and have been introduced through a hotfix there is no UI and no STSADM command to configure them. To get this option active you need to add a modification to the web.config file of the central administration website.

The following changes have to be applied:

Locate the following section group inside the web.config:

Inside the following sectionGroup

<configuration>
  <configSections>
    <sectionGroup name=”SharePoint>

add the following element:

        <section name=”ExportRetrySettings” 
                      type=”Microsoft.SharePoint.Publishing.Administration.RetrySectionHandler, 
                                Microsoft.SharePoint.Publishing, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c
” />

Inside the following element

<configuration>
  <SharePoint>

add the following element:

<ExportRetrySettings DefaultOption=”<option>” DefaultRetries=”<number-of-retries>>
      <Job Name=”<Name-of-Deployment-Job1>” Option=”<option>” Retries=”<number-of-retries>“/> 
      <Job Name=”<Name-of-Deployment-Job2>” Option=”<option>” Retries=”<number-of-retries>“/>
      …

<
/ExportRetrySettings>

<option> can be one of the following: None, SkipFailedWebs, SimpleRetry
<number-of-retries> can be an integer between 1 and 999

If you don’t add any <Job…> elements, then all new jobs will get the values configured as DefaultOption and DefaultRetries. Using the <Job…> element you can configure different jobs with different values.

After you have added the above listed configuration to the web.config of the Central Administration you have to open the content deployment job definition once in the central administration and save it again. During this save the configured options will be silently added to the content deployment job. New jobs created after you changed the web.config will automatically get the configured values.

You can verify this easily using the URL listed above which allows you to verify the configured settings for each of the content deployment jobs.

52 Comments


  1. Thank you, very helpful – BUT I get this error (I have Sharepoint Servcie Pack 1 installed and version 12.0.6211.1000 of Sharepoint.Publishing dll, also checked with Reflector can not find type: RetrySectionHandler). Any help is appreciated.

    The ERROR:

    An error occurred creating the configuration section handler for SharePoint/ExportRetrySettings: Could not load type ‘Microsoft.SharePoint.Publishing.Administration.RetrySectionHandler’ from assembly ‘Microsoft.SharePoint.Publishing, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c’.

    Reply

  2. Yesterday a colleague (Patrick Heyde) solved an interesting problem with MOSS 2007 and IIS 7 related

    Reply

  3. Parce qu&#39;il n&#39;est pas possible d&#39;&#233;crire sur tous les sujets et que d&#39;autres le font

    Reply

  4. Hi Robert,

    sounds as if you have hotfix mixture. Retry comes with build 6315.5000 – not before. So something is on 6315.5000 while other stuff is on SP1.

    Please ensure to install the latest WSS and MOSS hotfixes.

    Cheers,

    Stefan

    Reply

  5. dude you saved my butt today with this post… along with a little more specifics from http://ecmlounge.blogspot.com/ who had the actual code for setting the FileMaxSize

    though I set it to 2MB based on another post

    thanks, oh great sharepoint guru

    p.s. I will be using a lot more of this post in the future, too

    Reply

  6. Hi there.

    Appologies for contacting you like this, we’re in a real sticky situation with a content deployment job.

    I am currently running an incremental content deployment job and the SharePoint log file has manu of these in there:

    10/01/2008 15:48:24.56 w3wp.exe (0x0A28)                       0x0B14 CMS                           Content Deployment             78cf High     Content Deployment Status is not available

    The job status page in Central Admin never gets updated, it always says that the deployment job has been running for 3 seconds.

    Can you point me to any resources that might help me resolve this? I have restarted the SharePoint timer service, but this has had no effect.

    Many thanks.

    Jas.

    Reply

  7. Hi Evans,

    sorry I haven’t seen this. I would suggest to open a support case with Microsoft to get this analyzed.

    Cheers,

    Stefan

    Reply

  8. In the past I have released several blogs about the various problems that can occur with Content Deployment.

    Reply

  9. Hi Stefan,

    Thank you for your posts.  They are very informative.  We have two separate WSS 3.0 site collections with their own content databases.  I was wondering if it is possible to use the Content Deployment API to export these two site collections and import them into a new WSS 3.0 web application with a new content db.  The reason we want to combine these is to allow WSS search service to be able to index and search all of the data.  We would then extend this web app into a new zone to allow a different authentication method (ie forms authentication or authentication against a second Active directory).  Is this solution possible using Content Deployment?  Any gotchas i should be aware of?  Thank you very much in advance!

    Reply

  10. Content Deployment can deploy content between different site collections.

    It does not matter if these site collections are in the same or differetn content DBs.

    So you can deploy two site collections which are in separate content db’s into two site collections in the same content db if you like.

    Cheers,

    Stefan

    Reply

  11. Hi Stefan

    We pimped our content deployment job..

    the export runs successfully, but when i try to import, the the import stops with the following error:

    FatalError: Length cannot be less than zero.

    Parameter name: length

      at System.String.InternalSubStringWithChecks(Int32 startIndex, Int32 length, Boolean fAlwaysCopy)

      at Microsoft.SharePoint.Deployment.ListItemSerializer.GetLookupInfoFromFieldData(Object value, Guid& lookupListId, Int32& lookupItemId, Boolean& isUserLookup, String& userLogin, String& mvlVal)

      at Microsoft.SharePoint.Deployment.ListItemSerializer.UpdateFieldData(SPListItem listItem, ImportObjectManager objectManager, Guid docId, String fieldName, String value, String value2, Guid gFieldId, Boolean& bCreated, Dictionary`2 brokenFields)

      at Microsoft.SharePoint.Deployment.ListItemSerializer.UpdateFieldData(SPListItem listItem, Guid docId, Boolean& bCreated, SPContentTypeId contentTypeId, ImportObjectManager objectManager, Object data)

      at Microsoft.SharePoint.Deployment.ListItemSerializer.SetObjectData(Object obj, SerializationInfo info, StreamingContext context, ISurrogateSelector selector)

      at Microsoft.SharePoint.Deployment.XmlFormatter.ParseObject(Type objectType, Boolean isChildObject)

      at Microsoft.SharePoint.Deployment.XmlFormatter.DeserializeObject(Type objectType, Boolean isChildObject, DeploymentObject envelope)

      at Microsoft.SharePoint.Deployment.XmlFormatter.Deserialize(Stream serializationStream)

      at Microsoft.SharePoint.Deployment.ObjectSerializer.Deserialize(Stream serializationStream)

      at Microsoft.SharePoint.Deployment.ImportObjectManager.ProcessObject(XmlReader xmlReader)

      at Microsoft.SharePoint.Deployment.SPImport.DeserializeObjects()

      at Microsoft.SharePoint.Deployment.SPImport.Run()

    Any Ideas?

    regards

    patrick

    Reply

  12. Hi Patrick,

    there seems to be a problem with a lookup field.

    Are you doing a full deployment of the whole site collection?

    Cheers,

    Stefan

    Reply

  13. Thanks for the quick reply

    we do an export of a Site into another site.

    f.ex http://www.1.tld/content/ into http://www.2.tld/content/ with a full deployment. The problem is the same with stsadm import/export.

    when i delete a subsite which occured the error, then it fails f.ex 3 sites later. all the pages have the same content type between it. i can’t see any logic regarding content type or content or type of creation.

    also when i try to copy one of the "error" pages to another place with the "Manage Content & Structure" i got the same problem

    is there a possibilty to exactly know the field which causes the problem?

    regards

    patrick

    Reply

  14. Hi Patrick,

    only be analyzing a memory dump.

    I would suggest to open a support case with Microsoft to get this analyzed.

    Cheers,

    Stefan

    Reply

  15. We&#39;re wrapping up a project where we&#39;re building a publicly accessible dot com site in SharePoint

    Reply

  16. Stefan,

    I need to force full content because the site collections are not the same anymore …. some stuff on the destination site is not working. Is there a way to deploy the collection completely without deleting the old one first …. it is needed there for internet purposes ….

    Reply

  17. Hi Eddi,

    unfortunatelly not.

    When performing a full deploy it is a va banque game.

    It be that it fails – or it could be that it will not fail.

    But in both cases you cannot be sure if the content on source and target is in sync again.

    The only reliable method is to perform a full deploy into an empty site collection.

    Cheers,

    Stefan

    Reply

  18. Hi Stefan,

    Thank’s for this useful series.

    I may have a solution for patricks problem, after much heart ache and a few anger management sessions, we came across this blog post on the subject http://nklbala.blogspot.com/2008/09/length-cannot-be-less-than-zero-error.html

    Unfortunately I have another infuriating problem, I need to export just the list items from a list and then import them into a new deployment of the list, whilst retaining the items GUIDS and IDS.

    However when I attempt to do this through the api I get an error

    "Cannot be imported because its parent web.. does not exist"

    The web does exist and if I turn the retain identity flag off the import succeeds fine, except with new GUIDS.

    Do you have any idea why this occurs? or anyway around it? the idea is we can incrementally deploy new versions of our site (new list views ect) and then migrate the content across.

    Cheers,

    Scott.

    Reply

  19. Hi Scott,

    sorry but what you are trying to achieve is not possible.

    If you use RetainObjectIdentity then the URL, the GUID and the parent object have to be the same on source and target.

    If you need to change one you have to disable RetainObjectIdentity which also causes the creation of a new GUID.

    Cheers,

    Stefan

    Reply

  20. Hi Stefan,

    Cheers for that(Although that really sucks), we have just further investigated the error "FatalError: Length cannot be less than zero." as there were still a few lists that were giving us the lookup exception.

    It turns out that a user lookup can be represented in two formats either just the user ID or the ID + # + the name +; SharePoint normally uses the full name but sometimes returns just the ID, I’m not sure why.

    But anyway the import code doesn’t handle the fact that there are two ways to represent a user in a user lookup and only does a lastIndexOf(";") Which gives us the index cant be less that zero exception.

    So the solution is to loop through all your items and change any lookup ids to use the full shebang before you export.

    This seems like a SharePoint bug to me(sigh).

    Reply

  21. Hi Scott,

    sounds indeed like a problem in the code.

    I would suggest to open a support case for this to request a fix.

    Cheers,

    Stefan

    Reply

  22. Hi Stefan,

    What in your opinion is the best way to do incremental deployments of a SharePoint project? We currently have our site as a Visual Studio Project using SharePoint extensions.

    Bearing in mind migrating content with permissions, alerts, long running workflows.

    Cheers,

    Scott.

    Reply

  23. Hi Scott,

    content deployment can only deploy what it’s name says: content (including the permissions).

    Alerts and workflow state cannot be deployed.

    Cheers,

    Stefan

    Reply

  24. Hi Stefan,

    Just encountered another bug with the content migration api. If you export List Items with versions, and then import them to a list that is version enabled. You get a Null Object Reference Exception.

    It turns out that if one of the exported items does not have multiple versions the xml that is generated, misses out the version tag, this then causes the import to crash, as it’s expecting version information.

    Our workaround is to go update each item in the versioned list to make sure it has at least one version, though this shouldn’t be necessary…

    Reply

  25. Hi Scott,

    to get this fixed you would need to raise a support case.

    Cheers,

    Stefan

    Reply

  26. Hi Stefan

    I’m trying to create a web service that will allow us to run a pre-defined incremental deployment job remotely. Is it posible to use Microsoft.SharePoint.Publishing.Administration.ContentDeploymentJob.Run() on the server to achieve this? I keep hitting "Item does not exist. It may have been deleted by another user" type errors which I assume are security related.

    I can view the job properties and even delete it like this, but can’t use .Run(), .Test() or .Update().

    Is this possible or is there any other way of achieving this? The scenario is that we frequently need to deploy content to the live site but our hosting company can’t allow us access to central admin as they don’t want non-technical users potentially causing issues there so we’re trying to provide a single web page with a single button that will let these users start an incremental update, but not give them access to anything else.

    Reply

  27. Hi Dave,

    performing this would cause the content deployment export to be done in the w3wp.exe process.

    It will not start the job in the OWSTIMER.EXE what you are most likely planning to do.

    Cheers,

    Stefan

    Reply

  28. Cheers Stefan – So is there no way to trigger a remote content update then? How do other users get around this problem – I can’t imagine we’re alone in having a hosted solution and therefore not having access to central admin…

    Reply

  29. Hi Dave,

    most people use the scheduling feature of content deployment.

    Cheers,

    Stefan

    Reply

  30. Thats a shame – we use that already for general updates, but we have frequent time critical updates that need to be made live as soon as they’ve been added to the authoring site, so we can’t wait for the scheduled job to run and we can’t wait for our hosts to run a job manually as it takes too long. I guess the only option is to see if they’ll give us central admin access after all 🙁

    Reply

  31. Hi Dave,

    that’s what Quick Deploy covers.

    Here an author can tag specific items for a quick deploy which is independent from incremental deployment schedules.

    Cheers,

    Stefan

    Reply

  32. Thanks – Unfortunately that doesn’t work for us as it’s still a timer job (and the lowest frequency seems to be 10 minutes). As our content deployments are very time dependent we need to be able to get content onto the live site as soon as it has management approval – ie. in less than a minute. Also is it possible to deploy list items with quick deploy? I can’t see any way of tagging those.

    Reply

  33. Hi Dave,

    you can only tag publishing pages which will deploy the page and the linked resources.

    An alternative might be not to use content deployment at all if you need such direct updates but to author directly on the live site.

    Cheers,

    Stefan

    Reply

  34. Hi, If you use or want to you content deployment, in Microsoft Office SharePoint Server 2007, you should

    Reply

  35. In working with a customer we ran into this issue from MOSS Content Deployment. This is a good one to

    Reply

  36. Hi Stefan

    We are facing wired issue after upgrading our pre-prod and production environment to SP2. We created an empty site on the destination and tried to do a full content deployment from the pre-prod environment. The import failed while importing object 742 with the following error

    The local device name is already in use. (Exception from HRESULT: 0x80070055) at Microsoft.SharePoint.Library.SPRequest.ThrowError(Int32 dwErr) at Microsoft.SharePoint.SPContentTypeCollection.AddContentTypeToWeb(SPContentType contentType) at Microsoft.SharePoint.SPContentTypeCollection.AddContentType(SPContentType contentType, Boolean checkName, Boolean updateResourceFileProperty) at Microsoft.SharePoint.Deployment.ContentTypeSerializer.CreateContentType(SPContentType sourceContentType) at Microsoft.SharePoint.Deployment.ContentTypeSerializer.ProcessContentType(SPContentType sourceContentType, String contentTypeXml, ImportObjectManager importObjectManager, Boolean IsParentSystemObject) at Microsoft.SharePoint.Deployment.ContentTypeSerializer.SetObjectData(Object obj, SerializationInfo info, StreamingContext context, ISurrogateSelector selector) at Microsoft.SharePoint.Deployment.XmlFormatter.ParseObject(Type objectType, Boolean isChildObject) at Microsoft.SharePoint.Deployment.XmlFormatter.DeserializeObject(Type objectType, Boolean isChildObject, DeploymentObject envelope) at Microsoft.SharePoint.Deployment.XmlFormatter.Deserialize(Stream serializationStream) at Microsoft.SharePoint.Deployment.ObjectSerializer.Deserialize(Stream serializationStream) at Microsoft.SharePoint.Deployment.ImportObjectManager.ProcessObject(XmlReader xmlReader) at Microsoft.SharePoint.Deployment.SPImport.DeserializeObjects() at Microsoft.SharePoint.Deployment.SPImport.Run()

    Checked file system, there is enough space, checked the temp folder on the import server and the cab files are copied over there. Can you please provide some pointers ?

    Appreciate any help from you

    Reply

  37. Hi Binay,

    I haven’t seen this specific issue. I would suggest to open a support case with Microsoft to get it analyzed.

    Cheers,

    Stefan

    Reply

  38. Thank you. Very helpful to find out about the compression flag

    Reply

  39. Hi Stefan,

    I had today the same error than Binay.

    How I "achieve" this:

    I took a full backup of our environment (using "stsadm -backup http://myserver -overwrite").

    I restored it on another server which seems to work fine.

    Then I wanted to keep it up to date by doing and import/export (again using stsadm -export and stsadm -export including user security, version=4, etc…)

    Any help would be much appreciated.

    Thanks.

    Jerry

    Reply

  40. Hi Jerry,

    sorry this is not possible.

    STSADM cannot do incrementals.

    Cheers,

    Stefan

    Reply

  41. HI Stefan,

    I use content deployment feature between two farms (Staging and Web)

    I want to disable Compression for a Content Deployment Job and automatically retry after a failure has happened.

    The configurations should be done in both Farms? I mean to run STSADM -o editcontentdeploymentpath -pathname <pathname> -enablecompression no and make the changes in CA web.config.

    Reply

  42. Hi Martin,

    these changes are only required on the source farm.

    Cheers,

    Stefan

    Reply

  43. Hi Stefan,

                i have run the content deployment Job from central administration,it's throwing error as

    "The list item profiles.aspx cannot be imported because its parent web does not exist. at Microsoft.SharePoint.Deployment.ListItemSerializer.GetParentWeb(Guid parentWebId, Guid listItemId, Guid parentId, String dirName, String fileName, Guid& guidWeb, Guid& guidList) at Microsoft.SharePoint.Deployment.ListItemSerializer.DeleteListItem(DeploymentObject deployObject) at Microsoft.SharePoint.Deployment.ListItemSerializer.SetObjectData(Object obj, SerializationInfo info, StreamingContext context, ISurrogateSelector selector) at Microsoft.SharePoint.Deployment.XmlFormatter.DeserializeDeletedObject(DeploymentObject deployObject) at Microsoft.SharePoint.Deployment.XmlFormatter.Deserialize(Stream serializationStream) at Microsoft.SharePoint.Deployment.ObjectSerializer.Deserialize(Stream serializationStream) at Microsoft.SharePoint.Deployment.ImportObjectManager.ProcessObject(XmlReader xmlReader) at Microsoft.SharePoint.Deployment.SPImport.DeserializeObjects() at Microsoft.SharePoint.Deployment.SPImport.Run()"

    but there is no item profiles.aspx. please help me to understand that what the reason will be to fail the job.

    Thanks,

    Naresh

    Reply

  44. Hi Naresh,

    just from the error message it is hard to tell what went wrong. But a common scenario where this can happen is selective deployment – means if you did not deploy the whole site collection but only some sites in the hierarchy.

    In such a situation this error can happen if a page in the exported hierarchy contains a link to pages outside the exported hierarchy.

    As content deployment exports all dependencies it will in such a situation also export the linked page.

    When later the import phase occurs the linked page can not be imported as the parent web was not exported.

    I would suggest to analyze the import report and the manifest file to isolate where the page resided and whether it was exported as dependency or as selected object.

    Cheers,

    Stefan

    Reply

  45. Hi Stefan,

                  Thanks a ton for your responce.

                   I really didn't know, from where the profiles.aspx page is coming but i solved this one by created the new blank site for destination.

    Thanks,

    Naresh

    Reply

  46. Hey Stefan,

    We use Sharepoint 2007 content deployment and plan to upgrade our source farm to sql 2008 from sql 2005.  Are there any issues or any advice you could offer that might keep things smooth?  in theory, we can just detach from the 05 sql server and attach to the 08 sql server and it should still content deploy ok right? (not a backup and restore)  the sql server and instance on the 08 environment will maintain the same names and ip's.

    Thanks,

    Sarah

    Reply

  47. Hi Stefan,

    We have content deployment setup in SharePoint 2010. All was working fine. We renamed target site (i.e. changed url). Now content deployment fails. Is there a way to update the content deployment target URL after the path has been created? Or is our only option to delete the path and create a new one?

    Thank you

    Reply

  48. Hi Nishan,

    you would need to update the url in the Content Deployment Path list item in the /lists/Content%20Deployment%20Paths list.

    Cheers,

    Stefan

    Reply

  49. Hi Stefan –

    On the path you provided we do see the Content Deployment Path and the settings we see within it are:

    Title, Description, IsPathEnabled, SourceServerUrl, SourceSiteCollection, AuthenticationType, includeUserInfoDateTime, IncludeSecurity, DeploymentStatus, KeepTemporaryFiles, EnableEventReceivers, EnableCompression

    We need to modify the TARGET url and don't see any option to modify that particular piece. At this point would we be looking at starting over with a new content database?

    Much appreciated

    Reply

  50. Hi Nishan,

    might be that the url is not in the default view.

    Cheers,

    Stefan

    Reply

  51. Hi Nishan,

    forgot; in 2010 and 2013 some of the columns are marked as hidden. You can only update them using Object model – e.g. using Powershell.

    Might be that this url is one of these. Don't have a system in front of me to double check right now.

    Cheers,

    Stefan

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.