Deep Dive into the SharePoint Content Deployment and Migration API – Part 2

[Part 1Part 2Part 3Part 4Part 5 – Part 6 – Part 7]

Providing some real world samples for export

To demonstrate how powerful the Content Deployment and Migration API is I will now provide some real world examples:

  1. Export a complete Site Collection
  2. Export a specific sub web
  3. Exporting other elements like lists, document libraries, list items or documents from a document library
  4. Exporting using compression
  5. Incremental Export

The examples below assume that the source site collection is on the local server on port 2000.

Lets start with a scenario that can also be covered by using stsadm.exe: 


1) Export a complete Site Collection

Actually there are two different methods to export a complete site collection.

Method 1)

SPExportSettings settings = new SPExportSettings();
settings.SiteUrl = “http://localhost:2000”;
settings.ExportMethod = SPExportMethodType.ExportAll; 
settings.FileLocation = @”c:\export”;
settings.FileCompression = false;
settings.CommandLineVerbose = true;
 
SPExport export = new SPExport(settings);
export.Run(); 

So what did we actually do here? First we created a SPExportSettings object to define the general configuration settings for the export we would like to perform. As we did not select a specific object to export the configured site collection will be selected for export. Then we created the SPExport object based on the configured settings and started the export by calling the Run method.

Here is a short explanation about the settings being used in the code above:

  • SiteUrl – this property defines which site collection the export should use. All objects being exported always have to be in the same site collection. The Content Deployment and Migration API cannot access items in different site collections in a single operation. 
  • ExportMethod – this property allows to define whether to perform an incremental export (value = ExportChanges) or everything (value = ExportAll). Be aware that ExportChanges would require to provide an Export Change Token in a separate property.
  • FileLocation – this property defines where to store the exported content. The value should point to be an empty directory. If the directory does not exist it will be created during export. If file compression is being used, then only the compressed files will be stored on this location. The uncompressed files will be stored in the directory identified by the value of the system wide TMP environment variable. So you need to ensure that the directory the TMP environment variable points to also needs to have sufficient space available.
  • FileCompression – this property defines whether the content should be compressed into a CAB file. If you need to archive the exported content or need to transfer it to a different machine you should choose to compress. If you only export the content to import it afterwards using code on the same machine and don’t need to archive (e.g. a copy or move operation) then you should decide to disable the compression as this is significantly quicker.
  • CommandLineVerbose – this parameter allows to control if the API should provide some verbose output. If you have ever seen the generated output when running STSADM -o export: this is exactly the flag the generates this output. If the value is false no output is generated.

Method 2)

SPSite site = new SPSite("http://localhost:2000");

SPExportObject exportObject = new SPExportObject();
exportObject.Id = site.ID;
exportObject.Type = SPDeploymentObjectType.Site;

SPExportSettings settings = new SPExportSettings(); 
settings.SiteUrl = "http://localhost:2000";
settings.ExportMethod = SPExportMethodType.ExportAll; 
settings.FileLocation = @"c:\export";
settings.FileCompression = false;
settings.ExportObjects.Add(exportObject);

SPExport export = new SPExport(settings);
export.Run();  

site.Dispose();

What did we actually do here? In this method we explicitly add a specific object to the ExportObjects collection – the SPSite object defining the site collection. This leads to the same result as the first method where the SPSite object was implicitly selected as no explicit object was selected for export.

Here is a short explanation about the settings being used in the code above:

  • SPExportObject.Id – GUID of the object that should be exported
  • SPExportObject.Type – The type of the object to be exported. This can be one of the following:
    • Site = SPSite object (the site collection)
    • Web = SPWeb object (often referred to as site)
    • List = SPList object
    • File = SPFile object
    • Folder = SPFolder object
    • ListItem = SPListItem object
  • for the other properties see Method 1


2) Export a specific sub web

The code to export a specific subweb is very similar to the code used in method 2 to export the complete site collection:

SPSite site = new SPSite("http://localhost:2000");

SPWeb web = site.OpenWeb("/SomeWeb"); 

SPExportObject exportObject = new SPExportObject();
exportObject.Id = web.ID;
exportObject.IncludeDescendants = SPIncludeDescendants.All;

exportObject.Type = SPDeploymentObjectType.Web;

SPExportSettings settings = new SPExportSettings(); 
settings.SiteUrl = "http://localhost:2000";
settings.ExportMethod = SPExportMethodType.ExportAll; 
settings.FileLocation = @"c:\export";
settings.FileCompression = false;
settings.ExcludeDependencies = false;
settings.ExportObjects.Add(exportObject);

SPExport export = new SPExport(settings);
export.Run();

web.Dispose();
site.Dispose();

 

Be aware that when exporting a specific sub web using the method above might also export objects outside the specific subweb as not only the objects in the sub web are exported but also dependent objects like images referenced in pages by this sub web and the master page assigned to the sub web.

Whether such dependent objects should be exported or not can be controlled by the ExcludeDependencies property of the SPExportSettings. Whether child objects of an object are exported can be controlled using the IncludeDecendents property of the SPExportObject object:

  • ExcludeDependencies – This property controls whether to export dependent objects like referenced images, master pages or page layouts. If it is unclear if the dependent objects are already in the destination database you should export the dependent objects as well to prevent problems.
  • IncludeDescendants

     

     

    – This property controls whether to export child objects of the selected objects (like sub webs, lists, list items) or only the selected object. Possible values are None (only the object is exported), Content (List and libraries of a web will be exported but not sub webs) and All (all child content is exported)


3) Exporting other elements like lists, document libraries, list items or documents from a document library

Other objects or containers can be exported in the same way by adding them to the ExportObjects collection.

SPSite site = new SPSite("http://localhost:2000");
SPWeb web = site.OpenWeb("/SomeWeb");  
SPList list = web.Lists["MyList"];
SPListItem listItem = list.Items[0]; // select the first list item

SPExportObject exportObject = new SPExportObject();
exportObject.Id = list.ID;
exportObject.Type = SPDeploymentObjectType.List;

SPExportObject exportObject = new SPExportObject();
exportObject.Id = listItem.UniqueId;
exportObject.Type = SPDeploymentObjectType.ListItem;

...

 

Exporting document libraries can be done using the same code as exporting a list as internally a document library is implemented as a list. And a document in a document library can be exported using the same method as exporting any other list item. The SPFile object of the document will automatically be automatically exported together with the SPListItem object.


4) Exporting using compression

When exporting using compression additional settings in the SPExportSettings object need to be adjusted:

SPExportSettings settings = new SPExportSettings();
settings.FileLocation = @”c:\export”;
settings.FileCompression = true;

settings.FileMaxSize = 10// defines the maximum filesize as 10 MB
settings.BaseFileName = “ExportedItems.cmp”;
… 

Here is an explanation about the two properties:

  • FileCompression – this property defines whether the content should be compressed into a CAB file. If you need to archive the exported content or need to transfer it to a different machine you should choose to compress. If you only export the content to import it afterwards using code on the same machine and don’t need to archive (e.g. a copy or move operation) then you should decide to disable the compression as this is significantly quicker.
  • FileLocation – this property defines where to place the export files. If compression is used it defines the location of the generated compressed content migration pack.
  • FileMaxSize – this property defines the maximum size of the created content migration pack.
  • BaseFileName – this property is only used if compression is enabled. It defines the name of the compressed content migration pack file. The default extension (if not specified) is “.cmp”. If the size of FileMaxSize is exceeded a new file is created with the BaseFileName and an integer number. E.g. with the above configuration the next file would be ExportedItems1.cmp

 


5) Incremental Export
 
If incremental export should be done it is required to save the Change Token of the last full or incremental export. This change token needs then be provided in the ExportSettings to allow the Content Deployment and Migration API to determine which items have changed.
So after a previous export the change token can be retrieved using the following code:
SPExportSettings settings = new SPExportSettings();
 
SPExport export = new SPExport(settings);
export.Run(); 
 
string ChangeToken = settings.CurrentChangeToken;  

Explanation:

  • CurrentChangeToken – this property is a read-only parameter populated during export. It contains the Change Token right after the export. So doing an incremental export providing this change token in the future will export all items that have been created, changed or deleted in the configured scope after the change token was generated.

To start an incremental export using this change token would then be done using the following code:

SPExportSettings settings = new SPExportSettings();
settings.ExportMethod = SPExportMethodType.ExportChanges;  
settings.ExportChangeToken = oldChangeToken;
…  

Explanation:

  • ExportMethod – this property allows to define whether to perform an incremental export (value = ExportChanges) or everything (value = ExportAll).
  • ExportChangeToken – this property defines which items to export when using incremental deployment. the incremental export will only export items that have been created, changed or deleted in the configured scope after the change token was generated.

 

Now after we have seen how to export items the Part 3 of this article series will cover the import part.

114 Comments


  1. Providing some real world samples for import After we managed to export content in part 2 lets now focus

    Reply

  2. Stefan is back! J Stefan is in Redmond this week, and we had a chance to catch up! In fact, I had dinner

    Reply

  3. Stefan Goßner taucht in einer vierteiligen Serie in die Tiefen der SharePoint Deployment und Migration

    Reply

  4. [ Part 1 – Part 2 – Part 3 – Part 4 ] Advanced content deployment scenarios Till now we have covered

    Reply

  5. Fantastic, incredibly detailed, content on content deployment by Stefan

    Reply

  6. [via Stefan Gossner ] This past week I was in Redmond teaching my WCM401 development class in an open

    Reply

  7. Body: Great series of posts by Stefan on how to use the Content Migration API. Very timely for me as

    Reply

  8. Hello,

    We can have full license for MOSS 2007 for intranet / CMS access for content management, though for the public-facing content delivery part we intend on using Windows SharePoint Services 3.0 (which comes w/ Windows Server 2003), and not have to purchase internet connector for MOSS 2007.

    To get a little deep into the scenario, for authoring and content management, MOSS 2007 with SQL Server 2005 as the database would be used. The content authored (we plan to) would eventually be exported on a different box with Win 2003 machine with WSS V3 and SQL Server 2005 installed. Once this is done we can currently think of the below mentioned scenarios to publish our site –

    –         Write a custom engine that pulls out content from the database (using APIs, if provided by WSS 3.0) on every hit to the page.

    –         Use Out of the box features provided by WSS V3 to pull out content from the database on every hit to the page.

    I would really appreciate if you could give us some insight into the feasibility of the approaches that we are intending to take up.

    Regards

    Sachin

    sachsharma@sapient.com

    Reply

  9. Hi Sachin,

    I don’t think this method will be practically useful. Without MOSS you will not have the page layout concept of the pages layout. So you only see the fields.

    You will not be able to influence how the fields are arranged in the page and within a master page.

    Cheers,

    Stefan

    Reply

  10. 今天凌晨加班的时候偶然翻到 Stefan Gossner 的这几篇文章,强烈推荐给大家: Deep Dive into the SharePoint Content Deployment and Migration

    Reply

  11. Hi,

    Thank you for these great articles – they gave me a kickstart on content deployment at the right time!

    I have a scenario where we need to move contents from lists on developer servers, to test and further on to production. The site structure is not neccessarily the same on the different environments, so the sample for importing in a different structure is a great starting point.

    A note on incremental import should be said – it is not supported on lists – only sites and webs. I get around this by using an importhandler that checks the imported ListItem, and if it already exists (checked by CAML query on list-webservice) is deleted again. I find this approach easier to code compared to manipulating the manifest.xml file.

    I think the performance is a bit problematic. Exporting 196 ListItems take in average 10 minutes. I fear the time, when we get 1-2000 items in the list. I am currently looking into ways to improve performance and will be investigating the usage of indexed columns. Any hints are welcome..

    Regards

    Flemming

    Reply

  12. Great set of articles by Stefan Goßner : Deep Dive Into the SharePoint Content Deployment…

    Reply

  13. MOSS / SharePoint Content Deployment

    Reply

  14. I’ trying to export my "Pages" library with this code:

    SPSite site = new SPSite("http://vm-spwf");

    SPWeb web = site.OpenWeb();

    SPList list = web.Lists["Pages"];

    SPExportObject exportObject = new SPExportObject();

    exportObject.Id = list.ID;

    exportObject.Type = SPDeploymentObjectType.List;

    SPExportSettings settings = new SPExportSettings();

    settings.SiteUrl = "http://vm-spwf";

    settings.ExportMethod = SPExportMethodType.ExportAll;

    settings.FileLocation = @"c:export.1";

    settings.FileCompression = false;

    settings.ExcludeDependencies = false;

    settings.ExportObjects.Add(exportObject);

    SPExport export = new SPExport(settings);

    export.Run();

    web.Close();

    site.Close();

    I received the following Exception:

    Violation of PRIMARY KEY constraint ‘PK__#ExportObjects____73DFBB75’. Cannot insert duplicate key in object ‘dbo.#ExportObjects’.

    The statement has been terminated.

    Where is the problem ?

    Regards

    Tihomir Ignatov

    Reply

  15. Hi Tihomir,

    please ensure that the latest security fixes for WSS and MOSS are installed. If this does not fix the problem, please open a support case to get this analyzed.

    Cheers,

    Stefan

    Reply

  16. Hi Stefan,

    I’ve successfully written some tools to run site exports and imports.  Thanks for the examples, they worked well.  I am now trying to write some code to export a single page layout with the corresponding content type.  I was hoping that exporting the page layout would grab the content type as a dependency but this doesn’t seem to be the case.  Since content types are SPContentType objects and do not have GUID IDs, I am having real trouble writing the export code for a content type.  Any chance you can add a sample on this?  

    Thanks!

    Joe

    Reply

  17. Hi Joe,

    sorry that is not possible this way.

    You should handle content types using custom features.

    Cheers,

    Stefan

    Reply

  18. Hello again.

     Will the Import/Export functions via the API take care of this situation:

    1.  I have a test site called

       http://mysitetest.com

    2.  I’d like to import the export package into a site called http://mysite.com

       The OOTB tools would create a site called:

       http://mysitetest/mysite.com, which I do not want.  Again, will the API handle this situation?  I need to have a proof of concept done by next Wednesday.  Thanks!

    Reply

  19. Hi Marees,

    export import does not allow to rename an item. Only to export and import it to different locations.

    When exporting
    http://servername:6000/Test then the web Test and most likely the items underneath are exported.

    When importing this package to a different location you can decide where to place the exported objects. That what you did. You decided to import at a000123 rather than at the root.

    But still all items in the package (including the exported Test web) will be imported here.

    I assume what you are looking for is to export all items inside the Test web and then import them into the a000123 web.

    If this is what you are looking for then you need to export only the included items (e.g. the document libraries and lists) and child swebs and not the Test web itself.

    Then you can decide that all the items in the export package should go below the a000123 web.

    Cheers,

    Stefan

    Reply

  20. Thanks Stefan for the quick response.

    my requirement:

    Export "http:\servername:6000Test" site using SPExport object.

    When i try to import this "Test" site to other webapplication "http:\servername:6001g000123"

    .Its throwing exception "g000123test" is not found in the server.

    note: there is no site g000123 in this web app. "Test" site should be created in the name of g000123.

    I need the imported site url as http:\servername:6001g000123 not as

    http:\servername:6001g000123Test

    so, i exported "Test" site using SPExport object and then imported the site using stsadm command.

    stsadm -o import -url
    http://servename:6001/g000123 -filename export.cab

    (note:there is no site g000123 in this web app)

    now the "Test" site is created in the name of g000123.

    Is this is right way to do it?

    How this stsadm import command did it?.

    how to do this using API?

    I used SPExport object because it supports exclude subsites option  but stsadm export command does not support this, it export all sites and it subsites.

    Please help me.

    Marees

    Reply

  21. Hi Marees,

    again: you cannot rename items using the export import mnethod.

    Please do not export the test web. Only export the items inside.

    Cheers,

    Stefan

    Reply

  22. Stefan,

    Thanks for reply.

    How to export only the items inside the TestWeb ?

    -marees

    Reply

  23. Hi Marees,

    you need to enumerate them and add them to the ExportObjects collection one by one.

    Cheers,

    Stefan

    Reply

  24. I’m having trouble finding a solution to deploy Sharepoint artifacts (master pages, page layouts) WITHOUT content. We’re in a situation where we’re about to deploy a site to a client’s web environment; they’ll be making changes to content, but we’ll no doubt be making changes to the master pages, page layouts, etc over the coming months and we’re not wanting to overwrite the content by using the backup/restore method. Are the import/export methods either via the command line or through the API capable of acheiving this?

    Reply

  25. Hi Sammuello,

    these things should be deployed using features – not using content deployment.

    Cheers,

    Stefan

    Reply

  26. Thanks Stefan – appreciate the quick response. I’m now going down the features route. This probably isn’t the thread to discuss this in detail, but do you happen to know if there are any tools in existence to automatically package up artifacts into a feature, without manaually having to extract the files & describe them in an xml document?

    Cheers,

    Sam

    Reply

  27. Hi Sam,

    sorry haven’t looked into this.

    Cheers,

    Stefan

    Reply

  28. Hello Stefan,

    Great series of articles, thanks.

    For incremental updates where is the change token stored? i.e. is it possible to find out what the change token is following the last time a central admin incremental content deployment job ran?

    Also, when writing a custom incremental export where should the value of settings.CurrentChangeToken be stored? When executing the next incremental job the token must be picked up from somewhere?

    Thanks,

    Chris

    Reply

  29. Hi Chris,

    you would need to keep the change token returned from the earlier run yourself.

    You cannot query it.

    That is what content deployment does as well. It stores the change token in the configuration settings of the deployment job.

    Cheers,

    Stefan

    Reply

  30. Thanks Stefan,

    That makes sense. So presumably in the case of the content deployment job you can get hold of it through the API and access the settings and change token etc?

    Cheers,

    Chris

    Reply

  31. That is correct.

    E.g. using the following code:

    ContentDeploymentPathCollection deployPaths = ContentDeploymentPath.GetAllPaths();

    foreach (ContentDeploymentPath path in deployPaths)

    {

       foreach (ContentDeploymentJob job in path.Jobs)

       {

           foreach (SPExportObject o in job.ExportObjects)

           {

                Console.WriteLine("ExportChangeToken=" + o.ExportChangeToken);

           }

       }

    }

    Reply

  32. Stefan,

    We’ve been having problems pushing content through central admin, timeouts etc. I believe this is because the site is around 35 gigabytes. We can create MANY jobs that each define a few webs, and then everything works. But pushing the entire site collection "/" has been a no go. So, rather than having to define all these jobs I was thinking I could just write some code to do an initial full content export. Then copy and paste those cab files and reimport on the other farm.

    Then what I would *like* to be able to do is keep the one job specified by CA but just push changes. I expect however that this will not work because CA will not have my change token and so it will just try to perform a full content push.

    So, I guess my question is, is it possible for CA to know about the content that I pushed through the API?

    Reply

  33. Hi Tim,

    your assumption is correct. The Central Admin will not know about this.

    It might be possible to adjust the Change Token in the deployment job using the API but I haven’t tried this.

    Cheers,

    Stefan

    Reply

  34. Stefan Goßner posted a great series of posts about the Content Migration API (formerly known as PRIME

    Reply

  35. [ Part 1 – Part 2 – Part 3 – Part 4 – Part 5 – Part 6 ] Requirements for a successful content deployment

    Reply

  36. Hi Stefan,

    Great series of articles.

    Do you know if it is possible to export/import workflows created with SharePoint Designer using the API?

    stsadm import/export seems to do it (but loses the association to the list/doc lib beacuse of the new guids) so I would have thought the API would also support this.

    Thanks,

    Adam

    Reply

  37. Hi Adam,

    honestly I haven’t tried it. If it is a GUID related thing it should work when using RetainObjectIdentity on import which preserves the GUIDs.

    Did you try this?

    Cheers,

    Stefan

    Reply

  38. I’m having some problems while trying to create a tool to copy list items to another list:

    a) When I export the items the file RootObjectMap.xml (which contains the items I can reparent) is empty, so I can’t reparent the items (I export the items one by one adding them to the Export objects collection). It happens only with some lists, with others the items are orphaned as I would like… ¿Any ideas?

    b) When an item with more than one version is imported the last version has the correct modified date, but all the other versions have the time when the import is run. I seems that is a bug… any workarround?

    Thanks!

    Reply

  39. Hi Enrique,

    a) this will happen if you used the incremental export. You cannot remap files if you used this option.

    b) I haven’t heard about this. Please open a support case for this.

    Cheers,

    Stefan

    Reply

  40. Thanks for the quick answer, but I specify SPExportMethodType.ExportAll as ExportMethod.

    If I run the same code against two different lists I don’t get consistent results… can I be missing something?

    Reply

  41. Hi Enrique,

    I haven’t see this.

    Cheers,

    Stefan

    Reply

  42. SharePoint Site vs. Web exports/imports and Custom Layout Pages

    Reply

  43. Hi Stefan,

    Can we increase the cab file size.

    By default it breaks the big files because of the 25 mb default.

    Thanks

    Ajay

    Reply

  44. Hi,

    Curious and urgent ! you don’t talk about the ExcludeChildrens property. I am confused between IncludeDescendants and ExcludeChildrens properties, what’s exactly the difference between descendants and childrens ?

    The MSDN do not explain that, so you are the last hope 😉

    Thanks,

    Reply

  45. Hi Elhadi,

    if you check above you will find that I have explained the two properties. The Include Decendants allows to decide whether to export referenced items like page layout and images (which can exist in different sites and libraries) are exported together with the referencing page or not.

    The ExcludeChildren allows to decide whether to exclude child galleries (means galleries inside the current web) or not.

    So they are completly different.

    Cheers,

    Stefan

    Reply

  46. Hi Stefan,

    I am really worried, in the article you explain the properties SPExportSettings.ExcludeDependencies and SPExportObject.IncludeDescendants, but not the SPExportObject.ExcludeChildren.

    I think also that you in your last post you confused the definition of IncludeDescendants with ExcludeDependencies.

    I’m really embarrassed to insist like that but please verify

    Thanks,

    Reply

  47. Hi Elhadi,

    sorry for this! You are right. What happens when using ExcludeChildren = true is that it sets IncludeDecendents to SPIncludeDescendants.Content if you set it to false it will set IncludeDecendets to SPIncludeDescendants.All.

    So this is more or less an option to use an oposit logic. You should not mix the use of IncludeDecendent with ExcludeChildren.

    Cheers,

    Stefan

    Reply

  48. Dear Stefan,

     Does incremental update support SPList? If not, is there any suggestion to implement it? Thanks!

    Reply

  49. Hi Frankie,

    íncremental update supports also SPList and SPListItem objects.

    Cheers,

    Stefan

    Reply

  50. Dear Stefan,

     But when i used SPExportMethodType.ExportChanges to export a specify list changes, i got below errors.

    Microsoft.SharePoint.SPException: Incremental export is only supported for SPSite and SPWeb objects.

      at Microsoft.SharePoint.Deployment.SPExportSettings.Validate()

      at Microsoft.SharePoint.Deployment.SPExport.InitializeExport()

      at Microsoft.SharePoint.Deployment.SPExport.Run()

      at WSSMigration.ContentMigration.ExportList(String SiteURL, String sWeb, String sListName, String exportFileLocation, Boolean bCompress, String strToken)

      at WSSMigration.Program.Main(String[] args)

     Please give advise. Many thanks!

    Reply

  51. Hi Frankie,

    sorry I misunderstood you. I thought you are exporting a site or web and would like to know if the included lists will be exported incrementally.

    It seems you have defined a SPList object as SPExportObject. Here the incremental method does not work.

    Cheers,

    Stefan

    Reply

  52. Hi Stefan,

    I have read your great post. Thanks

    I have a situation. I would like to copy all content from a source site to a destination site but want to just exclude the custom lists from the source site.

    Any idea how i can achieve this?

    Appreciate your help.

    Thanks,

    Freda

    Reply

  53. Hi Freda,

    that cannot be done out of the box. I think the easiest way would be to export/import everything and afterwards delete the unwanted content using a script.

    Alternatively you could modify the content of the manifest.xml file after export and remove all references to the unwanted content which will ensure that the content is not imported.

    Cheers,

    Stefan

    Reply

  54. Thanks for the ideas Stefan,

    I will be trying this today…will keep you updated.

    Thanks,

    Freda

    Reply

  55. Hi Stefan,

    As suggested by you I did the following…

    1)From the 12 hive, ran the stsadm -o export command.This gave a .cab file in BIN folder.

    2)Extracted the contents using winrar and Got a folder with .dat files and .xml files

    3)Opened the manifest.xml file and removed unwanted content.Saved the file.

    5)Since I dint know how to cab it back again. I just re-compressed the same folder using winrar and renamed it to .cab

    6)I then ran an import command from same Bin folder, but unfortunately ended up getting the following error:

    "FatalError: Failed to read package file.

    Error: Unable to read cabinet info from C:Program FilesCommon FilesMicrosoft

    Sharedweb server extensions12BINAuthoringExport.cab

    Start Time: 1/7/2009 3:28:13 AM.

    Progress: Initializing Import.

    Error: Unable to read cabinet info from C:AuthoringExport.cab

    FatalError: Failed to read package file.

      at Microsoft.SharePoint.Deployment.ImportDataFileManager.Uncompress(SPRequest

    request)

      at Microsoft.SharePoint.Deployment.SPImport.Run()

    *** Inner exception:

    Unable to read cabinet info from C:AuthoringExport.cab

      at Microsoft.SharePoint.Library.SPRequest.ExtractFilesFromCabinet(String bstr

    TempDirectory, String bstrCabFileLocation)

      at Microsoft.SharePoint.Deployment.ImportDataFileManager.<>c__DisplayClass2.<

    Uncompress>b__0()

      at Microsoft.SharePoint.SPSecurity.CodeToRunElevatedWrapper(Object state)

      at Microsoft.SharePoint.SPSecurity.<>c__DisplayClass4.<RunWithElevatedPrivile

    ges>b__2()

      at Microsoft.SharePoint.Utilities.SecurityContext.RunAsProcess(CodeToRunEleva

    ted secureCode)

      at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(WaitCallback sec

    ureCode, Object param)

      at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(CodeToRunElevate

    d secureCode)

      at Microsoft.SharePoint.Deployment.ImportDataFileManager.Uncompress(SPRequest

    request)

    Progress: Import Completed.

    Finish Time: 1/7/2009 3:28:14 AM.

    Completed with 0 warnings.

    Completed with 2 errors.

    Log file generated:

           C:AuthoringExport.cab.import.log

    Failed to read package file."

    I dont know if I am doing something wrong?

    Inputs from you are appreciated!

    Thanks,

    Freda Grace

    Reply

  56. A rar is not a cab.

    Both use different compression. Winrar can decompress many different compression types but it will always create a rar file.

    But you don’t have to compress at all.

    Or uncompress.

    STSADM -o export/import has a parameter which allows to use uncompressed data: -nofilecompression

    Cheers,

    Stefan

    Reply

  57. Yea, I used the same parameter immediately after posting the comment you and it worked!

    The Custom list was not created in the destination site as was required,but I feel this manual procedure of making changes in the manifest.xml file is quite risky. Also how can we achieve it if we have number of such custom lists to remove. Because for removing content related to just one list (for testing), I had to remove atleast 5 xml tags.

    Can you suggest any better way? Or any way in which we can atleast automate this process?

    Thanks,

    Freda Grace

    Reply

  58. Hi Freda,

    it is indeed risky. And if the content in the DB would not be correct afterwards it would be your responsibility to correct it.

    If you would like to go this route I would suggest to write a tool which does the cleanup of the manifest.

    Alternatively you would need to delete the lists after import.

    Cheers,

    Stefan

    Reply

  59. Yes,

    I guess your right…was just wondering if I write a tool to do the clean up, what about the dependent objects? Dont we have to clean them up too? If yes, then is there a way that we can find out dependent objects of a particular list say for example which workflow is being run on it, etc?

    Thanks for your time..

    Freda

    Regards,

    Freda Grace

    Reply

  60. Hi Freda,

    yes you need to cleanup all dependent objects and also all references in other objects to the removed ones.

    It depends on the complexity of your site structure how much cleanup would be required.

    Cheers,

    Stefan

    Reply

  61. Hi!

    I’m looking for a code to move pages between webs ( for archival )

    most of the codes use export/import API, but this only copy the page instead of moving it!

    any idea? should i delete the page after export/import? there is no elegant way?

    thank you

    Reply

  62. Hi Judith,

    after the successful import you have to delete the old page.

    That is what happens when moving in manage content and structure as well.

    Cheers,

    Stefan

    Reply

  63. In the past I have released several blogs about the various problems that can occur with Content Deployment.

    Reply

  64. Hi,

    I am using two sharepoint sites one as primary and other as secondary, and these sites are being used to store video’s. These video’s are attached as attachments in a list. I am using Windows Sharepoint Services. Along with that I am using External Blob storage to actually store these video’s.

    I am using Content Deployment API to synchronize data b/w the two sites. I am always using Retain ID’s to ensure no duplicate data is being created.

    My problem is that Content Deployment API while importing List Items updates the meta-data but doesn’t update the attachment if the attachment name remains same , even if the actual attachment has changed. Like in my case, the attachment size and duration has changed but name remains same, yet it doesn’t replaces it.

    Is there anyway where i can force the replacement of attachments. Any other suggestion would be welcome.

    Regards

    Rahul Singh

    Reply

  65. Hi Rahul,

    I haven’t seen this. Please open a support case with Microsoft to get this analyzed.

    Cheers,

    Stefan

    Reply

  66. Dear Stefan,

     Currently, We are testing our deployment program within one stand alone MOSS server (As export and import server). But we met below errors. Could you give advise? Many thanks!

    [3/18/2009 9:41:54 AM]: Progress: Importing File WorkflowTasks/NewForm.aspx.

    [3/18/2009 9:42:09 AM]: Progress: Importing List fpdatasources.

    [3/18/2009 9:42:10 AM]: Progress: Importing List List Template Gallery.

    [3/18/2009 9:42:11 AM]: Progress: Importing List Master Page Gallery.

    [3/18/2009 9:42:12 AM]: FatalError: Value does not fall within the expected range.

      at Microsoft.SharePoint.SPFileCollection.get_Item(String urlOfFile)

      at Microsoft.SharePoint.SPContentTypeCollection.Add(SPContentType contentType)

      at Microsoft.SharePoint.Deployment.ContentTypeSerializer.CreateContentType(SPContentType sourceContentType)

      at Microsoft.SharePoint.Deployment.ContentTypeSerializer.ProcessContentType(SPContentType sourceContentType, String contentTypeXml, ImportObjectManager importObjectManager, Boolean IsParentSystemObject)

    Best Regards,

    Frankie

    Reply

  67. Hi Stefan,

    We were running a full content deployment between two farms, but while it was importing objects there was a connection loss between database and app servers which caused the job to go into running state.

    I have tried to restart the timer job on both the authoring and publishing farms, but it didn’t helped.

    Can you suggest how to fix this.

    Thanks,

    Soumyadev

    Reply

  68. Hi Soumyadev,

    on source and target please do the following:

    Open Central-Admin – Operations – Timer Job Status.

    Choose "Service" for the "View"

    Select "Central Administration" for the Service

    Can you find a Content Deployment timer job?

    If not, then the job is indeed no longer running.

    In this situation you can change the status to Failed manually to allow restart:

    Browse to http://central-admin-url/Lists/Content%20Deployment%20Jobs

    choose the list item related to the job that shows the incorrect status

    edit the job and change the "LastStatus" field to "2" which means Failed.

    Now you can restart the job manually using the manage content deployment paths and jobs page.

    Cheers,

    Stefan

    Reply

  69. Hi Stefan,

    First of all thanks a lot for your great sharepoint articles.

    I’ve got a very annoying issue with content deployment.

    I’m using Content deployment to replicate publishing pages.

    I was hoping to benefit from "ExcludeDependencies = false" to retrieve images and such. But i’m starting to think that was not a good idea…

    I had a first issue with SummaryLinks dependencies not included in export. I fixed that exporting them with custom code then fixing Summary links after import.

    Now i have the *BIG* issue.

    Sometimes, didnt find why yet, while i’m exporting only one page, Content deployment include RootWeb as a dependency… 90 MB today on a test server, will surely become a lot more in production environment.

    Anyway what is the reason, RootWeb should never be exported (and hopefully because of a different destination template, the import fails)

    In fact i would like to exclude any SPWeb from export.

    But i found no way to control Dependencies export except with the ExcludeDependencies Flag…

    I’ve spent some time digging the API with reflector, sealed class and internal methods prevent me to override that bad behavior.

    I hope i missed something… Or i’ll have to rewrite my code without Dependencies feature of Content Deployment = i will forget using Content deployment and probably some SPListem.Copy or worse custom Export / Import code (like OCD Export Page)…

    Thanks for your help and sorry for my english, which is not my native language… (hello from paris 😉

    Reply

  70. Hi Vincent,

    the rootweb is vital for the publishing feature.

    It holds the page layouts, the reusable content, …

    so every publishing page will have a dependency to at least the page layout in the root web.

    Cheers,

    Stefan

    Reply

  71. Hi Stefan,

    Of course RootWeb is vital, and 99% of times, content deployment dependencies mechanism only adds one master page or Page Layout in my export packages…

    But the last 1% brings the FULL RootWeb, i mean all Lists, masterpages and so on… which is not acceptable, since i bet the RootWeb will be growing fast in production and since i have many transfers to be done…

    I just want to copy only few pages… with the benefits of that dependency mechanism…

    Anyway, thanks a lot for your (very) fast answer.

    /me taking the "full custom code road"…

    Reply

  72. Me again,

    Since i cant control the exported objects by dependencies, could it be possible to control objects to import, i mean excluding them "on the fly".

    Nothing found in SPImportSettings… (Would like to exlude all SPWebs)

    And on SPImport.Started,  SPIMportObjectCollection is not setable nor mutable…

    Some trick playing with SPImportObject.TargetParentUrl + ignoring errors maybe…? :/

    Vincent.

    Reply

  73. Hi Vincent,

    you would need to modify the manifest file and remove the items you would like to exclude before starting the import.

    You could do this (e.g.) in the Started event handler. See part 4 for details.

    Cheers,

    Stefan

    Reply

  74. Thanks again for your answer stefan,

    That’s what i will surely do soon… As a temp workaround.

    After studying my export 10MB (!) manifest file i come to the conclusion that ins ome cases, exporting one single page with dependencies brings not only the RootWeb but the full site collection…

    /me thinking of opening a case @ MS

    Vincent.

    Reply

  75. Hi Vincent,

    a case might indeed be the best way to isolate this.

    Cheers,

    Stefan

    Reply

  76. Hi Stefan, your blog here has saved me a few times – thanks for writing this good stuff (and even more for formalising with the Sharepoint product team).

    I’m interested in an incremental migration situation where various SPLists have had items modified or added, but the ExportChangeToken was not saved [1].

    Is it reasonable to use the GetChanges() method of a change-exportable object [2], locate the change of interest [3], and assign:

    anExportObject.ExportChangeToken = anSPChange.ChangeToken.toString()

    ?

    [1] In this case, the changes were made in an SPS 2003 database and the customer wants changes to be ‘re-migrated’

    [2] It puzzles me that SPList and SPListItem cannot be targets of an incremental export.  Do you have any insight as to why this is?

    [3] It would surely have helped if SPChangeQuery had properties for ‘Before’ and ‘After’ which took a DateTime instance.

    Reply

  77. Hi David,

    yes you can use this method to deploy from a specific change.

    [2] for a list item it does not make sense as the target has to be a container. Regarding the list: it was a design decision to have the granularity on SPWeb level. For lists you can achieve this using an event receiver as outlined in part 3

    Cheers,

    Stefan

    Reply

  78. If two lists are not identical(different column names) and I performed copy list item operation via API and Sharepoint UI(Collabration Portal>content and structure). The columns (which does not exist at the destination) in the source list are ADDED TO THE DESTINATION LIST but EditItem.aspx pages seems like corrupted and only show the newly added columnns to use/edit.

    Is it a bug? or What is the logic behind that?  

    Reply

  79. Thanks so much for your help yesterday! One quick question – where in the manifest or any of the other files, can you change where the site will end? Also as a sub web I had to enter the name without the slash i.e. "Toolbox" is that common? Is it because its a sub-site and not a web?

    Thanks again.

    Sara

    Reply

  80. Hi Sara,

    there is no end for the site. Each object is separate. There is no hierarchical ordering in the manifest file.

    Not sure what you mean with the slash.

    Cheers,

    Stefan

    Reply

  81. What I meant is, if there is a way to move the site to a different URL completely via the manifest. I saw your answer above, where you recommend the movement of objects, rather than the sites.

    Thanks 🙂 (and cheers!!)

    Reply

  82. Hi Sara,

    you would need to adjust the URL of the site and all included lists and items and so on.

    Its better to export the site as is and then import it at the desired location using the API properties.

    Cheers,

    Stefan

    Reply

  83. Dear Stefan,

    I am using spexport/import such that I can simulate replication between 2 sites, (testing is current on a seperate sub-web).

    Export is all content within a spWeb, and the Initial Export where ExportAll is specified, and Import is successful.  (Identities not retained)

    The next export, only use ExportChanges, with the following changes,

    Add new list, with or without items: Import success

    Add item in an exisiting list: Failed to import, as the TargetParentUrl is wrong.  I attented to set this using the OnImportStarted Event, but i get an Invalid File error.

    In additional to this error, I noticed that when Retain Identities = false, Overwrite Changes does not work,  new items are added.

    Can you please advise whether using SPExport/Import is a feasible way to create a replication module.  Furthermore, is there anyway to access the serialised objects, such that I can deserialise manually from the objects?

    Many thanks in advance.

    William

    Reply

  84. Hi William,

    please read all the other parts of this series. It answers the questions you raised.

    Cheers,

    Stefan

    Reply

  85. Hi Stefan,

    Many thanks for your reply.  Am I correct in reading that there isn’t really a solution to resolve the Overwrite migrate issues when Retain Identities = false, other than setting this to true.  But for this, guid/names must be unique.

    And if objects need to be different in location for migtation, another tool needs to be written.  I’d already thought of this, but would rather not do it that way.  Just hoped there was something in the API would could be used, or rather we could access the serialised objects and re-write part of the api.

    Many thanks for your reply.

    William

    Reply

  86. Hi William,

    that is correct. If overwrite is required you have to use RetainObjectIdentity as this is the only method that allows checking if the identical item already exists and has to be replaced or not.

    Cheers,

    Stefan

    Reply

  87. Hi Stefan,

    I am in a problem. In my scenario, I want to export all sharepoint groups, users and permissions. I can export them if I am exporting and importing complete site collection but if I am attempting to export only one list in a site then sharepoint groups and users are exported and imported but the permissions of them are not exported which will eventually cause them to fail.

    My requirement is so that i want to export one list(or limited number of items but not complete site) only and all the Sharepoint groups, users and permissions should be exported and imported with that.

    Any help is highly appreciated.

    Thanks,

    Vijay

    Reply

  88. Hi Vijay,

    with the hotfix level available this is not possible.

    A fix to get this working is currently being investigated.

    Cheers,

    Stefan

    Reply

  89. I have a master list and an archive list; is there a way of moving an item from the master list to the archive list and maintain the same item ID in the archive list. I need to keep the same item ID as this is the primary key for task items related to the master item.

    Reply

  90. Hi Sham,

    no that is not possible. I would recommend to use a custom field to store the information that should be persisted.

    Cheers,

    Stefan

    Reply

  91. Is it possible to export a site and exclude the descendants?  I’d like to export just the site definition without the webs, lists and contents.  I’ve set the SPExportObject.IncludeDescendants to SPIncludeDescendants.None but it still traverses the entire site for export.

    This is the code i’m using to perform the export.  Do i have some setting wrong?  

    SPExportSettings settings = new SPExportSettings();

    settings.SiteUrl = site.Url;

    settings.ExportMethod = SPExportMethodType.ExportAll;

    settings.FileLocation = fileLocation;

    settings.FileCompression = false;

    settings.CommandLineVerbose = true;

    settings.ExcludeDependencies = true;

    settings.OverwriteExistingDataFile = true;

    SPExportObject exportObject = new SPExportObject();

    exportObject.Id = site.ID;

    exportObject.Type = SPDeploymentObjectType.Site;

    exportObject.IncludeDescendants = SPIncludeDescendants.None;

    exportObject.ExcludeChildren = false;

    settings.ExportObjects.Add(exportObject);

    SPExport export = new SPExport(settings);

    Thanks

    Reply

  92. Hi Henry,

    that does not work for SPSite objects.

    SPSite is just a container for all the SPWeb objects.

    You could do it for the SPWeb object of the rootweb. That will ensure that no child webs are imported.

    Cheers,

    Stefan

    Reply

  93. I am getting SQL deadlock error during export. In one of your article, you mentioned about deadlock error on import, does it possible have deadlock error on export? I also tested with stsadm and export completed successfully. In my code basically I am doing same thing what the stsadm does but I got an error. Thank you very much.

    Here is the my code:

    SPExportSettings settings = new SPExportSettings();

    settings.SiteUrl = web.Url;

    settings.ExportMethod = SPExportMethodType.ExportAll;

    settings.FileLocation = fileLocation;

    settings.FileCompression = false;

    settings.CommandLineVerbose = false;

    settings.ExcludeDependencies = true;

    settings.OverwriteExistingDataFile = true;

    SPExportObject exportObject = new SPExportObject();

    exportObject.Id = web.ID;

    exportObject.Type = SPDeploymentObjectType.Web;

    exportObject.IncludeDescendants = SPIncludeDescendants.All;

    settings.ExportObjects.Add(exportObject);

    SPExport export = new SPExport(settings);

    Error Message:

    [3/19/2010 2:21:29 PM]: FatalError: Transaction (Process ID 612) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

      at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)

      at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)

      at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)

      at System.Data.SqlClient.SqlDataReader.HasMoreRows()

      at System.Data.SqlClient.SqlDataReader.ReadInternal(Boolean setTimeout)

      at System.Data.Common.DataAdapter.FillLoadDataRow(SchemaMapping mapping)

      at System.Data.Common.DataAdapter.FillFromReader(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 startRecord, Int32 maxRecords, DataColumn parentChapterColumn, Object parentChapterValue)

      at System.Data.Common.DataAdapter.Fill(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords)

      at System.Data.Common.LoadAdapter.FillFromReader(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords)

      at System.Data.DataSet.Load(IDataReader reader, LoadOption loadOption, FillErrorEventHandler errorHandler, DataTable[] tables)

      at System.Data.DataSet.Load(IDataReader reader, LoadOption loadOption, String[] tables)

      at Microsoft.SharePoint.Deployment.ListItemObjectHelper.GetNextBatch()

      at Microsoft.SharePoint.Deployment.ObjectHelper.RetrieveDataFromDatabase(ExportObject exportObject)

      at Microsoft.SharePoint.Deployment.ListItemObjectHelper.RetrieveData(ExportObject exportObject)

      at Microsoft.SharePoint.Deployment.ExportObjectManager.GetObjectData(ExportObject exportObject)

      at Microsoft.SharePoint.Deployment.ExportObjectManager.MoveNext()

      at Microsoft.SharePoint.Deployment.ExportObjectManager.ExportObjectEnumerator.MoveNext()

      at Microsoft.SharePoint.Deployment.SPExport.SerializeObjects()

      at Microsoft.SharePoint.Deployment.SPExport.Run()

    Reply

  94. Hi Mark,

    sounds as if there are either multiple exports running in parallel or that there are authoring activities interacting with the export.

    Cheers,

    Stefan

    Reply

  95. Hi Stefan,

    I know that there is only one export process and no parallel export is running. What do you mean "authoring activities", could you explain little bit more? Thanks

    Reply

  96. Hi Mark,

    users changing items, documents, approving stuff, …

    Cheers,

    Stefan

    Reply

  97. Hi Stefan,

    I tested in different days and times even out of office hours (once I have tried 11 pm) but always it gives same error message and whenever I try with stsadm export command it works.

    Reply

  98. Hi Mark,

    I haven’t seen this. If you tested this on the lastest patch level I would suggest to open a support case to get it analyzed.

    Cheers,

    Stefan

    Reply

  99. Hi Stefan,

    Is incremental export supported at document library or list level?  I tried it but got an error.

    Thanks,

    Nick

    Reply

  100. Hi Nick,

    no this is not supported. You can use incremental only for SPSite and SPWeb.

    Cheers,

    Stefan

    Reply

  101. Hi Stefan,

    I do have a specific requirement of excluding hidden libraries like "Master Page Gallery", "Web Part Gallery" while exporting whole site collection including all descendents in the content migration package. I'm using the SharePoint Content Deployment Wizard by Chris O'Brien.

    Would SPExportObject.ExcludeChildren be an useful property to achieve this ? Or any other suggestions / comments ?

    Final result should be Content Migration Package containing Entire Site Collection with all content except these hidden libraries and its corresponding list items / contents.

    Any help would be much appreciated.

    Thanks,

    Vishwajit

    Reply

  102. Hi Vishwajit,

    the content deployment and migration API does not directly support this.

    To achieve what you are looking for you would need to modify the manifest.xml file between export and import. Here you need to remove the SPObjects for all unwanted libraries, all folders in this library and all documents in this library.

    If you can write a custom tool for this I would suggest to perform the modification in the SPImport.Started event handler which allows you to access the uncompressed manifest files.

    Cheers,

    Stefan

    Reply

  103. Thanks for the nice article.

    I have a question: I Need to create a timer job OR some sort of utility application OR event handler inorder to run this code and achieve this content migration. right?

    I am new to sharepoint and my requirement is to move bulk list items, in a incremental fashion from one list to archival list on same or different server.

    Any suggestion/help is appreciated..

    Thanks

    Aj

    Reply

  104. Hi Ajay,

    that is correct.

    Utility or timerjob is recommended.

    Cheers,

    Stefan

    Reply

  105. Hi i have exported a site successfully from Moss 2007… but i got feature OffWFCommon can not be found error while Importing that to Sharepoint Server 2010…

    is there any solution to skip such error and continue importing?

    Reply

  106. Hi Aravind,

    you cannot import a package into SP2010 that was exported from MOSS.

    Export/Import only works between servers/farms of the same product version.

    Cheers,

    Stefan

    Reply

  107. Hi Stefan,

    I wrote the code to allow the copy item (as your article). This worked fine with full access control user, but for other user, it did not work (for ex: testuser)

    Using sharepoint interface, I saw the "copy to …" option with the testuser.

    Do you know where are the exported files if we don't set the FileLocation in SPExportSetting?

    Thanks

    Reply

  108. Hi Stephanie,

    import requires site collection administrator permission – also for the copy operation. You can retrieve the file location after the export from the FileLocation property if you did not specify it. The export operation will set it.

    Cheers,

    Stefan

    Reply

  109. Hi Stefan,
    is it possible to migrate documents with versions by using these methods ??

    Reply

  110. Hi Siva, yes that is possible.

    Reply

  111. Hello Stefan,
    We’re doing the incremental content deployment on the site(MOSS2007), and the job has been running for a few years. But recently we found the content export sometimes took a long time (1 hour ~ 2 hours) even if there’s no content changes on the site, while it should be just a few minutes to complete in normal. Could you please advise how we can check the issue? Any log we can monitor on the server? Thanks.
    Best regards,
    Lili

    Reply

    1. Hi Lili,
      please open a support case with Microsoft to get this analyzed.
      Cheers,
      Stefan

      Reply

Leave a Reply to Frankie Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.