Work items not migrating during release planning data import

We’re attempting to bulk import work items for release planning using the Azure DevOps REST API. The migration process completes without throwing errors, but work item relations aren’t being preserved correctly. Parent-child links between features and user stories are missing, and custom field mappings for our priority scoring system aren’t transferring. We’re also seeing attachment migration failures - about 40% of file attachments linked to requirements aren’t appearing in the target project.

Here’s the API call structure we’re using:

POST https://dev.azure.com/{org}/{project}/_apis/wit/workitems/${type}?api-version=7.0
{
  "op": "add",
  "path": "/fields/System.Title",
  "value": "Migrated Feature"
}

Validation reports show successful imports, but when we open the release planning board, the hierarchy is completely flat. Has anyone dealt with relation preservation during bulk imports? We need to migrate 800+ work items with their complete relationship structure intact.

Also verify your API token has the proper scopes. Work item relation operations require ‘vso.work_write’ scope, while attachment uploads need ‘vso.work_write’ plus potentially blob storage permissions depending on your organization’s security configuration. Check the validation reports more carefully - they might show success on field imports but have warnings about skipped relations that are easy to miss in the summary view.

Exactly. Upload attachments via the attachments API endpoint, which returns a URL. Then use that URL in a subsequent relation operation on the work item. For parent-child links, you need to add operations with path ‘/relations/-’ and specify the relation type. Common types are ‘System.LinkTypes.Hierarchy-Forward’ for parent-to-child and ‘System.LinkTypes.Related’ for peer relations.

Thanks for the pointer on field reference names. I checked our process template and found several custom fields using different internal identifiers. For attachments, are you suggesting a two-step process where we upload files to blob storage first, then link them?