Process mining event log import fails due to invalid timestamp format

I’m trying to import event logs into Process Mining for our order-to-cash analysis, but the import keeps failing with a timestamp format error. Our CSV file contains event data exported from SAP with timestamps in the format ‘DD.MM.YYYY HH:MM:SS’, but ServiceNow expects ISO 8601 format.

Here’s a sample of what we’re importing:


Case_ID,Activity,Timestamp,Resource
ORD-001,Create Order,15.11.2024 09:30:00,JohnD
ORD-001,Approve Order,15.11.2024 14:45:00,MaryS

The error message says ‘Invalid timestamp format in row 2’. This is blocking our incomplete process mining analysis - we have 50,000+ event records waiting to be imported. Do I need to preprocess the CSV before importing, or is there a way to configure the timestamp format in the event log import settings?

Before you go down the transform map route, check if your SAP export can be configured to output ISO 8601 timestamps directly. Many modern SAP versions support configurable export formats. That would be cleaner than doing transformation in ServiceNow. But if that’s not possible, the transform map approach Carlos mentioned is the right solution.

That’s disappointing. We receive these exports weekly from SAP, so manual preprocessing isn’t sustainable. Is there any way to automate the CSV preprocessing within ServiceNow itself? Maybe a transform map or import script that handles the timestamp conversion?

You need a robust solution that handles all three aspects of your event log import challenge. Here’s the complete approach:

Timestamp Format Conversion: Create a Transform Map specifically for your SAP event log imports. Navigate to System Import Sets > Create Transform Map and configure it for your target event log table. The critical part is the timestamp field mapping script:

// Transform map script for timestamp field
var sourceTimestamp = source.timestamp; // 'DD.MM.YYYY HH:MM:SS'
var parts = sourceTimestamp.split(' ');
var dateParts = parts[0].split('.');
var timePart = parts[1];

// Convert to ISO 8601: 'YYYY-MM-DD HH:MM:SS'
var isoTimestamp = dateParts[2] + '-' + dateParts[1] + '-' +
                   dateParts[0] + ' ' + timePart;
answer = isoTimestamp;

CSV Preprocessing: While the transform map handles format conversion, you should implement preprocessing for data quality validation. Create a Scheduled Script Execution that runs before imports:

// Pseudocode - CSV preprocessing steps:
1. Read staged CSV from import set table
2. Validate required fields (Case_ID, Activity, Timestamp)
3. Check for duplicate events (same case + activity + timestamp)
4. Verify timestamp chronological order within each case
5. Flag anomalies in separate validation table
6. Mark clean records as 'ready_for_transform'

This preprocessing catches issues before they reach the event log import, preventing partial imports that corrupt your process mining analysis.

Event Log Import Configuration: For sustainable weekly imports from SAP, set up an automated import flow:

  1. Create an Import Set table (u_sap_event_staging) matching your CSV structure
  2. Configure a Scheduled Data Import that loads CSV files from a designated directory
  3. Apply your custom Transform Map with timestamp conversion
  4. Set up email notifications for import failures with detailed error logs

The key to handling 50,000+ records reliably is chunking the import. Modify your scheduled import to process records in batches of 5,000:

var batch = new GlideRecord('u_sap_event_staging');
batch.addQuery('state', 'pending');
batch.setLimit(5000);
batch.query();

while (batch.next()) {
  // Transform and import logic
  batch.state = 'processed';
  batch.update();
}

This prevents timeout issues with large imports and allows you to resume if there are failures.

Additional Recommendations:

  1. Create a reference qualifier on your timestamp field to validate ISO 8601 format post-transformation
  2. Implement a rollback mechanism - if an import batch fails validation, automatically revert all records from that batch
  3. Build a dashboard showing import statistics (records processed, errors, processing time) for monitoring

For your immediate need, you can manually fix your current CSV using this Excel formula in a new column: =TEXT(DATEVALUE(MID(A2,7,4)&"-"&MID(A2,4,2)&"-"&LEFT(A2,2))&TIMEVALUE(MID(A2,12,8)),"yyyy-mm-dd hh:mm:ss") where A2 contains your original timestamp. Then import the corrected CSV while you implement the automated solution.

You can definitely automate this with a transform map. Create a custom transform map for your event log import and add a field map with a script that converts the timestamp format. Use JavaScript’s date parsing functions to read the DD.MM.YYYY format and output ISO 8601. I’ve done this for several clients importing from legacy systems.

I recommend using a combination approach. Set up a scheduled import that first stages the CSV data in a temporary table, then runs a background script to clean and transform timestamps before loading into the process mining event log table. This gives you more control over data quality validation and error handling compared to doing everything in a transform map.

ServiceNow Process Mining is strict about timestamp formats. You’ll need to preprocess your CSV to convert timestamps to ISO 8601 format (YYYY-MM-DD HH:MM:SS). I usually use Python or Excel formulas to transform the data before import. There’s no built-in format converter in the import wizard unfortunately.