You need a robust solution that handles all three aspects of your event log import challenge. Here’s the complete approach:
Timestamp Format Conversion:
Create a Transform Map specifically for your SAP event log imports. Navigate to System Import Sets > Create Transform Map and configure it for your target event log table. The critical part is the timestamp field mapping script:
// Transform map script for timestamp field
var sourceTimestamp = source.timestamp; // 'DD.MM.YYYY HH:MM:SS'
var parts = sourceTimestamp.split(' ');
var dateParts = parts[0].split('.');
var timePart = parts[1];
// Convert to ISO 8601: 'YYYY-MM-DD HH:MM:SS'
var isoTimestamp = dateParts[2] + '-' + dateParts[1] + '-' +
dateParts[0] + ' ' + timePart;
answer = isoTimestamp;
CSV Preprocessing:
While the transform map handles format conversion, you should implement preprocessing for data quality validation. Create a Scheduled Script Execution that runs before imports:
// Pseudocode - CSV preprocessing steps:
1. Read staged CSV from import set table
2. Validate required fields (Case_ID, Activity, Timestamp)
3. Check for duplicate events (same case + activity + timestamp)
4. Verify timestamp chronological order within each case
5. Flag anomalies in separate validation table
6. Mark clean records as 'ready_for_transform'
This preprocessing catches issues before they reach the event log import, preventing partial imports that corrupt your process mining analysis.
Event Log Import Configuration:
For sustainable weekly imports from SAP, set up an automated import flow:
- Create an Import Set table (u_sap_event_staging) matching your CSV structure
- Configure a Scheduled Data Import that loads CSV files from a designated directory
- Apply your custom Transform Map with timestamp conversion
- Set up email notifications for import failures with detailed error logs
The key to handling 50,000+ records reliably is chunking the import. Modify your scheduled import to process records in batches of 5,000:
var batch = new GlideRecord('u_sap_event_staging');
batch.addQuery('state', 'pending');
batch.setLimit(5000);
batch.query();
while (batch.next()) {
// Transform and import logic
batch.state = 'processed';
batch.update();
}
This prevents timeout issues with large imports and allows you to resume if there are failures.
Additional Recommendations:
- Create a reference qualifier on your timestamp field to validate ISO 8601 format post-transformation
- Implement a rollback mechanism - if an import batch fails validation, automatically revert all records from that batch
- Build a dashboard showing import statistics (records processed, errors, processing time) for monitoring
For your immediate need, you can manually fix your current CSV using this Excel formula in a new column: =TEXT(DATEVALUE(MID(A2,7,4)&"-"&MID(A2,4,2)&"-"&LEFT(A2,2))&TIMEVALUE(MID(A2,12,8)),"yyyy-mm-dd hh:mm:ss") where A2 contains your original timestamp. Then import the corrected CSV while you implement the automated solution.