Your field mapping issue stems from three distinct configuration gaps that need systematic resolution:
Field Mapping Configuration:
The Integration module requires explicit JSONPath mappings for each external field, especially when field names don’t match your internal schema. Navigate to Integration > External Systems > [Your Sensor System] > Field Mappings. For your environmental sensor payload, configure these mappings:
// Field mapping configuration
source: $.sensor_id → target: device.external_id
source: $.temp_c → target: telemetry.temperature.value
source: $.humidity_pct → target: telemetry.humidity.value
The JSONPath syntax ($.field_name) is mandatory for nested JSON structures. If you’re using dot notation instead of JSONPath, the mapper won’t resolve fields correctly. Additionally, ensure your mapping includes all required fields in your internal schema - missing mappings for mandatory fields cause the entire record to be rejected during ingestion, even if some fields map correctly.
Schema Translation with Data Type Conversion:
Your external sensors send numeric values without explicit units, but your internal schema requires typed decimal values with metadata. Enable Advanced Mapping mode and create a transformation function. Go to Integration > Schema Mappings > Create Custom Transform:
function transformSensorData(payload) {
return {
device_id: payload.sensor_id,
temperature: {
value: parseFloat(payload.temp_c).toFixed(2),
unit: 'celsius'
},
humidity: {
value: parseFloat(payload.humidity_pct).toFixed(2),
unit: 'percent'
},
timestamp: new Date().toISOString()
};
}
This transformation ensures data types match your internal schema expectations and adds required metadata. The integration module executes this function for each incoming payload before writing to analytics tables.
Integration with External Sensors - Data Flow Validation:
Even with correct mappings, data might not reach analytics tables if the integration pipeline has gaps. Verify the complete data flow: REST API ingestion → Integration mapper → Internal message queue → Analytics writer. Enable detailed logging in Integration > System Settings: set integration.logging.level=DEBUG and integration.trace.enabled=true. Monitor logs at /var/log/cisco-iot/integration/mapper.log to see exactly where transformation fails.
Check that your analytics tables have matching column names and types. Run this validation query against your analytics database to confirm schema alignment. If your analytics reports show nulls despite successful ingestion, the issue is likely a view definition that queries different column names than what the integration module writes. Refresh materialized views and verify column mappings in your reporting layer.
Finally, test the complete pipeline with a single sensor payload using the Integration Test Console (Integration > Test Harness). This shows real-time transformation results and identifies exactly which mapping or transformation step fails. Once your test payload successfully maps and appears in analytics tables, apply the configuration to production ingestion endpoints.