We’re experiencing a critical issue with firmware updates on our IoT gateways after renewing our SSL certificates last week. The update process fails immediately with SSL handshake errors, blocking all firmware deployments to our 200+ edge gateways.
The error appears in gateway logs:
SSL handshake failed: unable to get local issuer certificate
javax.net.ssl.SSLHandshakeException: PKIX path building failed
FirmwareUpdateAgent: Connection refused to update server
We’ve verified the new certificate is valid and the CA chain appears complete in our update server configuration. However, the gateway trust store management seems problematic - we’re unsure if gateways need agent restart to pick up new certificates or if there’s a refresh mechanism. The CA chain completeness on the gateway side is our main concern.
This is blocking critical security patches. Has anyone dealt with certificate trust store issues during firmware updates?
Check if your intermediate CA certificates are included in the chain. We had a similar SSL handshake failure where the root CA was in the gateway trust store, but the intermediate certificate wasn’t being sent by the update server. The PKIX path building error you’re seeing is classic incomplete chain. Use openssl s_client to verify what certificates your update server is actually presenting during the handshake.
For oiot-23, you can use the gateway configuration API to push trust store updates remotely. Create a configuration profile with the new CA certificates and deploy it to your gateway device groups. The gateways will pull the updated configuration on their next heartbeat cycle, typically within 5-10 minutes. However, you’ll still need to trigger an agent restart for the changes to take effect. You can schedule this during maintenance windows using the device management console to minimize disruption across your fleet.
I had this exact scenario last quarter and here’s the complete solution that worked:
Certificate Trust Store Management:
First, verify your complete certificate chain on the update server. Export the full chain including root and intermediate CAs:
openssl s_client -connect update-server:8443 -showcerts
# Verify all certificates in chain are present
Gateway Trust Store Update Process:
Navigate to IoT Console → Gateway Management → Configuration Profiles
Create new profile named “Updated-CA-TrustStore-2025”
Under Security Settings, upload your complete CA chain file (PEM format)
The CA chain completeness is critical - include:
Root CA certificate
All intermediate CA certificates
Ensure proper certificate order (leaf to root)
Deployment Steps:
# Apply configuration to gateway groups
iotctl config deploy --profile Updated-CA-TrustStore-2025 --group production-gateways
# Schedule agent restart (required for trust store reload)
iotctl gateway restart --group production-gateways --schedule "2025-03-20T02:00:00Z"
Gateway Agent Restart Required:
This is mandatory. The gateway agent loads trust store only at startup. You have two options:
Scheduled restart via device management API (recommended for fleet deployment)
Rolling restart with 10-15 minute intervals to avoid overwhelming your backend
Verification:
After restart, test firmware update on a single gateway first:
The key issue in your case is that gateways cached the old trust store. Once you push the updated CA chain via configuration profile and restart agents, SSL handshake will succeed. For your 200+ gateways, use staged rollout - restart 20 gateways, verify firmware update works, then proceed with remaining fleet. This approach prevented our security patch delays and we now have a documented procedure for certificate renewals.
Pro tip: Set up monitoring alerts for certificate expiration 60 days in advance to avoid this situation. Also configure automatic trust store updates in your gateway provisioning templates.
One thing to watch out for - if you’re using custom CA certificates, make sure they’re in PEM format and properly encoded. I’ve debugged cases where certificate files had Windows line endings or extra whitespace that caused validation failures. Also verify the certificate chain order: server cert first, then intermediates, then root CA.
I’ve seen this exact issue. The gateway agents cache the trust store at startup, so they won’t automatically pick up certificate changes. You need to either restart the gateway agents or trigger a trust store refresh if your version supports it.