Hi Blues team!
I ran into a subtle issue with the COBS codec length functions that causes max_decoded_length(max_encoded_length(X)) to sometimes return X - 1 instead of X.
The issue:
When X = 65536:
-
NoteBinaryCodecMaxEncodedLength(65536) returns 65796
-
NoteBinaryCodecMaxDecodedLength(65796) returns 65535 (not 65536)
This caused our code to reject valid 64KB chunks since it didn’t think the buffer was large enough.
Root cause:
In n_cobs.c, _cobsGuaranteedFit() computes overhead based on the encoded buffer size:
uint32_t cobsOverhead = 1 + (bufLen / 254) + COBS_EOP_OVERHEAD;
But 65796 / 254 = 259, while 65536 / 254 = 258. The extra overhead bytes push us into the next 254-byte bucket, causing a 1-byte discrepancy.
Suggested fix:
Use the algebraic inverse instead:
uint32_t _cobsGuaranteedFit(uint32_t bufLen) {
uint32_t fixed_overhead = 1 + COBS_EOP_OVERHEAD;
if (bufLen <= fixed_overhead) return 0;
return ((bufLen - fixed_overhead) * 254) / 255;
}
This ensures the round-trip property holds for all valid inputs.