BIT STRING input to asn1_der_coding producing incorrect result
Description of problem: BIT STRING encoding producing incorrect result
Version of libtasn1 used: 4.19.0.12-3f7a
Distributor of libtasn1 (e.g., Ubuntu, Fedora, RHEL) - built from source for different platforms
I originally thought it might be a big endian only behavior where I originally saw the issue (s390x Linux), but I've reproduced the error on OSX and x86 Linux as well.
How reproducible:
Steps to Reproduce:
-
Using the asn1Coding executable which makes calls to asn1_der_coding, I attempt to encode a single element SEQUENCE with a BIT STRING element.
-
definition file:
MYEXAMPLE { } DEFINITIONS IMPLICIT TAGS ::= BEGIN mySeq ::= SEQUENCE { myBit BIT STRING } END
-
assignment file:
dp MYEXAMPLE.mySeq myBit abc123
Actual results:
30 04 03 02 02 60
When I set the input characters to multiples of 8, it will encode the first char in the set per 8 input chars. e.g. if the input for myBit is a1234567
, the output encode a
as the bit string:
30 04 03 02 00 61
where 61 is the ascii encoding of a
. However, less than 8 characters (like the above example abc123
) ends up with different results.
Expected results:
30 09 03 07 00 61 62 63 31 32 33
I expected the same results as with an OCTET STRING, with the additional "unused bits" prefix byte. OCTET STRING encoding is below:
30 08 04 06 61 62 63 31 32 33