Currently, using x.509 as the signing method for commits does not result in either a "Verified" or "Unverified" png image being displayed next to the commit hash. Ex.
This unverified/verified status should be displayed for x.509-signed commits. Failing this, the documentation for configuration x.509-signed commits should be updated to clearly indicate this absence of this feature for this method of commit signing.
User experience goal
Users who utilize the x.509 key pairs on their smart cards could easily sign and visually verify their commits without having to do so via something like git log --show-signatures.
Tim Seagrenchanged title from x.509-signed commits to not display as "Verified"/"Unverified" as to GPG-signed commits to x.509-signed commits to not display as "Verified"/"Unverified" as do GPG-signed commits
changed title from x.509-signed commits to not display as "Verified"/"Unverified" as to GPG-signed commits to x.509-signed commits to not display as "Verified"/"Unverified" as do GPG-signed commits
Some notes : support for x509 signatures was added in !17773 (merged).
An example of an Unverified stamp for a x509 signed commit can be seen in 3aa88de7.
This inconsistency with GPG raises questions if we are simply aborting half-way on an invalid x509 signature or something like that. Based on the description of !17773 (merged):
unverified: email within x509 certificate does not match committer email or ca is not trusted or signature invalid
So if a commit has a x509 signature but it is invalid, maybe it should show the Unverified stamp ? But can we distinguish an unparseable x509 signature vs an invalid signature
I guess this is a self-signed certificate and one of the following attributes is missing on the signing certificate as written within the docs:
NOTE:
Self signed certificates without authorityKeyIdentifier,
subjectKeyIdentifier, and crlDistributionPoints are not supported. We
recommend using certificates from a PKI that are in line with
RFC 5280.
@tkuah here is the screenshot you requested, along with a screenshot of the user profile I'm using to push with the matching email address present:
@bufferoverflow, to your question, there are not self-signed certs, but rather the x.509 certs present on Common Access Cards supplied by the DoD. I have imported the Root Trust CAs used by the DoD using the instructions I referenced in the issue description.
I should also note that this is being testing in a dev environment in which we do not have SMTP configured, making it impossible to manually verify the email address I have associated with my user/matching the cert. I instead force verified all user emails using this documentation, which resulted in my user's email showing as "Verified":
@chaospuppy Could you double check that your CA is actually intended for digital signature? You should see something like Key Usage: Digital Signature on the CA data.
I'm pasting a small shell tool we use internally to extract this data from the commits, if you run this against a signed commit you should see the signature chain metadata:
#!/bin/bash# Arguments# $1 - commit SHA. If not provided, HEAD will be used# Requirements# - awk# - sed# - openssl cli# Tested on macOS 10.15.7 and Linux Debian 10RAW_COMMIT=$(git show $1--format=raw)SIGNATURE=$(echo"${RAW_COMMIT}" | awk'/-----BEGIN/{a=1};a;/-----END/{exit}' | sed-e's/gpgsig //'-e's/^ *//'-e's/SIGNED MESSAGE/PKCS7/')# openssl x509 cannot print all certs in the chain, we need to extract them one at a timeecho"${SIGNATURE}" | openssl pkcs7 -print_certs | awk'{ if ($0 == "-----BEGIN CERTIFICATE-----") cert="" else if ($0 == "-----END CERTIFICATE-----") print cert else cert=cert$0}' | while read CERT;doecho"${CERT}" | base64-d | openssl x509 -inform DER -textdone
Will do. Keep in mind that I will not be able to verify my tim.seagren.ctr@us.af.mil email address, because it's not a real email address (don't ask me why they don't make one by default).
Created this repo with a signed commit... And that is showing "Unverified", as expected...
Is there any obvious reason that our Gitlab deployment's UI would be working differently than gitlab.com? Again, the version should be up to date and GPG signatures show up as Unverified or Verified with our deployment.
Could be related to the helm chart or configuration you are using. Maybe worth to check with ruby -ropenssl -e 'puts OpenSSL::X509::DEFAULT_CERT_FILE' if the content you configure is there.
DEFAULT_CERT_FILE points to /usr/lib/ssl/cert.pem, which doesn't exist on the task-runner pod. However, DEFAULT_CERT_DIR is /usr/lib/ssl/certs/, which is where I've dumped our custom CAs. Is DEFAULT_CERT_DIR unused/would not having anything at DEFAULT_CERT_FILE explain these failures?
No luck with that either. Having mounted a few of the CAs into the file at /usr/lib/ssl/cert.pem, I still get neither Unverified or Verified as the commit signature status in the UI, and a call to the API still results in only {"signature_type":"X509"} being returned.
This is almost 100% a configuration issue. You can see here that I just made a signed commit on gitlab.com a couple of days ago (it's shown as unverified because gitlab.com does not trust our intermediate Siemens CA):
@bufferoverflow is much more knowledgeable in this regard, but imho that points to gitlab receiving your signed commit, but not triggering the sidekiq jobs that actually analyze and verify the commits after they're created. I'd check the sidekiq queues for create_commit_signature, and you also should find some errors in the rails logs in case the signature creation is causing issues.
All in all it still points to local server setup :-(
Would it be possible to set up a time to debug our environment via a screenshare? I'm positive that this must be an issue with our deployment, but without anything showing in our sidekiq logs and nothing obviously wrong with the certificates or how we've used the helm chart, it's difficult to know where the issue could be originating.
Commits and tags that have been signed with X.509 certificates do not appear as Verified or Unverified in GitLab 13.5.4, installed using the GitLab Helm chart v4.5.4. This has never worked properly in this instance. Importantly, this does work properly on GitLab.com (commits signed in the same way appear as Unverified).
The custom root CA used by $ORGANIZATION has been defined per the chart docs.
The email addresses associated with the certificates that are in use are not email addresses that accept email. These addresses were force confirmed per the docs.
The authorityKeyIdentifier, subjectKeyIdentifier and crlDistributionPoints attributes are present on the X.509 certificates and have values (are not empty).
The certificates are intended for use with digital signatures.
Moving your custom certificate files to /usr/lib/ssl/cert.pem did not resolve the problem.
There is no useful information about the create_commit_signature queue or the CreateCommitSignatureWorker in the logs that you have provided thus far.
In querying the database through the GitLab Rails console, we confirmed that no commits with signatures are known to the database.
@bcarranza Just for my clarification; using the same x509 cert on a GitLab setup without an external internet connection does actually show the commit as verified, but with an internet connection it doesn't?
Is there any information about the two environments? If they're identical except for the external connection then that would point to a weird issue/bug, but if there are differences then it could be a configuration issue.
Hi @robotmay_gitlab, I can clear this up. The problem is happening in environments that DO have internet gateways. There are no airgapped environments in play here, only gitlab.com itself (where it works) and our production and development environments (where it doesn’t), which all have access to the internet.
Brie is saying that the problems we’re seeing are not occurring in an airgap to rule that out as a possible factor.
Is there any information about the two environments?
Yes; the comment atop this thread has the information we gathered about this environment through the support ticket. As a GitLab team member, you can review the Slack thread for the next ~76 days for additional context. We are open to new suggestions on differences of note to inquire about. GitLab team members who are US citizens with access to the US Federal instance can read more in the ticket.
I've been working on reproducing this this week but must admit I've not generated x509 certs before so just trying to get that aspect working at the moment.
One thing we did notice, @chaospuppy, is that you're running 13.5 currently - there was a fix related to GPG/x509 in 13.6 that might be related, so if you have plans to upgrade soon then that might be worth checking.
@tkuah You mentioned on Slack recently that you'd used the x509 feature before. Is there any chance you could take a look at this issue when you have time and see if you can reproduce the problem? I can't seem to generate a working x509 certificate in order to sign commits myself.
@chaospuppy We made a new attempt to replicate this this evening and I think we may have made some progress; we're now getting the same problem as you, where no verified/unverified badge appears.
The issue seems to be that the X509CommitSignature database records aren't being created. In our instance we've traced this back to our certificate missing some of the required data - for us it appears to be failing based on us not having any crlDistributionPoints data in the certificate. This is listed as a requirement in the docs, but when the requirement isn't met it appears to fail silently, which isn't very helpful
Could you confirm if your certificates meet all those requirements listed, in particular this section:
Hi Robert! I can confirm that all three of those fields are present and populated in the cert I've been using to test signatures. This is also the cert I used to successfully sign a commit made to a repo hosted at gitlab.com.
For reference, I'll post the relevant output generated by inspecting this certificate.
Aah excellent, thanks for checking that @chaospuppy!
If possible it would be worth testing the next version up, 13.6, as it included this fix: !46736 (merged), which would also fit the symptoms we're seeing here.
Another thing possibly worth checking is whether any X509CommitSignature records are being created. You could test that by doing the following:
gitlab-rails console> X509CommitSignature.last
If that does seem to show a record/any records, then the fix from above would seem more likely, whereas if it returns nothing then it would imply it's failing silently when trying to create that record, like ours did, but for a different reason
Edit: Just remembered you're on Kubernetes so your Rails console command is likely different from that (we were testing on Omnibus)
@chaospuppy I have a few more things that are worth testing in the Rails console, just to see if we can identify where the certificate verification is failing:
Grab a signed commit and build a test signature object
commit = Project.last.commit # find a commit somehowsig = Gitlab::X509::Signature.new(commit.gpg_commit.signature_text, commit.gpg_commit.signed_text, commit.author.email, commit.created_at) # load it into the signature class
Perform some tests against it
sig.x509_certificate I expect this to be nil
sig.verification_status likely :unverified because the certificate is nil
sig.send(:valid_signature?) probably false
sig.send(:p7) not sure what this will return, but this is probably going to be the most informative. In our test this returns an error of "signer certificate not found"
sig.send(:certificate_crl) should return your CRL definitions
This issue has been looked into extensively but we are unable to replicate the exact issue as it appears to be tied to the customer's environment or configuration settings. It is possible there is a bug, but at this point that is not confirmed.
The customer is currently on %13.5, please see this note
Great news, I believe !46736 (merged) as highlighted by @robotmay_gitlab may have been the fix we were waiting on. Upgrading our Gitlab version to 13.8 results in "Unverified" (at least) showing up. I will follow the documentation to get my email verified.
It seems this was a bug in 13.5 that has been resolved somewhere in 13.6-13.8
Oh that's great to hear! Thanks for going through with the upgrade If it continues to show as Unverified then those steps above might help to narrow down which part of it is failing.
One last issue I'm running into while trying to get these commits to show as verified...
My efforts to debug in gitlab-rails console:
irb(main):017:0>commit = Project.last.commit # Get last commit (done after I pushed a signed commit)=>#<Commit id:a85e29136c5942dbdbd3e9d6d59833c3599307a7 tseagren/test@a85e29136c5942dbdbd3e9d6d59833c3599307a7>irb(main):017:0>sig = Gitlab::X509::Signature.new(commit.gpg_commit.signature_text, commit.gpg_commit.signed_text, commit.author.email, commit.created_at) # load it into the signature class=>#<Gitlab::X509::Signature:0x00007f4bff5703a8 @signature_text="-----BEGIN SIGNED MESSAGE-----\nMIIL+QYJKoZIhvcNAQcCoIIL6jCCC+YCAQExDTALBglghkgBZQMEAgEwCwYJKoZI\nhvcNAQcBoIIJyTCCBQUwggPtoAMCAQICAxcW6jANBgkqhkiG9w0BAQsFADBdMQsw\nCQYDVQQGEwJVUzEYMBYGA1UEChMPVS5TLiBHb3...irb(main):017:0>cert_store = sig.send(:cert_store) # Define the certificate store=>#<OpenSSL::X509::Store:0x00007f4befd1f2f8 @verify_callback=nil, @error=nil, @error_string=nil, @chain=nil, @time=nil, @_httpclient_cert_store_items=[:default]>irb(main):017:0>p7 = sig.send(:p7) # Create new PKCS7 object using the OpenSSL::PKCS7::PKCS7.new() method=>#<OpenSSL::PKCS7:0x00007f4bedbaec68 @data=nil, @error_string=nil>irb(main):017:0>p7.verify([], cert_store, sig.signed_text) # Call p7.verify method=>false
From Robert's suggestions:
sig.x509_certificate returns a nice-looking X509 certificate (I can post the full output if desired)
sig.send(:p7) returns #<OpenSSL::PKCS7:0x00007fdcf985b348 @data="tree 3820eb463973707bcc2b5337a58c10577e7a390d\nparent 387bad95633ebde53aae4e43b78f373c619813c5\nauthor Tim Seagren <tim.seagren.ctr@us.af.mil> 1613516927 -0800\ncommitter Tim Seagren <tim.seagren.ctr@us.af.mil> 1613516927 -0800\n\ntest\n", @error_string=nil>
sig.send(:certificate_crl) returns my CRL definition
Meanwhile, if valid_signature? and if valid_signing_time? are both returning true. I have also inspected the /etc/ssl/certs/ca-bundle.crt and /etc/ssl/certs/ca-bundle.trust.crt files to verify that our custom CAs are present.
Is there something I should try to get more info out of p7.verify([], cert_store, sig.signed_text)?
Looks like the difference between the two is that valid_signature? passes OpenSSL::PKCS7::NOVERIFY as an extra param to p7.verify
From the other comments I think this means it skips the system certificate chain, which would seem to indicate that the root CA isn't being picked up, but as you've checked the bundle and they're in there, that doesn't make a lot of sense.
So this is interesting and works... Executing the following:
irb(main):017:0>commit = Project.last.commit # Get last commit (done after I pushed a signed commit)=>#<Commit id:a85e29136c5942dbdbd3e9d6d59833c3599307a7 tseagren/test@a85e29136c5942dbdbd3e9d6d59833c3599307a7>irb(main):017:0>sig = Gitlab::X509::Signature.new(commit.gpg_commit.signature_text, commit.gpg_commit.signed_text, commit.author.email, commit.created_at) # load it into the signature class=>#<Gitlab::X509::Signature:0x00007f4bff5703a8 @signature_text="-----BEGIN SIGNED MESSAGE-----\nMIIL+QYJKoZIhvcNAQcCoIIL6jCCC+YCAQExDTALBglghkgBZQMEAgEwCwYJKoZI\nhvcNAQcBoIIJyTCCBQUwggPtoAMCAQICAxcW6jANBgkqhkiG9w0BAQsFADBdMQsw\nCQYDVQQGEwJVUzEYMBYGA1UEChMPVS5TLiBHb3...irb(main):017:0>cert_store = sig.send(:cert_store) # Define the certificate store=>#<OpenSSL::X509::Store:0x00007f4befd1f2f8 @verify_callback=nil, @error=nil, @error_string=nil, @chain=nil, @time=nil, @_httpclient_cert_store_items=[:default]>irb(main):004:0> cert_store.add_file '/etc/ssl/certs/ca-bundle.crt'=> #<OpenSSL::X509::Store:0x00007f8bd6dc8c00 @verify_callback=nil, @error=nil, @error_string=nil, @chain=nil, @time=nil, @_httpclient_cert_store_items=[:default, "/etc/ssl/certs/ca-bundle.crt"]>irb(main):017:0>p7 = sig.send(:p7) # Create new PKCS7 object using the OpenSSL::PKCS7::PKCS7.new() method=>#<OpenSSL::PKCS7:0x00007f4bedbaec68 @data=nil, @error_string=nil>irb(main):017:0>p7.verify([], cert_store, sig.signed_text) # Call p7.verify method=>true
Note the only difference between here and what I posted at the top of my last comment is that I am explicitly adding the cert bundle generated by the helm charts (which includes our custom CAs) with irb(main):004:0> cert_store.add_file '/etc/ssl/certs/ca-bundle.crt'. Once that file is added to the cert_store object, verification of the signed_text succeeds as expected.
Oh interesting, looks like it is indeed not loading the certs correctly. I'll see if I can find someone who knows more about the kubernetes/helm setup than me to have a look at this because I don't have anything set up for testing that at the moment
While our custom CAs are successfully mounted in /etc/pki/tls/certs/ (which is OpenSSL::X509::DEFAULT_CERT_DIR) by the helm chart, they do not appear to be getting used to verify signatures.
However, manually overwriting the file at /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem, which is a symlink for /etc/pki/tls/cert.pem (which is the value of OpenSSL::X509::DEFAULT_CERT_FILE) results in a successful verification of the signature WITHOUT running anything anything special to add a new CA bundle to the cert_store used in p7.verify([], cert_store, sig.signed_text).
I think from that it's safe to conclude that OpenSSL::X509::DEFAULT_CERT_DIR isn't being used anywhere in the verification process, and therefore having our custom CAs mounted to that location isn't doing anything.
It's quite weird really, as far as I can tell OpenSSL just ignores those constants. I wonder if they're getting set in our environment after the point where it already loaded them
I'm just exploring patching the X509 integration to forcibly load them, as I've noticed we do the same in a few other places already.
From my testing the DEFAULT_CERT_DIR, even if forcibly set as a store path, still doesn't correctly load the cert either. However force-loading DEFAULT_CERT_FILE does work, so I'm opening an MR that does that.
So we have a working solution in place now until there is a release made to resolve the issues with OpenSSL::X509::DEFAULT_CERT_DIR and OpenSSL::X509::DEFAULT_CERT_FILE.
I believe the troubles we are having with adding our custom CAs go beyond these constants not being loaded by default. For the Helm Chart deployment there is the following process:
An initContainer using the alpine-cerificates image is used (in part) to gather our custom CAs from numerous k8s secrets and mount them to /usr/share/pki/ca-trust-source/anchors/
Then, update-ca-trust is run to generate /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem and /etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt which are then copied into /etc/ssl/certs/ca-bundle.crt and /etc/ssl/certs/ca-bundle.trust.crt respectively.
/etc/ssl/certs/ is then mounted to the task-runner container itself.
Nothing is currently done to use these bundles with ruby from there, since OpenSSL::X509::DEFAULT_CERT_* constants look in subdirectories of /etc/pki/, which were NOT carried over from alpine-certificates. So our good certs live in /etc/ssl/certs/, where nothing is watching.
Our solution is to add additional volume mounts to all Deployments, StatefulSets, Jobs, and CronJobs where the /etc/ssl/certs/ volume is mounted. The volumes mounts we patched onto these objects are the following:
- mountPath: /etc/pki/tls/certs/ name: etc-ssl-certs # This is the name of the volume containing the certs created by alpine-certificates in /etc/ssl/certs/ readOnly: true- mountPath: /etc/pki/tls/cert.pem subPath: ca-bundle.crt # Mount specifically ca-bundle.crt from etc-ssl-certs volume to /etc/pki/tls/cert.pem name: etc-ssl-certs readOnly: true
Doing this results in OpenSSL::X509::DEFAULT_CERT_FILE pointing to /etc/tls/pki/cert.pem OpenSSL::X509::DEFAULT_CERT_DIR pointing to /etc/tls/pki/certs/, which now contain our custom CAs. The only thing to do from there is for us to run gitlab-rake gitlab:x509:update_signatures manually on the task-runner to verify signatures retrospectively using either of OpenSSL::X509::DEFAULT_CERT_FILE or OpenSSL::X509::DEFAULT_CERT_DIR.
This allows sidekiq to successfully run the CreateCommitSignature Job to verify signatures against custom CAs and the task-runner to run gitlab:x509:update_signatures successfully.
I'm also observing the same behavior in 15.5.3-ee whereby I do a proper git commit signing using smimesign, however no badge appears next to the commit in the UI.
Steps taken:
Added private CA root certificate to /etc/gitlab/trusted_certs and did a reconfigure
Signed commit
Verified commit signature:
git log --show-signaturecommit 874315746639508f02311b043c2394404669b4dc (HEAD -> main, origin/main, origin/HEAD)smimesign: Signature made using certificate ID 0x528fa8c4f9ee335907437d8bcfd8bbf11fdf7fc2smimesign: Good signature from "CN=Sample GPG user,OU=Engineering,O=Test,L=San Jose,ST=CA,C=US"Author: Sample GPG User <release@demo.com>Date: Mon Nov 14 08:13:45 2022 -0800signed commit test
I was able to run the troubleshooting steps, and it looks like the certificate is received however the signature is unable to verify. I uploaded the issuer CA to the /etc/gitlab/trusted-certs folder and did a reconfigure.
@bufferoverflow Just added you to a test project on gitlab.com. I also ran the same steps to sign the commit using an X.509 certificate similar to the on-prem/self-hosted scenario and there is no verified/un-verified badge for the test commit.
Yes, good catch. Ended up creating a new cert with an updated CRLDP and now at least the Unverified badge is showing up. I'm signing with a cert that was issued by a private CA.
@bufferoverflow Is there any way to add a private CA trust chain to gitlab.com to assist with X.509 chain verification?
Also I ended up testing this with a self-hosted gitlab instance and it is showing the unverified badge. I added the Issuer and Root to /etc/gitlab/trusted-certs and performed a gitlab-ctl reconfigure.
@zosocanuck Yeah, that's not going to work sadly, your private CAs cannot be added to a SaaS such as gitlab.com. That will only work if you use a self-hosted GitLab, or if you switch your certificates to a publicly trusted CA.
@bufferoverflow@dlouzan Thanks for your feedback. I'm following the docs and placing the trusted chain (both issuer and root) in the /etc/gitlab/trusted-certs folder (i'm using the omnibus installation on Ubuntu). Based on other user's experiences above should I be dropping these Issuer and Root certificates in a different folder?
@bufferoverflow Yes, every time I've made a change to the gitlab trusted cert store this command has been run. Just curious if you've tested a private CA configuration in a self-hosted gitlab environment? If so I'd love to see what I'm doing wrong
@zosocanuck Yes, we do have internal CAs, and this works. Reviewing our playbooks, we are adding all required intermediate & root CAs to both the internal gitlab trust path (the one you just mentioned), and also the OS trust path (this depends of course on your OS). But I think this is just for other internal tasks, I do not think you need it for gitlab.
@dlouzan Thanks for confirming this is a supported use case. Any other troubleshooting approaches or logs that could possibly provide more insight as to what's going wrong?