I just updated to 8.12 via the omnibus package, and now gitlab is returning a 404 loading /assets/application-1840f43fd20c30180b0599c214ddae7e6939d50933870904ebd0e81704658333.css, which is breaking the page layout.
Started GET "/assets/application-1840f43fd20c30180b0599c214ddae7e6939d50933870904ebd0e81704658333.css" for xxx.xxx.xxx.xxx at 2016-09-23 09:37:11 -0400Processing by ProjectsController#show as HTML Parameters: {"namespace_id"=>"assets", "id"=>"application-1840f43fd20c30180b0599c214ddae7e6939d50933870904ebd0e81704658333.css"}Filter chain halted as :project rendered or redirectedCompleted 404 Not Found in 22ms (Views: 0.5ms | ActiveRecord: 2.1ms)
Please advise.
Designs
Child items ...
Show closed items
Linked items 0
Link issues together to show that they're related or that one is blocking others.
Learn more.
I just got the exact same issue. Upgrading omnibus installation from 8.11.7 to 8.12.0 with a reverse nginx proxy. Is a different file, but essentially the same problem...
==> /var/log/gitlab/gitlab-rails/production.log <== Processing by RootController#index as HTML Completed 200 OK in 58ms (Views: 27.6ms | ActiveRecord: 3.8ms) Started POST "/api/v3/internal/allowed" for 127.0.0.1 at 2016-09-23 15:42:54 +0000 Started POST "/ci/api/v1/builds/register.json" for xxx.xxx.xxx.xxx at 2016-09-23 15:42:54 +0000 Started GET "/assets/application-204e7b3d5eb4881844cf4ccd8b6a8a0d60ca35d2fda088c667058efbc4aa9499.css" for xxx.xxx.xxx.xxx at 2016-09-23 15:42:54 +0000 Processing by ProjectsController#show as HTML Parameters: {"namespace_id"=>"assets", "id"=>"application-204e7b3d5eb4881844cf4ccd8b6a8a0d60ca35d2fda088c667058efbc4aa9499.css"} Filter chain halted as :project rendered or redirected Completed 404 Not Found in 16ms (Views: 0.5ms | ActiveRecord: 1.6ms)
I can confirm this as well. Also using the omnibus package and a apache reverse proxy. Howeber, the 404 also happens when accessing it via localhost without the reverse proxy.
And indeed the print CSS files can be accessed fine.
Same issue here.
I'm using Archlinux and I installed GitLab v8.11 successfully a couple of weeks ago following the official wiki. Yesterday I updated to v8.12 and now I'm facing the same problem.
In production, Rails should not be serving these static assets. The production.log indicates it's incorrectly getting to the Rails controller for some reason. It should be going through gitlab-workhorse, and you should see some log message such as:
2016-09-23_20:05:52.66931 worker9 gitlab-workhorse: 2016/09/23 20:05:52 Send static file "/opt/gitlab/embedded/service/gitlab-rails/public/assets/select2x2-6fe28d687dc0ed4d96016238c608ba1e7198c9c9accfa0b360b78018b9fb9bc2.png" ("") for GET "/assets/select2x2-6fe28d687dc0ed4d96016238c608ba1e7198c9c9accfa0b360b78018b9fb9bc2.png"
Can you check your gitlab-workhorse logs to see if there are any errors?
Sure. The favicon and the print...css seem fine but the JS and the CSS show a 404 altough the files are in the asset folder.
Here is the log tail:
Started GET "/" for 91.97.38.233 at 2016-09-23 22:19:12 +0200 Processing by RootController#index as HTML Completed 200 OK in 171ms (Views: 60.5ms | ActiveRecord: 21.2ms) Started GET "/assets/application-c8d3f635e65582c3f37efa5258a1edf1832f5d0b02e2fb30d8303a2ae60a5692.css" for 91.97.38.233 at 2016-09-23 22:19:12 +0200 Started GET "/assets/application-591156345de7cf30941cac4744c4744835cb4639743d7a48c884017bc0fc486c.js" for 91.97.38.233 at 2016-09-23 22:19:12 +0200 Processing by ProjectsController#show as HTML Parameters: {"namespace_id"=>"assets", "id"=>"application-c8d3f635e65582c3f37efa5258a1edf1832f5d0b02e2fb30d8303a2ae60a5692.css"} Filter chain halted as :project rendered or redirected Completed 404 Not Found in 38ms (Views: 1.0ms | ActiveRecord: 5.7ms) Started GET "/assets/print-68eed6d8135d858318821e790e25da27b2b4b9b8dbb1993fa6765d8e2e3e16ee.css" for 91.97.38.233 at 2016-09-23 22:19:12 +0200 Started GET "/assets/favicon-075eba76312e8421991a0c1f89a89ee81678bcde72319dd3e8047e2a47cd3a42.ico" for 91.97.38.233 at 2016-09-23 22:19:12 +0200
(I did a rake cache:clear already several times and also an assets:precompile, a reconfigure and a restart)
And then, gitlab-ctl reconfigure, gitlab-ctl restart
Works like a charm right now... so, I'm not upgrading automatically my server anymore. Every update worked fine until this one.
My server is running an omnibus installation with Debian 8 (jessie), for reference.
@brianodonnell @cluxter @svenabels is your reverse proxies pointing at gitlab's nginx, or gitlab's unicorn server, or the workhorse? They need to either be pointed at workhorse or nginx, and not the unicorn server.
@twk3 It was working before the upgrade so if it's not pointing to the right server anymore it means the upgrade changed it - which I guess is unlikely, right?
I'm not sure where to check that exactly but in /etc/webapps/gitlab/gitlab.yml my settings didn't change vs v8.11. Under the gitlab section I still have host: cluxter.net, port: 8088 and https: false which is what I set for v8.11. Under trusted_proxies: the 3 examples are still commented out.
@twk3 For me, it's pointing to the unicorn port but that's always been the case in the last years and it always worked fine. nginx['enable'] is set to false in our case
@twk3 @cluxter I just changed the reverse proxy to point to nginx and I switched on nginx in gitlab and now it works fine. :-) However, I wonder why this is the case because pointing to unicorn worked like a charm for all the time so far - until upgrading to 8.12.0
@twk3 I'm still new with the GitLab configuration (I just installed it last week for the first time) so I'm not totally familiar with how the different elements of GitLab work together or what they even are.
Could you please give us more details about this change? For example what file(s) did you change?
Thanks!
@svenabels From what I saw this file is used by the omnibus package, right? I'm trying - hard - to make it work from the source package which is not very easy. The gitlab-workhorse service is installed and running according to systemctl status gitlab-workhorse but I didn't find how to configure the reverse proxy to point at workhorse yet, as suggested by @twk3 . I need to understand how the workhorse has been configured first, then how to configure the reverse proxy to make it point at the workhorse. Any help greatly appreciated from anyone who has experience in configuring a GitLab from the source ;-)
I'm not sure what changed in this recent release yet relating to this, but we do intend for people to use the workhorse instead of of directly hitting unicorn. The workhorse handles static files, git over http, repository archiving etc.
@twk3 is right here--again, if you are seeing errors in the Rails log, that means that something is routing the /assets/XXX URL incorrectly to unicorn.
For those of you having trouble, it would really help if you include the settings of your reverse proxy--particularly those of you who are using your own instead of the one managed by omnibus. gitlab-workhorse logs may also help.
for some guys who locates here by search engine,if you got 404 with all static files, DO NOT change you reverse proxy settings from proxy_pass http://gitlab-workhorse; to proxy_pass https://gitlab-workhorse;
sorry of my poor english.but maybe it's helpful to someone.
Same problem here. After updating the Ubuntu omnibus package from 8.11.x to 8.12, CSS files result in a 404.
Running behind an nginx reverse proxy, but the problem persists when trying to access the css files locally via curl (curl --cookie '_gitlab_session=…' -v http://localhost:8080/assets/application-891a61baf08dd362204cccb62419682e810754e7b9e657eb3d33897e53d5bd96.css).
@twk3 Thank you so much for your advice about using gitlab-workhorse instead of gitlab-unicorn! I configured gitlab-workhorse correctly and I am now able to use it instead of gitlab-unicorn, which almost solved the problem (and it's much faster).
However my avatar is not showing up anymore now. It is successfully uploaded into the /var/lib/gitlab/uploads/user/avatar/2/ folder though (I am using Archlinux): the timestamp and size are correctly updated everytime I upload a new avatar and this is the right file because I was able to open it in an image viewer and see that it was my file. The permissions weren't set right at first according to the checking tool, so I followed the advice from the checking tool to modify permissions but no luck. I then set 777 on the uploads folder, it didn't work. I then also set 777 on its recursive folders, still no luck. I also tried this:
but it didn't change anything (I obviously reloaded and restarted all the modules everytime I changed something).
I am not using nginx or Apache at all - only gitlab-workhorse.
I tried uploading other files as well in an issue that I have in my project. The file is successfully uploaded as /var/lib/gitlab/uploads/cluxter/test1/bce78ce6d035702303ef795ed0c36acc/Test.odt, I can see it and use it. But I can't download it from my GitLab. The browser keeps waiting until it times out.
If you (or anyone else) could give me another clue on how I could investigate this issue I would be very grateful.
PS: I will write a whole description of the steps I followed to solve all this once I will be done with the avatar issue.
Here is how I fixed this issue for Archlinux. Big thanks to @twk3 who gave me the key advice to unlock it. Please do not hesitate to correct me if I'm wrong somewhere or to clarify some points.
Definitions
GitLab can rely on several parts:
gitlab-unicorn: this is an HTTP server which can also act as a reverse HTTP proxy
gitlab-workhorse: from what I understood, basically the same thing than gitlab-unicorn but much faster, more modern and better suited for big HTTP requests. So gitlab-workhorse should be preferred over gitlab-unicorn. The GitLab team intends for people to use it instead of gitlab-unicorn
apache: also an HTTP server + reverse HTTP proxy, so basically the same role than the 2 previous packages
nginx: like apache but usually with better performances and an easier configuration
My situation
I use Archlinux and I didn't want to use apache nor nginx. So far my GitLab was relying on gitlab-unicorn. I understood I should use gitlab-workhorse instead since some unknown changes happened on gitlab-unicorn that made it unusable. I knew I had installed gitlab-workhorse but I didn't know how it was set up.
The big change
First I had to see if gitlab-workhorse was loaded:
So I understood I had to change this parameter to use a TCP connexion on a network interface instead of using a socket. Thus I edited the service config file like this (path given by the "systemctl status" command):
unicorn is still the server used internally for serving up the dynamic app pages. But it has a limited number for workers for dealing with long lived requests, it isn't as fast at serving static assets as the servers most of us are familiar with, and it has request timeouts geared towards connections coming in from the browser. (unlike coming in over a git push/fetch for example).
So instead, workhorse is dealing with all those things that unicorn is bad at, and proxying requests to unicorn for the stuff that unicorn is good at. (serving the dynamic pages)
The nginx or apache in front is now mostly only used for ssl termination if you are using https, working nicer with other sites you may be hosting on the box, and providing a well-known exposed system for things like your firewall.
We are seeing the same problem in a HA deployment when 2 or more servers are running. It turns out that the application-[hash].css file is not the same on every server. In fact, it changes when regenerating the assets. I attached a diff with the changes between asset compilation. It seems that keyframes and animation names are randomized. Therefore, the resulting hash is different causing 404 error.
@cluxter Thanks for giving everyone a summary of what worked for you. I'm not aware of anything that changed from 8.11 to 8.12 that would suddenly cause this problem (the current routing to gitlab-workhorse has been there for some time), but perhaps something else triggered it.
@filipa No, this does not look like a frontend-specific issue; it is more likely a configuration problem. I've removed the label.
I have just done a fresh install of gitlab on a new CentOS 7 machine, and have the same issue. I can see that the css file is present at /opt/gitlab/embedded/service/gitlab-rails/public/assets/application-1840f43fd20c30180b0599c214ddae7e6939d50933870904ebd0e81704658333.css however.
I can confirm that the issue was resolvable entirely in my apache configuration, which was originally implemented for 7.x. Some URLs were being proxied to unicorn instead of workhorse. Thanks for the help.
I don't know if this helps, but I did a gitlab-ctl tail and reloaded the site. The css file starting with application- is processed by a ProjectsController which results in the 401 error. The other asset files are now if I understand it correctly. Does this help to find a solution?
==> /var/log/gitlab/gitlab-rails/production.log <==Started GET "/users/sign_in" for aaa.bbb.ccc.ddd at 2016-10-01 07:16:14 +0200Processing by SessionsController#new as HTMLCompleted 200 OK in 27ms (Views: 7.7ms | ActiveRecord: 2.0ms)Started GET "/assets/application-481a0db50fbd5baaf710363e35d7ac792af08f710a842cf7e084cfd1cbe786d3.css" for aaa.bbb.ccc.ddd at 2016-10-01 07:16:14 +0200Processing by ProjectsController#show as HTML Parameters: {"namespace_id"=>"assets", "id"=>"application-481a0db50fbd5baaf710363e35d7ac792af08f710a842cf7e084cfd1cbe786d3.css"}Completed 401 Unauthorized in 21ms (ActiveRecord: 2.1ms)Started GET "/assets/application-5fa120300b338070e8b574cfe457926b9cea7df9de1053d1b17f3439fb8b379f.js" for aaa.bbb.ccc.ddd at 2016-10-01 07:16:14 +0200Started GET "/assets/print-68eed6d8135d858318821e790e25da27b2b4b9b8dbb1993fa6765d8e2e3e16ee.css" for aaa.bbb.ccc.ddd at 2016-10-01 07:16:14 +0200Started GET "/users/sign_in" for aaa.bbb.ccc.ddd at 2016-10-01 07:16:15 +0200Processing by SessionsController#new as HTMLCompleted 200 OK in 27ms (Views: 8.0ms | ActiveRecord: 2.0ms)
@RalfEggert It totally does.
Following what have been discussed in this thread, you need to use (and hence connect to) gitlab-workhorse instead of gitlab-unicorn.
The line with process 34366 shows that gitlab-workhorse has been launched with these parameters (among others):
According to the gitlab-workhorse manual, this means that your gitlab-workhorse is listening on a UNIX socket instead of a TCP/IP address.
So find the Ubuntu init script that launches the gitlab-workhorse service (I don't know anything about the Ubuntu init system), and use these parameters instead:
-listenNetwork tcp -listenAddr 127.0.0.1:8181
BTW these are the default settings, it should work the same if you just delete these 2 parameters.
However this means that you'll have to configure a reverse proxy like Apache or nginx on the same machine to point to 127.0.0.1:8181, because 127.0.0.1 is only accessible from the local machine.
If you want to make gitlab-workhorse directly accessible from you LAN and/or NAT, use something like this (assuming your local IP address is 192.168.0.1):
-listenNetwork tcp -listenAddr 192.168.0.1:8181
Then you will be able to connect to gitlab-workhorse using http://192.168.0.1:8181 from your LAN and redirect ports on your NAT to this IP:port pair.
Thanks for your kind help. I do feel still lost though because I have no idea where and how to change the configuration.
I did found the /opt/gitlab/service/gitlab-workhorse/run file which configures listenNetwork to unix and listenAddr to /var/opt/gitlab/gitlab-workhorse/socket. I could change the values, but this file is overwritten by every gitlab-ctl reconfigure run.
In the /etc/gitlab/gitlab.rb configuration file there is no mentioning of gitlab-workhorse at all. All lines with the configuration for the mentioned unicorn are commented out completely.
So, I have no idea how to change it. Any further help is really appreciated.
Funny enough I just stumbled about the same lines and wanted to add them just before you replied. After reconfiguration the settings in /opt/gitlab/service/gitlab-workhorse/run have been overwritten as expected. But still it did not help yet.
I guess I need to do the reverse proxy like Apache you mentioned and again I have no idea how to set that up. Need to dig in again. Found this but not sure if that is the correct one:
We use Apache as proxy and proxied to rails instead of workhorse. proxying to workhorse and starting workhorse solved this problem.
It also solved a similar issue about not being able to edit .MD files
Same issue here, while upgrading from 8.10 to 8.12.4. We use apache proxying to rails.
Fixed with what @cluxter is mentioning above (configure gitlab-workhorse to listen to a port and redirect apache to that port instead of rails).
Thank you!-)
Going forward I think any new installation should create an (updated) gitlab.rb.new, so that we have the always updated config options.
I can't believe i wasted so much time on this. I installed Gitlab for the first time (on fresh Ubuntu) and I saw new process listening on 8080. I thought "OK, that must be Gitlab" and it is Gitlab, but on 8080 you get this weird problems with CSS.
The actual Gitlab with proper working CSS is on nginx, port 80 :/
Hoping that this will help someone.
based on CentOS 7
Had a similar issue with the css on the sign in page giving a 404 error.
I have run this prior to an upgrade and the upgrade is then successful. I have then upgraded to another version again to ensure that is is carried through.
Note: this may not be considered good practice to rebuild the assets with an omnibus install.
Using feedback on several bugs logged with similar errors, I managed to fix it using the following method.
This has also been run after the upgrade which may also need a restart once carried out.
This is based on an omnibus install
#### Clear all assets known currently
# gitlab-rake assets:clean
##### Ensure git has the access required to the directory
# setfacl -R -m u:git:rwX /opt/gitlab/embedded/service/gitlab-rails/public/assets/
##### Rebuild all the assets - change the ENV if yours does not match
# gitlab-rake assets:precompile RAILS_ENV=production
##### give all users read and write to that folder
# chmod -R a+rX /opt/gitlab/embedded/service/gitlab-rails/public/assets/
#### Remove git's write access
# setfacl -R -x u:git /opt/gitlab/embedded/service/gitlab-rails/public/assets/
If a restart is required
## restart services using gitlab control
# gitlab-ctl restart
# restart the whole gitlab service tree at the OS level (Centos)
# systemctl restart gitlab-runsvdir
@carpnutter That fixed it for our machines! We've been having this issue on seemingly random machines and haven't been able to upgrade past 8.15.2 because of it.
I'm going crazy! Nothing works for me... :( I've read so many threads in so many forums ...!
I have a nginx server which manages websites via subdirectories ( location /foo {...} and location /bar {...}).
The server is behind a dns at home and all sites are working with ssl perfectly.
my gitlab-server (example ip: 192.168.1.50) is next to the webserver in the same network. The webserver should forward to the gitlab-server. Forwarding works. Only the assets files do not work on https://www.example.com/git. I get the said 404 errors.
After setting the nginx webserver i restarted it with systemctl reload nginx. after reconfiguring gitlab i reconfigured it and restarted it with gitlab-ctl reconfigure and gitlab-ctl restart.
Just chiming in to say that two of our Centos EE instances had this happen again when updating from 8.16.1 -> 8.16.4. Our third instance went from 8.16.2 -> 8.16.4 and didn't have any issues. The third machine has actually never had this happen, for some reason.
Be-aware of the warning at the bottom (about "If you are using relative urls remove the block below"), this was working the in past. But broke during a migration from 8.16.2-ce to 8.16.6-c on Debian (omnibus installation, via separate Nginx server via unix socket).
Restart Nginx server (sudo service nginx restart). I tried to set the assets directory permissions, rebuilding the assets, reconfigure, restarting.. well everything I tried. But commenting-out this block did work in my case.
Well I didn't know I used relative paths, but yea I did.. No idea how to make the paths absolute btw.
Recompiling assets appears to have been disabled in the omnibus version somewhere between 8.16.5 and 8.17.3 and now you'll get this error message when trying
rake aborted!ExecJS::Error: ExecJS disabled/opt/gitlab/embedded/bin/bundle:22:in `load'/opt/gitlab/embedded/bin/bundle:22:in `<main>'Tasks: TOP => assets:precompile
I didn't setup our nginx configuration, but it certainly doesn't look like the advanced config. I may need to have a talk with the person who did. Looks like we're going to be on 8.16.5 for awhile.
@smcgivern Whenever I upgrade some of our EE machines, the CSS stops working. I know the docs have said there should be no need to recompile on omnibus, but it fixes it, so I did it anyway.
We aren't using a relative url, but we (unfortunately) have multiple services on the same machine, so we use an external Nginx install to handle the subdomains. I didn't do the configuration for Nginx, but it appears to only have a / block that forwards everything to the workhorse socket, which seems roughly correct.
I could have sworn it happens on one machine with external nginx and one that is pure omnibus, and another that is pure omnibus is wholly unaffected. I haven't been keeping things as up to date as I should lately so I may be misremembering slightly.
Once classes are over for the day I can break a machine to scrape some logs if that will help, though I believe it looked a lot like those in the original post.
@smcgivern Here's the tail from loading the sign in page
==> /var/log/gitlab/gitlab-rails/production.log <==Started GET "/users/sign_in" for 128.206.116.250 at 2017-03-16 12:41:45 -0500Processing by SessionsController#new as HTMLCompleted 200 OK in 140ms (Views: 70.6ms | ActiveRecord: 14.8ms)==> /var/log/gitlab/gitlab-workhorse/current <==2017-03-16_17:41:46.18454 git.dsa.missouri.edu @ - - [2017-03-16 12:41:45.841156552 -0500 CDT] "GET /users/sign_in HTTP/1.1" 200 11736 "https://git.dsa.missouri.edu/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" 0.3433072017-03-16_17:41:46.23421 git.dsa.missouri.edu @ - - [2017-03-16 12:41:46.233959361 -0500 CDT] "GET /assets/application-a6dd150d84720bf9a3c0d83ce742846db842b2f38248e1dd91159801d5aa5f41.css HTTP/1.1" 404 19 "https://git.dsa.missouri.edu/users/sign_in" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" 0.0001532017-03-16_17:41:46.23507 2017/03/16 12:41:46 Send static file "/opt/gitlab/embedded/service/gitlab-rails/public/assets/print-9c3a1eb4a2f45c9f3d7dd4de03f14c2e6b921e757168b595d7f161bbc320fc05.css" ("gzip") for GET "/assets/print-9c3a1eb4a2f45c9f3d7dd4de03f14c2e6b921e757168b595d7f161bbc320fc05.css"2017-03-16_17:41:46.24735 git.dsa.missouri.edu @ - - [2017-03-16 12:41:46.234860011 -0500 CDT] "GET /assets/print-9c3a1eb4a2f45c9f3d7dd4de03f14c2e6b921e757168b595d7f161bbc320fc05.css HTTP/1.1" 200 350 "https://git.dsa.missouri.edu/users/sign_in" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" 0.0124222017-03-16_17:41:46.28646 2017/03/16 12:41:46 Send static file "/opt/gitlab/embedded/service/gitlab-rails/public/assets/webpack/application-70c67ed9d057a03a61dd-v2.js" ("gzip") for GET "/assets/webpack/application-70c67ed9d057a03a61dd-v2.js"2017-03-16_17:41:46.28757 git.dsa.missouri.edu @ - - [2017-03-16 12:41:46.2862999 -0500 CDT] "GET /assets/webpack/application-70c67ed9d057a03a61dd-v2.js HTTP/1.1" 200 269494 "https://git.dsa.missouri.edu/users/sign_in" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" 0.001194==> /var/log/gitlab/redis/current <==2017-03-16_17:41:47.08736 74734:M 16 Mar 12:41:47.087 * 10000 changes in 60 seconds. Saving...2017-03-16_17:41:47.08812 74734:M 16 Mar 12:41:47.088 * Background saving started by pid 755602017-03-16_17:41:47.09201 75560:C 16 Mar 12:41:47.091 * DB saved on disk2017-03-16_17:41:47.09245 75560:C 16 Mar 12:41:47.092 * RDB: 6 MB of memory used by copy-on-write==> /var/log/gitlab/gitlab-workhorse/current <==2017-03-16_17:41:47.17153 2017/03/16 12:41:47 Send static file "/opt/gitlab/embedded/service/gitlab-rails/public/assets/favicon-075eba76312e8421991a0c1f89a89ee81678bcde72319dd3e8047e2a47cd3a42.ico" ("gzip") for GET "/assets/favicon-075eba76312e8421991a0c1f89a89ee81678bcde72319dd3e8047e2a47cd3a42.ico"2017-03-16_17:41:47.17162 git.dsa.missouri.edu @ - - [2017-03-16 12:41:47.171329345 -0500 CDT] "GET /assets/favicon-075eba76312e8421991a0c1f89a89ee81678bcde72319dd3e8047e2a47cd3a42.ico HTTP/1.1" 200 1384 "" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" 0.000232==> /var/log/gitlab/redis/current <==2017-03-16_17:41:47.18890 74734:M 16 Mar 12:41:47.188 * Background saving terminated with success
The console in Firefox always reports a MIME issue when this happens, but I'm not sure it's really valid. How can a type mismatch if you never received the file? Maybe it's trying to interpret the 404 page as the CSS.
GET https://git.dsa.missouri.edu/assets/application-a6dd150d84720bf9a3c0d83ce742846db842b2f38248e1dd91159801d5aa5f41.css [HTTP/1.1 404 Not Found 1ms]The resource from “https://git.dsa.missouri.edu/assets/application-a6dd150d84720bf9a3c0d83ce742846db842b2f38248e1dd91159801d5aa5f41.css” was blocked due to MIME type mismatch (X-Content-Type-Options: nosniff).
Rake check doesn't reveal anything, but I'll attach it anyway. rake_check.txt
@smcgivern Cloned a different machine with the same issue so we can pick it apart if need be, here's the log tail as well as the contents of /opt/gitlab/embedded/service/gitlab-rails/public/assets/webpack/
==> /var/log/gitlab/gitlab-rails/production.log <==Started GET "/" for 172.16.20.50 at 2017-03-17 12:40:25 -0500Processing by RootController#index as HTMLRedirected to https://172.16.11.168/users/sign_inFilter chain halted as :redirect_unlogged_user rendered or redirectedCompleted 302 Found in 20ms (ActiveRecord: 1.6ms)==> /var/log/gitlab/gitlab-workhorse/current <==2017-03-17_17:40:25.34483 172.16.11.168 @ - - [2017-03-17 12:40:25.313959779 -0500 CDT] "GET / HTTP/1.1" 302 101 "" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36" 0.030805==> /var/log/gitlab/nginx/gitlab_access.log <==172.16.20.50 - - [17/Mar/2017:12:40:25 -0500] "GET / HTTP/2.0" 302 388 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36"==> /var/log/gitlab/gitlab-rails/production.log <==Started GET "/users/sign_in" for 172.16.20.50 at 2017-03-17 12:40:25 -0500Processing by SessionsController#new as HTMLCompleted 200 OK in 43ms (Views: 17.7ms | ActiveRecord: 3.8ms)==> /var/log/gitlab/gitlab-workhorse/current <==2017-03-17_17:40:25.39413 172.16.11.168 @ - - [2017-03-17 12:40:25.34765579 -0500 CDT] "GET /users/sign_in HTTP/1.1" 200 8154 "" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36" 0.046407==> /var/log/gitlab/nginx/gitlab_access.log <==172.16.20.50 - - [17/Mar/2017:12:40:25 -0500] "GET /users/sign_in HTTP/2.0" 200 8458 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36"==> /var/log/gitlab/gitlab-workhorse/current <==2017-03-17_17:40:25.39920 172.16.11.168 @ - - [2017-03-17 12:40:25.399042957 -0500 CDT] "GET /assets/application-a6dd150d84720bf9a3c0d83ce742846db842b2f38248e1dd91159801d5aa5f41.css HTTP/1.1" 404 19 "https://172.16.11.168/users/sign_in" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36" 0.0000952017-03-17_17:40:25.39926 2017/03/17 12:40:25 Send static file "/opt/gitlab/embedded/service/gitlab-rails/public/assets/webpack/application-70c67ed9d057a03a61dd-v2.js" ("gzip") for GET "/assets/webpack/application-70c67ed9d057a03a61dd-v2.js"==> /var/log/gitlab/nginx/gitlab_access.log <==172.16.20.50 - - [17/Mar/2017:12:40:25 -0500] "GET /assets/application-a6dd150d84720bf9a3c0d83ce742846db842b2f38248e1dd91159801d5aa5f41.css HTTP/2.0" 404 107 "https://172.16.11.168/users/sign_in" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36"==> /var/log/gitlab/gitlab-workhorse/current <==2017-03-17_17:40:25.40109 172.16.11.168 @ - - [2017-03-17 12:40:25.399137624 -0500 CDT] "GET /assets/webpack/application-70c67ed9d057a03a61dd-v2.js HTTP/1.1" 200 269494 "https://172.16.11.168/users/sign_in" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36" 0.001857==> /var/log/gitlab/nginx/gitlab_access.log <==172.16.20.50 - - [17/Mar/2017:12:40:25 -0500] "GET /assets/webpack/application-70c67ed9d057a03a61dd-v2.js HTTP/2.0" 200 269650 "https://172.16.11.168/users/sign_in" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36"==> /var/log/gitlab/gitlab-workhorse/current <==2017-03-17_17:40:25.43261 2017/03/17 12:40:25 Send static file "/opt/gitlab/embedded/service/gitlab-rails/public/assets/print-9c3a1eb4a2f45c9f3d7dd4de03f14c2e6b921e757168b595d7f161bbc320fc05.css" ("gzip") for GET "/assets/print-9c3a1eb4a2f45c9f3d7dd4de03f14c2e6b921e757168b595d7f161bbc320fc05.css"2017-03-17_17:40:25.43270 172.16.11.168 @ - - [2017-03-17 12:40:25.432499501 -0500 CDT] "GET /assets/print-9c3a1eb4a2f45c9f3d7dd4de03f14c2e6b921e757168b595d7f161bbc320fc05.css HTTP/1.1" 200 350 "https://172.16.11.168/users/sign_in" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36" 0.000138==> /var/log/gitlab/nginx/gitlab_access.log <==172.16.20.50 - - [17/Mar/2017:12:40:25 -0500] "GET /assets/print-9c3a1eb4a2f45c9f3d7dd4de03f14c2e6b921e757168b595d7f161bbc320fc05.css HTTP/2.0" 200 528 "https://172.16.11.168/users/sign_in" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36"
[root@corona-dbg webpack]# lltotal 21016-rw-r--r--. 1 root root 945194 Mar 7 16:09 application-70c67ed9d057a03a61dd-v2.js-rw-r--r--. 1 root root 269494 Mar 7 16:09 application-70c67ed9d057a03a61dd-v2.js.gz-rw-r--r--. 1 root root 5512928 Mar 7 16:09 application-70c67ed9d057a03a61dd-v2.js.map-rw-r--r--. 1 root root 1372152 Mar 7 16:09 application-70c67ed9d057a03a61dd-v2.js.map.gz-rw-r--r--. 1 root root 2753 Mar 7 16:09 blob_edit-015d33cb3fb9d27ec767-v2.js-rw-r--r--. 1 root root 1104 Mar 7 16:09 blob_edit-015d33cb3fb9d27ec767-v2.js.gz-rw-r--r--. 1 root root 17219 Mar 7 16:09 blob_edit-015d33cb3fb9d27ec767-v2.js.map-rw-r--r--. 1 root root 4143 Mar 7 16:09 blob_edit-015d33cb3fb9d27ec767-v2.js.map.gz-rw-r--r--. 1 root root 146615 Mar 7 16:09 boards-662e551ffc94960e61fe-v2.js-rw-r--r--. 1 root root 45778 Mar 7 16:09 boards-662e551ffc94960e61fe-v2.js.gz-rw-r--r--. 1 root root 639844 Mar 7 16:09 boards-662e551ffc94960e61fe-v2.js.map-rw-r--r--. 1 root root 177064 Mar 7 16:09 boards-662e551ffc94960e61fe-v2.js.map.gz-rw-r--r--. 1 root root 2225 Mar 7 16:09 boards_test-990daf654abf8eb792c5-v2.js-rw-r--r--. 1 root root 1098 Mar 7 16:09 boards_test-990daf654abf8eb792c5-v2.js.gz-rw-r--r--. 1 root root 17002 Mar 7 16:09 boards_test-990daf654abf8eb792c5-v2.js.map-rw-r--r--. 1 root root 4312 Mar 7 16:09 boards_test-990daf654abf8eb792c5-v2.js.map.gz-rw-r--r--. 1 root root 108347 Mar 7 16:09 commit_pipelines-bc2b628333c1173b4ed6-v2.js-rw-r--r--. 1 root root 36241 Mar 7 16:09 commit_pipelines-bc2b628333c1173b4ed6-v2.js.gz-rw-r--r--. 1 root root 470370 Mar 7 16:09 commit_pipelines-bc2b628333c1173b4ed6-v2.js.map-rw-r--r--. 1 root root 142111 Mar 7 16:09 commit_pipelines-bc2b628333c1173b4ed6-v2.js.map.gz-rw-r--r--. 1 root root 99574 Mar 7 16:09 cycle_analytics-5feed6bcfb6646b7e987-v2.js-rw-r--r--. 1 root root 32505 Mar 7 16:09 cycle_analytics-5feed6bcfb6646b7e987-v2.js.gz-rw-r--r--. 1 root root 374908 Mar 7 16:09 cycle_analytics-5feed6bcfb6646b7e987-v2.js.map-rw-r--r--. 1 root root 112922 Mar 7 16:09 cycle_analytics-5feed6bcfb6646b7e987-v2.js.map.gz-rw-r--r--. 1 root root 90615 Mar 7 16:09 diff_notes-6ed778a5a0d5e30b8acb-v2.js-rw-r--r--. 1 root root 31057 Mar 7 16:09 diff_notes-6ed778a5a0d5e30b8acb-v2.js.gz-rw-r--r--. 1 root root 392193 Mar 7 16:09 diff_notes-6ed778a5a0d5e30b8acb-v2.js.map-rw-r--r--. 1 root root 116991 Mar 7 16:09 diff_notes-6ed778a5a0d5e30b8acb-v2.js.map.gz-rw-r--r--. 1 root root 113364 Mar 7 16:09 environments-17f6d709b1764175f9d5-v2.js-rw-r--r--. 1 root root 37560 Mar 7 16:09 environments-17f6d709b1764175f9d5-v2.js.gz-rw-r--r--. 1 root root 491647 Mar 7 16:09 environments-17f6d709b1764175f9d5-v2.js.map-rw-r--r--. 1 root root 143609 Mar 7 16:09 environments-17f6d709b1764175f9d5-v2.js.map.gz-rw-r--r--. 1 root root 26436 Mar 7 16:09 filtered_search-f002d80f212885c0b347-v2.js-rw-r--r--. 1 root root 5945 Mar 7 16:09 filtered_search-f002d80f212885c0b347-v2.js.gz-rw-r--r--. 1 root root 128471 Mar 7 16:09 filtered_search-f002d80f212885c0b347-v2.js.map-rw-r--r--. 1 root root 27947 Mar 7 16:09 filtered_search-f002d80f212885c0b347-v2.js.map.gz-rw-r--r--. 1 root root 163572 Mar 7 16:09 graphs-50167501d51321b9add8-v2.js-rw-r--r--. 1 root root 56114 Mar 7 16:09 graphs-50167501d51321b9add8-v2.js.gz-rw-r--r--. 1 root root 1035693 Mar 7 16:09 graphs-50167501d51321b9add8-v2.js.map-rw-r--r--. 1 root root 242483 Mar 7 16:09 graphs-50167501d51321b9add8-v2.js.map.gz-rw-r--r--. 1 root root 15121 Mar 7 16:09 issuable-4395f065dec7b1377787-v2.js-rw-r--r--. 1 root root 3610 Mar 7 16:09 issuable-4395f065dec7b1377787-v2.js.gz-rw-r--r--. 1 root root 67230 Mar 7 16:09 issuable-4395f065dec7b1377787-v2.js.map-rw-r--r--. 1 root root 13121 Mar 7 16:09 issuable-4395f065dec7b1377787-v2.js.map.gz-rw-r--r--. 1 root root 52930 Mar 7 16:09 lib_chart-3b350422be827e3c8b71-v2.js-rw-r--r--. 1 root root 11970 Mar 7 16:09 lib_chart-3b350422be827e3c8b71-v2.js.gz-rw-r--r--. 1 root root 320201 Mar 7 16:09 lib_chart-3b350422be827e3c8b71-v2.js.map-rw-r--r--. 1 root root 68268 Mar 7 16:09 lib_chart-3b350422be827e3c8b71-v2.js.map.gz-rw-r--r--. 1 root root 152231 Mar 7 16:09 lib_d3-5a909fc0117033fbe705-v2.js-rw-r--r--. 1 root root 53492 Mar 7 16:09 lib_d3-5a909fc0117033fbe705-v2.js.gz-rw-r--r--. 1 root root 975144 Mar 7 16:09 lib_d3-5a909fc0117033fbe705-v2.js.map-rw-r--r--. 1 root root 229011 Mar 7 16:09 lib_d3-5a909fc0117033fbe705-v2.js.map.gz-rw-r--r--. 1 root root 77389 Mar 7 16:09 lib_vue-39b983fba26af5199758-v2.js-rw-r--r--. 1 root root 27860 Mar 7 16:09 lib_vue-39b983fba26af5199758-v2.js.gz-rw-r--r--. 1 root root 311898 Mar 7 16:09 lib_vue-39b983fba26af5199758-v2.js.map-rw-r--r--. 1 root root 99759 Mar 7 16:09 lib_vue-39b983fba26af5199758-v2.js.map.gz-rw-r--r--. 1 root root 20271 Mar 7 16:09 manifest.json-rw-r--r--. 1 root root 76936 Mar 7 16:09 merge_conflicts-5aca6c48c77ac46f4115-v2.js-rw-r--r--. 1 root root 27061 Mar 7 16:09 merge_conflicts-5aca6c48c77ac46f4115-v2.js.gz-rw-r--r--. 1 root root 295167 Mar 7 16:09 merge_conflicts-5aca6c48c77ac46f4115-v2.js.map-rw-r--r--. 1 root root 95358 Mar 7 16:09 merge_conflicts-5aca6c48c77ac46f4115-v2.js.map.gz-rw-r--r--. 1 root root 1760 Mar 7 16:09 merge_request_widget-8f66cd055bd5899a1cfa-v2.js-rw-r--r--. 1 root root 718 Mar 7 16:09 merge_request_widget-8f66cd055bd5899a1cfa-v2.js.gz-rw-r--r--. 1 root root 11461 Mar 7 16:09 merge_request_widget-8f66cd055bd5899a1cfa-v2.js.map-rw-r--r--. 1 root root 2798 Mar 7 16:09 merge_request_widget-8f66cd055bd5899a1cfa-v2.js.map.gz-rw-r--r--. 1 root root 87121 Mar 7 16:09 mr_widget_ee-092cf0edd691a2a06d75-v2.js-rw-r--r--. 1 root root 30039 Mar 7 16:09 mr_widget_ee-092cf0edd691a2a06d75-v2.js.gz-rw-r--r--. 1 root root 352270 Mar 7 16:09 mr_widget_ee-092cf0edd691a2a06d75-v2.js.map-rw-r--r--. 1 root root 109111 Mar 7 16:09 mr_widget_ee-092cf0edd691a2a06d75-v2.js.map.gz-rw-r--r--. 1 root root 9017 Mar 7 16:09 network-26aff28014e5c3396a85-v2.js-rw-r--r--. 1 root root 3135 Mar 7 16:09 network-26aff28014e5c3396a85-v2.js.gz-rw-r--r--. 1 root root 53423 Mar 7 16:09 network-26aff28014e5c3396a85-v2.js.map-rw-r--r--. 1 root root 11954 Mar 7 16:09 network-26aff28014e5c3396a85-v2.js.map.gz-rw-r--r--. 1 root root 7991 Mar 7 16:09 profile-1b5e086be9f380745af0-v2.js-rw-r--r--. 1 root root 2714 Mar 7 16:09 profile-1b5e086be9f380745af0-v2.js.gz-rw-r--r--. 1 root root 41113 Mar 7 16:09 profile-1b5e086be9f380745af0-v2.js.map-rw-r--r--. 1 root root 9680 Mar 7 16:09 profile-1b5e086be9f380745af0-v2.js.map.gz-rw-r--r--. 1 root root 15260 Mar 7 16:09 protected_branches-b0453100e933de3cbc85-v2.js-rw-r--r--. 1 root root 4183 Mar 7 16:09 protected_branches-b0453100e933de3cbc85-v2.js.gz-rw-r--r--. 1 root root 87158 Mar 7 16:09 protected_branches-b0453100e933de3cbc85-v2.js.map-rw-r--r--. 1 root root 19201 Mar 7 16:09 protected_branches-b0453100e933de3cbc85-v2.js.map.gz-rw-r--r--. 1 root root 974 Mar 7 16:09 snippet-11fe6c9225a081b78d02-v2.js-rw-r--r--. 1 root root 542 Mar 7 16:09 snippet-11fe6c9225a081b78d02-v2.js.gz-rw-r--r--. 1 root root 8973 Mar 7 16:09 snippet-11fe6c9225a081b78d02-v2.js.map-rw-r--r--. 1 root root 2339 Mar 7 16:09 snippet-11fe6c9225a081b78d02-v2.js.map.gz-rw-r--r--. 1 root root 604357 Mar 7 16:09 terminal-90c9d241fa9298c28b0e-v2.js-rw-r--r--. 1 root root 210611 Mar 7 16:09 terminal-90c9d241fa9298c28b0e-v2.js.gz-rw-r--r--. 1 root root 1652629 Mar 7 16:09 terminal-90c9d241fa9298c28b0e-v2.js.map-rw-r--r--. 1 root root 520588 Mar 7 16:09 terminal-90c9d241fa9298c28b0e-v2.js.map.gz-rw-r--r--. 1 root root 5687 Mar 7 16:09 users-e8a391b905ba4224d868-v2.js-rw-r--r--. 1 root root 2032 Mar 7 16:09 users-e8a391b905ba4224d868-v2.js.gz-rw-r--r--. 1 root root 33008 Mar 7 16:09 users-e8a391b905ba4224d868-v2.js.map-rw-r--r--. 1 root root 7499 Mar 7 16:09 users-e8a391b905ba4224d868-v2.js.map.gz-rw-r--r--. 1 root root 109912 Mar 7 16:09 vue_pipelines-d7db90bf895a28ab91d2-v2.js-rw-r--r--. 1 root root 36995 Mar 7 16:09 vue_pipelines-d7db90bf895a28ab91d2-v2.js.gz-rw-r--r--. 1 root root 476473 Mar 7 16:09 vue_pipelines-d7db90bf895a28ab91d2-v2.js.map-rw-r--r--. 1 root root 143446 Mar 7 16:09 vue_pipelines-d7db90bf895a28ab91d2-v2.js.map.gz
@smcgivern Well I suppose that explains the issue. It's 8.17.3 as far as yum is concerned. I just noticed that the timestamps don't match the install time (March 17th), is the 7th when 8.17.3 was released/built?
We did release it on 7 March, yes. And I'm an idiot - I was looking at the Debian package, but it does actually contain the same CSS file as your installation
@vilhelmen as you're using EE, would you mind emailing support for this? I tried - and failed - to help here, but they are much better at this than I am
@smcgivern Had a nice debugging session with support today. Short version: No one knows what the issue is.
We were, however, able to fix it by purging the gitlab install and restoring from a backup, which I suppose was the nuclear option. So, if you're using omnibus and are not using a relative path and having the problem, a purge/restore may be easiest. It's actually pretty simple, at least with our basic setup.
Here's how I fixed our Cent7 machines:
Run a backup gitlab-rake gitlab:backup:create
Move the backup from /var/opt/gitlab/backups
Copy the /etc/gitlab directory
Run the gitlab uninstaller gitlab-ctl uninstall (Make sure the etc dir and the backup have been moved)
Stop/disable/kill the systemd service and anything still running (this may be better before the uninstall) systemctl stop gitlab-runsvdir.service
Uninstall gitlab yum erase gitlab-ee (make sure you remember what version you are on)
Delete all remaining gitlab data (you probably don't have to wipe the log dir) rm -rf /var/opt/gitlab /var/log/gitlab /etc/gitlab /opt/gitlab
Reinstall whatever version you were on yum install gitlab-ee-VERSION
Recheck/start/enable the gitlab systemd service systemctl daemon-reload, systemctl enable gitlab-runsvdir.service, and systemctl start gitlab-runsvdir.service
Edit the /etc/gitlab/gitlab.rb and configure the external url. Reconfigure gitlab gitlab-ctl reconfigure and set the root password in the browser.
Wipe the /etc/gitlab directory and replace it with the backup from earlier. Reconfigure gitlab again. gitlab-ctl reconfigure
Copy the backup back to /var/opt/gitlab/backups/, chown it to git:git, and copy the timestamp string (looks like EPOCH_YYYY_MM_DD).
Stop unicorn and sidekiq services. gitlab-ctl stop unicorngitlab-ctl stop sidekiq
Restore backup with gitlab-rake gitlab:backup:restore BACKUP=BACKUP_TIMESTAMP_STRING, answer yes to all three(?) prompts
Thanks for the summary @vilhelmen I followed this on my Debian server (with some minor changes) and I was finally able to fully repair my GitLab instance!
EDIT: I wrote a blog post for the Debian version. I couldn't find a way to contact you directly to ask for your permission, if you've got any issue please let me know!
For those running into this problem, can you try moving /opt/gitlab/embedded/service/gitlab-rails/app/assets/stylesheets/application.scss out of the path and seeing if that solves the problem?
I ended up following @vilhelmen's procedure to fix my EE install on Centos 7. It worked well, and as a side effect, exposed the fact my backups had been broken due to low disk space, so I was able to fix that. Thanks!