Show unchanged lines with a large file fails
Summary
I've discovered this issue while I was investigating #28644 (closed). Although, I wasn't able to reproduce that bug, but I've discovered this instead which sounds a little different.
Show unchanged lines function fails when the file is very big. According to my investigation, it happens to a file with between about 10,000 and 100,000 lines to load.
Currently, it tries to load and fails with this message Something went wrong while fetching diff lines.
Steps to reproduce
- Add a large file that has about 100,000 lines to a project
- Make a change at the top of that file
- Open a Merge Request for it
- Click
Show unchanged linesbelow the change - Wait for a while and the error message will appear and
Show unchanged linesbutton is now hidden
Example Project
What is the current bug behavior?
Cannot load the unchanged lines for a very big file
What is the expected correct behavior?
Load unchanged lines
Relevant logs and/or screenshots
I did some investigation and it seems that the actual error message is being swallowed currently, but this is the stacktrace for it if you inspect it manually by capturing the exception object.
RangeError: Maximum call stack size exceeded
at Array.mutator (vue.esm.js?381f:877)
at addContextLines (utils.js?d389:174)
at Store.eval (mutations.js?2b76:141)
at wrappedMutationHandler (vuex.esm.js?6eda:714)
at commitIterator (vuex.esm.js?6eda:382)
at Array.forEach (<anonymous>)
at eval (vuex.esm.js?6eda:381)
at Store._withCommit (vuex.esm.js?6eda:512)
at Store.commit (vuex.esm.js?6eda:380)
at Store.boundCommit [as commit] (vuex.esm.js?6eda:325)
Output of checks
This bug happens on GitLab.com
Possible fixes
This is the point where it fails.
# app/assets/javascripts/diffs/store/utils.js:172
if (!isExpandDown && options.bottom) {
inlineLines.push(...contextLines);
parallelLines.push(...normalizedParallelLines);
} else {
I suppose Vue has some sort of limit in handling Array push with lots of objects, but I'm not sure if it can be solved by using something else. It seems to me that we need to have a limit in how many lines we load each time instead of loading the entire file to avoid this issue.