There was an error fetching the job.
rspec migration pg13 single-db-ci-connection 5/8
Passed Started
by
@gitlab-bot

🤖 GitLab Bot 🤖
1Running with gitlab-runner 15.9.0~beta.115.g598a7c91 (598a7c91)2 on green-4.shared-gitlab-org.runners-manager.gitlab.com/default x5QiHUKw, system ID: s_b9637080a79e3 feature flags: FF_NETWORK_PER_BUILD:true, FF_USE_FASTZIP:true, FF_USE_IMPROVED_URL_MASKING:true6Using Docker executor with image registry.gitlab.com/gitlab-org/gitlab-build-images/debian-bullseye-ruby-3.0.patched-golang-1.18-rust-1.65-node-16.14-postgresql-13:rubygems-3.4-git-2.36-lfs-2.9-chrome-109-yarn-1.22-graphicsmagick-1.3.36 ...7Starting service postgres:13 ...8Pulling docker image postgres:13 ...9Using docker image sha256:dd421ca1f7f13d81c5c145d77d97d8d84cd0e6f1e045936ee506ce0f50ee397a for postgres:13 with digest postgres@sha256:00f455399f30cc3f2fe4185476601438b7a4959c74653665582d7c313a783d51 ...10Starting service redis:6.2-alpine ...11Pulling docker image redis:6.2-alpine ...12Using docker image sha256:3616f0c0705d2a35d30dde109daf3cbe58ae7284121aafa6f5cfa987db98d1a8 for redis:6.2-alpine with digest redis@sha256:edddbcad5a41d58df2f142d68439922f1860ea902903d016257337c3342f30fc ...13Waiting for services to be up and running (timeout 30 seconds)...14Authenticating with credentials from job payload (GitLab Registry)15Pulling docker image registry.gitlab.com/gitlab-org/gitlab-build-images/debian-bullseye-ruby-3.0.patched-golang-1.18-rust-1.65-node-16.14-postgresql-13:rubygems-3.4-git-2.36-lfs-2.9-chrome-109-yarn-1.22-graphicsmagick-1.3.36 ...16Using docker image sha256:7a1b51158a1ee23b080b514ce6ebbf52ac51585877198fb52709e5cd01805a77 for registry.gitlab.com/gitlab-org/gitlab-build-images/debian-bullseye-ruby-3.0.patched-golang-1.18-rust-1.65-node-16.14-postgresql-13:rubygems-3.4-git-2.36-lfs-2.9-chrome-109-yarn-1.22-graphicsmagick-1.3.36 with digest registry.gitlab.com/gitlab-org/gitlab-build-images/debian-bullseye-ruby-3.0.patched-golang-1.18-rust-1.65-node-16.14-postgresql-13@sha256:412d55913d43377094427ced549cb422ed2f2c7223e6a605d0d3e0151cb892b0 ...18Running on runner-x5qihukw-project-278964-concurrent-0 via runner-x5qihukw-shared-gitlab-org-1680172167-cf78ad84...20$ eval "$CI_PRE_CLONE_SCRIPT"21Fetching changes with git depth set to 20...22Initialized empty Git repository in /builds/gitlab-org/gitlab/.git/23Created fresh repository.24remote: Enumerating objects: 135903, done. 25remote: Counting objects: 100% (135903/135903), done. 26remote: Compressing objects: 100% (88560/88560), done. 27remote: Total 135903 (delta 58315), reused 94618 (delta 42152), pack-reused 0 28Receiving objects: 100% (135903/135903), 121.35 MiB | 32.35 MiB/s, done.29Resolving deltas: 100% (58315/58315), done.31 * [new ref] refs/pipelines/822866544 -> refs/pipelines/82286654432Checking out 523abdde as detached HEAD (ref is refs/merge-requests/116270/merge)...33Skipping Git submodules setup34$ git remote set-url origin "${CI_REPOSITORY_URL}"36Checking cache for ruby-gems-debian-bullseye-ruby-3.0-16...37cache.zip is up to date 38Successfully extracted cache39Checking cache for gitaly-ruby-gems-debian-bullseye-ruby-3.0-16...40cache.zip is up to date 41Successfully extracted cache43Downloading artifacts for compile-test-assets (4031138198)...44Downloading artifacts from coordinator... ok host=storage.googleapis.com id=4031138198 responseStatus=200 OK token=64_n4fNr45Downloading artifacts for detect-tests (4031138213)...46Downloading artifacts from coordinator... ok host=storage.googleapis.com id=4031138213 responseStatus=200 OK token=64_n4fNr47Downloading artifacts for retrieve-tests-metadata (4031138217)...48Downloading artifacts from coordinator... ok host=storage.googleapis.com id=4031138217 responseStatus=200 OK token=64_n4fNr49Downloading artifacts for setup-test-env (4031138204)...50Downloading artifacts from coordinator... ok host=storage.googleapis.com id=4031138204 responseStatus=200 OK token=64_n4fNr52Using docker image sha256:7a1b51158a1ee23b080b514ce6ebbf52ac51585877198fb52709e5cd01805a77 for registry.gitlab.com/gitlab-org/gitlab-build-images/debian-bullseye-ruby-3.0.patched-golang-1.18-rust-1.65-node-16.14-postgresql-13:rubygems-3.4-git-2.36-lfs-2.9-chrome-109-yarn-1.22-graphicsmagick-1.3.36 with digest registry.gitlab.com/gitlab-org/gitlab-build-images/debian-bullseye-ruby-3.0.patched-golang-1.18-rust-1.65-node-16.14-postgresql-13@sha256:412d55913d43377094427ced549cb422ed2f2c7223e6a605d0d3e0151cb892b0 ...53$ echo $FOSS_ONLY54$ [ "$FOSS_ONLY" = "1" ] && rm -rf ee/ qa/spec/ee/ qa/qa/specs/features/ee/ qa/qa/ee/ qa/qa/ee.rb55$ export GOPATH=$CI_PROJECT_DIR/.go56$ mkdir -p $GOPATH57$ source scripts/utils.sh58$ source scripts/prepare_build.sh709Using decomposed database config (config/database.yml.postgresql)710Enabling ci connection (database_tasks: false) in config/database.yml711Geo DB won't be set up.712$ setup_db_user_only713CREATE ROLE714GRANT715==> 'setup_db_user_only' succeeded in 0 seconds.716$ bundle exec rake db:drop db:create db:schema:load db:migrate gitlab:db:lock_writes717Dropped database 'gitlabhq_test'718Created database 'gitlabhq_test'719==> 'bundle exec rake db:drop db:create db:schema:load db:migrate gitlab:db:lock_writes' succeeded in 35 seconds.720$ setup_db_praefect721SELECT pg_catalog.set_config('search_path', '', false);722CREATE DATABASE praefect_test ENCODING 'UTF8';723==> 'setup_db_praefect' succeeded in 0 seconds.724$ source ./scripts/rspec_helpers.sh725$ run_timed_command "gem install knapsack --no-document"726$ gem install knapsack --no-document727Successfully installed knapsack-4.0.07281 gem installed729==> 'gem install knapsack --no-document' succeeded in 1 seconds.730$ echo -e "\e[0Ksection_start:`date +%s`:gitaly-test-spawn[collapsed=true]\r\e[0KStarting Gitaly"834$ rspec_paralellized_job "--tag ~quarantine --tag ~zoekt"835SKIP_FLAKY_TESTS_AUTOMATICALLY: false836RETRY_FAILED_TESTS_IN_NEW_PROCESS: true837KNAPSACK_GENERATE_REPORT: true838FLAKY_RSPEC_GENERATE_REPORT: true839KNAPSACK_TEST_FILE_PATTERN: spec/{migrations}{,/**/}*_spec.rb840KNAPSACK_LOG_LEVEL: debug841KNAPSACK_REPORT_PATH: knapsack/rspec_migration_pg13_single-db-ci-connection_5_8_report.json842FLAKY_RSPEC_SUITE_REPORT_PATH: rspec/flaky/report-suite.json843FLAKY_RSPEC_REPORT_PATH: rspec/flaky/all_rspec_migration_pg13_single-db-ci-connection_5_8_report.json844NEW_FLAKY_RSPEC_REPORT_PATH: rspec/flaky/new_rspec_migration_pg13_single-db-ci-connection_5_8_report.json845SKIPPED_FLAKY_TESTS_REPORT_PATH: rspec/flaky/skipped_flaky_tests_rspec_migration_pg13_single-db-ci-connection_5_8_report.txt846CRYSTALBALL: 847Knapsack report generator started!848Run options: exclude {:quarantine=>true, :zoekt=>true}849Test environment set up in 0.88222522 seconds850ScheduleResetDuplicateCiRunnersTokenValues851 # order random852 #up853ci: == 20220922143143 ScheduleResetDuplicateCiRunnersTokenValues: migrating =======854ci: == 20220922143143 ScheduleResetDuplicateCiRunnersTokenValues: migrated (0.0667s) 855 schedules background jobs for each batch of runners856 #down857ci: == 20220922143143 ScheduleResetDuplicateCiRunnersTokenValues: migrating =======858ci: == 20220922143143 ScheduleResetDuplicateCiRunnersTokenValues: migrated (0.0556s) 859 deletes all batched migration records860DedupRunnerProjects861main: == 20220124130028 DedupRunnerProjects: migrating ==============================862main: -- view_exists?(:postgres_partitions)863main: -> 0.0015s864main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 5", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})865main: -> 0.0050s866main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 5", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})867main: -> 0.0025s868main: -- view_exists?(:postgres_partitions)869main: -> 0.0014s870main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})871main: -> 0.0060s872main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})873main: -> 0.0018s874main: -- view_exists?(:postgres_partitions)875main: -> 0.0012s876main: -- indexes(:ci_runner_projects)877main: -> 0.0070s878main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id"})879main: -> 0.0015s880main: -- view_exists?(:postgres_partitions)881main: -> 0.0013s882main: -- indexes(:ci_runner_projects)883main: -> 0.0052s884main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"index_ci_runner_projects_on_runner_id_and_project_id"})885main: -> 0.0018s886main: == 20220124130028 DedupRunnerProjects: migrated (0.1052s) =====================887 deduplicates ci_runner_projects table888main: == 20220124130028 DedupRunnerProjects: migrating ==============================889main: -- view_exists?(:postgres_partitions)890main: -> 0.0015s891main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 10", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})892main: -> 0.0052s893main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 10", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})894main: -> 0.0021s895main: -- view_exists?(:postgres_partitions)896main: -> 0.0017s897main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})898main: -> 0.0059s899main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})900main: -> 0.0022s901main: -- view_exists?(:postgres_partitions)902main: -> 0.0014s903main: -- indexes(:ci_runner_projects)904main: -> 0.0068s905main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id"})906main: -> 0.0018s907main: -- view_exists?(:postgres_partitions)908main: -> 0.0015s909main: -- indexes(:ci_runner_projects)910main: -> 0.0055s911main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"index_ci_runner_projects_on_runner_id_and_project_id"})912main: -> 0.0016s913main: == 20220124130028 DedupRunnerProjects: migrated (0.1104s) =====================914 merges `duplicated_runner_project_1` with `duplicated_runner_project_2`915main: == 20220124130028 DedupRunnerProjects: migrating ==============================916main: -- view_exists?(:postgres_partitions)917main: -> 0.0015s918main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 15", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})919main: -> 0.0053s920main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 15", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})921main: -> 0.0026s922main: -- view_exists?(:postgres_partitions)923main: -> 0.0017s924main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})925main: -> 0.0059s926main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})927main: -> 0.0022s928main: -- view_exists?(:postgres_partitions)929main: -> 0.0014s930main: -- indexes(:ci_runner_projects)931main: -> 0.0067s932main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id"})933main: -> 0.0018s934main: -- view_exists?(:postgres_partitions)935main: -> 0.0015s936main: -- indexes(:ci_runner_projects)937main: -> 0.0061s938main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"index_ci_runner_projects_on_runner_id_and_project_id"})939main: -> 0.0017s940main: == 20220124130028 DedupRunnerProjects: migrated (0.1159s) =====================941 merges `duplicated_runner_project_3` with `duplicated_runner_project_4`942main: == 20220124130028 DedupRunnerProjects: migrating ==============================943main: -- view_exists?(:postgres_partitions)944main: -> 0.0015s945main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 20", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})946main: -> 0.0047s947main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 20", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})948main: -> 0.0020s949main: -- view_exists?(:postgres_partitions)950main: -> 0.0019s951main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})952main: -> 0.0054s953main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})954main: -> 0.0019s955main: -- view_exists?(:postgres_partitions)956main: -> 0.0013s957main: -- indexes(:ci_runner_projects)958main: -> 0.0064s959main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id"})960main: -> 0.0016s961main: -- view_exists?(:postgres_partitions)962main: -> 0.0013s963main: -- indexes(:ci_runner_projects)964main: -> 0.0051s965main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"index_ci_runner_projects_on_runner_id_and_project_id"})966main: -> 0.0014s967main: == 20220124130028 DedupRunnerProjects: migrated (0.1015s) =====================968 does not change non duplicated records969main: == 20220124130028 DedupRunnerProjects: migrating ==============================970main: -- view_exists?(:postgres_partitions)971main: -> 0.0015s972main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 0", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})973main: -> 0.0045s974main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:where=>"id > 0", :unique=>true, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id", :algorithm=>:concurrently})975main: -> 0.0021s976main: -- view_exists?(:postgres_partitions)977main: -> 0.0014s978main: -- index_exists?(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})979main: -> 0.0053s980main: -- add_index(:ci_runner_projects, [:runner_id, :project_id], {:unique=>true, :name=>"index_unique_ci_runner_projects_on_runner_id_and_project_id", :algorithm=>:concurrently})981main: -> 0.0018s982main: -- view_exists?(:postgres_partitions)983main: -> 0.0014s984main: -- indexes(:ci_runner_projects)985main: -> 0.0061s986main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"tmp_unique_ci_runner_projects_by_runner_id_and_project_id"})987main: -> 0.0015s988main: -- view_exists?(:postgres_partitions)989main: -> 0.0015s990main: -- indexes(:ci_runner_projects)991main: -> 0.0055s992main: -- remove_index(:ci_runner_projects, {:algorithm=>:concurrently, :name=>"index_ci_runner_projects_on_runner_id_and_project_id"})993main: -> 0.0016s994main: == 20220124130028 DedupRunnerProjects: migrated (0.0840s) =====================995 does nothing when there are no runner projects996FinaliseProjectNamespaceMembers997 #up998 when migration is missing999main: == 20220628012902 FinaliseProjectNamespaceMembers: migrating ==================1000main: == 20220628012902 FinaliseProjectNamespaceMembers: migrated (0.0079s) =========1001 warns migration not found1002 with migration present1003 when migration finished successfully1004main: == 20220628012902 FinaliseProjectNamespaceMembers: migrating ==================1005main: == 20220628012902 FinaliseProjectNamespaceMembers: migrated (0.0230s) =========1006 does not raise exception1007 with different migration statuses1008 status: 0, description: "paused"1009 behaves like finalizes the migration1010 finalizes the migration1011 status: 1, description: "active"1012 behaves like finalizes the migration1013 finalizes the migration1014 status: 4, description: "failed"1015 behaves like finalizes the migration1016 finalizes the migration1017 status: 5, description: "finalizing"1018 behaves like finalizes the migration1019 finalizes the migration1020CreateSyncNamespaceDetailsTrigger1021 #up1022 INSERT trigger1023main: == 20220506154054 CreateSyncNamespaceDetailsTrigger: migrating ================1024main: -- execute("CREATE OR REPLACE FUNCTION update_namespace_details_from_namespaces()\nRETURNS TRIGGER AS\n$$\nBEGIN\nINSERT INTO\n namespace_details (\n description,\n description_html,\n cached_markdown_version,\n updated_at,\n created_at,\n namespace_id\n )\nVALUES\n (\n NEW.description,\n NEW.description_html,\n NEW.cached_markdown_version,\n NEW.updated_at,\n NEW.updated_at,\n NEW.id\n ) ON CONFLICT (namespace_id) DO\nUPDATE\nSET\n description = NEW.description,\n description_html = NEW.description_html,\n cached_markdown_version = NEW.cached_markdown_version,\n updated_at = NEW.updated_at\nWHERE\n namespace_details.namespace_id = NEW.id;RETURN NULL;\n\nEND\n$$ LANGUAGE PLPGSQL\n")1025main: -> 0.0015s1026main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_update\nAFTER UPDATE ON namespaces\nFOR EACH ROW\nWHEN (\n NEW.type <> 'Project' AND (\n OLD.description IS DISTINCT FROM NEW.description OR\n OLD.description_html IS DISTINCT FROM NEW.description_html OR\n OLD.cached_markdown_version IS DISTINCT FROM NEW.cached_markdown_version)\n)\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1027main: -> 0.0021s1028main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_insert\nAFTER INSERT ON namespaces\nFOR EACH ROW\nWHEN (NEW.type <> 'Project')\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1029main: -> 0.0021s1030main: == 20220506154054 CreateSyncNamespaceDetailsTrigger: migrated (0.0062s) =======1031 creates a namespace_detail record1032main: == 20220506154054 CreateSyncNamespaceDetailsTrigger: migrating ================1033main: -- execute("CREATE OR REPLACE FUNCTION update_namespace_details_from_namespaces()\nRETURNS TRIGGER AS\n$$\nBEGIN\nINSERT INTO\n namespace_details (\n description,\n description_html,\n cached_markdown_version,\n updated_at,\n created_at,\n namespace_id\n )\nVALUES\n (\n NEW.description,\n NEW.description_html,\n NEW.cached_markdown_version,\n NEW.updated_at,\n NEW.updated_at,\n NEW.id\n ) ON CONFLICT (namespace_id) DO\nUPDATE\nSET\n description = NEW.description,\n description_html = NEW.description_html,\n cached_markdown_version = NEW.cached_markdown_version,\n updated_at = NEW.updated_at\nWHERE\n namespace_details.namespace_id = NEW.id;RETURN NULL;\n\nEND\n$$ LANGUAGE PLPGSQL\n")1034main: -> 0.0017s1035main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_update\nAFTER UPDATE ON namespaces\nFOR EACH ROW\nWHEN (\n NEW.type <> 'Project' AND (\n OLD.description IS DISTINCT FROM NEW.description OR\n OLD.description_html IS DISTINCT FROM NEW.description_html OR\n OLD.cached_markdown_version IS DISTINCT FROM NEW.cached_markdown_version)\n)\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1036main: -> 0.0021s1037main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_insert\nAFTER INSERT ON namespaces\nFOR EACH ROW\nWHEN (NEW.type <> 'Project')\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1038main: -> 0.0018s1039main: == 20220506154054 CreateSyncNamespaceDetailsTrigger: migrated (0.0067s) =======1040 the created namespace_details record has matching attributes1041 UPDATE trigger1042main: == 20220506154054 CreateSyncNamespaceDetailsTrigger: migrating ================1043main: -- execute("CREATE OR REPLACE FUNCTION update_namespace_details_from_namespaces()\nRETURNS TRIGGER AS\n$$\nBEGIN\nINSERT INTO\n namespace_details (\n description,\n description_html,\n cached_markdown_version,\n updated_at,\n created_at,\n namespace_id\n )\nVALUES\n (\n NEW.description,\n NEW.description_html,\n NEW.cached_markdown_version,\n NEW.updated_at,\n NEW.updated_at,\n NEW.id\n ) ON CONFLICT (namespace_id) DO\nUPDATE\nSET\n description = NEW.description,\n description_html = NEW.description_html,\n cached_markdown_version = NEW.cached_markdown_version,\n updated_at = NEW.updated_at\nWHERE\n namespace_details.namespace_id = NEW.id;RETURN NULL;\n\nEND\n$$ LANGUAGE PLPGSQL\n")1044main: -> 0.0013s1045main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_update\nAFTER UPDATE ON namespaces\nFOR EACH ROW\nWHEN (\n NEW.type <> 'Project' AND (\n OLD.description IS DISTINCT FROM NEW.description OR\n OLD.description_html IS DISTINCT FROM NEW.description_html OR\n OLD.cached_markdown_version IS DISTINCT FROM NEW.cached_markdown_version)\n)\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1046main: -> 0.0017s1047main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_insert\nAFTER INSERT ON namespaces\nFOR EACH ROW\nWHEN (NEW.type <> 'Project')\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1048main: -> 0.0016s1049main: == 20220506154054 CreateSyncNamespaceDetailsTrigger: migrated (0.0051s) =======1050 updates the attribute in the synced namespace_details record1051 #down1052main: -- execute("CREATE OR REPLACE FUNCTION update_namespace_details_from_namespaces()\nRETURNS TRIGGER AS\n$$\nBEGIN\nINSERT INTO\n namespace_details (\n description,\n description_html,\n cached_markdown_version,\n updated_at,\n created_at,\n namespace_id\n )\nVALUES\n (\n NEW.description,\n NEW.description_html,\n NEW.cached_markdown_version,\n NEW.updated_at,\n NEW.updated_at,\n NEW.id\n ) ON CONFLICT (namespace_id) DO\nUPDATE\nSET\n description = NEW.description,\n description_html = NEW.description_html,\n cached_markdown_version = NEW.cached_markdown_version,\n updated_at = NEW.updated_at\nWHERE\n namespace_details.namespace_id = NEW.id;RETURN NULL;\n\nEND\n$$ LANGUAGE PLPGSQL\n")1053main: -> 0.0018s1054main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_update\nAFTER UPDATE ON namespaces\nFOR EACH ROW\nWHEN (\n NEW.type <> 'Project' AND (\n OLD.description IS DISTINCT FROM NEW.description OR\n OLD.description_html IS DISTINCT FROM NEW.description_html OR\n OLD.cached_markdown_version IS DISTINCT FROM NEW.cached_markdown_version)\n)\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1055main: -> 0.0021s1056main: -- execute("CREATE TRIGGER trigger_update_details_on_namespace_insert\nAFTER INSERT ON namespaces\nFOR EACH ROW\nWHEN (NEW.type <> 'Project')\nEXECUTE PROCEDURE update_namespace_details_from_namespaces();\n")1057main: -> 0.0016s1058main: -- execute("DROP TRIGGER IF EXISTS trigger_update_details_on_namespace_update ON namespaces")1059main: -> 0.0017s1060main: -- execute("DROP TRIGGER IF EXISTS trigger_update_details_on_namespace_insert ON namespaces")1061main: -> 0.0013s1062main: -- execute("DROP FUNCTION IF EXISTS update_namespace_details_from_namespaces()")1063main: -> 0.0011s1064 drops the trigger1065RemoveNotNullContraintOnTitleFromSprints1066 #down1067main: -- change_column_null(:sprints, :title, true)1068main: -> 0.0019s1069main: -- execute("UPDATE sprints SET title = id WHERE title IS NULL\n")1070main: -> 0.0021s1071main: -- change_column_null(:sprints, :title, false)1072main: -> 0.0016s1073 removes null titles by setting them with ids1074EnsureWorkItemTypeBackfillMigrationFinished1075 # order random1076 #up1077 when migration is missing1078main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrating ======1079main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrated (0.0652s) 1080 warns migration not found1081 with migration present1082 when migrations have finished1083main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrating ======1084main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrated (0.1604s) 1085 does not raise an error1086 with different migration statuses1087 status: 0, description: "paused"1088main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrating ======1089main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrated (1.1031s) 1090 finalizes the migration1091 status: 1, description: "active"1092main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrating ======1093main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrated (1.1477s) 1094 finalizes the migration1095 status: 4, description: "failed"1096main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrating ======1097main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrated (1.3175s) 1098 finalizes the migration1099 status: 5, description: "finalizing"1100main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrating ======1101main: == 20221115173607 EnsureWorkItemTypeBackfillMigrationFinished: migrated (1.0740s) 1102 finalizes the migration1103FixPartitionIdsForCiBuildTraceMetadata1104 # order random1105 when on self-managed instance1106 #up1107 does not schedule background job1108 #down1109 does not delete background job1110 when on saas1111 #up1112ci: == 20230209092204 FixPartitionIdsForCiBuildTraceMetadata: migrating ===========1113ci: == 20230209092204 FixPartitionIdsForCiBuildTraceMetadata: migrated (0.0909s) ==1114 schedules background jobs for each batch of ci_build_trace_metadata1115 #down1116ci: == 20230209092204 FixPartitionIdsForCiBuildTraceMetadata: migrating ===========1117ci: == 20230209092204 FixPartitionIdsForCiBuildTraceMetadata: migrated (0.0963s) ==1118 deletes all batched migration records1119ScheduleFixIncorrectMaxSeatsUsed21120 #up1121main: == 20220212120735 ScheduleFixIncorrectMaxSeatsUsed2: migrating ================1122main: -- view_exists?(:postgres_partitions)1123main: -> 0.0022s1124main: -- index_exists?(:gitlab_subscriptions, :id, {:where=>"start_date < '2021-08-02' AND max_seats_used != 0 AND max_seats_used > seats_in_use AND max_seats_used > seats", :name=>"tmp_gitlab_subscriptions_max_seats_used_migration_2", :algorithm=>:concurrently})1125main: -> 0.0100s1126main: -- add_index(:gitlab_subscriptions, :id, {:where=>"start_date < '2021-08-02' AND max_seats_used != 0 AND max_seats_used > seats_in_use AND max_seats_used > seats", :name=>"tmp_gitlab_subscriptions_max_seats_used_migration_2", :algorithm=>:concurrently})1127main: -> 0.0031s1128main: == 20220212120735 ScheduleFixIncorrectMaxSeatsUsed2: migrated (0.0445s) =======1129 schedules a job on Gitlab.com1130main: == 20220212120735 ScheduleFixIncorrectMaxSeatsUsed2: migrating ================1131main: -- view_exists?(:postgres_partitions)1132main: -> 0.0024s1133main: -- index_exists?(:gitlab_subscriptions, :id, {:where=>"start_date < '2021-08-02' AND max_seats_used != 0 AND max_seats_used > seats_in_use AND max_seats_used > seats", :name=>"tmp_gitlab_subscriptions_max_seats_used_migration_2", :algorithm=>:concurrently})1134main: -> 0.0128s1135main: -- add_index(:gitlab_subscriptions, :id, {:where=>"start_date < '2021-08-02' AND max_seats_used != 0 AND max_seats_used > seats_in_use AND max_seats_used > seats", :name=>"tmp_gitlab_subscriptions_max_seats_used_migration_2", :algorithm=>:concurrently})1136main: -> 0.0044s1137main: == 20220212120735 ScheduleFixIncorrectMaxSeatsUsed2: migrated (0.0474s) =======1138 does not schedule any jobs when not Gitlab.com1139EnsureTaskNoteRenamingBackgroundMigrationFinished1140 # order random1141 #up1142 when migration is missing1143main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrating 1144main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrated (0.0140s) 1145 warns migration not found1146 with migration present1147 when migration finished successfully1148main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrating 1149main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrated (0.0278s) 1150 does not raise exception1151 with different migration statuses1152 status: 0, description: "paused"1153 behaves like finalizes the migration1154main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrating 1155main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrated (0.2572s) 1156 finalizes the migration1157 status: 1, description: "active"1158 behaves like finalizes the migration1159main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrating 1160main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrated (0.2498s) 1161 finalizes the migration1162 status: 4, description: "failed"1163 behaves like finalizes the migration1164main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrating 1165main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrated (0.2641s) 1166 finalizes the migration1167 status: 5, description: "finalizing"1168 behaves like finalizes the migration1169main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrating 1170main: == 20221018193635 EnsureTaskNoteRenamingBackgroundMigrationFinished: migrated (0.2366s) 1171 finalizes the migration1172ScheduleBackfillProjectMemberNamespaceId1173 #up1174main: == 20220516054011 ScheduleBackfillProjectMemberNamespaceId: migrating =========1175main: == 20220516054011 ScheduleBackfillProjectMemberNamespaceId: migrated (0.0762s) 1176 schedules background jobs for each batch of project members1177 #down1178main: == 20220516054011 ScheduleBackfillProjectMemberNamespaceId: migrating =========1179main: == 20220516054011 ScheduleBackfillProjectMemberNamespaceId: migrated (0.0740s) 1180 deletes all batched migration records1181SchedulePopulateRequirementsIssueId1182main: == 20220506124021 SchedulePopulateRequirementsIssueId: migrating ==============1183main: -- Scheduled 0 MigrateRequirementsToWorkItems jobs with a maximum of 2 records per batch and an interval of 120 seconds.1184The migration is expected to take at least 0 seconds. Expect all jobs to have completed after 2023-03-30 11:51:43 UTC."1185main: == 20220506124021 SchedulePopulateRequirementsIssueId: migrated (0.0114s) =====1186main: == 20220506124021 SchedulePopulateRequirementsIssueId: reverting ==============1187main: == 20220506124021 SchedulePopulateRequirementsIssueId: reverted (0.0004s) =====1188main: == 20220506123922 AddNotNullConstraintWithoutValidationToRequirementsIssueId: reverting 1189main: -- transaction_open?()1190main: -> 0.0000s1191main: -- transaction_open?()1192main: -> 0.0000s1193main: -- execute(" ALTER TABLE requirements\n DROP CONSTRAINT IF EXISTS check_requirement_issue_not_null\n")1194main: -> 0.0016s1195main: == 20220506123922 AddNotNullConstraintWithoutValidationToRequirementsIssueId: reverted (0.0081s) 1196main: == 20220506124021 SchedulePopulateRequirementsIssueId: migrating ==============1197main: -- Scheduled 2 MigrateRequirementsToWorkItems jobs with a maximum of 2 records per batch and an interval of 120 seconds.1198The migration is expected to take at least 240 seconds. Expect all jobs to have completed after 2023-03-30 11:55:43 UTC."1199main: == 20220506124021 SchedulePopulateRequirementsIssueId: migrated (0.0508s) =====1200 schedules jobs for all requirements without issues in sync1201MigrateShimoConfluenceServiceCategory1202 #up1203main: == 20220324032250 MigrateShimoConfluenceServiceCategory: migrating ============1204main: -- Scheduled 1 MigrateShimoConfluenceIntegrationCategory jobs with a maximum of 2 records per batch and an interval of 120 seconds.1205The migration is expected to take at least 120 seconds. Expect all jobs to have completed after 2023-03-30 11:55:03 UTC."1206main: == 20220324032250 MigrateShimoConfluenceServiceCategory: migrated (0.0571s) ===1207 correctly schedules background migrations1208The application_settings (main) table has 1271 columns.1209Recreating the database1210Dropped database 'gitlabhq_test'1211Created database 'gitlabhq_test'1212Databases re-creation done in 7.2478738459994931213CleanupAfterFixingRegressionWithNewUsersEmails1214 adds primary email to emails for confirmed users that do not have their primary email in emails table1215 continues in case of errors with one email1216QueueUpdateDelayedProjectRemovalToNullForUserNamespace1217 #up1218main: == 20220627152642 QueueUpdateDelayedProjectRemovalToNullForUserNamespace: migrating 1219main: == 20220627152642 QueueUpdateDelayedProjectRemovalToNullForUserNamespace: migrated (0.0737s) 1220 schedules background jobs for each batch of namespace settings1221 #down1222main: == 20220627152642 QueueUpdateDelayedProjectRemovalToNullForUserNamespace: migrating 1223main: == 20220627152642 QueueUpdateDelayedProjectRemovalToNullForUserNamespace: migrated (0.0650s) 1224 deletes all batched migration records1225ResetTooManyTagsSkippedRegistryImports1226main: == 20220502173045 ResetTooManyTagsSkippedRegistryImports: migrating ===========1227main: -- Scheduled 2 ResetTooManyTagsSkippedRegistryImports jobs with a maximum of 2 records per batch and an interval of 120 seconds.1228The migration is expected to take at least 240 seconds. Expect all jobs to have completed after 2023-03-30 12:00:15 UTC."1229main: == 20220502173045 ResetTooManyTagsSkippedRegistryImports: migrated (0.0609s) ==1230 schedules jobs to reset skipped registry imports1231ScheduleBackfillVulnerabilityReadsClusterAgent1232main: == 20220525221133 ScheduleBackfillVulnerabilityReadsClusterAgent: migrating ===1233main: == 20220525221133 ScheduleBackfillVulnerabilityReadsClusterAgent: migrated (0.0640s) 1234 schedules background jobs for each batch of vulnerability reads1235FinalizeBackfillNullNoteDiscussionIds1236main: == 20220524074947 FinalizeBackfillNullNoteDiscussionIds: migrating ============1237main: == 20220524074947 FinalizeBackfillNullNoteDiscussionIds: migrated (0.0398s) ===1238 performs remaining background migrations1239ScheduleDisableLegacyOpenSourceLicenseForProjectsLessThanFiveMb1240 # order random1241 when on gitlab.com1242 #up1243 schedules background jobs for each batch of project_settings1244 #down1245 deletes all batched migration records1246 when on self-managed instance1247 #up1248 does not schedule background job1249 #down1250 does not delete background job1251UpdateStartDateForIterationsCadences1252 # order random1253 #down1254main: -- transaction_open?()1255main: -> 0.0003s1256main: -- execute("UPDATE iterations_cadences\nSET start_date=ic.first_upcoming_iteration_start_date\nFROM (\n SELECT ic.id, sprints2.first_upcoming_iteration_start_date \n FROM iterations_cadences as ic,\n LATERAL (\n -- For each cadence, query for the due date of its current iteration\n SELECT due_date as current_iteration_due_date FROM sprints\n WHERE iterations_cadence_id=ic.id AND start_date <= current_date AND due_date >= current_date\n LIMIT 1\n ) as sprints1,\n LATERAL (\n -- For each cadence, query for the start date of the first upcoming iteration (i.e, it starts after the current iteration)\n SELECT start_date as first_upcoming_iteration_start_date FROM sprints\n WHERE iterations_cadence_id=ic.id AND start_date > sprints1.current_iteration_due_date\n ORDER BY start_date ASC LIMIT 1\n ) as sprints2\n WHERE ic.automatic=true AND ic.id BETWEEN 1 AND 5\n) as ic\nWHERE iterations_cadences.id=ic.id;\n")1257main: -> 0.0040s1258main: -- transaction_open?()1259main: -> 0.0003s1260main: -- execute("UPDATE iterations_cadences\nSET start_date=ic.first_iteration_start_date\nFROM (\n SELECT ic.id, sprints.start_date as first_iteration_start_date\n FROM iterations_cadences as ic,\n LATERAL (\n SELECT start_date FROM sprints WHERE iterations_cadence_id=ic.id ORDER BY start_date ASC LIMIT 1\n ) as sprints\n WHERE ic.automatic=true AND ic.id BETWEEN 1 AND 5\n) as ic\nWHERE iterations_cadences.id=ic.id;\n")1261main: -> 0.0027s1262 updates the start date of an automatic cadence to the start date of its earliest sprint record.1263 #up1264main: -- transaction_open?()1265main: -> 0.0002s1266main: -- execute("UPDATE iterations_cadences\nSET start_date=ic.first_upcoming_iteration_start_date\nFROM (\n SELECT ic.id, sprints2.first_upcoming_iteration_start_date \n FROM iterations_cadences as ic,\n LATERAL (\n -- For each cadence, query for the due date of its current iteration\n SELECT due_date as current_iteration_due_date FROM sprints\n WHERE iterations_cadence_id=ic.id AND start_date <= current_date AND due_date >= current_date\n LIMIT 1\n ) as sprints1,\n LATERAL (\n -- For each cadence, query for the start date of the first upcoming iteration (i.e, it starts after the current iteration)\n SELECT start_date as first_upcoming_iteration_start_date FROM sprints\n WHERE iterations_cadence_id=ic.id AND start_date > sprints1.current_iteration_due_date\n ORDER BY start_date ASC LIMIT 1\n ) as sprints2\n WHERE ic.automatic=true AND ic.id BETWEEN 6 AND 10\n) as ic\nWHERE iterations_cadences.id=ic.id;\n")1267main: -> 0.0029s1268 updates the start date of an automatic cadence to the start date of its first upcoming sprint record.1269BackfillInternalOnNotes1270 # order random1271 #down1272main: == 20220920124709 BackfillInternalOnNotes: migrating ==========================1273main: == 20220920124709 BackfillInternalOnNotes: migrated (0.0552s) =================1274 deletes all batched migration records1275 #up1276main: == 20220920124709 BackfillInternalOnNotes: migrating ==========================1277main: == 20220920124709 BackfillInternalOnNotes: migrated (0.0596s) =================1278 schedules background jobs for each batch of issues1279SetEmailConfirmationSettingBeforeRemovingSendUserConfirmationEmailColumn1280 # order random1281 #up1282 when 'send_user_confirmation_email' is set to 'true'1283 updates 'email_confirmation_setting' to '2' (hard)1284 when 'send_user_confirmation_email' is set to 'false'1285 updates 'email_confirmation_setting' to '0' (off)1286 #down1287 updates 'email_confirmation_setting' to default value: '0' (off)1288ScheduleDeleteOrphanedOperationalVulnerabilities1289 # order random1290 #down1291 deletes all batched migration records1292 #up1293 schedules background jobs for each batch of vulnerabilities1294AddVulnerabilityAdvisoryForeignKeyToSbomVulnerableComponentVersions1295 # order random1296main: -- foreign_keys(:sbom_vulnerable_component_versions)1297main: -> 0.0053s1298main: == 20220819153725 AddVulnerabilityAdvisoryForeignKeyToSbomVulnerableComponentVersions: migrating 1299main: -- foreign_keys(:sbom_vulnerable_component_versions)1300main: -> 0.0041s1301main: -- execute("ALTER TABLE sbom_vulnerable_component_versions ADD CONSTRAINT fk_d720a1959a FOREIGN KEY (vulnerability_advisory_id) REFERENCES vulnerability_advisories (id) ON DELETE CASCADE NOT VALID;")1302main: -> 0.0016s1303main: -- execute("ALTER TABLE sbom_vulnerable_component_versions VALIDATE CONSTRAINT fk_d720a1959a;")1304main: -> 0.0019s1305main: == 20220819153725 AddVulnerabilityAdvisoryForeignKeyToSbomVulnerableComponentVersions: migrated (0.0321s) 1306main: -- foreign_keys(:sbom_vulnerable_component_versions)1307main: -> 0.0046s1308main: -- foreign_keys(:sbom_vulnerable_component_versions)1309main: -> 0.0065s1310 creates and drops the foreign key1311RecountEpicCacheCounts1312 # order random1313 #down1314main: == 20221107094359 RecountEpicCacheCounts: migrating ===========================1315main: == 20221107094359 RecountEpicCacheCounts: migrated (0.0603s) ==================1316 deletes all batched migration records1317 #up1318main: == 20221107094359 RecountEpicCacheCounts: migrating ===========================1319main: == 20221107094359 RecountEpicCacheCounts: migrated (0.0774s) ==================1320 schedules a batched background migration1321ScheduleBackfillEnvironmentTier1322 # order random1323main: == 20221205151917 ScheduleBackfillEnvironmentTier: migrating ==================1324main: == 20221205151917 ScheduleBackfillEnvironmentTier: migrated (0.0627s) =========1325 schedules a new batched migration1326RemoveFlowdockIntegrationRecords1327 # order random1328main: == 20221129124240 RemoveFlowdockIntegrationRecords: migrating =================1329main: == 20221129124240 RemoveFlowdockIntegrationRecords: migrated (0.0258s) ========1330 removes integrations records of type_new Integrations::Flowdock1331BackfillProductAnalyticsDataCollectorHost1332 # order random1333 #up1334 when jitsu host is present1335main: == 20230327123333 BackfillProductAnalyticsDataCollectorHost: migrating ========1336main: -- execute("UPDATE application_settings\nSET product_analytics_data_collector_host = regexp_replace(jitsu_host, '://(.+?\\.)', '://collector.', 'g')\nWHERE jitsu_host IS NOT NULL AND product_analytics_data_collector_host IS NULL\n")1337main: -> 0.0034s1338main: == 20230327123333 BackfillProductAnalyticsDataCollectorHost: migrated (0.0136s) 1339 backfills missing product_analytics_data_collector_host1340main: == 20230327123333 BackfillProductAnalyticsDataCollectorHost: migrating ========1341main: -- execute("UPDATE application_settings\nSET product_analytics_data_collector_host = regexp_replace(jitsu_host, '://(.+?\\.)', '://collector.', 'g')\nWHERE jitsu_host IS NOT NULL AND product_analytics_data_collector_host IS NULL\n")1342main: -> 0.0034s1343main: == 20230327123333 BackfillProductAnalyticsDataCollectorHost: migrated (0.0142s) 1344 does not modify existing product_analytics_data_collector_host1345 when jitsu host is not present1346main: == 20230327123333 BackfillProductAnalyticsDataCollectorHost: migrating ========1347main: -- execute("UPDATE application_settings\nSET product_analytics_data_collector_host = regexp_replace(jitsu_host, '://(.+?\\.)', '://collector.', 'g')\nWHERE jitsu_host IS NOT NULL AND product_analytics_data_collector_host IS NULL\n")1348main: -> 0.0034s1349main: == 20230327123333 BackfillProductAnalyticsDataCollectorHost: migrated (0.0146s) 1350 does not backfill product_analytics_data_collector_host1351The application_settings (main) table has 1207 columns.1352Recreating the database1353Dropped database 'gitlabhq_test'1354Created database 'gitlabhq_test'1355Databases re-creation done in 6.8380544080000621356SwapTimelogsNoteIdToBigintForGitlabDotCom1357 # order random1358 #up1359 swaps the integer and bigint columns for GitLab.com, dev, or test1360 is a no-op for other instances1361RemoveIncorrectlyOnboardedNamespacesFromOnboardingProgress1362 # order random1363 #up1364main: == 20230221214519 RemoveIncorrectlyOnboardedNamespacesFromOnboardingProgress: migrating 1365main: == 20230221214519 RemoveIncorrectlyOnboardedNamespacesFromOnboardingProgress: migrated (0.0449s) 1366 deletes the onboarding for namespaces without learn gitlab1367SwapDesignUserMentionsNoteIdToBigintForGitlabDotCom1368 # order random1369 #up1370 swaps the integer and bigint columns for GitLab.com, dev, or test1371 is a no-op for other instances1372EnsureNoteDiffFilesBigintBackfillIsFinishedForGitlabDotCom1373 # order random1374 #up1375main: == 20230322023442 EnsureNoteDiffFilesBigintBackfillIsFinishedForGitlabDotCom: migrating 1376main: == 20230322023442 EnsureNoteDiffFilesBigintBackfillIsFinishedForGitlabDotCom: migrated (0.0118s) 1377 ensures the migration is completed for GitLab.com, dev, or test1378main: == 20230322023442 EnsureNoteDiffFilesBigintBackfillIsFinishedForGitlabDotCom: migrating 1379main: == 20230322023442 EnsureNoteDiffFilesBigintBackfillIsFinishedForGitlabDotCom: migrated (0.0099s) 1380 skips the check for other instances1381Knapsack report was generated. Preview:1382{1383 "spec/migrations/20220922143143_schedule_reset_duplicate_ci_runners_token_values_spec.rb": 21.815627164000034,1384 "spec/migrations/20220124130028_dedup_runner_projects_spec.rb": 40.41602144999979,1385 "spec/migrations/20220628012902_finalise_project_namespace_members_spec.rb": 33.450221630999295,1386 "spec/migrations/20220506154054_create_sync_namespace_details_trigger_spec.rb": 31.77369752499999,1387 "spec/migrations/remove_not_null_contraint_on_title_from_sprints_spec.rb": 28.10811492599987,1388 "spec/migrations/20221115173607_ensure_work_item_type_backfill_migration_finished_spec.rb": 38.36326207999991,1389 "spec/migrations/20230209092204_fix_partition_ids_for_ci_build_trace_metadata_spec.rb": 17.884268453000004,1390 "spec/migrations/schedule_fix_incorrect_max_seats_used2_spec.rb": 38.928388402000564,1391 "spec/migrations/20221018193635_ensure_task_note_renaming_background_migration_finished_spec.rb": 37.30595753800026,1392 "spec/migrations/20220416054011_schedule_backfill_project_member_namespace_id_spec.rb": 34.38155413499953,1393 "spec/migrations/schedule_populate_requirements_issue_id_spec.rb": 32.934807908999574,1394 "spec/migrations/20220324032250_migrate_shimo_confluence_service_category_spec.rb": 33.39262378100011,1395 "spec/migrations/cleanup_after_fixing_regression_with_new_users_emails_spec.rb": 34.27336447699963,1396 "spec/migrations/20220627152642_queue_update_delayed_project_removal_to_null_for_user_namespace_spec.rb": 31.606612974999734,1397 "spec/migrations/20220502173045_reset_too_many_tags_skipped_registry_imports_spec.rb": 29.92865371300013,1398 "spec/migrations/20220525221133_schedule_backfill_vulnerability_reads_cluster_agent_spec.rb": 29.76320692399986,1399 "spec/migrations/20220524074947_finalize_backfill_null_note_discussion_ids_spec.rb": 28.598401286999433,1400 "spec/migrations/20221018095434_schedule_disable_legacy_open_source_license_for_projects_less_than_five_mb_spec.rb": 29.299987786999736,1401 "spec/migrations/20220816163444_update_start_date_for_iterations_cadences_spec.rb": 25.7473812169992,1402 "spec/migrations/20220920124709_backfill_internal_on_notes_spec.rb": 21.633176295999874,1403 "spec/migrations/set_email_confirmation_setting_before_removing_send_user_confirmation_email_column_spec.rb": 23.255945793000137,1404 "spec/migrations/20220929213730_schedule_delete_orphaned_operational_vulnerabilities_spec.rb": 22.19939739700021,1405 "spec/migrations/20220819153725_add_vulnerability_advisory_foreign_key_to_sbom_vulnerable_component_versions_spec.rb": 21.4630892819996,1406 "spec/migrations/recount_epic_cache_counts_spec.rb": 21.418036394999945,1407 "spec/migrations/20221205151917_schedule_backfill_environment_tier_spec.rb": 15.6024774790003,1408 "spec/migrations/remove_flowdock_integration_records_spec.rb": 14.663702761999957,1409 "spec/migrations/20230327123333_backfill_product_analytics_data_collector_host_spec.rb": 10.783743777999916,1410 "spec/migrations/swap_timelogs_note_id_to_bigint_for_gitlab_dot_com_spec.rb": 11.578876513999603,1411 "spec/migrations/20230221214519_remove_incorrectly_onboarded_namespaces_from_onboarding_progress_spec.rb": 7.7773611559996425,1412 "spec/migrations/swap_design_user_mentions_note_id_to_bigint_for_gitlab_dot_com_spec.rb": 8.187130033000358,1413 "spec/migrations/ensure_note_diff_files_bigint_backfill_is_finished_for_gitlab_dot_com_spec.rb": 7.0175932629999811414}1415Knapsack global time execution for tests: 13m 03s1416Finished in 28 minutes 54 seconds (files took 39.66 seconds to load)141775 examples, 0 failures1418Randomized with seed 223141419[TEST PROF INFO] Time spent in factories: 00:00.731 (0.04% of total time)1420RSpec exited with 0.1421No examples to retry, congrats!1423Not uploading cache ruby-gems-debian-bullseye-ruby-3.0-16 due to policy1424Not uploading cache gitaly-ruby-gems-debian-bullseye-ruby-3.0-16 due to policy1426Uploading artifacts...1427coverage/: found 5 matching artifact files and directories 1428crystalball/: found 2 matching artifact files and directories 1429WARNING: deprecations/: no matching files. Ensure that the artifact path is relative to the working directory (/builds/gitlab-org/gitlab) 1430knapsack/: found 3 matching artifact files and directories 1431WARNING: query_recorder/: no matching files. Ensure that the artifact path is relative to the working directory (/builds/gitlab-org/gitlab) 1432rspec/: found 14 matching artifact files and directories 1433WARNING: tmp/capybara/: no matching files. Ensure that the artifact path is relative to the working directory (/builds/gitlab-org/gitlab) 1434log/*.log: found 13 matching artifact files and directories 1435WARNING: Upload request redirected location=https://gitlab.com/api/v4/jobs/4031138395/artifacts?artifact_format=zip&artifact_type=archive&expire_in=31d new-url=https://gitlab.com1436WARNING: Retrying... context=artifacts-uploader error=request redirected1437Uploading artifacts as "archive" to coordinator... 201 Created id=4031138395 responseStatus=201 Created token=64_n4fNr1438Uploading artifacts...1439rspec/junit_rspec.xml: found 1 matching artifact files and directories 1440WARNING: Upload request redirected location=https://gitlab.com/api/v4/jobs/4031138395/artifacts?artifact_format=gzip&artifact_type=junit&expire_in=31d new-url=https://gitlab.com1441WARNING: Retrying... context=artifacts-uploader error=request redirected1442Uploading artifacts as "junit" to coordinator... 201 Created id=4031138395 responseStatus=201 Created token=64_n4fNr1444Job succeeded