aboutsummaryrefslogtreecommitdiffstats
path: root/includes/filerepo/file/ArchivedFile.php
diff options
context:
space:
mode:
authorAmir Sarabadani <ladsgroup@gmail.com>2024-01-17 22:06:37 +0100
committerLadsgroup <Ladsgroup@gmail.com>2024-01-22 20:10:11 +0000
commit16b468b515da957398b95c602f9db7a9bd87cdbc (patch)
tree53dd860017d727de2398c4fae358bbe2c7d971b8 /includes/filerepo/file/ArchivedFile.php
parentfd3622e992659d943fe638d2d8ee6e35791cc0f2 (diff)
downloadmediawikicore-16b468b515da957398b95c602f9db7a9bd87cdbc.tar.gz
mediawikicore-16b468b515da957398b95c602f9db7a9bd87cdbc.zip
updateCollation: Simplify and redo how batching works
Long time ago, when we changed collation way more often and the HDD was the norm for databases, indexing on cl_collation helped us speed up the updates and minimize user impact. That's not the case anymore and on top of that, we now have feature of copying the table via setting --target-table and there is no need to run the schema change super fast. Also, categorylinks table is quite large (in Commons it has reached 210GB, comparable to enwiki's revision table) and an extra index like that is quite taxing on the infra. So let's just do whatever other scripts do, go through all rows in batches and take advantage of cl_from index instead. This is similar to what migrateLinksTable does. Bug: T342854 Change-Id: Ie4dd91ee29308c980ec0b9b7ee684cb175ffca43
Diffstat (limited to 'includes/filerepo/file/ArchivedFile.php')
0 files changed, 0 insertions, 0 deletions