There might be an easy way around all of this though, the problem is that the URLs used START encoded. If you just sourced the raw list and in the extension live ran it through https://github.com/pieroxy/lz-string/blob/master/libs/lz-str... to output the lz compressed pac file to disk (you'll need the filesystem write permission on the extension) and attach that lz-string library at the end and call it to decompress (similar to how you do now just not manual) it's probably a win-win.
I say win-win because on your side the lz compression will probably net a file ~10x smaller than what you have now at the same decode speed and on Google's side the URLs used are stored a form they can validate against.
- The linked page covers why they needed to do it
- A chromium bug report on that issue was linked (i.e. they tried to get the limit removed properly)
- "* Encode the rules as data rather than generated code" was the idea of a chromium developer in that bug report