Forum moved here!

Home / Increasing cache of page

BaiduGirl

Dragutė
There is a problem - is it possible to increase the cache of paged pages in RAM? It is absolutely impossible to work with large documents, you have to wait for 8-10 seconds to load the page
Is such a possibility being realized in the future or is it already implemented but I do not know about this possibility now?

Joachim_Weisse

I have the same question. When I work with PDFs with 100 up to 600 sites, you can not fast scroll (left/right keys at keyboard). It scrolls 10 or twenty pages, then it sucks and need time to load. My documents are saved at a fast SSD.

Is there a way to force Sumatra to cache more pages at start?
Or is the only way to switch to a faster NVME?

Sumatra 3.3 64 bit Win10

Best Regards

GitHubRulesOK

I have a 7063 page text document, I can go to the last or any other page like 5000 and it loads instantly and i can move one page right or left without a delay i.e. it is NOT loading all 7063 pages as such or else I would have to wait for the other 4999 to be lifted into memory. A PDF can be, but is rarely a sequence of pages like a book, it is more like a room full of scattered crumpled sheets starting with 0 which with luck have tag numbers 1-7063 and a map where sheet 5000 might be found. so there is a short delay while I look at the map to see where 4999 might be located and unfurl it. Thus memory cache of the index is important to minimise delay picking up pages. but I dont want to try to memorise every word. nor remember every dot of an eye or cross of a tea. (Remember one character could be multiple glyphs with hundreds of strokes or bytes when unpacked)

I have another smaller file with 1 single page, highly detailed and compressed Aeronautic Chart, that takes what seems like hours to load just that 1 page. I would have landed on the mountain peak in the fog before the Navigator saw it on the chart and gave me a bearing. If a cover sheet had a photo of Everest I could have been forewarned to elevate.

The problems you see are usually based on the need to decompress objects and especially images in the areas of interest.

The more stupid the author or heavily compacted the graphics the longer those pages take to be decompressed from more fathoms down.

I always suggest for PDF use PNG 96ppi at native compression i.e not J3K or any other esoteric user slowdown. For Image folders or Zip/CBZ consider smaller blocks like editions/chapters (Avoid combining 12 into an easier to download Annual)