Skip to content

client-side filtering does not scale

Looking specifically at the Amazon EC2 images, I see that the image finder currently loads data >15000 images in the browser, and then relies on client-side filtering to limit the view. This currently requires an 8.4 MB download ever time the provider page is accessed, and requires the browser to process all 15k entries in order to apply filters. This cannot work in production.

In normal operation, the image finder should only retrieve data for images that it is likely to need. In almost all cases, this will be limited to only the most recent version of the "release" images. If a user needs to find an older release, or a daily build, then this should be possible, if they explicitly request it, but we should not load all the data to facilitate this search by default. Loading only the latest image data will reduce the document size to well under 100 kB, which will load far quicker be far more efficient.

Note also that we have few enough providers that it may be worth pre-generating the provider view and serving it as static content. But we can deal with that later.