The core problem with WeakMap in JS (from a regular developer point of view) is that it was originally intended to be what PLT folk call an Ephemeron Table, rather than what “weak map” means to everyone else :D
The principle use case for them - from the PoV of MarkM and Crockford (IIRC) was essentially private fields, which is a thing we solved much better with the addition of `Symbol`/private names.
That’s why you have to have an object key in weakmaps, and as a result most of the uses for weakmap that I have encountered have been incorrect in that they inherently end up keeping the key object alive and that causes the associated value to stay alive (again JS WeakMap does not match the definition of WeakMap in any other language :-/)
A bunch of folk have made statements to the effect of “depending on behavior of iteration when iterating a weak map is bad code and the developers are wrong and preventing iteration to protect against bad code is dumb”. That’s not true for the web, for myriad reasons, but the fix of it is that if a major site ends up being meaningfully dependent on GC behaviour in Chrome then the site support will just say “you need to use chrome” - and then every other browser will need to work out how to get close enough semantics. An easy way to understand how this can meaningfully impact behaviour is to consider generational collectors vs non-generational: a generational collector can very easily end up keeping a cache entry alive longer and then iteration allows it to come back to life. The user experience here is Browser 1 is worse than browser 2 because it has to keep downloading data as the cache gets evicted.
A more period relevant reason for the “there shall be no iteration” is that at the time there was no standardized definition of when references were permitted to (observably) die.
That is less of an issue today: when the language finally added a real weak reference type that finally had to be addressed, and now the JS spec specifies run loop semantics governing when weak references can die so that you get a sufficiently reasonable amount of determinism in behaviour for them to be exposed to web content, so to an extent the problems of WeakMap iteration are less of a problem.
The principle use case for them - from the PoV of MarkM and Crockford (IIRC) was essentially private fields, which is a thing we solved much better with the addition of `Symbol`/private names.
That’s why you have to have an object key in weakmaps, and as a result most of the uses for weakmap that I have encountered have been incorrect in that they inherently end up keeping the key object alive and that causes the associated value to stay alive (again JS WeakMap does not match the definition of WeakMap in any other language :-/)
A bunch of folk have made statements to the effect of “depending on behavior of iteration when iterating a weak map is bad code and the developers are wrong and preventing iteration to protect against bad code is dumb”. That’s not true for the web, for myriad reasons, but the fix of it is that if a major site ends up being meaningfully dependent on GC behaviour in Chrome then the site support will just say “you need to use chrome” - and then every other browser will need to work out how to get close enough semantics. An easy way to understand how this can meaningfully impact behaviour is to consider generational collectors vs non-generational: a generational collector can very easily end up keeping a cache entry alive longer and then iteration allows it to come back to life. The user experience here is Browser 1 is worse than browser 2 because it has to keep downloading data as the cache gets evicted.
A more period relevant reason for the “there shall be no iteration” is that at the time there was no standardized definition of when references were permitted to (observably) die.
That is less of an issue today: when the language finally added a real weak reference type that finally had to be addressed, and now the JS spec specifies run loop semantics governing when weak references can die so that you get a sufficiently reasonable amount of determinism in behaviour for them to be exposed to web content, so to an extent the problems of WeakMap iteration are less of a problem.