Blame view

node_modules/lru-cache/README.md 5.32 KB
909d7e57   liuqimichale   build
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
  # lru cache
  
  A cache object that deletes the least-recently-used items.
  
  [![Build Status](https://travis-ci.org/isaacs/node-lru-cache.svg?branch=master)](https://travis-ci.org/isaacs/node-lru-cache) [![Coverage Status](https://coveralls.io/repos/isaacs/node-lru-cache/badge.svg?service=github)](https://coveralls.io/github/isaacs/node-lru-cache)
  
  ## Installation:
  
  ```javascript
  npm install lru-cache --save
  ```
  
  ## Usage:
  
  ```javascript
  var LRU = require("lru-cache")
    , options = { max: 500
                , length: function (n, key) { return n * 2 + key.length }
                , dispose: function (key, n) { n.close() }
                , maxAge: 1000 * 60 * 60 }
    , cache = LRU(options)
    , otherCache = LRU(50) // sets just the max size
  
  cache.set("key", "value")
  cache.get("key") // "value"
  
  // non-string keys ARE fully supported
  // but note that it must be THE SAME object, not
  // just a JSON-equivalent object.
  var someObject = { a: 1 }
  cache.set(someObject, 'a value')
  // Object keys are not toString()-ed
  cache.set('[object Object]', 'a different value')
  assert.equal(cache.get(someObject), 'a value')
  // A similar object with same keys/values won't work,
  // because it's a different object identity
  assert.equal(cache.get({ a: 1 }), undefined)
  
  cache.reset()    // empty the cache
  ```
  
  If you put more stuff in it, then items will fall out.
  
  If you try to put an oversized thing in it, then it'll fall out right
  away.
  
  ## Options
  
  * `max` The maximum size of the cache, checked by applying the length
    function to all values in the cache.  Not setting this is kind of
    silly, since that's the whole purpose of this lib, but it defaults
    to `Infinity`.
  * `maxAge` Maximum age in ms.  Items are not pro-actively pruned out
    as they age, but if you try to get an item that is too old, it'll
    drop it and return undefined instead of giving it to you.
  * `length` Function that is used to calculate the length of stored
    items.  If you're storing strings or buffers, then you probably want
    to do something like `function(n, key){return n.length}`.  The default is
    `function(){return 1}`, which is fine if you want to store `max`
    like-sized things.  The item is passed as the first argument, and
    the key is passed as the second argumnet.
  * `dispose` Function that is called on items when they are dropped
    from the cache.  This can be handy if you want to close file
    descriptors or do other cleanup tasks when items are no longer
    accessible.  Called with `key, value`.  It's called *before*
    actually removing the item from the internal cache, so if you want
    to immediately put it back in, you'll have to do that in a
    `nextTick` or `setTimeout` callback or it won't do anything.
  * `stale` By default, if you set a `maxAge`, it'll only actually pull
    stale items out of the cache when you `get(key)`.  (That is, it's
    not pre-emptively doing a `setTimeout` or anything.)  If you set
    `stale:true`, it'll return the stale value before deleting it.  If
    you don't set this, then it'll return `undefined` when you try to
    get a stale entry, as if it had already been deleted.
  * `noDisposeOnSet` By default, if you set a `dispose()` method, then
    it'll be called whenever a `set()` operation overwrites an existing
    key.  If you set this option, `dispose()` will only be called when a
    key falls out of the cache, not when it is overwritten.
  
  ## API
  
  * `set(key, value, maxAge)`
  * `get(key) => value`
  
      Both of these will update the "recently used"-ness of the key.
      They do what you think. `maxAge` is optional and overrides the
      cache `maxAge` option if provided.
  
      If the key is not found, `get()` will return `undefined`.
  
      The key and val can be any value.
  
  * `peek(key)`
  
      Returns the key value (or `undefined` if not found) without
      updating the "recently used"-ness of the key.
  
      (If you find yourself using this a lot, you *might* be using the
      wrong sort of data structure, but there are some use cases where
      it's handy.)
  
  * `del(key)`
  
      Deletes a key out of the cache.
  
  * `reset()`
  
      Clear the cache entirely, throwing away all values.
  
  * `has(key)`
  
      Check if a key is in the cache, without updating the recent-ness
      or deleting it for being stale.
  
  * `forEach(function(value,key,cache), [thisp])`
  
      Just like `Array.prototype.forEach`.  Iterates over all the keys
      in the cache, in order of recent-ness.  (Ie, more recently used
      items are iterated over first.)
  
  * `rforEach(function(value,key,cache), [thisp])`
  
      The same as `cache.forEach(...)` but items are iterated over in
      reverse order.  (ie, less recently used items are iterated over
      first.)
  
  * `keys()`
  
      Return an array of the keys in the cache.
  
  * `values()`
  
      Return an array of the values in the cache.
  
  * `length`
  
      Return total length of objects in cache taking into account
      `length` options function.
  
  * `itemCount`
  
      Return total quantity of objects currently in cache. Note, that
      `stale` (see options) items are returned as part of this item
      count.
  
  * `dump()`
  
      Return an array of the cache entries ready for serialization and usage
      with 'destinationCache.load(arr)`.
  
  * `load(cacheEntriesArray)`
  
      Loads another cache entries array, obtained with `sourceCache.dump()`,
      into the cache. The destination cache is reset before loading new entries
  
  * `prune()`
  
      Manually iterates over the entire cache proactively pruning old entries