API¶
This part of the documentation covers all the interfaces of requests-cache
Public api¶
requests_cache.core¶
Core functions for configuring cache and monkey patching requests
-
class
requests_cache.core.
CachedSession
(cache_name='cache', backend=None, expire_after=None, allowable_codes=(200, ), allowable_methods=('GET', ), **backend_options)¶ Requests
Sessions
with caching support.Parameters: - cache_name –
for
sqlite
backend: cache file will start with this prefix, e.gcache.sqlite
for
mongodb
: it’s used as database namefor
redis
: it’s used as the namespace. This means all keys are prefixed with'cache_name:'
- backend – cache backend name e.g
'sqlite'
,'mongodb'
,'redis'
,'memory'
. (see Persistence). Or instance of backend implementation. Default value isNone
, which means use'sqlite'
if available, otherwise fallback to'memory'
. - expire_after (float) – number of seconds after cache will be expired or None (default) to ignore expiration
- allowable_codes (tuple) – limit caching only for response with this codes (default: 200)
- allowable_methods (tuple) – cache only requests of this methods (default: ‘GET’)
- backend_options – options for chosen backend. See corresponding sqlite, mongo and redis backends API documentation
-
cache_disabled
()¶ Context manager for temporary disabling cache
>>> s = CachedSession() >>> with s.cache_disabled(): ... s.get('http://httpbin.org/ip')
- cache_name –
-
requests_cache.core.
install_cache
(cache_name='cache', backend=None, expire_after=None, allowable_codes=(200, ), allowable_methods=('GET', ), session_factory=<class 'requests_cache.core.CachedSession'>, **backend_options)¶ Installs cache for all
Requests
requests by monkey-patchingSession
Parameters are the same as in
CachedSession
. Additional parameters:Parameters: session_factory – Session factory. It must be class which inherits CachedSession
(default)
-
requests_cache.core.
configure
(cache_name='cache', backend=None, expire_after=None, allowable_codes=(200, ), allowable_methods=('GET', ), session_factory=<class 'requests_cache.core.CachedSession'>, **backend_options)¶ Installs cache for all
Requests
requests by monkey-patchingSession
Parameters are the same as in
CachedSession
. Additional parameters:Parameters: session_factory – Session factory. It must be class which inherits CachedSession
(default)
-
requests_cache.core.
uninstall_cache
()¶ Restores
requests.Session
and disables cache
-
requests_cache.core.
disabled
()¶ Context manager for temporary disabling globally installed cache
Warning
not thread-safe
>>> with requests_cache.disabled(): ... requests.get('http://httpbin.org/ip') ... requests.get('http://httpbin.org/get')
-
requests_cache.core.
enabled
(*args, **kwargs)¶ Context manager for temporary installing global cache.
Accepts same arguments as
install_cache()
Warning
not thread-safe
>>> with requests_cache.enabled('cache_db'): ... requests.get('http://httpbin.org/get')
-
requests_cache.core.
get_cache
()¶ Returns internal cache object from globally installed
CachedSession
-
requests_cache.core.
clear
()¶ Clears globally installed cache
Cache backends¶
requests_cache.backends.base¶
Contains BaseCache class which can be used as in-memory cache backend or extended to support persistence.
-
class
requests_cache.backends.base.
BaseCache
(*args, **kwargs)¶ Base class for cache implementations, can be used as in-memory cache.
To extend it you can provide dictionary-like objects for
keys_map
andresponses
or override public methods.-
keys_map
= None¶ key -> key_in_responses mapping
-
responses
= None¶ key_in_cache -> response mapping
-
save_response
(key, response)¶ Save response to cache
Parameters: - key – key for this response
- response – response to save
Note
Response is reduced before saving (with
reduce_response()
) to make it picklable
-
add_key_mapping
(new_key, key_to_response)¶ Adds mapping of new_key to key_to_response to make it possible to associate many keys with single response
Parameters: - new_key – new key (e.g. url from redirect)
- key_to_response – key which can be found in
responses
Returns:
-
get_response_and_time
(key, default=(None, None))¶ Retrieves response and timestamp for key if it’s stored in cache, otherwise returns default
Parameters: - key – key of resource
- default – return this if key not found in cache
Returns: tuple (response, datetime)
Note
Response is restored after unpickling with
restore_response()
-
delete
(key)¶ Delete key from cache. Also deletes all responses from response history
-
delete_url
(url)¶ Delete response associated with url from cache. Also deletes all responses from response history. Works only for GET requests
-
clear
()¶ Clear cache
-
has_key
(key)¶ Returns True if cache has key, False otherwise
-
has_url
(url)¶ Returns True if cache has url, False otherwise. Works only for GET request urls
-
reduce_response
(response)¶ Reduce response object to make it compatible with
pickle
-
restore_response
(response)¶ Restore response object after unpickling
-
requests_cache.backends.sqlite¶
sqlite3
cache backend
-
class
requests_cache.backends.sqlite.
DbCache
(location='cache', fast_save=False, extension='.sqlite', **options)¶ sqlite cache backend.
Reading is fast, saving is a bit slower. It can store big amount of data with low memory usage.
Parameters: - location – database filename prefix (default:
'cache'
) - fast_save – Speedup cache saving up to 50 times but with possibility of data loss. See backends.DbDict for more info
- extension – extension for filename (default:
'.sqlite'
)
- location – database filename prefix (default:
requests_cache.backends.mongo¶
mongo
cache backend
-
class
requests_cache.backends.mongo.
MongoCache
(db_name='requests-cache', **options)¶ mongo
cache backend.Parameters: - db_name – database name (default:
'requests-cache'
) - connection – (optional)
pymongo.Connection
- db_name – database name (default:
Internal modules which can be used outside¶
requests_cache.backends.dbdict¶
Dictionary-like objects for saving large data sets to sqlite database
-
class
requests_cache.backends.storage.dbdict.
DbDict
(filename, table_name='data', fast_save=False, **options)¶ DbDict - a dictionary-like object for saving large datasets to sqlite database
It’s possible to create multiply DbDict instances, which will be stored as separate tables in one database:
d1 = DbDict('test', 'table1') d2 = DbDict('test', 'table2') d3 = DbDict('test', 'table3')
all data will be stored in
test.sqlite
database into correspondent tables:table1
,table2
andtable3
Parameters: - filename – filename for database (without extension)
- table_name – table name
- fast_save – If it’s True, then sqlite will be configured with “PRAGMA synchronous = 0;” to speedup cache saving, but be careful, it’s dangerous. Tests showed that insertion order of records can be wrong with this option.
-
can_commit
= None¶ Transactions can be commited if this property is set to True
-
commit
(force=False)¶ Commits pending transaction if
can_commit
or force is TrueParameters: force – force commit, ignore can_commit
-
bulk_commit
()¶ Context manager used to speedup insertion of big number of records
>>> d1 = DbDict('test') >>> with d1.bulk_commit(): ... for i in range(1000): ... d1[i] = i * 2
-
class
requests_cache.backends.storage.dbdict.
DbPickleDict
(filename, table_name='data', fast_save=False, **options)¶ Same as
DbDict
, but pickles values before savingParameters: - filename – filename for database (without extension)
- table_name – table name
- fast_save –
If it’s True, then sqlite will be configured with “PRAGMA synchronous = 0;” to speedup cache saving, but be careful, it’s dangerous. Tests showed that insertion order of records can be wrong with this option.
requests_cache.backends.mongodict¶
Dictionary-like objects for saving large data sets to mongodb
database
-
class
requests_cache.backends.storage.mongodict.
MongoDict
(db_name, collection_name='mongo_dict_data', connection=None)¶ MongoDict - a dictionary-like interface for
mongo
databaseParameters: - db_name – database name (be careful with production databases)
- collection_name – collection name (default: mongo_dict_data)
- connection –
pymongo.Connection
instance. If it’sNone
(default) new connection with default options will be created
-
class
requests_cache.backends.storage.mongodict.
MongoPickleDict
(db_name, collection_name='mongo_dict_data', connection=None)¶ Same as
MongoDict
, but pickles values before savingParameters: - db_name – database name (be careful with production databases)
- collection_name – collection name (default: mongo_dict_data)
- connection –
pymongo.Connection
instance. If it’sNone
(default) new connection with default options will be created
requests_cache.backends.redisdict¶
Dictionary-like objects for saving large data sets to redis
key-store
-
class
requests_cache.backends.storage.redisdict.
RedisDict
(namespace, collection_name='redis_dict_data', connection=None)¶ RedisDict - a dictionary-like interface for
redis
key-storesThe actual key name on the redis server will be
namespace
:collection_name
In order to deal with how redis stores data/keys, everything, i.e. keys and data, must be pickled.
Parameters: - namespace – namespace to use
- collection_name – name of the hash map stored in redis (default: redis_dict_data)
- connection –
redis.StrictRedis
instance. If it’sNone
(default), a new connection with default options will be created