smallseo.info

redis-py

Redis Python Client

Redis-python setting multiple key/values in on operation

currently I use the basic mset feature to store a key/value;

rom common.redis_client import get_redis_client
cache = get_redis_client()
for k,v in some_dict.items():
   kw = {'key': value}
   cache.mset(kw) 

#later:
   cache.get('key')

I store each key/value seperatly (not in one json for example) Since storing the whole dict would turn it into a string and would require me to serialize/deserialize on storing and retrieving and I really need access to seperate key/values.

My question:: is there a way I can mset multiple key/values at once? Instead of multiple writes to the redis db? and vice-versa can I have multiple reads (get) in one access? (and Yes - I have a lot of redis actitivy going on and with heavly load. I do care about this)


Source: (StackOverflow)

redis-py - ConnectionError: Socket closed on remote end - overload?

I'm using Redis from Python via redis-py to store JSON in a sorted set.

Everything works fine until I try to get a certain amount of data out of Redis.


redis.StrictRedis(host='localhost', port=6379, db=12) 
redis_client.zrange('key', 0, 20, 'desc')

Will work fine as I'm only requesting 20 entries.

As soon as I try anything above 35 I get:

ConnectionError: Socket closed on remote end

I've tried working around it by "chunking" the queries in sets of 5 but it seems that I'm hitting Redis so fast with a lot of queries of 5 that this can still cause the exception.

Am I somehow DDOSing redis?


I've tried it on both Windows and Ubuntu.

Last week I actually got away with up to 100 entries at once and chunking worked if I did it in groups of 10, but it seems since then my Redis server has gotten even more sensitive.


Here is a little script that reproduces the error.

import redis
import ujson as json

r = redis.StrictRedis(host="localhost", port=6379, db=12)
dummy_json = {"data":"hfoiashflkasdjaisdäjpagufeiaghaifhaspdas", 
          "more": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more1": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more2": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more3": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more4": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp"}

for score in xrange(0, 6000):
    dummy_json["score"]=score
    r.zadd("test", score, json.dumps(dummy_json))

result = r.zrange('test', 0, 200, 'desc')
print result

You'll see that if you make dummy_json hold less data or request fewer entries at once the exception will be gone.


Source: (StackOverflow)

redis-py and hgetall behavior

I played around with flask microframework, and wanted to cache some stats in redis. Let's say I have this dict:

mydict = {}
mydict["test"] = "test11"

I saved it to redis with

redis.hmset("test:key", mydict)

However after restore

stored = redis.hgetall("test:key")
print(str(stored))

I see weird {b'test': b'test11'} so stored.get("test") gives me None

mydict str method result looks fine {'test': 'test11'}. So, why this binary marker added to restored data? I also checked in redis-cli and don't see explicit b markers there. Something wrong with hgetall?


Source: (StackOverflow)

redis.exceptions.ConnectionError: Error -2 connecting to localhost:6379. Name or service not known

I have this error when I run my code in server, my env is debian, and Python2.7.3

Traceback (most recent call last):
  File "fetcher.py", line 4, in <module>
    import mirad.fetcher_tasks as tasks
  File "/home/mirad/backend/mirad/fetcher_tasks.py", line 75, in <module>
    redis_keys = r.keys('*')
  File "/home/mirad/backend/venv/local/lib/python2.7/site-packages/redis/client.py", line 863, in keys
    return self.execute_command('KEYS', pattern)
  File "/home/mirad/backend/venv/local/lib/python2.7/site-packages/redis/client.py", line 534, in execute_command
    connection.send_command(*args)
  File "/home/mirad/backend/venv/local/lib/python2.7/site-packages/redis/connection.py", line 532, in send_command
    self.send_packed_command(self.pack_command(*args))
  File "/home/mirad/backend/venv/local/lib/python2.7/site-packages/redis/connection.py", line 508, in send_packed_command
    self.connect()
  File "/home/mirad/backend/venv/local/lib/python2.7/site-packages/redis/connection.py", line 412, in connect
    raise ConnectionError(self._error_message(e))
redis.exceptions.ConnectionError: Error -2 connecting to localhost:6379. Name or service not known.

when I run redis-cli it works correctly without any error:

$ redis-cli 
127.0.0.1:6379> 

Source: (StackOverflow)

What's the difference between the API of Redis and StrictRedis?

I'm working on a project with redis.py, I works when I connect the app to a Redis client, but failed with StrictRedis.

So, I wanna know the difference between the two, but searched with no satisfied answer.

My project is here: https://github.com/kxxoling/librorum Sorry for the Chinese annotation!


Source: (StackOverflow)

Share redis connection between django views

While debugging I've noticed that each redis accessing django view uses a separate redis connection.

Why is this so?
Is django using a thread per view and redis-py creates a connection per thread? Or is it some other reason?

How can I make django share a single connection between the various views?


Source: (StackOverflow)

Insert a new database in redis using redis.StrictRedis()

I know that Redis have 16 databases by default, but what if i need to add another database, how can i do that using redis-py?


Source: (StackOverflow)

Debugging memory leak in Python service that uses Redis pub/sub

I'm trying to build a Python 2.7 service using Redis publish/subscribe functionality. I use redis 2.8.17 on Ubuntu 12.04 with redis-py 2.10.3 as a client. Unfortunately my service seems to be leaking memory. The memory consumption seems to increase linearl-ish with the amount of messages the service receives/consumes/handles.

I tried to debug this using the tool memory_profiler by decorating my main subscribe loop. In order to have it print output continuously, I changed it to exits every every hundredth message it receives. The output looks like this:

Line #    Mem usage    Increment   Line Contents
================================================
    62     39.3 MiB      0.0 MiB       @memory_profiler.profile
    63                                 def _listen(self, callback):
    64     40.1 MiB      0.7 MiB           for _ in self.redis_pubsub.listen():
    65     40.1 MiB      0.0 MiB               self.count += 1
    66     40.1 MiB      0.0 MiB               self._consume(callback)
    67     40.1 MiB      0.0 MiB               if self.count == 100:
    68     40.1 MiB      0.0 MiB                   self.count = 0
    69     40.1 MiB      0.0 MiB                   break
    70     40.1 MiB      0.0 MiB           gc.collect()

It reports a similar increase for every hundred message pushed to the service. The callback is the function that actually does application things, so line 65 is where I'd actually expect a memory increase if there was something wrong in my app code ..

The output made me suspect the redis client so I also checked the size of the self.redis_pubsub and redis.StrictRedis objects using pympler.asizeof. These objects are small to begin with and does not increase at all as the service receives messages.

Further, when trying to look for leaking objects using pympler.muppy and pympler.summarize, it does not report any growing object-counts or accumulating memory whatsoever. Also, the total numbers for memory consumptions and growth does not resemble the numbers provided by top in Linux.

I'm stuck, do anyone have any idea what might be going on or have any ideas on how I can debug this further?


Source: (StackOverflow)

Python redis pubsub: what happen to types when it gets published?

pub.py

import redis
import datetime
import time

def main():
    redis_host = '10.235.13.29'
        r = redis.client.StrictRedis(host=redis_host, port=6379)
        while True:
            now = datetime.datetime.now()
            print 'Sending {0}'.format(now)
            print 'data type is %s' % type(now)
            r.publish('clock', now)
            time.sleep(1)

if __name__ == '__main__':
  main()

OUTPUT:

Sending 2014-10-08 13:10:58.338765
data type is <type 'datetime.datetime'>
Sending 2014-10-08 13:10:59.368707
data type is <type 'datetime.datetime'>
Sending 2014-10-08 13:11:00.378723
data type is <type 'datetime.datetime'>
Sending 2014-10-08 13:11:01.398132
data type is <type 'datetime.datetime'>
Sending 2014-10-08 13:11:02.419030
data type is <type 'datetime.datetime'>

sub.py

import redis
import threading
import time
import datetime

def callback():
    redis_host = '10.235.13.29'
    r = redis.client.StrictRedis(host=redis_host, port=6379)
    sub = r.pubsub()
    sub.subscribe('clock')
    while True:
        for m in sub.listen():
            #print m #'Recieved: {0}'.format(m['data'])
            now = datetime.datetime.now()
            print 'Recieved: %s at %s' % (m['data'], now)
            print 'Data type is %s' % type(m['data'])
            dur = 1
            print 'It took %s to receive' % dur

def main():
    t = threading.Thread(target=callback)
    t.setDaemon(True)
    t.start()
    while True:
        print 'Waiting'
        time.sleep(30)

if __name__ == '__main__':
    main()

OUTPUT:

{}: ./sub.py
Waiting
Recieved: 1 at 2014-10-08 13:09:36.708088
Data type is <type 'long'>
It took 1 to receive
Recieved: 2014-10-08 13:09:37.629664 at 2014-10-08 13:09:37.630479
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:38.630661 at 2014-10-08 13:09:38.631585
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:39.632663 at 2014-10-08 13:09:39.633480
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:40.633662 at 2014-10-08 13:09:40.634464
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:41.634665 at 2014-10-08 13:09:41.635557
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:42.635662 at 2014-10-08 13:09:42.636673
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:43.642665 at 2014-10-08 13:09:43.643441
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:44.643663 at 2014-10-08 13:09:44.644582
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:45.644667 at 2014-10-08 13:09:45.673734
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:46.672918 at 2014-10-08 13:09:46.673874
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:47.673913 at 2014-10-08 13:09:47.675014
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:48.674920 at 2014-10-08 13:09:48.675804
Data type is <type 'str'>
It took 1 to receive
Recieved: 2014-10-08 13:09:49.675912 at 2014-10-08 13:09:49.677346
Data type is <type 'str'>

The type changed from datetime.datetime to str
Is it possible to preserve the type because i am trying to find the duration i cant subtracte datetime obj to str?


Source: (StackOverflow)

redis-py "ConnectionError: Socket closed on remote end"

Using redis-py's PubSub class I sometimes get the following exception:

Exception in thread listener_2013-10-24 12:50:31.687000:
Traceback (most recent call last):
  File "c:\Python27\Lib\threading.py", line 551, in __bootstrap_inner
    self.run()
  File "c:\Python27\Lib\threading.py", line 504, in run
    self.__target(*self.__args, **self.__kwargs)
  File "C:\Users\Administrator\Documents\my_proj\my_module.py", line 69, in _listen
    for message in _pubsub.listen():
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\client.py", line 1555, in listen
    r = self.parse_response()
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\client.py", line 1499, in parse_response
    response = self.connection.read_response()
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\connection.py", line 306, in read_response
    response = self._parser.read_response()
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\connection.py", line 106, in read_response
    raise ConnectionError("Socket closed on remote end")
ConnectionError: Socket closed on remote end

What would cause such an event?
If I catch this exception, what would be a reasonable handling logic? Would retrying listen() be futile?

The reason for asking and not simply trying is that I do not know how to reproduce this problem. It's rare but it's detrimental, so I must create some logic before this error strikes again.


Source: (StackOverflow)

redis client pipeline does not work in twemproxy environment

I use redis-py to operate on redis, and our environment use twemproxy as redis proxy. But looks clinet pipeline doesn't work when connect to twemproxy.

import redis

client = redis.StrictRedis(host=host, port=port, db=0)
pipe = client.pipeline()
pipe.smembers('key')
print pipe.execute()

it throws exception when do execute method

redis.exceptions.ConnectionError: Socket closed on remote end

In twemproxy environment, client pipeline doesn't work or it is an issue of redis-py ?


Source: (StackOverflow)

How to set the redis timeout waiting for the response with pipeline in redis-py?

In the code below, is the pipeline timeout 2 seconds?

client = redis.StrictRedis(host=host, port=port, db=0, socket_timeout=2)
pipe = client.pipeline(transaction=False)
for name in namelist:
    key = "%s-%s-%s-%s" % (key_sub1, key_sub2, name, key_sub3)
    pipe.smembers(key)
pipe.execute()

In the redis, there are a lot of members in the set "key". It always return the error as below with the code last:

error Error while reading from socket: ('timed out',)

If I modify the socket_timeout value to 10, it returns ok.
Doesn't the param "socket_timeout" mean connection timeout? But it looks like response timeout.
The redis-py version is 2.6.7.


Source: (StackOverflow)

ZREM on Redis Sorted Set

What will happen if 2 workers call ZREM on the same element of a sorted set at the same time? Will it return true to the worker which actually removes the element and false to the other to indicate it doesn't exist or will it return true to both? In other words is ZREM atomic internally?


Source: (StackOverflow)

Is there a NUMSUB command for redis-py?

Is there some equivalent to the NUMSUB command in redis for the python client?

I've looked through the documentation and can't find anything other than the publish() method itself, which returns the number of subscribers on that channel. Knowing how many subscribers there are after-the-fact is not very useful to me though.


Source: (StackOverflow)

How can I implement an atomic get or set&get key in redis using python?

I have a redis server and I want to implement an atomic (or pseudo atomic) method that will do the following (NOTICE: I have a system that has multiple sessions to the redis server) :

  1. If some key K exists get the value for it
  2. Otherwise, call SETNX function with a random value that is generated by some function F(that generates salts)
  3. Ask redis for the value of key K that was just generated by the current session (or by another session "simultaneously" - a short moment before the current session generated it)

The reasons that I don't want to pre-generate (before checking if the value exists) a value with function F, and use it if the key doesn't exist are :

  1. I don't want to call F with no justification (it might cause an intensive CPU behaviour(
  2. I want to avoid the next problematic situation : T1 : Session 1 generated a random value VAL1 T2 : Session 1 asked if key K exists and got "False" T3 : Session 2 generated a random value VAL2 T4 : Session 2 asked if key K exists and got "False" T5 : Session 2 calls SETNX with the value VAL2 and uses VAL2 from now on T6 : Session 1 calls SETNX with the value VAL1 and uses VAL1 from now on where the actual value of key K is VAL2

A python pseudo-code that I created is :

    import redis
    r = redis.StrictRedis(host='localhost', port=6379, db=0)
    ''' gets the value of key K if exists (r.get(K) if r.exists(K)), 
    otherwise gets the value of key K if calling SETNX function returned TRUE 
    (else r.get(K) if r.setnx(K,F())), meaning this the sent value is really the value,
    otherwise, get the value of key K, that was generated by another session a         
    short moment ago (else r.get(K))
    The last one is redundant and I can write "**r.setnx(K,F()) or True**" to get the 
    latest value instead, but the syntax requires an "else" clause at the end '''
    r.get(K) if r.exists(K) else r.get(K) if r.setnx(K,F()) else r.get(K)

Is there another solution?


Source: (StackOverflow)