收录日期:2019/10/18 22:06:19 时间:2009-11-11 00:37:04 标签:python,json,rpc

I have come across several guides and packages on implementing a python JSON RPC server, e.g.:

They all do a good job in the sense that the server/application implementation is very simple, you just return the python object as a result and the framework takes care of serializing it. However, this is not suitable for my needs mainly because I am looking forward to serializing possibly thousands of records from database and such a solution would require me to create a single python object containing all the records and return that as the result.

The ideal solution I am looking for would involve a framework that would provide the application a stream to write the response to and a JSON encoder that could encode an iterator (in this case a cursor from pyodbc) on the fly, something like this:

def process(self, request, response):
  // retrieve parameters from request.

  cursor = self.conn.cursor()
  cursor.execute(sql) // etc.

  // Dump the column descriptions and the results (an iterator)
  json.dump(response.getOut(), [cursor.description, cursor])

Can someone point me to a server framework that can provide me a stream to write to and a json serialization framework that can handle an iterable such as the pyodbc cursor and serialize it on the fly.

if the typical JSON-RPC frameworks doesn't allow you to dump such huge data effectively, why not just use a HTTP server and return json data, that way you can stream and read streamed data, good thing is you may even gzip it for faster transfer, and you will be able to use many standard servers too e.g. apache .