When I initially wrote the serialization/deserialization code for the RPC::XML package, it was after all meant to be a reference implementation from which I wrote extensions to the XML-RPC protocol. When that stirred up a storm of shit with Dave Winer, I decide to focus on code rather than ego, and instead made it a conformant XML-RPC implementation.
But there was a little trap waiting for me, patiently...
I designed it serialize and deserialize in memory. Even as I wrote support for the Base-64 datatype and wrote tests to validate it, I didn't think about how big something sent as a Base-64 chunk would likely be.
A project I'm working on at the day job will be using the package to move Base-64-encoded WAV files. That's pretty damned big. And suddenly, keeping it all in memory is a really bad idea. So I'm spending my weekend coming up with a streaming layer for serialization and deserialization in the package. 'Cause I'm going to need it on Monday when I get back into the office.
For every complex problem there is a solution which is simple, neat and wrong. — H.L. Mencken
Re:Big data scares me
rjray on 2003-01-13T09:08:33
The specific case that's driving this development suffers from more design woes than just this. But at present, I have to be prepared to fetch WAV data that was sent as an e-mail attachment (!) over an XML-RPC interface.
And it has occurred to me that even as I do all this work this weekend, I have no idea if their server (for which the XML-RPC component is very new and highly experimental) can even deal with that much data in a single message. They, too, may suffer from having code that expects to read all of a base64 object into memory before encoding it and sending it across the socket.
Re:Big data scares me
Matts on 2003-01-13T11:16:26
This is where xlink would be very appropriate, and where Dave Winer needs to get off his high horse about namespaces - they would be very useful here.