[Zope] large images to a database via zope.

ghaley@mail.venaca.com ghaley@mail.venaca.com
Thu, 12 Apr 2001 16:40:36 -0400 (EDT)


hi,

if any of you have had experience with this, some sort of pointers or a
link to the appropriate how-to would be most appreciated.

we are running zope 2.3.n with mysql 2.23.n (where n is i'm not sure the
exact port).  there is some sort of practical limit on the size of a file
that mysql can insert (it's somewhere around 16MB), although a largeblob
data type should be able to hold over a gig.  

[blob is binary large object, a datatype in MySQL]

using a shell script i can get a large file to split into 5MB chuncks and
then creating a set of part numbers, insert these (via the mysql++) api
into the database.  a second script can retrieve these and recombine them.  
so, my question is, how can i get zope to do this?  two tables will be
involved:

1.  object identifiers.  containing

 	object_id (a unique, auto_incrementing number) 
	size in bytes

2.  object blob storage.

	id (unique auto_increment) 
	object_id (identical to object_id above)
	object_pt 
	bfile (the blob data).

in my calling method, i would do something like (this is pseudo code).

<dtml-in object>
  <dtml-call (obj_pt, object_pt)>
  <dtml-call EM.load_blob>
     // python does all the work based on the object_id value passed in //
</dtml-in>

i am very low on python skills, so, since i cannot know how many parts a
file will contain until i start to fetch how can i write the python script
to iterate the number of loops to get all the pieces back out.

would
 
    for xx in [object_pt] 

work?  if so, does xx start at 0, or can i set it to start at 1 (this
would determine the values i put into the blob datafile in the first
place)?

i've been told this can be done, but i've not tracked down any examples of
it.

any help will be greatly appreciated.

ciao!
greg.

Gregory Haley
DBA/Web Programmer
Venaca, LLC.