I am under pressure to simplify copying images between data centers so I've written a command-line python app to do it, had to make some updates to ibm_sce.py to make this work.
Summary of changes:
BUG: in the constants in class VolumeState, 12 should be COPYING
ADDED: ex_get_volume_detail - because the API function to retrieve from /storage/<storageid> is the only way of getting a useful % completion in cloneStatus
ADDED: in _to_volume, return cloneStatus in the extra content
ADDED: ex_get_image_detail to get image info for a single image
QUERY: one time I think I made a parameter too long - maybe the name of a node, not sure - and the API call failed, would be good if the libcloud implementation could handle this sort of thing cleanly
My delvings into the API showed up some variation/inconsistencies/deficiencies between the documentation and the API return for JSON and XML formatsL, I raised a cloud support case on this see https://www.ibm.com/developerworks/mydeveloperworks/groups/service/forum/topicThread?topicUuid=d2791f71-7524-4887-9409-5085bafe3bff&communityUuid=1dba2e59-05da-4b9a-84e4-2444a6cac251&lang=en
One thing that feels awkward using libcloud for SCE is the difference between libcloud calls expecting e.g. volume and image python objects, and lower-level things needing SCE 'ID's. I had to create a dummy class with a .id attribute to satisfy the need for libcloud objects when all I had to work with was the SCE ID.
Also I'd like to be able to suppress that warning about not using SSL cert validation, it gets wearisome seeing that every time I tun my app.
Also because my command-line app won't run continuously (because image clone takes so long), I used pickle/shelve to persist settings, but this wouldn't work for objects returned by libcloud, so I had to use copy.copy() to selectively retrieve the important data out of them, and only fire up the libcloud for the short periods I need it.
I haven't tried with this version of ibm_sce.py, but with a previous version I wanted to build a single-file executable using pyinstaller and I had to add some specific additional imports into my .py file to ensure that pyinstaller brought everything needed across. Actually this feels like a libcloud problem rather than an ibm_sce.py problem.
What I found is that for what I want to automate - image copying between data centres - there are some significant gaps the SCE API which leave no choice but to do manual work for each and every image copy:
No way to retrieve the 'copy allowed' status of an image - has to be checked manually
No way to change the 'copy allowed' status of an image - has to be set manually
No way to share/unshare an image - has to be performed manually
No way to check the user's quotas beforehand to see if they have ability to create a 60G storage - have to rely onthe API call failing
The big one: no way to abort an operation - not a problem when everything works or fails relatively quickly, but clone between RTP and EHN takes a loooooonnngggg time, being able to abort it would definitely be useful, same for create volume from image, and copy volume to image.
Overall if I had to 'vote' on libcloud I think I'd see its effectiveness as a unifying API definition as maybe 3 out of 10 - all the useful stuff is in the cloud-specific .extra data dict and in the ex_...() functions. However with ibm_sce.py beefed up to better cover the SCE API then libcloud gets maybe 7/10 for use as a python binding for SCE - it would get more if the extensions could be better integrated with the libcloud core.