Why can only textures be attached to Framebuffers and textures not be buffered transparently? Or why can renderbuffers not be read from directly via GetBufferSubData and there is no GetTexSubImage?
It’s obvious that an attachment to the framebuffer has to provide data about it’s color/pixel-Format, so it cannot be a plain buffer. But that does not mean anything attachable to a framebuffer must not be readable in parts directly.
Always clumsy ways of composing this assumedly simple operations have to be used. The (partial) read of a multisampled rendering-target is a prime example. To get to the “evaluated” pixels of a multisample framebuffer one has to create a second framebuffer and blit to it before being able to read from it which even adds a layer of abstraction to the operation required. What is so special about framebuffer to framebuffer transfer that cannot be done by ReadPixels? No - please do not tell me…
The workaround - a very good idea IMHO - would be transparently bufferable textures. One could render to them, use their data for almost anything but would eventually trade their all-around usability for some speed when sampling from them with linear filters - leave alone mipmaps that would have to be generated when Auto-creation is toggled on for them.
This brings us to buffered textures as they are: Those thingies do not seem to be usable for anything. They are 1dimensional (as if the y*w+x couldn’t be computed by the GPU) cannot be attached to framebuffers and cannot even be sampled from - which doesn’t hurt really as no one really samples from 1-dimensional textures anyways.
The generic buffer-abstraction should be extended to every aspect of the api that requires “data” in some form IMHO - be it at the cost of some rendering performance if used naively.