texture baking space

I’m trying to bake a texture from my screen onto my mesh texture, but my conversion math seems to be screwed up. I’m passing my vertex shader the mesh UVs as the position (since I’m rendering in texture space) and the first texture unit is the mesh position in object space. The vertex shader converts the object position of the mesh to camera position and passes it to the fragment shader. The fragment shader just takes the screen space coordinate of the paint texture (which is a 4k square texture) and samples from that texture if the screen space is greater than .5 as a test. However I get a really weird result.


// vertex shader
#version 120
uniform mat4 orthoPV;
uniform mat4 cameraPV;
uniform mat4 objToWorld;
varying vec2 uv;
varying vec4 cameraPos;
void main() {
  uv = gl_Vertex.xy;
  cameraPos = cameraPV * objToWorld * vec4(gl_MultiTexCoord0.xyz,1);
  //screenPos = 0.5 * (vec2(1,1) + (s.xy / s.w));
  //screenPos = gl_MultiTexCoord0.xy;
  gl_Position = orthoPV * gl_Vertex;
  gl_FrontColor = vec4(1,0,0,1);
});

// fragment shader
#version 120
uniform sampler2D meshTexture;
uniform sampler2D paintTexture;
varying vec2 uv;
varying vec4 cameraPos;
void main() {
    vec2 screenPos = cameraPos.xy / cameraPos.w;
    gl_FragColor = texture2D(meshTexture, uv);
    if (screenPos.x > .5)
        gl_FragColor = texture2D(paintTexture, uv);
})


This is the quad rendered from the camera perspective with a test texture.
[ATTACH=CONFIG]696[/ATTACH]

This is what I get in texture space when setting the frag color. I would expect the black line to go down the middle.
[ATTACH=CONFIG]697[/ATTACH]