Advertisement

DirectX9 Can only scale up and now down when drawing vertexes :(

Started by August 25, 2022 09:50 AM
14 comments, last by JoeJ 2 years, 2 months ago

I honestly can't figure this out.

I should be able to do this, I'd think, but even using examples I've found online in my program don't work.

I can do a full screen quad of a texture and it can display fine 1:1.

I can even change the vertexes and make it 1.25x 1.5x 2.5x, whatever… It'll work and go off screen.

I can't, however, shrink it to do .5x or .25x or anything.

If I go below 1x, it just doesn't draw at all and I can't figure out why.

I'm using an XYZRHW + TEXCOORD vertex.

These are my vertexes for the triangle strip quad (2 primitives):

{ -0.5f, -0.5f, 1.0f, 1.0f, 0.0f, 0.0f },

{ resX - 0.5f, -0.5, 1.0f, 1.0f, 1.0f, 0.0f },

{ -0.5, resY - 0.5f, 1.0f, 1.0f, 0.0f, 1.0f },

{ resX - 0.5f, resY - 0.5f, 1.0f, 1.0f, 1.0f, 1.0f

resX = width of the screen

resY = height of the screen

Like I said, works fine if I keep resX and resY to the width or height of the screen OR bigger.

If I try to draw something smaller, it just doesn't draw…

Any ideas?

(I'm trying to use a quad to downscale a texture and I can't do ANY downscaling at all because of this, only up.)

I think you need to provide more information.

You show your data for 4 vertices, but you don't show code how you apply a scaling factor.

No information on setting up projection.

But my guess is: By scaling down, you move your vertices in front of the front clipping plane, so everything gets culled : )

Advertisement

Hey, thanks for the reply.

Oh, I don't think I have done anything re: setting up projection.

As far as scaling, I just do this:

Quad[1].x = (Quad[1].x + 0.5f) * ScaleX - 0.5f;

Quad[3].x = (Quad[3].x + 0.5f) * ScaleX - 0.5f;

Quad[2].y = (Quad[2].y + 0.5f) * ScaleY - 0.5f;

Quad[3].y = (Quad[3].y + 0.5f) * ScaleY - 0.5f;

ScaleX can be 1.0x, 1.5, 0.5, etc.

I guess my mind is thinking, as an example if rendered to a 1920x1080 resolution and setting a texture to draw to half that at 960x540… wondering why it doesn't work because, well, that's half for each… it clearly should fit in the 1920x1080 screen dimensions… and super odd I can go outside of it.

hmm… my idea of clipping issues just became less likely. : (

But i still have no better idea, so i would do some trial and error on your z coordinate. Ideally you could change this at runtime with some slider or key presses, to see what happens.

Idk what's the default projection of DirectX, but no matter what - clipping can't be turned off, even if you do only 2D stuff.
So besides z coordinate, front and far clip values might matter. You may need to experiment with some manual ortho projection setup, which also needs those values.

It's quite a common problem. You render some first triangles, but screen is black. And then you need to figure out some camera projection so they come into view.

I agree in your case it should just work without such problems, but idk what else to try.

Yeah, I don't know either.

I've been using StretchRect as a workaround for now.

I'd love to be able to take advantage of DrawPrimitive and/or DrawPrimitiveUP to downscale stuff.

I've implemented SSAA as well as Bloom into my program.

I thought that, in theory, you should be able to take a 3840x2160 rendered image from SSAA 2x and quad it onto a 1920x1080 backbuffer, but I can't do that. I have to StretchRect it onto the BackBuffer.

For SSAA 4x, which is 7680x4320, I have to StretchRect it twice. Once in half using Bilinear onto a temporary surface, then half it again onto the backbuffer.

Would be nice to just quad it once… (Though that might still run into the problem with Bilinear filtering being better in half steps, so maybe not.)

I also have to StretchRect for Bloom to downsample the original image instead of just being able to use a Shader with a Quad… cuz yeah… it won't downscale at all.

But I could quad that 3840x2160 image onto a 1920x1080 backbuffer, though! You'd only see the upper left of the image though, cuz it's too big. :')

I could also take a small image like 128x128 and quad it to fit the entire 1920x1080 backbuffer, no problem.

Soooooooooooooooo odd!

Still digging into this, but it seems to work with using quad to downscale if I set the texture sampler to NONE.

It won't work if I set it to linear or point.

… Still digging. LOL.

Advertisement

So, here's another quad function I tried using...

#define CUSTOM_VERTEX2 (D3DFVF_XYZRHW|D3DFVF_TEX1)

typedef struct _VERTEX3

{

D3DXVECTOR4 pos;

D3DXVECTOR2 tex;

} VERTEX3;

void D3D9_Quad(IDirect3DTexture9* lpSrc, float rx, float ry)

{

DWORD lv, fv;

VERTEX3 Vertices[4];

ry -= 0.5f;

rx -= 0.5f;

Vertices[0].pos = { -0.5f, ry, 0.0f, 1.0f };

Vertices[1].pos = { -0.5f, -0.5f, 0.0f, 1.0f };

Vertices[2].pos = { rx, ry, 0.0f, 1.0f };

Vertices[3].pos = { rx, -0.5f, 0.0f, 1.0f };

Vertices[0].tex = { 0.0f, 1.0f }; // Bottom Left

Vertices[1].tex = { 0.0f, 0.0f }; // Top Left

Vertices[2].tex = { 1.0f, 1.0f }; // Bottom Right

Vertices[3].tex = { 1.0f, 0.0f }; // Top Right

t_d3d9->GetRenderState(D3DRS_LIGHTING, &lv);

t_d3d9->GetRenderState(D3DRS_FOGENABLE, &fv);

t_d3d9->SetRenderState(D3DRS_LIGHTING, 0);

t_d3d9->SetRenderState(D3DRS_FOGENABLE, 0);

t_d3d9->SetSamplerState(0, D3DSAMP_MINFILTER, D3DTEXF_LINEAR);

t_d3d9->SetSamplerState(0, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR);

t_d3d9->SetSamplerState(0, D3DSAMP_MIPFILTER, D3DTEXF_LINEAR);

t_d3d9->SetTexture(0, lpSrc);

t_d3d9->SetFVF(CUSTOM_VERTEX2);

t_d3d9->DrawPrimitiveUP(D3DPT_TRIANGLESTRIP, 2, Vertices, sizeof(VERTEX3));

t_d3d9->SetRenderState(D3DRS_FOGENABLE, fv);

t_d3d9->SetRenderState(D3DRS_LIGHTING, lv);

}

I'm trying to use this function to draw a 3840x2160 A8R8G8B8 rendertarget texture onto a 1920x1080 X8R8G8B8 backbuffer.

This doesn't work when the filters are set to D3DTEXF_LINEAR or D3DTEXF_POINT (draws nothing), but does work when the filters are set to D3DTEXF_NONE. (Draws pixelated, but fits the texture onto the screen properly.)

Hopefully someone has some idea why this doesn't work and what I should do to correct it.

I will mention that HLSL/FX shaders that downsample also don't work for me.

Even the downsample one included in the Microsoft SDK. Upsample one works fine. I just don't get it.

OK, this has been SOLVED.

Turns out, I need mipmaps on my textures to go down but not up.

Adding D3DUSAGE_AUTOGENMIPMAP as a flag when creating the texture has done the trick!

you dont have to use mipmaps under directx9 to have picture regardless. something else is messed up in your code when you putting up that texture.

Uh…

It looks like you're right.

I think the main problem was my previous CreateTexture statement.

I've been using levels 0. I should have been using levels 1.

I just tried with levels 1 and took off the autogenmipmap and it's fine as well.

I feel dumb AF.

Thank you, Geri.

EDIT: Without mipmaps, the quality of downscaling seems worse tho'.

This topic is closed to new replies.

Advertisement