Tuesday, 10 May 2011

PNVS redux

Short update on this.

Dispensed with the headache and nausea inducing PNVS camera. Copying the physical arrangement virtually does not translate to a 2D video environment. And thanks to that man who emailed to sympathise while sitting on the pad, that made me feel less of a wimp.

I've got a much much faster and smarter solution working for the PNVS and I'm very happy with it. A few rough edges still (it needs more of a smooth edge) post-processing will add contrast enhancement and edge detection. Parts of the helo are visible through the floor of the cockpit, like the TADS barrel and forward avionic bays, these need to be removed using geometry masking when the PNVS system is active. Since the PNVS area always stays in the same place it's easier to accomplish.

What I need is a Portal hole. Again shaders to the rescue and I'll employ a texture channel to act as a discard mask when the PNVS is on, unless something simpler comes to mind.

Anti-col light half in and out of the PNVS frame.

devs quick test zone, lighting and transparency blending
Transparency and lighting is now blended to the PNVS buffer. Gunfire, explosions etc looking quite good at night in refreshing mint flavour vision (tm).

Some post-processing gives control over contrast and brightness while desaturating the image. Still to do, image enhancement and then the large task of adding heat maps to all necessary assets which when rendered in a single pass will be applied to the emissive output channel. And it hardly costs any frames, brilliant.

first heap map test

6 comments:

  1. For what it's worth, if it's not a huge amount of overhead for you to maintain, I'm sure a significant numbers of the drooling hordes who can't wait to buy this game would love to have the actual physical arrangement PNVS as a toggled-off-by-default option. Headaches or no headaches, I'd love to fly an Apache that matches the real one in every way conceivable. Just my two cents.

    ReplyDelete
  2. It doesn't work that way though. You need to re-create the physical entities. Even in DCS games you have to fudge a few things for sake of practicality. On a 2D monitor both eyes are seeing a slightly lagged double image, even if the virtual cameras are arranged like the real machine, the limitations of desktop PC just can't deliver it in the same way. You would have to pass that image into an eyepiece with appropriate optics so one side of your brain can learn to process it. You can't really simulate that on screen.

    It's a wee bit of extra work to add and really, would benefit a whole client PC running the optics system synced over the network. I've seen a car research project doing this with Leadwerks to great effect. Maybe we'll try it as part of a pro feature if it's something worth paying for later (networked instruments and specialist hardware support).

    See how it goes. Just want to get to the next mile-stone.

    ReplyDelete
  3. Wonderful pics, morrrre please ....

    ReplyDelete
  4. Looks good flex!

    ReplyDelete
  5. I wonder if, perhaps if this does get released as a "pro" feature, maybe it would work well on 3d-enabled monitors? with the PNVS system image sent to the right eye only? That would be pretty slick.

    ReplyDelete