Duke4.net Forums: New Polymer feature: highpalookup maps - Duke4.net Forums

Jump to content

  • 5 Pages +
  • 1
  • 2
  • 3
  • 4
  • 5
  • You cannot start a new topic
  • You cannot reply to this topic

New Polymer feature: highpalookup maps  "A new system to replace polymost tinting and most alternate pal maps"

User is online   Danukem 

  • Duke Plus Developer

#61

View PostThe Commander, on Jan 14 2011, 09:48 PM, said:

Now certain custom maps that used alt pals that had not been made yet in the HRP will actually look like they should.
(Thinks of maps by Gambini etc)


Pretty much all those mappers who don't use the HRP, although Gambini actually does try to make it look good both ways in his recent work.

This puts the last nail in the coffin of the Polymost/HRP combination. Polymost will be strictly for use with 8-bit art on my computer, and then only because it performs better.

This post has been edited by DeeperThought: 14 January 2011 - 09:52 PM

0

User is offline   Plagman 

  • Former VP of Media Operations

#62

View PostDeeperThought, on Jan 14 2011, 09:51 PM, said:

Polymost will be strictly for use with 8-bit art on my computer, and then only because it performs better.


Will that still be the case after I commit the code I have to make Polymer use the shade offset lookup tables instead of just darkening 8-bit art? With a (togglable) smooth transition between the shade levels instead of the sharp depth cueing that classic mode has.
0

User is online   Danukem 

  • Duke Plus Developer

#63

View PostPlagman, on Jan 14 2011, 10:03 PM, said:

Will that still be the case after I commit the code I have to make Polymer use the shade offset lookup tables instead of just darkening 8-bit art? With a (togglable) smooth transition between the shade levels instead of the sharp depth cueing that classic mode has.


Possibly not!

But you see, there are maps I play using 8-bit art that I would not even currently attempt with Polymer. Maps that use close to the 16384 wall limit, have big view distances, lots of spritework, enemies all over shooting light sources....they are unplayable with Polymer for me. I can actually run something like DNE much better with Polymer than, say, WGR2.
0

User is offline   Plagman 

  • Former VP of Media Operations

#64

View PostDeeperThought, on Jan 14 2011, 09:32 PM, said:

I want this right now.


BTW, you can get it right now. Just download the file CraigFatman posted, copy the DEFs and you're done.
0

User is offline   Jblade 

#65

Quote

Will that still be the case after I commit the code I have to make Polymer use the shade offset lookup tables instead of just darkening 8-bit art? With a (togglable) smooth transition between the shade levels instead of the sharp depth cueing that classic mode has.

That sounds amazing, so effectively polymer will be capable of looking much closer to classic mode than polymost then? (I'm assuming this isn't possible for polymost of course)
0

User is offline   Roma Loom 

  • Loomsday Device

#66

View PostPlagman, on Jan 15 2011, 09:33 AM, said:

BTW, you can get it right now. Just download the file CraigFatman posted, copy the DEFs and you're done.


Just submit it to pHRP SVN already, and then write a proggy that would parse all the defs and remove alt pals deflines.

This post has been edited by Roma Loom: 15 January 2011 - 03:14 AM

0

User is offline   Helixhorned 

  • EDuke32 Developer

#67

Awesomeness! Can't wait to see this appear in PHRP SVN.

Now that the issue of redundant pal images is out of the way except for gradual improvements, some effort should be invested towards making the HRP more consistent with the old content. Right now I can only think of some models that have the wrong size when they substitute the original art. They are the "guilty" and "innocent" signs, the first one being a particular offender.
0

User is offline   DavoX 

  • Honored Donor

#68

Greatness man! One more step into Optimized polymer :P
0

User is offline   Plagman 

  • Former VP of Media Operations

#69

Well, some of the highpal swaps make textures look a little "glitchy", so I'm not sure whether we should flip the switch right now or try to help tuning Lezing's script before.
0

#70

I'd be pleased to see any improvement over my script. Of course if somebody creates a better script, you can refuse my color maps. Now I'm going to introduce automatic selection of colors for processing which will neither interfere nor cause visible distortion. Also one can try an iterative stochastic algorithm which draws discrete color patches on the color map to gradually adapt it to a selected look-up palette, but I doubt that it will work better than linear mixing technique I am using.

This post has been edited by CraigFatman: 15 January 2011 - 06:50 PM

0

User is offline   Gambini 

#71

This is the first awesome thing i find in Polymer!

AWESOME

Now:

Quote

Will that still be the case after I commit the code I have to make Polymer use the shade offset lookup tables instead of just darkening 8-bit art? With a (togglable) smooth transition between the shade levels instead of the sharp depth cueing that classic mode has.


That´s going to be the next, as long as it affects 8bits art too (while using Polymer of course).
0

User is offline   Plagman 

  • Former VP of Media Operations

#72

The point is that it only works for 8-bit ART, there's no shade offset lookup table otherwise.
0

User is offline   Gambini 

#73

So this means that -not using the HRP- the game will look just like the software renderer but with a widescreen aspect ratio and a good texture filtering? Can´t wait.
0

User is offline   Plagman 

  • Former VP of Media Operations

#74

I just committed all of Lezing's highpalookups to the HRP repository (converted to PNG, though I suspect they're poorly optimized), so you guys can give it a try whenever.
0

User is offline   Plagman 

  • Former VP of Media Operations

#75

Roma Loom just asked me about NVG, which reminds me that the game uses more than one base palette, it also has a water and NVG/slime palette. That means we need 3 whole sets of highpalookups (bar possible duplicates, I noticed pal normal/6 was the exact same as pal NVG/6, which makes sense) and that I need to change the DEF language to allow you to tell the engine for which base pal you're definining your highpal (and to hook it up properly in the engine, right now we only have crappy fullscreen tinting).
0

#76

Maybe we should just apply one more color-remapping shader to the final image? That is, each pixel is to be remapped by hipalookups twice - by texture palettes during rasterization and then by a slime/water palette. This won't hit framerate very much, but could save memory.

This post has been edited by CraigFatman: 16 January 2011 - 02:08 PM

0

User is offline   Plagman 

  • Former VP of Media Operations

#77

Yes, I thought about the global highpal being applied as a postprocessing effect (that's what you're suggesting, right?), but it's still one more texture lookup per pixel, so always more performance left on the table than just using a highpal map that combines both upfront. When talking about saving memory, are you concerned about HRP filesize or video memory? I don't think video memory is a problem here because we'll either be using one set or the other, never both at the same time.
0

#78

I'd say that I'm concerned about data redundancy. Are highpalookups that slow that they can't be used twice per pixel? Increasing the amount of data to load will increase loading times and RAM usage; the former can easily overrule the realtime performance gain. There's no need to make separate palettes for all combinations of base and texture-specific palettes. Having a two-stage color mapping system should provide a better performance/RAM ratio. And if some day we come to custom base pals (which modders might want to introduce), the renderer will be flexible enough to support them. Another possible way is to precalculate all highpalookups needed for single-stage filtering from an initial set of palettes (look-ups + water + NVG) during the game load; it's quicker than loading them from disk, but is it worth it? Imho, making base palettes as a postprocessing effect is the only practical way to go.
0

User is offline   Plagman 

  • Former VP of Media Operations

#79

It's not _that_ expensive (though cache locality on datasets that large kind of sucks), but it seems sad to have a performance penalty instead of none. Anyway, on the E1L5 scene from before, I get 149 FPS without highpal, 125 with one pass of highpal, 111 with two passes. That's roughly 1ms of overhead per pass, but it might be a lot worse on more complex scenes. I'm not opposed to having it making two passes of highpal and it makes sense from a technical perspective, but I still don't think it's the optimal approach. BTW I don't understand your point regarding support for custom base palettes, why wouldn't modders be able to generate highpals for these just like they would for the regular base pal if they want to change them? Anyway, would you be able to provide slime and water base highpals so I can test the multipass implementation?
0

#80

Keep in mind that most of time the second pass will be disabled (while the player is neither underwater nor wearing NVG), so not to worry about the performance hit. Also, have you been experimenting with less bulky highpalookups (say, 64x64x64)? Can they speed up the processing?

Anyway, here are the water/slime highpalookups generated by a slightly modified "clutstat2.kc" script: http://lzg.duke4.net/basehpal.rar

As for custom base palettes, they would be easier to implement in Polymer if treated as regular highpalookups (the setgamepalette command could support all hipalookups as well as standard base palettes). Of course, this is possible if we use two-pass color mapping.
0

User is offline   Plagman 

  • Former VP of Media Operations

#81

64x64x64 isn't an option, look at how ugly the result is:

http://picasaweb.google.com/pierreloup.gri...211307009827970

I'm implementing the alternate palette support now, I'll post an update when I'm done.
0

#82

View PostPlagman, on Jan 17 2011, 04:39 AM, said:

64x64x64 isn't an option, look at how ugly the result is:

Well, it could provide suitable quality if some interpolation is performed. Of course, if a trilinear interpolation won't slow things down even more (probably it will)...
0

User is offline   Plagman 

  • Former VP of Media Operations

#83

It's already interpolated, that's why 128x128x128 doesn't look too bad. It looks a little different than the full precision, but you can't see a net loss. Without interpolation you can definitely nothing something is off, and same with 64x64x64 with interpolation.
0

#84

If so, I have no clue why they look differently. If the interpolations are performed correctly, the texture shouldn't be distorted even if we select a 2x2x2 highpalookup that is intended to keep all colors unmodified, right? An RGB space in itself is just eight vertices with interpolated values in between. Can you explain where the difference comes from?
0

User is offline   Plagman 

  • Former VP of Media Operations

#85

Everything you said is right, of course. However, graphics hardware complicate the situation a bit since texture coordinates 0 and 1 don't refer to the middle of the first and last texel, respectively. Instead, 0 is the beginning of the first texel and 1 is the end of the last texel. When not using interpolation it doesn't show as much since everything snaps into place, but with linear interpolation that means that you don't get the value of the first texel if you sample 0. Instead, you get a blend of the first texel and the border value (or the last texel if using a wrapping sampler). While this is generally helpful in most applications (seamless wrapping), in this case it means that you can't directly map the result of the diffuse map fetch to the highpalookup map texture coordinates as the inbound color values range from 0 to 1. To get this exactly right with a N*N*N colorspace, you would need to scale the inbound color components by ((N-1) / N) and apply a bias of ((1 / N) / 2). I was hoping to avoid this additional instruction cost by relying on the dataset having enough values to hide it mostly. Since the dataset size also helps with a faithful rendition of the palette itself, I think 128x128x128 is good? I can try comparing the visual results in-game with lower precisions when applying the scale and bias I mentioned above as a hack to help us make the decision. Can you provide various sizes for a relatively complex palette so that I can give it a try?
0

#86

Ok, in this case scaling+biasing seems to be the wisest way (I guess it's fast compared to caching/interpolating costs). Theoretically, an interpolated highpalookup could work perfect with the standard high-color precision of 32x64x32 (5:6:5 bit mask), but let's employ uniform dimensions to simplify calculations. I've uploaded the palette #7 rendered in sampling levels of 64 and 32 for testing purposes:
http://lzg.duke4.net/hpal7tst.rar

Also, usage of large datasets is reasonable if they contain high-frequency components which can't be reproduced by low-resolution images; or if there is severe quantization noise, as with non-interpolated processing. Actually we rather need to soften sharp transitions between colors, and lower resolution might facilitate this by cutting off higher frequencies.
0

User is offline   Stabs 

#87

Can craig start coding for eduke?

by the sounds of things and his previous work he would be very good at it
0

User is offline   Plagman 

  • Former VP of Media Operations

#88

5:6:5 would be nice since that means we could use 16bpp instead of 32, but it would triple the cost of the scale and bias as we wouldn't be able to apply it on the whole texture coordinate vector in one pass. I tested your maps and they look good as well, though the colors and where they shift is obviously a little different. I think 6 bits is fine and still leaves some room if anyone feels like making a very accurate highpal table. Since any memory concerns are now totally out of the equation, I think than more than ever base pals need to be collapsed with the highpalookups instead of being separate. The cost of two additional texture lookups + scale/bias for each fragment far outweighs an additional 6MB of highpals in the HRP.
0

User is offline   Plagman 

  • Former VP of Media Operations

#89

View PostDanM, on Jan 17 2011, 09:13 PM, said:

Can craig start coding for eduke?

by the sounds of things and his previous work he would be very good at it


Hasn't he already started? Besides, everyone is free to submit changes to the core code whenever they please, the source code is public.
0

User is offline   Stabs 

#90

sorta but some actual commits by Craigfatman on the sourceforge page, perhaps pursue and improve a certain area of eduke like how helix bitch slaps mapster around, I know anyone can do it, its just a shame with his oodles of knowledge he hasnt started coding for the core of eduke yet, because you know he would do something crazy that breaks all the rules
0

Share this topic:


  • 5 Pages +
  • 1
  • 2
  • 3
  • 4
  • 5
  • You cannot start a new topic
  • You cannot reply to this topic


All copyrights and trademarks not owned by Voidpoint, LLC are the sole property of their respective owners. Play Ion Fury! ;) © Voidpoint, LLC

Enter your sign in name and password


Sign in options