Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Global palettes bugs?
#5
(05-31-2024, 07:15 AM)megamarc Wrote: Hi!

Nice suggestions as always. I'll comment a bit

By design I didn't want Tilengine to be a 1:1 replica of past systems, imposing artificial restrictions. One of the major differencies is palette management. Classic systems had limited color capabilities, with prefixed slots of CRAM as you say. Graphic designers had to be very aware of color restrictions, and export their hand-crafted palettes to be loaded into CRAM separately.

Current graphics assets (bitmaps, png...) all have built-in palettes that are imported directly in a free-form fashion. Of course graphic designers could craft small palettes and use them on all their graphics, but they don't work this way anymore. They author the bitmap and have the palette embedded on it, just to be loaded by their framework of choice. So I don't think the old way of manage color would fit in current workflows.

I wouldn't partition global palettes in "tile" and "sprite" pools, that was just a chip limitation. I would offer a larger global palette pool, and let any element use any global palette. Of course you coluld arrange your palettes in any layout of your choice, but I won't make the engine to impose an artificial restriction to force you to work on a specific manner.

Sprites and tile-based layers don't own palettes, they just contain references, but they don't create nor destroy palettes. Spritesets and Tilesets are the ones who own the palettes. I could implement a TLN_SetSpriteGlobalPalette() function to make a sprite use a global palette instead of its referenced palette, but I don't think it's interesting because the sprite already has a valid palette obtained from its spriteset.

I think new features should be implemented to provide added value for artists and programmers in their workflow, but I think the features you're suggesting are more geared towards trying to protect the system from breaking it on purpose -like deallocating resources being in use- and force old limitations. This happens with any graphics (or whatever) system library writen on an unmanaged language. You can easily break wingdi, libcurl, expat, sdl or any other well established library if you intend to do so.

Don't get me wrong, I really appreciate your suggestions and take them into account -in fact I've already implemented many of them-, but this time I feel they don't align well with my vision.

However, if you can provide a use case where global palettes on sprites would be more suitable/capable to achieve a particular effect than spriteset palettes (and not just to try to make it more difficult to deliberatelly break it) I'll be glad to implement it.

On the "blueish" color palette, the renderer is reading memory past the data of the palette itself, so you're getting random data. You colud be getting segmentation fault errors, too. For performance reasons the renderer doesn't compare palette bounds on each pixel, it assumes you provided a suitable palette. By using undersized palettes you're working outside of prerequisites, so unwanted effects are not a bug, just undefined behavior.

Best regards,

For the global palettes story, I don't have any other use case beside of security (for now at least), but the reason I use global palettes is because of... Palettes animations
For exemple, you load a bitmap, and set an animation to its palette. Later, you free the bitmap, but the palette is also freed. The animation continues to access this palette and now you have a typical use after free bug. Global palettes are never deleted unless you explicitely ask for it. They are also always here and you always know where they are.

The way I achieve palette management for a bitmap is the following :
When assigning a bitmap to a layer :
1) delete the bitmap's internal palette and assign global palette 0 to the bitmap (the global palettes are also stored in a list).
2) When I delete the bitmap from memory, I create a dummy palette (of length 1) to the bitmap. After that, I delete the bitmap, so the global palette isn't free'd.

Nothing to do for the tilemaps, since they will prioritize global palettes usage if at least one is set.
For sprites, I can just set a global palette in my list to this sprite. The engine will prioritize this palette over the spriteset palette.

For the out of bounds undefined behavior, I have an idea to avoid a potential segfault. you can do something like this :
palette[bitmapColorIndex * (bitmapColorIndex >= paletteLength)].
If the (bitmapColorIndex >= paletteLength) test fails, it will result to a multiplication by 0, so you will use the very first index of the palette.
I think even palette[bitmapColorIndex & !!(bitmapColorIndex >= paletteLength)] can work too.
What I can do in my framework too is sanitizing the bitmaps the user loads. I iterate through each pixel and if the value is superior or equals to the global palettes length, I zero it (or I set it to the highest possible index).

Edit. What I did in my framework is quite interesting. There are ownership validations before assignation, and the resources are garbage collected.
Reply


Messages In This Thread
Global palettes bugs? - by System64 - 05-11-2024, 05:22 AM
RE: Global palettes bugs? - by megamarc - 05-30-2024, 05:37 AM
RE: Global palettes bugs? - by System64 - 05-31-2024, 03:08 AM
RE: Global palettes bugs? - by megamarc - 05-31-2024, 07:15 AM
RE: Global palettes bugs? - by System64 - 05-31-2024, 09:00 AM
RE: Global palettes bugs? - by System64 - 07-22-2024, 10:06 PM
RE: Global palettes bugs? - by megamarc - 08-10-2024, 05:17 PM

Forum Jump:


Users browsing this thread: 1 Guest(s)