|
|
|
|
Reply From: |
Zylann |
I think the performance issue comes from how many pixels you are drawing. Stacking a lot of full-res, full-screen images can lead to a huge amount of overdraw, even if images are mostly transparent. CPU has nothing to do with this scenario, because in your case you have only a few nodes to manage. However, that’s another big deal for the GPU. Having a hundreds of different textures can be an issue as well, but you don’t have that many. What is your graphic card? OS, drivers?
You could try to reduce drawn surface by cutting transparent areas? Also, are you using pixel shaders? Custom materials?
Edit: if you are using a laptop, does it have two graphic chipsets? Mine has nVidia + Intel graphics. While the latter can render OpenGL stuff, it is ridiculously slow, unless I explicitely tell Windows 10 to run the program with the nVidia chipset.
Yeah, I know overdraw can get bad, it just seems like 8 layers should be no problem. (I mean how do you even make a game with less than 4 or so?) I guess the facts say otherwise though. That’s why I asked if someone had successfully made a similar game with Godot.
Ah, I assumed the CPU would take over anything the integrated graphics couldn’t handle? I’m using windows 7 and the Intel HD 4000. I’ve also tried it on a Win 7 laptop with the HD 3000, an Ubuntu desktop with some other integrated intel chip I think, and a 10 year old dell XPS running windows XP. They all had fairly similar performance.
The only thing that has a material is the vignette. It has a simple screen-reading shader that desaturates what’s underneath it.
rgrams | 2016-06-05 15:50
Is there a way to specify if and how a sprite uses transparency? I know in Unreal Engine you specify if a material is opaque, masked, or uses the full alpha range.
rgrams | 2016-06-05 15:55
The CPU is never used to draw things in games, it’s the graphic card (GPU).
For transparency, I meant that bigger sprites should not always be square, because squares don’t really represent the shape of what you are drawing:
Let’s say you have a hollow circle sprite covering the whole screen. If you draw that, you’ll ask the GPU to draw 10% of filled pixels, then 90% will be… blank. But drawn anyways. The solution is usually to draw a shape encompassing non-empty pixels, instead of a quad. There will be a few more polygons, but you save 90% of screen space to fill. This is even better if that sprite has a pixel shader on it, because you save computations per pixel. Unfortunately, Godot doesn’t have this form of optimization yet, which should become possible with 3.0.
My game has three stacked tilemaps, a few sprites and one parallax background, and it runs at a solid 60 fps most of the time when maximized. When windowed over the editor it slows down to 40 fps, not sure why, but that’s not Godot’s fault because once exported it runs fine on both PC and Android. I never tested with further layers
Did you tested your game after a release export?
I would be curious to test your case but I guess your project is private at the moment
Zylann | 2016-06-05 17:01
Yup, all my testing on other computers was with a release build. I just put in a frame rate and frame time display and tested it a bit more. Running the release build does shave off a couple ms compared to running it over the editor, as expected.
I guess you are right, it’s just too much stuff. I’ll have to cut back on the parallax layers, which is frustrating (I had intended to add more, haha), but not game-destroying.
Now I’m curious how something like Ori and the Blind Forest would run on my computer, hmm . . .
rgrams | 2016-06-05 23:24
My main development laptop also uses a HD4000 under win7 (along with a gtx620 which is not much better than the intel GPU). I also get terrible performance with Godot for 2d. In contrast I am developing a full HD game with Gamemaker Studio that runs smoothly at 60fps on the same laptop. Gamemaker exports to DX9 which has much better performance than Godot’s OGL.
I really do think that it is OpenGLES2 that is letting us down at the moment, and you may have to wait till they incorporate ES3 before you get decent framerate on cards like the HD4000.
zendorf | 2016-06-06 04:01
Ori is a 3D game, even if you can’t see it at first sight.
But yes, I’ve had the same questions, because there are 32+ parallax layers in that game.
I think we should implement something similar to what other engines ar calling 2.5D: 2D Physics but 3D rendering. As the 3D-Part of Godot is going to be rewritten, it seems to be a good idea.
timoschwarzer | 2016-06-08 15:45
Ah, of course, it’s made with Unity, isn’t it. Yeah, I was recently informed of the differences between 2D and 3D rendering, and how 2D engines often don’t have any sort of “occlusion culling” or depth sorting, so every pixel on every layer gets drawn.
This article http://achurch.org/dco2d.pdf about optimizing 2D rendering was pretty interesting, and they got a pretty serious performance boost.
rgrams | 2016-06-08 17:26
I wonder if the game would run faster if you would add all Sprites from one scene into 3D space.
Maybe you can try this in a copy of your game. Just add the sprites and make the camera movable, so you don’t need to implement your game logic.
timoschwarzer | 2016-06-08 17:32
2D or 3D is not the point. In any case, you ask the graphic card to draw polygons. In 2D Godot already ignores depth-sorting, depth-test etc and actually has a form of Frustrum culling for 2D to speed up rendering. Maybe it batches the geometry but I didn’t went that far in the code. However, for the sake of simplicity, Godot draws quads regardless of what the images are.
Since official 2D support, Unity3D considers sprites to be actual assets, not just a texture you smash on the screen. It also optimizes the shape of it in a way the graphic card spends less time drawing them, so there is an actual mesh drawn here.
Also, maybe it’s due to the GLES2 limitation preventing the use of modern techniques. But I’m skeptical about this, since we are talking about drawing sprites.
Zylann | 2016-06-09 19:28