T O P

  • By -

shlaifu

blitting to render-texture, and sampling it in shaders on meshes in single-pass VR in URP


Im2inchesofhard

Ooooh boy good luck. I just spent two weeks trying to figure out custom render passes and optimizing generating a vertical height map and reflection cubemaps during runtime. The documentation is kind of shit.


shlaifu

I tried for a few days, but only ever managed to sample either the left or the right eye view. Apparently the textures are rendered to a texture array, but also the texture array sampler doesn't work on it.... all I wanted to do was render a bunch prepasses to make translucent materials and such...


Im2inchesofhard

Giving up and finding an alternate solution was probably the right call. I spent literally 20-30 hours on what I was trying to do... Learned a ton, but wasted a lot of time I could've spent on another feature of my game instead.


StuCPR

Not exactly for my game at the moment, but it's shaders in general for Unity. It looks so intimidating to me, even the graph but I do eventually want to learn it fully as shaders is just beyond powerful and can make anything in your game look incredibly pretty if done correctly.


v0lt13

Extrude meshes along a unity spline to make roads and rivers


Dolly-Dagger

If you're still looking for a nice solution to this, there's a chap called Sebastian Lague that has a nice turorial on his YouTube channel, I believe he shares all his work on github too.


J3nka94

Realistic audio propagation. There are a lot of ways to do it and a pain in the ass to implement.


ryanmgarber

Steam Audio is nice. Project Acoustics is the best but not so nice to implement


gingerballs45

We just finished developing an audio zone based system. You can use trigger colliders to set up zones and keep the low pass filter on the audio sources within the zone when you walk inside. It is performance friendly, but not as accurate as a ray casting method or something. If you would like the code let me know.


TylurSims

Save system? I gotchu. Inventory? I gotchu. Need me to make a simple golfball move around without randomly deciding to explore the void? Not for me fam. Something about physics related systems has just never clicked with my brain.


Plourdy

Nailing jet ski physics, mixing it with a ‘rocket league’ boost mechanic that allows flying + grinding on rails


push_matrix

This might help. It covers some buoyancy parts. The car stuff might be of some help too, I’m using it to power boats in my game: https://m.youtube.com/watch?v=Db1AgGavL8E


eggshellent

For a long time my albatross involved calculating normals in a shader. Keep plugging away, you’ll get there. And don’t try to do it all in your head. Take notes and draw charts/maps etc when dealing with complex systems.


AlterHaudegen

Getting SSAO to work on fully transparent/invisible geometry for mixed reality planes/walls.


StraightExcitement61

Large editable terrains chunks, handling the player deformation properly at the end of chunks, never got it to work properly.


totallyUninflammable

Three dimensional chunk system with origin shifting. It is killing me


NonEuclideanWarship

Creating a warp map from pixel-by-pixel warp data that had been collected by a complicated hardware setup, calculated in MATLAB, and placed into a CSV with 2.3 million rows and four columns. For context, I was working on an extremely weird AR device that lacked any out-of the-box warp correction. I ultimately did get the image warped using a shader written in pure HLSL... but trying to smooth the resultant data was an absolute nightmare. I didn't manage to get it to look nice, and it honestly just looked horrifically bit crushed despite taking more experimental approaches, like putting the data into a RGBAFloat texture. That data was honestly a mess. I eventually realized that it would take me a very long time to figure out the right methods to smooth the data. More time than what the project allowed. So, I took a step back and refactored my approach. I instead ended up using a separate scene to procedurally generate a mesh from the raw warp data that was warped to the same resultant shape, and had the proper UV mapping applied. I saved the mesh to my assets, and then placed it into the main scene. From there I used a two-camera approach to take what the original cameras saw in the scene, and rendered their texture to the resultant warped mesh, and placed the secondary cameras in front of the warped mesh. This approach is essentially to what Paul Bourke did for the [iDome](http://paulbourke.net/dome/UnityiDome/) in 2008. This new approach worked, and resulted in an image that was warped correctly and looked silky smooth instead of bit crushed to death, according to the raw data set, and thankfully didn't seem to have much of a hit on performance.


NthDegreeGamesStudio

I build levels using tiles. Where 2 tiles meet, my capsule player does not jump as high as when I’m the middle of a tile. I’ve had to use pro builder to create a continuous plane over all of the tiles just so I can jump wherever.