[-] [email protected] 5 points 2 days ago

This is why I would suffer from crippling transporter anxiety like Barkley. Every time you get transported, you essentially die and replaced with a copy of yourself. Would you notice? Probably not. But the tought of it would give me nightmares.

[-] [email protected] 85 points 1 month ago

"Due to the close proximity of the insects to each other the whole swarm explodes into a mist of blood and parts of elefant carcasses raining down all over the battlefield. Time for everyone to roll..."

[-] [email protected] 67 points 1 month ago

Even if this group shuts down completely, all this does is waste twitter's money while it also tells advertisers to stay the fuck away from ever do any business on Twitter.

[-] [email protected] 82 points 1 month ago

"taxes on couches"

[-] [email protected] 63 points 7 months ago

Maybe the rumors about Putin having cancer are true, and this is what the scientists told him to not accidentally fall out of a window.

[-] [email protected] 59 points 9 months ago* (last edited 9 months ago)

A lot of the implications for ray tracing are on the dev-side of things. It's a bit hard to explain without going into technical details.

Essentially, getting light to look "right" is very very hard. To do it, devs employ a lot of different techniques. One of those older techniques is baking the light on static objects, essentially pre-rendering where light goes and how it bounces. This has been done for a long time, e.g. even in Half-Life, the lights are baked for static geometry. So in a way, we have been using ray-tracing in games for a long time. however, it isn't real time ray-tracing, as the information gets stored in light map textures, so there is no performance impact other than storing the texture in RAM/VRAM and drawing the texture together with others.

The inherit problem of that technique is that it only really works for static geometry. If you move your light or any objects in the scene, your lightmaps will no longer match. To solve this, there are mixed modes which use real-time lights, dynamic light maps, and other tricks. However, these are often subject to problems and/or the limitations of using real-time lights. Real-time light problems are: You can only do a limited number before getting a serious performance impact, especially if the lights produce shadows. Soft shadows, shadows in big areas, and very detailed shadows are extremely hard to do as well without some advanced tricks. Also, ambient occlusion and global illumination is not something you can just give lights (there is screen-space GI and AO, but they don't look good in all circumstances, and you have limited control. There are also some other techniques some engines did for real time GI.).

Also there is the problem of baked light affecting dynamic objects, such as characters. This has been solved by baking so called "light probes". These are invisible spheres that store the light data and the closest data then gets applied to the characters and other dynamic objects. This again has a some problem, as it's hard to apply multiple light probes to the same object, so lighting might be off. Also, light direction is not accurate, which causes normal maps to look very flat in this light, and local shadows do not work using light probes. The same is done for reflections using reflection probes which are static. These are 360° "screen shots" essentially storing the reflection at that point in space. This however costs DiskSpace/RAM/VRAM, and it will not hold any information for moving objects (that's why sometimes you can't see yourself in the mirror in games). Also, the reflections sometimes look "out of place" or distorted when the reflection probe is too far from the reflecting surface (again, these cost VRAM and RAM so you don't want to place them in front of every single reflective surface). It costs a lot of time to find the right balance. For the rest, usually screen space reflections are used, as any other real-time reflection is extremely costly as you essentially render the whole scene again for each local reflection. Screen space reflection is an advanced technique that works very well for stuff like reflective floors, but you will quickly see its downsides on very mirrored surfaces as it lacks information that is not on the screen. Some games like Hitman for example use the mix of those techniques extremely well.

Coming back to lighting, there are now better techniques used for example by unreal and some other engines (and now unity in experimental). The light gets stored in more predictable data structures, such as 3d textures. This way, you can store the direction of all light in each cell. The light then gets applied to the objects passing through those cells. This looks pretty good, and the runtime cost is fairly low, but the storage cost of such textures is a tradeoff of texture resolution and fidelity. These textures cost a lot of VRAM to store and without using advanced techniques and tricks, have their own limits (e.g. for scene size). It also costs a lot of time to create each time you change the scene, and it also doesn't eliminate all problems mentioned above, like reflections, moving lights, etc.

Specifically, there is the problem of character lighting itself. Using light probes on characters usually looks pretty bad, as it removes a lot of detail of advanced skin shaders. Even with the above mentioned techniques, character lighting is still extremely hard to do. There is also some other problems, like ambient shadow in already shadowed areas, and light balancing for character versus scene lighting.

For that reason, most AAA games use separate light rigs for characters. Essentially floating lights that ONLY affect the character and move with them. When the mixing with the scene lights is done right, the rig adapts to the current situation in terms of light direction, color, and intensity. If you look in most AAA games, you can often see situations where rim-light comes from a direction where there is no actual light source. However, this way, the devs and artists have full control over lighting the characters. Essentially like a real movie production would have, but without the limitation of the real world.

Now, ray-tracing as you know it right now is not quite there yet, but eventually, ray tracing is the solution to a lot of the problems mentioned above. Things like polygon density, light count, global illumination, ambient occlusion, light direction, reflections, and much more are simply "there" for you to use. Now this doesn't mean that it will automatically make everything look great, but with the overwhelming amount of different tricks that have to be used for current gen games to make the look good, it opens a whole new world of possibilities.

Also, something that will not directly influence the final game, it will eventually simplify things for devs so that more time can be invested into other things.

At this current usage of ray-tracing, it's more like a gimmick, because devs will still focus most resources on the current ways to use light. This is because most people don't have cards with sufficient ray-tracing capabilities. So for the moment, I agree that the performance hit is not worth it. However, eventually it might become the default way to draw games. While we are not quite there, in terms of performance, I think that things might become a lot more consistent and predictable eventually for raytracing.

[-] [email protected] 75 points 10 months ago* (last edited 10 months ago)

Though it is correct they would claim they did.

[-] [email protected] 95 points 11 months ago

About as quickly as alternative platforms can replace them because of the incredible market advantage.

[-] [email protected] 70 points 11 months ago

Covid-19 has made me even more cynical than ever. It has shown that people would rather die than accept reality. And compared to climate change, the effort to protect against covid was minuscule on an individual level. But still, too many people couldn't be asked.

[-] [email protected] 74 points 1 year ago* (last edited 1 year ago)

They have a clause in the announcement that if two games are sufficiently similar is content, they are counted as the same game.

How arethey are going to determine that you ask? Probably the same way they are using to determine install count: pulling it out of their ass.

[-] [email protected] 116 points 1 year ago

Pay—the reason most humans work—remains a major motivator today. When consulting firm McKinsey earlier this year asked workers why they took a new job, nearly all groups gave the same No. 1 reason: More pay.

Getting a new job is usually the easiest way to get a raise, with pay for job switchers consistently rising faster than for those who keep the same job

Correct me if I'm wrong, but I have the very slight suspicion that it's not actually the workers to blame for not staying at a company their whole life.

[-] [email protected] 50 points 1 year ago* (last edited 1 year ago)

Maybe Jesus ran a scam with Judas on the Romans.

"Hey Jesus, how about we get that gold for your head. I know this dude that looks somewhat like you. Let's give him to the Romans."

Then, three days after, someone not in on the plan found Jesus, so they had to pretend that he rose from the dead.

They went to the grave to get rid of the body, so their story is more believable.

view more: next ›

schema

joined 1 year ago