Techy Gubbins
‹ Previous123456719
  • Can't remember which thread the AMD Mantle chat was in, so thought I'd start a new one for general tech blather that doesn't really fit anywhere else.

    A rendering guy's general musings on Mantle:
    http://c0de517e.blogspot.co.uk/2013/12/on-mantle.html
  • beano
    Show networks
    Wii
    all the way home.

    Send message
    Nothing to add other than I like the hex-not-hex-really-code (pun not intended) blog url.
    "Better than a tech demo. But mostly a tech demo for now. Exactly what we expected, crashes less and less. No multiplayer."
    - BnB NMS review, PS4, PC
  • dynamiteReady
    Show networks
    Steam
    dynamiteready

    Send message
    http://www.theverge.com/2013/12/3/5171942/microsoft-xbox-one-usage-stats-zombies-hours

    The interesting thing here is not the Verge's sensationalist sales figure bullshit, but the fact that Microsoft are confident in being able to disclose ballpark figures on rather precise in-game actions...

    Shit might well be way more precise than this "teaser" lets on...

    Next level stuff, as far as UI testing and R&D plans are concerned...
    "I didn't get it. BUUUUUUUUUUUT, you fucking do your thing." - Roujin
    Ninty Code: SW-7904-0771-0996
  • Analytics are prevalent these days - mostly down to wanting to max out IAPs it seems.
    I haven't seen the TRCs, but I wouldn't be surprised if next gen console achievements involve some form of stat/data upload to the achievement server - giving a handy centralised location to farm off stats like "X billion zombies killed", "X million miles driven", "X thousand hats sold at £Y" etc.
  • dynamiteReady
    Show networks
    Steam
    dynamiteready

    Send message
    djchump wrote:
    ...IAPs ...TRCs....

    Wait... TLAs... 

    WND, CQU*...

    *Is that, 'Will Need Definition, Can't Quite Understand', or 'Women Need Dick, Could Quell Udders'? I can't remember...
    "I didn't get it. BUUUUUUUUUUUT, you fucking do your thing." - Roujin
    Ninty Code: SW-7904-0771-0996
  • In app purchases
    Terms and conditions? Could be a typo by chump
    I am a FREE. I am not MAN. A NUMBER.
  • Ooh, a question.

    So, y'all know about that "soap opera effect" motionplus interpolation thing they whack in tellies now that bumps up the framerate and makes everything look weirdly higher-fidelty - has anyone ever tested this with videogames nominally locked at 30fps?
  • Sony TRC/ Microsoft TCR/ Nintendo lotcheck = the mandated list of technical tests and requirements that your game has to pass/comply with to pass certification for that platform (and this get the go ahead for gold master). Depending how cosy you are with the platform holders or how good your producer is, you may be able to get some waived, but mostly it's pretty basic stuff like "don't fuck up the save game", "render all text within this safezone to account for different tv setups" etc.

    More info in compliance section here if anyone wants to know more about game QA: http://en.wikipedia.org/wiki/Game_testing


    Re: motion interpolation - off the top of my head I can't think of any games that have gone for that internally as it introduces an extra frame's worth of lag. Presumably people with swanky TVs have the option to get the tv to do it for them anyhows. The general idea behind interpolation (I.e. use data from the previous frame) does get used a hell of a lot though - any rendering technique that mentions "temporal" will be using data from the previous frame(s) to converge to a better visual result - e.g. many ssao implementations since gow2 I think will iteratively improve the ambient occlusion over several frames while the camera isn't moving, so you see the grounding shadows improve in quality over several frames if you're looking at a static scene, but then reset when you move the camera. Same for several of the newer antialiasing techniques (smaa) and many motion blur effects.
  • Escape
    Show networks
    Twitter
    Futurscapes
    Xbox
    Futurscape
    PSN
    Futurscape
    Steam
    Futurscape

    Send message
    Brooks wrote:
    Ooh, a question.

    So, y'all know about that "soap opera effect" motionplus interpolation thing they whack in tellies now that bumps up the framerate and makes everything look weirdly higher-fidelty - has anyone ever tested this with videogames nominally locked at 30fps?

    Smoother, but less responsive.
  • Aye, at best all the frames will be displayed one frame late (at the original framerate); you can't interpolate without having the frame both sides.
  • Escape
    Show networks
    Twitter
    Futurscapes
    Xbox
    Futurscape
    PSN
    Futurscape
    Steam
    Futurscape

    Send message
    The transitions themselves don't worsen input lag, their processing cost does. Unless we're talking about interpolating a 60-fps game, which only accepts as many inputs per second.

    I thought about buying one for GTA IV once, because its frame-rate was so poor that the trade-off didn't seem as bad.
  • I'm glad this thread exists but nothing to add thus far, well, i dont know, i do have things to fire at chump regarding personal interests, (some very ambitious intentions regarding volcano reproduction/simulation)  is that in the remit of this thread?
  • Escape
    Show networks
    Twitter
    Futurscapes
    Xbox
    Futurscape
    PSN
    Futurscape
    Steam
    Futurscape

    Send message
    'Tis if I understand it and you employ at least one cccaaawwwssshhh!!!.
  • I cant understand that last bit, is that irony
  • Escape
    Show networks
    Twitter
    Futurscapes
    Xbox
    Futurscape
    PSN
    Futurscape
    Steam
    Futurscape

    Send message
    Can't build a game if you can't design sound, son.
  • The volcano sounds will be super brutal dont you worry!
  • djchump wrote:
    Analytics are prevalent these days - mostly down to wanting to max out IAPs it seems.

    Analytics are a huge part of balancing and refining a game once it's live, not just to boost IAPs. In fact I'd argue game balancing / tweaking is the key purpose, but I guess that depends what department you're working out of.

    Analytics allow you to spot difficulty spikes / drop off points early and address them, as well as giving info on player interaction with features etc, all of which let you build on what you know is working, and either improve or bin what isn't.

    You can also use analytics to entice sponsors / investors, if you're using ads, "hey look X feature gets Y amount of users per Z, if you advertise there you'll reach XYZ people" or something.

    There are loads of uses, saying it's just for IAPs is kinda lumping it in with the IAP / no-IAP argument, or at least that's the end of the stick I got.
  • I think chump may have been a little broad and just fooling around, but the point was, craploads of player data is collated to be used in many ways and that data is a goldmine
  • nick_md wrote:
    djchump wrote:
    Analytics are prevalent these days - mostly down to wanting to max out IAPs it seems.
    Analytics are a huge part of balancing and refining a game once it's live, not just to boost IAPs. In fact I'd argue game balancing / tweaking is the key purpose, but I guess that depends what department you're working out of. Analytics allow you to spot difficulty spikes / drop off points early and address them, as well as giving info on player interaction with features etc, all of which let you build on what you know is working, and either improve or bin what isn't. You can also use analytics to entice sponsors / investors, if you're using ads, "hey look X feature gets Y amount of users per Z, if you advertise there you'll reach XYZ people" or something. There are loads of uses, saying it's just for IAPs is kinda lumping it in with the IAP / no-IAP argument, or at least that's the end of the stick I got.
    Very good points - aye, I was being a tad too cynical and dismissive; kickback against the somewhat prevalent drive to add IAPs and monetisation that I see creeping into all kinds of games that I'd rather not see them in.

    But yeah, as you say, analytics is a huge boon and vital to any kind of balancing - especially of multiplayer games. I'd totally forgotten about the old Halo heatmaps, stats and the like, so in many ways analytics and gameplay data are really quite interesting in and of themselves.
  • LazyGunn wrote:
    I'm glad this thread exists but nothing to add thus far, well, i dont know, i do have things to fire at chump regarding personal interests, (some very ambitious intentions regarding volcano reproduction/simulation)  is that in the remit of this thread?
    Sure, feel free to fire away with any techy stuff - if it's out of my depth I'm sure others like muzzy can help out as well ;-)
  • Right you are then! The challenge is thus:



    To create a sustainable hour or two long analogue or simulation of a volcano at that stage and scale. And wether I should be using particles, volumetric reproduction or a mix, and wether i'm almost certainly looking at implementing it in some form on the gpu. There's several great raymarching examples on shadertoy one might use to create a volume, but its going to have to be very fine, and I cannot see it working on its own. Would particles belie their billboarded origin as soon as you get closer to the volcano and navigate around it? Dilemma!
  • acemuzzy
    Show networks
    PSN
    Acemuzzy
    Steam
    Acemuzzy (aka murray200)
    Wii
    3DS - 4613-7291-1486

    Send message
    My answer is 'yes'
  • To which bit!

    Anyways you're the graphics guy chump, muzzy's the maths unit
  • New techy comment time, im considering moving all my item modelling to MODO exclusively, its swish. Never thought it was on windows! I dont know if it would give you as clean a mesh as, say, zbrush, but the UI actually looks possible to understand and it has retopo tools, the works
  • LazyGunn wrote:
    Right you are then! The challenge is thus: To create a sustainable hour or two long analogue or simulation of a volcano at that stage and scale. And wether I should be using particles, volumetric reproduction or a mix, and wether i'm almost certainly looking at implementing it in some form on the gpu. There's several great raymarching examples on shadertoy one might use to create a volume, but its going to have to be very fine, and I cannot see it working on its own. Would particles belie their billboarded origin as soon as you get closer to the volcano and navigate around it? Dilemma!
    Wow, okay, some initial thoughts and I'll have a think about it over the afternoon as well:
    - most important rendering feature being volumetric shadowing, given its so dense there's not much light transmission, so volumetric lighting maybe isn't all that important, but nailing the shadowing is gonna be pretty vital to visual sell of the form and scale.
    - probably almost as important though is the roiling motion - you can get away with all kinds of fudges and shortcuts if the rendering if the motion is believable; hides a multitude of sins. So how you control/animate/simulate the movement is pretty important. Precanned mesh anim won't cut it, not enough variation, but then again fluid sim is a big ask if you don't have a black box plugin/renderer to handle the sim for you.
    - actually, scratch that first comment about the shadowing: I think nailing the motion is THE most important part, everything else can follow on from whatever solution fits best for getting the motion believable.
    - options: 
    1) particle system: but you're looking at a fuckload of particles, so overdraw is gonna be a killer (especially if you aim to have it on mobile/tablet). Safest/most controllable to render a low number of fairly opaque particles to an offscreen render target (that has scene depth buffer) then composite it back into the scene - then gives you the option of mixed res render if expense is high, offscreen process to add noise/extract normals etc. then can and add further noise in the upscale composite pass. Battlefield distance smoke worth reading up on, but not sure how well the technique will hold up at the huge scale in the vid and up close. Offscreen render you can also get funky with depth write and go for some kind of volume effect (dice translucency the obvious one here).
    If using particles as smoke , remember to splay the quad normals out to get the most from the normal map - but you may want to forgo pre-authored normal maps altogether and reconstruct normals from the  offscreen render target for that fully dynamic look (will also cut down on any obvious repeated textures or tiling).
    2) volumetric - I dunno much about this kinda stuff as I'm increasingly a bit oldskool dx9 these days; have no experience in full dx11 pipelines. The old vertex texture displacement method could help you get that "noisy roughly spherical dust sources roiling upwards and outward from each other" - getting the octaves of perlin (or whichever noise func you choose) right for that fractal look will help sell it at different scales/viewing distances - have higher freq octaves of the noise func blend in as the camera gets closer to add fine grain detail. Lots of research about this kinda volume rendering as offline (e.g. Siggraph 2013 appear on Clouds for that dreamworks film (puss in boots? Or maybe jack and beanstalk, can't remember) but not all that much for realtime. Write up for that unity anim demo they did had a bit about displacement volume render for the explosions iirc - could be repurposed.
    3) overall sim - 2d axis-rotated fluid sim will be bit too mushroom cloudy, where you need asymmetry for the motion, so it may have to be 3d coarse grain but with lots of layered, animated noise to add the texture. Instead of fluid sim, Hand-authored curl noise texture (see Brutal Legend vfx writeup) will give you more precise art control over the movement - will have to be a 3d texture, but will let you easily add occluders and shape the output.

    Decisions: 
    - particle v volumetric: particle systems normally give you good control over spawn, movement, lifespan and texturing, but difficult to add depth and volume effects such as the necessary lighting and shadowing. Volumetric, much easier to get the shading, but more difficult to control the motion and get the texturing.
    Best of both? = render spawning spherical(ish) 3d meshes as a volumetric particle system to offscreen texture/ render target - untreated will give fast depth only shadowmap, then for colour render treat by add noise and vertex displacement, rendering out depth as well because the density of those clouds basically make them solid.
    - leave it easy to swap the noise func as you'll probably want to try several different types because it'll be fundamental to the final shape and motion.

    If I thnk of anything else I'll add more :-)
  • Re: particle systems and vfx: rather than throwing more low alpha particles at an effect for mass and density, going with less particles with higher alpha (more opaque) gives more control over the final texturing look, meaning less fudge blurry final look and more authored control. See the naughty Dog writeup of the fire fx in uncharted 3 (and tlou as well iirc) for how effective that can be - rather than throwing more fire particles at the screen, they pretty much went with single layer mesh sections and went to town on the animation of the fire texturing (via indirect/warp map) - in large part because ps3 sucks at alpha rop so they couldn't afford lots of alpha, but necessity is the mother of invention!
  • You guys should do this on a Raspberry Pi, or something even less powerful. Get a B&B demoscene going. :P
  • Cheers for the insight Chump! I'll give a response to points made

    Well, the most relieving one you might find is its going to be all directx11, im not making it with profit in mind but i am out to impress, my fiancee coauthoring and if she keeps to that then its going to be very exciting because shes a very clever woman who already has had ideas that blew my mind. In terms of narrative I cant help but think this thing is going to be something genuinely original - hence firing my ambition/drive

    Well - particles, ihad a good long look at every explosion going in cod ghosts the other day, im going to repeat the process with bf4 probably tonight, i really want to get inside their approaches to that including any papers that might be available. To fill in, when you see the volcano initially it's 250-300km away, ill say its visible because the middle of iceland is very flat save for the volcanos pittings its surface. A reasonable LOD approach would be to use articles at a distance but to work they have to blend seamlessly with things up close

    I was at my wits ends, in fact, about how to do the volcano up close then today threw a few surprises at me. Firstly a guy making something for unity that could use images and shapes for emitters, then a guy i'm quite familiar with his work was presentng a demo of how he imported fumefx information into unity, not the flames obviously but could get cubes to follow the simulation. Suddenly im thinking WHY NOT FLAMES. Which I guess is where I poke people again because i need a GPU solution simply so things run quickly enough. I can post the video that started all this if you like, its a GPU calculated ocean surface with incredible foam effects (3 or 4 additional FFT lumps sent to the compute shader), not my work but i put it it in an arty setting and man, it looked nice. That would describe my preoccupation with dx11

    So yes! The flames, again the GPU comes into action - bought a fluid sim months ago, it was an entry to a uyity dx11 comp, only played with it today! Mind boggled, it boggled quite a lot. Its more a flame/smothe thing atm but i guess i can illustrate where im going with, with these:

    This is my fluid simulation farting about doing nothing useful, but it can be very flexible

    Screenshot%202013-12-14%2015.59.13.pngThe who got the simuation data in was just doing someting like the windtunnel demo in an old fumefx video

    But what if you can combine them, esp given, say, that old nvidia smoke in a box video



    If feasible to combine them, you have realtime fumefx! Which culd possibly solve the problem of the main plume where the smoke has lost its form, although the scale is still a bit huge

    The actual bollowing dust clouds being propagated ae a whole different challenge but then i found an unexpected demo in Fuiditys sample scenes, it looks like this:

    https://dl.dropboxusercontent.com/u/12598787/Screenshot 2013-12-14 16.19.02.png

    Nothing special to decent graphics programmers but thats the first time i've feasibly gotten close to a raymarched volume that i could mess around with. Is it possible to propogate and light that volume (or shadow it) appropriately to recreate those bollwing clouds, and what kind of texture would i need to drive that? There's a gpu powered noise library for unity that im trying to buy ASAP not just for oean things but now for feasibly creating cloud layers (althouh in a funny coup i ended up with quite a pretty cloud solution) but if i can turn it into something to help make billowing undispered smoke movement then that would be rad, im just hoping it can be approximated by some form of noise in the library (It can generate both 2d and 3d noise)

    It's a lot to think about, but i really, really, want that in, and i want to to be the most impressive volcano anyones ever seen in again especially when they just flew over several gb worth of terrain data to get there (which is suypposed to be a bit of a thrill in itself

    Someone do this for me
  • Worth adding the guy who did the fluid sim did a great pyroclasic effect that seemed useful, hes currently a games dev in scotland, worked for rockstar for a bit? His site is www.dunnalex.com

    Secretly hoping he releases his volumetric light thing to the uniy world, and that someone buys me his realtime fragmentation thing
  • Check the unity "butterfly effect" animation they did for unity 4.0 - the making of video has a tiny bit about their explosion effect. More in the writeup from Simon Green in this (page 50 onwards):  https://developer.nvidia.com/sites/default/files/akamai/gamedev/files/gdc12/GDC2012_Mastering_DirectX11_with_Unity.pdf

    Seems a shoe-in for the smokestack - should LOD reasonably well as well. Texture driving it seems to be pretty bog standard 3d perlin - if you have gpu to spare you could go the shadertoy raymarch method and write your own noise func (fbm) for more control and dynamic changes, or at the very least for prototyping different noise funcs that you can then render out to a 3D control texture.
    If it can render depth, you can use it to shadow, so that shouldn't be a concern. You'll have to recalc/extract the normals as well (hence may be easier to an offscreen rt then process and comp to scene) for the lighting, but that shouldn't be too hard either.

    I'll dig up the battlefield paper on their (iirc) distant skybox smoke effects - but if you go at route, it's gonna be hard to make it look nice right up close and you don't really want to have 2 different rendering methods for the stack as you'll never be able to get em to look the same (and you won't want the hero effect to LOD pop in approach).

    So, yeah, if you want to get clever, use your fluid sim to drive a low poly mesh for the overall shape and movement (keep the 3D vector field handy for passing to any other particles in the scene so their movement also matches?). Or, spawn your own 3d spheres, move and scale using the fluid sim vector field.
    add pyroclastic noise to whatever meshes you have there.
    Layer with some extra particle effects - rock chunks flying out etc. - for the big sell, spawn falling particles from the upper regions of the smokestack mesh and drop those downwards through the turbulence, so the stack isn't too clean and self contained. Having stuff/particles coming off it will sell it as a physical thing that's embedded in the scene. Shadow the stack and any particles by the shape of the stack itself (so make sure you can have a depth-only render pass of it).
  • You just broke my head but it sounds like good stuff to learn! The only probs I had with using the pyroclastic effect was wether it would propogate properly and be sustainable for the hour or so long period of time you will be able to see it. Or these types of smoke stack rising

    Making a mesh out of the fluid sim sounds beyond me but i could make the fluid sim quite larege and dense and have some form of attractor pull the whole thing in a direction to recreate what it actually did which is drift into the jetstream

    Regarding noise generation, i think i could well have that wapped up before the end of the night - theres an amazing noise gen lib for unity which costs 50 quid or so, so need to get the moolah, but lots and lots of noise methids to play with and animate and so on, very useful not just for the fume formation but cloud formation in general (take iqs now famous raymarching shadertoy pretties) and also driving ocen waves which is becoming another preoccupation of mine

    If you had interest in dx11 and compute stuff and all id like yu to have a look at the ocean, not really my work i just 'artified' it, it was someone elses translation of eric brunnetns work towards proland, which if yu havent seen, you really should esp what with the guy who translated my ocean thing now deciding t port the whole thing over to unity
‹ Previous123456719

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!