{"id":38,"date":"2026-03-19T10:55:58","date_gmt":"2026-03-19T10:55:58","guid":{"rendered":"https:\/\/marnok.com\/wp\/?p=38"},"modified":"2026-03-23T09:12:42","modified_gmt":"2026-03-23T09:12:42","slug":"dlss5-and-my-old-daydreams","status":"publish","type":"post","link":"https:\/\/marnok.com\/wp\/blog\/2026\/03\/19\/dlss5-and-my-old-daydreams\/","title":{"rendered":"DLSS5 and my old Daydreams"},"content":{"rendered":"\n<p class=\"is-style-text-subtitle is-style-text-subtitle--1\">What I wanted: ENB++ with genuine style. What we got : Instagram Filters.<\/p>\n\n\n\n<p class=\"is-style-default\"><br>The DLSS5 demo has tainted my hope for the future. My personal dream and vision of AI-enhanced rendering of games was dearly held and a quite different to what we see.<\/p>\n\n\n\n<p><a href=\"https:\/\/www.forbes.com\/sites\/danidiplacido\/2026\/03\/19\/gamers-rebel-against-nvidias-dlss-5-ai-slop-filter\/\" data-type=\"link\" data-id=\"https:\/\/www.forbes.com\/sites\/danidiplacido\/2026\/03\/19\/gamers-rebel-against-nvidias-dlss-5-ai-slop-filter\/\">Artists and gamers who&#8217;ve seen the demo share the same concern<\/a>, that highly capable diffusion models tend toward a homogenising influence. To quote the <a href=\"https:\/\/www.forbes.com\/sites\/danidiplacido\/2026\/03\/19\/gamers-rebel-against-nvidias-dlss-5-ai-slop-filter\/\" data-type=\"link\" data-id=\"https:\/\/www.forbes.com\/sites\/danidiplacido\/2026\/03\/19\/gamers-rebel-against-nvidias-dlss-5-ai-slop-filter\/\">Forbes article<\/a> referencing images of <em>Resident Evil\u00a0<\/em>character Grace Ashcroft: <br><br><em>Commentators joked that DLSS 5 had \u201cyassified\u201d Grace, like an AI beauty filter made for social media.<\/em><br><br> Anyone who has tried to get an online image service to produce something other than an Instagram influencer snapshot will know what I mean. The expressive choices of the original artists &#8211; the specific way a character&#8217;s face was modelled, the intended mood of a scene&#8217;s lighting &#8211; risk being smoothed into the confident generic aesthetic that modern AI image models favour.<\/p>\n\n\n\n<p>The core idea of DLSS5 &#8211; taking generated game frames and passing them through a diffusion-style layer to produce enhanced visuals &#8211; gives me cautious hope that this application of AI in gaming which I had discussed and hoped for, is closer to realization than ever. The concern is that a mis-step in implementation might sully the idea of adoption. History suggests that when a promising technology is deployed carelessly, the backlash attaches to the idea rather than the implementation, and the window doesn&#8217;t always reopen. <br><br>Ever since I first played <em>Everquest<\/em><sup data-fn=\"c7f53668-43ec-410e-8c41-95f1bc311a72\" class=\"fn\"><a href=\"#c7f53668-43ec-410e-8c41-95f1bc311a72\" id=\"c7f53668-43ec-410e-8c41-95f1bc311a72-link\">1<\/a><\/sup> I wished for a future where games would look like the box art, like <em>Dragon<\/em> magazine covers, akin to <a href=\"https:\/\/clydecaldwell.com\/\" data-type=\"link\" data-id=\"https:\/\/clydecaldwell.com\/\">Caldwell <\/a>or <a href=\"https:\/\/larryelmore.com\/store\/\" data-type=\"link\" data-id=\"https:\/\/larryelmore.com\/store\/\">Elmore<\/a> or <a href=\"https:\/\/www.keithparkinson.com\/\" data-type=\"link\" data-id=\"https:\/\/www.keithparkinson.com\/\">Parkinson<\/a> painting. Even then I imagined multiple &#8220;style&#8221; options being available for personal choice.<br><br>I imagined a game outputting the 3D world but with hair and cloaks blowing in the wind, skin rendered as a realistic oil painting, items depicted with material properties filtered through the rules of art as much as science.<\/p>\n\n\n\n<p>And as soon as I saw Stable Diffusion I envisioned a future where, soon, frames could be run through a process and enhanced using targeted AI, once we had sufficient GPU to spare. It all seemed within reach.<\/p>\n\n\n\n<p><strong>The ultimate goal of computer games is not to be photographic; the styles are not failures to achieve realism.<\/strong><\/p>\n\n\n\n<p>Every game has a visual identity its artists fought hard to create. The hand-drawn look of <em>Borderlands<\/em>, the exaggerated physiques of 2011&#8217;s <em>Brink<\/em>, the inference of scale in <em>Grounde<\/em>d, the strange beauty of <em>Senua&#8217;s Sacrifice<\/em>. The 3D artists, texture design, animators and more all work to produce a look and feel, which we hope the AI inference layer will amplify and not homogenise.<\/p>\n\n\n\n<p><br><strong>What I had hoped for<\/strong><\/p>\n\n\n\n<p>I had imagined a metadata layer in addition to the image frame. This might look a bit like &#8220;seeing the Matrix&#8221; if we looked at it unprocessed! Give the diffusion model enough information about intent and it has less room to impose its own. A customized highly trained checkpoint and LoRA are others.<\/p>\n\n\n\n<p>The core vision of this was always that core game engines could output additional metadata relating to each pixel and object. <br><em>This is actor #ac05f3 in lighting #00bb43 with expression #505d67&#8230; <\/em><br>a comprehensive description of the intent of the scene. Not simply overpainting the frames but maintaining a consistency across a playthrough, ensuring that the game engine outputs information about what the game was intending to depict, instead of just relying on image-to-image results to achieve an enhancement conjuring trick. <br><br>Ideally each game would have its own well trained custom checkpoint model variant, alongside recorded LoRA-style data  models about characters, objects and places within the world depicted. Without that, would <em>Deep Rock Galactic<\/em> even render? A generic model has never seen a Glyphid. It has no concept of how <em>Skyrim&#8217;s<\/em> Draugr differ from generic zombies. How will it render them? We know it will try, even in deep ignorance. Without a custom checkpoint trained on each game&#8217;s specific visual vocabulary, the diffusion layer is painting confidently in a language it doesn&#8217;t know that it doesn&#8217;t speak.<br><br>These complexities are the main reasons I left the ideas to one side in my <a href=\"https:\/\/marnok.com\/wp\/glossary\/\" data-type=\"link\" data-id=\"https:\/\/marnok.com\/wp\/glossary\/\">procrastagnation<\/a> pile. If I&#8217;d ever had the expensive hardware to test it on I like to think I would have experimented, but at the moment even demonstrations of working models require $10k of hardware, before any model training costs are considered.<br><br><strong>Is there a future? <\/strong><br><br>My hope for DLSS5 as it develops is that it acts, not as a paintover-and-hope, but as a form of super-<a href=\"http:\/\/enbdev.com\/\" data-type=\"link\" data-id=\"http:\/\/enbdev.com\/\">ENB <\/a>&#8211; image enhancement and  polish, but sticking closely to the important details of the rendered output and maintaining consistency of characters and objects, the intention of materials. <br><br>My fear is it will lose characterization and consistency, reverting to the generic looking AI outputs of more recent highly polished checkpoints and systems. The technology as a whole will be rejected by the gaming community and a truly powerful opportunity will be lost.<br><br>Whatever the current limitations or pitfalls, we are a big step forward on one of the features I was hoping AI would be used for; an uncontroversial, legitimate and ethical use case for the technology. If we can guide the future, we might end up with something wondrous.<br><br><br><\/p>\n\n\n<ol class=\"wp-block-footnotes has-small-font-size\"><li id=\"c7f53668-43ec-410e-8c41-95f1bc311a72\">Showing my age, don&#8217;t judge me, I am nowhere near as old as my date of birth insists I am. <a href=\"#c7f53668-43ec-410e-8c41-95f1bc311a72-link\" aria-label=\"Jump to footnote reference 1\">\u21a9\ufe0e<\/a><\/li><\/ol>","protected":false},"excerpt":{"rendered":"<p>What I wanted: ENB++ with genuine style. What we got : Instagram Filters. The DLSS5 demo has tainted my hope for the future. My personal dream and vision of AI-enhanced rendering of games was dearly held and a quite different to what we see. Artists and gamers who&#8217;ve seen the demo share the same concern, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":"[{\"content\":\"Showing my age, don't judge me, I am nowhere near as old as my date of birth insists I am.\",\"id\":\"c7f53668-43ec-410e-8c41-95f1bc311a72\"}]"},"categories":[1],"tags":[],"class_list":["post-38","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/posts\/38","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/comments?post=38"}],"version-history":[{"count":8,"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/posts\/38\/revisions"}],"predecessor-version":[{"id":63,"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/posts\/38\/revisions\/63"}],"wp:attachment":[{"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/media?parent=38"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/categories?post=38"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/marnok.com\/wp\/wp-json\/wp\/v2\/tags?post=38"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}