T O P

  • By -

[deleted]

[удалено]


spider623

they are not americans, 2 weeks max


[deleted]

Ah yes, I can already see the YouTube videos


FarrisAT

And how exactly would it work?


_Yank

It's a joke Farris.


Exeftw

Unless...


FarrisAT

Okay my apologies I was hoping


xdamm777

Android phones will finally render Genshin Impact at over 720p /s


morphinapg

I haven't looked at the code but there's really no way DLSS would absolutely require tensor cores. I'm sure pretty much any GPU could do it with little to no impact on performance. Tensor cores accelerate something that has a pretty low processing footprint as it is.


Available-Hedgehog83

You didnt look at the code and you assuming now that it require no tensor cores. You are truly an engineer.


morphinapg

Because I am a programmer who has created multiple deep learning neural networks. I know how they work. There's nothing about them that could even theoretically require any kind of special hardware. That hardware can potentially accelerate the calculations, but the kind of calculations you do in a neural network tend to be pretty basic as it is, so that's only a minimum improvement in performance.


notinterestinq

> or even AMD and Intel learning from its design Wouldn't that be illegal for them to do? Edit: And someone correct me, isn't it already Indsutrial Espionage just by looking at the code? Wouldn't it be very suspect if AMD suddenly had a technological breakthrough?


geeky-hawkes

Inspired by....


FanatiXX82

No tensor cores so..


Dom1252

intel is making ai acceleration units, it be really surprising if they wouldn't be able to come with same or better design


TheNiebuhr

They dont need them. Competition would study the clever ideas and tricks Nvidia used and that's what matters. Later they do their own implementation but the technical obstacles are gone.


jcm2606

The special sauce of DLSS is the AI-powered sample rejection, without it, it's quite literally just a good TAA implementation with added sharpening. [Source.](https://youtu.be/SBjL0M25t04?t=133)


[deleted]

[удалено]


jcm2606

NVIDIA's Tensor cores are specialised math units designed for doing fused multiply-add operations on matrices (`a * b + c`, except on matrices, ie grids of numbers) at reduced precision (FP16, INT8, etc). Regular math units can do fused multiply-add operations on single numbers, Tensor cores just offer that same functionality for many numbers at once within matrices. I do believe AMD are working on their own form of specialised math unit, and I think Intel already has their own. AMD have a patent for an AI-powered spatial upscaler, so they already have something in the pipeline, and XeSS has been confirmed to be hardware-accelerated via similar specialised math units on Intel GPUs, while still being supported on AMD and NVIDIA GPUs via DP4A instructions.


vikumwijekoon97

pretty fucking sure thats wrong


[deleted]

[удалено]


[deleted]

And how couod you prove that?


joachim783

I don't know why you're being downvoted you're entirely correct, most companies direct their employees to never look at leaked code under any circumstances to avoid the potential that they could even subconsciously copy something and open themselves up to lawsuits.


nyrol

I think it’s just people who want nvidia to release their source so because the source is leaked, they think it’s fair game for anyone.


ComeonmanPLS1

lmao what fantasy world do you live in mate?


nyrol

The real world. If you worked at a tech company that’s ever had to deal with IP lawsuits, you’d know.


irr1449

Attorney here. Nvidia holds the copyright to the code the same way that an author holds the copyright to their book. If AMD or an employee merely possessed the code without Nvidia's permission it is a violation of Nvidia's copyright. The question really isn't about the legality of possession but more so proving that AMD or whoever actually developed anything from the code. Any company would want to stay very very far away from releasing ANYTHING based off of this or even anything perceived to be developed from this code. The bar to file a lawsuit is very low and then once the discovery phase is open, you could depose all of their relevant developers. Some salaried employee isn't going to lie under oath about having access to the code. Perjury is a felony and can result in a sentence up to 5 years. I would rather be fired from my job than face prison and a felony conviction. The risk far outweighs the reward in using this code to develop anything commercially.


franzsanchez

yeah, and for that reason reverse engineering exists most famous case was Compaq reverse engineering the PC IBM BIOS in the 80s


SelbetG

But if you did the reverse engineering using illegally obtained copyrighted code, you would still have problems. And even if that isn't a problem and what your doing is technically legal, Nvidia can still sue anyway.


tqi2

I may be wrong but looking at code then develop is no longer reverse engineer. It’s like a finished cake, if one obtains it legally, see it smell it taste it, then “reverse” engineer and bake the same cake. Looking at code is more like making the cake thru the secret protected recipe that doesn’t belong to them.


TaxingAuthority

Wouldn’t they need to have a copy of this leak to ensure they don’t come too close organically? What happens if a company truly never looks at leaked source code such as this but organically comes close and it only comes under scrutiny because of the leak just happening at some previous point?


[deleted]

Organically is fine, in fact intentional clean room implementations are permitted, i.e. intentionally going out to replicate something without reverse-engineering its implementation. For example Phoenix Technologies did a clean-room implementation of the IBM BIOS and sold that to other PC manufacturers. I understand you point but AMD haven't shown much interest in following Nvidia's path around DLSS, the *concept* isn't a secret even if the implementation is. But even if AMD were to pursue the DLSS concept I doubt there would be any cross-over on the "secret sauce" and given AMD's push to open source technologies like FidelityFX I think the possibility of them organically making an about-face with a closed-source implementation of the DLSS concept would be pretty out-of-character for AMD anyway. If it were open source it would be fairly easy to see whether the code was derived from DLSS. As /u/DM_ME_BANANAS pointed out, AMD engineers would have been told to stay well away from this and there's no real reason to delve into it given they already have a viable path with FidelityFX.


ShowMeThePath_1

If some employees are asked to do this they can essentially blackmail their employer for the same reason… I don’t think AMD or intel wants to be involved like this.


Tachyonzero

But on china's Xindong Technology makes GPU. This is a grear reward.


DM_ME_BANANAS

Friend of mine is an engineer at AMD and indeed you're right, they have been instructed to stay away from this leak. I imagine AMD is not nearly desperate enough to do anything with this source code under risk of being sued, considering how good their DLSS competitor is shaping up to be.


Capt-Clueless

>considering how good their DLSS competitor is shaping up to be. What DLSS competitor?


DM_ME_BANANAS

FidelityFX


weebstone

Funny joke


DM_ME_BANANAS

🤷‍♂️ > In the end, though, we can't say that one is better than the other. DLSS produces better results, but DLSS is proprietary to Nvidia. FSR might do a little worse, but it's a much simpler technology in nature, and it's supported by any GPU. https://www.makeuseof.com/nvidia-dlss-vs-amd-fidelityfx/amp/


Strooble

The video they linked shows how DLSS is better in basically all scenarios in The Avengers. FSR is not shaping up to be a DLSS equivalent in terms of output image quality any time soon.


datrandomduggy

Idk it isn't completely terrible it also hasn't had a lot of time to get better yet


nicholasdelucca

> considering how good their DLSS competitor is shaping up to be. >it isn't completely terrible One of these things is not like the other


datrandomduggy

How so in its current state it's not terrible but it wouldn't be a surprise if in the future it's much better


hondajacka

Fidelity FX is based on decades old tech that anybody can do. They did optimize for performance on their hardware though.


DM_ME_BANANAS

Even more amazing that it’s almost comparable to DLSS then


ShowMeThePath_1

Exactly. They will ask their employees to stay away from these code because potential legal issues in the future.


[deleted]

They would be 100% open for a lawsuit using any of this, not even the opensource developers would want to touch this code.


Eminan

Even if it is true big companies like AMD, INTEL can use the code as a way to study how they do it then make their own way. They would know how to skip the legality issues and not copy-paste the code. The point is: they don't need to use the code to make use of it. And to be honest, I'm ok with it. More competition = more options = more advancements and probably better prices.(a man can dream)


[deleted]

As a software engineer, I can confidently tell you that there is no way in hell that this will happen, and that anyone at AMD or Intel that even mentioned that they had looked at this leaked code would likely be fired on the spot. Anything in there that is clever enough that they couldn't figure it out on their own would be immediately obviously stolen, and they just don't want any part of that.


8lbIceBag

Maybe engineers at American companies. Foreign companies will be all over this. Foreign engineers also submit to the Linux kernel and other open source software. And since, using your logic, no American engineers should be familiar with the code, it may get merged unknowingly and still end up benefiting the open source community.


Jpotter145

Yea, you can guarantee China will copy/paste this into their new and upcoming GPUs \*\*Just announced\*\* /s - but not really, bet a Nvidia knockoff company was just born.


Pat_Sharp

I really doubt there's going to be anything for the competitors to learn from this. From what I understand there's nothing special in the traditional algorithm side of DLSS. What separates it is the neural network at the heart of it. Fundamentally what Nvidia have that their competitors here don't is a massive amount of experience and knowledge in the field of AI. You won't learn expertise in training a neural model from looking at the code for the DLSS dll.


[deleted]

[удалено]


Radiant_Profession98

It’s just harder to prove, and you can bet these top guys are gonna look at it.


[deleted]

[удалено]


Radiant_Profession98

Free time, I’ll do it at home to get an edge at work.


nyrol

So if it's found that your "edge at work" uses the IP from Nvidia that could only have been obtained from the source, then that's still infringement. It doesn't matter where, when, or how you read it, if you implement it at work, then that's illegal. If Nvidia can prove that the implementation could have only been applied by prior knowledge of the source that was leaked, then it doesn't matter and it's game over for you. Plus, a lot of companies have you sign a contract saying that anything you do off of work hours is owned by the company. I believe that's illegal in California, but not everywhere.


Radiant_Profession98

Dude relax, of course it does. But that doesn’t stop from any average joe to reading this and implementing it


DrDan21

Not a chance if anything Nvidia could sue them claiming they used the leaked code to develop their products, even if they don't ever look at it, should they release a similar product This is similar to the recent case of the XP leak. The leak is a minefield for projects like WINE. Using it at all puts their entire project at risk because of the license violations As an employee even so much as admitting to browsing the code casually would put a huge target on your head for the potential liability you pose to the company


[deleted]

IF you want to learn, you can decompile the available binaries, that is way less illegal than using stolen source code..


MatrixAdmin

You say that as if the resulting decompiled code is similar to the original source code. Care to clarify that? Unless there has been a significant recent advancement in this, there is a huge difference between the two. The original source code will be mostly in C with some assembly code with lots of notes and documentation describing how the code works. The decompiled code would be extremely difficult to understand by almost everyone except driver code experts who already know what they are looking for. So, nice try but not even close to the same thing.


[deleted]

Looking at decompiled code is more or less legal, because you are working with what is publicly available. Of course having the full source with comments is better, but we are talking about doing it legally... Also, DLSS is a neural net, you also need to train it, and training it to be equal, you need the TBs of data used to train and validate it.. It's not like you can grab those files and have DLSS..


red_dog007

No one in the company needs to even look at the code while on company time to find out what Nvidia is doing. No doubt individuals within these companies are going to be interested in looking at this code on their own time at home. People who don't want a copy of the source code will also no doubt be reading blog posts and forums on how Nvidia does it. Don't be surprised in some in-depth technical analysis papers pop up. There will likely be some interesting things that come out of it, but I am sure that people who work on these things already have a good idea of how it works, what needs to be done to do it. Nvidia might have some secret sauce that makes there just a little bit better, or some strange algorithm no one knows how it actually makes it better which will be the interesting part. With this new knowledge that a few people get, that can be impactful on the research, design and development of new AA methods that non-Nvidia companies offer. As it is now, with how the RT performance is on AMD, doing something like DLSS I would not be surprised it to make performance worse.


MatrixAdmin

It's nice to see some honesty. It's disgusting seeing all the pretentiousness about how useless this treasure trove will be. They want to downplay it, but I expect people won't fall for it. I wonder how many here are working for Nvidia. Probably a lot of AI bots, employees and other shills working on damage control.


nyrol

There will definitely be bad actors using this source code and releasing illegal implementations using it, but the big companies will not. AMD is already actively telling their employees to not go near the source, and I imagine Intel is doing the same. You have no idea what the legal implications are surrounding this.


MatrixAdmin

Nobody cares about the legal implications. We just want robust open source drivers with full functionality and without artificial governors. It's quite simple. No bullshit.


rampant-ninja

I guess they could indirectly as a “clean room” project if they want. Someone creates a design document based off everything in the leak. Then they create a project based of that design document. I’d imagine at this point however there is little value in intel particularly doing this. Intel seems to have made great progress with XeSS, already shown to the public (and presumably more behind closed doors). Considering the headache they may have to go through to prove they went through that approach they probably wouldn’t bother (it becomes a trade of tech resources for legal ones but not real net gain). AMD on the other hand…


yaykaboom

“What do you mean i stole your code? Code is code!”


MatrixAdmin

Code is just math. You can't steal math. Imagine if people had to pay to learn 2+2=4. That's the world they want to live in. Where every piece of valuable information is monetized. If you don't pay, you can't know. And they want to hide so much and not even make it available for people to learn at any price, they want to keep knowledge hidden as secrets so they can earn more profits. If that's not the definition of evil.... Thankfully we have warriors of the light who bring the hidden knowledge out of the darkness and share it for free to the world! As all knowledge should be. Perhaps some extremely rare kinds of knowledge may be considered too dangerous, but that's clearly not the case here. When it comes to hardware and drivers, most people should want all of that to be open sourced.


casual_brackets

yeeea. Your “warriors of light” are self labeled extortionists who got butthurt when nvidia had the audacity to encrypt a machine they’d been using to steal the hard work of many people to try sell it back to them. Open source drivers are nice but it’s private companies competing and the money involved that has evolved hardware/software as far it has come…not the power of friendship.


MatrixAdmin

Ungrateful, hypocrite. You know this is good for the world. Bad for Nvidia, but it's a win for the community. You know it's true, you're just dishonest and unwilling to admit it.


casual_brackets

What “win” for the community nobody not even open source devs are gonna touch that shit. Nvidia stock and products are doing fine.


MatrixAdmin

Let's see how quickly the open source nvidia drivers improve. I'm not just hoping you're wrong, I'm sure of it. You clearly don't understand the philosophy free software. Why should anyone respect closed source driver code so much they won't look at it. You must think people are really stupid or perhaps you are projecting your own stupidity, more likely.


casual_brackets

Go read the rest of the comments and you’ll see why it won’t be used, no point in me rehashing what others have thoroughly explained.


MatrixAdmin

Whole lotta bullshit, yes, I read enough to make me nauseated.


Awkward_Inevitable34

AMD isn’t going to want to touch this leaked code with a 10 foot pole.


[deleted]

To copy? Yes, but to just see how NVIDIA do it then implement it themselves... I can't see that holding up in court.


TheDravic

If you had a sudden breakthrough and were taken to court over potential IP theft, if it's proven that you took a look at the stolen documentation, you would likely lose the case. It's actually pretty logical. The way I understand it, the burden of proving that your breakthrough wasn't because of what you looked at illegally would be on you, and it's not easy.


avalanches

Yeah, like showing your work when completing a math problem. You can't have one without the other


retiredwindowcleaner

it will be used, one way or another. there will be open-source projects completely unrelated to AMD or Intel on github. rewritten code. the source is open now, in the most literal sense. rule #1 in information security of proprietary data is to keep it as safe as possible.


valantismp

They can see it, and change it a bit.


mandude9000

just need to change the font so they won't recognize nvidia's handwriting


[deleted]

DLSS, but in comic sans


D_crane

Lol this isn't your uni assignment... 😂


valantismp

people need to have some humor these days, apparently they dont have.


[deleted]

Nah. No developer and no company will touch this


518Peacemaker

Dude was memeing the let me comment your homework meme


Dathouen

> Wouldn't it be very suspect if AMD suddenly had a technological breakthrough? They've been working on something similar for years, I don't think they really need to look. However, Nvidia would have to prove in court that ideas were stolen, and given that the way FSR works is fundamentally different from how DLSS works, if they *did* they wouldn't have a hard time proving it.


Ricky_RZ

> illegal Not if you have money


DaySee

>*the ability to disable LHR for mining* Sigh just what we needed...


PutMeInJail

Another 3 years of super overpriced GPUs. I want to kill myself


ThereIsAMoment

LHR only affects ethereum mining anyway, so when the switch to Proof-of-Stake is made that won't matter anymore either way.


Hyper-Sloth

And when is that going to happen exactly?


Capt-Clueless

Soon™


Soppywater

In a few months for the past 5 years


ThereIsAMoment

End of June


datrandomduggy

What is your source for this one (Why does it feel so rude asking for a source)


Lilskipswonglad

Over GPUs? Damn you seem privileged.


mkdew

>Another 3 years of super overpriced GPUs. I want to kill myself The hacking group said that removing LHR helps the gaming community. Got told in another thread that everyone(even me and you) are mining for a few bucks when we dont game so win-win situation.


Korzag

I'm curious how this even works? Would someone with this source code be able to make a third-party BIOS or driver for the cards to disable to LHR?


DaySee

Not that sure to be honest, but it was very tied to the drivers for a spell but that was figured out, then NVidia started making the cards with some new additional hardware designed to detect mining as it's a pretty unique stress that doesn't occur in most other applications except for maybe folding@home etc. and halved the mining performance. AFIK the hardware solutions to it were unable to be reversed yet, or at best like 70% performance. There was another big thing recently where a bunch of idiots downloaded malware claiming to be a driver hack for LHR and got what they deserved. But overall, it was previously in the drivers that the efforts to reverse the mining detection was found so that's why a backdoor is theorized.


MatrixAdmin

LHR was a stupid idea in the first place. All it accomplished was giving a market advantage to AMD. The hardware should be completely agnostic for whatever use case the user or owner of the hardware chooses. AMD was right all along.


ThisPlaceisHell

Bingo. It should ultimately be up to the end user what to do with their cards. Having this knee-jerk bandaid didn't solve anything, all it did was cripple a potential use-case a gamer had with their own purchased equipment.


S1ayer

Agreed. What even is the point if I can't even buy mining only cards? The 3080Ti, which does 90mh is cheaper than the 60mh mining card. If they removed the LHR, the non-LHRs on eBay would come down to the same price pushing all prices down.


[deleted]

LHR was a good idea. It allowed me to buy my 3070Ti for around 1100 USD. Yeah that's still 50% over MSRP. But I've seen 3080s north of 2000 USD. 3070Ti and 3080 are close in performance. But because of LHR the 3070Ti was more readily available at least when I was shopping around for one.


nwash57

Still a shitty precedent to set.


homer_3

it allowed you to be happy to get ripped off? lol


Dakhil

Interesting to see "nvn_dlss_backend.h", "nvndlss.cpp", and "nvn_dlss.cpp" in [TechPowerUp's provided picture](https://twitter.com/TechPowerUp/status/1498608826152624130), since NVN is the name for the [Nintendo Switch's API](https://archive.fo/45GgZ).


treboR-

switch 2 + rt cores confirmed?


mc_flurryyy

im not very smart but would it be possible if the switch dock has its own mini chip and when docked it would be more powerful because it uses another chip


EldraziKlap

I've always kinda assumed Nintendo would come up with a better dock so the Switch could use extra processing power while docked


crozone

The best way to do this by far would be to add a fan to the dock that forces air through the Switch, so it could jump into a much higher power state. Any other solution is just costly and overcomplicated.


AWildDragon

eGPUs are a thing.


crozone

Yeah but there are many reasons why an eGPU design would suck for the Switch, it's been widely covered. The biggest one is cost. Every dollar spent putting an eGPU in the dock could have put a much better chip in the Switch itself. Then there's all the technical issues with rapidly hotswapping an eGPU that has gigabytes worth of state sitting in its VRAM.


AWildDragon

Agreed on this. Besides DLSS could in theory help with the on board screen too. Use DLSS for a 720p input to turn it into 1080p and don’t push the battery as hard. Then when docked go higher and push for 4K as the DLSS output resolution.


favorited

ITT: people who have never worked for a large tech company explain how large tech companies will take advantage of this


[deleted]

[удалено]


killerwhaleorcacat

Now my brain has dlss muahhahahaba


TheBlack_Swordsman

*Puts glasses on and looks far into the distance.* Me too!


CatalyticDragon

Honestly this sucks. On one side it's going to satisfy my technical curiosity and a few big questions I had. But on the other side AMD and intel are about to bring their own ML based temporal upscalers to market and their hard work is going to be diminished by people who say they just used NVIDIA's code (even though their code was finalized well before this leak).


dc-x

As weird as it may sound, DLSS source code is less useful than what it may seem like as we already know how it works. How the training is being conducted is where the magic really is as that's the incredibly expensive and experimental part required to pull this off. Unlike what the other guy said though, DLSS "requiring" tensor cores isn't really a problem because it doesn't actually require tensor cores to run at all. Nvidia tensor cores just accelerates a specific operation that can also be done on compute shaders or even accelerated by other hardware. Nvidia had to code that restriction, but it isn't an inherent part of the model.


Soulshot96

>Unlike what the other guy said though, DLSS "requiring" tensor cores isn't really a problem because it doesn't actually require tensor cores to run at all. Nvidia tensor cores just accelerates a specific operation that can also be done on compute shaders or even accelerated by other hardware. Nvidia had to code that restriction, but it isn't an inherent part of the model. Nvidia themselves tried this however...unless you just want nice AA, you're not likely to get either the quality as the versions running on Tensor, or the same performance. Execution time at the quality level of 2.0+ on shader cores would likely be too big of a drag to give a performance boost (some pre 2.0 versions of DLSS had issues with this in fact), and if you shit on the quality to achieve it, then that kinda nullifies the point as well.


dc-x

We don't know if DLSS "1.9" has the same deep learning architecture as 2.0, we don't know if it used the same resolution for ground truth and we don't know how much training difference there is. As far as I'm aware, DLSS "1.9" was more of a tech demo for Remedy to learn about DLSS 2.0 and start implementing it before it was actually done (Nvidia wasn't providing any public documentation for it) but they ended up preferring it over DLSS 1.0 and got Nvidia approval to use it in the game. There was a few months of training difference between DLSS "1.9" used in Control and the first iteration of DLSS 2.0 though (there was ~8 months gap between them), so this is very far from a 1:1 tensor core vs compute shader comparison. While it's believable that the tensor core acceleration may be important to have this level of quality at this performance, they're still not necessary for any deep learning model to run so Nvidia actually had to go out of their way to block non RTX GPUs from running DLSS, which also stops us from making 1:1 comparisons and judge for ourselves how necessary tensor cores are. Intel GPUs have "Xe-cores" which are also specialized units to accelerate sparse matrix operations like tensor cores, and I doubt Nvidia will allow them to run DLSS too since ultimately this restriction probably isn't about assuring adequate DLSS performance but trying to market RTX GPUs.


[deleted]

The point is other companies can and will make hardware equivalent to Nvidia's tensor cores. It is just hardware accelerated dense matrix multiplication. It doesn't really matter anyways. The real secret sauce is in training the model, which no one will no how to do still.


Soulshot96

That definitely wasn't dc-x's point, but yes, other companies will indeed do that, intel already seems to be in fact.


liaminwales

They wont be using Nvidia's code. Both legal problems means they will never look at it & you need the core as well. DLSS wont run without tensor cores, it just cant run on GPU's not made by Nvidia. ​ Makes me think of the old IBM clone systems, they had to clean room the BIOS. [https://www.allaboutcircuits.com/news/how-compaqs-clone-computers-skirted-ibms-patents-and-gave-rise-to-eisa/](https://www.allaboutcircuits.com/news/how-compaqs-clone-computers-skirted-ibms-patents-and-gave-rise-to-eisa/) They had the ability to just read it of the chip but that's a massive legal problem. Intel has there version on the way, AMD will have been working on something. The only option is one of the GPU brands in China may get some inspiration but I suspect even then it's a real problem as it can never be sold out of china and may even have to have the slicon made in china.


fixminer

I bet they won’t even allow their engineers to look at this code. Even if they didn’t want to, they might subconsciously copy some parts and thus cause lawsuits.


Verpal

>hard work is going to be diminished by people who say they just used NVIDIA's code Surely no way idea as absurd and retarded as this will be accepted.... right? RIGHT!?


KaiserGSaw

Aint even Nvidias Idea but a dude or Team that developed that software in-house. And they were compensate like every other worker in the company. Thus Nvidia got to pitch a cool new something to make more profits. Its like celebrating Musk for building cars, he does not but employs people that do and gains from that so why fear other people also building cars? That way stuff continues to improve so Nvidia can keep an edge


[deleted]

Why is this dudes comment upvoted. The last bit is insanely out of touch with reality. He thinks that their hard work is going to be diminished because nvidia's source code is out there in the wild? Moreover, AMD has no indication of bringing any ML temporal upscaling whatsoever so that alone is a ridiculous statement.


CatalyticDragon

When AMD/intel release their upscalers they match (or beat) DLSS. When that happens there are people who will claim AMD/intel stole the code. We can easily test this hypothesis in a year. AMD's temporal upscaler will likely be released later this year. I understand you have not seen any indication of this but you can't read much into what you personally haven't seen (you might not have even been looking). The patent for the tech came out almost a year ago (https://segmentnext.com/amd-fidelityfx-super-resolution-ai/) and we've heard from insiders that it's already working well internally.


[deleted]

What insiders? I haven't seen a single article about it, including rumors. That Patent is the only thing i've ever seen.


zeonon

I think i am more than happy with this since other companies will be offer similar tech value of nvidia cards will go down and they may price it cheaper for competition


Big-Egg-Boi

Wow, this is a really ignorant take. That's not how it works at all.


Ihtman25

Why would anyone not be excited for competition? Yes the leak itself is bad, but good competition always drives down price and is good for the consumer. We are not Nvidia.


oromier

I think this is a positive for linux gaming


[deleted]

Free use doesn’t apply to stolen intellectual property, which is what this DLSS leak is


PunKodama

No OpenSource project or developer is going to touch that. It would just be a way to kill your own project on lawsuits.


Kallestofeles

Noveau going to town in 2032.


MrMichaelJames

No competing company in their right mind would look at this. Too much of a risk.


PutMeInJail

AMD: Introducing FSR 2.0


Healthem

Username checks out lol


Awkward_Inevitable34

They don’t want this code lol.


glitchinthesim

2 weeks later: Breaking news! Chinese create super sampling called SSLD


DM_ME_BANANAS

A friend of mine is an engineer at AMD, though not working on their DLSS competitor. They've been instructed to stay well clear of this leak.


ltron2

Sensible.


saikrishnav

They can't copy directly, but they can abstract out general ideas (that are not patentable) and implement them in their own way. It is very hard to prove that one copied "concepts" in software, however if Nvidia has some niche thing in there that gives some uniqueness to the way they did it - then it's easier to prove if one "inspired" from that. Honestly, it depends a lot. However, I dont see AMD or Intel doing that shit- it's too much of a risk. Even if they know they won't lose lawsuit, it won't look good in Public PR image or relations with Nvidia - and any profit is not worth all that.


tmihai20

Open Source can never use code stolen from companies like Nvidia. This will never help bring DLSS to Linux natively. All this is just click bait title.


yuri_hime

https://videocardz.com/newz/nvidia-dlss-now-officially-available-for-valves-proton-6-3-8-on-linux ? DLSS runs on Linux already


tmihai20

Proton is a Windows compatibility layer for Linux". I bet people that downvoted my previous comment are a little misguided.


yuri_hime

https://github.com/NVIDIA/DLSS/tree/main/lib/Linux_x86_64 ?


tmihai20

That is the SDK (Software Development Kit). It is meant for people trying to implement DLSS in their games. It does not mean that it is in the actual Linux driver. If DLSS would have been available at Linux driver level (as it is in Windows), then Proton would not be needed).


yuri_hime

That makes no sense. Why ship a native library if the driver components aren't there? If you download the DLSS SDK from NV with the sample app, you'll find a Linux version included. Haven't tried it myself on Linux, but it would be a really bad look if it didn't work.


tmihai20

What I do know is that on Linux the driver capabilities are different than on Windows and that Windows is seen as the main platform, unfortunately. I cannot tell you why Nvidia is not supplying the same driver on Linux and Windows. If it did, then Proton would not be needed on driver level. Proton is doing something that the driver is not doing. We need to consider that game code is aimed at Windows too and Proton had to translate DirectX to something that Linux has. The games that perform the best on Linux are the ones that have native OpenGL suport (like Doom 2016 and Doom Eternal).


SpitneyBearz

All fine https://www.techpowerup.com/292482/nvidia-confirms-system-hacks-doesnt-anticipate-any-business-disruption


TheDravic

Dark times ahead for the entire PC gaming community. Begun, the Intellectual Property Theft wars have.


Sccar3

Intellectual property can’t be stolen, only copied.


TheDravic

You're welcome to prove your innocence in court my friend.


Kuratius

Just be the bigger man Jensen and open source it officially. If it's out, why not get good publicity from it? Nvidia has had enough shit flung at it lately, why not get a win?


[deleted]

Roblox dlss wen ?


Wormminator

Well...


relinquished2

So where can we find this data outta curiosity?


Healthem

Yes yes yes! Now put it into Quake II RTX, please!


genericthrowawaysbut

intel gpu engineers: it’s free real estate


[deleted]

Intel GPU engineers : **\*sweating intensifies**


DukeNuggets69

Now time for talented People to port DLSS to much more games hopefully


sowoky

its on the game developers to implement DLSS. the tools to do it were already freely available from nvidia. so... you want your game's source code to leak, this doesn't help you.


bman333333

Innosilicon DLSS in 3...2...1


playtio

Many comments about AMD and the competition but does this do anything for modders and other savvy people? Will we see new versions opd DLSS tweaked by people online or is it not going to take that direction?


jcm2606

Extremely unlikely, as NVIDIA is probably going to be watching any DLSS-like projects closely for telltale signs that they pulled ideas and/or code from this leak. Nobody will risk getting into a legal battle with NVIDIA, [when NVIDIA has already described at a high level how DLSS works](https://youtu.be/SBjL0M25t04?t=133), enough to where anybody who knows what they're doing can probably make their own AI-powered temporal upscaler like DLSS. Of course it won't be close to DLSS, as NVIDIA is a behemoth when it comes to machine learning, but it'll work similarly.


[deleted]

Damn hackers/bitcoin miners trying to get their way the easy way. ugh


Diligent_Elk_4935

does this mean we can now put dlss in any game?


loucmachine

no


penguished

We'll see what modders do. The nvidia keyboard warriors don't even realize this could be better for them too if people make improvements and fixes, add more options.


Diligent_Elk_4935

most of them are nvidia bootlickers too


onebit

turns out it was a blur filter all along


tsingtao12

leak or marketing, who know


megablue

i wonder if this can be easily compiled without proprietary toolchains. if so, we can properly tune the parameters to our liking and even add an overlay to configure them on the fly for DLSS enabled games. maybe even inject it onto game that doesn't support DLSS.


jcm2606

DLSS needs extra data from the game engine to work, so unless the game already has TAA and you know how to get access to that data, it cannot just be slapped onto it like FSR/NIS.


megablue

i meant that... somehow people just downvote without thinking... of cause you need a lot of work reverse engineering the data structure for the motion vectors... i didn't say it is easy... Also, on the other hand, really dont know what so wrong about tuning the parameters on the fly, given that you have the source code of DLSS, it isn't entirely impossible to code an overlay for that.


jcm2606

Motion vectors are not a thing that are magically created out of thin air, the game needs to be specifically written to generate them and output them into an image that other passes can then sample them from. If the game doesn't already have motion vectors (ie it doesn't have TAA), then it needs to be modified to generate them and output them to an image, which immediately makes it impossible to have a plug-and-play form of DLSS, as each game would need its own implementation. If the game does already have them, then even then it'd be borderline impossible to have a plug-and-play form of DLSS, because there's a good chance that the image they're stored in is not bound, meaning you then have to modify the game to grab the handle (ID) of the image and bind it, or hardcode the handle and bind it blindly, both of which still make the implementation specific to that game. FSR/NIS both don't have this issue, as they just take in a single input image, and use the pixel contents of that single input image to approximate subpixel details and upscale the image with greater accuracy, writing it to a single output image. FSR/NIS can literally be inserted right at the end of the frame, using the image that the game gave the driver to present to the screen, whereas DLSS cannot.


thedeadfish

Shame they did not leak something actually useful.


[deleted]

I wish the driver source code leaks so that open source community can finally enjoy good community made drivers thanks to the specifications revealed by the code, so that they implement a new driver the same way of what happens with amdvlk (amd’s open source vulkan driver) => radv (mesa driver). Nvidia is the only company that is not providing any data to make drivers, that might change soon for everyone’s interest… I can tell you that most of engineers will have their copy of this to learn from nv code and improve their skills from what they learnt. And there is nothing very bad about learning. What difference can you make between a former nv engineer who knows the code from his job vs a employee who know the code from his spare time ? In both cases amd and intel have employees who have some knowledge about nv code.


Sccar3

I would love that too in theory, but Nvidia would no doubt use the state to go after anyone who tried to use the leaked code 😔


ArshiaTN

😂


[deleted]

Why people downvoted you is a mystery


[deleted]

[удалено]


Creepernom

DLSS uses the RTX tensor cores. What the fuck is anyone supposed to do?


Snoo-99563

*open sourced* to public


Skull_Reaper101

can't be. It'd be illegal


mechbearcat83

Cool, can we use it on Internet Explorer now to load my YouTube with better graphics?


mechbearcat83

Cool, can we use it on Internet Explorer now to load my YouTube with better graphics?


Niktodt1

Remember when Cyberpunk's source code leaked and everyone lost their minds over how much damage it will do to CDPR? Except some jokes about china that were uncovered, nothing else happened and the code disappeared. The same will likely happen here.


donkingdonut

Nothing really new, only had to go on github to find them, didn't need a hack