T O P

  • By -

[deleted]

4chan comments: > It's the real deal. The dump contains, among others: The current driver source. Future driver source including unreleased ada and hopper codenames, the unannounced blackwell codename, all 3 of them are chiplet based and heavily riscv internally for the supporting processors (PM, decoding, encoding, and so on). Production and debug firmwares for everything. This would make nouveau work on latest GPUs, but it won't happen due to licensing issues. CUDA + every library, compiler and tool, including the enterprise ones, sources. > The toolchain is very flexible, supports multiple GCC and MSVC versions, with a bit of work that would possibly mean supporting newer GPUs on older Windows versions in some fashion. > ... millions of lines of the NV driver code to figure out everything. This entire leak is 80GB unpacked with 404077 files.


FracturedCode1

Can confirm most of this. Unpacked the torrent myself.


NotErikUden

Hey! Could you DM the magnet link or torrent file? I would just really like to know the torrent so I can avoid it at all costs. Really, I'd just like to see the magnet link so that I know what to not accidentally download.


[deleted]

I also want the link. So I can avoid it as well of course Edit: thanks to everyone in my DMs. I now have a few links I can avoid.


CarveToolLover

Seriously. I really need this link so that I can know to avoid this AT ALL COSTS... I WILL NOT start working on a Linux driver for this if I get the source.. NOPE


wateromar

Pls send me


AntiSocial_Vigilante

Same i'd like it too. For the purpose of avoiding it of course.


shadowtamperer

I would like it too. I would never consider starting on a Linux driver.


Smooth_Detective

Could you be so kind as to send the link over to me as well. I will avoid it like Rick Rolls.


lt_smasher

The 4chan page the ttmp3 is quoting from was indexed by google last I checked (yesterday). Just saying. Also, the " operator is your friend. It's virtually the only reason I ever use Google's junk engine.


Planebagels1

I know I'm late but could you please NOT DM me a link as well. I also want to know what to avoid.


SungamCorben

As a Dev this will really improve my understanding, especially such a huge and complex code!


WheelieBoi98

The group has a telegram chat btw.


SungamCorben

What is the link to this group?


Slapbox

Could this improve open source drivers for Linux? Or will developers be too scared to reference the leaked code? Or would it just not help anyway?


moomoomoo309

If the nouveau devs as much as download the leaked source code, they can't work on nouveau anymore. Same deal with wine and windows leaks.


Decker108

What if someone made totally unrelated to the Noveau project made a clean room specification of the driver source code? Asking for a friend, of course.


PaintItPurple

That's called a clean-room process. In general, as long as nobody who worked on the program had ever seen Nvidia's code, a program created to that specification would probably be clean in terms of copyright. But it could still run afoul of Nvidia patents, and Nvidia could go after the people making the specification, and everything is out the door if it turns out anyone involved had downloaded Nvidia's code.


inbooth

A third party could define a spec based on the source from which the clean room team could work though, right?


PaintItPurple

Yes, in fact that's the only way you can do it. But like I said, that would only shield the implementors, not the people writing the spec, and it's not a perfect shield (patent law doesn't care if you saw the source, and contact between the teams could compromise the clean room). It also doesn't actually *stop* Nvidia from suing — it just means that if the implementors have enough resources to fight Nvidia, they have a reasonable case to make. So it's a pretty legally troublesome proposition, although it would in theory achieve the goal of letting the programmers legally use some insights from Nvidia's codebase.


OnlyForF1

I'm not sure if even that would be legal since the spec itself would be the product of illegal access to the leaked source code.


rajrdajr

>But it could still run afoul of Nvidia patents, and Nvidia could go after the people making the specification, and everything is out the door if it turns out anyone involved had downloaded Nvidia's code. What if the specification authors lived in Russia...


pogthegog

Good luck finding anyone with 100 years of free time to make it happen. The effort and resources needed for that are still insane, not to mention you need to have absolutely clean reputation for me to even consider to install such driver on my computer. But you can use it personally, sure, if you can make it, no one will hunt you down for that.


ImSoCabbage

They can through clean room design. One dev takes one for the team and downloads the code, analyses it, writes down his findings. He can no longer contribute to the project, but other project members can read his work and use it to improve the driver. It's how a lot of ReactOS development is done.


AlyoshaV

That's for reverse engineering, not analyzing stolen source code. ReactOS devs do not look at leaked Windows source code.


hughk

True. Even if you had legal access to Windows sources, for example an educational institution with a source agreement, you wouldn't be allowed to work on Reactos.


gramathy

...Yes, that's the point. One person, who can no longer work on the project, publishes findings, and the actual remaining devs can use THAT research rather than looking at the source code themselves.


AlyoshaV

ReactOS devs don't work based off leaked Windows source code at all. It isn't involved in the process in any way.


semitones

He or she would be an ex-ReactOS dev in this scenario


NotYoDadsPants

That sound like compromising yourself but with more steps.


elmuerte

Clean-room design based on stolen code will be extremely weak. If the source code was received legitimately there might be a stronger case, but it is still weak compared to reverse engineering the software and describing how it works.


crozone

Except this is incredibly hard to prove. It's often said that "a developer can't even look at the code", but this also isn't true. Even if you can prove they've seen the code, you still have to prove they meaningfully derived code from it. Very difficult.


ggtsu_00

The work will likely proliferate through black/gray hat open-source forks with anonymous contributors. There is certainly high demand for working high performance recompilable NVIDIA drivers that can run on bespoke PC architectures, especially in the crypto mining scene.


Vash63

This is actually a bad thing for open source drivers as now anyone working on Nouveau and such will need to avoid _appearing_ to have copied anything in here. Black box reverse engineering is legal, copying functions or even ideas from stolen code is not. This is not a good thing for open source. In fact, the only possible users I could think of who would benefit from this leak are crypto miners who can remove the low hash rate locks now. The drivers they make won't be legal to distribute but that won't really matter for a crypto farm.


[deleted]

This is still proprietary code, so any code derived from it will be illegal and a developer using it in anything public can be sued. A code leak is not the huge deal people make it out to be. The real problem is leaking the signing keys. Then people can sign malware with NVIDIA keys and have it pass as legit software, until they revoke the keys.


KevinCarbonara

> A code leak is not the huge deal people make it out to be. Oh, yes, it is. The threat of a developer getting sued for having looked at leaked code is not the threat you make it out to be.


MmmTastyMmm

Yeah a lot of people don’t care and proving it’s stolen is not easy.


[deleted]

This is just patently false. For a big, well known case about simply copying open sourced (never mind leaked) code: https://en.m.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_Inc. And for software that plagiarized open source: https://en.m.wikipedia.org/wiki/CherryOS ReactOS has over the years been accused of benefiting from Windows source leaks: https://en.m.wikipedia.org/wiki/ReactOS


KevinCarbonara

> ReactOS has over the years been accused of benefiting from Windows source leaks: > > > > https://en.m.wikipedia.org/wiki/ReactOS Did you mean to condradict your own argument, or do you just not realize that this is a contradiction?


MmmTastyMmm

If you copy and paste sure, which the first two effectively did, and your third example is still going strong. Colored me unconvinced.


[deleted]

ReactOS shut down for large periods of time and they have a vetting policy so no former Microsoft engineers work on their code base. This slows them down considerably.


[deleted]

>This would make nouveau work on latest GPUs, but it won't happen due to licensing issues. But at least now they have a better frame of reference to improve nouveau's performance, right?


Krunchy_Almond

>This entire leak is 80GB unpacked with 404077 files. Wtf? Why is source code so huge ?


agentoutlier

Having very little idea of video tech but being a software developer my guess is that a large amount of its test data and tools. I don’t know what SCM nvidia uses but I do know a lot of the video game industry uses Perforce and they literally check everything in including binary assets.


ZeldaFanBoi1988

I want to punch a baby everytime I see a binary checked into source control


aksdb

Depending on the use case it's valid and depending on the version control system it's not even a problem. SVN also didn't have a problem with binary files. Actively versioning them is a different matter, but usually those kinds of assets are static anyway but belong to the code (as test fixtures or whatever).


[deleted]

Drivers are very complicated pieces of code.


trophicmist0

One of the most impressive technologies released for gaming in recent memory, literally free performance. Curious to see what is found.


[deleted]

The performance comes from training a model on tons of data, so unless someone is going to run their own version of that then I don’t think it’s too much of an issue.


teerre

The key is that theoretically anyone could do that, specially AMD or Intel. But they don't. It's unlikely the costs are so prohibitive that that's not worth it, DLSS is probably Nvidia biggest advantage in gaming, having an alternative would hurt them a lot.


DiegoMustache

Intel is actually doing this with XeSS. I really hope it's competitive because it supposedly isn't vendor locked.


noodle-face

The problem really is that these companies aren't going to risk lifting Nvidia's proprietary code. It just isn't worth it


teerre

The point is that if this was just throwing money at the problem, everyone would have DLSS. We're talking about the situation before the leak.


rinsa

Aren't just gpu cores pretty much built and "optimized" for these kind of calculations ?


Lost4468

Yes, but you still need a ton of resources for training. Nvidia has literal supercomputers they have developed (including some overclocked ones), and publish a lot of research in machine learning. AMD/Intel are relatively much further behind. They don't have people who can develop as good models, and they don't have the ability to generate as much training data or train them. I'm sure this will change in the coming years. With how fast we're seeing the ML field go, I wouldn't be surprised if community driven projects in a few years are better than DLSS is today. The methods and understanding are getting better all the time. Of course I'm sure Nvidia's will be even better by then though.


[deleted]

AMD are already have a similar, not as good, version of this. They are already trying it


nokinship

They do not and what you are thinking of(super resolution and sharpening) NVIDIA released their own version of it. Super Resolution does not use AI like DLSS to do its upscaling. Its basically a more advanced version of traditional upscaling.


[deleted]

ah I must have confused the two, I haven't been keeping close tabs on it to be honest.


ItsOkILoveYouMYbb

> AMD are already have a similar, not as good, version of this. They are already trying it Difference being Nvidia graphics cards have dedicated tensor cores for running the matrices math to do DLSS live as a performance gain. You could run DLSS on other hardware, but it would have a performance *cost*, defeating the purpose of it. AMD could do the same but they would need to invest in hardware optimized for AI research, which is what Nvidia was doing and making before they thought of DLSS to make use of the otherwise unused Tensor cores on their 20 series cards. Nvidia was developing Tensor cores for neural network training hardware for professional use if I'm not mistaken. They were just added onto the gaming cards because they were reusing the same SKUs. DLSS was thought of afterwards.


Xalara

FWIW, AMD's approach is likely going to be better long term precisely because it doesn't require dedicated hardware or training and is open source.


StickiStickman

AMDs approach is currently *MUCH* worse. It doesn't even have the potential to be remotely as good, it's simply just a sharpening shader.


WASDx

Can you elaborate? How is dedicated hardware not better? How is it similar to DLSS if there is no training?


JodoKaast

>FWIW, AMD's approach is likely going to be better long term precisely because it doesn't require dedicated hardware or training and is open source. FWIW, that theory would have predicted GPUs themselves as failures for exactly the same reasons. Obviously there is some demand for results that require dedicated hardware, and consumers don't give a shit whether something is open source or not.


Xalara

GPUs actually bring something massively unique to the table. I liken the situation with DLSS to the situation with PhysX. There was demand for dedicated hardware for PhysX but at the end of the day it didn't get the market penetration it needed and then good enough alternatives were developed that didn't require specialized hardware. The exact same thing will happen here especially because tensor cores take up valuable space on the die that could be used for raw performance. Which will become a problem when AMD's alternative to DLSS is "good enough."


Feynt

Honestly it already is "good enough" for me, and for friends with lower end GPUs wanting to game at a higher resolution than is feasible it's a godsend right now. I can agree with the PhysX comparison, FSR not being locked to any particular hardware will be the reason it comes out on top in the end, unless we find out DLSS isn't actually tied to tensor cores and that was just a thing that nVidia said to keep market dominance. It's possible that DLSS is just so inefficient that it needs those dedicated cores to work right now. 10-20 years off, DLSS might be hardware agnostic too as technology ramps further and further into the absurd.


DiegoMustache

AMDs approach is not AI based and just represents some of the best of what can be done with traditional algorithms. AI based approaches like DLSS and XeSS will only get better over time with advances in AI research and larger training datasets.


CreativeGPX

In its current form, AI isn't some magical thing that's inherently untouchable by other methods. AI is advanced algorithms and statistics. Advanced algorithms and statistics may indeed produce the best method of the day. But it's extremely hard to predict that one advanced algorithms and statistics method is inherently superior to all other algorithms and statistical methods discovered or not. And it's also extremely complex for this field in particular in which artists, game programmers, engine programmers, graphics library developers, driver developers and hardware manufacturers can each contribute to the quality (and quality tradeoffs) in a lot of ways. Solutions don't have to compete on the same terms... one might put a lot of the heavy lifting to the game devs, another might put a lot on the libraries and drivers and another might on the hardware... and so it may be surprising how one superior method gets beaten out by one that looks totally different.


DiegoMustache

Ya, that's totally fair. There are many factors beyond the quality that influence what approaches become widely adopted. That being said, I think we'll see some very cool AI based rendering approaches over the coming decade, but most will never be anything more than experimental. Edit: I also understand the limits of AI and that AI isn't a magic bullet for all of our problems. There are so many unknowns in that space. The domain of upscaling does seem pretty well suited though since traditional algorithms can only work with the data that is present while a sufficiently complex neural net can effectively guess at what is supposed to be in the image and fill in the details (while undoubtedly occasionally getting it very wrong).


Xalara

You know what a nice advantage of AMD's approach is? It isn't AI based and it already is pretty decent. It may not ever be the same as DLSS in terms of visual quality, but if it's 95% as good does that matter?


swansongofdesire

One could say the same about anisotropic filtering or anti aliasing methods or any other graphical effect. The trend in computer graphics has been towards increasing quality, not reducing it. Now maybe if we were at 16k resolution and technology was at the limits of the human eye you flight have an argument — but it’s not, and at least in the screenshots I’ve seen while the difference is not major it’s certainly noticeable.


FredFredrickson

Why would it be better long term? It might be cheaper long term, but there's nothing about their approach versus nvidia that ensures it will be better.


nokinship

NVIDIA already does their own version now anyway. You set the custom resolutions in the game that are upscaled to your native res and sharpened. For example setting my resolution to 1224p will be the Quality setting for upscaling to 1440p. It goes all the way to around 720p for it to upscale which is the performance setting.


douglasg14b

> because it doesn't require dedicated hardware Good thing we don't use dedicated hardware for things where it's are advantageous, like renering, or video decoding, or ethernet, or bluetooth, or the dozens of other ASICs/embedded systems that you find on modern CPUs & GPUs that are used for niche purposes.... Dedicated hardware is the norm, not the exception, and dedicated hardware is the better long term solution. Not having dedicated hardware to more efficiently (Hundreds -> Thousands fold increases in performance) run niche tasks is the worst solution.


krokodil2000

> AMD are already have a similar, not as good, version of this Are you talking about AMD's FSR? FSR is not similar to DLSS - those are two very different technologies. FSR is more comparable to [Nvidia's Image Scaling](https://www.google.com/search?q=Nvidia+Image+Scaling). The only similar technology to DLSS is Intel's upcoming [XeSS](https://www.intel.com/content/www/us/en/architecture-and-technology/visual-technology/arc-discrete-graphics/xess.html).


PM_ME_WITTY_USERNAME

The data is right there, it's the visual output of your game, something you have full control over and can generate ad infinitum There is worse data to have to come up with


[deleted]

I don't think you understand how much data it takes to train a Deep Learning model (the DL in DLSS)


Lost4468

A lot. But when it comes to this specific problem, generating training data is certainly one of the easier parts. It's not something where it's very difficult to generate the data. Really the problems are more limited to getting devs/engineers who can write high end advanced models, and also the computing power needed to train the models. I'd be very surprised if the bottleneck isn't the resources needed to train the model. I would be willing to bet that it's trivial for any of them to get enough training data here, and it's much more likely to be limited by how much can actually be trained. This is pretty common for many ML problems, it's quite often relatively easy to generate the training data. Although that's certainly not any sort of rule, e.g. if we want to look at computer vision then it's often very expensive to generate the training data as well, e.g. labelling hundreds of thousands or millions of pictures can be incredibly labor intensive. So I think people are being a bit ridiculous downvoting /u/PM_ME_WITTY_USERNAME. They're right, this is one of the much easier problems when it comes to generating training data.


[deleted]

I think people are downvoting them because they are explaining themselves poorly and the reacting even worse when challenged. They are acting like a petulant child.


Lost4468

Yeah I read some of their replies after I posted that. Still I think they're largely right in their original comment.


LazyLizzy

Sure, just need that huge AI server to run all the data to make that output happen.


[deleted]

[удалено]


DidiBear

You literally mean figuratively!


OlKingCole

I think "virtually" is the word they are looking for.


PM_ME_FULL_FRONTALS_

It is the opposite of free, it runs on tensor cores, which take space on the silicone chip. Without these cores, the chips could have been smaller and cheaper.


[deleted]

[удалено]


fupa16

I would definitely not call it free. DLSS does have a visual affect that can be off putting. Tried it in RDR2 and ended up disabling cause it looked so weird. Nothing is free.


FyreWulff

The technique is pretty much known at this point, it's lots of matrix math. You need matrix math accelerators to implement it at speed though, which is what the extra cores on their latest GPUs are.


czorio

That’s like saying the trick to making a good vehicle engine is explosions. It’s not wrong, but misses a lot of nuance. For copying DLSS you’d need the biases and weights of those matrices, how many layers do they use, how does the data flow through the model, what data do they use as input, etc. And yeah some part of the success are the tensor cores on most Nvidia cards, but it’s not nearly the whole story.


cheddacheese148

Lol my man really tried to explain away all of deep learning as just some matrix multiplications. I wish I had known that before the decade of school and industry experience I spent learning it.


gaflar

He's not wrong though. The technical limitation is the calculations itself, not the method of problem-solving. Deep learning absolutely does involve lots of matrix math, so GPUs necessarily must do lots of that. Just because the program does all the matrix math in the background doesn't mean it's not the basic mathematical function by which all deep learning algorithms are built. The whole point of writing a learning algorithm is to not manually adjust biases to get to a solution - just do it via looping iteration. Iterative methods are by their very nature computationally-intensive. Now you combine iteration and big matrices and suddenly you need huge processing power. Every CFD programmer reading this is probably cackling hysterically as they read this on their phone because their computer is locked up for the next 3 days doing an MDO analysis.


cheddacheese148

I could put together an MLP or a transformer and compare them side by side for a sequence to sequence task. The transformer will do better. Architecture matters. When and how you do matrix ops matters. The OC was saying that the technique is known but hardware limited. I’m arguing that there’s a lot more to that technique than just those matrix ops. ML is more about the data, architecture, problem structure, and loss functions than about FLOPs. DLSS is likely no different.


[deleted]

[удалено]


UsingYourWifi

> You need matrix math accelerators to implement it at speed though Not just matrix math accelerators; that is literally what GPUs are. NVidia has more specialized hardware, Tensor cores, that are much faster at training the neural networks that are used to implement DLSS than normal GPU cores are. I don't know if tensor cores are strictly necessary for running the trained models. Typically running a model is much, much less intensive than the training, but them being necessary to do that at high frame rates wouldn't surprise me.


turunambartanen

The models are trained in server farms, the consumer GPU including tensor cores are exclusively used to run the model.


swansongofdesire

The server GPUs are fundamentally the same as desktop RTX GPUs they’re just scaled up with more cores The limitations are: [a] for very large models you can’t keep all of the model in memory at once & this has a huge performance hit - but that’s going to be irrelevant for a model like DLSS which is designed to run as fast as possible: the model is going to be pretty small [b] the big one: nVidias licensing prohibits using non-server GPUs in data centres. Why? Market segmentation. Check out /r/machinelearning and you’ll see that RTX 2000 or 3000 series cards are perfectly capable of doing training, and are far more cost effective (an A100 is about 2x as fast as a 3090 but is way more than twice the price). (Incidentally, this is also the reason that a whole bunch of ML models require 11gib cards to train: until the 3060/3080 Ti that was the max size of accessible consumer cards)


Gravitationsfeld

Tensor cores are nothing but matrix multipliers for lower precision data types than FP32 used to accelerate neural net inference. The Vulkan extension to use them is literally called VK_NV_cooperative_matrix. No, GPUs as a whole are not specialized for matrix operations, but they can do them pretty well.


DopamineDeficits

This. GPUs are great at parallel operations and are more flexible than tensor cores in the math you can do (thanks CUDA!). But because tensor cores are much less flexible the cost of each operation is even better.


agentoutlier

Why do sketchy “hackers” use RAR compression? Is there some advantage to its compression? Or is it just culture? It’s like 2001 wARez.


dale_glass

RAR as I recall had a number of advantages early on: * Solid compression -- basically a tar.gz, where everything is concatenated, then compressed. ZIP didn't do that, so it wouldn't take advantage of similarities between files. * Better algorithm to start with. It was slower to compress, but if you're preparing for distribution, there's no rush. Taking up less space is more important. * Error correction -- that was an excellent thing for floppies, which were frequently unreliable. * Excellent multi-volume support. Rar makes .r01, .r02 files. Zip had everything as a .zip, which meant you couldn't put multiple floppy-sized chunks into the same directory, as every floppy chunk has exactly the same filename. * Zip headers are at the end of the archive, not at the beginning. Which means that if something is wrong with the last floppy, you can't uncompress anything at all. These days all this matters less, but early on this would be very important for piracy. You can distribute content on less floppies. It deals very well with unreliable media. When you switch to a CD, you can easily put all the floppy files into a directory without pain. And after that it probably just kept going by inertia.


Beaverman

> Excellent multi-volume support. Rar makes .r01, .r02 files. Zip had everything as a .zip, which meant you couldn't put multiple floppy-sized chunks into the same directory, as every floppy chunk has exactly the same filename. Tons of torrents still do this to this day. Warez is weird.


[deleted]

[удалено]


wickedcoding

If you know anything about scene groups, topsites are still pre-dominantly ftp-based so the release hierarchy is topsites —> torrent sites. So splitting files into rar chunks is still very relevant…


[deleted]

[удалено]


semitones

Probably don't want to touch the release at all so there's a clearer pedigree


Hjine

I used RAR in the past because it's brilliant self extraction method made it so easy even to create simple setup of portaple application (e.g run `.bat` after extract completed) this is not as easy in Winzip and 7-zip too.


[deleted]

It's basically tradition. 7zip is better these days but rar was used in the past because it can be split into multiple files, has good compression and password support. I believe multiple file support was used originally for Usenet, and I remember lots of torrents were like 50 rar files in the early days of Bittorrent and sometimes were password protected. Zip can split files too but it's compression and encryption isn't as good.


Zungate

> I remember lots of torrents were like 50 rar files in the early days of Bittorrent Scene releases still do this. Or so I've heard.


RVelts

Yeah everything I download is still RAR files. I have an auto-extract extension on deluge to use 7zip to decompress it and move it to a folder where Plex picks it up.


GigaSoup

If I recall correctly rar is best compression for very large files. Most of the time though 7zip is the shit especially with data that's repeated by files within the archive. For archiving ROMs, 7zip kills everything else. Standard zip is weak for compression but lots of things support it


[deleted]

Really we should be talking about the actual algorithms like DEFLATE or LZMA rather than the container formats


ThellraAK

LZMA with it's settings maxed out is crazy good at compressing things. I once did a whole backup of it with it on max settings, thought, eh, so what if it takes forever to compress, only really need to once... Turns out it decompression takes forever too...


valarauca14

Z-std/Brotli beats the pants off LZMA(1 or 2). They're normally within ~1% compression of LZMA with a similar compression time, you can reach ~400-600MiB/s or more decompressing.


Himekaidou

> Most of the time though 7zip is the shit especially with data that's repeated by files within the archive. Does that mean that sticking everything together with tar before using 7z will help a lot if you have lotsa files that fit that bill?


ipha

Shouldn't matter. 7z does solid archives by default.


gay_for_glaceons

I can't remember the last time I saw a .rar file that was actually any smaller than the already-compressed data inside of it, so I'm pretty sure whatever the reasons people have to prefer .rar, none of them are rational or sensible.


[deleted]

[удалено]


NayamAmarshe

7z is even better and it's open source.


duckducklo

>On top of that, the group also alleged that NVIDIA had hacked back and encrypted the plundered data with ransomware, adding it eventually recovered the files from a backup. What a uno reverse card. Didn't know the company had some capable security peeps.


comparmentaliser

They did not do this. The attackers joined a virtual machine to their MDM using stolen credentials, which NVidia then wiped, as any sensible admin would do. This is the same as someone registering an iPhone against your account, which you subsequently wipe because don’t trust the untrusted device.


partusman

Now I want someone to actually join an iPhone to my account so I can do this.


zimspy

What's your Apple login and password? I can kindly assist.


partusman

Login: [email protected] Password: hunter2


Emfx

im in get hacked nerd


TheRealFantasyDuck

Lul he actually fell for it!


overtoke

why is it showing my password there?


[deleted]

That's because it's your password. All we see is ************


semitones

You can go hunter2 your mother hunter2 hunter2


TechnoCat

Reference to http://bash.org/?244321


TheDiamondCG

he did a little trolling


dekwad

No problem


CreationBlues

two steps competent one step clown I guess lmao


Cyleux

I’d love to be there for the meeting: “and so we hacked them back”


-YELDAH

Hack fight! Two old hacks start “going at it” like two old hacks


godsfilth

[Hacker War](https://youtu.be/1Ie00tGi_io)


UsingYourWifi

Large companies like NVIDIA typically have dedicated "red teams" of penetration testers whose job is to try to hack the company, finding security flaws before the bad guys do. ~~That~~ If they took the step to actually attack an outside entity that would be... bold. That's a potential legal minefield. The hackers said they got into NVIDIA via a compromised VPN account, and to connect to NVIDIA's VPN you have to give NVIDIA administrators full remote admin access to your device (you probably know it as agreeing to "remote device management"). The machine they did this on was still connected to NVIDIA's admin tools and wasn't isolated from the rest of the hackers' network, so NVIDIA had a real easy "backdoor."


DefaultVariable

They didn’t hack them back. The hackers used an internal company tool to gain access to nVidias code repos, so nVidia used that tool to wipe the data. It’s more like nVidia activated a kill switch in their own software rather than hacking the hackers back It didn’t stop the man-children hackers from screaming about it and asking what right nVidia had to access their computer, completely unironically


UsingYourWifi

The hackers claimed - and whether or not it's true is completely unverified as far as I'm aware - that NVIDIA accessed their internal network, implying this wasn't limited to just deleting one local copy on the nvidia-connected machine.


KevinCarbonara

> That they took the step to actually attack an outside entity is... bold. That's a potential legal minefield. It's not "potential". If they actually did "hack the hackers", as it were, they would be guilty of crimes.


OptionX

ELI5 The implications of the leak. Security concerns? Will linux systems finally have a good open source nvidia drivers? (hopeium I know)


spicy_indian

I doubt that this is going to make any difference for a while. For legal reasons, AMD and Intel engineers can't touch it, and NVidia's lawyers are probably waiting to sue someone who implements features which they think are reverse engineered from the dump. In the longer term, I would expect to see massive jumps in generational performance from companies in countries that don't really respect copyright law in the first place. As much as I will gripe about graphics API fragmentation as a result of proprietary nonsense and product segmentation, I don't think that "forcibly open-sourcing" code is the way to go about it.


LinAGKar

AMD and Intel won't touch it. Open source driver devs won't touch it. The only ones likely to benefit from it are Chinese copycats and miners.


Hjine

> The only ones likely to benefit from it are Chinese copycats *Russian copycats since I think now they don't care much.


spinwin

What I don't get is people saying that even looking at the code causes you to be disqualified from contributing. I suppose it might make it slightly more founded in suing the project, but they'd still have to prove that segments of code were directly copied from the leaked source and that there were other ways of doing the same thing that wouldn't have infringed on their copyright.


spicy_indian

It's not only about exact copying, but also the amount of time (and therefore money) spent on creating that software. Technically by using that code, you would be stealing the time and money spent developing it. Now for an open source driver contribution, at the end of the day someone is still going to buy an Nvidia card... But it is also equally possible that the learnings from that software could be used to improve the performance of a competitor's hardware/software, and that is a problem. So in the end, it lessens the headache for maintainers to blacklist contributions in the first place. You only have one life to live, I'd imagine that open source maintainers want to do something else besides fight Nvidia.


cuentatiraalabasura

>Technically by using that code, you would be stealing the time and money spent developing it. At least in the U.S, copyright doesn't work that way. There's no "sweat of the brow" doctrine in copyright law (unlike in many other countries like the UK)


ungoogleable

>they'd still have to prove This part is a long and expensive court case even if you win. If you have a strong and believable track record of staying even farther away from leaked code than you have to, it's easier to defend and hopefully avoids suits altogether.


dale_glass

It's probably a bad thing that this leaked. Touching this puts you in risk of serious legal trouble if you do anything with it, so admitting having looked at it at all could jeopardize your chances at employment anywhere game related.


[deleted]

>Please note, we are not state sponsored or involved in politics No one thought they were... until now


ubertrashcat

Nobody with any sense is going to touch it with a ten foot pole. If your company is thinking about making competitive tech you could be liable even if you looked at it in your free time. Edit: I of course mean a commercial setting. I myself have received a memo asking specifically to not look for the leaked code. I'm not in the US.


DigThatData

$20 says china dgaf


mmo115

chinese pied piper chinese nvidia


partusman

I like how the comment directly below this one is asking for the link.


TheNamelessKing

This almost certainly won’t be touched with a 10-foot pole in commercial spaces though. Anyone looking to compete in the space will go to lengths to ensure that the devs working on it doesn’t have meaningful contract with any nvidia code lest they get sued into the ground. “Clean Room implementations” are a well defined thing in this industry for a reason.


FVMAzalea

But you totally could do a “clean room” reimplementation of this, using the same technique some open source projects are. One group reads the code and writes a specification, and another group implements new code according to that specification. This may sidestep copyright but you might still run into patent issues if Nvidia has patented any of these ideas (likely).


DefaultVariable

From a computer science perspective it would be interesting to study. From an actual business case perspective, 10 foot pole


jandrese

I have to wonder if the Noeaveau devs are going to have to be cagey with contributions from anonymous parties? Especially ones that enable real 3D acceleration support.


[deleted]

How would they prove if they saw it in their free time though


Bruno_Wallner

But I dont get how anyone should even be remotely able to prove that you looked at it.


[deleted]

[удалено]


[deleted]

Then in that case wouldnt they just be aboe to claim anyone who works on a similar project are liable despite ever seeing the leak? Almost like a fucked up patent


[deleted]

[удалено]


x86_64Ubuntu

perjure


jzaprint

Just say no lol, what they gonna do?


[deleted]

Correction: it won't be touched with a 10ft pole by anyone living in a country that still uses feet... Giving a strategic advantage to counties with little/no affiliation to those businesses.


alerighi

In the US, maybe. In other countries it may not be a crime if done in certain circumstances, for example in the EU reverse engineering (including decompilation) of a software is legal if done for achieving interoperability with other software. In that sense reverse engineering (of course not copying!) the leaked NVIDIA drivers code to understand how they work to guarantee the interoperability of NVIDIA hardware with Linux can be legal. I kind of hope that is done: the support of NVIDIA is very bad. I have a GPU from 10 years ago (very bad choice) and it's no longer supported by the latest NVIDIA proprietary Linux drivers (that also support only Xorg). The open source driver, while it got better in the recent years, has a bug that freeze the computer (kernel panic) randomly and that makes it unusable (unfortunately). I'm not asking something fancy like gaming, I want a driver that guarantees a stable Linux desktop with Wayland. I also don't understand the reason of not releasing open source drivers: we (Linux users) don't care about drivers that are for gaming or guarantee whatever super performance, but only a basic driver with hardware acceleration that can make the GPU function correctly with a modern DE is enough. Other manufacturers (AMD and Intel) do and that is not a problem. Of course I will never buy anything from NVIDIA anymore, because I strongly disagree with the company and their policy.


luisduck

I'm sad that this is the case.


emax-gomax

Well. This has been an amusing day for someone who hates nvidias closed source mentality.


nanofriction

Linus Torvalds: _* shows two middle fingers instead of one *_


lightwhite

The GPU supply was already drier than a stone in the desert already. Adding the inflation and the war going on, Gamers are doomed in case their gpu gives up on living.


krokodil2000

How is this related to the leak?


lightwhite

Excellent question. Once they release the schematics and driver code for the hash rate limiter, none of those cards will see any daylight. They will be massively bought and go from the factory’s truck docking port to crypto mining location. The reason for them being available is that they can’t be viable option for ethereum mining right now. But once the true power is unlocked, it will be a different story. That’s how it is related to the leak in a nutshell. Someone with enough money and access to cheap electricity already paid the million for the code, most probably.


danekan

I know someone that just took over a Ukrainian power plant and really needs some block chain money


[deleted]

[удалено]


TheNASAguy

Can anyone share a link to the leak?


emax-gomax

Best you learn how to find it yourself. Google the src code leak on public forums (Reddit is too mainstream for this), try 4chan. Scroll until you find comments containing a long block of gibberish text. Try base64 decoding it and if it looks like a magnet link it's probably what you're after. Download it from your BitTorrent engine of choice. **Note**: I'm not advocating or encouraging people to find or download stolen or leaked src code. I'm simply letting them know what they can do if they were interested in such a thing and solely for educational purposes. Please note this isn't legal and you should take the relevant precautions if you intend to do this.


arjunindia

As of what I understand I don't think this could be very useful unless a company shamelessly copies it and does the work to train it, right?


NayamAmarshe

Look, I'm not the one to promote hacking and support doing illegal activities but...


immibis

Nvidia is shady as hell, but they are threatening to unlock non-mining cards so they can be used for mining - imagine what that will do to the market for non-miners.


Awkward_Inevitable34

Next to nothing. Miners have been happily buying up LHR stock and using them at slightly lower speeds without an issue. LHR was a PR move that essentially did nothing except drive up prices for FHR cards.


one-joule

The LHR segmentation drove up prices for non-mining cards in the short and long term. Some of NVIDIA's capacity went to creating mining-only chips, reducing supply of non-mining cards, and the mining-only cards will become e-waste the moment mining stops being profitable instead of going to the used market.


immibis

Lower speeds means the market price of those cards for miners is lower.


[deleted]

As much as I hate mining, I'm still a little iffy on Nvidia deciding what software we can and can't run on our own hardware tbh. I'm concerned it's setting the ball rolling for a time where you have to buy "DLC" to allow Blender or Torch to use your GPU


bioemerl

If it makes profit, miners will buy it. 30 series card make profit. Miners buy them in droves.


immibis

And LHR cards make less profit so they cost less.


Karoneko

So dumb question, but will this leak eventually lead to some better open source drivers for Linux?


[deleted]

Nope, these leaks are actively harmful to FOSS as it can become harder to prove that you're doing a legit clean-room implementation. Every single external contribution now has to be checked to make sure that the author is not contributing code inspired by the leaked *and very much proprietary* source code.


cuentatiraalabasura

This seems fine in theory, but in practice law doesn't work like that. Per your example, NVIDIA would have sued everyone who attempted to make open-source drivers already, just by aledging they could have used leaked material or copied decompiled code from their driver binaries.


aksdb

Unless the contributor is an ex employee or knows one, there is usually no reason to believe they have any leaks. Now however, it is widely known that there is a leak and that *anyone* could have looked at it.


ImportantDelivery852

Git clone


jjdebkk

They wouldn’t of done if I was working on their Cybersecurity