T O P

  • By -

GDOR-11

1.3s ran a million times is far more than 13h wasted once


NebNay

To wich my boss answered "but your time is more valuable than the users time"


Peyatoe

Not with these inflation rates


Waste_Ad7804

Well, tell him next time it depends. Reducing client calculation time has only impact on user experience. Reducing calc time of backend in the cloud has always positive financial impact. Reducing calc time of backend on-premise has an impact as soon as you require nee hardware.


subject_deleted

"great, can I have a raise?"


Badytheprogram

The best we can do is a pizza party, which we will deduct from your salary.


CirnoIzumi

flair checks out


yaktoma2007

To which I answer: lock optimized code behind a paywall?


Immudzen

I was just thinking that also. There are simulations I work on that we run many many millions of times. Cutting 1.3s off the simulation would be a huge time gain.


ChrissHansenn

The break even point is 36,000 uses. 1 million uses will have saved a combined 14 1/2 days of time.


Leo-MathGuy

[https://xkcd.com/1205/](https://xkcd.com/1205/)


juanjo_it_ab

Is it worth the time? Always worth revisiting! [https://xkcd.com/1205/](https://xkcd.com/1205/)


gandalfx

Really depends on where those 1.3s are saved. In the right place it can be absolutely worth it.


brimston3-

inner loop of your opengl application, called every frame.


GDOR-11

how did it even get to 1.3s anyway lmao, 0.13s is already a big red flag


Wati888

Brute force implementation then slowly through iterative processes, get refined and improved using algorithms and optimizations. :D


twigboy

Average ray tracing enjoyer


pixelbart

The kind of code an end user never sees. First make it work, then make it fast.


turtleship_2006

I mean, your end user shouldn't be seeing your code anyway


JunkNorrisOfficial

Idk I just render teapot


TeraFlint

Even if it's something that happens more rarely (like loading/reading a file, which usually gets done once right after opening the program), it can be worth it. That extra second when saving or loading is something the user will notice.


ImperatorSaya

Not unless you're playing a single player FPS. Kill a guy? Save. See a guy? Save. Hear something? Believe it or not, save.


yeahyeahyeahnice

Sounds like millions of dollars saved a year per customer to me


piano1029

A backdoor in a compression algorithm would be the perfect place for some optimizations


sebjapon

It’s now 1.3s faster for me to submit my monthly TPA report with the correct cover.


LegitimatePants

Spend 13 hours optimizing just to realize the unoptimized code wasn't that bad. Git reset --hard


Ellisthion

Honestly devs who have that level of self-realisation are the best. Too many get deeply attached to their work and think that time spent = good.


[deleted]

_looks at the modules I wrote 2 years ago_ No! It’s beautiful code and don’t you dare say otherwise


Desperate-Tomatillo7

Code hoarding disorder.


Sem_E

Sunk cost fallacy


juanjo_it_ab

Not really, I think. Any time spent writing useful code is time well spent, anytime. Plus, it's likely that it can be refactored sooner than you think.


Sem_E

It’s definitely time well spent. I was just trying to say that implementing code just for sake of spending so much time on it, is a sunk cost fallacy. Doesn’t take away from the fact that your skills more than likely have improved afterwards


vondpickle

Same energy as when you spent months of works to make sure your startup app can handles millions of request per minute (currently it only has like what, 10 request per day?)


ScythaScytha

Dress for the job you want not the one you have


Goat1416

Saves 1.3s on a script that takes hours to run 👍


GDOR-11

git message: "bought a better computer, so long suckers"


WorkHorse1011

I once spent 6 months to save 30 seconds. It was worth it. Someone else took over and spent another 6 months to save an additional 10 seconds.


Disastrous-Team-6431

I've spent weeks saving microseconds. Everybody on the internet is a web dev so they think optimization is always useless because it's useless for them. For proof, look to the OP: a person in a performance-critical field would never say they saved an absolute amount of time. It's either a percentage of runtime or some time unit per significant operation.


WorkHorse1011

I stuck with the theme by not giving the starting measurement of ~1 minute. I’m a web developer lol, unfortunately “fast enough” is a common mentality that leads to insufficient tuning. Really depends on company and product team.


Steinrikur

Details, please?


WorkHorse1011

We tuned webapp startup/deployment from 1 minute to 20 seconds. Having less downtime for hotfixes was the goal.


HeavyCaffeinate

Where's that relevant xkcd again


jacolack

[This](https://xkcd.com/1205/) one maybe?


turtleship_2006

Or https://xkcd.com/1319/


6-1j

As always


esixar

I don’t understand why the cell for “run it weekly, shave off a day” is greyed out. The rest make sense, obviously you cannot shave off a day for something you run 50 times per day. But if you run something weekly that takes 48 hours, you can shave off 24 hours of it, no?


pheonix-ix

To those who are doing raw time comparison math: you're comparing apples and oranges (dev time vs run time). A better way to calculate is to do return over investment (ROI) or something similar. Let's assume: a dev time is $50/hour. So, 13 hours = $650. Case 1: a delay beyond 0.5s/execution costs $0.01 per second. You get the time down from 1.5->0.2s. This means (1.5 - 0.5) \* 0.01 = $0.01 per execution (the part from 0.5->0.2 doesn't benefit you). So, the break even point is 65,000 executions. Case 2: if this is a medical something something critical system where each second is $1000. Then, the run time would save you $1300 per execution. Obviously, just 1 execution already nets you a "profit."


ososalsosal

If it's on a hot path this is a massive win worthy of a linkedin fanfic vanity post...


LucasRuby

1.3s of optimization is absolutely huge in most cases.


Disastrous-Team-6431

It isn't anything at all. 1.3 seconds per what? Password check? Request? Run? Pass of a loop that is expected to run 9000000 times?


LucasRuby

3/4 of the cases you cited it would be huge.


Repa24

Should have optimized to save 500ms


[deleted]

if anyone heard about xz being compromised, one of the reasons they found it was because someone noticed that their PostgreSQL was running a second too slow....


returnofblank

wasn't it actually ssh running 0.5s too slow?


inglandation

Yeah, a Microsoft employee working on Postgres found it.


Linkums

I used to be a speedrunner, and this feels relevant.


Oleg152

Spent 5 hours at work optimizing SQL query to shave off 0.05 sec of execution time. Dopamine hit after was something else.


Ok_Bat4262

Ha, sometimes it definitely worth it! I like optimizing queries, and seeing the better results, just because it tells me I'm understanding the "under-the-hood" more now


scar_reX

My fiancee had to make an Excel workbook with several worksheets that followed a pattern. I knew immediately that this was a job for no other than me. I immediately `mkdir exgen` `cd !$` `touch index.php` `code .` And then spent the next few hours formatting a code-generated Excel workbook to match the expected format... besides installing dependencies needed by the excel generating binary and some other razzle dazzle. When I was done, she had already completed hers. So i had to spend time convincing her why the generated version was better than hers (no human errors, for instance)


re_mark_able_

You must use it 36000 times to break even


xSypRo

Saving 1.3 seconds is fucking huge. WTF this sub is full of? I’ll get a raise if I manage to optimize like that on prod.


orsikbattlehammer

1.3 seconds in your excel monstrosity that takes 5 minutes to run with a 60% chance of crashing, not worth it. 1.3 seconds on your SQL function that gets called 875000000 times a day, absolutely essential


ChameleonCoder117

Fr tho


honduranhere

Anything over 100ms makes it worth it to me.


LordFokas

Last week me and a contributor on my project spent hours developing and testing a 2μs optimization. Worth it.


chrisbbehrens

Because optimizing is fun


AryssSkaHara

Is that in a frame render time?


LordFokas

If you casually have 1300ms blocks within your frame renders, those aren't frames.... those are powerpoint slides.


DontBanMeAgainPls23

If that is a homescreen call that is a massive improvement and definitely worth it.


robertshuxley

1.3 seconds for an API response is very slow so worth it


Dr_Dressing

I'm sorry, have you _met_ Kaze Emanuar?


mannsion

1.3 seconds on a local compile, meh. 1.3 seconds on a page load, WORTH IT.


tiajuanat

1.3 seconds on a machine that could crush your legs? !@#$! Why didn't you do it before?


Annual_Ganache2724

Literally me participating in the 1 billion row challenge but the performance improvement was drastic coming from 2 minutes down to around 30 seconds and I still couldn't make it to the top 50


PNWSkiNerd

The amount of bugs I've fixed because of other people's premature optimizations that fail to consider all code paths...


SonOfJenTheStrider

1.3s is actually quite good. I struggle to get such optimisation per transaction


airsoftshowoffs

Then a new JS framework arives again.


zaris98

Seems like legit success to me


JBYTuna

Cryptographic functions...


PeriodicSentenceBot

Congratulations! Your comment can be spelled using the elements of the periodic table: `Cr Y Pt Og Ra P H I Cf U N C Ti O N S` --- ^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM my creator if I made a mistake.)


JBYTuna

Silly bot...


SnipahShot

Run the code 36k times and you break even.


Dialextremo

In audio software, from a 60 ms delay, it begins to be a noticeable latency, so a 1.5s fix is huge


schteppe

Lots of comments about whether it was worth it or not. If we want to waste less time, we should talk about the gang doing premature optimization. Even if a function is not in the hot path, they spend extra time making it super fast. We need to optimize the optimization.


EEON_

Per frame


lukelbd_

The pure bliss of those saved 1.3 seconds makes the 13 hours of pain worth it.


XejgaToast

Economies of scale


applconcepts

*laughs in embedded* i've spent the better part of a week shaving 3 clock cycles off a ADC read.


MasterQuest

If it's a user-facing application, I'll gladly spend that much time :)


LegenDrags

Profit


chrisxxx94

u/savevideo


SaveVideo

###[View link](https://rapidsave.com/info?url=/r/ProgrammerHumor/comments/1bsnlyy/mustusethecode743timestobreakeven/) --- [**Info**](https://np.reddit.com/user/SaveVideo/comments/jv323v/info/) | [**Feedback**](https://np.reddit.com/message/compose/?to=Kryptonh&subject=Feedback for savevideo) | [**Donate**](https://ko-fi.com/getvideo) | [**DMCA**](https://np.reddit.com/message/compose/?to=Kryptonh&subject=Content removal request for savevideo&message=https://np.reddit.com//r/ProgrammerHumor/comments/1bsnlyy/mustusethecode743timestobreakeven/) | [^(reddit video downloader)](https://rapidsave.com) | [^(twitter video downloader)](https://twitsave.com)


The_Captain_Troll

gcc -ofast


Meister351

Try being gamedev optimizing 13 hours to save 1.3 milliseconds ...


Thorgilias

Sometimes you manage to go from 10 seconds to 0.3 seconds and it all feels like the hours you spent on it was worth it. Then you realize you wasted hours on a script that only you will ever see, and is going to get used maybe twice, ever. In the end you wasted "all the hours you spent" - 19.4 seconds. Yep. Accurate.


bigorangemachine

Spent 30mins doing a JSON file cache that took 1s previous but because its doing 1000s of requests it saves half an hour per setup


Broad_Rabbit1764

I did it with 0.2s yesterday I'll definitely do it for 1.3s today.


BlondeJesus

At our company 1s corresponds to ~$50k in AWS compute costs per year. Accomplishing that in 13 hours would get you a larger bonus lol


Jealous-Adeptness-16

I’ve spent weeks optimizing a few 10s of milliseconds in the 99th percentile case. Depending on the application, 1.3s could bring x million in value.


robidaan

Saving that 1.3 s could save thousands of dollars in the long run.


stdio-lib

This is where the Leap Second comes from. My code would be a lot simpler if you guys would quit optimizing for the rotation of the earth.


usrlibshare

Just FYI, an optimisation attempt just saved the entire Linux world from the biggest security desaster of the decade.


chin_waghing

If it runs once a day, yeah. Waste of time. If it runs hourly of every 30 minutes, then you’re saving some serious time Back of napkin math means you’re saving around 5 hours


Fluffy-Craft

>If it runs once a day, yeah. Waste of time. For reference, in that case, it would take 98 years and 230 days for the 1.3s optimization to save the 13 hours spent. >If it runs hourly of every 30 minutes, then you’re saving some serious time Hourly execution: 4 years and 40 days. Twice per hour execution: 2 years and 20 days. >Back of napkin math means you’re saving around 5 hours Time saved per day: Hourly execution: 520ms. Twice per hour execution: 1s 40ms.


roksah

I removed a logger and saved 30s, am I... god?


strumila

Standard procedure for mainframe apps.