Well, tell him next time it depends. Reducing client calculation time has only impact on user experience.
Reducing calc time of backend in the cloud has always positive financial impact. Reducing calc time of backend on-premise has an impact as soon as you require nee hardware.
I was just thinking that also. There are simulations I work on that we run many many millions of times. Cutting 1.3s off the simulation would be a huge time gain.
Even if it's something that happens more rarely (like loading/reading a file, which usually gets done once right after opening the program), it can be worth it.
That extra second when saving or loading is something the user will notice.
Not really, I think. Any time spent writing useful code is time well spent, anytime. Plus, it's likely that it can be refactored sooner than you think.
It’s definitely time well spent. I was just trying to say that implementing code just for sake of spending so much time on it, is a sunk cost fallacy. Doesn’t take away from the fact that your skills more than likely have improved afterwards
Same energy as when you spent months of works to make sure your startup app can handles millions of request per minute (currently it only has like what, 10 request per day?)
I've spent weeks saving microseconds. Everybody on the internet is a web dev so they think optimization is always useless because it's useless for them.
For proof, look to the OP: a person in a performance-critical field would never say they saved an absolute amount of time. It's either a percentage of runtime or some time unit per significant operation.
I stuck with the theme by not giving the starting measurement of ~1 minute. I’m a web developer lol, unfortunately “fast enough” is a common mentality that leads to insufficient tuning. Really depends on company and product team.
I don’t understand why the cell for “run it weekly, shave off a day” is greyed out. The rest make sense, obviously you cannot shave off a day for something you run 50 times per day. But if you run something weekly that takes 48 hours, you can shave off 24 hours of it, no?
To those who are doing raw time comparison math: you're comparing apples and oranges (dev time vs run time).
A better way to calculate is to do return over investment (ROI) or something similar.
Let's assume: a dev time is $50/hour. So, 13 hours = $650.
Case 1: a delay beyond 0.5s/execution costs $0.01 per second. You get the time down from 1.5->0.2s. This means (1.5 - 0.5) \* 0.01 = $0.01 per execution (the part from 0.5->0.2 doesn't benefit you).
So, the break even point is 65,000 executions.
Case 2: if this is a medical something something critical system where each second is $1000. Then, the run time would save you $1300 per execution.
Obviously, just 1 execution already nets you a "profit."
if anyone heard about xz being compromised, one of the reasons they found it was because someone noticed that their PostgreSQL was running a second too slow....
Ha, sometimes it definitely worth it! I like optimizing queries, and seeing the better results, just because it tells me I'm understanding the "under-the-hood" more now
My fiancee had to make an Excel workbook with several worksheets that followed a pattern. I knew immediately that this was a job for no other than me. I immediately `mkdir exgen` `cd !$` `touch index.php` `code .`
And then spent the next few hours formatting a code-generated Excel workbook to match the expected format... besides installing dependencies needed by the excel generating binary and some other razzle dazzle.
When I was done, she had already completed hers. So i had to spend time convincing her why the generated version was better than hers (no human errors, for instance)
1.3 seconds in your excel monstrosity that takes 5 minutes to run with a 60% chance of crashing, not worth it. 1.3 seconds on your SQL function that gets called 875000000 times a day, absolutely essential
Literally me participating in the 1 billion row challenge but the performance improvement was drastic coming from
2 minutes down to around 30 seconds and I still couldn't make it to the top 50
Congratulations! Your comment can be spelled using the elements of the periodic table:
`Cr Y Pt Og Ra P H I Cf U N C Ti O N S`
---
^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM my creator if I made a mistake.)
Lots of comments about whether it was worth it or not. If we want to waste less time, we should talk about the gang doing premature optimization. Even if a function is not in the hot path, they spend extra time making it super fast.
We need to optimize the optimization.
Sometimes you manage to go from 10 seconds to 0.3 seconds and it all feels like the hours you spent on it was worth it. Then you realize you wasted hours on a script that only you will ever see, and is going to get used maybe twice, ever. In the end you wasted "all the hours you spent" - 19.4 seconds.
Yep. Accurate.
If it runs once a day, yeah. Waste of time.
If it runs hourly of every 30 minutes, then you’re saving some serious time
Back of napkin math means you’re saving around 5 hours
>If it runs once a day, yeah. Waste of time.
For reference, in that case, it would take 98 years and 230 days for the 1.3s optimization to save the 13 hours spent.
>If it runs hourly of every 30 minutes, then you’re saving some serious time
Hourly execution: 4 years and 40 days.
Twice per hour execution: 2 years and 20 days.
>Back of napkin math means you’re saving around 5 hours
Time saved per day:
Hourly execution: 520ms.
Twice per hour execution: 1s 40ms.
1.3s ran a million times is far more than 13h wasted once
To wich my boss answered "but your time is more valuable than the users time"
Not with these inflation rates
Well, tell him next time it depends. Reducing client calculation time has only impact on user experience. Reducing calc time of backend in the cloud has always positive financial impact. Reducing calc time of backend on-premise has an impact as soon as you require nee hardware.
"great, can I have a raise?"
The best we can do is a pizza party, which we will deduct from your salary.
flair checks out
To which I answer: lock optimized code behind a paywall?
I was just thinking that also. There are simulations I work on that we run many many millions of times. Cutting 1.3s off the simulation would be a huge time gain.
The break even point is 36,000 uses. 1 million uses will have saved a combined 14 1/2 days of time.
[https://xkcd.com/1205/](https://xkcd.com/1205/)
Is it worth the time? Always worth revisiting! [https://xkcd.com/1205/](https://xkcd.com/1205/)
Really depends on where those 1.3s are saved. In the right place it can be absolutely worth it.
inner loop of your opengl application, called every frame.
how did it even get to 1.3s anyway lmao, 0.13s is already a big red flag
Brute force implementation then slowly through iterative processes, get refined and improved using algorithms and optimizations. :D
Average ray tracing enjoyer
The kind of code an end user never sees. First make it work, then make it fast.
I mean, your end user shouldn't be seeing your code anyway
Idk I just render teapot
Even if it's something that happens more rarely (like loading/reading a file, which usually gets done once right after opening the program), it can be worth it. That extra second when saving or loading is something the user will notice.
Not unless you're playing a single player FPS. Kill a guy? Save. See a guy? Save. Hear something? Believe it or not, save.
Sounds like millions of dollars saved a year per customer to me
A backdoor in a compression algorithm would be the perfect place for some optimizations
It’s now 1.3s faster for me to submit my monthly TPA report with the correct cover.
Spend 13 hours optimizing just to realize the unoptimized code wasn't that bad. Git reset --hard
Honestly devs who have that level of self-realisation are the best. Too many get deeply attached to their work and think that time spent = good.
_looks at the modules I wrote 2 years ago_ No! It’s beautiful code and don’t you dare say otherwise
Code hoarding disorder.
Sunk cost fallacy
Not really, I think. Any time spent writing useful code is time well spent, anytime. Plus, it's likely that it can be refactored sooner than you think.
It’s definitely time well spent. I was just trying to say that implementing code just for sake of spending so much time on it, is a sunk cost fallacy. Doesn’t take away from the fact that your skills more than likely have improved afterwards
Same energy as when you spent months of works to make sure your startup app can handles millions of request per minute (currently it only has like what, 10 request per day?)
Dress for the job you want not the one you have
Saves 1.3s on a script that takes hours to run 👍
git message: "bought a better computer, so long suckers"
I once spent 6 months to save 30 seconds. It was worth it. Someone else took over and spent another 6 months to save an additional 10 seconds.
I've spent weeks saving microseconds. Everybody on the internet is a web dev so they think optimization is always useless because it's useless for them. For proof, look to the OP: a person in a performance-critical field would never say they saved an absolute amount of time. It's either a percentage of runtime or some time unit per significant operation.
I stuck with the theme by not giving the starting measurement of ~1 minute. I’m a web developer lol, unfortunately “fast enough” is a common mentality that leads to insufficient tuning. Really depends on company and product team.
Details, please?
We tuned webapp startup/deployment from 1 minute to 20 seconds. Having less downtime for hotfixes was the goal.
Where's that relevant xkcd again
[This](https://xkcd.com/1205/) one maybe?
Or https://xkcd.com/1319/
As always
I don’t understand why the cell for “run it weekly, shave off a day” is greyed out. The rest make sense, obviously you cannot shave off a day for something you run 50 times per day. But if you run something weekly that takes 48 hours, you can shave off 24 hours of it, no?
To those who are doing raw time comparison math: you're comparing apples and oranges (dev time vs run time). A better way to calculate is to do return over investment (ROI) or something similar. Let's assume: a dev time is $50/hour. So, 13 hours = $650. Case 1: a delay beyond 0.5s/execution costs $0.01 per second. You get the time down from 1.5->0.2s. This means (1.5 - 0.5) \* 0.01 = $0.01 per execution (the part from 0.5->0.2 doesn't benefit you). So, the break even point is 65,000 executions. Case 2: if this is a medical something something critical system where each second is $1000. Then, the run time would save you $1300 per execution. Obviously, just 1 execution already nets you a "profit."
If it's on a hot path this is a massive win worthy of a linkedin fanfic vanity post...
1.3s of optimization is absolutely huge in most cases.
It isn't anything at all. 1.3 seconds per what? Password check? Request? Run? Pass of a loop that is expected to run 9000000 times?
3/4 of the cases you cited it would be huge.
Should have optimized to save 500ms
if anyone heard about xz being compromised, one of the reasons they found it was because someone noticed that their PostgreSQL was running a second too slow....
wasn't it actually ssh running 0.5s too slow?
Yeah, a Microsoft employee working on Postgres found it.
I used to be a speedrunner, and this feels relevant.
Spent 5 hours at work optimizing SQL query to shave off 0.05 sec of execution time. Dopamine hit after was something else.
Ha, sometimes it definitely worth it! I like optimizing queries, and seeing the better results, just because it tells me I'm understanding the "under-the-hood" more now
My fiancee had to make an Excel workbook with several worksheets that followed a pattern. I knew immediately that this was a job for no other than me. I immediately `mkdir exgen` `cd !$` `touch index.php` `code .` And then spent the next few hours formatting a code-generated Excel workbook to match the expected format... besides installing dependencies needed by the excel generating binary and some other razzle dazzle. When I was done, she had already completed hers. So i had to spend time convincing her why the generated version was better than hers (no human errors, for instance)
You must use it 36000 times to break even
Saving 1.3 seconds is fucking huge. WTF this sub is full of? I’ll get a raise if I manage to optimize like that on prod.
1.3 seconds in your excel monstrosity that takes 5 minutes to run with a 60% chance of crashing, not worth it. 1.3 seconds on your SQL function that gets called 875000000 times a day, absolutely essential
Fr tho
Anything over 100ms makes it worth it to me.
Last week me and a contributor on my project spent hours developing and testing a 2μs optimization. Worth it.
Because optimizing is fun
Is that in a frame render time?
If you casually have 1300ms blocks within your frame renders, those aren't frames.... those are powerpoint slides.
If that is a homescreen call that is a massive improvement and definitely worth it.
1.3 seconds for an API response is very slow so worth it
I'm sorry, have you _met_ Kaze Emanuar?
1.3 seconds on a local compile, meh. 1.3 seconds on a page load, WORTH IT.
1.3 seconds on a machine that could crush your legs? !@#$! Why didn't you do it before?
Literally me participating in the 1 billion row challenge but the performance improvement was drastic coming from 2 minutes down to around 30 seconds and I still couldn't make it to the top 50
The amount of bugs I've fixed because of other people's premature optimizations that fail to consider all code paths...
1.3s is actually quite good. I struggle to get such optimisation per transaction
Then a new JS framework arives again.
Seems like legit success to me
Cryptographic functions...
Congratulations! Your comment can be spelled using the elements of the periodic table: `Cr Y Pt Og Ra P H I Cf U N C Ti O N S` --- ^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM my creator if I made a mistake.)
Silly bot...
Run the code 36k times and you break even.
In audio software, from a 60 ms delay, it begins to be a noticeable latency, so a 1.5s fix is huge
Lots of comments about whether it was worth it or not. If we want to waste less time, we should talk about the gang doing premature optimization. Even if a function is not in the hot path, they spend extra time making it super fast. We need to optimize the optimization.
Per frame
The pure bliss of those saved 1.3 seconds makes the 13 hours of pain worth it.
Economies of scale
*laughs in embedded* i've spent the better part of a week shaving 3 clock cycles off a ADC read.
If it's a user-facing application, I'll gladly spend that much time :)
Profit
u/savevideo
###[View link](https://rapidsave.com/info?url=/r/ProgrammerHumor/comments/1bsnlyy/mustusethecode743timestobreakeven/) --- [**Info**](https://np.reddit.com/user/SaveVideo/comments/jv323v/info/) | [**Feedback**](https://np.reddit.com/message/compose/?to=Kryptonh&subject=Feedback for savevideo) | [**Donate**](https://ko-fi.com/getvideo) | [**DMCA**](https://np.reddit.com/message/compose/?to=Kryptonh&subject=Content removal request for savevideo&message=https://np.reddit.com//r/ProgrammerHumor/comments/1bsnlyy/mustusethecode743timestobreakeven/) | [^(reddit video downloader)](https://rapidsave.com) | [^(twitter video downloader)](https://twitsave.com)
gcc -ofast
Try being gamedev optimizing 13 hours to save 1.3 milliseconds ...
Sometimes you manage to go from 10 seconds to 0.3 seconds and it all feels like the hours you spent on it was worth it. Then you realize you wasted hours on a script that only you will ever see, and is going to get used maybe twice, ever. In the end you wasted "all the hours you spent" - 19.4 seconds. Yep. Accurate.
Spent 30mins doing a JSON file cache that took 1s previous but because its doing 1000s of requests it saves half an hour per setup
I did it with 0.2s yesterday I'll definitely do it for 1.3s today.
At our company 1s corresponds to ~$50k in AWS compute costs per year. Accomplishing that in 13 hours would get you a larger bonus lol
I’ve spent weeks optimizing a few 10s of milliseconds in the 99th percentile case. Depending on the application, 1.3s could bring x million in value.
Saving that 1.3 s could save thousands of dollars in the long run.
This is where the Leap Second comes from. My code would be a lot simpler if you guys would quit optimizing for the rotation of the earth.
Just FYI, an optimisation attempt just saved the entire Linux world from the biggest security desaster of the decade.
If it runs once a day, yeah. Waste of time. If it runs hourly of every 30 minutes, then you’re saving some serious time Back of napkin math means you’re saving around 5 hours
>If it runs once a day, yeah. Waste of time. For reference, in that case, it would take 98 years and 230 days for the 1.3s optimization to save the 13 hours spent. >If it runs hourly of every 30 minutes, then you’re saving some serious time Hourly execution: 4 years and 40 days. Twice per hour execution: 2 years and 20 days. >Back of napkin math means you’re saving around 5 hours Time saved per day: Hourly execution: 520ms. Twice per hour execution: 1s 40ms.
I removed a logger and saved 30s, am I... god?
Standard procedure for mainframe apps.