There is no "standard" time. It depends on your camera, your focal ratio, and your filters.
Actually, a lot of what you hear out there is mostly a hold over from the CCD and early CMOS days when cameras had low quantum efficiencies and high read noise.
The need for long sub exposure times pretty much only applies to narrowband filters these days. Its also not just about noise. Have you ever had problems with star colours being tough to bring out? A possible cause of that could be blowing out the pixel wells by sub-exposures that are too long.
A big, big thing that people fail to realize is once you hit a certain exposure time there is no benefit to going over it. Longer sub-exposure times do not always mean better final stacks.
For example, here are the exposure times I have calculated for my FLT91:

Compare that to my RASA 8:

You can see they are vastly different, mostly due to the focal ratio. If you are used to CCD cameras, a lot of these exposure times will look insane and not long enough, but cameras have come a long way.
If you have an hour to kill, I'd suggest checking out this video:
https://www.youtube.com/watch?v=3RH93UvP358Steve Mandel:
It seems like most folks are using 5 minute exposures when shooting with LRGB or Ha OIII SII filters. Why is that? Why not use 10 or 20 minute exposures? Please educate me, I'm new to CMOS!
To actually answer your question though, it seems like a standard time because for the longest time people were told longer subs = better, but 5 minutes was a decent compromise. Again, this is from the CCD days and no longer valid advice with newer CMOS cameras.