Hi Dimitris,
Light pollution is a signal (typically having the form of a gradient because things are more polluted lower down the sky and less pollulted the higher you get) + the associated noise which is the square root of the signal (on average).
Let's say for a given image you exposed for 20 seconds and have 30,000/65535 average light pollution, this will be accompanied by sqrt(30,000) = 171/65535 noise. You needed to expose so much because you are capturing something so faint that it emits a signal of 10/65535 during the exposure. If you wanted to capture something 10 times fainter, you are out of luck as you cannot expose 10X20 = 200 seconds, light pollution would give you a completely white sub. So this is the first limitation: you cannot expose as much as you'd like and some targets are simply beyond your reach.
So you have settled for 20 seconds and have a 30,000/65535 light pollution plus noise. You can subtract the light pollution relatively easily with a gradient removal tool. Especially if it the subject is darker, actually. Come to think of it, at 200mm your image covers a significant part of the sky meaning the gradients can be a tad too wild and even mess with your flat field, but overall getting rid of them is doable. However, the light pollution noise remains. In our example, any signal fainter than 171/65535 is completely hidden and above that you lose dynamic range: there is no way to tell if a pixel is 540 or 480 if the noise is 170, essentially your "step" is 170 and it is like having a 9-bit camera (not exactly, but close enough). In a nutshell, imaging under heavy light pollution means the dark stuff is invisible -which is to be expected- and the less dark stuff is posterized -which is a little unexpected but makes perfect sense.
It follows that you need to reduce noise (improve your SNR in scientific lingo) and the only way to do that is by integrating more subs. Noise is effectively divided by the square root of the number of subs. If you take 100 subs like the one in our example, noise effectively becomes 17 (171 divided by 10 which is the squre root of 100). Now stuff as faint as 34/65535 is visible and your dynamic range is 10 bits. To get the full dynamic range the same equipment would give you in a perfectly dark site you need more than 20 thousand subs which is not impossible but has many practical limitations. Planetary imagers can do it because their subs are milliseconds. For a DSO, you need around 20 seconds exposure stin kseftila

therefore your 20,000 subs take 100 hours = multiple nights with the target traversing significantly different parts of the sky with everything that entails. For example you will have different seeing, this is bound to cost you more in detail, and your light pollution signal will not be a simple gradient but a higher degree 2D polynomial type of thing which can be very, very difficult to tackle.
Putting it all together:
- Individual subs cannot/should not be too long because the sky saturates them (you are "sky limited"), so really faint stuff is out of the question. You can capture Andromeda for example but definitely not the Hα clouds surrounding it. You can capture the Crescent Nebula but it will be a Crescent, not oval-shaped like it looks in a dark location or in narrowband. And so on.
- Subs will have less faint detail and look more posterized than equivalent subs from a dark site.
- You can compensate by integrating more subs, but only up to a point. Your results will never be as good as the dark site, particularly for the darker stuff because returns are diminished and even become negative after the first few thousand subs.
Cheers,
Dimitris