TL;DR (too long, don't read), just skip this post, OK?
There are so many points I wish to say -- don't even know where to start...
It's the video bitrate -- not the video resolution -- that has the greatest impact on overall video viewing quality.
**
1080i & 1080p -- don't worry too much about this. In real world viewing, the difference is not that great.
PLEASE / KINDLY understand that in the world of video broadcasting / streaming -- resolution alone does not define the bandwidth requirements of a video, you need to (at least) also state clearly the following:
(a) the framerate
(b) the type of video compression used (if any)
(c) the overall video bitrate
To me,
point (c) is the single most important factor, more so than resolution.
But just looking at point (a) -- do you know -- you can have videos with 24fps, 30fps or 60fps etc?
That means you can have 1080i@60fps or 1080p@30fps or 1080p@60fps etc, etc.
If all other factors are equal, 1080i@60fps & 1080p@30fps have exactly the same amount of video data.
With the right hardware, you can easily convert any video between 1080i@60fps <--> 1080p@30fps.
If you define 1080p as "TRUE FHD" -- but 1080i is not "TRUE FHD"...
- Then 1080i@60fps is "TRUE FHD" because 1080i@60fps can be converted to 1080p@30fps
- Then 1080p@30fps is NOT "TRUE FHD" because 1080p@30fps is actually just 1080i@60fps
You see (from above),
once you go into details -- after a while it becomes meaningless to define videos based on resolution alone.
**
If you randomly pick people off the street & show them a 1080i@60fps video and a 1080p@30fps video...
Vast majority of people can't tell the difference between 1080i & 1080p.
With a well trained eye, it is easier to see the difference between 1080i & 1080p -- if you've a large TV (at least 42" or bigger).
EVEN if you've a large TV, it also depends on how close you sit (the viewing distance).
It only becomes easier to tell the difference between 1080i & 1080p -- if you've a large TV & sit you sit really close to the TV.
(But if you always sit too close to a large TV to watch, it'll spoilt your eyes in the long run, LOL).
**
Have you ever bought (or borrow) original Blu-ray discs to watch using a Blu-ray player?
Do you know that most Blu-ray videos are ONLY 1080p@24fps ? Yes, ONLY 1080p@24fps.
But yet most Blu-ray videos look good -- even on large TVs.
Why? Because the overall bitrate of Blu-ray videos are HIGH -- point (c) above.
If you watch enough Blu-ray videos, then you go watch Youtube 4K -- you'll actually feel that most Youtube 4K@30fps videos looks poor/terrible.
To limited bandwidth requirements, all Youtube videos (including 4K videos) are highly compressed videos with low bitrate.
The HIGHER the video bitrate, the MORE video data is available. On the other hand, the LOWER the overall video bitrate, the LESS video data is available. This is UNIVERSALLY TRUE regardless of resolution.
Bitrate is the single most important factor in overall video viewing quality, more so than resolution.
PERSONALLY, I'll watch a Blu-ray disc at only 1080p@24fps any day -- rather than watch mostly shitty Youtube 4K@30fps.
**
Based on the above,
For the current DVB-T2 channels we receive OTA locally, we can receive up to 1920x1080 broadcast -- which is good enough because sufficient frequency/bandwidth have been allocated for a reasonably good video bitrate.
If you have a TV with a high quality display panel, the DVB-T2 channels we receive OTA can look really good -- much better than most 1080p videos on Youtube.
**
I also agree, we will not get digital 4K broadcast anytime soon, because --
to limit the frequency/bandwidth used, the digital 4K video broadcast will SURELY be compressed.
What will be the video compression standard used? HEVC/H.265? VP9? or another video compression standard?
Most people will have to throw away their current TVs and buy new TVs or buy new digital receivers that can support whatever compression standard used for our local digital 4K broadcast.
Again, local digital 4K broadcast will not be available anytime soon.
Hope the above helps.