I remember back in 2014 just before the World Cup my old TV went wrong,
I was a little broke then and entered in a scramble to find a decent TV at a reasonable price.
Every retailer then was pushing expensive 4K sets when absolutely no 4K content was available.
I lucked out on a “dumb” 1080P set for £299 that gave a great picture and is still going strong now.
For at least the next two years one of my friends who at the same time spent £2.5K on one of these “new fangled” 4K sets would remark on what a great picture I got on that TV.
Fast forward to late 2017 when I finally invested in UHD and that same friend was absolutely gob smacked at the picture quality on my (only just) sub £1K set (it does help that I spent many an hour tweaking the settings on both tele’s to get the picture just right which he or most people I know never did/do),
for a lot of tech I can be quite an early adopter but when it comes to TV’s I will always let the tech mature before investing.
That being said I’m in a constant battle with myself at the moment not to spend a large amount of money on either a Samsung or LG 2019 4K flagship model lol.
Same here I think I’ll finally pul the trigger and buy a flagship 4k in late 2020. I’m still on a 2007 1080p Bravia though
Im extremly happy whit mine 4k+ the only thing it is mine anime collections whit the trubeling decode :=(( so if any uppgrade 4k+ can fix that thats fine for me.:=) if not i considering buy a new box just becose of that or even build a htcp :=)
As samone mentioned browsing on mine nas is painfully slow vs mine old mibox :=)
If they’re in 10-bit h.264 format then no software upgrade to the Vero 4K+ will ever help; it’s a hardware limitation that it can’t decode them with hardware acceleration, and the CPU isn’t powerful enough to decode them in software.
To be fair, no media player can play that format using hardware acceleration; and only one or two are powerful enough to do it in software.
I must confess I’m curious about the reasons why it isn’t, and about what sort of SoC would be appropriate.
I would like to know that too.
I’m developing a Hi10 decoder
This shouldn’t be the case. Please start a new forum post so we can resolve this for you. We’ll get this solved for you.
I’m curious as to why people stay with x264 10 bit. Why not move to x265 10 bit?
That is pretty much only a thing for certain anime stuff and it exists because the people releasing this content believe they can get better quality at a lower size than with any other type of encode. Once a release group picks a format they have a tendency to stick with it and it is very much a take it or leave it type of affair. There is still people doing divx rips, and that makes even less sense, but it is still a thing nonetheless so there are still going to be people who want player support without having to transcode to something more common.
thanks that would explain it
If you are successful and make it open source this will be huge. I know that what ever you say now might bite you in the ass later but do you think it will be released in vero 4k+ lifetime or after that?
This isn’t actually correct anymore. Currently there is one SoC that can decode Hi10P in hardware acceleration, and that is Rockchip RK3399. Rockchip RK3399 devices running LibreELEC Kodi Leia
On top of that Mali V52 and V76 (ARM’s VPU implementation) can hardware decode Hi10P.
I don’t know if designers like Amlogic based their VPUs on ARM’s implementation though.
Hi10P was popularized back when HEVC didn’t exist and standard 8 bit H.264 would produce color banding without substantially higher bitrate (hence file size). Although still quite slow, some fansub groups have transitioned to HEVC.
thx im not home for the moment awey holliday back agusti or so
Is 10bit for fake UHD not useless ?
Lists all movies which have fake UHD and which are just upscale.
I think for all upscale movies you can stay on x264 or x265 1080p.
UHD is more than just high resolution. If the original master film/tape is good enough, it could already have wider gamut and bigger dynamic range than the DVD. Then colorists will regrade it for more ‘pop’. It will need to be 10-bit.
Yes sure 10bit but normal bluray right ? I read you can’t really see a difference between 10bit bluray and Ultra HD blurays as some movies are just normal upscale and can’t improved much more. As example if you look on the Aquaman bluray and aquaman UHD there is no difference only UHD is upscale and bigger file size but no quality improvement you can see.
That’s debatable. The Digital Intermediates that disks are mastered from are usually 2K resolution - that’s 2048x1080 - so, to get it to 1920x1080 it has to be slightly downscaled. You might think that doesn’t represent much loss of resolution, but to scale something in a way which doesn’t introduce artefacts (e.g. ringing or softening of the image) is surprisingly difficult. And it’s easier to upscale without introducing artefacts than it is to downscale.
On top of that, there’s High Dynamic Range and Wide Colour Gamut to consider - a 1080p blu ray can’t make use of those, and the difference they make can be quite dramatic.
Then there’s the fact that 4K blu rays have a higher bit-rate, and also use HEVC rather than h.264 for their video codec; HEVC means you can get the same playback quality using half the bit-rate, or better quality using the same bit-rate, so you end up with fewer compression artefacts.
So, while a 4K blu ray may be an upscale of a 2K DI, when you add all those factors together, it will probably look a lot better than the 1080p blu ray equivalent.
On the contrary, the HDR and WCG make a huge difference. Watch the scene where the hero and heroine first arrive in Atlantis, with all the glowing coloured lights.
Do you know on 2.35:1 material whether the letterboxing bars are stored on the DI or whether its stored anamorphically to preserve more resolution? If its the latter then upscaling to 4K will certainly look better than 1080P.