What evidence do you have for this? IIRC it’s kodi that constructs the IEC 61937 preamble, not the AML drivers. I’m away from home atm so can’t check. I may be wrong.
Check it on a HDMI analyzer that reports on IEC 61937 burst preamble. This was brought to the attention of Minix when they released U9H. Minix contacted Kodi developer (fritsch). Fritsch was of the opinion that it was a Amlogic problem and not Kodi problem. The issue is not reproducible on nVIDIA Shield or on Intel LE system. I have seen this on both S905X and S912. It mostlys happens with HBR bitstreams.
Sorry for slow replies, we have a lot going on at OSMC.
It’s been some time since I looked at this, but I know that we fixed some things here. I would appreciate if you can test again. I’ll also analyse the HDMITX side when I get a bit of free time.
Do you also believe this causes HRA issues as well? I don’t think this is the case myself based on some preliminary testing and some liaising with some more cooperative vendors. The last time I spoke to Peter (fritsch), I believe a consensus was reached that some AVRs are simply shitty.
If there’s an issue that you can help us reproduce, we can definitely take a stab at fixing it.
After a couple of weeks away, I’ve started to look at this. I can play a 32-bit WAV made from a 24-bit flac (so 8 LSBs zeroed) with aplay through S/PDIF, capture it, and the capture is bit-perfect. So the S905X spdif device is working as expected.
Playing the same file from kodi, it gets scaled but it still comes through as 24-bit.
I don’t have the equipment to do something similar with HDMI but there shouldn’t be any hardware limitations to 24-bit.
That is not what I find in my tests. It is only 16-bit via HDMI and S/PDIF. How are you capturing the SPDIF output?
What version of OSMC are you using?
Can you post a debug log via grab-logs -A or My OSMC?
It has the June update. I will see whether I can get the logs tonight.
Here are the logs:
Audio output: AML-M8AUDIO, HDMI
Audio output: AML-M8AUDIO Analog, PCM
HDMI output is only 16-bit in both the cases. I only tested in Kodi.
With Analog, PCM output, I do see in the log that the output is PCM32. The audio analyzer that I was using today only supports up to 24-bit 192kHz on S/PDIF input. For whatever reason, the analyzer detected no signal and I couldn’t verify what the actual bit depth was. Will have to recheck on a R&S UPV analyzer. It did play on my Denon AVR.
I had only looked at AML-M8AUDIO, HDMI output before.
I have a USB soundcard with Toslink input. It has to be set manually to the appropriate bitrate and bit-depth. It is possible that the vero is sending all 24 bits but signalling 16-bit in the metadata.
Kodi converts audio internally to 32-bit float. If that is not supported directly by the sound sink, it will try other formats with decreasing bitrate/depth until it finds one the sink supports. The SoC supports 32-bitLE. So from then on in the drivers, the signal is treated as 32-bit.
It would be interesting to try sending a 24-bit signal direct to the SoC, eg with aplay to see what comes out.
UPDATE: The way things are at the moment, if I set the Kodi audio output to PCM, I get 24 bits sent to S/PDIF for both stereo and multi-channel (FL/FR only, of course). If I set output to HDMI, I get 16 bits for stereo, 24 bits for multi-channel on S/PDIF.
If someone (like @wesk05) has the time and equipment, I would love to know what comes through on HDMI for these four conditions.
I can only detect 16-bit at HDMI output on all 4 conditions.
What exactly do you mean by this statement: If I set output to HDMI, I get 16 bits for stereo, 24 bits for multi-channel on S/PDIF.
Is it changing the number of channels in Kodi?
I checked it with channels set to 2 and 7.1 with AML8AUDIO HDMI device. What I see in the log is, when you set it to >2.0, Kodi falls back to the default device AML8AUDIO, Analog and
uses AE_FMT_S32NE format. When it is set to 2.0, Kodi uses the HDMI device, but changes the format to AE_FMT_S16NE.
PCM32 cannot be transported via HDMI or S/PDIF. It has to be changed somewhere else to 24/16-bit.
Edit: OK, I can see that in the code where it is converted to 24-bits.
This is how the recorded waveforms from HDMI output of Vero 4K and Zidoo X9S look like. The test pattern was 4kHz -100dBFS Sine wave (24-bit 48kHz stereo wav).
Original Test Signal
Yes, that’s what I mean. If I play a stereo track or set kodi to 2.0 with the HDMI device selected vero uses its spdif device which corresponds to what is labelled HDMI in kodi. Kodi tries various bit depths in turn. The spdif device refuses to play 24 bits (I’m trying to find out why), then refuses to play 32 bits so kodi falls back to 16.
If I play a multi-track clip with kodi set to more that 2.0, or select the PCM Analog device, vero uses its i2s device which accepts 32-bit signals. For some reason, that 32-bit signal gets passed back to the spdif device as 32-bits, then truncated to 24 bits, and output to Toslink. I’m definately getting 24 bits on Toslink with my poor-man’s setup.
Can you point me to that, please? In Kodi or the drivers?
Thanks for testing. 24-bit is clearly not getting through to HDMI.
Here is what I get capturing a 4kHz -100dB signal from the Toslink output on Vero4k into Audacity. The top trace is using aplay to the default device (i2s, aka PCM analog). The bottom one is using Kodi with the PCM analog device. I had to apply some amplification in Audacity to both in order to show the detail, (50dB to top, 40dB to bottom) but I think this does demonstrate 24 bits are getting through.
What’s happening here?
and what does this mean? Does this apply to all non GXTVBB CPUs?
What version of Audacity are you using? Only versions later than 2.1.2 support 24-bit WASAPI recording in Windows.
I did further testing today at one of the ATC lab that I know. The table below shows the summary findings.
You can see from the analyzer screenshots below that the word length is not set for S/PDIF outut (It is for HDMI output), but the measured bit depth is 24-bit. green color: used bit, blue: unused bit. I also checked the recorded waveform and it is indeed 24-bit.
S/PDIF Channel Status
Your guess is as good as mine. IEC958 32bits doesn’t exist of course, but the i2s and spdif drivers/hw don’t respond to attempts to send 24-bit signals.
It means nothing as the whole of aml_spdif_play is commented out.
So the bit-depth for S/PDIF seems a bit random - is this just your equipment making its own estimate?
I think that’s right. I can’t identify a register that sets it. I’ve found where the HDMI channel status bits are set to 16-bits and added code to switch to 24-bit if the stream is not IEC61937. If I can figure out how to package it you might like to see if that does the trick.
Thanks again for your testing.
I’m going to split this in to a new thread as it’s not related to DTS-HD HRA.
Yes, it is the analyzer detecting the bits that are in use. If the state of a bit changes within 100ms or so, the bit is assumed to be in use.
You may know this already, you can save the recorded wav as 32-bit PCM. Open the file in a hex editor, if the first 4 columns are 0 (after the header), then the recorded wav is 16-bit, if only the first two columns are 0, then it is 24-bit. This is with the assumption that the recording software doesn’t add noise.
I saw some lines related to IEC958 channel status in aml_audio_hw, but couldn’t find anything specific to word length upon searching the repository.
I am not sure whether it is just the matter of word length for HDMI output. If that was the only issue, a channel status agnostic HDMI capture would have shown 24-bit which isn’t the case. In fact, the 16-bit S/PDIF and HDMI output waveforms look identical despite being done on different set of instruments.
If your instrument doesn’t care about the width reported by the channel status bits, we will have to look elsewhere.
Continued here: Can Vero do bit-perfect playback?