I can only detect 16-bit at HDMI output on all 4 conditions.
What exactly do you mean by this statement: If I set output to HDMI, I get 16 bits for stereo, 24 bits for multi-channel on S/PDIF.
Is it changing the number of channels in Kodi?
I checked it with channels set to 2 and 7.1 with AML8AUDIO HDMI device. What I see in the log is, when you set it to >2.0, Kodi falls back to the default device AML8AUDIO, Analog and
uses AE_FMT_S32NE format. When it is set to 2.0, Kodi uses the HDMI device, but changes the format to AE_FMT_S16NE.
PCM32 cannot be transported via HDMI or S/PDIF. It has to be changed somewhere else to 24/16-bit.
Edit: OK, I can see that in the code where it is converted to 24-bits.
This is how the recorded waveforms from HDMI output of Vero 4K and Zidoo X9S look like. The test pattern was 4kHz -100dBFS Sine wave (24-bit 48kHz stereo wav).
Yes, thatâs what I mean. If I play a stereo track or set kodi to 2.0 with the HDMI device selected vero uses its spdif device which corresponds to what is labelled HDMI in kodi. Kodi tries various bit depths in turn. The spdif device refuses to play 24 bits (Iâm trying to find out why), then refuses to play 32 bits so kodi falls back to 16.
If I play a multi-track clip with kodi set to more that 2.0, or select the PCM Analog device, vero uses its i2s device which accepts 32-bit signals. For some reason, that 32-bit signal gets passed back to the spdif device as 32-bits, then truncated to 24 bits, and output to Toslink. Iâm definately getting 24 bits on Toslink with my poor-manâs setup.
Can you point me to that, please? In Kodi or the drivers?
Thanks for testing. 24-bit is clearly not getting through to HDMI.
Here is what I get capturing a 4kHz -100dB signal from the Toslink output on Vero4k into Audacity. The top trace is using aplay to the default device (i2s, aka PCM analog). The bottom one is using Kodi with the PCM analog device. I had to apply some amplification in Audacity to both in order to show the detail, (50dB to top, 40dB to bottom) but I think this does demonstrate 24 bits are getting through.
You can see from the analyzer screenshots below that the word length is not set for S/PDIF outut (It is for HDMI output), but the measured bit depth is 24-bit. green color: used bit, blue: unused bit. I also checked the recorded waveform and it is indeed 24-bit.
Your guess is as good as mine. IEC958 32bits doesnât exist of course, but the i2s and spdif drivers/hw donât respond to attempts to send 24-bit signals.
It means nothing as the whole of aml_spdif_play is commented out.
2.1.2
So the bit-depth for S/PDIF seems a bit random - is this just your equipment making its own estimate?
I think thatâs right. I canât identify a register that sets it. Iâve found where the HDMI channel status bits are set to 16-bits and added code to switch to 24-bit if the stream is not IEC61937. If I can figure out how to package it you might like to see if that does the trick.
Yes, it is the analyzer detecting the bits that are in use. If the state of a bit changes within 100ms or so, the bit is assumed to be in use.
You may know this already, you can save the recorded wav as 32-bit PCM. Open the file in a hex editor, if the first 4 columns are 0 (after the header), then the recorded wav is 16-bit, if only the first two columns are 0, then it is 24-bit. This is with the assumption that the recording software doesnât add noise.
I saw some lines related to IEC958 channel status in aml_audio_hw, but couldnât find anything specific to word length upon searching the repository.
I am not sure whether it is just the matter of word length for HDMI output. If that was the only issue, a channel status agnostic HDMI capture would have shown 24-bit which isnât the case. In fact, the 16-bit S/PDIF and HDMI output waveforms look identical despite being done on different set of instruments.