Apple TV Component Output and Broadcom CrystalHD BCM70015 Card

Hello,
I have recently installed the Broadcom CrystalHD BCM70015 Card and also installed OSMC to utilize this card’s 1080p output. I am running the ATV1 into an older 1080p television that uses Component (YPbPr) inputs so the intent is to have OSMC to use those outputs on ATV1 to feed this television. What is not exactly clear to me now after messing with this configuration for a bit is the following:

  1. Does the Broadcom CrystalHD BCM70015 Card only decode for the HDMI video output on the ATV1 using OSMC?
  2. Does the edit to the xorg.conf file to “enable” the 1080p output on the Component outputs of the ATV1 actually work? I made the edit expecting a difference and I only see OSMC splash screen when rebooting then blank screen (audio is working though), plugging in to HDMI screen, OSMC works as expected.
  3. Do I need to also include the 720p mode from the edit described in other post to get this to work or can I just use the 1080p edit as I already did (with no success)?
  4. Original ATV OS was v.3.0.2 and I followed instructions to try and set the HDMI to RGB High value but no HDMI settings were present in my ATV OS perhaps because I was only plugged in to television via Component inputs, did I need to plug in to an HDMI display first, set that value, then install OSMC even though I am going to run only on component outputs?
  5. If all this config to make component outputs is not working or producing proper output, would it be recommended to get some kind of HDMI to Component adapter since it appears OSMC is really more friendly to HDMI? Would I lose anything from the Broadcom CrystalHD BCM70015 card by running it this way?

I did read post about determining if OSMC is actually using the Broadcom CrystalHD BCM70015 and hope to be able to verify this once I resolve these other issues with the Component outputs. Any response to this is appreciated.

Thanks,
~Dubhead

No, component works.

Yes – tested on an Optoma projector here, and other users use it.

No – but are you sure your TV actually supports 1080p?

Don’t need to worry about this if you use component.

This is not correct. With an appropriate configuration, OSMC doesn’t care how your display is connected.

It’s not worth investing money in an (almost) ten year old device.

1 Like

Sam,
Thanks for the very expedient reply. I updated the xorg.conf file with the 720p mode (in addition to the 1080p) and reboot to no success. Then removed the 1080p mode in the xorg.conf file and rebooted, also no success. Must be more to this than just updating the xorg.conf file, but I have inserted the text of the file I am using just in case there is something that is incorrect in it.

Contents of xorg.conf file being used below:

nvidia-xconfig: X configuration file generated by nvidia-xconfig
nvidia-xconfig: version 340.46 (buildd@binet) Tue Oct 7 08:03:22 UTC 2014

Section “ServerLayout”
Identifier “serverlayout0”
Screen 0 “screen0” 0 0
InputDevice “Keyboard0” “CoreKeyboard”
InputDevice “Mouse0” “CorePointer”
EndSection

Section “InputDevice”
# generated from default
Identifier “Keyboard0”
Driver “keyboard”
EndSection

Section “InputDevice”
# generated from default
Identifier “Mouse0”
Driver “mouse”
Option “Protocol” “auto”
Option “Device” “/dev/input/mice”
Option “Emulate3Buttons” “no”
Option “ZAxisMapping” “4 5”
EndSection

Section “Monitor”
Identifier “monitor0”
VendorName “SNY”
ModelName “Sony TV”
Option “DPMS”
EndSection

Section “Device”
Identifier “device0”
Driver “nvidia”
EndSection

Section “Screen”
Identifier “screen0”
Device “device0”
Monitor “monitor0”
DefaultDepth 24
Option “NoLogo” “true”
Option “RegistryDwords” “RMDisableRenderToSysmem=1”
Option “DynamicTwinView” “false”
Option “ModeValidation” “NoVertRefreshCheck, NoVesaModes, NoXServerModes”
SubSection “Display”
Depth 24
EndSubSection
EndSection

Section “Device”
Identifier “Device0”
Driver “nvidia”
VendorName “NVIDIA Corporation”
Option “RegistryDwords” “RMDisableRenderToSysmem=1”
Option “DynamicTwinView” “false”
EndSection

Section “Screen”
Identifier “Screen0”
Device “Device0”
Monitor “Monitor0”
Option “UseDisplayDevice” “TV”
Option “TVOutFormat” “COMPONENT”
Option “TVStandard” “HD720p”
Option “TVOverScan” “0.80”
DefaultDepth 24
Option “NoLogo” “True”
SubSection “Display”
Modes “1920x1080” “1280x720” “1024x768” “720x480” “800x600” “640x480”
Depth 24
EndSubSection
EndSection

Section “Extensions”
Option “Composite” “Disable”
EndSection

If you set

You probably need to remove “1920x1080” from Modes.

There may also be a Xorg.log which will help.

Hi there, made the edit to remove that additional mode to see if that worked. Rebooted and no change. Here is xorg.0.log file from that reboot: http://paste.osmc.io/fetojucoru.coffee

Not sure what to look for in that log to indicate issue.

Well it has been 2 weeks and I have not been able to successfully get component output with OSMC working with the Apple TV. I posted my log file earlier but I am not sure what in that file may be an indicator of a particular configuration issue or otherwise that may be causing this. Frustrated as I had hoped that I wouldn’t have to buy a new TV just to get OSMC to work with my Apple TV. Does anyone have any insights here on Component (not Composite!) output enabling from Apple TV?

It would be much cheaper to buy another device, such as a Vero 2 or Raspberry Pi.

  • Check that your TV supports 720p or 1080p via component. Most likely it supports 1366x768.
  • Adjust Xorg.conf for this.

A search for ‘nvidia x11 custom resolution component’ may help.

I did confirm that the resolution supported is 1366x768. Is the idea that I include that or remove every other resolution listed in the configuration?

Included:
Modes “1366x768” “1280x720” “1024x768” “720x480” “800x600” “640x480”

Removed:
Modes “1366x768”

I assume it is the former since other lower resolutions would likely need to be specified? Did you learn about this resolution from something in the log file?

I made the change I suggested in the former Included Mode statement from my previous post. No change. The log file for this change is here: http://paste.osmc.io/ijoyafigad.coffee

I looked at this log file for a bit and it seems to indicate that the mode is not being selected correctly and is using the “nvidia-auto-select” to request the mode. It then indicates it chose 1024x768(?) There also seems to be some statements at the end of the log about not having certain parameters defined or using defaults from EDID. I really don’t know how to interpret these to determine if there is something more specifically that needs to be configured in the xorg.conf file but that appears to be the case since no luck yet.

Searched a bit around setting nvidia x11 custom resolution and found a few things about using xrandr instead but at this point I am not sure the best path forward.

~Dubhead