On 07/25/2016 12:36 PM, Ian Arkver wrote:
On 25/07/16 18:55, Steve Longerbeam wrote:
On 07/25/2016 05:04 AM, Ian Arkver wrote:
On 23/07/16 18:00, Steve Longerbeam wrote:Hi Ian, I double-checked the ADV7180 datasheet, this value is
<snip>See below re this value.
+#define ADV7180_VSYNC_FIELD_CTL_1_NEWAVMODE 0x02
correct. Bit 4, when cleared, _enables_ NEWAVMODE.
Hah, ok. I'm not familiar enough with the history of this chip and didn't
know what "OLDAVMODE" was. So, to enable NEWAVMODE you clear
the NEWAVMODE bit. That makes perfect sense.
Anyway, I still don't see what NEWAVMODE gets you.
With video standard auto-detect disabled in the chip (VID_SEL > 2), captured NTSC
images by the i.mx6q SabreAuto are corrupted, best I can describe it as "extremely
fuzzy". Only when newavmode is enabled do the images look good again, in manual
mode. With auto-detect enabled, images look good with or without newavmode.
The strange this is, the auto-detected standard is identical to the standard set
explicitly in manual mode (NTSC-M). I did a complete i2c dump of the registers
for both auto-detect and manual mode, and found no other differences besides
the auto-detect/manual setting.
Trying to track this down further would probably require a logic analyzer on the
bt.656 bus, which I don't have access to.
I will not be debugging this further so NEWAVMODE it will have to remain.
far as I can see it just locks down the timings and removes the flexibility
the chip otherwise offers to move the BT656 SAV and EAV codes around
relative to the incoming video.
In what circumstances would you need to set the newavmode property
and change this default behaviour? We're not coupling the adv7180
back-to-back with an ADV video encoder here, which is what
NEWAVMODE is for and is presumably why AD recommend it for their
eval boards. We're trying to get a BT656 compliant stream, which is
what the default mode purports to generate.