1 Reply Latest reply on Feb 19, 2010 5:26 AM by frr

    IEGD 10.2/10.3 vs. Scala Player v5: unexpected CRT timing dependency

    Green Belt

      First of all, thanks to all the magnificent people at Intel and Scala for providing their software to us - it makes our business life much easier.
      That said, I'd like to present the following problem description:


      We're using a small-form-factor industrial PC, based on the Intel ATOM with an i945GSE chipset. Quite a number of them already deployed in the field. We run XP Embedded on them (based on XP SP3), including the Intel IEGD driver (currently experimenting with versions 10.2.2 and 10.3).
      The analog CRT output (the GMCH's native CRT port) is set to 1366x768 - a custom video mode, created in the IEGD CED, starting from a DTD created from scratch in the "Simple parameters" mode (then switched to IEGD timings mode). We've tried that resolution at both 60 and 50 Hz. The CED did complain that 1366x768 is not a standard CVT mode (really?) but actually did calculate some reasonable timings and our displays do sync to that.

      On that setup, we're running the Scala Player v5 - a piece of software that's used to play back so-called "scripts".


      The Scala is a "Digital Signage" content publication platform. A Scala script is a pile of images, sections of video and dynamic graphics and text in general, linked together by a couple of XML-based files (scripts). The bits of content, thus combined, comprise a multi-level loopy presentation running on the display.


      We're feeding the analog CRT output to an LCD with a native resolution of 1366x768. Actually we're using some Elantec/Intersil-based analog video extenders over CAT5 - hence the reason for not using DVI in the first place.


      We're having a bit of a misunderstanding with our LCD display, which tends to "lose sync" repeatedly (display going dark for a second and coming back on again). We've tested the problem in the field and in our lab, with an oscilloscope, and we've discovered that the "loss of sync symptom" goes away for good if we provide the display with "active high" HSYNC (the standard for 1366x768 seems to be active low), preferably with a slightly narrower sync pulse (around 1 us, compared to the default ~1.6 us).


      We've discovered this by plugging in a notebook (i965GM) and feeding the LCD with 1280x768@60 from the "desktop-flavour" Intel Graphics Driver, which provided a hsync of 1 us active high.
      We've verified that theory with the IEGD, by modifying the default 1366x768 mode generated by the CED's "simple mode DTD", to have 1us HSYNC active high.


      So far so good, that way our LCD is happy. The problem that I'd like to point out is this:

      The modified HSYNC timing and polarity only works on the basic Windows desktop.
      When the Scala Player starts up, somehow it flips the video output to the default 60Hz settings (or very close). The HSYNC pulse flips back to "active low" and ~1.6 us width, maybe it also moves slightly relative to the active=visible row of pixels. Actually, if we choose a 50Hz vertical refresh variant of the IEGD graphics mode, Scala also shifts the gears back to 60 Hz.

      Upon Scala shutdown, the HSYNC flips back to our desired way (as the standard Windows desktop takes over).


      I'm attaching some waveforms / scope screenshots.


      This makes us wonder exactly *how* it happens that Scala, running on top of the pre-set Windows graphics mode, affects such gory timing details, while the basic screen resolution essentially stays intact.
      We can see several possibilities:

      1. Scala explicitly asks Windows graphics for that kind of change, down to the timing details. Is this available to user-space software via Win32 API? Or to video codecs? Via DirectX maybe? weird...
      2. Scala switches to full-screen mode, just like some video-games can do it, setting its desired resolution at the same time, and somehow this change also brings about the observed fallback to the default timings. (Heh, does it use the VESA BIOS for that, by any chance? Not likely...)
      3. Scala merely switches to full-screen mode via some standard Windows mechanism. It doesn't touch the current display resolution in any explicit way. It enables some special graphics feature (video overlay comes to mind), that in turn somehow implicitly, behind the scenes, due to some IEGD-internal logic, brings about the observed fallback to the default timings.

      I've tried disabling overlay in the IEGD via the CED, to no avail. I've tried fiddling with the Scala Player's configuration regarding fullscreen/window mode, to no avail. If Scala is configured to merely assume a borderless window the size of the screen, it still flips that HSYNC pulse. Only if Scala runs in a normal window, the graphics mode stays intact.


      Scala is configured to run at a resolution "same as desktop", and the respective configuration menu item specifically mentions 1366x768. It also mentions 60Hz, stubbornly, no matter if we choose 60 or 50 Hz in the Windows display properties (IEGD driver runtime configuration).

      The top-level Scala "script" indicates that the presentation's "main frame" consists of several rectangles playing different content snippets, but the rectangles in total sum up to precisely 1366x768. Thus, there's no need for any stretching/scaling from Scala's output frame buffer all the way to the LCD's pixel matrix (it's a shame we have to use an analog signal path inbetween).


      I've forgotten to mention that obviously I have EDID strictly OFF in the CED, only the checkbox saying "use custom DTDs" is ticked. Thus, my custom 1366x768@50/60 are effectively hardwired into the IEGD. I've also tried using the "uninstall utility" that comes packaged with the IEGD, to get rid of any past valid DTD's in the Windows Registry - no effect there.


      I can provide our IEGD config files if anyone's interested. I also have a "stracent" syscall trace of the Scala Player, but it doesn't seem to show anything interesting (stracent is admittedly blind to a number of "special cases"). Unfortunately I'm not conversant in the Graphics-related parts of the Windows API / DirectX / whatever.


      Any ideas and hints are welcome :-)


      Frank Rysanek

        • Re: IEGD 10.2/10.3 vs. Scala Player v5: unexpected CRT timing dependency
          Green Belt

          I've managed to diagnose this partial problem a little further and work around it - mostly by accident, I have to say.

          First I wondered, if it would help if I could disable DirectX somehow. Uncle google quickly led me to "dxdiag", where I disabled DirectDraw alltogether. If I do that, my video player hangs while starting up, before switching to full-screen mode. Not much help.

          While fiddling with that, I moved the culprit PC to another room and hooked it up to another display. Suddenly I realized (saw on the scope), that the Scala player is running, and the HSYNC pulse looks *my way*! How come? I figured out that I probably powered up the PC without a monitor attached. I have quickly verified that the assumption was right. The next thing I did, I broke off pin 15 in the DSUB15 cable. Guess what: now the HSYNC pulse works just fine, the way I told it to :-)

          In other words, it seems that when the IEGD is switched to DirectX Full-Screen Exclusive mode (I guess), it asks the monitor via DDC for its EDID parameters (or maybe this is performed by the generic DirectX code) and most importantly, the monitor's prefered timings get reflected on the output, despite the fact that I told IEGD not to do this and despite the fact that the basic desktop mode does obey.
          Maybe the EDID block is read upon the boot of Windows, and the data is merely re-used upon the switch to the DirectX FSE mode.

          It looks like a bug in the IEGD. It may be somehow related to the fact that the "configuration" part of the CED GUI has the EDID section not entirely exhaustive/unambiguous. The top-level choice says "use an EDID display if available", but the dependent tick-boxes and their active/greyed-out logic is not very clear. Perhaps not very clear to the IEGD source code maintainers in the first place. Intuitively I just disabled the use of EDID wherever possible and tried to set in stone my own DTD's, and that's the way it works for me in the regular desktop mode. It worked the same even if I said "use EDID if available" (which un-greyed another section of EDID config), and there I said "use my custom DTD's only". But apparently the "DirectX FSE mode" code path within the driver understands the settings a bit differently :-) Practically it follows the EDID data from the display no matter what. The only way to stop the merry DDC chatter+anarchy, and to make it start working my desired way, was to cripple the DDC I2C link in hardware :-)