No announcement yet.

ADVC110 vs. ADVC300

  • Time
  • Show
Clear All
new posts

  • #16
    Not sure why your VCR wouldn't be 25 fps?

    As far as the sync goes, essentially the ADVC will try to lock audio samples onto the video clock, hence maintaining audio/video sync.


    • #17
      Originally posted by GrassValley_BH
      Not sure why your VCR wouldn't be 25 fps?
      Let me put it another way...

      I don't have any knowledge of how the ADVC110 or the Firewire interface work. All I know is that there's a format which we call "DV" which sends ~25Mbps video data over that Firewire interface. For PAL, it's at 25fps.

      Somewhere I assume there's a master clock for the conversion, and 25fps is this clock divided by some number (it could be 13.5MHz / 540000 to get 25fps - that would be the case for DVD, but I don't know about DV).

      The A>D conversion must be driven by a clock from somewhere, but where? From a clock on the Firewire bus, or by a clock in the ADVC110, or by a clock PLL'd to the incoming video signal?

      Nothing in this world is exact, and no video is exactly 25.000000000000000000000000fps. So given that my analogue video is coming out of a piece of machinery where the video rate is dictated by magnetic particles on a tape, read by a spinning piece of metal, moved by motors driven by servos, it's a safe bet that the "25fps" coming out of the VCR will not match the "25fps" derived from a 13.5MHz (or other) crystal (or other) clock.

      So, what do your ADVC devices do to reconcile this? Is the DV output effectively (directly or indirectly) clocked from the incoming video, and DV Firewire interface is "OK" with this? Or is the DV clocked from the ADVC110 or the PC bus? If it's the latter, how is the incoming (independent!) video signal made to fit to this clock which it has no knowledge of, and is not slaved to?

      In short, do you have to drop or duplicate frames to keep up, or is there another method?

      I guess I'll find out when my ADVC110 arrives in a few days, but if you know the answer, I'm still interested!

      Either way, many thanks for all your help BH.



      • #18
        I don't think anyone here is quite sure what you're asking. In the analog to DV conversion, the ADVC110 simply needs to capture each frame from the analog signal, compress it according to the DV standard, and feed it into the DV stream that is captured by the computer. The ADVC110 sets the attribute flags in the DV stream to indicate it is 25 fps (for PAL).

        So let's say your VCR is somewhat inaccurate and sends a signal at 24.9 fps. The ADVC110 would still capture this a frame at time and send the resulting DV stream to the computer, which saves it to disk. Presumably the ADVC110 has circuitry which waits until a whole frame is captured before converting, it doesn't just blindly take a snapshot at a point in time and hope it has a whole frame. Of course I'm only guessing that's how it works, but I can't see how else it could work.

        When your playback software reads the DV file, it sees that it should play it at 25 fps and does so.

        If you think of the ADVC110 as a device that captures frames, then it is not a problem.



        • #19
          Brandon's probably more the expert here, but correct me if I'm wrong - if your VCR wasn't outputting a 50hz (PAL) signal, your telly wouldn't be showing you a picture, even if only momentarily.

          So in turn, if your equipment isn't outputting 50hz, ADVC too would see a dropout - because your source equipment isn't doing its job right. This would mean that the ADVC would simply breakup the signal in the same way a display would - (missing) black frames. And that's the biggest sign that either the VCR is stuffed, or the tape is. :)

          Now for Brandon to tell me I got it completely wrong...


          • #20
            You're right Kenneally...

            I think the point where we started getting confused was the difference between the signal format, being PAL (50 Hz interlaced, or 25 fps) and the scientific signal timing measurement.

            As long as your signal is "regular" PAL, 50 Hz interlaced, then the ADVC is compatible with it.

            As for the exact signal timings, the video sync is embedded into the signal (in the vertical interval), so the receiving device will detect and drive off of that sync.
            Since as you say, nothing is 100% absolute, there's a good deal of tolerance there, especially in an analog world where things just "flow through."

            However, in the digital world, things are far more cut-and-dry and hence less tolerant. If the sync is sufficiently off, then a frame may be lost.

            The ADVCs generally time off of the embedded sync in the video, and if a video frame is missed, the corresponding audio frame is also missed, keeping the audio and video in sync with each other. Of course the higher-end ADVCs have Reference I/Os so you can genlock for recording and playout.

            The ADVCs do need a full frame, they don't handle partial frames very well, if at all, which is why the DV signal will be interrupted if you disconnect the video cable or encounter a truly blank area of tape (that has no sync signal).

            Most of the newer ADVCs also have a choice between internal clock and external clock. The internal clock option uses an internal clock, while the external clock will sync off of the FireWire clock.

            The higher-end ADVCs have PerfectSync which is a process that synchronizes the FireWire clock with the ADVC to ensure there are no skipped or repeated frames.

            Regardless, the ADVCs are very, very good at maintaining audio/video sync, and that is a large reason why we get a lot of folks who have moved over from competing products.