![Very Happy :D](./images/smilies/icon_biggrin.gif)
First of all, i want to apologize for my badly written, because i'm not from an English country.
And my written is rusty.
![Rolling Eyes :roll:](./images/smilies/icon_rolleyes.gif)
Now... who can enlighten me in a big doubt about media "VIDEO" files, with BITRATE in VBR.
I'm in a great debate with some ppl...
That i say a VIDEO File with a BITRATE in VBR, the encoder changes the amount of BITRATE necessary,
that the encoder will analyze the maximum BITRATE capability that the hardware can handle,
depending on the characteristics of the graphic hardware, that the specific equipment uses.
Of course, taking into account the maximum and minimum BITRATE (VBR).
And everyone else says that i'm wrong, is the encoder how does that from himself.
Without having to analyze absolutely no hardware at all.
And then i counter argued that it's ridiculous, because otherwise i would get the same quality with a weak graphics card (500mb DDR)
compared to another more reasonable graphics card (1GB or 2GB DDR).
And the ENCODER it will not use the same amount of BITRATE in VBR, in different graphic cards.
Now i need your help with this, am i right or wrong?
![Question :?:](./images/smilies/icon_question.gif)
I do not want to offend anyone, but please guys i need a concrete answer, rather than personal explanations. Please.
![Crying or Very sad :cry:](./images/smilies/icon_cry.gif)
__________________________________________________________________________________
[NOTE: Re-Edit by changing the sentence "theoretical explanations" to "personal explanations"]