Announcement

Collapse
No announcement yet.

Vegas vs. Avidemux AVC - Quality differences...

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Megagoth1702
    Junior Member
    Junior Member
    • Sep 2012
    • 2

    Vegas vs. Avidemux AVC - Quality differences...

    Hey guys,

    quick:

    When I render a 1080p video with avidemux, AVC codec, constant rate factor at 26 I get an overall bitrate of ca. 11.5 (MediaInfo). When I render the same video in Vegas 11 Pro Trial with the MainConcept AVC with the maximum bitrate set to 15k, average set to 12 I get a HORRIBLY less quality video! The overall bitrate is higher but I see a LOT of blockyness in the video... So many blocks! Bah!

    I almost want to use Lagarith Lossless Codec in Vegas and then re--render the stuff with Avidemux but Avidemux ... Sadly... Does not like GPU accelleration which I am a BIG FAN of.

    Any tips?

    Is there a better way to use the x264 codec in Vegas? Maybe a way where I can actually use the great great "constant rate factor"? I am all about quality, not really telling the exact file size if you know what I mean...

    Thank you a lot guys, this is my first post and I hope I am not making to many mistakes.

    Have a nice one,
    Mega
  • Megagoth1702
    Junior Member
    Junior Member
    • Sep 2012
    • 2

    #2
    I know what is wrong now... After a whole night on google...

    Apparently Vegas, the so called industry standart, comes with SHITTY h264 codecs! Why the HELL would I want to use 20k bit rates if I can just save a lossless uncompressed video and then just re-render it with handbrake? No GPU accel there YET. But soon.

    Goddamit, big people in the indusrty always suck at these little things. Bah.

    Comment

    • turundus
      New Member
      New Member
      • Dec 2012
      • 1

      #3
      Still vegas wins, you can actually edit the codecs you want to use there.

      Comment

      Working...
      😀
      🥰
      🤢
      😎
      😡
      👍
      👎