PacificaBren
Member
Hi,
I'm digitizing Laserdiscs using an Elgato external USB capture card designed for analog-to-digital conversions.
Today I experimented with setting the resolution to 560 x 360, which is the true resolution of a Laserdisc, but I will probably go back to using 720 x 480, because I didn't like the results.
But the other thing I did for the first time today was go into Settings/Output/Encoder Preset and choose "fast (high CPU usage, high quality)
I did this because I figured I wouldn't be doing anything else on that computer while the capture was taking place, so it would be fine to devote all the CPU to this process.
BUT I am worried that, perhaps, this setting might have been inappropriate for my computer's low specs.
What I'm using is a late 2014 Mac Mini, running Windows 10 via Boot Camp (for those who aren't Mac people, this means the Mac is booted up from a Windows partition, running Windows at full speed, without any kind of emulation or virtual machine). This machine has a 2.6 Ghz Core i5-4278U processor, which is, sadly, a mere dual core processor. RAM is just 8GB, and the hard drive is a conventional 5400 RPM rotational device, i.e, not an SSD. The graphics on the motherboard is Intel Iris Graphics, though I don't know if that's relevant.
So please tell me, is it possible this old, slow computer got overwhelmed by the Encoder Preset I selected? Is it possible I might see better results if I selected something like, "veryfast (default) (medium CPU usage, standard quality)"?
Here is my most recent log file, btw:
Thank you for your input!
I'm digitizing Laserdiscs using an Elgato external USB capture card designed for analog-to-digital conversions.
Today I experimented with setting the resolution to 560 x 360, which is the true resolution of a Laserdisc, but I will probably go back to using 720 x 480, because I didn't like the results.
But the other thing I did for the first time today was go into Settings/Output/Encoder Preset and choose "fast (high CPU usage, high quality)
I did this because I figured I wouldn't be doing anything else on that computer while the capture was taking place, so it would be fine to devote all the CPU to this process.
BUT I am worried that, perhaps, this setting might have been inappropriate for my computer's low specs.
What I'm using is a late 2014 Mac Mini, running Windows 10 via Boot Camp (for those who aren't Mac people, this means the Mac is booted up from a Windows partition, running Windows at full speed, without any kind of emulation or virtual machine). This machine has a 2.6 Ghz Core i5-4278U processor, which is, sadly, a mere dual core processor. RAM is just 8GB, and the hard drive is a conventional 5400 RPM rotational device, i.e, not an SSD. The graphics on the motherboard is Intel Iris Graphics, though I don't know if that's relevant.
So please tell me, is it possible this old, slow computer got overwhelmed by the Encoder Preset I selected? Is it possible I might see better results if I selected something like, "veryfast (default) (medium CPU usage, standard quality)"?
Here is my most recent log file, btw:
Thank you for your input!