This way, my loads are managed rather easy between all machines,Īnd I don't have to worry about CPU hogging.every machine has a dedicated purpose.Ĭheck or for good deals on old G3's and G4's. Stuff from their download folders to my main machine,Īnd unpack them using MPDL, and then burn as necessary. When stuff is done, I simply transfer over my network the I have a dedicated mac on my home network that is an oldī&W G3 300 with 512MB of Ram, OSX 10.4.8, two 80GB HDs,Īnd 10/100T ethernet that all it runs is Azureus and Thoth. My suggestion is invest not in more RAM, but invest in a secondĪzureus and Thoth will run on as little as as a G3 processor. If your gonna run a dedicated Azureus or Thoth client 24/7, I'm only guessing your running all three apps on the same machine. So my guesses are then, according to Gerard, if you stopĪzareus or Thoth, and do your MPDL work unpacking stuff, So there is is! Tooo bad I ddin't find the email to do this myself in time It might be interesting to find out what program that is (or are), and see what happens if you manage to stop the other program(s) during the MPDL run. However, when you are sure the program uses 50% CPU, and still the computer slows down considerably, then apparently some other process is using the other 50% the the CPU. The 3 GB of memory you have is more than enough, and adding more will not make any difference. I optimized the program considerably during the last few years, but it still remains a "hog", as you say. A lot of calculations take place, in particular in the repair phase. To answer your question: MPDL is a CPU-heavy program. Anyway, thanks for your mail, and for using MDPL. Only thing I can think of is the HDD usage causing osx to become unresponsive.Loek Jehee forwarded your mail to me I recommend to use the menu "send e-mail to the author" in the MacPAR deLuxe Help menu to directly contact me for question on the program. when using top or activity monitor to watch the processor usage of unrar, it is only up to 6 percent at most always during extracting and still osx becomes unresponsive. On the asrock I could compile code and watch a movie in xbmc while sabnzbd was extracting files without a problem (also probably because of the famous 224 line code patch to the linux kernel in 2010 wich drasticly improved multitasking) So it shouldn't freeze at all? If you asign a task to more cores, more of the system resources are used for a specific taks, so other tasks get less priority hence the freezes?Īt least this is how it worked on the asrock under ubuntu, by default if I remember correctly sabnzbd used multi core par/unrar, then I had hickups when watching video's in xbmc/kodi, so I set it to use 1 core only, this solved the hickups in xbmc/kodi under ubuntu lol. perhaps it can be activated by manually editing the sabnzbd config file with a text editor?īut it doesn't make sense, when mutlicore par is not activated wouldn't sabnzb (unrar and par task) use less resources because it only uses 1 core, instead of 2, so the system should have more "power/resources" 1 core free to do other tasks? the really slow atom processor in the asrock ion was better in handling multitasking with unrar & par, then the i5 processor in the macmini lol, go figure.īy default in sabnzb settings, multicore par is greyed out in osx btw, so it cannot be changed. I have the same issues on osx yosemite 10.10.1 on a macmini 2014, whenever sabnzbd starts extracting a downloaded file, my system gets really slow, chrome and other apps become unresponsive, xbmc/kodi stops playback and starts buffering for 30 secs or longer and the dreaded "spinning wheel" in osx shows when opening apps or browsing with chrome etc.Īs soon as the extracting finishes everything goes back to normal again, I never had these problems before on my asrock 330 ION wich was running ubuntu, on much slower hardware, before I bought the macmini as a replacement for my htpc. Mac Mac Video Downloader f. Should I just avoid this multicore par2 thing altogether with OS X?Įdited. Free to try Download YouTube, Hulu, UStream, CBS, MTV, HBO, and any Web video to your PC in any format. I wanted to see it the multicore par2 could speed things up a bit when repairing. I have to mention that I am using Snow Leopard right now. I copied the relevant files to the SABnzbd+ app (app/Contents/Resources/osx/par2/) and entered -t0 in the parameter field under the switches page in SABnzbd+.ĭo I delete the par2-classic executable and just leave the par2 executable in the directory. And then the file is being listed as "Completed" without the par2 stage being finished. Now I want to use multicore par2 but each file I download says " filename. it was the same place where the Linux multicore par2 can be found. Today I decided to look for the multicore par2 binary for OS X and found it via a link in the Wiki for SABnzbd+. I'm new to this board but have used SABnzbd+ for some time.
0 Comments
Leave a Reply. |