I have the python script file_creator.py. That script has on purpose to take over 1037 CSV files and concatenate them into one single file. The problem with it is each time I run the script, my whole system freeze. How could I prevent my CPU from reaching 100%? I have that problem with many other scripts and I just don't know how to prevent that problem. Yet I have a very good computer.
How could I modify that script to make it work fine on my computer? I would like to advise how to handle that kind of problem. How could I watch the CPU to do not pas over 100%?
Hello Sir, I can update the script to use threading and use less power to suit your needs looking forward to your response. Thank you
17 freelance font une offre moyenne de $28 pour ce travail
I am a python expert and i can fix your issue. I can start immediately. please check my reviews................
Hi there, I checked your script. I think you should not read csv to list. You should merge multi pandas fram. Let's check this url: [login to view URL] Hope it help.
I suspect that your computer may be running out of RAM, causing it to write memory to the swap file on disk. So your CPU processes backup waiting for disk access. I've worked with huge files that won't fit into memo Plus
Hi. Generally when we need to ease up on CPU we should periodically yield execution to the system by using sleep. But in this case likely culprits are bulk memory operations executed by read_csv and pd.concat. Since we Plus
Hello, Software Engineer here, questions: why are you using Python for that task? Why not run Windows native applications for these jobs(If you are in a Windows environment)? I can evaluate your current scripts and giv Plus
Hello! I am a python developer. I looked at your project and it seems interesting. I have good experience in python and I am an expert in it. I have all necessary skills required to be a good developer. I am interested Plus
Dear friend, This is place holder bid. I need to see first whats happening and then I can tell and fix the problem. The cost to you will depend upon actual time spend. an
Hello, I will audit and find out the reason for high CPU uses. I will make sure that everything must be in perfect way. Please ping me in PMB and share all information. Looking forward to hear from you. Thanks
Ready to understand your program and pervert CPU usage in the shortest possible time. I have experience in solving similar problems.
Hi I'm very interested in this project. I have experience of building and debugging python applications. I reviewed your code. It has a lot of CPU and memory intensive operations. If your purpose is just concat Plus
hi , i got 4 yer experience on python and 6 on linux , i think your code , could be optimized , but i need some example files to test it and the precise characteristics from you PC.
Hello, In order to fix cpu intensive application, we need to use profiler to catch the code block or function which consume too much cpu cycles. But there is another way, by looking at the algorithm. Please send me Plus
If you don't need the CSV in memory, just copying from input to output, it'll be a lot cheaper to avoid parsing at all, and copy without building up in memory: import shutil #import csv files from folder path = Plus
I've been developing with Python for over 5 years and have 15 years development experience overall. It seems like there's something not right with your script. I'd add the psutil. This module has a number of method Plus