We cannot remove files like PID that need root privileges in Python. We also can’t restart services on “/etc/init.d” using Python.
At least, we can’t do it directly using “os.unlink” or “subprocess”. Because, we will caught by this errors that said:
Python remove file permission denied
How to solve this problem?
This is easy, we can use Fabric with sudo(“command”) to overcome this problem.
1. Example restart services using Fabric
with cd('%s' % PROJECT_PATH):
We can use “sudo” with fabric without any hassle.
Here is tutorial on how to setup bitcoin and start mining on Fedora 18 (Linux). First thing first, we need to download dependencies.
1. Install QT
Since we will using Bitcoin client, it’s need QT to make it run.
sudo yum install libQtGTL
2. Download BitCoin client for Linux
Go to Bitcoin Download and pick linux packages. Extract and copy the bin folder into “/usr/local/bin”. For me, I use 64bit version.
sudo cp bin/* /usr/local/bin
3. Start Bitcoin QT
Just execute “bitcoin-qt” from your terminal. First time it’s run, it’s need to retrieve hash blocks which meaning it will download 9 GB data. Just leave your laptop for a day and it will finish.
4. Start solo mining
After our bitcoin download blocks already completed, we can stat doing solo mining. But, we need to install dependencies here :
sudo yum install opencl wxPython git pyserial
git clone https://github.com/Kiv/poclbm
You can start doing solo mining or join with pooling mining website like deepbit.net, slush, etc.
Using Multiprocessing in Python is a bit tricky. Sometimes when we are using simple Queue() and join() it’s just hang there.
To make it more stable, we can use Manager and Consumer in python multiprocessing. Remember, using Queue on multiprocessing manager is better than Queue(). Why?
Warning As mentioned above, if a child process has put items on a queue (and it has not used JoinableQueue.cancel_join_thread()), then that process will not terminate until all buffered items have been flushed to the pipe. This means that if you try joining that process you may get a deadlock unless you are sure that all items which have been put on the queue have been consumed. Similarly, if the child process is non-daemonic then the parent process may hang on exit when it tries to join all its non-daemonic children. Note that a queue created using a manager does not have this issue. See Programming guidelines.
Here is the problem:
I need to read all HTML files inside folder and process them not in sequential way because it too slow.
That mean I should be able open and read HTML files in multi way.
Then, we can use multiprocessing to solve this problem. First thing first that we will build is queue.
It will cater all input and output. Here is implementation of manager Queue() :
manager = multiprocessing.Manager()
task_queue = manager.Queue()
todo_queue = manager.Queue()
Then we use Consumer instead of Process to wrapped the queue:
When deploying django into production with DEBUG = False, sometimes we may get “blank page” or “500”. Eventhough we already put “500.html” on templates, seems like it’s doesn’t works.
Solution? Check your ALLOWED_HOSTS in your settings.py
ALLOWED_HOSTS = 
Make sure it’s not empty 😀
from bs4 import BeautifulSoup
soup = BeautifulSoup("your.html")