Hi all,
I have been speaking with Harry (PA dev) briefly about this issue but I thought I'd open it up to the community, hopefully so I can get input from others who have had the issue but also then to help those that might come across it.
Basically, when I run my code locally from my system it works perfectly, the problem is, when I run it on PA it just stops running half way through the process - this is only on large processes.
Essentially my script searches for links within files and adds them to a queue for processing. On PA, the script does work on small queues (I've had it completing on 600 links) but when I do larger queues it just stops.
Last night I started it through a scheduled task (on recommendation from Harry) on a large process (this completes in about 20 minutes of my local machine) and this morning it's just listed in the 'processes running' section, my CPU is static, I tested the scheduled task on the small process and as expected it worked fine.
Here is a simplified version of my code (just removing data processing etc.)...
import Queue
from threading import Thread
import csv
# queue
q = Queue.LifoQueue()
# parse the file to get the links and put them in the queue
def get_put_links(linkfile):
# process file and get the links
q.put(link)
# run the function
get_put_links(linkfile)
# start pulling from the queue doing GET requests on each one
def multi():
while True:
q.get()
# do stuff with links (GET requests)
q.task_done()
if q.empty():
break
# start 5 threads
t1 = Thread(target = multi)
t2 = Thread(target = multi)
t3 = Thread(target = multi)
t4 = Thread(target = multi)
t5 = Thread(target = multi)
t1.start()
t2.start()
t3.start()
t4.start()
t5.start()
q.join()
I'm starting to think there is a bug or restriction on PA for queue sizes? This is (hopefully) going to be the main processing logic to my app so It's important that it works.
Thanks in advance and hope you're having a nice weekend. :)