options for launching sql script asynchronously from web app
I'd like to launch some sql script asynchronously from a web app and
have some kind of feedback later.
Some form of authentication would be a plus.
Is there anything ready?
--
Ivan Sergio Borgonovo
http://www.webthatworks.it
Authenticate in web app and drop a script in a directory and run them with
cron maybe?
Authenticate in web app and drop a row in a table and let a long running
process suck the row out and kick something off in a thread pool?
I've seen both. You've got to monitor both somehow. The second one is much
more work but lets you do more with the output and gives you a nice thread
pool. There might be a simple bash-ie way to make a thread pool, but I
don't know it.
--Nik
On Fri, Nov 7, 2008 at 8:06 PM, Ivan Sergio Borgonovo
<mail@webthatworks.it>wrote:
Show quoted text
I'd like to launch some sql script asynchronously from a web app and
have some kind of feedback later.Some form of authentication would be a plus.
Is there anything ready?
--
Ivan Sergio Borgonovo
http://www.webthatworks.it--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
Whenever asynchronous execution or batch processing is needed PgQ in
SkyTools should be one option to be evaluated. It has PhP interfaces as well
as tens of useful Python scripts.
On Sat, Nov 8, 2008 at 3:49 AM, Nikolas Everett <nik9000@gmail.com> wrote:
Show quoted text
Authenticate in web app and drop a script in a directory and run them with
cron maybe?
Authenticate in web app and drop a row in a table and let a long running
process suck the row out and kick something off in a thread pool?I've seen both. You've got to monitor both somehow. The second one is
much more work but lets you do more with the output and gives you a nice
thread pool. There might be a simple bash-ie way to make a thread pool, but
I don't know it.--Nik
On Fri, Nov 7, 2008 at 8:06 PM, Ivan Sergio Borgonovo <
mail@webthatworks.it> wrote:I'd like to launch some sql script asynchronously from a web app and
have some kind of feedback later.Some form of authentication would be a plus.
Is there anything ready?
--
Ivan Sergio Borgonovo
http://www.webthatworks.it--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
On Sat, 8 Nov 2008 11:36:06 +0200
"Asko Oja" <ascoja@gmail.com> wrote:
Whenever asynchronous execution or batch processing is needed PgQ
in SkyTools should be one option to be evaluated. It has PhP
interfaces as well as tens of useful Python scripts.
Nice to know and really nice tool but it seems not suited for my
needs.
1) it is too complicated and it's not in Debian etch (it seems to be
at least in sid anyway)
2) It doesn't guarantee that an event is processed only once
3) events are passed in batches. It is up to the consumer to process
an event at a time
4) it seems it is pooling the queue (?), not receiving events
I could set it so that it would look like as what I need... but it
looks much work that writing a python daemon.
At this stage I think that a delay in response of 59 sec may be
tolerable and manage it with cron and lock files.
I don't like it... but I've done it before and I'm still alive.
I just have to trigger an event that run asynchronously, avoid
other events of the same kind are triggered while one is running and
report back event status.
thanks
--
Ivan Sergio Borgonovo
http://www.webthatworks.it
On Sat, Nov 8, 2008 at 12:40 PM, Ivan Sergio Borgonovo <mail@webthatworks.it
wrote:
On Sat, 8 Nov 2008 11:36:06 +0200
"Asko Oja" <ascoja@gmail.com> wrote:Whenever asynchronous execution or batch processing is needed PgQ
in SkyTools should be one option to be evaluated. It has PhP
interfaces as well as tens of useful Python scripts.Nice to know and really nice tool but it seems not suited for my
needs.
1) it is too complicated and it's not in Debian etch (it seems to be
at least in sid anyway)
We are using it on Debian etch as far as i know.
2) It doesn't guarantee that an event is processed only once
By default but if you use pgq_ext it guarantees. Just in many cases we can
build stuff so thet duplicate events are ignored by procedure that applies
them.
3) events are passed in batches. It is up to the consumer to process
an event at a time
That i do not understand fully. Events are grouped in batches as they can be
processed more efficiently in batches.
4) it seems it is pooling the queue (?), not receiving events
Yes it's polling the queue. Usual delay for event processing with default
configuration is around couple of seconds.
I could set it so that it would look like as what I need... but it
looks much work that writing a python daemon.
Our consumers written in python are usually really small (code fits on
screen without scrolling) SkyTools contains several useful consumer so for
common solutions wýou might not even need to write anything.
Here is the code for one called simple_serial_consumer that reads url
encoded events from queue decodes them and for each event executes function
in some database with parameters filled from the event. Also keeps track
which events have been processed so it's guaranteed that none will be done
twice.
#!/usr/bin/python
import sys, pgq, skytools
skytools.sane_config = 1
class SimpleSerialConsumer(pgq.SerialConsumer):
def __init__(self, args):
pgq.SerialConsumer.__init__(self,"simple_serial_consumer","src_db","dst_db",
args)
self.dst_query = self.cf.get("dst_query")
self.producer = self.cf.get("producer", default="")
def process_remote_batch(self, batch_id, ev_list, dst_db):
curs = dst_db.cursor()
for ev in ev_list:
if self.producer <> "" and ev.producer <> self.producer:
ev.tag_done()
continue
payload = skytools.db_urldecode(ev.payload)
self.log.debug(self.dst_query % payload)
curs.execute(self.dst_query, payload)
try:
res = curs.dictfetchone()
self.log.debug(res)
except:
pass
ev.tag_done()
if __name__ == '__main__':
script = SimpleSerialConsumer(sys.argv[1:])
script.start()
SkyTools also contains trigger function logutriga that when attaqched to the
table creates url encoded events into the queue thus making inserting events
into queue as simple as inserting into table with standard sql.
Show quoted text
At this stage I think that a delay in response of 59 sec may be
tolerable and manage it with cron and lock files.
I don't like it... but I've done it before and I'm still alive.I just have to trigger an event that run asynchronously, avoid
other events of the same kind are triggered while one is running and
report back event status.thanks
--
Ivan Sergio Borgonovo
http://www.webthatworks.it--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
The only way to launch an ~asynchronous~ request from a web app is using
XMLHTTPRequest. That would be answered by a web server of
some sort. The server can launch any app you want - to run a sql
script, say. Authentication can be done with passwords, as usual.
Security may be an issue - you should use https, and not hard-code
passwords...all the usual security precautions.
Anything ready? Hard to say without knowing more about your
specific needs.
TJ O'Donnell
http://www.gnova.com
Show quoted text
I'd like to launch some sql script asynchronously from a web app and
have some kind of feedback later.Some form of authentication would be a plus.
Is there anything ready?
Import Notes
Resolved by subject fallback