users-prolog
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Bridging GProlog with other systems [Was: Re: FFI experimentation, some


From: Ciprian Dorin Craciun
Subject: Bridging GProlog with other systems [Was: Re: FFI experimentation, some advice please...]
Date: Fri, 5 Jul 2013 15:40:32 +0300

On Fri, Jul 5, 2013 at 1:56 AM, Sean Charles <address@hidden> wrote:
> I am already having delusions of grandeur and thinking about using RabbitMQ,
> Redis and MySQL and writing non-deterministic code that returns each row of
> a result set or keeps reading from an AMQP queue just because I can. For my
> day job, the ability to showcase Prolog would be great but it has to connect
> to all the usual stuff to be seen to be useful.


    A little bit off topic from the original subject of FFI, however
related with "bridging" GProlog with other "modern" systems.

    Indeed GProlog can be a perfect "secret" weapon behind a lot of
solutions, especially since it's simple and small enough to use and
extend, and it's compilable to machine code (thus no distribution
headaches).  However making it "talk" with today's software is a
daunting task (at a first glance).  I know there are "raw" sockets and
the usual "text" or "binary" consuming "functions", however this means
going to back to the basics on both sides (GProlog and the component
sitting at the other end).

    Moreover, in response to the original poster idea of using MySQL,
Redis, etc. directly from GProlog, I think it's highly impractical
decision, especially since all development must be done in C (or
wrapping C++ in C code).  (Is there any "production-ready" RabbitMQ
library for C?  How about for Riak, CouchDB or many other "NoSQL"
solutions?)


    However I think one could easily do the following instead:

    * implement an "efficient" (i.e. in C) JSON parser and formatter
that reads properly formatted UTF-8 JSON and spits out Prolog terms
(terms with some constraints), plus the reverse transformation;
    * implement a "gateway" to the outer world based on ZeroMQ;  (both
"client" mode to allow "asking" questions, i.e. REQ sockets, and
"server" mode to allow "receiving" questions, i.e. "REP" sockets;)
    * then, implement an adapter (or "driver") based on the same
technologies (ZeroMQ and JSON) that translates these requests towards
the wrapped resource;  (for example NodeJS, Go or even Java would be
perfect candidates for such a task;)


    Completing these (especially the first two), means GProlog is now
able to "communicate" with the rest of the "modern" world.  :)
Overhead and security is not such a big issue, since ZeroMQ can be
made to talk over UNIX domain sockets, or even "in-memory" if the
other end of the communication channel runs as a separate thread in
the GProlog process, thus not "mangled" with the GProlog VM.


    If someone is interested of such a direction I have a prototype
that solves the JSON part (by using the Jackson library), and the
ZeroMQ "server" part.  It's open source (LGPL3) and on GitHub at the
link below.  To test (or at least see the "look-and-feel" of the
solution) see the files in `sources/tests`:  in a couple of lines it
bridges a Python client with a GProlog server.  To build you require
Ninja (also linked below): just type `ninja` and see the `.outputs`
folder.

      https://github.com/cipriancraciun/volution-gprolog-libraries
      https://github.com/martine/ninja

    (My intention was to use GProlog for describing and enforcing
ACL's in a NodeJS web-service.  A "binding" to LevelDB or Memcache was
also intended to cache deduced "facts"...)


    Feedback is welcomed,
    Ciprian.


    P.S.: My Prolog is **extremely** rusty, thus if I've done horrible
mistakes please let me know...  :D



reply via email to

[Prev in Thread] Current Thread [Next in Thread]