HTTP (Hypertext Transfer Protocol) is the W3C standard protocol for transferring information between a web-client (e.g., a browser) and a web-server. The protocol is a simple envelope protocol where standard name/value pairs in the header are used to split the stream into messages and communicate about the connection-status. Many languages have client and server libraries to deal with the HTTP protocol, making this protocol an excellent candidate for building client-server applications. In particular, HTTP is a natural fit for networked systems built according to the principles of “Representational State Transfer'' (REST).
In this document we describe a modular infrastructure to access web-servers from SWI-Prolog and turn Prolog into a web-server.
This work has been carried out under the following projects: GARP, MIA (dead link), IBROW (dead link), KITS (dead link) and MultiMediaN (dead link).
The following people have pioneered parts of this library and contributed with bug reports and suggestions for improvements: Anjo Anjewierden, Bert Bredeweg, Wouter Jansweijer, Bob Wielinga, Jacco van Ossenbruggen, Michiel Hildebrandt, Matt Lilley and Keri Harris.
Path wildcards (see http_handler/3) have been modelled after the “arouter'' add-on pack by Raivo Laanemets. Request rewriting has been added after discussion with Raivo Laanemets and Anne Ogborn on the SWI-Prolog mailinglist.
This package provides two client libraries for accessing HTTP servers.
library(http/http_open)
setup_call_cleanup( http_open(URL, In, []), process(In), close(In)).
library(http/http_client)
Content-Type
of the reply. This
library supports a plugin infrastructure that can register hooks for
converting additional document types.
Content-Type
header.This library defines http_open/3, which opens an URL as a Prolog stream. The functionality of the library can be extended by loading two additional modules that act as plugins:
https
is requested using a default SSL context. See the
plugin for additional information regarding security.gzip
transfer encoding.
This plugin is lazily loaded if a connection is opened that claims this
transfer encoding.Transfer-encoding: chunked
header.Here is a simple example to fetch a web-page:
?- http_open('http://www.google.com/search?q=prolog', In, []), copy_stream_data(In, user_output), close(In). <!doctype html><head><title>prolog - Google Search</title><script> ...
The example below fetches the modification time of a web-page. Note
that
Modified
is ''
(the empty atom) if the
web-server does not provide a time-stamp for the resource. See also parse_time/2.
modified(URL, Stamp) :- http_open(URL, In, [ method(head), header(last_modified, Modified) ]), close(In), Modified \== '', parse_time(Modified, Stamp).
Then next example uses Google search. It exploits library(uri)
to manage URIs, library(sgml)
to load an HTML document and library(xpath)
to navigate the parsed HTML. Note that you may need to adjust the XPath
queries if the data returned by Google changes (this example indeed no
longer works and currently fails at the first xpath/3
call)
:- use_module(library(http/http_open)). :- use_module(library(xpath)). :- use_module(library(sgml)). :- use_module(library(uri)). google(For, Title, HREF) :- uri_encoded(query_value, For, Encoded), atom_concat('http://www.google.com/search?q=', Encoded, URL), http_open(URL, In, []), call_cleanup( load_html(In, DOM, []), close(In)), xpath(DOM, //h3(@class=r), Result), xpath(Result, //a(@href=HREF0, text), Title), uri_components(HREF0, Components), uri_data(search, Components, Query), uri_query_components(Query, Parts), memberchk(q=HREF, Parts).
An example query is below:
?- google(prolog, Title, HREF). Title = 'SWI-Prolog', HREF = 'http://www.swi-prolog.org/' ; Title = 'Prolog - Wikipedia', HREF = 'https://nl.wikipedia.org/wiki/Prolog' ; Title = 'Prolog - Wikipedia, the free encyclopedia', HREF = 'https://en.wikipedia.org/wiki/Prolog' ; Title = 'Pro-Log is logistiek dienstverlener m.b.t. vervoer over water.', HREF = 'http://www.pro-log.nl/' ; Title = 'Learn Prolog Now!', HREF = 'http://www.learnprolognow.org/' ; Title = 'Free Online Version - Learn Prolog ...
false
(default true
), do not try to
automatically authenticate the client if a 401 (Unauthorized) status
code is received.library(http/http_digest)
is also loaded.curl(1)
’s
option‘--unix-socket`.Connection
header. Default is close
.
The alternative is Keep-alive
. This maintains a pool of
available connections as determined by keep_connection/1.
The library(http/websockets)
uses Keep-alive, Upgrade
.
Keep-alive connections can be closed explicitly using
http_close_keep_alive/1.
Keep-alive connections may significantly improve repetitive requests on
the same server, especially if the IP route is long, HTTPS is used or
the connection uses a proxy.header(Name,Value)
option. A
pseudo header status_code(Code)
is added to provide the
HTTP status as an integer. See also raw_headers(-List)
which provides the entire HTTP reply header in unparsed representation.get
(default), head
, delete
, post
, put
or
patch
. The head
message can be used in
combination with the header(Name, Value)
option to access
information on the resource without actually fetching the resource
itself. The returned stream must be closed immediately.
If post(Data)
is provided, the default is post
.
Content-Length
in the reply header.Major-Minor
, where Major
and Minor are integers representing the HTTP version in the
reply header.end
. HTTP 1.1 only supports Unit = bytes
.
E.g., to ask for bytes 1000-1999, use the option
range(bytes(1000,1999))
raw_encoding('applocation/gzip')
the system will not
decompress the stream if it is compressed using gzip
.headers(-List)
.false
(default true
), do not
automatically redirect if a 3XX code is received. Must be combined with
status_code(Code)
and one of the header options to read the
redirect reply. In particular, without status_code(Code)
a
redirect is mapped to an exception.infinite
).POST
request on the HTTP server. Data is
handed to http_post_data/3.proxy(+Host:Port)
. Deprecated.authorization
option.true
, bypass proxy hooks. Default is false
.infinite
.
The default value is 10
.User-Agent
field of the HTTP
header. Default is SWI-Prolog
.
The hook http:open_options/2
can be used to provide default options based on the broken-down URL.
The option
status_code(-Code)
is particularly useful to query REST
interfaces that commonly return status codes other than 200
that need to be be processed by the client code.
URL | is either an atom or string (url) or a
list of parts.
When provided, this list may contain the fields
http_open([ host('www.example.com'), path('/my/path'), search([ q='Hello world', lang=en ]) ])
|
error(existence_error(url, Id),Context)
is raised if the
HTTP result code is not in the range 200..299. Context has the shape context(Message, status(Code, TextCode))
,
where Code is the numeric HTTP code and TextCode
is the textual description thereof provided by the server. Message
may provide additional details or may be unbound.library(http/http_ssl_plugin)
is loaded.METHOD
keywords. Default are the
official HTTP methods as defined by the various RFCs.Content-encoding
as Transfer-encoding
encoding for specific values of ContentType. This predicate
is multifile and can thus be extended by the user.-
, possibly
defined authorization is cleared. For example:
?- http_set_authorization('http://www.example.com/private/', basic('John', 'Secret'))
http
and
https
URLs for Mode == read
.http_close_keep_alive(_)
closes all currently known keep-alive connections.:- multifile http:open_options/2. http:open_options(Parts, Options) :- option(host(Host), Parts), Host \== localhost, Options = [proxy('proxy.local', 3128)].
This hook may return multiple solutions. The returned options are combined using merge_options/3 where earlier solutions overrule later solutions.
Cookie:
header for the current connection. Out
is an open stream to the HTTP server, Parts is the
broken-down request (see uri_components/2)
and Options is the list of options passed to http_open. The
predicate is called as if using ignore/1.
library(http/http_cookie)
implements cookie handling on
top of these hooks.Set-Cookie
field, Parts is the broken-down
request (see
uri_components/2) and Options
is the list of options passed to http_open.
library(http/http_cookies)
implements cookie handling on
top of these hooks.
This library provides the four basic HTTP client actions: GET
,
DELETE
, POST
and PUT
. In
addition, it provides http_read_data/3,
which is used by library(http/http_parameters)
to decode POST
data in server applications.
This library is based on http_open/3, which opens a URL as a Prolog stream. The reply is processed by http_read_data/3. The following content-types are supported. Options passed to http_get/3 and friends are passed to http_read_data/3, which in turn passes them to the conversion predicates. Support for additional content types can be added by extending the multifile predicate http_client:http_convert_data/4.
Name=Value
terms.library(http/http_multipart_plugin)
is loaded.
This format should be used to handle web forms that upload a file.text/html
|
text/xml
library(http/http_sgml_plugin)
is loaded. See load_html/3
for details and load_xml/3 for details.
The output is often processed using xpath/3.application/json
|
application/jsonrequest
library(http/http_json)
is loaded. The option
json_object(As)
can be used to return a term json(Attributes)
(As is term
) or a dict (As is dict
).Content-Type
header and
plugins. This predicate is the common implementation of the HTTP client
operations. The predicates http_delete/3, http_post/4
and
http_put/4 call this predicate
with an appropriate
method(+Method)
option and ---for http_post/4
and http_put/4--- a post(+Data)
option.
Options are passed to http_open/3 and http_read_data/3. Other options:
headers(Fields)
from http_open/3.
Provided for backward compatibility. Note that http_version(Major-Minor)
is missing in the new version.DELETE
method on the server. Arguments are the
same as for http_get/3. Typically
one should pass the option
status_code(-Code)
to assess and evaluate the returned
status code. Without, codes other than 200 are interpreted as an error.
POST
request. Data is posted using
http_post_data/3. The HTTP
server reply is returned in Reply, using the same rules as
for http_get/3.
PUT
request. Arguments are the same as for
http_post/4.
PATCH
request. Arguments are the same as for
http_post/4.
to(Format)
option or based on the Content-type
in the Request. The following options are supported:
stream(+WriteStream)
) Append the content of the message
to Streamlibrary(http/http_multipart_plugin)
and apply to processing
multipart/form-data
content.Without plugins, this predicate handles
Name=Value
terms.Request | is a parsed HTTP request as returned
by
http_read_request/2 or
available from the HTTP server's request dispatcher. Request
must contain a term input(In) that provides the input
stream from the HTTP server. |
library(http/http_json)
), HTML/XML (library(http/http_sgml_plugin)
)all
, closing all connections.
library(http/http_open)
.post(Data)
option of http_open/3. The
default implementation supports
prolog(Term)
, sending a Prolog term as application/x-prolog
.
The HTTP server infra structure consists of a number of small modular
libraries that are combined into library(http/http_server)
.
These modules are:
library(http/thread_httpd)
library(http/http_dyn_workers)
library(http/http_wrapper)
library(http/http_dispatch)
library(http/http_parameters)
library(http/html_write)
current_output
to XML-based templates
(PWP).library(http/http_json)
Most server implementation simply load the library(http/http_server)
library, which loads the above modules and reexports all predicates
except for those used for internal communication and older deprecated
predicates. Specific use cases may load a subset of the individual
libraries and may decide to replace one or more of them.
A typical skeleton for building a server is given below. If this file
is loaded as main file (using e.g., swipl server.pl
) it
creates a simple server that listens on port 8080. If the root is
accessed it redirects to the home page and shows Hello world!.
:- use_module(library(http/http_server)). :- initialization http_server([port(8080)]). :- http_handler(root(.), http_redirect(moved, location_by_id(home_page)), []). :- http_handler(root(home), home_page, []). home_page(_Request) :- reply_html_page( title('Demo server'), [ h1('Hello world!') ]).
The handler (e.g., home_page/1 above) is called with the
parsed request (see section 3.13) as
argument and
current_output
set to a temporary buffer. Its task is
closely related to the task of a CGI script; it must write a header
declaring at least the Content-type
field and a body. Below
is a simple body writing the request as an HTML table.3Note
that writing an HTML reply this way is deprecated. In fact, the code is
subject to injection attacks as the HTTP request field values
are literally injected in the output while HTML reserved characters
should be properly escaped.
reply(Request) :- format('Content-type: text/html~n~n', []), format('<html>~n', []), format('<table border=1>~n'), print_request(Request), format('~n</table>~n'), format('</html>~n', []). print_request([]). print_request([H|T]) :- H =.. [Name, Value], format('<tr><td>~w<td>~w~n', [Name, Value]), print_request(T).
The infrastructure recognises the header fields described below.
Other header lines are passed verbatim to the client. Typical examples
are
Set-Cookie
and authentication headers (see section
3.7).
text/*
or the type matches with UTF-8
(case insensitive), the server uses UTF-8 encoding. The user may force
UTF-8 encoding for arbitrary content types by adding ;
charset=UTF-8
to the end of the Content-type
header.chunked
option in http_handler/3.Status
header to force
a
redirect response to the given URL. The message body
must be empty. Handling this header is primarily intended for
compatibility with the CGI conventions. Prolog code should use
http_redirect/3.Location
, where Status
must be one of 301 (moved), 302 (moved temporary, default) or 303 (see
other). Using the status field also allows for formulating replies such
as 201 (created).
Note that the handler may send any type of document instead of HTML.
After the header has been written, the encoding of the
current_output
stream encoding is established as follows:
text/*
the stream is switched to
UTF-8 encoding. If the content type does not provide attributes, ; charset=UTF-8
is added.UTF-8
the stream is switched
to UTF-8 encoding.http_header
. The current list deals with JSON, Turtle
and SPARQL.
Besides returning a page by writing it to the current output stream,
the server goal can raise an exception using throw/1
to generate special pages such as not_found
, moved
,
etc. The defined exceptions are:
http_reply(Reply, HdrExtra,[])
.http_reply(Reply, [],[])
.http_reply(not_modified,[])
. This exception
is for backward compatibility and can be used by the server to indicate
the referenced resource has not been modified since it was requested
last time.
In addition, the normal "200 OK"
reply status may be
overruled by writing a CGI Status
header prior to the
remainder of the message. This is particularly useful for defining REST
APIs. The following handler replies with a "201 Created"
header:
handle_request(Request) :- process_data(Request, Id), % application predicate format('Status: 201~n'), format('Content-type: text/plain~n~n'), format('Created object as ~q~n', [Id]).
Most code doesn't need to use this directly; instead use
library(http/http_server)
, which combines this library with
the typical HTTP libraries that most servers need.
This module can be placed between http_wrapper.pl
and
the application code to associate HTTP locations to predicates
that serve the pages. In addition, it associates parameters with
locations that deal with timeout handling and user authentication. The
typical setup is:
server(Port, Options) :- http_server(http_dispatch, [ port(Port) | Options ]). :- http_handler('/index.html', write_index, []). write_index(Request) :- ...
'/home.html'
or a term
Alias(Relative). Where Alias is associated with a concrete path using http:location/3
and resolved using http_absolute_location/3. Relative
can be a single atom or a term‘Segment1/Segment2/...`, where each
element is either an atom or a variable. If a segment is a variable it
matches any segment and the binding may be passed to the closure. If the
last segment is a variable it may match multiple segments. This allows
registering REST paths, for example:
:- http_handler(root(user/User), user(Method, User), [ method(Method), methods([get,post,put]) ]). user(get, User, Request) :- ... user(post, User, Request) :- ...
If an HTTP request arrives at the server that matches Path, Closure is called as below, where Request is the parsed HTTP request.
call(Closure, Request)
Options is a list containing the following options:
http_authenticate.pl
provides a plugin for user/password based Basic
HTTP
authentication.Transfer-encoding: chunked
if the client allows for it.true
on a prefix-handler (see prefix), possible children
are masked. This can be used to (temporary) overrule part of the tree.methods([Method])
. Using method(*)
allows for
all methods.:- http_handler(/, http_404([index('index.html')]), [spawn(my_pool),prefix]).
infinite
, default
or a positive number
(seconds). If
default
, the value from the setting http:time_limit
is taken. The default of this setting is 300 (5 minutes). See
setting/2.Note that http_handler/3 is normally invoked as a directive and processed using term-expansion. Using term-expansion ensures proper update through make/0 when the specification is modified.
existence_error(http_location, Location)
permission_error(http_method, Method, Location)
path
member of Request.
If multiple handlers match due to the prefix
option or
variables in path segments (see http_handler/3),
the longest specification is used. If multiple specifications of equal
length match the one with the highest priority is used.method
member of the
Request or throw permission_error(http_method, Method, Location)
http_reply(Term, ExtraHeader, Context)
exceptions.method(Method)
as one of the options.call(Goal, Request0, Request, Options)
If multiple goals are registered they expand the request in a pipeline starting with the expansion hook with the lowest rank.
Besides rewriting the request, for example by validating the user identity based on HTTP authentication or cookies and adding this to the request, the hook may raise HTTP exceptions to indicate a bad request, permission error, etc. See http_status_reply/4.
Initially, auth_expansion/3 is
registered with rank 100
to deal with the older http:authenticate/3
hook.
id(ID)
appears in the option list of the handler, ID
it is used and takes preference over using the predicate.Module:Pred
If the handler is declared with a pattern, e.g., root(user/User)
,
the location to access a particular user may be accessed using
e.g., user('Bob')
. The number of arguments to the compound
term must match the number of variables in the path pattern.
A plain atom ID can be used to find a handler with a
pattern. The returned location is the path up to the first variable,
e.g.,
/user/
in the example above.
User code is adviced to use http_link_to_id/3 which can also add query parameters to the URL. This predicate is a helper for http_link_to_id/3.
existence_error(http_handler_id, Id)
.library(http/html_write)
construct
location_by_id(ID)
or its abbreviation #(ID)
root(user_details)
)
is irrelevant in this equation and HTTP locations can thus be moved
freely without breaking this code fragment.
:- http_handler(root(user_details), user_details, []). user_details(Request) :- http_parameters(Request, [ user_id(ID) ]), ... user_link(ID) --> { user_name(ID, Name), http_link_to_id(user_details, [id(ID)], HREF) }, html(a([class(user), href(HREF)], Name)).
HandleID | is either an atom, possibly module qualified predicate or a compound term if the hander is defined using a pattern. See http_handler/3 and http_location_by_id/2. |
Parameters | is one of
|
true
(default), handle If-modified-since and send
modification time.true
(default false
) and, in addition to
the plain file, there is a .gz
file that is not older than
the plain file and the client acceps gzip
encoding, send
the compressed file with Transfer-encoding: gzip
.true
(default false
) the system maintains
cached gzipped files in a directory accessible using the file search
path http_gzip_cache
and serves these similar to the static_gzip(true)
option. If the gzip file does not exist or is older than the input the
file is recreated.false
(default), validate that FileSpec does
not contain references to parent directories. E.g., specifications such
as www('../../etc/passwd')
are not allowed.
If caching is not disabled, it processes the request headers
If-modified-since
and Range
.
http_reply(not_modified)
http_reply(file(MimeType, Path))
alias(Sub)
, than Sub cannot have
references to parent directories.
permission_error(read, file, FileSpec)
:- http_handler(root(.), http_redirect(moved, myapp('index.html')), []).
How | is one of moved , moved_temporary
or see_other |
To | is an atom, a aliased path as defined by
http_absolute_location/3.
or a term location_by_id(Id) or its abbreviations #(Id)
or #(Id)+Parameters . If To is not absolute, it
is resolved relative to the current location. |
http_reply(not_found(Path))
"HTTP 101 Switching Protocols"
reply. After sending
the reply, the HTTP library calls call(Goal, InStream, OutStream)
,
where InStream and OutStream are the raw streams to the HTTP client.
This allows the communication to continue using an an alternative
protocol.
If Goal fails or throws an exception, the streams are
closed by the server. Otherwise Goal is responsible for
closing the streams. Note that Goal runs in the HTTP handler
thread. Typically, the handler should be registered using the spawn
option if http_handler/3 or Goal
must call thread_create/3 to allow the
HTTP worker to return to the worker pool.
The streams use binary (octet) encoding and have their I/O timeout set to the server timeout (default 60 seconds). The predicate set_stream/2 can be used to change the encoding, change or cancel the timeout.
This predicate interacts with the server library by throwing an exception.
The following options are supported:
headers(+Headers)
.
This module provides a simple API to generate an index for a physical directory. The index can be customised by overruling the dirindex.css CSS file and by defining additional rules for icons using the hook http:file_extension_icon/2.
The calling conventions allows for direct calling from http_handler/3.
//
name
(default), size
or time
.ascending
. The altenative is
descending
absolute_file_name(icons(IconName), Path, [])
.
Although the SWI-Prolog Web Server is intended to serve documents
that are computed dynamically, serving plain files is sometimes
necessary. This small module combines the functionality of http_reply_file/3
and
http_reply_dirindex/3
to act as a simple web-server. Such a server can be created using the
following code sample, which starts a server at port 8080 that serves
files from the current directory ('.'). Note that the handler needs a prefix
option to specify that it must handle all paths that begin with the
registed location of the handler.
:- use_module(library(http/http_server)). :- use_module(library(http/http_files)). :- http_handler(root(.), http_reply_from_files('.', []), [prefix]). :- initialization(http_server([port(8080)]), main).
indexes
to locate an index file (see below) or
uses http_reply_dirindex/3
to create a listing of the directory.
Options:
['index.html']
.
Note that this handler must be tagged as a prefix
handler (see
http_handler/3 and module
introduction). This also implies that it is possible to override more
specific locations in the hierarchy using http_handler/3
with a longer path-specifier.
When using http_handler/3
to bind this predicate to an HTTP location, make sure it is bound to a
location that ends in a /
. When using http:location/3
to define symbolic names to HTTP locations this is written as
:-
http_handler(aliasname(.), http_reply_from_files(srcdir, []), [prefix])
.Dir | is either a directory or an path-specification as used by absolute_file_name/3. This option provides great flexibility in (re-)locating the physical files and allows merging the files of multiple physical locations into one web-hierarchy by using multiple user:file_search_path/2 clauses that define the same alias. |
Content-type
from the file name.
This library defines session management based on HTTP cookies.
Session management is enabled simply by loading this module. Details can
be modified using http_set_session_options/1.
By default, this module creates a session whenever a request is
processes that is inside the hierarchy defined for session handling (see
path option in
http_set_session_options/1).
Automatic creation of a session can be stopped using the option create(noauto)
.
The predicate
http_open_session/2 must
be used to create a session if noauto
is enabled. Sessions
can be closed using http_close_session/1.
If a session is active, http_in_session/1 returns the current session and http_session_assert/1 and friends maintain data about the session. If the session is reclaimed, all associated data is reclaimed too.
Begin and end of sessions can be monitored using library(broadcast)
.
The broadcasted messages are:
For example, the following calls end_session(SessionId)
whenever a session terminates. Please note that sessions ends are not
scheduled to happen at the actual timeout moment of the session.
Instead, creating a new session scans the active list for timed-out
sessions. This may change in future versions of this library.
:- listen(http_session(end(SessionId, Peer)), end_session(SessionId)).
0
(zero) disables timeout.swipl_session
./
. Cookies are only sent if the HTTP request path is a
refinement of Path.auto
(default), which creates a session if there is a request whose path
matches the defined session path or noauto
, in which cases
sessions are only created by calling
http_open_session/2
explicitely.active
, which starts a thread
that performs session cleanup at close to the moment of the timeout or passive
,
which runs session GC when a new session is created.none
, lax
(default), or strict
- The SameSite attribute prevents the CSRF vulnerability. strict has
best security, but prevents links from external sites from operating
properly. lax stops most CSRF attacks against REST endpoints but rarely
interferes with legitimage operations. none
removes the
samesite attribute entirely. Caution: The value none
exposes the entire site to CSRF attacks.
In addition, extension libraries can define session_option/2
to make this predicate support more options. In particular,
library(http/http_redis_plugin)
defines the following
additional options:
'swipl:http:session'
http_session_set(Setting)
.timeout
.
permission_error(set, http_session, Setting)
if setting a
setting that is not supported on per-session basis.SessionId | is an atom. |
existence_error(http_session, _)
session(ID)
from the
current HTTP request (see http_current_request/1).
The value is cached in a backtrackable global variable http_session_id
.
Using a backtrackable global variable is safe because continuous worker
threads use a failure driven loop and spawned threads start without any
global variables. This variable can be set from the commandline to fake
running a goal from the commandline in the context of a session.
noauto
. Options:
true
(default false
) and the current
request is part of a session, generate a new session-id. By default,
this predicate returns the current session as obtained with
http_in_session/1.permission_error(open, http_session, CGI)
if this call is
used after closing the CGI header.create
option. existence_error(http_session,_)
http_session(end(SessionId, Peer))
The broadcast is done before the session data is destroyed and the listen-handlers are executed in context of the session that is being closed. Here is an example that destroys a Prolog thread that is associated to a thread:
:- listen(http_session(end(SessionId, _Peer)), kill_session_thread(SessionID)). kill_session_thread(SessionID) :- http_session_data(thread(ThreadID)), thread_signal(ThreadID, throw(session_closed)).
Succeed without any effect if SessionID does not refer to an active session.
If http_close_session/1
is called from a handler operating in the current session and the CGI
stream is still in state
header
, this predicate emits a Set-Cookie
to
expire the cookie.
type_error(atom, SessionID)
library(http/http_redis_plugin)
, storing all session data
in a redis database.
This small module allows for enabling Cross-Origin Resource Sharing (CORS) for a specific request. Typically, CORS is enabled for API services that you want to have useable from browser client code that is loaded from another domain. An example are the LOD and SPARQL services in ClioPatria.
Because CORS is a security risc (see references), it is disabled by default. It is enabled through the setting http:cors. The value of this setting is a list of domains that are allowed to access the service. Because * is used as a wildcard match, the value [*] allows access from anywhere.
Services for which CORS is relevant must call cors_enable/0
as part of the HTTP response, as shown below. Note that cors_enable/0
is a no-op if the setting http:cors is set to the empty list ([]
).
my_handler(Request) :- ...., cors_enable, reply_json(Response, []).
If a site uses a Preflight OPTIONS
request to
find the server's capabilities and access politics, cors_enable/2
can be used to formulate an appropriate reply. For example:
my_handler(Request) :- option(method(options), Request), !, cors_enable(Request, [ methods([get,post,delete]) ]), format('~n'). % 200 with empty body
Access-Control-Allow-Origin
using
domains from the setting http:cors. This this setting is []
(default), nothing is written. This predicate is typically used for
replying to API HTTP-request (e.g., replies to an AJAX request that
typically serve JSON or XML).OPTIONS
request. Request
is the HTTP request. Options provides:
GET
,
only allowing for read requests.
Both methods and headers may use Prolog friendly syntax, e.g.,
get
for a method and content_type
for a
header.
This module provides the basics to validate an HTTP Authorization
header. User and password information are read from a Unix/Apache
compatible password file.
This library provides, in addition to the HTTP authentication, predicates to read and write password files.
Basic
authetication and verify the password from PasswordFile. PasswordFile
is a file holding usernames and passwords in a format compatible to Unix
and Apache. Each line is record with :
separated fields.
The first field is the username and the second the password hash.
Password hashes are validated using crypt/2.Successful authorization is cached for 60 seconds to avoid overhead of decoding and lookup of the user and password data.
http_authenticate/3 just validates the header. If authorization is not provided the browser must be challenged, in response to which it normally opens a user-password dialogue. Example code realising this is below. The exception causes the HTTP wrapper code to generate an HTTP 401 reply.
( http_authenticate(basic(passwd), Request, Fields) -> true ; throw(http_reply(authorise(basic, Realm))) ).
Fields | is a list of fields from the password-file entry. The first element is the user. The hash is skipped. |
Authorization
header. Data is a
term
Method(User, Password)
where Method is the (downcased) authorization method (typically
basic
), User is an atom holding the user name and Password
is a list of codes holding the password
Fields | are the fields from the password file File, converted using name/2, which means that numeric values are passed as numbers and other fields as atoms. The password hash is the first element of Fields and is a string. |
passwd(User, Hash, Fields)
passwd(User, Hash, Fields)
library(http_dispatch)
to perform basic HTTP
authentication.
This predicate throws http_reply(authorise(basic, Realm))
.
AuthData | must be a term basic(File, Realm) |
Request | is the HTTP request |
Fields | describes the authenticated user with
the option
user(User) and with the option user_details(Fields)
if the password file contains additional fields after the user and
password. |
This library implements HTTP Digest Authentication as per RFC2617. Unlike Basic Authentication, digest authentication is based on challenge-reponse and therefore does not need to send the password over the (insecure) connection. In addition, it provides a count mechanism that ensure that old credentials cannot be reused, which prevents attackers from using old credentials with a new request. Digest authentication have the following advantages and disadvantages:
And, of course, the connection itself remains insecure. Digest based authentication is a viable alternative if HTTPS is not a good option and security of the data itself is not an issue.
This library acts as plugin for library(http/http_dispatch)
,
where the registered handler (http_handler/3)
can be given the option below to initiate digest authentication.
authentication(digest(PasswdFile, Realm))
Above, PasswdFile is a file containing lines of the from
below, where PasswordHash is computed using http_digest_password_hash/4.
See also
library(http/http_authenticate)
, http_read_passwd_file/2
and
http_write_passwd_file/2.
User ":" PasswordHash (":" Extra)*
This library also hooks into library(http/http_open)
if
the option
authorization(digest(User, Password))
is given.
//
WWW-Authenticate: Digest
header field.WWW-Authenticate
header into a
list of Name(Value) terms./
'GET'
Challenge | is a list Name(Value), normally
from
http_parse_digest_challenge/2.
Must contain
realm and nonce . Optionally contains
opaque . |
User | is the user we want to authenticated |
Password | is the user's password |
Options | provides additional options |
<user>:<realm>:<password>.
The inexpensive MD5 algorithm makes the hash sensitive to brute force attacks while the lack of seeding make the hashes sensitive for rainbow table attacks, although the value is somewhat limited because the realm and user are part of the hash.
library(http_dispatch)
to perform basic HTTP
authentication. Note that we keep the authentication details cached to
avoid a‘nonce-replay' error in the case that the application tries
to verify multiple times.
This predicate throws http_reply(authorise(digest(Digest)))
Digest | is a term digest(File, Realm, Options) |
Request | is the HTTP request |
Fields | describes the authenticated user with
the option
user(User) and with the option user_details(Fields)
if the password file contains additional fields after the user and
password. |
authorization(AuthData)
and Out is a
stream on which to write additional HTTP headers.request_header(authorization=Digest)
header to Options,
causing
http_open/3 to retry the request
with the additional option.
Most code doesn't need to use this directly; instead use
library(http/http_server)
, which combines this library with
the typical HTTP libraries that most servers need.
This module defines hooks into the HTTP framework to dynamically schedule worker threads. Dynamic scheduling relieves us from finding a good value for the size of the HTTP worker pool.
The decision to add a worker follows these rules:
The policy depends on three settings:
http
:
max_workers
http
:
worker_idle_limit
http
:
max_load
__http_scheduler
as the hook is called in time critical
code.
It is possible to create arbitrary error pages for responses
generated when a http_reply term is thrown. Currently this is only
supported for status 403 (authentication required). To do this,
instead of throwing http_reply(authorise(Term))
throw
http_reply(authorise(Term), [], Key)
, where Key
is an arbitrary term relating to the page you want to generate. You must
then also define a clause of the multifile predicate http:status_page_hook/3:
http_reply
exception or the HTTP status code, i.e., the hook is called twice. New
code should using the Term. Context is the third argument of
the http_reply exception which was thrown, and CustomHTML is a list of
HTML tokens. A page equivalent to the default page for 401 is generated
by the example below.
:- multifile http:status_page_hook/3. http:status_page_hook(authorise(Term), _Context, HTML) :- phrase(page([ title('401 Authorization Required') ], [ h1('Authorization Required'), p(['This server could not verify that you ', 'are authorized to access the document ', 'requested. Either you supplied the wrong ', 'credentials (e.g., bad password), or your ', 'browser doesn\'t understand how to supply ', 'the credentials required.' ]), \address ]), HTML).
This library implements the OpenID protocol (http://openid.net/). OpenID is a protocol to share identities on the network. The protocol itself uses simple basic HTTP, adding reliability using digitally signed messages.
Steps, as seen from the consumer (or relying partner).
openid_identifier
openid_identifier
and lookup
<link rel="openid.server" href="server">
checkid_setup
,
asking to validate the given OpenID.
A consumer (an application that allows OpenID login) typically
uses this library through openid_user/3.
In addition, it must implement the hook http_openid:openid_hook(trusted(OpenId, Server))
to define accepted OpenID servers. Typically, this hook is used to
provide a white-list of acceptable servers. Note that accepting any
OpenID server is possible, but anyone on the internet can setup a dummy
OpenID server that simply grants and signs every request. Here is an
example:
:- multifile http_openid:openid_hook/1. http_openid:openid_hook(trusted(_, OpenIdServer)) :- ( trusted_server(OpenIdServer) -> true ; throw(http_reply(moved_temporary('/openid/trustedservers'))) ). trusted_server('http://www.myopenid.com/server').
By default, information who is logged on is maintained with the
session using http_session_assert/1
with the term openid(Identity)
. The hooks
login/logout/logged_in can be used to provide alternative administration
of logged-in users (e.g., based on client-IP, using cookies, etc.).
To create a server, you must do four things: bind the handlers
openid_server/2 and openid_grant/1
to HTTP locations, provide a user-page for registered users and define
the grant(Request, Options)
hook to verify your users. An
example server is provided in in
<plbase>/doc/packages/examples/demo_openid.pl
handler(Request) :- openid_user(Request, OpenID, []), ...
If the user is not yet logged on a sequence of redirects will follow:
verify
, which calls openid_verify/2.
Options:
//
img
structures where the href
points to an OpenID 2.0 endpoint. These buttons are displayed below the
OpenID URL field. Clicking the button sets the URL field and submits the
form. Requires Javascript support.
If the href
is relative, clicking it opens the
given location after adding’openid.return_to' and‘stay'.
true
, show a checkbox that allows the user to stay
logged on.http_dispatch.pl
. Options
processes:
openid.trust_root
attribute. Defaults to the
root of the current server (i.e., http://host[.port]/
).openid.realm
attribute. Default is the
trust_root
.
The OpenId server will redirect to the openid.return_to
URL.
http_reply(moved_temporary(Redirect))
OpenIDLogin | ID as typed by user (canonized) |
OpenID | ID as verified by server |
Server | URL of the OpenID server |
global(true)
.
After openid_verify/2 has
redirected the browser to the OpenID server, and the OpenID
server did its magic, it redirects the browser back to this address. The
work is fairly trivial. If
mode
is cancel
, the OpenId server denied. If id_res
,
the OpenId server replied positive, but we must verify what the server
told us by checking the HMAC-SHA signature.
This call fails silently if their is no openid.mode
field in the request.
openid(cancel)
if request was cancelled by the OpenId
server openid(signature_mismatch)
if the HMAC signature check
failedyes
, check the authority (typically the password) and if
all looks good redirect the browser to ReturnTo, adding the OpenID
properties needed by the Relying Party to verify the login.openid_associate(URL, Handle, Assoc, []).
http://specs.openid.net/auth/2.0
(default) or
http://openid.net/signon/1.1
.
The library library(http/http_parameters)
provides two
predicates to fetch HTTP request parameters as a type-checked list
easily. The library transparently handles both GET and POST requests. It
builds on top of the low-level request representation described in
section 3.13.
If a parameter is missing the exception
error(
is thrown which. If the argument cannot be converted to the requested
type, a
existence_error(http_parameter, Name)
, _)error(
is
raised, where the error context indicates the HTTP parameter. If not
caught, the server translates both errors into a existence_error(Type, Value)
, _)400 Bad request
HTTP message.
Options fall into three categories: those that handle presence of the parameter, those that guide conversion and restrict types and those that support automatic generation of documention. First, the presence-options:
default
and optional
are
ignored and the value is returned as a list. Type checking options are
processed on each value.list(Type)
.The type and conversion options are given below. The type-language can be extended by providing clauses for the multifile hook http:convert_parameter/3.
;
(Type1, Type2)(nonneg;oneof([infinite]))
to
specify an integer or a symbolic value.
The last set of options is to support automatic generation of HTTP
API documentation from the sources.4This
facility is under development in ClioPatria; see http_help.pl
.
Below is an example
reply(Request) :- http_parameters(Request, [ title(Title, [ optional(true) ]), name(Name, [ length >= 2 ]), age(Age, [ between(0, 150) ]) ]), ...
Same as http_parameters(Request, Parameters,[])
call(Goal, +ParamName, -Options)
to find the options.
Intended to share declarations over many calls to http_parameters/3.
Using this construct the above can be written as below.
reply(Request) :- http_parameters(Request, [ title(Title), name(Name), age(Age) ], [ attribute_declarations(param) ]), ... param(title, [optional(true)]). param(name, [length >= 2 ]). param(age, [integer]).
The body-code (see section 3.1) is
driven by a Request. This request is generated from http_read_request/2
defined in
library(http/http_header)
.
Name(Value)
elements. It provides a number of predefined elements for the result of
parsing the first line of the request, followed by the additional
request parameters. The predefined fields are:
Host:
Host, Host is
unified with the host-name. If Host is of the format <host>:<port>
Host only describes <host> and a field port(Port)
where
Port is an integer is added.delete
, get
, head
,
options
, patch
, post
, put
,
trace
). This field is present if the header has been parsed
successfully.ip(A,B,C,D)
containing the IP
address of the contacting host.host
for details.?
,
normally used to transfer data from HTML forms that use the HTTP GET
method. In the URL it consists of a www-form-encoded list of Name=Value
pairs. This is mapped to a list of Prolog Name=Value
terms with decoded names and values. This field is only present if the
location contains a search-specification.
The URL specification does not demand the query part to be of the form name=value. If the field is syntactically incorrect, ListOfNameValue is bound the the empty list ([]).
HTTP/
Major.Minor
version indicator this element indicate the HTTP version of the peer.
Otherwise this field is not present.Cookie
line, the value of the
cookie is broken down in Name=Value pairs, where
the
Name is the lowercase version of the cookie name as used for
the HTTP fields.SetCookie
line, the cookie field
is broken down into the Name of the cookie, the Value
and a list of Name=Value pairs for additional
options such as expire
, path
, domain
or secure
.
If the first line of the request is tagged with
HTTP/
Major.Minor, http_read_request/2
reads all input upto the first blank line. This header consists of
Name:Value fields. Each such field appears as a
term
Name(Value)
in the Request, where Name
is canonicalised for use with Prolog. Canonisation implies that the
Name is converted to lower case and all occurrences of the
are replaced by -
_
. The value
for the
Content-length
fields is translated into an integer.
Here is an example:
?- http_read_request(user_input, X). |: GET /mydb?class=person HTTP/1.0 |: Host: gollem |: X = [ input(user), method(get), search([ class = person ]), path('/mydb'), http_version(1-0), host(gollem) ].
Where the HTTP GET
operation is intended to get a
document, using a path and possibly some additional search
information, the POST
operation is intended to hand
potentially large amounts of data to the server for processing.
The Request parameter above contains the term method(post)
.
The data posted is left on the input stream that is available through
the term input(Stream)
from the Request header.
This data can be read using http_read_data/3
from the HTTP client library. Here is a demo implementation simply
returning the parsed posted data as plain text (assuming pp/1
pretty-prints the data).
reply(Request) :- member(method(post), Request), !, http_read_data(Request, Data, []), format('Content-type: text/plain~n~n', []), pp(Data).
If the POST is initiated from a browser, content-type is generally
either application/x-www-form-urlencoded
or
multipart/form-data
.
The functionality of the server should be defined in one Prolog file (of course this file is allowed to load other files). Depending on the wanted server setup this‘body' is wrapped into a small Prolog file combining the body with the appropriate server interface. There are three supported server-setups. For most applications we advise the multi-threaded server. Examples of this server architecture are the PlDoc documentation system and the SeRQL Semantic Web server infrastructure.
All the server setups may be wrapped in a reverse proxy to make them available from the public web-server as described in section 3.14.7.
library(thread_httpd)
for a multi-threaded
serverThis server is harder to debug due to the involved threading, although the GUI tracer provides reasonable support for multi-threaded applications using the tspy/1 command. It can provide fast communication to multiple clients and can be used for more demanding servers.
library(inetd_httpd)
for server-per-clientThis server is very hard to debug as the server is not connected to the user environment. It provides a robust implementation for servers that can be started quickly.
All the server interfaces provide http_server(:Goal, +Options)
to create the server. The list of options differ, but the servers share
common options:
The library(http/thread_httpd.pl)
provides the
infrastructure to manage multiple clients using a pool of worker-threads.
This realises a popular server design, also seen in Java Tomcat and
Microsoft .NET. As a single persistent server process maintains
communication to all clients startup time is not an important issue and
the server can easily maintain state-information for all clients.
In addition to the functionality provided by the inetd server, the
threaded server can also be used to realise an HTTPS server exploiting
the library(ssl)
library. See option ssl(+SSLOptions)
below.
port(?Port)
option to specify the port the server should listen to. If Port
is unbound an arbitrary free port is selected and Port is
unified to this port-number. The server consists of a small Prolog
thread accepting new connection on Port and dispatching these
to a pool of workers. Defined Options are:
infinite
,
a worker may wait forever on a client that doesn't complete its request.
Default is 60 seconds.https://
protocol. SSL
allows for encrypted communication to avoid others from tapping the wire
as well as improved authentication of client and server. The SSLOptions
option list is passed to ssl_context/3.
The port option of the main option list is forwarded to the SSL layer.
See the library(ssl)
library for details.http
or https
.This can be used to tune the number of workers for performance. Another possible application is to reduce the pool to one worker to facilitate easier debugging.
pool(Pool)
or to thread_create/3
of the pool option is not present. If the dispatch module is used (see section
3.2), spawning is normally specified as an option to the http_handler/3
registration.
We recomment the use of thread pools. They allow registration of a set of threads using common characteristics, specify how many can be active and what to do if all threads are active. A typical application may define a small pool of threads with large stacks for computation intensive tasks, and a large pool of threads with small stacks to serve media. The declaration could be the one below, allowing for max 3 concurrent solvers and a maximum backlog of 5 and 30 tasks creating image thumbnails.
:- use_module(library(thread_pool)). :- thread_pool_create(compute, 3, [ local(20000), global(100000), trail(50000), backlog(5) ]). :- thread_pool_create(media, 30, [ local(100), global(100), trail(100), backlog(100) ]). :- http_handler('/solve', solve, [spawn(compute)]). :- http_handler('/thumbnail', thumbnail, [spawn(media)]).
This module provides the logic that is needed to integrate a process into the Unix service (daemon) architecture. It deals with the following aspects, all of which may be used/ignored and configured using commandline options:
port(s)
to be used by the serverThe typical use scenario is to write a file that loads the following components:
In the code below, ?- [load].
loads the remainder of the
webserver code. This is often a sequence of use_module/1
directives.
:- use_module(library(http/http_unix_daemon)). :- [load].
The program entry point is http_daemon/0, declared using initialization/2. This may be overruled using a new declaration after loading this library. The new entry point will typically call http_daemon/1 to start the server in a preconfigured way.
:- use_module(library(http/http_unix_daemon)). :- initialization(run, main). run :- ... http_daemon(Options).
Now, the server may be started using the command below. See http_daemon/0 for supported options.
% [sudo] swipl mainfile.pl [option ...]
Below are some examples. Our first example is completely silent,
running on port 80 as user www
.
% swipl mainfile.pl --user=www --pidfile=/var/run/http.pid
Our second example logs HTTP interaction with the syslog daemon for
debugging purposes. Note that the argument to --debug
= is a
Prolog term and must often be escaped to avoid misinterpretation by the
Unix shell. The debug option can be repeated to log multiple debug
topics.
% swipl mainfile.pl --user=www --pidfile=/var/run/http.pid \ --debug='http(request)' --syslog=http
Broadcasting The library uses broadcast/1 to allow hooking certain events:
--http=Spec
or --https=Spec
is followed by
arguments for that server until the next --http=Spec
or --https=Spec
or the end of the options.--http=Spec
or --https=Spec
appears,
one HTTP server is created from the specified parameters.
Examples:
--workers=10 --http --https --http=8080 --https=8443 --http=localhost:8080 --workers=1 --https=8443 --workers=25
--user=User
to open ports below 1000. The default port is 80. If --https
is used, the default port is 443.--ip=localhost
to restrict access to connections from
localhost if the server itself is behind an (Apache) proxy server
running on the same host.
socket(s)
--pwfile=File
)--user
. If omitted, the login
group of the target user is used.--no-fork
or --fork=false
, the
process runs in the foreground.|
Port|
BindTo:Port)]true
, create at the specified or default address. Else use
the given port and interface. Thus, --http
creates a server
at port 80, --http=8080
creates one at port 8080 and --http=localhost:8080
creates one at port 8080 that is only accessible from localhost
.|
Port|
BindTo:Port)]--http
, but creates an HTTPS server. Use --certfile
, --keyfile
, -pwfile
,
--password
and --cipherlist
to configure SSL
for this server.--password=PW
as it allows using file
protection to avoid leaking the password. The file is read before
the server drops privileges when started with the --user
option.true
(default false
) implies --no-fork
and presents the Prolog toplevel after starting the server.kill -HUP <pid>
. Default is reload
(running make/0). Alternative is quit
,
stopping the server.Other options are converted by argv_options/3 and passed to http_server/1. For example, this allows for:
http_daemon/0 is defined as
below. The start code for a specific server can use this as a starting
point, for example for specifying defaults or additional options. This
uses guided options processing from argv_options/3
from library(main)
. The option definitions are available as http_opt_type/3, http_opt_help/2
and
http_opt_meta/2
http_daemon :- current_prolog_flag(argv, Argv), argv_options(Argv, _RestArgv, Options), http_daemon(Options).
Error handling depends on whether or not interactive(true)
is in effect. If so, the error is printed before entering the toplevel.
In non-interactive mode this predicate calls halt(1)
.
http_server(Handler, Options)
. The default is
provided by start_server/1.
All modern Unix systems handle a large number of the services they
run through the super-server inetd or one of its descendants (xinetd, systemd
etc.) Such a program reads a configuration file (for example /etc/inetd.conf
)
and opens server-sockets on all ports defined in this file. As a request
comes in it accepts it and starts the associated server such that
standard I/O is performed through the socket. This approach has several
advantages:
The very small generic script for handling inetd based connections is
in inetd_httpd
, defining http_server/1:
Here is the example from demo_inetd
#!/usr/bin/pl -t main -q -f :- use_module(demo_body). :- use_module(inetd_httpd). main :- http_server(reply).
With the above file installed in /home/jan/plhttp/demo_inetd
,
the following line in /etc/inetd
enables the server at port
4001 guarded by tcpwrappers. After modifying inetd, send the
daemon the HUP
signal to make it reload its configuration.
For more information, please check inetd.conf(5).
4001 stream tcp nowait nobody /usr/sbin/tcpd /home/jan/plhttp/demo_inetd
There are rumours that inetd has been ported to Windows.
To be done.
There are several options for public deployment of a web service. The main decision is whether to run it on a standard port (port 80 for HTTP, port 443 for HTTPS) or a non-standard port such as for example 8000 or 8080. Using a standard port below 1000 requires root access to the machine, and prevents other web services from using the same port. On the other hand, using a non-standard port may cause problems with intermediate proxy- and/or firewall policies that may block the port when you try to access the service from some networks. In both cases, you can either use a physical or a virtual machine running ---for example--- under VMWARE or XEN to host the service. Using a dedicated (physical or virtual) machine to host a service isolates security threats. Isolation can also be achieved using a Unix chroot environment, which is however not a security feature.
To make several different web services reachable on the same (either standard or non-standard) port, you can use a so-called reverse proxy. A reverse proxy uses rules to relay requests to other web services that use their own dedicated ports. This approach has several advantages:
Proxy technology can be combined with isolation methods such as dedicated machines, virtual machines and chroot jails. The proxy can also provide load balancing.
Setting up an Apache reverse proxy
The Apache reverse proxy setup is really simple. Ensure the modules
proxy
and proxy_http
are loaded. Then add two
simple rules to the server configuration. Below is an example that makes
a PlDoc server on port 4000 available from the main Apache server at
port 80.
ProxyPass /pldoc/ http://localhost:4000/pldoc/ ProxyPassReverse /pldoc/ http://localhost:4000/pldoc/
Apache rewrites the HTTP headers passing by, but using the above
rules it does not examine the content. This implies that URLs embedded
in the (HTML) content must use relative addressing. If the locations on
the public and Prolog server are the same (as in the example above) it
is allowed to use absolute locations. I.e. /pldoc/search
is
ok, but http://myhost.com:4000/pldoc/search
is not.
If the locations on the server differ, locations must be relative (i.e. not
start with
.
/
This problem can also be solved using the contributed Apache module
proxy_html
that can be instructed to rewrite URLs embedded
in HTML documents. In our experience, this is not troublefree as URLs
can appear in many places in generated documents. JavaScript can create
URLs on the fly, which makes rewriting virtually impossible.
The body is called by the module library(http/http_wrapper.pl)
.
This module realises the communication between the I/O streams and the
body described in section 3.1. The
interface is realised by
http_wrapper/5:
’Keep-alive'
if both ends of the connection want to
continue the connection or close
if either side wishes to
close the connection.
This predicate reads an HTTP request-header from In,
redirects current output to a memory file and then runs call(Goal,
Request)
, watching for exceptions and failure. If Goal
executes successfully it generates a complete reply from the created
output. Otherwise it generates an HTTP server error with additional
context information derived from the exception.
http_wrapper/5 supports the following options:
..., format('Set-Cookie: ~w=~w; path=~w~n', [Cookie, SessionID, Path]), ...,
If ---for whatever reason--- the conversion is not possible it simply unifies RelPath to AbsPath.
This library finds the public address of the running server. This can
be used to construct URLs that are visible from anywhere on the
internet. This module was introduced to deal with OpenID, where a
request is redirected to the OpenID server, which in turn redirects to
our server (see http_openid.pl
).
The address is established from the settings http:public_host
and
http:public_port
if provided. Otherwise it is deduced from
the request.
true
(default false
), try to replace a
local hostname by a world-wide accessible name.This predicate performs the following steps to find the host and port:
http:public_host
and http:public_port
X-Forwarded-Host
header, which applies if this
server runs behind a proxy.Host
header, which applies for HTTP 1.1 if we
are contacted directly.Request | is the current request. If it is left unbound, and the request is needed, it is obtained with http_current_request/1. |
Simple module for logging HTTP requests to a file. Logging is enabled
by loading this file and ensure the setting http:logfile is not the
empty atom. The default file for writing the log is httpd.log
.
See
library(settings)
for details.
The level of logging can be modified using the multifile predicate
http_log:nolog/1 to hide HTTP request
fields from the logfile and
http_log:password_field/1 to hide
passwords from HTTP search specifications (e.g. /topsecret?password=secret
).
append
mode if the file is not yet
open. The log file is determined from the setting http:logfile
.
If this setting is set to the empty atom (''), this predicate fails.
If a file error is encountered, this is reported using print_message/2, after which this predicate silently fails. Opening is retried every minute when a new message arrives.
Before opening the log file, the message http_log_open(Term)
is broadcasted. This message allows for creating the directory,
renaming, deleting or truncating an existing log file.
server(Reason, Time)
.
to the logfile. This call is intended for cooperation with the Unix
logrotate facility using the following schema:
Content-type
header. If the
hook succeeds, the POST data is not logged. For example, to stop logging
anything but application/json messages:
:- multifile http_log:nolog_post_content_type/1. http_log:nolog_post_content_type(Type) :- Type \= (application/json).
Type | is a term MainType/SubType |
library(http/http_unix_daemon)
is used, closing is achieved
by sending SIGHUP or SIGUSR1 to the process.library(http/http_unix_daemon)
which
schedules the maintenance actions.Options:
true
, rotate the log files in the background.
This must be used with a timer that broadcasts a
maintenance(_,_)
message (see broadcast/1).
Such a timer is part of library(http/http_unix_daemon)
.
The library library(http/http_error)
defines a hook that
decorates uncaught exceptions with a stack-trace. This will generate a 500
internal server error document with a stack-trace. To enable this
feature, simply load this library. Please do note that providing error
information to the user simplifies the job of a hacker trying to
compromise your server. It is therefore not recommended to load this
file by default.
The example program calc.pl
has the error handler loaded
which can be triggered by forcing a divide-by-zero in the calculator.
The library library(http/http_header)
provides
primitives for parsing and composing HTTP headers. Its functionality is
normally hidden by the other parts of the HTTP server and client
libraries.
end_of_file
if FdIn is at the end
of input.html_write.pl
file
, but do not include modification timeHdrExtra | provides additional reply-header
fields, encoded as Name(Value). It can also contain a field
content_length(-Len) to retrieve the value of the
Content-length header that is replied. |
Code | is the numeric HTTP status code sent |
Status can be one of the following:
basic(Realm)
digest(Digest)
authorise(basic(Realm))
. Deprecated.format('Content-type: <MIME type>~n')
. This hook is
called before
mime_type_encoding/2. This default defines utf8
for JSON and Turtle derived application/
MIME types.never
,
even explitic requests are ignored. If on_request
, chunked
encoding is used if requested through the CGI header and allowed by the
client. If
if_possible
, chunked encoding is used whenever the client
allows for it, which is interpreted as the client supporting HTTP 1.1 or
higher.
Chunked encoding is more space efficient and allows the client to start processing partial results. The drawback is that errors lead to incomplete pages instead of a nicely formatted complete page.
http_client.pl
to send the POST data to the server. Data is one of:
html(+Tokens)
Result of html//1
from html_write.pl
json(+Term)
Posting a JSON query and processing the
JSON reply (or any other reply understood by http_read_data/3)
is simple as
http_post(URL, json(Term), Reply, [])
, where Term is a JSON
term as described in json.pl
and reply is of the same
format if the server replies with JSON, when using module :- use_module(library(http/http_json))
.
Note that the module is used in both http server and http client, see
library(http/http_json)
.xml(+Term)
Post the result of xml_write/3
using the Mime-type
text/xml
xml(+Type, +Term)
Post the result of xml_write/3
using the given Mime-type and an empty option list to xml_write/3.xml(+Type, +Term, +Options)
Post the result of xml_write/3
using the given Mime-type and option list for xml_write/3.file(+File)
Send contents of a file. Mime-type is
determined by
file_mime_type/2.file(+Type, +File)
Send file with content of indicated
mime-type.memory_file(+Type, +Handle)
Similar to file(+Type, +File)
,
but using a memory file instead of a real file. See new_memory_file/1.codes(+Codes)
As codes(text/plain, Codes)
.codes(+Type, +Codes)
Send Codes using the indicated
MIME-type.bytes(+Type, +Bytes)
Send Bytes using the indicated
MIME-type. Bytes is either a string of character codes 0..255 or list of
integers in the range 0..255. Out-of-bound codes result in a
representation error exception.atom(+Atom)
As atom(text/plain, Atom)
.atom(+Type, +Atom)
Send Atom using the indicated
MIME-type.cgi_stream(+Stream, +Len)
Read the input from Stream
which, like CGI data starts with a partial HTTP header. The fields of
this header are merged with the provided HdrExtra fields. The
first Len characters of Stream are used.form(+ListOfParameter)
Send data of the MIME type
application/x-www-form-urlencoded as produced by browsers issuing a POST
request from an HTML form. ListOfParameter is a list of Name=Value or
Name(Value).form_data(+ListOfData)
Send data of the MIME type multipart/form-data
as produced by browsers issuing a POST request from an HTML form using
enctype multipart/form-data
. ListOfData is the same as for
the List alternative described below. Below is an example. Repository,
etc. are atoms providing the value, while the last argument provides a
value from a file.
..., http_post([ protocol(http), host(Host), port(Port), path(ActionPath) ], form_data([ repository = Repository, dataFormat = DataFormat, baseURI = BaseURI, verifyData = Verify, data = file(File) ]), _Reply, []), ...,
set_cookie(Name, Value, Options)
.
Options is a list consisting of Name=Value or a single atom
(e.g., secure
)bytes(From, To)
, where From is an integer
and To is either an integer or the atom end
.media(Type, TypeParams, Quality, AcceptExts)
. The list is
sorted according to preference.disposition(Name, Attributes)
, where Attributes
is a list of Name=Value pairs.media(Type/SubType, Attributes)
, where
Attributes is a list of Name=Value pairs.As some fields are already parsed in the Request, this predicate is a no-op when called on an already parsed field.
Value | is either an atom, a list of codes or an already parsed header value. |
-->
content_type(text/html)
domain_error(http_request_line, Line)
created(Location)
moved(To)
moved_temporary(To)
see_other(To)
bad_request(ErrorTerm)
authorise(AuthMethod)
forbidden(URL)
not_found(URL)
method_not_allowed(Method,URL)
not_acceptable(Why)
server_error(ErrorTerm)
unavailable(Why)
The hook is tried twice, first using the status term, e.g.,
not_found(URL)
and than with the code, e.g. 404
.
The second call is deprecated and only exists for compatibility.
Context | is the 4th argument of http_status_reply/5,
which is invoked after raising an exception of the format
http_reply(Status, HeaderExtra, Context) . The default
context is [] (the empty list). |
HTMLTokens | is a list of tokens as produced by html//1. It is passed to print_html/2. |
library(http/html_write)
libraryProducing output for the web in the form of an HTML document is a requirement for many Prolog programs. Just using format/2 is not satisfactory as it leads to poorly readable programs generating poor HTML. This library is based on using DCG rules.
The library(http/html_write)
structures the generation
of HTML from a program. It is an extensible library, providing a DCG
framework for generating legal HTML under (Prolog) program control. It
is especially useful for the generation of structured pages (e.g. tables)
from Prolog data structures.
The normal way to use this library is through the DCG html//1. This non-terminal provides the central translation from a structured term with embedded calls to additional translation rules to a list of atoms that can then be printed using print_html/[1,2].
//
[]
\
List
\
Term
\
Term but allows for invoking grammar rules in
external packages.
&<Entity>;
or &#<Entity>;
if Entity is an integer. SWI-Prolog atoms and strings are
represented as Unicode. Explicit use of this construct is rarely needed
because code-points that are not supported by the output encoding are
automatically converted into character-entities.
Tag(Content)
Tag(Attributes, Content)
Name(Value)
or
Name=Value. Value is the atomic
attribute value but allows for a limited functional notation:
encode(Atom)
location_by_id(ID)
#
(ID)
location_by_id(ID)
.Name(Value)
. Values are encoded as in the encode option
described above.NAMES
). Each value
in list is separated by a space. This is particularly useful for setting
multiple class
attributes on an element. For example:
... span(class([c1,c2]), ...),
The example below generates a URL that references the predicate
set_lang/1 in
the application with given parameters. The http_handler/3
declaration binds /setlang
to the predicate set_lang/1
for which we provide a very simple implementation. The code between ...
is part of an HTML page showing the english flag which, when pressed,
calls set_lang(Request)
where Request contains
the search parameter lang
= en
. Note that the
HTTP location (path) /setlang
can be moved without
affecting this code.
:- http_handler('/setlang', set_lang, []). set_lang(Request) :- http_parameters(Request, [ lang(Lang, []) ]), http_session_retractall(lang(_)), http_session_assert(lang(Lang)), reply_html_page(title('Switched language'), p(['Switch language to ', Lang])). ... html(a(href(location_by_id(set_lang) + [lang(en)]), img(src('/www/images/flags/en.png')))), ...
//
DOCTYPE
declaration. HeadContent are elements to
be placed in the head
element and BodyContent
are elements to be placed in the body
element.
To achieve common style (background, page header and footer), it is
possible to define DCG non-terminals head//1 and/or body//1.
Non-terminal page//1 checks for the definition of these non-terminals in
the module it is called from as well as in the user
module.
If no definition is found, it creates a head with only the HeadContent
(note that the
title
is obligatory) and a body
with bgcolor
set to white
and the provided BodyContent.
Note that further customisation is easily achieved using html//1 directly as page//2 is (besides handling the hooks) defined as:
page(Head, Body) --> html([ \['<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 4.0//EN">\n'], html([ head(Head), body(bgcolor(white), Body) ]) ]).
//
DOCTYPE
and the HTML
element. Contents is used to generate both the head and body
of the page.//
html_begin(table) html_begin(table(border(2), align(center)))
This predicate provides an alternative to using the
\
Command syntax in the html//1 specification.
The following two fragments are the same. The preferred solution depends
on your preferences as well as whether the specification is generated or
entered by the programmer.
table(Rows) --> html(table([border(1), align(center), width('80%')], [ \table_header, \table_rows(Rows) ])). % or table(Rows) --> html_begin(table(border(1), align(center), width('80%'))), table_header, table_rows, html_end(table).
//
The non-terminal html//1 translates a specification into a list of
atoms and layout instructions. Currently the layout instructions are
terms of the format nl(N)
, requesting at least N
newlines. Multiple consecutive nl(1)
terms are combined to
an atom containing the maximum of the requested number of newline
characters.
To simplify handing the data to a client or storing it into a file, the following predicates are available from this library:
reply_html_page(default, Head, Body)
.library(http_wrapper)
(CGI-style). Here is a simple typical example:
reply(Request) :- reply_html_page(title('Welcome'), [ h1('Welcome'), p('Welcome to our ...') ]).
The header and footer of the page can be hooked using the
grammar-rules user:head//2 and user:body//2. The first argument passed
to these hooks is the Style argument of reply_html_page/3
and the second is the 2nd (for head//2) or 3rd (for body//2) argument of reply_html_page/3.
These hooks can be used to restyle the page, typically by embedding the
real body content in a div
. E.g., the following code
provides a menu on top of each page of that is identified using the
style
myapp.
:- multifile user:body//2. user:body(myapp, Body) --> html(body([ div(id(top), \application_menu), div(id(content), Body) ])).
Redefining the head
can be used to pull in scripts, but
typically html_requires//1 provides a more modular approach for pulling
scripts and CSS-files.
DOCTYPE
header,
html
, head
or body
. It is
intended for JavaScript handlers that request a partial document and
insert that somewhere into the existing page DOM. See reply_html_page/3
to reply with a complete (valid) HTML page.Content-length
field of an HTTP reply-header.
Modern HTML commonly uses CSS and Javascript. This requires <link> elements in the HTML <head> element or <script> elements in the <body>. Unfortunately this seriously harms re-using HTML DCG rules as components as each of these components may rely on their own style sheets or JavaScript code. We added a‘mailing' system to reposition and collect fragments of HTML. This is implemented by html_post//2, html_receive//1 and html_receive//2.
//
\
-commands are executed by mailman/1
from print_html/1 or html_print_length/2.
These commands are called in the calling context of the html_post//2
call.
A typical usage scenario is to get required CSS links in the document head in a reusable fashion. First, we define css//1 as:
css(URL) --> html_post(css, link([ type('text/css'), rel('stylesheet'), href(URL) ])).
Next we insert the unique CSS links, in the pagehead using the following call to reply_html_page/2:
reply_html_page([ title(...), \html_receive(css) ], ...)
//
//
phrase(Handler, PostedTerms, HtmlTerms, Rest)
Typically, Handler collects the posted terms, creating a term suitable for html//1 and finally calls html//1.
The library predefines the receiver channel head
at the
end of the
head
element for all pages that write the html head
through this library. The following code can be used anywhere inside an
HTML generating rule to demand a javascript in the header:
js_script(URL) --> html_post(head, script([ src(URL), type('text/javascript') ], [])).
This mechanism is also exploited to add XML namespace (xmlns
)
declarations to the (outer) html
element using xhml_ns//2:
//
xmlns
channel. Rdfa (http://www.w3.org/2006/07/SWD/RDFa/syntax/),
embedding RDF in (x)html provides a typical usage scenario where we want
to publish the required namespaces in the header. We can define:
rdf_ns(Id) --> { rdf_global_id(Id:'', Value) }, xhtml_ns(Id, Value).
After which we can use rdf_ns//1 as a
normal rule in html//1 to publish
namespaces from library(semweb/rdf_db)
. Note that this
macro only has effect if the dialect is set to xhtml
. In
html
mode it is silently ignored.
The required xmlns
receiver is installed by html_begin//1
using the html
tag and thus is present in any document that
opens the outer html
environment through this library.
In some cases it is practical to extend the translations imposed by
html//1. We used this technique to define translation rules for the
output of the SWI-Prolog library(sgml)
package.
The html//1 non-terminal first calls the multifile ruleset html_write:expand//1.
//
//
<&>
.//
<&>"
.
Though not strictly necessary, the library attempts to generate reasonable layout in SGML output. It does this only by inserting newlines before and after tags. It does this on the basis of the multifile predicate html_write:layout/3
-
,
requesting the output generator to omit the close-tag altogether or empty
,
telling the library that the element has declared empty content. In this
case the close-tag is not emitted either, but in addition html//1
interprets Arg in Tag(Arg)
as a list of
attributes rather than the content.
A tag that does not appear in this table is emitted without additional layout. See also print_html/[1,2]. Please consult the library source for examples.
In the following example we will generate a table of Prolog predicates we find from the SWI-Prolog help system based on a keyword. The primary database is defined by the predicate predicate/5 We will make hyperlinks for the predicates pointing to their documentation.
html_apropos(Kwd) :- findall(Pred, apropos_predicate(Kwd, Pred), Matches), phrase(apropos_page(Kwd, Matches), Tokens), print_html(Tokens). % emit page with title, header and table of matches apropos_page(Kwd, Matches) --> page([ title(['Predicates for ', Kwd]) ], [ h2(align(center), ['Predicates for ', Kwd]), table([ align(center), border(1), width('80%') ], [ tr([ th('Predicate'), th('Summary') ]) | \apropos_rows(Matches) ]) ]). % emit the rows for the body of the table. apropos_rows([]) --> []. apropos_rows([pred(Name, Arity, Summary)|T]) --> html([ tr([ td(\predref(Name/Arity)), td(em(Summary)) ]) ]), apropos_rows(T). % predref(Name/Arity) % % Emit Name/Arity as a hyperlink to % % /cgi-bin/plman?name=Name&arity=Arity % % we must do form-encoding for the name as it may contain illegal % characters. www_form_encode/2 is defined in library(url). predref(Name/Arity) --> { www_form_encode(Name, Encoded), sformat(Href, '/cgi-bin/plman?name=~w&arity=~w', [Encoded, Arity]) }, html(a(href(Href), [Name, /, Arity])). % Find predicates from a keyword. '$apropos_match' is an internal % undocumented predicate. apropos_predicate(Pattern, pred(Name, Arity, Summary)) :- predicate(Name, Arity, Summary, _, _), ( '$apropos_match'(Pattern, Name) -> true ; '$apropos_match'(Pattern, Summary) ).
library(http/html_write)
libraryThis library is the result of various attempts to reach at a more satisfactory and Prolog-minded way to produce HTML text from a program. We have been using Prolog for the generation of web pages in a number of projects. Just using format/2 never was not a real option, generating error-prone HTML from clumsy syntax. We started with a layer on top of format/2, keeping track of the current nesting and thus always capable of properly closing the environment.
DCG based translation however, naturally exploits Prolog's term-rewriting primitives. If generation fails for whatever reason it is easy to produce an alternative document (for example holding an error message).
In a future version we will probably define a goal_expansion/2
to do compile-time optimisation of the library. Quotation of known text
and invocation of sub-rules using the \
RuleSet
and
<Module>:<RuleSet> operators are
costly operations in the analysis that can be done at compile-time.
This library is a supplement to library(http/html_write)
for producing JavaScript fragments. Its main role is to be able to call
JavaScript functions with valid arguments constructed from Prolog data.
For example, suppose you want to call a JavaScript functions to process
a list of names represented as Prolog atoms. This can be done using the
call below, while without this library you would have to be careful to
properly escape special characters.
numbers_script(Names) --> html(script(type('text/javascript'), [ \js_call('ProcessNumbers'(Names) ]),
The accepted arguments are described with js_expression//1.
//
script
element with the given
content.+
operator, which results in concatenation at the client side.
..., js_script({|javascript(Id, Config)|| $(document).ready(function() { $("#"+Id).tagit(Config); }); |}), ...
The current implementation tokenizes the JavaScript input and yields syntax errors on unterminated comments, strings, etc. No further parsing is implemented, which makes it possible to produce syntactically incorrect and partial JavaScript. Future versions are likely to include a full parser, generating syntax errors.
The parser produces a term \List
, which is suitable for
js_script//1 and html//1.
Embedded variables are mapped to
\js_expression(Var)
, while the remaining text is mapped to
atoms.
//
... html(script(type('text/javascript'), [ \js_call('x.y.z'(hello, 42)) ]),
//
['var ', Id, ' = new ', \js_call(Term)]
//
//
null
object(Attributes)
object(Attributes)
, providing a more
JavaScript-like syntax. This may be useful if the object appears
literally in the source-code, but is generally less friendlyto produce
as a result from a computation.json(Term)
true
, false
and null
, but can also be use for emitting JavaScript
symbols (i.e. function- or variable names).symbol(Atom)
//
This module provides an abstract specification of HTTP server locations that is inspired on absolute_file_name/3. The specification is done by adding rules to the dynamic multifile predicate http:location/3. The speficiation is very similar to user:file_search_path/2, but takes an additional argument with options. Currently only one option is defined:
The default priority is 0. Note however that notably libraries may decide to provide a fall-back using a negative priority. We suggest -100 for such cases.
This library predefines a single location at priority -100:
http:prefix
To serve additional resource files such as CSS, JavaScript and icons,
see library(http/http_server_files)
.
Here is an example that binds /login
to login/1.
The user can reuse this application while moving all locations using a
new rule for the admin location with the option [priority(10)]
.
:- multifile http:location/3. :- dynamic http:location/3. http:location(admin, /, []). :- http_handler(admin(login), login, []). login(Request) :- ...
/
. Options
currently only supports the priority of the path. If http:location/3
returns multiple solutions the one with the highest priority is
selected. The default priority is 0.
This library provides a default for the abstract location
root
. This defaults to the setting http:prefix or, when not
available to the path /
. It is adviced to define all
locations (ultimately) relative to root
. For example, use
root('home.html')
rather than '/home.html'
.
http://
) URI
for the abstract specification Spec. Use http_absolute_location/3
to create references to locations on the same server.
This library allows for abstract declaration of available CSS and
Javascript resources and their dependencies using html_resource/2.
Based on these declarations, html generating code can declare that it
depends on specific CSS or Javascript functionality, after which this
library ensures that the proper links appear in the HTML head. The
implementation is based on mail system implemented by html_post/2
of library html_write.pl
.
Declarations come in two forms. First of all http locations are
declared using the http_path.pl
library. Second, html_resource/2
specifies HTML resources to be used in the head
and their
dependencies. Resources are currently limited to Javascript files (.js)
and style sheets (.css). It is trivial to add support for other material
in the head. See
html_include//1.
For usage in HTML generation, there is the DCG rule html_requires//1 that demands named resources in the HTML head.
All calls to html_requires//1 for the page are collected and duplicates are removed. Next, the following steps are taken:
Use ?-
debug(html(script))
. to see the
requested and final set of resources. All declared resources are in html_resource/3.
The edit/1 command recognises the names of
HTML resources.
true
(default false
), do not include About
itself, but only its dependencies. This allows for defining an alias for
one or more resources.Registering the same About multiple times extends the properties defined for About. In particular, this allows for adding additional dependencies to a (virtual) resource.
//
head
using html_post/2.
The actual dependencies are computed during the HTML output phase by
html_insert_resource//1.//
//
text/css
and text/javascript
are tried. For
example, to include a =.pl= files as a Prolog script, use:
:- multifile html_head:mime_include//2. html_head:mime_include(text/'x-prolog', Path) --> !, html(script([ type('text/x-prolog'), src(Path) ], [])).
This module provides convience predicates to include PWP (Prolog Well-formed Pages) in a Prolog web-server. It provides the following predicates:
pwp_handler
/
2reply_pwp_page
/
3library(http/http_dispatch)
. In the typical usage scenario,
one needs to define an http location and a file-search path that is used
as the root of the server. E.g., the following declarations create a
self-contained web-server for files in /web/pwp/
.
user:file_search_path(pwp, '/web/pwp'). :- http_handler(root(.), pwp_handler([path_alias(pwp)]), [prefix]).
Options include:
index.pwp
.true
(default is false
), allow for
?view=source to serve PWP file as source.permission_error(index, http_location, Location)
is raised
if the handler resolves to a directory that has no index.Options supported are:
true
, (default false
), process the PWP file
in a module constructed from its canonical absolute path. Otherwise, the
PWP file is processed in the calling module.Initial context:
get
, post
, put
or head
While processing the script, the file-search-path pwp includes the current location of the script. I.e., the following will find myprolog in the same directory as where the PWP file resides.
pwp:ask="ensure_loaded(pwp(myprolog))"
As of version 9.1.5, SWI-Prolog supports IPv6. This has few
implications for the HTTP package because most aspects are handled by library(socket)
and library(uri)
. This sections highlights a few aspects.
The client libraries use http_open/3,
which in turn uses
tcp_connect/3.
This causes the client to use addresses returned by
host_address/3,
which is based on the C API getaddrinfo(), in the order provided by
getaddrinfo(). The URL is parsed using library(uri)
, which
allows enclosing IPv6 addresses in []
. The query below
accesses an IPv6 server on localhost at port 8080
?- http_open('http://[::1]:8080', Stream, []).
The predicate http_server/2
can be used to create an IPv6 server using one of the queries below. The
first binds to all interfaces. The second only binds to the IPv6
equivalent of localhost
. Note that the IPv6 address needs
to be quoted to create the desired
Host:Port term.
?- http_server('::':8080, []). ?- http_server('::1':8080, []).
The
HTTP protocol provides for transfer encodings. These define
filters applied to the data described by the Content-type
.
The two most popular transfer encodings are chunked
and
deflate
. The chunked
encoding avoids the need
for a Content-length
header, sending the data in chunks,
each of which is preceded by a length. The deflate
encoding
provides compression.
Transfer-encodings are supported by filters defined as foreign
libraries that realise an encoding/decoding stream on top of another
stream. Currently there are two such libraries: library(http/http_chunked.pl)
and library(zlib.pl)
.
There is an emerging hook interface dealing with transfer encodings.
The
library(http/http_chunked.pl)
provides a hook used by
library(http/http_open.pl)
to support chunked encoding in http_open/3.
Note that both http_open.pl
and http_chunked.pl
must be loaded for http_open/3
to support chunked encoding.
library(http/http_chunked)
library
WebSocket is a lightweight message oriented protocol on top of TCP/IP streams. It is typically used as an upgrade of an HTTP connection to provide bi-directional communication, but can also be used in isolation over arbitrary (Prolog) streams.
The SWI-Prolog interface is based on streams and provides ws_open/3 to create a websocket stream from any Prolog stream. Typically, both an input and output stream are wrapped and then combined into a single object using stream_pair/3.
The high-level interface provides http_upgrade_to_websocket/3 to realise a websocket inside the HTTP server infrastructure and http_open_websocket/3 as a layer over http_open/3 to realise a client connection. After establishing a connection, ws_send/2 and ws_receive/2 can be used to send and receive messages. The predicate ws_close/3 is provided to perform the closing handshake and dispose of the stream objects.
subprotocol(Protocol)
.
Note that clients often provide an Origin header and some
servers require this field. See RFC 6455 for details. By default this
predicate does not set Origin. It may be set using the
request_header
option of http_open/3,
e.g. by passing this in the
Options list:
request_header('Origin' = 'https://www.swi-prolog.org')
The following example exchanges a message with the html5rocks.websocket.org echo service:
?- URL = 'ws://html5rocks.websocket.org/echo', http_open_websocket(URL, WS, []), ws_send(WS, text('Hello World!')), ws_receive(WS, Reply), ws_close(WS, 1000, "Goodbye"). URL = 'ws://html5rocks.websocket.org/echo', WS = <stream>(0xe4a440,0xe4a610), Reply = websocket{data:"Hello World!", opcode:text}.
WebSocket | is a stream pair (see stream_pair/3) |
call(Goal, WebSocket)
,
where WebSocket is a socket-pair. Options:
true
(default), guard the execution of Goal
and close the websocket on both normal and abnormal termination of Goal.
If false
, Goal itself is responsible for the
created websocket. This can be used to create a single thread that
manages multiple websockets using I/O multiplexing.infinite
.Note that the Request argument is the last for cooperation with http_handler/3. A simple echo server that can be accessed at =/ws/= can be implemented as:
:- use_module(library(http/websocket)). :- use_module(library(http/thread_httpd)). :- use_module(library(http/http_dispatch)). :- http_handler(root(ws), http_upgrade_to_websocket(echo, []), [spawn([])]). echo(WebSocket) :- ws_receive(WebSocket, Message), ( Message.opcode == close -> true ; ws_send(WebSocket, Message), echo(WebSocket) ).
switching_protocols(Goal, Options)
. The recovery from this
exception causes the HTTP infrastructure to call
call(Goal, WebSocket)
.text(+Text)
, but all character codes produced by Content
must be in the range [0..255]. Typically, Content will be an
atom or string holding binary data.text(+Text)
, provided for consistency.opcode
key. Other keys
used are:
format
:
Formatstring
, prolog
or json
.
See ws_receive/3.data
:
TermNote that ws_start_message/3 does not unlock the stream. This is done by ws_send/1. This implies that multiple threads can use ws_send/2 and the messages are properly serialized.
opcode
:
OpCodeclose
and data to the atom
end_of_file
.data
:
Stringrsv
:
RSV
If ping
message is received and WebSocket is
a stream pair,
ws_receive/1 replies with a pong
and waits for the next message.
The predicate ws_receive/3 processes the following options:
close
message if this was not already sent and wait for the close reply.
Code | is the numerical code indicating the close status. This is 16-bit integer. The codes are defined in section 7.4.1. Defined Status Codes of RFC6455. Notably, 1000 indicates a normal closure. |
Data | is currently interpreted as text. |
websocket_error(unexpected_message, Reply)
if the other
side did not send a close message in reply.server
or client
. If client
,
messages are sent as masked.true
(default), closing WSStream also closes Stream.subprotocols
option of http_open_websocket/3
and
http_upgrade_to_websocket/3.A typical sequence to turn a pair of streams into a WebSocket is here:
..., Options = [mode(server), subprotocol(chat)], ws_open(Input, WsInput, Options), ws_open(Output, WsOutput, Options), stream_pair(WebSocket, WsInput, WsOutput).
This library manages a hub that consists of clients that are connected using a websocket. Messages arriving at any of the websockets are sent to the event queue of the hub. In addition, the hub provides a broadcast interface. A typical usage scenario for a hub is a chat server A scenario for realizing an chat server is:
error
:Error, left
:ClientId, reason
:Reason}read
or write
and Error is the Prolog I/O exception.joined
:ClientId}
The thread(s)
can talk to clients using two predicates:
A hub consists of (currenty) four message queues and a simple dynamic fact. Threads that are needed for the communication tasks are created on demand and die if no more work needs to be done.
.
name
queues
.
event
thread(s)
can listen.After creating a hub, the application normally creates a thread that listens to Hub.queues.event and exposes some mechanisms to establish websockets and add them to the hub using hub_add/3.
Message | is either a single message (as accepted by ws_send/2) or a list of such messages. |
call(Condition, Id)
succeeds. Note that this process is
asynchronous: this predicate returns immediately after putting
all requests in a broadcast queue. If a message cannot be delivered due
to a network error, the hub is informed through
io_error/3.
From http://json.org, " JSON (JavaScript Object Notation) is a lightweight data-interchange format. It is easy for humans to read and write. It is easy for machines to parse and generate. It is based on a subset of the JavaScript Programming Language, Standard ECMA-262 3rd Edition - December 1999. JSON is a text format that is completely language independent but uses conventions that are familiar to programmers of the C-family of languages, including C, C++, C#, Java, JavaScript, Perl, Python, and many others. These properties make JSON an ideal data-interchange language."
Although JSON is nowadays used a lot outside the context of web
applications, SWI-Prolog's support for JSON started life as part of the
HTTP package. SWI-Prolog supports two Prolog representations for JSON
terms. The first and oldest map JSON objects to a term
json(PropertyList)
and use the @
functor to
disambiguate e.g. null
from the string "null"
,
leading to @(null)
. As of SWI-Prolog version 7, JSON
objects may be represented using dict objects and JSON strings
using Prolog strings. Predicates following this convention are suffixed
with _dict
, e.g. json_read_dict/2.
For example, given the JSON document
{ "name": "Bob", "children": ["Mary", "John"], "age":42, "married": true }
we get either (using json_read/2):
json([name='Bob', children=['Mary', 'John'], age=42, married= @(true)]).
or (using json_read_dict/2):
_{age:42, children:["Mary", "John"], married:true, name:"Bob"}
The SWI-Prolog JSON interface consists of three libraries:
library(http/json)
provides support for the core JSON
object serialization and parsing.library(http/json_convert)
converts between the primary
representation of JSON terms in Prolog and more application oriented
Prolog terms. E.g. point(X,Y)
vs. object([x=X,y=Y])
.library(http/http_json)
hooks the conversion libraries
into the HTTP client and server libraries.
http_json.pl
links JSON to the HTTP client and server
modules. json_convert.pl
converts JSON Prolog terms to more
comfortable terms.This module supports reading and writing JSON objects. This library supports two Prolog representations (the new representation is only supported in SWI-Prolog version 7 and later):
json(NameValueList)
, a JSON string as an
atom and the JSON constants null
, true
and
false
as @(null), @(true) and @false.null
, true
and false
.atom
(default),
string
, codes
or chars
.
json(NameValueList)
,
where NameValueList is a list of Name=Value. Name is an atom created
from the JSON string.true
and false
are
mapped -like JPL- to @(true) and @(false).null
is mapped to the Prolog term
@(null)Here is a complete example in JSON and its corresponding Prolog term.
{ "name":"Demo term", "created": { "day":null, "month":"December", "year":2007 }, "confirmed":true, "members":[1,2,3] }
json([ name='Demo term', created=json([day= @null, month='December', year=2007]), confirmed= @true, members=[1, 2, 3] ])
The following options are processed:
null
. Default
@(null)true
. Default
@(true)false
. Default
@(false)error
):
==
error
, throw
an unexpected end of file syntax error
Returning an status term is required to process
Concatenated
JSON. Suggested values are @(eof)
or end_of_file
.
atom
.
The alternative is string
, producing a packed string
object. Please note that codes
or chars
would
produce ambiguous output and are therefore not supported.Values can be of the form #(Term), which causes Term to be stringified if it is not an atom or string. Stringification is based on term_string/2.
Rational numbers are emitted as floating point numbers. The hook json_write_hook/4 can be used to realize domain specific alternatives.
The version 7 dict type is supported as well. Optionally, if
the dict has a tag, a property "type":"tag" can be added to the
object. This behaviour can be controlled using the tag
option (see below). For example:
?- json_write(current_output, point{x:1,y:2}). { "x":1, "y":2 }
?- json_write(current_output, point{x:1,y:2}, [tag(type)]). { "type":"point", "x":1, "y":2 }
In addition to the options recognised by json_read/3, we process the following options are recognised:
true
(default false
), serialize unknown
terms and print them as a JSON string. The default raises a type error.
Note that this option only makes sense if you can guarantee that the
passed value is not an otherwise valid Prolog reporesentation of a
Prolog term.
If a string is emitted, the sequence </
is emitted as
<\/
. This is valid JSON syntax which ensures that JSON
objects can be safely embedded into an HTML <script>
element.
Note that this hook is shared by all users of this library. It is generally adviced to map a unique compound term to avoid interference with normal output.
State | and Options are opaque handles to the current output state and settings. Future versions may provide documented access to these terms. Currently it is adviced to ignore these arguments. |
true
, false
and null
constants.
true
, false
and null
are
represented using these Prolog atoms.type
field in an object assigns a tag for
the dict.
The predicate json_read_dict/3
processes the same options as
json_read/3, but with different
defaults. In addition, it processes the tag
option. See json_read/3
for details about the shared options.
tag
option does not
apply.null
.true
.false
string
.
The alternative is atom
, producing a packed string object.atom
,
string
or codes
.
null
.
Conversion to Prolog could translate @null into a variable if the
desired type is not any
. Conversion to JSON could map
variables to null
, though this may be unsafe. If the Prolog
term is known to be non-ground and JSON @null is a sensible mapping, we
can also use this simple snipit to deal with that fact.
term_variables(Term, Vars), maplist(=(@null), Vars).
The idea behind this module is to provide a flexible high-level
mapping between Prolog terms as you would like to see them in your
application and the standard representation of a JSON object as a Prolog
term. For example, an X-Y point may be represented in JSON as {"x":25, "y":50}
.
Represented in Prolog this becomes json([x=25,y=50])
, but
this is a pretty non-natural representation from the Prolog point of
view.
This module allows for defining records (just like library(record)
)
that provide transparent two-way transformation between the two
representations.
:- json_object point(x:integer, y:integer).
This declaration causes prolog_to_json/2 to translate the native Prolog representation into a JSON Term:
?- prolog_to_json(point(25,50), X). X = json([x=25, y=50])
A json_object/1 declaration
can define multiple objects separated by a comma (,), similar to the dynamic/1
directive. Optionally, a declaration can be qualified using a module.
The conversion predicates
prolog_to_json/2 and json_to_prolog/2
first try a conversion associated with the calling module. If not
successful, they try conversions associated with the module user
.
JSON objects have no type. This can be solved by adding an
extra field to the JSON object, e.g. {"type":"point", "x":25, "y":50}
.
As Prolog records are typed by their functor we need some notation to
handle this gracefully. This is achieved by adding +Fields to the
declaration. I.e.
:- json_object point(x:integer, y:integer) + [type=point].
Using this declaration, the conversion becomes:
?- prolog_to_json(point(25,50), X). X = json([x=25, y=50, type=point])
The predicate json_to_prolog/2 is often used after http_read_json/2 and prolog_to_json/2 before reply_json/1. For now we consider them separate predicates because the transformation may be too general, too slow or not needed for dedicated applications. Using a separate step also simplifies debugging this rather complicated process.
f(Name, Type, Default, Var)
,
ordered by Name. Var is the corresponding variable in Term.library(record)
. E.g.
?- json_object point(x:int, y:int, z:int=0).
The type arguments are either types as know to library(error)
or functor names of other JSON objects. The constant any
indicates an untyped argument. If this is a JSON term, it becomes
subject to json_to_prolog/2.
I.e., using the type
list(any)
causes the conversion to be executed on each
element of the list.
If a field has a default, the default is used if the field is not
specified in the JSON object. Extending the record type definition,
types can be of the form (Type1|
Type2). The type
null
means that the field may not be present.
Conversion of JSON to Prolog applies if all non-defaulted arguments can be found in the JSON object. If multiple rules match, the term with the highest arity gets preference.
true
, on
or 1
for @true
and one of false
, fail
, off
or 0
for @false.
:-
json_object/1
declarations. If a json_object/1
declaration declares a field of type
boolean
, commonly used thruth-values in Prolog are
converted to JSON booleans. Boolean translation accepts one of true
,
on
, 1
, @true, false
, fail
, off
or 0
, @false.
type_error(json_term, X)
:-
json_object/1
declarations. An efficient transformation is non-trivial, but we rely on
the assumption that, although the order of fields in JSON
terms is irrelevant and can therefore vary a lot, practical applications
will normally generate the JSON objects in a consistent
order.
If a field in a json_object is declared of type boolean
,
@true and @false are translated to true
or false
,
the most commonly used Prolog representation for truth-values.
json.pl
describes how JSON objects are represented in
Prolog terms. json_convert.pl
converts between more natural Prolog
terms and json terms.
Most code doesn't need to use this directly; instead use
library(http/http_server)
, which combines this library with
the typical HTTP libraries that most servers need.
This module adds hooks to several parts of the HTTP libraries, making them JSON-aware. Notably:
application/json
and
application/jsonrequest
content to a JSON term.post(json(Term))
to issue a POST request with JSON content.
Accept
header prefers application/json over
text/html.Typically JSON is used by Prolog HTTP servers. This module supports two JSON representations: the classical representation and the new representation supported by the SWI-Prolog version 7 extended data types. Below is a skeleton for handling a JSON request, answering in JSON using the classical interface.
handle(Request) :- http_read_json(Request, JSONIn), json_to_prolog(JSONIn, PrologIn), <compute>(PrologIn, PrologOut), % application body prolog_to_json(PrologOut, JSONOut), reply_json(JSONOut).
When using dicts, the conversion step is generally not needed and the code becomes:
handle(Request) :- http_read_json_dict(Request, DictIn), <compute>(DictIn, DictOut), reply_json(DictOut).
This module also integrates JSON support into the http client
provided by http_client.pl
. Posting a JSON query and
processing the JSON reply (or any other reply understood by http_read_data/3)
is as simple as below, where Term is a JSON term as described in json.pl
and reply is of the same format if the server replies with JSON.
..., http_post(URL, json(Term), Reply, [])
term
or dict
. If
the value is dict
,
json_read_dict/3 is used.MediaType | is a term Type/SubType, where both Type and SubType are atoms. |
http_post(URL, json(Term), Reply, Options) http_post(URL, json(Term, Options), Reply, Options)
If Options are passed, these are handed to json_write/3. In addition, this option is processed:
dict
, json_write_dict/3
is used to write the output. This is default if json(Dict)
is passed.term
(default) to generate a classical Prolog term
or dict
to exploit the SWI-Prolog version 7 data type
extensions. See json_read_dict/3.domain_error(mimetype, Found)
if the mimetype is not
known (see json_type/1). domain_error(method, Method)
if the request method is not
a POST
, PUT
or PATCH
.Content-type
is application/json; charset=UTF8
. charset=UTF8
should not be required because JSON is defined to be UTF-8 encoded, but
some clients insist on it.term
(classical json representation) or dict
to use the new dict representation. If omitted and Term is a dict, dict
is assumed. SWI-Prolog Version 7.
Simple and partial implementation of MIME encoding. MIME is covered
by RFC 2045. This library is used by e.g., http_post_data/3
when using the
form_data(+ListOfData)
input specification.
MIME decoding is now arranged through library(mime)
from
the clib package, based on the external librfc2045 library. Most likely
the functionality of this package will be moved to the same library
someday. Packing however is a lot simpler then parsing.
=
Valuefilename
is present if Value is of the form file(File)
.
Value may be any of remaining value specifications.
Content-Disposition: form-data; name="Name"[; filename="<File>"
Content-type
is
derived from the File using file_mime_type/2.
If the content-type is text/_
, the file data is copied in
text mode, which implies that it is read in the default encoding of the
system and written using the encoding of the Out stream.
Otherwise the file data is copied binary.type(ContentType)
and/or character_set(CharSet)
.
This can be used to give a content-type to values that otherwise do not
have a content-type. For example:
mime([type(text/html)], '<b>Hello World</b>', [])
Out | is a stream opened for writing. Typically, it should be opened in text mode using UTF-8 encoding. |
Writing servers is an inherently dangerous job that should be carried out with some considerations. You have basically started a program on a public terminal and invited strangers to use it. When using the interactive server or inetd based server the server runs under your privileges. Using CGI scripted it runs with the privileges of your web-server. Though it should not be possible to fatally compromise a Unix machine using user privileges, getting unconstrained access to the system is highly undesirable.
Symbolic languages have an additional handicap in their inherent possibilities to modify the running program and dynamically create goals (this also applies to the popular Perl and PHP scripting languages). Here are some guidelines.
/etc/passwd
, but also ../../../../../etc/passwd
are tried by hackers to learn about the system they want to attack. So,
expand provided names using absolute_file_name/[2,3]
and verify they are inside a folder reserved for the server. Avoid
symbolic links from this subtree to the outside world. The example below
checks validity of filenames. The first call ensures proper canonisation
of the paths to avoid an mismatch due to symbolic links or other
filesystem ambiguities.
check_file(File) :- absolute_file_name('/path/to/reserved/area', Reserved), absolute_file_name(File, Tried), sub_atom(Tried, 0, _, _, Reserved).
open(pipe(Command), ...)
, verify the argument once more.
Use
process_create/3
in preference over shell/1
as this function avoids stringification of arguments (Unix) or ensures
proper quoting of arguments (Windows).
reply(Query) :- member(search(Args), Query), member(action=Action, Query), member(arg=Arg, Query), call(Action, Arg). % NEVER EVER DO THIS!
All your attacker has to do is specify Action as shell
and Arg as /bin/sh
and he has an uncontrolled
shell!
/
).
This is
not a good idea. It is advised to have all locations in a
server below a directory with an informative name. Consider to make the
root location something that can be changed using a global setting.
library(uri)
, such as uri_components/2,
uri_data/4, uri_edit/3, uri_nomalized/2,
etc.
The SWI-Prolog HTTP library is in active use in a large number of projects. It is considered one of the SWI-Prolog core libraries that is actively maintained and regularly extended with new features. This is particularly true for the multi-threaded server. The inetd based server may be applicable for infrequent requests where the startup time is less relevant. The XPCE based server is considered obsolete.
This library is by no means complete and you are free to extend it.