Command line interface

CLI scripts can be used perform all actions available in the UI

LinkedDataHub CLI wraps the HTTP API into a set of shell scripts with convenient parameters. The scripts should run on any Unix-based system. They can be used for testing, automation, scheduled execution and such. It is usually much quicker to perform actions using CLI rather than the user interface, as well as easier to reproduce.

Some scripts correspond to a single request to LinkedDataHub, others combine others into tasks with multiple interdependent requests, such as CSV import.

You will need to supply a .pem file of your WebID certificate as well as its password as script arguments, among others.

The CLI scripts use the environmental variable $SCRIPT_ROOT, which should point to the scripts in your LinkedDataHub fork.

They also use the Jena's CLI commands internally, so make sure to have them on $PATH before running the scripts.


Common parameters used by most scripts include:

.pem file with the WebID certificate of the agent
Password of the WebID certificate
Base URI of the application
The host this request will be proxied through (optional)
It can be used with port 5443 for which the client certificate authentication is always enabled, for example --proxy https://localhost:5443/

Other parameters are script-specific.


A usage message with parameters of a script is printed when the scripted is run without any arguments. There can be named parameters and default parameters, both of those can be optional. For example:

~/WebRoot/AtomGraph/LinkedDataHub/scripts$ ./
Creates a SPARQL SELECT query.

Usage:  ./ options

  -f, --cert-pem-file CERT_FILE        .pem file with the WebID certificate of the agent
  -p, --cert-password CERT_PASSWORD    Password of the WebID certificate
  -b, --base BASE_URI                  Base URI of the application
  --proxy PROXY_URL                    The host this request will be proxied through (optional)

  --title TITLE                        Title of the chart
  --description DESCRIPTION            Description of the chart (optional)
  --slug STRING                        String that will be used as URI path segment (optional)
  --fragment STRING                    String that will be used as URI fragment identifier (optional)

  --query-file ABS_PATH                Absolute path to the text file with the SPARQL query string
  --service SERVICE_URI                URI of the SPARQL service specific to this query (optional)

The optional parameters are marked with (Optional). In this case there is no default argument, but some scripts require document (named graph) URI as the default parameter, e.g. ontology document URL.

This is how a invocation would look like:

./ \
  -b "$base" \
  -f "$cert_pem_file" \
  -p "$cert_password" \
  --proxy "$proxy" \
  --title "Select concepts" \
  --slug select-concepts \
  --query-file "$pwd/queries/select-concepts.rq"


Currently supported:

Purpose Script
Create document
Update document
Create container
Create item
Create result set chart
Create SELECT query
Create file imports/
Create query imports/
Create CSV import imports/
Import CSV data imports/
Add owl:import to ontology admin/
Clear and reload ontology admin/
Access control
Add agent to group admin/acl/
Create authorization admin/acl/
Create group admin/acl/
Make application publicly readable to any agent admin/acl/
Create class admin/model/
Create CONSTRUCT query admin/model/
Create ontology admin/model/
Create property constraint admin/model/
Create SELECT query admin/model/
Import ontology admin/model/

Usage example:

./ https://localhost:4443/ \
  -f "$cert_pem_file" \
  -p "$cert_password" \
  --title "Friends" \
  --file-slug 646af756-a49f-40da-a25e-ea8d81f6d306 \
  --file friends.csv \
  --file-content-type text/csv

See also the data import user guides.

Find the CLI scripts on GitHub or check out the demo apps that use them.