Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Files Location

For a maintenable operational suite, we recommand to:

  • start the server in a local /tmp directory (/:ECF_HOME)

  • define ECF_FILES and ECF_INCLUDE variables, at suites and families level: scripts wrappers will be accessible for creation, and update under these directories.

  • define /suite:ECF_HOME as another directory location, where jobs and related outputs will be found. These dynamic files may then be tar’ed as snapshot of the effective work associated to a suite/family, for later analysis or rerun.

  • when server and remote jobs destination do not share a common directory for output-files, ECF_OUT variable needs to be present in the suite definition: it indicates the remote output path. In this situation, the suite designer is responsible to create the directory structure where the output file will be found. Most queueing system won’t start the job, if this directory is absent, and the task may remain visible as submitted, from the ecFlow server side.

  • after seding the job complete command, the job may copy its output to ECF_NODE, to enable direct access from ecFlow server.

    When a file is requested from the ecflow-server, it is limited to 15k lines, to avoid the server spending too much time delivering very large output files.

    ecFlowview can be configured (globally, Edit-Preferences-Server-Option or locally top-node-menu->Options “Read Output an other files from disk when possible”) to get the best expected behaviour.

  • use ecf.list file to restrict access to the server for read-write or read only access

    Code Block
    #
    4.0.6
    # ecflow_client --help reloadwsfile
    # ecflow_client --reloadwsfile # update ecflow server
    # $USER  # rw access, aka $LOGNAME
    # -$USER # for read only access
    # export ECF_LISTS=/path/to/file # before server starts, to change location or name
    emos
    -rdx

Log-Server

It is possible to setup a log-server, to access ‘live’ output from the jobs. ecFlow is provided with the perl script ecflow_logsvr.pl.

  • it is configured to deliver files under specific directories,

  • configration variables are

    • LOGPORT # 9316
    • LOGPATH # <path1>:<path2>:<path3>
    • LOGMAP # mapping betwen requested path and real actual location

    As an example, with two possible storage destination:

    export LOGPATH=/s2o1/logs:/s2o2/logs # two possible
    export LOGMAP=/s2o1/logs:/s2o1/logs:/s2o2/logs:/s2o2/logs # maps itself
    export LOGMAP=$LOGMAP:/tmp:/s2o1/logs:/tmp:/s2o2/logs     # map from /tmp
  • It is started on the remote machine and ecFlowview GUI will contact it when the variables ECF_LOGHOST and ECF_LOGPORT are defined in the suite:

    edit ECF_LOGHOST c2a
    edit ECF_LOGPORT 9316
  • it can be tested from the command line with telnet:

    telnet c2a 9316 # get <file> # list <directory>

...