Products, services and company names referenced in this document may be either trademarks or registered trademarks of their respective owners.

Copyright © 2017–2024 EVL Tool, s.r.o.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, with no Front-Cover Texts, and with no Back-Cover Texts.

Table of Contents

Ls

(since EVL 2.0)

List <dest>, which might be one of:

<local_path>
gdrive://<path>
gs://<bucket>/<path>
hdfs://<path>
s3://<bucket>/<path>
sftp://<path>
smb://<path>

So for example when argument starts with ‘hdfs://’, then it is supposed to be on HDFS file system and calls the function ‘evl_hdfs_ls’, which is by default ‘hadoop fs -ls’.

Or when argument starts with ‘s3://’, then it is supposed to be on S3 file system and calls the function ‘evl_s3_ls’, which is by default ‘aws s3 ls’.

Otherwise act as usual ‘ls’ command.

Ls

is to be used in EVS job structure definition file or in EWS workflow structure definition.

evl ls

is intended for standalone usage, i.e. to be invoked from command line.

Synopsis

Ls
  [-R|--recursive] <dest>...

evl ls
  [-R|--recursive] <dest>...
  [--verbose]

evl ls
  ( --help | --usage | --version )

Options

Standard options:

--help

print this help and exit

--usage

print short usage information and exit

-v, --verbose

print to stderr info/debug messages of the component

--version

print version and exit

Examples

  1. These simple examples write result on stdout:
    Ls hdfs:///some/path/????-??-??.csv
    Ls s3:///somebucketname/path/
    Ls /some/local/machine/path/*
    
  2. To be used to initiate a flow in EVL job:
    INPUT_FILES=/data/input
    Run   ""    INPUT  "Ls $INPUT_FILES"
    Map   INPUT ...
    ...
    

And then, for PROD environment, input files would be defined for example:

INPUT_FILES=hdfs:///data/input