(since EVL 2.0)
<dest>, which might be one of:
<local_path> gdrive://<path> gs://<bucket>/<path> hdfs://<path> s3://<bucket>/<path> sftp://<path> smb://<path>
So for example when argument starts with ‘hdfs://’, then it is supposed to be on HDFS file system and calls the function ‘evl_hdfs_ls’, which is by default ‘hadoop fs -ls’.
Or when argument starts with ‘s3://’, then it is supposed to be on S3 file system and calls the function ‘evl_s3_ls’, which is by default ‘aws s3 ls’.
Otherwise act as usual ‘ls’ command.
is to be used in EVS job structure definition file or in EWS workflow structure definition.
- evl ls
is intended for standalone usage, i.e. to be invoked from command line.
Ls [-R|--recursive] <dest>... evl ls [-R|--recursive] <dest>... [--verbose] evl ls ( --help | --usage | --version )
print this help and exit
print short usage information and exit
- -v, --verbose
print to stderr info/debug messages of the component
print version and exit
- These simple examples write result on stdout:
Ls hdfs:///some/path/????-??-??.csv Ls s3:///somebucketname/path/ Ls /some/local/machine/path/*
- To be used to initiate a flow in EVL job:
INPUT_FILES=/data/input Run "" INPUT "Ls $INPUT_FILES" Map INPUT ... ...
And then, for PROD environment, input files would be defined for example: