When argument contains ‘hdfs://’, then it is supposed to be on HDFS file system and calls the command from variable HADOOP_FS_LS, which is by default ‘hadoop fs -ls’.
When argument contains ‘s3://’, then it is supposed to be on AWS S3 file system and calls the command from variable AWS_S3_LS, which is by default ‘aws s3 ls’.
Otherwise act as usual ‘ls’ command.
is to be used in EVS job structure definition file or in EWS workflow structure definition.
- evl ls
is intended for standalone usage, i.e. to be invoked from command line.
Ls [-dhlqrRStu] (hdfs://<path> | <local_path> )... Ls [-hR] s3://<bucket>/<path> evl ls [-dhlqrRStu] (hdfs://<path> | <local_path> )... evl ls [-hR] s3://<bucket>/<path>
- -d, --directory
list directories themselves, not their contents
- -h, --human-readable
print human readable sizes (e.g. 1K, 234M or 2G)
use a long listing format, for HDFS means ‘hdfs dfs -ls’
- -q, --hide-control-chars
print ? instead of nongraphic characters
- -r, --reverse
reverse order while sorting
- -R, --recursive
list subdirectories recursively
sort by file size, largest first
sort by modification time, newest first
with -lt: sort by, and show, access time; with -l: show access time and sort by name; otherwise: sort by access time, newest first
- These simple examples write result on stdout:
Ls hdfs:///some/path/????-??-??.csv Ls s3:///somebucketname/path/ Ls /some/local/machine/path/*
- To be used to initiate a flow in EVL job:
INPUT_FILES=/data/input Run "" INPUT "Ls $INPUT_FILES" Map INPUT ... ...
And then, for PROD environment, input files would be defined for example: