Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port Hive, Yarn, Flink, Kylin, Camel, Nifi to framework #39

Open
wants to merge 126 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
126 commits
Select commit Hold shift + click to select a range
1bad85b
Collect Configuration Parameters and Tests
wenhsinghuang Nov 18, 2022
b841ab7
Automate Project Installation
wenhsinghuang Nov 24, 2022
83a46b6
add hint in add_project.sh
wenhsinghuang Nov 24, 2022
f963380
modify ctest+const.py
wenhsinghuang Nov 24, 2022
3788ad0
support inject
wenhsinghuang Nov 24, 2022
9145918
Add Project Specific Constants
wenhsinghuang Nov 25, 2022
b1f6410
modify add_project.sh
wenhsinghuang Nov 27, 2022
484a842
update file
wenhsinghuang Nov 28, 2022
91fdc2d
add project kylin
ConstaT99 Nov 30, 2022
04d8cb8
update gitignore
ConstaT99 Nov 30, 2022
de1b9cb
Update default and environmental path variables
CarolSSS Nov 30, 2022
6fb4410
update inject file
ConstaT99 Nov 30, 2022
99b2934
update injection and main in generate ctest
ConstaT99 Nov 30, 2022
a039c39
Update value generation
CarolSSS Nov 30, 2022
09688bd
Merge branch 'main' of https://github.com/ConstaT99/openctest
CarolSSS Nov 30, 2022
7c4e7fd
update identify_param and add_project
ConstaT99 Nov 30, 2022
cead0a4
Update const param
CarolSSS Nov 30, 2022
5ec8058
Update const param
CarolSSS Dec 1, 2022
fbb6c46
Update const param
CarolSSS Dec 1, 2022
eaa60cd
add hadoop yarn common to identify_param
Dec 1, 2022
da876d4
add hadoop-yarn-common to constant.py
Dec 1, 2022
d547796
add yarn-common-default.tsv and conf_params.txt
Dec 1, 2022
b640fac
skipTrace added
Dec 1, 2022
110db32
added test method list
Dec 1, 2022
5ddeec6
update ctest_const
Dec 1, 2022
f09518e
generate ctest update
Dec 1, 2022
c3a4d53
upload param_unset_getter_map.json
Dec 1, 2022
4bc6655
run_ctest update
Dec 1, 2022
6ecaaaa
update value_generation.py
Dec 1, 2022
38ae8a5
change last part
CarolSSS Dec 1, 2022
f9809b7
Finish all first step changing
CarolSSS Dec 1, 2022
d3faa6b
Finish generate value
CarolSSS Dec 2, 2022
9481940
fixed surefire and push
ConstaT99 Dec 2, 2022
260806f
added generated mapping
Dec 4, 2022
877b93d
revert to ctest hadoop
Dec 4, 2022
612e2a3
add patch file for yarn
chrisshen98 Dec 5, 2022
eb76861
new repo
ConstaT99 Dec 5, 2022
5f9b37d
add_project
ConstaT99 Dec 5, 2022
63b80b3
Update generate value and getter/setter result
CarolSSS Dec 6, 2022
23de7b8
update result specific for kylin
CarolSSS Dec 6, 2022
d2d628d
add generated values to ctest
CarolSSS Dec 6, 2022
e366e49
adding test_result/kylin-common
CarolSSS Dec 6, 2022
6c29013
update map
CarolSSS Dec 6, 2022
d133526
update generate ctest with fixed data file
CarolSSS Dec 6, 2022
36355b9
fixed the bug
ConstaT99 Dec 6, 2022
fad6287
fix bugs
ConstaT99 Dec 6, 2022
77bbc0e
update ctest const value
ConstaT99 Dec 6, 2022
c79aaea
modified for hive
Dec 6, 2022
9980400
some updates
ConstaT99 Dec 6, 2022
9c884a0
clean repo
ConstaT99 Dec 6, 2022
b7e0afd
clean repo
ConstaT99 Dec 6, 2022
c4af888
clean
ConstaT99 Dec 6, 2022
89fe040
clean
ConstaT99 Dec 6, 2022
90156af
clean
ConstaT99 Dec 6, 2022
8cf0e86
clean
ConstaT99 Dec 6, 2022
9bc3f87
clean
ConstaT99 Dec 6, 2022
511832b
clean in progress
ConstaT99 Dec 6, 2022
ef7812a
modify for nifi
Dec 6, 2022
387ea76
update some change
ConstaT99 Dec 6, 2022
d40e854
clean
ConstaT99 Dec 6, 2022
b947e96
clean
ConstaT99 Dec 6, 2022
5ded63f
ran identify param
Dec 6, 2022
ef40289
add hive files
Dec 6, 2022
406562f
update git ignore
ConstaT99 Dec 6, 2022
5d95890
fix injection path
Dec 6, 2022
6e6dad7
adding functions for tool
CarolSSS Dec 6, 2022
91ab7ea
update result
CarolSSS Dec 6, 2022
063be4d
update ctest
CarolSSS Dec 6, 2022
ab67280
update ctest
CarolSSS Dec 6, 2022
c9701b0
updates for HIVE
Dec 6, 2022
00d4cb6
common version
ConstaT99 Dec 6, 2022
170125a
add interception and logging patch files for hadoop-yarn-common
Dec 8, 2022
e4b90e4
removed combined patch file
Dec 8, 2022
62488ab
fix bug
ConstaT99 Dec 8, 2022
e670de5
update ignore
ConstaT99 Dec 8, 2022
9d8b4bd
Update-cube storage
CarolSSS Dec 8, 2022
c8574c4
Merge branch 'main' of https://github.com/ConstaT99/openctest
CarolSSS Dec 8, 2022
138aa77
updare git ignore
ConstaT99 Dec 8, 2022
bc2262c
update setup_ubuntu.sh
whhuang4 Dec 8, 2022
013a716
update gitignore
ConstaT99 Dec 9, 2022
692a8c6
Upate identify_param
CarolSSS Dec 9, 2022
69a7bbd
update git ignore
ConstaT99 Dec 9, 2022
951b650
fix storage not found
CarolSSS Dec 9, 2022
59cb74b
update gitignore
ConstaT99 Dec 9, 2022
5179b52
Delete log
CarolSSS Dec 9, 2022
b7f18f6
Delete log
CarolSSS Dec 9, 2022
e8fff4b
update error of fogetting adding in const
CarolSSS Dec 9, 2022
51f7280
Merge branch 'main' of https://github.com/ConstaT99/openctest
ConstaT99 Dec 9, 2022
14acea0
Merge branch 'xlab-uiuc:main' into main
ConstaT99 Dec 9, 2022
5c62bf0
Merge branch 'main' of https://github.com/ConstaT99/openctest
ConstaT99 Dec 9, 2022
3ebfd22
add patch
ConstaT99 Dec 9, 2022
5130fd0
update getctest storage
CarolSSS Dec 9, 2022
039c1d6
update getctest storage
CarolSSS Dec 9, 2022
fb06a15
Update generate ctest
CarolSSS Dec 9, 2022
88467b7
push cube
CarolSSS Dec 9, 2022
01da495
Update hardcoded value
CarolSSS Dec 9, 2022
311127a
update map
ConstaT99 Dec 9, 2022
6cab976
clean repo
ConstaT99 Dec 9, 2022
c0652f3
Update add_project.sh
wenhsinghuang Dec 9, 2022
40950b6
Merge pull request #1 from xlab-uiuc/main
wenhsinghuang Dec 9, 2022
cfe4a23
add patches
whhuang4 Dec 9, 2022
589cb42
cube done
ConstaT99 Dec 9, 2022
2af7f3c
clean program input file
ConstaT99 Dec 9, 2022
c1847a5
run ctest working
Dec 10, 2022
705c63a
update nifi setup
Dec 10, 2022
9c22266
add params
Dec 11, 2022
444a7c2
modification of branches
Dec 11, 2022
09c6e0d
modify identify runner
Dec 11, 2022
09f1f7f
update add project
Dec 11, 2022
5e74038
finish hive
Dec 11, 2022
bbd2cce
adding patches
Dec 11, 2022
1b93b40
added mapping
Dec 12, 2022
9406529
finish nifi
Dec 12, 2022
fee78df
push patch
Dec 12, 2022
4204680
Merge remote-tracking branch 'upstream/hive' into main
ramyabygari Dec 8, 2023
a6bc7aa
Merge remote-tracking branch 'upstream/nifi' into main
ramyabygari Dec 8, 2023
d259c46
docker changes
ramyabygari Dec 8, 2023
adefe8d
flink tested and logging patch changed
ramyabygari Dec 10, 2023
943650a
camel tested
ramyabygari Dec 10, 2023
fbb9e91
hive, nifi, flink and camel work!
ramyabygari Dec 10, 2023
2bab8a1
yarn final checks done
ramyabygari Dec 10, 2023
fd0dfa8
all projects work!
ramyabygari Dec 10, 2023
53f77d9
changes in docker annd readme
ramyabygari Dec 10, 2023
623678e
deletion of redundant files
ramyabygari Dec 10, 2023
c5541d9
ctest_constchange
ramyabygari Dec 10, 2023
6b228f8
nifi-common corrected
ramyabygari Dec 10, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion core/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ RUN \
apt-get install -y git && \
# Install python
apt-get update && \
apt-get install -y python python-dev python-pip python-virtualenv && \
apt-get install -y python python-dev python-pip python-virtualenv python3-pip python3-virtualenv && \
rm -rf /var/lib/apt/lists/* && \
# Install misc
apt-get update && \
Expand Down
2 changes: 1 addition & 1 deletion core/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ To generate ctests or run ctest, you need to first clone the target project.
1. In `openctest/core`, run `./add_project.sh <main project>` to clone the project, switch to and build the branch `ctest-injection`. This branch will be later used by `generate_ctest` and `run_ctest`.
2. In `openctest/core/identify_param`, run `./add_project.sh <main project>` to clone the project, switch to and build the branch `ctest-logging`. This branch will be later used by `identify_param`.

`<main project>` can be `hadoop`, `hbase`, `zookeeper` or `alluxio`.
`<main project>` can be `hadoop`, `hbase`, `zookeeper`, `hive`, `alluxio`, `yarn`, `flink`, `kylin`, `camel` or `nifi`.

## Usage

Expand Down
46 changes: 45 additions & 1 deletion core/add_project.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ function setup_hadoop() {
mvn clean install -DskipTests
cd $home_dir/hadoop-hdfs-project/hadoop-hdfs
mvn package -DskipTests
cd $home_dir/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common
mvn package -DskipTests
}

function setup_hbase() {
Expand Down Expand Up @@ -47,6 +49,43 @@ function setup_alluxio() {
cd core
mvn clean install -DskipTests -Dcheckstyle.skip -Dlicense.skip -Dfindbugs.skip -Dmaven.javadoc.skip=true
}
function setup_nifi(){
[ ! -d "app/ctest-nifi" ] && git clone https://github.com/lilacyl/nifi.git app/ctest-nifi
cd app/ctest-nifi
git fetch && git checkout ctest-injection
mvn clean install -DskipTest
}

function setup_hive(){
[ ! -d "app/ctest-hive" ] && git clone https://github.com/lilacyl/hive.git app/ctest-hive
cd app/ctest-hive
git fetch && git checkout ctest-injection
cd common
mvn clean install -DskipTests
}

function setup_flink() {
[ ! -d "app/ctest-flink" ] && git clone https://github.com/jessicahuang523/flink app/ctest-flink
cd app/ctest-flink
git fetch && git checkout ctest-injection
cd flink-core
mvn clean install -DskipTests
}

function setup_camel() {
[ ! -d "app/ctest-camel" ] && git clone https://github.com/wenhsinghuang/camel.git app/ctest-camel
cd app/ctest-camel
git fetch && git checkout ctest-logging
mvn clean install -DskipTests
}


function setup_kylin(){
[ ! -d "app/ctest-kylin" ] && git clone https://github.com/rtao6/kylin.git app/ctest-kylin
cd app/ctest-kylin
git fetch && git checkout ctest-injection
mvn clean install -DskipTests -Dcheckstyle.skip -Dlicense.skip -Dfindbugs.skip -Dmaven.javadoc.skip=true
}

function usage() {
echo "Usage: add_project.sh <main project>"
Expand All @@ -64,7 +103,12 @@ function main() {
hbase) setup_hbase ;;
zookeeper) setup_zookeeper ;;
alluxio) setup_alluxio ;;
*) echo "Unexpected project: $project - only support hadoop, hbase, zookeeper and alluxio." ;;
hive) setup_hive ;;
nifi) setup_nifi ;;
flink) setup_flink ;;
camel) setup_camel ;;
kylin) setup_kylin ;;
*) echo "Unexpected project: $project - only support hadoop, hbase, zookeeper, alluxio, hive, nifi, flink, kylin and camel." ;;
esac
fi
}
Expand Down
1 change: 1 addition & 0 deletions core/app/ctest-hive
Submodule ctest-hive added at 0c8f09
63 changes: 60 additions & 3 deletions core/ctest_const.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,18 +12,36 @@
HBASE = "hbase-server"
ZOOKEEPER = "zookeeper-server"
ALLUXIO = "alluxio-core"
HIVE = "hive-common"
NIFI = "nifi-common"
FLINK = "flink-core"
CAMEL = "camel-core"
HYARNCOMMON = "hadoop-yarn-common"
KCOMMON = "kylin-common"


CTEST_HADOOP_DIR = os.path.join(APP_DIR, "ctest-hadoop")
CTEST_HBASE_DIR = os.path.join(APP_DIR, "ctest-hbase")
CTEST_ZK_DIR = os.path.join(APP_DIR, "ctest-zookeeper")
CTEST_ALLUXIO_DIR = os.path.join(APP_DIR, "ctest-alluxio")
CTEST_HIVE_DIR = os.path.join(APP_DIR, "ctest-hive")
CTEST_NIFI_DIR = os.path.join(APP_DIR, "ctest-nifi")
CTEST_FLINK_DIR = os.path.join(APP_DIR, "ctest-flink")
CTEST_CAMEL_DIR = os.path.join(APP_DIR, "ctest-camel")
CTEST_KYLIN_DIR = os.path.join(APP_DIR, "ctest-kylin")

PROJECT_DIR = {
HCOMMON: CTEST_HADOOP_DIR,
HDFS: CTEST_HADOOP_DIR,
HBASE: CTEST_HBASE_DIR,
ZOOKEEPER: CTEST_ZK_DIR,
ALLUXIO: CTEST_ALLUXIO_DIR,
HIVE: CTEST_HIVE_DIR,
NIFI: CTEST_NIFI_DIR,
FLINK: CTEST_FLINK_DIR,
CAMEL: CTEST_CAMEL_DIR,
HYARNCOMMON: CTEST_HADOOP_DIR,
KCOMMON: CTEST_KYLIN_DIR,
}


Expand All @@ -34,12 +52,20 @@
HBASE: "hbase-server",
ZOOKEEPER: "zookeeper-server",
ALLUXIO: "core",
HIVE: "common",
NIFI: "nifi-common",
FLINK: "flink-core",
CAMEL: "core/camel-core",
HYARNCOMMON: "hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common",
KCOMMON: "core-common",
}


# surefire report
SUREFIRE_SUBDIR = "target/surefire-reports/"
SUREFIRE_SUBDIR = "../target/surefire-reports/"
SUREFIRE_XML = "TEST-{}.xml" # slot is the classname
HIVE_SUREFIRE_XML = "TEST-org.apache.hadoop.hive.conf.{}.xml" # slot is the classname
SUREFIRE_XML_NIFI = "TEST-org.apache.nifi.util.{}.xml" # slot is the classname
SUREFIRE_TXT = "{}.txt" # testclass
SUREFIRE_OUTTXT = "{}-output.txt" #testclass

Expand All @@ -58,6 +84,12 @@
os.path.join(CTEST_ALLUXIO_DIR, MODULE_SUBDIR[ALLUXIO], "server/worker", SUREFIRE_SUBDIR),
os.path.join(CTEST_ALLUXIO_DIR, MODULE_SUBDIR[ALLUXIO], "server/master", SUREFIRE_SUBDIR),
],
HIVE: [os.path.join(CTEST_HIVE_DIR, MODULE_SUBDIR[HIVE], SUREFIRE_SUBDIR)],
NIFI: [os.path.join(CTEST_NIFI_DIR, MODULE_SUBDIR[NIFI],"nifi-properties", SUREFIRE_SUBDIR)],
FLINK: [os.path.join(CTEST_FLINK_DIR, MODULE_SUBDIR[FLINK], SUREFIRE_SUBDIR)],
CAMEL: [os.path.join(CTEST_HADOOP_DIR, MODULE_SUBDIR[CAMEL], SUREFIRE_SUBDIR)],
HYARNCOMMON: [os.path.join(CTEST_HADOOP_DIR, MODULE_SUBDIR[HYARNCOMMON], SUREFIRE_SUBDIR)],
KCOMMON: [os.path.join(CTEST_KYLIN_DIR, MODULE_SUBDIR[KCOMMON], SUREFIRE_SUBDIR)],
}

# default or deprecate conf path
Expand All @@ -74,7 +106,13 @@
HDFS: os.path.join(DEFAULT_CONF_DIR, HDFS + "-default.tsv"),
HBASE: os.path.join(DEFAULT_CONF_DIR, HBASE + "-default.tsv"),
ALLUXIO: os.path.join(DEFAULT_CONF_DIR, ALLUXIO + "-default.tsv"),
ZOOKEEPER: os.path.join(DEFAULT_CONF_DIR, ZOOKEEPER + "-default.tsv")
ZOOKEEPER: os.path.join(DEFAULT_CONF_DIR, ZOOKEEPER + "-default.tsv"),
HIVE: os.path.join(DEFAULT_CONF_DIR, HIVE + "-default.tsv"),
NIFI: os.path.join(DEFAULT_CONF_DIR, NIFI + "-default.tsv"),
FLINK: os.path.join(DEFAULT_CONF_DIR, FLINK + "-default.tsv"),
CAMEL: os.path.join(DEFAULT_CONF_DIR, CAMEL + "-default.tsv"),
HYARNCOMMON: os.path.join(DEFAULT_CONF_DIR, HYARNCOMMON + "-default.tsv"),
KCOMMON: os.path.join(DEFAULT_CONF_DIR, KCOMMON + "-default.tsv"),
}


Expand All @@ -96,7 +134,26 @@
],
ALLUXIO: [
os.path.join(CTEST_ALLUXIO_DIR, "core/alluxio-ctest.properties")
]
],
HIVE: [
os.path.join(CTEST_HIVE_DIR, "conf/hive-ctest.xml")
],
NIFI: [
os.path.join(CTEST_NIFI_DIR, "nifi-commons/nifi-properties/src/test/resources/NiFiProperties/conf/ctest.properties")
],
FLINK: [
os.path.join(CTEST_FLINK_DIR, "flink-core/core-ctest.yaml")
],
CAMEL: [
os.path.join(CTEST_CAMEL_DIR, "core/camel-core/camel-ctest.properties")
],
HYARNCOMMON: [
os.path.join(CTEST_HADOOP_DIR, "hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/target/classes/yarn-common-ctest.xml")
],
KCOMMON: [
os.path.join(CTEST_KYLIN_DIR, "core-common/src/main/resources/ctest.properties")
# os.path.join(CTEST_KYLIN_DIR, "core-common/target/ctest.properties")
],
}


Expand Down
60 changes: 60 additions & 0 deletions core/default_configs/camel-core-default.tsv
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
cool.leading " Leading space" Default String
cool.trailing "Trailing space" Default String
cool.both " Both leading and trailing space" Default String
space.leading " \r\n" Default String
space.trailing "\t" Default String
space.both " \r \t \n" Default String
mixed.leading " Leading space\r\n" Default String
mixed.trailing "Trailing space\t" Default String
mixed.both " Both leading and trailing space\r \t \n" Default String
empty.line "" Default String
cool.end mock:result Default endpoint
cool.result result Default result
cool.result.xx result parameterized propertySuffix
cool.end.xx mock:result parameterized propertySuffix
cool.concat mock:{{cool.result}} parameterized propertySuffix
cool.concat.escaped mock:\\{{cool.result\\}}{"query":{"match_all":{}\\}} parameterized propertySuffix
cool.start direct:cool parameterized propertySuffix
cool.showid true parameterized propertySuffix
cool.name Camel parameterized propertySuffix
cool.other.name Cheese parameterized propertySuffix
cool.a {{cool.b}} circular reference test
cool.b {{cool.c}} circular reference test
cool.c {{cool.a}} circular reference test
cool.mock mock circular reference test
myCoolCharset iso-8859-1 circular reference test
slipDelimiter ## circular reference test
myQueueSize 10 circular reference test
myDelayPattern 3:10;5:30;10:50;20:100 circular reference test
stop true circular reference test
onlytwo 2 circular reference test
integration.ftpEnabled true circular reference test
cheese.end mock:cheese No description
cheese.quote Camel rocks No description
cheese.type Gouda No description
cheese.server http://mycoolserver No description
bean.foo foo No description
elephant Hello Thai Elephant จ No description
sslContextParameters.protocol TLS No description
keyManagersParameters.algorithm SunX509 No description
trustManagersParameters.algorithm PKIX No description
sslContextParameters.provider SunJSSE No description
sslContextParameters.sessionTimeout 2 No description
keyStoreParameters.provider SUN No description
trustManagersParameters.provider SunJSSE No description
keyStoreParamerers.password changeit No description
keyManagersParameters.keyPassword changeit No description
keyStoreParameters.resource org/apache/camel/support/jsse/localhost.p12 No description
cipherSuite.0 TLS_AES_256_GCM_SHA384 No description
filterParameters.exclude exclude No description
secureSocketProtocol.0 TLSv1.3 No description
secureRandomParameters.algorithm SHA1PRNG No description
sslContextServerParameters.clientAuthentication REQUIRE No description
keyStoreParameters.type pkcs12 No description
filterParameters.include include No description
keyManagersParameters.provider SunJSSE No description
secureRandomParameters.provider SUN No description
autoStartupProp true No description
noAutoStartupProp false No description
maxKeep 1 Hardcoded In OptionalPropertyPlaceholderTest.java
queue foo No description
32 changes: 32 additions & 0 deletions core/default_configs/flink-core-default.tsv
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
fs.default-scheme (none) The default filesystem scheme, used for paths that do not declare a scheme explicitly. May contain an authority, e.g. host:port in case of an HDFS NameNode.
fs.allowed-fallback-filesystems (none) A (semicolon-separated) list of file schemes, for which Hadoop can be used instead of an appropriate Flink plugin. (example: s3;wasb)
pipeline.auto-type-registration true Controls whether Flink is automatically registering all types in the user programs with Kryo.
pipeline.auto-generate-uids true When auto-generated UIDs are disabled, users are forced to manually specify UIDs on DataStream applications.It is highly recommended that users specify UIDs before deploying to production since they are used to match state in savepoints to operators in a job. Because auto-generated ID's are likely to change when modifying a job, specifying custom IDs allow an application to evolve over time without discarding state.
pipeline.auto-watermark-interval 0 ms The interval of the automatic watermark emission. Watermarks are used throughout the streaming system to keep track of the progress of time. They are used, for example, for time based windowing.
pipeline.closure-cleaner-level RECURSIVE Configures the mode in which the closure cleaner works.Possible values:"NONE": Disables the closure cleaner completely."TOP_LEVEL": Cleans only the top-level class without recursing into fields."RECURSIVE": Cleans all fields recursively.
pipeline.force-avro false Forces Flink to use the Apache Avro serializer for POJOs.Important: Make sure to include the flink-avro module.
pipeline.generic-types true If the use of generic types is disabled, Flink will throw an UnsupportedOperationException whenever it encounters a data type that would go through Kryo for serialization.Disabling generic types can be helpful to eagerly find and eliminate the use of types that would go through Kryo serialization during runtime. Rather than checking types individually, using this option will throw exceptions eagerly in the places where generic types are used.We recommend to use this option only during development and pre-production phases, not during actual production use. The application program and/or the input data may be such that new, previously unseen, types occur at some point. In that case, setting this option would cause the program to fail.
pipeline.force-kryo false If enabled, forces TypeExtractor to use Kryo serializer for POJOS even though we could analyze as POJO. In some cases this might be preferable. For example, when using interfaces with subclasses that cannot be analyzed as POJO.
pipeline.global-job-parameters (none) Register a custom, serializable user configuration object. The configuration can be accessed in operators
metrics.latency.interval 0 Defines the interval at which latency tracking marks are emitted from the sources. Disables latency tracking if set to 0 or a negative value. Enabling this feature can significantly impact the performance of the cluster.
state.backend.changelog.periodic-materialize.interval 10 min Defines the interval in milliseconds to perform periodic materialization for state backend. The periodic materialization will be disabled when the value is negative
state.backend.changelog.max-failures-allowed 3 Max number of consecutive materialization failures allowed.
pipeline.max-parallelism -1 The program-wide maximum parallelism used for operators which haven't specified a maximum parallelism. The maximum parallelism specifies the upper limit for dynamic scaling and the number of key groups used for partitioned state.
parallelism.default 1 Default parallelism for jobs.
pipeline.object-reuse false When enabled objects that Flink internally uses for deserialization and passing data to user-code functions will be reused. Keep in mind that this can lead to bugs when the user-code function of an operation is not aware of this behaviour.
task.cancellation.interval 30000 Time interval between two successive task cancellation attempts in milliseconds.
task.cancellation.timeout 180000 Timeout in milliseconds after which a task cancellation times out and leads to a fatal TaskManager error. A value of 0 deactivates the watch dog. Notice that a task cancellation is different from both a task failure and a clean shutdown. Task cancellation timeout only applies to task cancellation and does not apply to task closing/clean-up caused by a task failure or a clean shutdown.
execution.checkpointing.snapshot-compression false Tells if we should use compression for the state snapshot data or not
restart-strategy (none) Defines the restart strategy to use in case of job failures.Accepted values are:none, off, disable: No restart strategy.fixeddelay, fixed-delay: Fixed delay restart strategy. More details can be found here.failurerate, failure-rate: Failure rate restart strategy. More details can be found here.exponentialdelay, exponential-delay: Exponential delay restart strategy. More details can be found here.If checkpointing is disabled, the default value is none. If checkpointing is enabled, the default value is fixed-delay with Integer.MAX_VALUE restart attempts and '1 s' delay.
pipeline.default-kryo-serializers (none) Semicolon separated list of pairs of class names and Kryo serializers class names to be used as Kryo default serializersExample:class:org.example.ExampleClass,serializer:org.example.ExampleSerializer1; class:org.example.ExampleClass2,serializer:org.example.ExampleSerializer2
pipeline.registered-pojo-types (none) Semicolon separated list of types to be registered with the serialization stack. If the type is eventually serialized as a POJO, then the type is registered with the POJO serializer. If the type ends up being serialized with Kryo, then it will be registered at Kryo to make sure that only tags are written.
pipeline.registered-kryo-types (none) Semicolon separated list of types to be registered with the serialization stack. If the type is eventually serialized as a POJO, then the type is registered with the POJO serializer. If the type ends up being serialized with Kryo, then it will be registered at Kryo to make sure that only tags are written.
jobmanager.scheduler Default Determines which scheduler implementation is used to schedule tasks. Accepted values are:'Default': Default scheduler'Adaptive': Adaptive scheduler. More details can be found here.'AdaptiveBatch': Adaptive batch scheduler. More details can be found here.Possible values:"Default""Adaptive""AdaptiveBatch"
compiler.delimited-informat.max-line-samples 10 The maximum number of line samples taken by the compiler for delimited inputs. The samples are used to estimate the number of records. This value can be overridden for a specific input with the input format’s parameters.
compiler.delimited-informat.min-line-samples 2 The minimum number of line samples taken by the compiler for delimited inputs. The samples are used to estimate the number of records. This value can be overridden for a specific input with the input format’s parameters
compiler.delimited-informat.max-sample-len 2097152 The maximal length of a line sample that the compiler takes for delimited inputs. If the length of a single sample exceeds this value (possible because of misconfiguration of the parser), the sampling aborts. This value can be overridden for a specific input with the input format’s parameters.
cluster.intercept-user-system-exit DISABLED Flag to check user code exiting system by terminating JVM (e.g., System.exit()). Note that this configuration option can interfere with cluster.processes.halt-on-fatal-error: In intercepted user-code, a call to System.exit() will not cause the JVM to halt, when THROW is configured.Possible values:"DISABLED": Flink is not monitoring or intercepting calls to System.exit()"LOG": Log exit attempt with stack trace but still allowing exit to be performed"THROW": Throw exception when exit is attempted disallowing JVM termination
cluster.processes.halt-on-fatal-error false Whether processes should halt on fatal errors instead of performing a graceful shutdown. In some environments (e.g. Java 8 with the G1 garbage collector), a regular graceful shutdown can lead to a JVM deadlock. See FLINK-16510 for details.
fs.overwrite-files false Specifies whether file output writers should overwrite existing files by default. Set to "true" to overwrite by default,"false" otherwise.
fs.output.always-create-directory false File writers running with a parallelism larger than one create a directory for the output file path and put the different result files (one per parallel writer task) into that directory. If this option is set to "true", writers with a parallelism of 1 will also create a directory and place a single result file into it. If the option is set to "false", the writer will directly create the file directly at the output path, without creating a containing directory.
rest.bind-address (none) The address that the server binds itself.
Loading