Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

some errors #4

Open
bigleaffrog opened this issue Oct 5, 2023 · 3 comments
Open

some errors #4

bigleaffrog opened this issue Oct 5, 2023 · 3 comments

Comments

@bigleaffrog
Copy link

In the "lad_topic_model.py" file, I made a modification to the line:

temp = OneHotEncoder(sparse=False, handle_unknown='ignore', categories=np.arange(vocab_size).reshape([1, vocab_size]))
.fit_transform(word_labels.reshape([-1,1])) * word_scores

I modified it to:

temp = OneHotEncoder(sparse=False, handle_unknown='ignore', categories=[np.arange(vocab_size)])
.fit_transform(word_labels.reshape([-1, 1])) * word_scores

This change seems to be an attempt to fix an issue. However, I mentioned that there is still an error in the "loss.py" file, specifically in the line:

loss = -attr * loss_mask.cuda() * func.log_softmax(feat, 1)

but I don't know how to modify.

Looking forward your reply.

@Alien9427
Copy link
Owner

Hi,

Could you please kindly post the error message?

@bigleaffrog
Copy link
Author

Thank you for your attention. I cannot upload my screenshot and can only provide my error in this way. I look forward to your reply.
Traceback (most recent call last):
File "/mnt/data/DataSetOfDL/SeaIceData/XAI4SAR-PGIL-main/src/ICE_PGN_train.py", line 174, in
train(config)
File "/mnt/data/DataSetOfDL/SeaIceData/XAI4SAR-PGIL-main/src/ICE_PGN_train.py", line 138, in train
attr = get_BoT(scat_paths, kmeans_model, lda_model, config['topic_num']) # shape: batchsize * topic_num
File "/mnt/data/DataSetOfDL/SeaIceData/XAI4SAR-PGIL-main/src/ICE_PGN_train.py", line 98, in get_BoT
test_corpus = gen_corpus(test_docs, kmeans, word_win, stride)
File "/mnt/data/DataSetOfDL/SeaIceData/XAI4SAR-PGIL-main/src/lda_topic_model.py", line 68, in gen_corpus
temp = OneHotEncoder(sparse=False, handle_unknown='ignore', categories=np.arange(vocab_size).reshape([1,vocab_size]))
File "/mnt/softs/miniconda3/envs/pytorch1/lib/python3.9/site-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/mnt/softs/miniconda3/envs/pytorch1/lib/python3.9/site-packages/sklearn/base.py", line 915, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/mnt/softs/miniconda3/envs/pytorch1/lib/python3.9/site-packages/sklearn/base.py", line 1144, in wrapper
estimator._validate_params()
File "/mnt/softs/miniconda3/envs/pytorch1/lib/python3.9/site-packages/sklearn/base.py", line 637, in _validate_params
validate_parameter_constraints(
File "/mnt/softs/miniconda3/envs/pytorch1/lib/python3.9/site-packages/sklearn/utils/_param_validation.py", line 95, in validate_parameter_constraints
raise InvalidParameterError(
sklearn.utils._param_validation.InvalidParameterError: The 'categories' parameter of OneHotEncoder must be a str among {'auto'} or an instance of 'list'. Got array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25,
26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38,
39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51,
52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64,
65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77,
78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90,
91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103,
104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116,
117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129,
130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142,
143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155,
156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168,
169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181,
182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194,
195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207,
208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220,
221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233,
234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246,
247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259,
260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272,
273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285,
286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298,
299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311,
312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324,
325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337,
338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350,
351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363,
364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376,
377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389,
390, 391, 392, 393, 394, 395, 396, 397, 398, 399, 400, 401, 402,
403, 404, 405, 406, 407, 408, 409, 410, 411, 412, 413, 414, 415,
416, 417, 418, 419, 420, 421, 422, 423, 424, 425, 426, 427, 428,
429, 430, 431, 432, 433, 434, 435, 436, 437, 438, 439, 440, 441,
442, 443, 444, 445, 446, 447, 448, 449, 450, 451, 452, 453, 454,
455, 456, 457, 458, 459, 460, 461, 462, 463, 464, 465, 466, 467,
468, 469, 470, 471, 472, 473, 474, 475, 476, 477, 478, 479, 480,
481, 482, 483, 484, 485, 486, 487, 488, 489, 490, 491, 492, 493,
494, 495, 496, 497, 498, 499]]) instead.

@Alien9427
Copy link
Owner

The error may occur when using a different version of the scikit-learn package. For 0.23.2 version of sklearn, it is ok.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants