When I import a .csv file, I get the message "The following errors occurred during validation. Please correct the errors and import the file again. Error 94: The dataset has 134 columns, but there are 135 indicators in total"
When looking at previous posts, it was recommended to save the file as a .csv 2003. Although I do not have an option on my desktop to select earlier versions of csv files, I went to an old laptop that has Excel 2003 and saved the data file as a .csv on that computer. I also tried saving the file as a text file and creating a new project with it. In both these cases, I get the same error message. I've checked the file multiple times and confirmed that all 135 columns exist and have data.
Any advice on how to solve this so I can see the indicators listed and then put them in my model would be greatly appreciated!
Errors validating data file
Still didn't work
Dear Sarjana,
Thanks so much for responding to my query! Unfortunately, I am still having problems with both the txt and csv files.
I checked the .csv file, and it has values in either whole numbers or decimals (albeit occasionally in negative decimals, so if that is a problem, it would be helpful to know). So, having values should not be the problem.
I also tried to validate the data using a txt file. For the .txt file, I have the problem that all data seems to be condensed under the first indicator. I only have one column of data, and each row has a long list of values separated by commas (versus the multiple columns with each cell having a different value that I see when selecting the "comma" under "choose deliminator"). When I select "comma" for validation, I get the original error about too few columns. When I select any of the other deliminator options (e.g. "space"), I get an error message "Dataset 1 contains an invalid value [6,4,5...] in column 0", which repeats for "Dataset 2" onward.
Any other recommendations for solutions would be greatly appreciated!
Thanks so much for responding to my query! Unfortunately, I am still having problems with both the txt and csv files.
I checked the .csv file, and it has values in either whole numbers or decimals (albeit occasionally in negative decimals, so if that is a problem, it would be helpful to know). So, having values should not be the problem.
I also tried to validate the data using a txt file. For the .txt file, I have the problem that all data seems to be condensed under the first indicator. I only have one column of data, and each row has a long list of values separated by commas (versus the multiple columns with each cell having a different value that I see when selecting the "comma" under "choose deliminator"). When I select "comma" for validation, I get the original error about too few columns. When I select any of the other deliminator options (e.g. "space"), I get an error message "Dataset 1 contains an invalid value [6,4,5...] in column 0", which repeats for "Dataset 2" onward.
Any other recommendations for solutions would be greatly appreciated!
I'm having the same problem
Please, have you solved this? If yes, could you tell me how?
Thank you so much.
Thank you so much.
-
- PLS Junior User
- Posts: 3
- Joined: Tue Jan 07, 2014 3:27 pm
- Real name and title:
Solved the Errors validating data file
Hai I solved the problems.
Thank you for Mr. Hengky Latan Super Expert PLS from Olah Data. He suggest that one of the problem was sampels < indicators. I got the problem when I use all scores of the the questioner as indicators. The sum of indicators were 77, my samples only 64. I reduced the indicators by grouping them according to the same dimention. The result, my indicators become 32 and the problems solved.
Thank you for Mr. Hengky Latan Super Expert PLS from Olah Data. He suggest that one of the problem was sampels < indicators. I got the problem when I use all scores of the the questioner as indicators. The sum of indicators were 77, my samples only 64. I reduced the indicators by grouping them according to the same dimention. The result, my indicators become 32 and the problems solved.