Log outputs useless CSV files if region defaults to comma as decimal separator

Valantar

New Member
This is a rather trivial bug, but also an incredibly annoying one to come across after having ran a dozen benchmark runs with different hardware configs that now can't be graphed properly. The issue: My PC's region is set to Norway, which means it defaults to a comma as the decimal separator, while HWi also defaults to a comma as the CSV item separator.

At the absolute minimum, HWiNFO ought to give a visible warning in the sensor settings if the CSV item separator and decimal separator are set to the same thing, as this renders logs useless. But ideally, there should be no way for this to be the default setting, regardless of region settings.

Beyond this, does anyone know of any semi-automated way of fixing this? Having a dozen files with >500 entries for a whole bunch of sensors, I can import it into excel and delete unnecessary columns containing only decimal data (I don't care whether my clock speeds are logged as 4650MHz or 6450.1MHz). But that still leaves me with thousands and thousands of lines where I'll need to manually re-enter the data - at least I don't know of any simple way of making excel merge two cells with one being inserted after the decimal point in the other. I guess it would be possible to make some sort of script to fix this semi-automatically too (say, one that reads out each header and asks if its data has decimals, and if yes replaces the next comma in the csv with a period). But then I know absolutely nothing about programming.
 
You can easily change the CSV separator in sensor settings.
Probably the best way to 'fix' already created CSV files is to temporary change your Windows regional settings and convert the CSV file to a different format.
 
You can easily change the CSV separator in sensor settings.
Probably the best way to 'fix' already created CSV files is to temporary change your Windows regional settings and convert the CSV file to a different format.
Thanks for the reply! I know (now) that I can change it. But I never considered that this might be necessary - in part because of not being familiar with csv files before this, but also because I trusted default settings to be safe where I obviously shouldn't have. Sadly there's no simple way to fix this - it's not like Excel (or anything else) can tell which comma is a decimal separator and which is an item separator, after all - they're all the same as far as any software is concerned. Attempting to import this into Excel just results in a garbled mess like this:
xWjMngu.png

After all, no software outside of HWiNFO knows which fields have decimal values and which don't, etc. So, the "virtual memory load" column has a single decimal, which is instead read as the data for the "physical memory used" column, shifting everything over by one column. And this happens for every column with decimal data.

I'm entirely sure this is fixable with some relatively simple scripting (which could probably even be done in excel), but that's way above my skill level.

At the very least, please add some kind of safeguard in the software that reverts these settings a default 'safe' setting if regional settings or something else results in these two being the same - default settings resulting in unusable outputs is a pretty serious bug, imo.
 
I'm afraid this is not easily fixable. There are hundreds of possible sensors each with a variable decimal digits amount which can also be changed by user. It would be much easier to generate a new logfile.
 
I'm afraid this is not easily fixable. There are hundreds of possible sensors each with a variable decimal digits amount which can also be changed by user. It would be much easier to generate a new logfile.
I know. It would essentially require a semi-manual script that parses the csv file, reads through each header sequentially, asks something like "does this field have decimals, y/n", and if y it adds a decimal point and deletes the following comma in the appropriate data field. It would of course need to do that for every data field simultaneously (or remember the pattern and repeat it for each of them). For now, I guess I'll have to settle with two days of benchmarking and a water loop rebuild being wasted, and try to salvage what I can. But for the sake of future users, please add some sort of safety mechanism to avoid this.
 
Back
Top