Jump to content

Too many maps, and how to fix it


Recommended Posts

So I've encountered an issue with my own greed. I REALLY want to have all tons of maps in my game, but as I add them, the shaders start to break down. Is there a way to fix this? Should I just start deleting maps? Is there some way to optimize the game/mess with OpenJk to increase the number of shaders?

 

image.png.3242ed331afc7d945fb8dd0cdb08ee92.png

 

Link to comment

This is an issue I do not want to reproduce, installing that many mods sounds like too much of a hassle.

I read some of the related code to see if I could figure out what might theoretically be the issue. Your screenshot looks like SP to me, so I checked the SP code, but it should work the same in MP.

The first thing that jumped out is MAX_SHADER_FILES = 4096. You should be able to check if that's the limit you run into by adding a Com_Printf log like so

	if ( numShaderFiles > MAX_SHADER_FILES ) {
		Com_Printf("Too many shader files (%d > %d)", numShaderFiles, MAX_SHADER_FILES);
		numShaderFiles = MAX_SHADER_FILES;
	}

at the limit check in ScanAndLoadShaderFiles. But I suspect it's currently impossible to run into that, because you probably run into the MAX_FOUND_FILES limit first, which is also 4096. A message about that could go before the return in the limit check in FS_AddFileToList:

	if ( nfiles == MAX_FOUND_FILES - 1 ) {
		Com_Printf("MAX_FOUND_FILES (%d) exceeded, ignoring file %s", MAX_FOUND_FILES, name);
		return nfiles;
	}

If those are indeed the limits you're hitting, I would recommend you keep doubling both of them until it works. Don't immediately set them to something super large, as increasing the limits will increase the amount of RAM needed, and you only have so much of that.

Also be aware that a single broken shader file can break other files as well, which can also cause the kind of issue you're experiencing. When that happens, the broken mod needs to be removed or fixed.

bigphil2695 likes this
Link to comment
1 hour ago, mrwonko said:

This is an issue I do not want to reproduce, installing that many mods sounds like too much of a hassle.

I read some of the related code to see if I could figure out what might theoretically be the issue. Your screenshot looks like SP to me, so I checked the SP code, but it should work the same in MP.

The first thing that jumped out is MAX_SHADER_FILES = 4096. You should be able to check if that's the limit you run into by adding a Com_Printf log like so

	if ( numShaderFiles > MAX_SHADER_FILES ) {
		Com_Printf("Too many shader files (%d > %d)", numShaderFiles, MAX_SHADER_FILES);
		numShaderFiles = MAX_SHADER_FILES;
	}

at the limit check in ScanAndLoadShaderFiles. But I suspect it's currently impossible to run into that, because you probably run into the MAX_FOUND_FILES limit first, which is also 4096. A message about that could go before the return in the limit check in FS_AddFileToList:

	if ( nfiles == MAX_FOUND_FILES - 1 ) {
		Com_Printf("MAX_FOUND_FILES (%d) exceeded, ignoring file %s", MAX_FOUND_FILES, name);
		return nfiles;
	}

If those are indeed the limits you're hitting, I would recommend you keep doubling both of them until it works. Don't immediately set them to something super large, as increasing the limits will increase the amount of RAM needed, and you only have so much of that.

Also be aware that a single broken shader file can break other files as well, which can also cause the kind of issue you're experiencing. When that happens, the broken mod needs to be removed or fixed.

Will try this

Link to comment
13 hours ago, AshuraDX said:

I have an unfinished project that was designed to identify duplicate files across pk3 archives and later down the road produce optimized packages.

If you know anything about python I could share the half finished scripts

My knowledge is very limited, but I have tweaked blender plugins with Python before. Mind if I take a look?

Link to comment
1 hour ago, bigphil2695 said:

My knowledge is very limited, but I have tweaked blender plugins with Python before. Mind if I take a look?

https://drive.google.com/file/d/13yBM74qAsU_o58YCRp-r9RUNjriw_doa/view?usp=sharing

This is a zip archive containing the script and its current output files in json format.

Currently this works by dropping it into your base folder and running it. The script will poop out three files:

  1. base_tree.json - A json file that lists all pk3 archives and their included files with all details relevant for this script like the pk3 they're in, the date they were last changed, filesize and md5sum
  2. duplicates_by_name.json - Files grouped by identical name/path. Lists all filenames that occur atleast twice alongside additional details
  3. duplicates_by_hash.json - Groups files by md5sum and lists all files that have the same content, regardless of name.

There is no GUI, no automatic resolvation of duplicates or anything yet. But you could use this to find duplicate files and manually remove obsolete duplicates to reduce the overall number of files used.

bigphil2695 likes this
Link to comment
7 hours ago, AshuraDX said:

https://drive.google.com/file/d/13yBM74qAsU_o58YCRp-r9RUNjriw_doa/view?usp=sharing

This is a zip archive containing the script and its current output files in json format.

Currently this works by dropping it into your base folder and running it. The script will poop out three files:

  1. base_tree.json - A json file that lists all pk3 archives and their included files with all details relevant for this script like the pk3 they're in, the date they were last changed, filesize and md5sum
  2. duplicates_by_name.json - Files grouped by identical name/path. Lists all filenames that occur atleast twice alongside additional details
  3. duplicates_by_hash.json - Groups files by md5sum and lists all files that have the same content, regardless of name.

There is no GUI, no automatic resolvation of duplicates or anything yet. But you could use this to find duplicate files and manually remove obsolete duplicates to reduce the overall number of files used.

Thank you very much. I'm no coder, but I'll see what use this will be.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...