Documentation Initiale

This commit is contained in:
Farewell_ 2025-01-20 23:19:14 +01:00
commit 92be0ecac4
29 changed files with 2371 additions and 0 deletions

88
README.md Normal file
View File

@ -0,0 +1,88 @@
# Taiko Nijiiro
Le tooling permettant de générer des updates
## Informations générales
La borne de AtomCity est un [Bootleg Chinois de Taiko 14](https://web.archive.org/web/20241122064431/https://www.gztomy.com/products/taiko-no-tatsujin-14-chine-direct).
Elle a été modifiée pour supporter la dernière version du jeu.
![Borne Taiko](./images/bootleg_cab.png)
## Table des matières
- [Taiko Nijiiro](#taiko-nijiiro)
- [Informations générales](#informations-générales)
- [Table des matières](#table-des-matières)
- [Software](#software)
- [Jeu](#jeu)
- [Serveur](#serveur)
- [Version](#version)
- [Administration](#administration)
- [Tooling](#tooling)
- [Scripts](#scripts)
- [Assets](#assets)
- [Hardware](#hardware)
- [PC](#pc)
- [Ecran](#ecran)
- [Tambours](#tambours)
- [IO Boards](#io-boards)
- [Conception](#conception)
- [Lecteur de cartes](#lecteur-de-cartes)
## Software
### Jeu
La version du jeu actuellement en exploitation sur la borne est la 148ème mise à jour du jeu (39.06-JPN) [**Taiko Nijiiro**](https://wikiwiki.jp/taiko-fumen/%E4%BD%9C%E5%93%81/%E6%96%B0AC/%E3%82%A2%E3%83%83%E3%83%97%E3%83%87%E3%83%BC%E3%83%88%E5%B1%A5%E6%AD%B4/%E3%83%8B%E3%82%B8%E3%82%A4%E3%83%AD2023#update148) datant du 19/04/2023. Le jeu est lancé grace à l'outil [TaikoArcadeLoader](https://github.com/esuo1198/TaikoArcadeLoader) (Build [#191](https://github.com/esuo1198/TaikoArcadeLoader/actions/runs/12494345054))
### Serveur
Le serveur de jeu est accessible [tatsuj.in](https://tatsuj.in). Il est également disponible sur l'ancienne URL [taiko.farewell.dev](https://taiko.farewell.dev).
Il est actuellement hébergé sur mon NAS (KIT!).
La borne est connectée au serveur via un VPN Wireguard afin de proteger le Backend. L'ip du serveur sur le réseau est `192.168.1.25`.
La WebUI est la seule partie accessible depuis internet.
#### Version
Le serveur de jeu est la version de developpement de [TaikoLocalServer](https://github.com/asesidaa/TaikoLocalServer/commits/dev) (Commit [21d5bc9](https://github.com/asesidaa/TaikoLocalServer/commit/21d5bc97bb45fcd1cbf40fcdd3e9c8fb0f897d7f)).
#### Administration
Les utilisateur-ice ayant un droit ADMIN sur le serveur sont **Siphonight** et moi-meme, **KIT!**.
Veuillez nous envoyer toute demande en rapport avec la modération !
### Tooling
#### Scripts
le dossier [Tooling](./tooling/) regroupe les différents scripts utilisés pour générer les bases de données du jeu. Ils ne sont pas fonctionnels dans l'état car les dumps de base de données du serveur contiennent les mot de passes hashés des joueurs, ils ont donc été omis.
#### Assets
le dossier [Assets](./tooling/Assets/) regroupe les différents mods mis en place sur la borne :
- [Display Card ID & Update old Card format](./tooling/Assets/Display%20Card%20ID%20&%20Update%20old%20Card%20format/) : Fichier source du serveur permettant l'affichage d'un code QR redirigeant vers la WebUI, ainsi que le numéro de carte pour les joueurs non inscrits afin de faciliter leur onboarding.
- [New Song Intro List](./tooling/Assets/New%20Song%20Intro%20List/) : Liste de songs rajoutées dans l'attract mode du jeu.
- [QR Login](./tooling/Assets/QRLogin/) : Fichier image du code QR affiché lorsqu'un joueur n'es pas inscrit
- [Unlimited Random Select](./tooling/Assets/Unlimited%20Random%20Select/) : Mod lua permettant de choisir des songs aléatoirement à l'infini (Le jeu permet normalement de choisir une song aléatoirement qu'une seule fois).
## Hardware
### PC
### Ecran
### Tambours
#### IO Boards
#### Conception
### Lecteur de cartes
Le plugin [tal-cardreader](https://gitea.farewell.dev/AkaiiKitsune/tal-cardreader/src/branch/atomcity/) a été développé dans le but de pouvoir injecter des cartes compatible EAMUSEMENT-IC dans le jeu.
La borne utilise actuellement un lecteur [PN5180-cardio](https://github.com/CrazyRedMachine/PN5180-cardio).
![PN5180-cardio](./images/cardio.png)

88
configs/config.toml Normal file
View File

@ -0,0 +1,88 @@
[amauth]
server = "127.0.0.1"
port = "54430"
chassis_id = "284111080000"
shop_id = "ATOMCITY"
game_ver = "39.06"
country_code = "FRA"
[patches]
version = "JPN39" # Patch version
# | - auto: hash detection (you need to use the original exe otherwise it will not load).
# | - JPN00: For use with Taiko JPN 00.18
# | - JPN08: For use with Taiko JPN 08.18
# | - JPN39: For use with Taiko JPN 39.06
# | - CHN00: For use with Taiko CHN 00.32
unlock_songs = true
[patches.chn00] # These patches are only available for version CHN00
fix_language = false # Sync test mode language to attract etc
demo_movie = true # Show demo movie
mode_collabo025 = false # Enable one piece collab mode
mode_collabo026 = false # Enable ai soshina mode
[patches.jpn39] # These patches are only available for version JPN39
fix_language = true # Sync test mode language to attract etc
chs_patch = false # Use Chinese font and Simplified Chinese values from the wordlist
# More options are available in the ModManager, in the TestMode menu (Default key is F1)
[emulation]
usio = true # Disable this if you want to use an original Namco USIO board. you need to place bnusio_original.dll (unmodified bnusio.dll) in the executable folder.
card_reader = true # Disable this if you want to use an original Namco card reader
accept_invalid = true # Enable this if you want to accept cards incompatible with the original readers
qr = true # Disable this if you want to use an original Namco QR code scanner
[graphics]
res = { x = 1920, y = 1080 }
windowed = false
vsync = true
fpslimit = 0
[audio]
wasapi_shared = false # Wasapi shared mode, allows you to have multiple audio sources at once at a cost of having higher latency.
asio = true # Use asio audio mode
asio_driver = "ASIO4ALL v2" # Asio driver name
# | If you're not using asio4all, open up regedit then navigate to HKEY_LOCAL_MACHINE\SOFTWARE\ASIO for your driver's name.
# | It is case sensitive.
[qr]
image_path = "" # Path to the image of the QR Code you want to use
[qr.data] # QR data used for other events (ex. gaiden, custom folder)
serial = "" # QR serial
type = 0 # QR type
# | 0: default (serial only)
# | 5: custom folder
song_no = [] # Song noes used for custom folder
[controller]
wait_period = 2 # Input interval (if using taiko drum controller, should be set to 0)
analog_input = false # Use analog input (you need a compatible controller, this allows playing small and big notes like on arcade cabinets)
[keyboard]
auto_ime = false # Automatically change to english ime mode upon game startup
jp_layout = false # Use jp layout scan code (if using jp layout keyboard, must be set to true)
[layeredfs]
enabled = false # Replace assets from the game using a layered file system.
# | For example if you want to edit the wordlist, add your edited version like so:
# | .\Data_mods\x64\datatable\wordlist.json
# | You can provide both unencrypted and encrypted files.
[logging]
log_level = "INFO" # Log level, Can be either "NONE", "ERROR", "WARN", "INFO", "DEBUG" and "HOOKS"
# | Keep this as low as possible (Info is usually more than enough) as more logging will slow down your game
log_to_file = false # Log to file, set this to true to save the logs from your last session to TaikoArcadeLoader.log
# |Again, if you do not have a use for this (debugging mods or whatnot), turn it off.

40
configs/keyconfig.toml Normal file
View File

@ -0,0 +1,40 @@
EXIT = ["ESCAPE"]
TEST = ["F1"]
SERVICE = ["F2"]
DEBUG_UP = ["UPARROW"]
DEBUG_DOWN = ["DOWNARROW"]
DEBUG_ENTER = ["ENTER"]
COIN_ADD = ["ENTER", "SDL_START"]
CARD_INSERT_1 = ["F3"]
CARD_INSERT_2 = ["F4"]
QR_DATA_READ = ["Q"]
QR_IMAGE_READ = ["W"]
P1_LEFT_BLUE = ["D", "SDL_LTRIGGER"]
P1_LEFT_RED = ["F", "SDL_LSTICK_PRESS"]
P1_RIGHT_RED = ["J", "SDL_RSTICK_PRESS"]
P1_RIGHT_BLUE = ["K", "SDL_RTRIGGER"]
P2_LEFT_BLUE = ["Z"]
P2_LEFT_RED = ["X"]
P2_RIGHT_RED = ["C"]
P2_RIGHT_BLUE = ["V"]
# ESCAPE F1 through F12
# ` 1 through 0 -= BACKSPACE ^ YEN
# TAB QWERTYUIOP [ ] BACKSLASH @
# CAPS_LOCK ASDFGHJKL ;' ENTER :
# SHIFT ZXCVBNM , . SLASH
# CONTROL L_WIN ALT SPACE R_WIN MENU
# SCROLL_LOCK PAUSE INSERT DELETE HOME END PAGE_UP PAGE_DOWN
# UPARROW LEFTARROW DOWNARROW RIGHTARROW
# NUM0 through NUM9 NUM_LOCK DIVIDE MULTIPLY SUBTRACT ADD DECIMAL
# SCROLL_UP SCROLL_DOWN
# SDL_A SDL_B SDL_X SDL_Y
# SDL_BACK SDL_GUIDE SDL_START
# SDL_LSHOULDER SDL_LTRIGGER SDL_RSHOULDER SDL_RTRIGGER
# SDL_DPAD_UP SDL_DPAD_LEFT SDL_DPAD_DOWN SDL_DPAD_RIGHT
# SDL_MISC SDL_PADDLE1 SDL_PADDLE2 SDL_PADDLE3 SDL_PADDLE4 SDL_TOUCHPAD
# SDL_LSTICK_UP SDL_LSTICK_LEFT SDL_LSTICK_DOWN SDL_LSTICK_RIGHT SDL_LSTICK_PRESS
# SDL_RSTICK_UP SDL_RSTICK_LEFT SDL_RSTICK_DOWN SDL_RSTICK_RIGHT SDL_RSTICK_PRESS

BIN
images/bootleg_cab.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 MiB

BIN
images/cardio.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 840 KiB

1
tooling/.env Normal file
View File

@ -0,0 +1 @@
DBNAME=taiko20241125.db3

View File

@ -0,0 +1,223 @@
using GameDatabase.Context;
using Throw;
namespace TaikoLocalServer.Handlers;
public record BaidQuery(string AccessCode) : IRequest<CommonBaidResponse>;
public class BaidQueryHandler(
TaikoDbContext context,
ILogger<BaidQueryHandler> logger,
IGameDataService gameDataService)
: IRequestHandler<BaidQuery, CommonBaidResponse>
{
public async Task<CommonBaidResponse> Handle(BaidQuery request, CancellationToken cancellationToken)
{
var card = await context.Cards.FindAsync(request.AccessCode);
var oldAccesscode = ConvertOldUID(request.AccessCode);
var oldcard = await context.Cards.FindAsync(oldAccesscode);
if (oldcard is not null)
{
logger.LogInformation("{AccessCode} converts to {Converted}, Updating...", request.AccessCode, oldcard.AccessCode);
if (card is null)
{
logger.LogInformation("{newCard} wasn't bound !, Updating...", request.AccessCode);
Card newCard = new Card();
newCard.Baid = oldcard.Baid;
newCard.AccessCode = request.AccessCode;
await context.Cards.AddAsync(newCard);
card = newCard;
}
else logger.LogInformation("{newCard} was already bound !", request.AccessCode);
context.Cards.Remove(oldcard);
await context.SaveChangesAsync(cancellationToken);
}
if (card is null)
{
logger.LogInformation("New user with access code {AccessCode}", request.AccessCode);
return new CommonBaidResponse
{
Result = 1,
IsNewUser = true,
Baid = context.Cards.Any() ? context.Cards.AsEnumerable().Max(c => c.Baid) + 1 : 1
};
}
var baid = card.Baid;
var credential = context.Credentials.Where(cred => cred.Baid == baid).First();
var userData = await context.UserData.FindAsync(baid, cancellationToken);
userData.ThrowIfNull($"User not found for card with Baid {baid}!");
var songBestData = context.SongBestData.Where(datum => datum.Baid == baid).ToList();
var achievementDisplayDifficulty = userData.AchievementDisplayDifficulty;
if (achievementDisplayDifficulty == Difficulty.None)
{
achievementDisplayDifficulty = songBestData
.Where(datum => datum.BestCrown >= CrownType.Clear)
.Select(datum => datum.Difficulty)
.DefaultIfEmpty(Difficulty.Easy)
.Max();
}
// For each crown type, calculate how many songs have that crown type
var crownCountData = songBestData
.Where(datum => datum.BestCrown >= CrownType.Clear)
.GroupBy(datum => datum.BestCrown)
.ToDictionary(datums => datums.Key, datums => (uint)datums.Count());
var crownCount = new uint[3];
foreach (var crownType in Enum.GetValues<CrownType>())
{
if (crownType != CrownType.None)
{
crownCount[(int)crownType - 1] = crownCountData.GetValueOrDefault(crownType, (uint)0);
}
}
var scoreRankData = songBestData
.Where(datum => datum.BestCrown >= CrownType.Clear)
.GroupBy(datum => datum.BestScoreRank)
.ToDictionary(datums => datums.Key, datums => (uint)datums.Count());
var scoreRankCount = new uint[7];
foreach (var scoreRank in Enum.GetValues<ScoreRank>())
{
if (scoreRank != ScoreRank.None)
{
scoreRankCount[(int)scoreRank - 2] = scoreRankData.GetValueOrDefault(scoreRank, (uint)0);
}
}
List<uint> costumeData = [userData.CurrentKigurumi, userData.CurrentHead, userData.CurrentBody, userData.CurrentFace, String.IsNullOrEmpty(credential.Password) ? 200 : userData.CurrentPuchi];
List<List<uint>> costumeArrays =
[userData.UnlockedKigurumi, userData.UnlockedHead, userData.UnlockedBody, userData.UnlockedFace, userData.UnlockedPuchi];
var costumeFlagArrays = gameDataService.GetCostumeFlagArraySizes()
.Select((size, index) => FlagCalculator.GetBitArrayFromIds(costumeArrays[index], size, logger))
.ToList();
var danData = await context.DanScoreData
.Where(datum => datum.Baid == baid && datum.DanType == DanType.Normal)
.Include(datum => datum.DanStageScoreData).ToListAsync(cancellationToken);
var gaidenData = await context.DanScoreData
.Where(datum => datum.Baid == baid && datum.DanType == DanType.Gaiden)
.Include(datum => datum.DanStageScoreData).ToListAsync(cancellationToken);
var maxDan = danData.Where(datum => datum.ClearState != DanClearState.NotClear)
.Select(datum => datum.DanId)
.DefaultIfEmpty()
.Max();
var danDataDictionary = gameDataService.GetCommonDanDataDictionary();
var danIdList = danDataDictionary.Keys.ToList();
var gotDanFlagArray = FlagCalculator.ComputeGotDanFlags(danData, danIdList);
var gaidenDataDictionary = gameDataService.GetCommonGaidenDataDictionary();
var gaidenIdList = gaidenDataDictionary.Keys.ToList();
var gotGaidenFlagArray = FlagCalculator.ComputeGotDanFlags(gaidenData, gaidenIdList);
var genericInfoFlg = userData.GenericInfoFlgArray;
var genericInfoFlgLength = genericInfoFlg.Any() ? genericInfoFlg.Max() + 1 : 0;
var genericInfoFlgArray = FlagCalculator.GetBitArrayFromIds(genericInfoFlg, (int)genericInfoFlgLength, logger);
var aiRank = (uint)(userData.AiWinCount / 10);
if (aiRank > 10)
{
aiRank = 10;
}
return new CommonBaidResponse
{
Result = 1,
IsNewUser = false,
Baid = baid,
MyDonName = userData.MyDonName,
MyDonNameLanguage = userData.MyDonNameLanguage,
AryCrownCounts = crownCount,
AryScoreRankCounts = scoreRankCount,
ColorBody = userData.ColorBody,
ColorFace = userData.ColorFace,
ColorLimb = userData.ColorLimb,
CostumeData = costumeData,
CostumeFlagArrays = costumeFlagArrays,
DisplayDan = userData.DisplayDan,
DispAchievementType = (uint)achievementDisplayDifficulty,
GenericInfoFlg = genericInfoFlgArray,
GotDanFlg = gotDanFlagArray,
GotDanMax = maxDan,
GotGaidenFlg = gotGaidenFlagArray,
IsDispAchievementOn = userData.DisplayAchievement,
LastPlayDatetime = userData.LastPlayDatetime.ToString(Constants.DateTimeFormat),
LastPlayMode = userData.LastPlayMode,
SelectedToneId = userData.SelectedToneId,
Title = String.IsNullOrEmpty(credential.Password) ? FormatStringInGroupsOfFour(request.AccessCode) : userData.Title,
TitlePlateId = userData.TitlePlateId,
AiTotalWin = (uint)userData.AiWinCount,
AiRank = aiRank
};
static string FormatStringInGroupsOfFour(string input)
{
if (string.IsNullOrEmpty(input))
return input;
// Use a StringBuilder for efficient string manipulation
var stringBuilder = new System.Text.StringBuilder();
int length = input.Length;
for (int i = 0; i < length; i++)
{
if (i > 0 && i % 4 == 0)
{
stringBuilder.Append(' ');
}
stringBuilder.Append(input[i]);
}
return stringBuilder.ToString();
}
string PadLeftWithZeros(string input, int desiredLength)
{
int zerosToAdd = Math.Max(0, desiredLength - input.Length);
return new string('0', zerosToAdd) + input;
}
string ConvertOldUID(string inputCardNum)
{
// Convert hexadecimal string to a byte array
inputCardNum = inputCardNum.ToUpper().Trim();
//Console.WriteLine(inputCardNum);
try
{
byte[] byteArray = new byte[inputCardNum.Length / 2];
for (int i = 0; i < inputCardNum.Length; i += 2)
{
byteArray[i / 2] = Convert.ToByte(inputCardNum.Substring(i, 2), 16);
}
// Reverse the array if needed (depends on endianness)
Array.Reverse(byteArray);
// Convert byte array to an unsigned long integer
string convertedNumber = PadLeftWithZeros(BitConverter.ToUInt64(byteArray, 0).ToString(), 20);
//Console.WriteLine($"Hexadecimal: {inputCardNum}");
//Console.WriteLine($"Decimal: {convertedNumber}");
return convertedNumber;
}
catch { }
return "";
}
}
}

View File

@ -0,0 +1,134 @@
[
{
"setId": 1,
"verupNo": 1,
"mainSongNo": 1191,
"subSongNo": [
1180,
409,
135,
772
]
},
{
"setId": 2,
"verupNo": 1,
"mainSongNo": 1102,
"subSongNo": [
1065,
966,
1008,
916
]
},
{
"setId": 3,
"verupNo": 1,
"mainSongNo": 1091,
"subSongNo": [
1009,
1064,
36,
965
]
},
{
"setId": 4,
"verupNo": 1,
"mainSongNo": 1117,
"subSongNo": [
122,
42,
430,
256
]
},
{
"setId": 5,
"verupNo": 1,
"mainSongNo": 1116,
"subSongNo": [
885,
985,
1003,
1063
]
},
{
"setId": 6,
"verupNo": 1,
"mainSongNo": 1101,
"subSongNo": [
915,
1004,
47,
1054
]
},
{
"setId": 7,
"verupNo": 1,
"mainSongNo": 1111,
"subSongNo": [
1028,
937,
374,
1062
]
},
{
"setId": 8,
"verupNo": 1,
"mainSongNo": 1112,
"subSongNo": [
1065,
1090,
1073,
1087
]
},
{
"setId": 9,
"verupNo": 1,
"mainSongNo": 1090,
"subSongNo": [
1112,
1003,
1007,
1088
]
},
{
"setId": 10,
"verupNo": 1,
"mainSongNo": 1113,
"subSongNo": [
1061,
1056,
1060,
1016
]
},
{
"setId": 11,
"verupNo": 1,
"mainSongNo": 1127,
"subSongNo": [
1038,
665,
1004,
1088
]
},
{
"setId": 12,
"verupNo": 1,
"mainSongNo": 1126,
"subSongNo": [
1077,
1030,
730,
1092
]
}
]

View File

@ -0,0 +1,82 @@
# Liste de fox
- [Liste de fox](#liste-de-fox)
- [1: Gundam](#1-gundam)
- [Block](#block)
- [2: Medley jeux](#2-medley-jeux)
- [Block](#block-1)
- [3: Medley jeux](#3-medley-jeux)
- [Block](#block-2)
- [4: Dragon maid](#4-dragon-maid)
- [Block](#block-3)
- [5: I buy sausage](#5-i-buy-sausage)
- [Block](#block-4)
- [6 TOUHOES](#6-touhoes)
- [Block](#block-5)
## 1: Gundam
1191: The blessing (The witch from mercury)
### Block
1180: Senkou (Hathaway)
409: Blazing (Reconquistada in G)
135: A cruel agel thesis (Evangay)
772: Battaille Decisive (Evangay)
## 2: Medley jeux
327: Phoenix wright medley
### Block
291: Monhunter 4 medley
643: Pokemon Ruby medley
659: Wild pokemon
692: Pokemon Black medley
## 3: Medley jeux
602: Animal crossing medley
### Block
609: Kirby medley
901: Kirby star allies medley
1188: Sonic medley
601: Splatoon
## 4: Dragon maid
189: Aozora no rhapsody
### Block
460: Yokai watch
486 : Database
249: The path of the wind (totoro)
1111 : Monster (Yoasobi)
## 5: I buy sausage
708: Motteke! Sailor fuku
### Block
233: Detective conan ???
102: Crossing field ???
644: Anzu no uta
140: + Denshi
## 6 TOUHOES
421: Necro fantasia
### Block
413: Bad apple
1140: Scarlet police (FUNKYYY)
412: Night of knights
416: Help me erinnn

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,9 @@
import os
import shutil
print("Copying files to game...")
os.makedirs("./Data_exported/Data_mods/x64/model/acce", exist_ok=True)
shutil.copy2("./Assets/QRLogin/acce_200000.nutexb", "./Data_exported/Data_mods/x64/model/acce/")
os.makedirs("./Data_exported/Data_mods/x64/script/song_select", exist_ok=True)
shutil.copy2("./Assets/Unlimited Random Select/song_select_all.tlb", "./Data_exported/Data_mods/x64/script/song_select")

57
tooling/checkFumen.py Normal file
View File

@ -0,0 +1,57 @@
from pathlib import Path
# Script used to check the unencrypted fumen file detection in TAL.
def is_fumen_encrypted(filename):
with open(filename, "rb") as file:
file.seek(0x214)
buffer = file.read(24)
# print(buffer)
# Expected byte pattern
expected_bytes = bytes(
[
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
]
)
return buffer != expected_bytes
def check_folder_for_encryption(folder_path):
folder = Path(folder_path)
# Iterate over all files recursively in the folder
for file in folder.rglob("*"):
if file.is_file():
if is_fumen_encrypted(file):
print(f"File '{file}' is NOT valid.")
# Usage example:
folder_path = "./Data_exported/Data_mods/x64/fumen"
check_folder_for_encryption(folder_path)

401
tooling/checkOutput.py Normal file
View File

@ -0,0 +1,401 @@
from enum import Enum
from encryption import decrypt_file
import json
import os
# "japaneseText"
# "englishUsText"
# "chineseTText"
# "koreanText"
# "chineseSText"
language = "englishUsText"
# region Loading files
checkFile = {}
infos = json.loads(decrypt_file(input_file="./Data_exported/Data_mods/x64/datatable/musicinfo.bin"))["items"]
usb = json.loads(decrypt_file(input_file="./Data_exported/Data_mods/x64/datatable/music_usbsetting.bin"))["items"]
order = json.loads(decrypt_file(input_file="./Data_exported/Data_mods/x64/datatable/music_order.bin"))["items"]
attributes = json.loads(decrypt_file(input_file="./Data_exported/Data_mods/x64/datatable/music_attribute.bin"))["items"]
words = json.loads(decrypt_file(input_file="./Data_exported/Data_mods/x64/datatable/wordlist.bin"))["items"]
# endregion
# region Classes And Methods
class Genres(Enum):
Unknown = -1
Pop = 0
Anime = 1
Kids = 2
Vocaloid = 3
GameMusic = 4
NamcoOriginal = 5
Variety = 6
Classical = 7
@classmethod
def _missing_(cls, value):
return cls.Unknown
# gv 08.18 with omnimix
# 0 = Pop
# 1 = Anime
# 2 = Kids
# 3 = Vocaloid
# 4 = Game Music
# 5 = Namco Original
# (WITH OMNIMIX)
# 7 = Variety
# 8 = Classical
def findKeyInList(list: list, key: str, keyValue, value=None):
for object in list:
if object[key] == keyValue:
if value is not None:
return object[value]
else:
return object
if value is not None:
return ""
else:
return None
def findAllObjects(list: list, key: str, keyValue):
templist = []
templist.append(list)
objects = []
for element in templist[0]:
if element[key] == keyValue:
objects.append(element)
return objects
def findDoubloninList(list: list, key: str, keyValue):
if len(findAllObjects(list=list, key=key, keyValue=keyValue)) > 1:
return True
return False
def doesPathExist(path: str):
if os.path.exists(path):
return True
return False
def initCheckFile():
global checkFile
checkFile = {
"musicinfo.json": {
"TotalEntries": len(infos),
"MaxId": max(infos, key=lambda ev: ev["uniqueId"])["uniqueId"],
"UniqueIdTooHigh": 0,
"UniqueIdTooHighList": [],
"UnusedUniqueIds": 0,
"UnusedUniqueIdsList": [],
"Doublons": 0,
"DoublonsList": [],
"GenreNoList": [],
},
}
if attributes is not None:
checkFile["music_attribute.json"] = {
"TotalEntries": len(attributes),
"Missing": 0,
"MissingList": [],
"Mismatch": 0,
"MismatchList": [],
"Doublons": 0,
"DoublonsList": [],
}
if order is not None:
checkFile["music_order.json"] = {
"TotalEntries": len(order),
"UniqueEntries": 0,
"UniqueEntriesList": [],
"GenreNoList": [],
"Missing": 0,
"MissingList": [],
"Mismatch": 0,
"MismatchList": [],
}
if usb is not None:
checkFile["music_usbsetting.json"] = {
"TotalEntries": len(usb),
"Missing": 0,
"MissingList": [],
"Mismatch": 0,
"MismatchList": [],
"Doublons": 0,
"DoublonsList": [],
}
if words is not None:
checkFile["wordlist.json"] = {
"TotalEntries": len(words),
"MissingSongName": 0,
"MissingSongNameList": [],
"MissingSongSub": 0,
"MissingSongSubList": [],
"MissingSongDetail": 0,
"MissingSongDetailList": [],
"Doublons": 0,
"DoublonsList": [],
}
checkFile.update(
{
"GameFiles": {
"MissingSound": 0,
"MissingSoundList": [],
"MissingFumen": 0,
"MissingFumenList": [],
},
}
)
class Song:
id = ""
uniqueId = -1
genreNo = -1
name = ""
sub = ""
detail = ""
def __init__(self, id, uniqueId, genreNo, name, sub, detail):
self.id = id
self.uniqueId = uniqueId
self.genreNo = genreNo
self.name = name
self.sub = sub
self.detail = detail
# endregion
# Loading all songs from musicinfo in an array
songs = []
for song in infos:
name = findKeyInList(
list=words,
key="key",
keyValue="song_" + song["id"],
value=language,
)
sub = findKeyInList(
list=words,
key="key",
keyValue="song_sub_" + song["id"],
value=language,
)
detail = findKeyInList(
list=words,
key="key",
keyValue="song_detail_" + song["id"],
value=language,
)
songs.append(
Song(
id=song["id"],
uniqueId=song["uniqueId"],
genreNo=song["genreNo"],
name=name,
sub=sub,
detail=detail,
)
)
# Preparing the json file containing the results of this checking script
initCheckFile()
# Checking...
for song in songs:
# musicinfo.json
if infos is not None:
# Checking for too high of an id
if song.uniqueId > 1599:
checkFile["musicinfo.json"]["UniqueIdTooHigh"] += 1
checkFile["musicinfo.json"]["UniqueIdTooHighList"].append(
{
"id": song.id,
"uniqueId": song.uniqueId,
}
)
# Listing genres and counting entries for each genres
genre = {
"GenreNo": song.genreNo,
"Name": Genres(song.genreNo).name,
"NumberofSongs": 0,
}
if (
findKeyInList(
list=checkFile["musicinfo.json"]["GenreNoList"],
key="GenreNo",
keyValue=song.genreNo,
)
is None
):
genre["NumberofSongs"] = len(findAllObjects(list=infos, key="genreNo", keyValue=song.genreNo))
checkFile["musicinfo.json"]["GenreNoList"].append(genre)
# Search doublons
if findDoubloninList(list=infos, key="id", keyValue=song.id):
if song.id not in checkFile["musicinfo.json"]["DoublonsList"]:
checkFile["musicinfo.json"]["Doublons"] += 1
checkFile["musicinfo.json"]["DoublonsList"].append(song.id)
# music_usbsetting.json
if usb is not None:
# Check for missing uniqueIds or id and uniqueId mismatches
orderOccurences = findAllObjects(list=usb, key="id", keyValue=song.id)
if len(orderOccurences) == 0:
checkFile["music_usbsetting.json"]["Missing"] += 1
checkFile["music_usbsetting.json"]["MissingList"].append(song.id)
else:
for occurence in orderOccurences:
if not all([song.id == occurence["id"], song.uniqueId == occurence["uniqueId"]]):
if song.id not in checkFile["music_usbsetting.json"]["MismatchList"]:
checkFile["music_usbsetting.json"]["Mismatch"] += 1
checkFile["music_usbsetting.json"]["MismatchList"].append(
{
"id": song.id,
"ExpectedUniqueId": song.uniqueId,
"CurrentUniqueId": occurence["uniqueId"],
}
)
# Search doublons
if findDoubloninList(list=usb, key="id", keyValue=song.id):
if song.id not in checkFile["music_usbsetting.json"]["DoublonsList"]:
checkFile["music_usbsetting.json"]["Doublons"] += 1
checkFile["music_usbsetting.json"]["DoublonsList"].append(song.id)
# music_attribute.json
if attributes is not None:
# Check for missing uniqueIds or id and uniqueId mismatches
orderOccurences = findAllObjects(list=attributes, key="id", keyValue=song.id)
if len(orderOccurences) == 0:
checkFile["music_attribute.json"]["Missing"] += 1
checkFile["music_attribute.json"]["MissingList"].append(song.id)
else:
for occurence in orderOccurences:
if not all([song.id == occurence["id"], song.uniqueId == occurence["uniqueId"]]):
if song.id not in checkFile["music_attribute.json"]["MismatchList"]:
checkFile["music_attribute.json"]["Mismatch"] += 1
checkFile["music_attribute.json"]["MismatchList"].append(
{
"id": song.id,
"ExpectedUniqueId": song.uniqueId,
"CurrentUniqueId": occurence["uniqueId"],
}
)
if findDoubloninList(list=attributes, key="id", keyValue=song.id):
if song.id not in checkFile["music_attribute.json"]["DoublonsList"]:
checkFile["music_attribute.json"]["Doublons"] += 1
checkFile["music_attribute.json"]["DoublonsList"].append(song.id)
# music_order.json
if order is not None:
# Check for missing uniqueIds or id and uniqueId mismatches
orderOccurences = findAllObjects(list=order, key="id", keyValue=song.id)
if len(orderOccurences) == 0:
checkFile["music_order.json"]["Missing"] += 1
checkFile["music_order.json"]["MissingList"].append(song.id)
else:
songGenres = []
for occurence in orderOccurences:
songGenres.append(occurence["genreNo"])
if not all([song.id == occurence["id"], song.uniqueId == occurence["uniqueId"]]):
if song.id not in checkFile["music_order.json"]["MismatchList"]:
checkFile["music_order.json"]["Mismatch"] += 1
checkFile["music_order.json"]["MismatchList"].append(
{
"id": song.id,
"ExpectedUniqueId": song.uniqueId,
"CurrentUniqueId": occurence["uniqueId"],
}
)
# Counting unique entries
checkFile["music_order.json"]["UniqueEntries"] += 1
checkFile["music_order.json"]["UniqueEntriesList"].append(
{
song.id: songGenres,
}
)
# wordlist.json
if words is not None:
if song.name == "":
checkFile["wordlist.json"]["MissingSongName"] += 1
checkFile["wordlist.json"]["MissingSongNameList"].append(song.id)
if song.sub == "":
checkFile["wordlist.json"]["MissingSongSub"] += 1
checkFile["wordlist.json"]["MissingSongSubList"].append(song.id)
if song.detail == "":
checkFile["wordlist.json"]["MissingSongDetail"] += 1
checkFile["wordlist.json"]["MissingSongDetailList"].append(song.id)
# Gamefiles
if not doesPathExist("./Data_exported/Data_mods/x64/sound/" + "song_" + song.id + ".nus3bank"):
checkFile["GameFiles"]["MissingSound"] += 1
checkFile["GameFiles"]["MissingSoundList"].append(song.id)
if not doesPathExist("./Data_exported/Data_mods/x64/fumen/" + song.id):
checkFile["GameFiles"]["MissingFumen"] += 1
checkFile["GameFiles"]["MissingFumenList"].append(song.id)
# Checking for vacant uniqueIds
for i in range(max(checkFile["musicinfo.json"]["MaxId"], 1600)):
key = findKeyInList(list=infos, key="uniqueId", keyValue=i)
if key is not None:
# Updating GenreNoList of music_order.json
for song in findAllObjects(list=order, key="uniqueId", keyValue=key["uniqueId"]):
genre = {
"GenreNo": song["genreNo"],
"Name": Genres(song["genreNo"]).name,
"NumberofSongs": 0,
}
if (
findKeyInList(
list=checkFile["music_order.json"]["GenreNoList"],
key="GenreNo",
keyValue=song["genreNo"],
)
is None
):
genre["NumberofSongs"] = len(findAllObjects(list=order, key="genreNo", keyValue=song["genreNo"]))
checkFile["music_order.json"]["GenreNoList"].append(genre)
else:
# Finding unused Ids bellow 1599
if i < 1600:
checkFile["musicinfo.json"]["UnusedUniqueIds"] += 1
checkFile["musicinfo.json"]["UnusedUniqueIdsList"].append(i)
# Checking for doublons in wordlist
if words is not None:
for word in words:
if findDoubloninList(list=words, key="key", keyValue=word["key"]):
if word["key"] not in checkFile["wordlist.json"]["DoublonsList"]:
checkFile["wordlist.json"]["Doublons"] += 1
checkFile["wordlist.json"]["DoublonsList"].append(word["key"])
# Sorting some values for better readability
checkFile["musicinfo.json"]["GenreNoList"].sort(key=lambda x: x["GenreNo"], reverse=False)
checkFile["music_order.json"]["GenreNoList"].sort(key=lambda x: x["GenreNo"], reverse=False)
# Writing everything to checks.json
json_object = json.dumps(checkFile, ensure_ascii=False, indent="\t")
# json_object = json.dumps(jsonList, ensure_ascii=False, indent="\t")
with open("./temp/checks.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
print("Wrote checks.\n")

63
tooling/copyFiles.py Normal file
View File

@ -0,0 +1,63 @@
import os
import shutil
# region Game
print("Copying Game files...")
# # Copying the LayeredFS folder
# shutil.copytree("./Data_exported/Data_mods", "./Game/Data_mods", dirs_exist_ok=True)
# Copying the game folder
shutil.copytree("./Data_exported/Data_mods", "./Game/Data", dirs_exist_ok=True)
# Copying TaikoArcadeLoader
shutil.copytree("./Assets/TaikoArcadeLoader", "./Game/Executable/Release", dirs_exist_ok=True)
# endregion
# region Server
print("Copying Server files...")
# Making folders
os.makedirs("./TaikoLocalServer/TaikoLocalServer/bin/Release/net8.0/win-x64/wwwroot", exist_ok=True)
os.makedirs("./TaikoLocalServer/TaikoLocalServer/bin/Debug/net8.0/wwwroot/", exist_ok=True)
os.makedirs("./TaikoLocalServer/TaikoLocalServer/Handlers/", exist_ok=True)
# Encrypting the required files for the server to the Server export folder
os.system(
'py .\encryption.py -i "./Data_decrypted/don_cos_reward.json" -o "./Data_exported/Server/wwwroot/data/datatable/don_cos_reward.bin" --enc'
)
os.system(
'py .\encryption.py -i "./Data_decrypted/neiro.json" -o "./Data_exported/Server/wwwroot/data/datatable/neiro.bin" --enc'
)
os.system(
'py .\encryption.py -i "./Data_decrypted/shougou.json" -o "./Data_exported/Server/wwwroot/data/datatable/shougou.bin" --enc'
)
# Copying the datatables to the Server export folder
shutil.copy2("./Data_exported/Data_mods/x64/datatable/wordlist.bin", "./Data_exported/Server/wwwroot/data/datatable")
shutil.copy2("./Data_exported/Data_mods/x64/datatable/musicinfo.bin", "./Data_exported/Server/wwwroot/data/datatable")
shutil.copy2("./Data_exported/Data_mods/x64/datatable/music_order.bin", "./Data_exported/Server/wwwroot/data/datatable")
# Copying the Server export folder to TaikoLocalServer
shutil.copytree("./Data_exported/Server/wwwroot/", "./TaikoLocalServer/TaikoLocalServer/wwwroot/", dirs_exist_ok=True)
# Copying the BaidQuery script to convert old card formats
shutil.copy2(
"./Assets/Display Card ID & Update old Card format/BaidQuery.cs",
"./TaikoLocalServer/TaikoLocalServer/Handlers/BaidQuery.cs",
)
# Copying the Intro song list
shutil.copy2(
"./Assets/New Song Intro List/intro_data.json",
"./TaikoLocalServer/TaikoLocalServer/wwwroot/data/intro_data.json",
)
# Copying the database to the Debug and Release build folders (otherwise the server will regenerate a DB file on first start)
shutil.copy2(
"./Data_exported/Server/wwwroot/taiko.db3",
"./TaikoLocalServer/TaikoLocalServer/bin/Release/net8.0/win-x64/wwwroot/",
)
shutil.copy2(
"./Data_exported/Server/wwwroot/taiko.db3",
"./TaikoLocalServer/TaikoLocalServer/bin/Debug/net8.0/wwwroot/",
)
# endregion

31
tooling/copyOmniFiles.py Normal file
View File

@ -0,0 +1,31 @@
import json
import os
import shutil
from encryption import encrypt_file
remap = json.load(open(file="./Data_exported/Data_mods/x64/datatable/dec/remap.json", encoding="utf-8"))
os.makedirs("./Data_exported/Data_mods/x64/fumen", exist_ok=True)
os.makedirs("./Data_exported/Data_mods/x64/sound", exist_ok=True)
for entry in remap:
if not os.path.exists("./Assets/Taiko Omnimix v8/Data/x64/fumen/" + entry["id"]) or not os.path.exists(
"./Assets/Taiko Omnimix v8/Data/x64/sound/song_" + entry["id"] + ".nus3bank"
):
print(entry["id"], "is missing!")
exit("Make sure you exported the omnimix properly in the temp folder !")
else:
# region Encrypting databases
indir = "./Assets/Taiko Omnimix v8/Data/x64/fumen/" + entry["id"]
outdir = "./Data_exported/Data_mods/x64/fumen/" + entry["id"]
os.system('py ./encryption.py -i "' + indir + '" -o "' + outdir + '" --enc --fumen')
# shutil.copytree(
# "./Assets/Taiko Omnimix v8/Data/x64/fumen/" + entry["id"],
# "./Data_exported/Data_mods/x64/fumen/" + entry["id"],
# dirs_exist_ok=True,
# )
shutil.copy2(
"./Assets/Taiko Omnimix v8/Data/x64/sound/song_" + entry["id"] + ".nus3bank",
"./Data_exported/Data_mods/x64/sound/",
)
print("Copied files for ", entry["id"])

View File

@ -0,0 +1,105 @@
# -*- coding: utf-8 -*-
import os
import sqlite3
import numpy as np
import json
from dotenv import load_dotenv
load_dotenv()
musicinfo_list = json.load(open(file="./Data_exported/Data_mods/x64/datatable/dec/musicinfo.json", encoding="utf-8"))[
"items"
]
unique_id_list = [entry["uniqueId"] for entry in musicinfo_list]
unique_id_list = np.unique(unique_id_list)
unique_id_list = sorted(unique_id_list)
# Connect to the database
conn = sqlite3.connect("./Data_exported/Server/wwwroot/taiko.db3")
c = conn.cursor()
# Get the AiScoreData table
c.execute("SELECT * FROM AiScoreData")
ai_score_data_list = c.fetchall()
# Column 1 is the SongId
new_ai_score_data_list = [row for row in ai_score_data_list if int(row[1]) in unique_id_list]
# Write back to the database
c.execute("DELETE FROM AiScoreData")
query = "INSERT INTO AiScoreData VALUES (" + ",".join(["?"] * len(new_ai_score_data_list[0])) + ")"
c.executemany(query, new_ai_score_data_list)
# Get the AiSectionScoreData table
c.execute("SELECT * FROM AiSectionScoreData")
ai_section_score_data_list = c.fetchall()
# Column 1 is the SongId
new_ai_section_score_data_list = [row for row in ai_section_score_data_list if int(row[1]) in unique_id_list]
# Write back to the database
c.execute("DELETE FROM AiSectionScoreData")
query = "INSERT INTO AiSectionScoreData VALUES (" + ",".join(["?"] * len(new_ai_section_score_data_list[0])) + ")"
c.executemany(query, new_ai_section_score_data_list)
# Get the SongBestData table
c.execute("SELECT * FROM SongBestData")
song_best_data_list = c.fetchall()
# Column 1 is the SongId
new_song_best_data_list = [row for row in song_best_data_list if int(row[1]) in unique_id_list]
# Write back to the database
c.execute("DELETE FROM SongBestData")
query = "INSERT INTO SongBestData VALUES (" + ",".join(["?"] * len(new_song_best_data_list[0])) + ")"
c.executemany(query, new_song_best_data_list)
# Get the SongPlayData table
c.execute("SELECT * FROM SongPlayData")
song_play_data_list = c.fetchall()
# Column -2 is the SongId
new_song_play_data_list = [row for row in song_play_data_list if int(row[-2]) in unique_id_list]
removed_song_play_data_list = [row for row in song_play_data_list if int(row[-2]) not in unique_id_list]
# print(removed_song_play_data_list)
removed_id_list = []
for entry in removed_song_play_data_list:
if entry[-2] not in removed_id_list:
removed_id_list.append(entry[-2])
removed_id_list = sorted(removed_id_list)
print("removed", len(removed_id_list), "ids", removed_id_list)
# Write back to the database
c.execute("DELETE FROM SongPlayData")
query = "INSERT INTO SongPlayData VALUES (" + ",".join(["?"] * len(new_song_play_data_list[0])) + ")"
c.executemany(query, new_song_play_data_list)
# Get the UserData table
c.execute("SELECT * FROM UserData")
user_data_list = c.fetchall()
new_user_data_list = []
for row in user_data_list:
new_row = list(row)
# Column 12 is the FavoriteSongsArray
favorite_songs_array = json.loads(row[12])
new_favorite_songs_array = [song_id for song_id in favorite_songs_array if int(song_id) in unique_id_list]
new_row[12] = json.dumps(new_favorite_songs_array)
# Column 28 is the UnlockedSongIdList
try:
unlocked_song_id_list = json.loads(row[28])
except:
unlocked_song_id_list = []
new_unlocked_song_id_list = [song_id for song_id in unlocked_song_id_list if int(song_id) in unique_id_list]
new_row[28] = json.dumps(new_unlocked_song_id_list)
new_user_data_list.append(new_row)
# Write back to the database
c.execute("DELETE FROM UserData")
query = "INSERT INTO UserData VALUES (" + ",".join(["?"] * len(new_user_data_list[0])) + ")"
c.executemany(query, new_user_data_list)
conn.commit()
conn.close()

164
tooling/encryption.py Normal file
View File

@ -0,0 +1,164 @@
import glob
import gzip
import os
from pathlib import Path
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import padding
from argparse import ArgumentParser
from enum import Enum
import binascii
class Keys(Enum):
Datatable = "3530304242323633353537423431384139353134383346433246464231354534" # Add datatable key here
Fumen = "4434423946383537303842433443383030333843444132343339373531353830" # Add Fumen key here
def read_iv_from_file(file_path):
with open(file_path, "rb") as f:
iv = f.read(16)
if len(iv) != 16:
raise Exception("Invalid file")
return iv
def pad_data(data):
padder = padding.PKCS7(128).padder()
return padder.update(data) + padder.finalize()
def remove_pkcs7_padding(data):
unpadder = padding.PKCS7(128).unpadder()
return unpadder.update(data) + unpadder.finalize()
def decrypt_file(input_file, key_type: Keys = Keys(Keys.Datatable)):
# Convert the key from hex to bytes
key = binascii.unhexlify(Keys(key_type.value).value)
# Read the IV from the first 16 bytes of the input file
iv = read_iv_from_file(input_file)
# Create an AES cipher object with CBC mode
cipher = Cipher(algorithms.AES(key), modes.CBC(iv), backend=default_backend())
decryptor = cipher.decryptor()
with open(input_file, "rb") as infile:
# Skip the IV in the input file
infile.seek(16)
# Decrypt the file
decrypted_data = b"" + decryptor.update(infile.read())
# Remove PKCS7 padding
unpadded_data = remove_pkcs7_padding(decrypted_data)
# Gzip decompress the data
decompressed_data = gzip.decompress(unpadded_data)
# return the decompressed data
return decompressed_data
def encrypt_file(input_file, key_type: Keys = Keys(Keys.Datatable)):
# Convert the key from hex to bytes
key = binascii.unhexlify(Keys(key_type.value).value)
# Generate a random 128-bit IV
iv = os.urandom(16)
# Create an AES cipher object with CBC mode
try:
cipher = Cipher(algorithms.AES(key), modes.CBC(iv), backend=default_backend())
encryptor = cipher.encryptor()
except Exception as error:
print(error)
print("You need to set the right AES keys in the encryption.py file")
exit(0)
with open(input_file, "rb") as infile:
# Read the entire file into memory
data = infile.read()
# Gzip compress the data
compressed_data = gzip.compress(data)
# Pad the compressed data, encrypt it, and return the encrypted result
encrypted_data = encryptor.update(pad_data(compressed_data)) + encryptor.finalize()
return iv + encrypted_data
def save_file(file: bytes, outdir: str, encrypt: bool):
fileContent = (
decrypt_file(input_file=file, key_type=type) if not encrypt else encrypt_file(input_file=file, key_type=type)
)
print("Decrypting" if not encrypt else "Encrypting", file, "to", outdir)
with open(outdir, "wb") as outfile:
outfile.write(fileContent)
if __name__ == "__main__":
parser = ArgumentParser()
parser.add_argument(
"-i",
"--input",
help="Input file / folder",
)
parser.add_argument(
"-o",
"--output",
help="Output file / folder",
)
parser.add_argument(
"-e",
"--enc",
action="store_true",
default=False,
help="Use this flag to encrypt a file",
)
parser.add_argument(
"-t",
"--fumen",
action="store_true",
default=False,
help="Datatable is default, use this flag for Fumen",
)
args = parser.parse_args()
if not args.input:
print("Missing input file, pass the argument --help for help")
exit(0)
if not args.output:
print("Missing output file, pass the argument --help for help")
exit(0)
type = Keys.Datatable if not args.fumen else Keys.Fumen
if os.path.isdir(args.input):
for path, subdirs, files in os.walk(args.input):
for name in files:
full_path = os.path.join(path, name)
relative_path = os.path.relpath(full_path, args.input)
outpath = os.path.join(args.output, relative_path)
outdir = os.path.dirname(outpath)
Path(outdir).mkdir(parents=True, exist_ok=True)
if os.path.isfile(full_path):
save_file(
file=full_path,
outdir=outpath,
encrypt=False if not args.enc else True,
)
else:
save_file(
file=args.input,
outdir=args.output,
encrypt=False if not args.enc else True,
)

23
tooling/helpers.py Normal file
View File

@ -0,0 +1,23 @@
import re
def capfirst(s):
if s is not None:
return s[:1].upper() + s[1:]
return s
def is_cjk(string: str):
if re.match(
"^[A-Za-z0-9!@#\$%^&*()_+=\[\]{};:'\",.<>\/\\|λéÓíäā?\-*$~μ♨☆★♥♡♪↑↓◆××・⑨“”°Δ ]*$",
string,
):
return False
else:
return True
def fetchKey(key: str, wordlist):
for wordentry in wordlist["items"]:
if wordentry["key"] == key:
return wordentry

87
tooling/listPlays.py Normal file
View File

@ -0,0 +1,87 @@
import json
import os
import sqlite3
from dotenv import load_dotenv
load_dotenv()
remap = json.load(open(file="../Migration & Backup/CHN to 39.06/unalteredRemap.json", encoding="utf-8"))
wordlist = json.load(open(file="../08.18 & CHN/Data/x64/datatable/dec/wordlist.json", encoding="utf-8"))
musicinfo = json.load(open(file="../08.18 & CHN/Data/x64/datatable/dec/musicinfo.json", encoding="utf-8"))
# Connect to the database
conn = sqlite3.connect("./Data_exported/Server/wwwroot/taiko.db3")
cursor = conn.cursor()
sql_update_query = f"SELECT SongId, COUNT(SongId) AS count FROM SongPlayData GROUP BY SongId ORDER BY SongId;"
cursor.execute(sql_update_query)
results = cursor.fetchall()
id_counts = {row[0]: row[1] for row in results}
omniPlays = {}
for remapped in remap:
remappedId = int(remapped["uniqueIdOriginal"])
id_count = 0
try:
id_count = id_counts[remappedId]
except:
pass
omniPlays[remappedId] = id_count
omniPlays = dict(sorted(omniPlays.items(), key=lambda item: item[1], reverse=True))
plays = {
"Omnimix": [],
"Regular": [],
}
for key in omniPlays:
value = next((item for item in remap if item["uniqueIdOriginal"] == key), None)
nameKey = next((item for item in wordlist["items"] if item["key"] == "song_" + value["id"]), None)
print(omniPlays[key], ": ", value["id"], "=>", nameKey["englishUsText"])
plays["Omnimix"].append(
{
"id": value["id"],
"plays": omniPlays[key],
"nameJp": nameKey["japaneseText"],
"nameUs": nameKey["englishUsText"],
}
)
regularPlays = {}
for entry in musicinfo["items"]:
id_count = 0
try:
id_count = id_counts[entry["uniqueId"]]
except:
pass
if id_count < 1599:
if next((item for item in remap if item["uniqueIdRemap"] == entry["uniqueId"]), None) is None:
regularPlays[int(entry["uniqueId"])] = id_count
regularPlays = dict(sorted(regularPlays.items(), key=lambda item: item[1], reverse=True))
for key in regularPlays:
value = next((item for item in musicinfo["items"] if item["uniqueId"] == key), None)
nameKey = next(
(item for item in wordlist["items"] if item["key"] == "song_" + value["id"]),
{"japaneseText": "", "englishUsText": ""},
)
print(regularPlays[key], "=>", value["id"], "=>", nameKey["englishUsText"])
plays["Regular"].append(
{
"id": value["id"],
"plays": regularPlays[key],
"nameJp": nameKey["japaneseText"],
"nameUs": nameKey["englishUsText"],
}
)
with open("./temp/listPlays.json", "w", encoding="utf8") as outfile:
outfile.write(json.dumps(plays, indent="\t", ensure_ascii=False))
conn.close()
print(len(plays["Regular"]), "regular songs,", len(plays["Omnimix"]), "omnimix songs.")

463
tooling/makeDatabases.py Normal file
View File

@ -0,0 +1,463 @@
import glob
import json
import os
import shutil
from encryption import encrypt_file
from helpers import fetchKey, is_cjk
translationFixes = [
{
"key": "folder_event1",
"japaneseText": "東方Project特集",
"englishUsText": "Touhou Project",
},
{
"key": "folder_intro_event1",
"japaneseText": "東方Projectアレンジの曲をあつめたよ",
"englishUsText": "A collection of Touhou Project songs!",
},
{
"key": "folder_event2",
"japaneseText": "アイドルマスター特集",
"englishUsText": "The Idolmaster",
},
{
"key": "folder_intro_event2",
"japaneseText": "東方Projectアレンジの曲をあつめたよ",
"englishUsText": "A collection of songs from The Idolmaster!",
},
{
"key": "folder_event5",
"japaneseText": "スタジオジブリ特集",
"englishUsText": "Studio Ghibli",
},
{
"key": "folder_intro_event5",
"japaneseText": "東方Projectアレンジの曲をあつめたよ",
"englishUsText": "A collection of Studio Ghibli songs!",
},
{
"key": "folder_event6",
"japaneseText": "妖怪ウォッチ特集",
"englishUsText": "Yokai Watch",
},
{
"key": "folder_intro_event6",
"japaneseText": "東方Projectアレンジの曲をあつめたよ",
"englishUsText": "A collection of Yokai Watch songs!",
},
{
"key": "folder_event7",
"japaneseText": "UUUMクリエイター特集",
"englishUsText": "UUUM Creator Feature",
},
{
"key": "folder_intro_event7",
"japaneseText": "「#コンパス」の曲をあつめたよ!",
"englishUsText": "A collection of songs from UMMM!",
},
{
"key": "folder_event12",
"japaneseText": "#コンパス特集",
"englishUsText": "#Compass Creator Feature",
},
{
"key": "folder_intro_event12",
"japaneseText": "「#コンパス」の曲をあつめたよ!",
"englishUsText": "A collection of songs from the game #Compass!",
},
]
# region Loading json files
# Loading final song list ======================================================================
finalList = json.load(open("./temp/finalList.json", encoding="utf-8"))
# Loading wordlists ============================================================================
wordlist = json.load(open("./Data_decrypted/wordlist.json", encoding="utf-8"))
omni_wordlist_en = json.load(open("../08.18 & CHN/gamefiles/Omni/wordlist_en.json", encoding="utf-8"))
omni_wordlist_jp = json.load(open("../08.18 & CHN/gamefiles/Omni/wordlist_jp.json", encoding="utf-8"))
music_attributes = json.load(open("./Data_decrypted/music_attribute.json", encoding="utf-8"))
omni_music_attributes = json.load(open("../08.18 & CHN/gamefiles/Omni/music_attribute.json", encoding="utf-8"))
# Loading music_order ====================================================
music_orders = json.load(open("./Data_decrypted/music_order.json", encoding="utf-8"))
omni_music_orders = {"items": []}
for item in json.load(open("../08.18 & CHN/gamefiles/Omni/music_order.json", encoding="utf-8"))["items"]:
if item["genreNo"] >= 6:
item["genreNo"] -= 1
omni_music_orders["items"].append(item)
# Loading music_ai_section =============================================
music_ai_section = json.load(open("./Data_decrypted/music_ai_section.json", encoding="utf-8"))
# Loading musicinfo ====================================================
musicinfos = json.load(open("./Data_decrypted/musicinfo.json", encoding="utf-8"))
omni_musicinfos = {"items": []}
for item in json.load(open("../08.18 & CHN/gamefiles/Omni/musicinfo.json", encoding="utf-8"))["items"]:
if item["genreNo"] >= 6:
item["genreNo"] -= 1
omni_musicinfos["items"].append(item)
# endregion
# region Game files
###################
###################
#### endregion ####
# region musicinfo.json, music_usbsetting.json, music_attributes.json, music_ai_section.json.
for newentry in finalList["songs"]:
# we try to find an entry from the final list in the 39.06's musicinfo file
entry = next((item for item in musicinfos["items"] if item["id"] == newentry["id"]), None)
# if we find nothing that means the song is from omnimix
if entry is None:
# we get the musicinfo entry from the omnimix files and append it to the 39.06 file.
omni_entry = next((item for item in omni_musicinfos["items"] if item["id"] == newentry["id"]), None)
omni_entry["spikeOnEasy"] = 0
omni_entry["spikeOnNormal"] = 0
omni_entry["spikeOnHard"] = 0
omni_entry["spikeOnOni"] = 0
omni_entry["spikeOnUra"] = 0
musicinfos["items"].append(omni_entry)
# we generate a list of all unused uniqueIds bellow 1599 and bellow the highest uniqueId for a 39.06 song.
higher = 0
usedUniqueIds = []
for song in musicinfos["items"]:
uniqueId = song["uniqueId"]
# find higher id
usedUniqueIds.append(uniqueId)
if uniqueId >= higher:
higher = uniqueId
unusedList = []
overLimitList = []
for i in range(higher):
if all([i not in usedUniqueIds, i <= 1599]):
unusedList.append(i)
if all([i in usedUniqueIds, i > 1599]):
overLimitList.append(i)
# we then remap all songs above id 1599 using the list of unused uniqueIds bellow 1599.
unusedIndex = 0
remapJson = {"items": []}
for song in musicinfos["items"]:
if song["uniqueId"] > 1599:
if len(unusedList) > 0:
if unusedIndex < len(unusedList):
remapJson["items"].append(
{"id": song["id"], "uniqueIdOriginal": song["uniqueId"], "uniqueIdRemap": unusedList[unusedIndex]}
)
song["uniqueId"] = unusedList[unusedIndex]
unusedIndex += 1
music_ai_section["items"].append(
{
"id": song["id"],
"uniqueId": song["uniqueId"],
"easy": 5,
"normal": 5,
"hard": 5,
"oni": 5,
"ura": 3,
"oniLevel11": "",
"uraLevel11": "",
},
)
else:
print("Couldn't remap " + song["id"])
else:
print("Couldn't remap " + song["id"])
print("Remapped " + str(len(remapJson["items"])) + " songs")
json_object = json.dumps(remapJson["items"], indent="\t", ensure_ascii=False)
with open("./Data_exported/Data_mods/x64/datatable/dec/remap.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
# we generate the music_ai_section file
music_ai_section["items"].sort(key=lambda x: x["uniqueId"], reverse=False)
json_object = json.dumps(music_ai_section, indent="\t", ensure_ascii=False)
with open("./Data_exported/Data_mods/x64/datatable/dec/music_ai_section.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
# we generate the music_usbsetting file
usbsettingjson = {"items": []}
musicinfos["items"].sort(key=lambda x: x["uniqueId"], reverse=False)
for song in musicinfos["items"]:
usbsetting = {"id": song["id"], "uniqueId": int(song["uniqueId"]), "usbVer": ""}
usbsettingjson["items"].append(usbsetting)
json_object = json.dumps(usbsettingjson, indent="\t", ensure_ascii=False)
with open("./Data_exported/Data_mods/x64/datatable/dec/music_usbsetting.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
# we generate the music_attribute file
for song in musicinfos["items"]:
entry = next((item for item in music_attributes["items"] if item["id"] == song["id"]), None)
# if we find nothing that means the song is from omnimix
if entry is None:
# we get the musicinfo entry from the omnimix files and append it to the 39.06 file.
omni_entry = next((item for item in omni_music_attributes["items"] if item["id"] == song["id"]), None)
omni_entry["uniqueId"] = song["uniqueId"]
omni_entry["ensoPartsID1"] = 0
omni_entry["ensoPartsID2"] = 0
del omni_entry["canPlayUra"]
music_attributes["items"].append(omni_entry)
music_attributes["items"].sort(key=lambda x: x["uniqueId"], reverse=False)
json_object = json.dumps(music_attributes, indent="\t", ensure_ascii=False)
with open("./Data_exported/Data_mods/x64/datatable/dec/music_attribute.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
musicinfos["items"].sort(key=lambda x: x["uniqueId"], reverse=False)
json_object = json.dumps(musicinfos, indent="\t", ensure_ascii=False)
with open("./Data_exported/Data_mods/x64/datatable/dec/musicinfo.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
print("Wrote musicinfo, music_attribute and music_usbsetting.\n")
# # endregion
# region wordlist.json
for song in finalList["songs"]:
songKey = "song_" + song["id"]
songSubKey = "song_sub_" + song["id"]
songDetailKey = "song_sub_" + song["id"]
# song entry
entry = next((item for item in wordlist["items"] if item["key"] == songKey), None)
if entry is not None:
if entry["japaneseText"] != "":
next
else:
print(songKey, "is already in the wordlist but has an empty string.")
entry["japaneseText"] = song["nameJp"]
entry["englishUsText"] = song["nameUs"]
entry["englishUsFontType"] = 1 if not is_cjk(song["nameUs"]) else 0
else:
print(songKey, "has been added to the wordlist.")
wordlist["items"].append(
{
"key": songKey,
"japaneseText": song["nameJp"],
"japaneseFontType": 0,
"englishUsText": song["nameUs"],
"englishUsFontType": 1 if not is_cjk(song["nameUs"]) else 0,
},
)
# song sub entry
entry = next((item for item in wordlist["items"] if item["key"] == songSubKey), None)
if entry is not None:
if entry["japaneseText"] != "":
next
else:
subentry = next(
(item for item in omni_wordlist_en["items"] if item["key"] == songSubKey),
{"japaneseText": ""},
)["japaneseText"]
if subentry != "":
print(songKey, "sub is already in the wordlist but has an empty string.")
entry["japaneseText"] = subentry
entry["englishUsText"] = subentry
entry["englishUsFontType"] = 1 if not is_cjk(subentry) else 0
else:
subentry = next(
(item for item in omni_wordlist_en["items"] if item["key"] == songSubKey),
{"japaneseText": ""},
)["japaneseText"]
wordlist["items"].append(
{
"key": songSubKey,
"japaneseText": subentry,
"japaneseFontType": 0,
"englishUsText": subentry,
"englishUsFontType": 1 if not is_cjk(subentry) else 0,
},
)
if subentry != "":
print(songSubKey, "has been added to the wordlist.")
# song detail entry
entry = next((item for item in wordlist["items"] if item["key"] == songDetailKey), None)
if entry is not None:
if entry["japaneseText"] != "":
next
else:
detailentry = next(
(item for item in omni_wordlist_en["items"] if item["key"] == songDetailKey),
{"japaneseText": ""},
)["japaneseText"]
if detailentry != "":
print(songKey, "detail is already in the wordlist but has an empty string.")
entry["japaneseText"] = detailentry
entry["englishUsText"] = detailentry
entry["englishUsFontType"] = 1 if not is_cjk(detailentry) else 0
else:
detailentry = next(
(item for item in omni_wordlist_en["items"] if item["key"] == songDetailKey),
{"japaneseText": ""},
)["japaneseText"]
if detailentry != "":
print(songDetailKey, "has been added to the wordlist.")
wordlist["items"].append(
{
"key": songDetailKey,
"japaneseText": detailentry,
"japaneseFontType": 0,
"englishUsText": detailentry,
"englishUsFontType": 1 if not is_cjk(detailentry) else 0,
},
)
print("Processed wordlist.\n")
# endregion
# region music_order.json
# closedisptype in music_order
# 1 to show subtitle
# 0 to show title
for song in music_attributes["items"]:
# we try to find an entry from the final list in the 39.06's music_order file
entry = next((item for item in music_orders["items"] if item["id"] == song["id"]), None)
name = next((item for item in wordlist["items"] if item["key"] == "song_" + song["id"]), {"englishUsText": ""})
if name["englishUsText"] == "" and song["id"] != "tmap4":
name["englishUsText"] = name["japaneseText"]
name["englishUsFontType"] = 1 if not is_cjk(name["japaneseText"]) else 0
print("Missing title for", name["key"])
# if we find nothing that means the song is from omnimix
if entry is None:
if song["id"] != "tmap4":
for omniEntry in omni_music_orders["items"]:
if omniEntry["id"] == song["id"]:
omniEntry["uniqueId"] = song["uniqueId"]
omniEntry["englishUsText"] = name["englishUsText"]
music_orders["items"].append(omniEntry)
continue
else:
for entry in music_orders["items"]:
if entry["id"] == song["id"]:
entry["englishUsText"] = name["englishUsText"]
# Writing music_order ===============================================================================
# ordering music_order by genre and english name
music_orders["items"].sort(key=lambda x: (x["genreNo"], x["englishUsText"]))
# removing the names from the dict
# for items in music_orders["items"]:
# if "englishUsText" in items:
# del items["englishUsText"]
# writing the music order
json_object = json.dumps(music_orders, indent="\t", ensure_ascii=False)
with open("./Data_exported/Data_mods/x64/datatable/dec/music_order.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
print("Wrote music_order.\n")
# wordlist["items"].sort(key=lambda x: x["key"], reverse=False)
# removing unused languages from the dict
for items in wordlist["items"]:
if "koreanText" in items:
del items["koreanText"]
del items["koreanFontType"]
# # if "chineseTText" in items:
# # del items["chineseTText"]
# # del items["chineseTFontType"]
for entry in translationFixes:
key = fetchKey(key=entry["key"], wordlist=wordlist)
key["englishUsText"] = entry["englishUsText"]
key["englishUsFontType"] = 1
print(key["japaneseText"], "->", key["englishUsText"])
# exporting the wordlist.
json_object = json.dumps(wordlist, ensure_ascii=False, indent="\t")
with open("./Data_exported/Data_mods/x64/datatable/dec/wordlist.json", "w", encoding="utf8") as outfile:
outfile.write(json_object)
# Fixing missing translations
print("Wrote wordlist.\n")
# endregion
# region Encrypting databases
files = glob.glob("./Data_exported/Data_mods/x64/datatable/dec/*")
for f in files:
outfile = os.path.splitext(os.path.basename(f))[0] + ".bin"
outdir = os.path.join("./Data_exported/Data_mods/x64/datatable/", outfile)
if outfile != "remap.bin":
print("Encrypting " + f + " to " + outdir)
file = encrypt_file(input_file=f)
with open(outdir, "wb") as outfile:
outfile.write(file)
print("Encrypted Datatables.\n")
# endregion
# region Writing server files
#############################
ServerFolderSongsPerType = 20
######## endregion ##########
# region event_folder_data.json
playcounts = json.load(open(file="./temp/listPlays.json", encoding="utf-8"))
eventfolders = json.load(open(file="./Data_decrypted/Server/event_folder_data.json", encoding="utf-8"))
musicinfo = json.load(open(file="./Data_exported/Data_mods/x64/datatable/dec/musicinfo.json", encoding="utf-8"))
# The recommended song folder is accessed with the following key:
# eventfolders[2]["songNo"]
MostPlayedList = []
for song in playcounts["Omnimix"]:
song["Omni"] = True
if song not in MostPlayedList:
MostPlayedList.append(song)
if len(MostPlayedList) > ServerFolderSongsPerType:
break
for song in playcounts["Regular"]:
song["Omni"] = False
if song not in MostPlayedList:
MostPlayedList.append(song)
if len(MostPlayedList) > ServerFolderSongsPerType * 2 - 1:
break
MostPlayedList = sorted(MostPlayedList, key=lambda item: item["plays"], reverse=True)
MostPlayedArray = []
for song in MostPlayedList:
uniqueId = next((item for item in musicinfo["items"] if item["id"] == song["id"]), None)
if not uniqueId:
print(song["id"])
else:
print(str(song["plays"]).zfill(3), "=>", "O" if song["Omni"] else "R", "=>", song["id"], "=>", song["nameUs"])
MostPlayedArray.append(uniqueId["uniqueId"])
print("Exported", len(MostPlayedArray), "songs")
eventfolders[2]["songNo"] = MostPlayedArray
eventfoldersDump = json.dumps(eventfolders, indent=4, ensure_ascii=False)
with open("./Data_exported/Server/wwwroot/data/event_folder_data.json", "w", encoding="utf8") as outfile:
outfile.write(eventfoldersDump)
print("Wrote event_folder_data.\n")
# endregion
# region shop_folder_data.json
with open("./Data_exported/Server/wwwroot/data/shop_folder_data.json", "w", encoding="utf8") as outfile:
outfile.write(json.dumps([], indent=4, ensure_ascii=False))
print("Wrote shop_folder_data.\n")
# endregion
# region movie_data.json
with open("./Data_exported/Server/wwwroot/data/movie_data.json", "w", encoding="utf8") as outfile:
outfile.write(json.dumps([{"movie_id": 20, "enable_days": 999}], indent=4, ensure_ascii=False))
print("Wrote movie_data.\n")
# endregion

47
tooling/makeSongList.py Normal file
View File

@ -0,0 +1,47 @@
import json
minimumPlays = 3
playcounts = json.load(open(file="./temp/listPlays.json", encoding="utf-8"))
musicinfo = json.load(open(file="./Data_decrypted/musicinfo.json", encoding="utf-8"))
musicinfoOmni = json.load(open(file="../08.18 & CHN/gamefiles/Omni/musicinfo.json", encoding="utf-8"))
finalList = {"songs": []}
# Adding all the 39.06 songs in the list
for music in musicinfo["items"]:
finalList["songs"].append({"id": music["id"], "uniqueId": music["uniqueId"]})
# Going over all the Omni songs and adding the ones that have been played more than X times to the final list
omniAdded = 0
omniRemoved = 0
for music in musicinfoOmni["items"]:
try:
song = next((item for item in playcounts["Omnimix"] if item["id"] == music["id"]), None)
songPlayCount = song["plays"]
if songPlayCount > minimumPlays:
finalList["songs"].append(
{
"id": music["id"],
"uniqueId": music["uniqueId"],
"nameJp": song["nameJp"],
"nameUs": song["nameUs"],
"Played": songPlayCount,
}
)
print("Added", (music["id"] + ": "), song["nameUs"], "from omnimix")
omniAdded += 1
else:
print("Skipped", (music["id"] + ": "), song["nameUs"])
omniRemoved += 1
except:
pass
print("Kept", omniAdded, "songs")
print("Removed", omniRemoved, "songs")
finalList["songs"].sort(key=lambda x: int(x["uniqueId"]), reverse=False)
updatedWordList = json.dumps(finalList, indent=4, ensure_ascii=False)
with open("./temp/finalList.json", "w", encoding="utf8") as outfile:
outfile.write(updatedWordList)

37
tooling/readme.md Normal file
View File

@ -0,0 +1,37 @@
# 39.06 Tools
These tools are used to update and maintain the Taiko Nijiiro cab at AtomCity.
Some of these have been open sourced as part of the [nijiiro-toolset](https://github.com/AkaiiKitsune/nijiiro-toolset) project.
Before doing anything, make sure you're in the `/39.06/` folder !
You'll need to do some setup to use the scripts in this folder.
1) Extract the 08.18 Omnimix files to `./Assets/Taiko Omnimix v8` and copy the files
* Download it from [here](https://cloud.farewell.dev/s/ewjnXEgp2jRzKjD) (You'll need to extract in order `Taiko Omnimix v8`, `Omnimix v8 Addition` and `Omnimix_v8_Song_Fix`)
2) Extract a clean copy of the game to `./Game`
3) Clone a copy of TaikoLocalServer
* Run `git clone https://github.com/asesidaa/TaikoLocalServer.git`
* Go to the TaikoLocalServer folder and run `git checkout dev`
The upgrade procedure for CHN=>39.06 is the following:
1) Export the data from the old install and make the new song list
* Run [remapToOriginal](./remapToOriginal.py) to restore the original omnimix ids
* Run [listPlays](./listPlays.py) to export the playcount of all songs
* Run [makeSongList](./makeSongList.py) to generate the new songlist (39.06 + The most played Omnimix songs)
2) Generate the new datatable files
* Run [makeDatabases](./makeDatabases.py) to generate the updated 39.06 Omnimix databases
3) Copy the Omnimix sound and fumen files
* Run [copyOmniFiles](./copyOmniFiles.py) to copy the song and fumen data from omnimix
4) Translate the costume names and copy the custom QR code puchi
* Run [translateCostumes](./translateCostumes.py) to translate the costumes to english
* Run [addCustomFiles](./addCustomfiles.py) to copy the custom patches
5) Map the omnimix song ids in the server database to their new value
* Run [remapToOmnimix](./remapToOmnimix.py) to remap the new omnimix songs
* Run [deleteMissingSongs](./deleteMissingSongs.py) to remove the scores from entries with invalid IDs (removed songs)
6) Check the datatable output using [checkOutput](./checkOutput.py)
7) Copy the files to the game's folder by running [copyFiles](./copyFiles.py)
You're done !
To do everything at once, run [runAll](./runAll.py)

91
tooling/remapToOmnimix.py Normal file
View File

@ -0,0 +1,91 @@
import json
import os
import shutil
import sqlite3
from dotenv import load_dotenv
load_dotenv()
remap = json.load(open(file="./Data_exported/Data_mods/x64/datatable/dec/remap.json", encoding="utf-8"))
# Connect to the database
conn = sqlite3.connect("./Data_exported/Server/wwwroot/taiko.db3")
cursor = conn.cursor()
def update_ids_in_table(cursor, table, column, id_mappings):
"""
Updates the IDs in a specific table and column based on the provided id_mappings,
removes any rows where the column value is equal to the new ID to avoid duplicates,
and updates the favorite_songs_array accordingly.
Args:
cursor: SQLite cursor object.
table: Name of the table to update.
column: Name of the column to update.
id_mappings: Dictionary with old IDs as keys and new IDs as values.
"""
for old_id, new_id in id_mappings.items():
# Step 1: Remove rows where column = new_id to avoid duplicates
sql_delete_query = f"DELETE FROM {table} WHERE {column} = ?"
cursor.execute(sql_delete_query, (new_id,))
if cursor.rowcount > 0:
print(f"Removed {cursor.rowcount} entries in {table} where {column} = {new_id}")
# Step 2: Update favorite_songs_array in UserData table (assumes column index 12)
cursor.execute(f"SELECT rowid, * FROM UserData") # `rowid` lets us update specific rows
user_data_rows = cursor.fetchall()
for row in user_data_rows:
row_id = row[0]
favorite_songs_array = json.loads(row[13]) if row[13] else [] # Load favorite_songs_array from column 12
# Update the favorite_songs_array, removing any instances of new_id
updated_favorite_songs = [song_id for song_id in favorite_songs_array if int(song_id) != new_id]
# Check if the favorite_songs_array was actually changed
if favorite_songs_array != updated_favorite_songs:
# Convert back to JSON and update in the database
updated_favorite_songs_json = json.dumps(updated_favorite_songs)
cursor.execute(
"UPDATE UserData SET FavoriteSongsArray = ? WHERE rowid = ?", (updated_favorite_songs_json, row_id)
)
print(f'Updated FavoriteSongsArray for "{row[19]}": Removed song id {new_id}')
# Step 3: Update the IDs in the specified column (e.g., main table update)
sql_update_query = f"UPDATE {table} SET {column} = ? WHERE {column} = ?"
cursor.execute(sql_update_query, (new_id, old_id))
print(f"Updated {table}.{column} from {old_id} to {new_id}")
# List of tables and their corresponding columns to update
tables_to_update = {
"AiScoreData": "SongId",
"AiSectionScoreData": "SongId",
"SongBestData": "SongId",
"SongPlayData": "SongId",
}
invert = True
id_mappings = {}
for remapentry in remap:
RemapFrom = remapentry["uniqueIdOriginal" if invert else "uniqueIdRemap"]
RemapTo = remapentry["uniqueIdRemap" if invert else "uniqueIdOriginal"]
id_mappings[RemapFrom] = RemapTo
try:
# Iterate over each table and update the IDs
for table, column in tables_to_update.items():
update_ids_in_table(cursor, table, column, id_mappings)
# Commit the changes to the database
conn.commit()
except sqlite3.Error as e:
print(f"An error occurred: {e}")
conn.rollback() # Rollback changes if something goes wrong
finally:
# Close the database connection
conn.close()
print("Database connection closed.")

View File

@ -0,0 +1,74 @@
import json
import os
import shutil
import sqlite3
from dotenv import load_dotenv
load_dotenv()
remap = json.load(open(file="../Migration & Backup/CHN to 39.06/unalteredRemap.json", encoding="utf-8"))
os.makedirs("./temp/", exist_ok=True)
os.makedirs("./Data_exported/Server/wwwroot/data/datatable", exist_ok=True)
os.makedirs("./Data_exported/Data_mods/x64/datatable/dec/", exist_ok=True)
shutil.copy2(
"../Migration & Backup/CHN to 39.06/unaltered" + os.getenv("DBNAME"),
"./Data_exported/Server/wwwroot/taiko.db3",
)
# Connect to the database
conn = sqlite3.connect("./Data_exported/Server/wwwroot/taiko.db3")
cursor = conn.cursor()
def update_ids_in_table(cursor, table, column, id_mappings):
"""
Updates the IDs in a specific table and column based on the provided id_mappings.
Args:
cursor: SQLite cursor object.
table: Name of the table to update.
column: Name of the column to update.
id_mappings: Dictionary with old IDs as keys and new IDs as values.
"""
for old_id, new_id in id_mappings.items():
# SQL query to update the IDs
sql_update_query = f"UPDATE {table} SET {column} = ? WHERE {column} = ?"
# Execute the update query
cursor.execute(sql_update_query, (new_id, old_id))
print(f"Updated {table}.{column} from {old_id} to {new_id}")
# List of tables and their corresponding columns to update
tables_to_update = {
"AiScoreData": "SongId",
"AiSectionScoreData": "SongId",
"SongBestData": "SongId",
"SongPlayData": "SongId",
}
invert = False
id_mappings = {}
for remapentry in remap:
RemapFrom = remapentry["uniqueIdOriginal" if invert else "uniqueIdRemap"]
RemapTo = remapentry["uniqueIdRemap" if invert else "uniqueIdOriginal"]
id_mappings[RemapFrom] = RemapTo
try:
# Iterate over each table and update the IDs
for table, column in tables_to_update.items():
update_ids_in_table(cursor, table, column, id_mappings)
# Commit the changes to the database
conn.commit()
except sqlite3.Error as e:
print(f"An error occurred: {e}")
conn.rollback() # Rollback changes if something goes wrong
finally:
# Close the database connection
conn.close()
print("Database connection closed.")

3
tooling/requirements.txt Normal file
View File

@ -0,0 +1,3 @@
cryptography==43.0.3
deep_translator==1.11.4
numpy==2.1.2

14
tooling/runAll.py Normal file
View File

@ -0,0 +1,14 @@
import os
os.system("py ./remapToOriginal.py")
os.system("py ./listPlays.py")
os.system("py ./makeSongList.py")
os.system("py ./makeDatabases.py")
os.system("py ./copyOmniFiles.py")
os.system("py ./addCustomfiles.py")
os.system("py ./remapToOmnimix.py")
os.system("py ./deleteMissingSongs.py")
os.system("py ./copyFiles.py")
print("\n Done !")

View File

@ -0,0 +1,46 @@
import json
from deep_translator import GoogleTranslator
from helpers import capfirst
from encryption import encrypt_file
# ======================================================================================
destinationLanguage = "english"
destinationKey = "englishUsText"
wordlist = json.load(open(file="./Data_exported/Data_mods/x64/datatable/dec/wordlist.json", encoding="utf-8"))
doncos = json.load(open(file="./Data_decrypted/don_cos_reward.json", encoding="utf-8"))
# ======================================================================================
def fetchKey(key: str):
for wordentry in wordlist["items"]:
if wordentry["key"] == key:
return wordentry
translator = GoogleTranslator(source="japanese", target=destinationLanguage)
for costume in doncos["items"]:
cosType = costume["cosType"]
costumeId = costume["uniqueId"]
costumeNameKey = f"costume_{cosType}_{costumeId}"
key = fetchKey(key=costumeNameKey)
key[destinationKey] = capfirst(translator.translate(key["japaneseText"]))
print(
cosType,
(str(costumeId) + ":"),
key["japaneseText"],
"->",
key[destinationKey],
)
updatedWordList = json.dumps(wordlist, indent=4, ensure_ascii=False)
with open(
"./Data_exported/Data_mods/x64/datatable/dec/wordlist_costume_translated.json", "w", encoding="utf8"
) as outfile:
outfile.write(updatedWordList)
file = encrypt_file(input_file="./Data_exported/Data_mods/x64/datatable/dec/wordlist_costume_translated.json")
with open("./Data_exported/Data_mods/x64/datatable/wordlist_costume_translated.bin", "wb") as outfile:
outfile.write(file)