1
0
mirror of synced 2024-11-23 22:10:57 +01:00

Merge branch 'develop'

This commit is contained in:
Kevin Trocolli 2024-08-11 02:23:04 -04:00
commit 31ca45fc68
398 changed files with 33827 additions and 10175 deletions

BIN
LICENSE.txt Normal file

Binary file not shown.

View File

@ -1,9 +1,114 @@
# Changelog
Documenting updates to ARTEMiS, to be updated every time the master branch is pushed to.
## 20240811
### System
+ Change backend from Twisted to Starlette
+ Implement async handlers
+ Reboot times for multiple games have been fixed (thanks zaphkito!)
### Frontend
+ Edit button changed to View on the user page, and is where you can edit the card memo
+ Add card now works as it should
+ Add event log viewer in the `sys` page for sysadmins
+ Add pages for Pokken, SAO, and maimai
### AimeDB
+ Now rejects all-zero access codes
+ Stores card IDm (for AmusementIC) and MiFare ID (for old aime/banapass)
+ ...unless that MiFare ID is 0x01020304 (the default for segatools)
### maimai
+ Add support for BUDDiES
+ Rivals and Favorite Music support
### Wacca
+ Add option to block unregistered serials from accessing the title server
### DIVA
+ Fix for reading modded content (Thanks ThatzOkay!)
### CHUNITHM
+ Save net battle info
## 20240630
### DIVA
+ Added configurable festa options'
## 20240629
### CHUNITHM
+ Add team points
## 20240628
### maimai
+ Add present support
## 20240627
### SAO
+ Fix ghost items, character and player XP, EX Bonuses, unlocks, and much much more
## 20240620
### CHUNITHM
+ CHUNITHM LUMINOUS support
## 20240616
### CHUNITHM
+ Support network encryption for Export/International versions
### DIVA
+ Working frontend with name and level strings edit and playlog
## 20240530
### DIVA
+ Fix reader for when dificulty is not a int
## 20240526
### DIVA
+ Fixed missing awaits causing coroutine error
## 20240524
### DIVA
+ Fixed new profile start request causing coroutine error
## 20240523
### DIVA
+ Fixed binary handler & render_POST errors
## 20240408
### System
+ Modified the game specific documentation
## 20240407
### Maimai
+ Support maimai DX International [#118](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/118) (Thanks beerpsi!)
+ Fixed the maimai DX reboot time from config [#120](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/120) (Thanks topty!)
## 20240318
### CXB
+ Fixing handle_data_shop_list_detail_request for Sunrise S1
## 20240302
### SAO
+ Fixing new profile creation with right heroes and start VP
+ Fix to the Unanalyzed Log responses returning the wrong rewards
+ Documentation revised
## 20240226
### CXB
+ Fixing paths for rev.py
+ Changed encoding for handle_data_item_list_icon_request
## 20240202
### SAO
+ Added reader assets and edited the game specific documentation
## 20240118
### System
+ Added game version names to the readme
## 20240109
### System
+ Removed `ADD config config` from dockerfile
+ Removed `ADD config config` from dockerfile [#83](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/83) (Thanks zaphkito!)
### Aimedb
+ Fixed an error that resulted from trying to scan a banned or locked card

View File

@ -1,8 +1,182 @@
# Contributing to ARTEMiS
If you would like to contribute to artemis, either by adding features, games, or fixing bugs, you can do so by forking the repo and submitting a pull request [here](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls). Please make sure, if you're submitting a PR for a game or game version, that you're following the n-0/y-1 guidelines, or it will be rejected.
If you would like to contribute to artemis, either by adding features, games, or fixing bugs, you can do so by forking the repo and submitting a pull request [here](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls). This guide assume you're familiar with both git, python, and the libraries that artemis uses.
This document is a work in progress. If you have any questions or notice any errors, please report it to the discord.
## Adding games
Guide WIP
### Step 0
+ Follow the "n-1" rule of thumb. PRs for game versions that are currently active in arcades will be deleted. If you're unsure, ask!
+ Always PR against the `develop` branch.
+ Check to see if somebody else is already PRing the features/games you want to add. If they are, consider contributing to them rather then making an entirely new PR.
+ We don't technically have a written code style guide (TODO) but try to keep your code consistant with code that's already there where possible.
### Step 1 (Setup)
1) Fork the gitea repo, clone your fork, and checkout the develop branch.
2) Make a new folder in the `titles` folder, name it some recogniseable shorthand for your game (Chunithm becomes chuni, maimai dx is mai2, etc)
3) In this new folder, create a file named `__init__.py`. This is the first thing that will load when your title module is loaded by the core system, and it acts as sort of a directory for where everything lives in your module. This file will contain the following required items:
+ `index`: must point to a subclass of `BaseServlet` that will handle setup and dispatching of your game.
+ `game_codes`: must be a list of 4 letter SEGA game codes as strings.
It can also contain the following optional fields:
+ `database`: points to a subclass of `Data` that contains one or more subclasses of `BaseData` that act as database transaction handlers. Required for the class to store and retrieve data from the database.
+ `reader`: points to a subclass of `BaseReader` that handles importing static data from game files into the database.
+ `frontend`: points to a subclass of `FE_Base` that handles frontend routes for your game.
The next step will focus on `index`
### Step 2 (Index)
1) Create another file in your game's folder. By convention, it should be called `index.py`.
2) Inside `index.py`, add the following code, replacing {Game name here} with the name of your game, without spaces or special characters. Look at other titles for examples.
```py
from core.title import BaseServlet
from core import CoreConfig
class {Game name here}Servlet(BaseServlet):
def __init__(self, core_cfg: CoreConfig, cfg_dir: str) -> None:
pass
```
3) The `__init__` function should acomplish the following:
+ Reading your game's config
+ Setting up your games logger
+ Instancing your games versions
It's usually safe to copy and paste the `__init__` functions from other games, just make sure you change everything that needs to be changed!
4) Go back to the `__init__.py` that you created and add the following:
```py
from .index import {Game name here}Servlet
index = {Game name here}Servlet
```
5) Going back to `index.py`, within the Servlet class, define the following functions from `BaseServlet` as needed (see function documentation):
+ `is_game_enabled`: Returns true if the game is enabled and should be served, false otherwise. Returns false by default, so override this to allow your game to be served.
+ `get_routes`: Returns a list of Starlette routes that your game will serve.
+ `get_allnet_info`: Returns a tuple of strings where the first is the allnet uri and the second is the allnet host. The function takes the game ID, version and keychip ID as parameters, so you can send different responses if need be.
+ `get_mucha_info`: Only used by games that use Mucha as authentication. Returns a tuple where the first is a bool that is weather or not the game is enabled, the 2nd is a list of game CDs as strings that this servlet should handle, and the 3rd is a list of netID prefixes that each game CD should use. If your game does not use mucha, do not define this function.
+ `setup`: Preforms any setup your servlet requires, such as spinning up matching servers. It is run once when the server starts. If you don't need any setup, do not define.
6) Make sure any functions you specify to handle routes in `get_routes` are defined as async, as follows: `async def handle_thing(self, request: Request) -> Response:` where Response is whatever kind of Response class you'll be returning. Make sure all paths in this function return some subclass of Response, otherwise you'll get an error when serving.
### Step 3 (Constants)
1) In your game's folder, create a file to store static values for your game. By convention, we call this `const.py`
2) Inside, create a class called `{Game name here}Constants`. Do not define an `__init__` function.
3) Put constants related to your game here. A good example of something to put here is game codes.
```py
class {Game name here}Constants:
GAME_CODE = "SBXX"
CONFIG_NAME = "{game name}.yaml"
```
4) If you choose to put game codes in here, add this to your `__init__.py` file:
```py
from .const import {Game name here}Constants
...
game_codes = [{Game name here}Constants.GAME_CODE]
```
### Step 4 (Config)
1) Make a file to store your game's config. By convention, it should be called `config.py`
2) Inside that file, add the following:
```py
from core.config import CoreConfig
class {game name}ServerConfig:
def __init__(self, parent_config: "{game name}Config") -> None:
self.__config = parent_config
@property
def enable(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "{game name}", "server", "enable", default=True
)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "{game name}", "server", "loglevel", default="info"
)
)
class {game name}Config(dict):
def __init__(self) -> None:
self.server = {game name}ServerConfig(self)
```
3) In the `example_config` folder, create a yaml file for your game. By convention, it should be called `{game folder name}.ymal`. Add the following:
```yaml
server:
enable: True
loglevel: "info"
```
4) Add any additional config options that you feel the game needs. Look to other games for config examples.
5) In `index.py` import your config and instance it in `__init__` with:
```py
self.game_cfg = {game folder name}Config()
if path.exists(f"{cfg_dir}/{game folder name}Constants.CONFIG_NAME}"):
self.game_cfg.update(
yaml.safe_load(open(f"{cfg_dir}/{game folder name}Constants.CONFIG_NAME}"))
)
```
This will attempt to load the config file you specified in your constants, and if not, go with the defaults specified in `config.py`. This game_cfg object can then be passed down to your handlers when you create them.
At this stage your game should be loaded by allnet, and serve whatever routes you put in `get_routes`. See the next section about adding versions and handlers.
### Step 5 (Database)
TODO
### Step 6 (Frontend)
TODO
### Step 7 (Reader)
TODO
## Adding game versions
Guide WIP
See the above section about code expectations and how to PR.
1) In the game's folder, create a python file to contain the version handlers. By convention, the first version is version 0, and is stored in `base.py`. Versions following that increment the version number, and are stored in `{short version name}.py`. See Wacca's folder for an example of how to name versions.
2) Internal version numbers should be defined in `const.py`. The version should change any time the game gets a major update (i.e. a new version or plus version.)
```py
# in const.py
VERSION_{game name} = 0
VERSION_{game name}_PLUS = 1
```
3) Inside `base.py` (or whatever your version is named) add the following:
```py
class {game name}Base:
def __init__(self, cfg: CoreConfig, game_cfg: {game name}Config) -> None:
self.game_config = game_cfg
self.core_config = cfg
self.version = {game name}Constants.VERSION_{game name}
self.data = {game name}Data(cfg)
# Any other initialization stuff
```
4) Define your handlers. This will vary wildly by game, but best practice is to keep the naming consistant, so that the main dispatch function in `index.py` can use `getattr` to get the handler, rather then having a static list of what endpoint or request type goes to which handler. See Wacca's `index.py` and `base.py` for examples of how to do this.
5) If your version is not the base version, make sure it inherits from the base version:
```py
class {game name}Plus({game name}Base):
def __init__(self, cfg: CoreConfig, game_cfg: {game name}Config) -> None:
super().__init__(cfg, game_cfg)
self.version = {game name}Constants.VERSION_{game name}_PLUS
```
6) Back in `index.py` make sure to import your new class, and add it to `__init__`. Some games may opt to just a single list called `self.versions` that contains all the version classes at their internal version's index. Others may simply define them as seperate members. See Wacca for an example of `self.versions`
7) Add your version to your game's dispatching logic.
8) Test to make sure your game is being handled properly.
9) Submit a PR.
## Adding/improving core services
If you intend to submit improvements or additions to core services (allnet, mucha, billing, aimedb, database, etc) please get in touch with a maintainer.

View File

@ -1,6 +1,6 @@
from core.config import CoreConfig
from core.allnet import AllnetServlet
from core.aimedb import AimedbFactory
from core.allnet import AllnetServlet, BillingServlet
from core.aimedb import AimedbServlette
from core.title import TitleServlet
from core.utils import Utils
from core.mucha import MuchaServlet

View File

@ -2,5 +2,5 @@ from .base import ADBBaseRequest, ADBBaseResponse, ADBHeader, ADBHeaderException
from .base import CompanyCodes, ReaderFwVer, CMD_CODE_GOODBYE, HEADER_SIZE
from .lookup import ADBLookupRequest, ADBLookupResponse, ADBLookupExResponse
from .campaign import ADBCampaignClearRequest, ADBCampaignClearResponse, ADBCampaignResponse, ADBOldCampaignRequest, ADBOldCampaignResponse
from .felica import ADBFelicaLookupRequest, ADBFelicaLookupResponse, ADBFelicaLookup2Request, ADBFelicaLookup2Response
from .felica import ADBFelicaLookupRequest, ADBFelicaLookupResponse, ADBFelicaLookupExRequest, ADBFelicaLookupExResponse
from .log import ADBLogExRequest, ADBLogRequest, ADBStatusLogRequest, ADBLogExResponse

View File

@ -102,7 +102,7 @@ class ADBHeader:
magic, protocol_ver, cmd, length, status, game_id, store_id, keychip_id = struct.unpack_from("<5H6sI12s", data)
head = cls(magic, protocol_ver, cmd, length, status, game_id, store_id, keychip_id)
if head.length != len(data):
if head.length > len(data):
raise ADBHeaderException(f"Length is incorrect! Expect {head.length}, got {len(data)}")
return head

View File

@ -10,13 +10,14 @@ class ADBFelicaLookupRequest(ADBBaseRequest):
self.pmm = hex(pmm)[2:].upper()
class ADBFelicaLookupResponse(ADBBaseResponse):
def __init__(self, access_code: str = None, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x03, length: int = 0x30, status: int = 1) -> None:
def __init__(self, access_code: str = None, idx: int = 0, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x03, length: int = 0x30, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.access_code = access_code if access_code is not None else "00000000000000000000"
self.idx = idx
@classmethod
def from_req(cls, req: ADBHeader, access_code: str = None) -> "ADBFelicaLookupResponse":
c = cls(access_code, req.game_id, req.store_id, req.keychip_id)
def from_req(cls, req: ADBHeader, access_code: str = None, idx: int = 0) -> "ADBFelicaLookupResponse":
c = cls(access_code, idx, req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
@ -26,7 +27,7 @@ class ADBFelicaLookupResponse(ADBBaseResponse):
"access_code" / Int8ub[10],
Padding(2)
).build(dict(
felica_idx = 0,
felica_idx = self.idx,
access_code = bytes.fromhex(self.access_code)
))
@ -34,7 +35,7 @@ class ADBFelicaLookupResponse(ADBBaseResponse):
return self.head.make() + resp_struct
class ADBFelicaLookup2Request(ADBBaseRequest):
class ADBFelicaLookupExRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.random = struct.unpack_from("<16s", data, 0x20)[0]
@ -45,7 +46,7 @@ class ADBFelicaLookup2Request(ADBBaseRequest):
self.company = CompanyCodes(int.from_bytes(company, 'little'))
self.fw_ver = ReaderFwVer.from_byte(fw_ver)
class ADBFelicaLookup2Response(ADBBaseResponse):
class ADBFelicaLookupExResponse(ADBBaseResponse):
def __init__(self, user_id: Union[int, None] = None, access_code: Union[str, None] = None, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x12, length: int = 0x130, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.user_id = user_id if user_id is not None else -1
@ -55,7 +56,7 @@ class ADBFelicaLookup2Response(ADBBaseResponse):
self.auth_key = [0] * 256
@classmethod
def from_req(cls, req: ADBHeader, user_id: Union[int, None] = None, access_code: Union[str, None] = None) -> "ADBFelicaLookup2Response":
def from_req(cls, req: ADBHeader, user_id: Union[int, None] = None, access_code: Union[str, None] = None) -> "ADBFelicaLookupExResponse":
c = cls(user_id, access_code, req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c

View File

@ -1,9 +1,7 @@
from twisted.internet.protocol import Factory, Protocol
import logging, coloredlogs
from Crypto.Cipher import AES
import struct
from typing import Dict, Tuple, Callable, Union
from typing_extensions import Final
from typing import Dict, Tuple, Callable, Union, Optional
import asyncio
from logging.handlers import TimedRotatingFileHandler
from core.config import CoreConfig
@ -11,15 +9,37 @@ from core.utils import create_sega_auth_key
from core.data import Data
from .adb_handlers import *
class AimedbProtocol(Protocol):
class AimedbServlette():
request_list: Dict[int, Tuple[Callable[[bytes, int], Union[ADBBaseResponse, bytes]], int, str]] = {}
def __init__(self, core_cfg: CoreConfig) -> None:
self.logger = logging.getLogger("aimedb")
self.config = core_cfg
def __init__(self, core_cfg: CoreConfig) -> None:
self.config = core_cfg
self.data = Data(core_cfg)
if core_cfg.aimedb.key == "":
self.logger = logging.getLogger("aimedb")
if not hasattr(self.logger, "initted"):
log_fmt_str = "[%(asctime)s] Aimedb | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "aimedb"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(self.config.aimedb.loglevel)
coloredlogs.install(
level=core_cfg.aimedb.loglevel, logger=self.logger, fmt=log_fmt_str
)
self.logger.initted = True
if not core_cfg.aimedb.key:
self.logger.error("!!!KEY NOT SET!!!")
exit(1)
@ -40,27 +60,31 @@ class AimedbProtocol(Protocol):
self.register_handler(0x13, 0x14, self.handle_log_ex, 'aime_log_ex')
self.register_handler(0x64, 0x65, self.handle_hello, 'hello')
self.register_handler(0x66, 0, self.handle_goodbye, 'goodbye')
def register_handler(self, cmd: int, resp:int, handler: Callable[[bytes, int], Union[ADBBaseResponse, bytes]], name: str) -> None:
self.request_list[cmd] = (handler, resp, name)
def start(self) -> None:
self.logger.info(f"Start on port {self.config.aimedb.port}")
addr = self.config.aimedb.listen_address if self.config.aimedb.listen_address else self.config.server.listen_address
asyncio.create_task(asyncio.start_server(self.dataReceived, addr, self.config.aimedb.port))
async def dataReceived(self, reader: asyncio.StreamReader, writer: asyncio.StreamWriter):
self.logger.debug(f"Connection made from {writer.get_extra_info('peername')[0]}")
while True:
try:
data: bytes = await reader.read(4096)
if len(data) == 0:
self.logger.debug("Connection closed")
return
await self.process_data(data, reader, writer)
await writer.drain()
except ConnectionResetError as e:
self.logger.debug("Connection reset, disconnecting")
return
def append_padding(self, data: bytes):
"""Appends 0s to the end of the data until it's at the correct size"""
length = struct.unpack_from("<H", data, 6)
padding_size = length[0] - len(data)
data += bytes(padding_size)
return data
def connectionMade(self) -> None:
self.logger.debug(f"{self.transport.getPeer().host} Connected")
def connectionLost(self, reason) -> None:
self.logger.debug(
f"{self.transport.getPeer().host} Disconnected - {reason.value}"
)
def dataReceived(self, data: bytes) -> None:
async def process_data(self, data: bytes, reader: asyncio.StreamReader, writer: asyncio.StreamWriter) -> Optional[bytes]:
addr = writer.get_extra_info('peername')[0]
cipher = AES.new(self.config.aimedb.key.encode(), AES.MODE_ECB)
try:
@ -68,9 +92,9 @@ class AimedbProtocol(Protocol):
except Exception as e:
self.logger.error(f"Failed to decrypt {data.hex()} because {e}")
return None
return
self.logger.debug(f"{self.transport.getPeer().host} wrote {decrypted.hex()}")
self.logger.debug(f"{addr} wrote {decrypted.hex()}")
try:
head = ADBHeader.from_data(decrypted)
@ -79,7 +103,9 @@ class AimedbProtocol(Protocol):
self.logger.error(f"Error parsing ADB header: {e}")
try:
encrypted = cipher.encrypt(ADBBaseResponse().make())
self.transport.write(encrypted)
writer.write(encrypted)
await writer.drain()
return
except Exception as e:
self.logger.error(f"Failed to encrypt default response because {e}")
@ -89,46 +115,51 @@ class AimedbProtocol(Protocol):
if head.keychip_id == "ABCD1234567" or head.store_id == 0xfff0:
self.logger.warning(f"Request from uninitialized AMLib: {vars(head)}")
if head.cmd == 0x66:
self.logger.info("Goodbye")
writer.close()
return
handler, resp_code, name = self.request_list.get(head.cmd, (self.handle_default, None, 'default'))
if resp_code is None:
self.logger.warning(f"No handler for cmd {hex(head.cmd)}")
elif resp_code > 0:
self.logger.info(f"{name} from {head.keychip_id} ({head.game_id}) @ {self.transport.getPeer().host}")
self.logger.info(f"{name} from {head.keychip_id} ({head.game_id}) @ {addr}")
resp = handler(decrypted, resp_code)
resp = await handler(decrypted, resp_code)
if type(resp) == ADBBaseResponse or issubclass(type(resp), ADBBaseResponse):
resp_bytes = resp.make()
if len(resp_bytes) != resp.head.length:
resp_bytes = self.append_padding(resp_bytes)
elif type(resp) == bytes:
resp_bytes = resp
elif resp is None: # Nothing to send, probably a goodbye
self.logger.warn(f"None return by handler for {name}")
return
else:
self.logger.error(f"Unsupported type returned by ADB handler for {name}: {type(resp)}")
raise TypeError(f"Unsupported type returned by ADB handler for {name}: {type(resp)}")
try:
try:
encrypted = cipher.encrypt(resp_bytes)
self.logger.debug(f"Response {resp_bytes.hex()}")
self.transport.write(encrypted)
writer.write(encrypted)
except Exception as e:
self.logger.error(f"Failed to encrypt {resp_bytes.hex()} because {e}")
def handle_default(self, data: bytes, resp_code: int, length: int = 0x20) -> ADBBaseResponse:
async def handle_default(self, data: bytes, resp_code: int, length: int = 0x20) -> ADBBaseResponse:
req = ADBHeader.from_data(data)
return ADBBaseResponse(resp_code, length, 1, req.game_id, req.store_id, req.keychip_id, req.protocol_ver)
def handle_hello(self, data: bytes, resp_code: int) -> ADBBaseResponse:
return self.handle_default(data, resp_code)
async def handle_hello(self, data: bytes, resp_code: int) -> ADBBaseResponse:
return await self.handle_default(data, resp_code)
def handle_campaign(self, data: bytes, resp_code: int) -> ADBBaseResponse:
async def handle_campaign(self, data: bytes, resp_code: int) -> ADBBaseResponse:
h = ADBHeader.from_data(data)
if h.protocol_ver >= 0x3030:
req = h
@ -143,12 +174,18 @@ class AimedbProtocol(Protocol):
# We don't currently support campaigns
return resp
def handle_lookup(self, data: bytes, resp_code: int) -> ADBBaseResponse:
async def handle_lookup(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBLookupRequest(data)
user_id = self.data.card.get_user_id_from_card(req.access_code)
is_banned = self.data.card.get_card_banned(req.access_code)
is_locked = self.data.card.get_card_locked(req.access_code)
if req.access_code == "00000000000000000000":
self.logger.warn(f"All-zero access code from {req.head.keychip_id}")
ret = ADBLookupResponse.from_req(req.head, -1)
ret.head.status = ADBStatus.BAN_SYS
return ret
user_id = await self.data.card.get_user_id_from_card(req.access_code)
is_banned = await self.data.card.get_card_banned(req.access_code)
is_locked = await self.data.card.get_card_locked(req.access_code)
ret = ADBLookupResponse.from_req(req.head, user_id)
if is_banned and is_locked:
ret.head.status = ADBStatus.BAN_SYS_USER
@ -160,14 +197,26 @@ class AimedbProtocol(Protocol):
self.logger.info(
f"access_code {req.access_code} -> user_id {ret.user_id}"
)
if user_id and user_id > 0:
await self.data.card.update_card_last_login(req.access_code)
if (req.access_code.startswith("010") or req.access_code.startswith("3")) and req.serial_number != 0x04030201: # Default segatools sn
await self.data.card.set_chip_id_by_access_code(req.access_code, req.serial_number)
self.logger.info(f"Attempt to set chip id to {req.serial_number:08X} for access code {req.access_code}")
return ret
def handle_lookup_ex(self, data: bytes, resp_code: int) -> ADBBaseResponse:
async def handle_lookup_ex(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBLookupRequest(data)
user_id = self.data.card.get_user_id_from_card(req.access_code)
if req.access_code == "00000000000000000000":
self.logger.warn(f"All-zero access code from {req.head.keychip_id}")
ret = ADBLookupExResponse.from_req(req.head, -1)
ret.head.status = ADBStatus.BAN_SYS
return ret
user_id = await self.data.card.get_user_id_from_card(req.access_code)
is_banned = self.data.card.get_card_banned(req.access_code)
is_locked = self.data.card.get_card_locked(req.access_code)
is_banned = await self.data.card.get_card_banned(req.access_code)
is_locked = await self.data.card.get_card_locked(req.access_code)
ret = ADBLookupExResponse.from_req(req.head, user_id)
if is_banned and is_locked:
@ -189,40 +238,67 @@ class AimedbProtocol(Protocol):
self.logger.debug(f"Generated auth token {auth_key}")
ret.auth_key = auth_key_full
if user_id and user_id > 0:
await self.data.card.update_card_last_login(req.access_code)
return ret
def handle_felica_lookup(self, data: bytes, resp_code: int) -> bytes:
async def handle_felica_lookup(self, data: bytes, resp_code: int) -> bytes:
"""
On official, I think a card has to be registered for this to actually work, but
I'm making the executive decision to not implement that and just kick back our
faux generated access code. The real felica IDm -> access code conversion is done
on the ADB server, which we do not and will not ever have access to. Because we can
assure that all IDms will be unique, this basic 0-padded hex -> int conversion will
be fine.
On official, the IDm is used as a key to look up the stored access code in a large
database. We do not have access to that database so we have to make due with what we got.
Interestingly, namco games are able to read S_PAD0 and send the server the correct access
code, but aimedb doesn't. Until somebody either enters the correct code manually, or scans
on a game that reads it correctly from the card, this will have to do. It's the same conversion
used on the big boy networks.
"""
req = ADBFelicaLookupRequest(data)
ac = self.data.card.to_access_code(req.idm)
idm = req.idm.zfill(16)
if idm == "0000000000000000":
self.logger.warn(f"All-zero IDm from {req.head.keychip_id}")
ret = ADBFelicaLookupResponse.from_req(req.head, "00000000000000000000")
ret.head.status = ADBStatus.BAN_SYS
return ret
card = await self.data.card.get_card_by_idm(idm)
if not card:
ac = self.data.card.to_access_code(idm)
test = await self.data.card.get_card_by_access_code(ac)
if test:
await self.data.card.set_idm_by_access_code(ac, idm)
else:
ac = card['access_code']
self.logger.info(
f"idm {req.idm} ipm {req.pmm} -> access_code {ac}"
f"idm {idm} ipm {req.pmm.zfill(16)} -> access_code {ac}"
)
return ADBFelicaLookupResponse.from_req(req.head, ac)
def handle_felica_register(self, data: bytes, resp_code: int) -> bytes:
async def handle_felica_register(self, data: bytes, resp_code: int) -> bytes:
"""
I've never seen this used.
Used to register felica moble access codes. Will never be used on our network
because we don't implement felica_lookup properly.
"""
req = ADBFelicaLookupRequest(data)
idm = req.idm.zfill(16)
if idm == "0000000000000000":
self.logger.warn(f"All-zero IDm from {req.head.keychip_id}")
ret = ADBFelicaLookupResponse.from_req(req.head, "00000000000000000000")
ret.head.status = ADBStatus.BAN_SYS
return ret
ac = self.data.card.to_access_code(req.idm)
if self.config.server.allow_user_registration:
user_id = self.data.user.create_user()
user_id = await self.data.user.create_user()
if user_id is None:
self.logger.error("Failed to register user!")
user_id = -1
else:
card_id = self.data.card.create_card(user_id, ac)
card_id = await self.data.card.create_card(user_id, ac)
if card_id is None:
self.logger.error("Failed to register card!")
@ -237,21 +313,49 @@ class AimedbProtocol(Protocol):
f"Registration blocked!: access code {ac} (IDm: {req.idm} PMm: {req.pmm})"
)
if user_id > 0:
await self.data.card.update_card_last_login(ac)
return ADBFelicaLookupResponse.from_req(req.head, ac)
def handle_felica_lookup_ex(self, data: bytes, resp_code: int) -> bytes:
req = ADBFelicaLookup2Request(data)
access_code = self.data.card.to_access_code(req.idm)
user_id = self.data.card.get_user_id_from_card(access_code=access_code)
async def handle_felica_lookup_ex(self, data: bytes, resp_code: int) -> bytes:
req = ADBFelicaLookupExRequest(data)
user_id = None
idm = req.idm.zfill(16)
if idm == "0000000000000000":
self.logger.warn(f"All-zero IDm from {req.head.keychip_id}")
ret = ADBFelicaLookupExResponse.from_req(req.head, -1, "00000000000000000000")
ret.head.status = ADBStatus.BAN_SYS
return ret
card = await self.data.card.get_card_by_idm(idm)
if not card:
access_code = self.data.card.to_access_code(idm)
card = await self.data.card.get_card_by_access_code(access_code)
if card:
user_id = card['user']
await self.data.card.set_idm_by_access_code(access_code, idm)
else:
user_id = card['user']
access_code = card['access_code']
if user_id is None:
user_id = -1
self.logger.info(
f"idm {req.idm} ipm {req.pmm} -> access_code {access_code} user_id {user_id}"
f"idm {idm} ipm {req.pmm} -> access_code {access_code} user_id {user_id}"
)
resp = ADBFelicaLookup2Response.from_req(req.head, user_id, access_code)
resp = ADBFelicaLookupExResponse.from_req(req.head, user_id, access_code)
if user_id > 0:
if card['is_banned'] and card['is_locked']:
resp.head.status = ADBStatus.BAN_SYS_USER
elif card['is_banned']:
resp.head.status = ADBStatus.BAN_SYS
elif card['is_locked']:
resp.head.status = ADBStatus.LOCK_USER
if user_id and user_id > 0 and self.config.aimedb.id_secret:
auth_key = create_sega_auth_key(user_id, req.head.game_id, req.head.store_id, req.head.keychip_id, self.config.aimedb.id_secret, self.config.aimedb.id_lifetime_seconds)
@ -260,10 +364,12 @@ class AimedbProtocol(Protocol):
auth_key_full = auth_key.encode() + (b"\0" * auth_key_extra_len)
self.logger.debug(f"Generated auth token {auth_key}")
resp.auth_key = auth_key_full
if user_id and user_id > 0:
await self.data.card.update_card_last_login(access_code)
return resp
def handle_campaign_clear(self, data: bytes, resp_code: int) -> ADBBaseResponse:
async def handle_campaign_clear(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBCampaignClearRequest(data)
resp = ADBCampaignClearResponse.from_req(req.head)
@ -271,19 +377,25 @@ class AimedbProtocol(Protocol):
# We don't support campaign stuff
return resp
def handle_register(self, data: bytes, resp_code: int) -> bytes:
async def handle_register(self, data: bytes, resp_code: int) -> bytes:
req = ADBLookupRequest(data)
user_id = -1
if req.access_code == "00000000000000000000":
self.logger.warn(f"All-zero access code from {req.head.keychip_id}")
ret = ADBLookupResponse.from_req(req.head, -1)
ret.head.status = ADBStatus.BAN_SYS
return ret
if self.config.server.allow_user_registration:
user_id = self.data.user.create_user()
user_id = await self.data.user.create_user()
if user_id is None:
self.logger.error("Failed to register user!")
user_id = -1
else:
card_id = self.data.card.create_card(user_id, req.access_code)
card_id = await self.data.card.create_card(user_id, req.access_code)
if card_id is None:
self.logger.error("Failed to register card!")
@ -297,25 +409,38 @@ class AimedbProtocol(Protocol):
self.logger.info(
f"Registration blocked!: access code {req.access_code}"
)
if user_id > 0:
if (req.access_code.startswith("010") or req.access_code.startswith("3")) and req.serial_number != 0x04030201: # Default segatools sn:
await self.data.card.set_chip_id_by_access_code(req.access_code, req.serial_number)
self.logger.info(f"Attempt to set chip id to {req.serial_number} for access code {req.access_code}")
elif req.access_code.startswith("0008"):
idm = self.data.card.to_idm(req.access_code)
await self.data.card.set_idm_by_access_code(req.access_code, idm)
self.logger.info(f"Attempt to set IDm to {idm} for access code {req.access_code}")
resp = ADBLookupResponse.from_req(req.head, user_id)
if resp.user_id <= 0:
resp.head.status = ADBStatus.BAN_SYS # Closest we can get to a "You cannot register"
else:
await self.data.card.update_card_last_login(req.access_code)
return resp
# TODO: Save these in some capacity, as deemed relevant
def handle_status_log(self, data: bytes, resp_code: int) -> bytes:
async def handle_status_log(self, data: bytes, resp_code: int) -> bytes:
req = ADBStatusLogRequest(data)
self.logger.info(f"User {req.aime_id} logged {req.status.name} event")
return ADBBaseResponse(resp_code, 0x20, 1, req.head.game_id, req.head.store_id, req.head.keychip_id, req.head.protocol_ver)
def handle_log(self, data: bytes, resp_code: int) -> bytes:
async def handle_log(self, data: bytes, resp_code: int) -> bytes:
req = ADBLogRequest(data)
self.logger.info(f"User {req.aime_id} logged {req.status.name} event, credit_ct: {req.credit_ct} bet_ct: {req.bet_ct} won_ct: {req.won_ct}")
return ADBBaseResponse(resp_code, 0x20, 1, req.head.game_id, req.head.store_id, req.head.keychip_id, req.head.protocol_ver)
def handle_log_ex(self, data: bytes, resp_code: int) -> bytes:
async def handle_log_ex(self, data: bytes, resp_code: int) -> bytes:
req = ADBLogExRequest(data)
strs = []
self.logger.info(f"Recieved {req.num_logs} or {len(req.logs)} logs")
@ -324,43 +449,3 @@ class AimedbProtocol(Protocol):
self.logger.debug(f"User {req.logs[x].aime_id} logged {req.logs[x].status.name} event, credit_ct: {req.logs[x].credit_ct} bet_ct: {req.logs[x].bet_ct} won_ct: {req.logs[x].won_ct}")
return ADBLogExResponse.from_req(req.head)
def handle_goodbye(self, data: bytes, resp_code: int) -> None:
self.logger.info(f"goodbye from {self.transport.getPeer().host}")
self.transport.loseConnection()
return
class AimedbFactory(Factory):
protocol = AimedbProtocol
def __init__(self, cfg: CoreConfig) -> None:
self.config = cfg
log_fmt_str = "[%(asctime)s] Aimedb | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
self.logger = logging.getLogger("aimedb")
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "aimedb"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(self.config.aimedb.loglevel)
coloredlogs.install(
level=cfg.aimedb.loglevel, logger=self.logger, fmt=log_fmt_str
)
if self.config.aimedb.key == "":
self.logger.error("Please set 'key' field in your config file.")
exit(1)
self.logger.info(f"Ready on port {self.config.aimedb.port}")
def buildProtocol(self, addr):
return AimedbProtocol(self.config)

View File

@ -1,20 +1,24 @@
from typing import Dict, List, Any, Optional, Tuple, Union, Final
import logging, coloredlogs
from logging.handlers import TimedRotatingFileHandler
from twisted.web.http import Request
from datetime import datetime
import pytz
import base64
import zlib
import json
import yaml
import logging
import coloredlogs
import urllib.parse
import math
from typing import Dict, List, Any, Optional, Union, Final
from logging.handlers import TimedRotatingFileHandler
from starlette.requests import Request
from starlette.responses import PlainTextResponse
from starlette.applications import Starlette
from starlette.routing import Route
from datetime import datetime
from enum import Enum
from Crypto.PublicKey import RSA
from Crypto.Hash import SHA
from Crypto.Signature import PKCS1_v1_5
from time import strptime
from os import path
import urllib.parse
import math
from os import path, environ, mkdir, access, W_OK
from .config import CoreConfig
from .utils import Utils
@ -90,8 +94,8 @@ class DLI_STATUS(Enum):
return cls.UNKNOWN
class AllnetServlet:
allnet_registry: Dict[str, Any] = {}
def __init__(self, core_cfg: CoreConfig, cfg_folder: str):
super().__init__()
self.config = core_cfg
self.config_folder = cfg_folder
self.data = Data(core_cfg)
@ -120,25 +124,22 @@ class AllnetServlet:
)
self.logger.initialized = True
plugins = Utils.get_all_titles()
def startup(self) -> None:
self.logger.info(f"Ready on port {self.config.allnet.port if self.config.allnet.standalone else self.config.server.port}")
if not TitleServlet.title_registry:
TitleServlet(self.config, self.config_folder)
if len(plugins) == 0:
self.logger.error("No games detected!")
self.logger.info(
f"Serving {len(TitleServlet.title_registry)} game codes port {core_cfg.allnet.port}"
)
def handle_poweron(self, request: Request, _: Dict):
async def handle_poweron(self, request: Request):
request_ip = Utils.get_ip_addr(request)
pragma_header = request.getHeader('Pragma')
is_dfi = pragma_header is not None and pragma_header == "DFI"
pragma_header = request.headers.get('Pragma', "")
is_dfi = pragma_header == "DFI"
data = await request.body()
try:
if is_dfi:
req_urlencode = self.from_dfi(request.content.getvalue())
req_urlencode = self.from_dfi(data)
else:
req_urlencode = request.content.getvalue().decode()
req_urlencode = data
req_dict = self.allnet_req_to_dict(req_urlencode)
if req_dict is None:
@ -155,7 +156,7 @@ class AllnetServlet:
except AllnetRequestException as e:
if e.message != "":
self.logger.error(e)
return b""
return PlainTextResponse()
if req.format_ver == 3:
resp = AllnetPowerOnResponse3(req.token)
@ -166,44 +167,54 @@ class AllnetServlet:
self.logger.debug(f"Allnet request: {vars(req)}")
machine = self.data.arcade.get_machine(req.serial)
machine = await self.data.arcade.get_machine(req.serial)
if machine is None and not self.config.server.allow_unregistered_serials:
msg = f"Unrecognised serial {req.serial} attempted allnet auth from {request_ip}."
self.data.base.log_event(
"allnet", "ALLNET_AUTH_UNKNOWN_SERIAL", logging.WARN, msg
await self.data.base.log_event(
"allnet", "ALLNET_AUTH_UNKNOWN_SERIAL", logging.WARN, msg, {"serial": req.serial}, None, None, None, request_ip, req.game_id, req.ver
)
self.logger.warning(msg)
resp.stat = ALLNET_STAT.bad_machine.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return (urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n").encode("utf-8")
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
if machine is not None:
arcade = self.data.arcade.get_arcade(machine["arcade"])
arcade = await self.data.arcade.get_arcade(machine["arcade"])
if self.config.server.check_arcade_ip:
if arcade["ip"] and arcade["ip"] is not None and arcade["ip"] != req.ip:
msg = f"Serial {req.serial} attempted allnet auth from bad IP {req.ip} (expected {arcade['ip']})."
self.data.base.log_event(
"allnet", "ALLNET_AUTH_BAD_IP", logging.ERROR, msg
msg = f"{req.serial} attempted allnet auth from bad IP {req.ip} (expected {arcade['ip']})."
await self.data.base.log_event(
"allnet", "ALLNET_AUTH_BAD_IP", logging.ERROR, msg, {}, None, arcade['id'], machine['id'], request_ip, req.game_id, req.ver
)
self.logger.warning(msg)
resp.stat = ALLNET_STAT.bad_shop.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return (urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n").encode("utf-8")
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
elif (not arcade["ip"] or arcade["ip"] is None) and self.config.server.strict_ip_checking:
msg = f"Serial {req.serial} attempted allnet auth from bad IP {req.ip}, but arcade {arcade['id']} has no IP set! (strict checking enabled)."
self.data.base.log_event(
"allnet", "ALLNET_AUTH_NO_SHOP_IP", logging.ERROR, msg
msg = f"{req.serial} attempted allnet auth from bad IP {req.ip}, but arcade {arcade['id']} has no IP set! (strict checking enabled)."
await self.data.base.log_event(
"allnet", "ALLNET_AUTH_NO_SHOP_IP", logging.ERROR, msg, {}, None, arcade['id'], machine['id'], request_ip, req.game_id, req.ver
)
self.logger.warning(msg)
resp.stat = ALLNET_STAT.bad_shop.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return (urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n").encode("utf-8")
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
if machine['game'] and machine['game'] != req.game_id:
msg = f"{req.serial} attempted allnet auth with bad game ID {req.game_id} (expected {machine['game']})."
await self.data.base.log_event(
"allnet", "ALLNET_AUTH_BAD_GAME", logging.ERROR, msg, {}, None, arcade['id'], machine['id'], request_ip, req.game_id, req.ver
)
self.logger.warning(msg)
resp.stat = ALLNET_STAT.bad_game.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
country = (
arcade["country"] if machine["country"] is None else machine["country"]
)
@ -211,7 +222,7 @@ class AllnetServlet:
country = AllnetCountryCode.JAPAN.value
resp.country = country
resp.place_id = arcade["id"]
resp.place_id = f"{arcade['id']:04X}"
resp.allnet_id = machine["id"]
resp.name = arcade["name"] if arcade["name"] is not None else ""
resp.nickname = arcade["nickname"] if arcade["nickname"] is not None else ""
@ -235,60 +246,82 @@ class AllnetServlet:
arcade["timezone"] if arcade["timezone"] is not None else "+0900" if req.format_ver == 3 else "+09:00"
)
else:
arcade = None
if req.game_id not in TitleServlet.title_registry:
if not self.config.server.is_develop:
msg = f"Unrecognised game {req.game_id} attempted allnet auth from {request_ip}."
self.data.base.log_event(
"allnet", "ALLNET_AUTH_UNKNOWN_GAME", logging.WARN, msg
await self.data.base.log_event(
"allnet", "ALLNET_AUTH_UNKNOWN_GAME", logging.WARN, msg, {}, None, arcade['id'] if arcade else None, machine['id'] if machine else None, request_ip, req.game_id, req.ver
)
self.logger.warning(msg)
resp.stat = ALLNET_STAT.bad_game.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return (urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n").encode("utf-8")
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
else:
self.logger.info(
f"Allowed unknown game {req.game_id} v{req.ver} to authenticate from {request_ip} due to 'is_develop' being enabled. S/N: {req.serial}"
)
resp.uri = f"http://{self.config.title.hostname}:{self.config.title.port}/{req.game_id}/{req.ver.replace('.', '')}/"
resp.host = f"{self.config.title.hostname}:{self.config.title.port}"
resp.uri = f"http://{self.config.server.hostname}:{self.config.server.port}/{req.game_id}/{req.ver.replace('.', '')}/"
resp.host = f"{self.config.server.hostname}:{self.config.server.port}"
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
resp_str = urllib.parse.unquote(urllib.parse.urlencode(resp_dict))
self.logger.debug(f"Allnet response: {resp_str}")
return (resp_str + "\n").encode("utf-8")
return PlainTextResponse(resp_str + "\n")
int_ver = req.ver.replace(".", "")
resp.uri, resp.host = TitleServlet.title_registry[req.game_id].get_allnet_info(req.game_id, int(int_ver), req.serial)
try:
resp.uri, resp.host = TitleServlet.title_registry[req.game_id].get_allnet_info(req.game_id, int(int_ver), req.serial)
except Exception as e:
self.logger.error(f"Error running get_allnet_info for {req.game_id} - {e}")
resp.stat = ALLNET_STAT.bad_game.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
msg = f"{req.serial} authenticated from {request_ip}: {req.game_id} v{req.ver}"
self.data.base.log_event("allnet", "ALLNET_AUTH_SUCCESS", logging.INFO, msg)
if machine and arcade:
msg = f"{req.serial} authenticated from {request_ip}: {req.game_id} v{req.ver}"
await self.data.base.log_event(
"allnet", "ALLNET_AUTH_SUCCESS", logging.INFO, msg, {}, None, arcade['id'], machine['id'], request_ip, req.game_id, req.ver
)
else:
msg = f"Allow unregistered serial {req.serial} to authenticate from {request_ip}: {req.game_id} v{req.ver}"
await self.data.base.log_event(
"allnet", "ALLNET_AUTH_SUCCESS_UNREG", logging.INFO, msg, {"serial": req.serial}, None, None, None, request_ip, req.game_id, req.ver
)
self.logger.info(msg)
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
resp_str = urllib.parse.unquote(urllib.parse.urlencode(resp_dict))
resp_str = urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n"
self.logger.debug(f"Allnet response: {resp_dict}")
resp_str += "\n"
"""if is_dfi:
request.responseHeaders.addRawHeader('Pragma', 'DFI')
return self.to_dfi(resp_str)"""
if is_dfi:
return PlainTextResponse(
content=self.to_dfi(resp_str) + b"\r\n",
headers={
"Pragma": "DFI",
},
)
return resp_str.encode("utf-8")
return PlainTextResponse(resp_str)
def handle_dlorder(self, request: Request, _: Dict):
async def handle_dlorder(self, request: Request):
request_ip = Utils.get_ip_addr(request)
pragma_header = request.getHeader('Pragma')
is_dfi = pragma_header is not None and pragma_header == "DFI"
pragma_header = request.headers.get('Pragma', "")
is_dfi = pragma_header == "DFI"
data = await request.body()
try:
if is_dfi:
req_urlencode = self.from_dfi(request.content.getvalue())
req_urlencode = self.from_dfi(data)
else:
req_urlencode = request.content.getvalue().decode()
req_urlencode = data.decode()
req_dict = self.allnet_req_to_dict(req_urlencode)
if req_dict is None:
@ -305,7 +338,7 @@ class AllnetServlet:
except AllnetRequestException as e:
if e.message != "":
self.logger.error(e)
return b""
return PlainTextResponse()
self.logger.info(
f"DownloadOrder from {request_ip} -> {req.game_id} v{req.ver} serial {req.serial}"
@ -316,54 +349,71 @@ class AllnetServlet:
not self.config.allnet.allow_online_updates
or not self.config.allnet.update_cfg_folder
):
return urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n"
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n")
else:
machine = await self.data.arcade.get_machine(req.serial)
if not machine or not machine['ota_enable'] or not machine['is_cab'] or machine['is_blacklisted']:
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n")
else: # TODO: Keychip check
if path.exists(
f"{self.config.allnet.update_cfg_folder}/{req.game_id}-{req.ver.replace('.', '')}-app.ini"
):
resp.uri = f"http://{self.config.title.hostname}:{self.config.title.port}/dl/ini/{req.game_id}-{req.ver.replace('.', '')}-app.ini"
resp.uri = f"http://{self.config.server.hostname}:{self.config.server.port}/dl/ini/{req.game_id}-{req.ver.replace('.', '')}-app.ini"
if path.exists(
f"{self.config.allnet.update_cfg_folder}/{req.game_id}-{req.ver.replace('.', '')}-opt.ini"
):
resp.uri += f"|http://{self.config.title.hostname}:{self.config.title.port}/dl/ini/{req.game_id}-{req.ver.replace('.', '')}-opt.ini"
resp.uri += f"|http://{self.config.server.hostname}:{self.config.server.port}/dl/ini/{req.game_id}-{req.ver.replace('.', '')}-opt.ini"
self.logger.debug(f"Sending download uri {resp.uri}")
self.data.base.log_event("allnet", "DLORDER_REQ_SUCCESS", logging.INFO, f"{Utils.get_ip_addr(request)} requested DL Order for {req.serial} {req.game_id} v{req.ver}")
if resp.uri:
self.logger.info(f"Sending download uri {resp.uri}")
await self.data.base.log_event(
"allnet", "DLORDER_REQ_SUCCESS", logging.INFO, f"Send download URI to {req.serial} for {req.game_id} v{req.ver} from {Utils.get_ip_addr(request)}", {"uri": resp.uri}, None,
machine['arcade'], machine['id'], request_ip, req.game_id, req.ver
)
# Maybe add a log event for checkin but no url sent?
res_str = urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n"
"""if is_dfi:
request.responseHeaders.addRawHeader('Pragma', 'DFI')
return self.to_dfi(res_str)"""
if is_dfi:
return PlainTextResponse(
content=self.to_dfi(res_str) + b"\r\n",
headers={
"Pragma": "DFI",
},
)
return res_str
return PlainTextResponse(res_str)
def handle_dlorder_ini(self, request: Request, match: Dict) -> bytes:
if "file" not in match:
return b""
async def handle_dlorder_ini(self, request: Request) -> bytes:
req_file = request.path_params.get("file", "").replace("%0A", "").replace("\n", "")
request_ip = Utils.get_ip_addr(request)
req_file = match["file"].replace("%0A", "")
if not req_file:
return PlainTextResponse(status_code=404)
if path.exists(f"{self.config.allnet.update_cfg_folder}/{req_file}"):
self.logger.info(f"Request for DL INI file {req_file} from {Utils.get_ip_addr(request)} successful")
self.data.base.log_event("allnet", "DLORDER_INI_SENT", logging.INFO, f"{Utils.get_ip_addr(request)} successfully recieved {req_file}")
self.logger.info(f"Request for DL INI file {req_file} from {request_ip} successful")
await self.data.base.log_event(
"allnet", "DLORDER_INI_SENT", logging.INFO, f"{request_ip} successfully recieved {req_file}", {"file": req_file}, ip=request_ip
)
return open(
f"{self.config.allnet.update_cfg_folder}/{req_file}", "rb"
).read()
return PlainTextResponse(open(
f"{self.config.allnet.update_cfg_folder}/{req_file}", "r", encoding="utf-8"
).read())
self.logger.info(f"DL INI File {req_file} not found")
return b""
return PlainTextResponse()
def handle_dlorder_report(self, request: Request, match: Dict) -> bytes:
req_raw = request.content.getvalue()
async def handle_dlorder_report(self, request: Request) -> bytes:
req_raw = await request.body()
client_ip = Utils.get_ip_addr(request)
try:
req_dict: Dict = json.loads(req_raw)
except Exception as e:
self.logger.warning(f"Failed to parse DL Report: {e}")
return "NG"
return PlainTextResponse("NG")
dl_data_type = DLIMG_TYPE.app
dl_data = req_dict.get("appimage", {})
@ -374,24 +424,30 @@ class AllnetServlet:
if dl_data is None or not dl_data:
self.logger.warning(f"Failed to parse DL Report: Invalid format - contains neither appimage nor optimage")
return "NG"
return PlainTextResponse("NG")
rep = DLReport(dl_data, dl_data_type)
if not rep.validate():
self.logger.warning(f"Failed to parse DL Report: Invalid format - {rep.err}")
return "NG"
return PlainTextResponse("NG")
msg = f"{rep.serial} @ {client_ip} reported {rep.rep_type.name} download state {rep.rf_state.name} for {rep.gd} v{rep.dav}:"\
f" {rep.tdsc}/{rep.tsc} segments downloaded for working files {rep.wfl} with {rep.dfl if rep.dfl else 'none'} complete."
self.data.base.log_event("allnet", "DL_REPORT", logging.INFO, msg, dl_data)
machine = await self.data.arcade.get_machine(rep.serial)
if machine:
await self.data.base.log_event("allnet", "DL_REPORT", logging.INFO, msg, dl_data, None, machine['arcade'], machine['id'], client_ip, rep.gd, rep.dav)
else:
msg = "Unknown serial " + msg
await self.data.base.log_event("allnet", "DL_REPORT_UNREG", logging.INFO, msg, dl_data, None, None, None, client_ip, rep.gd, rep.dav)
self.logger.info(msg)
return "OK"
return PlainTextResponse("OK")
def handle_loaderstaterecorder(self, request: Request, match: Dict) -> bytes:
req_data = request.content.getvalue()
async def handle_loaderstaterecorder(self, request: Request) -> bytes:
req_data = await request.body()
sections = req_data.decode("utf-8").split("\r\n")
req_dict = dict(urllib.parse.parse_qsl(sections[0]))
@ -403,130 +459,27 @@ class AllnetServlet:
ip = Utils.get_ip_addr(request)
if serial is None or num_files_dld is None or num_files_to_dl is None or dl_state is None:
return "NG".encode()
return PlainTextResponse("NG")
self.logger.info(f"LoaderStateRecorder Request from {ip} {serial}: {num_files_dld}/{num_files_to_dl} Files download (State: {dl_state})")
return "OK".encode()
def handle_alive(self, request: Request, match: Dict) -> bytes:
return "OK".encode()
def handle_billing_request(self, request: Request, _: Dict):
req_raw = request.content.getvalue()
msg = f"LoaderStateRecorder Request from {ip} {serial}: {num_files_dld}/{num_files_to_dl} Files download (State: {dl_state})"
machine = await self.data.arcade.get_machine(serial)
if machine:
await self.data.base.log_event("allnet", "LSR_REPORT", logging.INFO, msg, req_dict, None, machine['arcade'], machine['id'], ip)
if request.getHeader('Content-Type') == "application/octet-stream":
req_unzip = zlib.decompressobj(-zlib.MAX_WBITS).decompress(req_raw)
else:
req_unzip = req_raw
msg = "Unregistered " + msg
await self.data.base.log_event("allnet", "LSR_REPORT_UNREG", logging.INFO, msg, req_dict, None, None, None, ip)
req_dict = self.billing_req_to_dict(req_unzip)
request_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(f"Failed to parse request {request.content.getvalue()}")
return b""
self.logger.debug(f"request {req_dict}")
rsa = RSA.import_key(open(self.config.billing.signing_key, "rb").read())
signer = PKCS1_v1_5.new(rsa)
digest = SHA.new()
traces: List[TraceData] = []
try:
req = BillingInfo(req_dict[0])
except KeyError as e:
self.logger.error(f"Billing request failed to parse: {e}")
return f"result=5&linelimit=&message=field is missing or formatting is incorrect\r\n".encode()
for x in range(1, len(req_dict)):
if not req_dict[x]:
continue
try:
tmp = TraceData(req_dict[x])
if tmp.trace_type == TraceDataType.CHARGE:
tmp = TraceDataCharge(req_dict[x])
elif tmp.trace_type == TraceDataType.EVENT:
tmp = TraceDataEvent(req_dict[x])
elif tmp.trace_type == TraceDataType.CREDIT:
tmp = TraceDataCredit(req_dict[x])
traces.append(tmp)
except KeyError as e:
self.logger.warn(f"Tracelog failed to parse: {e}")
kc_serial_bytes = req.keychipid.encode()
machine = self.data.arcade.get_machine(req.keychipid)
if machine is None and not self.config.server.allow_unregistered_serials:
msg = f"Unrecognised serial {req.keychipid} attempted billing checkin from {request_ip} for {req.gameid} v{req.gamever}."
self.data.base.log_event(
"allnet", "BILLING_CHECKIN_NG_SERIAL", logging.WARN, msg
)
self.logger.warning(msg)
return f"result=1&requestno={req.requestno}&message=Keychip Serial bad\r\n".encode()
msg = (
f"Billing checkin from {request_ip}: game {req.gameid} ver {req.gamever} keychip {req.keychipid} playcount "
f"{req.playcnt} billing_type {req.billingtype.name} nearfull {req.nearfull} playlimit {req.playlimit}"
)
self.logger.info(msg)
self.data.base.log_event("billing", "BILLING_CHECKIN_OK", logging.INFO, msg)
if req.traceleft > 0:
self.logger.warn(f"{req.traceleft} unsent tracelogs")
kc_playlimit = req.playlimit
kc_nearfull = req.nearfull
return PlainTextResponse("OK")
async def handle_alive(self, request: Request) -> bytes:
return PlainTextResponse("OK")
while req.playcnt > req.playlimit:
kc_playlimit += 1024
kc_nearfull += 1024
playlimit = kc_playlimit
nearfull = kc_nearfull + (req.billingtype.value * 0x00010000)
digest.update(playlimit.to_bytes(4, "little") + kc_serial_bytes)
playlimit_sig = signer.sign(digest).hex()
digest = SHA.new()
digest.update(nearfull.to_bytes(4, "little") + kc_serial_bytes)
nearfull_sig = signer.sign(digest).hex()
# TODO: playhistory
#resp = BillingResponse(playlimit, playlimit_sig, nearfull, nearfull_sig)
resp = BillingResponse(playlimit, playlimit_sig, nearfull, nearfull_sig, req.requestno, req.protocolver)
resp_str = urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\r\n"
self.logger.debug(f"response {vars(resp)}")
if req.traceleft > 0:
self.logger.info(f"Requesting 20 more of {req.traceleft} unsent tracelogs")
return f"result=6&waittime=0&linelimit=20\r\n".encode()
return resp_str.encode("utf-8")
def handle_naomitest(self, request: Request, _: Dict) -> bytes:
self.logger.info(f"Ping from {Utils.get_ip_addr(request)}")
return b"naomi ok"
def billing_req_to_dict(self, data: bytes):
"""
Parses an billing request string into a python dictionary
"""
try:
sections = data.decode("ascii").split("\r\n")
ret = []
for x in sections:
ret.append(dict(urllib.parse.parse_qsl(x)))
return ret
except Exception as e:
self.logger.error(f"billing_req_to_dict: {e} while parsing {data}")
return None
async def handle_naomitest(self, request: Request) -> bytes:
# This could be spam-able, removing
#self.logger.info(f"Ping from {Utils.get_ip_addr(request)}")
return PlainTextResponse("naomi ok")
def allnet_req_to_dict(self, data: str) -> Optional[List[Dict[str, Any]]]:
"""
@ -554,6 +507,167 @@ class AllnetServlet:
zipped = zlib.compress(unzipped)
return base64.b64encode(zipped)
class BillingServlet:
def __init__(self, core_cfg: CoreConfig, cfg_folder: str) -> None:
self.config = core_cfg
self.config_folder = cfg_folder
self.data = Data(core_cfg)
self.logger = logging.getLogger("billing")
if not hasattr(self.logger, "initialized"):
log_fmt_str = "[%(asctime)s] Billing | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "billing"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(core_cfg.allnet.loglevel)
coloredlogs.install(
level=core_cfg.billing.loglevel, logger=self.logger, fmt=log_fmt_str
)
self.logger.initialized = True
def startup(self) -> None:
self.logger.info(f"Ready on port {self.config.billing.port if self.config.billing.standalone else self.config.server.port}")
def billing_req_to_dict(self, data: bytes):
"""
Parses an billing request string into a python dictionary
"""
try:
sections = data.decode("ascii").split("\r\n")
ret = []
for x in sections:
ret.append(dict(urllib.parse.parse_qsl(x)))
return ret
except Exception as e:
self.logger.error(f"billing_req_to_dict: {e} while parsing {data}")
return None
async def handle_billing_request(self, request: Request):
req_raw = await request.body()
if request.headers.get('Content-Type', '') == "application/octet-stream":
req_unzip = zlib.decompressobj(-zlib.MAX_WBITS).decompress(req_raw)
else:
req_unzip = req_raw
req_dict = self.billing_req_to_dict(req_unzip)
request_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(f"Failed to parse request {req_raw}")
return PlainTextResponse()
self.logger.debug(f"request {req_dict}")
rsa = RSA.import_key(open(self.config.billing.signing_key, "rb").read())
signer = PKCS1_v1_5.new(rsa)
digest = SHA.new()
traces: List[TraceData] = []
try:
req = BillingInfo(req_dict[0])
except KeyError as e:
self.logger.error(f"Billing request failed to parse: {e}")
return PlainTextResponse("result=5&linelimit=&message=field is missing or formatting is incorrect\r\n")
for x in range(1, len(req_dict)):
if not req_dict[x]:
continue
try:
tmp = TraceData(req_dict[x])
if tmp.trace_type == TraceDataType.CHARGE:
tmp = TraceDataCharge(req_dict[x])
elif tmp.trace_type == TraceDataType.EVENT:
tmp = TraceDataEvent(req_dict[x])
elif tmp.trace_type == TraceDataType.CREDIT:
tmp = TraceDataCredit(req_dict[x])
traces.append(tmp)
except KeyError as e:
self.logger.warn(f"Tracelog failed to parse: {e}")
kc_serial_bytes = req.keychipid.encode()
machine = await self.data.arcade.get_machine(req.keychipid)
if machine is None and not self.config.server.allow_unregistered_serials:
msg = f"Unrecognised serial {req.keychipid} attempted billing checkin from {request_ip} for {req.gameid} v{req.gamever}."
await self.data.base.log_event(
"allnet", "BILLING_CHECKIN_NG_SERIAL", logging.WARN, msg, ip=request_ip, game=req.gameid, version=req.gamever
)
self.logger.warning(msg)
return PlainTextResponse(f"result=1&requestno={req.requestno}&message=Keychip Serial bad\r\n")
log_details = {
"playcount": req.playcnt,
"billing_type": req.billingtype.name,
"nearfull": req.nearfull,
"playlimit": req.playlimit,
}
if machine is not None:
await self.data.base.log_event("billing", "BILLING_CHECKIN_OK", logging.INFO, "", log_details, None, machine['arcade'], machine['id'], request_ip, req.gameid, req.gamever)
self.logger.info(
f"Unregistered Billing checkin from {request_ip}: game {req.gameid} ver {req.gamever} keychip {req.keychipid} playcount "
f"{req.playcnt} billing_type {req.billingtype.name} nearfull {req.nearfull} playlimit {req.playlimit}"
)
else:
log_details['serial'] = req.keychipid
await self.data.base.log_event("billing", "BILLING_CHECKIN_OK_UNREG", logging.INFO, "", log_details, None, None, None, request_ip, req.gameid, req.gamever)
self.logger.info(
f"Unregistered Billing checkin from {request_ip}: game {req.gameid} ver {req.gamever} keychip {req.keychipid} playcount "
f"{req.playcnt} billing_type {req.billingtype.name} nearfull {req.nearfull} playlimit {req.playlimit}"
)
if req.traceleft > 0:
self.logger.warn(f"{req.traceleft} unsent tracelogs")
kc_playlimit = req.playlimit
kc_nearfull = req.nearfull
while req.playcnt > req.playlimit:
kc_playlimit += 1024
kc_nearfull += 1024
playlimit = kc_playlimit
nearfull = kc_nearfull + (req.billingtype.value * 0x00010000)
digest.update(playlimit.to_bytes(4, "little") + kc_serial_bytes)
playlimit_sig = signer.sign(digest).hex()
digest = SHA.new()
digest.update(nearfull.to_bytes(4, "little") + kc_serial_bytes)
nearfull_sig = signer.sign(digest).hex()
# TODO: playhistory
resp = BillingResponse(playlimit, playlimit_sig, nearfull, nearfull_sig, req.requestno, req.protocolver)
resp_str = urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\r\n"
self.logger.debug(f"response {vars(resp)}")
if req.traceleft > 0:
self.logger.info(f"Requesting 20 more of {req.traceleft} unsent tracelogs")
return PlainTextResponse("result=6&waittime=0&linelimit=20\r\n")
return PlainTextResponse(resp_str)
class AllnetPowerOnRequest:
def __init__(self, req: Dict) -> None:
@ -613,7 +727,6 @@ class AllnetPowerOnResponse3(AllnetPowerOnResponse):
self.minute = None
self.second = None
class AllnetPowerOnResponse2(AllnetPowerOnResponse):
def __init__(self) -> None:
super().__init__()
@ -623,7 +736,6 @@ class AllnetPowerOnResponse2(AllnetPowerOnResponse):
self.timezone = "+09:00"
self.res_class = "PowerOnResponseV2"
class AllnetDownloadOrderRequest:
def __init__(self, req: Dict) -> None:
self.game_id = req.get("game_id", "")
@ -631,7 +743,6 @@ class AllnetDownloadOrderRequest:
self.serial = req.get("serial", "")
self.encode = req.get("encode", "")
class AllnetDownloadOrderResponse:
def __init__(self, stat: int = 1, serial: str = "", uri: str = "") -> None:
self.stat = stat
@ -781,7 +892,6 @@ class BillingResponse:
# playhistory -> YYYYMM/C:...
# YYYY -> 4 digit year, MM -> 2 digit month, C -> Playcount during that period
class AllnetRequestException(Exception):
def __init__(self, message="") -> None:
self.message = message
@ -849,3 +959,48 @@ class DLReport:
return False
return True
cfg_dir = environ.get("ARTEMIS_CFG_DIR", "config")
cfg: CoreConfig = CoreConfig()
if path.exists(f"{cfg_dir}/core.yaml"):
cfg.update(yaml.safe_load(open(f"{cfg_dir}/core.yaml")))
if not path.exists(cfg.server.log_dir):
mkdir(cfg.server.log_dir)
if not access(cfg.server.log_dir, W_OK):
print(
f"Log directory {cfg.server.log_dir} NOT writable, please check permissions"
)
exit(1)
billing = BillingServlet(cfg, cfg_dir)
app_billing = Starlette(
cfg.server.is_develop,
[
Route("/request", billing.handle_billing_request, methods=["POST"]),
Route("/request/", billing.handle_billing_request, methods=["POST"]),
],
on_startup=[billing.startup]
)
allnet = AllnetServlet(cfg, cfg_dir)
route_lst = [
Route("/sys/servlet/PowerOn", allnet.handle_poweron, methods=["GET", "POST"]),
Route("/sys/servlet/DownloadOrder", allnet.handle_dlorder, methods=["GET", "POST"]),
Route("/sys/servlet/LoaderStateRecorder", allnet.handle_loaderstaterecorder, methods=["GET", "POST"]),
Route("/sys/servlet/Alive", allnet.handle_alive, methods=["GET", "POST"]),
Route("/naomitest.html", allnet.handle_naomitest),
]
if cfg.allnet.allow_online_updates:
route_lst += [
Route("/report-api/Report", allnet.handle_dlorder_report, methods=["POST"]),
Route("/dl/ini/{file:str}", allnet.handle_dlorder_ini),
]
app_allnet = Starlette(
cfg.server.is_develop,
route_lst,
on_startup=[allnet.startup]
)

92
core/app.py Normal file
View File

@ -0,0 +1,92 @@
import yaml
import logging
import coloredlogs
from logging.handlers import TimedRotatingFileHandler
from starlette.routing import Route
from starlette.requests import Request
from starlette.applications import Starlette
from starlette.responses import PlainTextResponse
from os import environ, path, mkdir, W_OK, access
from typing import List
from core import CoreConfig, TitleServlet, MuchaServlet, AllnetServlet, BillingServlet, AimedbServlette
from core.frontend import FrontendServlet
async def dummy_rt(request: Request):
return PlainTextResponse("Service OK")
cfg_dir = environ.get("ARTEMIS_CFG_DIR", "config")
cfg: CoreConfig = CoreConfig()
if path.exists(f"{cfg_dir}/core.yaml"):
cfg.update(yaml.safe_load(open(f"{cfg_dir}/core.yaml")))
if not path.exists(cfg.server.log_dir):
mkdir(cfg.server.log_dir)
if not access(cfg.server.log_dir, W_OK):
print(
f"Log directory {cfg.server.log_dir} NOT writable, please check permissions"
)
exit(1)
logger = logging.getLogger("core")
log_fmt_str = "[%(asctime)s] Core | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(cfg.server.log_dir, "core"), when="d", backupCount=10
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
logger.addHandler(fileHandler)
logger.addHandler(consoleHandler)
log_lv = logging.DEBUG if cfg.server.is_develop else logging.INFO
logger.setLevel(log_lv)
coloredlogs.install(level=log_lv, logger=logger, fmt=log_fmt_str)
logger.info(f"Artemis starting in {'develop' if cfg.server.is_develop else 'production'} mode")
title = TitleServlet(cfg, cfg_dir) # This has to be loaded first to load plugins
mucha = MuchaServlet(cfg, cfg_dir)
route_lst: List[Route] = [
# Mucha
Route("/mucha_front/boardauth.do", mucha.handle_boardauth, methods=["POST"]),
Route("/mucha_front/updatacheck.do", mucha.handle_updatecheck, methods=["POST"]),
Route("/mucha_front/downloadstate.do", mucha.handle_dlstate, methods=["POST"]),
# General
Route("/", dummy_rt),
Route("/robots.txt", FrontendServlet.robots)
]
if not cfg.billing.standalone:
billing = BillingServlet(cfg, cfg_dir)
route_lst += [
Route("/request", billing.handle_billing_request, methods=["POST"]),
Route("/request/", billing.handle_billing_request, methods=["POST"]),
]
if not cfg.allnet.standalone:
allnet = AllnetServlet(cfg, cfg_dir)
route_lst += [
Route("/sys/servlet/PowerOn", allnet.handle_poweron, methods=["GET", "POST"]),
Route("/sys/servlet/DownloadOrder", allnet.handle_dlorder, methods=["GET", "POST"]),
Route("/sys/servlet/LoaderStateRecorder", allnet.handle_loaderstaterecorder, methods=["GET", "POST"]),
Route("/sys/servlet/Alive", allnet.handle_alive, methods=["GET", "POST"]),
Route("/naomitest.html", allnet.handle_naomitest),
]
if cfg.allnet.allow_online_updates:
route_lst += [
Route("/report-api/Report", allnet.handle_dlorder_report, methods=["POST"]),
Route("/dl/ini/{file:str}", allnet.handle_dlorder_ini),
]
for code, game in title.title_registry.items():
route_lst += game.get_routes()
app = Starlette(cfg.server.is_develop, route_lst)

View File

@ -1,16 +1,48 @@
import logging, os
from typing import Any
class ServerConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def listen_address(self) -> str:
"""
Address Artemis will bind to and listen on
"""
return CoreConfig.get_config_field(
self.__config, "core", "server", "listen_address", default="127.0.0.1"
)
@property
def hostname(self) -> str:
"""
Hostname sent to games
"""
return CoreConfig.get_config_field(
self.__config, "core", "server", "hostname", default="localhost"
)
@property
def port(self) -> int:
"""
Port the game will listen on
"""
return CoreConfig.get_config_field(
self.__config, "core", "server", "port", default=80
)
@property
def ssl_key(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "server", "ssl_key", default="cert/title.key"
)
@property
def ssl_cert(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "ssl_cert", default="cert/title.pem"
)
@property
def allow_user_registration(self) -> bool:
@ -43,9 +75,23 @@ class ServerConfig:
)
@property
def threading(self) -> bool:
def proxy_port(self) -> int:
"""
What port the proxy is listening on. This will be sent instead of 'port' if
is_using_proxy is True and this value is non-zero
"""
return CoreConfig.get_config_field(
self.__config, "core", "server", "threading", default=False
self.__config, "core", "server", "proxy_port", default=0
)
@property
def proxy_port_ssl(self) -> int:
"""
What port the proxy is listening for secure connections on. This will be sent
instead of 'port' if is_using_proxy is True and this value is non-zero
"""
return CoreConfig.get_config_field(
self.__config, "core", "server", "proxy_port_ssl", default=0
)
@property
@ -66,7 +112,6 @@ class ServerConfig:
self.__config, "core", "server", "strict_ip_checking", default=False
)
class TitleConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@ -79,36 +124,6 @@ class TitleConfig:
)
)
@property
def hostname(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "hostname", default="localhost"
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(
self.__config, "core", "title", "port", default=8080
)
@property
def port_ssl(self) -> int:
return CoreConfig.get_config_field(
self.__config, "core", "title", "port_ssl", default=0
)
@property
def ssl_key(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "ssl_key", default="cert/title.key"
)
@property
def ssl_cert(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "ssl_cert", default="cert/title.pem"
)
@property
def reboot_start_time(self) -> str:
return CoreConfig.get_config_field(
@ -121,7 +136,6 @@ class TitleConfig:
self.__config, "core", "title", "reboot_end_time", default=""
)
class DatabaseConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@ -159,7 +173,7 @@ class DatabaseConfig:
@property
def protocol(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "database", "type", default="mysql"
self.__config, "core", "database", "protocol", default="mysql"
)
@property
@ -176,16 +190,6 @@ class DatabaseConfig:
)
)
@property
def user_table_autoincrement_start(self) -> int:
return CoreConfig.get_config_field(
self.__config,
"core",
"database",
"user_table_autoincrement_start",
default=10000,
)
@property
def enable_memcached(self) -> bool:
return CoreConfig.get_config_field(
@ -198,13 +202,12 @@ class DatabaseConfig:
self.__config, "core", "database", "memcached_host", default="localhost"
)
class FrontendConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def enable(self) -> int:
def enable(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "frontend", "enable", default=False
)
@ -212,7 +215,7 @@ class FrontendConfig:
@property
def port(self) -> int:
return CoreConfig.get_config_field(
self.__config, "core", "frontend", "port", default=8090
self.__config, "core", "frontend", "port", default=8080
)
@property
@ -222,20 +225,23 @@ class FrontendConfig:
self.__config, "core", "frontend", "loglevel", default="info"
)
)
@property
def secret(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "frontend", "secret", default=""
)
class AllnetConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "allnet", "loglevel", default="info"
)
def standalone(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "standalone", default=False
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(
@ -243,9 +249,11 @@ class AllnetConfig:
)
@property
def ip_check(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "ip_check", default=False
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "allnet", "loglevel", default="info"
)
)
@property
@ -260,10 +268,23 @@ class AllnetConfig:
self.__config, "core", "allnet", "update_cfg_folder", default=""
)
class BillingConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def standalone(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "billing", "standalone", default=True
)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "billing", "loglevel", default="info"
)
)
@property
def port(self) -> int:
@ -289,11 +310,22 @@ class BillingConfig:
self.__config, "core", "billing", "signing_key", default="cert/billing.key"
)
class AimedbConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def enable(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "aimedb", "enable", default=True
)
@property
def listen_address(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "aimedb", "listen_address", default=""
)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
@ -326,17 +358,10 @@ class AimedbConfig:
self.__config, "core", "aimedb", "id_lifetime_seconds", default=86400
)
class MuchaConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def enable(self) -> int:
return CoreConfig.get_config_field(
self.__config, "core", "mucha", "enable", default=False
)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
@ -345,13 +370,6 @@ class MuchaConfig:
)
)
@property
def hostname(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "mucha", "hostname", default="localhost"
)
class CoreConfig(dict):
def __init__(self) -> None:
self.server = ServerConfig(self)
@ -373,6 +391,19 @@ class CoreConfig(dict):
return logging.DEBUG
else:
return logging.INFO
@classmethod
def loglevel_to_str(cls, level: int) -> str:
if level == logging.ERROR:
return "error"
elif level == logging.WARN:
return "warn"
elif level == logging.INFO:
return "info"
elif level == logging.DEBUG:
return "debug"
else:
return "notset"
@classmethod
def get_config_field(

View File

@ -1,16 +1,18 @@
from enum import Enum
class MainboardPlatformCodes:
RINGEDGE = "AALE"
RINGWIDE = "AAML"
NU = "AAVE"
NUSX = "AAWE"
ALLS_UX = "ACAE"
ALLS_HX = "ACAX"
class MainboardPlatformCodes(Enum):
RINGEDGE = "AAL"
RINGEDGE2 = "AAS"
RINGWIDE = "AAM"
NU = "AAV"
NUSX = "AAW"
ALLS = "ACA"
#ALLS_UX = "ACAE"
#ALLS_HX = "ACAX"
class MainboardRevisions:
class MainboardRevisions(Enum):
RINGEDGE = 1
RINGEDGE2 = 2
@ -29,11 +31,10 @@ class MainboardRevisions:
ALLS_HX2 = 12
class KeychipPlatformsCodes:
RING = "A72E"
NU = ("A60E", "A60E", "A60E")
NUSX = ("A61X", "A69X")
ALLS = "A63E"
class KeychipPlatformsCodes(Enum):
RING = "72"
NU = ("60", "61", "69")
ALLS = "63"
class AllnetCountryCode(Enum):

1
core/data/alembic/README Normal file
View File

@ -0,0 +1 @@
Generic single-database configuration.

View File

@ -0,0 +1,64 @@
# A generic, single database configuration.
[alembic]
script_location=.
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
# max length of characters to apply to the
# "slug" field
#truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; this defaults
# to migrations//versions. When using multiple version
# directories, initial revisions must be specified with --version-path
# version_locations = %(here)s/bar %(here)s/bat migrations//versions
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

81
core/data/alembic/env.py Normal file
View File

@ -0,0 +1,81 @@
from __future__ import with_statement
from alembic import context
from sqlalchemy import engine_from_config, pool
from logging.config import fileConfig
from core.data.schema.base import metadata
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline():
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
raise Exception('Not implemented or configured!')
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
ini_section = config.get_section(config.config_ini_section)
overrides = context.get_x_argument(as_dictionary=True)
for override in overrides:
ini_section[override] = overrides[override]
connectable = engine_from_config(
ini_section,
prefix='sqlalchemy.',
poolclass=pool.NullPool)
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True,
compare_server_default=True,
)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

View File

@ -0,0 +1,24 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}

View File

@ -0,0 +1,27 @@
"""chuni_add_net_battle_uk
Revision ID: 1e150d16ab6b
Revises: b23f985100ba
Create Date: 2024-06-21 22:57:18.418488
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '1e150d16ab6b'
down_revision = 'b23f985100ba'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_unique_constraint(None, 'chuni_profile_net_battle', ['user'])
# ### end Alembic commands ###
def downgrade():
op.drop_constraint(None, 'chuni_profile_net_battle', type_='unique')
# ### end Alembic commands ###

View File

@ -0,0 +1,48 @@
"""add_event_log_info
Revision ID: 2bf9f38d9444
Revises: 81e44dd6047a
Create Date: 2024-05-21 23:00:17.468407
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '2bf9f38d9444'
down_revision = '81e44dd6047a'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('event_log', sa.Column('user', sa.INTEGER(), nullable=True))
op.add_column('event_log', sa.Column('arcade', sa.INTEGER(), nullable=True))
op.add_column('event_log', sa.Column('machine', sa.INTEGER(), nullable=True))
op.add_column('event_log', sa.Column('ip', sa.TEXT(length=39), nullable=True))
op.alter_column('event_log', 'when_logged',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
op.create_foreign_key(None, 'event_log', 'machine', ['machine'], ['id'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'event_log', 'arcade', ['arcade'], ['id'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'event_log', 'aime_user', ['user'], ['id'], onupdate='cascade', ondelete='cascade')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint(None, 'event_log', type_='foreignkey')
op.drop_constraint(None, 'event_log', type_='foreignkey')
op.drop_constraint(None, 'event_log', type_='foreignkey')
op.alter_column('event_log', 'when_logged',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('current_timestamp()'),
existing_nullable=False)
op.drop_column('event_log', 'ip')
op.drop_column('event_log', 'machine')
op.drop_column('event_log', 'arcade')
op.drop_column('event_log', 'user')
# ### end Alembic commands ###

View File

@ -0,0 +1,46 @@
"""add_event_log_game_version
Revision ID: 2d024cf145a1
Revises: 2bf9f38d9444
Create Date: 2024-05-21 23:41:31.445331
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '2d024cf145a1'
down_revision = '2bf9f38d9444'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('event_log', sa.Column('game', sa.TEXT(length=4), nullable=True))
op.add_column('event_log', sa.Column('version', sa.TEXT(length=24), nullable=True))
op.alter_column('event_log', 'ip',
existing_type=mysql.TINYTEXT(),
type_=sa.TEXT(length=39),
existing_nullable=True)
op.alter_column('event_log', 'when_logged',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('event_log', 'when_logged',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('current_timestamp()'),
existing_nullable=False)
op.alter_column('event_log', 'ip',
existing_type=sa.TEXT(length=39),
type_=mysql.TINYTEXT(),
existing_nullable=True)
op.drop_column('event_log', 'version')
op.drop_column('event_log', 'game')
# ### end Alembic commands ###

View File

@ -0,0 +1,54 @@
"""pokken_fix_pokemon_uk
Revision ID: 3657efefc5a4
Revises: 4a02e623e5e6
Create Date: 2024-06-13 23:50:57.611998
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '3657efefc5a4'
down_revision = '4a02e623e5e6'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('pokken_pokemon_data', 'char_id',
existing_type=mysql.INTEGER(display_width=11),
nullable=True)
op.alter_column('pokken_pokemon_data', 'illustration_book_no',
existing_type=mysql.INTEGER(display_width=11),
nullable=False)
op.drop_constraint('pokken_pokemon_data_ibfk_1', table_name='pokken_pokemon_data', type_='foreignkey')
op.drop_index('pokken_pokemon_data_uk', table_name='pokken_pokemon_data')
op.create_unique_constraint('pokken_pokemon_uk', 'pokken_pokemon_data', ['user', 'illustration_book_no'])
op.create_foreign_key("pokken_pokemon_data_ibfk_1", "pokken_pokemon_data", "aime_user", ['user'], ['id'])
op.alter_column('pokken_profile', 'trainer_name',
existing_type=mysql.VARCHAR(length=16),
type_=sa.String(length=14),
existing_nullable=True)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('pokken_profile', 'trainer_name',
existing_type=sa.String(length=14),
type_=mysql.VARCHAR(length=16),
existing_nullable=True)
op.drop_constraint('pokken_pokemon_data_ibfk_1', table_name='pokken_pokemon_data', type_='foreignkey')
op.drop_constraint('pokken_pokemon_uk', 'pokken_pokemon_data', type_='unique')
op.create_index('pokken_pokemon_data_uk', 'pokken_pokemon_data', ['user', 'char_id'], unique=True)
op.create_foreign_key("pokken_pokemon_data_ibfk_1", "pokken_pokemon_data", "aime_user", ['user'], ['id'])
op.alter_column('pokken_pokemon_data', 'illustration_book_no',
existing_type=mysql.INTEGER(display_width=11),
nullable=True)
op.alter_column('pokken_pokemon_data', 'char_id',
existing_type=mysql.INTEGER(display_width=11),
nullable=False)
# ### end Alembic commands ###

View File

@ -0,0 +1,50 @@
"""card_add_idm_chip_id
Revision ID: 48f4acc43a7e
Revises: 1e150d16ab6b
Create Date: 2024-06-21 23:53:34.369134
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '48f4acc43a7e'
down_revision = '1e150d16ab6b'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('aime_card', sa.Column('idm', sa.String(length=16), nullable=True))
op.add_column('aime_card', sa.Column('chip_id', sa.BIGINT(), nullable=True))
op.alter_column('aime_card', 'access_code',
existing_type=mysql.VARCHAR(length=20),
nullable=False)
op.alter_column('aime_card', 'created_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=True)
op.create_unique_constraint(None, 'aime_card', ['chip_id'])
op.create_unique_constraint(None, 'aime_card', ['idm'])
op.create_unique_constraint(None, 'aime_card', ['access_code'])
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint("chip_id", 'aime_card', type_='unique')
op.drop_constraint("idm", 'aime_card', type_='unique')
op.drop_constraint("access_code", 'aime_card', type_='unique')
op.alter_column('aime_card', 'created_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('CURRENT_TIMESTAMP'),
existing_nullable=True)
op.alter_column('aime_card', 'access_code',
existing_type=mysql.VARCHAR(length=20),
nullable=True)
op.drop_column('aime_card', 'chip_id')
op.drop_column('aime_card', 'idm')
# ### end Alembic commands ###

View File

@ -0,0 +1,48 @@
"""mai2_add_favs_rivals
Revision ID: 4a02e623e5e6
Revises: 8ad40a6e7be2
Create Date: 2024-06-08 19:02:43.856395
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '4a02e623e5e6'
down_revision = '8ad40a6e7be2'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('mai2_item_favorite_music',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user', sa.Integer(), nullable=False),
sa.Column('musicId', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user', 'musicId', name='mai2_item_favorite_music_uk'),
mysql_charset='utf8mb4'
)
op.create_table('mai2_user_rival',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user', sa.Integer(), nullable=False),
sa.Column('rival', sa.Integer(), nullable=False),
sa.Column('show', sa.Boolean(), server_default='0', nullable=False),
sa.ForeignKeyConstraint(['rival'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user', 'rival', name='mai2_user_rival_uk'),
mysql_charset='utf8mb4'
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('mai2_user_rival')
op.drop_table('mai2_item_favorite_music')
# ### end Alembic commands ###

View File

@ -0,0 +1,41 @@
"""mai2_presents
Revision ID: 5ea363686347
Revises: 680789dabab3
Create Date: 2024-06-28 14:49:07.666879
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '5ea363686347'
down_revision = '680789dabab3'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('mai2_item_present',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('version', sa.INTEGER(), nullable=True),
sa.Column('user', sa.Integer(), nullable=True),
sa.Column('itemKind', sa.INTEGER(), nullable=False),
sa.Column('itemId', sa.INTEGER(), nullable=False),
sa.Column('stock', sa.INTEGER(), server_default='1', nullable=False),
sa.Column('startDate', sa.TIMESTAMP(), nullable=True),
sa.Column('endDate', sa.TIMESTAMP(), nullable=True),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('version', 'user', 'itemKind', 'itemId', name='mai2_item_present_uk'),
mysql_charset='utf8mb4'
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('mai2_item_present')
# ### end Alembic commands ###

View File

@ -0,0 +1,28 @@
"""card_add_memo
Revision ID: 5ea73f89d982
Revises: 745448d83696
Create Date: 2024-07-06 22:46:56.992152
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '5ea73f89d982'
down_revision = '745448d83696'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('aime_card', sa.Column('memo', sa.VARCHAR(length=16), nullable=True))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('aime_card', 'memo')
# ### end Alembic commands ###

View File

@ -0,0 +1,295 @@
"""sao_player_changes
Revision ID: 680789dabab3
Revises: a616fd164e40
Create Date: 2024-06-26 23:19:16.863778
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '680789dabab3'
down_revision = 'a616fd164e40'
branch_labels = None
depends_on = None
def upgrade():
op.add_column('sao_equipment_data', sa.Column('is_shop_purchase', sa.BOOLEAN(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('is_protect', sa.BOOLEAN(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property1_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property1_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property1_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property2_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property2_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property2_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property3_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property3_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property3_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property4_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property4_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('property4_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_equipment_data', sa.Column('converted_card_num', sa.INTEGER(), server_default='0', nullable=False))
op.alter_column('sao_equipment_data', 'equipment_id',
existing_type=mysql.INTEGER(),
type_=sa.BIGINT(),
existing_nullable=False)
op.alter_column('sao_equipment_data', 'get_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
op.create_foreign_key(None, 'sao_equipment_data', 'sao_static_property', ['property2_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_equipment_data', 'sao_static_property', ['property4_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_equipment_data', 'sao_static_property', ['property3_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_equipment_data', 'sao_static_property', ['property1_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_equipment_data', 'sao_static_equipment_list', ['equipment_id'], ['EquipmentId'], onupdate='cascade', ondelete='cascade')
op.add_column('sao_hero_log_data', sa.Column('max_level_extend_num', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('is_awakenable', sa.BOOLEAN(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('awakening_stage', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('awakening_exp', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('is_shop_purchase', sa.BOOLEAN(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('is_protect', sa.BOOLEAN(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property1_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property1_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property1_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property2_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property2_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property2_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property3_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property3_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property3_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property4_property_id', sa.BIGINT(), server_default='2', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property4_value1', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('property4_value2', sa.INTEGER(), server_default='0', nullable=False))
op.add_column('sao_hero_log_data', sa.Column('converted_card_num', sa.INTEGER(), server_default='0', nullable=False))
op.alter_column('sao_hero_log_data', 'main_weapon',
existing_type=mysql.INTEGER(),
nullable=True)
op.alter_column('sao_hero_log_data', 'sub_equipment',
existing_type=mysql.INTEGER(),
nullable=True)
op.alter_column('sao_hero_log_data', 'skill_slot1_skill_id',
existing_type=mysql.INTEGER(),
type_=sa.BIGINT(),
nullable=True)
op.alter_column('sao_hero_log_data', 'skill_slot2_skill_id',
existing_type=mysql.INTEGER(),
type_=sa.BIGINT(),
nullable=True)
op.alter_column('sao_hero_log_data', 'skill_slot3_skill_id',
existing_type=mysql.INTEGER(),
type_=sa.BIGINT(),
nullable=True)
op.alter_column('sao_hero_log_data', 'skill_slot4_skill_id',
existing_type=mysql.INTEGER(),
type_=sa.BIGINT(),
nullable=True)
op.alter_column('sao_hero_log_data', 'skill_slot5_skill_id',
existing_type=mysql.INTEGER(),
type_=sa.BIGINT(),
nullable=True)
op.alter_column('sao_hero_log_data', 'get_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
op.alter_column("sao_hero_log_data", "user_hero_log_id",
existing_type=sa.Integer(),
new_column_name="hero_log_id",
type_=sa.BIGINT(),
nullable=False)
op.execute(sa.text("UPDATE sao_hero_log_data SET skill_slot1_skill_id = NULL WHERE skill_slot1_skill_id = 0;"))
op.execute(sa.text("UPDATE sao_hero_log_data SET skill_slot2_skill_id = NULL WHERE skill_slot2_skill_id = 0;"))
op.execute(sa.text("UPDATE sao_hero_log_data SET skill_slot3_skill_id = NULL WHERE skill_slot3_skill_id = 0;"))
op.execute(sa.text("UPDATE sao_hero_log_data SET skill_slot4_skill_id = NULL WHERE skill_slot4_skill_id = 0;"))
op.execute(sa.text("UPDATE sao_hero_log_data SET skill_slot5_skill_id = NULL WHERE skill_slot5_skill_id = 0;"))
op.execute(sa.text("UPDATE sao_hero_log_data SET main_weapon = NULL WHERE main_weapon = 0;"))
op.execute(sa.text("UPDATE sao_hero_log_data SET sub_equipment = NULL WHERE sub_equipment = 0;"))
op.execute(sa.text("UPDATE sao_hero_party SET user_hero_log_id_1 = NULL WHERE user_hero_log_id_1 = 0;"))
op.execute(sa.text("UPDATE sao_hero_party SET user_hero_log_id_2 = NULL WHERE user_hero_log_id_2 = 0;"))
op.execute(sa.text("UPDATE sao_hero_party SET user_hero_log_id_3 = NULL WHERE user_hero_log_id_3 = 0;"))
op.execute(sa.text("UPDATE sao_hero_log_data INNER JOIN sao_equipment_data ON sao_hero_log_data.main_weapon = sao_equipment_data.equipment_id SET sao_hero_log_data.main_weapon = sao_equipment_data.id;"))
op.execute(sa.text("UPDATE sao_hero_log_data INNER JOIN sao_equipment_data ON sao_hero_log_data.sub_equipment = sao_equipment_data.equipment_id SET sao_hero_log_data.sub_equipment = sao_equipment_data.id;"))
op.execute(sa.text("UPDATE sao_hero_party INNER JOIN sao_hero_log_data ON sao_hero_party.user_hero_log_id_1 = sao_hero_log_data.hero_log_id SET sao_hero_party.user_hero_log_id_1 = sao_hero_log_data.id;"))
op.execute(sa.text("UPDATE sao_hero_party INNER JOIN sao_hero_log_data ON sao_hero_party.user_hero_log_id_2 = sao_hero_log_data.hero_log_id SET sao_hero_party.user_hero_log_id_2 = sao_hero_log_data.id;"))
op.execute(sa.text("UPDATE sao_hero_party INNER JOIN sao_hero_log_data ON sao_hero_party.user_hero_log_id_3 = sao_hero_log_data.hero_log_id SET sao_hero_party.user_hero_log_id_3 = sao_hero_log_data.id;"))
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_property', ['property4_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_skill', ['skill_slot1_skill_id'], ['SkillId'], onupdate='set null', ondelete='set null')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_skill', ['skill_slot5_skill_id'], ['SkillId'], onupdate='set null', ondelete='set null')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_skill', ['skill_slot2_skill_id'], ['SkillId'], onupdate='set null', ondelete='set null')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_skill', ['skill_slot3_skill_id'], ['SkillId'], onupdate='set null', ondelete='set null')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_equipment_data', ['main_weapon'], ['id'], onupdate='set null', ondelete='set null')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_property', ['property3_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_skill', ['skill_slot4_skill_id'], ['SkillId'], onupdate='set null', ondelete='set null')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_equipment_data', ['sub_equipment'], ['id'], onupdate='set null', ondelete='set null')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_property', ['property1_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_hero_list', ['hero_log_id'], ['HeroLogId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_hero_log_data', 'sao_static_property', ['property2_property_id'], ['PropertyId'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_hero_party', 'sao_hero_log_data', ['user_hero_log_id_3'], ['id'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_hero_party', 'sao_hero_log_data', ['user_hero_log_id_1'], ['id'], onupdate='cascade', ondelete='cascade')
op.create_foreign_key(None, 'sao_hero_party', 'sao_hero_log_data', ['user_hero_log_id_2'], ['id'], onupdate='cascade', ondelete='cascade')
op.alter_column('sao_item_data', 'get_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
op.alter_column('sao_play_sessions', 'play_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
op.add_column('sao_player_quest', sa.Column('quest_type', sa.INTEGER(), server_default='1', nullable=False))
op.alter_column('sao_player_quest', 'play_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
op.alter_column('sao_player_quest', 'episode_id',
existing_type=mysql.INTEGER(),
new_column_name="quest_scene_id",
type_=sa.BIGINT(),
nullable=False)
op.create_foreign_key(None, 'sao_player_quest', 'sao_static_quest', ['quest_scene_id'], ['QuestSceneId'], onupdate='cascade', ondelete='cascade')
op.add_column('sao_profile', sa.Column('my_shop', sa.INTEGER(), nullable=True))
op.add_column('sao_profile', sa.Column('fav_hero', sa.INTEGER(), nullable=True))
op.add_column('sao_profile', sa.Column('when_register', sa.TIMESTAMP(), server_default=sa.text('now()'), nullable=True))
op.add_column('sao_profile', sa.Column('last_login_date', sa.TIMESTAMP(), nullable=True))
op.add_column('sao_profile', sa.Column('last_yui_medal_date', sa.TIMESTAMP(), nullable=True))
op.add_column('sao_profile', sa.Column('last_bonus_yui_medal_date', sa.TIMESTAMP(), nullable=True))
op.add_column('sao_profile', sa.Column('last_comeback_date', sa.TIMESTAMP(), nullable=True))
op.add_column('sao_profile', sa.Column('last_login_bonus_date', sa.TIMESTAMP(), nullable=True))
op.add_column('sao_profile', sa.Column('ad_confirm_date', sa.TIMESTAMP(), nullable=True))
op.add_column('sao_profile', sa.Column('login_ct', sa.INTEGER(), server_default='0', nullable=True))
op.create_foreign_key(None, 'sao_profile', 'sao_hero_log_data', ['fav_hero'], ['id'], onupdate='cascade', ondelete='set null')
def downgrade():
op.drop_constraint("sao_profile_ibfk_2", 'sao_profile', type_='foreignkey')
op.drop_column('sao_profile', 'login_ct')
op.drop_column('sao_profile', 'ad_confirm_date')
op.drop_column('sao_profile', 'last_login_bonus_date')
op.drop_column('sao_profile', 'last_comeback_date')
op.drop_column('sao_profile', 'last_bonus_yui_medal_date')
op.drop_column('sao_profile', 'last_yui_medal_date')
op.drop_column('sao_profile', 'last_login_date')
op.drop_column('sao_profile', 'when_register')
op.drop_column('sao_profile', 'fav_hero')
op.drop_column('sao_profile', 'my_shop')
op.alter_column('sao_player_quest', 'quest_scene_id',
existing_type=mysql.BIGINT(),
new_column_name="episode_id",
type_=sa.INTEGER(),
nullable=False)
op.drop_constraint("sao_player_quest_ibfk_2", 'sao_player_quest', type_='foreignkey')
op.alter_column('sao_player_quest', 'play_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('CURRENT_TIMESTAMP'),
existing_nullable=False)
op.drop_column('sao_player_quest', 'quest_scene_id')
op.drop_column('sao_player_quest', 'quest_type')
op.alter_column('sao_play_sessions', 'play_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('CURRENT_TIMESTAMP'),
existing_nullable=False)
op.alter_column('sao_item_data', 'get_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('CURRENT_TIMESTAMP'),
existing_nullable=False)
op.drop_constraint("sao_hero_party_ibfk_2", 'sao_hero_party', type_='foreignkey')
op.drop_constraint("sao_hero_party_ibfk_3", 'sao_hero_party', type_='foreignkey')
op.drop_constraint("sao_hero_party_ibfk_4", 'sao_hero_party', type_='foreignkey')
op.alter_column("sao_hero_log_data", "hero_log_id",
existing_type=sa.BIGINT(),
new_column_name="user_hero_log_id",
type_=sa.Integer(),
nullable=False)
op.drop_constraint("sao_hero_log_data_ibfk_2", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_3", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_4", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_5", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_6", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_7", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_8", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_9", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_10", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_11", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_12", 'sao_hero_log_data', type_='foreignkey')
op.drop_constraint("sao_hero_log_data_ibfk_13", 'sao_hero_log_data', type_='foreignkey')
op.alter_column('sao_hero_log_data', 'get_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('CURRENT_TIMESTAMP'),
existing_nullable=False)
op.alter_column('sao_hero_log_data', 'skill_slot5_skill_id',
existing_type=sa.BIGINT(),
type_=mysql.INTEGER(),
nullable=False)
op.alter_column('sao_hero_log_data', 'skill_slot4_skill_id',
existing_type=sa.BIGINT(),
type_=mysql.INTEGER(),
nullable=False)
op.alter_column('sao_hero_log_data', 'skill_slot3_skill_id',
existing_type=sa.BIGINT(),
type_=mysql.INTEGER(),
nullable=False)
op.alter_column('sao_hero_log_data', 'skill_slot2_skill_id',
existing_type=sa.BIGINT(),
type_=mysql.INTEGER(),
nullable=False)
op.alter_column('sao_hero_log_data', 'skill_slot1_skill_id',
existing_type=sa.BIGINT(),
type_=mysql.INTEGER(),
nullable=False)
op.alter_column('sao_hero_log_data', 'sub_equipment',
existing_type=mysql.INTEGER(),
nullable=False)
op.alter_column('sao_hero_log_data', 'main_weapon',
existing_type=mysql.INTEGER(),
nullable=False)
op.drop_column('sao_hero_log_data', 'converted_card_num')
op.drop_column('sao_hero_log_data', 'property4_value2')
op.drop_column('sao_hero_log_data', 'property4_value1')
op.drop_column('sao_hero_log_data', 'property4_property_id')
op.drop_column('sao_hero_log_data', 'property3_value2')
op.drop_column('sao_hero_log_data', 'property3_value1')
op.drop_column('sao_hero_log_data', 'property3_property_id')
op.drop_column('sao_hero_log_data', 'property2_value2')
op.drop_column('sao_hero_log_data', 'property2_value1')
op.drop_column('sao_hero_log_data', 'property2_property_id')
op.drop_column('sao_hero_log_data', 'property1_value2')
op.drop_column('sao_hero_log_data', 'property1_value1')
op.drop_column('sao_hero_log_data', 'property1_property_id')
op.drop_column('sao_hero_log_data', 'is_protect')
op.drop_column('sao_hero_log_data', 'is_shop_purchase')
op.drop_column('sao_hero_log_data', 'awakening_exp')
op.drop_column('sao_hero_log_data', 'awakening_stage')
op.drop_column('sao_hero_log_data', 'is_awakenable')
op.drop_column('sao_hero_log_data', 'max_level_extend_num')
op.drop_constraint("sao_equipment_data_ibfk_2", 'sao_equipment_data', type_='foreignkey')
op.drop_constraint("sao_equipment_data_ibfk_3", 'sao_equipment_data', type_='foreignkey')
op.drop_constraint("sao_equipment_data_ibfk_4", 'sao_equipment_data', type_='foreignkey')
op.drop_constraint("sao_equipment_data_ibfk_5", 'sao_equipment_data', type_='foreignkey')
op.drop_constraint("sao_equipment_data_ibfk_6", 'sao_equipment_data', type_='foreignkey')
op.alter_column('sao_equipment_data', 'get_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('CURRENT_TIMESTAMP'),
existing_nullable=False)
op.alter_column('sao_equipment_data', 'equipment_id',
existing_type=sa.BIGINT(),
type_=mysql.INTEGER(),
existing_nullable=False)
op.drop_column('sao_equipment_data', 'converted_card_num')
op.drop_column('sao_equipment_data', 'property4_value2')
op.drop_column('sao_equipment_data', 'property4_value1')
op.drop_column('sao_equipment_data', 'property4_property_id')
op.drop_column('sao_equipment_data', 'property3_value2')
op.drop_column('sao_equipment_data', 'property3_value1')
op.drop_column('sao_equipment_data', 'property3_property_id')
op.drop_column('sao_equipment_data', 'property2_value2')
op.drop_column('sao_equipment_data', 'property2_value1')
op.drop_column('sao_equipment_data', 'property2_property_id')
op.drop_column('sao_equipment_data', 'property1_value2')
op.drop_column('sao_equipment_data', 'property1_value1')
op.drop_column('sao_equipment_data', 'property1_property_id')
op.drop_column('sao_equipment_data', 'is_protect')
op.drop_column('sao_equipment_data', 'is_shop_purchase')

View File

@ -0,0 +1,56 @@
"""GekiChu rating tables
Revision ID: 6a7e8277763b
Revises: d8950c7ce2fc
Create Date: 2024-03-13 12:18:53.210018
"""
from alembic import op
from sqlalchemy import Column, Integer, String
# revision identifiers, used by Alembic.
revision = '6a7e8277763b'
down_revision = 'd8950c7ce2fc'
branch_labels = None
depends_on = None
GEKICHU_RATING_TABLE_NAMES = [
"chuni_profile_rating",
"ongeki_profile_rating",
]
def upgrade():
for table_name in GEKICHU_RATING_TABLE_NAMES:
op.create_table(
table_name,
Column("id", Integer, primary_key=True, nullable=False),
Column("user", Integer, nullable=False),
Column("version", Integer, nullable=False),
Column("type", String(255), nullable=False),
Column("index", Integer, nullable=False),
Column("musicId", Integer),
Column("difficultId", Integer),
Column("romVersionCode", Integer),
Column("score", Integer),
mysql_charset="utf8mb4",
)
op.create_foreign_key(
None,
table_name,
"aime_user",
["user"],
["id"],
ondelete="cascade",
onupdate="cascade",
)
op.create_unique_constraint(
f"{table_name}_uk",
table_name,
["user", "version", "type", "index"],
)
def downgrade():
for table_name in GEKICHU_RATING_TABLE_NAMES:
op.drop_table(table_name)

View File

@ -0,0 +1,28 @@
"""chuni_team_points
Revision ID: 745448d83696
Revises: 5ea363686347
Create Date: 2024-06-29 00:05:22.479187
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '745448d83696'
down_revision = '5ea363686347'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('chuni_profile_team', sa.Column('userTeamPoint', sa.JSON(), nullable=True))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('chuni_profile_team', 'userTeamPoint')
# ### end Alembic commands ###

View File

@ -0,0 +1,28 @@
"""cxb_add_playlog_grade
Revision ID: 7dc13e364e53
Revises: 2d024cf145a1
Create Date: 2024-05-28 22:31:22.264926
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '7dc13e364e53'
down_revision = '2d024cf145a1'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('cxb_playlog', sa.Column('grade', sa.Integer(), nullable=True))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('cxb_playlog', 'grade')
# ### end Alembic commands ###

View File

@ -0,0 +1,68 @@
"""mai2_buddies_support
Revision ID: 81e44dd6047a
Revises: 6a7e8277763b
Create Date: 2024-03-12 19:10:37.063907
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = "81e44dd6047a"
down_revision = "6a7e8277763b"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"mai2_playlog_2p",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("user", sa.Integer(), nullable=False),
sa.Column("userId1", sa.Integer(), nullable=True),
sa.Column("userId2", sa.Integer(), nullable=True),
sa.Column("userName1", sa.String(length=25), nullable=True),
sa.Column("userName2", sa.String(length=25), nullable=True),
sa.Column("regionId", sa.Integer(), nullable=True),
sa.Column("placeId", sa.Integer(), nullable=True),
sa.Column("user2pPlaylogDetailList", sa.JSON(), nullable=True),
sa.ForeignKeyConstraint(
["user"], ["aime_user.id"], onupdate="cascade", ondelete="cascade"
),
sa.PrimaryKeyConstraint("id"),
mysql_charset="utf8mb4",
)
op.add_column(
"mai2_playlog",
sa.Column(
"extBool1", sa.Boolean(), nullable=True, server_default=sa.text("NULL")
),
)
op.add_column(
"mai2_profile_detail",
sa.Column(
"renameCredit", sa.Integer(), nullable=True, server_default=sa.text("NULL")
),
)
op.add_column(
"mai2_profile_detail",
sa.Column(
"currentPlayCount",
sa.Integer(),
nullable=True,
server_default=sa.text("NULL"),
),
)
def downgrade():
op.drop_table("mai2_playlog_2p")
op.drop_column("mai2_playlog", "extBool1")
op.drop_column("mai2_profile_detail", "renameCredit")
op.drop_column("mai2_profile_detail", "currentPlayCount")

View File

@ -0,0 +1,24 @@
"""Initial Migration
Revision ID: 835b862f9bf0
Revises:
Create Date: 2024-01-09 13:06:10.787432
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '835b862f9bf0'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
pass
def downgrade():
pass

View File

@ -0,0 +1,30 @@
"""ongeki: fix clearStatus
Revision ID: 8ad40a6e7be2
Revises: 7dc13e364e53
Create Date: 2024-05-29 19:03:30.062157
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '8ad40a6e7be2'
down_revision = '7dc13e364e53'
branch_labels = None
depends_on = None
def upgrade():
op.alter_column('ongeki_score_best', 'clearStatus',
existing_type=mysql.TINYINT(display_width=1),
type_=sa.Integer(),
existing_nullable=False)
def downgrade():
op.alter_column('ongeki_score_best', 'clearStatus',
existing_type=sa.Integer(),
type_=mysql.TINYINT(display_width=1),
existing_nullable=False)

View File

@ -0,0 +1,437 @@
"""sao_backport
Revision ID: a616fd164e40
Revises: 48f4acc43a7e
Create Date: 2024-06-24 20:28:34.471282
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = 'a616fd164e40'
down_revision = '48f4acc43a7e'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('sao_static_quest')
op.create_table('sao_static_quest',
sa.Column('QuestSceneId', sa.BIGINT(), nullable=False),
sa.Column('SortNo', sa.INTEGER(), nullable=False),
sa.Column('Tutorial', sa.BOOLEAN(), nullable=False),
sa.Column('ColRate', sa.DECIMAL(), nullable=False),
sa.Column('LimitDefault', sa.INTEGER(), nullable=False),
sa.Column('LimitResurrection', sa.INTEGER(), nullable=False),
sa.Column('RewardTableSubId', sa.INTEGER(), nullable=False),
sa.Column('PlayerTraceTableSubId', sa.INTEGER(), nullable=False),
sa.Column('SuccessPlayerExp', sa.INTEGER(), nullable=False),
sa.Column('FailedPlayerExp', sa.INTEGER(), nullable=False),
sa.Column('PairExpRate', sa.INTEGER(), nullable=False),
sa.Column('TrioExpRate', sa.INTEGER(), nullable=False),
sa.Column('SingleRewardVp', sa.INTEGER(), nullable=False),
sa.Column('PairRewardVp', sa.INTEGER(), nullable=False),
sa.Column('TrioRewardVp', sa.INTEGER(), nullable=False),
sa.PrimaryKeyConstraint('QuestSceneId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_property',
sa.Column('PropertyId', sa.BIGINT(), nullable=False),
sa.Column('PropertyTargetType', sa.INTEGER(), nullable=False),
sa.Column('PropertyName', sa.VARCHAR(length=255), nullable=False),
sa.Column('PropertyName_en', sa.VARCHAR(length=255), nullable=True),
sa.Column('PropertyNameFormat', sa.VARCHAR(length=255), nullable=False),
sa.Column('PropertyNameFormat_en', sa.VARCHAR(length=255), nullable=True),
sa.Column('PropertyTypeId', sa.INTEGER(), nullable=False),
sa.Column('Value1Min', sa.INTEGER(), nullable=False),
sa.Column('Value1Max', sa.INTEGER(), nullable=False),
sa.Column('Value2Min', sa.INTEGER(), nullable=False),
sa.Column('Value2Max', sa.INTEGER(), nullable=False),
sa.PrimaryKeyConstraint('PropertyId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_reward',
sa.Column('RewardTableId', sa.BIGINT(), nullable=False),
sa.Column('RewardTableSubId', sa.INTEGER(), nullable=False),
sa.Column('UnanalyzedLogGradeId', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardType', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardId', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardNum', sa.INTEGER(), nullable=False),
sa.Column('StrengthMin', sa.INTEGER(), nullable=False),
sa.Column('StrengthMax', sa.INTEGER(), nullable=False),
sa.Column('PropertyTableSubId', sa.INTEGER(), nullable=False),
sa.Column('QuestInfoDisplayFlag', sa.BOOLEAN(), nullable=False),
sa.Column('Rate', sa.INTEGER(), nullable=False),
sa.PrimaryKeyConstraint('RewardTableId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_skill',
sa.Column('SkillId', sa.BIGINT(), nullable=False),
sa.Column('WeaponTypeId', sa.INTEGER(), nullable=False),
sa.Column('Name', sa.VARCHAR(length=255), nullable=False),
sa.Column('Name_en', sa.VARCHAR(length=255), nullable=True),
sa.Column('Attack', sa.BOOLEAN(), nullable=False),
sa.Column('Passive', sa.BOOLEAN(), nullable=False),
sa.Column('Pet', sa.BOOLEAN(), nullable=False),
sa.Column('Level', sa.INTEGER(), nullable=False),
sa.Column('SkillCondition', sa.INTEGER(), nullable=False),
sa.Column('CoolTime', sa.INTEGER(), nullable=False),
sa.Column('SkillIcon', sa.VARCHAR(length=255), nullable=False),
sa.Column('FriendSkillIcon', sa.VARCHAR(length=255), nullable=False),
sa.Column('InfoText', sa.VARCHAR(length=255), nullable=False),
sa.Column('InfoText_en', sa.VARCHAR(length=255), nullable=True),
sa.PrimaryKeyConstraint('SkillId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_trace_table',
sa.Column('PlayerTraceTableId', sa.BIGINT(), nullable=False),
sa.Column('PlayerTraceTableSubId', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardType', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardId', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardNum', sa.INTEGER(), nullable=False),
sa.Column('Rate', sa.INTEGER(), nullable=False),
sa.PrimaryKeyConstraint('PlayerTraceTableId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_player_beginner_mission',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('user', sa.INTEGER(), nullable=False),
sa.Column('beginner_mission_id', sa.INTEGER(), nullable=False),
sa.Column('condition_id', sa.INTEGER(), nullable=False),
sa.Column('is_seat', sa.BOOLEAN(), server_default='0', nullable=False),
sa.Column('achievement_num', sa.INTEGER(), nullable=False),
sa.Column('complete_flag', sa.BOOLEAN(), server_default='0', nullable=False),
sa.Column('complete_date', sa.TIMESTAMP(), nullable=True),
sa.Column('reward_received_flag', sa.BOOLEAN(), server_default='0', nullable=False),
sa.Column('reward_received_date', sa.TIMESTAMP(), nullable=True),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user'),
sa.UniqueConstraint('user', 'condition_id', name='sao_player_beginner_mission_uk'),
mysql_charset='utf8mb4'
)
op.create_table('sao_player_resource_card',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('user', sa.INTEGER(), nullable=False),
sa.Column('common_reward_type', sa.INTEGER(), nullable=False),
sa.Column('common_reward_id', sa.INTEGER(), nullable=False),
sa.Column('holographic_flag', sa.BOOLEAN(), server_default='0', nullable=False),
sa.Column('serial', sa.VARCHAR(length=20), nullable=True),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('serial'),
mysql_charset='utf8mb4'
)
op.create_table('sao_player_tutorial',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('user', sa.INTEGER(), nullable=False),
sa.Column('tutorial_byte', sa.INTEGER(), nullable=False),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user', 'tutorial_byte', name='sao_player_tutorial_uk'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_episode',
sa.Column('EpisodeId', sa.BIGINT(), nullable=False),
sa.Column('EpisodeChapterId', sa.INTEGER(), nullable=False),
sa.Column('ReleaseEpisodeId', sa.INTEGER(), nullable=False),
sa.Column('Title', sa.VARCHAR(length=255), nullable=False),
sa.Column('CommentSummary', sa.VARCHAR(length=255), nullable=False),
sa.Column('ExBonusTableSubId', sa.INTEGER(), nullable=False),
sa.Column('QuestSceneId', sa.BIGINT(), nullable=True),
sa.ForeignKeyConstraint(['QuestSceneId'], ['sao_static_quest.QuestSceneId'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('EpisodeId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_ex_bonus',
sa.Column('ExBonusTableId', sa.BIGINT(), nullable=False),
sa.Column('ExBonusTableSubId', sa.INTEGER(), nullable=False),
sa.Column('ExBonusConditionId', sa.INTEGER(), nullable=False),
sa.Column('ConditionValue1', sa.INTEGER(), nullable=False),
sa.Column('ConditionValue2', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardType', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardId', sa.INTEGER(), nullable=False),
sa.Column('CommonRewardNum', sa.INTEGER(), nullable=False),
sa.Column('Strength', sa.INTEGER(), nullable=False),
sa.Column('Property1PropertyId', sa.BIGINT(), nullable=False),
sa.Column('Property1Value1', sa.INTEGER(), nullable=False),
sa.Column('Property1Value2', sa.INTEGER(), nullable=False),
sa.Column('Property2PropertyId', sa.BIGINT(), nullable=False),
sa.Column('Property2Value1', sa.INTEGER(), nullable=False),
sa.Column('Property2Value2', sa.INTEGER(), nullable=False),
sa.Column('Property3PropertyId', sa.BIGINT(), nullable=False),
sa.Column('Property3Value1', sa.INTEGER(), nullable=False),
sa.Column('Property3Value2', sa.INTEGER(), nullable=False),
sa.Column('Property4PropertyId', sa.BIGINT(), nullable=False),
sa.Column('Property4Value1', sa.INTEGER(), nullable=False),
sa.Column('Property4Value2', sa.INTEGER(), nullable=False),
sa.ForeignKeyConstraint(['Property1PropertyId'], ['sao_static_property.PropertyId'], onupdate='cascade', ondelete='cascade'),
sa.ForeignKeyConstraint(['Property2PropertyId'], ['sao_static_property.PropertyId'], onupdate='cascade', ondelete='cascade'),
sa.ForeignKeyConstraint(['Property3PropertyId'], ['sao_static_property.PropertyId'], onupdate='cascade', ondelete='cascade'),
sa.ForeignKeyConstraint(['Property4PropertyId'], ['sao_static_property.PropertyId'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('ExBonusTableId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_ex_tower',
sa.Column('ExTowerQuestId', sa.BIGINT(), nullable=False),
sa.Column('ExTowerId', sa.INTEGER(), nullable=False),
sa.Column('ReleaseExTowerQuestId', sa.INTEGER(), nullable=False),
sa.Column('Title', sa.VARCHAR(length=255), nullable=False),
sa.Column('Title_en', sa.VARCHAR(length=255), nullable=True),
sa.Column('ExBonusTableSubId', sa.INTEGER(), nullable=False),
sa.Column('QuestSceneId', sa.BIGINT(), nullable=False),
sa.ForeignKeyConstraint(['QuestSceneId'], ['sao_static_quest.QuestSceneId'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('ExTowerQuestId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_side_quest',
sa.Column('SideQuestId', sa.BIGINT(), nullable=False),
sa.Column('DisplayName', sa.VARCHAR(length=255), nullable=False),
sa.Column('DisplayName_en', sa.VARCHAR(length=255), nullable=True),
sa.Column('EpisodeNum', sa.INTEGER(), nullable=False),
sa.Column('ExBonusTableSubId', sa.INTEGER(), nullable=False),
sa.Column('QuestSceneId', sa.BIGINT(), nullable=False),
sa.ForeignKeyConstraint(['QuestSceneId'], ['sao_static_quest.QuestSceneId'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('SideQuestId'),
sa.UniqueConstraint('SideQuestId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_skill_table',
sa.Column('SkillTableId', sa.BIGINT(), nullable=False),
sa.Column('SkillId', sa.BIGINT(), nullable=False),
sa.Column('SkillTableSubId', sa.INTEGER(), nullable=False),
sa.Column('LevelObtained', sa.INTEGER(), nullable=False),
sa.Column('AwakeningId', sa.INTEGER(), nullable=False),
sa.ForeignKeyConstraint(['SkillId'], ['sao_static_skill.SkillId'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('SkillTableId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_static_tower',
sa.Column('TowerId', sa.BIGINT(), nullable=False),
sa.Column('ReleaseTowerId', sa.INTEGER(), nullable=False),
sa.Column('ExBonusTableSubId', sa.INTEGER(), nullable=False),
sa.Column('QuestSceneId', sa.BIGINT(), nullable=False),
sa.ForeignKeyConstraint(['QuestSceneId'], ['sao_static_quest.QuestSceneId'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('TowerId'),
mysql_charset='utf8mb4'
)
op.create_table('sao_player_ex_bonus',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('user', sa.INTEGER(), nullable=False),
sa.Column('quest_scene_id', sa.BIGINT(), nullable=False),
sa.Column('ex_bonus_table_id', sa.BIGINT(), nullable=False),
sa.Column('quest_clear_flag', sa.BOOLEAN(), server_default='0', nullable=False),
sa.ForeignKeyConstraint(['ex_bonus_table_id'], ['sao_static_ex_bonus.ExBonusTableId'], onupdate='cascade', ondelete='cascade'),
sa.ForeignKeyConstraint(['quest_scene_id'], ['sao_static_quest.QuestSceneId'], onupdate='cascade', ondelete='cascade'),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user', 'quest_scene_id', 'ex_bonus_table_id', name='sao_player_ex_bonus_uk'),
mysql_charset='utf8mb4'
)
op.create_table('sao_player_hero_card',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('user', sa.INTEGER(), nullable=False),
sa.Column('user_hero_id', sa.INTEGER(), nullable=False),
sa.Column('holographic_flag', sa.BOOLEAN(), server_default='0', nullable=False),
sa.Column('serial', sa.VARCHAR(length=20), nullable=True),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.ForeignKeyConstraint(['user_hero_id'], ['sao_hero_log_data.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('serial'),
mysql_charset='utf8mb4'
)
op.alter_column('sao_end_sessions', 'play_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('now()'),
existing_nullable=False)
op.drop_table('sao_static_equipment_list')
op.create_table("sao_static_equipment_list",
sa.Column("EquipmentId", sa.BIGINT, primary_key=True, nullable=False),
sa.Column("EquipmentType", sa.INTEGER, nullable=False),
sa.Column("WeaponTypeId", sa.INTEGER, nullable=False),
sa.Column("Name", sa.VARCHAR(255), nullable=False),
sa.Column("Name_en", sa.VARCHAR(255)),
sa.Column("Rarity", sa.INTEGER, nullable=False),
sa.Column("Power", sa.INTEGER, nullable=False),
sa.Column("StrengthIncrement", sa.INTEGER, nullable=False),
sa.Column("SkillCondition", sa.INTEGER, nullable=False),
sa.Column("Property1PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property1Value1", sa.INTEGER, nullable=False),
sa.Column("Property1Value2", sa.INTEGER, nullable=False),
sa.Column("Property2PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property2Value1", sa.INTEGER, nullable=False),
sa.Column("Property2Value2", sa.INTEGER, nullable=False),
sa.Column("Property3PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property3Value1", sa.INTEGER, nullable=False),
sa.Column("Property3Value2", sa.INTEGER, nullable=False),
sa.Column("Property4PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property4Value1", sa.INTEGER, nullable=False),
sa.Column("Property4Value2", sa.INTEGER, nullable=False),
sa.Column("SalePrice", sa.INTEGER, nullable=False),
sa.Column("CompositionExp", sa.INTEGER, nullable=False),
sa.Column("AwakeningExp", sa.INTEGER, nullable=False),
sa.Column("FlavorText", sa.VARCHAR(255), nullable=False),
sa.Column("FlavorText_en", sa.VARCHAR(255)),
mysql_charset="utf8mb4"
)
op.drop_table('sao_static_hero_list')
op.create_table("sao_static_hero_list",
sa.Column("HeroLogId", sa.BIGINT, primary_key=True, nullable=False),
sa.Column("CharaId", sa.INTEGER, nullable=False),
sa.Column("Name", sa.VARCHAR(255), nullable=False),
sa.Column("Nickname", sa.VARCHAR(255), nullable=False),
sa.Column("Name_en", sa.VARCHAR(255)),
sa.Column("Nickname_en", sa.VARCHAR(255)),
sa.Column("Rarity", sa.INTEGER, nullable=False),
sa.Column("WeaponTypeId", sa.INTEGER, nullable=False),
sa.Column("HeroLogRoleId", sa.INTEGER, nullable=False),
sa.Column("CostumeTypeId", sa.INTEGER, nullable=False),
sa.Column("UnitId", sa.INTEGER, nullable=False),
sa.Column("DefaultEquipmentId1", sa.BIGINT, sa.ForeignKey("sao_static_equipment_list.EquipmentId", ondelete="cascade", onupdate="cascade")),
sa.Column("DefaultEquipmentId2", sa.BIGINT, sa.ForeignKey("sao_static_equipment_list.EquipmentId", ondelete="cascade", onupdate="cascade")),
sa.Column("SkillTableSubId", sa.INTEGER, nullable=False),
sa.Column("HpMin", sa.INTEGER, nullable=False),
sa.Column("HpMax", sa.INTEGER, nullable=False),
sa.Column("StrMin", sa.INTEGER, nullable=False),
sa.Column("StrMax", sa.INTEGER, nullable=False),
sa.Column("VitMin", sa.INTEGER, nullable=False),
sa.Column("VitMax", sa.INTEGER, nullable=False),
sa.Column("IntMin", sa.INTEGER, nullable=False),
sa.Column("IntMax", sa.INTEGER, nullable=False),
sa.Column("Property1PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property1Value1", sa.INTEGER, nullable=False),
sa.Column("Property1Value2", sa.INTEGER, nullable=False),
sa.Column("Property2PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property2Value1", sa.INTEGER, nullable=False),
sa.Column("Property2Value2", sa.INTEGER, nullable=False),
sa.Column("Property3PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property3Value1", sa.INTEGER, nullable=False),
sa.Column("Property3Value2", sa.INTEGER, nullable=False),
sa.Column("Property4PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("Property4Value1", sa.INTEGER, nullable=False),
sa.Column("Property4Value2", sa.INTEGER, nullable=False),
sa.Column("FlavorText", sa.VARCHAR(255), nullable=False),
sa.Column("FlavorText_en", sa.VARCHAR(255)),
sa.Column("SalePrice", sa.INTEGER, nullable=False),
sa.Column("CompositionExp", sa.INTEGER, nullable=False),
sa.Column("AwakeningExp", sa.INTEGER, nullable=False),
sa.Column("Slot4UnlockLevel", sa.INTEGER, nullable=False),
sa.Column("Slot5UnlockLevel", sa.INTEGER, nullable=False),
sa.Column("CollectionEmptyFrameDisplayFlag", sa.BOOLEAN, nullable=False),
mysql_charset="utf8mb4"
)
op.drop_table('sao_static_item_list')
op.create_table("sao_static_item_list",
sa.Column("ItemId", sa.INTEGER, nullable=False, primary_key=True),
sa.Column("ItemTypeId", sa.INTEGER, nullable=False),
sa.Column("Name", sa.VARCHAR(255), nullable=False),
sa.Column("Name_en", sa.VARCHAR(255)),
sa.Column("Rarity", sa.INTEGER, nullable=False),
sa.Column("Value", sa.INTEGER, nullable=False),
sa.Column("PropertyId", sa.BIGINT, sa.ForeignKey("sao_static_property.PropertyId", ondelete="cascade", onupdate="cascade"), nullable=False),
sa.Column("PropertyValue1Min", sa.INTEGER, nullable=False),
sa.Column("PropertyValue1Max", sa.INTEGER, nullable=False),
sa.Column("PropertyValue2Min", sa.INTEGER, nullable=False),
sa.Column("PropertyValue2Max", sa.INTEGER, nullable=False),
sa.Column("FlavorText", sa.VARCHAR(255), nullable=False),
sa.Column("FlavorText_en", sa.VARCHAR(255)),
sa.Column("SalePrice", sa.INTEGER, nullable=False),
sa.Column("ItemIcon", sa.VARCHAR(255), nullable=False),
mysql_charset="utf8mb4"
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('sao_static_item_list')
op.create_table("sao_static_item_list",
sa.Column("id", sa.Integer, primary_key=True, nullable=False),
sa.Column("version", sa.Integer),
sa.Column("itemId", sa.Integer),
sa.Column("itemTypeId", sa.Integer),
sa.Column("name", sa.String(255)),
sa.Column("rarity", sa.Integer),
sa.Column("flavorText", sa.String(255)),
sa.Column("enabled", sa.Boolean),
sa.UniqueConstraint(
"version", "itemId", name="sao_static_item_list_uk"
),
mysql_charset="utf8mb4"
)
op.drop_table('sao_static_hero_list')
op.create_table("sao_static_hero_list",
sa.Column("id", sa.Integer, primary_key=True, nullable=False),
sa.Column("version", sa.Integer),
sa.Column("heroLogId", sa.Integer),
sa.Column("name", sa.String(255)),
sa.Column("nickname", sa.String(255)),
sa.Column("rarity", sa.Integer),
sa.Column("skillTableSubId", sa.Integer),
sa.Column("awakeningExp", sa.Integer),
sa.Column("flavorText", sa.String(255)),
sa.Column("enabled", sa.Boolean),
sa.UniqueConstraint(
"version", "heroLogId", name="sao_static_hero_list_uk"
),
mysql_charset="utf8mb4",
)
op.drop_table('sao_static_equipment_list')
op.create_table("sao_static_equipment_list",
sa.Column("id", sa.Integer, primary_key=True, nullable=False),
sa.Column("version", sa.Integer),
sa.Column("equipmentId", sa.Integer),
sa.Column("equipmentType", sa.Integer),
sa.Column("weaponTypeId", sa.Integer),
sa.Column("name", sa.String(255)),
sa.Column("rarity", sa.Integer),
sa.Column("flavorText", sa.String(255)),
sa.Column("enabled", sa.Boolean),
sa.UniqueConstraint(
"version", "equipmentId", name="sao_static_equipment_list_uk"
),
mysql_charset="utf8mb4"
)
op.alter_column('sao_end_sessions', 'play_date',
existing_type=mysql.TIMESTAMP(),
server_default=sa.text('CURRENT_TIMESTAMP'),
existing_nullable=False)
op.drop_table('sao_player_hero_card')
op.drop_table('sao_player_ex_bonus')
op.drop_table('sao_static_tower')
op.drop_table('sao_static_skill_table')
op.drop_table('sao_static_side_quest')
op.drop_table('sao_static_ex_tower')
op.drop_table('sao_static_ex_bonus')
op.drop_table('sao_static_episode')
op.drop_table('sao_player_tutorial')
op.drop_table('sao_player_resource_card')
op.drop_table('sao_player_beginner_mission')
op.drop_table('sao_static_trace_table')
op.drop_table('sao_static_skill')
op.drop_table('sao_static_reward')
op.drop_table('sao_static_property')
op.drop_table('sao_static_quest')
op.create_table('sao_static_quest',
sa.Column('id', mysql.INTEGER(), autoincrement=True, nullable=False),
sa.Column('enabled', mysql.TINYINT(display_width=1), autoincrement=False, nullable=True),
sa.Column('version', mysql.INTEGER(), autoincrement=False, nullable=True),
sa.Column('questSceneId', mysql.INTEGER(), autoincrement=False, nullable=True),
sa.Column('sortNo', mysql.INTEGER(), autoincrement=False, nullable=True),
sa.Column('name', mysql.VARCHAR(charset='utf8mb4', collation='utf8mb4_general_ci', length=255), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint("version", "questSceneId", name="sao_static_quest_uk"),
mysql_charset='utf8mb4'
)
# ### end Alembic commands ###

View File

@ -0,0 +1,87 @@
"""CHUNITHM LUMINOUS
Revision ID: b23f985100ba
Revises: 3657efefc5a4
Create Date: 2024-06-20 08:08:08.759261
"""
from alembic import op
from sqlalchemy import Column, Integer, Boolean, UniqueConstraint
# revision identifiers, used by Alembic.
revision = 'b23f985100ba'
down_revision = '3657efefc5a4'
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"chuni_profile_net_battle",
Column("id", Integer, primary_key=True, nullable=False),
Column("user", Integer, nullable=False),
Column("isRankUpChallengeFailed", Boolean),
Column("highestBattleRankId", Integer),
Column("battleIconId", Integer),
Column("battleIconNum", Integer),
Column("avatarEffectPoint", Integer),
mysql_charset="utf8mb4",
)
op.create_foreign_key(
None,
"chuni_profile_net_battle",
"aime_user",
["user"],
["id"],
ondelete="cascade",
onupdate="cascade",
)
op.create_table(
"chuni_item_cmission",
Column("id", Integer, primary_key=True, nullable=False),
Column("user", Integer, nullable=False),
Column("missionId", Integer, nullable=False),
Column("point", Integer),
UniqueConstraint("user", "missionId", name="chuni_item_cmission_uk"),
mysql_charset="utf8mb4",
)
op.create_foreign_key(
None,
"chuni_item_cmission",
"aime_user",
["user"],
["id"],
ondelete="cascade",
onupdate="cascade",
)
op.create_table(
"chuni_item_cmission_progress",
Column("id", Integer, primary_key=True, nullable=False),
Column("user", Integer, nullable=False),
Column("missionId", Integer, nullable=False),
Column("order", Integer),
Column("stage", Integer),
Column("progress", Integer),
UniqueConstraint(
"user", "missionId", "order", name="chuni_item_cmission_progress_uk"
),
mysql_charset="utf8mb4",
)
op.create_foreign_key(
None,
"chuni_item_cmission_progress",
"aime_user",
["user"],
["id"],
ondelete="cascade",
onupdate="cascade",
)
def downgrade():
op.drop_table("chuni_profile_net_battle")
op.drop_table("chuni_item_cmission")
op.drop_table("chuni_item_cmission_progress")

View File

@ -0,0 +1,29 @@
"""Remove old db mgmt system
Revision ID: d8950c7ce2fc
Revises: 835b862f9bf0
Create Date: 2024-01-09 13:43:51.381175
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'd8950c7ce2fc'
down_revision = '835b862f9bf0'
branch_labels = None
depends_on = None
def upgrade():
op.drop_table("schema_versions")
def downgrade():
op.create_table(
"schema_versions",
sa.Column("game", sa.String(4), primary_key=True, nullable=False),
sa.Column("version", sa.Integer, nullable=False, server_default="1"),
mysql_charset="utf8mb4",
)

View File

@ -1,13 +1,14 @@
import logging, coloredlogs
from typing import Optional, Dict, List
from typing import Optional
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy import create_engine
from logging.handlers import TimedRotatingFileHandler
import importlib, os
import os
import secrets, string
import bcrypt
from hashlib import sha256
import alembic.config
import glob
from core.config import CoreConfig
from core.data.schema import *
@ -15,7 +16,6 @@ from core.utils import Utils
class Data:
current_schema_version = 6
engine = None
session = None
user = None
@ -77,281 +77,177 @@ class Data:
)
self.logger.handler_set = True # type: ignore
def __alembic_cmd(self, command: str, *args: str) -> None:
old_dir = os.path.abspath(os.path.curdir)
base_dir = os.path.join(os.path.abspath(os.path.curdir), 'core', 'data', 'alembic')
alembicArgs = [
"-c",
os.path.join(base_dir, "alembic.ini"),
"-x",
f"script_location={base_dir}",
"-x",
f"sqlalchemy.url={self.__url}",
command,
]
alembicArgs.extend(args)
os.chdir(base_dir)
alembic.config.main(argv=alembicArgs)
os.chdir(old_dir)
def create_database(self):
self.logger.info("Creating databases...")
try:
metadata.create_all(self.__engine.connect())
except SQLAlchemyError as e:
self.logger.error(f"Failed to create databases! {e}")
return
games = Utils.get_all_titles()
for game_dir, game_mod in games.items():
try:
if hasattr(game_mod, "database") and hasattr(
game_mod, "current_schema_version"
):
game_mod.database(self.config)
metadata.create_all(self.__engine.connect())
self.base.touch_schema_ver(
game_mod.current_schema_version, game_mod.game_codes[0]
)
except Exception as e:
self.logger.warning(
f"Could not load database schema from {game_dir} - {e}"
)
self.logger.info(f"Setting base_schema_ver to {self.current_schema_version}")
self.base.set_schema_ver(self.current_schema_version)
self.logger.info(
f"Setting user auto_incrememnt to {self.config.database.user_table_autoincrement_start}"
)
self.user.reset_autoincrement(
self.config.database.user_table_autoincrement_start
metadata.create_all(
self.engine,
checkfirst=True,
)
def recreate_database(self):
self.logger.info("Dropping all databases...")
self.base.execute("SET FOREIGN_KEY_CHECKS=0")
try:
metadata.drop_all(self.__engine.connect())
except SQLAlchemyError as e:
self.logger.error(f"Failed to drop databases! {e}")
return
for _, mod in Utils.get_all_titles().items():
if hasattr(mod, "database"):
mod.database(self.config)
metadata.create_all(
self.engine,
checkfirst=True,
)
for root, dirs, files in os.walk("./titles"):
for dir in dirs:
if not dir.startswith("__"):
try:
mod = importlib.import_module(f"titles.{dir}")
# Stamp the end revision as if alembic had created it, so it can take off after this.
self.__alembic_cmd(
"stamp",
"head",
)
try:
if hasattr(mod, "database"):
mod.database(self.config)
metadata.drop_all(self.__engine.connect())
def schema_upgrade(self, ver: str = None):
self.__alembic_cmd(
"upgrade",
"head" if not ver else ver,
)
except Exception as e:
self.logger.warning(
f"Could not load database schema from {dir} - {e}"
)
def schema_downgrade(self, ver: str):
self.__alembic_cmd(
"downgrade",
ver,
)
except ImportError as e:
self.logger.warning(
f"Failed to load database schema dir {dir} - {e}"
)
break
self.base.execute("SET FOREIGN_KEY_CHECKS=1")
self.create_database()
def migrate_database(self, game: str, version: Optional[int], action: str) -> None:
old_ver = self.base.get_schema_ver(game)
sql = ""
if version is None:
if not game == "CORE":
titles = Utils.get_all_titles()
for folder, mod in titles.items():
if not mod.game_codes[0] == game:
continue
if hasattr(mod, "current_schema_version"):
version = mod.current_schema_version
else:
self.logger.warning(
f"current_schema_version not found for {folder}"
)
else:
version = self.current_schema_version
if version is None:
self.logger.warning(
f"Could not determine latest version for {game}, please specify --version"
)
if old_ver is None:
self.logger.error(
f"Schema for game {game} does not exist, did you run the creation script?"
)
return
if old_ver == version:
self.logger.info(
f"Schema for game {game} is already version {old_ver}, nothing to do"
)
return
if action == "upgrade":
for x in range(old_ver, version):
if not os.path.exists(
f"core/data/schema/versions/{game.upper()}_{x + 1}_{action}.sql"
):
self.logger.error(
f"Could not find {action} script {game.upper()}_{x + 1}_{action}.sql in core/data/schema/versions folder"
)
return
with open(
f"core/data/schema/versions/{game.upper()}_{x + 1}_{action}.sql",
"r",
encoding="utf-8",
) as f:
sql = f.read()
result = self.base.execute(sql)
if result is None:
self.logger.error("Error execuing sql script!")
return None
else:
for x in range(old_ver, version, -1):
if not os.path.exists(
f"core/data/schema/versions/{game.upper()}_{x - 1}_{action}.sql"
):
self.logger.error(
f"Could not find {action} script {game.upper()}_{x - 1}_{action}.sql in core/data/schema/versions folder"
)
return
with open(
f"core/data/schema/versions/{game.upper()}_{x - 1}_{action}.sql",
"r",
encoding="utf-8",
) as f:
sql = f.read()
result = self.base.execute(sql)
if result is None:
self.logger.error("Error execuing sql script!")
return None
result = self.base.set_schema_ver(version, game)
if result is None:
self.logger.error("Error setting version in schema_version table!")
return None
self.logger.info(f"Successfully migrated {game} to schema version {version}")
def create_owner(self, email: Optional[str] = None) -> None:
async def create_owner(self, email: Optional[str] = None, code: Optional[str] = "00000000000000000000") -> None:
pw = "".join(
secrets.choice(string.ascii_letters + string.digits) for i in range(20)
)
hash = bcrypt.hashpw(pw.encode(), bcrypt.gensalt())
user_id = self.user.create_user(email=email, permission=255, password=hash)
user_id = await self.user.create_user(username="sysowner", email=email, password=hash.decode(), permission=255)
if user_id is None:
self.logger.error(f"Failed to create owner with email {email}")
return
card_id = self.card.create_card(user_id, "00000000000000000000")
card_id = await self.card.create_card(user_id, code)
if card_id is None:
self.logger.error(f"Failed to create card for owner with id {user_id}")
return
self.logger.warning(
f"Successfully created owner with email {email}, access code 00000000000000000000, and password {pw} Make sure to change this password and assign a real card ASAP!"
f"Successfully created owner with email {email}, access code {code}, and password {pw} Make sure to change this password and assign a real card ASAP!"
)
def migrate_card(self, old_ac: str, new_ac: str, should_force: bool) -> None:
if old_ac == new_ac:
self.logger.error("Both access codes are the same!")
return
new_card = self.card.get_card_by_access_code(new_ac)
if new_card is None:
self.card.update_access_code(old_ac, new_ac)
return
if not should_force:
self.logger.warning(
f"Card already exists for access code {new_ac} (id {new_card['id']}). If you wish to continue, rerun with the '--force' flag."
f" All exiting data on the target card {new_ac} will be perminently erased and replaced with data from card {old_ac}."
)
return
self.logger.info(
f"All exiting data on the target card {new_ac} will be perminently erased and replaced with data from card {old_ac}."
)
self.card.delete_card(new_card["id"])
self.card.update_access_code(old_ac, new_ac)
hanging_user = self.user.get_user(new_card["user"])
if hanging_user["password"] is None:
self.logger.info(f"Delete hanging user {hanging_user['id']}")
self.user.delete_user(hanging_user["id"])
def delete_hanging_users(self) -> None:
"""
Finds and deletes users that have not registered for the webui that have no cards assocated with them.
"""
unreg_users = self.user.get_unregistered_users()
if unreg_users is None:
self.logger.error("Error occoured finding unregistered users")
for user in unreg_users:
cards = self.card.get_user_cards(user["id"])
if cards is None:
self.logger.error(f"Error getting cards for user {user['id']}")
continue
if not cards:
self.logger.info(f"Delete hanging user {user['id']}")
self.user.delete_user(user["id"])
def autoupgrade(self) -> None:
all_game_versions = self.base.get_all_schema_vers()
if all_game_versions is None:
self.logger.warning("Failed to get schema versions")
return
all_games = Utils.get_all_titles()
all_games_list: Dict[str, int] = {}
for _, mod in all_games.items():
if hasattr(mod, "current_schema_version"):
all_games_list[mod.game_codes[0]] = mod.current_schema_version
for x in all_game_versions:
failed = False
game = x["game"].upper()
update_ver = int(x["version"])
latest_ver = all_games_list.get(game, 1)
if game == "CORE":
latest_ver = self.current_schema_version
if update_ver == latest_ver:
self.logger.info(f"{game} is already latest version")
continue
for y in range(update_ver + 1, latest_ver + 1):
if os.path.exists(f"core/data/schema/versions/{game}_{y}_upgrade.sql"):
with open(
f"core/data/schema/versions/{game}_{y}_upgrade.sql",
"r",
encoding="utf-8",
) as f:
sql = f.read()
result = self.base.execute(sql)
if result is None:
self.logger.error(
f"Error execuing sql script for game {game} v{y}!"
)
failed = True
break
else:
self.logger.warning(f"Could not find script {game}_{y}_upgrade.sql")
failed = True
if not failed:
self.base.set_schema_ver(latest_ver, game)
def show_versions(self) -> None:
all_game_versions = self.base.get_all_schema_vers()
for ver in all_game_versions:
self.logger.info(f"{ver['game']} -> v{ver['version']}")
async def migrate(self) -> None:
exist = await self.base.execute("SELECT * FROM alembic_version")
if exist is not None:
self.logger.warn("No need to migrate as you have already migrated to alembic. If you are trying to upgrade the schema, use `upgrade` instead!")
return
self.logger.info("Upgrading to latest with legacy system")
if not await self.legacy_upgrade():
self.logger.warn("No need to migrate as you have already deleted the old schema_versions system. If you are trying to upgrade the schema, use `upgrade` instead!")
return
self.logger.info("Done")
self.logger.info("Stamp with initial revision")
self.__alembic_cmd(
"stamp",
"835b862f9bf0",
)
self.logger.info("Upgrade")
self.__alembic_cmd(
"upgrade",
"head",
)
async def legacy_upgrade(self) -> bool:
vers = await self.base.execute("SELECT * FROM schema_versions")
if vers is None:
self.logger.warn("Cannot legacy upgrade, schema_versions table unavailable!")
return False
db_vers = {}
vers_list = vers.fetchall()
for x in vers_list:
db_vers[x['game']] = x['version']
core_now_ver = int(db_vers['CORE']) + 1
while os.path.exists(f"core/data/schema/versions/CORE_{core_now_ver}_upgrade.sql"):
with open(f"core/data/schema/versions/CORE_{core_now_ver}_upgrade.sql", "r") as f:
result = await self.base.execute(f.read())
if result is None:
self.logger.error(f"Invalid upgrade script CORE_{core_now_ver}_upgrade.sql")
break
result = await self.base.execute(f"UPDATE schema_versions SET version = {core_now_ver} WHERE game = 'CORE'")
if result is None:
self.logger.error(f"Failed to update schema version for CORE to {core_now_ver}")
break
self.logger.info(f"Upgrade CORE to version {core_now_ver}")
core_now_ver += 1
for _, mod in Utils.get_all_titles().items():
game_codes = getattr(mod, "game_codes", [])
for game in game_codes:
if game not in db_vers:
self.logger.warn(f"{game} does not have an antry in schema_versions, skipping")
continue
now_ver = int(db_vers[game]) + 1
while os.path.exists(f"core/data/schema/versions/{game}_{now_ver}_upgrade.sql"):
with open(f"core/data/schema/versions/{game}_{now_ver}_upgrade.sql", "r") as f:
result = await self.base.execute(f.read())
if result is None:
self.logger.error(f"Invalid upgrade script {game}_{now_ver}_upgrade.sql")
break
result = await self.base.execute(f"UPDATE schema_versions SET version = {now_ver} WHERE game = '{game}'")
if result is None:
self.logger.error(f"Failed to update schema version for {game} to {now_ver}")
break
self.logger.info(f"Upgrade {game} to version {now_ver}")
now_ver += 1
return True
async def create_revision(self, message: str) -> None:
if not message:
self.logger.info("Message is required for create-revision")
return
self.__alembic_cmd(
"revision",
"-m",
message,
)
async def create_revision_auto(self, message: str) -> None:
if not message:
self.logger.info("Message is required for create-revision")
return
for _, mod in Utils.get_all_titles().items():
if hasattr(mod, "database"):
mod.database(self.config)
self.__alembic_cmd(
"revision",
"--autogenerate",
"-m",
message,
)

View File

@ -69,7 +69,7 @@ arcade_owner = Table(
class ArcadeData(BaseData):
def get_machine(self, serial: str = None, id: int = None) -> Optional[Row]:
async def get_machine(self, serial: str = None, id: int = None) -> Optional[Row]:
if serial is not None:
serial = serial.replace("-", "")
if len(serial) == 11:
@ -89,12 +89,12 @@ class ArcadeData(BaseData):
self.logger.error(f"{__name__ }: Need either serial or ID to look up!")
return None
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_machine(
async def create_machine(
self,
arcade_id: int,
serial: str = "",
@ -102,21 +102,21 @@ class ArcadeData(BaseData):
game: str = None,
is_cab: bool = False,
) -> Optional[int]:
if arcade_id:
if not arcade_id:
self.logger.error(f"{__name__ }: Need arcade id!")
return None
sql = machine.insert().values(
arcade=arcade_id, keychip=serial, board=board, game=game, is_cab=is_cab
arcade=arcade_id, serial=serial, board=board, game=game, is_cab=is_cab
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
def set_machine_serial(self, machine_id: int, serial: str) -> None:
result = self.execute(
async def set_machine_serial(self, machine_id: int, serial: str) -> None:
result = await self.execute(
machine.update(machine.c.id == machine_id).values(keychip=serial)
)
if result is None:
@ -125,8 +125,8 @@ class ArcadeData(BaseData):
)
return result.lastrowid
def set_machine_boardid(self, machine_id: int, boardid: str) -> None:
result = self.execute(
async def set_machine_boardid(self, machine_id: int, boardid: str) -> None:
result = await self.execute(
machine.update(machine.c.id == machine_id).values(board=boardid)
)
if result is None:
@ -134,29 +134,29 @@ class ArcadeData(BaseData):
f"Failed to update board id for machine {machine_id} -> {boardid}"
)
def get_arcade(self, id: int) -> Optional[Row]:
async def get_arcade(self, id: int) -> Optional[Row]:
sql = arcade.select(arcade.c.id == id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_arcade_machines(self, id: int) -> Optional[List[Row]]:
async def get_arcade_machines(self, id: int) -> Optional[List[Row]]:
sql = machine.select(machine.c.arcade == id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_arcade(
async def create_arcade(
self,
name: str,
name: str = None,
nickname: str = None,
country: str = "JPN",
country_id: int = 1,
state: str = "",
city: str = "",
regional_id: int = 1,
region_id: int = 1,
) -> Optional[int]:
if nickname is None:
nickname = name
@ -168,65 +168,104 @@ class ArcadeData(BaseData):
country_id=country_id,
state=state,
city=city,
regional_id=regional_id,
region_id=region_id,
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
def get_arcades_managed_by_user(self, user_id: int) -> Optional[List[Row]]:
async def get_arcades_managed_by_user(self, user_id: int) -> Optional[List[Row]]:
sql = select(arcade).join(arcade_owner, arcade_owner.c.arcade == arcade.c.id).where(arcade_owner.c.user == user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return False
return result.fetchall()
def get_manager_permissions(self, user_id: int, arcade_id: int) -> Optional[int]:
async def get_manager_permissions(self, user_id: int, arcade_id: int) -> Optional[int]:
sql = select(arcade_owner.c.permissions).where(and_(arcade_owner.c.user == user_id, arcade_owner.c.arcade == arcade_id))
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return False
return result.fetchone()
def get_arcade_owners(self, arcade_id: int) -> Optional[Row]:
async def get_arcade_owners(self, arcade_id: int) -> Optional[Row]:
sql = select(arcade_owner).where(arcade_owner.c.arcade == arcade_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def add_arcade_owner(self, arcade_id: int, user_id: int) -> None:
async def add_arcade_owner(self, arcade_id: int, user_id: int) -> None:
sql = insert(arcade_owner).values(arcade=arcade_id, user=user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
async def get_arcade_by_name(self, name: str) -> Optional[List[Row]]:
sql = arcade.select(or_(arcade.c.name.like(f"%{name}%"), arcade.c.nickname.like(f"%{name}%")))
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
async def get_arcades_by_ip(self, ip: str) -> Optional[List[Row]]:
sql = arcade.select().where(arcade.c.ip == ip)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
async def get_num_generated_keychips(self) -> Optional[int]:
result = await self.execute(select(func.count("serial LIKE 'A69A%'")).select_from(machine))
if result:
return result.fetchone()['count_1']
self.logger.error("Failed to count machine serials that start with A69A!")
def format_serial(
self, platform_code: str, platform_rev: int, serial_num: int, append: int = 4152
self, platform_code: str, platform_rev: int, serial_letter: str, serial_num: int, append: int, dash: bool = False
) -> str:
return f"{platform_code}{platform_rev:02d}A{serial_num:04d}{append:04d}" # 0x41 = A, 0x52 = R
return f"{platform_code}{'-' if dash else ''}{platform_rev:02d}{serial_letter}{serial_num:04d}{append:04d}"
def validate_keychip_format(self, serial: str) -> bool:
if re.fullmatch(r"^A[0-9]{2}[E|X][-]?[0-9]{2}[A-HJ-NP-Z][0-9]{4}([0-9]{4})?$", serial) is None:
# For the 2nd letter, E and X are the only "real" values that have been observed
if re.fullmatch(r"^A[0-9]{2}[A-Z][-]?[0-9]{2}[A-HJ-NP-Z][0-9]{4}([0-9]{4})?$", serial) is None:
return False
return True
# Thanks bottersnike!
def get_keychip_suffix(self, year: int, month: int) -> str:
assert year > 1957
assert 1 <= month <= 12
def get_arcade_by_name(self, name: str) -> Optional[List[Row]]:
sql = arcade.select(or_(arcade.c.name.like(f"%{name}%"), arcade.c.nickname.like(f"%{name}%")))
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
year -= 1957
# Jan/Feb/Mar are from the previous tax year
if month < 4:
year -= 1
assert year >= 1 and year <= 99
def get_arcades_by_ip(self, ip: str) -> Optional[List[Row]]:
sql = arcade.select().where(arcade.c.ip == ip)
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
month = ((month - 1) + 9) % 12 # Offset so April=0
return f"{year:02}{month // 6:01}{month % 6 + 1:01}"
def parse_keychip_suffix(self, suffix: str) -> tuple[int, int]:
year = int(suffix[0:2])
half = int(suffix[2])
assert half in (0, 1)
period = int(suffix[3])
assert period in (1, 2, 3, 4, 5, 6)
month = half * 6 + (period - 1)
month = ((month + 3) % 12) + 1 # Offset so Jan=1
# Jan/Feb/Mar are from the previous tax year
if month < 4:
year += 1
year += 1957
return (year, month)

View File

@ -8,21 +8,14 @@ from sqlalchemy.engine.base import Connection
from sqlalchemy.sql import text, func, select
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy import MetaData, Table, Column
from sqlalchemy.types import Integer, String, TIMESTAMP, JSON
from sqlalchemy.types import Integer, String, TIMESTAMP, JSON, INTEGER, TEXT
from sqlalchemy.schema import ForeignKey
from sqlalchemy.dialects.mysql import insert
from core.config import CoreConfig
metadata = MetaData()
schema_ver = Table(
"schema_versions",
metadata,
Column("game", String(4), primary_key=True, nullable=False),
Column("version", Integer, nullable=False, server_default="1"),
mysql_charset="utf8mb4",
)
event_log = Table(
"event_log",
metadata,
@ -30,6 +23,12 @@ event_log = Table(
Column("system", String(255), nullable=False),
Column("type", String(255), nullable=False),
Column("severity", Integer, nullable=False),
Column("user", INTEGER, ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade")),
Column("arcade", INTEGER, ForeignKey("arcade.id", ondelete="cascade", onupdate="cascade")),
Column("machine", INTEGER, ForeignKey("machine.id", ondelete="cascade", onupdate="cascade")),
Column("ip", TEXT(39)),
Column("game", TEXT(4)),
Column("version", TEXT(24)),
Column("message", String(1000), nullable=False),
Column("details", JSON, nullable=False),
Column("when_logged", TIMESTAMP, nullable=False, server_default=func.now()),
@ -43,11 +42,11 @@ class BaseData:
self.conn = conn
self.logger = logging.getLogger("database")
def execute(self, sql: str, opts: Dict[str, Any] = {}) -> Optional[CursorResult]:
async def execute(self, sql: str, opts: Dict[str, Any] = {}) -> Optional[CursorResult]:
res = None
try:
self.logger.info(f"SQL Execute: {''.join(str(sql).splitlines())}")
self.logger.debug(f"SQL Execute: {''.join(str(sql).splitlines())}")
res = self.conn.execute(text(sql), opts)
except SQLAlchemyError as e:
@ -82,62 +81,24 @@ class BaseData:
"""
return randrange(10000, 9999999)
def get_all_schema_vers(self) -> Optional[List[Row]]:
sql = select(schema_ver)
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_schema_ver(self, game: str) -> Optional[int]:
sql = select(schema_ver).where(schema_ver.c.game == game)
result = self.execute(sql)
if result is None:
return None
row = result.fetchone()
if row is None:
return None
return row["version"]
def touch_schema_ver(self, ver: int, game: str = "CORE") -> Optional[int]:
sql = insert(schema_ver).values(game=game, version=ver)
conflict = sql.on_duplicate_key_update(version=schema_ver.c.version)
result = self.execute(conflict)
if result is None:
self.logger.error(
f"Failed to update schema version for game {game} (v{ver})"
)
return None
return result.lastrowid
def set_schema_ver(self, ver: int, game: str = "CORE") -> Optional[int]:
sql = insert(schema_ver).values(game=game, version=ver)
conflict = sql.on_duplicate_key_update(version=ver)
result = self.execute(conflict)
if result is None:
self.logger.error(
f"Failed to update schema version for game {game} (v{ver})"
)
return None
return result.lastrowid
def log_event(
self, system: str, type: str, severity: int, message: str, details: Dict = {}
async def log_event(
self, system: str, type: str, severity: int, message: str, details: Dict = {}, user: int = None,
arcade: int = None, machine: int = None, ip: str = None, game: str = None, version: str = None
) -> Optional[int]:
sql = event_log.insert().values(
system=system,
type=type,
severity=severity,
user=user,
arcade=arcade,
machine=machine,
ip=ip,
game=game,
version=version,
message=message,
details=json.dumps(details),
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
self.logger.error(
@ -147,9 +108,9 @@ class BaseData:
return result.lastrowid
def get_event_log(self, entries: int = 100) -> Optional[List[Dict]]:
sql = event_log.select().limit(entries).all()
result = self.execute(sql)
async def get_event_log(self, entries: int = 100) -> Optional[List[Row]]:
sql = event_log.select().order_by(event_log.c.id.desc()).limit(entries)
result = await self.execute(sql)
if result is None:
return None

View File

@ -1,6 +1,6 @@
from typing import Dict, List, Optional
from sqlalchemy import Table, Column, UniqueConstraint
from sqlalchemy.types import Integer, String, Boolean, TIMESTAMP
from sqlalchemy.types import Integer, String, Boolean, TIMESTAMP, BIGINT, VARCHAR
from sqlalchemy.sql.schema import ForeignKey
from sqlalchemy.sql import func
from sqlalchemy.engine import Row
@ -11,107 +11,149 @@ aime_card = Table(
"aime_card",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("access_code", String(20)),
Column("user", ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column("access_code", String(20), nullable=False, unique=True),
Column("idm", String(16), unique=True),
Column("chip_id", BIGINT, unique=True),
Column("created_date", TIMESTAMP, server_default=func.now()),
Column("last_login_date", TIMESTAMP, onupdate=func.now()),
Column("is_locked", Boolean, server_default="0"),
Column("is_banned", Boolean, server_default="0"),
Column("memo", VARCHAR(16)),
UniqueConstraint("user", "access_code", name="aime_card_uk"),
mysql_charset="utf8mb4",
)
class CardData(BaseData):
def get_card_by_access_code(self, access_code: str) -> Optional[Row]:
moble_os_codes = set([0x06, 0x07, 0x10, 0x12, 0x13, 0x14, 0x15, 0x17, 0x18])
card_os_codes = set([0x20, 0xF0, 0xF1, 0xF2, 0xF3, 0xF4, 0xF5, 0xF6, 0xF7])
async def get_card_by_access_code(self, access_code: str) -> Optional[Row]:
sql = aime_card.select(aime_card.c.access_code == access_code)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_card_by_id(self, card_id: int) -> Optional[Row]:
async def get_card_by_id(self, card_id: int) -> Optional[Row]:
sql = aime_card.select(aime_card.c.id == card_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def update_access_code(self, old_ac: str, new_ac: str) -> None:
async def update_access_code(self, old_ac: str, new_ac: str) -> None:
sql = aime_card.update(aime_card.c.access_code == old_ac).values(
access_code=new_ac
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
self.logger.error(
f"Failed to change card access code from {old_ac} to {new_ac}"
)
def get_user_id_from_card(self, access_code: str) -> Optional[int]:
async def get_user_id_from_card(self, access_code: str) -> Optional[int]:
"""
Given a 20 digit access code as a string, get the user id associated with that card
"""
card = self.get_card_by_access_code(access_code)
card = await self.get_card_by_access_code(access_code)
if card is None:
return None
return int(card["user"])
def get_card_banned(self, access_code: str) -> Optional[bool]:
async def get_card_banned(self, access_code: str) -> Optional[bool]:
"""
Given a 20 digit access code as a string, check if the card is banned
"""
card = self.get_card_by_access_code(access_code)
card = await self.get_card_by_access_code(access_code)
if card is None:
return None
if card["is_banned"]:
return True
return False
def get_card_locked(self, access_code: str) -> Optional[bool]:
async def get_card_locked(self, access_code: str) -> Optional[bool]:
"""
Given a 20 digit access code as a string, check if the card is locked
"""
card = self.get_card_by_access_code(access_code)
card = await self.get_card_by_access_code(access_code)
if card is None:
return None
if card["is_locked"]:
return True
return False
def delete_card(self, card_id: int) -> None:
async def delete_card(self, card_id: int) -> None:
sql = aime_card.delete(aime_card.c.id == card_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to delete card with id {card_id}")
def get_user_cards(self, aime_id: int) -> Optional[List[Row]]:
async def get_user_cards(self, aime_id: int) -> Optional[List[Row]]:
"""
Returns all cards owned by a user
"""
sql = aime_card.select(aime_card.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def create_card(self, user_id: int, access_code: str) -> Optional[int]:
async def create_card(self, user_id: int, access_code: str) -> Optional[int]:
"""
Given a aime_user id and a 20 digit access code as a string, create a card and return the ID if successful
"""
sql = aime_card.insert().values(user=user_id, access_code=access_code)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
async def update_card_last_login(self, access_code: str) -> None:
sql = aime_card.update(aime_card.c.access_code == access_code).values(
last_login_date=func.now()
)
result = await self.execute(sql)
if result is None:
self.logger.warn(f"Failed to update last login time for {access_code}")
async def get_card_by_idm(self, idm: str) -> Optional[Row]:
result = await self.execute(aime_card.select(aime_card.c.idm == idm))
if result:
return result.fetchone()
async def get_card_by_chip_id(self, chip_id: int) -> Optional[Row]:
result = await self.execute(aime_card.select(aime_card.c.chip_id == chip_id))
if result:
return result.fetchone()
async def set_chip_id_by_access_code(self, access_code: str, chip_id: int) -> Optional[Row]:
result = await self.execute(aime_card.update(aime_card.c.access_code == access_code).values(chip_id=chip_id))
if not result:
self.logger.error(f"Failed to update chip ID to {chip_id} for {access_code}")
async def set_idm_by_access_code(self, access_code: str, idm: str) -> Optional[Row]:
result = await self.execute(aime_card.update(aime_card.c.access_code == access_code).values(idm=idm))
if not result:
self.logger.error(f"Failed to update IDm to {idm} for {access_code}")
async def set_access_code_by_access_code(self, old_ac: str, new_ac: str) -> None:
result = await self.execute(aime_card.update(aime_card.c.access_code == old_ac).values(access_code=new_ac))
if not result:
self.logger.error(f"Failed to change card access code from {old_ac} to {new_ac}")
async def set_memo_by_access_code(self, access_code: str, memo: str) -> None:
result = await self.execute(aime_card.update(aime_card.c.access_code == access_code).values(memo=memo))
if not result:
self.logger.error(f"Failed to add memo to card {access_code}")
def to_access_code(self, luid: str) -> str:
"""
Given a felica cards internal 16 hex character luid, convert it to a 0-padded 20 digit access code as a string
@ -122,4 +164,4 @@ class CardData(BaseData):
"""
Given a 20 digit access code as a string, return the 16 hex character luid
"""
return f"{int(access_code):0{16}x}"
return f"{int(access_code):0{16}X}"

View File

@ -1,4 +1,3 @@
from enum import Enum
from typing import Optional, List
from sqlalchemy import Table, Column
from sqlalchemy.types import Integer, String, TIMESTAMP
@ -24,15 +23,8 @@ aime_user = Table(
mysql_charset="utf8mb4",
)
class PermissionBits(Enum):
PermUser = 1
PermMod = 2
PermSysAdmin = 4
class UserData(BaseData):
def create_user(
async def create_user(
self,
id: int = None,
username: str = None,
@ -60,20 +52,20 @@ class UserData(BaseData):
username=username, email=email, password=password, permissions=permission
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_user(self, user_id: int) -> Optional[Row]:
async def get_user(self, user_id: int) -> Optional[Row]:
sql = select(aime_user).where(aime_user.c.id == user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return False
return result.fetchone()
def check_password(self, user_id: int, passwd: bytes = None) -> bool:
usr = self.get_user(user_id)
async def check_password(self, user_id: int, passwd: bytes = None) -> bool:
usr = await self.get_user(user_id)
if usr is None:
return False
@ -85,39 +77,50 @@ class UserData(BaseData):
return bcrypt.checkpw(passwd, usr["password"].encode())
def reset_autoincrement(self, ai_value: int) -> None:
# ALTER TABLE isn't in sqlalchemy so we do this the ugly way
sql = f"ALTER TABLE aime_user AUTO_INCREMENT={ai_value}"
self.execute(sql)
def delete_user(self, user_id: int) -> None:
async def delete_user(self, user_id: int) -> None:
sql = aime_user.delete(aime_user.c.id == user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to delete user with id {user_id}")
def get_unregistered_users(self) -> List[Row]:
async def get_unregistered_users(self) -> List[Row]:
"""
Returns a list of users who have not registered with the webui. They may or may not have cards.
"""
sql = select(aime_user).where(aime_user.c.password == None)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def find_user_by_email(self, email: str) -> Row:
async def find_user_by_email(self, email: str) -> Row:
sql = select(aime_user).where(aime_user.c.email == email)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return False
return result.fetchone()
def find_user_by_username(self, username: str) -> List[Row]:
async def find_user_by_username(self, username: str) -> List[Row]:
sql = aime_user.select(aime_user.c.username.like(f"%{username}%"))
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return False
return result.fetchall()
async def change_password(self, user_id: int, new_passwd: str) -> bool:
sql = aime_user.update(aime_user.c.id == user_id).values(password = new_passwd)
result = await self.execute(sql)
return result is not None
async def change_username(self, user_id: int, new_name: str) -> bool:
sql = aime_user.update(aime_user.c.id == user_id).values(username = new_name)
result = await self.execute(sql)
return result is not None
async def get_user_by_username(self, username: str) -> Optional[Row]:
result = await self.execute(aime_user.select(aime_user.c.username == username))
if result: return result.fetchone()

View File

@ -1,8 +1,8 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE ongeki_user_event_point DROP COLUMN version;
ALTER TABLE ongeki_user_event_point DROP COLUMN rank;
ALTER TABLE ongeki_user_event_point DROP COLUMN type;
ALTER TABLE ongeki_user_event_point DROP COLUMN `rank`;
ALTER TABLE ongeki_user_event_point DROP COLUMN `type`;
ALTER TABLE ongeki_user_event_point DROP COLUMN date;
ALTER TABLE ongeki_user_tech_event DROP COLUMN version;
@ -19,4 +19,4 @@ DROP TABLE ongeki_static_tech_music;
DROP TABLE ongeki_static_client_testmode;
DROP TABLE ongeki_static_game_point;
SET FOREIGN_KEY_CHECKS=1;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -1,8 +1,8 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE ongeki_user_event_point ADD COLUMN version INTEGER NOT NULL;
ALTER TABLE ongeki_user_event_point ADD COLUMN rank INTEGER;
ALTER TABLE ongeki_user_event_point ADD COLUMN type INTEGER NOT NULL;
ALTER TABLE ongeki_user_event_point ADD COLUMN `rank` INTEGER;
ALTER TABLE ongeki_user_event_point ADD COLUMN `type` INTEGER NOT NULL;
ALTER TABLE ongeki_user_event_point ADD COLUMN date VARCHAR(25);
ALTER TABLE ongeki_user_tech_event ADD COLUMN version INTEGER NOT NULL;
@ -12,87 +12,87 @@ ALTER TABLE ongeki_user_mission_point ADD COLUMN version INTEGER NOT NULL;
ALTER TABLE ongeki_static_events ADD COLUMN endDate TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP;
CREATE TABLE ongeki_tech_event_ranking (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
user INT NOT NULL,
version INT NOT NULL,
date VARCHAR(25),
eventId INT NOT NULL,
rank INT,
totalPlatinumScore INT NOT NULL,
totalTechScore INT NOT NULL,
UNIQUE KEY ongeki_tech_event_ranking_uk (user, eventId),
CONSTRAINT ongeki_tech_event_ranking_ibfk1 FOREIGN KEY (user) REFERENCES aime_user(id) ON DELETE CASCADE ON UPDATE CASCADE
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
user INT NOT NULL,
version INT NOT NULL,
date VARCHAR(25),
eventId INT NOT NULL,
`rank` INT,
totalPlatinumScore INT NOT NULL,
totalTechScore INT NOT NULL,
UNIQUE KEY ongeki_tech_event_ranking_uk (user, eventId),
CONSTRAINT ongeki_tech_event_ranking_ibfk1 FOREIGN KEY (user) REFERENCES aime_user(id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE TABLE ongeki_static_music_ranking_list (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
musicId INT NOT NULL,
point INT NOT NULL,
userName VARCHAR(255),
UNIQUE KEY ongeki_static_music_ranking_list_uk (version, musicId)
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
musicId INT NOT NULL,
point INT NOT NULL,
userName VARCHAR(255),
UNIQUE KEY ongeki_static_music_ranking_list_uk (version, musicId)
);
CREATE TABLE ongeki_static_rewards (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
rewardId INT NOT NULL,
rewardName VARCHAR(255) NOT NULL,
itemKind INT NOT NULL,
itemId INT NOT NULL,
UNIQUE KEY ongeki_tech_event_ranking_uk (version, rewardId)
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
rewardId INT NOT NULL,
rewardName VARCHAR(255) NOT NULL,
itemKind INT NOT NULL,
itemId INT NOT NULL,
UNIQUE KEY ongeki_tech_event_ranking_uk (version, rewardId)
);
CREATE TABLE ongeki_static_present_list (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
presentId INT NOT NULL,
presentName VARCHAR(255) NOT NULL,
rewardId INT NOT NULL,
stock INT NOT NULL,
message VARCHAR(255),
startDate VARCHAR(25) NOT NULL,
endDate VARCHAR(25) NOT NULL,
UNIQUE KEY ongeki_static_present_list_uk (version, presentId, rewardId)
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
presentId INT NOT NULL,
presentName VARCHAR(255) NOT NULL,
rewardId INT NOT NULL,
stock INT NOT NULL,
message VARCHAR(255),
startDate VARCHAR(25) NOT NULL,
endDate VARCHAR(25) NOT NULL,
UNIQUE KEY ongeki_static_present_list_uk (version, presentId, rewardId)
);
CREATE TABLE ongeki_static_tech_music (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
eventId INT NOT NULL,
musicId INT NOT NULL,
level INT NOT NULL,
UNIQUE KEY ongeki_static_tech_music_uk (version, musicId, eventId)
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
eventId INT NOT NULL,
musicId INT NOT NULL,
level INT NOT NULL,
UNIQUE KEY ongeki_static_tech_music_uk (version, musicId, eventId)
);
CREATE TABLE ongeki_static_client_testmode (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
regionId INT NOT NULL,
placeId INT NOT NULL,
clientId VARCHAR(11) NOT NULL,
updateDate TIMESTAMP NOT NULL,
isDelivery BOOLEAN NOT NULL,
groupId INT NOT NULL,
groupRole INT NOT NULL,
continueMode INT NOT NULL,
selectMusicTime INT NOT NULL,
advertiseVolume INT NOT NULL,
eventMode INT NOT NULL,
eventMusicNum INT NOT NULL,
patternGp INT NOT NULL,
limitGp INT NOT NULL,
maxLeverMovable INT NOT NULL,
minLeverMovable INT NOT NULL,
UNIQUE KEY ongeki_static_client_testmode_uk (clientId)
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
regionId INT NOT NULL,
placeId INT NOT NULL,
clientId VARCHAR(11) NOT NULL,
updateDate TIMESTAMP NOT NULL,
isDelivery BOOLEAN NOT NULL,
groupId INT NOT NULL,
groupRole INT NOT NULL,
continueMode INT NOT NULL,
selectMusicTime INT NOT NULL,
advertiseVolume INT NOT NULL,
eventMode INT NOT NULL,
eventMusicNum INT NOT NULL,
patternGp INT NOT NULL,
limitGp INT NOT NULL,
maxLeverMovable INT NOT NULL,
minLeverMovable INT NOT NULL,
UNIQUE KEY ongeki_static_client_testmode_uk (clientId)
);
CREATE TABLE ongeki_static_game_point (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
type INT NOT NULL,
cost INT NOT NULL,
startDate VARCHAR(25) NOT NULL DEFAULT "2000-01-01 05:00:00.0",
endDate VARCHAR(25) NOT NULL DEFAULT "2099-01-01 05:00:00.0",
UNIQUE KEY ongeki_static_game_point_uk (type)
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
`type` INT NOT NULL,
cost INT NOT NULL,
startDate VARCHAR(25) NOT NULL DEFAULT "2000-01-01 05:00:00.0",
endDate VARCHAR(25) NOT NULL DEFAULT "2099-01-01 05:00:00.0",
UNIQUE KEY ongeki_static_game_point_uk (`type`)
);
SET FOREIGN_KEY_CHECKS=1;
SET FOREIGN_KEY_CHECKS=1;

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +0,0 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>{{ arcade.name }}</h1>
{% endblock content %}

View File

@ -1,5 +0,0 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
{% include "core/frontend/widgets/err_banner.jinja" %}
<h1>Machine Management</h1>
{% endblock content %}

View File

@ -1,103 +0,0 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>System Management</h1>
<div class="row" id="rowForm">
{% if sesh.permissions >= 2 %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="usrLookup" name="usrLookup" action="/sys/lookup.user" class="form-inline">
<h3>User Search</h3>
<div class="form-group">
<label for="usrId">User ID</label>
<input type="number" class="form-control" id="usrId" name="usrId">
</div>
OR
<div class="form-group">
<label for="usrName">Username</label>
<input type="text" class="form-control" id="usrName" name="usrName">
</div>
OR
<div class="form-group">
<label for="usrEmail">Email address</label>
<input type="email" class="form-control" id="usrEmail" name="usrEmail" aria-describedby="emailHelp">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
{% endif %}
{% if sesh.permissions >= 4 %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="arcadeLookup" name="arcadeLookup" action="/sys/lookup.arcade" class="form-inline" >
<h3>Arcade Search</h3>
<div class="form-group">
<label for="arcadeId">Arcade ID</label>
<input type="number" class="form-control" id="arcadeId" name="arcadeId">
</div>
OR
<div class="form-group">
<label for="arcadeName">Arcade Name</label>
<input type="text" class="form-control" id="arcadeName" name="arcadeName">
</div>
OR
<div class="form-group">
<label for="arcadeUser">Owner User ID</label>
<input type="number" class="form-control" id="arcadeUser" name="arcadeUser">
</div>
OR
<div class="form-group">
<label for="arcadeIp">Assigned IP Address</label>
<input type="text" class="form-control" id="arcadeIp" name="arcadeIp">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
<div class="col-sm-6" style="max-width: 25%;">
<form id="cabLookup" name="cabLookup" action="/sys/lookup.cab" class="form-inline" >
<h3>Machine Search</h3>
<div class="form-group">
<label for="cabId">Machine ID</label>
<input type="number" class="form-control" id="cabId" name="cabId">
</div>
OR
<div class="form-group">
<label for="cabSerial">Machine Serial</label>
<input type="text" class="form-control" id="cabSerial" name="cabSerial">
</div>
OR
<div class="form-group">
<label for="cabAcId">Arcade ID</label>
<input type="number" class="form-control" id="cabAcId" name="cabAcId">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
{% endif %}
</div>
<div class="row" id="rowResult" style="margin: 10px;">
{% if sesh.permissions >= 2 %}
<div id="userSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for usr in usrlist %}
<a href=/user/{{ usr.id }}><pre>{{ usr.id }} | {{ usr.username if usr.username != None else "<i>No Name Set</i>"}}</pre></a>
{% endfor %}
</div>
{% endif %}
{% if sesh.permissions >= 4 %}
<div id="arcadeSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for ac in aclist %}
<pre><a href=/arcade/{{ ac.id }}>{{ ac.id }} | {{ ac.name if ac.name != None else "<i>No Name Set</i>" }} | {{ ac.ip if ac.ip != None else "<i>No IP Assigned</i>"}}</pre></a>
{% endfor %}
</div
><div id="cabSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for cab in cablist %}
<a href=/cab/{{ cab.id }}><pre>{{ cab.id }} | {{ cab.game if cab.game != None else "<i>ANY </i>" }} | {{ cab.serial }}</pre></a>
{% endfor %}
</div>
{% endif %}
</div>
<div class="row" id="rowAdd">
</div>
{% endblock content %}

View File

@ -1,41 +0,0 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>Management for {{ username }}</h1>
<h2>Cards <button class="btn btn-success" data-bs-toggle="modal" data-bs-target="#card_add">Add</button></h2>
<ul style="font-size: 20px;">
{% for c in cards %}
<li>{{ c.access_code }}: {{ c.status }}&nbsp;{% if c.status == 'Active'%}<button class="btn-warning btn">Lock</button>{% elif c.status == 'Locked' %}<button class="btn-warning btn">Unlock</button>{% endif %}&nbsp;<button class="btn-danger btn">Delete</button></li>
{% endfor %}
</ul>
{% if arcades is defined %}
<h2>Arcades</h2>
<ul style="font-size: 20px;">
{% for a in arcades %}
<li><a href=/arcade/{{ a.id }}>{{ a.name }}</a></li>
{% endfor %}
</ul>
{% endif %}
<div class="modal fade" id="card_add" tabindex="-1" aria-labelledby="card_add_label" aria-hidden="true">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h1 class="modal-title fs-5" id="card_add_label">Add Card</h1>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
HOW TO:<br>
Scan your card on any networked game and press the "View Access Code" button (varies by game) and enter the 20 digit code below.<br>
!!FOR AMUSEIC CARDS: DO NOT ENTER THE CODE SHOWN ON THE BACK OF THE CARD ITSELF OR IT WILL NOT WORK!!
<p /><label for="card_add_frm_access_code">Access Code:&nbsp;</label><input id="card_add_frm_access_code" maxlength="20" type="text" required>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-primary">Add</button>
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
</div>
</div>
{% endblock content %}

View File

@ -1,18 +0,0 @@
{% if error > 0 %}
<div class="err-banner">
<h3>Error</h3>
{% if error == 1 %}
Card not registered, or wrong password
{% elif error == 2 %}
Missing or malformed access code
{% elif error == 3 %}
Failed to create user
{% elif error == 4 %}
Arcade not found
{% elif error == 5 %}
Machine not found
{% else %}
An unknown error occoured
{% endif %}
</div>
{% endif %}

View File

@ -1,8 +1,8 @@
from typing import Dict, Any, Optional, List
from typing import Dict, Any, Optional
import logging, coloredlogs
from logging.handlers import TimedRotatingFileHandler
from twisted.web import resource
from twisted.web.http import Request
from starlette.requests import Request
from starlette.responses import PlainTextResponse
from datetime import datetime
from Crypto.Cipher import Blowfish
import pytz
@ -10,9 +10,11 @@ import pytz
from .config import CoreConfig
from .utils import Utils
from .title import TitleServlet
from .data import Data
from .const import *
class MuchaServlet:
mucha_registry: List[str] = []
mucha_registry: Dict[str, Dict[str, str]] = {}
def __init__(self, cfg: CoreConfig, cfg_dir: str) -> None:
self.config = cfg
self.config_dir = cfg_dir
@ -36,90 +38,148 @@ class MuchaServlet:
self.logger.setLevel(cfg.mucha.loglevel)
coloredlogs.install(level=cfg.mucha.loglevel, logger=self.logger, fmt=log_fmt_str)
self.data = Data(cfg)
for _, mod in TitleServlet.title_registry.items():
if hasattr(mod, "get_mucha_info"):
enabled, game_cd = mod.get_mucha_info(
self.config, self.config_dir
)
if enabled:
self.mucha_registry.append(game_cd)
enabled, game_cds, netids = mod.get_mucha_info(self.config, self.config_dir)
if enabled:
for x in range(len(game_cds)):
self.mucha_registry[game_cds[x]] = { "netid_prefix": netids[x] }
self.logger.info(f"Serving {len(self.mucha_registry)} games")
def handle_boardauth(self, request: Request, _: Dict) -> bytes:
req_dict = self.mucha_preprocess(request.content.getvalue())
async def handle_boardauth(self, request: Request) -> bytes:
bod = await request.body()
req_dict = self.mucha_preprocess(bod)
client_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(
f"Error processing mucha request {request.content.getvalue()}"
f"Error processing mucha request {bod}"
)
return b"RESULTS=000"
return PlainTextResponse("RESULTS=000")
req = MuchaAuthRequest(req_dict)
self.logger.info(f"Boardauth request from {client_ip} for {req.gameVer}")
self.logger.debug(f"Mucha request {vars(req)}")
if not req.gameCd or not req.gameVer or not req.sendDate or not req.countryCd or not req.serialNum:
self.logger.warn(f"Missing required fields - {vars(req)}")
return PlainTextResponse("RESULTS=000")
if req.gameCd not in self.mucha_registry:
self.logger.warning(f"Unknown gameCd {req.gameCd}")
return b"RESULTS=000"
minfo = self.mucha_registry.get(req.gameCd, {})
if not minfo:
self.logger.warning(f"Unknown gameCd {req.gameCd} from {client_ip}")
return PlainTextResponse("RESULTS=000")
# TODO: Decrypt S/N
b_key = b""
for x in range(8):
b_key += req.sendDate[(x - 1) & 7].encode()
b_iv = b_key # what the fuck namco
cipher = Blowfish.new(b_key, Blowfish.MODE_ECB)
sn_decrypt = cipher.decrypt(bytes.fromhex(req.serialNum))
self.logger.debug(f"Decrypt SN to {sn_decrypt.hex()}")
cipher = Blowfish.new(b_key, Blowfish.MODE_CBC, b_iv)
try:
sn_decrypt = cipher.decrypt(bytes.fromhex(req.serialNum))[:12].decode()
except Exception as e:
self.logger.error(f"Decrypt SN {req.serialNum} failed! - {e}")
return PlainTextResponse("RESULTS=000")
self.logger.info(f"Boardauth request from {sn_decrypt} ({client_ip}) for {req.gameVer}")
resp = MuchaAuthResponse(
f"{self.config.mucha.hostname}{':' + str(self.config.allnet.port) if self.config.server.is_develop else ''}"
f"{self.config.server.hostname}{':' + str(self.config.server.port) if not self.config.server.is_using_proxy else ''}"
)
netid = minfo.get('netid_prefix', "ABxN") + sn_decrypt[5:]
cab = await self.data.arcade.get_machine(netid)
if cab:
arcade = await self.data.arcade.get_arcade(cab['id'])
if not arcade:
self.logger.error(f"Failed to get arcade with id {cab['id']}")
return PlainTextResponse("RESULTS=000")
resp.AREA_0 = arcade["region_id"] or AllnetJapanRegionId.AICHI.name
resp.AREA_0_EN = arcade["region_id"] or AllnetJapanRegionId.AICHI.name
resp.AREA_FULL_0 = arcade["region_id"] or AllnetJapanRegionId.AICHI.name
resp.AREA_FULL_0_EN = arcade["region_id"] or AllnetJapanRegionId.AICHI.name
resp.AREA_1 = arcade["country"] or cab['country'] or AllnetCountryCode.JAPAN.value
resp.AREA_1_EN = arcade["country"] or cab['country'] or AllnetCountryCode.JAPAN.value
resp.AREA_FULL_1 = arcade["country"] or cab['country'] or AllnetCountryCode.JAPAN.value
resp.AREA_FULL_1_EN = arcade["country"] or cab['country'] or AllnetCountryCode.JAPAN.value
resp.AREA_2 = arcade["city"] if arcade["city"] else ""
resp.AREA_2_EN = arcade["city"] if arcade["city"] else ""
resp.AREA_FULL_2 = arcade["city"] if arcade["city"] else ""
resp.AREA_FULL_2_EN = arcade["city"] if arcade["city"] else ""
resp.AREA_3 = ""
resp.AREA_3_EN = ""
resp.AREA_FULL_3 = ""
resp.AREA_FULL_3_EN = ""
resp.PREFECTURE_ID = arcade['region_id']
resp.COUNTRY_CD = arcade['country'] or cab['country'] or AllnetCountryCode.JAPAN.value
resp.PLACE_ID = req.placeId if req.placeId else f"{arcade['country'] or cab['country'] or AllnetCountryCode.JAPAN.value}{arcade['id']:04X}"
resp.SHOP_NAME = arcade['name']
resp.SHOP_NAME_EN = arcade['name']
resp.SHOP_NICKNAME = arcade['nickname']
resp.SHOP_NICKNAME_EN = arcade['nickname']
elif self.config.server.allow_unregistered_serials:
self.logger.info(f"Allow unknown serial {netid} ({sn_decrypt}) to auth")
else:
self.logger.warn(f'Auth failed for NetID {netid}')
return PlainTextResponse("RESULTS=000")
self.logger.debug(f"Mucha response {vars(resp)}")
return self.mucha_postprocess(vars(resp))
return PlainTextResponse(self.mucha_postprocess(vars(resp)))
def handle_updatecheck(self, request: Request, _: Dict) -> bytes:
req_dict = self.mucha_preprocess(request.content.getvalue())
async def handle_updatecheck(self, request: Request) -> bytes:
bod = await request.body()
req_dict = self.mucha_preprocess(bod)
client_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(
f"Error processing mucha request {request.content.getvalue()}"
f"Error processing mucha request {bod}"
)
return b"RESULTS=000"
return PlainTextResponse("RESULTS=000")
req = MuchaUpdateRequest(req_dict)
self.logger.info(f"Updatecheck request from {client_ip} for {req.gameVer}")
self.logger.info(f"Updatecheck request from {req.serialNum} ({client_ip}) for {req.gameVer}")
self.logger.debug(f"Mucha request {vars(req)}")
if req.gameCd not in self.mucha_registry:
self.logger.warning(f"Unknown gameCd {req.gameCd}")
return b"RESULTS=000"
return PlainTextResponse("RESULTS=000")
resp = MuchaUpdateResponse(req.gameVer, f"{self.config.mucha.hostname}{':' + str(self.config.allnet.port) if self.config.server.is_develop else ''}")
resp = MuchaUpdateResponse(req.gameVer, f"{self.config.server.hostname}{':' + str(self.config.server.port) if not self.config.server.is_using_proxy else ''}")
self.logger.debug(f"Mucha response {vars(resp)}")
return self.mucha_postprocess(vars(resp))
return PlainTextResponse(self.mucha_postprocess(vars(resp)))
def handle_dlstate(self, request: Request, _: Dict) -> bytes:
req_dict = self.mucha_preprocess(request.content.getvalue())
async def handle_dlstate(self, request: Request) -> bytes:
bod = await request.body()
req_dict = self.mucha_preprocess(bod)
client_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(
f"Error processing mucha request {request.content.getvalue()}"
f"Error processing mucha request {bod}"
)
return b""
return PlainTextResponse("RESULTS=000")
req = MuchaDownloadStateRequest(req_dict)
self.logger.info(f"DownloadState request from {client_ip} for {req.gameCd} -> {req.updateVer}")
self.logger.info(f"DownloadState request from {req.serialNum} ({client_ip}) for {req.gameCd} -> {req.updateVer}")
self.logger.debug(f"request {vars(req)}")
return b"RESULTS=001"
return PlainTextResponse("RESULTS=001")
def mucha_preprocess(self, data: bytes) -> Optional[Dict]:
try:
@ -169,7 +229,7 @@ class MuchaAuthResponse:
self.RESULTS = "001"
self.AUTH_INTERVAL = "86400"
self.SERVER_TIME = datetime.strftime(datetime.now(), "%Y%m%d%H%M")
self.UTC_SERVER_TIME = datetime.strftime(datetime.now(pytz.UTC), "%Y%m%d%H%M")
self.SERVER_TIME_UTC = datetime.strftime(datetime.now(pytz.UTC), "%Y%m%d%H%M")
self.CHARGE_URL = f"https://{mucha_url}/charge/"
self.FILE_URL = f"https://{mucha_url}/file/"

View File

@ -0,0 +1,19 @@
{% extends "core/templates/index.jinja" %}
{% block content %}
{% if arcade is defined %}
<h1>{{ arcade.name }}</h1>
<h2>PCBs assigned to this arcade <button class="btn btn-success" id="btn_add_cab" onclick="toggle_add_cab_form()">Add</button></h2>
{% if success is defined and success == 3 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Cab added successfully
</div>
{% endif %}
<ul style="font-size: 20px;">
{% for c in arcade.cabs %}
<li><a href="/cab/{{ c.id }}">{{ c.serial }}</a> ({{ c.game if c.game else "Any" }})&nbsp;<button class="btn btn-secondary" onclick="prep_edit_form()">Edit</button>&nbsp;<button class="btn-danger btn">Delete</button></li>
{% endfor %}
</ul>
{% else %}
<h3>Arcade Not Found</h3>
{% endif %}
{% endblock content %}

View File

@ -1,4 +1,4 @@
{% extends "core/frontend/index.jinja" %}
{% extends "core/templates/index.jinja" %}
{% block content %}
<h1>Create User</h1>
<form id="create" style="max-width: 240px; min-width: 10%;" action="/gate/gate.create" method="post">

View File

@ -1,7 +1,7 @@
{% extends "core/frontend/index.jinja" %}
{% extends "core/templates/index.jinja" %}
{% block content %}
<h1>Gate</h1>
{% include "core/frontend/widgets/err_banner.jinja" %}
{% include "core/templates/widgets/err_banner.jinja" %}
<style>
/* Chrome, Safari, Edge, Opera */
input::-webkit-outer-spin-button,
@ -15,18 +15,18 @@
-moz-appearance: textfield;
}
</style>
<form id="login" style="max-width: 240px; min-width: 10%;" action="/gate/gate.login" method="post">
<form id="login" style="max-width: 240px; min-width: 15%;" action="/gate/gate.login" method="post">
<div class="form-group row">
<label for="access_code">Card Access Code</label><br>
<input form="login" class="form-control" name="access_code" id="access_code" type="number" placeholder="00000000000000000000" maxlength="20" required>
<label for="access_code">Access Code or Username</label><br>
<input form="login" class="form-control" name="access_code" id="access_code" placeholder="00000000000000000000" maxlength="20" required aria-describedby="access_code_help">
<div id="access_code_help" class="form-text">20 Digit access code from a card registered to your account, or your account username. (NOT your username from a game!)</div>
</div>
<div class="form-group row">
<label for="passwd">Password</label><br>
<input id="passwd" class="form-control" name="passwd" type="password" placeholder="password">
<input id="passwd" class="form-control" name="passwd" type="password" placeholder="password" aria-describedby="passwd_help">
<div id="passwd_help" class="form-text">Leave blank if registering for the webui. Your card must have been used on a game connected to this server to register.</div>
</div>
<p></p>
<input id="submit" class="btn btn-primary" style="display: block; margin: 0 auto;" form="login" type="submit" value="Login">
</form>
<h6>*To register for the webui, type in the access code of your card, as shown in a game, and leave the password field blank.</h6>
<h6>*If you have not registered a card with this server, you cannot create a webui account.</h6>
{% endblock content %}

View File

@ -84,7 +84,7 @@
</style>
</head>
<body>
{% include "core/frontend/widgets/topbar.jinja" %}
{% include "core/templates/widgets/topbar.jinja" %}
{% block content %}
<h1>{{ server_name }}</h1>
{% endblock content %}

View File

@ -0,0 +1,4 @@
{% extends "core/templates/index.jinja" %}
{% block content %}
<h1>Machine Management</h1>
{% endblock content %}

View File

@ -0,0 +1,192 @@
{% extends "core/templates/index.jinja" %}
{% block content %}
<h1>System Management</h1>
{% if error is defined %}
{% include "core/templates/widgets/err_banner.jinja" %}
{% endif %}
<h2>Search</h2>
<div class="row" id="rowForm">
{% if "{:08b}".format(sesh.permissions)[6] == "1" %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="usrLookup" name="usrLookup" action="/sys/lookup.user" class="form-inline">
<h3>User Search</h3>
<div class="form-group">
<label for="usrId">User ID</label>
<input type="number" class="form-control" id="usrId" name="usrId">
</div>
OR
<div class="form-group">
<label for="usrName">Username</label>
<input type="text" class="form-control" id="usrName" name="usrName">
</div>
OR
<div class="form-group">
<label for="usrEmail">Email address</label>
<input type="email" class="form-control" id="usrEmail" name="usrEmail">
</div>
OR
<div class="form-group">
<label for="usrAc">Access Code</label>
<input type="text" class="form-control" id="usrAc" name="usrAc" maxlength="20" placeholder="00000000000000000000">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
{% endif %}
{% if "{:08b}".format(sesh.permissions)[5] == "1" %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="shopLookup" name="shopLookup" action="/sys/lookup.shop" class="form-inline">
<h3>Shop search</h3>
<div class="form-group">
<label for="shopId">Shop ID</label>
<input type="number" class="form-control" id="shopId" name="shopId">
</div>
OR
<div class="form-group">
<label for="serialNum">Serial Number</label>
<input type="text" class="form-control" id="serialNum" name="serialNum" maxlength="15">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
<div class="col-sm-6" style="max-width: 25%;">
<a href="/sys/logs"><button class="btn btn-primary">Event Logs</button></a>
</div>
{% endif %}
</div>
<div class="row" id="rowResult" style="margin: 10px;">
{% if "{:08b}".format(sesh.permissions)[6] == "1" %}
<div id="userSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for usr in usrlist %}
<a href=/user/{{ usr.id }}><pre>{{ usr.username if usr.username is not none else "<i>No Name Set</i>"}}</pre></a>
{% endfor %}
</div>
{% endif %}
{% if "{:08b}".format(sesh.permissions)[5] == "1" %}
<div id="shopSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for shop in shoplist %}
<a href="/shop/{{ shop.id }}"><pre>{{ shop.name if shop.name else "<i>No Name Set</i>"}}</pre></a>
{% endfor %}
</div>
{% endif %}
</div>
<h2>Add</h2>
<div class="row" id="rowAdd">
{% if "{:08b}".format(sesh.permissions)[6] == "1" %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="usrAdd" name="usrAdd" action="/sys/add.user" class="form-inline" method="POST">
<h3>Add User</h3>
<div class="form-group">
<label for="usrName">Username</label>
<input type="text" class="form-control" id="usrName" name="usrName">
</div>
<br>
<div class="form-group">
<label for="usrEmail">Email address</label>
<input type="email" class="form-control" id="usrEmail" name="usrEmail" required>
</div>
<br>
<div class="form-group">
<label for="usrPerm">Permission Level</label>
<input type="number" class="form-control" id="usrPerm" name="usrPerm" value="1">
</div>
<br />
<button type="submit" class="btn btn-primary">Add</button>
</form>
</div>
<div class="col-sm-6" style="max-width: 25%;">
<form id="cardAdd" name="cardAdd" action="/sys/add.card" class="form-inline" method="POST">
<h3>Add Card</h3>
<div class="form-group">
<label for="cardUsr">User ID</label>
<input type="number" class="form-control" id="cardUsr" name="cardUsr" required>
</div>
<br>
<div class="form-group">
<label for="cardAc">Access Code</label>
<input type="text" class="form-control" id="cardAc" name="cardAc" maxlength="20" placeholder="00000000000000000000" required>
</div>
<br>
<div class="form-group">
<label for="cardIdm">IDm/Chip ID</label>
<input type="text" class="form-control" id="cardIdm" name="cardIdm" disabled>
</div>
<br />
<button type="submit" class="btn btn-primary">Add</button>
</form>
</div>
{% endif %}
{% if "{:08b}".format(sesh.permissions)[5] == "1" %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="shopAdd" name="shopAdd" action="/sys/add.shop" class="form-inline" method="POST">
<h3>Add Shop</h3>
<div class="form-group">
<label for="shopName">Name</label>
<input type="text" class="form-control" id="shopName" name="shopName">
</div>
<br>
<div class="form-group">
<label for="shopCountry">Country Code</label>
<input type="text" class="form-control" id="shopCountry" name="shopCountry" maxlength="3" placeholder="JPN">
</div>
<br />
<div class="form-group">
<label for="shopIp">VPN IP</label>
<input type="text" class="form-control" id="shopIp" name="shopIp">
</div>
<br />
<button type="submit" class="btn btn-primary">Add</button>
</form>
</div>
<div class="col-sm-6" style="max-width: 25%;">
<form id="cabAdd" name="cabAdd" action="/sys/add.cab" class="form-inline" method="POST">
<h3>Add Machine</h3>
<div class="form-group">
<label for="cabShop">Shop ID</label>
<input type="number" class="form-control" id="cabShop" name="cabShop" required>
</div>
<br>
<div class="form-group">
<label for="cabSerial">Serial</label>
<input type="text" class="form-control" id="cabSerial" name="cabSerial">
</div>
<br />
<div class="form-group">
<label for="cabGame">Game Code</label>
<input type="text" class="form-control" id="cabGame" name="cabGame" maxlength="4" placeholder="SXXX">
</div>
<br />
<button type="submit" class="btn btn-primary">Add</button>
</form>
</div>
{% endif %}
</div>
<div class="row" id="rowAddResult" style="margin: 10px;">
{% if "{:08b}".format(sesh.permissions)[6] == "1" %}
<div id="userAddResult" class="col-sm-6" style="max-width: 25%;">
{% if usradd is defined %}
<pre>Added user {{ usradd.username if usradd.username is not none else "with no name"}} with id {{usradd.id}} and password {{ usradd.password }}</pre>
{% endif %}
</div>
<div id="cardAddResult" class="col-sm-6" style="max-width: 25%;">
{% if cardadd is defined %}
<pre>Added {{ cardadd.access_code }} with id {{cardadd.id}} to user {{ cardadd.user }}</pre>
{% endif %}
</div>
{% endif %}
{% if "{:08b}".format(sesh.permissions)[5] == "1" %}
<div id="shopAddResult" class="col-sm-6" style="max-width: 25%;">
{% if shopadd is defined %}
<pre>Added Shop {{ shopadd.id }}</pre></a>
{% endif %}
</div>
<div id="cabAddResult" class="col-sm-6" style="max-width: 25%;">
{% if cabadd is defined %}
<pre>Added Machine {{ cabadd.id }} with serial {{ cabadd.serial }}</pre></a>
{% endif %}
</div>
{% endif %}
</div>
{% endblock content %}

View File

@ -0,0 +1,202 @@
{% extends "core/templates/index.jinja" %}
{% block content %}
<h1>Event Logs</h1>
<table class="table table-dark table-striped-columns" id="tbl_events">
<caption>Viewing last 100 logs</caption>
<thead>
<tr>
<th>Severity</th>
<th>Timestamp</th>
<th>System</th>
<th>Name</th>
<th>User</th>
<th>Arcade</th>
<th>Machine</th>
<th>Game</th>
<th>Version</th>
<th>Message</th>
<th>Params</th>
</tr>
</thead>
{% if events is not defined or events|length == 0 %}
<tr>
<td colspan="11" style="text-align:center"><i>No Events</i></td>
</tr>
{% endif %}
</table>
<div id="div_tbl_ctrl">
<select id="sel_per_page" onchange="update_tbl()">
<option value="10" selected>10</option>
<option value="25">25</option>
<option value="50">50</option>
<option value="100">100</option>
</select>
&nbsp;
<button class="btn btn-primary" id="btn_prev" disabled onclick="chg_page(-1)"><<</button>
<button class="btn btn-primary" id="btn_next" onclick="chg_page(1)">>></button>
</div>
<script type="text/javascript">
{% if events is defined %}
const TBL_DATA = {{events}};
{% else %}
const TBL_DATA = [];
{% endif %}
var per_page = 0;
var page = 0;
function update_tbl() {
if (TBL_DATA.length == 0) {
document.getElementById("btn_next").disabled = true;
document.getElementById("btn_prev").disabled = true;
return;
}
var tbl = document.getElementById("tbl_events");
for (var i = 0; i < per_page; i++) {
try{
tbl.deleteRow(1);
} catch {
break;
}
}
per_page = document.getElementById("sel_per_page").value;
if (per_page >= TBL_DATA.length) {
page = 0;
document.getElementById("btn_next").disabled = true;
document.getElementById("btn_prev").disabled = true;
}
for (var i = 0; i < per_page; i++) {
let off = (page * per_page) + i;
if (off >= TBL_DATA.length) {
if (page != 0) {
document.getElementById("btn_next").disabled = true;
document.getElementById("btn_prev").disabled = false;
}
break;
}
var data = TBL_DATA[off];
var row = tbl.insertRow(i + 1);
var cell_severity = row.insertCell(0);
switch (data.severity) {
case 10:
cell_severity.innerHTML = "DEBUG";
row.classList.add("table-success");
break;
case 20:
cell_severity.innerHTML = "INFO";
row.classList.add("table-info");
break;
case 30:
cell_severity.innerHTML = "WARN";
row.classList.add("table-warning");
break;
case 40:
cell_severity.innerHTML = "ERROR";
row.classList.add("table-danger");
break;
case 50:
cell_severity.innerHTML = "CRITICAL";
row.classList.add("table-danger");
break;
default:
cell_severity.innerHTML = "---";
row.classList.add("table-primary");
break;
}
var cell_ts = row.insertCell(1);
cell_ts.innerHTML = data.when_logged;
var cell_mod = row.insertCell(2);
cell_mod.innerHTML = data.system;
var cell_name = row.insertCell(3);
cell_name.innerHTML = data.type;
var cell_usr = row.insertCell(4);
if (data.user == 'NONE') {
cell_usr.innerHTML = "---";
} else {
cell_usr.innerHTML = "<a href=\"/user/" + data.user + "\">" + data.user + "</a>";
}
var cell_arcade = row.insertCell(5);
if (data.arcade == 'NONE') {
cell_arcade.innerHTML = "---";
} else {
cell_arcade.innerHTML = "<a href=\"/shop/" + data.arcade + "\">" + data.arcade + "</a>";
}
var cell_machine = row.insertCell(6);
if (data.arcade == 'NONE') {
cell_machine.innerHTML = "---";
} else {
cell_machine.innerHTML = "<a href=\"/cab/" + data.machine + "\">" + data.machine + "</a>";
}
var cell_game = row.insertCell(7);
if (data.game == 'NONE') {
cell_game.innerHTML = "---";
} else {
cell_game.innerHTML = data.game;
}
var cell_version = row.insertCell(8);
if (data.version == 'NONE') {
cell_version.innerHTML = "---";
} else {
cell_version.innerHTML = data.version;
}
var cell_msg = row.insertCell(9);
if (data.message == '') {
cell_msg.innerHTML = "---";
} else {
cell_msg.innerHTML = data.message;
}
var cell_deets = row.insertCell(10);
if (data.details == '{}') {
cell_deets.innerHTML = "---";
} else {
cell_deets.innerHTML = data.details;
}
}
}
function chg_page(num) {
var max_page = TBL_DATA.length / per_page;
console.log(max_page);
page = page + num;
if (page > max_page && max_page >= 1) {
page = max_page;
document.getElementById("btn_next").disabled = true;
document.getElementById("btn_prev").disabled = false;
return;
} else if (page < 0) {
page = 0;
document.getElementById("btn_next").disabled = false;
document.getElementById("btn_prev").disabled = true;
return;
} else if (page == 0) {
document.getElementById("btn_next").disabled = TBL_DATA.length == 0;
document.getElementById("btn_prev").disabled = true;
} else {
document.getElementById("btn_next").disabled = false;
document.getElementById("btn_prev").disabled = false;
}
update_tbl();
}
update_tbl();
</script>
{% endblock content %}

View File

@ -0,0 +1,213 @@
{% extends "core/templates/index.jinja" %}
{% block content %}
<script type="text/javascript">
function toggle_new_name_form() {
let frm = document.getElementById("new_name_form");
let btn = document.getElementById("btn_toggle_form");
if (frm.style['display'] != "") {
frm.style['display'] = "";
frm.style['max-height'] = "";
btn.innerText = "Cancel";
} else {
frm.style['display'] = "none";
frm.style['max-height'] = "0px";
btn.innerText = "Edit";
}
}
function toggle_add_card_form() {
let btn = document.getElementById("btn_add_card");
let dv = document.getElementById("add_card_container")
if (dv.style['display'] != "") {
btn.innerText = "Cancel";
dv.style['display'] = "";
} else {
btn.innerText = "Add";
dv.style['display'] = "none";
}
}
function toggle_idm_disabled(is_disabled) {
document.getElementById("btn_add_card");
let dv = document.getElementById("add_card_container")
if (dv.style['display'] != "") {
btn.innerText = "Cancel";
dv.style['display'] = "";
} else {
btn.innerText = "Add";
dv.style['display'] = "none";
}
}
function prep_edit_form(access_code, chip_id, idm, card_type, u_memo, card_id) {
ac = document.getElementById("card_edit_frm_access_code");
cid = document.getElementById("card_edit_frm_chip_id");
fidm = document.getElementById("card_edit_frm_idm");
memo = document.getElementById("card_edit_frm_memo");
document.getElementById("card_edit_frm_card_id").value = card_id;
if (chip_id == "None" || chip_id == undefined) {
chip_id = ""
}
if (idm == "None" || idm == undefined) {
idm = ""
}
if (u_memo == "None" || u_memo == undefined) {
u_memo = ""
}
ac.value = access_code;
cid.value = chip_id;
fidm.value = idm;
memo.value = u_memo;
if (access_code.startsWith("3") || access_code.startsWith("010")) {
cid.disabled = false;
fidm.disabled = true;
} else if (access_code.startsWith("5") || access_code.startsWith("0008")) {
cid.disabled = true;
fidm.disabled = false;
} else {
cid.disabled = true;
fidm.disabled = true;
}
}
</script>
<h1>Management for {{ username }}&nbsp;<button onclick="toggle_new_name_form()" class="btn btn-secondary" id="btn_toggle_form">Edit</button></h1>
{% if error is defined %}
{% include "core/templates/widgets/err_banner.jinja" %}
{% endif %}
{% if success is defined and success == 2 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Update successful
</div>
{% endif %}
<form style="max-width: 33%; display: none; max-height: 0px;" action="/user/update.name" method="post" id="new_name_form">
<div class="mb-3">
<label for="new_name" class="form-label">New Nickname</label>
<input type="text" class="form-control" id="new_name" name="new_name" aria-describedby="new_name_help">
<div id="new_name_help" class="form-text">Must be 10 characters or less</div>
</div>
<button type="submit" class="btn btn-primary">Submit</button>
</form>
<p></p>
<h2>Cards <button class="btn btn-success" id="btn_add_card" onclick="toggle_add_card_form()">Add</button></h2>
{% if success is defined and success == 3 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Card added successfully
</div>
{% endif %}
<div id="add_card_container" style="display: none; max-width: 33%;">
<form action="/user/add.card" method="post", id="frm_add_card">
<div class="form-check">
<input type="radio" id="card_add_frm_type_aicc" value="0" name="card_add_frm_type" aria-describedby="aicc_help" onclick="document.getElementById('card_add_frm_idm').disabled = false;">
<label class="form-label" for="card_add_frm_type_aicc">AmusementIC</label>
<div id="aicc_help" class="form-text">Starts with 5. If you don't have the IDm, use the 0008 access code shown in-game</div>
<br>
<input type="radio" id="card_add_frm_type_old" value="1" name="card_add_frm_type" aria-describedby="old_help" onclick="document.getElementById('card_add_frm_idm').disabled = true;">
<label class="form-label" for="card_add_frm_type_old">Old Aime/Banapass</label>
<div id="old_help" class="form-text">Starts with 010 (aime) or 3 (banapass)</div>
</div>
<label class="form-label" for="card_add_frm_access_code">Access Code:</label>
<input class="form-control" name="add_access_code" id="card_add_frm_access_code" maxlength="20" type="text" required aria-describedby="ac_help">
<div id="ac_help" class="form-text">20 digit code on the back of the card.</div>
<label class="form-label" for="card_add_frm_access_code">IDm:</label>
<input class="form-control" name="add_idm" id="card_add_frm_idm" maxlength="16" type="text" aria-describedby="idm_help">
<div id="idm_help" class="form-text">AmusementIC cards only! 16 hexidecimal digits, sometimes called the serial number, gotten by scanning the card with a reader.</div>
<br>
<button type="submit" class="btn btn-primary">Add</button>
</form>
<br>
</div>
{% if success is defined and success == 4 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Update successful
</div>
{% endif %}
<ul style="font-size: 20px;">
{% for c in cards %}
<li>{{ c.access_code }} ({{ c.type if c.memo is none or not c.memo else c.memo }}): {{ c.status }}&nbsp;<button onclick="prep_edit_form('{{ c.access_code }}', '{{ c.chip_id}}', '{{ c.idm }}', '{{ c.type }}', '{{ c.memo }}', '{{ c.id }}')" data-bs-toggle="modal" data-bs-target="#card_edit" class="btn btn-secondary" id="btn_edit_card_{{ c.access_code }}">View</button>&nbsp;{% if c.status == 'Active'%}<button class="btn-warning btn">Lock</button>{% elif c.status == 'Locked' %}<button class="btn-warning btn">Unlock</button>{% endif %}&nbsp;<button class="btn-danger btn" {{ "disabled" if cards|length == 1 else ""}}>Delete</button></li>
{% endfor %}
</ul>
<h2>Reset Password</h2>
{% if success is defined and success == 1 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Update successful
</div>
{% endif %}
<form style="max-width: 33%;" action="/user/update.pw" method="post">
<div class="mb-3">
<label for="current_pw" class="form-label">Current Password</label>
<input type="password" class="form-control" id="current_pw" name="current_pw">
</div>
<div class="mb-3">
<label for="password1" class="form-label">New Password</label>
<input type="password" class="form-control" id="password1" name="password1" aria-describedby="password_help">
<div id="password_help" class="form-text">Password must be at least 10 characters long, contain an upper and lowercase character, number, and special character</div>
</div>
<div class="mb-3">
<label for="password2" class="form-label">Retype New Password</label>
<input type="password" class="form-control" id="password2" name="password2">
</div>
<button type="submit" class="btn btn-primary">Submit</button>
</form>
{% if arcades is defined and arcades|length > 0 %}
<h2>Arcades</h2>
<ul>
{% for a in arcades %}
<li><h3>{{ a.name }}</h3>
{% if a.machines|length > 0 %}
<table>
<tr><th>Serial</th><th>Game</th><th>Last Seen</th></tr>
{% for m in a.machines %}
<tr><td>{{ m.serial }}</td><td>{{ m.game }}</td><td>{{ m.last_seen }}</td></tr>
{% endfor %}
</table>
{% endif %}
</li>
{% endfor %}
</ul>
{% endif %}
<div class="modal fade" id="card_edit" tabindex="-1" aria-labelledby="card_edit_label" aria-hidden="true">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h1 class="modal-title fs-5" id="card_edit_label">Card Information</h1>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
<form action="/user/edit.card" method="post" id="frm_edit_card">
<input type="hidden" readonly name="card_edit_frm_card_id" id="card_edit_frm_card_id">
<label class="form-label" for="card_edit_frm_access_code">Access Code:</label>
<input class="form-control-plaintext" readonly name="card_edit_frm_access_code" id="card_edit_frm_access_code" maxlength="20" type="text" required aria-describedby="ac_help">
<div id="ac_help" class="form-text">20 digit code on the back of the card. If this is incorrect, contact a sysadmin.</div>
<label class="form-label" for="card_edit_frm_idm" id="card_edit_frm_idm_lbl">FeliCa IDm:</label>
<input class="form-control-plaintext" aria-describedby="idm_help" name="add_felica_idm" id="card_edit_frm_idm" maxlength="16" type="text" readonly>
<div id="idm_help" class="form-text">8 bytes that uniquly idenfites a FeliCa card. Obtained by reading the card with an NFC reader.</div>
<label class="form-label" for="card_edit_frm_chip_id" id="card_edit_frm_chip_id_lbl">Mifare UID:</label>
<input class="form-control-plaintext" aria-describedby="chip_id_help" name="add_mifare_chip_id" id="card_edit_frm_chip_id" maxlength="8" type="text" readonly>
<div id="chip_id_help" class="form-text">4 byte integer that uniquly identifies a Mifare card. Obtained by reading the card with an NFC reader.</div>
<label class="form-label" for="card_edit_frm_memo" id="card_edit_frm_memo_lbl">Memo:</label>
<input class="form-control" aria-describedby="memo_help" name="add_memo" id="card_edit_frm_memo" maxlength="16" type="text">
<div id="memo_help" class="form-text">Must be 16 characters or less.</div>
</div>
<div class="modal-footer">
<button type="submit" class="btn btn-primary">Update Memo</button>
</form>
</div>
</div>
</div>
</div>
{% endblock content %}

View File

@ -0,0 +1,34 @@
{% if error > 0 %}
<div class="err-banner">
<h3>Error</h3>
{% if error == 1 %}
Card not registered, or wrong password
{% elif error == 2 %}
Missing or malformed access code
{% elif error == 3 %}
Failed to create user
{% elif error == 4 %}
Required field not filled or invalid
{% elif error == 5 %}
Incorrect old password
{% elif error == 6 %}
Passwords don't match
{% elif error == 7 %}
New password not acceptable
{% elif error == 8 %}
New Nickname too long
{% elif error == 9 %}
You must be logged in to preform this action
{% elif error == 10 %}
Invalid serial number
{% elif error == 11 %}
Access Denied
{% elif error == 12 %}
Card already registered
{% elif error == 13 %}
AmusementIC Access Codes beginning with 5 must have IDm
{% else %}
An unknown error occoured
{% endif %}
</div>
{% endif %}

View File

@ -3,19 +3,20 @@
</div>
<div style="background: #333; color: #f9f9f9; width: 80%; height: 50px; line-height: 50px; padding-left: 10px; float: left;">
<a href=/><button class="btn btn-primary">Home</button></a>&nbsp;
{% for game in game_list %}
<a href=/game/{{ game.url }}><button class="btn btn-success">{{ game.name }}</button></a>&nbsp;
{% for game, data in game_list|items %}
<a href=/game{{ data.url }}/><button class="btn btn-success">{{ game }}</button></a>&nbsp;
{% endfor %}
</div>
</div>
<div style="background: #333; color: #f9f9f9; width: 10%; height: 50px; line-height: 50px; text-align: center; float: left;">
{% if sesh is defined and sesh["permissions"] >= 2 %}
<a href="/sys"><button class="btn btn-primary">System</button></a>
<a href="/sys/"><button class="btn btn-primary">System</button></a>
{% endif %}
{% if sesh is defined and sesh["userId"] > 0 %}
<a href="/user"><button class="btn btn-primary">Account</button></a>
{% if sesh is defined and sesh["user_id"] > 0 %}
<a href="/user/"><button class="btn btn-primary">Account</button></a>
<a href="/user/logout"><button class="btn btn-danger">Logout</button></a>
{% else %}
<a href="/gate"><button class="btn btn-primary">Gate</button></a>
<a href="/gate/"><button class="btn btn-primary">Gate</button></a>
{% endif %}
</div>

View File

@ -1,12 +1,24 @@
from typing import Dict, List, Tuple
from typing import Dict, List, Tuple, Any
import json
import logging, coloredlogs
from logging.handlers import TimedRotatingFileHandler
from twisted.web.http import Request
from starlette.requests import Request
from starlette.responses import Response
from starlette.routing import Route
from core.config import CoreConfig
from core.data import Data
from core.utils import Utils
class JSONResponseNoASCII(Response):
media_type = "application/json"
def render(self, content: Any) -> bytes:
return json.dumps(
content,
ensure_ascii=False,
).encode("utf-8")
class BaseServlet:
def __init__(self, core_cfg: CoreConfig, cfg_dir: str) -> None:
self.core_cfg = core_cfg
@ -28,18 +40,16 @@ class BaseServlet:
"""
return False
def get_endpoint_matchers(self) -> Tuple[List[Tuple[str, str, Dict]], List[Tuple[str, str, Dict]]]:
def get_routes(self) -> List[Route]:
"""Called during boot to get all matcher endpoints this title servlet handles
Returns:
Tuple[List[Tuple[str, str, Dict]], List[Tuple[str, str, Dict]]]: A 2-length tuple where offset 0 is GET and offset 1 is POST,
containing a list of 3-length tuples where offset 0 is the name of the function in the handler that should be called, offset 1
is the matching string, and offset 2 is a dict containing rules for the matcher.
List[Route]: A list of Routes, WebSocketRoutes, or similar classes
"""
return (
[("render_GET", "/{game}/{version}/{endpoint}", {'game': R'S...'})],
[("render_POST", "/{game}/{version}/{endpoint}", {'game': R'S...'})]
)
return [
Route("/{game}/{version}/{endpoint}", self.render_POST, methods=["POST"]),
Route("/{game}/{version}/{endpoint}", self.render_GET, methods=["GET"]),
]
def setup(self) -> None:
"""Called once during boot, should contain any additional setup the handler must do, such as starting any sub-services
@ -58,11 +68,11 @@ class BaseServlet:
Tuple[str, str]: A tuple where offset 0 is the allnet uri field, and offset 1 is the allnet host field
"""
if not self.core_cfg.server.is_using_proxy and Utils.get_title_port(self.core_cfg) != 80:
return (f"http://{self.core_cfg.title.hostname}:{Utils.get_title_port(self.core_cfg)}/{game_code}/{game_ver}/", "")
return (f"http://{self.core_cfg.server.hostname}:{Utils.get_title_port(self.core_cfg)}/{game_code}/{game_ver}/", "")
return (f"http://{self.core_cfg.title.hostname}/{game_code}/{game_ver}/", "")
return (f"http://{self.core_cfg.server.hostname}/{game_code}/{game_ver}/", "")
def get_mucha_info(self, core_cfg: CoreConfig, cfg_dir: str) -> Tuple[bool, str]:
def get_mucha_info(self, core_cfg: CoreConfig, cfg_dir: str) -> Tuple[bool, List[str], List[str]]:
"""Called once during boot to check if this game is a mucha game
Args:
@ -70,17 +80,18 @@ class BaseServlet:
cfg_dir (str): Config directory
Returns:
Tuple[bool, str]: Tuple where offset 0 is true if the game is enabled, false otherwise, and offset 1 is the game CD
Tuple[bool, List[str], List[str]]: Tuple where offset 0 is true if the game is enabled, false otherwise, and offset 1 is the game CDs handled
by this servlette, and offset 2 is mucha netID prefixes that should be used for each game CD.
"""
return (False, "")
return (False, [], [])
def render_POST(self, request: Request, game_code: str, matchers: Dict) -> bytes:
self.logger.warn(f"{game_code} Does not dispatch POST")
return None
async def render_POST(self, request: Request) -> bytes:
self.logger.warn(f"Game Does not dispatch POST")
return Response()
def render_GET(self, request: Request, game_code: str, matchers: Dict) -> bytes:
self.logger.warn(f"{game_code} Does not dispatch GET")
return None
async def render_GET(self, request: Request) -> bytes:
self.logger.warn(f"Game Does not dispatch GET")
return Response()
class TitleServlet:
title_registry: Dict[str, BaseServlet] = {}
@ -136,7 +147,7 @@ class TitleServlet:
self.logger.error(f"{folder} missing game_code or index in __init__.py, or is_game_enabled in index")
self.logger.info(
f"Serving {len(self.title_registry)} game codes {'on port ' + str(core_cfg.title.port) if core_cfg.title.port > 0 else ''}"
f"Serving {len(self.title_registry)} game codes {'on port ' + str(core_cfg.server.port) if core_cfg.server.port > 0 else ''}"
)
def render_GET(self, request: Request, endpoints: dict) -> bytes:

View File

@ -1,6 +1,6 @@
from typing import Dict, Any, Optional
from types import ModuleType
from twisted.web.http import Request
from starlette.requests import Request
import logging
import importlib
from os import walk
@ -34,33 +34,22 @@ class Utils:
@classmethod
def get_ip_addr(cls, req: Request) -> str:
return (
req.getAllHeaders()[b"x-forwarded-for"].decode()
if b"x-forwarded-for" in req.getAllHeaders()
else req.getClientAddress().host
)
ip = req.headers.get("x-forwarded-for", req.client.host)
return ip.split(", ")[0]
@classmethod
def get_title_port(cls, cfg: CoreConfig):
if cls.real_title_port is not None: return cls.real_title_port
if cfg.title.port == 0:
cls.real_title_port = cfg.allnet.port
else:
cls.real_title_port = cfg.title.port
cls.real_title_port = cfg.server.proxy_port if cfg.server.is_using_proxy and cfg.server.proxy_port else cfg.server.port
return cls.real_title_port
@classmethod
def get_title_port_ssl(cls, cfg: CoreConfig):
if cls.real_title_port_ssl is not None: return cls.real_title_port_ssl
if cfg.title.port_ssl == 0:
cls.real_title_port_ssl = 443
else:
cls.real_title_port_ssl = cfg.title.port_ssl
cls.real_title_port_ssl = cfg.server.proxy_port_ssl if cfg.server.is_using_proxy and cfg.server.proxy_port_ssl else 443
return cls.real_title_port_ssl

View File

@ -1,9 +1,12 @@
import yaml
#!/usr/bin/env python3
import argparse
import logging
from core.config import CoreConfig
from os import mkdir, path, access, W_OK
import yaml
import asyncio
from core.data import Data
from os import path, mkdir, access, W_OK
from core.config import CoreConfig
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Database utilities")
@ -16,19 +19,10 @@ if __name__ == "__main__":
type=str,
help="Version of the database to upgrade/rollback to",
)
parser.add_argument(
"--game",
"-g",
type=str,
help="Game code of the game who's schema will be updated/rolled back. Ex. SDFE",
)
parser.add_argument("--email", "-e", type=str, help="Email for the new user")
parser.add_argument("--old_ac", "-o", type=str, help="Access code to transfer from")
parser.add_argument("--new_ac", "-n", type=str, help="Access code to transfer to")
parser.add_argument("--force", "-f", type=bool, help="Force the action to happen")
parser.add_argument(
"action", type=str, help="DB Action, create, recreate, upgrade, or rollback"
)
parser.add_argument("--access_code", "-a", type=str, help="Access code for new/transfer user", default="00000000000000000000")
parser.add_argument("--message", "-m", type=str, help="Revision message")
parser.add_argument("action", type=str, help="create, upgrade, downgrade, create-owner, migrate, create-revision, create-autorevision")
args = parser.parse_args()
cfg = CoreConfig()
@ -48,44 +42,31 @@ if __name__ == "__main__":
data = Data(cfg)
loop = asyncio.get_event_loop()
if args.action == "create":
data.create_database()
elif args.action == "upgrade":
data.schema_upgrade(args.version)
elif args.action == "recreate":
data.recreate_database()
elif args.action == "upgrade" or args.action == "rollback":
if args.version is None:
data.logger.warning("No version set, upgrading to latest")
if args.game is None:
data.logger.warning("No game set, upgrading core schema")
data.migrate_database(
"CORE",
int(args.version) if args.version is not None else None,
args.action,
)
else:
data.migrate_database(
args.game,
int(args.version) if args.version is not None else None,
args.action,
)
elif args.action == "autoupgrade":
data.autoupgrade()
elif args.action == "downgrade":
if not args.version:
logging.getLogger("database").error(f"Version argument required for downgrade")
exit(1)
data.schema_downgrade(args.version)
elif args.action == "create-owner":
data.create_owner(args.email)
loop.run_until_complete(data.create_owner(args.email, args.access_code))
elif args.action == "migrate-card":
data.migrate_card(args.old_ac, args.new_ac, args.force)
elif args.action == "migrate":
loop.run_until_complete(data.migrate())
elif args.action == "cleanup":
data.delete_hanging_users()
elif args.action == "version":
data.show_versions()
elif args.action == "create-revision":
loop.run_until_complete(data.create_revision(args.message))
data.logger.info("Done")
elif args.action == "create-autorevision":
loop.run_until_complete(data.create_revision_auto(args.message))
else:
logging.getLogger("database").info(f"Unknown action {args.action}")

107
docs/INSTALL_LINUX.md Normal file
View File

@ -0,0 +1,107 @@
# Installing ARTEMiS on Linux
This guide assumes a fresh install of Debian 12 or Rasperry Pi OS. If you're using a different distrubution, your package manager commands and package names may be different then what's listed below. Please check with your repository's package manager for package names.
## Install prerequisits
### Python
Some installs may come with python already installed. You can verify this by trying the following commands:
- `python --version`
- `python3 --version`
- `python3.<minor version> --version` where `<minor version>` is a python 3 release (eg 11, 10)
If your python version is at least 3.7, you can move to the next step
### Libraries and other software
ARTEMiS depends on mysql and memcached. As stated above, package names may vary by distrubution, but this is generally what you should expect to install.
#### Rasperry Pi OS
`sudo apt install git mariadb-server python3-pip memcached libmemcached-dev `
#### Debian 12
`sudo apt install git mariadb-server python3-pip memcached libmemcached-dev default-libmysqlclient-dev pkg-config`
### Optional: Install proxy
If you intend to use a proxy (recomended for public-facing production setups), we recomend nginx
`sudo apt install nginx`
## Database setup
### mysql_secure_installation
If you already have your database installed and configured, and are able to log in, skip down to the [Creating the database](#creating-the-database) section below. Otherwise, setup your newly installed database.
`sudo mysql_secure_installation`
Leave the root password blank, do not switch to unix socket, do reset the root password to something secure, and answer yes to the rest of the prompts. You can then log into your database with `sudo mysql`
### Creating the database
Once you're logged in, run the following commands, as root, to set up our database. Make sure you note down whatever you decide to make the password for the aime account, as you will need it to configure artemis.
```sql
CREATE USER 'aime'@'localhost' IDENTIFIED BY '<password>';
CREATE DATABASE aime;
GRANT Alter,Create,Delete,Drop,Index,Insert,References,Select,Update ON aime.* TO 'aime'@'localhost';
quit
```
We have now set up our new user, `aime`, created a database called `aime` and given our user all the permissions it needs on every table of that database.
### Configure memcached
Under the file /etc/memcached.conf, please make sure the following parameters are set:
```
# Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
# Note that the daemon will grow to this size, but does not start out holding this much
# memory
-I 128m
-m 1024
```
** This is mandatory to avoid memcached overload caused by Crossbeats or by massive profiles
Restart memcached using: sudo systemctl restart memcached
## Getting ARTEMiS
### Clone from gitea
use `git clone https://gitea.tendokyu.moe/Hay1tsme/artemis.git` to pull down ARTEMiS into a folder called `artemis` created at wherever your current working directory is. `cd` into `artemis`.
### Optional: Create a venv
Python venvs are a way to install and manage packages on a per-project basis and are recomended on systems that will have multiple python scripts running on them to avoid dependancy issues. If this server will be running ARTEMiS and ONLY ARTEMiS, then it is possible to get away without creating one. If you do want to create one, you will have to install an additional package:
`sudo apt install python3-venv` (like above, package name may vary depending on distro and python version)
Now, simply run `python -m venv .venv` (may have to use python3 or python 3.11 instead of python) to create your virtual environment in the folder `.venv`. In order to install packages and run scripts in this environment, you have to 'activate' it by running `source .venv/bin/activate`. Your terminal should now have (venv) appended to it.
### Optional: Use the develop branch
By default, pulling down ARTEMiS from gitea will pull the `master` branch. This branch is updated less frequently, but is considered stable and ready for production use. If you'd rather have more updates, but a possibility for instability or bugs, you can switch to the develop branch by running `git checkout develop`. You can run `git checkout master` to switch back to stable.
## Install python libraries
Run `pip install -r requirements.txt` to install all of ARTEMiS' dependencies. If any installs fail, you may have missed a step in the [Install prerequisits](#install-prerequisits) section above. If you're absolutly sure you didn't, submit an issue on gitea.
## Configuration
### Copy example configs
From the `artemis` directory, run `cp -r example_config config` to copy the example configuration files to a new folder called `config`. All of the config changes you make will be done in the `config` folder.
### Optional: Generate AimeDB and Frontend JWT Secrets
AimeDB and the frontend utalize JSON Web Tokens (JWT) for card authentication and session cookies respectivly. While generating a secret for AimeDB is optional, if you intend to run the frontend, a secret is required. You can generate a secret easily by running:
`openssl rand --base64 64`
With 64 being the number of bytes. You shouldn't need to go higher then 64, but you can if desired. **NOTE: When pasting secrets into the config file, make sure you remove any newlines!**
### Edit `core.yaml`
Before editing `core.yaml`, you should familiarize yourself with the name and function of each of the config options. You can find a full list in [config.md](config.md)
Open `core.yaml` in the `config` folder in your prefered text editor. The only configuration option that it is absolutly mandatory to change is `aimedb`->`key`. This key must be set for the server to start, and the key must be correct, otherwise you will not be able to process aimedb requests. The correct key is floating around online, and finding it is left as an excersie to the reader.
Another option that should be changed is `database`->`password` to be the password you set when you created your database user. You did write it down somewhere, right?
Since you are presumably not running the games on the same computer you're installing this server on, you're going to want to change `server`->`hostname` to be whatever hostname or IP address other PCs can reach this server by. Note that some games reject IPs and require hostnames, so setting a hostname is always recomended over an IP.
### Edit game configs
Every game has their own yaml file with settings that you may want to tweek. `InitialD Zero` and `Pokken` both have `hostname` fields in their config file that you should edit, and some games support encryption, if supplied with proper keys.
### A note about IDZ
InitialD Zero is currently the only game where it is required to specify encryption information (the AES key and at least one RSA key) for the game to start. These keys are, like the aimedb key, floating around online and will not be provided. If you don't have the keys, and don't plan on anybody connecting to your server playing InitialD Zero, it's best to set `enabled` to `False` in idz.yaml to disable the game.
## Create database tables
ARTEMiS uses alembic to manage datbase versioning. `dbutils.py` acts as a wrapper for alembic, and can execute some necessassary database functions. To create the database tables, run `python dbutils.py create`. Confirm that there are no errors, and you're good to go. If you intend to use the frontend, you may also want to run `python dbutils.py create-owner -a <your 20 digit access code here>` to create a superuser account to log in with.
## Run ARTEMiS
Once you have everything configured properly, simply run `python index.py` to start ARTEMiS. Verify that clients can connect to all services (allnet, billing, aimedb, and game servers) and setup is complete.

View File

@ -1,129 +0,0 @@
# ARTEMiS - Ubuntu 20.04 LTS Guide
This step-by-step guide assumes that you are using a fresh install of Ubuntu 20.04 LTS, some of the steps can be skipped if you already have an installation with MySQL 5.7 or even some of the modules already present on your environment
# Setup
## Install memcached module
1. sudo apt-get install memcached
2. Under the file /etc/memcached.conf, please make sure the following parameters are set:
```
# Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
# Note that the daemon will grow to this size, but does not start out holding this much
# memory
-I 128m
-m 1024
```
** This is mandatory to avoid memcached overload caused by Crossbeats or by massive profiles
3. Restart memcached using: sudo systemctl restart memcached
## Install MySQL 5.7
```
sudo apt update
sudo apt install wget -y
wget https://dev.mysql.com/get/mysql-apt-config_0.8.12-1_all.deb
sudo dpkg -i mysql-apt-config_0.8.12-1_all.deb
```
1. During the first prompt, select Ubuntu Bionic
2. Select the default option
3. Select MySQL 5.7
4. Select the last option
```
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 467B942D3A79BD29
sudo apt-get update
sudo apt-cache policy mysql-server
sudo apt install -f mysql-client=5.7* mysql-community-server=5.7* mysql-server=5.7*
```
## Default Configuration for MySQL Server
1. sudo mysql_secure_installation
> Make sure to follow the steps that will be prompted such as changing the mysql root password and such
2. Test your MySQL Server login by doing the following command :
> mysql -u root -p
## Create the default ARTEMiS database and user
1. mysql -u root -p
2. Please change the password indicated in the next line for a custom secure one and continue with the next commands
```
CREATE USER 'aime'@'localhost' IDENTIFIED BY 'MyStrongPass.';
CREATE DATABASE aime;
GRANT Alter,Create,Delete,Drop,Index,Insert,References,Select,Update ON aime.* TO 'aime'@'localhost';
FLUSH PRIVILEGES;
exit;
```
3. sudo systemctl restart mysql
## Install Python modules
```
sudo apt-get install python3-dev default-libmysqlclient-dev build-essential mysql-client libmysqlclient-dev libmemcached-dev
sudo apt install libpython3.8-dev
sudo apt-get install python3-software-properties
sudo apt install python3-pip
sudo pip3 install --upgrade pip testresources
sudo pip3 install --upgrade pip setuptools
sudo apt-get install python3-tk
```
7. Change your work path to the ARTEMiS root folder using 'cd' and install the requirements:
> sudo python3 -m pip install -r requirements.txt
## Copy/Rename the folder example_config to config
## Adjust /config/core.yaml
1. Make sure to change the server listen_address to be set to your local machine IP (ex.: 192.168.1.xxx)
2. Adjust the proper MySQL information you created earlier
3. Add the AimeDB key at the bottom of the file
## Create the database tables for ARTEMiS
1. sudo python3 dbutils.py create
2. If you get "No module named Crypto", run the following command:
```
sudo pip uninstall crypto
sudo pip uninstall pycrypto
sudo pip install pycrypto
```
## Firewall Adjustements
```
sudo ufw allow 80
sudo ufw allow 443
sudo ufw allow 8443
sudo ufw allow 22345
sudo ufw allow 8090
sudo ufw allow 8444
sudo ufw allow 8080
```
## Running the ARTEMiS instance
1. sudo python3 index.py
# Troubleshooting
## Game does not connect to ARTEMiS Allnet server
1. Double-check your core.yaml, the listen_address is most likely either not binded to the proper IP or the port is not opened
## Game does not connect to Title Server
1. Verify that your core.yaml is setup properly for both the server listen_address and title hostname
2. Boot your game and verify that an AllNet response does show and if it does, attempt to open the URI that is shown under a browser such as Edge, Chrome & Firefox.
3. If a page is shown, the server is working properly and if it doesn't, double check your port forwarding and also that you have entered the proper local IP under the Title hostname in core.yaml.
## Unhandled command under AimeDB
1. Double check your AimeDB key under core.yaml, it is incorrect.
## Memcache failed, error 3
1. Make sure memcached is properly installed and running. You can check the status of the service using the following command:
> sudo systemctl status memcached
2. If it is failing, double check the /etc/memcached.conf file, it may have duplicated arguments like the -I and -m
3. If it is still not working afterward, you can proceed with a workaround by manually editing the /core/data/cache.py file.
```
# Make memcache optional
try:
has_mc = False
except ModuleNotFoundError:
has_mc = False
```

View File

@ -1,102 +1,77 @@
# ARTEMiS - Windows 10/11 Guide
This step-by-step guide assumes that you are using a fresh install of Windows 10/11 without MySQL installed, some of the steps can be skipped if you already have an installation with MySQL 8.0 or even some of the modules already present on your environment
# Installing ARTEMiS on Windows
This guide assumes a fresh install of Windows 10. Please be aware that due to the lack of memcached and the general woes of running a server on Windows, this is only recommended for local setups or small hosting-for-the-homies type servers.
# Setup
## Install Python Python 3.9 (recommended) or 3.10
1. Download Python 3.9 : [Link](https://www.python.org/ftp/python/3.9.13/python-3.9.13-amd64.exe)
2. Install python-3.9.13-amd64.exe
1. Select Customize installation
2. Make sure that pip, tcl/tk, and the for all users are checked and hit Next
3. Make sure that you enable "Create shortcuts for installed applications" and "Add Python to environment variables" and hit Install
## Install prerequisites
### Python
- Python versions from 3.8 to 3.11 work with ARTEMiS. We recommend 3.11.
- https://www.python.org/ftp/python/3.11.7/python-3.11.7-amd64.exe
- Install using whichever options best suit your environment, making sure that the Python executable is on path, such that you can open CMD, type `python --version` and see the version of Python you have installed.
- If you already have a working version of Python installed, skip this step.
## Install MySQL 8.0
1. Download MySQL 8.0 Server : [Link](https://dev.mysql.com/get/Downloads/MySQLInstaller/mysql-installer-community-8.0.34.0.msi)
2. Install mysql-installer-web-community-8.0.34.0.msi
1. Click on "Add ..." on the side
2. Click on the "+" next to MySQL Servers
3. Make sure MySQL Server 8.0.34 - X64 is under the products to be installed.
4. Hit Next and Next once installed
5. Select the configuration type "Development Computer"
6. Hit Next
7. Select "Use Legacy Authentication Method (Retain MySQL 5.x compatibility)" and hit Next
8. Enter a root password and then hit Next >
9. Leave everything under Windows Service as default and hit Next >
10. Click on Execute and for it to finish and hit Next> and then Finish
3. Open MySQL 8.0 Command Line Client and login as your root user
4. Change `<Enter Password Here>` to a new password for the user aime, type those commands to create your user and the database
### MariaDB
- It is always recommended to use MariaDB over MySQL because Oracle is a terrible company.
- While the latest release of v10 is recommended, as it is an LTS release, v11 should work fine.
- https://ftp.osuosl.org/pub/mariadb//mariadb-10.11.6/winx64-packages/mariadb-10.11.6-winx64.msi
- REMEMBER YOUR ROOT PASSWORD SO YOU CAN LOG IN IN FUTURE STEPS.
```sql
CREATE USER 'aime'@'localhost' IDENTIFIED BY '<Enter Password Here>';
### Git
- While technically optional, it is strongly recommended to obtain ARTEMiS via git clone instead of just downloading it.
- https://git-scm.com/download/win
- It is recommended to use Notepad++ as the default editor (if you have it installed), other than that, the default settings should be fine.
### Optional: GUI database viewer
- Having a GUI database editor is recommended but not required.
- MariaDB will try to install HeidiSQL, but we recommend DBeaver.
- https://dbeaver.io/download/
## Obtain ARTEMiS
### Via git (recommended)
- `git clone https://gitea.tendokyu.moe/Hay1tsme/artemis.git` via cmd in whatever folder you want to install ARTEMiS.
- You can switch to the develop branch for latest changes via `git checkout develop`.
### Via http download
- Download [here](https://gitea.tendokyu.moe/Hay1tsme/artemis/archive/master.zip).
- Develop branch can be found [here](https://gitea.tendokyu.moe/Hay1tsme/artemis/archive/develop.zip).
- Extract the zip file somewhere.
## Database setup
- Log into your server as root, either via GUI (recommended) or CMD
- Create the `aime` user, replace `<password>` with a password you choose. Remember it!
```
CREATE USER 'aime'@'localhost' IDENTIFIED BY '<password>';
CREATE DATABASE aime;
GRANT Alter,Create,Delete,Drop,Index,Insert,References,Select,Update ON aime.* TO 'aime'@'localhost';
FLUSH PRIVILEGES;
exit;
```
- If you create the database via a GUI, make sure you grant all the above permissions.
## Install Python modules
1. Change your work path to the artemis-master folder using 'cd' and install the requirements:
## Create a venv
- Python virtual environments are a good way to manage packages and make dealing with python and pip easier.
- `python -m pip venv venv`
- `venv\Scripts\activate.bat` to activate the venv whenever you need to interact with ARTEMiS.
- All the rest of the steps assume your venv is activated.
```shell
pip install -r requirements.txt
```
## Install pip modules
- `pip install -r requirements.txt`
## Copy/Rename the folder `example_config` to `config`
## Setup configuration
- Create a new `config` folder and copy the files in `example_config` over.
- edit `core.yaml`
- Put the password you created for the aime user into the `database` section.
- Put in the aimedb key (YOU DO NOT GENERATE THIS KEY, FIND IT SOMEWHERE).
- Set your hostname to be whatever hostname or IP address games can reach your server at (many games reject localhost and 127.0.0.1).
- Optional: generate base64-encoded secrets for aimedb and frontend using something like `openssl rand -base64 64`. It is advised to make all secrets different.
- See [config.md](docs/config.md) for a full list of options.
- edit `idz.yaml`
- If you don't plan on anyone using your server to play Initial D Zero, it is best to disable it to cut down on console spam on boot.
- Edit other game yamls
- Add keys, set hostnames, ports, etc. Specific settings will depend on the game. See [game_specific_info](docs/game_specific_info.md).
## Adjust `config/core.yaml`
## Create Database Tables
- `python dbutils.py create`
1. Make sure to change the server `hostname` to be set to your local machine IP (ex.: 192.168.xxx.xxx)
- In case you want to run this only locally, set the following values:
## Firewall
- If you're planning on serving games not on your PC, open at least ports 80, 8443, and 22345 in windows firewall
- Also set `listen_address` to either your local IP to serve on your LAN, or `0.0.0.0` for all interfaces, to accept connections from other places.
```yaml
server:
listen_address: 0.0.0.0
title:
hostname: 192.168.xxx.xxx
```
1. Adjust the proper MySQL information you created earlier
```yaml
database:
host: "localhost"
username: "aime"
password: "<Enter Password Here>"
name: "aime"
```
3. Add the AimeDB key at the bottom of the file
4. If the webui is needed, change the flag from False to True
## Create the database tables for ARTEMiS
```shell
python dbutils.py create
```
## Firewall Adjustements
Make sure the following ports are open both on your router and local Windows firewall in case you want to use this for public use (NOT recommended):
> Port 80 (TCP), 443 (TCP), 8443 (TCP), 22345 (TCP), 8080 (TCP), 8090 (TCP) **webui, 8444 (TCP) **mucha
## Running the ARTEMiS instance
```shell
python index.py
```
# Troubleshooting
## Game does not connect to ARTEMiS Allnet server
1. Double-check your core.yaml, the listen_address is most likely either not binded to the proper IP or the port is not opened
## Game does not connect to Title Server
1. Verify that your core.yaml is setup properly for both the server listen_address and title hostname
2. Boot your game and verify that an AllNet response does show and if it does, attempt to open the URI that is shown under a browser such as Edge, Chrome & Firefox.
3. If a page is shown, the server is working properly and if it doesn't, double check your port forwarding and also that you have entered the proper local IP under the Title hostname in core.yaml.
## Unhandled command under AimeDB
1. Double check your AimeDB key under core.yaml, it is incorrect.
## AttributeError: module 'collections' has no attribute 'Hashable'
1. This means the pyYAML module is obsolete, simply rerun pip with the -U (force update) flag, as shown below.
- Change your work path to the artemis-master (or artemis-develop) folder using 'cd' and run the following commands:
```shell
pip install -r requirements.txt -U
```
## Start ARTEMiS
- `python index.py`

View File

@ -1,23 +1,24 @@
# ARTEMiS Configuration
## Server
- `listen_address`: IP Address or hostname that the server will listen for connections on. Set to 127.0.0.1 for local only, or 0.0.0.0 for all interfaces. Default `127.0.0.1`
- `hostname`: Hostname that gets sent to clients to tell them where to connect. Games must be able to connect to your server via the hostname or IP you spcify here. Note that most games will reject `localhost` or `127.0.0.1`. Default `localhost`
- `port`: Port that the server will listen for connections on. Default `80`
- `ssl_key`: Location of the ssl server key for the secure title server. Ignored if you don't use SSL. Default `cert/title.key`
- `ssl_cert`: Location of the ssl server certificate for the secure title server. Must not be a self-signed SSL. Ignored if you don't use SSL. Default `cert/title.pem`
- `allow_user_registration`: Allows users to register in-game via the AimeDB `register` function. Disable to be able to control who can use cards on your server. Default `True`
- `allow_unregistered_serials`: Allows games that do not have registered keychips to connect and authenticate. Disable to restrict who can connect to your server. Recomended to disable for production setups. Default `True`
- `name`: Name for the server, used by some games in their default MOTDs. Default `ARTEMiS`
- `is_develop`: Flags that the server is a development instance without a proxy standing in front of it. Setting to `False` tells the server not to listen for SSL, because the proxy should be handling all SSL-related things, among other things. Default `True`
- `threading`: Flags that `reactor.run` should be called via the `Thread` standard library. May provide a speed boost, but removes the ability to kill the server via `Ctrl + C`. Default: `False`
- `check_arcade_ip`: Checks IPs against the `arcade` table in the database, if one is defined. Default `False`
- `strict_ip_checking`: Rejects clients if there is no IP in the `arcade` table for the respective arcade
- `is_develop`: Flags that the server is a development instance, and enables some useful development features. Disable for production setups. Default `True`.
- `is_using_proxy`: Flags that you'll be using some other software, such as nginx, to proxy requests, and to send `proxy_port` or `proxy_port_ssl` to games instead of `port`. Default `False`
- `proxy_port`: Which port your front-facing proxy will be listening on. Ignored if `is_using_proxy` is `False` or if set to `0`. Default `0`
- `proxy_port`: Which port your front-facing proxy will be listening for ssl connections on. Ignored if `is_using_proxy` is `False` or if set to `0`. Default `0`
- `log_dir`: Directory to store logs. Server MUST have read and write permissions to this directory or you will have issues. Default `logs`
- `check_arcade_ip`: Checks IPs against the `arcade` table in the database, if one is defined. Default `False`
- `strict_ip_checking`: Rejects clients if there is no IP in the `arcade` table for the respective arcade. Default `False`
## Title
- `loglevel`: Logging level for the title server. Default `info`
- `hostname`: Hostname that gets sent to clients to tell them where to connect. Games must be able to connect to your server via the hostname or IP you spcify here. Note that most games will reject `localhost` or `127.0.0.1`. Default `localhost`
- `port`: Port that the title server will listen for connections on. Set to 0 to use the Allnet handler to reduce the port footprint. Default `8080`
- `port_ssl`: Port that the secure title server will listen for connections on. Set to 0 to use the Allnet handler to reduce the port footprint. Default `0`
- `ssl_key`: Location of the ssl server key for the secure title server. Ignored if `port_ssl` is set to `0` or `is_develop` set to `False`. Default `cert/title.key`
- `ssl_cert`: Location of the ssl server certificate for the secure title server. Must not be a self-signed SSL. Ignored if `port_ssl` is set to `0` or `is_develop` is set to `False`. Default `cert/title.pem`
- `reboot_start_time`: 24 hour JST time that clients will see as the start of maintenance period. Leave blank for no maintenance time. Default: ""
- `reboot_end_time`: 24 hour JST time that clients will see as the end of maintenance period. Leave blank for no maintenance time. Default: ""
- `reboot_start_time`: 24 hour JST time that clients will see as the start of maintenance period, ex `04:00`. A few games or early version will report errors if it is empty, ex maimai DX 1.00
- `reboot_end_time`: 24 hour JST time that clients will see as the end of maintenance period, ex `07:00`. this must be set to 7:00 am for some game, please do not change it.
## Database
- `host`: Host of the database. Default `localhost`
- `username`: Username of the account the server should connect to the database with. Default `aime`
@ -25,24 +26,32 @@
- `name`: Name of the database the server should expect. Default `aime`
- `port`: Port the database server is listening on. Default `3306`
- `protocol`: Protocol used in the connection string, e.i `mysql` would result in `mysql://...`. Default `mysql`
- `sha2_password`: Weather or not the password in the connection string should be hashed via SHA2. Default `False`
- `loglevel`: Logging level for the database. Default `warn`
- `user_table_autoincrement_start`: What the `aime_user` table ID autoincrememnt should start with. Default `10000`
- `sha2_password`: Whether or not the password in the connection string should be hashed via SHA2. Default `False`
- `loglevel`: Logging level for the database. Default `info`
- `memcached_host`: Host of the memcached server. Default `localhost`
## Frontend
- `enable`: Weather or not the frontend should be enabled. Default `False`
- `port`: Port the frontend should listen for connections on. Default `8090`
- `enable`: Whether or not the frontend servlet should run. Frontend can still be run via `python -m uvicorn core.frontend:app` even if this is set to `False`. Default `False`
- `port`: Port the frontend should listen on. Default `8080`
- `loglevel`: Logging level for the frontend server. Default `info`
- `secret`: Base64-encoded JWT secret for session cookies, generated by you. Default `""`
## Allnet
- `standalone`: Whether allnet should launch it's own servlet on it's own port, or be part of the main servlet on the default port. Disable if you either have something proxying `naominet.jp` requests to port 80, or have port 80 set in `server` -> `port`
- `port`: Port the allnet server should listen for connections on if it's running standalone. Games are hardcoded to ask for port `80` so only change if you have a proxy redirecting properly. Ignored if `standalone` is `False`. Default `80`
- `loglevel`: Logging level for the allnet server. Default `info`
- `port`: Port the allnet server should listen for connections on. Games are hardcoded to ask for port `80` so only change if you have a proxy redirecting properly. Default `80`
- `allow_online_updates`: Allow allnet to distribute online updates via DownloadOrders. This system is currently non-functional, so leave it disabled. Default `False`
- `update_cfg_folder`: Folder where delivery INI files will be checked for. Ignored if `allow_online_updates` is `False`. Default `""`
## Billing
- `port`: Port the billing server should listen for connections on. Games are hardcoded to ask for port `8443` so only change if you have a proxy redirecting properly. Set to 0 to use the allnet handler to reduce the number of ports the server eats up. Default `8443`
- `ssl_key`: Location of the ssl server key for the billing server. Ignored if `port` is set to `0` or `is_develop` set to `False`. Default `cert/server.key`
- `ssl_cert`: Location of the ssl server certificate for the billing server. Must match the CA distributed to users or the billing server will not connect. Ignored if `port` is set to `0` or `is_develop` is set to `False`. Default `cert/server.pem`
- `standalone`: Whether the billing server should launch it's own servlet on it's own port, or be part of the main servlet on the default port. Setting this to `True` requires that you have `ssl_key` and `ssl_cert` set. Default `False`
- `loglevel`: Logging level for the billing server. Default `info`
- `port`: Port the billing server should listen for connections on. Games are hardcoded to ask for port `8443` so only change if you have a proxy redirecting properly. Ignored if `standalone` is `False`. Default `8443`
- `ssl_key`: Location of the ssl server key for the billing server. Ignored if `standalone` is `False`. Default `cert/server.key`
- `ssl_cert`: Location of the ssl server certificate for the billing server. Ignored if `standalone` is `False`. Must match the CA distributed to users or the billing server will not connect. Default `cert/server.pem`
- `signing_key`: Location of the RSA Private key used to sign billing requests. Must match the public key distributed to users or the billing server will not connect. Default `cert/billing.key`
## Aimedb
- `enable`: Whether or not aimedb should run. Default `True`
- `listen_address`: IP Address or hostname that the aimedb server will listen for connections on. Leave this blank to use the listen address under `server`. Default `""`
- `loglevel`: Logging level for the aimedb server. Default `info`
- `port`: Port the aimedb server should listen for connections on. Games are hardcoded to ask for port `22345` so only change if you have a proxy redirecting properly. Default `22345`
- `key`: Key to encrypt/decrypt aimedb requests and responses. MUST be set or the server will not start. If set incorrectly, your server will not properly handle aimedb requests. Default `""`
- `key`: Key to encrypt/decrypt aimedb requests and responses. MUST be set or the server will not start. If set incorrectly, your server will not properly handle aimedb requests. Default `""`
- `id_secret`: Base64-encoded JWT secret for Sega Auth IDs. Leaving this blank disables this feature. Default `""`
- `id_lifetime_seconds`: Number of secons a JWT generated should be valid for. Default `86400` (1 day)

View File

@ -9,7 +9,15 @@ using the megaime database. Clean installations always create the latest databas
To upgrade the core database and the database for every game, execute:
```shell
python dbutils.py autoupgrade
python dbutils.py upgrade
```
If you are using the old master branch that was not setup with alembic, make sure to do the following steps in order:
- Pull down latest master/develop
- Update core.yaml
- Back up your existing database
```shell
python dbutils.py migrate
```
# Table of content
@ -22,7 +30,7 @@ python dbutils.py autoupgrade
- [Card Maker](#card-maker)
- [WACCA](#wacca)
- [Sword Art Online Arcade](#sao)
- [Initial D THE ARCADE](#initial-d-the-arcade)
- [Initial D THE ARCADE](#initial-d-the-arcade)
# Supported Games
@ -55,6 +63,7 @@ Games listed below have been tested and confirmed working.
| 12 | CHUNITHM NEW PLUS!! |
| 13 | CHUNITHM SUN |
| 14 | CHUNITHM SUN PLUS |
| 15 | CHUNITHM LUMINOUS |
### Importer
@ -79,13 +88,19 @@ Config file is located in `config/chuni.yaml`.
| `crypto` | This option is used to enable the TLS Encryption |
**If you would like to use network encryption, the following will be required underneath but key, iv and hash are required:**
If you would like to use network encryption, add the keys to the `keys` section under `crypto`, where the key
is the version ID for Japanese (SDHD) versions and `"{versionID}_int"` for Export (SDGS) versions, and the value
is an array containing `[key, iv, salt, iter_count]` in order.
`iter_count` is optional for all Japanese (SDHD) versions but may be required for some Export (SDGS) versions.
You will receive an error in the logs if it needs to be specified.
```yaml
crypto:
encrypted_only: False
keys:
13: ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000"]
"13_int": ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000", 42]
```
### Database upgrade
@ -93,7 +108,7 @@ crypto:
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDBT upgrade
python dbutils.py upgrade
```
### Online Battle
@ -167,6 +182,14 @@ Config file is located in `config/cxb.yaml`.
## maimai DX
### Presents
Presents are items given to the user when they login, with a little animation (for example, the KOP song was given to the finalists as a present). To add a present, you must insert it into the `mai2_item_present` table. In that table, a NULL version means any version, a NULL user means any user, a NULL start date means always open, and a NULL end date means it never expires. Below is a list of presents one might wish to add:
| Game Version | Item ID | Item Kind | Item Description | Present Description |
|--------------|---------|-----------|-------------------------------------------------|------------------------------------------------|
| BUDDiES (21) | 409505 | Icon (3) | 旅行スタンプ(月面基地) (Travel Stamp - Moon Base) | Officially obtained on the webui with a serial |
| | | | | number, for project raputa |
### Versions
| Game Code | Version ID | Version Name |
@ -192,6 +215,7 @@ Config file is located in `config/cxb.yaml`.
| SDEZ | 18 | maimai DX UNiVERSE PLUS |
| SDEZ | 19 | maimai DX FESTiVAL |
| SDEZ | 20 | maimai DX FESTiVAL PLUS |
| SDEZ | 21 | maimai DX BUDDiES |
### Importer
@ -215,7 +239,7 @@ The importer for maimai Pre-DX will import Events and Music. Not all games will
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDEZ upgrade
python dbutils.py upgrade
```
Pre-Dx uses the same database as DX, so only upgrade using the SDEZ game code!
@ -245,10 +269,14 @@ the Shop, Modules and Customizations.
Config file is located in `config/diva.yaml`.
| Option | Info |
| -------------------- | ----------------------------------------------------------------------------------------------- |
| `unlock_all_modules` | Unlocks all modules (costumes) by default, if set to `False` all modules need to be purchased |
| `unlock_all_items` | Unlocks all items (customizations) by default, if set to `False` all items need to be purchased |
| Option | Info |
| -------------------- | ------------------------------------------------------------------------------------------------ |
| `festa_enable` | Enable or disable the ingame festa |
| `festa_add_VP` | Set the extra VP you get when clearing a song, if festa is not enabled no extra VP will be given |
| `festa_multiply_VP` | Multiplier for festa add VP |
| `festa_end_time` | Set the date time for when festa will end and not show up in game anymore |
| `unlock_all_modules` | Unlocks all modules (costumes) by default, if set to `False` all modules need to be purchased |
| `unlock_all_items` | Unlocks all items (customizations) by default, if set to `False` all items need to be purchased |
### Custom PV Lists (databanks)
@ -259,7 +287,7 @@ In order to use custom PV Lists, simply drop in your .dat files inside of /title
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SBZV upgrade
python dbutils.py upgrade
```
## O.N.G.E.K.I.
@ -315,7 +343,7 @@ crypto:
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDDT upgrade
python dbutils.py upgrade
```
### Controlling Events (Ranking Event, Technical Challenge Event, Mission Event)
@ -406,6 +434,7 @@ After that, on next login the present should be received (or whenever it suppose
* UNiVERSE PLUS: Yes
* FESTiVAL: Yes (added in A031)
* FESTiVAL PLUS: Yes (added in A035)
* BUDDiES: Yes (added in A039)
* O.N.G.E.K.I. bright MEMORY: Yes
@ -542,7 +571,7 @@ Config file is located in `config/wacca.yaml`.
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDFE upgrade
python dbutils.py upgrade
```
### VIP Rewards
@ -594,7 +623,7 @@ Below is a list of VIP rewards. Currently, VIP is not implemented, and thus thes
In order to use the importer locate your game installation folder and execute:
```shell
python read.py --game SDEW --version <version ID> --binfolder /path/to/game/extractedassets
python read.py --game SDEW --version 0 --binfolder /titles/sao/data/
```
The importer for SAO will import all items, heroes, support skills and titles data.
@ -615,21 +644,23 @@ Config file is located in `config/sao.yaml`.
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDEW upgrade
python dbutils.py upgrade
```
### Notes
- Defrag Match will crash at loading
- Co-Op Online is not supported
- Shop is not functionnal
- Shop is displayed but cannot purchase heroes or items
- Player title is currently static and cannot be changed in-game
- QR Card Scanning currently only load a static hero
**Network hashing in GssSite.dll must be disabled**
- Ex-quests progression not supported yet
- Daily Missions not implemented
- EX TOWER 1,2 & 3 are not yet supported
- Daily Yui coin not yet fixed
### Credits for SAO support:
- Midorica - Limited Network Support
- Midorica - Network Support
- Dniel97 - Helping with network base
- tungnotpunk - Source
@ -678,7 +709,7 @@ Config file is located in `config/idac.yaml`.
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDGT upgrade
python dbutils.py upgrade
```
### Notes

34
docs/migrating.md Normal file
View File

@ -0,0 +1,34 @@
# Migrating from an older build of ARTEMiS
If you haven't updated artemis in a while, you may find that configuration options have moved, been renamed, or no longer exist. This document exists to help migrate from legacy versions of artemis to newer builds.
## Dependancies
Make sure your dependiences are up to date with what's required to run artemis. A simple `pip install -r requirements.txt` will get you up to date.
## Database
Database migration is required if you are using a version of artemis that still uses the old custom-rolled database versioning system (raw SQL scripts). Artemis now uses alembic to manage database versioning, and you will need to move to this new system.
**BEFORE DOING ANY DATABASE WORK, ALWAYS MAKE SURE YOU HAVE FUNCTIONAL, UP-TO-DATE BACKUPS!!**
For almost all situations, simply running `python dbutils.py migrate` will do the job. This will upgrade you to the latest version of the old system, move you over to alembic, then upgrade you to the newest alembic version. If you encounter any errors or data loss, you should report this as a bug to our issue tracker.
## Configuration
Configuration management is the sewage cleaning of the sysadmin world. It sucks and nobody likes to do it, but it needs to be done or everyone ends up in deep shit. This section will walk through what configuration options have changed, and how to set them properly.
### core.yaml
`title`->`hostname` is now `server`->`hostname`. This hostname is what gets sent to clients in response to auth requests, so it should be both accessable from whereever the client is, and point properly to the title server.
With the move to starlette and uvicorn, different services now run as seperate USGI applications. `billing`->`standalone` and `allnet`->`standalone` are flags that determine weather the service runs as a stand-alone service, on it's own seperate port, or as a part of the whole application. For example, setting `billing`->`standalone` to `True` will cause a seperate instance of the billing server to spin up listening on 8443 with SSL using the certs listed in the config file. Setting it to `False` will just allow the main server to also serve `/request/` and assumes that something is standing in front of it proxying 8443 SSL to whatever `server`->`port` is set to.
Beforehand, if `server`->`is_develop` was `False`, the server assumed that there was a proxy standing in front of it, proxying requests to proper channels. This was, in hindsight, a very dumb assumption. Now, `server`->`is_using_proxy` is what flags the server as having nginx or another proxy in front of it. The effects of setting this to true are somewhat game-dependant, but generally artemis will use the port listed in `server`->`proxy_port` (and `server`->`proxy_port_ssl` for SSL connections, as defined by the games) instead of `server`->`port`. If set to 0, `server`->`proxy_port` will default to what `server`->`port` (and `server`->`proxy_port_ssl` will default to 443) make sure to set them accordingly. Note that some title servers have their own needs and specify their own specific ports. Refer to [game_specific_info.md](docs/game_specific_info.md) for more infomation. (For example, pokken requires SSL using an older, weaker certificate, and always requires the port to be sent even if it's port 443)
`index.py`'s args have changed. You can now override what port the title server listens on with `-p` and tell the server to use ssl with `-s`.
Rather then having a `standalone` config variable, the frontend is a seperate wsgi app entirely. Having `enable` be `True` will launch it on the port specified in the config file. Otherwise, the fontend will not run.
`title`->`reboot_start_time`/`reboot_end_time` allow you to specify when the games should be told network maintanence is happening. It's exact implementation depends on the game. Do note that many games will beave unexpectly if `reboot_end_time` is not `07:00`.
If you wish to make use of aimedb's SegaAuthId system to better protect the few title servers that actually use it, set `aimedb`->`id_secret` to base64-encoded random bytes (32 is a good length) using something like `openssl rand -base64 64`. If you intend to use the frontend, the same thing must be done for `frontend`->`secret` or you won't be able to log in.
`mucha`'s only option is now just log level.
`aimedb` now has it's own `listen_address` field, in case you want to proxy everything but aimedb, so it can still listen on `0.0.0.0` instead of `127.0.0.1`.

View File

@ -1,41 +1,34 @@
# ARTEMiS Production mode
Production mode is a configuration option that changes how the server listens to be more friendly to a production environment. This mode assumes that a proxy (for this guide, nginx) is standing in front of the server to handle port mapping and TLS. In order to activate production mode, simply change `is_develop` to `False` in `core.yaml`. Next time you start the server, you should see "Starting server in production mode".
ARTEMiS is designed to run in one of two ways. Developmen/local mode, which assumes you're just trying to set up something to save your scores and make the games work, and have patched your games to disable SSL and cert checks and encryption and the like, and production mode. In production mode, artemis assumes you have a proxy server, such as nginx or apache, standing in front of artemis doing HTTPS and port management. This document will cover how to properly set up a production instance of ARTEMiS.
## ARTEMiS configuration
Step 1 is to edit your artemis configuration. Some recomended changes:
### `server`
- `listen_address` -> `127.0.0.1`
- `is_develop` -> `False`
- `is_using_proxy` -> `True`
- `port` -> The port nginx will send proxied requests to. If you're using the example config, set this to 8080.
- `proxy_port` -> The port your proxy will be accepting title server connections on. If you're using the example config, set this to 80.
- `proxy_port_ssl` -> The port your proxy will be accepting secure title server connections on. If you're using the example config, set this to 443.
- `allow_unregistered_serials` -> `False`
### `billing`
- `standalone` -> `False`
### `allnet`
- `standalone` -> `False`
### `frontend`
- `enable` -> `True` if you want the frontend
- `port` -> `8090` if you're using the default nginx config, otherwise whatever port your proxy will be sending requests to
### `aimedb`
- `listen_address` -> `0.0.0.0` unless you're proxying aimedb requests (not recomended at this time), in which case, leave this option unchanged
If you plan to serve artemis behind a VPN, these additional settings are also recomended
- `check_arcade_ip` -> `True`
- `strict_ip_checking` -> `True`
## Nginx Configuration
### Port forwarding
Artemis requires that the following ports be forwarded to allow internet traffic to access the server. This will not change regardless of what you set in the config, as many of these ports are hard-coded in the games.
`tcp:80` all.net, non-ssl titles
`tcp:8443` billing
`tcp:22345` aimedb
`tcp:443` frontend, SSL titles
For most cases, the config in `example_config` will suffice. It makes the following assumptions
- ARTEMiS is running on port 8080
- Billing is set to not be standalone
- You're not using cloudflare in front of your frontend
### A note about external proxy services (cloudflare, etc)
Due to the way that artemis functions, it is currently not possible to put the server behind something like Cloudflare. Cloudflare only proxies web traffic on the standard ports (80, 443) and, as shown above, this does not work with artemis. Server administrators should seek other means to protect their network (VPS hosting, VPN, etc)
### SSL Certificates
You will need to generate SSL certificates for some games. The certificates vary in security and validity requirements. Please see the general guide below
- General Title: The certificate for the general title server should be valid, not self-signed and match the CN that the game will be reaching out to (e.i if your games are reaching out to titles.hostname.here, your ssl certificate should be valid for titles.hostname.here, or *.hostname.here)
- CXB: Same requires as the title server. It must not be self-signed, and CN must match. Recomended to get a wildcard cert if possible, and use it for both Title and CXB
- Pokken: Pokken can be self-signed, and the CN doesn't have to match, but it MUST use 2048-bit RSA. Due to the games age, andthing stronger then that will be rejected.
### Port mappings
An example config is provided in the `config` folder called `nginx_example.conf`. It is set up for the following:
`naominet.jp:tcp:80` -> `localhost:tcp:8000` for allnet
`ib.naominet.jp:ssl:8443` -> `localhost:tcp:8444` for the billing server
`your.hostname.here:ssl:443` -> `localhost:tcp:8080` for the SSL title server
`your.hostname.here:tcp:80` -> `localhost:tcp:8080` for the non-SSL title server
`cxb.hostname.here:ssl:443` -> `localhost:tcp:8080` for crossbeats (appends /SDCA/104/ to the request)
`pokken.hostname.here:ssl:443` -> `localhost:tcp:8080` for pokken
`frontend.hostname.here:ssl:443` -> `localhost:tcp:8090` for the frontend, includes https redirection
If you're using this as a guide, be sure to replace your.hostname.here with the hostname you specified in core.yaml under `titles->hostname`. Do *not* change naominet.jp, or allnet/billing will fail. Also remember to specifiy certificate paths correctly, as in the example they are simply placeholders.
### Multi-service ports
It is possible to use nginx to redirect billing and title server requests to the same port that all.net uses. By setting `port` to 0 under billing and title server, you can change the nginx config to serve the following (entries not shown here should be the same)
`ib.naominet.jp:ssl:8443` -> `localhost:tcp:8000` for the billing server
`your.hostname.here:ssl:443` -> `localhost:tcp:8000` for the SSL title server
`your.hostname.here:tcp:80` -> `localhost:tcp:8000` for the non-SSL title server
`cxb.hostname.here:ssl:443` -> `localhost:tcp:8000` for crossbeats (appends /SDCA/104/ to the request)
`pokken.hostname.here:ssl:443` -> `localhost:tcp:8000` for pokken
This will allow you to only use 3 ports locally, but you will still need to forward the same internet-facing ports as before.
If this describes you, your only configuration needs are to edit the `server_name` and `certificate_*` directives. Otherwise, please see nginx configuration documentation to configure it to best suit your setup.

View File

@ -22,6 +22,9 @@ version:
14:
rom: 2.15.00
data: 2.15.00
15:
rom: 2.20.00
data: 2.20.00
crypto:
encrypted_only: False

View File

@ -1,25 +1,24 @@
server:
listen_address: "127.0.0.1"
listen_address: "127.0.0.1"
hostname: "localhost"
port: 80
ssl_key: "cert/title.key"
ssl_cert: "cert/title.crt"
allow_user_registration: True
allow_unregistered_serials: True
name: "ARTEMiS"
is_develop: True
is_using_proxy: False
threading: False
proxy_port: 0
proxy_port_ssl: 0
log_dir: "logs"
check_arcade_ip: False
strict_ip_checking: False
title:
loglevel: "info"
hostname: "localhost"
port: 8080
port_ssl: 0
ssl_cert: "cert/title.crt"
ssl_key: "cert/title.key"
reboot_start_time: "04:00"
reboot_end_time: "05:00"
reboot_end_time: "07:00" # this must be set to 7:00 am for some game, please do not change it
database:
host: "localhost"
@ -29,30 +28,34 @@ database:
port: 3306
protocol: "mysql"
sha2_password: False
loglevel: "warn"
user_table_autoincrement_start: 10000
loglevel: "info"
enable_memcached: True
memcached_host: "localhost"
frontend:
enable: False
port: 8090
port: 8080
loglevel: "info"
secret: ""
allnet:
loglevel: "info"
standalone: False
port: 80
ip_check: False
loglevel: "info"
allow_online_updates: False
update_cfg_folder: ""
billing:
standalone: True
loglevel: "info"
port: 8443
ssl_key: "cert/server.key"
ssl_cert: "cert/server.pem"
signing_key: "cert/billing.key"
aimedb:
enable: True
listen_address: ""
loglevel: "info"
port: 22345
key: ""
@ -60,6 +63,4 @@ aimedb:
id_lifetime_seconds: 86400
mucha:
enable: False
hostname: "localhost"
loglevel: "info"

View File

@ -1,3 +1,4 @@
server:
enable: True
loglevel: "info"
loglevel: "info"
use_https: True

View File

@ -1,6 +1,10 @@
server:
enable: True
loglevel: "info"
festa_enable: True
festa_add_VP: "20,5"
festa_multiply_VP: "1,2"
festa_end_time: "2029-01-01 00:00:00.0"
mods:
unlock_all_modules: True

View File

@ -12,3 +12,6 @@ uploads:
photos_dir: ""
movies: False
movies_dir: ""
crypto:
encrypted_only: False

View File

@ -6,7 +6,7 @@ server {
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://localhost:8000/;
proxy_pass http://127.0.0.1:8080/;
}
}
@ -18,7 +18,7 @@ server {
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://localhost:8080/;
proxy_pass http://127.0.0.1:8080/;
}
}
@ -38,11 +38,13 @@ server {
ssl_prefer_server_ciphers off;
location / {
proxy_pass http://localhost:8080/;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://127.0.0.1:8080/;
}
}
# Billing
# Billing, comment this out if running billing standalone
server {
listen 8443 ssl;
server_name ib.naominet.jp;
@ -57,30 +59,10 @@ server {
ssl_ciphers "ALL:@SECLEVEL=0";
ssl_prefer_server_ciphers off;
location / {
proxy_pass http://localhost:8444/;
}
}
# Pokken, comment this out if you don't plan on serving pokken.
server {
listen 443 ssl;
server_name pokken.hostname.here;
ssl_certificate /path/to/cert/pokken.pem;
ssl_certificate_key /path/to/cert/pokken.key;
ssl_session_timeout 1d;
ssl_session_cache shared:MozSSL:10m;
ssl_session_tickets off;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2 TLSv1.3;
ssl_ciphers "ALL:@SECLEVEL=0";
ssl_prefer_server_ciphers off;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://localhost:8080/;
proxy_pass http://127.0.0.1:8080/;
}
}
@ -91,12 +73,12 @@ server {
location / {
return 301 https://$host$request_uri;
# If you don't want https redirection, comment the line above and uncomment the line below
# proxy_pass http://localhost:8090/;
# If you don't want https redirection, or are using something like cloudflare to manage HTTPS, comment out the line above and uncomment the line below
# proxy_pass http://127.0.0.1:8090/;
}
}
# Frontend HTTPS. Comment out if you on't intend to use the frontend
# Frontend HTTPS. Comment out if you on't intend to use the frontend, or have cloudflare or something managing https for you.
server {
listen 443 ssl;
server_name frontend.hostname.here;
@ -118,6 +100,6 @@ server {
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://localhost:8090/;
proxy_pass http://127.0.0.1:8090/;
}
}

View File

@ -2,12 +2,19 @@ server:
enable: True
loglevel: "info"
auto_register: True
photon_app_id: "7df3a2f6-d69d-4073-aafe-810ee61e1cea"
data_version: 1
game_version: 33
crypt:
enable: False
key: ""
iv: ""
hash:
verify_hash: False
hash_base: ""
hash_base: ""
card:
enable: True
crypt_password: ""
crypt_salt: ""

400
index.py
View File

@ -1,335 +1,117 @@
#!/usr/bin/env python3
import argparse
import logging, coloredlogs
from logging.handlers import TimedRotatingFileHandler
from typing import Dict
import yaml
from os import path, mkdir, access, W_OK
from core import *
from os import path, environ
import uvicorn
import logging
import asyncio
from twisted.web import server, resource
from twisted.internet import reactor, endpoints
from twisted.web.http import Request
from routes import Mapper
from threading import Thread
from core import CoreConfig, AimedbServlette
class HttpDispatcher(resource.Resource):
def __init__(self, cfg: CoreConfig, config_dir: str):
super().__init__()
self.config = cfg
self.isLeaf = True
self.map_get = Mapper()
self.map_post = Mapper()
self.logger = logging.getLogger("core")
self.title = TitleServlet(cfg, config_dir)
self.allnet = AllnetServlet(cfg, config_dir)
self.mucha = MuchaServlet(cfg, config_dir)
self.map_get.connect(
"allnet_downloadorder_ini",
"/dl/ini/{file}",
controller="allnet",
action="handle_dlorder_ini",
conditions=dict(method=["GET"]),
async def launch_main(cfg: CoreConfig, ssl: bool) -> None:
if ssl:
server_cfg = uvicorn.Config(
"core.app:app",
host=cfg.server.listen_address,
port=cfg.server.port if args.port == 0 else args.port,
reload=cfg.server.is_develop,
log_level="info" if cfg.server.is_develop else "critical",
ssl_version=3,
ssl_certfile=cfg.server.ssl_cert,
ssl_keyfile=cfg.server.ssl_key
)
else:
server_cfg = uvicorn.Config(
"core.app:app",
host=cfg.server.listen_address,
port=cfg.server.port if args.port == 0 else args.port,
reload=cfg.server.is_develop,
log_level="info" if cfg.server.is_develop else "critical"
)
server = uvicorn.Server(server_cfg)
await server.serve()
self.map_post.connect(
"allnet_downloadorder_report",
"/report-api/Report",
controller="allnet",
action="handle_dlorder_report",
conditions=dict(method=["POST"]),
)
async def launch_billing(cfg: CoreConfig) -> None:
server_cfg = uvicorn.Config(
"core.allnet:app_billing",
host=cfg.server.listen_address,
port=cfg.billing.port,
reload=cfg.server.is_develop,
log_level="info" if cfg.server.is_develop else "critical",
ssl_version=3,
ssl_certfile=cfg.billing.ssl_cert,
ssl_keyfile=cfg.billing.ssl_key,
ssl_ciphers="DEFAULT:!aNULL:!eNULL:!MD5:!3DES:!DES:!RC4:!IDEA:!SEED:!aDSS:!SRP:!PSK",
)
server = uvicorn.Server(server_cfg)
await server.serve()
self.map_get.connect(
"allnet_ping",
"/naomitest.html",
controller="allnet",
action="handle_naomitest",
conditions=dict(method=["GET"]),
)
self.map_post.connect(
"allnet_poweron",
"/sys/servlet/PowerOn",
controller="allnet",
action="handle_poweron",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"allnet_downloadorder",
"/sys/servlet/DownloadOrder",
controller="allnet",
action="handle_dlorder",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"allnet_loaderstaterecorder",
"/sys/servlet/LoaderStateRecorder",
controller="allnet",
action="handle_loaderstaterecorder",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"allnet_alive",
"/sys/servlet/Alive",
controller="allnet",
action="handle_alive",
conditions=dict(method=["POST"]),
)
self.map_get.connect(
"allnet_alive",
"/sys/servlet/Alive",
controller="allnet",
action="handle_alive",
conditions=dict(method=["GET"]),
)
self.map_post.connect(
"allnet_billing",
"/request",
controller="allnet",
action="handle_billing_request",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"allnet_billing",
"/request/",
controller="allnet",
action="handle_billing_request",
conditions=dict(method=["POST"]),
)
async def launch_frontend(cfg: CoreConfig) -> None:
server_cfg = uvicorn.Config(
"core.frontend:app",
host=cfg.server.listen_address,
port=cfg.frontend.port,
reload=cfg.server.is_develop,
log_level="info" if cfg.server.is_develop else "critical",
)
server = uvicorn.Server(server_cfg)
await server.serve()
# Maintain compatability
self.map_post.connect(
"mucha_boardauth",
"/mucha/boardauth.do",
controller="mucha",
action="handle_boardauth",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"mucha_updatacheck",
"/mucha/updatacheck.do",
controller="mucha",
action="handle_updatecheck",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"mucha_dlstate",
"/mucha/downloadstate.do",
controller="mucha",
action="handle_dlstate",
conditions=dict(method=["POST"]),
)
async def launch_allnet(cfg: CoreConfig) -> None:
server_cfg = uvicorn.Config(
"core.allnet:app_allnet",
host=cfg.server.listen_address,
port=cfg.allnet.port,
reload=cfg.server.is_develop,
log_level="info" if cfg.server.is_develop else "critical",
)
server = uvicorn.Server(server_cfg)
await server.serve()
self.map_post.connect(
"mucha_boardauth",
"/mucha_front/boardauth.do",
controller="mucha",
action="handle_boardauth",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"mucha_updatacheck",
"/mucha_front/updatacheck.do",
controller="mucha",
action="handle_updatecheck",
conditions=dict(method=["POST"]),
)
self.map_post.connect(
"mucha_dlstate",
"/mucha_front/downloadstate.do",
controller="mucha",
action="handle_dlstate",
conditions=dict(method=["POST"]),
)
for code, game in self.title.title_registry.items():
get_matchers, post_matchers = game.get_endpoint_matchers()
for m in get_matchers:
self.map_get.connect(
"title_get",
m[1],
controller="title",
action="render_GET",
title=code,
subaction=m[0],
conditions=dict(method=["GET"]),
requirements=m[2],
)
for m in post_matchers:
self.map_post.connect(
"title_post",
m[1],
controller="title",
action="render_POST",
title=code,
subaction=m[0],
conditions=dict(method=["POST"]),
requirements=m[2],
)
def render_GET(self, request: Request) -> bytes:
test = self.map_get.match(request.uri.decode())
client_ip = Utils.get_ip_addr(request)
if test is None:
self.logger.debug(
f"Unknown GET endpoint {request.uri.decode()} from {client_ip} to port {request.getHost().port}"
)
request.setResponseCode(404)
return b"Endpoint not found."
return self.dispatch(test, request)
def render_POST(self, request: Request) -> bytes:
test = self.map_post.match(request.uri.decode())
client_ip = Utils.get_ip_addr(request)
if test is None:
self.logger.debug(
f"Unknown POST endpoint {request.uri.decode()} from {client_ip} to port {request.getHost().port}"
)
request.setResponseCode(404)
return b"Endpoint not found."
return self.dispatch(test, request)
def dispatch(self, matcher: Dict, request: Request) -> bytes:
controller = getattr(self, matcher["controller"], None)
if controller is None:
self.logger.error(
f"Controller {matcher['controller']} not found via endpoint {request.uri.decode()}"
)
request.setResponseCode(404)
return b"Endpoint not found."
handler = getattr(controller, matcher["action"], None)
if handler is None:
self.logger.error(
f"Action {matcher['action']} not found in controller {matcher['controller']} via endpoint {request.uri.decode()}"
)
request.setResponseCode(404)
return b"Endpoint not found."
url_vars = matcher
url_vars.pop("controller")
url_vars.pop("action")
ret = handler(request, url_vars)
if type(ret) == str:
return ret.encode()
elif type(ret) == bytes or type(ret) == tuple: # allow for bytes or tuple (data, response code) responses
return ret
elif ret is None:
self.logger.warning(f"None returned by controller for {request.uri.decode()} endpoint")
return b""
else:
self.logger.warning(f"Unknown data type returned by controller for {request.uri.decode()} endpoint")
return b""
async def launcher(cfg: CoreConfig, ssl: bool) -> None:
task_list = [asyncio.create_task(launch_main(cfg, ssl))]
if cfg.billing.standalone:
task_list.append(asyncio.create_task(launch_billing(cfg)))
if cfg.frontend.enable:
task_list.append(asyncio.create_task(launch_frontend(cfg)))
if cfg.allnet.standalone:
task_list.append(asyncio.create_task(launch_allnet(cfg)))
if cfg.aimedb.enable:
AimedbServlette(cfg).start()
done, pending = await asyncio.wait(
task_list,
return_when=asyncio.FIRST_COMPLETED,
)
logging.getLogger("core").info("Shutdown")
for pending_task in pending:
pending_task.cancel("Another service died, server is shutting down")
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="ARTEMiS main entry point")
parser = argparse.ArgumentParser(description="Artemis main entry point")
parser.add_argument(
"--config", "-c", type=str, default="config", help="Configuration folder"
)
parser.add_argument(
"--port", "-p", type=int, default=0, help="Port override"
)
parser.add_argument(
"--ssl", "-s", type=bool, help="Launch with SSL"
)
args = parser.parse_args()
if not path.exists(f"{args.config}/core.yaml"):
print(
f"The config folder you specified ({args.config}) does not exist or does not contain core.yaml.\nDid you copy the example folder?"
f"The config folder you specified ({args.config}) does not exist or does not contain core.yaml. Defaults will be used.\nDid you copy the example folder?"
)
exit(1)
cfg: CoreConfig = CoreConfig()
if path.exists(f"{args.config}/core.yaml"):
cfg.update(yaml.safe_load(open(f"{args.config}/core.yaml")))
if not path.exists(cfg.server.log_dir):
mkdir(cfg.server.log_dir)
environ["ARTEMIS_CFG_DIR"] = args.config
if not access(cfg.server.log_dir, W_OK):
print(
f"Log directory {cfg.server.log_dir} NOT writable, please check permissions"
)
exit(1)
logger = logging.getLogger("core")
log_fmt_str = "[%(asctime)s] Core | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(cfg.server.log_dir, "core"), when="d", backupCount=10
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
logger.addHandler(fileHandler)
logger.addHandler(consoleHandler)
log_lv = logging.DEBUG if cfg.server.is_develop else logging.INFO
logger.setLevel(log_lv)
coloredlogs.install(level=log_lv, logger=logger, fmt=log_fmt_str)
if not cfg.aimedb.key:
logger.error("!!AIMEDB KEY BLANK, SET KEY IN CORE.YAML!!")
exit(1)
logger.info(
f"ARTEMiS starting in {'develop' if cfg.server.is_develop else 'production'} mode"
)
allnet_server_str = f"tcp:{cfg.allnet.port}:interface={cfg.server.listen_address}"
title_server_str = f"tcp:{cfg.title.port}:interface={cfg.server.listen_address}"
title_https_server_str = f"ssl:{cfg.title.port_ssl}:interface={cfg.server.listen_address}:privateKey={cfg.title.ssl_key}:certKey={cfg.title.ssl_cert}"
adb_server_str = f"tcp:{cfg.aimedb.port}:interface={cfg.server.listen_address}"
frontend_server_str = (
f"tcp:{cfg.frontend.port}:interface={cfg.server.listen_address}"
)
billing_server_str = f"tcp:{cfg.billing.port}:interface={cfg.server.listen_address}"
if cfg.server.is_develop:
billing_server_str = (
f"ssl:{cfg.billing.port}:interface={cfg.server.listen_address}"
f":privateKey={cfg.billing.ssl_key}:certKey={cfg.billing.ssl_cert}"
)
dispatcher = HttpDispatcher(cfg, args.config)
endpoints.serverFromString(reactor, allnet_server_str).listen(
server.Site(dispatcher)
)
endpoints.serverFromString(reactor, adb_server_str).listen(AimedbFactory(cfg))
if cfg.frontend.enable:
endpoints.serverFromString(reactor, frontend_server_str).listen(
server.Site(FrontendServlet(cfg, args.config))
)
if cfg.billing.port > 0:
endpoints.serverFromString(reactor, billing_server_str).listen(
server.Site(dispatcher)
)
if cfg.title.port > 0:
endpoints.serverFromString(reactor, title_server_str).listen(
server.Site(dispatcher)
)
if cfg.title.port_ssl > 0:
endpoints.serverFromString(reactor, title_https_server_str).listen(
server.Site(dispatcher)
)
if cfg.server.threading:
Thread(target=reactor.run, args=(False,)).start()
else:
reactor.run()
asyncio.run(launcher(cfg, args.ssl))

10
read.py
View File

@ -1,4 +1,4 @@
# vim: set fileencoding=utf-8
#!/usr/bin/env python3
import argparse
import re
import os
@ -6,6 +6,7 @@ import yaml
from os import path
import logging
import coloredlogs
import asyncio
from logging.handlers import TimedRotatingFileHandler
from typing import List, Optional
@ -38,6 +39,9 @@ class BaseReader:
ret.append(f"{root}/{dir}")
return ret
async def read(self) -> None:
pass
if __name__ == "__main__":
@ -136,6 +140,8 @@ if __name__ == "__main__":
for dir, mod in titles.items():
if args.game in mod.game_codes:
handler = mod.reader(config, args.version, bin_arg, opt_arg, args.extra)
handler.read()
loop = asyncio.get_event_loop()
loop.run_until_complete(handler.read())
logger.info("Done")

View File

@ -4,37 +4,69 @@ A network service emulator for games running SEGA'S ALL.NET service, and similar
# Supported games
Games listed below have been tested and confirmed working. Only game versions older then the version currently active in arcades, or games versions that have not recieved a major update in over one year, are supported.
+ CHUNITHM
+ All versions up to SUN PLUS
+ crossbeats REV.
+ All versions + omnimix
+ maimai DX
+ All versions up to FESTiVAL PLUS
+ Hatsune Miku: Project DIVA Arcade
+ All versions
+ Card Maker
+ 1.30
+ 1.35
+ O.N.G.E.K.I.
+ All versions up to bright MEMORY
+ CHUNITHM INTL
+ SUPERSTAR
+ SUPERSTAR PLUS
+ NEW
+ NEW PLUS
+ SUN
+ SUN PLUS
+ WACCA
+ Lily R
+ Reverse
+ CHUNITHM JP
+ AIR
+ AIR PLUS
+ AMAZON
+ AMAZON PLUS
+ CRYSTAL
+ CRYSTAL PLUS
+ PARADISE
+ PARADISE LOST
+ NEW
+ NEW PLUS
+ SUN
+ SUN PLUS
+ crossbeats REV.
+ Crossbeats REV.
+ Crossbeats REV. SUNRiSE S1
+ Crossbeats REV. SUNRiSE S2 + omnimix
+ Hatsune Miku: Project DIVA Arcade
+ Future Tone Arcade - All versions
+ Initial D THE ARCADE
+ Season 2
+ maimai DX
+ Splash
+ Splash Plus
+ UNiVERSE
+ UNiVERSE PLUS
+ FESTiVAL
+ FESTiVAL PLUS
+ BUDDiES
+ O.N.G.E.K.I.
+ SUMMER
+ SUMMER PLUS
+ R.E.D.
+ R.E.D. PLUS
+ bright
+ bright MEMORY
+ POKKÉN TOURNAMENT
+ Final Online
+ Sword Art Online Arcade (partial support)
+ Final
+ Sword Art Online Arcade
+ Final (Single player only)
+ Initial D THE ARCADE
+ Season 2
+ WACCA
+ Lily R
+ Reverse
## Requirements
- python 3 (tested working with 3.9 and 3.10, other versions YMMV)
@ -43,7 +75,7 @@ Games listed below have been tested and confirmed working. Only game versions ol
- mysql/mariadb server
## Setup guides
Follow the platform-specific guides for [windows](docs/INSTALL_WINDOWS.md), [ubuntu](docs/INSTALL_UBUNTU.md) or [docker](docs/INSTALL_DOCKER.md) to setup and run the server.
Follow the platform-specific guides for [windows](docs/INSTALL_WINDOWS.md), [linux (Debian 12 or Rasperry Pi OS recomended, but anything works)](docs/INSTALL_LINUX.md) or [docker](docs/INSTALL_DOCKER.md) to setup and run the server.
## Game specific information
Read [Games specific info](docs/game_specific_info.md) for all supported games, importer settings, configuration option and database upgrades.

Binary file not shown.

View File

@ -1,10 +1,11 @@
from titles.chuni.index import ChuniServlet
from titles.chuni.const import ChuniConstants
from titles.chuni.database import ChuniData
from titles.chuni.read import ChuniReader
from .index import ChuniServlet
from .const import ChuniConstants
from .database import ChuniData
from .read import ChuniReader
from .frontend import ChuniFrontend
index = ChuniServlet
database = ChuniData
reader = ChuniReader
frontend = ChuniFrontend
game_codes = [ChuniConstants.GAME_CODE, ChuniConstants.GAME_CODE_NEW, ChuniConstants.GAME_CODE_INT]
current_schema_version = 5

View File

@ -11,7 +11,7 @@ class ChuniAir(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_AIR
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.10.00"
return ret

View File

@ -11,7 +11,7 @@ class ChuniAirPlus(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_AIR_PLUS
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.15.00"
return ret

View File

@ -13,7 +13,7 @@ class ChuniAmazon(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_AMAZON
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.30.00"
return ret

View File

@ -13,7 +13,7 @@ class ChuniAmazonPlus(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_AMAZON_PLUS
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.35.00"
return ret

View File

@ -22,7 +22,7 @@ class ChuniBase:
self.game = ChuniConstants.GAME_CODE
self.version = ChuniConstants.VER_CHUNITHM
def handle_game_login_api_request(self, data: Dict) -> Dict:
async def handle_game_login_api_request(self, data: Dict) -> Dict:
"""
Handles the login bonus logic, required for the game because
getUserLoginBonus gets called after getUserItem and therefore the
@ -38,20 +38,20 @@ class ChuniBase:
return {"returnCode": 1}
user_id = data["userId"]
login_bonus_presets = self.data.static.get_login_bonus_presets(self.version)
login_bonus_presets = await self.data.static.get_login_bonus_presets(self.version)
for preset in login_bonus_presets:
# check if a user already has some pogress and if not add the
# login bonus entry
user_login_bonus = self.data.item.get_login_bonus(
user_login_bonus = await self.data.item.get_login_bonus(
user_id, self.version, preset["presetId"]
)
if user_login_bonus is None:
self.data.item.put_login_bonus(
await self.data.item.put_login_bonus(
user_id, self.version, preset["presetId"]
)
# yeah i'm lazy
user_login_bonus = self.data.item.get_login_bonus(
user_login_bonus = await self.data.item.get_login_bonus(
user_id, self.version, preset["presetId"]
)
@ -67,7 +67,7 @@ class ChuniBase:
bonus_count = user_login_bonus["bonusCount"] + 1
last_update_date = datetime.now()
all_login_boni = self.data.static.get_login_bonus(
all_login_boni = await self.data.static.get_login_bonus(
self.version, preset["presetId"]
)
@ -91,13 +91,13 @@ class ChuniBase:
is_finished = True
# grab the item for the corresponding day
login_item = self.data.static.get_login_bonus_by_required_days(
login_item = await self.data.static.get_login_bonus_by_required_days(
self.version, preset["presetId"], bonus_count
)
if login_item is not None:
# now add the present to the database so the
# handle_get_user_item_api_request can grab them
self.data.item.put_item(
await self.data.item.put_item(
user_id,
{
"itemId": login_item["presentId"],
@ -107,7 +107,7 @@ class ChuniBase:
},
)
self.data.item.put_login_bonus(
await self.data.item.put_login_bonus(
user_id,
self.version,
preset["presetId"],
@ -119,12 +119,12 @@ class ChuniBase:
return {"returnCode": 1}
def handle_game_logout_api_request(self, data: Dict) -> Dict:
async def handle_game_logout_api_request(self, data: Dict) -> Dict:
# self.data.base.log_event("chuni", "logout", logging.INFO, {"version": self.version, "user": data["userId"]})
return {"returnCode": 1}
def handle_get_game_charge_api_request(self, data: Dict) -> Dict:
game_charge_list = self.data.static.get_enabled_charges(self.version)
async def handle_get_game_charge_api_request(self, data: Dict) -> Dict:
game_charge_list = await self.data.static.get_enabled_charges(self.version)
if game_charge_list is None or len(game_charge_list) == 0:
return {"length": 0, "gameChargeList": []}
@ -145,8 +145,8 @@ class ChuniBase:
)
return {"length": len(charges), "gameChargeList": charges}
def handle_get_game_event_api_request(self, data: Dict) -> Dict:
game_events = self.data.static.get_enabled_events(self.version)
async def handle_get_game_event_api_request(self, data: Dict) -> Dict:
game_events = await self.data.static.get_enabled_events(self.version)
if game_events is None or len(game_events) == 0:
self.logger.warning("No enabled events, did you run the reader?")
@ -177,10 +177,10 @@ class ChuniBase:
"gameEventList": event_list,
}
def handle_get_game_idlist_api_request(self, data: Dict) -> Dict:
async def handle_get_game_idlist_api_request(self, data: Dict) -> Dict:
return {"type": data["type"], "length": 0, "gameIdlistList": []}
def handle_get_game_message_api_request(self, data: Dict) -> Dict:
async def handle_get_game_message_api_request(self, data: Dict) -> Dict:
return {
"type": data["type"],
"length": 1,
@ -193,14 +193,14 @@ class ChuniBase:
}]
}
def handle_get_game_ranking_api_request(self, data: Dict) -> Dict:
rankings = self.data.score.get_rankings(self.version)
async def handle_get_game_ranking_api_request(self, data: Dict) -> Dict:
rankings = await self.data.score.get_rankings(self.version)
return {"type": data["type"], "gameRankingList": rankings}
def handle_get_game_sale_api_request(self, data: Dict) -> Dict:
async def handle_get_game_sale_api_request(self, data: Dict) -> Dict:
return {"type": data["type"], "length": 0, "gameSaleList": []}
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
# if reboot start/end time is not defined use the default behavior of being a few hours ago
if self.core_cfg.title.reboot_start_time == "" or self.core_cfg.title.reboot_end_time == "":
reboot_start = datetime.strftime(
@ -240,8 +240,8 @@ class ChuniBase:
"isDumpUpload": "false",
"isAou": "false",
}
def handle_get_user_activity_api_request(self, data: Dict) -> Dict:
user_activity_list = self.data.profile.get_profile_activity(
async def handle_get_user_activity_api_request(self, data: Dict) -> Dict:
user_activity_list = await self.data.profile.get_profile_activity(
data["userId"], data["kind"]
)
@ -261,8 +261,8 @@ class ChuniBase:
"userActivityList": activity_list,
}
def handle_get_user_character_api_request(self, data: Dict) -> Dict:
characters = self.data.item.get_characters(data["userId"])
async def handle_get_user_character_api_request(self, data: Dict) -> Dict:
characters = await self.data.item.get_characters(data["userId"])
if characters is None:
return {
"userId": data["userId"],
@ -296,8 +296,8 @@ class ChuniBase:
"userCharacterList": character_list,
}
def handle_get_user_charge_api_request(self, data: Dict) -> Dict:
user_charge_list = self.data.profile.get_profile_charge(data["userId"])
async def handle_get_user_charge_api_request(self, data: Dict) -> Dict:
user_charge_list = await self.data.profile.get_profile_charge(data["userId"])
charge_list = []
for charge in user_charge_list:
@ -312,15 +312,15 @@ class ChuniBase:
"userChargeList": charge_list,
}
def handle_get_user_recent_player_api_request(self, data: Dict) -> Dict:
async def handle_get_user_recent_player_api_request(self, data: Dict) -> Dict:
return {
"userId": data["userId"],
"length": 0,
"userRecentPlayerList": [], # playUserId, playUserName, playDate, friendPoint
}
def handle_get_user_course_api_request(self, data: Dict) -> Dict:
user_course_list = self.data.score.get_courses(data["userId"])
async def handle_get_user_course_api_request(self, data: Dict) -> Dict:
user_course_list = await self.data.score.get_courses(data["userId"])
if user_course_list is None:
return {
"userId": data["userId"],
@ -354,8 +354,8 @@ class ChuniBase:
"userCourseList": course_list,
}
def handle_get_user_data_api_request(self, data: Dict) -> Dict:
p = self.data.profile.get_profile_data(data["userId"], self.version)
async def handle_get_user_data_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_data(data["userId"], self.version)
if p is None:
return {}
@ -366,8 +366,8 @@ class ChuniBase:
return {"userId": data["userId"], "userData": profile}
def handle_get_user_data_ex_api_request(self, data: Dict) -> Dict:
p = self.data.profile.get_profile_data_ex(data["userId"], self.version)
async def handle_get_user_data_ex_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_data_ex(data["userId"], self.version)
if p is None:
return {}
@ -378,8 +378,8 @@ class ChuniBase:
return {"userId": data["userId"], "userDataEx": profile}
def handle_get_user_duel_api_request(self, data: Dict) -> Dict:
user_duel_list = self.data.item.get_duels(data["userId"])
async def handle_get_user_duel_api_request(self, data: Dict) -> Dict:
user_duel_list = await self.data.item.get_duels(data["userId"])
if user_duel_list is None:
return {}
@ -396,8 +396,8 @@ class ChuniBase:
"userDuelList": duel_list,
}
def handle_get_user_rival_data_api_request(self, data: Dict) -> Dict:
p = self.data.profile.get_rival(data["rivalId"])
async def handle_get_user_rival_data_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_rival(data["rivalId"])
if p is None:
return {}
userRivalData = {
@ -409,14 +409,14 @@ class ChuniBase:
"userRivalData": userRivalData
}
def handle_get_user_rival_music_api_request(self, data: Dict) -> Dict:
async def handle_get_user_rival_music_api_request(self, data: Dict) -> Dict:
rival_id = data["rivalId"]
next_index = int(data["nextIndex"])
max_count = int(data["maxCount"])
user_rival_music_list = []
# Fetch all the rival music entries for the user
all_entries = self.data.score.get_rival_music(rival_id)
all_entries = await self.data.score.get_rival_music(rival_id)
# Process the entries based on max_count and nextIndex
for music in all_entries:
@ -462,12 +462,12 @@ class ChuniBase:
return result
def handle_get_user_favorite_item_api_request(self, data: Dict) -> Dict:
async def handle_get_user_favorite_item_api_request(self, data: Dict) -> Dict:
user_fav_item_list = []
# still needs to be implemented on WebUI
# 1: Music, 2: User, 3: Character
fav_list = self.data.item.get_all_favorites(
fav_list = await self.data.item.get_all_favorites(
data["userId"], self.version, fav_kind=int(data["kind"])
)
if fav_list is not None:
@ -482,17 +482,17 @@ class ChuniBase:
"userFavoriteItemList": user_fav_item_list,
}
def handle_get_user_favorite_music_api_request(self, data: Dict) -> Dict:
async def handle_get_user_favorite_music_api_request(self, data: Dict) -> Dict:
"""
This is handled via the webui, which we don't have right now
"""
return {"userId": data["userId"], "length": 0, "userFavoriteMusicList": []}
def handle_get_user_item_api_request(self, data: Dict) -> Dict:
async def handle_get_user_item_api_request(self, data: Dict) -> Dict:
kind = int(int(data["nextIndex"]) / 10000000000)
next_idx = int(int(data["nextIndex"]) % 10000000000)
user_item_list = self.data.item.get_items(data["userId"], kind)
user_item_list = await self.data.item.get_items(data["userId"], kind)
if user_item_list is None or len(user_item_list) == 0:
return {
@ -526,9 +526,9 @@ class ChuniBase:
"userItemList": items,
}
def handle_get_user_login_bonus_api_request(self, data: Dict) -> Dict:
async def handle_get_user_login_bonus_api_request(self, data: Dict) -> Dict:
user_id = data["userId"]
user_login_bonus = self.data.item.get_all_login_bonus(user_id, self.version)
user_login_bonus = await self.data.item.get_all_login_bonus(user_id, self.version)
# ignore the loginBonus request if its disabled in config
if user_login_bonus is None or not self.game_cfg.mods.use_login_bonus:
return {"userId": user_id, "length": 0, "userLoginBonusList": []}
@ -552,8 +552,8 @@ class ChuniBase:
"userLoginBonusList": user_login_list,
}
def handle_get_user_map_api_request(self, data: Dict) -> Dict:
user_map_list = self.data.item.get_maps(data["userId"])
async def handle_get_user_map_api_request(self, data: Dict) -> Dict:
user_map_list = await self.data.item.get_maps(data["userId"])
if user_map_list is None:
return {}
@ -570,8 +570,8 @@ class ChuniBase:
"userMapList": map_list,
}
def handle_get_user_music_api_request(self, data: Dict) -> Dict:
music_detail = self.data.score.get_scores(data["userId"])
async def handle_get_user_music_api_request(self, data: Dict) -> Dict:
music_detail = await self.data.score.get_scores(data["userId"])
if music_detail is None:
return {
"userId": data["userId"],
@ -629,8 +629,8 @@ class ChuniBase:
"userMusicList": song_list, # 240
}
def handle_get_user_option_api_request(self, data: Dict) -> Dict:
p = self.data.profile.get_profile_option(data["userId"])
async def handle_get_user_option_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_option(data["userId"])
option = p._asdict()
option.pop("id")
@ -638,8 +638,8 @@ class ChuniBase:
return {"userId": data["userId"], "userGameOption": option}
def handle_get_user_option_ex_api_request(self, data: Dict) -> Dict:
p = self.data.profile.get_profile_option_ex(data["userId"])
async def handle_get_user_option_ex_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_option_ex(data["userId"])
option = p._asdict()
option.pop("id")
@ -650,11 +650,11 @@ class ChuniBase:
def read_wtf8(self, src):
return bytes([ord(c) for c in src]).decode("utf-8")
def handle_get_user_preview_api_request(self, data: Dict) -> Dict:
profile = self.data.profile.get_profile_preview(data["userId"], self.version)
async def handle_get_user_preview_api_request(self, data: Dict) -> Dict:
profile = await self.data.profile.get_profile_preview(data["userId"], self.version)
if profile is None:
return None
profile_character = self.data.item.get_character(
profile_character = await self.data.item.get_character(
data["userId"], profile["characterId"]
)
@ -692,8 +692,8 @@ class ChuniBase:
"userNameEx": profile["userName"],
}
def handle_get_user_recent_rating_api_request(self, data: Dict) -> Dict:
recent_rating_list = self.data.profile.get_profile_recent_rating(data["userId"])
async def handle_get_user_recent_rating_api_request(self, data: Dict) -> Dict:
recent_rating_list = await self.data.profile.get_profile_recent_rating(data["userId"])
if recent_rating_list is None:
return {
"userId": data["userId"],
@ -707,7 +707,7 @@ class ChuniBase:
"userRecentRatingList": recent_rating_list["recentRating"],
}
def handle_get_user_region_api_request(self, data: Dict) -> Dict:
async def handle_get_user_region_api_request(self, data: Dict) -> Dict:
# TODO: Region
return {
"userId": data["userId"],
@ -715,23 +715,33 @@ class ChuniBase:
"userRegionList": [],
}
def handle_get_user_team_api_request(self, data: Dict) -> Dict:
async def handle_get_user_team_api_request(self, data: Dict) -> Dict:
# Default values
team_id = 65535
team_name = self.game_cfg.team.team_name
team_rank = 0
team_user_point = 0
# Get user profile
profile = self.data.profile.get_profile_data(data["userId"], self.version)
profile = await self.data.profile.get_profile_data(data["userId"], self.version)
if profile is None:
return {"userId": data["userId"], "teamId": 0}
if profile and profile["teamId"]:
# Get team by id
team = self.data.profile.get_team_by_id(profile["teamId"])
team = await self.data.profile.get_team_by_id(profile["teamId"])
if team:
team_id = team["id"]
team_name = team["teamName"]
team_rank = self.data.profile.get_team_rank(team["id"])
team_rank = await self.data.profile.get_team_rank(team["id"])
team_point = team["teamPoint"]
if team["userTeamPoint"] is not None and team["userTeamPoint"] != "":
user_team_point_data = json.loads(team["userTeamPoint"])
for user_point_data in user_team_point_data:
if user_point_data["user"] == data["userId"]:
team_user_point = int(user_point_data["userPoint"])
# Don't return anything if no team name has been defined for defaults and there is no team set for the player
if not profile["teamId"] and team_name == "":
return {"userId": data["userId"], "teamId": 0}
@ -741,16 +751,17 @@ class ChuniBase:
"teamId": team_id,
"teamRank": team_rank,
"teamName": team_name,
"assaultTimeRate": 1, # TODO: Figure out assaultTime, which might be team point boost?
"userTeamPoint": {
"userId": data["userId"],
"teamId": team_id,
"orderId": 1,
"teamPoint": 1,
"orderId": 0,
"teamPoint": team_user_point,
"aggrDate": data["playDate"],
},
}
def handle_get_team_course_setting_api_request(self, data: Dict) -> Dict:
async def handle_get_team_course_setting_api_request(self, data: Dict) -> Dict:
return {
"userId": data["userId"],
"length": 0,
@ -758,7 +769,7 @@ class ChuniBase:
"teamCourseSettingList": [],
}
def handle_get_team_course_setting_api_request_proto(self, data: Dict) -> Dict:
async def handle_get_team_course_setting_api_request_proto(self, data: Dict) -> Dict:
return {
"userId": data["userId"],
"length": 1,
@ -782,7 +793,7 @@ class ChuniBase:
],
}
def handle_get_team_course_rule_api_request(self, data: Dict) -> Dict:
async def handle_get_team_course_rule_api_request(self, data: Dict) -> Dict:
return {
"userId": data["userId"],
"length": 0,
@ -790,7 +801,7 @@ class ChuniBase:
"teamCourseRuleList": []
}
def handle_get_team_course_rule_api_request_proto(self, data: Dict) -> Dict:
async def handle_get_team_course_rule_api_request_proto(self, data: Dict) -> Dict:
return {
"userId": data["userId"],
"length": 1,
@ -807,10 +818,16 @@ class ChuniBase:
],
}
def handle_upsert_user_all_api_request(self, data: Dict) -> Dict:
async def handle_upsert_user_all_api_request(self, data: Dict) -> Dict:
upsert = data["upsertUserAll"]
user_id = data["userId"]
if int(user_id) & 0x1000000000001 == 0x1000000000001:
place_id = int(user_id) & 0xFFFC00000000
self.logger.info("Guest play from place ID %d, ignoring.", place_id)
return {"returnCode": "1"}
if "userData" in upsert:
try:
upsert["userData"][0]["userName"] = self.read_wtf8(
@ -819,58 +836,58 @@ class ChuniBase:
except Exception:
pass
self.data.profile.put_profile_data(
await self.data.profile.put_profile_data(
user_id, self.version, upsert["userData"][0]
)
if "userDataEx" in upsert:
self.data.profile.put_profile_data_ex(
await self.data.profile.put_profile_data_ex(
user_id, self.version, upsert["userDataEx"][0]
)
if "userGameOption" in upsert:
self.data.profile.put_profile_option(user_id, upsert["userGameOption"][0])
await self.data.profile.put_profile_option(user_id, upsert["userGameOption"][0])
if "userGameOptionEx" in upsert:
self.data.profile.put_profile_option_ex(
await self.data.profile.put_profile_option_ex(
user_id, upsert["userGameOptionEx"][0]
)
if "userRecentRatingList" in upsert:
self.data.profile.put_profile_recent_rating(
await self.data.profile.put_profile_recent_rating(
user_id, upsert["userRecentRatingList"]
)
if "userCharacterList" in upsert:
for character in upsert["userCharacterList"]:
self.data.item.put_character(user_id, character)
await self.data.item.put_character(user_id, character)
if "userMapList" in upsert:
for map in upsert["userMapList"]:
self.data.item.put_map(user_id, map)
await self.data.item.put_map(user_id, map)
if "userCourseList" in upsert:
for course in upsert["userCourseList"]:
self.data.score.put_course(user_id, course)
await self.data.score.put_course(user_id, course)
if "userDuelList" in upsert:
for duel in upsert["userDuelList"]:
self.data.item.put_duel(user_id, duel)
await self.data.item.put_duel(user_id, duel)
if "userItemList" in upsert:
for item in upsert["userItemList"]:
self.data.item.put_item(user_id, item)
await self.data.item.put_item(user_id, item)
if "userActivityList" in upsert:
for activity in upsert["userActivityList"]:
self.data.profile.put_profile_activity(user_id, activity)
await self.data.profile.put_profile_activity(user_id, activity)
if "userChargeList" in upsert:
for charge in upsert["userChargeList"]:
self.data.profile.put_profile_charge(user_id, charge)
await self.data.profile.put_profile_charge(user_id, charge)
if "userMusicDetailList" in upsert:
for song in upsert["userMusicDetailList"]:
self.data.score.put_score(user_id, song)
await self.data.score.put_score(user_id, song)
if "userPlaylogList" in upsert:
for playlog in upsert["userPlaylogList"]:
@ -881,7 +898,7 @@ class ChuniBase:
playlog["playedUserName2"] = self.read_wtf8(playlog["playedUserName2"])
if playlog["playedUserName3"] is not None:
playlog["playedUserName3"] = self.read_wtf8(playlog["playedUserName3"])
self.data.score.put_playlog(user_id, playlog, self.version)
await self.data.score.put_playlog(user_id, playlog, self.version)
if "userTeamPoint" in upsert:
team_points = upsert["userTeamPoint"]
@ -889,7 +906,7 @@ class ChuniBase:
for tp in team_points:
if tp["teamId"] != '65535':
# Fetch the current team data
current_team = self.data.profile.get_team_by_id(tp["teamId"])
current_team = await self.data.profile.get_team_by_id(tp["teamId"])
# Calculate the new teamPoint
new_team_point = int(tp["teamPoint"]) + current_team["teamPoint"]
@ -900,24 +917,24 @@ class ChuniBase:
}
# Update the team data
self.data.profile.update_team(tp["teamId"], team_data)
await self.data.profile.update_team(tp["teamId"], team_data)
except:
pass # Probably a better way to catch if the team is not set yet (new profiles), but let's just pass
if "userMapAreaList" in upsert:
for map_area in upsert["userMapAreaList"]:
self.data.item.put_map_area(user_id, map_area)
await self.data.item.put_map_area(user_id, map_area)
if "userOverPowerList" in upsert:
for overpower in upsert["userOverPowerList"]:
self.data.profile.put_profile_overpower(user_id, overpower)
await self.data.profile.put_profile_overpower(user_id, overpower)
if "userEmoneyList" in upsert:
for emoney in upsert["userEmoneyList"]:
self.data.profile.put_profile_emoney(user_id, emoney)
await self.data.profile.put_profile_emoney(user_id, emoney)
if "userLoginBonusList" in upsert:
for login in upsert["userLoginBonusList"]:
self.data.item.put_login_bonus(
await self.data.item.put_login_bonus(
user_id, self.version, login["presetId"], isWatched=True
)
@ -925,31 +942,67 @@ class ChuniBase:
for rp in upsert["userRecentPlayerList"]:
pass
for rating_type in {"userRatingBaseList", "userRatingBaseHotList", "userRatingBaseNextList"}:
if rating_type not in upsert:
continue
await self.data.profile.put_profile_rating(
user_id,
self.version,
rating_type,
upsert[rating_type],
)
# added in LUMINOUS
if "userCMissionList" in upsert:
for cmission in upsert["userCMissionList"]:
mission_id = cmission["missionId"]
await self.data.item.put_cmission(
user_id,
{
"missionId": mission_id,
"point": cmission["point"],
},
)
for progress in cmission["userCMissionProgressList"]:
await self.data.item.put_cmission_progress(user_id, mission_id, progress)
if "userNetBattleData" in upsert:
net_battle = upsert["userNetBattleData"][0]
# fix the boolean
net_battle["isRankUpChallengeFailed"] = (
False if net_battle["isRankUpChallengeFailed"] == "false" else True
)
await self.data.profile.put_net_battle(user_id, net_battle)
return {"returnCode": "1"}
def handle_upsert_user_chargelog_api_request(self, data: Dict) -> Dict:
async def handle_upsert_user_chargelog_api_request(self, data: Dict) -> Dict:
# add tickets after they got bought, this makes sure the tickets are
# still valid after an unsuccessful logout
self.data.profile.put_profile_charge(data["userId"], data["userCharge"])
await self.data.profile.put_profile_charge(data["userId"], data["userCharge"])
return {"returnCode": "1"}
def handle_upsert_client_bookkeeping_api_request(self, data: Dict) -> Dict:
async def handle_upsert_client_bookkeeping_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_upsert_client_develop_api_request(self, data: Dict) -> Dict:
async def handle_upsert_client_develop_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_upsert_client_error_api_request(self, data: Dict) -> Dict:
async def handle_upsert_client_error_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_upsert_client_setting_api_request(self, data: Dict) -> Dict:
async def handle_upsert_client_setting_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_upsert_client_testmode_api_request(self, data: Dict) -> Dict:
async def handle_upsert_client_testmode_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_get_user_net_battle_data_api_request(self, data: Dict) -> Dict:
async def handle_get_user_net_battle_data_api_request(self, data: Dict) -> Dict:
return {
"userId": data["userId"],
"userNetBattleData": {"recentNBSelectMusicList": []},
}
}

View File

@ -1,3 +1,6 @@
from enum import Enum
class ChuniConstants:
GAME_CODE = "SDBT"
GAME_CODE_NEW = "SDHD"
@ -16,10 +19,13 @@ class ChuniConstants:
VER_CHUNITHM_CRYSTAL = 8
VER_CHUNITHM_CRYSTAL_PLUS = 9
VER_CHUNITHM_PARADISE = 10
VER_CHUNITHM_NEW = 11
VER_CHUNITHM_NEW_PLUS = 12
VER_CHUNITHM_SUN = 13
VER_CHUNITHM_SUN_PLUS = 14
VER_CHUNITHM_LUMINOUS = 15
VERSION_NAMES = [
"CHUNITHM",
"CHUNITHM PLUS",
@ -35,9 +41,53 @@ class ChuniConstants:
"CHUNITHM NEW!!",
"CHUNITHM NEW PLUS!!",
"CHUNITHM SUN",
"CHUNITHM SUN PLUS"
"CHUNITHM SUN PLUS",
"CHUNITHM LUMINOUS",
]
SCORE_RANK_INTERVALS_OLD = [
(1007500, "SSS"),
(1000000, "SS"),
( 975000, "S"),
( 950000, "AAA"),
( 925000, "AA"),
( 900000, "A"),
( 800000, "BBB"),
( 700000, "BB"),
( 600000, "B"),
( 500000, "C"),
( 0, "D"),
]
SCORE_RANK_INTERVALS_NEW = [
(1009000, "SSS+"), # New only
(1007500, "SSS"),
(1005000, "SS+"), # New only
(1000000, "SS"),
( 990000, "S+"), # New only
( 975000, "S"),
( 950000, "AAA"),
( 925000, "AA"),
( 900000, "A"),
( 800000, "BBB"),
( 700000, "BB"),
( 600000, "B"),
( 500000, "C"),
( 0, "D"),
]
@classmethod
def game_ver_to_string(cls, ver: int):
return cls.VERSION_NAMES[ver]
return cls.VERSION_NAMES[ver]
class MapAreaConditionType(Enum):
UNLOCKED = 0
MAP_CLEARED = 1
MAP_AREA_CLEARED = 2
TROPHY_OBTAINED = 3
class MapAreaConditionLogicalOperator(Enum):
AND = 1
OR = 2

View File

@ -13,7 +13,7 @@ class ChuniCrystal(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_CRYSTAL
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.40.00"
return ret

View File

@ -13,7 +13,7 @@ class ChuniCrystalPlus(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.45.00"
return ret

299
titles/chuni/frontend.py Normal file
View File

@ -0,0 +1,299 @@
from typing import List
from starlette.routing import Route, Mount
from starlette.requests import Request
from starlette.responses import Response, RedirectResponse
from os import path
import yaml
import jinja2
from core.frontend import FE_Base, UserSession
from core.config import CoreConfig
from .database import ChuniData
from .config import ChuniConfig
from .const import ChuniConstants
def pairwise(iterable):
# https://docs.python.org/3/library/itertools.html#itertools.pairwise
# but for Python < 3.10. pairwise('ABCDEFG') → AB BC CD DE EF FG
iterator = iter(iterable)
a = next(iterator, None)
for b in iterator:
yield a, b
a = b
def calculate_song_rank(score: int, game_version: int) -> str:
if game_version >= ChuniConstants.VER_CHUNITHM_NEW:
intervals = ChuniConstants.SCORE_RANK_INTERVALS_NEW
else:
intervals = ChuniConstants.SCORE_RANK_INTERVALS_OLD
for (min_score, rank) in intervals:
if score >= min_score:
return rank
return "D"
def calculate_song_rating(score: int, chart_constant: float, game_version: int) -> float:
is_new = game_version >= ChuniConstants.VER_CHUNITHM_NEW
if is_new: # New and later
max_score = 1009000
max_rating_modifier = 2.15
else: # Up to Paradise Lost
max_score = 1007500
max_rating_modifier = 2.0
if (score < 500000):
return 0.0 # D
elif (score >= max_score):
return chart_constant + max_rating_modifier # SSS/SSS+
# Okay, we're doing this the hard way.
# Rating goes up linearly between breakpoints listed below.
# Pick the score interval in which we are in, then calculate
# the position between possible ratings.
score_intervals = [
( 500000, 0.0), # C
( 800000, max(0.0, (chart_constant - 5.0) / 2)), # BBB
( 900000, max(0.0, (chart_constant - 5.0))), # A
( 925000, max(0.0, (chart_constant - 3.0))), # AA
( 975000, chart_constant), # S
(1000000, chart_constant + 1.0), # SS
(1005000, chart_constant + 1.5), # SS+
(1007500, chart_constant + 2.0), # SSS
(1009000, chart_constant + max_rating_modifier), # SSS+!
]
for ((lo_score, lo_rating), (hi_score, hi_rating)) in pairwise(score_intervals):
if not (lo_score <= score < hi_score):
continue
interval_pos = (score - lo_score) / (hi_score - lo_score)
return lo_rating + ((hi_rating - lo_rating) * interval_pos)
class ChuniFrontend(FE_Base):
def __init__(
self, cfg: CoreConfig, environment: jinja2.Environment, cfg_dir: str
) -> None:
super().__init__(cfg, environment)
self.data = ChuniData(cfg)
self.game_cfg = ChuniConfig()
if path.exists(f"{cfg_dir}/{ChuniConstants.CONFIG_NAME}"):
self.game_cfg.update(
yaml.safe_load(open(f"{cfg_dir}/{ChuniConstants.CONFIG_NAME}"))
)
self.nav_name = "Chunithm"
def get_routes(self) -> List[Route]:
return [
Route("/", self.render_GET, methods=['GET']),
Route("/rating", self.render_GET_rating, methods=['GET']),
Mount("/playlog", routes=[
Route("/", self.render_GET_playlog, methods=['GET']),
Route("/{index}", self.render_GET_playlog, methods=['GET']),
]),
Route("/update.name", self.update_name, methods=['POST']),
Route("/version.change", self.version_change, methods=['POST']),
]
async def render_GET(self, request: Request) -> bytes:
template = self.environment.get_template(
"titles/chuni/templates/chuni_index.jinja"
)
usr_sesh = self.validate_session(request)
if not usr_sesh:
usr_sesh = UserSession()
if usr_sesh.user_id > 0:
versions = await self.data.profile.get_all_profile_versions(usr_sesh.user_id)
profile = []
if versions:
# chunithm_version is -1 means it is not initialized yet, select a default version from existing.
if usr_sesh.chunithm_version < 0:
usr_sesh.chunithm_version = versions[0]
profile = await self.data.profile.get_profile_data(usr_sesh.user_id, usr_sesh.chunithm_version)
resp = Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
user_id=usr_sesh.user_id,
profile=profile,
version_list=ChuniConstants.VERSION_NAMES,
versions=versions,
cur_version=usr_sesh.chunithm_version
), media_type="text/html; charset=utf-8")
if usr_sesh.chunithm_version >= 0:
encoded_sesh = self.encode_session(usr_sesh)
resp.set_cookie("ARTEMIS_SESH", encoded_sesh)
return resp
else:
return RedirectResponse("/gate/", 303)
async def render_GET_rating(self, request: Request) -> bytes:
template = self.environment.get_template(
"titles/chuni/templates/chuni_rating.jinja"
)
usr_sesh = self.validate_session(request)
if not usr_sesh:
usr_sesh = UserSession()
if usr_sesh.user_id > 0:
if usr_sesh.chunithm_version < 0:
return RedirectResponse("/game/chuni/", 303)
profile = await self.data.profile.get_profile_data(usr_sesh.user_id, usr_sesh.chunithm_version)
rating = await self.data.profile.get_profile_rating(usr_sesh.user_id, usr_sesh.chunithm_version)
hot_list=[]
base_list=[]
if profile and rating:
song_records = []
for song in rating:
music_chart = await self.data.static.get_music_chart(usr_sesh.chunithm_version, song.musicId, song.difficultId)
if not music_chart:
continue
rank = calculate_song_rank(song.score, profile.version)
rating = calculate_song_rating(song.score, music_chart.level, profile.version)
song_rating = int(rating * 10 ** 2) / 10 ** 2
song_records.append({
"difficultId": song.difficultId,
"musicId": song.musicId,
"title": music_chart.title,
"level": music_chart.level,
"score": song.score,
"type": song.type,
"rank": rank,
"song_rating": song_rating,
})
hot_list = [obj for obj in song_records if obj["type"] == "userRatingBaseHotList"]
base_list = [obj for obj in song_records if obj["type"] == "userRatingBaseList"]
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
profile=profile,
hot_list=hot_list,
base_list=base_list,
), media_type="text/html; charset=utf-8")
else:
return RedirectResponse("/gate/", 303)
async def render_GET_playlog(self, request: Request) -> bytes:
template = self.environment.get_template(
"titles/chuni/templates/chuni_playlog.jinja"
)
usr_sesh = self.validate_session(request)
if not usr_sesh:
usr_sesh = UserSession()
if usr_sesh.user_id > 0:
if usr_sesh.chunithm_version < 0:
return RedirectResponse("/game/chuni/", 303)
path_index = request.path_params.get('index')
if not path_index or int(path_index) < 1:
index = 0
else:
index = int(path_index) - 1 # 0 and 1 are 1st page
user_id = usr_sesh.user_id
playlog_count = await self.data.score.get_user_playlogs_count(user_id)
if playlog_count < index * 20 :
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
playlog_count=0
), media_type="text/html; charset=utf-8")
playlog = await self.data.score.get_playlogs_limited(user_id, index, 20)
playlog_with_title = []
for record in playlog:
music_chart = await self.data.static.get_music_chart(usr_sesh.chunithm_version, record.musicId, record.level)
if music_chart:
difficultyNum=music_chart.level
artist=music_chart.artist
title=music_chart.title
else:
difficultyNum=0
artist="unknown"
title="musicid: " + str(record.musicId)
playlog_with_title.append({
"raw": record,
"title": title,
"difficultyNum": difficultyNum,
"artist": artist,
})
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
user_id=usr_sesh.user_id,
playlog=playlog_with_title,
playlog_count=playlog_count
), media_type="text/html; charset=utf-8")
else:
return RedirectResponse("/gate/", 303)
async def update_name(self, request: Request) -> bytes:
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
form_data = await request.form()
new_name: str = form_data.get("new_name")
new_name_full = ""
if not new_name:
return RedirectResponse("/gate/?e=4", 303)
if len(new_name) > 8:
return RedirectResponse("/gate/?e=8", 303)
for x in new_name: # FIXME: This will let some invalid characters through atm
o = ord(x)
try:
if o == 0x20:
new_name_full += chr(0x3000)
elif o < 0x7F and o > 0x20:
new_name_full += chr(o + 0xFEE0)
elif o <= 0x7F:
self.logger.warn(f"Invalid ascii character {o:02X}")
return RedirectResponse("/gate/?e=4", 303)
else:
new_name_full += x
except Exception as e:
self.logger.error(f"Something went wrong parsing character {o:04X} - {e}")
return RedirectResponse("/gate/?e=4", 303)
if not await self.data.profile.update_name(usr_sesh.user_id, new_name_full):
return RedirectResponse("/gate/?e=999", 303)
return RedirectResponse("/game/chuni/?s=1", 303)
async def version_change(self, request: Request):
usr_sesh = self.validate_session(request)
if not usr_sesh:
usr_sesh = UserSession()
if usr_sesh.user_id > 0:
form_data = await request.form()
chunithm_version = form_data.get("version")
self.logger.debug(f"version change to: {chunithm_version}")
if(chunithm_version.isdigit()):
usr_sesh.chunithm_version=int(chunithm_version)
encoded_sesh = self.encode_session(usr_sesh)
self.logger.debug(f"Created session with JWT {encoded_sesh}")
resp = RedirectResponse("/game/chuni/", 303)
resp.set_cookie("ARTEMIS_SESH", encoded_sesh)
return resp
else:
return RedirectResponse("/gate/", 303)

View File

@ -1,5 +1,8 @@
from twisted.web.http import Request
import logging, coloredlogs
from starlette.requests import Request
from starlette.routing import Route
from starlette.responses import Response
import logging
import coloredlogs
from logging.handlers import TimedRotatingFileHandler
import zlib
import yaml
@ -32,13 +35,13 @@ from .new import ChuniNew
from .newplus import ChuniNewPlus
from .sun import ChuniSun
from .sunplus import ChuniSunPlus
from .luminous import ChuniLuminous
class ChuniServlet(BaseServlet):
def __init__(self, core_cfg: CoreConfig, cfg_dir: str) -> None:
super().__init__(core_cfg, cfg_dir)
self.game_cfg = ChuniConfig()
self.hash_table: Dict[Dict[str, str]] = {}
self.hash_table: Dict[str, Dict[str, str]] = {}
if path.exists(f"{cfg_dir}/{ChuniConstants.CONFIG_NAME}"):
self.game_cfg.update(
yaml.safe_load(open(f"{cfg_dir}/{ChuniConstants.CONFIG_NAME}"))
@ -60,6 +63,7 @@ class ChuniServlet(BaseServlet):
ChuniNewPlus,
ChuniSun,
ChuniSunPlus,
ChuniLuminous,
]
self.logger = logging.getLogger("chuni")
@ -88,30 +92,65 @@ class ChuniServlet(BaseServlet):
)
self.logger.inited = True
known_iter_counts = {
ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS: 67,
f"{ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS}_int": 25, # SUPERSTAR
ChuniConstants.VER_CHUNITHM_PARADISE: 44,
f"{ChuniConstants.VER_CHUNITHM_PARADISE}_int": 51, # SUPERSTAR PLUS
ChuniConstants.VER_CHUNITHM_NEW: 54,
f"{ChuniConstants.VER_CHUNITHM_NEW}_int": 49,
ChuniConstants.VER_CHUNITHM_NEW_PLUS: 25,
f"{ChuniConstants.VER_CHUNITHM_NEW_PLUS}_int": 31,
ChuniConstants.VER_CHUNITHM_SUN: 70,
f"{ChuniConstants.VER_CHUNITHM_SUN}_int": 35,
ChuniConstants.VER_CHUNITHM_SUN_PLUS: 36,
f"{ChuniConstants.VER_CHUNITHM_SUN_PLUS}_int": 36,
ChuniConstants.VER_CHUNITHM_LUMINOUS: 8,
f"{ChuniConstants.VER_CHUNITHM_LUMINOUS}_int": 8,
}
for version, keys in self.game_cfg.crypto.keys.items():
if len(keys) < 3:
continue
self.hash_table[version] = {}
if isinstance(version, int):
version_idx = version
else:
version_idx = int(version.split("_")[0])
salt = bytes.fromhex(keys[2])
if len(keys) >= 4:
iter_count = keys[3]
elif (iter_count := known_iter_counts.get(version)) is None:
self.logger.error(
"Number of iteration rounds for version %s is not known, but it is not specified in the config",
version,
)
continue
self.hash_table[version] = {}
method_list = [
method
for method in dir(self.versions[version])
for method in dir(self.versions[version_idx])
if not method.startswith("__")
]
for method in method_list:
method_fixed = inflection.camelize(method)[6:-7]
# number of iterations was changed to 70 in SUN and then to 36
if version == ChuniConstants.VER_CHUNITHM_SUN_PLUS:
iter_count = 36
elif version == ChuniConstants.VER_CHUNITHM_SUN:
iter_count = 70
else:
iter_count = 44
# This only applies for CHUNITHM NEW International and later for some reason.
# CHUNITHM SUPERSTAR (PLUS) did not add "Exp" to the endpoint when hashing.
if (
isinstance(version, str)
and version.endswith("_int")
and version_idx >= ChuniConstants.VER_CHUNITHM_NEW
):
method_fixed += "C3Exp"
hash = PBKDF2(
method_fixed,
bytes.fromhex(keys[2]),
salt,
128,
count=iter_count,
hmac_hash_module=SHA1,
@ -121,18 +160,9 @@ class ChuniServlet(BaseServlet):
self.hash_table[version][hashed_name] = method_fixed
self.logger.debug(
f"Hashed v{version} method {method_fixed} with {bytes.fromhex(keys[2])} to get {hash.hex()}"
f"Hashed v{version} method {method_fixed} with {salt} to get {hashed_name}"
)
def get_endpoint_matchers(self) -> Tuple[List[Tuple[str, str, Dict]], List[Tuple[str, str, Dict]]]:
return (
[],
[
("render_POST", "/{game}/{version}/ChuniServlet/{endpoint}", {}),
("render_POST", "/{game}/{version}/ChuniServlet/MatchingServer/{endpoint}", {})
]
)
@classmethod
def is_game_enabled(
cls, game_code: str, core_cfg: CoreConfig, cfg_dir: str
@ -150,19 +180,25 @@ class ChuniServlet(BaseServlet):
def get_allnet_info(self, game_code: str, game_ver: int, keychip: str) -> Tuple[str, str]:
if not self.core_cfg.server.is_using_proxy and Utils.get_title_port(self.core_cfg) != 80:
return (f"http://{self.core_cfg.title.hostname}:{Utils.get_title_port(self.core_cfg)}/{game_code}/{game_ver}/", self.core_cfg.title.hostname)
return (f"http://{self.core_cfg.server.hostname}:{Utils.get_title_port(self.core_cfg)}/{game_code}/{game_ver}/", self.core_cfg.server.hostname)
return (f"http://{self.core_cfg.title.hostname}/{game_code}/{game_ver}/", self.core_cfg.title.hostname)
return (f"http://{self.core_cfg.server.hostname}/{game_code}/{game_ver}/", self.core_cfg.server.hostname)
def render_POST(self, request: Request, game_code: str, matchers: Dict) -> bytes:
endpoint = matchers['endpoint']
version = int(matchers['version'])
game_code = matchers['game']
def get_routes(self) -> List[Route]:
return [
Route("/{game:str}/{version:int}/ChuniServlet/{endpoint:str}", self.render_POST, methods=['POST']),
Route("/{game:str}/{version:int}/ChuniServlet/MatchingServer/{endpoint:str}", self.render_POST, methods=['POST']),
]
async def render_POST(self, request: Request) -> bytes:
endpoint: str = request.path_params.get('endpoint')
version: int = request.path_params.get('version')
game_code: str = request.path_params.get('game')
if endpoint.lower() == "ping":
return zlib.compress(b'{"returnCode": "1"}')
return Response(zlib.compress(b'{"returnCode": "1"}'))
req_raw = request.content.getvalue()
req_raw = await request.body()
encrtped = False
internal_ver = 0
@ -197,47 +233,61 @@ class ChuniServlet(BaseServlet):
internal_ver = ChuniConstants.VER_CHUNITHM_NEW_PLUS
elif version >= 210 and version < 215: # SUN
internal_ver = ChuniConstants.VER_CHUNITHM_SUN
elif version >= 215: # SUN
elif version >= 215 and version < 220: # SUN PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_SUN_PLUS
elif version >= 220: # LUMINOUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS
elif game_code == "SDGS": # Int
if version < 110: # SUPERSTAR
internal_ver = ChuniConstants.PARADISE
if version < 105: # SUPERSTAR
internal_ver = ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS
elif version >= 105 and version < 110: # SUPERSTAR PLUS *Cursed but needed due to different encryption key
internal_ver = ChuniConstants.VER_CHUNITHM_PARADISE
elif version >= 110 and version < 115: # NEW
internal_ver = ChuniConstants.VER_CHUNITHM_NEW
elif version >= 115 and version < 120: # NEW PLUS!!
internal_ver = ChuniConstants.VER_CHUNITHM_NEW_PLUS
elif version >= 120 and version < 125: # SUN
internal_ver = ChuniConstants.VER_CHUNITHM_SUN
elif version >= 125: # SUN PLUS
elif version >= 125 and version < 130: # SUN PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_SUN_PLUS
elif version >= 130: # LUMINOUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS
if all(c in string.hexdigits for c in endpoint) and len(endpoint) == 32:
# If we get a 32 character long hex string, it's a hash and we're
# doing encrypted. The likelyhood of false positives is low but
# doing encrypted. The likelihood of false positives is low but
# technically not 0
if game_code == "SDGS":
crypto_cfg_key = f"{internal_ver}_int"
hash_table_key = f"{internal_ver}_int"
else:
crypto_cfg_key = internal_ver
hash_table_key = internal_ver
if internal_ver < ChuniConstants.VER_CHUNITHM_NEW:
endpoint = request.getHeader("User-Agent").split("#")[0]
endpoint = request.headers.get("User-Agent").split("#")[0]
else:
if internal_ver not in self.hash_table:
if hash_table_key not in self.hash_table:
self.logger.error(
f"v{version} does not support encryption or no keys entered"
)
return zlib.compress(b'{"stat": "0"}')
return Response(zlib.compress(b'{"stat": "0"}'))
elif endpoint.lower() not in self.hash_table[internal_ver]:
elif endpoint.lower() not in self.hash_table[hash_table_key]:
self.logger.error(
f"No hash found for v{version} endpoint {endpoint}"
)
return zlib.compress(b'{"stat": "0"}')
return Response(zlib.compress(b'{"stat": "0"}'))
endpoint = self.hash_table[internal_ver][endpoint.lower()]
endpoint = self.hash_table[hash_table_key][endpoint.lower()]
try:
crypt = AES.new(
bytes.fromhex(self.game_cfg.crypto.keys[internal_ver][0]),
bytes.fromhex(self.game_cfg.crypto.keys[crypto_cfg_key][0]),
AES.MODE_CBC,
bytes.fromhex(self.game_cfg.crypto.keys[internal_ver][1]),
bytes.fromhex(self.game_cfg.crypto.keys[crypto_cfg_key][1]),
)
req_raw = crypt.decrypt(req_raw)
@ -246,7 +296,7 @@ class ChuniServlet(BaseServlet):
self.logger.error(
f"Failed to decrypt v{version} request to {endpoint} -> {e}"
)
return zlib.compress(b'{"stat": "0"}')
return Response(zlib.compress(b'{"stat": "0"}'))
encrtped = True
@ -258,7 +308,7 @@ class ChuniServlet(BaseServlet):
self.logger.error(
f"Unencrypted v{version} {endpoint} request, but config is set to encrypted only: {req_raw}"
)
return zlib.compress(b'{"stat": "0"}')
return Response(zlib.compress(b'{"stat": "0"}'))
try:
unzip = zlib.decompress(req_raw)
@ -267,14 +317,20 @@ class ChuniServlet(BaseServlet):
self.logger.error(
f"Failed to decompress v{version} {endpoint} request -> {e}"
)
return b""
return Response(zlib.compress(b'{"stat": "0"}'))
req_data = json.loads(unzip)
self.logger.info(f"v{version} {endpoint} request from {client_ip}")
self.logger.debug(req_data)
endpoint = endpoint.replace("C3Exp", "") if game_code == "SDGS" else endpoint
if game_code == "SDGS" and version >= 110:
endpoint = endpoint.replace("C3Exp", "")
elif game_code == "SDGS" and version < 110:
endpoint = endpoint.replace("Exp", "")
else:
endpoint = endpoint
func_to_find = "handle_" + inflection.underscore(endpoint) + "_request"
handler_cls = self.versions[internal_ver](self.core_cfg, self.game_cfg)
@ -285,13 +341,13 @@ class ChuniServlet(BaseServlet):
else:
try:
handler = getattr(handler_cls, func_to_find)
resp = handler(req_data)
resp = await handler(req_data)
except Exception as e:
self.logger.error(f"Error handling v{version} method {endpoint} - {e}")
return zlib.compress(b'{"stat": "0"}')
return Response(zlib.compress(b'{"stat": "0"}'))
if resp == None:
if resp is None:
resp = {"returnCode": 1}
self.logger.debug(f"Response {resp}")
@ -299,14 +355,14 @@ class ChuniServlet(BaseServlet):
zipped = zlib.compress(json.dumps(resp, ensure_ascii=False).encode("utf-8"))
if not encrtped:
return zipped
return Response(zipped)
padded = pad(zipped, 16)
crypt = AES.new(
bytes.fromhex(self.game_cfg.crypto.keys[internal_ver][0]),
bytes.fromhex(self.game_cfg.crypto.keys[crypto_cfg_key][0]),
AES.MODE_CBC,
bytes.fromhex(self.game_cfg.crypto.keys[internal_ver][1]),
bytes.fromhex(self.game_cfg.crypto.keys[crypto_cfg_key][1]),
)
return crypt.encrypt(padded)
return Response(crypt.encrypt(padded))

298
titles/chuni/luminous.py Normal file
View File

@ -0,0 +1,298 @@
from datetime import timedelta
from typing import Dict
from core.config import CoreConfig
from titles.chuni.sunplus import ChuniSunPlus
from titles.chuni.const import ChuniConstants, MapAreaConditionLogicalOperator, MapAreaConditionType
from titles.chuni.config import ChuniConfig
class ChuniLuminous(ChuniSunPlus):
def __init__(self, core_cfg: CoreConfig, game_cfg: ChuniConfig) -> None:
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_LUMINOUS
async def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = await super().handle_cm_get_user_preview_api_request(data)
# Does CARD MAKER 1.35 work this far up?
user_data["lastDataVersion"] = "2.20.00"
return user_data
async def handle_get_user_c_mission_api_request(self, data: Dict) -> Dict:
user_id = data["userId"]
mission_id = data["missionId"]
progress_list = []
point = 0
mission_data = await self.data.item.get_cmission(user_id, mission_id)
progress_data = await self.data.item.get_cmission_progress(user_id, mission_id)
if mission_data and progress_data:
point = mission_data["point"]
for progress in progress_data:
progress_list.append(
{
"order": progress["order"],
"stage": progress["stage"],
"progress": progress["progress"],
}
)
return {
"userId": user_id,
"missionId": mission_id,
"point": point,
"userCMissionProgressList": progress_list,
}
async def handle_get_user_net_battle_ranking_info_api_request(self, data: Dict) -> Dict:
user_id = data["userId"]
net_battle = {}
net_battle_data = await self.data.profile.get_net_battle(user_id)
if net_battle_data:
net_battle = {
"isRankUpChallengeFailed": net_battle_data["isRankUpChallengeFailed"],
"highestBattleRankId": net_battle_data["highestBattleRankId"],
"battleIconId": net_battle_data["battleIconId"],
"battleIconNum": net_battle_data["battleIconNum"],
"avatarEffectPoint": net_battle_data["avatarEffectPoint"],
}
return {
"userId": user_id,
"userNetBattleData": net_battle,
}
async def handle_get_game_map_area_condition_api_request(self, data: Dict) -> Dict:
# There is no game data for this, everything is server side.
# However, we can selectively show/hide events as data is imported into the server.
events = await self.data.static.get_enabled_events(self.version)
event_by_id = {evt["eventId"]: evt for evt in events}
conditions = []
# The Mystic Rainbow of LUMINOUS map unlocks when any mainline LUMINOUS area
# (ep. I, ep. II, ep. III) are completed.
mystic_area_1_conditions = {
"mapAreaId": 3229301, # Mystic Rainbow of LUMINOUS Area 1
"length": 0,
"mapAreaConditionList": [],
}
mystic_area_1_added = False
# Secret AREA: MUSIC GAME
if 14029 in event_by_id:
start_date = event_by_id[14029]["startDate"].strftime(self.date_time_format)
mission_in_progress_end_date = "2099-12-31 00:00:00.0"
# The "MISSION in progress" trophy required to trigger the secret area
# is only available in the first CHUNITHM mission. If the second mission
# (event ID 14214) was imported into ARTEMiS, we disable the requirement
# for this trophy.
if 14214 in event_by_id:
mission_in_progress_end_date = (event_by_id[14214]["startDate"] - timedelta(hours=2)).strftime(self.date_time_format)
conditions.extend([
{
"mapAreaId": 2206201, # BlythE ULTIMA
"length": 1,
# Obtain the trophy "MISSION in progress".
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6832,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": mission_in_progress_end_date,
}
],
},
{
"mapAreaId": 2206202, # PRIVATE SERVICE ULTIMA
"length": 1,
# Obtain the trophy "MISSION in progress".
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6832,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": mission_in_progress_end_date,
}
],
},
{
"mapAreaId": 2206203, # New York Back Raise
"length": 1,
# SS NightTheater's EXPERT chart and get the title
# "今宵、劇場に映し出される景色とは――――。"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6833,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206204, # Spasmodic
"length": 2,
# - Get 1 miss on Random (any difficulty) and get the title "当たり待ち"
# - Get 1 miss on 花たちに希望を (any difficulty) and get the title "花たちに希望を"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6834,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6835,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206205, # ΩΩPARTS
"length": 2,
# - S Sage EXPERT to get the title "マターリ進行キボンヌ"
# - Equip this title and play cab-to-cab with another person with this title
# to get "マターリしようよ". Disabled because it is difficult to play cab2cab
# on data setups. A network operator may consider re-enabling it by uncommenting
# the second condition.
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6836,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
# {
# "type": MapAreaConditionType.TROPHY_OBTAINED.value,
# "conditionId": 6837,
# "logicalOpe": MapAreaConditionLogicalOperator.AND.value,
# "startDate": start_date,
# "endDate": "2099-12-31 00:00:00.0",
# },
],
},
{
"mapAreaId": 2206206, # Blow My Mind
"length": 1,
# SS on CHAOS EXPERT, Hydra EXPERT, Surive EXPERT and Jakarta PROGRESSION EXPERT
# to get the title "Can you hear me?"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6838,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206207, # VALLIS-NERIA
"length": 6,
# Finish the 6 other areas
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_AREA_CLEARED.value,
"conditionId": x,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
}
for x in range(2206201, 2206207)
],
},
])
# LUMINOUS ep. I
if 14005 in event_by_id:
start_date = event_by_id[14005]["startDate"].strftime(self.date_time_format)
if not mystic_area_1_added:
conditions.append(mystic_area_1_conditions)
mystic_area_1_added = True
mystic_area_1_conditions["length"] += 1
mystic_area_1_conditions["mapAreaConditionList"].append(
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020701,
"logicalOpe": MapAreaConditionLogicalOperator.OR.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
}
)
conditions.append(
{
"mapAreaId": 3229302, # Mystic Rainbow of LUMINOUS Area 2,
"length": 1,
# Unlocks when LUMINOUS ep. I is completed.
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020701,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
}
)
# LUMINOUS ep. II
if 14251 in event_by_id:
start_date = event_by_id[14251]["startDate"].strftime(self.date_time_format)
if not mystic_area_1_added:
conditions.append(mystic_area_1_conditions)
mystic_area_1_added = True
mystic_area_1_conditions["length"] += 1
mystic_area_1_conditions["mapAreaConditionList"].append(
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020702,
"logicalOpe": MapAreaConditionLogicalOperator.OR.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
}
)
conditions.append(
{
"mapAreaId": 3229303, # Mystic Rainbow of LUMINOUS Area 3,
"length": 1,
# Unlocks when LUMINOUS ep. II is completed.
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020702,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
}
)
return {
"length": len(conditions),
"gameMapAreaConditionList": conditions,
}

View File

@ -32,8 +32,10 @@ class ChuniNew(ChuniBase):
return "210"
if self.version == ChuniConstants.VER_CHUNITHM_SUN_PLUS:
return "215"
if self.version == ChuniConstants.VER_CHUNITHM_LUMINOUS:
return "220"
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
# use UTC time and convert it to JST time by adding +9
# matching therefore starts one hour before and lasts for 8 hours
match_start = datetime.strftime(
@ -82,27 +84,27 @@ class ChuniNew(ChuniBase):
"matchErrorLimit": self.game_cfg.matching.match_error_limit,
"romVersion": self.game_cfg.version.version(self.version)["rom"],
"dataVersion": self.game_cfg.version.version(self.version)["data"],
"matchingUri": f"http://{self.core_cfg.title.hostname}:{t_port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
"matchingUriX": f"http://{self.core_cfg.title.hostname}:{t_port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
"matchingUri": f"http://{self.core_cfg.server.hostname}:{t_port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
"matchingUriX": f"http://{self.core_cfg.server.hostname}:{t_port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
# might be really important for online battle to connect the cabs via UDP port 50201
"udpHolePunchUri": f"http://{self.core_cfg.title.hostname}:{self.core_cfg.title.port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
"reflectorUri": f"http://{self.core_cfg.title.hostname}:{self.core_cfg.title.port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
"udpHolePunchUri": f"http://{self.core_cfg.server.hostname}:{self.core_cfg.server.port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
"reflectorUri": f"http://{self.core_cfg.server.hostname}:{self.core_cfg.server.port}/SDHD/{self._interal_ver_to_intver()}/ChuniServlet/",
},
"isDumpUpload": False,
"isAou": False,
}
def handle_remove_token_api_request(self, data: Dict) -> Dict:
async def handle_remove_token_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_delete_token_api_request(self, data: Dict) -> Dict:
async def handle_delete_token_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_create_token_api_request(self, data: Dict) -> Dict:
async def handle_create_token_api_request(self, data: Dict) -> Dict:
return {"returnCode": "1"}
def handle_get_user_map_area_api_request(self, data: Dict) -> Dict:
user_map_areas = self.data.item.get_map_areas(data["userId"])
async def handle_get_user_map_area_api_request(self, data: Dict) -> Dict:
user_map_areas = await self.data.item.get_map_areas(data["userId"])
map_areas = []
for map_area in user_map_areas:
@ -113,14 +115,14 @@ class ChuniNew(ChuniBase):
return {"userId": data["userId"], "userMapAreaList": map_areas}
def handle_get_user_symbol_chat_setting_api_request(self, data: Dict) -> Dict:
async def handle_get_user_symbol_chat_setting_api_request(self, data: Dict) -> Dict:
return {"userId": data["userId"], "symbolCharInfoList": []}
def handle_get_user_preview_api_request(self, data: Dict) -> Dict:
profile = self.data.profile.get_profile_preview(data["userId"], self.version)
async def handle_get_user_preview_api_request(self, data: Dict) -> Dict:
profile = await self.data.profile.get_profile_preview(data["userId"], self.version)
if profile is None:
return None
profile_character = self.data.item.get_character(
profile_character = await self.data.item.get_character(
data["userId"], profile["characterId"]
)
@ -164,8 +166,8 @@ class ChuniNew(ChuniBase):
}
return data1
def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
p = self.data.profile.get_profile_data(data["userId"], self.version)
async def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_data(data["userId"], self.version)
if p is None:
return {}
@ -177,17 +179,17 @@ class ChuniNew(ChuniBase):
"isLogin": False,
}
def handle_printer_login_api_request(self, data: Dict) -> Dict:
async def handle_printer_login_api_request(self, data: Dict) -> Dict:
return {"returnCode": 1}
def handle_printer_logout_api_request(self, data: Dict) -> Dict:
async def handle_printer_logout_api_request(self, data: Dict) -> Dict:
return {"returnCode": 1}
def handle_get_game_gacha_api_request(self, data: Dict) -> Dict:
async def handle_get_game_gacha_api_request(self, data: Dict) -> Dict:
"""
returns all current active banners (gachas)
"""
game_gachas = self.data.static.get_gachas(self.version)
game_gachas = await self.data.static.get_gachas(self.version)
# clean the database rows
game_gacha_list = []
@ -213,11 +215,11 @@ class ChuniNew(ChuniBase):
"registIdList": [],
}
def handle_get_game_gacha_card_by_id_api_request(self, data: Dict) -> Dict:
async def handle_get_game_gacha_card_by_id_api_request(self, data: Dict) -> Dict:
"""
returns all valid cards for a given gachaId
"""
game_gacha_cards = self.data.static.get_gacha_cards(data["gachaId"])
game_gacha_cards = await self.data.static.get_gacha_cards(data["gachaId"])
game_gacha_card_list = []
for gacha_card in game_gacha_cards:
@ -237,8 +239,8 @@ class ChuniNew(ChuniBase):
"ssrBookCalcList": [],
}
def handle_cm_get_user_data_api_request(self, data: Dict) -> Dict:
p = self.data.profile.get_profile_data(data["userId"], self.version)
async def handle_cm_get_user_data_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_data(data["userId"], self.version)
if p is None:
return {}
@ -262,8 +264,8 @@ class ChuniNew(ChuniBase):
],
}
def handle_get_user_gacha_api_request(self, data: Dict) -> Dict:
user_gachas = self.data.item.get_user_gachas(data["userId"])
async def handle_get_user_gacha_api_request(self, data: Dict) -> Dict:
user_gachas = await self.data.item.get_user_gachas(data["userId"])
if user_gachas is None:
return {"userId": data["userId"], "length": 0, "userGachaList": []}
@ -281,8 +283,8 @@ class ChuniNew(ChuniBase):
"userGachaList": user_gacha_list,
}
def handle_get_user_printed_card_api_request(self, data: Dict) -> Dict:
user_print_list = self.data.item.get_user_print_states(
async def handle_get_user_printed_card_api_request(self, data: Dict) -> Dict:
user_print_list = await self.data.item.get_user_print_states(
data["userId"], has_completed=True
)
if user_print_list is None:
@ -316,10 +318,10 @@ class ChuniNew(ChuniBase):
"userPrintedCardList": print_list,
}
def handle_get_user_card_print_error_api_request(self, data: Dict) -> Dict:
async def handle_get_user_card_print_error_api_request(self, data: Dict) -> Dict:
user_id = data["userId"]
user_print_states = self.data.item.get_user_print_states(
user_print_states = await self.data.item.get_user_print_states(
user_id, has_completed=False
)
@ -338,13 +340,13 @@ class ChuniNew(ChuniBase):
"userCardPrintStateList": card_print_state_list,
}
def handle_cm_get_user_character_api_request(self, data: Dict) -> Dict:
return super().handle_get_user_character_api_request(data)
async def handle_cm_get_user_character_api_request(self, data: Dict) -> Dict:
return await super().handle_get_user_character_api_request(data)
def handle_cm_get_user_item_api_request(self, data: Dict) -> Dict:
return super().handle_get_user_item_api_request(data)
async def handle_cm_get_user_item_api_request(self, data: Dict) -> Dict:
return await super().handle_get_user_item_api_request(data)
def handle_roll_gacha_api_request(self, data: Dict) -> Dict:
async def handle_roll_gacha_api_request(self, data: Dict) -> Dict:
"""
Handle a gacha roll API request, with:
gachaId: the gachaId where the cards should be pulled from
@ -362,14 +364,14 @@ class ChuniNew(ChuniBase):
# characterId should be returned
if chara_id != -1:
# get the
card = self.data.static.get_gacha_card_by_character(gacha_id, chara_id)
card = await self.data.static.get_gacha_card_by_character(gacha_id, chara_id)
tmp = card._asdict()
tmp.pop("id")
rolled_cards.append(tmp)
else:
gacha_cards = self.data.static.get_gacha_cards(gacha_id)
gacha_cards = await self.data.static.get_gacha_cards(gacha_id)
# get the card id for each roll
for _ in range(num_rolls):
@ -386,7 +388,7 @@ class ChuniNew(ChuniBase):
return {"length": len(rolled_cards), "gameGachaCardList": rolled_cards}
def handle_cm_upsert_user_gacha_api_request(self, data: Dict) -> Dict:
async def handle_cm_upsert_user_gacha_api_request(self, data: Dict) -> Dict:
upsert = data["cmUpsertUserGacha"]
user_id = data["userId"]
place_id = data["placeId"]
@ -396,7 +398,7 @@ class ChuniNew(ChuniBase):
user_data.pop("rankUpChallengeResults")
user_data.pop("userEmoney")
self.data.profile.put_profile_data(user_id, self.version, user_data)
await self.data.profile.put_profile_data(user_id, self.version, user_data)
# save the user gacha
user_gacha = upsert["userGacha"]
@ -404,16 +406,16 @@ class ChuniNew(ChuniBase):
user_gacha.pop("gachaId")
user_gacha.pop("dailyGachaDate")
self.data.item.put_user_gacha(user_id, gacha_id, user_gacha)
await self.data.item.put_user_gacha(user_id, gacha_id, user_gacha)
# save all user items
if "userItemList" in upsert:
for item in upsert["userItemList"]:
self.data.item.put_item(user_id, item)
await self.data.item.put_item(user_id, item)
# add every gamegachaCard to database
for card in upsert["gameGachaCardList"]:
self.data.item.put_user_print_state(
await self.data.item.put_user_print_state(
user_id,
hasCompleted=False,
placeId=place_id,
@ -423,7 +425,7 @@ class ChuniNew(ChuniBase):
# retrieve every game gacha card which has been added in order to get
# the orderId for the next request
user_print_states = self.data.item.get_user_print_states_by_gacha(
user_print_states = await self.data.item.get_user_print_states_by_gacha(
user_id, gacha_id, has_completed=False
)
card_print_state_list = []
@ -441,7 +443,7 @@ class ChuniNew(ChuniBase):
"userCardPrintStateList": card_print_state_list,
}
def handle_cm_upsert_user_printlog_api_request(self, data: Dict) -> Dict:
async def handle_cm_upsert_user_printlog_api_request(self, data: Dict) -> Dict:
return {
"returnCode": 1,
"orderId": 0,
@ -449,7 +451,7 @@ class ChuniNew(ChuniBase):
"apiName": "CMUpsertUserPrintlogApi",
}
def handle_cm_upsert_user_print_api_request(self, data: Dict) -> Dict:
async def handle_cm_upsert_user_print_api_request(self, data: Dict) -> Dict:
user_print_detail = data["userPrintDetail"]
user_id = data["userId"]
@ -465,7 +467,7 @@ class ChuniNew(ChuniBase):
)
# add the entry to the user print table with the random serialId
self.data.item.put_user_print_detail(user_id, serial_id, user_print_detail)
await self.data.item.put_user_print_detail(user_id, serial_id, user_print_detail)
return {
"returnCode": 1,
@ -474,7 +476,7 @@ class ChuniNew(ChuniBase):
"apiName": "CMUpsertUserPrintApi",
}
def handle_cm_upsert_user_print_subtract_api_request(self, data: Dict) -> Dict:
async def handle_cm_upsert_user_print_subtract_api_request(self, data: Dict) -> Dict:
upsert = data["userCardPrintState"]
user_id = data["userId"]
place_id = data["placeId"]
@ -482,37 +484,37 @@ class ChuniNew(ChuniBase):
# save all user items
if "userItemList" in data:
for item in data["userItemList"]:
self.data.item.put_item(user_id, item)
await self.data.item.put_item(user_id, item)
# set the card print state to success and use the orderId as the key
self.data.item.put_user_print_state(
await self.data.item.put_user_print_state(
user_id, id=upsert["orderId"], hasCompleted=True
)
return {"returnCode": "1", "apiName": "CMUpsertUserPrintSubtractApi"}
def handle_cm_upsert_user_print_cancel_api_request(self, data: Dict) -> Dict:
async def handle_cm_upsert_user_print_cancel_api_request(self, data: Dict) -> Dict:
order_ids = data["orderIdList"]
user_id = data["userId"]
# set the card print state to success and use the orderId as the key
for order_id in order_ids:
self.data.item.put_user_print_state(user_id, id=order_id, hasCompleted=True)
await self.data.item.put_user_print_state(user_id, id=order_id, hasCompleted=True)
return {"returnCode": "1", "apiName": "CMUpsertUserPrintCancelApi"}
def handle_ping_request(self, data: Dict) -> Dict:
async def handle_ping_request(self, data: Dict) -> Dict:
# matchmaking ping request
return {"returnCode": "1"}
def handle_begin_matching_api_request(self, data: Dict) -> Dict:
async def handle_begin_matching_api_request(self, data: Dict) -> Dict:
room_id = 1
# check if there is a free matching room
matching_room = self.data.item.get_oldest_free_matching(self.version)
matching_room = await self.data.item.get_oldest_free_matching(self.version)
if matching_room is None:
# grab the latest roomId and add 1 for the new room
newest_matching = self.data.item.get_newest_matching(self.version)
newest_matching = await self.data.item.get_newest_matching(self.version)
if newest_matching is not None:
room_id = newest_matching["roomId"] + 1
@ -522,12 +524,12 @@ class ChuniNew(ChuniBase):
# create the new room with room_id and the current user id (host)
# user id is required for the countdown later on
self.data.item.put_matching(
await self.data.item.put_matching(
self.version, room_id, [new_member], user_id=new_member["userId"]
)
# get the newly created matching room
matching_room = self.data.item.get_matching(self.version, room_id)
matching_room = await self.data.item.get_matching(self.version, room_id)
else:
# a room already exists, so just add the new member to it
matching_member_list = matching_room["matchingMemberInfoList"]
@ -537,7 +539,7 @@ class ChuniNew(ChuniBase):
matching_member_list.append(new_member)
# add the updated room to the database, make sure to set isFull correctly!
self.data.item.put_matching(
await self.data.item.put_matching(
self.version,
matching_room["roomId"],
matching_member_list,
@ -554,8 +556,8 @@ class ChuniNew(ChuniBase):
return {"roomId": 1, "matchingWaitState": matching_wait}
def handle_end_matching_api_request(self, data: Dict) -> Dict:
matching_room = self.data.item.get_matching(self.version, data["roomId"])
async def handle_end_matching_api_request(self, data: Dict) -> Dict:
matching_room = await self.data.item.get_matching(self.version, data["roomId"])
members = matching_room["matchingMemberInfoList"]
# only set the host user to role 1 every other to 0?
@ -564,7 +566,7 @@ class ChuniNew(ChuniBase):
for m in members
]
self.data.item.put_matching(
await self.data.item.put_matching(
self.version,
matching_room["roomId"],
members,
@ -579,13 +581,13 @@ class ChuniNew(ChuniBase):
# no idea, maybe to differentiate between CPUs and real players?
"matchingMemberRoleList": role_list,
# TCP/UDP connection?
"reflectorUri": f"{self.core_cfg.title.hostname}",
"reflectorUri": f"{self.core_cfg.server.hostname}",
}
def handle_remove_matching_member_api_request(self, data: Dict) -> Dict:
async def handle_remove_matching_member_api_request(self, data: Dict) -> Dict:
# get all matching rooms, because Chuni only returns the userId
# not the actual roomId
matching_rooms = self.data.item.get_all_matchings(self.version)
matching_rooms = await self.data.item.get_all_matchings(self.version)
if matching_rooms is None:
return {"returnCode": "1"}
@ -599,10 +601,10 @@ class ChuniNew(ChuniBase):
# if the last user got removed, delete the matching room
if len(new_members) <= 0:
self.data.item.delete_matching(self.version, room["roomId"])
await self.data.item.delete_matching(self.version, room["roomId"])
else:
# remove the user from the room
self.data.item.put_matching(
await self.data.item.put_matching(
self.version,
room["roomId"],
new_members,
@ -612,10 +614,10 @@ class ChuniNew(ChuniBase):
return {"returnCode": "1"}
def handle_get_matching_state_api_request(self, data: Dict) -> Dict:
async def handle_get_matching_state_api_request(self, data: Dict) -> Dict:
polling_interval = 1
# get the current active room
matching_room = self.data.item.get_matching(self.version, data["roomId"])
matching_room = await self.data.item.get_matching(self.version, data["roomId"])
members = matching_room["matchingMemberInfoList"]
rest_sec = matching_room["restMSec"]
@ -638,7 +640,7 @@ class ChuniNew(ChuniBase):
current_member["userName"] = self.read_wtf8(current_member["userName"])
members[i] = current_member
self.data.item.put_matching(
await self.data.item.put_matching(
self.version,
data["roomId"],
members,

View File

@ -11,8 +11,8 @@ class ChuniNewPlus(ChuniNew):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_NEW_PLUS
def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = super().handle_cm_get_user_preview_api_request(data)
async def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = await super().handle_cm_get_user_preview_api_request(data)
# hardcode lastDataVersion for CardMaker 1.35 A028
user_data["lastDataVersion"] = "2.05.00"

View File

@ -13,7 +13,7 @@ class ChuniParadise(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_PARADISE
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.50.00"
return ret

View File

@ -11,7 +11,7 @@ class ChuniPlus(ChuniBase):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_PLUS
def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = super().handle_get_game_setting_api_request(data)
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
ret = await super().handle_get_game_setting_api_request(data)
ret["gameSetting"]["dataVersion"] = "1.05.00"
return ret

View File

@ -28,7 +28,7 @@ class ChuniReader(BaseReader):
self.logger.error(f"Invalid chunithm version {version}")
exit(1)
def read(self) -> None:
async def read(self) -> None:
data_dirs = []
if self.bin_dir is not None:
data_dirs += self.get_data_directories(self.bin_dir)
@ -38,19 +38,18 @@ class ChuniReader(BaseReader):
for dir in data_dirs:
self.logger.info(f"Read from {dir}")
self.read_events(f"{dir}/event")
self.read_music(f"{dir}/music")
self.read_charges(f"{dir}/chargeItem")
self.read_avatar(f"{dir}/avatarAccessory")
self.read_login_bonus(f"{dir}/")
await self.read_events(f"{dir}/event")
await self.read_music(f"{dir}/music")
await self.read_charges(f"{dir}/chargeItem")
await self.read_avatar(f"{dir}/avatarAccessory")
await self.read_login_bonus(f"{dir}/")
def read_login_bonus(self, root_dir: str) -> None:
async def read_login_bonus(self, root_dir: str) -> None:
for root, dirs, files in walk(f"{root_dir}loginBonusPreset"):
for dir in dirs:
if path.exists(f"{root}/{dir}/LoginBonusPreset.xml"):
with open(f"{root}/{dir}/LoginBonusPreset.xml", "rb") as fp:
bytedata = fp.read()
strdata = bytedata.decode("UTF-8")
with open(f"{root}/{dir}/LoginBonusPreset.xml", "r", encoding="utf-8") as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
@ -60,7 +59,7 @@ class ChuniReader(BaseReader):
True if xml_root.find("disableFlag").text == "false" else False
)
result = self.data.static.put_login_bonus_preset(
result = await self.data.static.put_login_bonus_preset(
self.version, id, name, is_enabled
)
@ -98,7 +97,7 @@ class ChuniReader(BaseReader):
bonus_root.find("loginBonusCategoryType").text
)
result = self.data.static.put_login_bonus(
result = await self.data.static.put_login_bonus(
self.version,
id,
bonus_id,
@ -117,13 +116,12 @@ class ChuniReader(BaseReader):
f"Failed to insert login bonus {bonus_id}"
)
def read_events(self, evt_dir: str) -> None:
async def read_events(self, evt_dir: str) -> None:
for root, dirs, files in walk(evt_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/Event.xml"):
with open(f"{root}/{dir}/Event.xml", "rb") as fp:
bytedata = fp.read()
strdata = bytedata.decode("UTF-8")
with open(f"{root}/{dir}/Event.xml", "r", encoding="utf-8") as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
@ -132,7 +130,7 @@ class ChuniReader(BaseReader):
for substances in xml_root.findall("substances"):
event_type = substances.find("type").text
result = self.data.static.put_event(
result = await self.data.static.put_event(
self.version, id, event_type, name
)
if result is not None:
@ -140,13 +138,12 @@ class ChuniReader(BaseReader):
else:
self.logger.warning(f"Failed to insert event {id}")
def read_music(self, music_dir: str) -> None:
async def read_music(self, music_dir: str) -> None:
for root, dirs, files in walk(music_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/Music.xml"):
with open(f"{root}/{dir}/Music.xml", "rb") as fp:
bytedata = fp.read()
strdata = bytedata.decode("UTF-8")
with open(f"{root}/{dir}/Music.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
@ -185,7 +182,7 @@ class ChuniReader(BaseReader):
)
we_chara = None
result = self.data.static.put_music(
result = await self.data.static.put_music(
self.version,
song_id,
chart_id,
@ -206,13 +203,12 @@ class ChuniReader(BaseReader):
f"Failed to insert music {song_id} chart {chart_id}"
)
def read_charges(self, charge_dir: str) -> None:
async def read_charges(self, charge_dir: str) -> None:
for root, dirs, files in walk(charge_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/ChargeItem.xml"):
with open(f"{root}/{dir}/ChargeItem.xml", "rb") as fp:
bytedata = fp.read()
strdata = bytedata.decode("UTF-8")
with open(f"{root}/{dir}/ChargeItem.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
@ -222,7 +218,7 @@ class ChuniReader(BaseReader):
consumeType = xml_root.find("consumeType").text
sellingAppeal = bool(xml_root.find("sellingAppeal").text)
result = self.data.static.put_charge(
result = await self.data.static.put_charge(
self.version,
id,
name,
@ -236,13 +232,12 @@ class ChuniReader(BaseReader):
else:
self.logger.warning(f"Failed to insert charge {id}")
def read_avatar(self, avatar_dir: str) -> None:
async def read_avatar(self, avatar_dir: str) -> None:
for root, dirs, files in walk(avatar_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/AvatarAccessory.xml"):
with open(f"{root}/{dir}/AvatarAccessory.xml", "rb") as fp:
bytedata = fp.read()
strdata = bytedata.decode("UTF-8")
with open(f"{root}/{dir}/AvatarAccessory.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
@ -254,7 +249,7 @@ class ChuniReader(BaseReader):
for texture in xml_root.findall("texture"):
texturePath = texture.find("path").text
result = self.data.static.put_avatar(
result = await self.data.static.put_avatar(
self.version, id, name, category, iconPath, texturePath
)

View File

@ -243,9 +243,39 @@ matching = Table(
mysql_charset="utf8mb4",
)
cmission = Table(
"chuni_item_cmission",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("missionId", Integer, nullable=False),
Column("point", Integer),
UniqueConstraint("user", "missionId", name="chuni_item_cmission_uk"),
mysql_charset="utf8mb4",
)
cmission_progress = Table(
"chuni_item_cmission_progress",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("user", ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column("missionId", Integer, nullable=False),
Column("order", Integer),
Column("stage", Integer),
Column("progress", Integer),
UniqueConstraint(
"user", "missionId", "order", name="chuni_item_cmission_progress_uk"
),
mysql_charset="utf8mb4",
)
class ChuniItemData(BaseData):
def get_oldest_free_matching(self, version: int) -> Optional[Row]:
async def get_oldest_free_matching(self, version: int) -> Optional[Row]:
sql = matching.select(
and_(
matching.c.version == version,
@ -253,46 +283,46 @@ class ChuniItemData(BaseData):
)
).order_by(matching.c.roomId.asc())
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_newest_matching(self, version: int) -> Optional[Row]:
async def get_newest_matching(self, version: int) -> Optional[Row]:
sql = matching.select(
and_(
matching.c.version == version
)
).order_by(matching.c.roomId.desc())
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_all_matchings(self, version: int) -> Optional[List[Row]]:
async def get_all_matchings(self, version: int) -> Optional[List[Row]]:
sql = matching.select(
and_(
matching.c.version == version
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_matching(self, version: int, room_id: int) -> Optional[Row]:
async def get_matching(self, version: int, room_id: int) -> Optional[Row]:
sql = matching.select(
and_(matching.c.version == version, matching.c.roomId == room_id)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_matching(
async def put_matching(
self,
version: int,
room_id: int,
@ -314,22 +344,22 @@ class ChuniItemData(BaseData):
restMSec=rest_sec, matchingMemberInfoList=matching_member_info_list
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def delete_matching(self, version: int, room_id: int):
async def delete_matching(self, version: int, room_id: int):
sql = delete(matching).where(
and_(matching.c.roomId == room_id, matching.c.version == version)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
def get_all_favorites(
async def get_all_favorites(
self, user_id: int, version: int, fav_kind: int = 1
) -> Optional[List[Row]]:
sql = favorite.select(
@ -340,12 +370,12 @@ class ChuniItemData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_login_bonus(
async def put_login_bonus(
self, user_id: int, version: int, preset_id: int, **login_bonus_data
) -> Optional[int]:
sql = insert(login_bonus).values(
@ -354,12 +384,12 @@ class ChuniItemData(BaseData):
conflict = sql.on_duplicate_key_update(presetId=preset_id, **login_bonus_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_all_login_bonus(
async def get_all_login_bonus(
self, user_id: int, version: int, is_finished: bool = False
) -> Optional[List[Row]]:
sql = login_bonus.select(
@ -370,12 +400,12 @@ class ChuniItemData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_login_bonus(
async def get_login_bonus(
self, user_id: int, version: int, preset_id: int
) -> Optional[Row]:
sql = login_bonus.select(
@ -386,12 +416,12 @@ class ChuniItemData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_character(self, user_id: int, character_data: Dict) -> Optional[int]:
async def put_character(self, user_id: int, character_data: Dict) -> Optional[int]:
character_data["user"] = user_id
character_data = self.fix_bools(character_data)
@ -399,30 +429,30 @@ class ChuniItemData(BaseData):
sql = insert(character).values(**character_data)
conflict = sql.on_duplicate_key_update(**character_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_character(self, user_id: int, character_id: int) -> Optional[Dict]:
async def get_character(self, user_id: int, character_id: int) -> Optional[Dict]:
sql = select(character).where(
and_(character.c.user == user_id, character.c.characterId == character_id)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_characters(self, user_id: int) -> Optional[List[Row]]:
async def get_characters(self, user_id: int) -> Optional[List[Row]]:
sql = select(character).where(character.c.user == user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_item(self, user_id: int, item_data: Dict) -> Optional[int]:
async def put_item(self, user_id: int, item_data: Dict) -> Optional[int]:
item_data["user"] = user_id
item_data = self.fix_bools(item_data)
@ -430,12 +460,12 @@ class ChuniItemData(BaseData):
sql = insert(item).values(**item_data)
conflict = sql.on_duplicate_key_update(**item_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_items(self, user_id: int, kind: int = None) -> Optional[List[Row]]:
async def get_items(self, user_id: int, kind: int = None) -> Optional[List[Row]]:
if kind is None:
sql = select(item).where(item.c.user == user_id)
else:
@ -443,12 +473,12 @@ class ChuniItemData(BaseData):
and_(item.c.user == user_id, item.c.itemKind == kind)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_duel(self, user_id: int, duel_data: Dict) -> Optional[int]:
async def put_duel(self, user_id: int, duel_data: Dict) -> Optional[int]:
duel_data["user"] = user_id
duel_data = self.fix_bools(duel_data)
@ -456,20 +486,20 @@ class ChuniItemData(BaseData):
sql = insert(duel).values(**duel_data)
conflict = sql.on_duplicate_key_update(**duel_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_duels(self, user_id: int) -> Optional[List[Row]]:
async def get_duels(self, user_id: int) -> Optional[List[Row]]:
sql = select(duel).where(duel.c.user == user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_map(self, user_id: int, map_data: Dict) -> Optional[int]:
async def put_map(self, user_id: int, map_data: Dict) -> Optional[int]:
map_data["user"] = user_id
map_data = self.fix_bools(map_data)
@ -477,20 +507,20 @@ class ChuniItemData(BaseData):
sql = insert(map).values(**map_data)
conflict = sql.on_duplicate_key_update(**map_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_maps(self, user_id: int) -> Optional[List[Row]]:
async def get_maps(self, user_id: int) -> Optional[List[Row]]:
sql = select(map).where(map.c.user == user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_map_area(self, user_id: int, map_area_data: Dict) -> Optional[int]:
async def put_map_area(self, user_id: int, map_area_data: Dict) -> Optional[int]:
map_area_data["user"] = user_id
map_area_data = self.fix_bools(map_area_data)
@ -498,28 +528,28 @@ class ChuniItemData(BaseData):
sql = insert(map_area).values(**map_area_data)
conflict = sql.on_duplicate_key_update(**map_area_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_map_areas(self, user_id: int) -> Optional[List[Row]]:
async def get_map_areas(self, user_id: int) -> Optional[List[Row]]:
sql = select(map_area).where(map_area.c.user == user_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_user_gachas(self, aime_id: int) -> Optional[List[Row]]:
async def get_user_gachas(self, aime_id: int) -> Optional[List[Row]]:
sql = gacha.select(gacha.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_user_gacha(
async def put_user_gacha(
self, aime_id: int, gacha_id: int, gacha_data: Dict
) -> Optional[int]:
sql = insert(gacha).values(user=aime_id, gachaId=gacha_id, **gacha_data)
@ -527,14 +557,14 @@ class ChuniItemData(BaseData):
conflict = sql.on_duplicate_key_update(
user=aime_id, gachaId=gacha_id, **gacha_data
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(f"put_user_gacha: Failed to insert! aime_id: {aime_id}")
return None
return result.lastrowid
def get_user_print_states(
async def get_user_print_states(
self, aime_id: int, has_completed: bool = False
) -> Optional[List[Row]]:
sql = print_state.select(
@ -544,12 +574,12 @@ class ChuniItemData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_user_print_states_by_gacha(
async def get_user_print_states_by_gacha(
self, aime_id: int, gacha_id: int, has_completed: bool = False
) -> Optional[List[Row]]:
sql = print_state.select(
@ -560,16 +590,16 @@ class ChuniItemData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_user_print_state(self, aime_id: int, **print_data) -> Optional[int]:
async def put_user_print_state(self, aime_id: int, **print_data) -> Optional[int]:
sql = insert(print_state).values(user=aime_id, **print_data)
conflict = sql.on_duplicate_key_update(user=aime_id, **print_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
@ -578,7 +608,7 @@ class ChuniItemData(BaseData):
return None
return result.lastrowid
def put_user_print_detail(
async def put_user_print_detail(
self, aime_id: int, serial_id: str, user_print_data: Dict
) -> Optional[int]:
sql = insert(print_detail).values(
@ -586,7 +616,7 @@ class ChuniItemData(BaseData):
)
conflict = sql.on_duplicate_key_update(user=aime_id, **user_print_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
@ -594,3 +624,66 @@ class ChuniItemData(BaseData):
)
return None
return result.lastrowid
async def put_cmission_progress(
self, user_id: int, mission_id: int, progress_data: Dict
) -> Optional[int]:
progress_data["user"] = user_id
progress_data["missionId"] = mission_id
sql = insert(cmission_progress).values(**progress_data)
conflict = sql.on_duplicate_key_update(**progress_data)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
async def get_cmission_progress(
self, user_id: int, mission_id: int
) -> Optional[List[Row]]:
sql = cmission_progress.select(
and_(
cmission_progress.c.user == user_id,
cmission_progress.c.missionId == mission_id,
)
).order_by(cmission_progress.c.order.asc())
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
async def get_cmission(self, user_id: int, mission_id: int) -> Optional[Row]:
sql = cmission.select(
and_(cmission.c.user == user_id, cmission.c.missionId == mission_id)
)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
async def put_cmission(self, user_id: int, mission_data: Dict) -> Optional[int]:
mission_data["user"] = user_id
sql = insert(cmission).values(**mission_data)
conflict = sql.on_duplicate_key_update(**mission_data)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
async def get_cmissions(self, user_id: int) -> Optional[List[Row]]:
sql = cmission.select(cmission.c.user == user_id)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()

View File

@ -1,10 +1,10 @@
import json
from typing import Dict, List, Optional
from sqlalchemy import Table, Column, UniqueConstraint, PrimaryKeyConstraint, and_
from sqlalchemy.types import Integer, String, TIMESTAMP, Boolean, JSON, BigInteger
from sqlalchemy.engine.base import Connection
from sqlalchemy import Table, Column, UniqueConstraint, and_
from sqlalchemy.types import Integer, String, Boolean, JSON, BigInteger
from sqlalchemy.schema import ForeignKey
from sqlalchemy.engine import Row
from sqlalchemy.sql import func, select
from sqlalchemy.sql import select, delete
from sqlalchemy.dialects.mysql import insert
from core.data.schema import BaseData, metadata
@ -390,12 +390,56 @@ team = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column("teamName", String(255)),
Column("teamPoint", Integer),
Column("userTeamPoint", JSON),
mysql_charset="utf8mb4",
)
rating = Table(
"chuni_profile_rating",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("version", Integer, nullable=False),
Column("type", String(255), nullable=False),
Column("index", Integer, nullable=False),
Column("musicId", Integer),
Column("difficultId", Integer),
Column("romVersionCode", Integer),
Column("score", Integer),
UniqueConstraint("user", "version", "type", "index", name="chuni_profile_rating_best_uk"),
mysql_charset="utf8mb4",
)
net_battle = Table(
"chuni_profile_net_battle",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("user", Integer, ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"), nullable=False, unique=True),
Column("isRankUpChallengeFailed", Boolean),
Column("highestBattleRankId", Integer),
Column("battleIconId", Integer),
Column("battleIconNum", Integer),
Column("avatarEffectPoint", Integer),
mysql_charset="utf8mb4",
)
class ChuniProfileData(BaseData):
def put_profile_data(
async def update_name(self, user_id: int, new_name: str) -> bool:
sql = profile.update(profile.c.user == user_id).values(
userName=new_name
)
result = await self.execute(sql)
if result is None:
self.logger.warning(f"Failed to set user {user_id} name to {new_name}")
return False
return True
async def put_profile_data(
self, aime_id: int, version: int, profile_data: Dict
) -> Optional[int]:
profile_data["user"] = aime_id
@ -407,26 +451,26 @@ class ChuniProfileData(BaseData):
sql = insert(profile).values(**profile_data)
conflict = sql.on_duplicate_key_update(**profile_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(f"put_profile_data: Failed to update! aime_id: {aime_id}")
return None
return result.lastrowid
def get_profile_preview(self, aime_id: int, version: int) -> Optional[Row]:
async def get_profile_preview(self, aime_id: int, version: int) -> Optional[Row]:
sql = (
select([profile, option])
.join(option, profile.c.user == option.c.user)
.filter(and_(profile.c.user == aime_id, profile.c.version <= version))
).order_by(profile.c.version.desc())
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_profile_data(self, aime_id: int, version: int) -> Optional[Row]:
async def get_profile_data(self, aime_id: int, version: int) -> Optional[Row]:
sql = select(profile).where(
and_(
profile.c.user == aime_id,
@ -434,12 +478,12 @@ class ChuniProfileData(BaseData):
)
).order_by(profile.c.version.desc())
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_profile_data_ex(
async def put_profile_data_ex(
self, aime_id: int, version: int, profile_ex_data: Dict
) -> Optional[int]:
profile_ex_data["user"] = aime_id
@ -449,7 +493,7 @@ class ChuniProfileData(BaseData):
sql = insert(profile_ex).values(**profile_ex_data)
conflict = sql.on_duplicate_key_update(**profile_ex_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
@ -458,7 +502,7 @@ class ChuniProfileData(BaseData):
return None
return result.lastrowid
def get_profile_data_ex(self, aime_id: int, version: int) -> Optional[Row]:
async def get_profile_data_ex(self, aime_id: int, version: int) -> Optional[Row]:
sql = select(profile_ex).where(
and_(
profile_ex.c.user == aime_id,
@ -466,17 +510,17 @@ class ChuniProfileData(BaseData):
)
).order_by(profile_ex.c.version.desc())
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_profile_option(self, aime_id: int, option_data: Dict) -> Optional[int]:
async def put_profile_option(self, aime_id: int, option_data: Dict) -> Optional[int]:
option_data["user"] = aime_id
sql = insert(option).values(**option_data)
conflict = sql.on_duplicate_key_update(**option_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
@ -485,22 +529,22 @@ class ChuniProfileData(BaseData):
return None
return result.lastrowid
def get_profile_option(self, aime_id: int) -> Optional[Row]:
async def get_profile_option(self, aime_id: int) -> Optional[Row]:
sql = select(option).where(option.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_profile_option_ex(
async def put_profile_option_ex(
self, aime_id: int, option_ex_data: Dict
) -> Optional[int]:
option_ex_data["user"] = aime_id
sql = insert(option_ex).values(**option_ex_data)
conflict = sql.on_duplicate_key_update(**option_ex_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
@ -509,15 +553,15 @@ class ChuniProfileData(BaseData):
return None
return result.lastrowid
def get_profile_option_ex(self, aime_id: int) -> Optional[Row]:
async def get_profile_option_ex(self, aime_id: int) -> Optional[Row]:
sql = select(option_ex).where(option_ex.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_profile_recent_rating(
async def put_profile_recent_rating(
self, aime_id: int, recent_rating_data: List[Dict]
) -> Optional[int]:
sql = insert(recent_rating).values(
@ -525,7 +569,7 @@ class ChuniProfileData(BaseData):
)
conflict = sql.on_duplicate_key_update(recentRating=recent_rating_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
f"put_profile_recent_rating: Failed to update! aime_id: {aime_id}"
@ -533,15 +577,15 @@ class ChuniProfileData(BaseData):
return None
return result.lastrowid
def get_profile_recent_rating(self, aime_id: int) -> Optional[Row]:
async def get_profile_recent_rating(self, aime_id: int) -> Optional[Row]:
sql = select(recent_rating).where(recent_rating.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_profile_activity(self, aime_id: int, activity_data: Dict) -> Optional[int]:
async def put_profile_activity(self, aime_id: int, activity_data: Dict) -> Optional[int]:
# The game just uses "id" but we need to distinguish that from the db column "id"
activity_data["user"] = aime_id
activity_data["activityId"] = activity_data["id"]
@ -549,7 +593,7 @@ class ChuniProfileData(BaseData):
sql = insert(activity).values(**activity_data)
conflict = sql.on_duplicate_key_update(**activity_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
@ -558,24 +602,24 @@ class ChuniProfileData(BaseData):
return None
return result.lastrowid
def get_profile_activity(self, aime_id: int, kind: int) -> Optional[List[Row]]:
async def get_profile_activity(self, aime_id: int, kind: int) -> Optional[List[Row]]:
sql = (
select(activity)
.where(and_(activity.c.user == aime_id, activity.c.kind == kind))
.order_by(activity.c.sortNumber.desc()) # to get the last played track
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_profile_charge(self, aime_id: int, charge_data: Dict) -> Optional[int]:
async def put_profile_charge(self, aime_id: int, charge_data: Dict) -> Optional[int]:
charge_data["user"] = aime_id
sql = insert(charge).values(**charge_data)
conflict = sql.on_duplicate_key_update(**charge_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(
@ -584,40 +628,40 @@ class ChuniProfileData(BaseData):
return None
return result.lastrowid
def get_profile_charge(self, aime_id: int) -> Optional[List[Row]]:
async def get_profile_charge(self, aime_id: int) -> Optional[List[Row]]:
sql = select(charge).where(charge.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def add_profile_region(self, aime_id: int, region_id: int) -> Optional[int]:
async def add_profile_region(self, aime_id: int, region_id: int) -> Optional[int]:
pass
def get_profile_regions(self, aime_id: int) -> Optional[List[Row]]:
async def get_profile_regions(self, aime_id: int) -> Optional[List[Row]]:
pass
def put_profile_emoney(self, aime_id: int, emoney_data: Dict) -> Optional[int]:
async def put_profile_emoney(self, aime_id: int, emoney_data: Dict) -> Optional[int]:
emoney_data["user"] = aime_id
sql = insert(emoney).values(**emoney_data)
conflict = sql.on_duplicate_key_update(**emoney_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_profile_emoney(self, aime_id: int) -> Optional[List[Row]]:
async def get_profile_emoney(self, aime_id: int) -> Optional[List[Row]]:
sql = select(emoney).where(emoney.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_profile_overpower(
async def put_profile_overpower(
self, aime_id: int, overpower_data: Dict
) -> Optional[int]:
overpower_data["user"] = aime_id
@ -625,31 +669,31 @@ class ChuniProfileData(BaseData):
sql = insert(overpower).values(**overpower_data)
conflict = sql.on_duplicate_key_update(**overpower_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_profile_overpower(self, aime_id: int) -> Optional[List[Row]]:
async def get_profile_overpower(self, aime_id: int) -> Optional[List[Row]]:
sql = select(overpower).where(overpower.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_team_by_id(self, team_id: int) -> Optional[Row]:
async def get_team_by_id(self, team_id: int) -> Optional[Row]:
sql = select(team).where(team.c.id == team_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_team_rank(self, team_id: int) -> int:
async def get_team_rank(self, team_id: int) -> int:
# Normal ranking system, likely the one used in the real servers
# Query all teams sorted by 'teamPoint'
result = self.execute(
result = await self.execute(
select(team.c.id).order_by(team.c.teamPoint.desc())
)
@ -663,16 +707,40 @@ class ChuniProfileData(BaseData):
# Return the rank if found, or a default rank otherwise
return rank if rank is not None else 0
# RIP scaled team ranking. Gone, but forgotten
# def get_team_rank_scaled(self, team_id: int) -> int:
def update_team(self, team_id: int, team_data: Dict) -> bool:
async def update_team(self, team_id: int, team_data: Dict, user_id: str, user_point_delta: int) -> bool:
# Update the team data
team_data["id"] = team_id
existing_team = self.get_team_by_id(team_id)
if existing_team is None or "userTeamPoint" not in existing_team:
self.logger.warn(
f"update_team: Failed to update team! team id: {team_id}. Existing team data not found."
)
return False
user_team_point_data = []
if existing_team["userTeamPoint"] is not None and existing_team["userTeamPoint"] != "":
user_team_point_data = json.loads(existing_team["userTeamPoint"])
updated = False
# Try to find the user in the existing data and update their points
for user_point_data in user_team_point_data:
if user_point_data["user"] == user_id:
user_point_data["userPoint"] = str(int(user_point_delta))
updated = True
break
# If the user was not found, add them to the data with the new points
if not updated:
user_team_point_data.append({"user": user_id, "userPoint": str(user_point_delta)})
# Update the team's userTeamPoint field in the team data
team_data["userTeamPoint"] = json.dumps(user_team_point_data)
# Update the team in the database
sql = insert(team).values(**team_data)
conflict = sql.on_duplicate_key_update(**team_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warn(
@ -680,16 +748,17 @@ class ChuniProfileData(BaseData):
)
return False
return True
def get_rival(self, rival_id: int) -> Optional[Row]:
async def get_rival(self, rival_id: int) -> Optional[Row]:
sql = select(profile).where(profile.c.user == rival_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_overview(self) -> Dict:
async def get_overview(self) -> Dict:
# Fetch and add up all the playcounts
playcount_sql = self.execute(select(profile.c.playCount))
playcount_sql = await self.execute(select(profile.c.playCount))
if playcount_sql is None:
self.logger.warn(
@ -697,9 +766,84 @@ class ChuniProfileData(BaseData):
)
return 0
total_play_count = 0;
total_play_count = 0
for row in playcount_sql:
total_play_count += row[0]
return {
"total_play_count": total_play_count
}
async def put_profile_rating(
self,
aime_id: int,
version: int,
rating_type: str,
rating_data: List[Dict],
):
inserted_values = [
{"user": aime_id, "version": version, "type": rating_type, "index": i, **x}
for (i, x) in enumerate(rating_data)
]
sql = insert(rating).values(inserted_values)
update_dict = {x.name: x for x in sql.inserted if x.name != "id"}
sql = sql.on_duplicate_key_update(**update_dict)
result = await self.execute(sql)
if result is None:
self.logger.warn(
f"put_profile_rating: Could not insert {rating_type}, aime_id: {aime_id}",
)
return
return result.lastrowid
async def get_profile_rating(self, aime_id: int, version: int) -> Optional[List[Row]]:
sql = select(rating).where(and_(
rating.c.user == aime_id,
rating.c.version <= version,
))
result = await self.execute(sql)
if result is None:
self.logger.warning(f"Rating of user {aime_id}, version {version} was None")
return None
return result.fetchall()
async def get_all_profile_versions(self, aime_id: int) -> Optional[List[Row]]:
sql = select([profile.c.version]).where(profile.c.user == aime_id)
result = await self.execute(sql)
if result is None:
self.logger.warning(f"user {aime_id}, has no profile")
return None
else:
versions_raw = result.fetchall()
versions = [row[0] for row in versions_raw]
return sorted(versions, reverse=True)
async def put_net_battle(self, aime_id: int, net_battle_data: Dict) -> Optional[int]:
sql = insert(net_battle).values(
user=aime_id,
isRankUpChallengeFailed=net_battle_data['isRankUpChallengeFailed'],
highestBattleRankId=net_battle_data['highestBattleRankId'],
battleIconId=net_battle_data['battleIconId'],
battleIconNum=net_battle_data['battleIconNum'],
avatarEffectPoint=net_battle_data['avatarEffectPoint'],
)
conflict = sql.on_duplicate_key_update(
isRankUpChallengeFailed=net_battle_data['isRankUpChallengeFailed'],
highestBattleRankId=net_battle_data['highestBattleRankId'],
battleIconId=net_battle_data['battleIconId'],
battleIconNum=net_battle_data['battleIconNum'],
avatarEffectPoint=net_battle_data['avatarEffectPoint'],
)
result = await self.execute(conflict)
if result:
return result.inserted_primary_key['id']
self.logger.error(f"Failed to put net battle data for user {aime_id}")
async def get_net_battle(self, aime_id: int) -> Optional[Row]:
result = await self.execute(net_battle.select(net_battle.c.user == aime_id))
if result:
return result.fetchone()

View File

@ -142,55 +142,72 @@ playlog = Table(
class ChuniScoreData(BaseData):
def get_courses(self, aime_id: int) -> Optional[Row]:
async def get_courses(self, aime_id: int) -> Optional[Row]:
sql = select(course).where(course.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_course(self, aime_id: int, course_data: Dict) -> Optional[int]:
async def put_course(self, aime_id: int, course_data: Dict) -> Optional[int]:
course_data["user"] = aime_id
course_data = self.fix_bools(course_data)
sql = insert(course).values(**course_data)
conflict = sql.on_duplicate_key_update(**course_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_scores(self, aime_id: int) -> Optional[Row]:
async def get_scores(self, aime_id: int) -> Optional[Row]:
sql = select(best_score).where(best_score.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_score(self, aime_id: int, score_data: Dict) -> Optional[int]:
async def put_score(self, aime_id: int, score_data: Dict) -> Optional[int]:
score_data["user"] = aime_id
score_data = self.fix_bools(score_data)
sql = insert(best_score).values(**score_data)
conflict = sql.on_duplicate_key_update(**score_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_playlogs(self, aime_id: int) -> Optional[Row]:
async def get_playlogs(self, aime_id: int) -> Optional[Row]:
sql = select(playlog).where(playlog.c.user == aime_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_playlog(self, aime_id: int, playlog_data: Dict, version: int) -> Optional[int]:
async def get_playlogs_limited(self, aime_id: int, index: int, count: int) -> Optional[Row]:
sql = select(playlog).where(playlog.c.user == aime_id).order_by(playlog.c.id.desc()).limit(count).offset(index * count)
result = await self.execute(sql)
if result is None:
self.logger.warning(f" aime_id {aime_id} has no playlog ")
return None
return result.fetchall()
async def get_user_playlogs_count(self, aime_id: int) -> Optional[Row]:
sql = select(func.count()).where(playlog.c.user == aime_id)
result = await self.execute(sql)
if result is None:
self.logger.warning(f" aime_id {aime_id} has no playlog ")
return None
return result.scalar()
async def put_playlog(self, aime_id: int, playlog_data: Dict, version: int) -> Optional[int]:
# Calculate the ROM version that should be inserted into the DB, based on the version of the ggame being inserted
# We only need from Version 10 (Plost) and back, as newer versions include romVersion in their upsert
# This matters both for gameRankings, as well as a future DB update to keep version data separate
@ -216,15 +233,17 @@ class ChuniScoreData(BaseData):
sql = insert(playlog).values(**playlog_data)
conflict = sql.on_duplicate_key_update(**playlog_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_rankings(self, version: int) -> Optional[List[Dict]]:
async def get_rankings(self, version: int) -> Optional[List[Dict]]:
# Calculates the ROM version that should be fetched for rankings, based on the game version being retrieved
# This prevents tracks that are not accessible in your version from counting towards the 10 results
romVer = {
15: "2.20%",
14: "2.15%",
13: "2.10%",
12: "2.05%",
11: "2.00%",
@ -241,7 +260,7 @@ class ChuniScoreData(BaseData):
0: "1.00%"
}
sql = select([playlog.c.musicId.label('id'), func.count(playlog.c.musicId).label('point')]).where((playlog.c.level != 4) & (playlog.c.romVersion.like(romVer.get(version, "%")))).group_by(playlog.c.musicId).order_by(func.count(playlog.c.musicId).desc()).limit(10)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
@ -249,10 +268,10 @@ class ChuniScoreData(BaseData):
rows = result.fetchall()
return [dict(row) for row in rows]
def get_rival_music(self, rival_id: int) -> Optional[List[Dict]]:
async def get_rival_music(self, rival_id: int) -> Optional[List[Dict]]:
sql = select(best_score).where(best_score.c.user == rival_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()

View File

@ -175,7 +175,7 @@ login_bonus = Table(
class ChuniStaticData(BaseData):
def put_login_bonus(
async def put_login_bonus(
self,
version: int,
preset_id: int,
@ -207,12 +207,12 @@ class ChuniStaticData(BaseData):
loginBonusCategoryType=login_bonus_category_type,
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_login_bonus(
async def get_login_bonus(
self,
version: int,
preset_id: int,
@ -224,12 +224,12 @@ class ChuniStaticData(BaseData):
)
).order_by(login_bonus.c.needLoginDayCount.desc())
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_login_bonus_by_required_days(
async def get_login_bonus_by_required_days(
self, version: int, preset_id: int, need_login_day_count: int
) -> Optional[Row]:
sql = login_bonus.select(
@ -240,12 +240,12 @@ class ChuniStaticData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_login_bonus_preset(
async def put_login_bonus_preset(
self, version: int, preset_id: int, preset_name: str, is_enabled: bool
) -> Optional[int]:
sql = insert(login_bonus_preset).values(
@ -259,12 +259,12 @@ class ChuniStaticData(BaseData):
presetName=preset_name, isEnabled=is_enabled
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_login_bonus_presets(
async def get_login_bonus_presets(
self, version: int, is_enabled: bool = True
) -> Optional[List[Row]]:
sql = login_bonus_preset.select(
@ -274,12 +274,12 @@ class ChuniStaticData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_event(
async def put_event(
self, version: int, event_id: int, type: int, name: str
) -> Optional[int]:
sql = insert(events).values(
@ -288,19 +288,19 @@ class ChuniStaticData(BaseData):
conflict = sql.on_duplicate_key_update(name=name)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def update_event(
async def update_event(
self, version: int, event_id: int, enabled: bool
) -> Optional[bool]:
sql = events.update(
and_(events.c.version == version, events.c.eventId == event_id)
).values(enabled=enabled)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
self.logger.warning(
f"update_event: failed to update event! version: {version}, event_id: {event_id}, enabled: {enabled}"
@ -315,35 +315,35 @@ class ChuniStaticData(BaseData):
return None
return event["enabled"]
def get_event(self, version: int, event_id: int) -> Optional[Row]:
async def get_event(self, version: int, event_id: int) -> Optional[Row]:
sql = select(events).where(
and_(events.c.version == version, events.c.eventId == event_id)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_enabled_events(self, version: int) -> Optional[List[Row]]:
async def get_enabled_events(self, version: int) -> Optional[List[Row]]:
sql = select(events).where(
and_(events.c.version == version, events.c.enabled == True)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_events(self, version: int) -> Optional[List[Row]]:
async def get_events(self, version: int) -> Optional[List[Row]]:
sql = select(events).where(events.c.version == version)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_music(
async def put_music(
self,
version: int,
song_id: int,
@ -376,12 +376,12 @@ class ChuniStaticData(BaseData):
worldsEndTag=we_tag,
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def put_charge(
async def put_charge(
self,
version: int,
charge_id: int,
@ -406,38 +406,38 @@ class ChuniStaticData(BaseData):
sellingAppeal=selling_appeal,
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_enabled_charges(self, version: int) -> Optional[List[Row]]:
async def get_enabled_charges(self, version: int) -> Optional[List[Row]]:
sql = select(charge).where(
and_(charge.c.version == version, charge.c.enabled == True)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_charges(self, version: int) -> Optional[List[Row]]:
async def get_charges(self, version: int) -> Optional[List[Row]]:
sql = select(charge).where(charge.c.version == version)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_music(self, version: int) -> Optional[List[Row]]:
async def get_music(self, version: int) -> Optional[List[Row]]:
sql = music.select(music.c.version <= version)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_music_chart(
async def get_music_chart(
self, version: int, song_id: int, chart_id: int
) -> Optional[List[Row]]:
sql = select(music).where(
@ -448,21 +448,21 @@ class ChuniStaticData(BaseData):
)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_song(self, music_id: int) -> Optional[Row]:
async def get_song(self, music_id: int) -> Optional[Row]:
sql = music.select(music.c.id == music_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_avatar(
async def put_avatar(
self,
version: int,
avatarAccessoryId: int,
@ -487,12 +487,12 @@ class ChuniStaticData(BaseData):
texturePath=texturePath,
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
def put_gacha(
async def put_gacha(
self,
version: int,
gacha_id: int,
@ -513,33 +513,33 @@ class ChuniStaticData(BaseData):
**gacha_data,
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(f"Failed to insert gacha! gacha_id {gacha_id}")
return None
return result.lastrowid
def get_gachas(self, version: int) -> Optional[List[Dict]]:
async def get_gachas(self, version: int) -> Optional[List[Dict]]:
sql = gachas.select(gachas.c.version <= version).order_by(
gachas.c.gachaId.asc()
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_gacha(self, version: int, gacha_id: int) -> Optional[Dict]:
async def get_gacha(self, version: int, gacha_id: int) -> Optional[Dict]:
sql = gachas.select(
and_(gachas.c.version <= version, gachas.c.gachaId == gacha_id)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_gacha_card(
async def put_gacha_card(
self, gacha_id: int, card_id: int, **gacha_card
) -> Optional[int]:
sql = insert(gacha_cards).values(gachaId=gacha_id, cardId=card_id, **gacha_card)
@ -548,21 +548,21 @@ class ChuniStaticData(BaseData):
gachaId=gacha_id, cardId=card_id, **gacha_card
)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(f"Failed to insert gacha card! gacha_id {gacha_id}")
return None
return result.lastrowid
def get_gacha_cards(self, gacha_id: int) -> Optional[List[Dict]]:
async def get_gacha_cards(self, gacha_id: int) -> Optional[List[Dict]]:
sql = gacha_cards.select(gacha_cards.c.gachaId == gacha_id)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_gacha_card_by_character(
async def get_gacha_card_by_character(
self, gacha_id: int, chara_id: int
) -> Optional[Dict]:
sql_sub = (
@ -574,26 +574,26 @@ class ChuniStaticData(BaseData):
and_(gacha_cards.c.gachaId == gacha_id, gacha_cards.c.cardId == sql_sub)
)
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
def put_card(self, version: int, card_id: int, **card_data) -> Optional[int]:
async def put_card(self, version: int, card_id: int, **card_data) -> Optional[int]:
sql = insert(cards).values(version=version, cardId=card_id, **card_data)
conflict = sql.on_duplicate_key_update(**card_data)
result = self.execute(conflict)
result = await self.execute(conflict)
if result is None:
self.logger.warning(f"Failed to insert card! card_id {card_id}")
return None
return result.lastrowid
def get_card(self, version: int, card_id: int) -> Optional[Dict]:
async def get_card(self, version: int, card_id: int) -> Optional[Dict]:
sql = cards.select(and_(cards.c.version <= version, cards.c.cardId == card_id))
result = self.execute(sql)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()

Some files were not shown because too many files have changed in this diff Show More