Finish cleaning up repo, add bitstream build and docs

This commit is contained in:
spicyjpeg 2024-06-10 18:43:57 +02:00
parent 3a76ba6d16
commit ffced10553
No known key found for this signature in database
GPG Key ID: 5CC87404C01DF393
19 changed files with 255 additions and 600 deletions

4
.gitignore vendored
View File

@ -13,7 +13,3 @@ __pycache__/
# Do not include user-specific workspace and configuration files.
*.code-workspace
CMakeUserPresets.json
# Do not include the dumps used to generate the cartdb files.
#data/dumps/
#data/tests/

View File

BIN
data/fpga.bit Normal file

Binary file not shown.

View File

@ -1,5 +1,5 @@
# Data formats
# File and data formats
## Security cartridge dump (.573 file)

149
doc/fpga.md Normal file
View File

@ -0,0 +1,149 @@
# Digital I/O board FPGA bitstream
## Overview
The System 573's digital I/O board has the bulk of its logic split across two
different chips:
- an XCS40XL Spartan-XL FPGA, implementing pretty much all of the board's
functionality and driving most of the light outputs;
- an XC9536 CPLD, responsible for driving the remaining outputs and bringing up
the FPGA.
While the CPLD is factory-programmed and its registers can be accessed without
any prior initialization, the FPGA must be configured by uploading a bitstream
prior to accessing anything connected to it. This includes the DS2401 that holds
the board's identifier, so a bitstream is required by the tool even though it
does not otherwise make use of the MP3 decoder, additional RAM or any other
hardware on the board.
The `fpga` directory contains the source code for a simple bitstream that
implements a small subset of the functionality provided by Konami's bitstreams,
allowing the tool to control light outputs and read the DS2401 without having to
redistribute any files extracted from games. See below for instructions on
building it.
For more information about the board's hardware and wiring, see:
- [Digital I/O board](https://psx-spx.consoledev.net/konamisystem573/#digital-io-board-gx894-pwbba)
- [XCS40XL FPGA pin mapping](https://psx-spx.consoledev.net/konamisystem573/#xcs40xl-fpga-pin-mapping)
## Register map
### `0x1f640080`: Magic number
| Bits | RW | Description |
| ---: | :- | :---------------------- |
| 0-15 | R | Magic number (`0x573f`) |
Note that the number is different from the one used by Konami (`0x1234`).
### `0x1f6400e0`: Light output bank A
| Bits | RW | Description |
| ---: | :- | :----------------------------------- |
| 0-11 | | _Unused_ |
| 12 | W | Output A4 (0 = grounded, 1 = high-z) |
| 13 | W | Output A5 (0 = grounded, 1 = high-z) |
| 14 | W | Output A6 (0 = grounded, 1 = high-z) |
| 15 | W | Output A7 (0 = grounded, 1 = high-z) |
### `0x1f6400e2`: Light output bank A
| Bits | RW | Description |
| ---: | :- | :----------------------------------- |
| 0-11 | | _Unused_ |
| 12 | W | Output A0 (0 = grounded, 1 = high-z) |
| 13 | W | Output A1 (0 = grounded, 1 = high-z) |
| 14 | W | Output A2 (0 = grounded, 1 = high-z) |
| 15 | W | Output A3 (0 = grounded, 1 = high-z) |
### `0x1f6400e4`: Light output bank B
| Bits | RW | Description |
| ---: | :- | :----------------------------------- |
| 0-11 | | _Unused_ |
| 12 | W | Output B4 (0 = grounded, 1 = high-z) |
| 13 | W | Output B5 (0 = grounded, 1 = high-z) |
| 14 | W | Output B6 (0 = grounded, 1 = high-z) |
| 15 | W | Output B7 (0 = grounded, 1 = high-z) |
### `0x1f6400e6`: Light output bank D
| Bits | RW | Description |
| ---: | :- | :----------------------------------- |
| 0-11 | | _Unused_ |
| 12 | W | Output D0 (0 = grounded, 1 = high-z) |
| 13 | W | Output D1 (0 = grounded, 1 = high-z) |
| 14 | W | Output D2 (0 = grounded, 1 = high-z) |
| 15 | W | Output D3 (0 = grounded, 1 = high-z) |
### `0x1f6400ee` (FPGA, DDR/Mambo bitstream): **1-wire bus**
When read:
| Bits | RW | Description |
| ----: | :- | :------------------------ |
| 0-11 | | _Unused_ |
| 12 | R | DS2401 1-wire bus readout |
| 13 | R | DS2433 1-wire bus readout |
| 14-15 | | _Unused_ |
When written:
| Bits | RW | Description |
| ----: | :- | :----------------------------------------------------------- |
| 0-11 | | _Unused_ |
| 12 | W | Drive DS2401 1-wire bus low (1 = pull to ground, 0 = high-z) |
| 13 | W | Drive DS2433 1-wire bus low (1 = pull to ground, 0 = high-z) |
| 14-15 | | _Unused_ |
Bit 13 is mapped to the bus of the (normally unpopulated) DS2433 footprint. It
is currently unclear whether and how Konami's bitstreams expose this bus.
## Building the bitstream
**NOTE**: building the bitstream is *not* required in order to compile the
project as a prebuilt copy is provided in the `data` directory. This section is
only relevant if you wish to modify the source files in the `fpga/src`
directory, for instance to add new functionality.
You will have to obtain and install a copy of Xilinx Foundation ISE 3.3. Later
ISE releases such as 4.2 (the last one to support Spartan-XL devices) may also
work but have not been tested. The toolchain is Windows only but seems to work
under Wine; the installer does not, however it is possible to sidestep it by
manually invoking the Java-based extractor included in the installer as follows:
```bash
# Replace /opt/xilinx with a suitable target location and run from the
# installation package's root
find car -iname '*.car' -exec \
java -cp ce/CarExpand.jar:ce/marimba.zip:ce/tuner.zip \
com.xilinx.carexp.CarExp '{}' /opt/xilinx \;
```
Due to ISE's limitations, the full absolute path to the target directory
(`C:\Xilinx` by default) must be less than 64 characters long and cannot contain
any spaces. You will additionally need a recent version of
[Yosys](https://github.com/YosysHQ/yosys), which can be installed as part of
the [OSS CAD Suite](https://github.com/YosysHQ/oss-cad-suite-build#installation)
and should be added to the `PATH` environment variable.
Once both are installed, you may compile the bitstream by running the following
commands from the project's `fpga` directory (replace the ISE path
appropriately):
```bash
# Windows
set XILINX=C:\Xilinx
.\build.bat
# Linux (requires Wine)
export XILINX=/opt/xilinx
chmod +x build.sh
./build.sh
```
The bitstream can then be inspected by loading the generated `build/fpga.ncd`
file into the ISE FPGA editor (`bin/nt/fpga_editor.exe`).

View File

@ -9,6 +9,7 @@ if errorlevel 1 (
added to PATH in order to run this script.
exit /b 1
)
if not exist "%XILINX%\bin\nt\" (
echo The XILINX environment variable must be set to the root of a valid ^
Xilinx ISE 3.3 ^(Windows^) installation in order to run this script. Note that ^
@ -16,6 +17,8 @@ the path cannot contain spaces due to ISE limitations.
exit /b 1
)
set PATH="%XILINX%\bin\nt";%PATH%
cd /d "%~dp0"
if exist "%BUILD_DIR%" (

View File

@ -2,8 +2,24 @@
BUILD_DIR="build"
if ! which yosys >/dev/null 2>&1; then
echo \
"Yosys (https://github.com/YosysHQ/yosys) must be installed and added" \
"to PATH in order to run this script."
exit 1
fi
if [ ! -d "$XILINX/bin/nt" ]; then
echo \
"The XILINX environment variable must be set to the root of a valid" \
"Xilinx ISE 3.3 (Windows) installation in order to run this script." \
"Note that the path cannot contain spaces due to ISE limitations."
exit 1
fi
case "$(uname -s)" in
CYGWIN*|MINGW*|MSYS*)
export PATH="$XILINX/bin/nt:$WINEPATH"
ISE_RUNNER=""
;;
*)
@ -14,24 +30,11 @@ case "$(uname -s)" in
exit 1
fi
export WINEPATH="$(winepath -w "$XILINX");$WINEPATH"
ISE_RUNNER="wine"
;;
esac
if ! which yosys >/dev/null 2>&1; then
echo \
"Yosys (https://github.com/YosysHQ/yosys) must be installed and added" \
" to PATH in order to run this script."
exit 1
fi
if [ ! -d "$XILINX/bin/nt" ]; then
echo \
"The XILINX environment variable must be set to the root of a valid" \
"Xilinx ISE 3.3 (Windows) installation in order to run this script." \
"Note that the path cannot contain spaces due to ISE limitations."
exit 1
fi
cd "$(dirname "$0")"
mkdir -p "$BUILD_DIR"

View File

@ -6,8 +6,6 @@ set COVER_MODE=area
set OPTIMIZATION_MODE=speed
set OPTIMIZATION_LEVEL=high
set PATH="%XILINX%\bin\nt";%PATH%
cd /d "%~dp0\build"
ngdbuild synth.edf synth.ngd ^

View File

@ -109,6 +109,12 @@
"name": "data/fpga.bit",
"source": "${PROJECT_SOURCE_DIR}/data/fpga.bit"
},
{
"type": "db",
"name": "data/games.db",
"source": "${PROJECT_SOURCE_DIR}/data/games.json"
},
{
"type": "binary",
"name": "data/x76f041.db",
@ -118,11 +124,6 @@
"type": "binary",
"name": "data/zs01.db",
"source": "${PROJECT_SOURCE_DIR}/data/zs01.db"
},
{
"type": "binary",
"name": "data/flash.db",
"source": "${PROJECT_SOURCE_DIR}/data/flash.db"
}
]
}

View File

@ -97,6 +97,11 @@
"name": "assets/about.txt",
"source": "${PROJECT_BINARY_DIR}/about.txt"
},
{
"type": "db",
"name": "data/games.db",
"source": "${PROJECT_SOURCE_DIR}/data/games.json"
},
{
"type": "binary",
@ -112,11 +117,6 @@
"type": "binary",
"name": "data/zs01.db",
"source": "${PROJECT_SOURCE_DIR}/data/zs01.db"
},
{
"type": "binary",
"name": "data/flash.db",
"source": "${PROJECT_SOURCE_DIR}/data/flash.db"
}
]
}

View File

@ -23,7 +23,7 @@
"properties": {
"type": {
"title": "Entry type",
"description": "Must be 'empty', 'text', 'binary', 'tim', 'metrics', 'palette' or 'strings'.",
"description": "Must be 'empty', 'text', 'binary', 'tim', 'metrics', 'palette', 'strings' or 'db'.",
"type": "string",
"enum": [
@ -33,7 +33,8 @@
"tim",
"metrics",
"palette",
"strings"
"strings",
"db"
]
},
"name": {
@ -173,12 +174,12 @@
"additionalProperties": false,
"properties": {
"type": { "pattern": "^metrics|palette|strings$" },
"type": { "pattern": "^metrics|palette|strings|db$" },
"name": { "type": "string" },
"source": {
"title": "Path to source file",
"description": "Path to the JSON file containing font metrics, palette entries or strings (if such data is not part of the entry object), relative to the configuration file's directory by default.",
"description": "Path to the JSON file containing font metrics, palette entries, strings or the game list (if such data is not part of the entry object), relative to the configuration file's directory by default.",
"type": "string",
"format": "uri-reference"
@ -229,6 +230,21 @@
"type": "object"
}
}
},
{
"required": [ "db" ],
"additionalProperties": false,
"properties": {
"type": { "const": "db" },
"name": { "type": "string" },
"strings": {
"title": "Game database",
"description": "Game database root object. If not specified, the source attribute must be a path to a JSON file containing this object.",
"type": "object"
}
}
}
]
}

View File

@ -189,10 +189,10 @@ def main():
parser: ArgumentParser = createParser()
args: Namespace = parser.parse_args()
with args.configFile as _file:
configFile: dict[str, Any] = json.load(_file)
with args.configFile as file:
configFile: dict[str, Any] = json.load(file)
sourceDir: Path = \
args.source_dir or Path(_file.name).parent
args.source_dir or Path(file.name).parent
iso: PyCdlib = PyCdlib()
paddingFile: PaddingFile = PaddingFile()
@ -264,17 +264,17 @@ def main():
restoreISOFileOrder(iso, isoEntries, not args.quiet)
with args.output as _file:
with args.output as file:
iso.write_fp(
outfp = _file,
outfp = file,
progress_cb = None if args.quiet else showProgress
)
iso.close()
if args.system_area:
with args.system_area as inputFile:
_file.seek(0)
_file.write(inputFile.read(iso.logical_block_size * 16))
file.seek(0)
file.write(inputFile.read(iso.logical_block_size * 16))
if __name__ == "__main__":
main()

View File

@ -1,278 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
__version__ = "0.4.2"
__author__ = "spicyjpeg"
import json, logging, os, re
from argparse import ArgumentParser, FileType, Namespace
from collections import Counter, defaultdict
from pathlib import Path
from struct import Struct
from typing import Any, Mapping, Sequence, TextIO
from common.cart import CartDump, DumpFlag
from common.cartdata import *
from common.games import GameDB, GameDBEntry
from common.util import setupLogger
## MAME NVRAM file parser
_MAME_X76F041_STRUCT: Struct = Struct("< 4x 8s 8s 8s 8s 512s")
_MAME_X76F100_STRUCT: Struct = Struct("< 4x 8s 8s 112s")
_MAME_ZS01_STRUCT: Struct = Struct("< 4x 8s 8s 8s 112s")
def parseMAMEDump(dump: bytes) -> CartDump:
systemID: bytes = bytes(8)
cartID: bytes = bytes(8)
zsID: bytes = bytes(8)
config: bytes = bytes(8)
flags: DumpFlag = \
DumpFlag.DUMP_PUBLIC_DATA_OK | DumpFlag.DUMP_PRIVATE_DATA_OK
match int.from_bytes(dump[0:4], "big"):
case 0x1955aa55:
chipType: ChipType = ChipType.X76F041
_, _, dataKey, config, data = _MAME_X76F041_STRUCT.unpack(dump)
flags |= DumpFlag.DUMP_CONFIG_OK
case 0x1900aa55:
chipType: ChipType = ChipType.X76F100
dataKey, readKey, data = _MAME_X76F100_STRUCT.unpack(dump)
if dataKey != readKey:
raise RuntimeError(
chipType,
"X76F100 dumps with different read/write keys are not "
"supported"
)
case 0x5a530001:
chipType: ChipType = ChipType.ZS01
_, dataKey, config, data = _MAME_ZS01_STRUCT.unpack(dump)
#zsID = _MAME_ZS_ID
flags |= DumpFlag.DUMP_CONFIG_OK | DumpFlag.DUMP_ZS_ID_OK
case _id:
raise RuntimeError(
ChipType.NONE, f"unrecognized chip ID: 0x{_id:08x}"
)
#if data.find(_MAME_CART_ID) >= 0:
#cartID = _MAME_CART_ID
#flags |= DumpFlag.DUMP_HAS_CART_ID | DumpFlag.DUMP_CART_ID_OK
#if data.find(_MAME_SYSTEM_ID) >= 0:
#systemID = _MAME_SYSTEM_ID
#flags |= DumpFlag.DUMP_HAS_SYSTEM_ID | DumpFlag.DUMP_SYSTEM_ID_OK
return CartDump(
chipType, flags, systemID, cartID, zsID, dataKey, config, data
)
## Dump processing
def processDump(
dump: CartDump, gameDB: GameDB, nameHints: Sequence[str] = [],
exportFile: TextIO | None = None
) -> CartDBEntry:
parser: CartParser = newCartParser(dump)
# If the parser could not find a valid game code in the dump, attempt to
# parse it from the provided hints.
if parser.region is None:
raise RuntimeError("can't parse game region from dump")
if parser.code is None:
for hint in nameHints:
code: re.Match | None = GAME_CODE_REGEX.search(
hint.upper().encode("ascii")
)
if code is not None:
parser.code = code.group().decode("ascii")
break
if parser.code is None:
raise RuntimeError(
"can't parse game code from dump nor from filename"
)
matches: list[GameDBEntry] = sorted(
gameDB.lookupByCode(parser.code, parser.region)
)
if exportFile:
_, flags = str(parser.flags).split(".", 1)
matchList: str = " ".join(
(game.mameID or f"[{game}]") for game in matches
)
exportFile.write(
f"{dump.chipType.name},"
f"{' '.join(nameHints)},"
f"{parser.code},"
f"{parser.region},"
f"{matchList},"
f"{parser.getFormatType().name},"
f"{flags}\n"
)
if not matches:
raise RuntimeError(
f"{parser.code} {parser.region} not found in game list"
)
# If more than one match is found, use the first result.
game: GameDBEntry = matches[0]
if game.hasCartID():
if not (parser.flags & DataFlag.DATA_HAS_CART_ID):
raise RuntimeError("game has a cartridge ID but dump does not")
else:
if parser.flags & DataFlag.DATA_HAS_CART_ID:
raise RuntimeError("dump has a cartridge ID but game does not")
if game.hasSystemID() and game.cartLockedToIOBoard:
if not (parser.flags & DataFlag.DATA_HAS_SYSTEM_ID):
raise RuntimeError("game has a system ID but dump does not")
else:
if parser.flags & DataFlag.DATA_HAS_SYSTEM_ID:
raise RuntimeError("dump has a system ID but game does not")
logging.info(f"imported {dump.chipType.name}: {game.getFullName()}")
return CartDBEntry(parser.code, parser.region, game.name, dump, parser)
## Main
_MAME_DUMP_SIZES: Sequence[int] = (
_MAME_X76F041_STRUCT.size,
_MAME_X76F100_STRUCT.size,
_MAME_ZS01_STRUCT.size
)
def createParser() -> ArgumentParser:
parser = ArgumentParser(
description = \
"Recursively scans a directory for MAME dumps of X76F041 and ZS01 "
"cartridges, analyzes them and generates .db files.",
add_help = False
)
group = parser.add_argument_group("Tool options")
group.add_argument(
"-h", "--help",
action = "help",
help = "Show this help message and exit"
)
group.add_argument(
"-v", "--verbose",
action = "count",
help = "Enable additional logging levels"
)
group = parser.add_argument_group("File paths")
group.add_argument(
"-o", "--output",
type = Path,
default = os.curdir,
help = "Path to output directory (current directory by default)",
metavar = "dir"
)
group.add_argument(
"-e", "--export",
type = FileType("wt"),
help = "Export CSV table of all dumps parsed to specified path",
metavar = "file"
)
group.add_argument(
"gameList",
type = FileType("rt"),
help = "Path to JSON file containing game list"
)
group.add_argument(
"input",
type = Path,
nargs = "+",
help = "Paths to input directories"
)
return parser
def main():
parser: ArgumentParser = createParser()
args: Namespace = parser.parse_args()
setupLogger(args.verbose)
with args.gameList as _file:
gameList: Sequence[Mapping[str, Any]] = json.load(_file)
gameDB: GameDB = GameDB(gameList)
failures: Counter[ChipType] = Counter()
entries: defaultdict[ChipType, list[CartDBEntry]] = defaultdict(list)
if args.export:
args.export.write(
"# chipType,nameHints,code,region,matchList,formatType,flags\n"
)
for inputPath in args.input:
for rootDir, _, files in os.walk(inputPath):
root: Path = Path(rootDir)
for dumpName in files:
path: Path = root / dumpName
size: int = os.stat(path).st_size
# Skip files whose size does not match any of the known dump
# formats.
if size not in _MAME_DUMP_SIZES:
logging.warning(f"ignoring: {dumpName}, invalid size")
continue
with open(path, "rb") as _file:
data: bytes = _file.read()
try:
dump: CartDump = parseMAMEDump(data)
except RuntimeError as exc:
logging.error(f"failed to parse: {path}, {exc}")
continue
hints: Sequence[str] = dumpName, root.name
try:
entries[dump.chipType].append(
processDump(dump, gameDB, hints, args.export)
)
except RuntimeError as exc:
logging.error(
f"failed to import {dump.chipType.name}: {path}, {exc}"
)
failures[dump.chipType] += 1
if args.export:
args.export.close()
# Sort all entries and generate the .db files.
for chipType, _entries in entries.items():
if not _entries:
logging.warning(f"no entries generated for {chipType.name}")
continue
_entries.sort()
with open(args.output / f"{chipType.name.lower()}.db", "wb") as _file:
for entry in _entries:
_file.write(entry.serialize())
logging.info(
f"{chipType.name}: {len(_entries)} entries saved, "
f"{failures[chipType]} failures"
)
if __name__ == "__main__":
main()

View File

@ -1,233 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
__version__ = "0.4.2"
__author__ = "spicyjpeg"
import json, logging, os, re
from argparse import ArgumentParser, FileType, Namespace
from pathlib import Path
from typing import ByteString, Mapping, TextIO
from common.cart import DumpFlag, ROMHeaderDump
from common.cartdata import *
from common.games import GameDB, GameDBEntry
from common.util import InterleavedFile, setupLogger
## Flash dump "parser"
_ROM_HEADER_LENGTH: int = 0x20
_MAME_SYSTEM_ID: bytes = bytes.fromhex("01 12 34 56 78 9a bc 3d")
def parseFlashDump(dump: bytes) -> ROMHeaderDump:
return ROMHeaderDump(
DumpFlag.DUMP_HAS_SYSTEM_ID | DumpFlag.DUMP_SYSTEM_ID_OK,
_MAME_SYSTEM_ID,
dump[0:_ROM_HEADER_LENGTH]
)
## Dump processing
def processDump(
dump: ROMHeaderDump, gameDB: GameDB, nameHints: Sequence[str] = [],
exportFile: TextIO | None = None
) -> ROMHeaderDBEntry:
parser: ROMHeaderParser = newROMHeaderParser(dump)
# If the parser could not find a valid game code in the dump, attempt to
# parse it from the provided hints.
if parser.region is None:
raise RuntimeError("can't parse game region from dump")
if parser.code is None:
for hint in nameHints:
code: re.Match | None = GAME_CODE_REGEX.search(
hint.upper().encode("ascii")
)
if code is not None:
parser.code = code.group().decode("ascii")
break
if parser.code is None:
raise RuntimeError(
"can't parse game code from dump nor from filename"
)
matches: list[GameDBEntry] = sorted(
gameDB.lookupByCode(parser.code, parser.region)
)
if exportFile:
_, flags = str(parser.flags).split(".", 1)
matchList: str = " ".join(
(game.mameID or f"[{game}]") for game in matches
)
exportFile.write(
f"{' '.join(nameHints)},"
f"{parser.code},"
f"{parser.region},"
f"{matchList},"
f"{parser.getFormatType().name},"
f"{flags}\n"
)
if not matches:
raise RuntimeError(
f"{parser.code} {parser.region} not found in game list"
)
# If more than one match is found, use the first result.
game: GameDBEntry = matches[0]
if game.hasSystemID() and game.flashLockedToIOBoard:
if not (parser.flags & DataFlag.DATA_HAS_SYSTEM_ID):
raise RuntimeError("game has a system ID but dump has no signature")
else:
if parser.flags & DataFlag.DATA_HAS_SYSTEM_ID:
raise RuntimeError("dump has a signature but game has no system ID")
logging.info(f"imported: {game.getFullName()}")
return ROMHeaderDBEntry(parser.code, parser.region, game.name, parser)
## Main
_FULL_DUMP_SIZE: int = 0x1000000
_EVEN_ODD_DUMP_SIZE: int = 0x200000
def createParser() -> ArgumentParser:
parser = ArgumentParser(
description = \
"Recursively scans a directory for subdirectories containing MAME "
"flash dumps, analyzes them and generates .db files.",
add_help = False
)
group = parser.add_argument_group("Tool options")
group.add_argument(
"-h", "--help",
action = "help",
help = "Show this help message and exit"
)
group.add_argument(
"-v", "--verbose",
action = "count",
help = "Enable additional logging levels"
)
group = parser.add_argument_group("File paths")
group.add_argument(
"-o", "--output",
type = Path,
default = os.curdir,
help = "Path to output directory (current directory by default)",
metavar = "dir"
)
group.add_argument(
"-e", "--export",
type = FileType("wt"),
help = "Export CSV table of all dumps parsed to specified path",
metavar = "file"
)
group.add_argument(
"gameList",
type = FileType("rt"),
help = "Path to JSON file containing game list"
)
group.add_argument(
"input",
type = Path,
nargs = "+",
help = "Paths to input directories"
)
return parser
def main():
parser: ArgumentParser = createParser()
args: Namespace = parser.parse_args()
setupLogger(args.verbose)
with args.gameList as _file:
gameList: Sequence[Mapping[str, Any]] = json.load(_file)
gameDB: GameDB = GameDB(gameList)
failures: int = 0
entries: list[ROMHeaderDBEntry] = []
if args.export:
args.export.write(
"# nameHints,code,region,matchList,formatType,flags\n"
)
for inputPath in args.input:
for rootDir, _, files in os.walk(inputPath):
root: Path = Path(rootDir)
for dumpName in files:
path: Path = root / dumpName
size: int = os.stat(path).st_size
match path.suffix.lower():
case ".31m":
oddPath: Path = Path(rootDir, f"{path.stem}.27m")
if not oddPath.is_file():
logging.warning(f"ignoring: {path}, no .27m file")
continue
if size != _EVEN_ODD_DUMP_SIZE:
logging.warning(f"ignoring: {path}, invalid size")
continue
with \
open(path, "rb") as even, \
open(oddPath, "rb") as odd:
data: ByteString = InterleavedFile(even, odd) \
.read(_ROM_HEADER_LENGTH)
case ".27m":
evenPath: Path = Path(rootDir, f"{path.stem}.31m")
if not evenPath.is_file():
logging.warning(f"ignoring: {path}, no .31m file")
continue
case _:
if size != _FULL_DUMP_SIZE:
logging.warning(f"ignoring: {path}, invalid size")
continue
with open(path, "rb") as _file:
data: ByteString = _file.read(_ROM_HEADER_LENGTH)
dump: ROMHeaderDump = parseFlashDump(data)
hints: Sequence[str] = dumpName, root.name
try:
entries.append(
processDump(dump, gameDB, hints, args.export)
)
except RuntimeError as exc:
logging.error(f"failed to import: {path}, {exc}")
failures += 1
if args.export:
args.export.close()
# Sort all entries and generate the .db file.
if not entries:
logging.warning("no entries generated")
return
entries.sort()
with open(args.output / "flash.db", "wb") as _file:
for entry in entries:
_file.write(entry.serialize())
logging.info(f"{len(entries)} entries saved, {failures} failures")
if __name__ == "__main__":
main()

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
__version__ = "0.4.2"
__version__ = "0.4.6"
__author__ = "spicyjpeg"
import json
@ -73,10 +73,10 @@ def main():
parser: ArgumentParser = createParser()
args: Namespace = parser.parse_args()
with args.configFile as _file:
configFile: dict[str, Any] = json.load(_file)
with args.configFile as file:
configFile: dict[str, Any] = json.load(file)
sourceDir: Path = \
args.source_dir or Path(_file.name).parent
args.source_dir or Path(file.name).parent
assetList: list[dict[str, Any]] = configFile["resources"]
@ -87,12 +87,12 @@ def main():
data: ByteString = bytes(int(asset.get("size", 0)))
case "text":
with open(sourceDir / asset["source"], "rt") as _file:
data: ByteString = _file.read().encode("ascii")
with open(sourceDir / asset["source"], "rt") as file:
data: ByteString = file.read().encode("ascii")
case "binary":
with open(sourceDir / asset["source"], "rb") as _file:
data: ByteString = _file.read()
with open(sourceDir / asset["source"], "rb") as file:
data: ByteString = file.read()
case "tim":
ix: int = int(asset["imagePos"]["x"])
@ -105,7 +105,7 @@ def main():
if image.mode != "P":
image = image.quantize(
int(asset["quantize"]), dither = Image.NONE
int(asset.get("quantize", 16)), dither = Image.NONE
)
data: ByteString = generateIndexedTIM(image, ix, iy, cx, cy)
@ -114,8 +114,8 @@ def main():
if "metrics" in asset:
metrics: dict = asset["metrics"]
else:
with open(sourceDir / asset["source"], "rt") as _file:
metrics: dict = json.load(_file)
with open(sourceDir / asset["source"], "rt") as file:
metrics: dict = json.load(file)
data: ByteString = generateFontMetrics(metrics)
@ -123,8 +123,8 @@ def main():
if "palette" in asset:
palette: dict = asset["palette"]
else:
with open(sourceDir / asset["source"], "rt") as _file:
palette: dict = json.load(_file)
with open(sourceDir / asset["source"], "rt") as file:
palette: dict = json.load(file)
data: ByteString = generateColorPalette(palette)
@ -132,8 +132,8 @@ def main():
if "strings" in asset:
strings: dict = asset["strings"]
else:
with open(sourceDir / asset["source"], "rt") as _file:
strings: dict = json.load(_file)
with open(sourceDir / asset["source"], "rt") as file:
strings: dict = json.load(file)
data: ByteString = generateStringTable(strings)

View File

@ -544,8 +544,8 @@ def parseInstruction(address: int, inst: int) -> Instruction:
## Executable analyzer
def parseStructFromFile(_file: BinaryIO, _struct: Struct) -> tuple:
return _struct.unpack(_file.read(_struct.size))
def parseStructFromFile(file: BinaryIO, _struct: Struct) -> tuple:
return _struct.unpack(file.read(_struct.size))
_EXE_HEADER_STRUCT: Struct = Struct("< 8s 8x 4I 16x 2I 20x 1972s")
_EXE_HEADER_MAGIC: bytes = b"PS-X EXE"
@ -553,7 +553,7 @@ _EXE_HEADER_MAGIC: bytes = b"PS-X EXE"
_FUNCTION_RETURN: bytes = bytes.fromhex("08 00 e0 03") # jr $ra
class PSEXEAnalyzer:
def __init__(self, _file: BinaryIO):
def __init__(self, file: BinaryIO):
(
magic,
entryPoint,
@ -564,7 +564,7 @@ class PSEXEAnalyzer:
stackLength,
_
) = \
parseStructFromFile(_file, _EXE_HEADER_STRUCT)
parseStructFromFile(file, _EXE_HEADER_STRUCT)
if magic != _EXE_HEADER_MAGIC:
raise RuntimeError("file is not a valid PS1 executable")
@ -572,9 +572,9 @@ class PSEXEAnalyzer:
self.entryPoint: int = entryPoint
self.startAddress: int = startAddress
self.endAddress: int = startAddress + length
self.body: bytes = _file.read(length)
self.body: bytes = file.read(length)
#_file.close()
#file.close()
def __getitem__(self, key: int | slice) -> Any:
if isinstance(key, slice):

View File

@ -9,7 +9,7 @@ customizing the region string (used by some emulators to determine whether they
should start in PAL or NTSC mode by default). Requires no external dependencies.
"""
__version__ = "0.1.1"
__version__ = "0.1.2"
__author__ = "spicyjpeg"
from argparse import ArgumentParser, FileType, Namespace
@ -26,13 +26,13 @@ def alignToMultiple(data: bytearray, alignment: int):
if padAmount < alignment:
data.extend(b"\0" * padAmount)
def parseStructFromFile(_file: BinaryIO, _struct: Struct) -> tuple:
return _struct.unpack(_file.read(_struct.size))
def parseStructFromFile(file: BinaryIO, _struct: Struct) -> tuple:
return _struct.unpack(file.read(_struct.size))
def parseStructsFromFile(
_file: BinaryIO, _struct: Struct, count: int
file: BinaryIO, _struct: Struct, count: int
) -> Generator[tuple, None, None]:
data: bytes = _file.read(_struct.size * count)
data: bytes = file.read(_struct.size * count)
for offset in range(0, len(data), _struct.size):
yield _struct.unpack(data[offset:offset + _struct.size])
@ -79,9 +79,9 @@ class Segment:
(self.flags & (ProgHeaderFlag.WRITE | ProgHeaderFlag.EXECUTE))
class ELF:
def __init__(self, _file: BinaryIO):
def __init__(self, file: BinaryIO):
# Parse the file header and perform some minimal validation.
_file.seek(0)
file.seek(0)
(
magic,
@ -103,7 +103,7 @@ class ELF:
secHeaderCount,
_
) = \
parseStructFromFile(_file, ELF_HEADER_STRUCT)
parseStructFromFile(file, ELF_HEADER_STRUCT)
if magic != ELF_HEADER_MAGIC:
raise RuntimeError("file is not a valid ELF")
@ -124,7 +124,7 @@ class ELF:
# Parse the program headers and extract all loadable segments.
self.segments: list[Segment] = []
_file.seek(progHeaderOffset)
file.seek(progHeaderOffset)
for (
headerType,
@ -135,13 +135,13 @@ class ELF:
length,
flags,
_
) in parseStructsFromFile(_file, PROG_HEADER_STRUCT, progHeaderCount):
) in parseStructsFromFile(file, PROG_HEADER_STRUCT, progHeaderCount):
if headerType != ProgHeaderType.LOAD:
continue
# Retrieve the segment and trim or pad it if necessary.
_file.seek(fileOffset)
data: bytes = _file.read(fileLength)
file.seek(fileOffset)
data: bytes = file.read(fileLength)
if length > len(data):
data = data.ljust(length, b"\0")
@ -150,7 +150,7 @@ class ELF:
self.segments.append(Segment(address, data, flags))
#_file.close()
#file.close()
def flatten(self, stripReadOnly: bool = False) -> tuple[int, bytearray]:
# Find the lower and upper boundaries of the segments' address space.
@ -241,9 +241,9 @@ def main():
parser: ArgumentParser = createParser()
args: Namespace = parser.parse_args()
with args.input as _file:
with args.input as file:
try:
elf: ELF = ELF(_file)
elf: ELF = ELF(file)
except RuntimeError as err:
parser.error(err.args[0])
@ -269,9 +269,9 @@ def main():
region # Region string
)
with args.output as _file:
_file.write(header)
_file.write(data)
with args.output as file:
file.write(header)
file.write(data)
if __name__ == "__main__":
main()

View File

@ -95,8 +95,8 @@ def main():
args: Namespace = parser.parse_args()
if args.input:
with args.input as _file:
data: bytes = _file.read()
with args.input as file:
data: bytes = file.read()
try:
dump: CartDump = parseCartDump(data)
@ -110,8 +110,8 @@ def main():
if args.log:
printDumpInfo(dump, args.log)
if args.export:
with args.export as _file:
_file.write(dump.serialize())
with args.export as file:
file.write(dump.serialize())
if __name__ == "__main__":
main()

View File

@ -17,12 +17,12 @@ _FLASH_BANK_SIZE: int = 0x400000
_PCMCIA_BANK_SIZE: int = 0x400000
def splitFlash(inputPath: Path, outputPath: Path):
with open(inputPath, "rb") as _file:
with open(inputPath, "rb") as file:
for bank in _FLASH_BANKS:
with \
open(outputPath / f"29f016a.31{bank}", "wb") as even, \
open(outputPath / f"29f016a.27{bank}", "wb") as odd:
data: ByteString = _file.read(_FLASH_BANK_SIZE)
data: ByteString = file.read(_FLASH_BANK_SIZE)
even.write(data[0::2])
odd.write(data[1::2])
@ -30,12 +30,12 @@ def splitFlash(inputPath: Path, outputPath: Path):
def splitPCMCIACard(inputPath: Path, outputPath: Path, card: int, size: int):
name: str = f"pccard{card}_{size // 0x100000}mb"
with open(inputPath, "rb") as _file:
with open(inputPath, "rb") as file:
for bank in range(1, (size // _PCMCIA_BANK_SIZE) + 1):
with \
open(outputPath / f"{name}_{bank}l", "wb") as even, \
open(outputPath / f"{name}_{bank}u", "wb") as odd:
data: ByteString = _file.read(_PCMCIA_BANK_SIZE)
data: ByteString = file.read(_PCMCIA_BANK_SIZE)
even.write(data[0::2])
odd.write(data[1::2])