First commit of group-ironmen-master directory.

This commit is contained in:
2025-10-27 08:25:16 +08:00
commit a8467389ef
26390 changed files with 35396 additions and 0 deletions

6
.gitignore vendored Normal file
View File

@@ -0,0 +1,6 @@
group-ironmen-master/.github
group-ironmen-master/backup
group-ironmen-master/.gitignore
group-ironmen-master/.ignore
group-ironmen-tracker-master/*
tasks-tracker-plugin-master/*

View File

@@ -0,0 +1,7 @@
HOST_URL=http://localhost:5000 # Set to the URL (including http// or https://) of the backend. This is used for API calls from the frontend.
PG_USER=postgres # This can be anything if using docker-compose, otherwise match it with the user in your external DB.
PG_PASSWORD=postgres # This can be anything if using docker-compose, otherwise match it with the password in your external DB.
PG_HOST=postgres # Only change this if you are connecting to an external postgres database.
PG_PORT=5432 # Only change this is if you are connecting to an external postgres database on a non-standard port.
PG_DB=groupironman_db # This can be anything if using docker-compose, otherwise match it with the table in your external DB.
BACKEND_SECRET=somerandomkey # This can be anything, generate a random secret using some generator. It is used by the backend for hashing.

View File

@@ -0,0 +1,25 @@
BSD 2-Clause License
Copyright (c) 2022, Christopher Brown
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@@ -0,0 +1,66 @@
# Group Ironmen Tracker Frontend and Backend
Website: [groupiron.men](https://groupiron.men)
Source for plugin: [https://github.com/christoabrown/group-ironmen-tracker](https://github.com/christoabrown/group-ironmen-tracker)
This repo is for the frontend website and backend of the above plugin.
This plugin tracks information about your group ironman player and sends it to a server where you and your other group members can view it. Currently it tracks:
* Inventory, equipment, bank, rune pouch, and shared bank
* Skill XP
* World position, viewable in an interactive map
* HP, prayer, energy, and world as well as showing inactivity
* Quest state - completed, finished, in progress
# Self-hosting
It is possible to self-host the frontend and backend rather than use [groupiron.men](https://groupiron.men).
In the plugin settings, put the URL that you are hosting the website on. Leaving it blank will default to https://groupiron.men.
![](https://i.imgur.com/0JFD7D5.png)
## With Docker
Prerequisites
* Docker
* docker-compose
### With docker-compose
Copy the `docker-compose.yml`, `.env.example`, and `schema.sql` (exists in `server/src/sql`) files onto your server.
Copy the contents of `.env.example` into a new file named `.env` in the same directory and fill it with your secrets.
The `.env` file explains what should go into each secret.
The `docker-compose.yml` has a line that takes the path to the `schema.sql`. Make sure to update this to the relative or absolute path of the file on your server.
After you have set up the `.env` file and `schema.sql` path, you can run `docker-compose up -d` and this will spin up both the frontend and backend. The backend should be available on port 5000 and the frontend on port 4000, although these can be changed in the docker-compose file.
### Without docker-compose (untested)
If you are not using the docker-compose, then you will have to set up the Postgres database and pass secrets in using Docker environment variables. See below in the [Without Docker](#without-docker) section for how to set up the database.
You can then run the following to run the image for the frontend, adding the values of the environment variables:
```sh
docker run -d -e HOST_URL= chrisleeeee/group-ironmen-tracker-frontend
```
Same thing for the backend:
```sh
docker run -d -e PG_USER= -e PG_PASSWORD= -e PG_HOST= -e PG_PORT= -e PG_DB= -e BACKEND_SECRET= chrisleeeee/group-ironmen-tracker-backend
```
Check `.env.example` for an explanation on what the value of each environment variable should be.
Once it's running, the backend should be available on port 8080 and the frontend on port 4000.
## Without Docker
To be filled...

11
group-ironmen-master/cache/.gitignore vendored Normal file
View File

@@ -0,0 +1,11 @@
runelite
node_modules
item-data
item-images
.#*
items_need_images.csv
item_data.json
cache
map-data
output_files
output.dzi

128
group-ironmen-master/cache/Cache.java vendored Normal file
View File

@@ -0,0 +1,128 @@
package net.runelite.cache;
import net.runelite.cache.definitions.loaders.ModelLoader;
import net.runelite.cache.definitions.providers.ModelProvider;
import net.runelite.cache.fs.Archive;
import net.runelite.cache.fs.Index;
import net.runelite.cache.fs.Store;
import net.runelite.cache.item.ItemSpriteFactory;
import org.apache.commons.cli.*;
import javax.imageio.ImageIO;
import java.awt.image.BufferedImage;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
public class Cache {
public static void main(String[] args) throws IOException {
Options options = new Options();
options.addOption("c", "cache", true, "cache base");
options.addOption(null, "ids", true, "csv file with item ids to create images from");
options.addOption(null, "output", true, "directory to dump item model images to");
CommandLineParser parser = new DefaultParser();
CommandLine cmd;
try {
cmd = parser.parse(options, args);
} catch (ParseException ex) {
System.err.println("Error parsing command line options: " + ex.getMessage());
System.exit(-1);
return;
}
String cache = cmd.getOptionValue("cache");
Store store = loadStore(cache);
if (cmd.hasOption("output") && cmd.hasOption("ids")) {
String outputDir = cmd.getOptionValue("output");
String imageIdsFile = cmd.getOptionValue("ids");
if (outputDir == null) {
System.err.println("Item image directory must be specified");
return;
}
if (imageIdsFile == null) {
System.err.println("Image ID CSV file must be specified");
return;
}
List<Integer> itemIds = new ArrayList<>();
try (BufferedReader br = new BufferedReader(new FileReader(imageIdsFile))) {
String line = br.readLine();
if (line != null) {
String[] values = line.split(",");
for (String value : values) {
Integer itemId = Integer.parseInt(value);
itemIds.add(itemId);
}
}
}
System.out.println("Dumping item model images to " + outputDir);
dumpItemModelImages(store, new File(outputDir), itemIds);
} else {
System.err.println("Nothing to do");
}
}
private static Store loadStore(String cache) throws IOException {
Store store = new Store(new File(cache));
store.load();
return store;
}
private static void dumpItemModelImages(Store store, File outputDir, List<Integer> itemIds) throws IOException {
ItemManager dumper = new ItemManager(store);
dumper.load();
ModelProvider modelProvider = modelId -> {
Index models = store.getIndex(IndexType.MODELS);
Archive archive = models.getArchive(modelId);
byte[] data = archive.decompress(store.getStorage().loadArchive(archive));
return new ModelLoader().load(modelId, data);
};
SpriteManager spriteManager = new SpriteManager(store);
spriteManager.load();
TextureManager textureManager = new TextureManager(store);
textureManager.load();
if (!outputDir.exists()) {
outputDir.mkdir();
}
for (Integer itemId : itemIds) {
try {
final int border = 1;
final int shadowColor = 0x111111;
final boolean noted = false;
BufferedImage sprite = ItemSpriteFactory.createSprite(
dumper,
modelProvider,
spriteManager,
textureManager,
itemId,
1,
border,
shadowColor,
noted);
File out = new File(outputDir, itemId + ".png");
assert sprite != null;
ImageIO.write(sprite, "PNG", out);
} catch (Exception ex) {
System.err.println("error dumping item " + itemId);
}
}
}
}

View File

@@ -0,0 +1,226 @@
package net.runelite.cache;
import com.google.common.collect.ImmutableList;
import com.google.common.io.Files;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import lombok.Data;
import net.runelite.cache.definitions.EnumDefinition;
import net.runelite.cache.definitions.ItemDefinition;
import net.runelite.cache.definitions.ScriptDefinition;
import net.runelite.cache.definitions.StructDefinition;
import net.runelite.cache.definitions.loaders.EnumLoader;
import net.runelite.cache.definitions.loaders.ScriptLoader;
import net.runelite.cache.fs.Archive;
import net.runelite.cache.fs.ArchiveFiles;
import net.runelite.cache.fs.FSFile;
import net.runelite.cache.fs.Index;
import net.runelite.cache.fs.Storage;
import net.runelite.cache.fs.Store;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.DefaultParser;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import java.io.File;
import java.io.IOException;
import java.nio.charset.Charset;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class CollectionLogDumper
{
private static String outputDirectory;
private static final List<Integer> COLLECTION_LOG_TAB_STRUCT_IDS = ImmutableList.of(
471, // Bosses
472, // Raids
473, // Clues
474, // Minigames
475 // Other
);
private static final int COLLECTION_LOG_TAB_ENUM_PARAM_ID = 683;
private static final int COLLECTION_LOG_PAGE_NAME_PARAM_ID = 689;
private static final int COLLECTION_LOG_PAGE_ITEMS_ENUM_PARAM_ID = 690;
private static final int COLLECTION_CATEGORY_COUNT_SCRIPT = 2735;
private static final Gson gson = new GsonBuilder().setPrettyPrinting().create();
@Data
static class CollectionLogItem
{
public Integer id;
public String name;
}
@Data
static class CollectionLogPage
{
public String name;
public List<String> completion_labels = new ArrayList<>();
public List<CollectionLogItem> items = new ArrayList<>();
}
@Data
static class CollectionLogTab
{
public Integer tabId;
public List<CollectionLogPage> pages = new ArrayList<>();
}
public static void main(String[] args) throws IOException
{
Options options = new Options();
options.addOption(Option.builder().longOpt("cachedir").hasArg().required().build());
options.addOption(Option.builder().longOpt("outputdir").hasArg().required().build());
CommandLineParser parser = new DefaultParser();
CommandLine cmd;
try
{
cmd = parser.parse(options, args);
}
catch (ParseException ex)
{
System.err.println("Error parsing command line options: " + ex.getMessage());
System.exit(-1);
return;
}
final String cacheDirectory = cmd.getOptionValue("cachedir");
outputDirectory = cmd.getOptionValue("outputdir");
File base = new File(cacheDirectory);
File outDir = new File(outputDirectory);
outDir.mkdirs();
try (Store store = new Store(base))
{
store.load();
Storage storage = store.getStorage();
Index index = store.getIndex(IndexType.CONFIGS);
Archive archive = index.getArchive(ConfigType.ENUM.getId());
byte[] archiveData = storage.loadArchive(archive);
ArchiveFiles files = archive.getFiles(archiveData);
EnumLoader enumLoader = new EnumLoader();
StructManager structManager = new StructManager(store);
structManager.load();
Index scriptIndex = store.getIndex(IndexType.CLIENTSCRIPT);
Archive collectionCategoryCountScript = scriptIndex.getArchive(COLLECTION_CATEGORY_COUNT_SCRIPT);
byte[] collectionCategoryCountScriptData = storage.loadArchive(collectionCategoryCountScript);
FSFile collectionCategoryCountScriptFile = collectionCategoryCountScript.getFiles(collectionCategoryCountScriptData).findFile(0);
ScriptLoader scriptLoader = new ScriptLoader();
ScriptDefinition collectionCategoryCountScriptDefinition = scriptLoader.load(COLLECTION_CATEGORY_COUNT_SCRIPT, collectionCategoryCountScriptFile.getContents());
Map<Integer, List<String>> completionLabels = new HashMap<>();
String[] labelStrings = collectionCategoryCountScriptDefinition.getStringOperands();
int offset = 1;
for (Integer pageId : collectionCategoryCountScriptDefinition.getSwitches()[0].keySet())
{
List<String> labels = new ArrayList<>();
int count = 0;
// Every completion count return has 3 strings
for (int i = offset; count < 3; ++i, ++offset)
{
String label = labelStrings[i];
String previousLabel = labelStrings[i - 1];
// If the previous value is another valid label then it is part of some argument to a proc
// and we only want the first one. Example is the "High-level Gambles" which can change to not
// include the word "Gamble" on mobile.
if (label != null && !label.trim().startsWith("<") && (previousLabel == null || !previousLabel.trim().endsWith(":")))
{
++count;
// non-empty labels should always have a <col></col> tag that we can advance to avoid
// any other empty string up until then.
if (label.trim().endsWith(":"))
{
for (; labelStrings[i] == null || !labelStrings[i].trim().equals("</col>"); ++i, ++offset);
labels.add(label.trim().substring(0, label.trim().length() - 1));
}
}
}
Collections.reverse(labels);
completionLabels.put(pageId, labels);
}
ItemManager itemManager = new ItemManager(store);
itemManager.load();
List<CollectionLogTab> collectionLog = new ArrayList<>();
int tabIdx = 0;
for (Integer collectionLogTabStructId : COLLECTION_LOG_TAB_STRUCT_IDS)
{
StructDefinition tabStruct = structManager.getStruct(collectionLogTabStructId);
Integer tabEnumId = (Integer) tabStruct.getParams().get(COLLECTION_LOG_TAB_ENUM_PARAM_ID);
EnumDefinition tabEnum = getEnumDefinition(enumLoader, files, tabEnumId);
CollectionLogTab collectionLogTab = new CollectionLogTab();
collectionLogTab.tabId = tabIdx++;
collectionLog.add(collectionLogTab);
for (Integer pageStructId : tabEnum.getIntVals())
{
StructDefinition pageStruct = structManager.getStruct(pageStructId);
String pageName = (String) pageStruct.getParams().get(COLLECTION_LOG_PAGE_NAME_PARAM_ID);
Integer pageItemsEnumId = (Integer) pageStruct.getParams().get(COLLECTION_LOG_PAGE_ITEMS_ENUM_PARAM_ID);
EnumDefinition pageItemsEnum = getEnumDefinition(enumLoader, files, pageItemsEnumId);
CollectionLogPage collectionLogPage = new CollectionLogPage();
collectionLogPage.name = pageName;
collectionLogPage.completion_labels = completionLabels.getOrDefault(pageStructId, new ArrayList<>());
collectionLogTab.pages.add(collectionLogPage);
for (Integer pageItemId : pageItemsEnum.getIntVals())
{
CollectionLogItem collectionLogItem = new CollectionLogItem();
ItemDefinition item = itemManager.getItem(pageItemId);
collectionLogItem.id = item.getId();
collectionLogItem.name = item.getName();
collectionLogPage.items.add(collectionLogItem);
}
}
}
Files.asCharSink(new File(outputDirectory, "collection_log_info.json"), Charset.defaultCharset()).write(gson.toJson(collectionLog));
}
}
private static EnumDefinition getEnumDefinition(EnumLoader enumLoader, ArchiveFiles files, Integer enumId) throws IOException
{
FSFile enumFile = null;
for (FSFile file : files.getFiles())
{
if (file.getFileId() == enumId)
{
enumFile = file;
break;
}
}
if (enumFile == null)
{
throw new IOException("Unable to find enum with id " + enumId);
}
byte[] b = enumFile.getContents();
EnumDefinition enumDefinition = enumLoader.load(enumFile.getFileId(), b);
if (enumDefinition == null)
{
throw new IOException("Unable to load enum definition for enum id " + enumId);
}
return enumDefinition;
}
}

View File

@@ -0,0 +1,442 @@
/*
* Copyright (c) 2018, Adam <Adam@sigterm.info>
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
*
* 1. Redistributions of source code must retain the above copyright notice, this
* list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright notice,
* this list of conditions and the following disclaimer in the documentation
* and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
* ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
* ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
* SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package net.runelite.cache.item;
import java.awt.image.BufferedImage;
import java.io.IOException;
import net.runelite.cache.definitions.ItemDefinition;
import net.runelite.cache.definitions.ModelDefinition;
import net.runelite.cache.definitions.providers.ItemProvider;
import net.runelite.cache.definitions.providers.ModelProvider;
import net.runelite.cache.definitions.providers.SpriteProvider;
import net.runelite.cache.definitions.providers.TextureProvider;
import net.runelite.cache.models.FaceNormal;
import net.runelite.cache.models.JagexColor;
import net.runelite.cache.models.VertexNormal;
public class ItemSpriteFactory
{
public static BufferedImage createSprite(ItemProvider itemProvider, ModelProvider modelProvider,
SpriteProvider spriteProvider, TextureProvider textureProvider,
int itemId, int quantity, int border, int shadowColor,
boolean noted) throws IOException
{
SpritePixels spritePixels = createSpritePixels(itemProvider, modelProvider, spriteProvider, textureProvider,
itemId, quantity, border, shadowColor, noted);
return spritePixels == null ? null : spritePixels.toBufferedImage();
}
private static SpritePixels createSpritePixels(ItemProvider itemProvider, ModelProvider modelProvider,
SpriteProvider spriteProvider, TextureProvider textureProvider,
int itemId, int quantity, int border, int shadowColor,
boolean noted) throws IOException
{
ItemDefinition item = itemProvider.provide(itemId);
if (quantity > 1 && item.countObj != null)
{
int stackItemID = -1;
for (int i = 0; i < 10; ++i)
{
if (quantity >= item.countCo[i] && item.countCo[i] != 0)
{
stackItemID = item.countObj[i];
}
}
if (stackItemID != -1)
{
item = itemProvider.provide(stackItemID);
}
}
Model itemModel = getModel(modelProvider, item);
if (itemModel == null)
{
return null;
}
SpritePixels auxSpritePixels = null;
if (item.notedTemplate != -1)
{
auxSpritePixels = createSpritePixels(itemProvider, modelProvider, spriteProvider, textureProvider,
item.notedID, 10, 1, 0, true);
if (auxSpritePixels == null)
{
return null;
}
}
else if (item.boughtTemplateId != -1)
{
auxSpritePixels = createSpritePixels(itemProvider, modelProvider, spriteProvider, textureProvider,
item.boughtId, quantity, border, 0, false);
if (auxSpritePixels == null)
{
return null;
}
}
else if (item.placeholderTemplateId != -1)
{
auxSpritePixels = createSpritePixels(itemProvider, modelProvider, spriteProvider, textureProvider,
item.placeholderId, quantity, 0, 0, false);
if (auxSpritePixels == null)
{
return null;
}
}
RSTextureProvider rsTextureProvider = new RSTextureProvider(textureProvider, spriteProvider);
int width = 36 * 2;
int height = 32 * 2;
SpritePixels spritePixels = new SpritePixels(width, height);
Graphics3D graphics = new Graphics3D(rsTextureProvider);
graphics.setBrightness(JagexColor.BRIGHTNESS_MAX);
graphics.setRasterBuffer(spritePixels.pixels, width, height);
graphics.reset();
graphics.setRasterClipping();
graphics.setOffset(16 * 2, 16 * 2);
graphics.rasterGouraudLowRes = false;
if (item.placeholderTemplateId != -1)
{
auxSpritePixels.drawAtOn(graphics, 0, 0);
}
int zoom2d;
// The holy symbol item for some reason needs a different zoom value otherwise it gets cut off
if (itemId == 1716 || itemId == 1718) {
zoom2d = (int) (((double) item.zoom2d) / 1.5D);
} else {
zoom2d = (int) (((double) item.zoom2d) / 1.95D);
}
if (noted)
{
zoom2d = (int) ((double) zoom2d * 1.5D);
}
else if (border == 2)
{
zoom2d = (int) ((double) zoom2d * 1.04D);
}
int var17 = zoom2d * Graphics3D.SINE[item.xan2d] >> 16;
int var18 = zoom2d * Graphics3D.COSINE[item.xan2d] >> 16;
itemModel.calculateBoundsCylinder();
itemModel.projectAndDraw(graphics, 0,
item.yan2d,
item.zan2d,
item.xan2d,
item.xOffset2d,
itemModel.modelHeight / 2 + var17 + item.yOffset2d,
var18 + item.yOffset2d);
if (item.boughtTemplateId != -1)
{
auxSpritePixels.drawAtOn(graphics, 0, 0);
}
if (border >= 1)
{
spritePixels.drawBorder(1);
}
if (border >= 2)
{
spritePixels.drawBorder(0xffffff);
}
if (shadowColor != 0)
{
spritePixels.drawShadow(shadowColor);
}
graphics.setRasterBuffer(spritePixels.pixels, width, height);
if (item.notedTemplate != -1)
{
auxSpritePixels.drawAtOn(graphics, 0, 0);
}
graphics.setRasterBuffer(graphics.graphicsPixels,
graphics.graphicsPixelsWidth,
graphics.graphicsPixelsHeight);
graphics.setRasterClipping();
graphics.rasterGouraudLowRes = true;
return spritePixels;
}
private static Model getModel(ModelProvider modelProvider, ItemDefinition item) throws IOException
{
Model itemModel;
ModelDefinition inventoryModel = modelProvider.provide(item.inventoryModel);
if (inventoryModel == null)
{
return null;
}
if (item.resizeX != 128 || item.resizeY != 128 || item.resizeZ != 128)
{
inventoryModel.resize(item.resizeX, item.resizeY, item.resizeZ);
}
if (item.colorFind != null)
{
for (int i = 0; i < item.colorFind.length; ++i)
{
inventoryModel.recolor(item.colorFind[i], item.colorReplace[i]);
}
}
if (item.textureFind != null)
{
for (int i = 0; i < item.textureFind.length; ++i)
{
inventoryModel.retexture(item.textureFind[i], item.textureReplace[i]);
}
}
itemModel = light(inventoryModel, item.ambient + 64, item.contrast + 768, -50, -10, -50);
return itemModel;
}
private static Model light(ModelDefinition def, int ambient, int contrast, int x, int y, int z)
{
def.computeNormals();
int somethingMagnitude = (int) Math.sqrt((double) (z * z + x * x + y * y));
int var7 = somethingMagnitude * contrast >> 8;
Model litModel = new Model();
litModel.faceColors1 = new int[def.faceCount];
litModel.faceColors2 = new int[def.faceCount];
litModel.faceColors3 = new int[def.faceCount];
if (def.numTextureFaces > 0 && def.textureCoords != null)
{
int[] var9 = new int[def.numTextureFaces];
int var10;
for (var10 = 0; var10 < def.faceCount; ++var10)
{
if (def.textureCoords[var10] != -1)
{
++var9[def.textureCoords[var10] & 255];
}
}
litModel.numTextureFaces = 0;
for (var10 = 0; var10 < def.numTextureFaces; ++var10)
{
if (var9[var10] > 0 && def.textureRenderTypes[var10] == 0)
{
++litModel.numTextureFaces;
}
}
litModel.texIndices1 = new int[litModel.numTextureFaces];
litModel.texIndices2 = new int[litModel.numTextureFaces];
litModel.texIndices3 = new int[litModel.numTextureFaces];
var10 = 0;
for (int i = 0; i < def.numTextureFaces; ++i)
{
if (var9[i] > 0 && def.textureRenderTypes[i] == 0)
{
litModel.texIndices1[var10] = def.texIndices1[i] & '\uffff';
litModel.texIndices2[var10] = def.texIndices2[i] & '\uffff';
litModel.texIndices3[var10] = def.texIndices3[i] & '\uffff';
var9[i] = var10++;
}
else
{
var9[i] = -1;
}
}
litModel.textureCoords = new byte[def.faceCount];
for (int i = 0; i < def.faceCount; ++i)
{
if (def.textureCoords[i] != -1)
{
litModel.textureCoords[i] = (byte) var9[def.textureCoords[i] & 255];
}
else
{
litModel.textureCoords[i] = -1;
}
}
}
for (int faceIdx = 0; faceIdx < def.faceCount; ++faceIdx)
{
byte faceType;
if (def.faceRenderTypes == null)
{
faceType = 0;
}
else
{
faceType = def.faceRenderTypes[faceIdx];
}
byte faceAlpha;
if (def.faceTransparencies == null)
{
faceAlpha = 0;
}
else
{
faceAlpha = def.faceTransparencies[faceIdx];
}
short faceTexture;
if (def.faceTextures == null)
{
faceTexture = -1;
}
else
{
faceTexture = def.faceTextures[faceIdx];
}
if (faceAlpha == -2)
{
faceType = 3;
}
if (faceAlpha == -1)
{
faceType = 2;
}
VertexNormal vertexNormal;
int tmp;
FaceNormal faceNormal;
if (faceTexture == -1)
{
if (faceType != 0)
{
if (faceType == 1)
{
faceNormal = def.faceNormals[faceIdx];
tmp = (y * faceNormal.y + z * faceNormal.z + x * faceNormal.x) / (var7 / 2 + var7) + ambient;
litModel.faceColors1[faceIdx] = method2608(def.faceColors[faceIdx] & '\uffff', tmp);
litModel.faceColors3[faceIdx] = -1;
}
else if (faceType == 3)
{
litModel.faceColors1[faceIdx] = 128;
litModel.faceColors3[faceIdx] = -1;
}
else
{
litModel.faceColors3[faceIdx] = -2;
}
}
else
{
int var15 = def.faceColors[faceIdx] & '\uffff';
vertexNormal = def.vertexNormals[def.faceIndices1[faceIdx]];
tmp = (y * vertexNormal.y + z * vertexNormal.z + x * vertexNormal.x) / (var7 * vertexNormal.magnitude) + ambient;
litModel.faceColors1[faceIdx] = method2608(var15, tmp);
vertexNormal = def.vertexNormals[def.faceIndices2[faceIdx]];
tmp = (y * vertexNormal.y + z * vertexNormal.z + x * vertexNormal.x) / (var7 * vertexNormal.magnitude) + ambient;
litModel.faceColors2[faceIdx] = method2608(var15, tmp);
vertexNormal = def.vertexNormals[def.faceIndices3[faceIdx]];
tmp = (y * vertexNormal.y + z * vertexNormal.z + x * vertexNormal.x) / (var7 * vertexNormal.magnitude) + ambient;
litModel.faceColors3[faceIdx] = method2608(var15, tmp);
}
}
else if (faceType != 0)
{
if (faceType == 1)
{
faceNormal = def.faceNormals[faceIdx];
tmp = (y * faceNormal.y + z * faceNormal.z + x * faceNormal.x) / (var7 / 2 + var7) + ambient;
litModel.faceColors1[faceIdx] = bound2to126(tmp);
litModel.faceColors3[faceIdx] = -1;
}
else
{
litModel.faceColors3[faceIdx] = -2;
}
}
else
{
vertexNormal = def.vertexNormals[def.faceIndices1[faceIdx]];
tmp = (y * vertexNormal.y + z * vertexNormal.z + x * vertexNormal.x) / (var7 * vertexNormal.magnitude) + ambient;
litModel.faceColors1[faceIdx] = bound2to126(tmp);
vertexNormal = def.vertexNormals[def.faceIndices2[faceIdx]];
tmp = (y * vertexNormal.y + z * vertexNormal.z + x * vertexNormal.x) / (var7 * vertexNormal.magnitude) + ambient;
litModel.faceColors2[faceIdx] = bound2to126(tmp);
vertexNormal = def.vertexNormals[def.faceIndices3[faceIdx]];
tmp = (y * vertexNormal.y + z * vertexNormal.z + x * vertexNormal.x) / (var7 * vertexNormal.magnitude) + ambient;
litModel.faceColors3[faceIdx] = bound2to126(tmp);
}
}
litModel.verticesCount = def.vertexCount;
litModel.verticesX = def.vertexX;
litModel.verticesY = def.vertexY;
litModel.verticesZ = def.vertexZ;
litModel.indicesCount = def.faceCount;
litModel.indices1 = def.faceIndices1;
litModel.indices2 = def.faceIndices2;
litModel.indices3 = def.faceIndices3;
litModel.facePriorities = def.faceRenderPriorities;
litModel.faceTransparencies = def.faceTransparencies;
litModel.faceTextures = def.faceTextures;
return litModel;
}
static int method2608(int var0, int var1)
{
var1 = ((var0 & 127) * var1) >> 7;
var1 = bound2to126(var1);
return (var0 & 65408) + var1;
}
static int bound2to126(int var0)
{
if (var0 < 2)
{
var0 = 2;
}
else if (var0 > 126)
{
var0 = 126;
}
return var0;
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,264 @@
package net.runelite.cache;
import com.google.gson.Gson;
import lombok.extern.slf4j.Slf4j;
import net.runelite.cache.definitions.AreaDefinition;
import net.runelite.cache.definitions.FontDefinition;
import net.runelite.cache.definitions.SpriteDefinition;
import net.runelite.cache.definitions.WorldMapElementDefinition;
import net.runelite.cache.fs.Store;
import net.runelite.cache.region.Position;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.DefaultParser;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import javax.imageio.ImageIO;
import java.awt.*;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class MapLabelDumper
{
private static String outputDirectory;
public static void main(String[] args) throws IOException
{
Options options = new Options();
options.addOption(Option.builder().longOpt("cachedir").hasArg().required().build());
options.addOption(Option.builder().longOpt("outputdir").hasArg().required().build());
CommandLineParser parser = new DefaultParser();
CommandLine cmd;
try
{
cmd = parser.parse(options, args);
}
catch (ParseException ex)
{
System.err.println("Error parsing command line options: " + ex.getMessage());
System.exit(-1);
return;
}
final String cacheDirectory = cmd.getOptionValue("cachedir");
outputDirectory = cmd.getOptionValue("outputdir");
File base = new File(cacheDirectory);
File outDir = new File(outputDirectory);
outDir.mkdirs();
try (Store store = new Store(base))
{
store.load();
WorldMapManager worldMapManager = new WorldMapManager(store);
worldMapManager.load();
AreaManager areas = new AreaManager(store);
areas.load();
FontManager fonts = new FontManager(store);
fonts.load();
SpriteManager sprites = new SpriteManager(store);
sprites.load();
List<Object[]> result = new ArrayList<>();
FontName[] fontSizes = new FontName[]{FontName.VERDANA_11, FontName.VERDANA_13, FontName.VERDANA_15};
List<WorldMapElementDefinition> elements = worldMapManager.getElements();
int x = 0;
for (WorldMapElementDefinition element : elements)
{
AreaDefinition area = areas.getArea(element.getAreaDefinitionId());
Position worldPosition = element.getWorldPosition();
if (area == null || area.getName() == null)
{
continue;
}
result.add(new Object[]{
worldPosition.getX(),
worldPosition.getY(),
worldPosition.getZ()
});
FontName fontSize = fontSizes[area.getTextScale()];
FontDefinition font = fonts.findFontByName(fontSize.getName());
String areaLabel = area.getName();
String[] lines = areaLabel.split("<br>");
int ascent = 0;
int startImageWidth = 0;
for (String line : lines)
{
startImageWidth = Math.max(startImageWidth, font.stringWidth(line));
}
startImageWidth += 200;
int startImageHeight = 300;
BufferedImage image = new BufferedImage(startImageWidth, startImageHeight, BufferedImage.TYPE_INT_ARGB);
for (String line : lines)
{
int advance = 0;
int stringWidth = font.stringWidth(line);
for (int i = 0; i < line.length(); ++i)
{
char c = line.charAt(i);
SpriteDefinition sprite = sprites.findSpriteByArchiveName(fontSize.getName(), c);
if (sprite.getWidth() != 0 && sprite.getHeight() != 0)
{
blitGlyph(image,
advance + (startImageWidth / 2) - (stringWidth / 2),
ascent + (startImageHeight / 2),
area.getTextColor(),
sprite
);
}
advance += font.getAdvances()[c];
}
ascent += font.getAscent() / 2;
}
int imageTop = 0;
int imageBottom = 0;
for (int y = 0; y < startImageHeight; ++y) {
boolean lineHasPixels = false;
for (int xx = 0; xx < startImageWidth; ++xx)
{
if (image.getRGB(xx, y) != 0)
{
lineHasPixels = true;
}
}
if (lineHasPixels)
{
if (imageTop == 0)
{
imageTop = y;
}
else
{
imageBottom = Math.max(imageBottom, y);
}
}
}
imageBottom += 1;
int imageLeft = 0;
int imageRight = 0;
for (int xx = 0; xx < startImageWidth; ++xx)
{
boolean columnHasPixels = false;
for (int y = 0; y < startImageHeight; ++y)
{
if (image.getRGB(xx, y) != 0)
{
columnHasPixels = true;
}
}
if (columnHasPixels)
{
if (imageLeft == 0)
{
imageLeft = xx;
}
else
{
imageRight = Math.max(imageRight, xx);
}
}
}
imageRight += 1;
int imageHeight = imageBottom - imageTop;
int imageWidth = imageRight - imageLeft;
BufferedImage finalImage = new BufferedImage(imageWidth, imageHeight, BufferedImage.TYPE_INT_ARGB);
Graphics g = finalImage.createGraphics();
g.drawImage(
image,
0,
0,
imageWidth,
imageHeight,
imageLeft,
imageTop,
imageRight,
imageBottom,
null
);
File imageFile = new File(outDir, "" + (x++) + ".png");
ImageIO.write(finalImage, "png", imageFile);
}
try {
Gson gson = new Gson();
File jsonFile = new File(outDir, "map-labels.json");
FileWriter writer = new FileWriter(jsonFile);
gson.toJson(result, writer);
writer.flush();
writer.close();
} catch (Exception ex) {
log.error("Failed to write map-labels.json", ex);
}
}
}
private static void blitGlyph(BufferedImage dst, int x, int y, int color, SpriteDefinition glyph)
{
int[] pixels = glyph.getPixels();
int[] shadowPixels = new int[pixels.length];
for (int i = 0; i < pixels.length; ++i)
{
if (pixels[i] != 0)
{
pixels[i] = color;
shadowPixels[i] = 0xFF000000;
}
}
SpriteDefinition shadow = new SpriteDefinition();
shadow.setPixels(shadowPixels);
shadow.setOffsetX(glyph.getOffsetX());
shadow.setOffsetY(glyph.getOffsetY());
shadow.setWidth(glyph.getWidth());
shadow.setHeight(glyph.getHeight());
blitIcon(dst, x + 1, y + 1, shadow);
blitIcon(dst, x, y, glyph);
}
private static void blitIcon(BufferedImage dst, int x, int y, SpriteDefinition sprite)
{
x += sprite.getOffsetX();
y += sprite.getOffsetY();
int ymin = Math.max(0, -y);
int ymax = Math.min(sprite.getHeight(), dst.getHeight() - y);
int xmin = Math.max(0, -x);
int xmax = Math.min(sprite.getWidth(), dst.getWidth() - x);
for (int yo = ymin; yo < ymax; yo++)
{
for (int xo = xmin; xo < xmax; xo++)
{
int rgb = sprite.getPixels()[xo + (yo * sprite.getWidth())];
if (rgb != 0)
{
dst.setRGB(x + xo, y + yo, rgb | 0xFF000000);
}
}
}
}
}

1259
group-ironmen-master/cache/package-lock.json generated vendored Normal file

File diff suppressed because it is too large Load Diff

19
group-ironmen-master/cache/package.json vendored Normal file
View File

@@ -0,0 +1,19 @@
{
"name": "osrs-cache",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"update": "node update.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"async": "^3.2.3",
"axios": "^0.26.1",
"glob": "^8.0.1",
"sharp": "^0.31.1",
"unzipper": "^0.12.3",
"xml2js": "^0.4.23"
}
}

513
group-ironmen-master/cache/update.js vendored Normal file
View File

@@ -0,0 +1,513 @@
const child_process = require('child_process');
const fs = require('fs');
const xml2js = require('xml2js');
const glob = require('glob');
const nAsync = require('async');
const path = require('path');
const axios = require('axios');
const sharp = require('sharp');
const unzipper = require('unzipper');
// NOTE: sharp will keep some files open and prevent them from being deleted
sharp.cache(false);
const xmlParser = new xml2js.Parser();
const xmlBuilder = new xml2js.Builder();
const runelitePath = './runelite';
const cacheProjectPath = `${runelitePath}/cache`;
const cachePomPath = `${cacheProjectPath}/pom.xml`;
const cacheJarOutputDir = `${cacheProjectPath}/target`;
const osrsCacheDirectory = './cache/cache';
const siteItemDataPath = '../site/public/data/item_data.json';
const siteMapIconMetaPath = "../site/public/data/map_icons.json";
const siteMapLabelMetaPath = "../site/public/data/map_labels.json";
const siteItemImagesPath = '../site/public/icons/items';
const siteMapImagesPath = '../site/public/map';
const siteMapLabelsPath = '../site/public/map/labels';
const siteMapIconPath = "../site/public/map/icons/map_icons.webp";
const tileSize = 256;
function exec(command, options) {
console.log(command);
options = options || {};
options.stdio = 'inherit';
try {
child_process.execSync(command, options);
} catch (err) {
console.log(err);
process.exit(1);
}
}
async function retry(fn, skipLast) {
const attempts = 10;
for (let i = 0; i < attempts; ++i) {
try {
await fn();
return;
} catch (ex) {
await new Promise(resolve => setTimeout(resolve, 100));
if (i === (attempts - 1) && skipLast) {
console.error(ex);
}
}
}
if (!skipLast) {
fn();
}
}
async function setMainClassInCachePom(mainClass) {
console.log(`Setting mainClass of ${cachePomPath} to ${mainClass}`);
xmlParser.reset();
const cachePomData = fs.readFileSync(cachePomPath, 'utf8');
const cachePom = await xmlParser.parseStringPromise(cachePomData);
const plugins = cachePom.project.build[0].plugins[0].plugin;
const mavenAssemblyPlugin = plugins.find((plugin) => plugin.artifactId[0] === 'maven-assembly-plugin');
const configuration = mavenAssemblyPlugin.configuration[0];
configuration.archive = [{ manifest: [{ mainClass: [mainClass] }] }];
const cachePomResult = xmlBuilder.buildObject(cachePom);
fs.writeFileSync(cachePomPath, cachePomResult);
}
function execRuneliteCache(params) {
const jars = glob.sync(`${cacheJarOutputDir}/cache-*-jar-with-dependencies.jar`);
let cacheJar = jars[0];
let cacheJarmtime = fs.statSync(cacheJar).mtime;
for (const jar of jars) {
const mtime = fs.statSync(jar).mtime;
if (mtime > cacheJarmtime) {
cacheJarmtime = mtime;
cacheJar = jar;
}
}
const cmd = `java -Xmx8g -jar ${cacheJar} ${params}`;
exec(cmd);
}
async function readAllItemFiles() {
const itemFiles = glob.sync(`./item-data/*.json`);
const result = {};
const q = nAsync.queue((itemFile, callback) => {
fs.promises.readFile(itemFile, 'utf8').then((itemFileData) => {
const item = JSON.parse(itemFileData);
if (isNaN(item.id)) console.log(item);
result[item.id] = item;
callback();
});
}, 50);
for (const itemFile of itemFiles) {
q.push(itemFile);
}
await q.drain();
return result;
}
function buildCacheProject() {
exec(`mvn install -Dmaven.test.skip=true -f pom.xml`, { cwd: cacheProjectPath });
}
async function setupRunelite() {
console.log('Step: Setting up runelite');
if (!fs.existsSync(runelitePath)) {
exec(`git clone "git@github.com:runelite/runelite.git"`);
}
exec(`git fetch origin master`, { cwd: runelitePath });
exec(`git reset --hard origin/master`, { cwd: runelitePath });
}
async function dumpItemData() {
console.log('\nStep: Unpacking item data from cache');
await setMainClassInCachePom('net.runelite.cache.Cache');
buildCacheProject();
execRuneliteCache(`-c ${osrsCacheDirectory} -items ./item-data`);
}
async function getNonAlchableItemNames() {
console.log('\nStep: Fetching unalchable items from wiki');
const nonAlchableItemNames = new Set();
let cmcontinue = '';
do {
const url = `https://oldschool.runescape.wiki/api.php?cmtitle=Category:Items_that_cannot_be_alchemised&action=query&list=categorymembers&format=json&cmlimit=500&cmcontinue=${cmcontinue}`;
const response = await axios.get(url);
const itemNames = response.data.query.categorymembers.map((member) => member.title).filter((title) => !title.startsWith('File:') && !title.startsWith('Category:'));
itemNames.forEach((name) => nonAlchableItemNames.add(name));
cmcontinue = response.data?.continue?.cmcontinue || null;
} while(cmcontinue);
return nonAlchableItemNames;
}
async function buildItemDataJson() {
console.log('\nStep: Build item_data.json');
const items = await readAllItemFiles();
const includedItems = {};
const allIncludedItemIds = new Set();
for (const [itemId, item] of Object.entries(items)) {
if (item.name && item.name.trim().toLowerCase() !== 'null') {
const includedItem = {
name: item.name,
highalch: Math.floor(item.cost * 0.6)
};
const stackedList = [];
if (item.countCo && item.countObj && item.countCo.length > 0 && item.countObj.length > 0) {
for (let i = 0; i < item.countCo.length; ++i) {
const stackBreakPoint = item.countCo[i];
const stackedItemId = item.countObj[i];
if (stackBreakPoint > 0 && stackedItemId === 0) {
console.log(`${itemId}: Item has a stack breakpoint without an associated item id for that stack.`);
} else if (stackBreakPoint > 0 && stackedItemId > 0) {
allIncludedItemIds.add(stackedItemId);
stackedList.push([stackBreakPoint, stackedItemId]);
}
}
if (stackedList.length > 0) {
includedItem.stacks = stackedList;
}
}
allIncludedItemIds.add(item.id);
includedItems[itemId] = includedItem;
}
}
const nonAlchableItemNames = await getNonAlchableItemNames();
let itemsMadeNonAlchable = 0;
for (const item of Object.values(includedItems)) {
const itemName = item.name;
if (nonAlchableItemNames.has(itemName)) {
// NOTE: High alch value = 0 just means unalchable in the context of this program
item.highalch = 0;
itemsMadeNonAlchable++;
}
// NOTE: The wiki data does not list every variant of an item such as 'Abyssal lantern (yew logs)'
// which is also not alchable. So this step is to handle that case by searching for the non variant item.
if (itemName.trim().endsWith(')') && itemName.indexOf('(') !== -1) {
const nonVariantItemName = itemName.substring(0, itemName.indexOf('(')).trim();
if (nonAlchableItemNames.has(nonVariantItemName)) {
item.highalch = 0;
itemsMadeNonAlchable++;
}
}
}
console.log(`${itemsMadeNonAlchable} items were updated to be unalchable`);
fs.writeFileSync('./item_data.json', JSON.stringify(includedItems));
return allIncludedItemIds;
}
async function dumpItemImages(allIncludedItemIds) {
console.log('\nStep: Extract item model images');
console.log(`Generating images for ${allIncludedItemIds.size} items`);
fs.writeFileSync('items_need_images.csv', Array.from(allIncludedItemIds.values()).join(','));
const imageDumperDriver = fs.readFileSync('./Cache.java', 'utf8');
fs.writeFileSync(`${cacheProjectPath}/src/main/java/net/runelite/cache/Cache.java`, imageDumperDriver);
const itemSpriteFactory = fs.readFileSync('./ItemSpriteFactory.java', 'utf8');
fs.writeFileSync(`${cacheProjectPath}/src/main/java/net/runelite/cache/item/ItemSpriteFactory.java`, itemSpriteFactory);
buildCacheProject();
execRuneliteCache(`-c ${osrsCacheDirectory} -ids ./items_need_images.csv -output ./item-images`);
const itemImages = glob.sync(`./item-images/*.png`);
let p = [];
for (const itemImage of itemImages) {
p.push(new Promise(async (resolve) => {
const itemImageData = await sharp(itemImage).webp({ lossless: true }).toBuffer();
fs.unlinkSync(itemImage);
await sharp(itemImageData).webp({ lossless: true, effort: 6 }).toFile(itemImage.replace(".png", ".webp")).then(resolve);
}));
}
await Promise.all(p);
}
async function convertXteasToRuneliteFormat() {
const xteas = JSON.parse(fs.readFileSync(`${osrsCacheDirectory}/../xteas.json`, 'utf8'));
let result = xteas.map((region) => ({
region: region.mapsquare,
keys: region.key
}));
const location = `${osrsCacheDirectory}/../xteas-runelite.json`;
fs.writeFileSync(location, JSON.stringify(result));
return location;
}
async function dumpMapData(xteasLocation) {
console.log('\nStep: Dumping map data');
const mapImageDumper = fs.readFileSync('./MapImageDumper.java', 'utf8');
fs.writeFileSync(`${cacheProjectPath}/src/main/java/net/runelite/cache/MapImageDumper.java`, mapImageDumper);
await setMainClassInCachePom('net.runelite.cache.MapImageDumper');
buildCacheProject();
execRuneliteCache(`--cachedir ${osrsCacheDirectory} --xteapath ${xteasLocation} --outputdir ./map-data`);
}
async function dumpMapLabels() {
console.log('\nStep: Dumping map labels');
const mapLabelDumper = fs.readFileSync('./MapLabelDumper.java', 'utf8');
fs.writeFileSync(`${cacheProjectPath}/src/main/java/net/runelite/cache/MapLabelDumper.java`, mapLabelDumper);
await setMainClassInCachePom('net.runelite.cache.MapLabelDumper');
buildCacheProject();
execRuneliteCache(`--cachedir ${osrsCacheDirectory} --outputdir ./map-data/labels`);
const mapLabels = glob.sync("./map-data/labels/*.png");
let p = [];
for (const mapLabel of mapLabels) {
p.push(new Promise(async (resolve) => {
const mapLabelImageData = await sharp(mapLabel).webp({ lossless: true }).toBuffer();
fs.unlinkSync(mapLabel);
await sharp(mapLabelImageData).webp({ lossless: true, effort: 6 }).toFile(mapLabel.replace(".png", ".webp")).then(resolve);
}));
}
await Promise.all(p);
}
async function dumpCollectionLog() {
console.log('\nStep: Dumping collection log');
const collectionLogDumper = fs.readFileSync('./CollectionLogDumper.java', 'utf8');
fs.writeFileSync(`${cacheProjectPath}/src/main/java/net/runelite/cache/CollectionLogDumper.java`, collectionLogDumper);
await setMainClassInCachePom('net.runelite.cache.CollectionLogDumper');
buildCacheProject();
execRuneliteCache(`--cachedir ${osrsCacheDirectory} --outputdir ../server`);
}
async function tilePlane(plane) {
await retry(() => fs.rmSync('./output_files', { recursive: true, force: true }));
const planeImage = sharp(`./map-data/img-${plane}.png`, { limitInputPixels: false }).flip();
await planeImage.webp({ lossless: true }).tile({
size: tileSize,
depth: "one",
background: { r: 0, g: 0, b: 0, alpha: 0 },
skipBlanks: 0
}).toFile('output.dz');
}
async function outputTileImage(s, plane, x, y) {
return s.flatten({ background: '#000000' })
.webp({ lossless: true, alphaQuality: 0, effort: 6 })
.toFile(`./map-data/tiles/${plane}_${x}_${y}.webp`);
}
async function finalizePlaneTiles(plane, previousTiles) {
const tileImages = glob.sync('./output_files/0/*.webp');
for (const tileImage of tileImages) {
const filename = path.basename(tileImage, '.webp');
const [x, y] = filename.split('_').map((coord) => parseInt(coord, 10));
const finalX = x + (4608 / tileSize);
const finalY = y + (4864 / tileSize);
let s;
if (plane > 0) {
const backgroundPath = `./map-data/tiles/${plane-1}_${finalX}_${finalY}.webp`;
const backgroundExists = fs.existsSync(backgroundPath);
if (backgroundExists) {
const tile = await sharp(tileImage).flip().webp({ lossless: true }).toBuffer();
const background = await sharp(backgroundPath).linear(0.5).webp({ lossless: true }).toBuffer();
s = sharp(background)
.composite([
{ input: tile }
]);
}
}
if (!s) {
s = sharp(tileImage).flip();
}
previousTiles.add(`${plane}_${finalX}_${finalY}`);
await outputTileImage(s, plane, finalX, finalY);
}
// NOTE: This is just so the plane will have a darker version of the tile below it
// even if the plane does not have its own image for a tile.
if (plane > 0) {
const belowTiles = [...previousTiles].filter(x => x.startsWith(plane - 1));
for (const belowTile of belowTiles) {
const [belowPlane, x, y] = belowTile.split('_');
const lookup = `${plane}_${x}_${y}`;
if (!previousTiles.has(lookup)) {
const outputPath = `./map-data/tiles/${plane}_${x}_${y}.webp`;
if (fs.existsSync(outputPath) === true) {
throw new Error(`Filling tile ${outputPath} but it already exists!`);
}
const s = sharp(`./map-data/tiles/${belowTile}.webp`).linear(0.5);
previousTiles.add(lookup);
await outputTileImage(s, plane, x, y);
}
}
}
}
async function generateMapTiles() {
console.log('\nStep: Generate map tiles');
fs.rmSync('./map-data/tiles', { recursive: true, force: true });
fs.mkdirSync('./map-data/tiles');
const previousTiles = new Set();
const planes = 4;
for (let i = 0; i < planes; ++i) {
console.log(`Tiling map plane ${i + 1}/${planes}`);
await tilePlane(i);
console.log(`Finalizing map plane ${i + 1}/${planes}`);
await finalizePlaneTiles(i, previousTiles);
}
}
async function moveFiles(globSource, destination) {
const files = glob.sync(globSource);
for (file of files) {
const base = path.parse(file).base;
if (base) {
await retry(() => fs.renameSync(file, `${destination}/${base}`), true);
}
}
}
async function moveResults() {
console.log('\nStep: Moving results to site');
await retry(() => fs.renameSync('./item_data.json', siteItemDataPath), true);
await moveFiles('./item-images/*.webp', siteItemImagesPath);
await moveFiles("./map-data/tiles/*.webp", siteMapImagesPath);
await moveFiles("./map-data/labels/*.webp", siteMapLabelsPath);
// Create a tile sheet of the map icons
const mapIcons = glob.sync("./map-data/icons/*.png");
let mapIconsCompositeOpts = [];
const iconIdToSpriteMapIndex = {};
for (let i = 0; i < mapIcons.length; ++i) {
mapIconsCompositeOpts.push({
input: mapIcons[i],
left: 15 * i,
top: 0
});
iconIdToSpriteMapIndex[path.basename(mapIcons[i], '.png')] = i;
}
await sharp({
create: {
width: 15 * mapIcons.length,
height: 15,
channels: 4,
background: { r: 0, g: 0, b: 0, alpha: 0 }
}
}).composite(mapIconsCompositeOpts).webp({ lossless: true, effort: 6 }).toFile(siteMapIconPath);
// Convert the output of the map-icons locations to be keyed by the X an Y of the regions
// that they are in. This is done so that the canvas map component can quickly lookup
// all of the icons in each of the regions that are being shown.
const mapIconsMeta = JSON.parse(fs.readFileSync("./map-data/icons/map-icons.json", 'utf8'));
const locationByRegion = {};
for (const [iconId, coordinates] of Object.entries(mapIconsMeta)) {
for (let i = 0; i < coordinates.length; i += 2) {
const x = coordinates[i] + 128;
const y = coordinates[i + 1] + 1;
const regionX = Math.floor(x / 64);
const regionY = Math.floor(y / 64);
const spriteMapIndex = iconIdToSpriteMapIndex[iconId];
if (spriteMapIndex === undefined) {
throw new Error("Could not find sprite map index for map icon: " + iconId);
}
locationByRegion[regionX] = locationByRegion[regionX] || {};
locationByRegion[regionX][regionY] = locationByRegion[regionX][regionY] || {};
locationByRegion[regionX][regionY][spriteMapIndex] = locationByRegion[regionX][regionY][spriteMapIndex] || [];
locationByRegion[regionX][regionY][spriteMapIndex].push(x, y);
}
}
fs.writeFileSync(siteMapIconMetaPath, JSON.stringify(locationByRegion));
// Do the same for map labels
const mapLabelsMeta = JSON.parse(fs.readFileSync("./map-data/labels/map-labels.json", 'utf8'));
const labelByRegion = {};
for (let i = 0; i < mapLabelsMeta.length; ++i) {
const coordinates = mapLabelsMeta[i];
const x = coordinates[0] + 128;
const y = coordinates[1] + 1;
const z = coordinates[2];
const regionX = Math.floor(x / 64);
const regionY = Math.floor(y / 64);
labelByRegion[regionX] = labelByRegion[regionX] || {};
labelByRegion[regionX][regionY] = labelByRegion[regionX][regionY] || {};
labelByRegion[regionX][regionY][z] = labelByRegion[regionX][regionY][z] || [];
labelByRegion[regionX][regionY][z].push(x, y, i);
}
fs.writeFileSync(siteMapLabelMetaPath, JSON.stringify(labelByRegion));
}
async function getLatestGameCache() {
if (!fs.existsSync('./cache')) {
fs.mkdirSync('./cache');
}
const caches = (await axios.get('https://archive.openrs2.org/caches.json')).data;
const latestOSRSCache = caches.filter((cache) => {
return cache.scope === 'runescape' && cache.game === 'oldschool' && cache.environment === 'live' && !!cache.timestamp;
}).sort((a, b) => (new Date(b.timestamp)) - (new Date(a.timestamp)))[0];
console.log(latestOSRSCache);
const pctValidArchives = latestOSRSCache.valid_indexes / latestOSRSCache.indexes;
if (pctValidArchives < 1) {
throw new Error(`valid_indexes was less than indexes valid_indexes=${latestOSRSCache.valid_indexes} indexes=${latestOSRSCache.indexes} pctValidArchives=${pctValidArchives}`);
}
const pctValidGroups = latestOSRSCache.valid_groups / latestOSRSCache.groups;
if (pctValidGroups < 1) {
throw new Error(`valid_groups was less than groups valid_groups=${latestOSRSCache.valid_groups} groups=${latestOSRSCache.groups} pctValidGroups=${pctValidGroups}`);
}
const pctValidKeys = latestOSRSCache.valid_keys / latestOSRSCache.keys;
if (pctValidKeys < 0.97) {
throw new Error(`pctValidKeys was less that 97% valid_keys=${latestOSRSCache.valid_keys} keys=${latestOSRSCache.keys} pctValidKeys=${pctValidKeys}`);
}
const cacheFilesResponse = await axios.get(`https://archive.openrs2.org/caches/${latestOSRSCache.scope}/${latestOSRSCache.id}/disk.zip`, {
responseType: 'arraybuffer'
});
const cacheFiles = await unzipper.Open.buffer(cacheFilesResponse.data);
await cacheFiles.extract({ path: './cache' });
const xteas = (await axios.get(`https://archive.openrs2.org/caches/${latestOSRSCache.scope}/${latestOSRSCache.id}/keys.json`)).data;
fs.writeFileSync('./cache/xteas.json', JSON.stringify(xteas));
}
(async () => {
await getLatestGameCache();
await setupRunelite();
await dumpItemData();
const allIncludedItemIds = await buildItemDataJson();
await dumpItemImages(allIncludedItemIds);
const xteasLocation = await convertXteasToRuneliteFormat();
await dumpMapData(xteasLocation);
await generateMapTiles();
await dumpMapLabels();
await dumpCollectionLog();
await moveResults();
})();

View File

@@ -0,0 +1,39 @@
version: "3.8"
services:
group-ironmen-tracker-frontend:
image: chrisleeeee/group-ironmen-tracker-frontend
environment:
- HOST_URL=${HOST_URL}
restart: always
container_name: group-ironmen-tracker-frontend
ports:
- 4000:4000 # replace this if using a docker-compatible reverse proxy like traefik
group-ironmen-tracker-backend:
# build:
# context: ./server
# dockerfile: Dockerfile
image: chrisleeeee/group-ironmen-tracker-backend
environment:
- PG_USER=${PG_USER}
- PG_PASSWORD=${PG_PASSWORD}
- PG_HOST=${PG_HOST}
- PG_PORT=${PG_PORT}
- PG_DB=${PG_DB}
- BACKEND_SECRET=${BACKEND_SECRET}
restart: always
depends_on:
- "postgres"
container_name: group-ironmen-tracker-backend
ports:
- 5000:8080 # replace this if using a docker-compatible reverse proxy like traefik
postgres:
image: postgres
restart: always
container_name: group-ironmen-tracker-postgres
environment:
POSTGRES_USER: ${PG_USER}
POSTGRES_PASSWORD: ${PG_PASSWORD}
POSTGRES_DB: ${PG_DB}
volumes:
- ./pg-data:/var/lib/postgresql/data # change the left-hand side of : to the path you prefer the DB to store the data in
- ./server/src/sql/schema.sql:/docker-entrypoint-initdb.d/schema.sql # change the left-hand side of the : to the path that contains the schema.sql

View File

@@ -0,0 +1,11 @@
**/node_modules
**/.husky
**/.eslint*
**/prettier*
README.md
.gitignore
config.toml
Dockerfile
secret
target
.dockerignore

View File

@@ -0,0 +1,5 @@
config.toml
target
secret
node_modules
flamegraph*.svg

3154
group-ironmen-master/server/Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,42 @@
[package]
name = "server"
version = "0.1.0"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
tokio-postgres = { version = "0.7.11", features = ["with-serde_json-1", "with-chrono-0_4"] }
actix-web = "4.11.0"
tokio = { version = "1.47.1", features = ["time"] }
serde = { version = "1.0.219", features = ["derive"] }
deadpool-postgres = { version = "0.14.1", features = ["serde"] }
config = "0.15.15"
derive_more = "2.0.1"
env_logger = { version = "0.11.8", default-features = false, features = ["humantime"] }
log = "0.4.28"
serde_json = "1.0.143"
futures = "0.3.31"
uuid = { version = "1.18.1", features = ["v4"] }
actix-service = "2.0.3"
chrono = { version = "0.4.42", features = ["serde"] }
actix-cors = "0.7.1"
blake2 = "0.10.6"
data-encoding = { version = "2.9.0", features = ["alloc"] }
lazy_static = "1.5.0"
regex = "1.11.2"
arc-swap = "1.7.1"
reqwest = { version = "0.12.23", features = ["json"] }
mimalloc = "0.1.48"
[profile.dev]
opt-level = 0
# Enable high optimizations for dependencies but not for our code:
[profile.dev.package."*"]
opt-level = 3
[profile.release]
lto = true
codegen-units = 1
panic = "abort"

View File

@@ -0,0 +1,22 @@
###############################################
# Backend Image
###############################################
FROM rust:1.73 as builder
WORKDIR /app
COPY src ./src
COPY Cargo.toml .
COPY Cargo.lock .
COPY collection_log_info.json .
RUN cargo build --release
FROM debian:bookworm-slim
WORKDIR /app
RUN apt-get update
RUN apt-get install -y openssl ca-certificates
RUN rm -rf /var/lib/apt/lists/*
COPY --from=builder /app/target/release/server ./
COPY --from=builder /app/collection_log_info.json ./
COPY ./docker-entrypoint.sh ./
ENTRYPOINT ["/app/docker-entrypoint.sh"]
CMD ["/app/server"]

View File

@@ -0,0 +1 @@
RUSTFLAGS="-C target-cpu=native" cargo build --release

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,31 @@
#!/bin/bash
CONFIG_FILE=config.toml
echo "[entrypoint] Creating $CONFIG_FILE"
if [ -e $CONFIG_FILE ]
then
echo "[entrypoint] $CONFIG_FILE already exists, deleting and starting fresh"
rm $CONFIG_FILE
fi
echo "[pg]" >> $CONFIG_FILE
echo "user = \"$PG_USER\"" >> $CONFIG_FILE
echo "password = \"$PG_PASSWORD\"" >> $CONFIG_FILE
echo "host = \"$PG_HOST\"" >> $CONFIG_FILE
echo "port = $PG_PORT" >> $CONFIG_FILE
echo "dbname = \"$PG_DB\"" >> $CONFIG_FILE
echo "pool.max_size = 16" >> $CONFIG_FILE
SECRET_FILE=secret
echo "[entrypoint] Creating $SECRET_FILE"
if [ -e $SECRET_FILE ]
then
echo "[entrypoint] $SECRET_FILE already exists, deleting and starting fresh"
rm $SECRET_FILE
fi
echo "$BACKEND_SECRET" >> $SECRET_FILE
echo "[entrypoint] Running run"
exec "$@"

View File

@@ -0,0 +1,143 @@
use crate::db;
use actix_web::{
body::BoxBody,
dev::{Service, ServiceRequest, ServiceResponse, Transform},
web, Error, FromRequest, HttpMessage, HttpRequest,
};
use deadpool_postgres::Pool;
use futures::{
future::{ready, LocalBoxFuture, Ready},
FutureExt,
};
use std::rc::Rc;
pub struct AuthenticateMiddlewareFactory;
impl AuthenticateMiddlewareFactory {
pub fn new() -> Self {
AuthenticateMiddlewareFactory {}
}
}
impl<S, B> Transform<S, ServiceRequest> for AuthenticateMiddlewareFactory
where
S: Service<ServiceRequest, Response = ServiceResponse<B>, Error = Error> + 'static,
B: actix_web::body::MessageBody + 'static,
{
type Response = ServiceResponse<BoxBody>;
type Error = Error;
type InitError = ();
type Transform = AuthenticateMiddleware<S>;
type Future = Ready<Result<Self::Transform, Self::InitError>>;
fn new_transform(&self, service: S) -> Self::Future {
ready(Ok(AuthenticateMiddleware {
service: Rc::new(service),
}))
}
}
pub struct AuthenticationResult {
pub group_id: i64,
}
type AuthenticationInfo = Rc<AuthenticationResult>;
pub struct Authenticated(AuthenticationInfo);
impl std::ops::Deref for Authenticated {
type Target = AuthenticationInfo;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl FromRequest for Authenticated {
type Error = Error;
type Future = Ready<Result<Self, Self::Error>>;
fn from_request(req: &HttpRequest, _payload: &mut actix_web::dev::Payload) -> Self::Future {
let value = req.extensions().get::<AuthenticationInfo>().cloned();
let result = match value {
Some(v) => Ok(Authenticated(v)),
None => Err(actix_web::error::ErrorUnauthorized("")),
};
ready(result)
}
}
pub struct AuthenticateMiddleware<S> {
service: Rc<S>,
}
impl<S, B> Service<ServiceRequest> for AuthenticateMiddleware<S>
where
S: Service<ServiceRequest, Response = ServiceResponse<B>, Error = Error> + 'static,
B: actix_web::body::MessageBody + 'static,
{
type Response = ServiceResponse<BoxBody>;
type Error = Error;
type Future = LocalBoxFuture<'static, Result<Self::Response, Self::Error>>;
actix_service::forward_ready!(service);
fn call(&self, req: ServiceRequest) -> Self::Future {
let srv = Rc::clone(&self.service);
async move {
let group_name = match req.match_info().get("group_name") {
Some(group_name) => group_name,
None => {
return Ok(req.error_response(actix_web::error::ErrorBadRequest(
"Missing group name from request",
)));
}
};
if group_name != "_" {
let auth_header = match req.headers().get("Authorization") {
Some(auth_header) => auth_header,
None => {
return Ok(req.error_response(actix_web::error::ErrorBadRequest(
"Authorization header missing from request",
)));
}
};
let token = match auth_header.to_str() {
Ok(token) => token,
Err(_) => {
return Ok(req.error_response(actix_web::error::ErrorBadRequest(
"Unable to parse Authorization header",
)));
}
};
let db_pool = match req.app_data::<web::Data<Pool>>() {
Some(db_pool) => db_pool,
None => {
return Ok(
req.error_response(actix_web::error::ErrorInternalServerError(""))
);
}
};
let client = match db_pool.get().await {
Ok(client) => client,
Err(_) => {
// log::error!("{}", err);
return Ok(
req.error_response(actix_web::error::ErrorInternalServerError(""))
);
}
};
let group_id = match db::get_group(&client, group_name, token).await {
Ok(group) => group,
Err(_) => {
// log::error!("{}", err);
return Ok(req.error_response(actix_web::error::ErrorUnauthorized("")));
}
};
let authentication_result = AuthenticationResult { group_id };
req.extensions_mut()
.insert::<AuthenticationInfo>(Rc::new(authentication_result));
}
let res = srv.call(req).await?;
Ok(res.map_into_boxed_body())
}
.boxed_local()
}
}

View File

@@ -0,0 +1,197 @@
use crate::auth_middleware::Authenticated;
use crate::collection_log::{CollectionLog, CollectionLogInfo};
use crate::db;
use crate::error::ApiError;
use crate::models::{
AmIInGroupRequest, GroupMember, GroupSkillData, RenameGroupMember, SHARED_MEMBER,
};
use crate::validators::{valid_name, validate_collection_log, validate_member_prop_length};
use actix_web::{delete, get, post, put, web, Error, HttpResponse};
use chrono::{DateTime, Utc};
use deadpool_postgres::{Client, Pool};
use serde::Deserialize;
use std::collections::HashMap;
#[post("/add-group-member")]
pub async fn add_group_member(
auth: Authenticated,
group_member: web::Json<GroupMember>,
db_pool: web::Data<Pool>,
) -> Result<HttpResponse, Error> {
if group_member.name.eq(SHARED_MEMBER) {
return Ok(
HttpResponse::BadRequest().body(format!("Member name {} not allowed", SHARED_MEMBER))
);
}
if !valid_name(&group_member.name) {
return Ok(HttpResponse::BadRequest()
.body(format!("Member name {} is not valid", group_member.name)));
}
let client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
db::add_group_member(&client, auth.group_id, &group_member.name).await?;
Ok(HttpResponse::Created().finish())
}
#[delete("/delete-group-member")]
pub async fn delete_group_member(
auth: Authenticated,
group_member: web::Json<GroupMember>,
db_pool: web::Data<Pool>,
) -> Result<HttpResponse, Error> {
if group_member.name.eq(SHARED_MEMBER) {
return Ok(
HttpResponse::BadRequest().body(format!("Member name {} not allowed", SHARED_MEMBER))
);
}
let mut client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
db::delete_group_member(&mut client, auth.group_id, &group_member.name).await?;
Ok(HttpResponse::Ok().finish())
}
#[put("/rename-group-member")]
pub async fn rename_group_member(
auth: Authenticated,
rename_member: web::Json<RenameGroupMember>,
db_pool: web::Data<Pool>,
) -> Result<HttpResponse, Error> {
if rename_member.original_name.eq(SHARED_MEMBER) || rename_member.new_name.eq(SHARED_MEMBER) {
return Ok(
HttpResponse::BadRequest().body(format!("Member name {} not allowed", SHARED_MEMBER))
);
}
if !valid_name(&rename_member.new_name) {
return Ok(HttpResponse::BadRequest().body(format!(
"Member name {} is not valid",
rename_member.new_name
)));
}
let client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
db::rename_group_member(
&client,
auth.group_id,
&rename_member.original_name,
&rename_member.new_name,
)
.await?;
Ok(HttpResponse::Ok().finish())
}
#[post("/update-group-member")]
pub async fn update_group_member(
auth: Authenticated,
group_member: web::Json<GroupMember>,
db_pool: web::Data<Pool>,
collection_log_info: web::Data<CollectionLogInfo>,
) -> Result<HttpResponse, Error> {
let client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
let in_group: bool = db::is_member_in_group(&client, auth.group_id, &group_member.name).await?;
if !in_group {
return Ok(HttpResponse::Unauthorized().body("Player is not a member of this group"));
}
let mut group_member_inner: GroupMember = group_member.into_inner();
validate_member_prop_length("stats", &group_member_inner.stats, 7, 7)?;
validate_member_prop_length("coordinates", &group_member_inner.coordinates, 3, 3)?;
validate_member_prop_length("skills", &group_member_inner.skills, 23, 24)?;
validate_member_prop_length("quests", &group_member_inner.quests, 0, 220)?;
validate_member_prop_length("inventory", &group_member_inner.inventory, 56, 56)?;
validate_member_prop_length("equipment", &group_member_inner.equipment, 28, 28)?;
validate_member_prop_length("bank", &group_member_inner.bank, 0, 3000)?;
validate_member_prop_length("shared_bank", &group_member_inner.shared_bank, 0, 1000)?;
validate_member_prop_length("rune_pouch", &group_member_inner.rune_pouch, 6, 8)?;
validate_member_prop_length("seed_vault", &group_member_inner.seed_vault, 0, 500)?;
validate_member_prop_length("deposited", &group_member_inner.deposited, 0, 200)?;
validate_member_prop_length("diary_vars", &group_member_inner.diary_vars, 0, 62)?;
validate_collection_log(&collection_log_info, &mut group_member_inner.collection_log)?;
db::update_group_member(
&client,
auth.group_id,
group_member_inner,
collection_log_info,
)
.await?;
Ok(HttpResponse::Ok().finish())
}
#[derive(Deserialize)]
#[serde(deny_unknown_fields)]
pub struct GetGroupDataQuery {
pub from_time: DateTime<Utc>,
}
#[get("/get-group-data")]
pub async fn get_group_data(
auth: Authenticated,
db_pool: web::Data<Pool>,
query: web::Query<GetGroupDataQuery>,
) -> Result<web::Json<Vec<GroupMember>>, Error> {
let from_time = query.from_time;
let client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
let group_members = db::get_group_data(&client, auth.group_id, &from_time).await?;
Ok(web::Json(group_members))
}
#[derive(Deserialize)]
pub enum SkillDataPeriod {
Day,
Week,
Month,
Year,
}
#[derive(Deserialize)]
#[serde(deny_unknown_fields)]
pub struct GetSkillDataQuery {
pub period: SkillDataPeriod,
}
#[get("/get-skill-data")]
pub async fn get_skill_data(
auth: Authenticated,
db_pool: web::Data<Pool>,
query: web::Query<GetSkillDataQuery>,
) -> Result<web::Json<GroupSkillData>, Error> {
let client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
let aggregate_period = match query.period {
SkillDataPeriod::Day => db::AggregatePeriod::Day,
SkillDataPeriod::Week => db::AggregatePeriod::Month,
SkillDataPeriod::Month => db::AggregatePeriod::Month,
SkillDataPeriod::Year => db::AggregatePeriod::Year,
};
let group_skill_data =
db::get_skills_for_period(&client, auth.group_id, aggregate_period).await?;
Ok(web::Json(group_skill_data))
}
#[get("/collection-log")]
pub async fn get_collection_log(
auth: Authenticated,
db_pool: web::Data<Pool>,
) -> Result<web::Json<HashMap<String, Vec<CollectionLog>>>, Error> {
let client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
let collection_logs = db::get_collection_log_for_group(&client, auth.group_id).await?;
Ok(web::Json(collection_logs))
}
#[get("/am-i-logged-in")]
pub async fn am_i_logged_in(_auth: Authenticated) -> Result<HttpResponse, Error> {
Ok(HttpResponse::Ok().finish())
}
#[get("/am-i-in-group")]
pub async fn am_i_in_group(
auth: Authenticated,
db_pool: web::Data<Pool>,
q: web::Query<AmIInGroupRequest>,
) -> Result<HttpResponse, Error> {
let client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
let in_group: bool = db::is_member_in_group(&client, auth.group_id, &q.member_name).await?;
if !in_group {
return Ok(HttpResponse::Unauthorized().body("Player is not a member of this group"));
}
Ok(HttpResponse::Ok().finish())
}

View File

@@ -0,0 +1,174 @@
use lazy_static::lazy_static;
use serde::{Deserialize, Serialize};
use std::collections::{HashMap, HashSet};
#[derive(Deserialize, Serialize)]
#[serde(deny_unknown_fields)]
pub struct CollectionLog {
pub tab: i16,
pub page_name: String,
pub completion_counts: Vec<i32>,
pub items: Vec<i32>,
#[serde(skip_deserializing)]
pub new_items: Vec<i32>,
}
#[derive(Serialize, Clone)]
pub struct CollectionLogInfo {
#[serde(skip_serializing)]
page_name_to_id_lookup: HashMap<String, i16>,
#[serde(skip_serializing)]
page_id_item_set_lookup: HashMap<i16, HashSet<i32>>,
#[serde(skip_serializing)]
item_name_to_id_lookup: HashMap<String, i32>,
#[serde(skip_serializing)]
item_id_to_page_id_lookup: HashMap<i32, HashSet<i16>>,
}
#[derive(Deserialize)]
pub struct CollectionLogItemInfo {
pub id: i32,
pub name: String,
}
#[derive(Deserialize)]
pub struct CollectionLogPageInfo {
pub name: String,
pub items: Vec<CollectionLogItemInfo>,
}
#[allow(non_snake_case)]
#[derive(Deserialize)]
pub struct CollectionLogTabInfo {
pub tabId: i16,
pub pages: Vec<CollectionLogPageInfo>,
}
impl CollectionLogInfo {
pub fn new(pages_db: Vec<(i16, i16, String)>) -> Self {
let mut page_name_to_id_lookup = HashMap::new();
for page in &pages_db {
page_name_to_id_lookup.insert(page.2.clone(), page.1);
}
let mut item_id_to_page_id_lookup = HashMap::new();
let mut item_name_to_id_lookup = HashMap::new();
let mut page_id_item_set_lookup = HashMap::new();
for tab in COLLECTION_LOG_INFO.iter() {
for page in tab.pages.iter() {
let page_id = page_name_to_id_lookup.get(&page.name).unwrap();
if !page_id_item_set_lookup.contains_key(page_id) {
page_id_item_set_lookup.insert(*page_id, HashSet::new());
}
for item in page.items.iter() {
item_name_to_id_lookup.insert(item.name.clone(), item.id);
match page_id_item_set_lookup.get_mut(&page_id) {
Some(x) => x.insert(item.id),
None => true,
};
if !item_id_to_page_id_lookup.contains_key(&item.id) {
item_id_to_page_id_lookup.insert(item.id, HashSet::new());
}
match item_id_to_page_id_lookup.get_mut(&item.id) {
Some(x) => x.insert(*page_id),
None => true,
};
}
}
}
Self {
page_name_to_id_lookup,
page_id_item_set_lookup,
item_name_to_id_lookup,
item_id_to_page_id_lookup,
}
}
pub fn page_name_to_id(&self, page_name: &String) -> Option<&i16> {
match self.page_name_to_id_lookup.get(page_name) {
Some(x) => Some(x),
None => match COLLECTION_PAGE_REMAP.get(page_name) {
Some(x) => self.page_name_to_id_lookup.get(x),
None => None,
},
}
}
pub fn has_item(&self, page_id: i16, item_id: i32) -> bool {
match self.page_id_item_set_lookup.get(&page_id) {
None => false,
Some(x) => x.contains(&item_id),
}
}
pub fn remap_item_id(&self, item_id: i32) -> i32 {
match COLLECTION_ITEM_ID_REMAP.get(&item_id) {
Some(x) => *x,
None => item_id,
}
}
pub fn item_name_to_id(&self, item_name: &String) -> Option<&i32> {
match self.item_name_to_id_lookup.get(item_name) {
Some(x) => Some(x),
None => match COLLECTION_ITEM_REMAP.get(item_name) {
Some(x) => self.item_name_to_id_lookup.get(x),
None => None,
},
}
}
pub fn page_ids_for_item(&self, item_id: i32) -> Option<&HashSet<i16>> {
self.item_id_to_page_id_lookup.get(&item_id)
}
pub fn number_of_items_in_page(&self, page_id: i16) -> usize {
match self.page_id_item_set_lookup.get(&page_id) {
None => 0,
Some(x) => x.len(),
}
}
}
lazy_static! {
// Seems runelite plugins can rename the value we pass for the page. This remaps
// known plugin boss renaming. Is there a better way to handle this?
pub static ref COLLECTION_PAGE_REMAP: HashMap<String, String> = HashMap::from([
("The Grumbler".to_string(), "Phantom Muspah".to_string())
]);
pub static ref COLLECTION_ITEM_REMAP: HashMap<String, String> = HashMap::from([
("Pharaoh's sceptre".to_string(), "Pharaoh's sceptre (uncharged)".to_string())
]);
pub static ref COLLECTION_ITEM_ID_REMAP: HashMap<i32, i32> = HashMap::from([
(25627, 12019), // coal bag
(25628, 12020), // gem bag
(25629, 24882), // plank sack
(25617, 10859), // tea flask
(25618, 10877), // plain satchel
(25619, 10878), // green satchel
(25620, 10879), // red satchel
(25621, 10880), // black stachel
(25622, 10881), // gold satchel
(25623, 10882), // rune satchel
(25624, 13273), // unsired pet
(25630, 12854), // Flamtaer bag
(29992, 29990), // Alchemist's amulet
(30805, 30803), // Dossier
]);
pub static ref COLLECTION_LOG_DATA: String = {
let path = concat!(env!("CARGO_MANIFEST_DIR"), "/collection_log_info.json");
std::fs::read_to_string(path).expect(&format!("Could not read collection log info file at {}", path))
};
pub static ref COLLECTION_LOG_INFO: Vec<CollectionLogTabInfo> = {
serde_json::from_str(&COLLECTION_LOG_DATA).unwrap()
};
}

View File

@@ -0,0 +1,57 @@
use config::{ConfigError, File};
use serde::{Deserialize, Serialize};
#[derive(Deserialize, Clone)]
pub enum LogLevel {
Info,
Warn,
Error,
}
impl LogLevel {
pub fn to_string(&self) -> &'static str {
match self {
LogLevel::Info => "info",
LogLevel::Warn => "warn",
LogLevel::Error => "error",
}
}
}
#[derive(Deserialize, Clone)]
pub struct LoggerConfig {
pub level: LogLevel,
}
#[derive(Serialize, Deserialize, Clone)]
pub struct CaptchaConfig {
pub enabled: bool,
pub sitekey: String,
#[serde(skip_serializing)]
pub secret: String,
}
#[derive(Deserialize, Clone)]
pub struct Config {
pub pg: deadpool_postgres::Config,
#[serde(default = "default_logger_config")]
pub logger: LoggerConfig,
#[serde(default = "default_captcha_config")]
pub hcaptcha: CaptchaConfig,
}
fn default_logger_config() -> LoggerConfig {
LoggerConfig {
level: LogLevel::Info,
}
}
fn default_captcha_config() -> CaptchaConfig {
CaptchaConfig {
enabled: false,
sitekey: "".to_string(),
secret: "".to_string(),
}
}
impl Config {
pub fn from_env() -> Result<Self, ConfigError> {
let cfg = ::config::Config::builder()
.add_source(File::with_name("config"))
.build()?;
cfg.try_deserialize()
}
}

View File

@@ -0,0 +1,27 @@
use blake2::{Blake2s256, Digest};
use data_encoding::HEXLOWER;
use lazy_static::lazy_static;
use std::fs;
lazy_static! {
static ref SECRET: String = {
let path = concat!(env!("CARGO_MANIFEST_DIR"), "/secret");
fs::read_to_string(path).expect(&format!("Could not find secret file at {}", path))
};
}
pub fn hash(value: &str, salt: &str, iterations: u32) -> std::vec::Vec<u8> {
let mut hasher = Blake2s256::new();
let v = value.as_bytes();
for _ in 0..iterations {
hasher.update(v);
}
hasher.update(salt);
hasher.update(&SECRET.as_str());
hasher.finalize().to_vec()
}
pub fn token_hash(token: &str, salt: &str) -> String {
let hashed_token = hash(token, salt, 2);
HEXLOWER.encode(&hashed_token)
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,85 @@
use actix_web::{HttpResponse, ResponseError};
use deadpool_postgres::PoolError;
use derive_more::{Display, From};
#[derive(Debug, Display, From)]
pub enum ApiError {
PoolError(PoolError),
PGError(tokio_postgres::error::Error),
SerdeJsonError(serde_json::Error),
#[from(ignore)]
GroupCreationError(tokio_postgres::error::Error),
#[from(ignore)]
UpdateGroupMemberError(tokio_postgres::error::Error),
#[from(ignore)]
GetGroupError(tokio_postgres::error::Error),
#[from(ignore)]
AddMemberError(tokio_postgres::error::Error),
#[from(ignore)]
GetGroupDataError(tokio_postgres::error::Error),
#[from(ignore)]
DeleteGroupMemberError(tokio_postgres::error::Error),
#[from(ignore)]
RenameGroupMemberError(tokio_postgres::error::Error),
#[from(ignore)]
IsMemberInGroupError(tokio_postgres::error::Error),
#[from(ignore)]
GetSkillsDataError(tokio_postgres::error::Error),
#[from(ignore)]
GetCollectionLogError(tokio_postgres::error::Error),
GroupFullError,
ReqwestError(reqwest::Error),
GroupMemberValidationError(String),
}
impl std::error::Error for ApiError {}
fn handle_pg_error(err: &tokio_postgres::error::Error, name: &str) -> HttpResponse {
match err.as_db_error() {
Some(db_error) => log::error!("{}: {}", name, db_error.message()),
None => log::error!("{}: {}", name, err),
};
HttpResponse::InternalServerError().finish()
}
impl ResponseError for ApiError {
fn error_response(&self) -> HttpResponse {
match *self {
ApiError::PoolError(ref err) => {
log::error!("PoolError: {}", err);
HttpResponse::InternalServerError().body(format!("PoolError: {}", err))
}
ApiError::GroupCreationError(ref err) => handle_pg_error(err, "GroupCreationError"),
ApiError::UpdateGroupMemberError(ref err) => {
handle_pg_error(err, "UpdateGroupMemberError")
}
ApiError::PGError(ref err) => handle_pg_error(err, "PGError"),
ApiError::GetGroupError(ref err) => handle_pg_error(err, "GetGroupError"),
ApiError::AddMemberError(ref err) => handle_pg_error(err, "AddMemberError"),
ApiError::GetGroupDataError(ref err) => handle_pg_error(err, "GetGroupDataError"),
ApiError::IsMemberInGroupError(ref err) => handle_pg_error(err, "IsMemberInGroupError"),
ApiError::GetSkillsDataError(ref err) => handle_pg_error(err, "GetSkillsDataError"),
ApiError::GetCollectionLogError(ref err) => {
handle_pg_error(err, "GetCollectionLogError")
}
ApiError::DeleteGroupMemberError(ref err) => {
handle_pg_error(err, "DeleteGroupMemberError")
}
ApiError::RenameGroupMemberError(ref err) => {
handle_pg_error(err, "RenameGroupMemberError")
}
ApiError::SerdeJsonError(ref err) => {
log::error!("SerdeJsonError: {}", err);
HttpResponse::InternalServerError().body(format!("SerdeJsonError: {}", err))
}
ApiError::GroupFullError => HttpResponse::BadRequest()
.body("Group has already reached the maximum amount of players"),
ApiError::ReqwestError(ref err) => {
log::error!("ReqwestError: {}", err);
HttpResponse::InternalServerError().body(format!("ReqwestError: {}", err))
}
ApiError::GroupMemberValidationError(ref reason) => {
log::error!("Validation error: {}", reason);
HttpResponse::BadRequest().body(reason.clone())
}
}
}
}

View File

@@ -0,0 +1,86 @@
mod auth_middleware;
mod authed;
mod collection_log;
mod config;
mod crypto;
mod db;
mod error;
mod models;
mod unauthed;
mod validators;
use crate::auth_middleware::AuthenticateMiddlewareFactory;
use crate::collection_log::CollectionLogInfo;
use crate::config::Config;
use actix_cors::Cors;
use actix_web::{http::header, middleware, web, App, HttpServer};
use tokio_postgres::NoTls;
use mimalloc::MiMalloc;
#[global_allocator]
static GLOBAL: MiMalloc = MiMalloc;
#[actix_web::main]
async fn main() -> std::io::Result<()> {
let config = Config::from_env().unwrap();
let pool = config.pg.create_pool(None, NoTls).unwrap();
env_logger::init_from_env(
env_logger::Env::new().default_filter_or(config.logger.level.to_string()),
);
let mut client = pool.get().await.unwrap();
db::update_schema(&mut client).await.unwrap();
let collection_log_info: CollectionLogInfo =
db::get_collection_log_info(&client).await.unwrap();
unauthed::start_ge_updater();
unauthed::start_skills_aggregator(pool.clone());
HttpServer::new(move || {
let unauthed_scope = web::scope("/api")
.service(unauthed::create_group)
.service(unauthed::get_ge_prices)
.service(unauthed::captcha_enabled)
.service(unauthed::collection_log_info);
let authed_scope = web::scope("/api/group/{group_name}")
.wrap(AuthenticateMiddlewareFactory::new())
.service(authed::update_group_member)
.service(authed::get_group_data)
.service(authed::add_group_member)
.service(authed::delete_group_member)
.service(authed::rename_group_member)
.service(authed::am_i_logged_in)
.service(authed::am_i_in_group)
.service(authed::get_skill_data)
.service(authed::get_collection_log);
let json_config = web::JsonConfig::default().limit(100000);
let cors = Cors::default()
.allow_any_origin()
.send_wildcard()
.allowed_methods(vec!["GET", "POST", "DELETE", "PUT", "OPTIONS"])
.allowed_headers(vec![
header::AUTHORIZATION,
header::ACCEPT,
header::CONTENT_TYPE,
header::CONTENT_LENGTH,
])
.max_age(3600);
App::new()
.wrap(middleware::Logger::new(
"\"%r\" %s %b \"%{User-Agent}i\" %D",
))
.wrap(middleware::Compress::default())
.wrap(cors)
.app_data(web::PayloadConfig::new(100000))
.app_data(json_config)
.app_data(web::Data::new(pool.clone()))
.app_data(web::Data::new(config.clone()))
.app_data(web::Data::new(collection_log_info.clone()))
.service(authed_scope)
.service(unauthed_scope)
})
.bind(("0.0.0.0", 8080))?
.run()
.await
}

View File

@@ -0,0 +1,118 @@
use crate::collection_log::CollectionLog;
use chrono::{DateTime, Utc};
use serde::{Deserialize, Serialize};
pub const SHARED_MEMBER: &str = "@SHARED";
#[derive(Serialize, Deserialize)]
#[serde(deny_unknown_fields)]
pub struct Coordinates {
x: i32,
y: i32,
plane: i32,
}
#[derive(Serialize, Deserialize)]
#[serde(deny_unknown_fields)]
pub struct Interacting {
name: String,
scale: i32,
ratio: i32,
location: Coordinates,
#[serde(default = "default_last_updated")]
last_updated: DateTime<Utc>,
}
fn default_last_updated() -> DateTime<Utc> {
Utc::now()
}
#[derive(Deserialize)]
#[serde(deny_unknown_fields)]
pub struct RenameGroupMember {
pub original_name: String,
pub new_name: String,
}
#[derive(Deserialize, Serialize)]
pub struct GroupMember {
pub name: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub stats: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub coordinates: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub skills: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub quests: Option<Vec<u8>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub inventory: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub equipment: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub bank: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub shared_bank: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub rune_pouch: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub interacting: Option<Interacting>,
#[serde(skip_serializing_if = "Option::is_none")]
pub seed_vault: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub deposited: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub diary_vars: Option<Vec<i32>>,
#[serde(skip_serializing)]
pub collection_log: Option<Vec<CollectionLog>>,
#[serde(skip_serializing)]
pub collection_log_new: Option<Vec<String>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub last_updated: Option<DateTime<Utc>>,
}
#[derive(Serialize)]
pub struct AggregateSkillData {
pub time: DateTime<Utc>,
pub data: Vec<i32>,
}
#[derive(Serialize)]
pub struct MemberSkillData {
pub name: String,
pub skill_data: Vec<AggregateSkillData>,
}
pub type GroupSkillData = Vec<MemberSkillData>;
#[derive(Deserialize, Serialize)]
#[serde(deny_unknown_fields)]
pub struct CreateGroup {
pub name: String,
pub member_names: Vec<String>,
#[serde(default, skip_serializing)]
pub captcha_response: String,
#[serde(default = "default_token")]
#[serde(skip_deserializing)]
pub token: String,
}
fn default_token() -> String {
uuid::Uuid::new_v4().hyphenated().to_string()
}
#[derive(Deserialize)]
#[serde(deny_unknown_fields)]
pub struct AmIInGroupRequest {
pub member_name: String,
}
#[derive(Deserialize)]
pub struct WikiGEPrice {
pub high: Option<i64>,
pub low: Option<i64>,
}
#[derive(Deserialize)]
pub struct WikiGEPrices {
pub data: std::collections::HashMap<i32, WikiGEPrice>,
}
pub type GEPrices = std::collections::HashMap<i32, i64>;
#[derive(Deserialize)]
pub struct CaptchaVerifyResponse {
pub success: bool,
// NOTE: unused
// #[serde(rename = "error-codes", default)]
// pub error_codes: std::vec::Vec<String>,
}

View File

@@ -0,0 +1,8 @@
CREATE SCHEMA IF NOT EXISTS groupironman;
CREATE TABLE IF NOT EXISTS groupironman.groups(
group_id BIGSERIAL UNIQUE,
group_name TEXT NOT NULL,
group_token_hash CHAR(64) NOT NULL,
PRIMARY KEY (group_name, group_token_hash)
);

View File

@@ -0,0 +1,197 @@
use crate::collection_log::COLLECTION_LOG_DATA;
use crate::config::Config;
use crate::db;
use crate::error::ApiError;
use crate::models::{CaptchaVerifyResponse, CreateGroup, GEPrices, WikiGEPrices};
use crate::validators::valid_name;
use actix_web::{get, http::header::ContentType, post, web, Error, HttpResponse};
use arc_swap::{ArcSwap, ArcSwapAny};
use deadpool_postgres::{Client, Pool};
use lazy_static::lazy_static;
use std::sync::Arc;
use std::time::Duration;
use tokio::{task, time};
lazy_static! {
static ref GE_PRICES: ArcSwapAny<Arc<String>> = ArcSwap::from(Arc::new(String::default()));
static ref HTTP_CLIENT: reqwest::Client = reqwest::Client::new();
}
pub async fn fetch_latest_prices() -> Result<WikiGEPrices, ApiError> {
let res = HTTP_CLIENT
.get("https://prices.runescape.wiki/api/v1/osrs/latest")
.header("User-Agent", "Group Ironmen - Dprk#8740")
.send()
.await
.map_err(ApiError::ReqwestError)?;
let wiki_ge_prices = res
.json::<WikiGEPrices>()
.await
.map_err(ApiError::ReqwestError)?;
Ok(wiki_ge_prices)
}
pub async fn update_ge_prices() -> Result<(), ApiError> {
let wiki_ge_prices = fetch_latest_prices().await?;
let mut ge_prices: GEPrices = std::collections::HashMap::new();
for (item_id, wiki_ge_price) in wiki_ge_prices.data {
let mut avg_ge_price: i64 = 0;
match wiki_ge_price.high {
Some(high) => avg_ge_price = high,
None => (),
}
match wiki_ge_price.low {
Some(low) => {
if avg_ge_price > 0 {
avg_ge_price = (avg_ge_price + low) / 2
} else {
avg_ge_price = low
}
}
None => (),
}
ge_prices.insert(item_id, avg_ge_price);
}
GE_PRICES.store(Arc::new(serde_json::to_string(&ge_prices)?));
Ok(())
}
pub fn start_ge_updater() {
task::spawn(async {
let mut interval = time::interval(Duration::from_secs(14400));
loop {
interval.tick().await;
log::info!("Fetching latest ge prices");
match update_ge_prices().await {
Ok(_) => (),
Err(err) => {
log::error!("Failed to fetch latest ge prices: {}", err);
}
}
}
});
}
pub fn start_skills_aggregator(db_pool: Pool) {
task::spawn(async move {
let mut interval = time::interval(Duration::from_secs(1800));
loop {
interval.tick().await;
log::info!("Running skill aggregator");
match db_pool.get().await {
Ok(mut client) => {
match db::aggregate_skills(&mut client).await {
Ok(_) => (),
Err(err) => {
log::error!("Failed to aggregate skills: {}", err);
}
}
match db::apply_skills_retention(&mut client).await {
Ok(_) => (),
Err(err) => {
log::error!("Failed to apply skills retention: {}", err);
}
}
}
Err(err) => {
log::error!("Failed to get db client: {}", err);
}
}
}
});
}
#[get("/ge-prices")]
pub async fn get_ge_prices() -> Result<HttpResponse, Error> {
let ge_prices_opt = GE_PRICES.load();
let res: String = (&**ge_prices_opt).clone();
Ok(HttpResponse::Ok()
.append_header(("Cache-Control", "public, max-age=86400"))
.content_type("application/json")
.body(res))
}
pub async fn verify_captcha(
response: &String,
secret: &String,
) -> Result<CaptchaVerifyResponse, ApiError> {
let body = [("response", response), ("secret", secret)];
let res = HTTP_CLIENT
.post("https://hcaptcha.com/siteverify")
.form(&body)
.send()
.await
.map_err(ApiError::ReqwestError)?;
let captcha_verify_response = res
.json::<CaptchaVerifyResponse>()
.await
.map_err(ApiError::ReqwestError)?;
Ok(captcha_verify_response)
}
#[post("/create-group")]
pub async fn create_group(
create_group: web::Json<CreateGroup>,
db_pool: web::Data<Pool>,
config: web::Data<Config>,
) -> Result<HttpResponse, Error> {
let mut create_group_inner = create_group.into_inner();
if config.hcaptcha.enabled {
let captcha_verify_response = verify_captcha(
&create_group_inner.captcha_response,
&config.hcaptcha.secret,
)
.await?;
if !captcha_verify_response.success {
return Ok(HttpResponse::BadRequest().body("Captcha response verification failed"));
}
}
if create_group_inner.member_names.len() > 5 {
return Ok(HttpResponse::BadRequest().body("Too many member names provided"));
}
create_group_inner.name = create_group_inner.name.trim().to_string();
if !valid_name(&create_group_inner.name) {
return Ok(HttpResponse::BadRequest().body("Provided group name is not valid"));
}
create_group_inner
.member_names
.retain(|member_name| member_name.trim().len() > 0);
for member_name in &create_group_inner.member_names {
if !valid_name(&member_name) {
return Ok(HttpResponse::BadRequest()
.body(format!("Member name {} is not valid", member_name)));
}
}
let mut client: Client = db_pool.get().await.map_err(ApiError::PoolError)?;
db::create_group(&mut client, &create_group_inner).await?;
Ok(HttpResponse::Created().json(&create_group_inner))
}
#[get("captcha-enabled")]
pub async fn captcha_enabled(config: web::Data<Config>) -> Result<HttpResponse, Error> {
Ok(HttpResponse::Ok().json(&config.hcaptcha))
}
#[get("collection-log-info")]
pub async fn collection_log_info() -> HttpResponse {
HttpResponse::Ok()
.content_type(ContentType::json())
.body(&**COLLECTION_LOG_DATA)
}

View File

@@ -0,0 +1,151 @@
use crate::collection_log::{CollectionLog, CollectionLogInfo};
use crate::error::ApiError;
use lazy_static::lazy_static;
use regex::Regex;
#[cfg(test)]
mod valid_name_tests {
use super::*;
#[test]
fn valid_names() {
let valid_names = [
"test",
"with space",
"with 1234",
"123",
"with-dash",
"dash-and space",
"CAPITAL LETTERS",
"MiXeD case-123",
"0123456789",
"underscore_name",
" space",
"space ",
];
for name in valid_names {
assert!(valid_name(name), "{} should have been a valid name", name);
}
}
#[test]
fn invalid_names() {
let invalid_names = [
"@SHARED",
"invalid!",
"@",
"-=+[];'./,<>?\"\\|`~",
"=",
"+",
"[",
"]",
";",
"'",
".",
"/",
",",
"<",
">",
"?",
"\"",
"\\",
"|",
"`",
"~",
"",
" ",
" ",
];
for name in invalid_names {
assert!(
!valid_name(name),
"{} should have been an invalid name",
name
);
}
}
}
pub fn valid_name(name: &str) -> bool {
lazy_static! {
static ref NAME_RE: Regex = Regex::new("[^A-Za-z 0-9-_]").unwrap();
}
let len = name.len();
(1..=16).contains(&len) && name.is_ascii() && !NAME_RE.is_match(name) && name.trim().len() > 0
}
pub fn validate_member_prop_length<T>(
prop_name: &str,
value: &Option<Vec<T>>,
min: usize,
max: usize,
) -> Result<(), ApiError> {
match value {
None => Ok(()),
Some(x) => {
if (min..=max).contains(&x.len()) {
Ok(())
} else {
Err(ApiError::GroupMemberValidationError(format!(
"{} length violated range constraint {}..={} actual={}",
prop_name,
min,
max,
x.len()
)))
}
}
}
}
pub fn validate_collection_log(
collection_log_info: &actix_web::web::Data<CollectionLogInfo>,
collection_logs: &mut Option<Vec<CollectionLog>>,
) -> Result<(), ApiError> {
match collection_logs {
None => Ok(()),
Some(ref mut x) => {
for collection_log in x {
let page_id = collection_log_info.page_name_to_id(&collection_log.page_name);
let result = match page_id {
Some(id) => {
let number_of_items: usize = collection_log.items.len() / 2;
if number_of_items > collection_log_info.number_of_items_in_page(*id) {
return Err(ApiError::GroupMemberValidationError(format!(
"{} is too many items for collection log {}",
number_of_items, collection_log.page_name
)));
}
for i in (0..collection_log.items.len()).step_by(2) {
let item_id =
collection_log_info.remap_item_id(collection_log.items[i]);
collection_log.items[i] = item_id;
if !collection_log_info.has_item(*id, item_id) {
return Err(ApiError::GroupMemberValidationError(format!(
"collection log {} does not have item id {}",
collection_log.page_name, item_id
)));
}
}
Ok(())
}
None => Err(ApiError::GroupMemberValidationError(format!(
"invalid collection log page {}",
collection_log.page_name
))),
};
if result.is_err() {
return result;
}
}
Ok(())
}
}
}

View File

@@ -0,0 +1,6 @@
**/node_modules
**/.husky
**/.eslint*
**/prettier*
README.md
.gitignore

View File

@@ -0,0 +1,30 @@
{
"extends": [
"eslint:recommended"
],
"rules": {
"padding-line-between-statements": [
"error",
{ "blankLine": "always", "prev": "function", "next": "function" },
{ "blankLine": "always", "prev": "*", "next": "class" },
{ "blankLine": "always", "prev": "*", "next": "export" },
{ "blankLine": "always", "prev": "import", "next": "function" },
{ "blankLine": "always", "prev": "import", "next": "const" },
{ "blankLine": "always", "prev": "import", "next": "let" }
],
"lines-between-class-members": ["error", "always"],
"no-unused-vars": "error",
"no-console": ["error", { "allow": ["warn", "error"] }],
"no-empty": "off"
},
"env": {
"es6": true,
"browser": true
},
"parserOptions": {
"ecmaVersion": 2020,
"sourceType": "module",
"ecmaFeatures": {
}
}
}

29
group-ironmen-master/site/.gitignore vendored Normal file
View File

@@ -0,0 +1,29 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Dependency directories
node_modules/
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
public/*.js
public/*.html
public/*.map
nocommit

View File

@@ -0,0 +1,5 @@
#!/bin/sh
. "$(dirname "$0")/_/husky.sh"
cd site
npm run precommit

View File

@@ -0,0 +1,9 @@
*~
\#*#
/.emacs.desktop
/.emacs.desktop.lock
*.elc
auto-save-list
tramp
.#*
*.html

View File

@@ -0,0 +1,5 @@
{
"tabWidth": 2,
"semi": true,
"printWidth": 120
}

View File

@@ -0,0 +1,14 @@
###############################################
# Frontend Image
###############################################
FROM node:16.10.0-alpine as production-frontend
WORKDIR /app
COPY ["./package.json", "./package-lock.json*", "./"]
RUN npm install --ignore-scripts
COPY . .
ENTRYPOINT ["/app/scripts/docker-entrypoint.sh"]
CMD ["npm","run","serve"]

View File

@@ -0,0 +1,177 @@
const fs = require('fs');
const path = require('path');
const { minify } = require("terser");
const { performance } = require('perf_hooks');
const CleanCSS = require('clean-css');
const cleanCSSInstance = new CleanCSS({});
const productionMode = process.argv.some((arg) => arg === '--prod');
if (productionMode) {
console.log("Production mode is enabled");
}
const mapJsonPlugin = {
name: 'mapTilesJson',
setup(build) {
const mapImageFiles = fs.readdirSync("public/map").filter((file) => file.endsWith('.webp')).map((file) => path.basename(file, '.webp'));
const tiles = [[], [], [], []];
for (const mapImageFile of mapImageFiles) {
const [plane, x, y] = mapImageFile.split('_').map((x) => parseInt(x, 10));
tiles[plane].push(((x + y) * (x + y + 1)) / 2 + y);
}
const icons = JSON.parse(fs.readFileSync("public/data/map_icons.json", 'utf8'));
const labels = JSON.parse(fs.readFileSync("public/data/map_labels.json", 'utf8'));
const result = {
tiles,
icons,
labels
};
fs.writeFileSync('public/data/map.json', JSON.stringify(result));
}
}
const componentBuildPlugin = {
name: 'componentBuild',
setup(build) {
const components = new Set(JSON.parse(fs.readFileSync('components.json', 'utf8')));
build.onLoad({ filter: /\.js$/ }, async (args) => {
const componentDir = path.dirname(args.path);
const componentName = path.basename(args.path, '.js');
const isComponent = components.has(componentName);
let jsText = await fs.promises.readFile(args.path, 'utf8');
if (isComponent) {
try {
let htmlText = await fs.promises.readFile(`${componentDir}/${componentName}.html`, 'utf8');
jsText = jsText.replace(`{{${componentName}.html}}`, htmlText);
} catch {}
}
return {
contents: jsText,
loader: 'js'
};
});
}
}
const buildLoggingPlugin = {
name: "buildLogging",
setup(build) {
let start;
build.onStart(() => {
start = performance.now();
console.log('\nBuild started');
});
build.onEnd(() => {
console.log(`Build finished in ${(performance.now() - start).toFixed(1)}ms`);
});
}
};
const htmlBuildPlugin = {
name: "htmlBuild",
setup(build) {
const components = JSON.parse(fs.readFileSync('components.json', 'utf8'));
const imagesToInline = [
"/ui/border-button.png",
"/ui/border-button-dark.png",
"/ui/checkbox.png",
"/ui/border.png",
"/ui/border-dark.png",
"/ui/border-tiny.png",
"/ui/border-tiny-dark.png",
"/ui/297-0.png",
"/ui/297-0-dark.png"
];
build.onEnd(async () => {
let htmlFile = await fs.promises.readFile("src/index.html", "utf8");
const cssFiles = ['src/main.css', ...components.map((component) => `./src/${component}/${component}.css`)];
const cssReadResults = await Promise.all(cssFiles.map((cssFile) => fs.promises.readFile(cssFile, "utf8")));
let css = cssReadResults.join('');
for (imagePath of imagesToInline) {
const imageData = await fs.promises.readFile(`public/${imagePath}`, "base64");
css = css.replace(imagePath, `data:image/png;base64,${imageData}`);
}
if (productionMode) {
css = cleanCSSInstance.minify(css).styles;
}
htmlFile = htmlFile.replace("{{style}}", css);
const jsContent = await fs.promises.readFile('public/app.js', 'utf8');
htmlFile = htmlFile.replace("{{js}}", jsContent);
await fs.promises.writeFile("public/index.html", htmlFile);
});
}
};
const minifyJsPlugin = {
name: "minifyJs",
setup(build) {
build.onEnd(async () => {
if (!productionMode) return;
console.log('Minifying app.js');
const code = await fs.promises.readFile("public/app.js", "utf8");
const result = await minify(code, {
sourceMap: {
filename: "app.js",
url: "app.js.map"
},
ecma: "2017",
mangle: {
keep_classnames: false,
keep_fnames: false,
module: true,
reserved: [],
toplevel: true
},
compress: {
ecma: "2017"
},
module: true
});
await fs.promises.writeFile("public/app.js", result.code);
await fs.promises.writeFile("public/app.js.map", result.map);
});
}
};
function build() {
require('esbuild').build({
entryPoints: ['src/index.js'],
bundle: true,
sourcemap: true,
minify: false,
format: 'esm',
outfile: 'public/app.js',
plugins: [componentBuildPlugin, minifyJsPlugin, htmlBuildPlugin, buildLoggingPlugin, mapJsonPlugin]
}).catch((error) => console.error(error));
}
const watch = process.argv.find((arg) => arg === "--watch");
if (watch) {
const chokidar = require('chokidar');
const watcher = chokidar.watch('src', {
ignorePermissionErrors: true,
ignored: ".#*"
});
watcher.on('change', (event, path) => {
build();
});
}
build();

View File

@@ -0,0 +1 @@
["search-element","inventory-item","inventory-pager","app-navigation","items-page","app-route","map-page","side-panel","player-panel","player-stats","player-inventory","player-skills","skill-box","player-equipment","xp-dropper","rs-tooltip","item-box","total-level-box","player-quests","men-homepage","wrap-routes","create-group","men-link","setup-instructions","app-initializer","group-settings","member-name-input","men-input","edit-member","loading-screen","login-page","logout-page","demo-page","social-links","rune-pouch","stat-bar","player-interacting","skills-graphs","skill-graph","confirm-dialog","panels-page","diary-dialog","player-diaries","diary-completion","canvas-map","collection-log","collection-log-page","collection-log-tab","collection-log-item","player-icon","donate-button"]

6564
group-ironmen-master/site/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,38 @@
{
"name": "group-ironmen",
"version": "1.0.0",
"description": "",
"devDependencies": {
"axios": "^0.26.1",
"chokidar": "^3.5.3",
"clean-css": "^5.3.2",
"compression": "^1.7.4",
"concurrently": "^6.2.2",
"esbuild": "^0.17.10",
"eslint": "^7.32.0",
"express": "^4.17.1",
"express-winston": "^4.2.0",
"glob": "^7.2.0",
"husky": "^7.0.2",
"jsdom": "^17.0.0",
"prettier": "^2.4.1",
"terser": "^5.16.5",
"winston": "^3.3.3"
},
"scripts": {
"bundle": "npm run clean && node build.js --prod",
"clean": "node scripts/clean.js",
"start": "concurrently \"npm run watch\" \"npm run serve -- --backend https://groupiron.men\"",
"start:local-api": "concurrently \"npm run watch\" \"npm run serve -- --backend http://127.0.0.1:8080\"",
"serve": "node scripts/server.js",
"watch": "node build --watch",
"lint": "eslint --ext .js src",
"format": "prettier --write src/",
"format:check": "prettier --check src/",
"fix": "npm run lint -- --fix",
"generate-component": "node scripts/generate-component.js",
"prepare": "cd .. && husky install site/.husky",
"precommit": "npm run format:check && npm run lint"
},
"author": "Christopher Brown"
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

View File

@@ -0,0 +1,9 @@
<?xml version="1.0" encoding="utf-8"?>
<browserconfig>
<msapplication>
<tile>
<square150x150logo src="/mstile-150x150.png"/>
<TileColor>#da532c</TileColor>
</tile>
</msapplication>
</browserconfig>

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 469 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 535 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 509 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 526 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 475 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 636 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 778 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 860 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 902 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 730 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 612 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1002 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 814 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 804 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 702 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 544 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 866 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 626 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 626 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 262 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 972 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 580 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 946 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 692 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.0 KiB

Some files were not shown because too many files have changed in this diff Show More