Hi @quigua,
Thanks for your comment, indeed it has grown to extended api. Often tempted to create a python package for it.
checkout his latest post:
@kalkulus is also busy with a spltools (python) repo.
https://peakd.com/hive-13323/@kalkulus/a-strategy-for-spending-glint-based-on-current-collection-progress-and-updates-to-spltools
Indeed i currently only use csv files, it still performs good enough, with 3 accounts monitoring since 2023-04-30. But i should be moving towards a sql or at least Parquet files.
When you start with your tool and have questions let me known 👍
I understand that you have csv files per user. How big are those one-year files? Because I downloaded the battle data from the last few days for all active users and the csv files are around 5 GB. If I plan to keep permanent statistics on all users, I have to optimize the storage a lot. That's why I asked about using SQLite, but parquet files maybe a better solution.
I wanted to also try building it with mongodb. Had a small experiment with and is also nice for the way i handle the transactions.
This method with csv is just easy also for others to use without knowledge of setting up a database etc.
For now my biggest file is 40MB, not that much. I do not store all the data just the data i need.
The most important one is the transaction id (or battle_id), that always be use to migrate and extend data I'm really missing it in the future.