More actions
(Created page with "{{Infobox 3DS homebrew | title = eShop analysis tool | image = https://dlhb.gamebrew.org/3dshomebrew/3DS-eShop-analysis-tool.jpeg|250px | type = Utilities | version = v1.0 | l...") |
No edit summary |
||
Line 10: | Line 10: | ||
| source = https://dlhb.gamebrew.org/3dshomebrew/3DS-eShop-analysis-tool-3DS.rar | | source = https://dlhb.gamebrew.org/3dshomebrew/3DS-eShop-analysis-tool-3DS.rar | ||
}} | }} | ||
= 3DS eShop analysis tool = | = 3DS eShop analysis tool = | ||
''A python script to help you get deeper insights into the Nintendo 3DS eshops'' | ''A python script to help you get deeper insights into the Nintendo 3DS eshops'' | ||
= Requirements = | == Requirements == | ||
To run this script, you need the following: | To run this script, you need the following: | ||
* an up-to-date installation of Python 3 (at least 3.6.x) | * an up-to-date installation of Python 3 (at least 3.6.x) | ||
* the requests module installed (<code>py -3 -mpip install requests</code> from the command line) | * the requests module installed (<code>py -3 -mpip install requests</code> from the command line) | ||
= What it does = | == What it does == | ||
This script will scrape the Nintendo 3DS eShops for all available regions (> 200 regions), merge the data, and compare with the [http://www.3dsdb.com/ 3dsdb] and the data from ''that titlekey site''. It gives insights (among others) into what titles are available globally, and what titles have already been archived. | This script will scrape the Nintendo 3DS eShops for all available regions (> 200 regions), merge the data, and compare with the [http://www.3dsdb.com/ 3dsdb] and the data from ''that titlekey site''. It gives insights (among others) into what titles are available globally, and what titles have already been archived. | ||
= How to run = | == How to run == | ||
Just run the script via <code>py -3 eat.py</code> (or <code>python3 eat.py</code> on unix). To include information about titlekeys into the results '''(highly recommended)''', add <code>-t [TITLEKEYURL]</code> or <code>--titlekeyurl [TITLEKEYURL]</code>, whereas <code>[TITLEKEYURL]</code> is the URL (with 'http//') of ''that titlekeys site''. If you don't want to do this every time, you may also edit <code>titlekeyurl</code> in the source code, it's right at the top. To add proper title ids and title sizes to the results '''(also highly recommended)''', you need to provide <code>ctr-common-1.crt</code> and <code>ctr-common-1.key</code>. You may also limit the scope of analysed regions via <code>-r [REGION]</code> or <code>--region=[REGION]</code>, whereas <code>[REGION]</code> is <code>english</code>, <code>main</code> or the two letter country code of a specific region. Resulting CSV files will be written to the <code>results</code> subdirectory, intermediate dumps will be written to the <code>dumped</code> subdirectory. | Just run the script via <code>py -3 eat.py</code> (or <code>python3 eat.py</code> on unix). To include information about titlekeys into the results '''(highly recommended)''', add <code>-t [TITLEKEYURL]</code> or <code>--titlekeyurl [TITLEKEYURL]</code>, whereas <code>[TITLEKEYURL]</code> is the URL (with 'http//') of ''that titlekeys site''. If you don't want to do this every time, you may also edit <code>titlekeyurl</code> in the source code, it's right at the top. To add proper title ids and title sizes to the results '''(also highly recommended)''', you need to provide <code>ctr-common-1.crt</code> and <code>ctr-common-1.key</code>. You may also limit the scope of analysed regions via <code>-r [REGION]</code> or <code>--region=[REGION]</code>, whereas <code>[REGION]</code> is <code>english</code>, <code>main</code> or the two letter country code of a specific region. Resulting CSV files will be written to the <code>results</code> subdirectory, intermediate dumps will be written to the <code>dumped</code> subdirectory. | ||
= Credits = | == Credits == | ||
I actually learnt Python writing this script, and doing so wouldn't have been possible without @ihaveamac's help. @ihaveamac also started this by providing the eShop parser function. Thanks a gigaton! | I actually learnt Python writing this script, and doing so wouldn't have been possible without @ihaveamac's help. @ihaveamac also started this by providing the eShop parser function. Thanks a gigaton! |
Revision as of 11:26, 20 April 2020
3DS eShop analysis tool
A python script to help you get deeper insights into the Nintendo 3DS eshops
Requirements
To run this script, you need the following:
- an up-to-date installation of Python 3 (at least 3.6.x)
- the requests module installed (
py -3 -mpip install requests
from the command line)
What it does
This script will scrape the Nintendo 3DS eShops for all available regions (> 200 regions), merge the data, and compare with the 3dsdb and the data from that titlekey site. It gives insights (among others) into what titles are available globally, and what titles have already been archived.
How to run
Just run the script via py -3 eat.py
(or python3 eat.py
on unix). To include information about titlekeys into the results (highly recommended), add -t [TITLEKEYURL]
or --titlekeyurl [TITLEKEYURL]
, whereas [TITLEKEYURL]
is the URL (with 'http//') of that titlekeys site. If you don't want to do this every time, you may also edit titlekeyurl
in the source code, it's right at the top. To add proper title ids and title sizes to the results (also highly recommended), you need to provide ctr-common-1.crt
and ctr-common-1.key
. You may also limit the scope of analysed regions via -r [REGION]
or --region=[REGION]
, whereas [REGION]
is english
, main
or the two letter country code of a specific region. Resulting CSV files will be written to the results
subdirectory, intermediate dumps will be written to the dumped
subdirectory.
Credits
I actually learnt Python writing this script, and doing so wouldn't have been possible without @ihaveamac's help. @ihaveamac also started this by providing the eShop parser function. Thanks a gigaton!