From bd86f4fc950cdc5bb4cb346f48c14a6e356dc4fb Mon Sep 17 00:00:00 2001 From: David Luevano Alvarado Date: Thu, 7 Mar 2024 21:55:16 -0600 Subject: stop tracking live/ --- live/blog/a/learned_go_and_lua_hard_way.html | 159 --------------------------- 1 file changed, 159 deletions(-) delete mode 100644 live/blog/a/learned_go_and_lua_hard_way.html (limited to 'live/blog/a/learned_go_and_lua_hard_way.html') diff --git a/live/blog/a/learned_go_and_lua_hard_way.html b/live/blog/a/learned_go_and_lua_hard_way.html deleted file mode 100644 index dc62d96..0000000 --- a/live/blog/a/learned_go_and_lua_hard_way.html +++ /dev/null @@ -1,159 +0,0 @@ - - - - - - -I had to learn Go and Lua the hard way -- Luévano's Blog - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - -
- -
-
- -
-

I had to learn Go and Lua the hard way

- -

TL;DR: I learned Go and Lua the hard way by forking (for fixing):

- -

In the last couple of days I’ve been setting up a Komga server for manga downloaded using metafates/mangal (upcoming set up entry about it) and everything was fine so far until I tried to download One Piece from MangaDex of which mangal has a built-in scraper. Long story short the issue was that MangaDex’s API only allows requesting manga chapters on chunks of 500 and the way that was being handled was completely wrong, specifics can be found on my commit (and the subsequent minor fix commit).

-

I tried to do a PR, but the project hasn’t been active since Feb 2023 (same reason I didn’t even try to do PRs on the other repos) so I closed it and will start working on my own fork, probaly just merging everything Belphemur‘s fork has to offer, as he’s been working on mangal actively. I could probably just fork from him and/or just submit PR requests to him, but I think I saw some changes I didn’t really like, will have to look more into it.

-

Also, while trying to use some of the custom scrapers I ran into issues with the headless chrome explorer implementation where it didn’t close on each manga chapter download, causig my CPU and Mem usage to get maxed out and losing control of the system, so I had to also fork the metafates/mangal-lua-libs and “fixed” (I say fixed because that wasn’t the issue at the end, it was how the custom scrapers where using it, shitty documentation) the issue by adding the browser.Close() function to the headless Lua API (commit) and merged some commits from the original vadv/gopher-lua-libs just to include any features added to the Lua libs needed.

-

Finally I forked the metafates/mangal-scrapers (which I actually forked NotPhantomX‘s fork as they had included more scrapers from some pull requests) to be able to have updated custom Lua scrapers (in which I also fixed the headless bullshit) and use them on my mangal.

-

So, I went into the rabbit hole of manga scrapping because I wanted to set up my Komga server, and more importantly I had to quickly learn Go and Lua (Lua was easier) and I have to say that Go is super convoluted on the module management, all research I did lead me to totally different responses, but it is just because of different Go versions and the year of the responses.

- - - - -
- -
- - - - \ No newline at end of file -- cgit v1.2.3-54-g00ecf