• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle
  • I don’t pirate software anymore. If I do the math on how much enjoyment I get even from a mediocre AAA game title, it is dwarfed by what I’d spend on a night out, so the value is there for me. On top of that the risk of malware (or the effort in mitigating it) isn’t really worth it.

    Tv and movies? Pirate it. The streaming services are garbage and the content has too much crap for me to want to pay a corporation for it. If it became too hard to pirate I just wouldn’t watch it anymore.

    Books kind of fall in the middle. Happy to pay for ebooks if the author makes it practical, but I’m not keen on buying through Amazon.


  • It’s a little worrisome, actually. Professionally written software still needs a human to verify things are correct, consistent, and safe, but the tasks we used to foist off on more junior developers are being increasingly done by AI.

    Part of that is fine - offloading minor documentation updates and “trivial” tasks to AI is easy to do and review while remaining productive. But it comes at the expense of the next generation of junior developers being deprived of tasks that are valuable for them to gain experience to work towards a more senior level.

    If companies lean too hard into that, we’re going to have serious problems when this generation of developers starts retiring and the next generation is understaffed, underpopulated, and probably underpaid.





  • Sonarr and Radarr are there for managing your requests, so they’ll handle things like downloading it when it’s available (either because it’s a new release or because the torrent/nzb weren’t readily available at the time you added it), upgrading an existing file to a higher quality version if it becomes available, sourcing a new copy if you mark the one it found as bad (e.g. huge, hard-coded Korean subtitles ruining your movie).

    If you’re trying to find new stuff based on vague conditions (like “90s action movie), I don’t think any of the self hosted apps are a huge help. You’re probably better off sourcing ideas from an external site like IMDb or tvdb (maybe even Rotten Tomatoes?). Those sites maintain their own rich indexes of content and tags, whereas the self hosted stuff seems to be built more around the “I’ll make an api request once I know what you’re looking for”, which sucks when you don’t really know what you’re looking for.

    I think there are even browser extensions for IMDb that will add a button to the IMDb movie page letting you automatically add it to Radarr if you like the look of it.


  • I can’t recommend an all-in-one primer, but if you want to look up guides independently, you’ll probably be most interested in these tools/services:

    • a Usenet host (paid. they’re largely the same. Look for deals)
    • a Usenet indexer site (analogous to a Pirate Bay type search engine). I like nzbgeek but there are hundreds. Many require a small annual fee and this may be worth it to you, but you can use free ones to test your initial setup.

    A Usenet indexer is going to let you download .nzb files, which is analogous to downloading .torrent files from a torrent indexer. The nzb describes what posts in what newsgroups contain the files for a particular release.

    • SABnzbd (download client, analogous to a torrent client like Transmission)
    • browser plugins to simplify clicking an nzb download link and sending it to SABnzbd (not always needed if you’re running everything on your local machine, but important if your SAB instance runs on another server or in a Docker container)

    If you’re looking to set up some extra infrastructure for automating a lot of steps, there’s also web apps to cover a ton of video use cases, like:

    • Sonarr and Radarr (for monitoring specific tv shows and movies and automatically searching for nzbs, downloading them, and moving them to a final home on disk)
    • Plex or Jellyfin (for providing a Netflix-like UI you can use to look for something to watch and then stream it to your browser/phone/TV)
    • Overseerr (for a single interface to look for shows and movies and have them automatically added to Sonarr/Radarr.

    I’d highly recommend setting up Docker and putting all of these apps into separate containers. Linuxserver creates easy to setup and update Docker packages for all these things. It’s also a great resource for finding other web apps you didn’t know you needed.







  • My org has issues with e2e, but we keep them because they usually inform that something, somewhere isn’t quite right.

    Our CI/CD pipeline is configured to automatically re-run a build if a test in the e2e suite fails. If it fails a second time, then it sends up the usual alerts and a human has to get involved.

    In addition to that, we track transient failures on the main branch and have stats on which ones are the noisiest. Someone is always peeling the noisiest one off the stack to address why it’s failing (usually time zones or browser async issues that are easy to fix).

    It’s imperfect, and still results in a lot of developers just spam retrying builds to “get stuff done”, but we’ve decided the signal to noise ratio is good enough that we want to keep things the way they are.


  • It works for us to some extent, but there are ultimately people who specialize in various areas. Nobody can be a true master of absolutely every area in a large system with enough knowledge queued up to be able to be an independent power player. And I think that’s reasonable to anybody who works in or near the trenches.

    It should be expected that any member of a full stack team can figure out how to do any particular task, but it’s a waste of resources not to have people lean on teammates who have more recent deep experience in a particular area.

    This goes for both technical areas as well as domains. If my teammate just spent a couple of weeks adding a feature that makes heavy use of, I dunno, database callbacks in the user authentication area, and I just picked up a task to fix a bug in that same area, I have two choices. I can consult with my teammate and get their take on what I’m doing and have them preemptively point out any areas of risk/hidden complexity. Or, I can just blunder into that area and spend my own couple of weeks getting properly ramped up so I can be confident I made the right change and didn’t introduce a subtle regression or performance issue.

    As the number of tech and domain concerns increases, it becomes more and more important for people to collaborate intelligently.

    My team oscillates between 4 and 6 developers and this has been our practice with pretty high success. Quality is high (when measuring our ability to deliver within our initial estimates and our bug count), morale is good, and everyone is still learning with minimal frustration.

    I think the issue of team cohesion comes into play if people are all independent and also stop talking. If I take the couple of weeks in my previous example instead of talking to my teammate for some knowledge transfer, I’m disincentivized from leaving my personal bubble at all. I’m also less likely to stay within overall software patterns if I think I’m always trailblazing. The first time you write new code, you’ve established a pattern. The first time someone repeats what you did, they’ve cemented that pattern. That’s often a nasty double edged sword, but I generally lean toward consistency rather than one-off “brilliant” code.

    Where this can start to break down is when people become pigeonholed in a certain role. “Oh, you’re the CSS guy, so why don’t you take this styling ticket?” and “Kevin, you do all the database migrations so here’s another one” are great ways to damage morale and increase risk if someone leaves.

    Tl;dr: everyone should be able to pick up any task, and they should endeavor to keep their axe sharp in every conceivable area, but it needs to be carefully offset against wasting time by refusing to lean on other people.

    As with most things, it’s usually a matter of personalities and not policies.